What are Regression Algorithms
- 03:09
The fundamentals of regression algorithms, and how they are used to make estimates based on characteristics of data.
Downloads
No associated resources to download.
Glossary
Python Regression algorithmsTranscript
So what actually are regression algorithms? Well, regression algorithms estimate things. For example, if you're performing equity research, you might wanna estimate that a stock will return 5%, 15%, or lose 10%. Or if you're inviting institutional investors to commit 100 million dollars each to a debt offering, you might estimate that they will ultimately commit 180 million or 100 million or 65 million, or maybe decline altogether and give you nothing. Each of these estimates in a regression algorithm is based on certain characteristics of each thing that you're estimating. So if that thing is a stock, you're estimating based on characteristics like the stock's, current price, the target price, the beta of the underlying stock, and the industry that the company is in. Or if you're estimating something like an investor's behavior, you might look at characteristics like the fees, credit ratings, or covenants of their past transactions. At first, your algorithm doesn't know how to estimate anything. You have to train it. You train your algorithm by giving it examples with characteristics linked to known values. So for example, stocks that return 25% or lost 10%, and the historical price, historical target, historical beta, and the industry that you would've used at the time to make an estimate. So you're looking at the result, that 25% return or the 10% loss, and then you're pairing that with the characteristics that you would've used to make a guess at the time. Or you're looking at investors that maybe committed a certain amount of money to historical debt offerings and the characteristics of those offerings, the fees, credit, ratings, and covenants that you would've used to make an estimate at the time. So you give those characteristics to your algorithm, and when you train your algorithm, it invents a rule that it can use to make an estimate based on given characteristics with a high degree of accuracy. Then you can give your algorithm different things with unknown values and ask your algorithm to predict the actual values based on their characteristics. So for example, the trained algorithm might apply this rule that it created to new stocks and their new characteristics, and make an estimate about the return so that you can make investment decisions, or the trained algorithm might apply its rule to investors' characteristics and estimate that they're going to commit a certain dollar amount to a transaction. And that'll help you predict if your debt offering will succeed or fail, and then you can change the characteristics of the deal that you're trying to structure by increasing fees or tightening covenants until you're confident that your debt offering will be fully subscribed. In terms of machine learning vocabulary, each thing is called an observation. Each estimate is called a prediction. Each characteristic of your dataset is called a feature, and then the rule is a trained model. Ask yourself, where could you use a regression algorithm in your job? If you can't think of an example, consider this. Have you ever performed an analysis using historical data to predict a numeric value? That could be an opportunity to use regression. Once you find an opportunity, the rest is easy. All you need to do is name the ingredients. What are your predictions? What are the observations that you're estimating? What are the features of those observations that you use to make a prediction? And do you think a model could describe the relationship between these features and the prediction?