Archive for the ‘Modelling’ Category
A great example of a historical forecasting methodology is PECOTA. PECOTA forecasts future performance for a player by first going back in time and finding historical players that are “similar”. It then uses information on how those historical players evolved over their careers to project how the current player will evolve. So far, this approach is as good as it can get for forecasting player performance.
Not all forecasts are this obviously historical, but all forecasts really are about intelligent selection of historical comparators.
This key relationship indicates why forecasts will always need both quantitative and qualitative components. Quantitative components — from numerical data that describes the past — are key to anchoring estimates of magnitude in an objective way.
Qualitative components are necessary to adjust for limitations in the data and to accommodate the possibility that this time “really is different”. Frequently, historical data exists because of convenience or some other business purpose; rarely is the historical data directly applicable to the current problem. A significant degree of wisdom is required to judge when “this time really is different”, as is perhaps obvious.
This post may appear to be a truism, but I've found that model interpretation and forecasting errors frequently stem from a lack of appreciation regarding the relationship between history and prediction. Curiosity, energy, and time are all required to investigate the past in a comprehensive way. It is difficult — even in retrospect — to identify key causes for historical events. It is exponentially more difficult to select and measure which of those causal relationships will be the key drivers in the future.
Companies would do well to keep in mind that forecasts are as much about the past as they are about the future. The better you know where you've been, and why, the better you will be able to navigate where you will be.
 Even quantitative measures are susceptible to subjective interpretation and biases that influence the selection of the data. Nevertheless, quantitative evaluation helps provide a degree of dispassion, if used wisely.
“The Machine Knows” is a classic Office episode that teaches us about Actuarial modeling in the following clip:
There are many lessons embedded in this two minutes.
1) Be careful with model interpretation. Michael wanted to interpret the result literally; be warned, if you do this, you may get very wet.
2) Models are only guides. The Map is Not the Territory; the model is not reality. From the clip, it seems possible that the GPS system (aka the Map, aka the Model) was wrong and Michael was following it into disaster (like these individuals did in real life). Even if the GPS was correct, either way the clip illustrates that reality itself needs to be paid attention to, regardless of what you believe any model says.
2-alternate) Michael fell prey to the Reification Fallacy, one of the most prevalent and powerful modeling fallacies.
3) If you can’t understand a model, then be warned that disastrous results may follow. Sometimes, models erroneously embolden you. Always be humble when interpreting model results, and be open to contrary evidence. All models are wrong; some are useful.
4) Don’t be a passenger in a car driven by someone who takes his models literally. Unless you make your living in disaster recovery.
Tomorrow I’ll post on the analysis of risk, and how that could be applied to this video.
Russ Roberts recently interviewed Sam Altman, of YCombinator. In this EconTalk episode, Sam offered the following insights into what level of planning he expects to see from those who apply to become part of YCombinator:
Sam Altman: I’ve never written [a business plan] in my life. At the stage that we are operating at, it’s irrelevant. Like financial projections also we never look at. … We would rather them spend the time working on their product, talking to users. What we care about is: Have you built a product? Have you spoken to users? Can we see that? Can we talk about where it may involve?
I think this is exactly right. As actuaries, our first instinct is to measure, quantify, and plan. However, you don’t have to have a detailed financial plan before engaging in an activity. What you have to have is a rational basis for believing that the activity has substantial merit. The level of modeling and projections must be related to (a) the ability of the forecaster to model accurately, and (b) the relative cost of producing the forecast. There is art in knowing when to model and when not to.
For young start-ups, the ability to forecast accurately is low, and the cost of forecasting is high, especially the opportunity cost. Further, if the business case has merit, the value proposition has to be easy to explain or it won’t take off. Business plans and pro formas should properly be viewed as a means of communication, not an end of themselves. And frequently the idea and business prospects can be best communicated in words, with examples, or with simple math that demonstrates scalability. And the simplest communication vehicle is frequently the most persuasive.