Warning: define(): Argument #3 ($case_insensitive) is ignored since declaration of case-insensitive constants is no longer supported in /home/u742613510/domains/strategy-at-risk.com/public_html/wp-content/plugins/wpmathpub/wpmathpub.php on line 65
Forecasting – Page 2 – Strategy @ Risk

Category: Forecasting

  • Inventory management – Some effects of risk pooling

    Inventory management – Some effects of risk pooling

    This entry is part 3 of 4 in the series Predictive Analytics

    Introduction

    The newsvendor described in the previous post has decided to branch out having news boys placed at strategic corners in the neighborhood. He will first consider three locations, but have six in his sights.

    The question to be pondered is how many of the newspaper he should order for these three locations and the possible effects on profit and risk (Eppen, 1979) and (Chang & Lin, 1991).

    He assumes that the demand distribution he experienced at the first location also will apply for the two others and that all locations (point of sales) can be served from a centralized inventory. For the sake of simplicity he further assumes that all points of sales can be restocked instantly (i.e. zero lead time) at zero cost, if necessary or advantageous by shipment from one of the other locations and that the demand at the different locations will be uncorrelated. The individual point of sales will initially have a working stock, but will have no need of safety stock.

    In short is this equivalent to having one inventory serve newspaper sales generated by three (or six) copies of the original demand distribution:

    The aggregated demand distribution for the three locations is still positively skewed (0.32) but much less than the original (0.78) and has a lower coefficient of variation – 27% – against 45% for the original ((The quartile variation has been reduced by 37%.)):

    The demand variability has thus been substantially reduced by this risk pooling ((We distinguish between ten main types of risk pooling that may reduce total demand and/or lead time variability (uncertainty): capacity pooling, central ordering, component commonality, inventory pooling, order splitting, postponement, product pooling, product substitution, transshipments, and virtual pooling. (Oeser, 2011)))  and the question now is how this will influence the vendor’s profit.

    Profit and Inventory level with Risk Pooling

    As in the previous post we have calculated profit and loss as:

    Profit = sales less production costs of both sold and unsold items
    Loss = value of lost sales (stock-out) and the cost of having produced and stocked more than can be expected to be sold

    The figure below indicates what will happen as we change the inventory level. We can see as we successively move to higher levels (from left to right on the x-axis) that expected profit (blue line) will increase to a point of maximum – ¤16541 at a level of 7149 units:

    Compared to the point of maximum profit for a single warehouse (profit ¤4963 at a level of 2729 units, see previous post), have this risk pooling increased the vendors profit by 11.1% while reducing his inventory by 12.7%. Centralization of the three inventories has thus been a successful operational hedge ((Risk pooling can be considered as a form of operational hedging. Operational hedging is risk mitigation using operational instruments.))  for our newsvendor by mitigating some, but not all, of the demand uncertainty.

    Since this risk mitigation was a success the newsvendor wants to calculate the possible benefits from serving six newsboys at different locations from the same inventory.

    Under the same assumptions, it turns out that this gives an even better result, with an increase in profit of almost 16% and at the same time reducing the inventory by 15%:

    The inventory ‘centralization’ has then both increased profit and reduced inventory level compared to a strategy with inventories held at each location.

    Centralizing inventory (inventory pooling) in a two-echelon supply chain may thus reduce costs and increase profits for the newsvendor carrying the inventory, but the individual newsboys may lose profits due to the pooling. On the other hand, the newsvendor will certainly lose profit if he allows the newsboys to decide the level of their own inventory and the centralized inventory.

    One of the reasons behind this conflict of interests is that each of the newsvendor and newsboys will benefit one-sidedly from shifting the demand risk to another party even though the performance may suffer as a result (Kemahloğlu-Ziya, 2004) and (Anupindi and Bassok 1999).

    In real life, the actual risk pooling effects would depend on the correlations between each locations demand. A positive correlation would reduce the effect while a negative correlation would increase the effects. If all locations were perfectly correlated (positive) the effect would be zero and a correlation coefficient of minus one would maximize the effects.

    The third effect

    The third direct effect of risk pooling is the reduced variability of expected profit. If we plot the profit variability, measured by its coefficient of variation (( The coefficient of variation is defined as the ratio of the standard deviation to the mean – also known as unitized risk.)) (CV) for the three sets of strategies discussed above; one single inventory (warehouse), three single inventories versus all three inventories centralized and six single inventories versus all six centralized.

    The graph below depicts the situation. The three curves show the CV for corporate profit given the three alternatives and the vertical lines the point of profit for each alternative.

    The angle of inclination for each curve shows the profits sensitivity for changes in the inventory level and the location each strategies impact on the predictability of realized profit.

    A single warehouse strategy (blue) gives clearly a much less ability to predict future profit than the ‘six centralized warehouse’ (purple) while the ‘three centralized warehouse’ (green) fall somewhere in between:

    So in addition to reduced costs and increased profits centralization, also gives a more predictable result, and lower sensitivity to inventory level and hence a greater leeway in the practical application of different policies for inventory planning.

    Summary

    We have thus shown through Monte-Carlo simulations, that the benefits of pooling will increase with the number of locations and that the benefits of risk pooling can be calculated without knowing the closed form ((In mathematics, an expression is said to be a closed-form expression if it can be expressed analytically in terms of a finite number of certain “well-known” functions.)) of the demand distribution.

    Since we do not need the closed form of the demand distribution, we are not limited to low demand variability or the possibility of negative demand (Normal distributions etc.). Expanding the scope of analysis to include stochastic supply, supply disruptions, information sharing, localization of inventory etc. is natural extensions of this method ((We will return to some of these issues in later posts.)).

    This opens for use of robust and efficient methods and techniques for solving problems in inventory management unrestricted by the form of the demand distribution and best of all, the results given as graphs will be more easily communicated to all parties than pure mathematical descriptions of the solutions.

    References

    Anupindi, R. & Bassok, Y. (1999). Centralization of stocks: Retailers vs. manufacturer.  Management Science 45(2), 178-191. doi: 10.1287/mnsc.45.2.178, accessed 09/12/2012.

    Chang, Pao-Long & Lin, C.-T. (1991). Centralized Effect on Expected Costs in a Multi-Location Newsboy Problem. Journal of the Operational Research Society of Japan, 34(1), 87–92.

    Eppen,G.D. (1979). Effects of centralization on expected costs in a multi-location newsboy problem. Management Science, 25(5), 498–501.

    Kemahlioğlu-Ziya, E. (2004). Formal methods of value sharing in supply chains. PhD thesis, School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, GA, July 2004. http://smartech.gatech.edu/bitstream/1853/4965/1/kemahlioglu ziya_eda_200407_phd.pdf, accessed 09/12/2012.

    OESER, G. (2011). Methods of Risk Pooling in Business Logistics and Their Application. Europa-Universität Viadrina Frankfurt (Oder). URL: http://opus.kobv.de/euv/volltexte/2011/45, accessed 09/12/2012.

    Endnotes

  • Inventory Management: Is profit maximization right for you?

    Inventory Management: Is profit maximization right for you?

    This entry is part 2 of 4 in the series Predictive Analytics

     

    Introduction

    In the following we will exemplify how sales forecasts can be used to set inventory levels in single or in multilevel warehousing. By inventory we will mean a stock or store of goods; finished goods, raw materials, purchased parts, and retail items. Since the problem discussed is the same for both production and inventory, the two terms will be used interchangeably.

    Good inventory management is essential to the successful operation for most organizations both because of the amount of money the inventory represents and the impact that inventories have on the daily operations.

    An inventory can have many purposes among them the ability:

    1. to support independence of operations,
    2. to meet both anticipated and variation in demand,
    3. to decouple components of production and allow flexibility in production scheduling and
    4. to hedge against price increases, or to take advantage of quantity discounts.

    The many advantages of stock keeping must however be weighted against the costs of keeping the inventory. This can best be described as the “too much/too little problem”; order too much and inventory is left over or order too little and sales are lost.

    This can be as a single-period (a onetime purchasing decision) or a multi-period problem, involving a single warehouse or multilevel warehousing geographically dispersed. The task can then be to minimize the organizations total cost, maximize the level of customer service, minimize ‘loss’ or maximize profit etc.

    Whatever the purpose, the calculation will have to be based on knowledge of the sales distribution. In addition, sales will usually have a seasonal variance creating a balance act between production, logistic and warehousing costs. In the example given below the sales forecasts will have to be viewed as a periodic forecast (month, quarter, etc.).

    We have intentionally selected a ‘simple problem’ to highlight the optimization process and the properties of the optimal solution. The last is seldom described in the standard texts.

    The News-vendor problem

    The news-vendor is facing a onetime purchasing decision; to maximize expected profit so that the expected loss on the Qth unit equals the expected gain on the Qth unit:

    I.  Co * F(Q) = Cu * (1-F(Q)) , where

    Co = The cost of ordering one more unit than what would have been ordered if demand had been known – or the increase in profit enjoyed by having ordered one fewer unit,

    Cu = The cost of ordering one fewer unit than what would have been ordered if demand had been known  – or the increase in profit enjoyed by having ordered one more unit, and

    F(Q) = Demand Probability for q<= Q. By rearranging terms in the above equation we find:

    II.  F(Q) = Cu/{Co+Cu}

    This ratio is often called the critical ratio (CR). The usual way of solving this is to assume that the demand is normal distributed giving Q as:

    III.    Q = m + z * s, where: z = {Q-m}/s , is normal distributed with zero mean and variance equal  one.

    Demand unfortunately, rarely haves a normal distribution and to make things worse we usually don’t know the exact distribution at all. We can only ‘find’ it by Monte Carlo simulation and thus have to find the Q satisfying the equation (I) by numerical methods.

    For the news-vendor the inventory level should be set to maximize profit given the sales distribution. This implies that the cost of lost sales will have to be weighed against the cost of adding more to the stock.

    If we for the moment assume that all these costs can be regarded as fixed and independent of the inventory level, then the product markup (% of cost) will determine the optimal inventory level:

    IV. Cu= Co * (1+ {Markup/100}) 

    In the example given here the critical ratio is approx. 0.8.  The question then is if the inventory levels indicated by that critical ratio always will be the best for the organization.

    Expected demand

    The following graph indicates the news-vendors demand distribution. Expected demand is 2096 units ((Median demand is 1819 units and the demand lies most typically in the range of 1500 to 2000 units)), but the distribution is heavily skewed to the right ((The demand distribution has a skewness of 0.78., with a coefficient of variation of 0.45, a lower quartile of 1432 units and an upper quartile of 2720 units.))  so there is a possibility of demand exceeding the expected demand:

    By setting the product markup – in the example below it is set to 300% – we can calculate profit and loss based on the demand forecast.

    Profit and Loss (of opportunity)

    In the following we have calculated profit and loss as:

    Profit = sales less production costs of both sold and unsold items
    Loss = value of lost sales (stock-out) and the cost of having produced and stocked more than can be expected to be sold

    The figure below indicates what will happen as we change the inventory level. We can see as we successively move to higher levels (from left to right on the x-axis) that expected profit (blue line) will increase to a point of maximum  ¤4963 at a level of 2729 units:

    At that point we can expect to have some excess stock and in some cases also lost sales. But regardless, it is at this point that expected profit is maximized, so this gives the optimal stock level.

    Since we include both costs of sold and unsold items, the point giving expected maximum profit will be below the point minimizing expected loss –¤1460 at a production level of 2910 units.

    Given the optimal inventory level (2729 units) we find the actual sales frequency distribution as shown in the graph below. At this level we expect an average sale of 1920 units – ranging from 262 to 2729 units ((Having a lower quartile of 1430 units and an upper quartile of 2714 units.)).

    The graph shows that the distribution possesses two different modes ((The most common value in a set of observations.)) or two local maxima. This bimodality is created by the fact that the demand distribution is heavily skewed to the right so that demand exceeding 2729 units will imply 2729 units sold with the rest as lost sales.

    This bimodality will of course be reflected in the distribution of realized profits. Have in mind that the line (blue) giving maximum profit is an average of all realized profits during the Monte Carlo simulation given the demand distribution and the selected inventory level. We can therefore expect realized profit both below and above this average (¤4963) – as shown in the frequency graph below:

    Expected (average) profit is ¤4963, with a minimum of ¤1681 and a maximum of ¤8186, the range of realized profits is therefore very large ((Having a lower quartile of ¤2991 and an upper quartile of ¤8129.)) ¤9867.

    So even if we maximize profit we can expect a large variation in realized profits, there is no way that the original uncertainty in the demand distribution can be reduced or removed.

    Risk and Reward

    Increased profit comes at a price: increased risk. The graph below describes the situation; the blue curve shows how expected profit increases with the production or inventory (service) level. The spread between the green and red curves indicates the band where actual profit with 80% probability will fall. As is clear from the graph, this band increases in width as we move to the right indicating an increased upside (area up to the green line) but also an increased probability for a substantial downside (area down to the red line:

    For some companies – depending on the shape of the demand distribution – other concerns than profit maximization might therefore be of more importance – like predictability of results (profit). The act of setting inventory or production levels should accordingly be viewed as an element for the boards risk assessments.

    On the other hand will the uncertainty band around loss as the service level increases decrease. This of course lies in the fact that loss due to lost sales diminishes as the service level increases and the that the high markup easily covers the cost of over-production.

    Thus a strategy of ‘loss’ minimization will falsely give a sense of ‘risk minimization’, while it in reality increases the uncertainty of future realized profit.

    Product markup

    The optimal stock or production level will be a function of the product markup. A high markup will give room for a higher level of unsold items while a low level will necessitate a focus on cost reduction and the acceptance of stock-out:

    The relation between markup (%) and the production level is quadratic ((Markup (%) = 757.5 – 0.78*production level + 0.00023*production level2))  implying that markup will have to be increasingly higher, the further out on the right tail we fix the production level.

    The Optimal inventory (production) level

    If we put it all together we get the chart below. In this the green curve is the accumulated sales giving the probability of the level of sales and the brown curve the optimal stock or production level given the markup.

    The optimal stock level is then found by drawing a line from the right markup axis (right y-axis) to the curve (red) for optimal stock level, and down to the x-axis giving the stock level. By continuing the line from the markup axis to the probability axis (left y-axis) we find the probability level for stock-out (1-the cumulative probability) and the probability for having a stock level in excess of demand:

    By using the sales distribution we can find the optimal stock/production level given the markup and this would not have been possible with single point sales forecasts – that could have ended up almost anywhere on the curve for forecasted sales.

    Even if a single point forecast managed to find expected sales – as mean, mode or median – it would have given wrong answers about the optimal stock/production level, since the shape of the sales distribution would have been unknown.

    In this case with the sales distribution having a right tail the level would have been to low – or with low markup, to high. With a left skewed sales distribution the result would have been the other way around: The level would have been too high and with low markup probably too low.

    In the case of multilevel warehousing, the above analyses have to be performed on all levels and solved as a simultaneous system.

    The state of affairs at the point of maximum

    To have the full picture of the state of affairs at the point of maximum we have to take a look at what we can expect of over- and under-production. At the level giving maximum expected profit we will on

    average have an underproduction of 168 units, ranging from zero to nearly 3000 ((Having a coefficient of variation of almost 250%)). On the face of it this could easily be interpreted as having set the level to low, but as we shall see that is not the case.

    Since we have a high markup, lost sales will weigh heavily in the profit maximization and as a result of this we can expect to have unsold items in our stock at the end of the period. On average we will have a little over 800 units left in stock, ranging from zero to nearly 2500. The lower quartile is 14 units and the upper is 1300 units so in 75% of the cases we will have an overproduction of less than 1300 units. However in 25% of the cases the overproduction will be in the range from 1300 to 2500 units.

    Even with the possibility of ending up at the end of the period with a large number of unsold units, the strategy of profit maximization will on average give the highest profit. However, as we have seen, with a very high level of uncertainty about the actual profit being realized.

    Now, since a lower inventory level in this case only will reduce profit by a small amount but lower the confidence limit by a substantial amount, other strategies giving more predictability for the actual result should be considered.

  • Budgeting Revisited

    Budgeting Revisited

    This entry is part 2 of 2 in the series Budgeting

     

    Introduction

    Budgeting is one area that is well suited for Monte Carlo Simulation. Budgeting involves personal judgments about future values of large number of variables like; sales, prices, wages, down- time, error rates, exchange rates etc. – variables that describes the nature of the business.

    Everyone that has been involved in a budgeting process knows that it is an exercise in uncertainty; however it is seldom described in this way and even more seldom is uncertainty actually calculated as an integrated part of the budget.

    Good budgeting practices are structured to minimize errors and inconsistencies, drawing in all the necessary participants to contribute their business experience and the perspective of each department. Best practice in budgeting entails a mixture of top-down guidelines and standards, combined with bottom-up individual knowledge and experience.

    Excel, the de facto tool for budgeting, is a powerful personal productivity tool. Its current capabilities, however, are often inadequate to support the critical nature of budgeting and forecasting. There will come a point when a company’s reliance on spreadsheets for budgeting leads to severely ineffective decision-making, lost productivity and lost opportunities.

    Spreadsheets can accommodate many tasks – but, over time, some of the models running in Excel may grow too big for the spreadsheet application. Programming in a spreadsheet model often requires embedded assumptions, complex macros, creating opportunities for formula errors and broken links between workbooks.

    It is common for spreadsheet budget models and their intricacies to be known and maintained by a single person who becomes a vulnerability point with no backup. And there are other maintenance and usage issues:

    A.    Spreadsheet budget models are difficult to distribute and even more difficult to collect and consolidate.
    B.    Data confidentiality is almost impossible to maintain in spreadsheets, which are not designed to hide or expose data based upon each user’s role.
    C.    Financial statements are usually not fully integrated leaving little basis for decision making.

    These are serious drawbacks for corporate governance and make the audit process more difficult.

    This is a few of many reasons why we use a dedicated simulation language for our models that specifically do not mix data and code.

    The budget model

    In practice budgeting can be performed on different levels:
    1.    Cash Flow
    2.    EBITDA
    3.    EBIT
    4.    Profit or
    5.    Company value.

    The most efficient is on EBITDA level, since taxes, depreciation and amortization on the short-term is mostly given. This is also the level where consolidation of daughter companies easiest is achieved. An EBITDA model describing the firm’s operations can again be used as a subroutine for more detailed and encompassing analysis thru P&L and Balance simulation.

    The aim will then to estimate of the firm’s equity value and is probability distribution. This can again be used for strategy selection etc.

    Forecasting

    In today’s fast moving and highly uncertain markets, forecasting have become the single most important element of the budget process.

    Forecasting or predictive analytics can best be described as statistic modeling enabling prediction of future events or results, using present and past information and data.

    1. Forecasts must integrate both external and internal cost and value drivers of the business.
    2. Absolute forecast accuracy (i.e. small confidence intervals) is less important than the insight about how current decisions and likely future events will interact to form the result.
    3. Detail does not equal accuracy with respect to forecasts.
    4. The forecast is often less important than the assumptions and variables that underpin it – those are the things that should be traced to provide advance warning.
    5.  Never relay on single point or scenario forecasting.

    All uncertainty about the market sizes, market shares, cost and prices, interest rates, exchange rates and taxes etc. – and their correlation will finally end up contributing to the uncertainty in the firm’s budget forecasts.

    The EBITDA model

    The EBITDA model have to be detailed enough to capture all important cost and value drivers, but simple enough to be easy to update with new data and assumptions.

    Input to the model can come from different sources; any internal reporting system or spread sheet. The easiest way to communicate with the model is by using Excel  spread sheet – templates.

    Such templates will be pre-defined in the sense that the information the model needs is on a pre-determined place in the workbook.  This makes it easy if the budgets for daughter companies is reported (and consolidated) in a common system (e.g. SAP) and can ‘dump’ onto an excel spread sheet. If the budgets are communicated directly to head office or the mother company then they can be read directly by the model.

    Standalone models and dedicated subroutines

    We usually construct our EBITDA models so that they can be used both as a standalone model and as a subroutine for balance simulation. The model can then be used both for short term budgeting and long-term EBITDA forecasting and simulation and for short/long term balance forecasting and simulation. This means that the same model can be efficiently reused in different contexts.
    Rolling budgets and forecast

    The EBITDA model can be constructed to give rolling forecast based on updated monthly or quarterly values, taking into consideration the seasonality of the operations. This will give new forecasts (new budget) for the remaining of the year and/or the next twelve month. By forecasts we again mean the probability distributions for the budget variables.

    Even if the variables have not changed, the fact that we move towards the end of the year will reduce the uncertainty of if the end year results and also for the forecast for the next twelve month.

    Uncertainty

    The most important part of budgeting with Monte Carlo simulation is assessment of the uncertainty in the budgeted (forecasted) cost and value drivers. This uncertainty is given as the most likely value (usually the budget figure) and the interval where it is assessed with a high degree of confidence (approx. 95%) to fall.

    We will then use these lower and upper limits (5% and 95%) for sales, prices and other budget items and the budget values as indicators of the shape of the probability distributions for the individual budget items. Together they described the range and uncertainty in the EBITDA forecasts.

    This gives us the opportunity to simulate (Monte Carlo) a number of possible outcomes – by a large number of runs of the model, usually 1000 – of net revenue, operating expenses and finally EBITDA. This again will give us their probability distributions

    Most managers and their staff have, based on experience, a good grasp of the range in which the values of their variables will fall. It is not based on any precise computation but is a reasonable assessment by knowledgeable persons. Selecting the budget value however is more difficult. Should it be the “mean”
    or the “most likely value” or should the manager just delegate fixing of the values to the responsible departments?

    Now we know that the budget values might be biased by a number of reasons – simplest by bonus schemes etc. – and that budgets based on average assumptions are wrong on average .

    This is therefore where the individual mangers intent and culture will be manifested, and it is here the greatest learning effect for both the managers and the mother company will be, as under-budgeting  and overconfidence  will stand out as excessive large deviations from the model calculated expected value (probability weighted average over the interval).

    Output

    The output from the Monte Carlo simulation will be in the form of graphs that puts all run’s in the simulation together to form the cumulative distribution for the operating expenses (red line):

    In the figure we have computed the frequencies of observed (simulated) values for operating expenses (blue frequency plot) – the x-axis gives the operating expenses and the left y-axis the frequency. By summing up from left to right we can compute the cumulative probability curve. The s-shaped curve (red) gives for every point the probability (on the right y-axis) for having an operating expenses less than the corresponding point on the x-axis. The shape of this curve and its range on the x-axis gives us the uncertainty in the forecasts.

    A steep curve indicates little uncertainty and a flat curve indicates greater uncertainty.  The curve is calculated from the uncertainties reported in the reporting package or templates.

    Large uncertainties in the reported variables will contribute to the overall uncertainty in the EBITDA forecast and thus to a flatter curve and contrariwise. If the reported uncertainty in sales and prices has a marked downside and the costs a marked upside the resulting EBITDA distribution might very well have a portion on the negative side on the x-axis – that is, with some probability the EBITDA might end up negative.

    In the figure below the lines give the expected EBITDA and the budget value. The expected EBIT can be found by drawing a horizontal line from the 0.5 (50%) point on the y-axis to the curve and a vertical line from this point on the curve to the x-axis. This point gives us the expected EBITDA value – the point where it is 50% probability of having a value of EBITDA below and 100%-50%=50% of having it above.

    The second set of lines give the budget figure and the probability that it will end up lower than budget. In this case it is almost a 100% probability that it will be much lower than the management have expected.

    This distributions location on the EBITDA axis (x-axis) and its shape gives a large amount of information of what we can expect of possible results and their probability.

    The following figure that gives the EBIT distributions for a number of subsidiaries exemplifies this. One wills most probable never earn money (grey), three is cash cows (blue, green and brown) and the last (red) can earn a lot of money:

    Budget revisions and follow up

    Normally – if something extraordinary does not happen – we would expect both the budget and the actual EBITDA to fall somewhere in the region of the expected value. We have however to expect some deviation both from budget and expected value due to the nature of the industry.  Having in mind the possibility of unanticipated events or events “outside” the subsidiary’s budget responsibilities, but affecting the outcome this implies that:

    • Having the actual result deviating from budget is not necessary a sign of bad budgeting.
    • Having the result close to or on budget is not necessary a sign of good budgeting.

    However:

    •  Large deviations between budget and actual result needs looking into – especially if the deviation to expected value also is large.
    • Large deviation between budget and expected value can imply either that the limits are set “wrong” or that the budget EBITDA is not reflecting the downside risk or upside opportunity expressed by the limits.

    Another way of looking at the distributions is by the probabilities of having the actual result below budget that is how far off line the budget ended up. In the graph below, country #1’s budget came out with a probability of 72% of having the actual result below budget.  It turned out that the actual figure with only 36% probability would have been lower. The length of the bars thus indicates the budget discrepancies.

    For country# 2 it is the other way around: the probability of having had a result lower than the final result is 88% while the budgeted figure had a 63% probability of having been too low. In this case the market was seriously misjudged.

    In the following we have measured the deviation of the actual result both from the budget values and from the expected values. In the figures the left axis give the deviation from expected value and the bottom axis the deviation from budget value.

    1.  If the deviation for a country falls in the upper right quadrant the deviation are positive for both budget and expected value – and the country is overachieving.
    2. If the deviation falls in the lower left quadrant the deviation are negative for both budget and expected value – and the country is underachieving.
    3. If the deviation falls in the upper left quadrant the deviation are negative for budget and positive for expected value – and the country is overachieving but has had a to high budget.

    With a left skewed EBITDA distribution there should not be any observations in the lower right quadrant that will only happen when the distribution is skewed to the right – and then there will not be any observations in the upper left quadrant:

    As the manager’s gets more experienced in assessing the uncertainty they face, we see that the budget figures are more in line with the expected values and that the interval’s given is shorter and better oriented.

    If the budget is in line with expected value given the described uncertainty, the upside potential ratio should be approx. one. A high value should indicate a potential for higher EBITDA and vice versa. Using this measure we can numerically describe the managements budgeting behavior:

    Rolling budgets

    If the model is set up to give rolling forecasts of the budget EBITDA as new and in this case monthly data, we will get successive forecast as in the figure below:

    As data for new month are received, the curve is getting steeper since the uncertainty is reduced. From the squares on the lines indicating expected value we see that the value is moving slowly to the right and higher EBITDA values.

    We can of course also use this for long term forecasting as in the figure below:

    As should now be evident; the EBITDA Monte Carlo model have multiple fields of use and all of them will increases the managements possibilities of control and foresight giving ample opportunity for prudent planning for the future.

     

     

  • Uncertainty modeling

    Uncertainty modeling

    This entry is part 2 of 3 in the series What We Do

    Prediction is very difficult, especially about the future.
    Niels Bohr. Danish physicist (1885 – 1962)

    Strategy @ Risks models provide the possibility to study risk and uncertainties related to operational activities;  cost, prices, suppliers,  markets, sales channels etc. financial issues like; interest rates risk, exchange rates risks, translation risk , taxes etc., strategic issues like investments in new or existing activities, valuation and M&As’ etc and for a wide range of budgeting purposes.

    All economic activities have an inherent volatility that is an integrated part of its operations. This means that whatever you do some uncertainty will always remain.

    The aim is to estimate the economic impact that such critical uncertainty may have on corporate earnings at risk. This will add a third dimension – probability – to all forecasts, give new insight: the ability to deal with uncertainties in an informed way and thus benefits above ordinary spread-sheet exercises.

    The results from these analyzes can be presented in form of B/S and P&L looking at the coming one to five (short term) or five to fifteen years (long term); showing the impacts to e.g. equity value, company value, operating income etc. With the purpose of:

    • Improve predictability in operating earnings and its’ expected volatility
    • Improve budgeting processes, predicting budget deviations and its’ probabilities
    • Evaluate alternative strategic investment options at risk
    • Identify and benchmark investment portfolios and their uncertainty
    • Identify and benchmark individual business units’ risk profiles
    • Evaluate equity values and enterprise values and their uncertainty in M&A processes, etc.

    Methods

    To be able to add uncertainty to financial models, we also have to add more complexity. This complexity is inevitable, but in our case, it is desirable and it will be well managed inside our models.

    People say they want models that are simple, but what they really want is models with the necessary features – that are easy to use. If something is complex but well designed, it will be easy to use – and this holds for our models.

    Most companies have some sort of model describing the company’s operations. They are mostly used for budgeting, but in some cases also for forecasting cash flow and other important performance measures. Almost all are deterministic models based on expected or average values of input data; sales, cost, interest and currency rates etc.

    We know however that forecasts based on average values are on average wrong. In addition will deterministic models miss the important uncertainty dimension that gives both the different risks facing the company and the opportunities they bring forth.

    S@R has set out to create models that can give answers to both deterministic and stochastic questions, by linking dedicated Ebitda models to holistic balance simulation taking into account all important factors describing the company. The basis is a real balance simulation model – not a simple cash flow forecast model.

    Both the deterministic and stochastic balance simulation can be set about in two different alternatives:

    1. by a using a EBITDA model to describe the companies operations or
    2. by using coefficients of fabrications (e.g. kg flour pr 1000 bread etc.) as direct input to the balance model – the ‘short cut’ method.

    The first approach implies setting up a dedicated Ebitda subroutine to the balance model. This will give detailed answers to a broad range of questions about markets, capacity driven investments, operational performance and uncertainty, but entails a higher degree of effort from both the company and S@R. This is a tool for long term planning and strategy development.

    The second (‘the short cut’) uses coefficients of fabrications and their variations, and is a low effort (cost) alternative, usually using the internal accounting as basis. This will in many cases give a ‘good enough’ description of the company – its risks and opportunities. It can be based on existing investment and market plans.  The data needed for the company’s economic environment (taxes, interest rates etc) will be the same in both alternatives:

    The ‘short cut’ approach is especially suited for quick appraisals of M&A cases where time and data is limited and where one wishes to limit efforts in an initial stage. Later the data and assumptions can be augmented to much more sophisticated analysis within the same ‘short cut’ framework. In this way analysis can be successively built in the direction the previous studies suggested.

    This also makes it a good tool for short-term (3-5 years) analysis and even for budget assessment. Since it will use a limited number of variables – usually less than twenty – describing the operations, it is easy to maintain and operate. The variables describing financial strategy and the economic environment come in addition, but will be easy to obtain.

    Used in budgeting it will give the opportunity to evaluate budget targets, their probable deviation from expected result and the probable upside or down side given the budget target (Upside/downside ratio).

    Done this way analysis can be run for subsidiaries across countries translating the P&L and Balance to any currency for benchmarking, investment appraisals, risk and opportunity assessments etc. The final uncertainty distributions can then be “aggregated’ to show global risk for the mother company.

    An interesting feature is the models ability to start simulations with an empty opening balance. This can be used to assess divisions that do not have an independent balance since the model will call for equity/debt etc. based on a target ratio, according to the simulated production and sales and the necessary investments. Questions about further investment in divisions or product lines can be studied this way.

    Since all runs (500 to 1000) in the simulation produces a complete P&L and Balance the uncertainty curve (distribution) for any financial metric like ‘Yearly result’, ‘free cash flow’, economic profit’, ‘equity value’, ‘IRR’ or’ translation gain/loss’ etc. can be produced.

    In some cases we have used both approaches for the same client, using the last approach for smaller daughter companies with production structures differing from the main companies.
    The second approach can also be considered as an introduction and stepping stone to a more holistic Ebitda model.

    Time and effort

    The work load for the client is usually limited to a small team of people ( 1 to 3 persons) acting as project leaders and principal contacts, assuring that all necessary information, describing value and risks for the clients’ operations can be collected as basis for modeling and calculations. However the type of data will have to be agreed upon depending on the scope of analysis.

    Very often will key people from the controller group be adequate for this work and if they don’t have the direct knowledge they usually know who to ask. The work for this team, depending on the scope and choice of method (see above) can vary in effective time from a few days to a couple of weeks, but this can be stretched from three to four weeks to the same number of months.

    For S&R the time frame will depend on the availability of key personnel from the client and the availability of data. For the second alternative it can take from one to three weeks of normal work to three to six months for the first alternative for more complex models. The total time will also depend on the number of analysis that needs to be run and the type of reports that has to be delivered.

    S@R_ValueSim

    Selecting strategy

    Models like this are excellent for selection and assessment of strategies. Since we can find the probability distribution for equity value, changes in this brought by different strategies will form a basis for selection or adjustment of current strategy. Models including real option strategies are a natural extension of these simulation models:

    If there is a strategy with a curve to the right and under all other feasible strategies this will be the stochastic dominant one. If the curves crosses further calculations needs to be done before a stochastic dominant or preferable strategy can be found:

    Types of problems we aim to address:

    The effects of uncertainties on the P&L and Balance and the effects of the Boards strategies (market, hedging etc.) on future P&L and Balance sheets evaluating:

    • Market position and potential for growth
    • Effects of tax and capital cost
    • Strategies
    • Business units, country units or product lines –  capital allocation – compare risk, opportunity and expected profitability
    • Valuations, capital cost and debt requirements, individually and effect on company
    • The future cash-flow volatility of company and the individual BU’s
    • Investments, M&A actions, their individual value, necessary commitments and impact on company
    • Etc.

    The aim regardless of approach is to quantify not only the company’s single and aggregated risks, but also the potential, thus making the company capable to perform detailed planning and of executing earlier and more apt actions against uncertain factors.

    Used in budgeting, this will improve budget stability through higher insight in cost side risks and income-side potentials. This is achieved by an active budget-forecast process; the control-adjustment cycle will teach the company to better target realistic budgets – with better stability and increased company value as a result.

    This is most clearly seen when effort is put into correctly evaluating strategies-projects and investments effects on the enterprise. The best way to do this is by comparing and Choosing strategies by analyzing the individual strategies risks and potential – and select the alternative that is dominant (stochastic) given the company’s chosen risk-profile.

    A severe depression like that of 1920-1921 is outside the range of probability. –The Harvard Economic Society, 16 November 1929

  • Concession Revenue Modelling and Forecasting

    Concession Revenue Modelling and Forecasting

    This entry is part 2 of 4 in the series Airports

     

    Concessions are an important source of revenue for all airports. An airport simulation model should therefore be able to give a good forecast of revenue from different types of concessions -given a small set of assumptions about local future price levels and income development for its international Pax. Since we already have a good forecast model for the expected number of international Pax (and its variation) we will attempt to forecast the airports revenue pr Pax from one type of concession and use both forecasts to estimate the airports revenue from that concession.

    The theory behind is simple; the concessionaires sales is a function of product price and the customers (Pax) income level. Some other airport specific variables also enter the equation however they will not be discussed here. As a proxy for change in Pax income we will use the individual countries change in GDP.  The price movement is represented by the corresponding movements of a price index.

    We assume that changes in the trend for the airports revenue is a function of the changes in the general income level and that the seasonal variance is caused by the seasonal changes in the passenger mix (business/leisure travel).

    It is of course impossible to forecast the exact level of revenue, but that is as we shall see where Monte Carlo simulation proves its worth.

    The fist step is a time series analysis of the observed revenue pr Pax, decomposing the series in trend and seasonal factors:

    Concession-revenue

    The time series fit turns out to be very good explaining more than 90 % of the series variation. At this point however our only interest is the trend movements and its relation to change in prices, income and a few other airport specific variables. We will however here only look at income – the most important of the variable.

    Step two, is a time series analysis of income (weighted average of GDP development in countries with majority of Pax) separating trend and seasonal factors. This trend is what we are looking for; we want to use it to explain the trend movements in the revenue.

    Step three, is then a regression of the revenue trend on the income trend as shown in the graph below. The revenue trend was estimated assuming a quadratic relation over time and we can see that the fit is good. In fact 98 % of the variance in the revenue trend can be explained by the change in income (+) trend:

    Concession-trend

    Now the model will be as follows – step four:

    1. We will collect the central banks GDP forecasts (base line scenario) and use this to forecast the most likely change in income trend
    2. More and more central banks are now producing fan charts giving the possible event space (with probabilities) for their forecasts. We will use this to establish a probability distribution for our income proxy

    Below is given an example of a fan chart taken from the Bank of England’s inflation report November 2009. (Bank of England, 2009) ((The fan chart depicts the probability of various outcomes for GDP growth.  It has been conditioned on the assumption that the stock of purchased assets financed by the issuance of central bank reserves reaches £200 billion and remains there throughout the forecast period.  To the left of the first vertical dashed line, the distribution reflects the likelihood of revisions to the data over the past; to the right, it reflects uncertainty over the evolution of GDP growth in the future.  If economic circumstances identical to today’s were to prevail on 100 occasions, the MPC’s best collective judgement is that the mature estimate of GDP growth would lie within the darkest central band on only 10 of those occasions.  The fan chart is constructed so that outturns are also expected to lie within each pair of the lighter green areas on 10 occasions.  In any particular quarter of the forecast period, GDP is therefore expected to lie somewhere within the fan on 90 out of 100 occasions.  The bands widen as the time horizon is extended, indicating the increasing uncertainty about outcomes.  See the box on page 39 of the November 2007 Inflation Report for a fuller description of the fan chart and what it represents.  The second dashed line is drawn at the two-year point of the projection.))

    Bilde1

    3. We will then use the relation between historic revenue and income trend to forecast the revenue trend
    4. Adding the seasonal variation using the estimated seasonal factors – give us a forecast of the periodic revenue.

    For our historic data the result is shown in the graph below:

    Concession-revenue-estimate

    The calculated revenue series have a very high correlation with the observed revenue series (R=0.95) explaining approximately 90% of the series variation.

    Step five, now we can forecast the revenue from concession pr Pax figures for the next periods (month, quarters or years), using Monte Carlo simulation:

    1. From the income proxy distribution we draw a possible change in yearly income and calculates the new trend
    2. Using the estimated relation between historic revenue and income trend we forecast the most likely revenue trend and calculate the 95% confidence interval. We then use this to establish a probability distribution for the period’s trend level and draws a value. This value is adjusted with the period’s seasonal factor and becomes our forecasted value for the airports revenue from the concession – for this period.

    Running thru this a thousand times we get a distribution as given below:

    Concession-revenue-distribuIn the airport EBITDA model this only a small but important part for forecasting future airport revenue. As the models data are updated (monthly) all the time series analysis and regressions are redone dynamically to capture changes in trends and seasonal factors.

    The level of monthly revenue from the concession is obviously more complex than can be described with a small set of variable and assumptions. Our model has with high probability specification errors and we may or may not have violated some of the statistical methods assumptions (the model produces output to monitor this). But we feel that we are far better of than having put all our money on a single figure as a forecast. At least we know something about the forecasts uncertainty.

    References

    Bank of England. (2009, November). Inflation Report November 2009 . Retrieved from http://www.bankofengland.co.uk/publications/inflationreport/ir09nov5.ppt

  • Budgeting

    Budgeting

    This entry is part 1 of 2 in the series Budgeting

     

    Budgeting is one area that is well suited for Monte Carlo Simulation. Budgeting involves personal judgments about future values of large number of variables like; sales, prices, wages, down- time, error rates, exchange rates etc. – variables that describes the nature of the business.

    Everyone that has been involved in a budgeting process knows that it is an exercise in uncertainty; however it is seldom described in this way and even more seldom is uncertainty actually calculated as an integrated part of the budget.

    Admittedly a number of large public building projects are calculated this way, but more often than not is the aim only to calculate some percentile (usually 85%) as expected budget cost.

    Most managers and their staff have, based on experience, a good grasp of the range in which the values of their variables will fall.  A manager’s subjective probability describes his personal judgement ebitabout how likely a particular event is to occur. It is not based on any precise computation but is a reasonable assessment by a knowledgeable person. Selecting the budget value however is more difficult. Should it be the “mean” or the “most likely value” or should the manager just delegate fixing of the values to the responsible departments?

    Now we know that the budget values might be biased by a number of reasons – simplest by bonus schemes etc. – and that budgets based on average assumptions are wrong on average ((Savage, Sam L. “The Flaw of Averages”, Harvard Business Review, November (2002): 20-21.))

    When judging probability, people can locate the source of the uncertainty either in their environment or in their own imperfect knowledge ((Kahneman D, Tversky A . ” On the psychology of prediction.” Psychological Review 80(1973): 237-251)). When assessing uncertainty, people tend to underestimate it – often called overconfidence and hindsight bias.

    Overconfidence bias concerns the fact that people overestimate how much they actually know: when they are p percent sure that they have predicted correctly, they are in fact right on average less than p percent of the time ((Keren G.  “Calibration and probability judgments: Conceptual and methodological issues”. Acta Psychologica 77(1991): 217-273.)).

    Hindsight bias concerns the fact that people overestimate how much they would have known had they not possessed the correct answer: events which are given an average probability of p percent before they have occurred, are given, in hindsight, probabilities higher than p percent ((Fischhoff B.  “Hindsight=foresight: The effect of outcome knowledge on judgment under uncertainty”. Journal of Experimental Psychology: Human Perception and Performance 1(1975) 288-299.)).

    We will however not endeavor to ask for the managers subjective probabilities only ask for the range of possible values (5-95%) and their best guess of the most likely value. We will then use this to generate an appropriate log-normal distribution for sales, prices etc. For investments we will use triangular distributions to avoid long tails. Where, most likely values are hard to guesstimate we will use rectangular distributions.

    We will then proceed as if the distributions where known (Keynes):

    [Under uncertainty] there is no scientific basis on which to form any calculable probability whatever. We simply do not know. Nevertheless, the necessity for action and for decision compels us as practical men to do our best to overlook this awkward fact and to behave exactly as we should if we had behind us a good Benthamite calculation of a series of prospective advantages and disadvantages, each multiplied by its appropriate probability waiting to be summed.  ((John Maynard Keynes. ” General Theory of Employment, Quarterly Journal of Economics (1937))

    budget_actual_expected

    The data collection can easily be embedded in the ordinary budget process, by asking the managers to set the lower and upper 5% values for all variables demining the budget, and assuming that the budget figures are the most likely values.

    This gives us the opportunity to simulate (Monte Carlo) a number of possible outcomes – usually 1000 – of net revenue, operating expenses and finally EBIT (DA).

    In this case the budget was optimistic with ca 84% probability of having an outcome below and only with 26% probability of having an outcome above. The accounts also proved it to be high (actual) with final EBIT falling closer to the expected value. In our experience expected value is a better estimator for final result than the budget  EBIT.

    However, the most important part of this exercise is the shape of the cumulative distribution curve for EBIT. The shape gives a good picture of the uncertainty the company faces in the year to come, a flat curve indicates more uncertainty both in the budget forecast and the final result than a steeper curve.

    Wisely used the curve (distribution) can be used both to inform stakeholders about risk being faced and to make contingency plans foreseeing adverse events.percieved-uncertainty-in-ne

    Having the probability distributions for net revenue and operating expenses we can calculate and plot the manager’s perceived uncertainty by using coefficients of variation.

    In our material we find on average twice as much uncertainty in the forecasts for net revenue than for operating expenses.

    As many often have budget values above expected value they are exposing a downward risk. We can measure this risk by the Upside Potential Ratio, which is the expected return above budget value per unit of downside risk. It can be found using the upper and lower moments calculated at budget value.

    References