Warning: define(): Argument #3 ($case_insensitive) is ignored since declaration of case-insensitive constants is no longer supported in /home/u742613510/domains/strategy-at-risk.com/public_html/wp-content/plugins/wpmathpub/wpmathpub.php on line 65
A short presentation of S@R – Strategy @ Risk

Series: A short presentation of S@R

  • A short presentation of S@R

    A short presentation of S@R

    This entry is part 1 of 4 in the series A short presentation of S@R

     

    My general view would be that you should not take your intuitions at face value; overconfidence is a powerful source of illusions. Daniel Kahneman (“Strategic decisions: when,” 2010)

    Most companies have some sort of model describing the company’s operations. They are mostly used for budgeting, but in some cases also for forecasting cash flow and other important performance measures. Almost all are deterministic models based on expected or average values of input data; sales, cost, interest and currency rates etc. We know however that forecasts based on average values are on average wrong. In addition deterministic models will miss the important uncertainty dimension that gives both the different risks facing the company and the opportunities they produce.

    S@R has set out to create models (See Pdf: Short presentation of S@R) that can give answers to both deterministic and stochastic questions, by linking dedicated EBITDA models to holistic balance simulation taking into account all important factors describing the company. The basis is a real balance simulation model – not a simple cash flow forecast model.

    Generic Simulation_model

    Both the deterministic and stochastic balance simulation can be set about in two different alternatives:

    1. by a using a EBITDA model to describe the companies operations or,
    2. by using coefficients of fabrications  as direct input to the balance model.

    The first approach implies setting up a dedicated ebitda subroutine to the balance model. This will give detailed answers to a broad range of questions about operational performance and uncertainty, but entails a higher degree of effort from both the company and S@R.

    The use of coefficients of fabrications and their variations is a low effort (cost) alternative, using the internal accounting as basis. This will in many cases give a ‘good enough’ description of the company – its risks and opportunities: The data needed for the company’s economic environment (taxes, interest rates etc.) will be the same in both alternatives.

    EBITDA_model

    In some cases we have used both approaches for the same client, using the last approach for smaller daughter companies with production structures differing from the main companies.
    The second approach can also be considered as an introduction and stepping stone to a more holistic EBITDA model.

    What problems do we solve?

    • The aim regardless of approach is to quantify not only the company’s single and aggregated risks, but also the potential, thus making the company capable to perform detailed planning and of executing earlier and more apt actions against risk factors.
    • This will improve stability to budgets through higher insight in cost side risks and income-side potentials. This is achieved by an active budget-forecast process; the control-adjustment cycle will teach the company to better target realistic budgets – with better stability and increased company value as a result.
    • Experience shows that the mere act of quantifying uncertainty throughout the company – and thru modelling – describe the interactions and their effects on profit, in itself over time reduces total risk and increases profitability.
    • This is most clearly seen when effort is put into correctly evaluating strategies-projects and investments effects on the enterprise. The best way to do this is by comparing and choosing strategies by analysing the individual strategies risks and potential – and select the alternative that is dominant (stochastic) given the company’s chosen risk-profile.
    • Our aim is therefore to transform enterprise risk management from only safeguarding enterprise value to contribute to the increase and maximization of the firm’s value within the firm’s feasible set of possibilities.

    References

    Strategic decisions: when can you trust your gut?. (2010). McKinsey Quarterly, (March)

  • The Case of Enterprise Risk Management

    The Case of Enterprise Risk Management

    This entry is part 2 of 4 in the series A short presentation of S@R

     

    The underlying premise of enterprise risk management is that every entity exists to provide value for its stakeholders. All entities face uncertainty and the challenge for management is to determine how much uncertainty to accept as it strives to grow stakeholder value. Uncertainty presents both risk and opportunity, with the potential to erode or enhance value. Enterprise risk management enables management to effectively deal with uncertainty and associated risk and opportunity, enhancing the capacity to build value. (COSO, 2004)

    The evils of a single point estimate

    Enterprise risk management is a process, effected by an entity’s board of directors, management and other personnel, applied in strategy setting and across the enterprise, designed to identify potential events that may affect the entity, and manage risk to be within its risk appetite, to provide reasonable assurance regarding the achievement of entity objectives. (COSO, 2004)

    Traditionally, when estimating costs, project value, equity value or budgeting, one number is generated – a single point estimate. There are many problems with this approach.  In budget work this point is too often given as the best the management can expect, but in some cases budgets are set artificially low generating bonuses for later performance beyond budget. The following graph depicts the first case.

    Budget_actual_expected

    Here, we have based on the production and market structure and on the managements assumptions of the variability of all relevant input and output variables simulated the probability distribution for next years EBITDA. The graph gives the budgeted value, the actual result and the expected value. Both budget and actual value are above expected value, but the budgeted value was far too high, giving with more than 80% probability a realized EBITDA lower than budget. In this case the board will be mislead with regard to the company’ ability to earn money and all subsequent decisions made based on the budget EBITDA can endanger the company.

    The organization’s ERM system should function to bring to the board’s attention the most significant risks affecting entity objectives and allow the board to understand and evaluate how these risks may be correlated, the manner in which they may affect the enterprise, and management’s mitigation or response strategies. (COSO, 2009)

    It would have been much more preferable to the board to be given both the budget value and the accompanying probability distribution allowing it to make independent judgment about the possible size of the next years EBITDA. Only then will the board – both from the shape of the distribution, its localization and the point estimate of budget EBITDA – be able to assess the risk and opportunity facing the company.

    Will point estimates cancel out errors?

    In the following we measure the deviation of the actual result from both from the budget value and from the expected value. The blue dots represent daughter companies located in different countries. For each company we have the deviation (in percent) of the budgeted EBITDA (bottom axis) and the expected value (left axis) from the actual EBITDA observed 1 ½ year later.

    If the deviation for a company falls in the upper right quadrant the deviation are positive for both budget and expected value – and the company is overachieving.

    If the deviation falls in the lower left quadrant the deviation are negative for both budget and expected value – and the company is underachieving.

    If the deviation falls in the upper left quadrant the deviation are negative for budget and positive for expected value – the company is overachieving but has had a to high budget.

    With left skewed EBITDA distributions there should not be any observations in the lower right quadrant that will only happen when the distributions is skewed to the right – and then there will not be any observations in the upper left quadrant.

    The graph below shows that two companies have seriously underperformed and that the budget process did not catch the risk they were facing.  The rest of the companies have done very well, some however have seriously underestimated opportunities manifested by the actual result. From an economic point of view, the mother company would of course have preferred all companies (blue dots) above the x-axis, but due to the stochastic nature of the EBITDA it have to accept that some always will fall below.  Risk wise, it would have preferred the companies to fall to the right of the y-axis but will due to budget uncertainties have to accept that some always will fall to the left. However, large deviations both below the x-axis and to the left of the y-axis add to the company risk.

    Budget_actual_expected#1

    A situation like the one given in the graph below is much to be preferred from the board’s point of view.

    Budget_actual_expected#2

    The graphs above, taken from real life – shows that budgeting errors will not be canceled out even across similar daughter companies. Consolidating the companies will give the mother company a left skewed EBITDA distribution. They also show that you need to be prepared for deviations both positive and negative – you need a plan. So how do you get a plan? You make a simulation model! (See Pdf: Short-presentation-of-S@R#2)

    Simulation

    The Latin verb simulare means to “to make like”, “to create an exact representation” or imitate. The purpose of a simulation model is to imitate the company and is environment, so that its functioning can be studied. The model can be a test bed for assumptions and decisions about the company. By creating a representation of the company a modeler can perform experiments that are impossible or prohibitively expensive in the real world. (Sterman, 1991)

    There are many different simulation techniques, including stochastic modeling, system dynamics, discrete simulation, etc. Despite the differences among them, all simulation techniques share a common approach to modeling.

    Key issues in simulation include acquisition of valid source information about the company, selection of key characteristics and behaviors, the use of simplifying approximations and assumptions within the simulation, and fidelity and validity of the simulation outcomes.

    Optimization models are prescriptive, but simulation models are descriptive. A simulation model does not calculate what should be done to reach a particular goal, but clarifies what could happen in a given situation. The purpose of simulations may be foresight (predicting how systems might behave in the future under assumed conditions) or policy design (designing new decision-making strategies or organizational structures and evaluating their effects on the behavior of the system). In other words, simulation models are “what if” tools. Often is such “what if” information more important than knowledge of the optimal decision.

    However, even with simulation models it is possible to mismanage risk by (Stulz, 2009):

    • Over-reliance on historical data
    • Using too narrow risk metrics , such as value at risk—probably the single most important measure in financial services—have underestimated risks
    • Overlooking knowable risks
    • Overlooking concealed risks
    • Failure to communicate effectively – failing to appreciate the complexity of the risks being managed.
    • Not managing risks in real time, you have to be able to monitor changing markets and,  respond to appropriately – You need a plan

    Being fully aware of the possible pitfalls we have methods and techniques’ that can overcome these issues and since we estimate the full probability distributions we can deploy a number of risk metrics  not having to relay on simple measures like value at risk – which we actually never uses.

    References

    COSO, (2004, September). Enterprise risk management — integrated framework. Retrieved from http://www.coso.org/documents/COSO_ERM_ExecutiveSummary.pdf

    COSO, (2009, October). Strengthening enterprise risk management for strategic advantage. Retrieved from http://www.coso.org/documents/COSO_09_board_position_final102309PRINTandWEBFINAL_000.pdf

    Sterman, J. D. (1991). A Skeptic’s Guide to Computer Models. In Barney, G. O. et al. (eds.),
    Managing a Nation: The Microcomputer Software Catalog. Boulder, CO: Westview Press, 209-229.

    Stulz, R.M. (2009, March). Six ways companies mismanage risk. Harvard Business Review (The Magazine), Retrieved from http://hbr.org/2009/03/six-ways-companies-mismanage-risk/ar/1

    Enterprise risk management is a process, effected by an entity’s board of directors,

    management and other personnel, applied in strategy setting and across the enterprise, designed to identify potential events that may affect the entity, and manage risk to be within its risk appetite, to provide reasonable assurance regarding the achievement of entity objectives. (COSO, 2004)

  • Risk, price and value

    Risk, price and value

    This entry is part 3 of 4 in the series A short presentation of S@R

     

    Having arrived at the probability distribution for the value of equity (see full story) we are able to calculate expected gain, loss and their probability when investing in a company where the capitalized value (price) is known. (see “The Probability of Gain and Loss”)

    In the figure below we have illustrated the investment and speculative area. The investment area comprice the part of the cumulative probability distribution below 50%.

     

    investment_figure.jpg

    The speculative area is the area above 50%. The expected value is given at the 50% probability point (stapled line). The literature advices, and successful investors insists, on having a safety margin (discount) of at least 20% between expected value (intrinsic value) and the market price, as shown by the yellow area in the figure below. Graham and Dodd in Security Analysis introduced the concept of a margin of safety in 1934.

    In a stochastic framework as ours it is better to set the safety margin at one of the percentiles or quartiles giving directly the value of the safety margin. A fixed percentage safety margin will always give a different probability for gain (loss), depending on the shape of the cumulative probability distribution.

    An investor having a portfolio of stocks should thus use percentiles as a margin – having the same probability for gain (loss) throughout the portfolio. In the case below a 20% safety margin coincide with the first quartile, – giving a 25% probability for loss and 75% probability for gain. The expected value of the company is 1.452 the first quartile is 1.160 giving an exepcted gain of 292 or more with 75% probability (dotted lines).

    We know that the total risk of any individual asset is the sum of the systematic and unsystematic risk. When computing the figure above we have used the company’s appropriate beta to account for the systematic risk (in calculating WACC). The unsystematic risk is given by the variance in the figure above.

    In a well-diversified portfolio the expected value of the unsystematic return is assumed to be zero. When investing in a single asset we should be looking for assets with a high unsystematic return. In our context companies with a capitalized value below the percentile set as limit of the safety margin.

    References

    1. Security Analysis: The Classic 1934 Edition by Benjamin Graham, David L. Dodd. October 1, 1996, McGraw-Hill Professional Publishing; ISBN: 0070244960
    2. and an interesting webiste The Graham-Buffett Teaching Endowment
  • The Value of Information

    The Value of Information

    This entry is part 4 of 4 in the series A short presentation of S@R

     

    Enterprise risk management (ERM) only has value to those who know that the future is uncertain

    Businesses have three key needs:

    First, they need to have a product or service that people will buy. They need revenues.

    Second, they need to have the ability to provide that product or service at a cost less than what their customers will pay. They need profits. Once they have revenues and profits, their business is a valuable asset.

    So third, they need to have a system to avoid losing that asset because of unforeseen adverse experience. They need risk management.

    The top CFO concern is the firm’s ability to forecast results and the first stepping-stone in the process of forecasting results is to forecast demand – and this is where ERM starts.

    The main risk any firm faces is the variability (uncertainty) of demand. Since all production activities like procurement of raw materials, sizing of work force, investment in machinery etc. is based on expected demand the task of forecasting future demand is crucial. It is of course difficult and in most cases not possible to perfectly forecast demand, but it is always possible to make forecasts that give better results than mere educated guesses.

    We will attempt in the following to show the value of making good forecasts by estimating the daily probability distribution for demand. We will do this using a very simple model, assuming that:

    1. Daily demand is normal distributed with expected sales of 100 units and a standard deviation of 12 units,
    2. the product can not be stocked,
    3. it sells at $4 pr unit, has a variable production cost of $2 and a fixed production cost of $50.

    Now we need to forecast the daily sales. If we had perfect information about the demand, we would have a probability distribution for daily profit as given by the red histogram and line in the graphs below.

    • One form of forecast (average) is the educated guess using the average daily sales (blue histogram). As we can see from the graphs, this forecast method gives a large downside (too high production) and no upside (too low production).
    • A better method (limited information) would have been to forecast demand by its relation to some other observable variable. Let us assume that we have a forecast method that gives us a near perfect forecast in 50% of the cases and a probability distribution for the rest that is normal distributed with expected sales as for demand, but with a standard deviation of six units (green histogram).

    Profit-histogramWith the knowledge we have from (selecting strategy) we clearly se that the last forecast strategy is stochastic dominant to the use of average demand as forecast.

    ProfitSo, what is the value to the company of more informed forecasts than the mere use of expected sales? The graph below gives the distribution for the differences in profit (percentage) using the two methods. Over time, the second method  will give on average an 8% higher profit than just using the average demand as forecast.

    Diff-in-profitHowever, there is still another seven to eight percent room for further improvement in the forecasting procedure.

    If the company could be reasonable sure of the existence of a better forecast model than using the average, it would be a good strategy to put money into a betterment. In fact it could use up to 8% of all future profit if it knew that a method as good as or better than our second method existed.