Warning: define(): Argument #3 ($case_insensitive) is ignored since declaration of case-insensitive constants is no longer supported in /home/u742613510/domains/strategy-at-risk.com/public_html/wp-content/plugins/wpmathpub/wpmathpub.php on line 65
Risk – Strategy @ Risk

Tag: Risk

  • We’ve Got Mail! (2)

    We’ve Got Mail! (2)

    This entry is part 2 of 2 in the series Self-applause

    From: slideshare

    Congrats – you are in the Top 25% of Most-View ed on SlideShare:

    Slideshare_table

    Even if we did not post anything new on SlideShare in 2014, our contribution ended up in the upper quartile of the most viewed articles on SlideShare. We believe that this shows the great current interest in uncertainty and risk related to economic issues.

    The article The Weighted Average Cost of Capital was originally published in the January/February 2003 issue of Financial Engineering News, and was many years later reprinted  as (among) “The best of Financial Engineering News”:

    reprint_fen

    And it is still a relevant and popular article both on SlideShare and here. Since we put up our home page in 2009 the number of visitors and page loads has increased steadily:

    Visitors_Page-Loads-2009_2014At the end of 2014, almost 80.000 had visited our home page, and read one or more of the articles posted there. The increase in visitors and articles read again shows the increased interest in uncertainty, risk and Monte Carlo simulation – topics we at the best of our ability try to cover.

    Thanks
    S@R

  • Risk tolerance

    Amber dice on paperOne of the most important concepts within risk management is risk tolerance.  Without clearly defining risk tolerance it is virtually impossible to implement good risk management, since we do not know what to measure risk against.

    Defining risk tolerance means to define how much risk the business can live with.  Risk tolerance is vitally important in choice of strategy and in implementation of the chosen strategy.  It may well be that the business is unable to take strategic opportunities because it does not have the ability to carry the risk inherit in the wanted strategy.

    The risk carrying ability must therefore be mapped and preferably quantified in the beginning of the strategy process, and throughout the process possible strategic choices must be measured against the risk carrying ability of the business.  For example, if the financing ability puts a stop to further expansion, it limits the strategic choices the business may make.

    Risk tolerance must be measured against the key figures for which the business is the most vulnerable.  To assess risk tolerance as a more or less random number (say, for instance, 1 million) makes it close to impossible to understand risk tolerance in an appropriate way.  Hence,  the business needs to have a good understanding of what drives its value creation, and also what sets limits on strategic choices.  If the most vulnerable key figure for a business is its equity ratio, then risk tolerance needs to be measured against this ratio.

    The fact that risk tolerance needs to be measured against something means that it is a great advantage for a business to have models that can estimate risk in a quantitative manner, showing clearly what variables and relationships that have the biggest impact on the key figures most at risk.

    Originally published in Norwegian.

  • The tool that would improve everybody’s toolkit

    The tool that would improve everybody’s toolkit

    Edge, which every year ((http://www.edge.org/questioncenter.html))   invites scientists, philosophers, writers, thinkers and artists to opine on a major question of the moment, asked this year: “What scientific concept would improve everybody’s cognitive toolkit?”

    The questions are designed to provoke fascinating, yet inspiring answers, and are typically open-ended, such as:  “What will change everything” (2008), “What are you optimistic about?” (2007), and “How is the internet changing the way you think?” (Last’s years question). Often these questions ((Since 1998))  are turned into paperback books.

    This year many of the 151 contributors pointed to Risk and Uncertainty in their answers. In the following we bring excerpt from some of the answers. We will however advice the interested reader to look up the complete answers:

    A Probability Distribution

    The notion of a probability distribution would, I think, be a most useful addition to the intellectual toolkit of most people.

    Most quantities of interest, most projections, most numerical assessments are not point estimates. Rather they are rough distributions — not always normal, sometimes bi-modal, sometimes exponential, sometimes something else.

    Related ideas of mean, median, and variance are also important, of course, but the simple notion of a distribution implicitly suggests these and weans people from the illusion that certainty and precise numerical answers are always attainable.

    JOHN ALLEN PAULOS, Professor of Mathematics, Temple University, Philadelphia.

    Randomness

    The First Law of Randomness: There is such a thing as randomness.
    The Second Law of Randomness: Some events are impossible to predict.
    The Third Law of Randomness: Random events behave predictably in aggregate even if they’re not predictable individually

    CHARLES SEIFE, Professor of Journalism, New York University; formerly journalist, Science magazine; Author, Proofiness: The Dark Arts of Mathematical Deception.

    The Uselessness of Certainty

    Every knowledge, even the most solid, carries a margin of uncertainty. (I am very sure about my own name … but what if I just hit my head and got momentarily confused?) Knowledge itself is probabilistic in nature, a notion emphasized by some currents of philosophical pragmatism. Better understanding of the meaning of probability, and especially realizing that we never have, nor need, ‘scientifically proven’ facts, but only a sufficiently high degree of probability, in order to take decisions and act, would improve everybody’ conceptual toolkit.

    CARLO ROVELLI, Physicist, University of Aix-Marseille, France; Author, The First Scientist: Anaximander and the Nature of Science.

    Uncertainty

    Until we can quantify the uncertainty in our statements and our predictions, we have little idea of their power or significance. So too in the public sphere. Public policy performed in the absence of understanding quantitative uncertainties, or even understanding the difficulty of obtaining reliable estimates of uncertainties usually means bad public policy.

    LAWRENCE KRAUSS, Physicist, Foundation Professor & Director, Origins Project, Arizona State University; Author, A Universe from Nothing; Quantum Man: Richard Feynman’s Life in Science.

    Risk Literacy

    Literacy — the ability to read and write — is the precondition for an informed citizenship in a participatory democracy. But knowing how to read and write is no longer enough. The breakneck speed of technological innovation has made risk literacy as indispensable in the 21st century as reading and writing were in the 20th century. Risk literacy is the ability to deal with uncertainties in an informed way.

    GERD GIGERENZER, Psychologist; Director of the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development in Berlin; Author, Gut Feelings.

    Living is fatal

    The ability to reason clearly in the face of uncertainty. If everybody could learn to deal better with the unknown, then it would improve not only their individual cognitive toolkit (to be placed in a slot right next to the ability to operate a remote control, perhaps), but the chances for humanity as a whole.

    SETH LLOYD, Quantum Mechanical Engineer, MIT; Author, Programming the Universe.

    Uncalculated Risk

    We humans are terrible at dealing with probability. We are not merely bad at it, but seem hardwired to be incompetent, in spite of the fact that we encounter innumerable circumstances every day which depend on accurate probabilistic calculations for our wellbeing. This incompetence is reflected in our language, in which the common words used to convey likelihood are “probably” and “usually” — vaguely implying a 50% to 100% chance. Going beyond crude expression requires awkwardly geeky phrasing, such as “with 70% certainty,” likely only to raise the eyebrow of a casual listener bemused by the unexpected precision. This blind spot in our collective consciousness — the inability to deal with probability — may seem insignificant, but it has dire practical consequences. We are afraid of the wrong things, and we are making bad decisions.

    GARRETT LISI, Independent Theoretical Physicist

    And there is more … much more at the Edge site

  • Solving Uncertainty in Simulation Models

    Solving Uncertainty in Simulation Models

    I shall be telling this with a sigh
    Somewhere ages and ages hence:
    Two roads diverged in a wood, and I–
    I took the one less travelled by,
    And that has made all the difference

    …Robert Frost, 1916


    Uncertainty in your operations is most likely complex and will need systematic treatment through simulation modeling. S@R carries out thorough analysis of companies risk and uncertainties with aim of producing good decision support tools. Making sure the client takes a huge step forward from scenario analysis.

    The Four Levels of Uncertainty

    The uncertainty that remains after the best possible analysis has been done is what we call residual uncertainty (Courtney, Kirkland & Viguerie, 1997).

    In our world ‘the best possible analysis’ means that we have a model ‘good enough’ to describe the business under study. The question then is – do we need to take into account the uncertainties that always will be inherent in its operations and markets?  And if we have to, is it possible?

    A useful distinction between the different situations that can arise is given by Courtney et al.  as four levels of residual uncertainty (see figure below, McKinsey Quarterly, Dec. 2008):

    1. A Clear-Enough Future; managers can develop a single forecast of the future that is precise enough for strategy development.
    2. Alternate Futures; the future can be described as one of a few discrete scenarios. Analysis cannot identify which outcome will occur, although it may help establish probabilities.
    3. A Range of Futures; a range of potential futures can be identified. That range is defined by a limited number of key variables, but the actual outcome may lie anywhere along a continuum bounded by that range.
    4. True Ambiguity; multiple dimensions of uncertainty interact to create an environment that is virtually impossible to predict.

    In real life there is however a problem with identifying the level we are facing. The definition of 1th level uncertainty indicates that the residual uncertainty is irrelevant to the strategic decisions under study. But how is it possible to know this before an uncertainty analysis has been performed?

    The answer has to be that the best possible analysis performed has been a risk/uncertainty analysis taking into account all known uncertainties in the business’s environment and that of all the business’s feasible strategies one is always best (1th order stochastic dominance).

    The best strategy will then be the one giving a probability distribution for the business’s equity value that is located to the right and under the distributions for all other strategies. In this case the resulting equity value is of less importance since it anyway will be larger than under any other strategy. With this established, the actual analysis can be performed as a deterministic calculation.

    For the 2nd level uncertainties with alternate futures, a scenario analysis is often advocated. However the same applies to each alternative future as for the 1th level uncertainties (also see scenario analysis). In addition some assumptions have to be made on the probabilities of each of the alternative futures.

    As an example we can take a company analyzing investment in production facilities in two alternative countries. In one country there is a sovereign risk of a future new tax scenario and if it is imposed two different scenarios is possible. In the other country there is a fixed tax scenario – not expected to change. In this case you will need at least three (maximum five) models all taking into account the inherent risk in the business, giving the probability distribution for equity value for;

    1. current operations,
    2. current operations + Investment in the country with no sovereign risk, and
    3. current operations + Investment in the country with sovereign risk;
      1. no new tax scenario and
      2. with each of the two different new tax scenarios.

    The reason for different models even if the operations in the new facility will be the same regardless of country, lies in the fact that the business strategy might differ between countries and the investment strategy might differ for different tax scenarios. The model with sovereign risk will switch between the different tax scenarios models, according to the probability of their occurrence – generating the distribution for equity value given the sovereign risk.

    To invest, at least one of the equity distributions for ‘Current operations + Investment’ should be located to the right and under the distributions for ‘Current operations’ (or be stochastic dominant). Likewise, the best investment alternative will have an equity distribution located to the right and under the distributions for the other alternative (or be stochastic dominant).

    Having the equity distribution for the dominant strategy, opens for measurement of the strategy’s inherent risk beyond the use of simple value at risk calculations, putting emphasis on the possibility of large losses and further unwanted capital infusions.

    As we now can see, directly applying a standard scenario analysis can quickly lead decision makers astray.

    The above classification for the two first levels can in general only be performed after a full the risk/uncertainty analysis and can never be used ex ante to select the appropriate method.

    The 3rd level uncertainties describes the normal situation where all exogenous variables have a range of possible values. Assuming that we can find (estimate or guesstimate) the probability distribution over that range, we can attack the problem by Monte Carlo simulation and calculate the probability distributions for our endogenous variables.

    The 4th level uncertainties comprises at least two different situations; where there are unknown but knowable probabilities and where there are unknown and unknowable probabilities:

    Ambiguity is uncertainty about probability, created by missing information that is relevant and could be known (Camerer & Weber, 1992).

    This leads us to a more comprehensive discussion of the situations that will arise in decision making processes:

    More generally, we propose that in most decision problems, “choice” is nothing but the terminal act of a problem-solving activity, preceded by the formulation of the problem itself, the identification of the relevant information, the application of pre-existing competences or the development of new ones to the problem solution and, finally, the identification of alternative courses of action. (Dosi & Egidi, 1991)

    The origin of uncertainty

    Uncertainty may have two origins:

    1. the lack of all the information which would be necessary to make decisions with certain outcomes (substantive uncertainty), and
    2. limitations on the computational and knowledge based capabilities, given the available information (procedural uncertainty).

    The first source of uncertainty comes from information incompleteness, and the second from the inability to recognize, interpret and act on the relevant information, even when it is available – knowledge incompleteness.

    To distinguish between the two different situations giving Courtney‘s 4th level uncertainty we will follow Dosi & Egidi:

    1. Weak substantive uncertainty (analogous to Knight’s “risk”) is all circumstances where uncertainty simply derives from lack of information about the occurrence of a particular event – with a certain known (or at least knowable) probability distribution, and
    2. Strong substantive uncertainty (analogous to Knight and Keynes “uncertainty”) is all cases involving unknown events or the impossibility, even in principle, of defining the probability distributions of the events themselves.

    Types of Uncertainty

    Uncertainty estimation usually includes the estimation of the uncertainty of the output parameters by estimating the uncertainty of the input parameters. This is done by estimating a probability distribution of the error. Hence, it is pretty much “straight forward” as long as the input parameters have values. However, the uncertainty of a model may not only be estimated via the parameters, there may also be uncertainty in the structure of a model, e.g. which variable and parameters are important in the model.

    Adopting the distinction between parametric- and structural uncertainty (Kyläheiko et al., 2002) we can further specify model uncertainty:

    1. Parametric; uncertainty or imperfect knowledge about the parameters in the decision model, and
    2. Structural (epistemic); uncertainty or imperfect knowledge about the structure of the model.

    Combining the above we can describe the types of risk and uncertainty facing both the decision maker and the decision support model as in the following picture:

    The purpose is then to solve weak substantive parametric and structural uncertainty using good methods and models. The model will constitute a mix of facts ((In a simulation the opening balance is usually considered as certain, but the balance sheet often contains highly uncertain items. In fact auditors should give interval estimates for the most critical items in the yearly balance report)) (certain values), risks with known (objective) probability distributions, uncertainties given by subjective probability distributions and a script of the firm’s operations.

    Modeling

    However, models will always have some structural uncertainty – even if it would be possible to remove all by introducing more and more variable and relations. Occam’s Razor can usually be applied with good results; select the model that introduces the fewest assumptions and postulates the fewest entities while still sufficiently answering the question. Borrowing from multidimensional scaling the term ‘stress’ – as the violation done to the actual decision structure by removing parameters or variable from the model – we can visualize this by the following figure:

    Reducing the dimensionality of the model will not necessarily reduce or move (distort) the endogenous variables event space since correlation exists between variable omitted and variable kept in the model and – depending on estimation methods – the standard errors of estimated relationships will increase, maintaining the original model variability.

    Strategy

    Maybe the world and the uncertainties we face haven’t changed all that much as a result of the financial crisis, but our perception of risks has. That means there is a real opportunity to rethink the way we make strategic decisions, the way we plan under uncertainty. (Courtney, McKinsey Quarterly, Dec. 2008)

    The development of strategy requires the courage to accept uncertainty. Strategists must accept that they will not have all of the information and not see the full spectrum of possible events, yet be committed to create and implement strategy. The uncertainty that exists is not only a product of not having complete information and being able to predict future events, it also is a product of the events generated by dynamic and thinking competitors.

    By its nature, uncertainty invariably involves the estimation and acceptance of risk. Risk is equally common to action and inaction. Risk may be related to gain; greater potential gain often requires greater risk. However, we should clearly understand that the acceptance of risk does not equate to the imprudent willingness to gamble the entire likelihood of success on improbable events.

    One important step in the direction of better and more informed decision making is the removal of procedural uncertainty by having good models capable of framing the environment of the circumstances under which the decisions are made – giving the best possible analysis.

    It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail. (Maslow, 1966)

    S@R carries out thorough analysis of companies risk and uncertainties with aim of producing good decision support tools. Making sure the client takes a huge step forward from scenario analysis.

    References

    A fresh look at strategy under uncertainty: An interview, McKinsey Quarterly, December 2008.

    http://www.mckinseyquarterly.com/fresh_look_at_strategy_under_uncertainty_2256.

    Camerer, C. & Weber, M., (1992). Recent Developments in Modelling Preferences: Uncertainty and Ambiguity, Journal of Risk and Uncertainty, Springer, vol. 5(4), 325-70.

    Courtney, H., (2001). 20/20 Foresight. Boston: Harvard Business School Press.

    Courtney, H. G., Kirkland, J., & Viguerie, P. S., (1997). Strategy Under Uncertainty. Harvard Business Review, 75(6), 67-79.

    Dequech, D., (2000), Fundamental Uncertainty and Ambiguity, Eastern Economic Journal, 26(1), 41-60.

    Dosi, G & Egidi, M, (1991). Substantive and Procedural Uncertainty: An Exploration of Economic Behaviours in Changing Environments, Journal of Evolutionary Economics, Springer, 1(2), 145-68.

    Frost, R., (1916). Mountain interval. Henry Holt And Company.

    Keynes, J., (2004). A Treatise on Probability. New York: Dover Publications.

    Knight, F. (1921). Risk, Uncertainty and Profit. Boston: Houghton Mifflin.

    Kylaheiko K., Sandstrom J. & Virkkunen V., (2002). Dynamic capability view in terms of real options. International Journal of Production Economics, Volume 80 (1), 65-83(19).

    Maslow, A., (1966). The Psychology of Science. South Bend: Gateway Editions, Ltd.

    Endnotes

  • The Case of Enterprise Risk Management

    The Case of Enterprise Risk Management

    This entry is part 2 of 4 in the series A short presentation of S@R

     

    The underlying premise of enterprise risk management is that every entity exists to provide value for its stakeholders. All entities face uncertainty and the challenge for management is to determine how much uncertainty to accept as it strives to grow stakeholder value. Uncertainty presents both risk and opportunity, with the potential to erode or enhance value. Enterprise risk management enables management to effectively deal with uncertainty and associated risk and opportunity, enhancing the capacity to build value. (COSO, 2004)

    The evils of a single point estimate

    Enterprise risk management is a process, effected by an entity’s board of directors, management and other personnel, applied in strategy setting and across the enterprise, designed to identify potential events that may affect the entity, and manage risk to be within its risk appetite, to provide reasonable assurance regarding the achievement of entity objectives. (COSO, 2004)

    Traditionally, when estimating costs, project value, equity value or budgeting, one number is generated – a single point estimate. There are many problems with this approach.  In budget work this point is too often given as the best the management can expect, but in some cases budgets are set artificially low generating bonuses for later performance beyond budget. The following graph depicts the first case.

    Budget_actual_expected

    Here, we have based on the production and market structure and on the managements assumptions of the variability of all relevant input and output variables simulated the probability distribution for next years EBITDA. The graph gives the budgeted value, the actual result and the expected value. Both budget and actual value are above expected value, but the budgeted value was far too high, giving with more than 80% probability a realized EBITDA lower than budget. In this case the board will be mislead with regard to the company’ ability to earn money and all subsequent decisions made based on the budget EBITDA can endanger the company.

    The organization’s ERM system should function to bring to the board’s attention the most significant risks affecting entity objectives and allow the board to understand and evaluate how these risks may be correlated, the manner in which they may affect the enterprise, and management’s mitigation or response strategies. (COSO, 2009)

    It would have been much more preferable to the board to be given both the budget value and the accompanying probability distribution allowing it to make independent judgment about the possible size of the next years EBITDA. Only then will the board – both from the shape of the distribution, its localization and the point estimate of budget EBITDA – be able to assess the risk and opportunity facing the company.

    Will point estimates cancel out errors?

    In the following we measure the deviation of the actual result from both from the budget value and from the expected value. The blue dots represent daughter companies located in different countries. For each company we have the deviation (in percent) of the budgeted EBITDA (bottom axis) and the expected value (left axis) from the actual EBITDA observed 1 ½ year later.

    If the deviation for a company falls in the upper right quadrant the deviation are positive for both budget and expected value – and the company is overachieving.

    If the deviation falls in the lower left quadrant the deviation are negative for both budget and expected value – and the company is underachieving.

    If the deviation falls in the upper left quadrant the deviation are negative for budget and positive for expected value – the company is overachieving but has had a to high budget.

    With left skewed EBITDA distributions there should not be any observations in the lower right quadrant that will only happen when the distributions is skewed to the right – and then there will not be any observations in the upper left quadrant.

    The graph below shows that two companies have seriously underperformed and that the budget process did not catch the risk they were facing.  The rest of the companies have done very well, some however have seriously underestimated opportunities manifested by the actual result. From an economic point of view, the mother company would of course have preferred all companies (blue dots) above the x-axis, but due to the stochastic nature of the EBITDA it have to accept that some always will fall below.  Risk wise, it would have preferred the companies to fall to the right of the y-axis but will due to budget uncertainties have to accept that some always will fall to the left. However, large deviations both below the x-axis and to the left of the y-axis add to the company risk.

    Budget_actual_expected#1

    A situation like the one given in the graph below is much to be preferred from the board’s point of view.

    Budget_actual_expected#2

    The graphs above, taken from real life – shows that budgeting errors will not be canceled out even across similar daughter companies. Consolidating the companies will give the mother company a left skewed EBITDA distribution. They also show that you need to be prepared for deviations both positive and negative – you need a plan. So how do you get a plan? You make a simulation model! (See Pdf: Short-presentation-of-S@R#2)

    Simulation

    The Latin verb simulare means to “to make like”, “to create an exact representation” or imitate. The purpose of a simulation model is to imitate the company and is environment, so that its functioning can be studied. The model can be a test bed for assumptions and decisions about the company. By creating a representation of the company a modeler can perform experiments that are impossible or prohibitively expensive in the real world. (Sterman, 1991)

    There are many different simulation techniques, including stochastic modeling, system dynamics, discrete simulation, etc. Despite the differences among them, all simulation techniques share a common approach to modeling.

    Key issues in simulation include acquisition of valid source information about the company, selection of key characteristics and behaviors, the use of simplifying approximations and assumptions within the simulation, and fidelity and validity of the simulation outcomes.

    Optimization models are prescriptive, but simulation models are descriptive. A simulation model does not calculate what should be done to reach a particular goal, but clarifies what could happen in a given situation. The purpose of simulations may be foresight (predicting how systems might behave in the future under assumed conditions) or policy design (designing new decision-making strategies or organizational structures and evaluating their effects on the behavior of the system). In other words, simulation models are “what if” tools. Often is such “what if” information more important than knowledge of the optimal decision.

    However, even with simulation models it is possible to mismanage risk by (Stulz, 2009):

    • Over-reliance on historical data
    • Using too narrow risk metrics , such as value at risk—probably the single most important measure in financial services—have underestimated risks
    • Overlooking knowable risks
    • Overlooking concealed risks
    • Failure to communicate effectively – failing to appreciate the complexity of the risks being managed.
    • Not managing risks in real time, you have to be able to monitor changing markets and,  respond to appropriately – You need a plan

    Being fully aware of the possible pitfalls we have methods and techniques’ that can overcome these issues and since we estimate the full probability distributions we can deploy a number of risk metrics  not having to relay on simple measures like value at risk – which we actually never uses.

    References

    COSO, (2004, September). Enterprise risk management — integrated framework. Retrieved from http://www.coso.org/documents/COSO_ERM_ExecutiveSummary.pdf

    COSO, (2009, October). Strengthening enterprise risk management for strategic advantage. Retrieved from http://www.coso.org/documents/COSO_09_board_position_final102309PRINTandWEBFINAL_000.pdf

    Sterman, J. D. (1991). A Skeptic’s Guide to Computer Models. In Barney, G. O. et al. (eds.),
    Managing a Nation: The Microcomputer Software Catalog. Boulder, CO: Westview Press, 209-229.

    Stulz, R.M. (2009, March). Six ways companies mismanage risk. Harvard Business Review (The Magazine), Retrieved from http://hbr.org/2009/03/six-ways-companies-mismanage-risk/ar/1

    Enterprise risk management is a process, effected by an entity’s board of directors,

    management and other personnel, applied in strategy setting and across the enterprise, designed to identify potential events that may affect the entity, and manage risk to be within its risk appetite, to provide reasonable assurance regarding the achievement of entity objectives. (COSO, 2004)

  • A short presentation of S@R

    A short presentation of S@R

    This entry is part 1 of 4 in the series A short presentation of S@R

     

    My general view would be that you should not take your intuitions at face value; overconfidence is a powerful source of illusions. Daniel Kahneman (“Strategic decisions: when,” 2010)

    Most companies have some sort of model describing the company’s operations. They are mostly used for budgeting, but in some cases also for forecasting cash flow and other important performance measures. Almost all are deterministic models based on expected or average values of input data; sales, cost, interest and currency rates etc. We know however that forecasts based on average values are on average wrong. In addition deterministic models will miss the important uncertainty dimension that gives both the different risks facing the company and the opportunities they produce.

    S@R has set out to create models (See Pdf: Short presentation of S@R) that can give answers to both deterministic and stochastic questions, by linking dedicated EBITDA models to holistic balance simulation taking into account all important factors describing the company. The basis is a real balance simulation model – not a simple cash flow forecast model.

    Generic Simulation_model

    Both the deterministic and stochastic balance simulation can be set about in two different alternatives:

    1. by a using a EBITDA model to describe the companies operations or,
    2. by using coefficients of fabrications  as direct input to the balance model.

    The first approach implies setting up a dedicated ebitda subroutine to the balance model. This will give detailed answers to a broad range of questions about operational performance and uncertainty, but entails a higher degree of effort from both the company and S@R.

    The use of coefficients of fabrications and their variations is a low effort (cost) alternative, using the internal accounting as basis. This will in many cases give a ‘good enough’ description of the company – its risks and opportunities: The data needed for the company’s economic environment (taxes, interest rates etc.) will be the same in both alternatives.

    EBITDA_model

    In some cases we have used both approaches for the same client, using the last approach for smaller daughter companies with production structures differing from the main companies.
    The second approach can also be considered as an introduction and stepping stone to a more holistic EBITDA model.

    What problems do we solve?

    • The aim regardless of approach is to quantify not only the company’s single and aggregated risks, but also the potential, thus making the company capable to perform detailed planning and of executing earlier and more apt actions against risk factors.
    • This will improve stability to budgets through higher insight in cost side risks and income-side potentials. This is achieved by an active budget-forecast process; the control-adjustment cycle will teach the company to better target realistic budgets – with better stability and increased company value as a result.
    • Experience shows that the mere act of quantifying uncertainty throughout the company – and thru modelling – describe the interactions and their effects on profit, in itself over time reduces total risk and increases profitability.
    • This is most clearly seen when effort is put into correctly evaluating strategies-projects and investments effects on the enterprise. The best way to do this is by comparing and choosing strategies by analysing the individual strategies risks and potential – and select the alternative that is dominant (stochastic) given the company’s chosen risk-profile.
    • Our aim is therefore to transform enterprise risk management from only safeguarding enterprise value to contribute to the increase and maximization of the firm’s value within the firm’s feasible set of possibilities.

    References

    Strategic decisions: when can you trust your gut?. (2010). McKinsey Quarterly, (March)