Warning: define(): Argument #3 ($case_insensitive) is ignored since declaration of case-insensitive constants is no longer supported in /home/u742613510/domains/strategy-at-risk.com/public_html/wp-content/plugins/wpmathpub/wpmathpub.php on line 65
S@R – Page 5 – Strategy @ Risk

Author: S@R

  • Working Capital and the Balance Sheet

    Working Capital and the Balance Sheet

    This entry is part 2 of 3 in the series Working Capital

     

    The conservation-of-value principle says that it doesn’t matter how you slice the financial pie with financial engineering, share repurchases, or acquisitions; only improving cash flows will create value. (Dobbs, Huyett & Koller, 2010).

    The above, taken from “The CEO’s guide to corporate finance” will be our starting point and Occam’s razor the tool to simplify the balance sheet using the concept of working- and operating capital.

    To get a better grasp of the firm’s real activities we will as well separate non-operating assets from operating assets – since it will be the last that defines the firm’s operations.

    To find the amount of operating current assets we have to deduct the sum of minimum cash level, inventories and account receivables from total current assets. The difference between total- and operating current assets is assumed placed in excess marketable securities – and will not be included in the working capital.

    Many firms have cash levels above and well beyond what is really needed as working capital, tying up capital that could have had better uses generating higher return than mere short-term placements.

    The net working capital now found by deducting non-interest bearing current liabilities from operating current assets, will be the actual amount of working capital needed to safely run the firms operations – no more and no less.

    By summing net property, plant and equipment and other operating fixed assets we find the total amount of fixed assets involved in the firm’s operations. This together with net working capital forms the firms operating assets, assets that will generate the cash flow and return on equity that the owners are expecting.

    The non-operating part – excess marketable securities and non-operating investments – should be kept as small as possible, since this at best only will give an average market return. The rest of the above calculations give us the firm’s total liability and equity, which we will use to set up the firm’s ordinary balance sheet:

    However, by introducing operating-, non-operating- and working capital we can get a clearer picture of the firm’s activities ((Used in yearly reports by Stora Enso, a large international Pulp & Paper company, noted on NASDAQ OMX in Stockholm and Helsinki.)):

    The balance sheet’s bottom line has been reduced by the smallest value of operating current assets and non-interest bearing debt and the difference between them – the working capital – will be an asset or a liability depending on which of them that have the largest value:

    The above calculations is an integral part of our balance simulation model and the report that can be produced for planning, strategy- and risk assessment from the simulation can be viewed her; report for the most likely outcome (Pdf, pp 32). However this report can be produced for every run in the simulation giving the opportunity to look at tail events that might arise, distorting expectations.

    Simplicity is the ultimate sophistication. — Leonardo da Vinci

    References

    Dobbs, D, Huyett, H, & Koller, T. (2010). The ceo’s guide to corporate finance. McKinsey Quarterly, 4. Retrieved from http://www.mckinseyquarterly.com/home.aspx

    Endnotes

  • Uncertainty modeling

    Uncertainty modeling

    This entry is part 2 of 3 in the series What We Do

    Prediction is very difficult, especially about the future.
    Niels Bohr. Danish physicist (1885 – 1962)

    Strategy @ Risks models provide the possibility to study risk and uncertainties related to operational activities;  cost, prices, suppliers,  markets, sales channels etc. financial issues like; interest rates risk, exchange rates risks, translation risk , taxes etc., strategic issues like investments in new or existing activities, valuation and M&As’ etc and for a wide range of budgeting purposes.

    All economic activities have an inherent volatility that is an integrated part of its operations. This means that whatever you do some uncertainty will always remain.

    The aim is to estimate the economic impact that such critical uncertainty may have on corporate earnings at risk. This will add a third dimension – probability – to all forecasts, give new insight: the ability to deal with uncertainties in an informed way and thus benefits above ordinary spread-sheet exercises.

    The results from these analyzes can be presented in form of B/S and P&L looking at the coming one to five (short term) or five to fifteen years (long term); showing the impacts to e.g. equity value, company value, operating income etc. With the purpose of:

    • Improve predictability in operating earnings and its’ expected volatility
    • Improve budgeting processes, predicting budget deviations and its’ probabilities
    • Evaluate alternative strategic investment options at risk
    • Identify and benchmark investment portfolios and their uncertainty
    • Identify and benchmark individual business units’ risk profiles
    • Evaluate equity values and enterprise values and their uncertainty in M&A processes, etc.

    Methods

    To be able to add uncertainty to financial models, we also have to add more complexity. This complexity is inevitable, but in our case, it is desirable and it will be well managed inside our models.

    People say they want models that are simple, but what they really want is models with the necessary features – that are easy to use. If something is complex but well designed, it will be easy to use – and this holds for our models.

    Most companies have some sort of model describing the company’s operations. They are mostly used for budgeting, but in some cases also for forecasting cash flow and other important performance measures. Almost all are deterministic models based on expected or average values of input data; sales, cost, interest and currency rates etc.

    We know however that forecasts based on average values are on average wrong. In addition will deterministic models miss the important uncertainty dimension that gives both the different risks facing the company and the opportunities they bring forth.

    S@R has set out to create models that can give answers to both deterministic and stochastic questions, by linking dedicated Ebitda models to holistic balance simulation taking into account all important factors describing the company. The basis is a real balance simulation model – not a simple cash flow forecast model.

    Both the deterministic and stochastic balance simulation can be set about in two different alternatives:

    1. by a using a EBITDA model to describe the companies operations or
    2. by using coefficients of fabrications (e.g. kg flour pr 1000 bread etc.) as direct input to the balance model – the ‘short cut’ method.

    The first approach implies setting up a dedicated Ebitda subroutine to the balance model. This will give detailed answers to a broad range of questions about markets, capacity driven investments, operational performance and uncertainty, but entails a higher degree of effort from both the company and S@R. This is a tool for long term planning and strategy development.

    The second (‘the short cut’) uses coefficients of fabrications and their variations, and is a low effort (cost) alternative, usually using the internal accounting as basis. This will in many cases give a ‘good enough’ description of the company – its risks and opportunities. It can be based on existing investment and market plans.  The data needed for the company’s economic environment (taxes, interest rates etc) will be the same in both alternatives:

    The ‘short cut’ approach is especially suited for quick appraisals of M&A cases where time and data is limited and where one wishes to limit efforts in an initial stage. Later the data and assumptions can be augmented to much more sophisticated analysis within the same ‘short cut’ framework. In this way analysis can be successively built in the direction the previous studies suggested.

    This also makes it a good tool for short-term (3-5 years) analysis and even for budget assessment. Since it will use a limited number of variables – usually less than twenty – describing the operations, it is easy to maintain and operate. The variables describing financial strategy and the economic environment come in addition, but will be easy to obtain.

    Used in budgeting it will give the opportunity to evaluate budget targets, their probable deviation from expected result and the probable upside or down side given the budget target (Upside/downside ratio).

    Done this way analysis can be run for subsidiaries across countries translating the P&L and Balance to any currency for benchmarking, investment appraisals, risk and opportunity assessments etc. The final uncertainty distributions can then be “aggregated’ to show global risk for the mother company.

    An interesting feature is the models ability to start simulations with an empty opening balance. This can be used to assess divisions that do not have an independent balance since the model will call for equity/debt etc. based on a target ratio, according to the simulated production and sales and the necessary investments. Questions about further investment in divisions or product lines can be studied this way.

    Since all runs (500 to 1000) in the simulation produces a complete P&L and Balance the uncertainty curve (distribution) for any financial metric like ‘Yearly result’, ‘free cash flow’, economic profit’, ‘equity value’, ‘IRR’ or’ translation gain/loss’ etc. can be produced.

    In some cases we have used both approaches for the same client, using the last approach for smaller daughter companies with production structures differing from the main companies.
    The second approach can also be considered as an introduction and stepping stone to a more holistic Ebitda model.

    Time and effort

    The work load for the client is usually limited to a small team of people ( 1 to 3 persons) acting as project leaders and principal contacts, assuring that all necessary information, describing value and risks for the clients’ operations can be collected as basis for modeling and calculations. However the type of data will have to be agreed upon depending on the scope of analysis.

    Very often will key people from the controller group be adequate for this work and if they don’t have the direct knowledge they usually know who to ask. The work for this team, depending on the scope and choice of method (see above) can vary in effective time from a few days to a couple of weeks, but this can be stretched from three to four weeks to the same number of months.

    For S&R the time frame will depend on the availability of key personnel from the client and the availability of data. For the second alternative it can take from one to three weeks of normal work to three to six months for the first alternative for more complex models. The total time will also depend on the number of analysis that needs to be run and the type of reports that has to be delivered.

    S@R_ValueSim

    Selecting strategy

    Models like this are excellent for selection and assessment of strategies. Since we can find the probability distribution for equity value, changes in this brought by different strategies will form a basis for selection or adjustment of current strategy. Models including real option strategies are a natural extension of these simulation models:

    If there is a strategy with a curve to the right and under all other feasible strategies this will be the stochastic dominant one. If the curves crosses further calculations needs to be done before a stochastic dominant or preferable strategy can be found:

    Types of problems we aim to address:

    The effects of uncertainties on the P&L and Balance and the effects of the Boards strategies (market, hedging etc.) on future P&L and Balance sheets evaluating:

    • Market position and potential for growth
    • Effects of tax and capital cost
    • Strategies
    • Business units, country units or product lines –  capital allocation – compare risk, opportunity and expected profitability
    • Valuations, capital cost and debt requirements, individually and effect on company
    • The future cash-flow volatility of company and the individual BU’s
    • Investments, M&A actions, their individual value, necessary commitments and impact on company
    • Etc.

    The aim regardless of approach is to quantify not only the company’s single and aggregated risks, but also the potential, thus making the company capable to perform detailed planning and of executing earlier and more apt actions against uncertain factors.

    Used in budgeting, this will improve budget stability through higher insight in cost side risks and income-side potentials. This is achieved by an active budget-forecast process; the control-adjustment cycle will teach the company to better target realistic budgets – with better stability and increased company value as a result.

    This is most clearly seen when effort is put into correctly evaluating strategies-projects and investments effects on the enterprise. The best way to do this is by comparing and Choosing strategies by analyzing the individual strategies risks and potential – and select the alternative that is dominant (stochastic) given the company’s chosen risk-profile.

    A severe depression like that of 1920-1921 is outside the range of probability. –The Harvard Economic Society, 16 November 1929

  • Big Issues Needs Big Tools

    Big Issues Needs Big Tools

    This entry is part 3 of 3 in the series What We Do

     

    You can always amend a big plan, but you can never expand a little one. I don’t believe in little plans. I believe in plans big enough to meet a situation which we can’t possibly foresee now. Harry S. Truman : American statesman (33rd US president: 1945-53)

    We believe you know your business best and will in your capacity implement the necessary resources, competence, tools and methods for running a successful and efficient organization.

    Still issues related to uncertainty, whether it is finance, stakeholders, production , purchase or sale, has in most cases increased due to more complex business operational environment. Excellent systems for a range of processes; consolidation, Customer relationship, accounting has kept up with increasingly complex environments, and so has your most important tool – people.

    But we believe you do not possess the best available method and tool for bringing people – competence, experience, economic/financial facts, assumptions and economic/financial tools together.

    You know your budgets, valuations, projections and estimates, scenario analysis all are made and presented with valuable information regarding uncertainty left out on the way. Whether this is because human experience related to risk, or analyzing, understanding and projection macro or micro risks is hard to capture, or tools not designed to capture risk is the cause. It is a fact that most complex big issues important for companies are based on insufficient information, a portion of luck, gut feeling and believes in market turns /stability/cycles or other comforting assumptions shared by peers.

    Or you are restricted to giving guidelines, min/max orders, specifications and trust to third party experts that one hope are better capable of capturing risk and potential in a narrow area of expertise. Regardless of this risk spreading or differentiation works – you need the best assistance for setting your guidelines and road-map both for your internal as well as external resources.

    Systems and methods (( A Skeptic’s Guide to Computer Models (Pdf, pp 25) , by John D. Sterman. This paper is reprinted from Sterman, J. D. (1991). A Skeptic’s Guide to Computer Models. In Barney, G. O. et al., Managing a Nation: The Microcomputer Software Catalog. Boulder, CO: Westview Press, 209-229.)) are never better than human experience, knowledge and excellence, but if you want to look closer at a method/tool that can capture the best of your existing decision making process and bringing it to a new level. You should look closer at a stochastic complete p/L/balance simulation model for those big issues and big decisions.

    If you are not familiar with stochastic simulations and probability distributions, take a look at a report for the most likely outcome (Pdf, pp 32) from the simulations – similar reports could have been made for the outcomes you would not have liked to see, giving a heads up for the sources of downside risk, OR for outcomes you would have loved you see – explaining the generators of up-side possibilities.

    Endnotes

  • Working Capital Strategy

    Working Capital Strategy

    This entry is part 1 of 3 in the series Working Capital

     

    Passion is inversely proportional to the amount of real information available. See Benford’s law of controversy.

    The annual “REL ((REL Consultancy. (2010). Wikipedia. Retrieved October 10, 2010, from http://en.wikipedia.org/wiki/REL_Consultancy)) /CFO Working Capital Survey” made its debut in 1997 in the CFO Magazine. The magazine identifies working capital management as one of the key issues facing financial executives in the 21st century (Filbeck, Krueger, & Preece, 2007).

    The 2010 Working Capital scorecard (Katz, 2010) and its accompanying data ((http://www.cfo.com/media/201006/1006WCcompletev2.xls)) gives us an opportunity to look at working capital management ((Data from 1,000 of the largest U.S. public companies)); that is the effect of working capital management on the return on capital employed (ROCE):

    ROCE = EBIT/{Capital~Employed}   or,

    ROCE = EBIT/(Operating fixed assets + net operating working capital)

    From the last formula we can see that – all else kept constant – a reduction in net operating working capital should imply an increased return on capital employed.

    Gross and Net Operating Working Capital

    A firm’s gross working capital comprises its total current assets. One part of it will consist of financial current assets held for various reasons other than operational, and the other part of receivables from operations and the inventory and cash necessary to run these operations. It is this last part that interests us.

    The firm’s operations will have been long term financed by equity from owners and by loans from lenders. Firms usually also have short term financing from banks (short term credit + overdraft facilities/ credit lines) and most always from suppliers by trade credit. The rest of the current liabilities; current tax and dividends will not be considered as parts of operating current liabilities, since they comprises only non recurrent payments.

    Net working capital is defined as the difference between current assets and current liabilities (see figure below). It can be both positive and negative depending on the firm’s strategic position in the market.

    However usually a positive net working capital is required to ensure that the firm is able to continue its operations and that it has sufficient funds to satisfy both maturing short-term debt and upcoming operational expenses.  In the following we assume that any positive net working capital is held as cash and that all excess cash is held as marketable securities.

    By removing from both current assets and liabilities all items not directly related to and necessary for the operations, we arrive at net operating working capital as the difference between operating current assets and operating current liabilities:

    Net operating working capital = Operating current assets – Operating current liabilities

    Since the needed amount of working capital will differ between industries and be dependent on company size it will be easier to base comparisons on the cash conversion cycle.

    Working Capital Management

    Working capital management is the administration of current assets as well as current liabilities. It is the main part of a firm’s short-term financial planning since it involves the management of cash, inventory and accounts receivable. Therefore, working capital management will reflect the firm’s short-term financial performance.

    Current assets often account for more than half of a company’s total assets and hence, represent a major investment for small firms as they can not be avoided in the same way as investments in fixed assets can – by renting or leasing. A large inventory will tie up capital but it prevents the company from lost sales or production stoppages due to stock-out. A high level of current assets hence means less risk to the company but also lower earnings due to higher capital tie-up – the risk-return trade-offs (Weston & Copeland, 1986).

    Since the needed amount of working capital will differ between industries and also will be dependent on company size it will be easier to base comparisons of working capital management between companies and industries on their cash conversion cycle (CCC).

    The Cash Conversion Cycle

    The term “cash conversion cycle” (CCC) refers to the time span between a firm’s disbursing and collecting cash and will thus be ‘unrelated’ to the firm’s size, but be dependent on the firm’s type of business (see figure below).

    Companies that have high inventory turnover and do business on a cash basis – usually have a low or negative CCC and hence needs very little working capital.

    For companies that make investment products the situation is a completely different. As these types of businesses are selling expensive items on a long-term payment basis, they will tend to have a high CCC and must keep enough working capital on hand to get through any unforeseen difficulties.

    The CCC cannot be directly observed in the cash flows, because these are also influenced by investment and financing activities and must be derived from the firm’s balance sheet:

    + Inventory conversion period (DSI)
    + Receivables conversion period (DSO)
    –  Payable conversion period (DPO)
    = Cash Conversion Cycle (days)

    Where:

    DSI  = Days sales of inventory, DSO = Days sales outstanding,  DPO = Days payable outstanding, WIP = Work in progress, Period = Accounting period and COGS = Cost of goods sold ((COGS = Opening inventory + Purchase of goods – Closing inventory)) or:

    + Average inventory+WIP / [COGS/days in period]
    + Average Accounts Receivable / [Revenue / days in period]
    + Average Accounts Payable / [(Inventory increase + COGS)/ days in period]
    = Cash Conversion Cycle (days)

    The Observations

    Even if not all of the working capital is determined by the cash conversion cycle, there should be a tendency for higher return on operating capital with lower CCC. However the data from the annual survey (Katz, 2010) does not support this ((Data used with permission from REL/CFO. Twenty of the one thousand observations have been removed as outliers, to give a better picture of the relation)):

    The scatter graph shows no direct relation between return on operating capital and the cash conversion cycle. A closer inspection of the data for the surveys different industries confirms this.

    Since the total amount of capital invested in the CCC is:

    Cap(CCC) = CCC * Sales * (1 + VAT)/{days~pr~period}

    and is thus a function of sales. The company size will then certainly play a role when we only look at the yearly data. The survey however also gives the change from 2008 to 2009 for all the companies so we are able to remove the size effect by looking at the changes (%) in ROCE by a change in CCC:

    The graph still shows no obvious relation between change (%) in CCC and change (%) in ROCE.  Now, we know that the shorter this cycle, the fewer resources the company needs to lock-up; reduced debtor levels (DSO), decreased inventory levels (DSI) and/or increased creditor levels (DPO) must have an effect on the ROCE – but will it be lost in the clutter of all the other company operations effects on the ROCE?

    Cash Management

    Net operating working capital is the cash plus cash equivalents needed to pay for the day-to-day operation of the business. This will include; demand deposits, money market accounts, currency holdings and highly liquid short-term investments such as marketable securities ((Marketable securities with a maturity of less than three months are referred to as ‘cash equivalents’ on the balance sheet, those with a longer maturity as ‘short-term investments’)); portfolios of highly liquid, near-cash assets which serves as a backup to the cash account.

    There are many reasons why holding cash is important; to act as a buffer when daily cash flows do not match cash out flows (Transaction motive), as a safety stock to face forecast errors and unforeseen expenses (Precautionary motive) or to be able to react immediately when opportunities can be taken (Speculative motive). If the cash level is too low and unexpected outflows occurs, the firm will have to either borrow funds or in the case of an investment – forgo the opportunity.

    Such short-term borrowing of funds can be costly as can a lost opportunity by the lost returns of rejected investments. Holding cash however also induces opportunity costs due to loss of interest.

    Cash management therefore aim at optimizing cash availability and interest income on any idle funds. Cash budgeting – as a part of the firm’s of short-term planning – constitutes the starting-point for all cash management activities as it represents the forecast of cash in- and outflows and therefore reflects the firm’s expected availability and need for cash.

    Working Capital Strategy

    We will in the following look closer at working capital management using balance simulation ((In the Monte Carlo simulation we have used 200 runs, as that was sufficient to give a good enough approximation of the distributions)). The data is from a company with large fixed assets in infrastructure. The demand for its services is highly seasonal as schematic depicted in the figure below:

    A company like this will need a flexible working capital strategy with a low level of working capital in the off-seasons and high levels in the high seasons. As the company wants to maximize its equity value it is looking for working capital strategies that can do just that.

    The company has been working on its cash conversion cycle, and succeeded in that with on average of only 11,1 days 1M (standard deviation 0,2 days) (across seasons) for turning supplied goods and services into cash:

    All the same, even then a substantial amount, on average €4,1M (standard deviation €1,8M) of the company’s resources, is invested in the cash conversion cycle:

    In addition the company needs a fair amount of cash to meet its other obligations. Its first strategy was to keep cash instead of using short term financing in the high seasons. In the off-seasons this strategy gives a large portfolio of marketable securities – giving a low return and thereby a low contribution to the ROCE.  This strategy can be described as being close to the red line in the seasonal graph above.

    When we now plot the two hundred observed (simulated) values of working capital and the corresponding ROE (from now we use return on equity (ROE) since this of more interest to the owners), we get a picture as below:

    This lax strategy shows little relation between the amount of working capital and the ROE and – from just looking at the graph it would be easy to conclude that working capital management is a waste of time and effort.

    Now we turn to a stricter strategy: keeping a low level of cash through all seasons, using short term financing in the high seasons and always have cash closely connected to expected sales. Again plotting the two hundred observed values we get the graph below:

    From this graph we can clearly see that if we can reduce the working capital we will increase the ROE – even if we live in a stochastic environment. By removing some of the randomness in the amount of working capital by keeping it close to what is absolutely needed – we get a much clearer picture of the effect. This strategy is best described as being close to the green line in the seasonal graph.

    Since we use pseudo-random ((Pseudo random number generator (PRNG), also known as a deterministic random bit generator, is an algorithm for generating a sequence of numbers that approximates the properties of random numbers. The sequence is not truly random in that it is completely determined by a set of initial values, called the seed number)) simulation we have replicated the first simulation (blue line), for the stricter strategy (green line).

    This means that the same events happened for both strategies; changes in sale, prices, costs, interest and exchange rates etc. The effects for the amount of working capital are shown in the graph below:

    The lax strategy (blue line) will have an average working capital of €4,8M with a standard deviation of €3.0M, while the strict strategy (green line) will have an average working capital of €1,4M with a standard deviation of €3.3M.

    Even if the stricter strategy seems to associate lower amounts of working capital with higher return to equity (se figure) and that the amount of working capital always is lower than under the laxer strategy, we have not yet established that it is a better strategy.

    To do this we need to simulate the strategies over a number of years and compare the differences in equity value under the two strategies. Doing this we get the probability distribution for difference in equity value as shown below:

    The expected value of the strict strategy over the lax strategy is €3,4 M width a standard deviation of €6,1 M. The distribution is skewed to the right, so there is also a possible additional upside. From this we can conclude that the stricter strategy is stochastic dominant to the laxer strategy. However there might be other strategies that can prove to be better.

    This brings us to the question: does an optimal working capital strategy exist? What we do know that there will be strategies that are stochastic dominant, but proving one to be optimal might be difficult.  Given the uncertainty in any firm’s future operations, you will probably first have to establish a set of strategies that can be applied depending on the set of events that can be experienced by the firm.

    References

    Filbeck, G, Krueger, T, & Preece, D. (2007). Cfo magazine’s “working capital survey”: do selected firms work for shareholders?. Quarterly Journal of Business and Economics , (March), Retrieved from http://www.allbusiness.com/company-activities-management/financial/5846250-1.html

    Katz, D.M.K. (2010). Working it out: The 2010 Working Capital Scorecard. CFO Magazine, June, Retrieved from http://www.cfo.com/article.cfm/14499542

    Weston, J. & Copeland, T. (1986). Managerial finance, Eighth Edition, Hinsdale, The Dryden Press

    Footnotes

  • Solving Uncertainty in Simulation Models

    Solving Uncertainty in Simulation Models

    I shall be telling this with a sigh
    Somewhere ages and ages hence:
    Two roads diverged in a wood, and I–
    I took the one less travelled by,
    And that has made all the difference

    …Robert Frost, 1916


    Uncertainty in your operations is most likely complex and will need systematic treatment through simulation modeling. S@R carries out thorough analysis of companies risk and uncertainties with aim of producing good decision support tools. Making sure the client takes a huge step forward from scenario analysis.

    The Four Levels of Uncertainty

    The uncertainty that remains after the best possible analysis has been done is what we call residual uncertainty (Courtney, Kirkland & Viguerie, 1997).

    In our world ‘the best possible analysis’ means that we have a model ‘good enough’ to describe the business under study. The question then is – do we need to take into account the uncertainties that always will be inherent in its operations and markets?  And if we have to, is it possible?

    A useful distinction between the different situations that can arise is given by Courtney et al.  as four levels of residual uncertainty (see figure below, McKinsey Quarterly, Dec. 2008):

    1. A Clear-Enough Future; managers can develop a single forecast of the future that is precise enough for strategy development.
    2. Alternate Futures; the future can be described as one of a few discrete scenarios. Analysis cannot identify which outcome will occur, although it may help establish probabilities.
    3. A Range of Futures; a range of potential futures can be identified. That range is defined by a limited number of key variables, but the actual outcome may lie anywhere along a continuum bounded by that range.
    4. True Ambiguity; multiple dimensions of uncertainty interact to create an environment that is virtually impossible to predict.

    In real life there is however a problem with identifying the level we are facing. The definition of 1th level uncertainty indicates that the residual uncertainty is irrelevant to the strategic decisions under study. But how is it possible to know this before an uncertainty analysis has been performed?

    The answer has to be that the best possible analysis performed has been a risk/uncertainty analysis taking into account all known uncertainties in the business’s environment and that of all the business’s feasible strategies one is always best (1th order stochastic dominance).

    The best strategy will then be the one giving a probability distribution for the business’s equity value that is located to the right and under the distributions for all other strategies. In this case the resulting equity value is of less importance since it anyway will be larger than under any other strategy. With this established, the actual analysis can be performed as a deterministic calculation.

    For the 2nd level uncertainties with alternate futures, a scenario analysis is often advocated. However the same applies to each alternative future as for the 1th level uncertainties (also see scenario analysis). In addition some assumptions have to be made on the probabilities of each of the alternative futures.

    As an example we can take a company analyzing investment in production facilities in two alternative countries. In one country there is a sovereign risk of a future new tax scenario and if it is imposed two different scenarios is possible. In the other country there is a fixed tax scenario – not expected to change. In this case you will need at least three (maximum five) models all taking into account the inherent risk in the business, giving the probability distribution for equity value for;

    1. current operations,
    2. current operations + Investment in the country with no sovereign risk, and
    3. current operations + Investment in the country with sovereign risk;
      1. no new tax scenario and
      2. with each of the two different new tax scenarios.

    The reason for different models even if the operations in the new facility will be the same regardless of country, lies in the fact that the business strategy might differ between countries and the investment strategy might differ for different tax scenarios. The model with sovereign risk will switch between the different tax scenarios models, according to the probability of their occurrence – generating the distribution for equity value given the sovereign risk.

    To invest, at least one of the equity distributions for ‘Current operations + Investment’ should be located to the right and under the distributions for ‘Current operations’ (or be stochastic dominant). Likewise, the best investment alternative will have an equity distribution located to the right and under the distributions for the other alternative (or be stochastic dominant).

    Having the equity distribution for the dominant strategy, opens for measurement of the strategy’s inherent risk beyond the use of simple value at risk calculations, putting emphasis on the possibility of large losses and further unwanted capital infusions.

    As we now can see, directly applying a standard scenario analysis can quickly lead decision makers astray.

    The above classification for the two first levels can in general only be performed after a full the risk/uncertainty analysis and can never be used ex ante to select the appropriate method.

    The 3rd level uncertainties describes the normal situation where all exogenous variables have a range of possible values. Assuming that we can find (estimate or guesstimate) the probability distribution over that range, we can attack the problem by Monte Carlo simulation and calculate the probability distributions for our endogenous variables.

    The 4th level uncertainties comprises at least two different situations; where there are unknown but knowable probabilities and where there are unknown and unknowable probabilities:

    Ambiguity is uncertainty about probability, created by missing information that is relevant and could be known (Camerer & Weber, 1992).

    This leads us to a more comprehensive discussion of the situations that will arise in decision making processes:

    More generally, we propose that in most decision problems, “choice” is nothing but the terminal act of a problem-solving activity, preceded by the formulation of the problem itself, the identification of the relevant information, the application of pre-existing competences or the development of new ones to the problem solution and, finally, the identification of alternative courses of action. (Dosi & Egidi, 1991)

    The origin of uncertainty

    Uncertainty may have two origins:

    1. the lack of all the information which would be necessary to make decisions with certain outcomes (substantive uncertainty), and
    2. limitations on the computational and knowledge based capabilities, given the available information (procedural uncertainty).

    The first source of uncertainty comes from information incompleteness, and the second from the inability to recognize, interpret and act on the relevant information, even when it is available – knowledge incompleteness.

    To distinguish between the two different situations giving Courtney‘s 4th level uncertainty we will follow Dosi & Egidi:

    1. Weak substantive uncertainty (analogous to Knight’s “risk”) is all circumstances where uncertainty simply derives from lack of information about the occurrence of a particular event – with a certain known (or at least knowable) probability distribution, and
    2. Strong substantive uncertainty (analogous to Knight and Keynes “uncertainty”) is all cases involving unknown events or the impossibility, even in principle, of defining the probability distributions of the events themselves.

    Types of Uncertainty

    Uncertainty estimation usually includes the estimation of the uncertainty of the output parameters by estimating the uncertainty of the input parameters. This is done by estimating a probability distribution of the error. Hence, it is pretty much “straight forward” as long as the input parameters have values. However, the uncertainty of a model may not only be estimated via the parameters, there may also be uncertainty in the structure of a model, e.g. which variable and parameters are important in the model.

    Adopting the distinction between parametric- and structural uncertainty (Kyläheiko et al., 2002) we can further specify model uncertainty:

    1. Parametric; uncertainty or imperfect knowledge about the parameters in the decision model, and
    2. Structural (epistemic); uncertainty or imperfect knowledge about the structure of the model.

    Combining the above we can describe the types of risk and uncertainty facing both the decision maker and the decision support model as in the following picture:

    The purpose is then to solve weak substantive parametric and structural uncertainty using good methods and models. The model will constitute a mix of facts ((In a simulation the opening balance is usually considered as certain, but the balance sheet often contains highly uncertain items. In fact auditors should give interval estimates for the most critical items in the yearly balance report)) (certain values), risks with known (objective) probability distributions, uncertainties given by subjective probability distributions and a script of the firm’s operations.

    Modeling

    However, models will always have some structural uncertainty – even if it would be possible to remove all by introducing more and more variable and relations. Occam’s Razor can usually be applied with good results; select the model that introduces the fewest assumptions and postulates the fewest entities while still sufficiently answering the question. Borrowing from multidimensional scaling the term ‘stress’ – as the violation done to the actual decision structure by removing parameters or variable from the model – we can visualize this by the following figure:

    Reducing the dimensionality of the model will not necessarily reduce or move (distort) the endogenous variables event space since correlation exists between variable omitted and variable kept in the model and – depending on estimation methods – the standard errors of estimated relationships will increase, maintaining the original model variability.

    Strategy

    Maybe the world and the uncertainties we face haven’t changed all that much as a result of the financial crisis, but our perception of risks has. That means there is a real opportunity to rethink the way we make strategic decisions, the way we plan under uncertainty. (Courtney, McKinsey Quarterly, Dec. 2008)

    The development of strategy requires the courage to accept uncertainty. Strategists must accept that they will not have all of the information and not see the full spectrum of possible events, yet be committed to create and implement strategy. The uncertainty that exists is not only a product of not having complete information and being able to predict future events, it also is a product of the events generated by dynamic and thinking competitors.

    By its nature, uncertainty invariably involves the estimation and acceptance of risk. Risk is equally common to action and inaction. Risk may be related to gain; greater potential gain often requires greater risk. However, we should clearly understand that the acceptance of risk does not equate to the imprudent willingness to gamble the entire likelihood of success on improbable events.

    One important step in the direction of better and more informed decision making is the removal of procedural uncertainty by having good models capable of framing the environment of the circumstances under which the decisions are made – giving the best possible analysis.

    It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail. (Maslow, 1966)

    S@R carries out thorough analysis of companies risk and uncertainties with aim of producing good decision support tools. Making sure the client takes a huge step forward from scenario analysis.

    References

    A fresh look at strategy under uncertainty: An interview, McKinsey Quarterly, December 2008.

    http://www.mckinseyquarterly.com/fresh_look_at_strategy_under_uncertainty_2256.

    Camerer, C. & Weber, M., (1992). Recent Developments in Modelling Preferences: Uncertainty and Ambiguity, Journal of Risk and Uncertainty, Springer, vol. 5(4), 325-70.

    Courtney, H., (2001). 20/20 Foresight. Boston: Harvard Business School Press.

    Courtney, H. G., Kirkland, J., & Viguerie, P. S., (1997). Strategy Under Uncertainty. Harvard Business Review, 75(6), 67-79.

    Dequech, D., (2000), Fundamental Uncertainty and Ambiguity, Eastern Economic Journal, 26(1), 41-60.

    Dosi, G & Egidi, M, (1991). Substantive and Procedural Uncertainty: An Exploration of Economic Behaviours in Changing Environments, Journal of Evolutionary Economics, Springer, 1(2), 145-68.

    Frost, R., (1916). Mountain interval. Henry Holt And Company.

    Keynes, J., (2004). A Treatise on Probability. New York: Dover Publications.

    Knight, F. (1921). Risk, Uncertainty and Profit. Boston: Houghton Mifflin.

    Kylaheiko K., Sandstrom J. & Virkkunen V., (2002). Dynamic capability view in terms of real options. International Journal of Production Economics, Volume 80 (1), 65-83(19).

    Maslow, A., (1966). The Psychology of Science. South Bend: Gateway Editions, Ltd.

    Endnotes

  • Stochastic Balance Simulation

    Stochastic Balance Simulation

    This entry is part 1 of 6 in the series Balance simulation

    Introduction

    Most companies have some sort of model describing the company’s operations. They are mostly used for budgeting, but in some cases also for forecasting cash flow and other important performance measures. Almost all are deterministic models based on a single values forecasts; the expected or average value of the input data; sales, cost, interest and currency rates etc. We know however that forecasts based on average values are on average wrong (Savage, 2002).  In addition deterministic models will miss the important dimension of uncertainty – that gives both the different risks facing the company and the opportunities they produce.

    In contrast, a stochastic model will be calculated a large number of times with different values for the input variable drawn from all possible values of the individual variables. Each run will then give a probable realization of future cash flow or of the company’s equity value etc. With thousands of runs we can plot the relative frequencies of the calculated values:

    and thus, we have succeeded in generating the probability distribution for the company’s equity value. In insurance this type of technique is often called Dynamic Financial Analysis (DFA) which actually is a fitting name.

    The Balance Simulation Model

    The main tool in the S&R toolbox is the balance model. The starting point is the company’s balance, which is treated as the simulations opening balance. In the case of a greenfield project – new factories, power plants, airports, etc. built from scratch – the opening balance is empty.

    The successive balances are then built from the Profit & Loss, by simulation of the company’s operation thru an EBITDA model mimicking the real life operations. Investments can be driven by demand (capacity calculations) or by investment programs giving the necessary or planned production capacity. The model will throughout the simulation raise debt (short and/or long term) or equity (domestic or foreign) according to the financial strategy set out by the company and the difference between cash outflow and inflow adjusted for the minimum cash level.

    Since this is a dynamic model, it will raise equity when losses occur and/or the maximum Debt/equity ratio has been exceeded. On the other hand it will repay loans, pay dividend, repurchase shares or purchase excess marketable securities (cash above the need for the operations) – all in line with the board’s shareholder strategy.

    The ledger and Double-entry Bookkeeping

    The activity described in the EBITDA model; investments, purchase of raw materials, production, payment of wages, income from sales, payment of special taxes on investments etc. is registered as transactions in the ledger, following a standard chart of accounts with double-entry bookkeeping. In a similar fashion are all financial transactions; loans repayments, cash, taxes paid and deferred, Agio and Disagio, etc. posted in the ledger. Currently, approximately 400 accounts are in use.

    The Trial Balance and the Financial Statements

    The trial balance (Post-Closing) is compiled and checked for balance between total debts and total credits. The income statement is then prepared using revenue and expense accounts from the trial balance and the balance sheet is prepared from the asset and liability accounts by including net income with the other equity accounts – using the International Financial Reporting Standards (IFRS).

    The general purpose of producing the trial balance is to ensure that the entries in the ledger are mathematically correct. Have in mind that every run in a simulation will produce a number of entries in the ledger and that they might differ not only in size but also in type depending on the realized states of the company’s operations (see above). We therefore need to be sure that the final financial statements – for every run – are correctly produced, since they will be the basis for all further financial analysis of the company.

    There are of course other sources of errors in book keeping; compensating errors, errors of omission, errors of principle etc. but after many years of use – with millions of runs – we feel confident that the ledger and financial statements are produced correctly. The point is that serious problems need serious models.

    However there are more benefits to be had from simulating the ledger and trial balance:

    1. It increases the models transparency; the trial balance can be printed out and audited. Together with the models extensive reporting and error/consistency control, it is no longer a ‘black box’ to the user.
    2. It makes it easy to plug inn new EBITDA models for other types of industry giving an automated check for consistency with the main balance simulation model.
    3. It is used to ensure correct solving of all implicit equations in the model, the most obvious is of course the interest and bank balance equation (interest depends on the bank balance and the bank balance depends on the interest) but others like translation hedging and limits set by the company’s financial strategy, create large and complicated systems of simultaneous equations.
    4. The trial balance changes from year to year are also used to ensure correct year to year balance transition.

    Financial Analysis, Financial Measures and Valuation

    Given the framework described above financial analysis can be performed and the expected value, variability and probability distributions for the different types of ratios; profitability, liquidity, activity, debt and equity etc. can be calculated and given as graphs. All important measures are calculated at least twice from different starting points to ensure consistency and correct solving of implicit equations.

    The following table shows the reconciliation of Economic Profit, initially calculated from (ROIC-WACC) multiplied with Invested capital:

    The motivation for doing all these consistency controls – in all nearly one hundred – lies in previously experience from Cash Flow/ Valuation models written in Excel. The level of detail is more often than not so low that there is no way to establish if they are right or wrong.

    More interesting than ratios, are the yearly distributions for EBITDA, EBIT, NOPLAT, Profit (loss) for the period, Free cash Flow, Economic profit, ROIC, Wacc, Debt and Equity and Equity value etc. giving a visual picture of the uncertainties and risks the company faces:

    Financial analysis is the conversion of financial data into useful information for decision making. Therefore, virtually any use of financial statements or other financial data for some purpose is financial analysis and is the primary focus of accounting and finance. Financial analysis can be internal (e.g., decision analysis by a company using internal data to understand or improve management and operating results) or external (e.g., comprehensive analysis for the purposes of commercial lending, mergers and acquisition or investment activities). The key is how to analysis available data to make correct decisions.

     

    Input

    As input the model needs parameter values and operational data. The parameter values fall in seven groups:

    1. Parameters describing investors preferences; Market risk premium etc.
    2. Parameters describing the company’s financial strategy; Leverage, Long/Short-term Debt ratio, Expected Foreign/ Domestic Debt Ratio, Economic Depreciation, Maximum Dividend Pay-out Ratio, Translation Hedging Strategy etc.
    3. Parameters describing the economic regime under which it operates: Taxes, Depreciation Scheme etc.
    4. Opening Balance etc.

    Since the model have to produces stochastic forecasts of interest(s) and exchange rates it will need for every currency involved (included lower and upper 5% probability limit):

    1. The Yield curves,
    2. Expected yearly inflation
    3. Depending on the forecast method(s) chosen for the exchange rates; the different currencies expected risk premiums or real exchange rates etc.

    Since there is a large number of parameters they are usually read from an excel template but the program will if necessary ask for missing or report inconsistent values of the parameters.

    The company’s operations are best described through an EBITDA model even if prices, costs and production coefficients and their variability can be read from an excel template. A dedicated EBITDA model will always give the opportunity to give a more detailed and in some cases complex description of the operations, include forecast and demand models, ‘exotic’ taxes, real options strategies etc., etc.

    Output

    S@R has set out to create models that can give answers to both deterministic and stochastic questions the tables will answer most deterministic issues while graphs must be used to answer the risk and uncertainty related questions:

    [TABLE=6]

    1.    In all 27 different reports with more than 70 pages describing operations and the economics of operations.
    2.    In addition the probability distributions for all input and output variables are produced.

    Use

    By linking dedicated EBITDA models to holistic balance simulation, taking into account all important factors describing the company. The basis is a real balance simulation model – not a simple cash flow forecast model.

    Both the deterministic and stochastic balance simulation can be set about in two different alternatives:
    1.    by a using a EBITDA model to describe the companies operations or
    2.    by using coefficients of fabrications (e.g. kg flour pr 1000 bread etc.) as direct input to the balance model.

    The first approach implies setting up a dedicated EBITDA performance and uncertainty, but entails a higher degree of effort from both the company and S@R.

    The use of coefficients of fabrications and their variations is a low effort (cost) alternative, using the internal accounting as basis. This will in many cases give a ‘good enough’ description of the company – its risks and opportunities: The data needed for the company’s economic environment (taxes, interest rates etc.) will be the same in both alternatives.

    In some cases we have used both approaches for the same client, using the last approach for smaller daughter companies with production structures differing from the main companies.
    The second approach can also be considered as an introduction and stepping stone to a more holistic EBITDA model.
    What problems do we solve?

    • The aim regardless of approach is to quantify not only the company’s single and aggregated risks, but also the potential, thus making the company capable to perform detailed planning and of executing earlier and more apt actions against risk factors.
    • This will improve stability to budgets through higher insight in cost side risks and income-side potentials. This is achieved by an active budget-forecast process; the control-adjustment cycle will teach the company to better target realistic budgets – with better stability and increased company value as a result.
    • Experience shows that the mere act of quantifying uncertainty throughout the company – and thru modeling – describe the interactions and their effects on profit, in itself over time reduces total risk and increases profitability.
    • This is most clearly seen when effort is put into correctly evaluating strategies-projects and investments effects on the enterprise. The best way to do this is by comparing and choosing strategies by analyzing the individual strategies risks and potential – and select the alternative that is dominant (stochastic) given the company’s chosen risk-profile.
    • Our aim is therefore to transform enterprise risk management from only safeguarding enterprise value to contribute to the increase and maximization of the firm’s value within the firm’s feasible set of possibilities.

    Strategy@Risk takes advantage of a program language developed and used for financial risk simulation. We have used the program language for over 25years, and developed a series of simulation models for industry, banks and financial institutions.

    The language has as one of its strengths, to be able to solve implicit equations in multiple dimensions. For the specific problems we seek to solve, this is a necessity that provides the necessary degrees of freedom to formulate the approach to problems.

    The Strategy@Risk tools have highly advance properties:

    • Using models written in dedicated financial simulation language (with code and data separated; see The risk of spreadsheet errors).
    • Solving implicit systems of equations giving unique WACC calculated for every period ensuring that “Free Cash Flow” always equals “Economic Profit” value.
    • Programs and models in “windows end-user” style.
    • Extended test for consistency in input, calculations and results.
    • Transparent reporting of assumptions and results.

    References

    Savage, Sam L. “The Flaw of Averages”, Harvard Business Review, November 2002, pp. 20-21

    Mukherjee, Mukherjee (2003). Financial Accounting. New York: Harper Perennial, ISBN 9780070581555.