Examples of Uses for Sensitivity Analysis and What-if Scenarios
What if Analysis
Excel VBA, solver, and other advanced worksheet tools
Giovanni Romeo , in Elements of Numerical Mathematical Economics with Excel, 2020
1.3 What-if analysis: scenario manager, Goal Seek, Data Table, and contour lines
Under the Ribbon Tab Data, we find another important advanced analysis tool offered by Excel, namely the What-if Analysis (see Fig. 1.3-1).
Figure 1.3-1. What-if analysis.
Essentially, this tool answers to the question of what will happen to the output function if we change the values of the inputs.
The What-if Analysis allows to perform three types of activities:
- i.
-
Scenarios
- ii.
-
Goal Seek
- iii.
-
Data Table (of key importance in the standard calculus, involving two independent variables)
The best way to explain the What-if Analysis is through an example. We decide to use a basic Capital Budgeting case of NPV (Net Present Value) and IRR (Internal Rate of Return) as it fits well with the explanation of the above points i, ii, and iii. Chapter 14 will go deeper into the simulation techniques, using the methods of Monte Carlo.
Example 1 (what-if analysis on NPV and IRR calculation)
Let us consider the Cash Flows schedule of Table 1.3-1, where the first one is represented by the initial investment:
Table 1.3-1. Initial Outlay and Cash Flow Projection of the project.
We assume a required rate of return equal to 9%. The required rate of return is the discount rate that investors should require given the riskiness of the project. This discount rate is also called the cost of capital and a weighted average cost of capital (Wacc) is usually calculated, which depends on how the project is financed. This is essentially an opportunity cost.
Unless an investment earns more than the cost of capital, the investment should not be undertaken.
The Net Present Value is the present value of the future after-tax cash flows minus the investment outlay as follows:
where CFt=Cash flow in period t.
Using the following Excel formula:
we obtain the NPV equal to £ 2292. The project should be undertaken as the NPV is positive. Here, the cash outlay (this is not discounted as it occurs at t = 0) is negatively added separately to the NPV excel formula (which is slightly different from the pure theoretical NPV we defined before).
- i.
-
Scenario Analysis
Suppose we need to simulate the NPV output changing both the Cash Outlay and the first negative Cash Flow.
The new values would be:
Now, we just go to Scenario Manager on What-if Analysis, following the steps from Figs. 1.3-2–1.3-6.
Figure 1.3-2. Adding and Editing a Scenario.
Figure 1.3-3. Changing CF 0 and CF 1 to −1500 and −300.
Figure 1.3-4. Show summary.
Figure 1.3-5. Define the Resulting Cell (NPV): we select Scenario Summary (Scenario in a Pivot Table format is also possible).
Figure 1.3-6. Scenario summary.
It is worth noticing that more scenarios can be added simultaneously.
Finally, the following report in Fig. 1.3-6 is created.
- ii.
-
Goal Seek (IRR calculation)
The internal rate of return (IRR) is the discount rate that makes the present value of the future after-tax cash flows equal that investment outlay.
The IRR is in other words the actual return (ex post, or expected based on the Cash Flows projection), of the project, which is found to satisfy the following equation:
A potential supplier of capital will not invest in the project, unless its actual return (i.e., the IRR) meets or exceeds what the supplier could earn (the Wacc) elsewhere in a comparable risk investment.
We can use the Excel Goal Seek as shown in Figs. 1.3-7 and 1.3-8 to solve the above equation (still considering the data of Table 1.3-1).
Figure 1.3-7. Goal Seek (IRR calculation).
Figure 1.3-8. Goal Seek (IRR calculation).
The Goal Seek is essentially an optimizer, and sometimes it can be used alternatively to the Solver.
The calculated IRR is equal to 34.51% and corresponds to the one we can calculate using the Excel built-in formula = IRR(B2:K2).
- iii.
-
Data Table Simulation
The third tool offered by Excel to do a simulation on the output variable is the Data Table. Throughout the book we will use the Data Table tool within the bivariate optimization analysis.
In the following Fig. 1.3-9, we have set Cell D9 = Cell B4 = NPV. This is the simulated output. Now we need the two input variables to be changed, to see how the NPV output will be affected.
Figure 1.3-9. Data Table NPV simulation (NPV as a function of the Wacc).
This can be easily done as in Fig. 1.3-9, selecting the Range C9:D20 and inserting a Data Table from the What-if Analysis.
The Data Table will ask two input variables, as a base for the simulation, namely the Row Input Cell and the Column Input Cell.
To begin, we decide to change only one variable, the Wacc. This will be our Column Input Cell (i.e., Cell B5) because we have arranged the range of possible Wacc under Column C (from Cell C10 to Cell C20).
The output is in Fig. 1.3-10, from which we notice that the IRR should be in the range area where the NPV changes sign.
Figure 1.3-10. Data Table NPV output.
The NPV profile is plotted in Fig. 1.3-11 from which we notice that at a discount rate equal to 0% we just have the sum of the undiscounted cash flows equal to £ 4,490. As we have calculated before, at a discount rate of 34.51% we have instead the NPV = 0 and therefore 34.51% is the IRR of the project, which compared to the theoretical Wacc (9%) required from the investors for a similar project's class riskiness, make them to decide to undertake the project.
Figure 1.3-11. NPV profile as a function of various discount rates: if NPV = 0 then Discount Rate = IRR.
We can also exploit the possibility given by the Data Table changing two variables instead of only one.
For example, we decide to simulate the NPV changing both the Wacc and the initial outlay. In this case the Row Input Cell would be Cell B2.
Now we can construct a pure Data Table in two input variables, as in Fig. 1.3-12, where the gray area is because we have selected the entire table before inserting the Data Table formula from the What-if Analysis.
Figure 1.3-12. Arranging the input variable by row (initial outlay) and by Column (Wacc).
In Fig. 1.3-13, the output returned inside the Table is Cell B4 (The NPV) and Table 1.3-2 is the final calculated output table.
Figure 1.3-13. Data Table construction (changing Wacc by Column and initial outlay by row).
Table 1.3-2. Final Data Table NPV output simulation.
We can now also plot the NPV as a function of the two input variables as in Fig. 1.3-14, where essentially the NPV (on the Vertical Axis) is a function , namely NPV = f(Wacc, CF 0).
Figure 1.3-14. NPV profile in a three-dimensional space, i.e., NPV = f ( Wacc , CF 0).
In the three-dimensional chart of Fig. 1.3-14 the different colors represent the Contour Map (or Contour Lines Map).
A Contour Line is the locus of points (or a curve) along which the output function (i.e., the NPV) does not change its value, and this will be analyzed in more details in the bivariate constrained optimization analysis.
The Excel Contour Diagram (see Fig. 1.3-15 for the NPV example) can be inserted selecting Contour from the 3D Surface Excel charts, as shown in Fig. 1.3-16, or alternatively selecting a Wireframe Contour (Fig. 1.3-17).
Figure 1.3-15. Contour Diagram.
Figure 1.3-16. How to insert a Contour Diagram.
Figure 1.3-17. How to insert a Wireframe Contour.
Notice how in Fig. 1.3-15 (looking at the diagram from the right) on the first line separating the first darkest area from the second less dark area we can read the pair values of Wacc and initial outlay such that NPV is zero (i.e., the IRR of the project is identified for each level of initial outlay). We see that the obvious result is that the less negative the initial outlay read on the vertical axis, the higher the IRR read on the horizontal axis, going upward from the left-hand side to the right-hand side on the first contour line.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780128176481000013
Querying the Data Warehouse
Lilian Hobbs , ... Pete Smith , in Oracle 10g Data Warehousing, 2005
Hypothetical RANK and Distribution Functions
Business intelligence often involves what-if analysis, where you make a hypothetical change to the business and analyze its impact. For instance, we are introducing a new product in the HDRW category and have a projected sales figure of $7,600 based on market surveys. Based on this information, we would like to know how this product would rank among other products in its category. Oracle provides a family of hypothetical rank and distribution functions for this purpose. With these functions, you can ask to compute the RANK, PERCENT_RANK, or CUME_DIST of a given value, as if it were hypothetically inserted into a set of values.
To illustrate this, we will use the following query, showing the sales for different products in the HDRW category in ascending order of sales and their respective ranks.
Now, suppose we want to find the hypothetical rank of a product with sales of $7,600. From the previous output, we can see that this value, if inserted into the data would get a rank of 8. The following query asks for the hypothetical rank:
Hypothetical rank functions take an ordering condition and a constant data value to be inserted into the ordered set. The way to recognize a hypothetical rank function in a query is the WITHIN GROUP clause and a constant expression within the RANK function. Similarly, you can use CUME_DIST or PERCENT_RANK to find the distribution or percentile of a quantity inserted hypothetically into a result.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9781555583224500084
Advanced Analytics
Rick Sherman , in Business Intelligence Guidebook, 2015
Abstract
Advanced analytics focuses on gauging the future and allowing what-if analysis. Predictive analytics and data mining are the processes by which one performs advanced analytics for forecasting and modeling. Data visualization is an analytics technique that presents information pictorially, often graphically, helping to communicate a huge volume of information in a more intuitive way. Enterprises need two distinct types of self-service data environments based on different business analytical requirements: analytical sandboxes, to enable business users to be able to add data and derive metrics, and analytical hubs, which accommodate more extensive and intensive data gathering and analysis. Analytics for Big Data require special attention to the scope, program, architecture, and team.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780124114616000150
A Monte Carlo approach applied to sensitivity analysis of criteria impacts on solar PV site selection
Hassan Z. Al Garni , Anjali Awasthi , in Handbook of Probabilistic Models, 2020
3.2 Monte Carlo simulation approach
MCS, which is considered as a methodical approach of doing what-if analysis, was used to measure the reliability of the MCDM analysis results and to draw insightful conclusions regarding the relationship between the variation in decision criteria values and the decision results. MCS, which is a very useful statistical tool for analyzing uncertain scenarios and providing deep analysis of multiple different scenarios, was first used by Jon von Neumann and Ulam in the 1940s. These days MCS represents any simulation that involves repeated random generation of samples and analyzing the behavior of statistical values over population samples (Martinez and Martinez, 2002). Information obtained from random samples is used to estimate the distributions and obtain statistical properties for different situations. In this research, the MCS approach is applied as depicted in Fig. 20.2, starting from the Step 4. "Create a sample of the selected distribution (m)." An explicit assumption was made about the distribution of the population of the criteria values by fitting each criterion with the available distributions using Matlab software, whereas no assumptions were made about the distribution of the sample mean (Y) and sample variance. Nevertheless, using a large number of sample simulations , the sample distributions of the mean and the variance are close to a normal distribution as depicted in Fig. 20.3.
Figure 20.3. The mean of the suitability index after 1000 simulation.
Figs. 20.4 and 20.5 illustrates the histogram of samples means for criteria Xs and Y for all simulations, whereas the correlations between the simulated criteria and the suitability index are represented in Table 20.4. The computation of the correlation between Y and each criterion X quantifies the degree to which two quantitative values X and sampled mean, Y, are linked to each other. The correlation coefficient which ranges between −1 and 1 indicates that the strongest linear relationship between the suitability index and the decision criteria occurred for proximity to cities (X5) followed by the aspect of the site, whereas the weakest linear relationship is indicated by proximity to roads (X6) as shown in Table 20.4.
Figure 20.4. The histogram of samples means for criteria Xs and Y for all simulations.
Figure 20.5. Comparison between the empirical data of criteria and their corresponding simulated values based on the selected distributions.
Table 20.4. Correlation coefficient of the associated decision criteria.
| Decision criteria | Correlation coefficient |
|---|---|
| Solar (X1) | 0.0523 |
| Temp. (X2) | 0.0387 |
| Slope (X3) | 0.0553 |
| Aspect (X4) | 0.0642 |
| Urban (X5) | 0.0662 |
| Roads (X6) | −0.009 |
| Power (X7) | 0.0164 |
Furthermore, because the potential sites have different amounts of solar irradiation and different slopes, which both have significant impact on the final suitability index of the site, the correlation coefficients of these criteria have moderate positive correlations.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780128165140000205
The Analysis Process
Chris Sanders , Jason Smith , in Applied Network Security Monitoring, 2014
Key Assumptions Check
Most all sciences tend to make assumptions based on generally accepted facts. This method of questioning is designed to challenge key assumptions and how they affect the investigation of a scenario. This most often pairs with the What If analysis method. As an example, in the spread of malware, it's been the assumption that when operating within a virtual machine, the malware doesn't have the ability to escape to the host or other virtual machines residing on it. Given an incident being presented where a virtual machine has been infected with malware, a peer might pose the question of what action might be taken if this malware did indeed escape the virtual environment and infect other virtual machines on the host, or the host itself.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780124172081000155
OLAP
Lilian Hobbs , ... Pete Smith , in Oracle 10g Data Warehousing, 2005
15.1.1 OLAP Applications
Online Analytical Processing involves analysis along multiple dimensions. The most basic OLAP operations are aggregation and analysis, such as ranking (e.g., top-10 products), time-series calculations (e.g., moving average), and interrow calculations (such as period-over-period comparisons).
As we discussed in Chapter 6, these calculations can be done using SQL analytical functions. You can also use powerful end-user tools such as Discoverer to perform this analysis graphically. These types of operations when done using SQL may require multiple passes over the data and hence, with Oracle OLAP option, it may be possible to do these types of operations faster because the data storage format is optimized for analysis.
Other business applications, such as financial modeling, sales forecasting, what-if analysis, and budget allocation, require more specialized storage and analysis models and cannot be done efficiently using SQL. Let us review what each of these applications involves.
Forecasting: Forecasting, as the name suggests, involves predicting a quantity based on available historical figures—for instance, forecasting sales for the next quarter based on results of the past year. These applications use advanced statistical algorithms, such as linear and nonlinear regressions, single and double exponential smoothing, and the Holt-Winters method.
Allocation: Allocation, also known as reverse aggregation, is used to divide a quantity such as a budget or a quota into several parts. Allocation is an important part of business planning applications. For example, at the beginning of each quarter, each department head may be given a budget for purchasing new equipment, which must then be further apportioned among the managers within that department and so on.
Financial Calculations: These are calculations that can be conveniently done in a spreadsheet environment—for example, interest calculations and payment schedules.
Modeling: Modeling involves describing a quantity using a set of equations. The model can then be used to compute other quantities by plugging data into these equations. The equations may have an implied dependency order among them and can compute new values of dimensions and facts. For example, you may have a model to calculate the peak sales for different countries or regions based on different holiday months. With Oracle Database 10g, you can now also do some modeling using the SQL Model Clause.
What-if Analysis: What-if analysis, or scenario management, is a very important aspect of advanced analytical applications. It involves analyzing data under hypothetical scenarios to determine its impact on the business. For instance, how much will it cost the company if we were to close down some of our retail stores and start an online outlet store? What will be the impact on revenues if we made a change to our sales organization? What-if analysis requires a transactional model different from that provided by relational databases and SQL. Users must be able to change the structure and content of the data in a localized fashion within the session, without making it visible to the entire database. Further, the changes may be temporary and the user should be able to restore the data back to the way it was.
Regardless of the type of analysis being performed, OLAP typically involves analyzing data across multiple dimensions. The question then arises—what is the best way to store data to facilitate such multidimensional analysis?
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9781555583224500175
A Decision-Support System Approach to Economics-Driven Modularity Evaluation
Yuanfang Cai , ... Hong-Mei Chen , in Economics-Driven Software Architecture, 2014
6.6 Conclusions
In this chapter, we have presented an options-based decision-support system to help manage modularity debt and to help resolve the refactoring decision dilemma faced by software practitioners today. Using the Datar–Mathews real-options valuation technique as our core economic model, our proposed MDM-DSS allows for what-if analysis, integrating history-based analysis, text mining, and input from the project manager to identify "hot spots" and to calculate the value of refactoring activities interactively.
We have advanced existing research in two specific areas to enable the realization of the MDM-DSS: (1) the introduction of three new proxy measures of effort and (2) the automated identification of refactoring candidates. These research advances allow the MDM-DSS to: (1) predict maintenance effort variation based on file metrics variations and (2) identify the parts of the system that need to be refactored. We also presented two pilot industrial case studies where refactoring decisions are supported based on the key modules of our MDM-DSS. The case studies served to improve our MDM-DSS prototype and to give them a solid empirical basis for finding potential modularity "hot spots" in an existing code base. These hot spots are the likely sites of future complexity, high risk, high cost, high number of bugs, and difficulty in finding and fixing these bugs.
The MDM-DSS vision is ambitious. It is inherently multidisciplinary. We have articulated the difficulty of managing modularity debt and the uncertainty it involves. We have presented two key economics-driven modularity evaluation research results underpinning the key modules of the MDM-DSS. The MDM-DSS is, of course, open to other kinds of metrics and analyses, and we expect to be continually incorporating new research results to the MDM-DSS through our own research and the research of others.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780124104648000064
28th European Symposium on Computer Aided Process Engineering
Saad A. Al-Sobhi , Ahmed AlNouss , in Computer Aided Chemical Engineering, 2018
2 Overall Methodology
The sequential strategy consists mainly of five majors steps.
Step One: Process Simulation: steady state simulation of six NGL process schemes/configurations using ASPEN HYSYS V9 under different operating conditions to estimate material and energy requirements. ASPEN HYSYS, a powerful simulation package, it is used to estimate the material balance and energy balances and help to address what-if analysis and carry out sensitivity analysis.
Step Two: Economic Evaluation: estimate the capital, operational costs using ASPEN Economic Evaluation tool and perform profitability by calculating the total Annualized Cost and ROI as shown in Eq. (1).
(1)
Where AEP is the annual net economic profit and TCI is the total capital investment.
Step Three: Environmental Impact Assessment: estimate the annual carbon emissions in kg/h and the reduction targets in all recovery configurations using ASPEN software embedded CO2 estimation algorithm.
Step Four: Process Integration and Improvements: perform heat integration techniques to estimate potential energy savings using ASPEN Energy Analyzer. It is an energy management software for performing optimal heat exchanger network design to minimize process energy. Also, the energy intensity before and after heat integration can be estimated.
Step Five: Sustainability Incorporation and Analysis: apply the sustainability metric with some weights as relative importance as ratio to economic profit as shown in Eq. (2).
(2)
where i is an index for the different sustainability indicators (other than the annual net economic profit with i = 1,2,…, N indicators ). The weighing factor w i is a ratio representing the relative importance of the ith sustainability indicator compared to the annual net economic profit. The term Indicator p,i represents the value of the ith sustainability indicator associated with the pth project and the term Indicator i Target corresponds to the target value of the ith sustainability indicator (obtained from process integration benchmarking or taken as the largest value from all projects, or set by the company as a goal).
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9780444642356500279
Marketing Information Systems
Robert R. Harmon , in Encyclopedia of Information Systems, 2003
II.A. Marketing Decision Support System Functions
The marketing decision support system (MDSS) provides analytical models for forecasting, simulation, and optimization. MDSS tools include simple spreadsheets such as Excel, statistical analysis packages such as SPSS and SAS, on-line analytical processing (OLAP) tools, data mining applications, and neural networks. The MDSS provides the user with the ability to explore multiple options. Typical MDSS functions include models and tools for:
- 1.
-
Sensitivity analysis. Decision makers can explore changes in a strategic variable such as price and model, its impact on demand, or competitive behavior.
- 2.
-
What-if analysis. This analysis can be easily accomplished with a spreadsheet. Revenues and costs can be manipulated to show the impact of each variable on profits and cash flows.
- 3.
-
Goal setting. Analysis focuses on the desired result and builds the resource base necessary to accomplish the goal.
- 4.
-
Exception reporting. Analysis looks for results that exceed or fall short of stated goals or benchmarks—sometimes called gap analysis. Which products or segments exceeded sales forecasts?
- 5.
-
Pareto analysis. Analysis looks for activities that generate disproportionate results. For instance, the top 20% of customers may account for 80% of sales revenues.
- 6.
-
Forecasting models. Econometric models are used to analyze time series data for the purpose of predicting future sales and market share levels.
- 7.
-
Simulation models. Monte Carlo simulations address marketing decision making under conditions of uncertainty. Variables such as the market price, unit variable cost, and quantity sold are not known ahead of the product investment decision. Simulation models allow the marketer to analyze risk and assess the probabilities of likely outcomes of their decisions.
- 8.
-
Scorecards and dashboards. Scorecard systems present a consistent framework for tracking the effectiveness of marketing activities. They often have different modules for senior executives, marketing managers, product managers, and customer service managers. Scorecard systems allow the user to "drill down" on an analytic and workflow basis to determine the status of any strategic initiative. Dashboards allow frontline managers to monitor their critical performance indicators. These systems are often used in conjunction with "best practice" standards for call-center-based customer support.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B0122272404001106
Power Estimation of Embedded Systems: A Hardware/Software Codesign Approach
William Fornaciari , ... Member, IEEE, in Readings in Hardware/Software Co-Design, 2002
B The TOSCA Codesign Flow
The design flow of the TOSCA codesign environment, where the present work is going to be integrated, is shown in Fig. 1. Main goal is to reduce the impact of the system integration and design constraints verification bottlenecks on the global design time, thus allowing a cost-effective evaluation of alternative designs.
Fig. 1. The TOSCA codesign roadmap.
The design capture is performed via a mixed textual/graphical editor based on a OCCAMII customization [6] improving the user friendliness and gathering in the same design database timing constraints, design requirements, design goals and possibly an initial HW versus SW allocation of the modules composing the system. If the latter information is left unspecified by the user, an initial allocation is decided based on the results of an heuristic, by statically inspecting the properties of the system description. The main part of the codesign flow is represented by the design space exploration, i.e., a "what if analysis of alternative architectural solutions to discover an acceptable final system modularization and HW versus SW allocation fulfilling the initial requirements and goals. This is obtained by evaluating system properties through a set of metrics, by applying system-level transformations, producing new modularization of the system specification semantically equivalent to the original one. When an acceptable partitioning is found, synthesis of the HW and SW parts can be performed. The SW synthesis passes through an intermediate uncommitted format, called virtual instruction set (VIS) [8], allowing the designer to consider the timing performance when different CPU cores are employed and to make possible a flexible simulation of the cooperating HW and SW based on the same VHDL simulator engine. HW-bound modules and interfaces are automatically converted into suitable VHDL templates. Finally, simulation of the HW/SW system is performed, considering the side-effects due to the HW/SW bused communication and the different performance of HW and SW technologies.
The task of system-level partitioning should provide alternative solutions in terms of the cost/performance ratio. To afford the partitioning process with respect to the design constraints, it is necessary to define a cost function, based on some metrics. Thus, a preliminary and iterated phase is a metric-based analysis of the system-level description. Design metrics, considering the contribution of both the HW and SW parts, can be conceived to evaluate the quality of a partitioning solution in terms of fulfillment of several design optimization criteria [6], such as performance, cost, resource exploitation, communication and power consumption. The current version of TOSCA evaluates a set of static and dynamic metrics, based on the analysis of the object oriented representation of the specification, high-level simulation and profiling. Metrics to evaluate area and performances are described in [6], while metrics for power analysis are the subject of such paper.
Read full chapter
URL:
https://www.sciencedirect.com/science/article/pii/B9781558607026500211
Examples of Uses for Sensitivity Analysis and What-if Scenarios
Source: https://www.sciencedirect.com/topics/computer-science/what-if-analysis
0 Response to "Examples of Uses for Sensitivity Analysis and What-if Scenarios"
Post a Comment