Analytica (software)

Analytica
Developer(s) Lumina Decision Systems
Initial release January 16, 1992 (1992-01-16)
Written in C
Operating system Windows
Platform x86, x64
Available in English
Type Decision-making software, statistics, information visualization, user interface creation, numerical analysis
License Proprietary
Website www.lumina.com

Analytica is a visual software package developed by Lumina Decision Systems for creating, analyzing and communicating quantitative decision models.[1] As a modeling environment, it is interesting in the way it combines hierarchical influence diagrams for visual creation and view of models, intelligent arrays for working with multidimensional data, Monte Carlo simulation for analyzing risk and uncertainty, and optimization, including linear and nonlinear programming. Its design, especially its influence diagrams and treatment of uncertainty, is based on ideas from the field of decision analysis. As a computer language, it is notable in combining a declarative (non-procedural) structure for referential transparency, array abstraction, and automatic dependency maintenance for efficient sequencing of computation.

Hierarchical influence diagrams

Analytica models are organized as influence diagrams. Variables (and other objects) appear as nodes of various shapes on a diagram, connected by arrows that provide a visual representation of dependencies. Analytica influence diagrams may be hierarchical, in which a single module node on a diagram represents an entire submodel.

Hierarchical influence diagrams in Analytica serve as a key organizational tool. Because the visual layout of an influence diagram matches these natural human abilities both spatially and in the level of abstraction, people are able to take in far more information about a model's structure and organization at a glance than is possible with less visual paradigms, such as spreadsheets and mathematical expressions. Managing the structure and organization of a large model can be a significant part of the modeling process, but is substantially aided by the visualization of influence diagrams.

Influence diagrams also serve as a tool for communication. Once a quantitative model has been created and its final results computed, it is often the case that an understanding of how the results are obtained, and how various assumptions impact the results, is far more important than the specific numbers computed. The ability of a target audience to understand these aspects is critical to the modeling enterprise. The visual representation of an influence diagram quickly communicates an understanding at a level of abstraction that is normally more appropriate than detailed representations such as mathematical expressions or cell formulae. When more detail is desired, users can drill down to increasing levels of detail, speeded by the visual depiction of the model's structure.

The existence of an easily understandable and transparent model supports communication and debate within an organization, and this effect is one of the primary benefits of investing in quantitative model building. When all interested parties are able to understand a common model structure, debates and discussions will often focus more directly on specific assumptions, can cut down on "cross-talk", and therefore lead to more productive interactions within the organization. The influence diagram serves as a graphical representation that can help to make models accessible to people at different levels.

Intelligent multidimensional arrays

Analytica uses index objects to track the dimensions of multidimensional arrays. An index object has a name and a list of elements. When two multidimensional values are combined, for example in an expression such as

Profit = Revenue − Expenses

where Revenue and Expenses are each multidimensional, Analytica repeats the profit calculation over each dimension, but recognizes when same dimension occurs in both values and treats it as the same dimension during the calculation, in a process called intelligent array abstraction. Unlike most programming languages, there is no inherent ordering to the dimensions in a multidimensional array. This avoids duplicated formulas and explicit FOR loops, both common sources of modeling errors. The simplified expressions made possible by intelligent array abstraction allow the model to be more accessible, interpretable, and transparent.

Another consequence of intelligent array abstraction is that new dimensions can be introduced or removed from an existing model, without requiring changes to the model structure or changes to variable definitions. For example, while creating a model, the model builder might assume a particular variable, for example discount_rate, contains a single number. Later, after constructing a model, a user might replace the single number with a table of numbers, perhaps discount_rate broken down by Country and by Economic_scenario. These new divisions may reflect the fact that the effective discount rate is not the same for international divisions of a company, and that different rates are applicable to different hypothetical scenarios. Analytica automatically propagates these new dimensions to any results that depend upon discount_rate, so for example, the result for Net present value will become multidimensional and contain these new dimensions. In essence, Analytica repeats the same calculation using the discount rate for each possible combination of Country and Economic_scenario.

This flexibility is important when exploring computation tradeoffs between the level of detail, computation time, available data, and overall size or dimensionality of parametric spaces. Such adjustments are common after models have been fully constructed as a way of exploring what-if scenarios and overall relationships between variables.

Uncertainty analysis

Incorporating uncertainty into model outputs helps to provide more realistic and informative projections. Uncertain quantities in Analytica can be specified using a distribution function. When evaluated, distributions are sampled using either Latin hypercube or Monte Carlo sampling, and the samples are propagated through the computations to the results. The sampled result distribution and summary statistics can then be viewed directly (mean, fractile bands, probability density function (PDF), cumulative distribution function (CDF)), Analytica supports collaborative Decision Analysis and Probability Management through the use of the DIST standard.[2][3]

Systems dynamics modeling

System dynamics is an approach to simulating the behaviour of complex systems over time. It deals with feedback loops and time delays on the behaviour of the entire system. The Dynamic() function in Analytica allows definition of variables with cyclic dependencies, such as feedback loops. It expands the influence diagram notation, which does not normally allow cycles. At least one link in each cycle includes a time lag, depicted as a gray influence arrow to distinguish it from standard black arrows without time lags.

As a programming language

Analytica includes a general language of operators and functions for expressing mathematical relationships among variables. Users can define functions and libraries to extend the language.

Analytica has several features as a programming language designed to make it easy to use for quantitative modeling: It is a visual programming language, where users view programs (or "models") as influence diagrams, which they create and edit visually by adding and linking nodes. It is a declarative language, meaning that a model declares a definition for each variable without specifying an execution sequence as required by conventional imperative languages. Analytica determines a correct and efficient execution sequence using the dependency graph. It is a referentially transparent functional language, in that execution of functions and variables have no side effects i.e. changing other variables. Analytica is an array programming language, where operations and functions generalize to work on multidimensional arrays.

Applications of Analytica

Analytica has been used for policy analysis, business modeling, and risk analysis.[4] Areas in which Analytica has been applied include energy,[5][6][7][8][9][10] health and pharmaceuticals,[11][12][13][14][15][16][17][18][19][20][21][22][23][24][25][26] environmental risk and emissions policy analysis,[27][28][29][30][31][32][33][34][35] wildlife management,[36][37][38][39] ecology,[40][41][42][43][44][45][46] climate change,[47][48][49][50][51][52][53][54][55][56] technology and defense,[57][58][59][60][61][62][63][64][65][66][67][68][69][70][71][72][73][74] strategic financial planning,[75][76] R&D planning and portfolio management,[77][78][79] financial services, aerospace,[80] manufacturing[81] and environmental health impact assessment.[82]

Editions

The Analytica software runs on Microsoft Windows operating systems. Three editions (Professional, Enterprise, Optimizer) each with more functions and cost, are purchased by users interested in building models. A free edition is available, called Analytica Free 101, which allows you to build medium to moderate sized models of up to 101 user objects.. Free 101 also allows you to view models with more than 101 objects, change inputs, and compute results, which enables free sharing of models for review. A more capable but non-free Power Player enables users to save inputs and utilize database connections. The Analytica Cloud Player allows you to share models over the web and lets users access and run via a web browser.

The most recent release of Analytica is version 4.6, released in May 2015.

History

Analytica's predecessor, called Demos,[83] grew from the research on tools for policy analysis by Max Henrion as a PhD student and later professor at Carnegie Mellon University between 1979 and 1990. Henrion founded Lumina Decision Systems in 1991 with Brian Arnold. Lumina continued to develop the software and apply it to environmental and public policy analysis applications. Lumina first released Analytica as a product in 1996.

References

  1. Granger Morgan and Max Henrion (1998), Analytica:A Software Tool for Uncertainty Analysis and Model Communication, Chapter 10 of Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis, second edition, Cambridge University Press, New York.
  2. The DISTTM Standard, ProbabilityManagement.org
  3. Paul D. Kaplan and Sam Savage (2011), Monte Carlo, A Lightbulb for Illuminating Uncertainty, in Investments & Wealth Monitor
  4. Jun Long, Baruch Fischhoff (2000), Setting Risk Priorities: A Formal Model Risk Analysis, Risk Analysis 20(3):339–352.
  5. Stadler M., Marnay C., Azevedo I.L., Komiyama R., Lai J. (2009), The Open Source Stochastic Building Simulation Tool SLBM and Its Capabilities to Capture Uncertainty of Policymaking in the U.S. Building Sector
  6. Ye Li and H. Keith Florig (Sept. 2006), Modeling the Operation and Maintenance Costs of a Large Scale Tidal Current Turbine Farm, Oceans (2006):1-6
  7. L.F.Miller, Brian Thomas, J.McConn, J. Hou, J.Preston, T.Anderson, and M.Humberstone (2007), Uncertainty Analysis Methods for Equilibrium Fuel Cycles, ANS Summer Abstract.
  8. Gregory A. Norris and Peter Yost (Fall 2001), Journal of Industrial Ecology 5(4):15–28, MIT Press Journals.
  9. Jouni T Tuomisto and Marko Tainio (2005), An economic way of reducing health, environmental, and other pressures of urban traffic: a decision analysis on trip aggregation, BMC Public Health 5:123. doi:10.1186/1471-2458-5-123
  10. Yurika Nishioka, Jonathan I. Levy, Gregory A. Norris, Andrew Wilson, Patrick Hofstetter, John D. Spengler (Oct 2002), Integrating Risk Assessment and Life Cycle Assessment: A Case Study of Insulation, Risk Analysis 22(5):1003–1017.
  11. Igor Linkov, Richard Wilson and George M., Gray (1998), Anticarcinogenic Responses in Rodent Cancer Bioassays Are Not Explained by Random Effects, Toxicological Sciences 43(1), Oxford University Press.
  12. M. Loane and R. Wootton (Oct 2001), A simulation model for analysing patient activity in dermatology, Journal of Telemedicine and Telecare 7(1):23–25(3), Royal Society of Medicine Press.
  13. Davis Bu, Eric Pan, Janice Walker, Julia Adler-Milstein, David Kendrick, Julie M. Hook, Caitlin M. Cusack, David W. Bates, and Blackford Middleton (2007), Benefits of Information Technology–Enabled Diabetes Management, Diabetes Care 30:1137–1142, American Diabetes Association.
  14. Julia Adler-Milstein, Davis Bu, Eric Pan, Janice Walker, David Kendrick, Julie M. Hook, David W. Bates, Blackford Middleton. The Cost of Information Technology-Enabled Diabetes Management, Disease Management. June 1, 2007, 10(3): 115–128. doi:10.1089/dis.2007.103640.
  15. E. Ekaette, R.C. Lee, K-L Kelly, P. Dunscombe (Aug 2006), A Monte Carlo simulation approach to the characterization of uncertainties in cancer staging and radiation treatment decisions, Journal of the Operational Research Society 58:177–185.
  16. Lyon, Joseph L.; Alder, Stephen C.; Stone, Mary Bishop; Scholl, Alan; Reading, James C.; Holubkov, Richard; Sheng, Xiaoming; White, George L. Jr; Hegmann, Kurt T.; Anspaugh, Lynn; Hoffman, F Owen; Simon, Steven L.; Thomas, Brian; Carroll, Raymond; Meikle, A Wayne (Nov 2006),Thyroid Disease Associated With Exposure to the Nevada Nuclear Weapons Test Site Radiation: A Reevaluation Based on Corrected Dosimetry and Examination Data, Epidemiology 17(6):604–614.
  17. Negar Elmieh, Hadi Dowlatabadi, Liz Casman (Jan 2006), A model for Probabilistic Assessment of Malathion Spray Exposures (PAMSE) in British Columbia, CMU EEP.
  18. Detlofvon Winterfeldt, Thomas Eppel, John Adams, Raymond Neutra, and Vincent Del Pizzo (2004), Managing Potential Health Risks from Electric Powerlines: A Decision Analysis Caught in Controversy, Risk Analysis 24(6):1487–1502.
  19. Rebecca Montville, Yuhuan Chen and Donald W. Schaffner (March 2002), Risk assessment of hand washing efficacy using literature and experimental data, International Journal of Food Microbiology 73(2–3):305–313.
  20. DC Kendrick, D Bu, E Pan, B Middleton (2007), Crossing the Evidence Chasm: Building Evidence Bridges from Process Changes to Clinical Outcomes, Journal of the American Medical Informatics Association, Elsevier.
  21. Louis Anthony (Tony) Cox, Jr. (May 2005), Potential human health benefits of antibiotics used in food animals: a case study of virginiamycin, Environment International 31(4):549–563.
  22. Jan Walker, Eric Pan, Douglas Johnston, Julia Adler-Milstein, David W. Bates, and Blackford Middleton (19 Jan 2005), The Value Of Health Care Information Exchange And Interoperability, Health Affairs.
  23. Doug Johnston, Eric Pan, Blackford Middleton, Finding the Value in Healthcare Information Technologies, Center for Information Technology Leadership (C!TL) whitepaper.
  24. Chrisman, L., Langley, P., Bay, S., and Pohorille, A. (Jan 2003), "Incorporating biological knowledge into evaluation of causal regulatory hypotheses", Pacific Symposium on Biocomputing (PSB).
  25. Jan Walker, Eric Pan, Douglas Johnson, Julia Adler-Milstein, David W. Bates and Blackford Middleton (2005), The Value of Health Care Information and Exchange And Interoperability" Health Affairs.
  26. Steve Lohr, Road Map to a Digital System of Health Records, New York Times, January 29, 2005
  27. C. Bloyd, J. Camp, G. Conzelmann, J. Formento, J. Molburg, J. Shannon, M. Henrion, R. Sonnenblick, K. Soo Hoo, J. Kalagnanam, S. Siegel, R. Sinha, M. Small, T. Sullivan, R. Marnicio, P. Ryan, R. Turner, D. Austin, D. Burtraw, D. Farrell, T. Green, A. Krupnick, and E. Mansur (Dec 1996), Tracking and Analysis Framework (TAF) Model Documentation and User’s Guide: An Interaction Model for Integrated Assessment of Title IV of the Clean Air Act Amendments, Decision and Information Sciences Division, Argonne National Laboratory.
  28. Max Henrion, Richard Sonnenblick, Cary Bloyd (Jan 1997), Innovations in Integrated Assessment: The Tracking and Analysis Framework (TAF), Air and Waste Management Conference on Acid Rain and Electric Utilities, Scottsdale, AZ.
  29. Richard Sonnenblick and Max Henrion (Jan 1997), Uncertainty in the Tracking and Analysis Framework Integrated Assessment: The Value of Knowing How Little You Know, Air and Waste Management Conference on Acid Rain and Electric Utilities, Scottsdale, Arizona.
  30. R. Sinha, M. J. Small, P. F. Ryan, T. J. Sullivan and B. J. Cosby (July 1998), Reduced-Form Modelling of Surface Water and Soil Chemistry for the Tracking and Analysis Framework, Water, Air, & Soil Pollution 105 (3–4).
  31. Dallas Burtraw and Erin Mansur (Mar 1999), The Effects of Trading and Banking in the SO2 Allowance Market, Discussion paper 99–25, Resources for the Future.
  32. Galen mcKinley, Miriam Zuk, Morten Höjer, Montserrat Avalos, Isabel González, Rodolfo Iniestra, Israel Laguna, Miguel A. Martínez, Patricia Osnaya, Luz M. Reynales, Raydel Valdés, and Julia Martínez (2005), Quantification of Local and Global Benefits from Air Pollution Control in Mexico City, Environ. Sci. Technol. 39:1954–1961.
  33. Luis A. CIFUENTES, Enzo SAUMA, Hector JORQUERA and Felipe SOTO (2000), Preliminary Estimation of the Potential Ancillary Benefits for Chile, Ancillary Benefits and Costs of Greenhouse Gas Mitigation.
  34. Marko Tainio, Jouni T Tuomisto, Otto Hänninen, Juhani Ruuskanen, Matti J Jantunen, and Juha Pekkanen (2007), Parameter and model uncertainty in a life-table model for fine particles (PM2.5): a statistical modeling study, Environ Health 6(24).
  35. L. Basson and J.G. Petrie (Feb 2007), An integrated approach for the consideration of uncertainty in decision making supported by Life Cycle Assessment, Environmental Modeling & Software 22(2):167–176, Environmental Decision Support Systems, Elsevier.
  36. Matthew F. Bingham, Zhimin Li, Kristy E. Mathews, Colleen M. Spagnardi, Jennifer S. Whaley, Sara G. Veale and Jason C. Kinnell (2011), An Application of Behavioral Modeling to Characterize Urban Angling Decisions and Values, North American Journal of Fisheries Management 31:257–268.
  37. Peter B. Woodbury, James E. Smith, David A. Weinstein and John A. Laurence (Aug 1998), Assessing potential climate change effects on loblolly pine growth: A probabilistic regional modeling approach, Forest Ecology and Management 107 (1–3), 99–116.
  38. P.R. Richard, M. Power, M. Hammilton (2003), Eastern Hudson Bay Beluga Precautionary Approach Case Study: Risk analysis models for co-management, Canadian Science Advisory Secretariat Research Document.
  39. P.R. Richard (2003), Incorporating Uncertainty in Population Assessments, Canadian Science Advisory Secretariat Research Document.
  40. O'Ryan R., Diaz M. (2008), The Use of Probabilistic Analysis to Improve Decision-Making in Environmental Regulation in a Developing Context: The Case of Arsenic Regulation in Chile, Human and Ecological Risk Assessment: An International Journal, Vol 14, Issue 3, pg: 623–640.
  41. Andrew Gronewold and Mark Borsuk, "A probabilistic modeling tool for assessing water quality standard compliance", submitted to EMS Oct 2008.
  42. Mark E. Borsuk, Peter Reichert, Armin Peter, Eva Schager and Patricia Burkhardt-Holm (feb 2006), Assessing the decline of brown trout (Salmo trutta) in Swiss rivers using a Bayesian probability network, Ecological Modelling 192 (1–2):224–244.
  43. Mark E. Borsuk, Craig A. Stow1 and Kenneth H. Reckhow (Apr 2004), A Bayesian network of eutrophication models for synthesis, prediction, and uncertainty analysis, Ecological Modelling 173 (2–3):219–239.
  44. Mark E. Borsuk, Sean P. Powers, and Charles H. Peterson (2002), A survival model of the effects of bottom-water hypoxia on the population density of an estuarine clam (Macoma balthica), Canadian Journal of Fisheries and Aquatic Sciences (59):1266–1274.
  45. Rebecca Montville and Donald Schaffner (Feb 2005), Monte Carlo Simulation of Pathogen Behavior during the Sprout Production Process, Applied and Environmental Microbiology 71(2):746–753.
  46. S. K. J. Rasmussen, T. Ross, J. Olley and T. McMeekin (2002), A process risk model for the shelf life of Atlantic salmon fillets, International Journal of Food Microbiology 73(1):47–60.
  47. David G. Groves nad Robert J. Lempert (Feb 2007), A new analytic method for finding policy-relevant scenarios, Global Environmental Change 17(1):73–85.
  48. Maged Senbel, Timothy McDaniels, and Hadi Dowlatabadi (July 2003), The ecological footprint: a non-monetary metric of human consumption applied to North America, Global Environmental Change 13(2):83–100.
  49. Dowlatabadi, H. (1998). Sensitivity of Climate Change Mitigation Estimates to Assumptions About Technical Change. Energy Economics 20: 473–93.
  50. West, J. J. and H. Dowlatabadi (1998). On assessing the economic impacts of sea level rise on developed coasts. Climate, change and risk. London, Routledge. 205–20.
  51. Leiss, W., H. Dowlatabadi, and Greg Paoli (2001). Who's Afraid of Climate Change? A guide for the perplexed. Isuma 2(4): 95–103.
  52. Morgan, M. G., M. Kandlikar, J. Risbey and H. Dowlatabadi (1999). Why conventional tools for policy analysis are often inadequate for problems of global change. Climatic Change 41: 271–81.
  53. Casman, E. A., M. G. Morgan and H. Dowlatabadi (1999). Mixed Levels of Uncertainty in Complex Policy Models. Risk Analysis 19(1): 33–42.
  54. Dowlatabadi, H. (2003). Scale and Scope In Integrated Assessment: lessons from ten years with ICAM. Scaling in Integrated Assessment. J. Rotmans and D. S. Rothman. Lisse, Swetz & Zeitlinger: 55–72.
  55. Dowlatabadi, H. (2000). Bumping against a gas ceiling. Climatic Change 46(3): 391–407.
  56. Morgan, M. G. and H. Dowlatabadi (1996). Learning From Integrated Assessment of Climate Change. Climatic Change 34: 337–368.
  57. Henry Heimeier (1996), A New Paradigm For Modeling The Precision Strike Process, published in MILCOM96.
  58. Russell F. Richards, Henry A. Neimeier, W. L. Hamm, and D. L. Alexander, "Analytical Modeling in Support of C4ISR Mission Assessment (CMA)," Third International Symposium on Command and Control Research and Technology, National Defense University, Fort McNair, Washington, DC, June 17–20, 1997, pp. 626– 639.
  59. Henry Neimeier and C. McGowan (1996), "Analyzing Processes with HANQ", Proceedings of the International Council on Systems Engineering '96.
  60. Kenneth P. Kuskey and Susan K. Parker (2000), "The Architecture of CAPE Models", MITRE technical paper. See Abstract.
  61. Henry Neimeier (1994), "Analytic Queuing Network", Conference Proceedings of the 12th International Conference on the System Dynamics Society, in Stirling, Scotland.
  62. Henry Neimeier (1996), "Analytic Uncertainty Modeling Versus Discrete Event Simulation", PHALANX.
  63. Rahul Tongia, "Can broadband over powerline carrier (PLC) compete?". The author uses Analytica to model the economic viability of the introduction of a PLC service.
  64. Promises and False Promises of PowerLine Carrier (PLC) Broadband Communications – A Techno-Economic Analysis http://tprc.org/papers/2003/246/Tongia-PLC.pdf
  65. Kanchana Wanichkorn and Marvin Sirbu (1998), The Economics of Premises Internet Telephony, CMU-EPP.
  66. E.L. Kyser, E.R. Hnatek, M.H. Roettgering (2001), The politics of accelerated stress testing, Sound and Vibration 35(3):24–29.
  67. Kevin J. Soo Hoo (June 2000), How Much Is Enough? A Risk-Management Approach to Computer Security, Working Paper, Consortium for Research on Information Security and Policy (CRISP), Stanford University.
  68. M. Steinbach and S. Giles of MITRE (2005), A Model for Joint Infrastructure Investment, AIAA-2005-7309, in AIAA 5th ATIO and 16th Lighter-than-air sys tech and balloon systems conferences, Arlington VA, Sep 26–28, 2005.
  69. Bloomfield, R., Guerra, S. (2002), Process modelling to support dependability arguments, Proceedings. International Conference on Dependable Systems and Networks, pg. 113–122. DSN 2002.
  70. Christopher L Weber and Sanath K Kalidas (Fall 2004), Cost-Benefit Analysis of LEED Silver Certification for New House Residence Hall at Carnegie Mellon University, Civil Systems Investment Planning and Pricing Project, Dept. of Civil & Environmental Engineering, Carnegie Mellon University.
  71. J. McMahon, X. Liu, I. Turiel (Jun 2000), Uncertainty and sensitivity analyses of ballast life-cycle cost and payback period, Technical Report LBNL–44450, Lawrence Berkeley Labs, Berkeley CA.
  72. Paul K. Davis (2000), Dealing with complexity: exploratory analysis enabled by multiresolultion, multiperspective modeling, Proceedings of the 32nd Conference on Winter Simulation, pg. 293–302.
  73. Paul K. Davis (2000), Exploratory Analysis Enabled by Multiresolution, Multiperspective Modeling, Proceedings of the 2000 Winter Simulation Conference J. A. Joines, R. R. Barton, K. Kang, and P. A. Fishwick, eds.
  74. NASA (1994), Schedule and Cost Risk Analysis Modeling (SCRAM) System, NASA SBIR Successes.
  75. "Cubeplan case studies". Cubeplan.com. Retrieved 2011-07-12.
  76. "Novix consulting services". Novix.com. Retrieved 2011-07-12.
  77. Enrich Consulting, publications on Portfolio Management
  78. "Bicore, Inc". Bicore.nl. Retrieved 2011-07-12.
  79. "R&D evaluation tools at W.L. Gore". Lumina. Archived from the original on October 17, 2013.
  80. Speeding turnaround of the Space Shuttle, Lumina case studies
  81. Auto maker saves $250M on warranty costs, Lumina case studies
  82. James Grellier, Paolo Ravazzani, and Elisabeth Cardis (2014),Potential health impacts of residential exposures to extremely low frequency magnetic fields in Europe, Environment International 62, p55-63. doi:10.1016/j.envint.2013.09.017
  83. Neil Wishbow and Max Henrion, "Demos User's Manual", Department of Engineering and Public Policy, Carnegie Mellon University, 1987.

External links

This article is issued from Wikipedia - version of the Thursday, October 15, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.