Integrated information theory

Integrated information theory (IIT) attempts to explain what consciousness is and why it might be associated with certain physical systems. Given any such system, the theory predicts whether that system is conscious, to what degree it is conscious, and what particular experience it is having (see Central Identity). According to IIT, a system's consciousness is determined by its causal properties and is therefore an intrinsic, fundamental property of certain causal systems.
IIT was proposed by neuroscientist Giulio Tononi in 2004, and has been continuously developed over the past decade. The latest version of the theory, labeled IIT 3.0, was published in 2014.[1][2]
Overview
Relationship to the "Hard Problem of Consciousness"
David Chalmers has argued that any attempt to explain consciousness in purely physical terms (i.e. to start with the laws of physics as they are currently formulated and derive the necessary and inevitable existence of consciousness) eventually runs into the so-called "hard problem". Rather than try to start from physical principles and arrive at consciousness, IIT "starts with consciousness" (accepts the existence of consciousness as certain) and reasons about the properties that a postulated physical substrate would have to have in order to account for it. The ability to perform this jump from phenomenology to mechanism rests on IIT's assumption that if a conscious experience can be fully accounted for by an underlying physical system, then the properties of the physical system must be constrained by the properties of the experience.
Specifically, IIT moves from phenomenology to mechanism by attempting to identify the essential properties of conscious experience (dubbed "axioms") and, from there, the essential properties of conscious physical systems (dubbed "postulates").
Axioms: essential properties of experience
The axioms are intended to capture the essential aspects of every conscious experience. Every axiom should apply to every possible experience.
The wording of the axioms has changed slightly as the theory as developed, and the most recent and complete statement of the axioms is as follows:
“ |
|
” | |
— Dr. Giulio Tononi, Scholarpedia[2] |
Postulates: properties required of the physical substrate
The axioms describe regularities in conscious experience, and IIT seeks to explain these regularities. What could account for the fact that every experience exists, is structured, is differentiated, is unified, and is definite? IIT argues that the existence of an underlying causal system with these same properties offers the most parsimonious explanation. Thus a physical system, if conscious, is so by virtue of its causal properties.
The properties required of a conscious physical substrate are called the "postulates," since the existence of the physical substrate is itself only postulated (remember, IIT maintains that the only thing one can be sure of is the existence of one's own consciousness). In what follows, a "physical system" is taken to be a set of elements, each with two or more internal states, inputs that influence that state, and outputs that are influenced by that state (neurons or logic gates are the natural examples). Given this definition of "physical system", the postulates are:
“ |
|
” | |
— Dr. Giulio Tononi, Scholarpedia[2] |
Mathematics: formalization of the postulates
For a complete and thorough account of the mathematical formalization of IIT, see.[1] What follows is intended as a brief summary, adapted from,[3] of the most important quantities involved. Pseudocode for the algorithms used to calculate these quantities can be found at.[4]
A system refers to a set of elements, each with two or more internal states, inputs that influence that state, and outputs that are influenced by that state. A mechanism refers to a subset of system elements. The mechanism-level quantities below are used to asses the integration of any given mechanism, and the system-level quantities are used to asses the integration of sets of mechanisms ("sets of sets").
In order to apply the IIT formalism to a system, its full transition probability matrix (TPM) must be known. The TPM specifies the probability with which any state of a system transitions to any other system state. Each of the following quantities is calculated in a bottom-up manner from the system's TPM.
Mechanism-level quantities |
---|
A cause-effect repertoire ![]() ![]() ![]() ![]() ![]() Note that |
A partition ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
The earth mover's distance ![]() ![]() ![]() |
Integrated information ![]() ![]() The irreducibility of the cause repertoire with respect to Combined, |
The minimum-information partition of a mechanism and its purview is given by ![]() Note that the minimum-information "partition", despite its name, is really a pair of partitions. We call these partitions |
There is at least one choice of elements over which a mechanism's cause-effect repertoire is maximally irreducible (in other words, over which its ![]() ![]() Formally, |
The concept ![]() ![]() ![]() ![]() ![]() ![]() The intrinsic cause-effect power of |
System-level quantities |
---|
A cause-effect structure ![]() ![]() ![]() ![]() |
A unidirectional partition ![]() ![]() ![]() |
The extended earth mover's distance ![]() ![]() ![]() In the XEMD, the "earth" to be transported is intrinsic cause-effect power ( |
Integrated (conceptual) information ![]() ![]() ![]() ![]() |
The minimum-information partition of a set of elements in a state is given by ![]() ![]() |
The intrinsic cause-effect power of a set of elements in a state is given by ![]() ![]() ![]() ![]() ![]() |
A complex is a set of elements ![]() ![]() |
Cause-effect space
For a system of simple binary elements, cause-effect space is formed by
axes, one for each possible past and future state of the system. Any cause-effect repertoire
, which specifies the probability of each each possible past and future state of the system, can be easily plotted as a point in this high-dimensional space: The position of this point along each axis is given by the probability of that state as specified by
. If a point is also taken to have a scalar magnitude (which can be informally thought of as the point's "size", for example), then it can easily represent a concept: The concept's cause-effect repertoire specifies the location of the point in cause-effect space, and the concept's
value specifies that point's magnitude.
In this way, a conceptual structure can be plotted as a constellation of points in cause-effect space. Each point is called a star, and each star's magnitude (
) is its size.
Central Identity
IIT addresses the mind-body problem by proposing an identity between phenomenological properties of experience and causal properties of physical systems: The conceptual structure specified by a complex of elements in a state is identical to its experience.
Specifically, the form of the conceptual structure in cause-effect space completely specifies the quality of the experience, while the irreducibility of the conceptual structure specifies the level to which it exists (i.e., the complex's level of consciousness). The maximally irreducible cause-effect repertoire of each concept within a conceptual structure specifies what the concept contributes to the quality of the experience, while its irreducibility
specifies how much the concept is present in the experience.
According to IIT, an experience is thus an intrinsic property of a complex of mechanisms in a state.
Extensions
The calculation of even a modestly-sized system's is often computationally intractable, so efforts have been made to develop heuristic or proxy measures of integrated information. For example, Masafumi Oizumi has developed a quantity
which is a practical proxy for integrated information.[5]
Adam Barrett has used Tononi's ideas to develop similar measures of integrated information such as "phi empirical".[6]
Related Experimental Work
While the algorithm[4] for assessing a system's and conceptual structure is relatively straightforward, its high time complexity makes it computationally intractable for many systems of interest. Heuristics and approximations can sometimes be used to provide ballpark estimates of a complex system's integrated information, but precise calculations are often impossible. These computational challenges, combined with the already difficult task of reliably and accurately assessing consciousness under experimental conditions, make testing many of the theory's predictions difficult.
Despite these challenges, researchers have attempted to use measures of information integration and differentiation to asses levels of consciousness in a variety of subjects.[7][8] For instance, a recent study using a less computationally-intensive proxy for was able to reliably discriminate between varying levels of consciousness in wakeful, sleeping (dreaming vs. non-dreaming), anesthetized, and comatose (vegetative vs. minimally-conscious vs. locked-in) individuals.[9]
IIT also makes several predictions which fit well with existing experimental evidence, and can be used to explain some counterintuitive findings in consciousness research.[10] For example, IIT can be used to explain why some brain regions, such as the cerebellum do not appear to contribute to consciousness, despite their size and/or functional importance. IIT can also help to explain why severing the corpus callosum appears to lead to the development of two separate consciousnesses in split-brain patients.
Reception
Integrated Information Theory has received both broad criticism and support.
Support
Neuroscientist Christof Koch has called IIT "the only really promising fundamental theory of consciousness.”[11]
Criticism
Meanwhile, some critics have challenged that IIT proposes conditions which are necessary for consciousness, but are not entirely sufficient.[12] Objections have also been made to the claim that all of IIT's axioms are self-evident.[13] Since IIT is not a functionalist theory of consciousness, many historical criticisms of non-functionalism have been applied to IIT.[13] Disagreements over the definition of consciousness also lead to inevitable criticism of the theory.[12][13]
See also
References
- 1 2 Oizumi, Masafumi; Albantakis, Larissa; Tononi, Giulio (2014-05-08). "From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0". PLoS Comput Biol 10 (5): e1003588. doi:10.1371/journal.pcbi.1003588. PMC 4014402. PMID 24811198.
- 1 2 3 "Integrated information theory - Scholarpedia". www.scholarpedia.org. Retrieved 2015-11-23.
- ↑ Albantakis, Larissa; Tononi, Giulio (2015-07-31). "The Intrinsic Cause-Effect Power of Discrete Dynamical Systems—From Elementary Cellular Automata to Adapting Animats". Entropy 17 (8): 5472–5502. doi:10.3390/e17085472.
- 1 2 "CSC-UW/iit-pseudocode". GitHub. Retrieved 2016-01-29.
- ↑ Oizumi, Masafumi; Amari, Shun-ichi; Yanagawa, Toru; Fujii, Naotaka; Tsuchiya, Naotsugu (2015-05-17). "Measuring integrated information from the decoding perspective". arXiv.
- ↑ Barrett, A.B., & Seth, A.K. (2011). Practical measures of integrated information for time-series data. PLoS Comput. Biol., 7(1): e1001052
- ↑ Massimini, M.; Ferrarelli, F.; Murphy, Mj; Huber, R.; Riedner, Ba; Casarotto, S.; Tononi, G. (2010-09-01). "Cortical reactivity and effective connectivity during REM sleep in humans". Cognitive Neuroscience 1 (3): 176–183. doi:10.1080/17588921003731578. ISSN 1758-8936. PMC 2930263. PMID 20823938.
- ↑ Ferrarelli, Fabio; Massimini, Marcello; Sarasso, Simone; Casali, Adenauer; Riedner, Brady A.; Angelini, Giuditta; Tononi, Giulio; Pearce, Robert A. (2010-02-09). "Breakdown in cortical effective connectivity during midazolam-induced loss of consciousness". Proceedings of the National Academy of Sciences of the United States of America 107 (6): 2681–2686. doi:10.1073/pnas.0913008107. ISSN 1091-6490. PMC 2823915. PMID 20133802.
- ↑ Casali, Adenauer G.; Gosseries, Olivia; Rosanova, Mario; Boly, Mélanie; Sarasso, Simone; Casali, Karina R.; Casarotto, Silvia; Bruno, Marie-Aurélie; Laureys, Steven (2013-08-14). "A Theoretically Based Index of Consciousness Independent of Sensory Processing and Behavior". Science Translational Medicine 5 (198): 198ra105–198ra105. doi:10.1126/scitranslmed.3006294. ISSN 1946-6234. PMID 23946194.
- ↑ "Integrated information theory - Scholarpedia". www.scholarpedia.org. Retrieved 2016-01-28.
- ↑ Zimmer, Carl (2010-09-20). "Sizing Up Consciousness by Its Bits". The New York Times. ISSN 0362-4331. Retrieved 2015-11-23.
- 1 2 "Shtetl-Optimized » Blog Archive » Why I Am Not An Integrated Information Theorist (or, The Unconscious Expander)". www.scottaaronson.com. Retrieved 2015-11-23.
- 1 2 3 Cerullo, Michael A.; Kording, Konrad P. (17 September 2015). "The Problem with Phi: A Critique of Integrated Information Theory". PLOS Computational Biology 11 (9): e1004286. doi:10.1371/journal.pcbi.1004286.
External links
Related papers
- From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0
- Integrated Information Theory: An Updated Account (2012) (First presentation of IIT 3.0)
- Integrated Information Theory: A Provisional Manifesto (2008) (IIT 2.0)
- An Information Integration Theory of Consciousness (2004) (IIT 1.0)
References
Websites
- IntegratedInformationTheory.org - Maintains software for calculating IIT quantities.
Books
News articles
- MIT Technology Review (2014): What It Will Take for Computers to Be Conscious
- Wired (2013): A Neuroscientist’s Radical Theory of How Networks Become Conscious
- The New Yorker (2013): How Much Consciousness Does an iPhone Have?
- New York Times (2010): Sizing Up Consciousness by Its Bits
- Scientific American (2009): A “Complex” Theory of Consciousness
- IEEE Spectrum (2008): A Bit of Theory: Consciousness as Integrated Information Theory
Talks
- David Chalmers (2014): How do you explain consciousness?
- Christof Koch (2014): The Integrated Information Theory of Consciousness