Hard Problem metaphors

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

The Hard Problem of consciousness is a philosophical mystery in the field of artificial intelligence that deals with a seemingly fundamental aspect of consciousness, namely the irreducible aspect of awareness known as qualia.

Philosophers and scientists have tried to solve the problem by adding additional complexity, or by reducing or denying the existence of the problem at all[1], or by minimizing its importance.[2][3] The mind could be seen as just a "black box" linking the senses and the muscles.

By the late 2010s, proposed explanations for the essential mystery of awareness remain incomplete, often using vague metaphors that could be seen as possible steps in the direction of an answer (if that) rather than an answer itself. If or when a genuine theory of awareness is developed, these metaphors may become obsolete and unnecessary. It's also possible all current awareness metaphors are completely wrong.[4]

The following list is incomplete.

Awareness metaphors

Quantity is quality

An immense amount of signals are exchanged every second in the human brain, more than in any supercomputer. This information is also more organized and integrated[5] than in any human-built computer. The subjective strangeness of awareness is fully equivalent to this immense complexity. The quality of awareness represents a vast quantity of knowledge. This organized complexity could be seen as a neural correlate of consciousness. The perception of reality is no more complex than the sum of what could be described about it in words, given enough time. Someone could spend years describing every aspect of their life, but their awareness could represent this complexity instantaneously. This is related to Strong Reductionism.

Strong Reductionism

The existence of awareness is no stranger than the existence of any object. The same basic essence underlies both matter and information. Matter IS information.[6] The actions that matter performs in the physical universe are just as complex as the perception and understanding of these actions. This is related to Transparency theory.

Transparency theory

When thinking about something, a mind only perceives a detailed representation of that thing, not some extra qualia or additional mental properties.[7] Qualia are artifacts that only exist when they are specifically being considered. This could be an aspect of Panpsychism.

Panpsychism

All information has a qualia-like aspect, but some qualia are more elaborate and therefore more intense than others. Everything physical has an underlying phenomenal nature. Full awareness involves the most detailed or elaborate qualia. These combine and organize the output of many lower qualia in the mind and the environment.[8]

Awareness is tautological

To be aware, a mind must create an elaborate and constantly updated description of its reality (which could be seen as the mind's focus). This model may be wrong or incomplete, but it will necessarily appear as the highest truth to the mind creating it, since no other truth is available to it.[9]

Experiencing is reliving

According to this theory, awareness is a function of the procedural memory. The current awareness of carrying out an often repeated action is immediately merged in the memory with all the previous times that that action has been carried out.[10]

This may also be a case of shared awareness.

Awareness is reverse memory

Emotions and feelings are undefined knowledge, a distinct sense of something that could only be described in words with a great deal of effort or not at all - for example the reason for a phobia. Emotions somehow describe these logical concepts before they have been logically defined. However, they may be imprecise and unreliable.

This is known as first-order representationalism. Conscious states attempt to present elements of reality "nonconceptually", even if the mind lacks the mental concepts to rationally categorize or describe them, as happens in intuition.[11]

In the theory of Neural Darwinism,[12] awareness combines direct perceptions and memories to create more elaborate but loosely defined higher perceptions about reality. It does this through neural reentry. The purpose is to prepare for the future, aiding the organism's and hence the mind's survival.

Awareness is false memory

While a mind can remember being aware in the past, awareness doesn't actually exist at any single moment in time, which is infinitesimal.

The perception that perception is inexplicable could itself be a mental simplification.[13] Perceptions of the past self are unreliable, as described in the multiple drafts model. This concept is part of Gerald Edelman's 1989 book about consciousness, "The Remembered Present".[14]

Both the external world and the past self are represented as something fundamentally different from the present self, which is the only version of the self which is undeniably aware (an extreme version would be momentary solipsism[15]). This may be related to the illusion of time passing and the paradox of free will.

Awareness is the illusion of freedom

Minds have no control over past actions or events. The freedom to act in the present (or free will) is pre-determined by the past self.[16] Philosophically, this is known as the argument from free will. Many neuroscience of free will experiments suggest that introspections about the reasons for an action are mostly illusions created after the fact.

Awareness is a self representational loop

Awareness may be fundamentally illogical. Every element is defined in terms of other elements. Such a recursive string of definitions may loop back on itself, creating the perceived self as a type of bootstrap model.[17]

Mind layers

Damasio's theory of consciousness and other higher-order theories of consciousness state that the mind is composed of many automatic response systems that are themselves not conscious. These are correlated, controlled, and manipulated by increasingly abstract control layers, that manipulate systems on the next lower layer. At the top (or the third layer according to Damasio) is abstract thought and planning, which is only partially aware of the lower layers and separate from them, but uses their results. It may do this through the mind's global workspace.[18]

Integrated information theory

A mathematical theory about the degree to which information can be organized in a finite system.[19] Awareness begins as a basic mind function that drives higher thoughts, and cannot exist separately from them. Awareness becomes stronger the more integrated a mind becomes, but may decrease again if the mind becomes over-organized.

Enactive or embodied awareness theory

Mental processes don't only depend on inner representations. Awareness is a dynamic process, with essential information being stored in the surrounding environment. Consciousness is a physical action, the brain interacting with the body and the world. A brain in a vat would be fundamentally less aware than a brain in a body.[20][21]

Universal Improver

A hypothetical computer program that could take any true statement, and change it into a different statement that would also be true, while adding insight about the original statement. The new statement could be shorter, by organizing or compressing the original statement; or it could be longer, elaborating and explaining the original statement. Any software consistently capable of doing this might have to be as complex as a conscious mind.

External links

Bibliography

  • Consciousness Explained (1991) by Daniel Dennett
  • Explaining Consciousness: The Hard Problem (1999) by Jonathan Shear

Also see

References

  1. Rey 1997; Dennett 1978, 1988; Wilkes 1984; Ryle 1949. http://www.iep.utm.edu/hard-con/#H3
  2. The theory of Epiphenomenalism states that conscious properties can be caused by physical events, but they cannot in turn cause physical events.
  3. Explaining Consciousness: The Hard Problem (1999, A Bradford Book); Jonathan Shear (Editor) https://www.amazon.com/Explaining-Consciousness-Problem-Jonathan-Shear/dp/026269221X
  4. The Atlantic; Michael Graziano (Mar 9, 2016) https://www.theatlantic.com/science/archive/2016/03/phlegm-theories-of-consciousness/472812/
  5. "Brain Bank North West" quoting Giulio Tononi | http://thebrainbank.scienceblog.com/2013/03/04/what-is-consciousness-a-scientists-perspective/
  6. Dual aspect theory: Spinoza 1677/2005, P. Strawson 1959, Nagel 1986. Neutral monism: Russell 1926, Feigl 1958, Maxwell 1979, Lockwood 1989, Stubenberg 1998, Stoljar 2001, G. Strawson 2008.
  7. Harman 1990, Kind 2003
  8. Panpsychism: Leibniz 1714/1989, Whitehead 1929, Griffin 1998, Rosenberg 2005, Skrbina 2007.
  9. (Feb 2014) https://www.quora.com/Is-personal-consciousness-the-only-absolute-truth
  10. The Role of Consciousness in Memory; Franklin, Baars, et al. (2013) http://www.brains-minds-media.org/archive/150
  11. Dretske 1995; Tye 1995, 2000.
  12. original research (2005) http://protoscience.wikia.com/wiki/Consciousness_and_Memory
  13. "Is Conscious Awareness Inexplicable? The 'Hard Problem of Consciousness' Further Pinpointed" (Mar 30, 2015) David Navon; University of Haifa; Department of Psychology; https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2586656
  14. Bobby Matherne (1997) http://www.doyletics.com/arj/trprev.shtml
  15. registration required (retrieved Apr 21, 2017) https://www.coursehero.com/file/p4b7n9i/There-are-three-main-possibilities-a-Personal-Solipsism-b-Momentary-Solipsism/
  16. The Independent (Apr 30, 2016) http://www.independent.co.uk/news/science/free-will-could-all-be-an-illusion-scientists-suggest-after-study-that-shows-choice-could-just-be-a7008181.html
  17. Frederic Peters | "Consciousness as recursive, spatiotemporal self-location" | Psychological Research (Sep 2009)
  18. http://www.livescience.com/47096-theories-seek-to-explain-consciousness.html
  19. Scientific American; Christof Koch (Jul 01, 2009) https://www.scientificamerican.com/article/a-theory-of-consciousness/
  20. Hurley 1998, Noë 2005 2009.
  21. https://plato.stanford.edu/entries/embodied-cognition/ | retrieved Feb 13 2017