Neurolaw

From Infogalactic: the planetary knowledge core
Jump to: navigation, search
File:FMRI.jpg
An example of an fMRI brain scan. fMRI outputs (yellow) are overlaid on the average brain anatomy (gray) averaged from several humans. Similar images are used in a variety of applications, now including law.

Neurolaw is an emerging field of interdisciplinary study that explores the effects of discoveries in neuroscience on legal rules and standards. Drawing from neuroscience, philosophy, social psychology, cognitive neuroscience, and criminology, neurolaw practitioners seek to address not only the descriptive and predictive issues of how neuroscience is and will be used in the legal system, but also the normative issues of how neuroscience should and should not be used. The most prominent questions that have emerged from this exploration are as follows: To what extent can a tumor or brain injury alleviate criminal punishment? Can sentencing or rehabilitation regulations be influenced by neuroscience? Who is permitted access to images of a person's brain? Neuroscience is beginning to address these questions in its effort to understand human behavior, and will potentially shape future aspects of legal processes.[1][2] New insights into the psychology and cognition of the brain have been made available by functional magnetic resonance imaging (fMRI). These new technologies were a break from the conventional and primitive views of the brain that have been prevalent in the legal system for centuries. Brain imaging has provided a much deeper insight into thought processes, and will have an effect on the law because it contests customary beliefs about mental development. Because the science is still developing and because there is substantial opportunity for misuse, the legal realm recognizes the need to proceed cautiously. Neurolaw proponents are quickly finding means to apply neuroscience to a variety of different contexts. For example, intellectual property could be better evaluated through neuroscience. Major areas of current research include applications in the courtroom, how neuroscience can and should be used legally, and how the law is created and applied.[3][4]

History

Neuroscience and the law have interacted over a long history, but interest spiked in the late 1990s. After the term neurolaw was first coined by Sherrod J. Taylor in 1991,[5] scholars from both fields began to network through presentations and dialogs. This led to an increasing pull to publish books, articles, and other literature. The Gruter Institute for Law and Behavioral Research and the Dana Foundation were the first groups to provide funding for the new interdisciplinary field. Parallel to the expansion of neurolaw, an emergence of ethics specifically regarding neuroscience was developing as well. The intersection of neurolaw and ethics was able to be better scrutinized by the initiation of the Law and Neuroscience Project in 2007.[4] The MacArthur Foundation launched Phase I of its project through a $10 million grant in hope of integrating the two fields. The initiative sustained forty projects addressing a multitude of issues, including experimental and theoretical data that will provide further evidence as to how neuroscience may eventually shape the law.[6] This new field of study has also piqued the interests of several universities. Baylor College of Medicine's Initiative on Neuroscience and the Law’s research seeks to research, educate, and make policy change.[7] The University of Pennsylvania’s Center for Neuroscience and Society began in July 2009, and is working towards confronting the social, legal, and ethical inferences of neuroscience.[8]

Neurocriminology

The term neurolaw was first used in practice by the neuroscientist and attorney J. Sherrod Taylor in 1991.[5] Taylor's book, Neurolaw: Brain and Spinal Cord Injury (1997), was used as a resource for attorneys to properly introduce medical jargon into the courtroom and to further develop the implications of neuroscience on litigation. In addition, Taylor explained the consequences of Daubert v. Merrell Dow Pharmaceuticals.[9] This United States Supreme Court case resulted in what is now known as Daubert Standard, which sets rules regarding the use of scientific evidence in the courtroom.

Crime prediction

Behavioral testing and neuroimaging evidence offer a potentially accurate method of predicting human behavior. This advancement would be beneficial particularly for determining guilty criminal sentences or discerning which criminals deserve to be released on parole or detained in jail due to the possibility of future offenses. Not only could it aid in the process of recidivism, it could also show an indication of the need for personal rehabilitation. In light of this information and its potential applications, the legal system seeks to create a balance between just punishment and penalties based on the ability to predict additional criminal activity.[1]

Insanity defense

The tendency of the United States criminal justice system has been to limit the degree to which one can claim innocence based on mental illness. During the middle of the 20th century, many courts through the Durham Rules and the American Law Institute Model Penal Code, among others, had regarded impaired volition as legitimate grounds for the insanity defense. However, when John Hinckley was acquitted due to insanity, a reversal of opinion occurred, which then spurred a narrowing definition of mental illness. Decisions became increasingly based on the M’Naghten Rules, which asserted that unless one was able to prove that a mental illness kept him or her from knowing that the act was wrong, or knowing the disposition of the criminal act, one would not be able to be tried as mentally handicapped. Contemporary research conducted on the prefrontal cortex has criticized this standpoint because it considers impaired volition as a factor. Many courts are now considering "irresistible impulse" as legitimate grounds for mental illness.[10] One of the factors neuroscience has added to the insanity defense is the claim that the brain “made someone do it.” In these cases, the argument is based on an understanding that decisions are made before the person is able to consciously realize what is happening. More research on control and inhibition mechanisms will allow further modifications to the insanity defense.[4] Impaired functioning of the PFC is evidence proving that a prime factor in mental illness is an issue of volition. Many experiments using MRI show that one of the functions of the PFC is to bias a person towards taking the more difficult action. This action is representative of a long-term reward, and it is competing with an action that will lead to immediate satisfaction. It is responsible for moral reasoning, including regret. Individual variations that impair the PFC are extremely detrimental to the decision-making process, and give an individual a greater likelihood in a committing a crime he or she would have otherwise not committed.[10]

Brain death

Injuries or illnesses that lead to a persistent vegetative state have come to the forefront of many ethical, legal, and scientific issues regarding brain death.[11] It is a difficult subject to know when someone is beyond hope for recovery, as well as to decide who has the right to make the decision of when death is most appropriate. Research to determine a person’s cognitive state has helped develop an understanding of the vegetative state. While a person can be awake and conscious, he or she may not show any signs of awareness or recognition to external stimulation. In 2005, research was conducted on a 23-year-old female who suffered severe head trauma due to an automobile accident. The woman was diagnosed to be in a vegetative state; after five months she continued to be unresponsive, but did show normal sleep and wake cycles. Using fMRI technology, researchers concluded that she was able to understand external stimuli, showing a response via activity in specific regions of the brain. For example, there was increased activity in the middle and superior temporal gyri similar to activity exhibited by control subjects. This positive response reveals potential for medical imaging to be used to understand the implications of brain death, and to help answer legal, scientific, and ethical questions pertaining to brain death.[12]

Nootropics

In addition to questions involving how neuroscience should influence criminal and civil law, neurolaw also encompasses ethical questions regarding nootropics, more commonly known as mind-enhancing drugs. A plethora of drugs are already known to cause a variety of effects on the brain, for example, the stimulatory action of caffeine. Similarly, current research suggests that the future may hold even more powerful medications that can specifically target and alter brain function.[13] The potential to significantly improve one's concentration, memory, or cognition has raised numerous questions on the legality of these substances, and their appropriateness for various uses, such as studying for an exam. Analogous to the controversy over the use of anabolic steroids in professional sports, many high schools and universities are wary of students eventually using nootropics to artificially boost academic performance.

Some of the questions raised regarding the use of nootropics include:[14]

  • How will these enhancers affect performance gaps between family income classes?
  • Will it become necessary to use an enhancing drug simply to remain competitive in society?
  • How does society distinguish between what is an acceptable substance (e.g. caffeine) and an unacceptable substance to alter one's mind?
  • Do people have the right to experiment with substances to modify their own cognition?

Scientists and ethicists have attempted to answer these questions while analyzing the overall effect on society. It is largely accepted that mind-enhancing drugs are acceptable for use with patients facing cognitive disorders, as in a case of prescribing Adderall to children and adults with ADHD. In contrast, Adderall and Ritalin have become popular black-market drugs, most notably on college campuses. Students often use them to maintain focus when struggling to complete large amounts of schoolwork.[15]

Technology

Much of neurolaw depends on state-of-the-art medical technology that has been adapted to a new role in the legal system. Among the most prominent technologies and disciplines are functional magnetic resonance imaging (fMRI), positron emission tomography (PET scan), magnetic resonance imaging (MRI), and epigenetics. MRI and fMRI are particularly important because they allow detailed mapping of the human brain, potentially allowing technicians to visualize another person's thoughts. FMRI, a derivative of MRI, allows for oxygen-specific mapping to view the most active areas of a brain at a specific moment. Combined with the knowledge of how the brain works in different situations (lying, remembering, etc.), there is the potential to use functional neuroimaging evidence as a modern form of lie detection. Similarly, PET scans use a radioactive tracer injected into the body to analyze brain tissue.[11]

Lie detection

The diagram above is an overview of the varying patterns of brain activity detected by fMRI. Yellow highlighted regions are where there is the most activity occurring. TR and CR levels represent "telling the truth." LT signals indicate a subject is withholding information. LN signals indicate a subject is making up information.

In regard to neuroscience as a form of lie-detection, specific regions of the brain have been analyzed in order to uncover patterns of truth telling, deception, and false memory. Notably, an important obstacle to any form of lie detection is when subjects inadvertently recall false memories.[citation needed] This is induced experimentally by presenting subjects a list of semantically related words. While they believe their responses to be true, their recollections are in fact false. For instance, reading a long list of words including "moon," "sun," and "Mars," may cause a subject to incorrectly believe the word "Earth" to have been listed, even if it was not. This is a normal psychological occurrence, but presents numerous problems to jury attempting to sort out the facts of a case. Indeed, researchers have attempted to distinguish genuine truths from "false truths." Subjects are subsequently quizzed on the word list while specific regions of the brain are analyzed for activity. For instance, the dorsolateral prefrontal cortex has been shown to activate when subjects are pretending to know information which they do not know, in contrast to truth telling and false recognition. Alternatively, the right anterior hippocampus activates when a subject presents false recognition in contrast to lying or accurately telling a truth. However, there remain limitations to how much brain imaging can distinguish between the many forms of truths and deceptions. For instance, future research hopes to uncover patterns that differentiate whether someone has genuinely forgotten an experience in contrast to the active choice to withhold information.[16]

Research

The Stanford Center for Biomedical Ethics (SCBE) specifically analyzes the contribution of fMRI to legal, ethical, and social challenges so that their conclusions may offer a liable transfer of fMRI information to policy recommendations and the clinical realm. The research will focus on identifying emerging trends of emotion, moral judgment, and other complex human behaviors. With the information that the researchers obtain, an advisory board will compile a list of guidelines to interpret the results. In analyzing the use of fMRIs, they will assess the risks and accuracy of using these machines to quantitatively detect mental illness. The Major Depressive Disorder is used as their main experimental model.[17]

Criticism

The use of neuroimaging in the legal system creates a very divided critical audience; many argue for its potential, while others argue it will not accurately replace human investigations to verify criminal decision making processes. Neuroimaging is inadequately understood; the multiple variables it displays, including medication, nutrition, and hormones create an image that is very complex and often impossible to interpret accurately. Other critics highlight that the image derived from the technology does not display the brain’s intentionality during the illegal act. Functional neuroimaging was not intended to calculate volition, and while it may offer insight into the processes that cause behavior, it is debated whether or not the images can objectively narrow in on human reason.[18] There are also many worries about privacy as well.

Application in practice

Neurolaw has already been applied to various situations throughout the United States and other countries. Two companies, No Lie MRI and Cephos Corp, both offer lie-detection services using fMRI. Advertising to lawyers, prosecutors, and other firms, they attempt to provide a twenty-first century version of the traditional polygraph.[19] Because of variations among individuals’ responses however, the technology is not fool proof and many are skeptical of its uses. Nevertheless, It is often considered to be a more advanced technique than the polygraph test. The United States rarely allows evidence to be accepted in the court of law; and in response to worries of its scientific validity, judges have so far veered away from allowing Cephos and No Lie MRI tests into the courtroom.[4]

Criminal law

In Mumbai, India, the legal system has taken a more rapid approach in applying neuroscience, and has already incorporated it into criminal convictions. In 2008, an Indian woman was convicted of murder based on strong circumstantial evidence, including a brain scan that suggested her guilt. This conviction was sharply criticized by Hank Greely, a professor of law at Stanford University. Greely contested the scan based on evidence produced by a brain electrical oscillations signature profiling test (BEOS). No scientific peer-review studies had ever been published demonstrating the efficacy of BEOS, raising questions about its reliability in such an important decision.[19]

In the United States, convicts have used introduced brain scan results during the sentencing phase of trials. Because the court system allows nearly any mitigating evidence during sentencing, brain scans have not faced as many hurdles for this application. In two instances occurring in California and New York, defendants were able to reduce their sentence of first-degree murder to manslaughter. Each presented brain scans suggesting hindered neurological function hoping to mitigate their responsibility in the crime.[19] Brain images were also used in the case of Harrington v. State of Iowa in 2003 as evidence for the defense.[4]

Government and military

The United States Military has become increasingly interested in the possibilities made available by neuroscience. In an effort to combat terrorism, officials hope to use modern technologies for a variety of purposes. Brain imaging may help to distinguish between enemy combatants from those who pose no risk. Similarly, officials can help determine the mental stability of their own soldiers. Nootropic drugs could also be used to enhance the focus and memories of soldiers, allowing for better recognition of dangers and improved performance. However, this has led to questions regarding the personal privacy of soldiers and detainees. While the general population generally has the right to refuse medication, soldiers may eventually face compulsory medication to benefit the overall mission.[citation needed] Additionally, questions regarding the accuracy of brain imaging arise when testing detainees for concealed information. Although the civilian court system is reluctant to use unproven technologies, the military's reliance on them may generate controversy over the possibility innocence or guilt of enemy combatants.[20]

With the advent of novel technological innovations and information in the field of neuroscience, the military has begun to anticipate specific uses for such neuroscience research. However, these approaches, which can alter human cognitive abilities as well as infringe on an individual’s right to the privacy of his or her own thoughts, are still innovatory and early in development. Therefore, the precise effects and potential influence are yet to be well explained.[21] Present day treaties, such as the U.N. Declaration of Human Rights and the Chemical Weapons Conventions, address only the use of certain chemical agents and therefore cannot adequately regulate the fast-paced evolution of recent advancements in cognitive science research.[21] Due to this ambiguity and the potential of technology misuse, it has become increasingly pressing to address the regulations needed for controlling the extent to which neuroscience research can be employed in military functions.

Another area of interest to the military is the use of human enhancement drugs. DARPA (Defense Advanced Research Projects Agency), a Pentagon branch of the United States Department of Defense, is often perceived to be responsible for military research and development of technology. A current operation of DARPA is named the Preventing Sleep Deprivation Program, which conducts research on the molecular processes and changes in the brain involved with sleep deprivation, with the ultimate purpose to maximize warfighters’ cognitive abilities, even with sleep deprivation.[22] As a result of this research, sleep deprivation prevention drugs such as Modafinil and Ampakine CX717 have increased in significance. However, because these chemical drugs directly affect natural chemical reactions and receptors in the body, the ethics of their use as well as safety are in question.[23]

Caveats

Neuroscience is a complex field and one not well understood by the general public. Although experts recognize the possibilities and drawbacks of brain imaging relatively well, others may be either too confident in or completely reject the field. Judges must decide on the validity of various neurological evidence so it can enter the courtroom, and juries must not be too willing to place all faith in neuroscience.[19] Due to glorified depictions of forensics labs on popular television shows, brain imaging has faced criticism for having a "CSI effect". Society may soon develop a false sense of what is possible with contemporary technologies, and may not understand the value of evidence being presented.[24]

References

  1. 1.0 1.1 Lua error in package.lua at line 80: module 'strict' not found. [1]
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. 4.0 4.1 4.2 4.3 4.4 Lua error in package.lua at line 80: module 'strict' not found.
  5. 5.0 5.1 Lua error in package.lua at line 80: module 'strict' not found. See also: Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Lua error in package.lua at line 80: module 'strict' not found.
  10. 10.0 10.1 Lua error in package.lua at line 80: module 'strict' not found.
  11. 11.0 11.1 Lua error in package.lua at line 80: module 'strict' not found. PDF
  12. Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. Lua error in package.lua at line 80: module 'strict' not found.
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. Lua error in package.lua at line 80: module 'strict' not found.
  19. 19.0 19.1 19.2 19.3 Lua error in package.lua at line 80: module 'strict' not found. PDF
  20. Lua error in package.lua at line 80: module 'strict' not found.
  21. 21.0 21.1 Lua error in package.lua at line 80: module 'strict' not found.
  22. Lua error in package.lua at line 80: module 'strict' not found.
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. Lua error in package.lua at line 80: module 'strict' not found.

Further reading

  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.

External links