Human subject research

From Infogalactic: the planetary knowledge core
Jump to: navigation, search
1946 military human subject research on the effects of wind on humans

Human subject research is systematic, scientific investigation that can be either interventional (a "trial") or observational (no "test article") and involves human beings as research subjects. Human subject research can be either medical (clinical) research or non-medical (e.g., social science) research.[1] Systematic investigation incorporates both the collection and analysis of data in order to answer a specific question. Medical human subject research often involves analysis of biological specimens, epidemiological and behavioral studies and medical chart review studies.[1] (A specific, and especially heavily regulated, type of medical human subject research is the "clinical trial", in which drugs, vaccines and medical devices are evaluated.) Human subject research in the social sciences often involves surveys, questionnaires, interviews, and focus groups.

Human subject research is used in various fields, including research into basic biology, clinical medicine, nursing, psychology, sociology, political science, and anthropology. As research has become formalized, the academic community has developed formal definitions of "human subject research", largely in response to abuses of human subjects.

Human subjects

The United States Department of Health and Human Services (HHS) defines a human research subject as a living individual about whom a research investigator (whether a professional or a student) obtains data through 1) intervention or interaction with the individual, or 2) identifiable private information (32 C.F.R. 219.102(f)). (Lim, 1990)[2]

As defined by HHS regulations:

"Intervention"- physical procedures by which data is gathered and the manipulation of the subject and/or their environment for research purposes [45 C.F.R. 46.102(f)][2]

"Interaction"- communication or interpersonal contact between investigator and subject [45 C.F.R. 46.102(f)])[2]

"Private Information"- information about behavior that occurs in a context in which an individual can reasonably expect that no observation or recording is taking place, and information which has been provided for specific purposes by an individual and which the individual can reasonably expect will not be made public [45 C.F.R. 46.102(f)] )][2]

"Identifiable information"- specific information that can be used to identify an individual[2]

Human subject rights

In 2010, the National Institute of Justice in the United States published recommended rights of human subjects:

  • Voluntary, informed consent
  • Respect for persons: treated as autonomous agents
  • The right to end participation in research at any time[3]
  • Right to safeguard integrity[3]
  • Benefits should outweigh cost
  • Protection from physical, mental and emotional harm
  • Access to information regarding research[3]
  • Protection of privacy and well-being[4]

Ethical guidelines

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Ethical guidelines that govern the use of human subjects in research are a fairly new construct. In 1906 some regulations were put in place in the United States to protect subjects from abuses. After the passage of the Pure Food and Drug Act in 1906, regulatory bodies were gradually institutionalized such as the Food and Drug Administration (FDA) and the Institutional Review Board (IRB). The policies that these institutions implemented served to minimize harm to the participant's mental and/or physical well being.

Nuremberg Code

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In 1947, German physicians who conducted deadly or debilitating experiments on concentration camp prisoners were prosecuted as war criminals in the Nuremberg Trials. That same year, the Allies established the Nuremberg Code, the first international document to support the concept that "the voluntary consent of the human subject is absolutely essential". Individual consent was emphasized in the Nuremberg Code in order to prevent prisoners of war, patients, prisoners, and soldiers from being coerced into becoming human subjects. In addition, it was emphasized in order to inform participants of the risk-benefit outcomes of experiments.

Declaration of Helsinki

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The Declaration of Helsinki was established in 1964 to regulate international research involving human subjects. Established by the World Medical Association, the declaration recommended guidelines for medical doctors conducting biomedical research that involves human subjects. Some of these guidelines included the principles that “research protocols should be reviewed by an independent committee prior to initiation" and that “research with humans should be based on results from laboratory animals and experimentation”.

The Declaration of Helsinki is widely regarded as the cornerstone document on human research ethics.[5][6][7]

Clinical trials

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Lua error in package.lua at line 80: module 'strict' not found. Clinical trials are experiments done in clinical research. Such prospective biomedical or behavioral research studies on human participants are designed to answer specific questions about biomedical or behavioral interventions, including new treatments (such as novel vaccines, drugs, dietary choices, dietary supplements, and medical devices) and known interventions that warrant further study and comparison. Clinical trials generate data on safety and efficacy.[8] They are conducted only after they have received health authority/ethics committee approval in the country where approval of the therapy is sought. These authorities are responsible for vetting the risk/benefit ratio of the trial - their approval does not mean that the therapy is 'safe' or effective, only that the trial may be conducted.

Depending on product type and development stage, investigators initially enroll volunteers and/or patients into small pilot studies, and subsequently conduct progressively larger scale comparative studies. Clinical trials can vary in size and cost, and they can involve a single research center or multiple centers, in one country or in multiple countries. Clinical study design aims to ensure the scientific validity and reproducibility of the results.

Trials can be quite costly, depending on a number of factors. The sponsor may be a governmental organization or a pharmaceutical, biotechnology or medical device company. Certain functions necessary to the trial, such as monitoring and lab work, may be managed by an outsourced partner, such as a contract research organization or a central laboratory.

Human subjects in psychology and sociology

Stanford prison experiment

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

A study conducted by Philip Zimbardo in 1971 examined the effect of social roles on college students at Stanford University. Twenty-four male students were assigned to a random role of a prisoner or guard to simulate a mock prison in one of Stanford's basements. After only six days, the abusive behavior of the guards and the psychological suffering of prisoners proved significant enough to halt the two-week-long experiment.[9]

Milgram experiment

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

In 1961, Yale University psychologist Stanley Milgram led a series of experiments to determine to what extent an individual would obey instructions given by an experimenter. Placed in a room with the experimenter, subjects played the role of a "teacher" to a "learner" situated in a separate room. The subjects were instructed to administer an electric shock to the learner when the learner answered incorrectly to a set of questions. The intensity of this electric shock was to be increased for every incorrect answer. The learner was a confederate (i.e. actor), and the shocks were faked, but the subjects were led to believe otherwise. Both prerecorded sounds of electric shocks and the confederate's pleas for the punishment to stop were audible to the "teacher" throughout the experiment. When the subject raised questions or paused, the experimenter insisted that the experiment should continue. Despite widespread speculation that most participants would not continue to "shock" the learner, 65 percent of participants in Milgram's initial trial complied until the end of the experiment, continuing to administer shocks to the confederate with purported intensities of up to "450 volts".[10][11] Although many participants questioned the experimenter and displayed various signs of discomfort, when the experiment was repeated, 65 percent of subjects were willing to obey instructions to administer the shocks through the final one.[12]

Asch conformity experiments

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Psychologist Solomon Asch's classic conformity experiment in 1951 involved one subject participant and multiple confederates; they were asked to provide answers to a variety of different low-difficulty questions.[13] In every scenario, the multiple confederates gave their answers in turn, and the subject participant subject was allowed to answer last. In a control group of participants, the percentage of error was less than one percent. However, when the confederates unanimously chose an incorrect answer, 75 percent of the subject participants agreed with the majority at least once. The study has been regarded as significant evidence for the power of social influence and conformity.[14]

Robber's Cave study

A classic advocate of Realistic conflict theory, Muzafer Sherif's Robber's Cave experiment shed light on how group competition can foster hostility and prejudice.[15] In the 1961 study, two groups of ten boys each who were not "naturally" hostile were grouped together without knowledge of one another in Robber's Cave State Park, Oklahoma.[16] The twelve-year-old boys bonded with their own groups for a week before the groups were set in competition with each other in games such as tug-of-war and football. In light of this competition, the groups resorted to name-calling and other displays of resentment, such as burning the other group's team flag. The hostility continued and worsened until the end of the three-week study, when the groups were forced to work together to solve problems.[16]

Bystander effect

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The Bystander Effect is demonstrated in a series of famous experiments by Bibb Latane and John Darley[16] In each of these experiments, participants were confronted with a type of emergency, such as the witnessing of a seizure or smoke entering through air vents. A common phenomenon was observed that as the number of witnesses or "bystanders" increases, so does the time it takes for individuals to respond to the emergency. This effect has been shown to promote the diffusion of responsibility by concluding that, when surrounded by others, the individual expects someone else to take action.[16]

Cognitive dissonance

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Human subjects have been commonly used in experiments testing the theory of Cognitive dissonance after the landmark study by Leon Festinger and Merrill Carlsmith.[17] In 1959, Festinger and Carlsmith devised a situation in which participants would undergo excessively tedious and monotonous tasks. After the completion of these tasks, the subjects were instructed to help the experiment continue in exchange for a variable amount of money. All the subjects had to do was simply inform the next "student" waiting outside the testing area (who was secretly a confederate) that the tasks involved in the experiment were interesting and enjoyable. It was expected that the participants wouldn't fully agree with the information they were imparting to the student, and after complying, half of the participants were awarded $1, and the others were awarded $20. A subsequent survey showed that, by a large margin, those who received less money for essentially "lying" to the student came to believe that the tasks were far more enjoyable than their highly paid counterparts.[17]

Unethical human experimentation

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Unethical human experimentation violates the principles of medical ethics. It has been performed by countries including Nazi Germany, Imperial Japan, North Korea, the United States, and the Soviet Union. Examples include Project MKUltra, Unit 731, Totskoye nuclear exercise[18] and the experiments of Josef Mengele.

Nazi Germany performed human experimentation on large numbers of prisoners (including children), largely Jews from across Europe, but also Romani, Sinti, ethnic Poles, Soviet POWs and disabled Germans, by Nazi Germany in its concentration camps mainly in the early 1940s, during World War II and the Holocaust. Prisoners were forced into participating; they did not willingly volunteer and no consent was given for the procedures. Typically, the experiments resulted in death, trauma, disfigurement or permanent disability, and as such are considered as examples of medical torture. After the war, these crimes were tried at what became known as the Doctors' Trial, and the abuses perpetrated led to the development of the Nuremberg Code.[19] During the Nuremberg Trials, 23 Nazi doctors and scientists were prosecuted for the unethical treatment of concentration camp inmates, who were often used as research subjects with fatal consequences. Of those 23, 15 were convicted, 7 were condemned to death, 9 received prison sentences from 10 years to life, and 7 were acquitted.[20]

Unit 731, a department of the Imperial Japanese Army located near Harbin (then in the puppet state of Manchukuo, in northeast China), experimented on prisoners by conducting vivisections, dismemberments, and bacterial inoculations. It induced epidemics on a very large scale from 1932 onward through the Second Sino-Japanese war.[21] It also conducted biological and chemical weapons tests on prisoners and captured POWs. With the expansion of the empire during World War II, similar units were set up in conquered cities such as Nanking (Unit 1644), Beijing (Unit 1855), Guangzhou (Unit 8604) and Singapore (Unit 9420). After the war, Supreme Commander of the Occupation Douglas MacArthur gave immunity in the name of the United States to Shiro Ishii and all members of the units in exchange for all of the results of their experiments.[21]

During World War II, Fort Detrick in Maryland was the headquarters of US biological warfare experiments. Operation Whitecoat involved the injection of infectious agents into military forces to observe their effects in human subjects.[22] Subsequent human experiments in the United States have also been characterized as unethical. They were often performed illegally, without the knowledge, consent, or informed consent of the test subjects. Public outcry over the discovery of government experiments on human subjects led to numerous congressional investigations and hearings, including the Church Committee, Rockefeller Commission, and Advisory Committee on Human Radiation Experiments, amongst others. The Tuskegee syphilis experiment, widely regarded as the "most infamous biomedical research study in U.S. history,"[23] was performed from 1932 to 1972 by the Tuskegee Institute contracted by the United States Public Health Service. The study followed more than 600 African-American men who were not told they had syphilis and were denied access to the known treatment of penicillin.[24] This led to the 1974 National Research Act, to provide for protection of human subjects in experiments. The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research was established and was tasked with establishing the boundary between research and routine practice, the role of risk-benefit analysis, guidelines for participation, and the definition of informed consent. Its Belmont Report established three tenets of ethical research: respect for persons, beneficence, and justice.[25]

See also

<templatestyles src="Div col/styles.css"/>

Notes

  1. 1.0 1.1 Lua error in package.lua at line 80: module 'strict' not found.
  2. 2.0 2.1 2.2 2.3 2.4 Lua error in package.lua at line 80: module 'strict' not found.
  3. 3.0 3.1 3.2 Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. WMA Press Release: WMA revises the Declaration of Helsinki. 9 October 2000
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Zimbardo, P.G. (2007). The Lucifer Effect: Understanding How Good People Turn Evil. New York: Random House.
  10. Lua error in package.lua at line 80: module 'strict' not found.
  11. Lua error in package.lua at line 80: module 'strict' not found. as PDF.
  12. Lua error in package.lua at line 80: module 'strict' not found. as PDF
  13. Asch, S.E. (1951). "Effects of group pressure on the modification and distortion of judgments", In H. Guetzkow (Ed.), Groups, Leadership and Men(pp. 177–190). Pittsburgh, PA: Carnegie Press
  14. Milgram, S. (1961). "Nationality and conformity", Scientific America, 205(6).
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. 16.0 16.1 16.2 16.3 Mook, Douglass, 2004, Greenwood Press, "Classic Experiments in Psychology"
  17. 17.0 17.1 Cooper, Joel. 2007 "Cognitive Dissonance, Fifty Years of a Classic Theory", SAGE Publications
  18. Lua error in package.lua at line 80: module 'strict' not found.
  19. Lua error in package.lua at line 80: module 'strict' not found.
  20. Lua error in package.lua at line 80: module 'strict' not found.
  21. 21.0 21.1 Lua error in package.lua at line 80: module 'strict' not found.
  22. Lua error in package.lua at line 80: module 'strict' not found.
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. Gray, Fred D. The Tuskegee Syphilis Study, Montgomery: New South Books, 1998.
  25. Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.

External links