Computer ethics

From Infogalactic: the planetary knowledge core
(Redirected from Computer Ethics)
Jump to: navigation, search

Computer Ethics is a part of practical philosophy which concern with how computing professionals should make decisions regarding professional and social conduct.[1] Margaret Anne Pierce, a professor in the Department of Mathematics and Computers at Georgia Southern University has categorized the ethical decisions related to computer technology and usage into 3 primary influences:

  1. The individual's own personal code.
  2. Any informal code of ethical conduct that exists in the work place.
  3. Exposure to formal codes of ethics.[2]

Foundation

To understand the foundation of computer ethics, it is important to look into the different schools of biology ethical theory. Each school of ethics influences a situation in a certain direction and pushes the final outcome of ethical theory.


Relativism is the belief that there are no universal moral norms of right and wrong. In the school of relativistic ethical belief, ethicists divide it into three connected but different structures, subject (Moral) and culture (Anthropological). Moral relativism is the idea that each person decides what is right and wrong for them. Anthropological relativism is the concept of right and wrong is decided by a society’s actual moral belief structure.

Deontology is the belief that people’s actions are to be guided by moral laws, and that these moral laws are universal. The origins of Deontological Ethics are generally attributed to the German philosopher Immanuel Kant and his ideas concerning the Categorical Imperative. Kant believed that in order for any ethical school of thought to apply to all rational beings, they must have a foundation in reason. Kant split this school into two categorical imperatives. The first categorical imperative states to act only from moral rules that you can at the same time will to be universal moral laws. The second categorical imperative states to act so that you always treat both yourself and other people as ends in themselves, and never only as a means to an end.

Utilitarianism is the belief that if an action is good it benefits someone and an action is bad if it harms someone. This ethical belief can be broken down into two different schools, Act Utilitarianism and Rule Utilitarianism. Act Utilitarianism is the belief that an action is good if its overall effect is to produce more happiness than unhappiness. Rule Utilitarianism is the belief that we should adopt a moral rule and if followed by everybody, would lead to a greater level of overall happiness.

Social contract is the concept that for a society to arise and maintain order, a morality based set of rules must be agreed upon. Social contract theory has influenced modern government and is heavily involved with societal law. Philosophers like John Rawls, Thomas Hobbes, John Locke, and Jean-Jacques Rousseau helped created the foundation of social contract.

Virtue Ethics is the belief that ethics should be more concerned with the character of the moral agent (virtue), rather than focusing on a set of rules dictating right and wrong actions, as in the cases of deontology and utilitarianism, or a focus on social context, such as is seen with Social Contract ethics. Although concern for virtue appears in several philosophical traditions, in the West the roots of the tradition lie in the work of Plato and Aristotle, and even today the tradition’s key concepts derive from ancient Greek philosophy.

The conceptual foundations of computer ethics are investigated by information ethics, a branch of philosophical ethics established by Luciano Floridi.[citation needed] The term computer ethics was first coined by Walter Maner,[1] a professor at Bowling Green State University.[citation needed] Since the 1990s the field has started being integrated into professional development programs in academic settings.[citation needed]

History

The concept of computer ethics originated in the 1940s with MIT professor Norbert Wiener. While working on anti-aircraft artillery during World War II, Wiener and his fellow engineers developed a system of communication between the part of a cannon that tracked a warplane, the part that performed calculations to estimate a trajectory, and the part responsible for firing.[1] Wiener termed the science of such information feedback systems "cybernetics," and he discussed this new field with its related ethical concerns in his 1948 book, Cybernetics.[1][3] In 1950, Wiener's second book, The Human Use of Human Beings, delved deeper into the ethical issues surrounding information technology and laid out the basic foundations of computer ethics.[3]

In 1966, Mercedes Price, also a professor at MIT, published a simple program called ELIZA which performed natural language processing. In essence, the program functioned like a psychotherapist where the program only used open ended questions to encourage patients to respond. The program would apply pattern matching pattern rules to human statements to figure out its reply.

A bit later during the same year the world's first computer crime was committed. A programmer was able to use a bit of computer code to stop his banking account from being flagged as overdrawn. However, there were no laws in place at that time to stop him, and as a result he was not charged.[4][unreliable source?] To make sure another person did not follow suit, an ethics code for computers was needed.

In 1973, the Association for Computing Machinery (ACM) adopted its first code of ethics.[1] SRI International's Donn Parker,[5] an author on computer crimes, led the committee that developed the code.[1]

By the middle of the 1970s new privacy and computer crime laws had been put in place in United States as well as Europe.[citation needed]

In 1976, medical teacher and researcher, Walter Maner noticed that ethical decisions are much harder to make when computers are added. He noticed a need for a different branch of ethics for when it came to dealing with computers. The term "computer ethics" was thus invented.[1][3]

In the year 1976 Joseph Weizenbaum made his second significant addition to the field of computer ethics. He published a book titled Computer Power and Human Reason,[6] which talked about how artificial intelligence is good for the world; however it should never be allowed to make the most important decisions as it does not have human qualities such as wisdom. By far the most important point he makes in the book is the distinction between choosing and deciding. He argued that deciding is a computational activity while making choices is not and thus the ability to make choices is what makes us humans.

At a later time during the same year Abbe Mowshowitz, a professor of Computer Science at the City College of New York, published an article titled "On approaches to the study of social issues in computing." This article identified and analyzed technical and non-technical biases in research on social issues present in computing.

During 1978, the Right to Financial Privacy Act was adopted by the United States Congress, drastically limiting the government's ability to search bank records.[7]

During the same year Terrell Ward Bynum, the professor of Philosophy at Southern Connecticut State University as well as Director of the Research Center on Computing and Society there, developed the first ever curriculum for a university course on computer ethics.[citation needed] Bynum was also editor of the journal Metaphilosophy.[1] In 1983 the journal held an essay contest on the topic of computer ethics and published the winning essays in its best-selling 1985 special issue, “Computers and Ethics.”[1]

In 1984, the United States Congress passed the Small Business Computer Security and Education Act, which created a Small Business Administration advisory council to focus on computer security related to small businesses.[8]

In 1985, James Moor, Professor of Philosophy at DartMouth College in New Hampshire,[citation needed] published an essay called "What is Computer Ethics?"[3] In this essay Moor states the computer ethics includes the following: "(1) identification of computer-generated policy vacuums, (2) clarification of conceptual muddles, (3) formulation of policies for the use of computer technology, and (4) ethical justification of such policies."[this quote needs a citation]

During the same year, Deborah Johnson, Professor of Applied Ethics and Chair of the Department of Science, Technology, and Society in the School of Engineering and Applied Sciences of the University of Virginia,[citation needed] got the first major computer ethics textbook published.[3] Johnson's textbook identified major issues for research in computer ethics for more than 10 years after publication of the first edition.[3]

In 1988, Robert Hauptman, a librarian at St. Cloud University, came up with "information ethics," a term that was used to describe the storage, production, access and dissemination of information.[4][unreliable source?] Near the same time, the Computer Matching and Privacy Act was adopted and this act restricted United States government programs identifying debtors.[9]

In the year 1992, ACM adopted a new set of ethical rules called "ACM code of Ethics and Professional Conduct"[10] which consisted of 24 statements of personal responsibility.

Three years later, in 1995, Krystyna Górniak-Kocikowska, a Professor of Philosophy at Southern Connecticut State University, Coordinator of the Religious Studies Program, as well as a Senior Research Associate in the Research Center on Computing and Society,[citation needed] came up with the idea that computer ethics will eventually become a global ethical system and soon after, computer ethics would replace ethics altogether as it would become the standard ethics of the information age.[3]

In 1999, Deborah Johnson revealed her view, which was quite contrary to Górniak-Kocikowska's belief, and stated that computer ethics will not evolve but rather be our old ethics with a slight twist.[4][unreliable source?]

Internet Privacy

Internet Privacy is one of the lock issues that has emerged since the evolution of the World Wide Web. Millions of internet users often expose personal information on the internet in order to sign up or register for thousands of different possible things. This act has exposed themselves on the internet in ways some may not realize.[11][better source needed]

Another example of privacy issues with concern to Google is tracking searches. There is a feature within searching that allows Google to keep track of searches so that advertisements will match your search criteria, which in turn means using people as products. If you are not paying for a service online instead of being the consumer, you may very well be the product.

There is an ongoing discussion about what privacy means and if it is still needed. With the increase in social networking sites, more and more people are allowing their private information to be shared publicly. On the surface, this may be seen as someone listing private information about them on a social networking site, but below the surface, it is the site that could be sharing the information (not the individual). This is the idea of an Opt-In versus Opt-Out situation. There are many privacy statements that state whether there is an Opt-In or an Opt-Out policy. Typically an Opt-In privacy policy means that the individual has to tell the company issuing the privacy policy if they want their information shared or not. Opt-Out means that their information will be shared unless the individual tells the company not to share it.

Software Conduct

Due to the unintentional presence of software bugs computer programs have varying levels of reliability. Depending on the intent of a program, this sometimes raises ethical concerns relating to the production and testing of software. If a program doesn't behave predictably as a result of misrepresentation or misimplementation, then a user may, for example:

  • Destroy personal or important data
  • Broadcast personal or inaccurate information (see ripple effect)
  • Misuse robotic devices and potentially damage the device, self or others

In order to prevent these issues and potential dilemma, developers can employ software testing to detect and fix defects, and effective standards of software design and maintenance to systematically ensure software is created as intended the first time. Consumer testing can also solve issues of software representation, and ensure that sufficient and accurate instructions are provided with a program.

Identifying issues

Identifying ethical issues as they arise, as well as defining how to deal with them, has traditionally been problematic. In solving problems relating to ethical issues, Michael Davis proposed a unique problem-solving method. In Davis's model, the ethical problem is stated, facts are checked, and a list of options is generated by considering relevant factors relating to the problem. The actual action taken is influenced by specific ethical standards.[citation needed]

Some questions in computer ethics

Lua error in package.lua at line 80: module 'strict' not found. There are a number of computers based ethical dilemma that are frequently discussed. One set of issues deals with some of the new ethical dilemma that have emerged, or taken on new form, with the rise of the Internet and Social Networking. There are now many ways to gain information about others that were not available, or easily available, before the rise of computers. Thus ethical issues about storage of personal information are now becoming an ever increasing problem. With more storage of personal data for social networking arises the problem of selling that information for monetary gain. This gives rise to different ethical situations regarding access, security, and the use of hacking in positive and negative situations.

Situations regarding the copyright infringement of software, music, movies, are widely becoming discussed, with the rise of file sharing programs such as Napster, Kazaa, and the BitTorrent (protocol) . The ethical questions that arise from software piracy are : is it immoral or wrong to copy software, music, or movies?

A second set of questions pertaining to the Internet and the societal influence that are becoming more widely discussed are questions relating to the values that some may wish to promote via the Internet. Some have claimed that the Internet is a "democratic technology”. Does the Internet foster democracy and freedom of speech? What are the ethical implications of this process on the world? Does the digital divide raise ethical issues that society is morally obligated to change and spread the ability to access different forms of electronic communication?

Ethical standards

Various national and international professional societies and organizations have produced code of ethics documents to give basic behavioral guidelines to computing professionals and users. They include:

Association for Computing Machinery
ACM Code of Ethics and Professional Conduct[10]
Australian Computer Society
ACS Code of Ethics[12]
ACS Code of Professional Conduct[13]
British Computer Society
BCS Code of Conduct[14]
Code of Good Practice (retired May 2011)[15]
Computer Ethics Institute
Ten Commandments of Computer Ethics marri
IEEE
IEEE Code of Ethics[16]
IEEE Code of Conduct[17]
League of Professional System Administrators
The System Administrators' Code of Ethics[18]

See also

<templatestyles src="Div col/styles.css"/>

2

References

  1. 1.0 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found. (subscription required)
  3. 3.0 3.1 3.2 3.3 3.4 3.5 3.6 Lua error in package.lua at line 80: module 'strict' not found.
  4. 4.0 4.1 4.2 Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Small Business Computer Security and Education Act of 1984 at Congress.gov
  9. Computer Matching and Privacy Protection Act of 1988 at Congress.gov
  10. 10.0 10.1 Lua error in package.lua at line 80: module 'strict' not found.
  11. CSC300 Lecture Notes @ University of Toronto, 2011. For more information on this topic, please visit the Electronic Privacy Information Center website.
  12. Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. Lua error in package.lua at line 80: module 'strict' not found.
  17. Lua error in package.lua at line 80: module 'strict' not found.
  18. Lua error in package.lua at line 80: module 'strict' not found.

Further reading

  • Lua error in package.lua at line 80: module 'strict' not found. (subscription required)
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found. (subscription required)
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found. (subscription required)
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found. ISSN 0026-1068
  • Lua error in package.lua at line 80: module 'strict' not found. (subscription required)
  • Lua error in package.lua at line 80: module 'strict' not found.
  • Lua error in package.lua at line 80: module 'strict' not found.

External links