Chapter 04
Chapter 04
Chapter 04
Chapter4
Ethical and Social Issues in
Information Systems
2
Ethical and Social Issues
• Google is the largest Web tracker, monitoring thousands of Web
sites
3
Ethical and Social Issues
4
Ethical and Social Issues
5
Ethical and Social Issues
6
Ethical and Social Issues
7
Information Systems and Ethics
9
Information Systems and Ethics
11
Moral Dimensions
The introduction of
new information
technology has a ripple
effect, raising new
ethical, social, and
legal issues that must
be dealt with on the
individual, social, and
political levels. These
issues have five moral
dimensions:
information rights and
12
Moral Dimensions
obligations, property rights and obligations, system quality, quality of life, and
accountability and control.
– Quality of life
– Information rights and obligations:
• What information rights do individuals possess with
respect to themselves? What can they protect?
– Property rights and obligations
• How will traditional intellectual property rights be
protected in a digital society in which tracing and
accounting for ownership are difficult?
– Accountability and control
Moral Dimensions
Ethics
– Advances in data analysis techniques
• Profiling
– Combining data from multiple sources to create
dossiers of detailed information on individuals
• Nonobvious relationship awareness (NORA)
Technology Trends Affecting
– Combining data from multiple sources to find
obscure hidden connections that might help
identify criminals or terrorists
– Mobile device growth
• Tracking of
individual
cell phones
Ethics
NORA technology can take
information about people
from disparate sources and
find obscure, nonobvious
Technology Trends Affecting
relationships. It might discover, for example, that an applicant for a job at a casino
shares a telephone number with a known criminal and issue an alert to the hiring
manager.
Ethics in an Information Society
Ethics is a concern of humans
who have freedom of choice.
3.
Identify the stakeholders: players in the game who have an
interest in the outcome, who have invested in the situation, etc.
• Five-step ethical analysis 4.
Ethics in an Information Society
Identify the options that you can reasonably take: arriving at
a good or ethical solution may not always be a balancing of
consequences to stakeholders.
5.
Identify the potential consequences of your options: Some
options may be ethically correct but disastrous from other
points of view. Other options may work in one instance but
not in other similar instances
Once your analysis is complete, what ethical approaches or rules
should you use to make a decision?
Ethics in an Information Society
Although you are the only one who can decide which among
many ethical principles you will follow, and how you will
prioritize them, it is helpful to consider some ethical approaches
with deep roots in many cultures that have survived throughout
recorded history, next slide
• Candidate ethical approaches – Golden Rule
• Do unto others as you would have them do unto you,
Putting yourself into the place of others – Categorical
Imperative
• If an action is not right for everyone to take, it is not
right for anyone. Ask yourself, “If everyone did this,
Ethics in an Information Society
could the organization, or society, survive? – Rule of
Change
• If an action cannot be taken repeatedly, it is not right
to take at all. An action may bring about a small
change now that is acceptable, but if it is repeated, it
would bring unacceptable changes in the long run.
• Candidate ethical principles (cont.) – Utilitarian
Principle
• Take the action that achieves the higher or greater
value. This rule assumes you can prioritize values in a
rank order and understand the consequences of
various courses of action
Ethics in an Information Society
– Risk Aversion Principle
• Take the action that produces the least harm or potential
cost. Some actions have extremely high failure costs of
very low probability (e.g. building a nuclear generator)
– Ethical “NoFree Lunch”Rule
• Assume that virtually all tangible and intangible objects
are owned by someone unless there is a specific
declaration otherwise. (relate to copyrights, patents,
etc.)
• Professional codes of conduct
– Proclaimed by associations of professionals
• Examples: American Medical Association (AMA)
Ethics in an Information Society
– Promises by professions to regulate themselves in the general
interest of society (e.g. avoiding harm to others)
– These professional groups take responsibility for the partial
regulation of their professions by determining entrance
qualifications and competence (put rules to be qualified).
• Real-world ethical dilemmas
– One set of interests pitted against another
• Example: right of company to maximize productivity of workers
versus workers right to use Internet for short personal tasks,
replacing people for technology, reducing the size of workforce,
etc.
Moral Dimensions –
Information Rights
• Information rights: privacy and freedom in
the Internetage
– Privacy:
• Claim of individuals to be left alone, free from surveillance
or interference from other individuals, organizations, or
state; claim to be able to control information about yourself
• Information technology and systems threaten individual
claims to privacy
Information Rights
• Fair information practices (FIP):
– Set of principles governing the collection and use of information
• Basis of most U.S. and European privacy laws
• Based on mutuality of interest between record holder and
individual, once information is gathered by record holder, the
individual maintains an interest in the record, and the record
may not be used to support other activities without the
individual’s consent.
Moral Dimensions –
• Restated and extended by FTC (Federal Trade Commission) in
1998 to provide guidelines for protecting online privacy
– Used to drive changes in privacy legislation
• COPPA, Gramm-Leach-Bliley Act, HIPAA, Do-Not-Track Online Act of 2011
Information Rights
• FTC FIP principles:
– Notice/awareness (core principle)
• Web sites must disclose practices before collecting data, e.g. uses
of data; other recipients of data, etc. – Choice/consent (core principle)
• Consumers must be able to choose how information is used for
secondary purposes. – Access/participation
• Consumers must be able to review and contest accuracy of
personal data. – Security
Moral Dimensions –
• Data collectors must take steps to ensure accuracy, security of
personal data.
– Enforcement
• Must be mechanism to enforce FIP principles.
Information Rights
• European Directive on Data Protection:
– Companies must inform people that information is
collected and disclose how it is stored and used.
– Requires informed consent of customer.
Moral Dimensions –
– EU member nations cannot transfer personal data to
countries without similar privacy protection (e.g., the
United States).
Information Rights
• Internet challenges to privacy:
– Cookies
• Identify browser and track visits to site
• Super cookies (Flash cookies)
– Web beacons (Web bugs)
• Tiny graphics embedded in e-mails and Web pages
• Monitor who is reading e-mail message or visiting site
Moral Dimensions –
– Spyware
• Installed on user’scomputer
• May transmit user’skeystrokes or display unwanted ads
– Google services and behavioral targeting
Moral Dimensions
Moral Dimensions –
Cookies are written by a Web site on a visitor’s hard drive. When the visitor
returns to that Web site, the Web server requests the ID number from the cookie
and uses it to access the data stored by that server on that visitor. The Web site
can then use these data to display personalized information.
Information Rights
>> Google has been using behavioral targeting to help it
display more relevant ads based on users’ search activities.
Information Rights
>> An opt-out model permits the collection of personal information until
the consumer specifically requests that the data not be collected.
Property Rights
• Property rights: Intellectual property
– Intellectual property: intangible property of any
kind created by individuals or corporations
– Information technology has made it difficult to
protect intellectual property because
computerized information can be so easily copied
or distributed on networks.
Moral Dimensions –
Property Rights
– Three main ways that intellectual property is
protected:
• Trade secret ()ةي راجتال رارس ألا: intellectual work or
product belonging to business(e.g. formula for
Coke)
• Copyright ()رش نالوعبطال ق وقح: statutory grant
protecting intellectual property from being copied
for the life of the author, plus 70 years (copyright of
a photo, book)
Moral Dimensions –
• Patents ()ع ارتخإلا ةءارب: grants the owner an
exclusive monopoly on the ideas behind an
invention for 20 years (Amazon’s One Click
shopping).
Property Rights
• Challenges to intellectual property rights
– Digital media different from physical media (e.g., books)
• Digital media differ from books, periodicals, and other media in terms of ease
of replication; ease of transmission; ease of alteration; difficulty in classifying
a software work as a program, book, or even music; compactness—making
theft easy; and difficulties in establishing uniqueness.
Moral Dimensions –
• Mechanisms are being developed to sell and distribute
books, articles, and other intellectual property legally on
the Internet, and the Digital Millennium Copyright Act
(DMCA) of 1998 is providing some copyright protection.
Internet service providers (ISPs) are required to take down
sites of copyright infringers that they are hosting once they
are notified of the problem
Moral Dimensions –
Accountability, Liability, Control
• Accountability, liability, control –
Computer-related liability problems • If
software fails, who is responsible?
– If a person is injured by a machine controlled, in part,
by software, who should be held accountable and,
therefore, held liable?
– Uploading offensive material, responsibility of
website developers (e.g. youtube) or broadcasters?
– In conclusion, it is difficult to ascribe liability to
software developers for the same reason that it is
Moral Dimensions –
difficult to ascribe a publisher liability for the effects
of a book.
System Quality
• System quality: Data quality and system errors
– Liability and accountability for unintentional consequences
lead the other moral dimension – System Quality
– What is an acceptable, technologically feasible level of
system quality?
– Individuals and organizations may be held responsible for
avoidable and foreseeable consequences, which they have
a duty to perceive and correct. And the gray area is that
Moral Dimensions –
some system errors are foreseeable and correctable only
at very great expense, an expense so great that pursuing
this level of perfection is not feasible economically—no
one could afford the product (so in gray areas, what should
organizations do?).
System Quality
– Three principal sources of poor system
performance:
• Software bugs, errors
• Hardware failures
Moral Dimensions –
• Poor input data quality
Quality of Life
• Quality of life: Equity, access, boundaries-
Quality of Life
Negative social consequences of systems (cont.)
Quality of Life
Negative social consequences of systems (cont.)
Quality of Life
Negative social consequences of systems (cont.)
- Employment:
• Reengineering work resulting in lost jobs
Moral Dimensions –
- Equity and access—the digital divide:
• Certain ethnic and income groups in the United States less
likely to have computers or Internet access. For example: A
similar digital divide exists in U.S. schools, with schools in
highpoverty areas less likely to have computers, high-quality
educational technology programs, or Internet access availability
for their students.
Quality of Life
Negative social consequences of systems (cont.)
- Health risks:
Moral Dimensions –
• Repetitive stress injury (RSI)
– Largest source is computer keyboards
– Carpal tunnel syndrome (CTS)
• Computer vision syndrome (CVS)
– Eyestrain and headaches related to screen use
• Technostress
– Aggravation, impatience, fatigue
Source: