A Framework For The Ethical Impact Assessment of Information Technology
A Framework For The Ethical Impact Assessment of Information Technology
DOI 10.1007/s10676-010-9242-6
Introduction
Objective
The objective of this paper is to propose an ethical impact
assessment framework that could be used by those developing
new technologies, services, projects, policies or programmes
D. Wright (&)
Trilateral Research and Consulting, 22 Argyll Court,
82-84 Lexham Gardens, London W8 5JB, UK
e-mail: [email protected]
as a way to ensure ethical implications are adequately examined by stakeholders before deployment and so that mitigating
measures can be taken as necessary. The framework could be
used in many different contexts, wherever the decision-maker
perceives a need to take the ethical considerations of stakeholders into account.
Here are some examples of where an ethical impact
assessment could help or could have helped project managers or policy-makers identify ethical issues before
deploying a technology or service:
Google introduced its Buzz social network in February
2010 without adequate consideration of the ethical or
privacy impacts. Google developed Buzz as a rival to
Facebook by creating instant and automatic social
networks for users of its Gmail service. The snag was
that it did not ask users whether they wanted a social
network composed of the people whom they e-mailed,
no matter how frequently. As a New York Times
reporter observed, E-mail, it turns out, can hold many
secrets, from the names of personal physicians and
illicit lovers to the identities of whistle-blowers and
antigovernment activists.1 Surprised by the firestorm
of criticism, Google had to make changes to Buzz
within a few days of its introduction. If it had carried
out an ethical impact assessment in advance of making
Buzz operational, it might have avoided the flak.
Is it ethically acceptable to electronically tag those with
incipient dementia who may go wandering from
assisted living facilities? While it may be ethically
correct not to hold such people as virtual prisoners
within the confines of a residence, is it ethically
acceptable to keep them under constant surveillance?
Even if they consented to be tagged, can their consent
1
Helft (2010).
123
200
D. Wright
Hofmann refers specifically to health technology, but his observation may well be applicable to any technology. Hofmann (2005, p.
288).
123
Moor (1985).
Nissenbaum (2004).
5
http://cordis.europa.eu/fp7/ethics_en.html#ethics_cl
6
Treasury Board of Canada Secretariat 2002.
7
[UK] Information Commissioners Office (ICO) 2009.
8
Marx (1998). Van Gorp also proposed a list of questions that helps
researchers doing research in technological fields to identify ethical
aspects of their research. Van Gorp (2009).
4
Target audience
The ethical impact assessment proposed in this paper is
primarily aimed at those who are developing or intend to
develop an information technology project, policy or programme that may have ethical implications. More specifically, this would include industry players when they are
developing a new technology or planning a new service as
well as policy-makers and regulatory authorities when they
are considering a new policy or regulation. In addition, the
ethical impact assessment framework should be of interest
to civil society organisations, so that when they become
aware of proposals or plans for new technologies, they can
advocate the frameworks use and their involvement in the
decision-making process. Other stakeholders, such as academics, may find the ethical impact assessment framework
of interest too and, as a consequence, may be able to
suggest improvements or to analyse its use. It might also be
of interest to the media as background to any stories they
prepare on the introduction of a new technology or service,
which in turn will help raise the awareness of the public
and other stakeholders about the associated ethical issues.
Nominally, an ethical impact assessment of a new or
emerging technology should target stakeholders interested
in or affected by the outcome. In the first instance, the
policy-maker or technology developer or project manager
should identify the stakeholders he or she thinks relevant,
but in most cases he or she should be open to or even
encourage other stakeholders to contribute to the assessment.9 To ensure those participating in an ethical impact
assessment are truly representative of the relevant stakeholder groups, the technology developer or policy maker
may need to make some special efforts to engage the relevant stakeholders in order to avoid something akin to
regulatory capture.
201
The construction of an ethical impact assessment framework, as proposed in this paper, draws on various sources
with regard to values, different types of impact assessment
and the role that IT plays.
With specific regard to values, it draws on those stated
in the EU Reform Treaty, signed by Heads of State and
Government at the European Council in Lisbon on 13
December 2007, such as human dignity, freedom, democracy, human rights protection, pluralism, non-discrimination, tolerance, justice, solidarity and gender equality.10
These values are also stated in the Charter of Fundamental
10
http://eurlex.europa.eu/JOHtml.do?uri=OJ:C:2007:306:SOM:EN:
HTML
123
202
D. Wright
are key features of the ethical impact assessment framework proposed here too.
The notion of examining the ethical impacts of information technology has been gaining traction ever since
Moor published the article cited above more than a quarter
of century ago. Of more recent provenance is the work
done by Skorupinski and Ott who argued that technology
assessment (TA) is not detachable (their descriptor)
from ethical questions for several reasons, among which
are
11
http://www.europarl.europa.eu/charter/pdf/text_en.pdf
European Commission 2007.
13
For a state-of-the-art review, see Renn (2008).
14
Technology assessments as an instrument for counselling political
decision-makers were given a major impetus with the establishment
of the Office for Technology Assessment (OTA) by the US Congress
in 1972. Similar organisations were subsequently established in
Europe, both at the Member State level (e.g., the Danish Board of
Technology) and at the European level (e.g., the European Parliaments office of Science and Technology Options Assessment
(STOA)). STOA is a member of the European Parliamentary
Technology Assessment Network (EPTA). Other EPTA members
are the national parliamentary technology assessment bodies of
Denmark, Finland, France, Germany, Greece, Italy, the Netherlands
and the United Kingdom.
15
For a good overview of developments in this area, see Kirkpatrick
and Parker (2007).
12
123
More recently, Beekman et al. view the ethical assessment of the application of new technologies as complementary to rather than an alternative to scientific risk
assessments and economic cost-benefit assessments. Taken
together, they say, these ethical, scientific and economic
assessments should provide a sound basis for socio-political decision-making.18
16
17
18
203
123
204
D. Wright
Ethical principles
The framework is structured on the four principles posited
by Beauchamp and Childress28 together with a separate
section on privacy and data protection. Under these major
principles are some values and/or issues followed by some
brief explanatory text and a set of questions aimed at the
technology developer or policy-maker to facilitate a consideration of the ethical issues which may arise in their
undertaking. Values and issues are clustered together
because of their relation to the overarching principles and
because they will generate debate among stakeholders. For
example, everyone would subscribe to the shared value of
dignity, but dignity could also become an issue in particular contextsi.e., does an emerging technology respect
the dignity of the individual? Is dignity compromised?
What is meant by dignity in the given context?
The framework draws on various sources (see the References) in compiling these questions. No doubt more
issues and questions could be added, and some questions
could be framed differently, and if so, thats fine. To some
extent, the issues and questions set out here should be
regarded as indicative, rather than comprehensive.
Dignity
Dignity is a key value, as evidenced by its being the subject
of Article 1 (Human dignity is inviolable. It must be
respected and protected.) of the Charter of Fundamental
Rights as well Article 25 which specifically refers to the
rights of the elderly (The Union recognises and respects
the rights of the elderly to lead a life of dignity and independence and to participate in social and cultural life.)
Dignity also features in Article 1 of the UNs Universal
Declaration of Human Rights, which states that All
human beings are born free and equal in dignity and
29
28
123
30
205
123
206
D. Wright
Questions
Will the project obtain the free and informed consent of
those persons to be involved in or affected by the
project? If not, why not?
Will the person be informed of the nature, significance,
implications and risks of the project or technology?
Will such consent be evidenced in writing, dated and
signed, or otherwise marked, by that person so as to
indicate his consent?
If the person is unable to sign or to mark a document so
as to indicate his consent, can his consent be given orally
in the presence of at least one witness and recorded in
writing?
Does the consent outline the use for which data are to be
collected, how the data are to be collected, instructions
on how to obtain a copy of the data, a description of the
mechanism to correct any erroneous data, and details of
who will have access to the data?
If the individual is not able to give informed consent
(because, for example, the person suffers from dementia)
to participate in a project or to use of a technology, will
the project representatives consult with close relatives, a
guardian with powers over the persons welfare or
professional carers? Will written consent be obtained
from the patients legal representative and his doctor?
Will the person have an interview with a project
representative in which he will be informed of the
objectives, risks and inconveniences of the project or
research activity and the conditions under which the
project is to be conducted?
Will the person be informed of his right to withdraw from
the project or trial at any time, without being subject to
any resulting detriment or the foreseeable consequences
of declining to participate or withdrawing?
Will the project ensure that persons involved in the
project give their informed consent, not only in relation
to the aims of the project, but also in relation to the
process of the research, i.e., how data will be collected
and by whom, where it will be collected, and what
happens to the results?
123
Beauchamp and Childress say that The principle of nonmaleficence asserts an obligation not to inflict harm on
others and that Nonmaleficence only requires intentionally refraining from actions that cause harm. Rules of
nonmaleficence, therefore, take the form of Do not do
X.36 Under this broad principle, this framework includes
several ethical values and issues, as follows.
Safety
Article 38 of the Charter of Fundamental Rights deals with
consumer protection: Union policies shall ensure a high
level of consumer protection. It is the subject of Article
153 of the EC Treaty: In order to promote the interests of
consumers and to ensure a high level of consumer protection, the Community shall contribute to protecting the
health, safety and economic interests of consumers, as well
as to promoting their right to information, education and to
organise themselves in order to safeguard their interests.
Consumer protection at European level is also provided by
(amongst others) Directive 93/13 on unfair terms in consumer contracts, Directive 97/7 on consumer protection in
respect of distance contracts and the Directive on liability
for defective products (85/374/EEC).
Questions
Is there any risk that the technology or project may cause
any physical or psychological harm to consumers? If so,
what measures can be adopted to avoid or mitigate the
risk?
Have any independent studies already been carried out
or, if not, are any planned which will address the safety
of the technology or service or trials? If so, will they be
made public?
To what extent is scientific or other objective evidence
used in making decisions about specific products,
processes or trials?
36
207
39
123
208
D. Wright
Universal service
Universal service means an obligation imposed on one or
more operators of electronic communications networks
and/or services to provide a minimum set of services to all
users, regardless of their geographical location within the
national territory, at an affordable price.42 Universal service is broader than basic telephony service. Now the
notion of universal service in Europe encompasses broadband and Internet access for all. The European Commission
and various Member States have recognised that it makes
economic and social sense to extend broadband Internet
access to all citizens. It is also the ethically correct thing to
do. They have made commitments with specific deadlines
to achieving this objective.43 Finland has recently made
broadband access to the Internet a basic right.44
Questions
Will the project or service be made available to all
citizens? When and how will this be done?
Will training be provided to those who do not (yet) have
computer skills or knowledge of the Internet? Who
should provide the training and under what conditions?
Will the service cost the same for users who live in
remote or rural areas as for users who live in urban
areas? How should a cost differential be paid?
Accessibility
With some exceptions, industry is reluctant to factor the
needs of the disabled and senior citizens into their design of
technologies and services and to adopt a design-for-all
approach. The accessibility (user-friendliness) of devices
and services are prerequisites for the e-inclusion of citizens
in the Information Society. Markets tend to overlook the
needs of senior citizens and the disabled: there are few
guidelines, voluntary or mandatory standards and related
regulatory frameworks.45
Others have said commitment to accessibility is widespread throughout the ICT industry, that there is a strong
willingness on the part of software and hardware vendors
to create accessible products; however, vendors ability to
develop and deploy accessible products is held back by the
need to comply with multiple standards. Thus, there needs
to be greater convergence between the accessibility
42
40
123
Questions
Is the project or technology or service being designed
taking into account values such as human well being,
dignity, justice, welfare, human rights, trust, autonomy
and privacy?
Have the technologists and engineers discussed their
project with ethicists and other experts from the social
sciences to ensure value sensitive design?
Does the new technology, service or application
empower users?
Sustainability
Sustainability, as used here, refers to a condition whereby a
project or service can be sustained, can continue into the
future, either because it can generate the financial return
necessary for doing so or that it has external support (e.g.,
government funding) which is not likely to go away in the
foreseeable future. In addition to economic and social
sustainability, more conventional understandings of sustainability should also be considered, i.e., decisions made
today should be defensible in relation to coming generations and the depletion of natural resources. Often new
technological products can be improved, for instance,
through the use of more recyclable materials.50
Questions
Is the project, technology or service economically or
socially sustainable? If not, and if the technology or
service or project appears to offer benefits, what could be
done to make it sustainable?
Should a service provided by means of a research project
continue once the research funding comes to an end?
Does the technology have obsolescence built in? If so,
can it be justified?
Has the project manager or technology developer
discussed their products with environmentalists with a
view to determining how their products can be recycled
or how their products can be designed to minimise
impact on the environment?
209
Justice
Beauchamp and Childress draw a distinction between the
terms justice and distributive justice as follows:
The terms fairness, desert (what is deserved), and
entitlement have been used by various philosophers in
attempts to explicate justice. These accounts interpret
46
50
Palm and Hansson, p. 553. See also Anke van Gorp who also
includes sustainability in his checklist of ethical issues and in this
sense. van Gorp, op. cit., p. 41.
123
210
D. Wright
Questions
Questions
52
53
51
123
54
Maiese (2003).
Marx, p. 174.
Marx, p. 174.
the EUs Data Protection Directive (95/46/EC), the e-Privacy Directive (2000/58/EC), etc.
Article 12 of the Universal Declaration of Human Rights
says No one shall be subjected to arbitrary interference
with his privacy, family, home or correspondence.
Article 8 of the Council of Europes Convention for the
Protection of Human Rights and Fundamental Freedoms,
as amended by Protocol No. 11, Rome, 4.XI.1950,
addresses the right to respect for private and family life.55
The 1980 OECD Guidelines on the Transborder Flows
of Personal Data and the EUs Data Protection Directive
(95/46/EC) identify a set of fair information practices or
principles which are important in any consideration of
ethical issues that might arise in matters affecting privacy
and data protection.
The complexities and intricacies of issues relating to
privacy and data protection have received huge attention
from policy-makers, regulators, academia, the mass media
and many other stakeholders, including ethicists. Privacy is now recognized by many computer ethicists as
requiring more attention than it has previously received in
moral theory. In part this is due to reconceptualizations of
the private and public sphere brought about by the use of
computer technology, which has resulted in inadequacies in
existing moral theory about privacy.56
Although privacy in the sense of protection of personal
data has received lots of attention in the computer age,
privacy extends beyond computers and data protection.
Some years ago, Roger Clarke identified four dimensions
of privacy:
privacy
privacy
privacy
privacy
of
of
of
of
the person;
personal behaviour;
personal communications; and
personal data.57
http://conventions.coe.int/treaty/en/Treaties/Html/005.htm
Brey (2000). Previous to this, Moor commented that From the
point of view of ethical theory, privacy is a curious value. On the one
hand, it seems to be something of very great importance and
something vital to defend, and, on the other hand, privacy seems to be
a matter of individual preference, culturally relative, and difficult to
justify in general. He goes onto argue that privacy has both
instrumental value (that which is good because it leads to something
else which is good) and intrinsic value (that which is good in itself).
Moor (1997).
57
Clarke (2007).
211
56
58
123
212
D. Wright
Purpose specification
Questions
Regarding the project, technology or service, are individuals aware that personal information is being (is to
be) collected, who seeks it, and why?
Has the purpose of collecting personal data been clearly
specified?
Has the project given individuals a full explanation of
the purpose of the project or technology in a way that is
clear and understandable?
Has the person been informed of the purpose of the
research, its expected duration and the procedures by
means of which the data is being (will be) collected?
Is there an appropriate balance between the importance
of the projects objectives and the cost of the means?
How have the goals of the data collection been
legitimated?
Is there a clear link between the information collected
and the goal sought?60
Questions
Has the project taken measures to ensure protection of
personal data, e.g., by means of encryption and/or access
control? If so, what are they?
Who will have access to any personal data collected for
the project or service?
What safeguards will be put in place to ensure that those
who have access to the information treat the information
in confidence?
Many service providers who provide service via the
telephone say that conversations are monitored for
training or quality control purposes. Will that happen
in this project or service? What happens (will happen) to
such recorded conversations?
Use limitation
Transparency (openness)
The OECD guidelines state that personal data should not be
disclosed, made available or otherwise used for purposes
other than those specified except with the consent of the
data subject or by the authority of law. Similarly, Article 6
of the EUs Data Protection Directive says that personal
data must be adequate, relevant and not excessive in relation to the purposes for which they are collected and/or
further processed.
Questions
Is the personal information used for the purposes given
for its collection, and do the data stay with the original
collector, or do they migrate elsewhere?
Is the personal data collected used for profit without
permission from or benefit to the person who provided
it?61
Who will have access to or use of the data collected?
Will the data be transferred to or shared with others?
60
61
Marx, p. 174.
Marx, p. 174.
123
Transparency is a precondition to public trust and confidence. A lack of transparency risks undermining support
for or interest in a technology or service.
The OECD guidelines contain an openness principle
which states that There should be a general policy of
openness about developments, practices and policies with
respect to personal data. Means should be readily available
for establishing the existence and nature of personal data,
and the main purposes of their use, as well as the identity
and usual residence of the data controller.
While the Data Protection Directive does not explicitly
mention openness in this way, recital 63 does say that data
protection supervisory authorities must help to ensure
transparency of processing in the Member States within
whose jurisdiction they fall.
Vedder and Custers have opined that With the growing
speed of the information and communication networks, two
characteristics of the Internet are further enlarged. First, as
the number of content providers and the ease of uploading
information further increases, assessing the true nature of
213
Questions
If a new database is to be created or an existing database
extended, has the data controller informed the data
protection supervisory authority?
Has the data controller made known publicly that he has
or intends to develop a new database, the purpose of the
database, how the database will be used and what
opportunities exist for persons to rectify inaccurate
personal information?
If a database is breached or if the data controller has lost
any data, has he informed the persons whose data have
been compromised and/or the data protection authority?
What activities will be carried out in order to promote
awareness of the project, technology or service?
Will such activities be targeted at those interested in or
affected by the project, technology or service?
Has an analysis been made of who are the relevant
stakeholders?
Are studies about the pros and cons of the project or
technology available to the public?
(b)
(c)
62
63
Questions
Have measures been put in place to facilitate the
persons access to his or her personal data?
Is there a charge for access to data and, if so, how has
that charge been determined?
Is the charge stated on the website of the project or
service?
Will the charge be perceived as reasonable by those
whose data are collected and by the data protection
supervisory authority?
How long should it usually take to respond to requests
for access to personal data and to provide such data?
Can the person whose data are collected rectify easily
errors in those data? What procedures are in place for
doing so?
Anonymity
According to the ISO/IEC 15408 standard on evaluation
criteria for IT security, anonymity ensures that a subject
may use a resource or service without disclosing his or her
identity.64
64
123
214
123
D. Wright
215
67
Beekman (2006).
123
216
D. Wright
68
123
Expert workshops
The European Commission, European agencies (such as
ENISA75) and many other organisations convene expert
workshops or stakeholder panels, often to complement
consultations and sometimes surveys. Ideally, such workshops bring together representatives from various stakeholder groups to discuss issues. The workshops often
consist of a mixture of presentations by those representatives and discussions on one or two or, at least, a limited
number of issues, which can be addressed in the course of a
one or 2-day meeting. Sometimes, just a single workshop is
75
217
76
See Beekman et al., p. 21, pp. 2829. The ethical matrix concept
was developed by Ben Mepham. See Mepham (2005).
123
218
D. Wright
An ethical impact assessment should not consist of questions only. A process for engaging and consulting with
stakeholders should be put in place to help policy-makers,
technology developers and project managers in ensuring
that ethical issues are identified, discussed and dealt with,
preferably as early in the project development as possible.
There are various reasons why project managers should
engage stakeholders and undertake a consultation when
developing new technologies or projects. For one thing,
Article 41 of the Charter of Fundamental Rights of the
European Union, entitled the Right to good administration,
makes clear that this right includes the right of every
person to be heard, before any individual measure which
would affect him or her adversely is taken, which
suggests that consultation with stakeholders is not only
desirable but necessary.
But there are other reasons too. Stakeholders may bring
new information which the project manager might not have
considered and may have some good suggestions for
resolving complex issues.79 Also, technology development
is often too complex to be fully understood by a single
78
79
Citizen panels
123
80
Sollie (2007, p. 302). Moor 2005, op. cit., p. 118, also supports
better collaboration among ethicists, scientists, social scientists and
technologists.
81
Palm and Hansson, p. 547.
82
US National Research Council 1989, p. 9.
83
Stern and Fineberg, pp. 2324.
84
Moor (2005). In his paper, Moor proposes the following hypothesis, which he calls Moors Law: As technological revolutions
increase their social impact, ethical problems increase.
219
who must take a decision whether to proceed with the technology or to modify it or to build some safeguards into its use
in order to accommodate the concerns raised by stakeholders. It is the policy-maker or technology developer alone who
will be held accountable for the decision.
Palm and Hansson caution that the search for consensus in controversial issues should not be overemphasized
since it may lead to the closure of issues at a too early
stage. In ethical TA, conflicts and different opinions should
be highlighted rather than evened out. They also urge that
the assessment should seek to identify all relevant
stakeholders, i.e., a broad spectrum of agents and therefore
also a broad spectrum of responsibilities. They see the
task of an ethical assessment as being to delineate and
analyze the issues and point out the alternative approaches
for the final analysis that are available.85
It would make life easier, undoubtedly, if the stakeholders reach a consensus about how to deal with the
ethical considerations raised and if the decision-maker
agreed with the consensus. In real life, that does not always
happen, so the decision-maker will need to decide which
considerations are given greatest weight and to explain
why he or she took that decision. The decision-maker
should make clear to stakeholders when he or she first
reaches out to them what the rules of the game will be, how
and by whom that ultimate decision will be made.
When a decision-maker ends up disagreeing with the
results of the consultation processes, this calls for explicit
argument, as Beekman et al. point out. It does not follow
that the decision-makers should always follow the results
of the use of ethical tools. Ethical tools are not decisionmaking machines for ethics. However, when such a situation occurs, the great advantage of ethical tools is that they
force the decision-maker to state why he or she prefers a
different conclusion.86
Questions
Has the policy-maker or technology developer developed a process for identifying and considering ethical
issues?
Will the project engage in consultations with stakeholders? If so, when?
Have all relevant stakeholders (i.e., those affected by or
with an interest in the technology or project) been
identified?
Have they been invited to participate in a consultation
and/or to provide their views on the project or
technology?
Is the process by means of which decisions are made
clearly articulated to stakeholders?
85
86
123
220
How many and what kinds of opportunities do stakeholders and citizens have to bring up concerns about
values or non-technical impacts?
How long will the consultation last? Will there be
sufficient time for stakeholders to conduct any research
which they may need to do in order to represent their
views to the project manager?
How will conflicting views of stakeholders be taken into
account or resolved? Are some stakeholders (e.g., industry) given more weight than others (e.g., civil society
organisations)?
Has the project manager made known to the public the
optionsand the pros and cons of each option
available with regard to the development or deployment
of the project, technology, service, etc.?
Is there a process in place for considering ethical issues
at later stages in the project or technology development
that may not have been considered at the outset?
D. Wright
Questions
Much has been written about risk assessment over the last
few decades. One of the best guidances is Ortwin Renns
recent book on Risk Governance.87 While risk experts,
such as Renn, have considered how to deal with uncertainty, uncertainty is a concept scarcely scrutinised in
ethics in general and ethics of technology in particular,
according to Paul Sollie.88 He says the uncertainty arising
from the unpredictable, unforeseen and unanticipated nature of technology development has many causes, one of
which is that technology designed for specific purposes
often ends up being used for completely different activities.
He notes that uncertainty is not simply the absence of
knowledge. Uncertainty can prevail even in situations
where a lot of information is available. New information
does not necessarily increase certainty, but might also
augment uncertainty by revealing the presence of uncertainties that were previously unknown or understated.
The European Commissions Communication on the
precautionary principle89 aims to build a common understanding of how to assess, appraise, manage and communicate risks that science is not yet able to evaluate adequately. It
says the precautionary principle should be considered within
a structured approach of risk assessment, management and
communication. Decision-makers need to be aware of the
scientific uncertainties, but judging what is an acceptable
87
88
89
123
90
If the project or technology is complex and responsibility is distributed, can mechanisms be created to ensure
accountability?
Are there means for discovering violations and penalties
to encourage responsible behaviour by those promoting
or undertaking the project?
If personal data are transferred outside the European
Union, what measures will be put in place to ensure
accountability to the requirements of the Data Protection
Directive?
Accountability
The Data Protection Directive says the data controller
should be accountable for complying with the principles
stated in the Directive.
In the development of new technologies and services,
however, many of the actors and stakeholders involved
(in their development) only have a very restricted insight
into the opportunities and risks involved. Moreover, many
of them have restricted means to respond. For instance,
engineers are involved in the first phases (of research and
development), but have limited influence on the introduction of new technologies into the market/society. End users
may have effect on how the new technologies are introduced into society and how the new technologies are
actually used. However, end users have restricted means to
influence research, development and production of new
technologies.91 Vedder and Custers argue that it is
undesirable to assign all responsibilities to just one group
of stakeholders. Instead, they argue in favour of joint
responsibilities. Instead of creating gaps in the responsibilities, i.e., parts of the research and development process where nobody is responsible, this may create joint
responsibilities. We consider overlapping responsibilities
an advantage rather than a drawback in these cases.92
Rene von Schomberg also argues along these lines. He
claims that the idea of role responsibility cannot be used any
longer in the complex society in which we live. No one person
has an overview of all consequences of a technological
development and therefore he argues for an ethics of knowledge policy and knowledge assessment and says that citizens
should be involved in the assessment and policy-making.93
Questions
Does the project make clear who will be responsible for
any consequences of the project?
Who is responsible for identifying and addressing
positive and negative consequences of the project or
technology or service?
Does the project make clear where responsibility lies for
liability, equality, property, privacy, autonomy, accountability, etc.?
91
92
93
221
94
Article 2 of the mandate given to the EGE states: The task of the
EGE shall be to advise the Commission on ethical questions relating
to sciences and new technologies, either at the request of the
Commission or on its own initiative. The Parliament and the Council
may draw the Commissions attention to questions which they
consider to be of major ethical importance. The Commission shall,
when seeking the opinion of the EGE, set a time limit within which an
opinion shall be given. http://ec.europa.eu/european_group_ethics/
mandate/index_en.htm
95
http://ec.europa.eu/european_group_ethics/link/index_en.htm#4
123
222
Questions
Has the project, its objectives and procedures in regard
to treatment of ethical issues been reviewed by independent evaluators to ensure that ethical issues have
been adequately considered?
Has the decision-maker considered evaluation of its
ethical impact assessment with a view to improving the
process of conducting such an assessment?
If the project involves the development and deployment
of complex technologies, an ethical impact assessment
may need to be ongoing or, at least, conducted again
(perhaps several times again). When does the project
manager envisage submitting its ethical impact assessment to a review by an independent third party?
D. Wright
Good practice
Examples of good practice in ethical assessments may be
strategically important from a policy point of view in the
sense that they might encourage other organisations to
undertake similar assessments, which might also be an
objective of policy-makers. Examples of good practice are
also practically important in the sense they provide guidance on how to undertake ethical assessments. The utility
of good practices depends on how well information about
such good practices is disseminated and how easy it is for
project managers to find relevant good practices.
Questions
Would the project, technology or service be generally
regarded as an example of ethical good practice?
Will the technology or project inspire public trust and
confidence?
Have the designers or proponents of the project examined other relevant good practices?
123
97
Conclusions
This paper has proposed an ethical impact assessment
framework that could used by those developing new
technologies, services, project, policies or programmes as a
way to ensure that their ethical implications are adequately
examined by stakeholders before possible deployment and
so that mitigating measures can be taken as necessary.
223
123
224
99
123
D. Wright
100
Palm and Hansson, op. cit., pp. 547548, p. 550. Moor (2005, p.
118), makes a similar point: We can foresee only so far into the
future We cannot anticipate every ethical issue that will arise from
the developing technology our ethical understanding of developing
technology will never be complete. Nevertheless, we can do much to
unpack the potential consequences of new technology. We have to do
as much as we can while realizing applied ethics is a dynamic
enterprise that continually requires reassessment of the situation. See
also Brey, op. cit.
References
Article 29 Data Protection Working Party, Recommendation 3/97:
Anonymity on the Internet (WP 6), Adopted on 3 December
1997. http://ec.europa.eu/justice_home/fsj/privacy/.
Article 29 Working Party, Opinion on data protection issues related to
search engines, 00737/EN, WP 148, Adopted on 4 April 2008.
http://ec.europa.eu/justice_home/fsj/privacy/workinggroup/
wpdocs/2008_en.htm.
Beauchamp, T. L., & Childress, J. F. (2001). Principles of biomedical
ethics (5th ed.). New York: Oxford University Press.
Beekman, V., et al. (2006). Ethical bio-technology assessment tools
for agriculture and food production, Final Report of the Ethical
Bio-TA Tools project, LEI, The Hague, February. http://
www.ethicaltools.info.
Beekman, V., & Brom, F. W. A. (2007). Ethical tools to support
systematic public deliberations about the ethical aspects of
agricultural biotechnologies. Journal of Agricultural and Environmental Ethics, 20(1), 312.
Boddy, Dr Ken, LOCOMOTION Ethical Study Report, Deliverable D
3.3, Final Version, September 2004. http://cordis.europa.eu/
search/index.cfm?fuseaction=proj.document&PJ_LANG=EN&
PJ_RCN=6099060&pid=37&q=6AF6FCCDA9FE6C99B48B10
861AFEBDDA&type=sim.
Brey, P. (2000). Method in computer ethics: Towards a multi-level
interdisciplinary approach. Ethics and Information Technology,
2(2), 125129.
Clarke, R. (2007). Introduction to dataveillance and information
privacy, and definitions of terms, Aug. http://www.rogerclarke.
com/DV/Intro.html.
Dekker, M. (2004). The role of ethics in interdisciplinary technology
assessment. Poiesis & Praxis, 2(23), 139156.
European Commission, Ageing well in the Information Society,
Action Plan on Information and Communication Technologies
and Ageing, An i2010 Initiative, Communication from the
Commission to the European Parliament, the Council, the
European Economic and Social Committee and the Committee
of the Regions, COM (2007) 332 final, Brussels, 14 June 2007.
European Commission, Communication on the precautionary principle, COM (2000)1, Brussels, 2 Feb 2000.
European Commission, Commission earmarks 1bn for investment in
broadbandFrequently Asked Questions, Press release, MEMO/
09/35, Brussels, 28 January 2009. http://europa.eu/rapid/press
ReleasesAction.do?reference=MEMO/09/35.
European Commission, The European Research Area: New Perspectives, Green Paper, COM(2007) 161 final, Brussels, 4 Apr 2007.
European Commission, European i2010 initiative on e-Inclusion: To
be part of the information society, Communication from the
Commission to the European Parliament, the Council, the
European Economic and Social Committee and the Committee
of the Regions, COM (2007) 694 final, Brussels, 8 Nov 2007.
European Council resolution on e-Inclusion, exploiting the opportunities of the information society for social inclusion, 2001/C 292/
02, OJ 18 Oct 2001.
European Group on Ethics in Science and New Technologies (EGE),
Opinion No. 20 on Ethical Aspects of ICT Implants in the
Human Body, Adopted on 16 March 2005.
European Parliament and Council, Directive 2001/20/EC of 4 April
2001 on the approximation of the laws, regulations and
administrative provisions of the Member States relating to the
implementation of good clinical practice in the conduct of
clinical trials on medicinal products for human use, OJ L 121/34,
Brussels, 1 May 2001.
European Parliament and Council, Directive 2002/22/EC of 7 March
2002 on universal service and users rights relating to electronic
225
123
226
Rowe, G., & Frewer, L. J. (2005). A Typology of Public Engagement
Mechanisms. Science, Technology & Human Values, 30(2), 251
290. http://sth.sagepub.com/cgi/content/abstract/30/2/251.
Skorupinski, B., & Ott, K. (2002). Technology assessment and ethics.
Poiesis & Praxis, 1, 95122.
Sollie, P. (2007). Ethics, technology development and uncertainty: an
outline for any future ethics of technology. Journal of Information Communications & Ethics in Society, 5(4), 293306.
Sollie, P., & Duwell, M. (2009). Evaluating new technologies:
Methodological problems for the ethical assessment of technology developments. Dordrecht: Springer.
Stern, P. C., & Fineberg, H. V. (Eds.). (1996). Understanding risk:
Informing decisions in a democratic society. Washington, DC:
Committee on Risk Characterization, National Research Council, National Academy Press.
Treasury Board of Canada Secretariat, Privacy Impact Assessment
Guidelines: A framework to Manage Privacy Risks, Ottawa, 31
Aug 2002.
UK Information Commissioners Office (ICO), Privacy Impact
Assessment Handbook, Version 2.0, June 2009. http://www.
ico.gov.uk/for_organisations/topic_specific_guides/pia_handbook.
aspx.
123
D. Wright
US National Research Council, Committee on Risk Perception and
Communications, Improving Risk Communication, National
Academy Press, Washington, D.C.,1989. http://www.nap.edu/
openbook.php?record_id=1189&page=R1.
Van Gorp, A. (2009). Ethics in and during technological research; An
addition to IT ethics and science ethics. In P. Sollie & M. Duwell
(Eds.), Evaluating new technologies (pp. 3550). Dordrecht:
Springer.
Vedder, A., & Custers, B. (2009). Whose responsibility is it anyway?
Dealing with the consequences of new technologies. In P. Sollie
& M. Duwell (Eds.), Evaluating new technologies. Dordrecht:
Springer.
Verbeek, P.-P. (2009). The moral relevance of technological artifacts.
In P. Sollie & M. Duwell (Eds.), Evaluating new technologies:
methodological problems for the ethical assessment of technology developments (pp. 6379). Dordrecht: Springer.
von Schomberg, R. (2007). From the ethics of technology towards an
ethics of knowledge policy & knowledge assessment. Working
document from the European Commission Services, Jan.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.