100% found this document useful (1 vote)
214 views

Discuss Ethical Issues in Data Science Covering .

Uploaded by

Abbas Hasan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
214 views

Discuss Ethical Issues in Data Science Covering .

Uploaded by

Abbas Hasan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

6/15/2021 ) Discuss Ethical Issues In Data Science Covering ... | Chegg.

com

  Textbook Solutions Expert Q&A Practice Study Pack 

Find solutions for your homework Search

home / study / engineering / computer science / computer science questions and answers / ) discuss ethical issues in data science covering privac…

Question: ) Discuss Ethical Issues in Data Science covering privacy,


secur… Post a question
Answers from our experts for your tough
homework questions

) Discuss Ethical Issues in Data Science covering privacy,


security, ethics and Next-generation data Enter question
scientists.

Expert Answer Continue to post


0 questions remaining
Anonymous answered this
Was this answer helpful? 2 0
67 answers

ANSWER :
Snap a photo from your
Ethical challenges phone to post a question
Unfair discrimination. We'll send you a one-time download
link
What is a fair defendant evaluation and how can data
scientists develop a corresponding algorithm that
works for any
defendant?
888-888-8888 Text me
Northpointe’s definition of a fair algorithm states that the
proportion of defendants who re-offend in each
risk category is
approximately the same regardless of race. In other words, a risk
score of e.g. 7 predicts an
By providing your phone number, you agree to receive a one-tim
equal likelihood of re-offending for
black and white defendants. Any alternative definition would, by automated text message with a link to get the app. Standard
definition, discriminate against white defendants by artificially
boosting their risk levels. Researchers from messaging rates may apply.

Stanford and Berkeley


have validated that a single risk score represents an approximately
equal risk of
actual recidivism based on roughly 5,000 samples from
Broward County, Florida

Reinforcing human biases.


My Textbook Solutions
Both aspects of Data Science defined earlier (vast amounts of
data and statistical models) lead to an
inherent limitation for
making decisions based on patterns in past data. COMPAS’ developers
presumably
fed the system with recidivism data to let the model
identify variables correlating with a likelihood to re-
offend — a
standard way to train classification algorithms. If the data were
perfectly unbiased there would
be no problems, but all COMPAS can
do is parrot back to us our own biases. These include severe,
racist
biases of judges that see black defendants as “more likely
to kill again than whites” (Epps, 2017) or a
Mexican defendant to
have committed sexual assault “because he’s Mexican and Mexican men
take Mechanics... Holt... Engineering..
whatever they want”
8th Edition 0th Edition 14th Edition
Lack of transparency. (1)

View all solutions


Hiding COMPAS’ inner workings from the public prohibits the
understanding and discussion around
mitigating learned biases.
However, Northpointe understandably argues that publishing its
proprietary
algorithms would lead to a competitive disadvantage
“because it’s certainly a core piece of our business”
(Liptak,
2017) as an executive is quoted in the New York Times. This dilemma
comes as no surprise when
government agencies authorise private
companies to apply Data Science to sensitive governmental data.
There are two key areas requiring transparency.

Privacy.

At first glance, one might not categorise electricity


consumption as particularly sensitive data. However,
granular meter
readings can be used to determine whether a person is at home or
not, which appliances
are used at what time (Molina-Markham et al.,
2010), whether you leave appliances on for longer than
required and
even features of buildings (Beckel et al., 2014). Hence, intimate
details of a user’s daily life
could be exposed and used in ways
that invade individual privacy

Lack of transparency.

Opportunities for new tariffs, for example, raise a problem of


transparency. Say my neighbour’s and my
tariffs vary even though we
have the same flat size and data parameters. Is the difference due
to me using
more electricity on the weekends than she is? What data
define tariff models?

Consent and power.

New tariff models would likely connect data granularity with


tariff prices. In other words, the more granular
the data, the
cheaper the tariff. Hence the consumer can trade privacy for
electricity costs, potentially
pushing low-income household to
consent to greater data sharing

Next-generation sequencing (NGS) is expected to revolutionize


health care. NGS allows for sequencing of
the whole genome more
cheaply and quickly than previous techniques. NGS offers
opportunities to
advance medical diagnostics and treatments, but
also raises complicated ethical questions that need to be
addressed.

https://www.chegg.com/homework-help/questions-and-answers/discuss-ethical-issues-data-science-covering-privacy-security-ethics-next-generation-d… 1/3
6/15/2021 ) Discuss Ethical Issues In Data Science Covering ... | Chegg.com
Areas considered:
  Textbook Solutions Expert Q&A Practice
This article draws from the literature on research and clinical
ethics, as well as next-generation sequencing,
Study Pack 

in order to provide
an overview of the ethical challenges involved in next-generation
sequencing. This
article includes a discussion of the ethics of NGS
in research and clinical contexts.

Expert opinion:

The use of NGS in clinical and research contexts has features


that pose challenges for traditional ethical
frameworks for
protecting research participants and patients. NGS generates
massive amounts of data and
results that vary in terms of known
clinical relevance. It is important to determine appropriate
processes for
protecting, managing and communicating the data. The
use of machine learning for sequencing and
interpretation of
genomic data also raises concerns in terms of the potential for
bias and potential
implications for fiduciary obligations. NGS
poses particular challenges in three main ethical areas: privacy,
informed consent, and return of results.

Bringing better ethics into data science

Despite giving our society the benefits of advancement, progression


and changing the mode of our lives,
there are aspects data
science fails to keep its ethical grounds. In the era, where
societies are subject to
rapid technological change, the issue of
privacy is often compromised. Along with privacy, concepts of
fairness and justice to fall short as through data analysis
significant assumptions are being made upon
people’s lives. Those
aspects that are considered private are also open for evaluation in
the form of
quantifiable data.

Many organizations and universities that carry out the survey


and use a large amount of data in order to
make convincing
assessments about people’s respective lives are now working towards
creating a robust
framework of a code of ethics.

On the basis of this code of ethics, the researches


unanimously claim that they should employ the right
tools and
methodologies so that they can make effective deductions on
people’s lives. Or to be more
precise, these researchers strive to
gear their values towards understanding how they shape their data
collection and analytical tools and how far they conform to their
ethical set of values.

If you do not get anything in this solution,please put a


comment and i will help you out.

Do not give a downvote instantly It is a humble


request.

NOTE : PLEASE GIVE ME A LIKE ........... I REALLY NEED


THOSE LIKES ..... @! HAPPY CHEGGING @!

I HAVE ANSWERED ACCORING TO CHEGG POLICY SO PLEASE DON'T


GIVE ANY NEGATIVE RATINGS
............... @ THANK YOU @
..............

NOTE : PLEASE GIVE ME A LIKE ........... I REALLY NEED


THOSE LIKES ..... @! HAPPY CHEGGING @!

I HAVE ANSWERED ACCORING TO CHEGG POLICY SO PLEASE DON'T


GIVE ANY NEGATIVE RATINGS
............... @ THANK YOU @
..............

Comment


Practice with similar questions

Q: Discuss Ethical Issues in Data Science covering privacy,


security, ethics and Next-generation data scientists.

A: See answer

COMPANY

LEGAL & POLICIES

CHEGG PRODUCTS AND SERVICES

CHEGG NETWORK

CUSTOMER SERVICE

https://www.chegg.com/homework-help/questions-and-answers/discuss-ethical-issues-data-science-covering-privacy-security-ethics-next-generation-d… 2/3
6/15/2021 ) Discuss Ethical Issues In Data Science Covering ... | Chegg.com

 

Textbook Solutions Expert Q&A Practice Study Pack 

© 2003-2021 Chegg Inc. All rights reserved.

https://www.chegg.com/homework-help/questions-and-answers/discuss-ethical-issues-data-science-covering-privacy-security-ethics-next-generation-d… 3/3

You might also like