Business Assignment 3
Business Assignment 3
Assignment 3
Farhana
Contents
Introduction:................................................................................................................................................2
TASK 1(a):.....................................................................................................................................................2
Effect on Privacy of Data:.........................................................................................................................2
Effect on the Professional Conduct and Ethics Code of the ACS:.............................................................2
TASK 1(b):....................................................................................................................................................3
Suggestions to Chelsea:...........................................................................................................................3
TASK 1(c):.....................................................................................................................................................4
TASK 2(a):.....................................................................................................................................................6
Benefits of Big-Data Driven Loan Evaluation That Are Ethically Significant:.............................................6
TASK 2(b):....................................................................................................................................................7
Josh and Hannah's Ethically Significant Damages:...................................................................................7
Task 2(c):......................................................................................................................................................8
More Severe Social Damages Resulting from Widespread Opaque Loan Evaluation Use:.......................8
Task 2(d):.....................................................................................................................................................8
Openness and Definability:......................................................................................................................8
Fairness in Algorithms:............................................................................................................................8
Appeal Procedures:.................................................................................................................................9
References.................................................................................................................................................10
Introduction:
In the data-driven and technologically advanced era, financial institutions are depending more and more
on complex algorithms to assess loan applications (Ethics, 2022). This report examines the ethical issues
surrounding the use of big-data driven systems in loan evaluations by examining a case study involving
Josh and Hannah, a couple pursuing their ambition of opening a fashion boutique. We examine in detail
the possible advantages as well as the serious ethical harms that candidates such as Josh and Hannah
might encounter. We also evaluate the wider societal implications and suggest actions to address the
ethical issues raised. This paper highlights the significance of coordinating technology improvements
with ethical norms in the financial sector by fusing theoretical ideas from lectures with practical
applications.
TASK 1(a):
Effect on Privacy of Data:
A legislative framework that controls how businesses handle personal data is the Australia Privacy Act. It
lays out guidelines for the appropriate management of data and lays a heavy focus on protecting
people's personal information. In this instance, selecting a less secure approach has a number of effects
on the privacy of data:
Risk of Data Breaches: The risk of data breaches is greatly increased when a less secure solution
is chosen in the presence of sensitive data, such as customer information (including credit card
numbers), employee salary, and performance reviews. Unauthorized access, theft, or exposure
of personal information can result from data breaches, and the impacted parties may face
serious repercussions (Australia, 1988).
Non-compliance with the Privacy Act: The Privacy Act mandates that businesses take
appropriate precautions to guard against loss, alteration, and abuse of personal data. The CTO
and IT Manager might be seen as failing to take these appropriate actions if they select a less
secure solution while being aware of the importance of the data and the presence of security
flaws. The organization may face penalties and legal ramifications as a consequence of this non-
compliance (Australia, 1988).
Reputation and Trust: The organization's reputation and trust may be damaged by data breaches
and privacy violations. A loss of revenue and opportunity may result from stakeholders, including
workers, customers, and stakeholders losing faith in the company's capacity to protect their
personal information ((OAIC), 2023).
Do No damage: The ACS Code of Professional Conduct's Principle 3 places a strong emphasis on
the duty of IT professionals to "do no harm." This idea is supported by Chelsea's suggestion for
an advanced IT security solution, which places a high priority on safeguarding private
information and preventing possible harm to people. This principle is violated when a less secure
option is chosen because it undermines data integrity and raises the possibility of harm from
data breaches ((ACS), 2022).
Privacy and Security: Protecting personal data's privacy and security is crucial, as the ACS Code
of Ethics emphasizes. IT workers are supposed to safeguard private data from exposure or illegal
access. It is unethical to use a less secure solution when dealing with sensitive data since it
compromises the security of personal information and does not adhere to ethical norms for
protecting people's data (Union, 2016).
In conclusion, the CTO and IT Manager's choice in this instance has a substantial impact on data privacy,
may violate the Australia Privacy Act, and is at odds with the values and guidelines outlined in the ACS
Code of Professional Conduct and Ethics. It calls into question if the company is abiding by its ethical and
legal obligations.
TASK 1(b):
Suggestions to Chelsea:
Remain Firm on Ethical Grounds: Chelsea has the right to stick by her ethically-driven and
professionally-astute advice for an advanced IT security solution. Her main responsibility is to
make sure that sensitive data—such as client credit card information, staff wages, and
performance reviews—is protected ((ACS), 2022).
Clearly Stated Risks: Chelsea has to keep outlining any possible dangers connected to the less
safe option. This needs to be done in an understandable, truthful, and data-driven way. She
should back up her statements with proof, such as previous security lapses or industry
standards. She can convince others of her suggestion more persuasively if she highlights the real
dangers (Australia, 1988).
Have a productive conversation: Chelsea has to have a good conversation with the IT manager
and CTO. Her goal should be to promote understanding rather than portraying the conversation
as a struggle. She can collaborate to create a solution that strikes a balance between security
and financial issues by posing open-ended questions to get a better understanding of the
reasons behind their choice (Australia, 1988).
Provide Cost-Effective Alternatives: Chelsea is able to investigate and provide more affordable
options that also offer a greater degree of protection. One of these options may be to take a
staged approach, delaying some security measures while implementing crucial ones first. This
strategy may handle the client's financial worries without compromising data security (Zealand,
2013).
Record Communication: Chelsea must keep a record of every conversation on this matter. She
can use this paperwork as proof of her ethical responsibilities and professional attention. It may
also prove beneficial in the event of future legal or ethical difficulties (Zealand, 2013).
Seek Assistance from Professional Organizations: Chelsea may want to think about obtaining
assistance from professional associations such as the ACS if the matter is not addressed and she
feels that the choice carries a large risk in terms of morality or the law. In situations when there
are ethical difficulties, these organizations could offer advice, resources, or even mediation
services ((OAIC), 2023).
Ethical Considerations: Chelsea needs to assess the moral implications of her own behavior
closely. She has to make sure that everything she does complies with the ACS Code of Ethics and
Professional Conduct. She should speak with legal counsel and, if required, take into account her
own ethical duties and potential professional responsibility if she feels that using the less secure
method will violate her ethical commitments ((ACS), 2022).
In summary, Chelsea need to give ethical and data security concerns top priority while making her
decision, but she should also look for opportunities for productive discussion and compromise. As a
professional, she has an obligation to secure sensitive data, respect the client's financial limits, and work
with them to find the best solution.
TASK 1(c):
K-anonymity is a key idea in data privacy, especially when it comes to safeguarding users' privacy when
exchanging data. It guarantees that at least k people share every combination of quasi-identifiers
(attributes that may be used to identify someone) in a collection. This indicates that the information has
been sufficiently anonymized to prevent people from being re-identified, protecting their privacy
(Sweeney L. , 2002).
The dataset includes the sensitive attribute income as well as quasi-identifiers {Sex, Age, Postcode}. In
order to evaluate the k-anonymity level for every value of the sensitive characteristic, we take into
account various values of k, e.g., 1-anonymous, 2-anonymous, 3-anonymous, and so on (Sweeney P. S.,
1998).
1-Anonymous: The goal of the K-anonymity privacy model is to make sure that every set of
quasi-identifiers belongs to a minimum of k distinct people. Every quasi-identifier combination in
the dataset should be distinct in the case of 1-anonymity. As a result, no two people in the
sample have the same collection of quasi-identifiers (Sweeney D. K., 2007).
However, the dataset supplied shows that there are numerous distinct combinations of {Sex,
Age, Postcode} for each value of the sensitive property. For example, there are several
permutations of {20-25, 318*, male} and {20-25, 318*, male} if we look at Income = $80,000.
This demonstrates unequivocally that numerous distinct people with the same Income value
may be associated with the same quasi-identifier combination. As a result, the information is not
1-anonymous.
2-Anonymous: At least two people with the same sensitive attribute value must be associated
with each combination of quasi-identifiers in order to achieve 2-anonymity (Sweeney D. K.,
2007). We are able to locate one occurrence of 2-anonymity in the dataset, which is related to
Income = $80,000.
Two entries with Income = $80,000 in this instance are identical in that they both have the same
combination of {20–25, 318*, male}. This indicates that at least two people have the same quasi-
identifiers for this particular Income figure. Therefore, the data for Income = $80,000 is 2-
anonymous.
3-Anonymous: At least three people with the same sensitive attribute value must be
represented by each combination of quasi-identifiers in order to achieve 3-anonymity (Sweeney
D. K., 2007). Nevertheless, a more thorough inspection of the information reveals that no {Sex,
Age, Postcode} combination has three or more records for any critical attribute value. Stated
otherwise, there aren't enough people with the same quasi-identifiers for any given Income
value. For any Income figure, the data is therefore not 3-anonymous.
4-Anonymous: For larger k-anonymity levels, the same trend holds true. Each quasi-identifier
combination must match at least four people with the same sensitive attribute value in order to
qualify for 4-anonymity (Sweeney D. K., 2007). Nevertheless, the dataset does not contain any
combinations of {Sex, Age, Postcode} that have four or more records for any critical attribute
value. As a result, the information is not 4-anonymous.
5-Anonymous: For 5-anonymity, this procedure is repeated. Each set of quasi-identifiers must
match at least five people who have the same sensitive attribute value in order to achieve 5-
anonymity (Sweeney D. K., 2007). Once more, no sensitive attribute value has five or more
records for any combination of {Sex, Age, Postcode}. The data is therefore not 5-anonymous.
In conclusion, the dataset satisfies the 2-anonymity threshold for Income = $80,000, indicating that at
least two people share the same quasi-identifiers for this particular sensitive attribute value. Higher k-
anonymity levels (3, 4, 5, or 6) are not attained by the dataset for any Income value, though. This
suggests that much of the information is at danger of re-identification, particularly for people with
different set of values. (Venkatasubramanian, 2007)
More anonymization or data aggregation procedures should be taken into consideration in order to
improve the privacy protection of this dataset and strengthen its resistance to re-identification,
particularly for those critical features that fall outside of the 2-anonymity criterion. This will lessen the
possibility of re-identification and safeguard each person's privacy inside the dataset (Abou-el-ela Abdou
Hussien, 2013).
To sum up, k-anonymity is a key idea in data privacy that seeks to safeguard personal information by
making sure that it is sufficiently anonymous. Since the case study's dataset does not reach the higher
degrees of k-anonymity, it is imperative to take additional privacy protection precautions while working
with sensitive material (Abou-el-ela Abdou Hussien, 2013).
TASK 2(a):
Benefits of Big-Data Driven Loan Evaluation That Are Ethically Significant:
For banks and borrowers, evaluating loan applications with a big-data driven system can offer the
following substantial ethical advantages:
Efficiency and Speed: The capacity of big-data driven systems to process enormous volumes of
data quickly is one of its main advantages. This can expedite the loan application process
considerably, giving borrowers quicker access to decisions. Borrowers who want rapid access to
funds for a variety of reasons, such as launching a business or meeting urgent financial demands,
benefit from this efficiency (Hsinchun Chen, 2012).
Objectivity and Consistency: Because big-data driven systems rely on data analysis and
algorithms, the decision-making process may benefit from some objectivity and consistency.
Human loan officers might be biased or conscious or unconscious, personal judgments. When
developed correctly, algorithms may reduce these biases and guarantee that objective criteria—
rather than arbitrary ones—are used to make judgments (Nadeem U. Shahid, 2021).
Greater Financial Inclusion: Banks may be able to provide loans to a wider spectrum of
applicants by utilizing big data, including people who have little or no traditional credit history. In
addition to promoting financial inclusion, this can open doors for people who might find it
difficult to obtain credit through conventional channels (Shmatikov, 2008).
Risk Mitigation: Banks may get a more complete picture of a borrower's financial situation and
risk profile from big-data driven platforms. This facilitates enhanced risk evaluation and handling.
Genuinely creditworthy borrowers may be eligible for more favorable loan conditions and lower
interest rates, which will eventually cut borrowing costs (Matt J. Kusner, 2018).
Customized Products: Banks may use big-data analysis to customize loan products to meet the
demands of specific borrowers. This customization may result in better loan terms and
arrangements that meet the goals and financial circumstances of borrowers (Nadauld, 2014).
Fraud Detection: Big-data technologies are very useful for spotting possible identity theft or
fraud. This lowers the possibility of financial loss and shields lenders and borrowers against
dishonest loan applications (Shmatikov, 2008).
Decreased Human Error: Big-data algorithms are less vulnerable to human mistakes that can
happen during manual underwriting. This lowers the possibility of making bad choices that
might hurt lenders or borrowers (Bank, 2014).
TASK 2(b):
Josh and Hannah's Ethically Significant Damages:
The loan refusal for Josh and Hannah as a result of an opaque big-data driven system may have various
detrimental effects on ethics:
Unfair Discrimination: Discriminatory results may arise from the broad and perhaps
inappropriate usage of personal data in loan decision-making. Discrimination on the basis of
race, gender, or health status is often extremely hurtful and unfair. In Josh and Hannah's
instance, unfair discrimination may result if the algorithm drew false conclusions about their
racial heritage (Khanna, 2016).
Lack of Transparency: Borrowers are not given the chance to correct any errors or problems in
their application if a loan is denied without giving them the reasons or the chance to appeal. The
reason this lack of openness matters morally is because It makes it more difficult for borrowers
to protect their rights and make any necessary corrections (Administration, 2022).
Impact on Life goals: Missing out on a loan might seriously interfere with possibilities and life
goals. Josh and Hannah, on the other hand, would like to own and run a fashion business. Not
only can a loan refusal put their company goals in jeopardy, but it may also negatively influence
their general financial stability, which may have an effect on their well-being and family life
(Brown, 2018).
Violation of Privacy: It may be argued that the system's access to a plethora of personal
information, such as conclusions drawn from online genetic registries or pharmacy transactions,
is a violation of privacy. Protecting people's privacy and autonomy when unauthorized access to
such data occurs without their knowledge or agreement presents ethical questions (Bureau C. F.,
2018).
Diminished Economic Mobility: Borrowers may have fewer options and less economic mobility
as a result of loan denials. If Josh and Hannah's creditworthiness has been negatively impacted
by the opaque decision, their refusal might lead to missed business opportunities and make it
difficult for them to find other sources of funding (Agarwal, 2011) .
Emotional and Psychological Stress: Borrowers' mental health and general well-being may be
negatively impacted by loan denials, which can cause emotional and psychological stress. Such
rejections might leave one feeling unsettled and disappointed for a long time (Bureau C. F.,
2018).
Inaccurate Assessments: Inaccurate data or faulty algorithms may result in incorrect loan
judgments that injure borrowers unnecessarily. Erroneous assumptions may lead to Josh and
Hannah being incorrectly classified as "moderate-to-high risk," which might potentially deny
them chances they deserve (Center, 2017).
Task 2(c):
More Severe Social Damages Resulting from Widespread Opaque Loan Evaluation Use:
In addition to the direct effects on people like Josh and Hannah, there are other wider negative effects
on society associated with the widespread usage of opaque loan appraisal processes:
Destroying Borrowers' Feeling of Unfair Treatment: Inconsistencies in loan choices might cause
borrowers to feel unfairly treated and undermine their faith in financial institutions. The financial
system's integrity and stability may be compromised by this (ProPublica, 2016).
Task 2(d):
Openness and Definability:
Justification: Ensuring openness and explainability in the loan evaluation process is one of the
most crucial steps in reducing damages. Financial organizations must to offer comprehensible
and transparent explanations of the algorithms, data sources, and other variables that go into
making judgments. By providing borrowers with knowledge about the decision-making process,
transparency may help correct any biases or mistakes (Narayanan, 2014).
Challenges: Preserving confidential algorithms and data sources while upholding openness
requires careful balancing. Financial firms may refuse to reveal confidential information, claiming
security and intellectual property issues as reasons (Regulation, 2016).
Fairness in Algorithms:
Justification: Preventing prejudice and discrimination requires that loan assessment algorithms
be fair. Financial institutions ought to make investments in the creation and application of
algorithms with a clear focus on fairness. To ensure fair decision-making, biases may be found
and corrected with the use of routine audits and monitoring (Bureau U. C., 2019).
Difficulties: It can be difficult to remove biases from algorithms, particularly when there may
already be biases in the previous data. Fair algorithm development and maintenance demand a
substantial investment of time and knowledge (Nadauld, 2014).
Appeal Procedures:
Rationale: Granting borrowers, the ability to challenge loan decisions is an essential step in
averting unfair loan rejections. Borrowers can contest decisions and request an application
review through appeal channels. This can be especially crucial when automated systems are
involved since it provides a means of addressing any mistakes or inconsistencies that may arise
throughout the assessment process (Union, 2016).
Challenges: Financial institutions may need to invest a lot of resources in creating effective and
easily accessible appeal processes. Processes that are clearly established and knowledge are
needed to ensure that appeals are handled swiftly and equitably (Schwartz, 2015).
To summarize, there are three essential steps that need to be taken in order to mitigate and avoid the
negative effects of opaque loan evaluation systems: appeal procedures, algorithmic fairness, and
openness and explainability. Algorithmic fairness ensures equal decision-making, transparency gives
borrowers knowledge about the decision-making process, and appeal mechanisms give borrowers a way
to contest judgments and seek remedy. Although these procedures are necessary for evaluating loans in
an ethical manner, they are not without difficulties. These difficulties include the necessity to strike a
balance between openness and proprietary concerns, the difficulty of removing biases from algorithms,
and the resource-intensive aspect of creating effective appeal processes. In the era of big data,
addressing these issues is crucial to advancing just and moral lending policies.
References
(ACS), A. C. (2022). ACS Code of Professional Conduct and Ethics .
Abou-el-ela Abdou Hussien, N. H. (2013). Protecting Privacy When Disclosing Information: k-Anonymity
and Its Enforcement through Generalization and Suppression. Journal of Information Security,
4(2).
Agarwal, R. &. (2011). The economic cost of credit denial. NBER Working Paper.
Australia, P. o. (1988). Australian Privacy Act 1988 (Cth). Commonwealth of Australia, 18(1).
Brown, W. &. (2018). The impact of credit denial on consumer credit scores. Federal Reserve Bank of
New York Staff Report.
Ethics, A. (2022). Algorithmic Decision-making in Financial Services: Economic and Normative Outcomes
in Consumer Credit.
Hsinchun Chen, R. H. (2012). Business intelligence and analytics: From big data to big impact. Morgan
Kaufmann.
Khanna, S. &. (2016). The impact of credit denial on small business outcomes. Federal Reserve Bank of
San Francisco Working Paper.
Matt J. Kusner, J. R. (2018). Counterfactual Fairness. In Proceedings of the 21st International Conference
on Artificial Intelligence and Statistics , 846-854.
Nadauld, T. D.-M. (2014). Big data and financial inclusion. . Federal Reserve Bank of St. Louis Review,,
96(1), 3-16.
Nadeem U. Shahid, N. J. (2021). Impact of Big Data on Innovation, Competitive Advantage, Productivity,
and Decision Making:. Open Journal of Business and Management, 9(2).
Narayanan, A. &. (2014). The case for transparent big data. Communications of the ACM,, 57(11), 89-97.
ProPublica. (2016). Machine bias: There's software used across the country to predict future criminals.
And it's biased against blacks.
Schwartz, D. J. (2015). PII: Privacy in the information age. Stanford Law Books.
Selbst, S. B. (2016). The visible machine: Unveiling the hidden biases in machine learning. Harvard
University Press.
Shmatikov, A. N. (2008). Robust de-anonymization of large datasets. In Proceedings of the 2008 ACM
SIGSAC conference on computer and communications security.ACM., 122-138.
Sweeney, D. K. (2007). A unified framework for privacy-preserving data mining. ACM Transactions on
Knowledge Discovery from Data (TKDD), 1(1), 1-33.
Sweeney, L. (2002). Achieving k-anonymity privacy protection using generalization and suppression.
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 10(5), 571-588.
Sweeney, P. S. (1998). Protecting privacy when disclosing information: k-anonymity and its enforcement
through generalization and suppression. Technical Report SRI-CSL-98-04.
Union, E. P. (2016). General Data Protection Regulation (GDPR). Official Journal of the European Union.