Case Study: Facebook
Case Study: Facebook
Facebook’s stated mission is “to give people the power to build community and bring
the world closer together.” But a deeper look at their business model suggests that it is
far more profitable to drive us apart.
Does Facebook actually filter out “disinformation that endangers public safety or
the integrity of the democratic process?”
Does Facebook actually filter out cyberbullying?
Does Facebook actually filter out calls to violence?
Does Facebook actually filter out hate speech?
Does Facebook actually filter out content?
Does Facebook actually filter out “disinformation that endangers public safety or
the integrity of the democratic process?
Facebook has had a profound impact on our access to ideas, information, and one
another. It has unprecedented global reach, and in many markets serves as a de-facto
monopolist. The influence it has over individual and global affairs is unique in human
history.
Questions:
Facebook is a social media platform that enables people to connect and share with
each other. It is also a great source to stay updated with the latest information. It's one
of the most used platforms to stay up to date with the latest news. Facebook is adding
message filters to its Messages feature to help prevent the discovery of unlikable posts.
However, others agree that Facebook becomes unethical. Through "filter bubbles"—
social media algorithms that increase engagement and create echo chambers where
inflammatory content is given the most visibility. Facebook profits from radicalism,
bullying, hate speech, disinformation, fake theories and verbal violence that are spread
on the Facebook. Political extremism, malicious disinformation, and false stories have
crept into mainstream politics and manifested as deadly, real-world violence due to
Facebook's failure to control them. Thus, because of numbers of people on Facebook
doing unethical behaviors, these actions have far-reaching consequences for countries,
governments, the global population, and every corporation.
Facebook has become the legally recognized online medium for communication and
social engagement in many regions of the world. The primary platform passed the 2
billion monthly active user mark in 2017, and global user growth has continued since
then, with 2.6 billion expected in April 2020. Furthermore, Facebook has become an
important component for preserving social interactions in many nations.
2. What courses of action were recommended? What decision to be made?
Who are responsible in making them?
1. Eliminating violences
Blocking and deleting fraudulent accounts, detecting and eliminating malicious
actions, restricting the spread of false news and disinformation, and providing
unbelievable transparency to political advertising are some Facebook violence’s that are
needed to eradicate. Facebook must also enhance its machine learning skills, which will
allow it to be more successful in detecting and eliminating illegal activity. While skilled
investigators manually discover increasingly complicated networks, these technological
advancements assist better identify and stop illegal behavior.
As a result, they will be able to make significant progress. Facebook should also
focus on deleting millions (or double) of false accounts every day, preventing them
from engaging in some kind of coordinated information operations that are commonly
used to influence voters. Creating a fact-checker rates anything as false that will help to
limit false-news. While political meddling is a common strategy, others will utilize fake
news for a variety of purposes, including making money by fooling people into clicking
on something.
2. Responsively focus on the use of AI-based technology
Facebook must be more responsive in its use of AI-based technologies to filter out
hate speech, calls to violence, bullying, and disinformation that endangers public safety
or the democratic process' integrity. Eli Pariser invented the phrase "filter bubble" and
authored a book about how social media algorithms are designed to enhance
engagement while also creating echo chambers. Filter bubbles aren't only an
algorithmic result; they often filter our own lives by associating with individuals (both
online and offline) who share our philosophical, religious, and political beliefs.
If you are the consultant in this case study, what other recommendations
will you add to ensure that problems faced by the company will be solved?