0% found this document useful (0 votes)
32 views15 pages

Building Consentful Tech Zine SPREADS

The document discusses the concept of consentful technology, emphasizing the importance of consent in digital interactions and data management. It highlights the need for a cultural and technological shift to ensure that users have control over their digital bodies and that consent is an ongoing process. The zine serves as a resource for individuals affected by digital technologies, aiming to foster a more equitable and just digital landscape.

Uploaded by

purvin kvg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views15 pages

Building Consentful Tech Zine SPREADS

The document discusses the concept of consentful technology, emphasizing the importance of consent in digital interactions and data management. It highlights the need for a cultural and technological shift to ensure that users have control over their digital bodies and that consent is an ongoing process. The zine serves as a resource for individuals affected by digital technologies, aiming to foster a more equitable and just digital landscape.

Uploaded by

purvin kvg
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

REV

ERS
N IBLE
G IVE
E LY
FRE

Building
Consentful
Tech IN
FO
RM
ED

IC
ST
SIA
T HU
EN

SP
EC
IFI
C
What does
consent have
3 What does consent have to do with
to do with
4
technology?

Content Warning
technology?
5 
Consentful technology is about
having control over our digital
bodies
2017
10 
Consent makes technology more
just and equitable You are free to:

Share — copy and redistribute the A lot of us know about consent with regard to our
14 Consent from the ground up material in any medium or format
physical bodies, like in the context of medical decisions
Adapt — remix, transform, and
17 Consent is an ongoing process build upon the material or sexual activities. But when it comes to our digital
19 
Consentful technology relies on for any purpose, even lives, there’s a lack of discussion about what consent
commercially.
community and accountability means for our data, our identities, and our online
Under the following terms:

23 
Consentful technology moves us Attribution — You must give
interactions.
appropriate credit, provide a link
towards consent culture to the license, and indicate if
This zine is intended for anyone who uses, makes, or
changes were made. You may do

24 Resources so in any reasonable manner, but


not in any way that suggests the
is affected by digital technologies and wants to build
24 Acknowledgments
licensor endorses you or your
use. a more consentful world. It is by no means a compre-
No additional restrictions — You hensive resource, but rather a collection of thoughts
25 Who wrote this? may not apply legal terms or
technological measures that and questions we’ve gathered in the hopes of growing
legally restrict others from doing
25 Feedback? anything the license permits. this conversation.
Consentful

BUILDING CONSENTFUL TECH 5


technology is
Content Warning about having
We’ve tried as much as possible not control over our
to reproduce harm and violence in
this zine, however we do make several digital bodies
references to sexual violence. We
have used the symbol at the top When it comes to our physical bodies, we know there is more to consent than a
simple yes or no. Medical procedures like surgery require the informed consent
of the relevant pages to indicate of the patient — they must be aware of both the benefits and risks in order to
where this content appears. really consent. With sexual activity, if someone says yes to one form of intimacy
but is coerced into performing another, the sex is not consensual.

But what about consent beyond our physical bodies? These days, we also have
digital bodies. Digital bodies are made up of pieces of personal data. Like our
physical bodies, our digital bodies exist in relationship with others and can
participate in communities. They can also experience harm. Although the harm
to them might not be physical, our digital bodies are frequently acted upon in
non-consensual ways:

CONSENT IN THE DICTIONARY Apps like Uber track “Revenge Private information
our location even pornography”: such as biometric
Consent /kən’sent/
when we are not posting intimate data being shared
4 BUILDING CONSENTFUL TECH

Noun: Permission for something to happen or using them images without the across government
agreement to do something. subject’s consent databases, which
“Doxing”: distributing
particularly
Verb: Give permission for something to happen a target’s private
impacts people
information, like Unchecked threats
with disabilities,
their social network of sexual assault in
immigrants, and the
account passwords digital spaces like
poor.
Twitter
With our digital bodies, there is also more to YOUR DIGITAL BODY IS

BUILDING CONSENTFUL TECH 7


consent than a “yes” or “no.” As our physical bodies
become increasingly interlinked with our digital
SCATTERED THROUGHOUT THE
bodies, harm can’t be prevented by trying to avoid SERVERS THAT MAKE UP THE
technology. And harm can’t be justified because INTERNET
someone checked a box that said “I agree to these
terms and conditions.”

Consentful Instead, cases like those on the previous page point


to a need for a cultural and technological shift in
technologies how we understand digital consent, as well as a
are applications political shift in how we advocate for control over
our digital bodies. We want to offer up the idea of
and spaces in consentful technologies to help us move toward
which consent this. Consentful technologies are applications and
spaces in which consent underlies all aspects, from
underlies all the way they are developed, to how data is stored
aspects. and accessed, to the way interactions happen
between users.

We use consentful instead of “consensual” because


the latter implies a singular ask or interaction.
Consentful technology is about a holistic and
ongoing approach to consent.

WHAT IS A DIGITAL BODY?

Digital bodies are like physical bodies in that


they’re comprised of smaller bits. Instead of cells
6 BUILDING CONSENTFUL TECH

and organs, digital bodies have data and metadata.

However unlike a physical body that exists in one


place, our digital bodies are scattered throughout Some types of data: photos, facial
the servers that make up the internet. Also unlike recognition information, search history, email
physical bodies, our body parts are controlled contents, contacts and friends
exclusively by the environment they live in.
Some servers your data lives in: Internet
What would a future look like in which the cells of service providers, law enforcement agencies,
our digital bodies have more autonomy? social media companies
UNDERSTANDING CONSENT IS

BUILDING CONSENTFUL TECH 9


AS EASY AS F.R.I.E.S.

IC
ST
IA
US
TH
For the sake of starting with a common but EN

robust definition of consent, we turn to Planned


Parenthood’s FRIES acronym.* According to this
definition, consent must be: IN
FO
RM
ED

Freely given. Doing something with someone is Enthusiastic. If someone isn’t excited, or really into
G IVEN a decision that should be made without pressure, it, that’s not consent. If people are giving up their
E LY force, manipulation, or while incapacitated. In data because they have to in order to access necessary
FRE
technology, if an interface is designed to mislead services and not because they want to, that is not
people into doing something they normally wouldn’t consentful.
do, the application is not consentful.
Specific. Saying yes to one thing doesn’t mean
Reversible. Anyone can change their mind about they’ve said yes to others. A consentful app only uses
what they want to do, at any time. In technology, data the user has directly given, not data acquired
you should have the right to limit access or entirely through other means like scraping or buying, and
remove your data at any time. uses it only in ways the user has consented to.

Informed. Be honest. For example, if someone says


they’ll use protection and then they don’t, that’s
not consent. Consentful applications use clear and
IFIC How might we expand this
accessible language to inform users about the risks
8 BUILDING CONSENTFUL TECH

S PEC
they present and the data they are storing, rather definition to address the intangible
than burying these important details in e.g., the fine and networked qualities of digital
print of terms & conditions.
REV
technologies?
ERS
IBLE
* Adapted from http://
plannedparenthood.tumblr.
com/post/148506806862/
understanding-consent-is-as-
easy-as-fries-consent
Consent makes In a lot of cases, you’ll find that those who might experience harm such as

BUILDING CONSENTFUL TECH 11


harassment or surveillance are not the owners of the technology. Sometimes
there is overlap between those who work on the building of the technology and

technology more just those who could be harmed, but often there isn’t.

There are many ways to make technology more just and equitable, and consent

and equitable is one important consideration. Non-consentful features and interactions can
be minor nuisances for some people, but can be very harmful to others. When
Facebook introduced photo tagging, anyone could tag you in a photo, whether
or not you were okay with it. For some users, that could lead to embarrassment
Think of a technology you use on a if the photo wasn’t particularly flattering. But for other people, the harm could
be much more serious. For trans users, tagging photos from their pre-transition
day to day basis. Can it have unjust
lives without their consent could lead to them being outed, which can have
or inequitable impacts on anyone? consequences for employment, housing, safety, and more.
Who owns the technology, and who In response to user outcry, Facebook eventually implemented a process by
participates in the making of it? which users can approve tagged photos. However, it required a critical mass
of complaints to make this happen. And, Facebook still stores photos that are
tagged with your face in its database, which informs its facial recognition
algorithms. Whether you consented to being tagged or not, Facebook has a 98%
accurate idea of what your face looks like.1

Who makes it? Consider the technology you were  Facebook’s image rec-
1
ognition algorithms can

Who profits?
thinking of earlier. What would it “recognize human faces with
98% accuracy, even if they
look like if it was built to ensure [aren’t] directly facing the
camera...[It can] identify a
that everyone had an equitable person in one picture out of
10 BUILDING CONSENTFUL TECH

experience, and some users were 800 million in less than five
seconds.” ‘Inside Facebook’s
Who is harmed? not negatively impacted more than Biggest Artificial Intelligent
Project Ever,’ Fortune Maga-
others? Who would need to own zine, April 13, 2016.
http://fortune.com/face-
and build the technology for this to book-machine-learning/
happen?
WHAT THIS ZINE BUILDS UPON BUILDING CONSENTFUL

BUILDING CONSENTFUL TECH 13


TECHNOLOGIES TOGETHER
Our thinking around consentful technology has been shaped by numerous It can be hard for people who aren’t developers or
other ideas. This zine builds upon the following concepts and movements: investors to imagine how they can be involved in
building consentful tech. It can also be challenging
Consent culture The anti-violence against women movement has given us
for the people who are making technology to
the concept of consent culture. Consent culture is “a culture in which asking
imagine how to center their work around people
for consent is normalized and condoned in popular culture. It is respecting the
who are most vulnerable. But we all have important
person’s response even if it isn’t the response you had hoped for. We will live
roles to play.
in a consent culture when we no longer objectify people and we value them as
human beings. Consent culture is believing that you and your partner(s) have
the right over your own bodily autonomies and understanding that each of you
Non-technology folks can contribute to building
consentful tech by:
Technologists
know what is best for yourselves.”1 We believe that consentful technology has
•  olding the platforms we use accountable to
H
and non-
an important role to play in creating a consent culture.

Design justice Design justice is an approach to design that is rooted in equity


how they use our data technologists
• Advocating for consent-focused policy and
and community. The Design Justice Network is “striving to create design legislation alike have
practices that center those who stand to be most adversely impacted by design • Intervening in development processes
decisions in design processes.”2 Design processes that are led by and centered through community organizing (petitions, important
around people who can be unjustly impacted by technology are a cornerstone demonstrations, etc.)
of consentful tech. • Signing on to platforms that are consentful
roles to play
Digital justice According to the Detroit Digital Justice Coalition,
• Learning more about code, policies, and in building
legislation
“communication is a fundamental human right. We are securing that right for
the digital age by promoting access, participation, common ownership, and Tech folks can contribute to building consentful
consentful
healthy communities.”3 Consentful technology is modelled on equitable access, tech by: tech.
participation in the design process, ownership and control of our digital bodies,
and communities based in consenting interactions. • Advocating for diverse teams
• Opening up design & development processes to
Community technology The Detroit Community Technology Project defines people who those who are vulnerable to harm
community technology as “a method of teaching and learning about technology • Working towards a culture of consent in our
12 BUILDING CONSENTFUL TECH

with the goal of restoring relationships and healing neighborhoods.”4 companies and organizations
Consentful technology builds healthy digital communities through consenting • Mentoring newcomers, particularly those
interactions and relationships. who are often excluded or marginalized from
mainstream tech communities
• Growing our knowledge on concepts
like collaborative design processes and
 Only with Consent. http://onlywithconsent.org/blog/consent-culture
1 intersectionality
2 Design Justice Network. http://designjusticenetwork.org/
• Consistently reviewing our development
3 Detroit Digital Justice Coalition. http://detroitdjc.org/about/story/
4 Detroit Community Technology Project, Teaching Community Technology Handbook.
processes
Consent from the Can people consent to Specific things in this system and not others? Can

BUILDING CONSENTFUL TECH 15


people select which aspects of their digital bodies they want to have exposed
and have stored?

ground up When technology is built without asking these questions from the beginning,
serious harm can happen. It often takes multiple instances of harm for a patch
to be designed. Popular photo and video sharing platforms, for instance, have
How might the technologies we are been used to circulate images of acts of sexual violence, which re-harms people
who have experienced violence and perpetuates violence in our culture. It is
most reliant upon look if they were extremely difficult to delete these images once they have been distributed.
designed with consent at their core? These are obviously not the use cases the developers and owners of these
What if, before writing a single line platforms were intending, but they do illustrate the harm that can happen
when we fail to design with consent in mind from the ground up, and
of code, the following questions
foreground the concerns of users who could be severely and unjustly impacted.
were asked:
We can and must do better.

Are people Freely giving us their consent to


access and store parts of their digital bodies? Can DEALING WITH RISK
potentially harmful personal information about
Unlike our physical bodies, a digital body can be in many places at once. It can
a person be displayed or stored without their
be at rest in a database, socializing in the cloud, or traveling through the tubes.
consent?
You should have a good understanding of the risks to your digital body that
Does our system allow for Reversible consent? How are involved in its activities, so you can make informed decisions, just like the
easy is it for people to withdraw both their consent decisions you make about where to travel, where to stay, and how to get there
and their data? with your physical body.

How are we fully and clearly Informing people Historically speaking, technology providers have done a poor job of
about what they’re consenting to? Is important acknowledging those risks. We see this in data breaches, where intimate details
information about the risks a user might be of digital bodies (passwords, credit card information, medical information,
exposed to buried in the fine print of the terms & private photos etc) are exposed publicly or sold to the highest bidder. We also
conditions? see it in cases where companies make use of your digital body in ways you
14 BUILDING CONSENTFUL TECH

didn’t intend, and whether that’s malicious (using your profile pic to advertise
How are we making sure that the consent is dating services) or seemingly benign (adding a new feature that exposes you
Enthusiastic? Is there an option not to use this to additional risk) it is still an act taken upon part of your digital body without
technology, which means that people use it because your consent.
they prefer to use it? In many places one can only
access social service benefits online. Declining to We can do better. By designing the system so certain things are impossible, we
register with these online services is not an option lower the trust barrier for that system. For example, your personal information
for those who need these benefits most. could be stored encrypted, with the decryption keys residing only on your own
devices. The application sends data and code to your device, and your consent
Consent is an ongoing

BUILDING CONSENTFUL TECH 17


is requested for each operation.

process
If we can’t make it safer then we can acknowledge the remaining risks and
educate users about them. Let’s build industry standards for reporting risks at
rest, in transit and during processing. Additional standards for functionality-
based risk would help too. This is a big problem: it requires software
developers, industry groups, advocates and users all working together, and it The process of asking for consent
starts by having this conversation. Let’s talk about how risky it currently is to
use software, and how we can make it safer and more accessible for everyone. does not stop at the first yes.

IDEAS FOR TECHNICAL MECHANISMS


A technique called differential privacy provides a way to measure the
1
“Is it cool if Saying yes to an interaction once — whether
it’s a hug or linking your user account with your
likelihood of negative impact and also a way to introduce plausible deniability, I Snapchat Facebook profile — should not imply that the
which in many cases can dramatically reduce risk exposure for sensitive data. consent was provided for an indefinite period of
this video of time.
Modern encryption techniques allow a user’s information to be fully
encrypted on their device, but using it becomes unwieldy. Balancing the you dancing Platforms like Google are incorporating periodic
levels of encryption is challenging, but can create strong safety guarantees. check-ins with users about what they’ve consented
Homomorphic encryption2 can allow certain types of processing or with that hot to, which is a good start. But Google’s account
aggregation to happen without needing to decrypt the data. holders aren’t the only ones impacted by non-
dog?” consent on their platform: for example, anyone
Creating falsifiable security claims allows independent analysts to validate who has their name and email address added to an
those claims, and invalidate them when they are compromised. For example, open Google Sheet has potential exposure.
by using subresource integrity to lock the code on a web page, the browser will
refuse to load any compromised code. By then publishing the code’s hash in This is because many of the technologies we rely on
an immutable location, any compromise of the page is detectable easily (and only require the consent of a user to the system, or
automatically, with a service worker or external monitor). of users to each other. What about people who are
impacted who are not users? We have found that
Taken to their logical conclusion these techniques suggest building our asking people directly, as one would in a physical
applications in a more decentralized3 way, which not only provides a higher interaction, is a strong practice. How might your
16 BUILDING CONSENTFUL TECH

bar for security, but also helps with scaling: if everyone is sharing some of the experience of the Internet shift if people who had
processing, the servers can do less work. In this model your digital body is no access to your digital body, whether in the form
longer spread throughout servers on the internet; instead the applications come of photos or contact information, were to check
to you and you directly control how they interact with your data. in with you from time to time about it? What
technologies would we need to build to help us
1 https://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf manage ongoing and direct consent processes?
2 https://crypto.stanford.edu/craig/craig-thesis.pdf
3 http://scuttlebot.io
AN EQUITABLE ITERATION
Consentful technology

BUILDING CONSENTFUL TECH 19


PROCESS
“Fail faster” is a maxim of application developers
these days. It means putting something out into the relies on community and
world quickly and responding to user feedback in
future iterations. This is a great way to optimize the
value of your application to your users, by starting
accountability
with something simple and experimenting until you
get the right features. We have talked a lot about what we can do to build
Unfortunately while this process can increase more consentful technologies. But implementing
positive impacts, it does nothing to diminish these measures can’t guarantee that non-consensual
negatives impacts. The fail faster approach
experiments not only with features but also with actions will not happen. This is why community and
the lives of people using those features. Consider accountability are critical in addressing harm.
the release of the Alexa app for Amazon Echo,
which did not allow for blocking calls or texts. This
raises immediate red flags for anyone who has been DIGITAL COMMUNITIES
doxed or stalked, and may have directly lead to
More and more, our digital bodies exist in digital networks and communities.
harm for Alexa users.
Whenever there are multiple relationships between people, a type of
It isn’t enough to iterate features in response to community or network is created. One type is formed when people sign up
harm — we must also iterate the process that lead for a service — as users, they are now in relationship with whoever owns
to those features being released. What would that and works on the technology. Another type of community is created when
process look like if it was centered around the users interact with each other. Digital communities can overlap with physical
privacy and security of survivors of violence? Of communities.
people from communities that are regularly subject
to state surveillance? A COMMUNITY ACCOUNTABILITY APPROACH
Accountability means being held responsible for your actions. The
accountability mechanisms available in most technologies are not good
18 BUILDING CONSENTFUL TECH

enough. Blocking users who are harassing you does not easily stop them from
harassing others. Reporting an image that is harmful to you does not stop that
image from being posted by others and to other platforms.

Some people have called for police departments to become more


knowledgeable about current technology, and for lawmakers to create harsher
punishments for people who are committing violence online. But the problems
with this approach mirror those that are rampant in enforcement of sexual
assault laws. Often it is the person who experienced the harm who is blamed
— why did you send nude photos to your ex, or why didn’t you just ignore that DIGITAL BODIES ARE

BUILDING CONSENTFUL TECH 21


troll? And, for Black and Indigenous people, racialized immigrants, LGBTQ
CONNECTED TO EACH OTHER
people and more, police and prisons are key vectors of violence in daily life.
IN DIGITAL RELATIONSHIPS
What if we built community-based responses to harm and violence into our
technologies?
AND COMMUNITIES

When we act harmfully against others, whether it is intentional or not, there


is an impact on both that person and the community as a whole. This is true
whether the harm is interpersonal or caused by algorithms. So we must be
responsible to each other as individuals as well as members of a community.
This is what is meant by community accountability.

Community accountability is a community-based strategy, rather than a


police/prison-based strategy, to address violence within our communities.
Community accountability is a process in which a community — a group
of friends, a family, a church, a workplace, an apartment complex, a
neighborhood, etc — work together to do the following things:

•  reate and affirm values & practices that resist abuse and oppression and
C
encourage safety, support, and accountability
• Develop sustainable strategies to address community members’ abusive
behavior, creating a process for them to account for their actions and
transform their behavior
• Commit to ongoing development of all members of the community, and
the community itself, to transform the political conditions that reinforce
oppression and violence
• Provide safety & support to community members who are violently
targeted that respects their self-determination1

What would a community accountability approach to digital communities


look like? How would it work for both people who are users of the technology
in question, as well as people who might be impacted by it? How could this
20 BUILDING CONSENTFUL TECH

change the way that the creators of algorithms are held accountable for the
harms that their biases cause? Our digital bodies interact with each other,
intermediated by the servers they inhabit. Currently
all the control is in the environment, and the data
that makes up our digital bodies is passive and
lacks agency. By binding that data into a cell with
its own logic, protected by encryption, we could
restore autonomy to our digital bodies, allowing
interactions to involve us instead of acting upon us.
1 http://www.incite-national.org/page/community-accountability
STRONG COMMUNITIES GIVE RISE TO MORE
Consentful technology

BUILDING CONSENTFUL TECH 23


CONSENTFUL TECHNOLOGIES
When attention is paid to relationships, stronger communities result. This
is the case in physical communities as well as digital. Users and makers can moves us towards
strengthen their communities and improve consent therein by asking:

•  ow can we better protect each other? For example, is there a technical


H consent culture
way to have other community members see and respond to harassing
messages, so the person who is targeted does not have to deal with the
barrage alone? Currently, achieving some measure of privacy and
• How can we hold each other accountable as a community? What are security in technology requires active participation
some community-based strategies for addressing non-consensual actions
that work on the roots of the issue? from users, which means when that trust is violated it
• How can we better support and uplift each other? How can we is the users who pay the price, and often the users who
normalize asking for consent on our platform?
are blamed.
Small changes can make a big difference when we add a little friction to
pathways used for abusive behaviour, and when we make it easier for people
The cost of interacting with technology securely is quite high, and those least
to help each other. For example, new users might have a quieter voice until
able to pay that cost are also those most at risk of harm when things break.
they’ve been around awhile, or messages mentioning you could be downvoted
Just as we should not blame survivors for sexual violence, we must not place
by your friends so you won’t see them.
the burden of safety on users in terms of who is responsible and who suffers
the consequences.

We see an alternative to this in consent culture. Consent culture is a culture


in which asking for consent permeates all our interactions small and large —
whether it’s asking before going in for a hug, checking in about taking a photo,
or asking whether a sexual activity feels okay. With technology mediating so
many of our daily interactions, it plays an increasingly large part in establishing
the type of culture we live in. Building consentful technology is not just about
our applications and data; it is about creating a culture of consent for the entire
22 BUILDING CONSENTFUL TECH

world to share in.


Resources Who wrote this?

BUILDING CONSENTFUL TECH 25


READING This zine was written by Una Lee and Dann Toliver,
who are the team behind the Ripple Mapping
Consent of the Networked: The Worldwide Struggle for Internet Freedom Tool. Una is a design practitioner, a collaborative
Rebecca MacKinnon. Lebanon, IN: Basic Books, 2012. design facilitator, and a design justice advocate.
Dann spends a lot of time talking to people about
Design Justice Zine
computers, and to computers about people.
Design Justice Network
http://designjusticenetwork.org/zine

From Paranoia to Power


Our Data Bodies Project 2016 Report
https://www.odbproject.org/wp-content/uploads/2016/12/ODB- FEEDBACK?
Community-Report-7-24.pdf We welcome your comments on this
Learning Good Consent: On Healthy Relationships and Survivor Support zine. To send feedback, please visit
Edited by Cindy Crabb. Chico, CA: AK Press, 2016. http://bit.ly/2yXQaZy

Teaching Community Technology Handbook


Detroit Community Technology Project
https://detroitcommunitytech.org/teachcommtech

ORGS & PROJECTS


Crash Override Network
http://www.crashoverridenetwork.com

INCITE!
http://www.incite-national.org
24 BUILDING CONSENTFUL TECH

Hold Your Boundaries project


https://www.holdyourboundaries.com

Troll Busters
https://www.troll-busters.com
Acknowledgments
This zine would not exist without funding and moral
support from Allied Media Projects and the Mozilla
Foundation.

Much gratitude to the extended Allied Media


Conference family for generously lending your time
and insights about consent.

Thanks to Shameela Zaman, Lupe Pérez, Hisayo Horie, To download a PDF


Erin Toliver, Tyler Sloane, and Alex Leitch for reviewing
of this zine, please visit
the content of this zine.
ripplemap.io/zine
Thank you also to the Difference Engine Initiative
participants — you were the spark that inspired this all.

Graphic Design: And Also Too


26 BUILDING CONSENTFUL TECH
Consent
must be...
REV
ERS
IBLE

V EN
GI
E LY
F RE

IN
FO
IC RM
IA ST ED
H US
E NT

SP
EC
IFI
C

You might also like