Social Media NUDGE - Klobuchar-Lummis - MUR22061

Download as pdf or txt
Download as pdf or txt
You are on page 1of 24

MUR22061 W95 S.L.C.

S. ll
117TH CONGRESS
2D SESSION

To require the Federal Trade Commission to identify content-agnostic plat-


form interventions to reduce the harm of algorithmic amplification and
social media addiction on covered platforms, and for other purposes.

IN THE SENATE OF THE UNITED STATES


llllllllll
Ms. KLOBUCHAR (for herself and Ms. LUMMIS) introduced the following bill;
which was read twice and referred to the Committee on
llllllllll

A BILL
To require the Federal Trade Commission to identify con-
tent-agnostic platform interventions to reduce the harm
of algorithmic amplification and social media addiction
on covered platforms, and for other purposes.

1 Be it enacted by the Senate and House of Representa-


2 tives of the United States of America in Congress assembled,
3 SECTION 1. SHORT TITLE.

4 This Act may be cited as the ‘‘Nudging Users to


5 Drive Good Experiences on Social Media Act’’ or the ‘‘So-
6 cial Media NUDGE Act’’.
7 SEC. 2. FINDINGS.

8 Congress finds the following:


MUR22061 W95 S.L.C.

2
1 (1) Social media platforms can have significant
2 impacts on their users, both positive and negative.
3 However, social media usage can be associated with
4 detrimental outcomes, including on a user’s mental
5 and physical health. Design decisions made by social
6 media platforms, such as decisions affecting the con-
7 tent a user might see on a social media platform,
8 may drive or exacerbate these negative or detri-
9 mental outcomes.
10 (2) Viral harmful content often spreads on so-
11 cial media platforms. Social media platforms do not
12 consistently enforce their terms of service and con-
13 tent policies, leading to supposedly prohibited con-
14 tent often being shown to users and amplified by
15 such platforms.
16 (3) Social media platforms often rely heavily on
17 automated measures for content detection and mod-
18 eration. These social media platforms may rely on
19 such automated measures due to the large quantity
20 of user-generated content on their platforms. How-
21 ever, evidence suggests that even state-of-the-art
22 automated content moderation systems currently do
23 not fully address the harmful content on social
24 media platforms.
MUR22061 W95 S.L.C.

3
1 (4) Significant research has found that content-
2 agnostic interventions, if made by social media plat-
3 forms, may help significantly mitigate these issues.
4 These interventions could be readily implemented by
5 social media platforms to provide safer user experi-
6 ences. Such interventions include the following:
7 (A) Nudges to users and increased plat-
8 form viewing options, such as screen time alerts
9 and grayscale phone settings, which may reduce
10 addictive platform usage patterns and improve
11 user experiences online.
12 (B) Labels and alerts that require a user
13 to read or review user-generated content before
14 sharing such content.
15 (C) Prompts to users, which may help
16 users identify manipulative and microtargeted
17 advertisements.
18 (D) Other research-supported content-ag-
19 nostic interventions.
20 (5) Evidence suggests that increased adoption
21 of content-agnostic interventions would lead to im-
22 proved outcomes of social media usage. However, so-
23 cial media platforms may be hesitant to independ-
24 ently implement content-agnostic interventions that
MUR22061 W95 S.L.C.

4
1 will reduce negative outcomes associated with social
2 media use.
3 SEC. 3. STUDY ON CONTENT-AGNOSTIC INTERVENTIONS.

4 (a) STUDY TO IDENTIFY CONTENT-AGNOSTIC INTER-


5 VENTIONS.—The Director of the National Science Foun-
6 dation (in this section referred to as the ‘‘Director’’) shall
7 enter into an agreement with the National Academies of
8 Sciences, Engineering, and Medicine (in this section re-
9 ferred to as the ‘‘Academies’’) to conduct ongoing studies
10 to identify content-agnostic interventions that covered
11 platforms could implement to reduce the harms of algo-
12 rithmic amplification and social media addiction on cov-
13 ered platforms. The initial study shall—
14 (1) identify ways to define and measure the
15 negative mental or physical health impacts related to
16 social media, including harms related to algorithmic
17 amplification and social media addiction, through a
18 review of—
19 (A) a wide variety of studies, literature, re-
20 ports, and other relevant materials created by
21 academic institutions, civil society groups, and
22 other appropriate sources; and
23 (B) relevant internal research conducted
24 by a covered platform or third party research in
25 the possession of a covered platform that is vol-
MUR22061 W95 S.L.C.

5
1 untarily submitted to the Academies by the cov-
2 ered platform (through a process, established by
3 the Academies, with appropriate privacy safe-
4 guards);
5 (2) identify research-based content-agnostic
6 interventions, such as reasonable limits on account
7 creation and content sharing, to combat problematic
8 smartphone use and other negative mental or phys-
9 ical health impacts related to social media, including
10 through a review of the materials described in sub-
11 paragraphs (A) and (B) of paragraph (1);
12 (3) provide recommendations on how covered
13 platforms may be separated into groups of similar
14 platforms for the purpose of implementing content-
15 agnostic interventions, taking into consideration fac-
16 tors including any similarity among the covered plat-
17 forms with respect to—
18 (A) the number of monthly active users of
19 the covered platform and the growth rate of
20 such number;
21 (B) how user-generated content is created,
22 shared, amplified, and interacted with on the
23 covered platform;
24 (C) how the covered platform generates
25 revenue; and
MUR22061 W95 S.L.C.

6
1 (D) other relevant factors for providing
2 recommendations on how covered platforms
3 may be separated into groups of similar plat-
4 forms;
5 (4) for each group of covered platforms rec-
6 ommended under paragraph (3), provide rec-
7 ommendations on which of the content-agnostic
8 interventions identified in paragraph (2) are gen-
9 erally applicable to the covered platforms in such
10 group;
11 (5) for each group of covered platforms rec-
12 ommended under paragraph (3), provide rec-
13 ommendations on how the covered platforms in such
14 group could generally implement each of the con-
15 tent-agnostic interventions identified for such group
16 under paragraph (4) in a way that does not alter the
17 core functionality of the covered platforms, consid-
18 ering—
19 (A) whether the content-agnostic interven-
20 tion should be offered as an optional setting or
21 feature that users of a covered platform could
22 manually turn on or off with appropriate de-
23 fault settings to reduce the harms of algo-
24 rithmic amplification and social media addiction
MUR22061 W95 S.L.C.

7
1 on the covered platform without altering the
2 core functionality of the covered platform; and
3 (B) other means by which the content-ag-
4 nostic intervention may be implemented and
5 any associated impact on the experiences of
6 users of the covered platform and the core
7 functionality of the covered platform;
8 (6) for each group of covered platforms rec-
9 ommended under paragraph (3), define metrics gen-
10 erally applicable to the covered platforms in such
11 group to measure and publicly report in a privacy-
12 preserving manner the impact of any content-agnos-
13 tic intervention adopted by the covered platform;
14 and
15 (7) identify data and research questions nec-
16 essary to further understand the negative mental or
17 physical health impacts related to social media, in-
18 cluding harms related to algorithmic amplification
19 and social media addiction.
20 (b) REQUIREMENT TO SUBMIT ADDITIONAL RE-
21 SEARCH.—If a covered platform voluntarily submits inter-
22 nal research to the Academies under subsection (a)(1)(B),
23 the covered platform shall, upon the request of the Acad-
24 emies and not later than 60 days after receiving such a
25 request, submit to the Academies any other research in
MUR22061 W95 S.L.C.

8
1 the platform’s possession that is closely related to such
2 voluntarily submitted research.
3 (c) REPORTS.—
4 (1) INITIAL STUDY REPORT.—Not later than 1
5 year after the date of enactment of this Act, the
6 Academies shall submit to the Director, Congress,
7 and the Commission a report containing the results
8 of the initial study conducted under subsection (a),
9 including recommendations for how the Commission
10 should establish rules for covered platforms related
11 to content-agnostic interventions as described in
12 paragraphs (1) through (5) of subsection (a).
13 (2) UPDATES.—Not later than 2 years after the
14 effective date of the regulations promulgated under
15 section 4, and every 2 years thereafter during the
16 10-year period beginning on such date, the Acad-
17 emies shall submit to the Director, Congress, and
18 the Commission a report containing the results of
19 the ongoing studies conducted under subsection (a).
20 Each such report shall—
21 (A) include analysis and updates to earlier
22 studies conducted, and recommendations made,
23 under such subsection;
24 (B) be based on—
MUR22061 W95 S.L.C.

9
1 (i) new academic research, reports,
2 and other relevant materials related to the
3 subject of previous studies, including addi-
4 tional research identifying new content-ag-
5 nostic interventions;
6 (ii) new academic research, reports,
7 and other relevant materials about harms
8 occurring on covered platforms that are
9 not being addressed by the content-agnos-
10 tic interventions being implemented by cov-
11 ered platforms as a result of the regula-
12 tions promulgated under section 4;
13 (C) include information about the imple-
14 mentation of the content-agnostic interventions
15 by covered platforms and the impact of the im-
16 plementation of the content-agnostic interven-
17 tions; and
18 (D) include an analysis of any entities that
19 have newly met the criteria to be considered a
20 covered platform under this Act since the last
21 report submitted under this subsection.
22 SEC. 4. IMPLEMENTATION OF CONTENT-AGNOSTIC INTER-

23 VENTIONS.

24 (a) DETERMINATION OF APPLICABLE CONTENT-AG-


25 NOSTIC INTERVENTIONS.—
MUR22061 W95 S.L.C.

10
1 (1) IN GENERAL.—Not later than 60 days after
2 the receipt of the initial study report under section
3 3(c)(1), the Commission shall initiate a rulemaking
4 proceeding for the purpose of promulgating regula-
5 tions in accordance with section 553 of title 5,
6 United States Code—
7 (A) to determine how covered platforms
8 should be grouped together;
9 (B) to determine which content-agnostic
10 interventions identified in such report shall be
11 applicable to each group of covered platforms
12 identified in the report; and
13 (C) to require each covered platform to im-
14 plement and measure the impact of such con-
15 tent-agnostic interventions in accordance with
16 subsection (b).
17 (2) CONSIDERATIONS.—In the rulemaking pro-
18 ceeding described in paragraph (1), the Commis-
19 sion—
20 (A) shall consider the report under section
21 3(c)(1) and its recommendations; and
22 (B) shall not promulgate regulations re-
23 quiring any covered platform to implement a
24 content-agnostic intervention that is not dis-
25 cussed in such report.
MUR22061 W95 S.L.C.

11
1 (3) NOTIFICATION TO COVERED PLATFORMS.—

2 The Commission shall, not later than 30 days after


3 the promulgation of the regulations under this sub-
4 section, provide notice to each covered platform of
5 the content-agnostic interventions that are applicable
6 to the platform pursuant to the regulations promul-
7 gated under this subsection.
8 (b) IMPLEMENTATION OF CONTENT-AGNOSTIC
9 INTERVENTIONS.—
10 (1) IN GENERAL.—

11 (A) IMPLEMENTATION PLAN.—

12 (i) IN GENERAL.—Not later than 60


13 days after the date on which a covered
14 platform receives the notice from the Com-
15 mission required under subsection (a)(3),
16 the covered platform shall submit to the
17 Commission a plan to implement each con-
18 tent-agnostic intervention applicable to the
19 covered platform (as determined by the
20 Commission) in an appropriately prompt
21 manner. If the covered platform reasonably
22 believes that any aspect of an applicable
23 intervention is not technically feasible for
24 the covered platform to implement, would
25 substantially change the core functionality
MUR22061 W95 S.L.C.

12
1 of the covered platform, or would pose a
2 material privacy or security risk to con-
3 sumer data stored, held, used, processed,
4 or otherwise possessed by such covered
5 platform, the covered platform shall in-
6 clude in its plan evidence supporting these
7 beliefs in accordance with paragraph (2).
8 (ii) COMMISSION DETERMINATION.—

9 Not later than 30 days after receiving a


10 covered platform’s plan under clause (i),
11 the Commission shall determine whether
12 such plan includes details related to the
13 appropriately prompt implementation of
14 each content-agnostic intervention applica-
15 ble to the covered platform, except for any
16 aspect of an intervention for which the
17 Commission determines the covered plat-
18 form is exempt under paragraph (2).
19 (iii) APPEAL OR REVISED PLAN.—

20 (I) IN GENERAL.—Subject to
21 subclause (II), if the Commission de-
22 termines under clause (ii) that a cov-
23 ered platform’s plan does not satisfy
24 the requirements of this subsection,
25 not later than 90 days after the
MUR22061 W95 S.L.C.

13
1 issuance of such determination, the
2 covered platform shall—
3 (aa) appeal the determina-
4 tion by the Commission to the
5 United States Court of Appeals
6 for the Federal Circuit; or
7 (bb) submit to the Commis-
8 sion a revised plan for a Commis-
9 sion determination pursuant to
10 clause (ii).
11 (II) LIMITATION.—If a covered
12 platform submits 3 revised plans to
13 the Commission for a determination
14 pursuant to clause (ii) and the Com-
15 mission determines that none of the
16 revised plans satisfy the requirements
17 of this subsection, the Commission
18 may find that the platform is not act-
19 ing in good faith in developing an im-
20 plementation plan and may require
21 the platform to implement, pursuant
22 to a plan developed for the platform
23 by the Commission, each content-ag-
24 nostic intervention applicable to the
25 platform (as determined by the Com-
MUR22061 W95 S.L.C.

14
1 mission) in an appropriately prompt
2 manner.
3 (B) STATEMENT OF COMPLIANCE.—Not

4 less frequently than annually, each covered plat-


5 form shall make publicly available on their
6 website and submit to the Commission, in a
7 machine-readable format and in a privacy-pre-
8 serving manner, the details of—
9 (i) the covered platform’s compliance
10 with the required implementation of con-
11 tent-agnostic interventions; and
12 (ii) the impact (using the metrics de-
13 fined by the Director of the National
14 Science Foundation and the National
15 Academies of Sciences, Engineering, and
16 Medicine pursuant to section 3(a)(6)) of
17 such content-agnostic interventions on re-
18 ducing the harms of algorithmic amplifi-
19 cation and social media addiction on cov-
20 ered platforms.
21 (2) FEASIBILITY, FUNCTIONALITY, PRIVACY,

22 AND SECURITY EXEMPTIONS.—

23 (A) STATEMENT OF INAPPLICABILITY.—

24 Not later than 60 days after the date on which


25 a covered platform receives the notice from the
MUR22061 W95 S.L.C.

15
1 Commission required under subsection (a)(3), a
2 covered platform seeking an exemption from
3 any aspect of such rule may submit to the Com-
4 mission—
5 (i) a statement identifying any specific
6 aspect of a content-agnostic intervention
7 applicable to such covered platform (as de-
8 termined by the Commission under sub-
9 section (a)) that the covered platform rea-
10 sonably believes—
11 (I) is not technically feasible for
12 the covered platform to implement;
13 (II) will substantially change the
14 core functionality of the covered plat-
15 form; or
16 (III) will create a material and
17 imminent privacy or security risk to
18 the consumer data stored, held, used,
19 processed, or otherwise possessed by
20 such covered platform; and
21 (ii) specific evidence supporting such
22 belief, including any relevant information
23 regarding the core functionality of the cov-
24 ered platform.
MUR22061 W95 S.L.C.

16
1 (B) DETERMINATION BY THE COMMIS-

2 SION.—Not later than 30 days after receiving a


3 covered platform’s statement under subpara-
4 graph (A), the Commission shall determine
5 whether the covered platform shall be exempt
6 from any aspect of a content-agnostic interven-
7 tion discussed in the covered platform’s state-
8 ment.
9 (C) APPEAL OR REVISED PLAN.—Not later
10 than 90 days after a determination issued
11 under subparagraph (B), a covered platform
12 may—
13 (i) appeal the determination by the
14 Commission to the United States Court of
15 Appeals for the Federal Circuit; or
16 (ii) submit to the Commission a re-
17 vised plan, including details related to the
18 prompt implementation of any content-ag-
19 nostic intervention for which the covered
20 platform requested an exemption that the
21 Commission subsequently denied, for a
22 Commission determination pursuant to
23 paragraph (1)(A)(ii).
MUR22061 W95 S.L.C.

17
1 SEC. 5. TRANSPARENCY REPORT.

2 Not later than 180 days after the date of enactment


3 of this Act, and semiannually thereafter, each covered
4 platform shall publish a publicly-available, machine-read-
5 able report about the content moderation efforts of the
6 covered platform with respect to each language spoken by
7 not less than 100,000 monthly active users of the covered
8 platform in the United States. Such report shall include
9 the following:
10 (1) CONTENT MODERATORS.—The total number
11 of individuals employed or contracted by the covered
12 platform during the reporting period to engage in
13 content moderation for each language, broken down
14 by the number of individuals retained as full-time
15 employees, part time employees, and contractors of
16 the covered platform and reported in a privacy-pre-
17 serving manner.
18 (2) RANDOM SAMPLE OF VIEWED CONTENT.—

19 Information related to a random sample of publicly


20 visible content accounting for 1,000 views each
21 month. Each month, covered platforms shall cal-
22 culate the total number of views for each piece of
23 publicly visible content posted during the month and
24 sample randomly from the content in a manner such
25 that the probability of a piece of content being sam-
26 pled is proportionate to the total number of views of
MUR22061 W95 S.L.C.

18
1 that piece of content during the month. Covered
2 platforms shall report the following information
3 about each piece of sampled content (with appro-
4 priate redactions to exclude the disclosure of illegal
5 content):
6 (A) The text, images, audio, video, or other
7 creative data associated with each such piece of
8 content.
9 (B) The details of the account or accounts
10 that originally posted the content; and
11 (C) The total number of views of each such
12 piece of content during the month
13 (3) HIGH REACH CONTENT.—Content modera-
14 tion metrics broken down by language to assess the
15 prevalence of harmful content on the covered plat-
16 form, including, for each language, the 1,000 most
17 viewed pieces of publicly visible content each month,
18 including the following (with appropriate redactions
19 to exclude the disclosure of illegal content):
20 (A) The text, images, audio, video, or other
21 creative data associated with each such piece of
22 content.
23 (B) The details of—
24 (i) the account that originally posted
25 the content; and
MUR22061 W95 S.L.C.

19
1 (ii) any account whose sharing or re-
2 posting of the content accounted for more
3 than 5 percent of the views of the content.
4 (4) REMOVED AND MODERATED CONTENT.—

5 (A) IN GENERAL.—Aggregate metrics for


6 user-generated content that is affected by any
7 automated or manual moderation system or de-
8 cision, including, as calculated on a monthly
9 basis and reported in a privacy-preserving man-
10 ner, the number of pieces of user-generated
11 content and the number of views of such con-
12 tent that were—
13 (i) reported to the covered platform by
14 a user of the covered platform;
15 (ii) flagged by the covered platform by
16 an automated content detection system;
17 (iii) removed from the covered plat-
18 form and not restored;
19 (iv) removed from the covered plat-
20 form and later restored; or
21 (v) labeled, edited, or otherwise mod-
22 erated by the covered platform following a
23 user report or flagging by an automated
24 content detection system.
MUR22061 W95 S.L.C.

20
1 (B) REQUIREMENTS FOR METRICS.—The

2 metrics reported under subparagraph (A) shall


3 be broken down by—
4 (i) the language of the user-generated
5 content;
6 (ii) the topic of the user-generated
7 content, such as bullying, hate speech,
8 drugs and firearms, violence and incite-
9 ment, or any other category determined by
10 the covered platform to categorize such
11 content; and
12 (iii) if the covered platform has a
13 process for publicly verifying that an ac-
14 count on the platform belongs to a promi-
15 nent user or public figure, whether the cre-
16 ator of the content is a politician or jour-
17 nalist with a verified account.
18 SEC. 6. ENFORCEMENT.

19 (a) UNFAIR OR DECEPTIVE ACTS OR PRACTICES.—


20 A violation of section 3(b), 4, or 5 or a regulation promul-
21 gated under section 4 shall be treated as a violation of
22 a rule defining an unfair or deceptive act or practice pre-
23 scribed under section 18(a)(1)(B) of the Federal Trade
24 Commission Act (15 U.S.C. 57a(a)(1)(B)).
25 (b) POWERS OF THE COMMISSION.—
MUR22061 W95 S.L.C.

21
1 (1) IN GENERAL.—The Commission shall en-
2 force this Act in the same manner, by the same
3 means, and with the same jurisdiction, powers, and
4 duties as though all applicable terms and provisions
5 of the Federal Trade Commission Act (15 U.S.C. 41
6 et seq.) were incorporated into and made a part of
7 this Act.
8 (2) PRIVILEGES AND IMMUNITIES.—Any person
9 who violates section 4 or 5 or a regulation promul-
10 gated under section 4 shall be entitled to the privi-
11 leges and immunities provided in the Federal Trade
12 Commission Act (15 U.S.C. 41 et seq.).
13 (3) ENFORCEMENT GUIDELINES AND UP-

14 DATES.—Not later than 1 year after the date of en-


15 actment of this Act, the Commission shall issue
16 guidelines that outline any policies and practices of
17 the Commission related to the enforcement of this
18 Act in order to promote transparency and deter vio-
19 lations of this Act. The Commission shall update the
20 guidelines as needed to reflect current policies, prac-
21 tices, and changes in technology, but not less fre-
22 quently than once every 4 years.
23 (4) AUTHORITY PRESERVED.—Nothing in this
24 Act shall be construed to limit the authority of the
25 Commission under any other provision of law.
MUR22061 W95 S.L.C.

22
1 SEC. 7. DEFINITIONS.

2 In this Act:
3 (1) ALGORITHMIC AMPLIFICATION.—The term
4 ‘‘algorithmic amplification’’ means the promotion,
5 demotion, recommendation, prioritization, or de-
6 prioritization of user-generated content on a covered
7 platform to other users of the covered platform
8 through a means other than presentation of content
9 in a reverse-chronological or chronological order.
10 (2) COMMISSION.—The term ‘‘Commission’’
11 means the Federal Trade Commission.
12 (3) CONTENT MODERATION.—The term ‘‘con-
13 tent moderation’’ means the intentional removal, la-
14 beling, or altering of user-generated content on a
15 covered platform by the covered platform or an auto-
16 mated or human system controlled by the covered
17 platform, including decreasing the algorithmic rank-
18 ing of user-generated content, removing user-gen-
19 erated content from algorithmic recommendations,
20 or any other action taken in accordance with the
21 covered platform’s terms of service, community
22 guidelines, or similar materials governing the con-
23 tent allowed on the covered platform.
24 (4) CONTENT-AGNOSTIC INTERVENTION.—The

25 term ‘‘content-agnostic intervention’’ means an ac-


26 tion that can be taken by a covered platform to alter
MUR22061 W95 S.L.C.

23
1 a user’s experience on the covered platform or the
2 user interface of the covered platform that does
3 not—
4 (A) rely on the substance of user-generated
5 content on the covered platform; or
6 (B) alter the core functionality of the cov-
7 ered platform.
8 (5) COVERED PLATFORM.—The term ‘‘covered
9 platform’’ means any public-facing website, desktop
10 application, or mobile application that—
11 (A) is operated for commercial purposes;
12 (B) provides a forum for user-generated
13 content;
14 (C) is constructed such that the core
15 functionality of the website or application is to
16 facilitate interaction between users and user-
17 generated content; and
18 (D) has more than 20,000,000 monthly ac-
19 tive users in the United States for a majority
20 of the months in the previous 12-month period.
21 (6) PRIVACY-PRESERVING MANNER.—The term
22 ‘‘privacy-preserving manner’’ means, with respect to
23 a report made by a covered platform, that the infor-
24 mation contained in the report is presented in a
25 manner in which it is not reasonably capable of
MUR22061 W95 S.L.C.

24
1 being used, either on its own or in combination with
2 other readily accessible information, to uniquely
3 identify an individual.
4 (7) USER.—The term ‘‘user’’ means a person
5 that uses a covered platform, regardless of whether
6 that person has an account or is otherwise registered
7 with the platform.
8 (8) USER-GENERATED CONTENT.—The term
9 ‘‘user-generated content’’ means any content, includ-
10 ing text, images, audio, video, or other creative data
11 that is substantially created, developed, or published
12 on a covered platform by any user of such covered
13 platform.

You might also like