Social Media NUDGE - Klobuchar-Lummis - MUR22061
Social Media NUDGE - Klobuchar-Lummis - MUR22061
Social Media NUDGE - Klobuchar-Lummis - MUR22061
S. ll
117TH CONGRESS
2D SESSION
A BILL
To require the Federal Trade Commission to identify con-
tent-agnostic platform interventions to reduce the harm
of algorithmic amplification and social media addiction
on covered platforms, and for other purposes.
2
1 (1) Social media platforms can have significant
2 impacts on their users, both positive and negative.
3 However, social media usage can be associated with
4 detrimental outcomes, including on a user’s mental
5 and physical health. Design decisions made by social
6 media platforms, such as decisions affecting the con-
7 tent a user might see on a social media platform,
8 may drive or exacerbate these negative or detri-
9 mental outcomes.
10 (2) Viral harmful content often spreads on so-
11 cial media platforms. Social media platforms do not
12 consistently enforce their terms of service and con-
13 tent policies, leading to supposedly prohibited con-
14 tent often being shown to users and amplified by
15 such platforms.
16 (3) Social media platforms often rely heavily on
17 automated measures for content detection and mod-
18 eration. These social media platforms may rely on
19 such automated measures due to the large quantity
20 of user-generated content on their platforms. How-
21 ever, evidence suggests that even state-of-the-art
22 automated content moderation systems currently do
23 not fully address the harmful content on social
24 media platforms.
MUR22061 W95 S.L.C.
3
1 (4) Significant research has found that content-
2 agnostic interventions, if made by social media plat-
3 forms, may help significantly mitigate these issues.
4 These interventions could be readily implemented by
5 social media platforms to provide safer user experi-
6 ences. Such interventions include the following:
7 (A) Nudges to users and increased plat-
8 form viewing options, such as screen time alerts
9 and grayscale phone settings, which may reduce
10 addictive platform usage patterns and improve
11 user experiences online.
12 (B) Labels and alerts that require a user
13 to read or review user-generated content before
14 sharing such content.
15 (C) Prompts to users, which may help
16 users identify manipulative and microtargeted
17 advertisements.
18 (D) Other research-supported content-ag-
19 nostic interventions.
20 (5) Evidence suggests that increased adoption
21 of content-agnostic interventions would lead to im-
22 proved outcomes of social media usage. However, so-
23 cial media platforms may be hesitant to independ-
24 ently implement content-agnostic interventions that
MUR22061 W95 S.L.C.
4
1 will reduce negative outcomes associated with social
2 media use.
3 SEC. 3. STUDY ON CONTENT-AGNOSTIC INTERVENTIONS.
5
1 untarily submitted to the Academies by the cov-
2 ered platform (through a process, established by
3 the Academies, with appropriate privacy safe-
4 guards);
5 (2) identify research-based content-agnostic
6 interventions, such as reasonable limits on account
7 creation and content sharing, to combat problematic
8 smartphone use and other negative mental or phys-
9 ical health impacts related to social media, including
10 through a review of the materials described in sub-
11 paragraphs (A) and (B) of paragraph (1);
12 (3) provide recommendations on how covered
13 platforms may be separated into groups of similar
14 platforms for the purpose of implementing content-
15 agnostic interventions, taking into consideration fac-
16 tors including any similarity among the covered plat-
17 forms with respect to—
18 (A) the number of monthly active users of
19 the covered platform and the growth rate of
20 such number;
21 (B) how user-generated content is created,
22 shared, amplified, and interacted with on the
23 covered platform;
24 (C) how the covered platform generates
25 revenue; and
MUR22061 W95 S.L.C.
6
1 (D) other relevant factors for providing
2 recommendations on how covered platforms
3 may be separated into groups of similar plat-
4 forms;
5 (4) for each group of covered platforms rec-
6 ommended under paragraph (3), provide rec-
7 ommendations on which of the content-agnostic
8 interventions identified in paragraph (2) are gen-
9 erally applicable to the covered platforms in such
10 group;
11 (5) for each group of covered platforms rec-
12 ommended under paragraph (3), provide rec-
13 ommendations on how the covered platforms in such
14 group could generally implement each of the con-
15 tent-agnostic interventions identified for such group
16 under paragraph (4) in a way that does not alter the
17 core functionality of the covered platforms, consid-
18 ering—
19 (A) whether the content-agnostic interven-
20 tion should be offered as an optional setting or
21 feature that users of a covered platform could
22 manually turn on or off with appropriate de-
23 fault settings to reduce the harms of algo-
24 rithmic amplification and social media addiction
MUR22061 W95 S.L.C.
7
1 on the covered platform without altering the
2 core functionality of the covered platform; and
3 (B) other means by which the content-ag-
4 nostic intervention may be implemented and
5 any associated impact on the experiences of
6 users of the covered platform and the core
7 functionality of the covered platform;
8 (6) for each group of covered platforms rec-
9 ommended under paragraph (3), define metrics gen-
10 erally applicable to the covered platforms in such
11 group to measure and publicly report in a privacy-
12 preserving manner the impact of any content-agnos-
13 tic intervention adopted by the covered platform;
14 and
15 (7) identify data and research questions nec-
16 essary to further understand the negative mental or
17 physical health impacts related to social media, in-
18 cluding harms related to algorithmic amplification
19 and social media addiction.
20 (b) REQUIREMENT TO SUBMIT ADDITIONAL RE-
21 SEARCH.—If a covered platform voluntarily submits inter-
22 nal research to the Academies under subsection (a)(1)(B),
23 the covered platform shall, upon the request of the Acad-
24 emies and not later than 60 days after receiving such a
25 request, submit to the Academies any other research in
MUR22061 W95 S.L.C.
8
1 the platform’s possession that is closely related to such
2 voluntarily submitted research.
3 (c) REPORTS.—
4 (1) INITIAL STUDY REPORT.—Not later than 1
5 year after the date of enactment of this Act, the
6 Academies shall submit to the Director, Congress,
7 and the Commission a report containing the results
8 of the initial study conducted under subsection (a),
9 including recommendations for how the Commission
10 should establish rules for covered platforms related
11 to content-agnostic interventions as described in
12 paragraphs (1) through (5) of subsection (a).
13 (2) UPDATES.—Not later than 2 years after the
14 effective date of the regulations promulgated under
15 section 4, and every 2 years thereafter during the
16 10-year period beginning on such date, the Acad-
17 emies shall submit to the Director, Congress, and
18 the Commission a report containing the results of
19 the ongoing studies conducted under subsection (a).
20 Each such report shall—
21 (A) include analysis and updates to earlier
22 studies conducted, and recommendations made,
23 under such subsection;
24 (B) be based on—
MUR22061 W95 S.L.C.
9
1 (i) new academic research, reports,
2 and other relevant materials related to the
3 subject of previous studies, including addi-
4 tional research identifying new content-ag-
5 nostic interventions;
6 (ii) new academic research, reports,
7 and other relevant materials about harms
8 occurring on covered platforms that are
9 not being addressed by the content-agnos-
10 tic interventions being implemented by cov-
11 ered platforms as a result of the regula-
12 tions promulgated under section 4;
13 (C) include information about the imple-
14 mentation of the content-agnostic interventions
15 by covered platforms and the impact of the im-
16 plementation of the content-agnostic interven-
17 tions; and
18 (D) include an analysis of any entities that
19 have newly met the criteria to be considered a
20 covered platform under this Act since the last
21 report submitted under this subsection.
22 SEC. 4. IMPLEMENTATION OF CONTENT-AGNOSTIC INTER-
23 VENTIONS.
10
1 (1) IN GENERAL.—Not later than 60 days after
2 the receipt of the initial study report under section
3 3(c)(1), the Commission shall initiate a rulemaking
4 proceeding for the purpose of promulgating regula-
5 tions in accordance with section 553 of title 5,
6 United States Code—
7 (A) to determine how covered platforms
8 should be grouped together;
9 (B) to determine which content-agnostic
10 interventions identified in such report shall be
11 applicable to each group of covered platforms
12 identified in the report; and
13 (C) to require each covered platform to im-
14 plement and measure the impact of such con-
15 tent-agnostic interventions in accordance with
16 subsection (b).
17 (2) CONSIDERATIONS.—In the rulemaking pro-
18 ceeding described in paragraph (1), the Commis-
19 sion—
20 (A) shall consider the report under section
21 3(c)(1) and its recommendations; and
22 (B) shall not promulgate regulations re-
23 quiring any covered platform to implement a
24 content-agnostic intervention that is not dis-
25 cussed in such report.
MUR22061 W95 S.L.C.
11
1 (3) NOTIFICATION TO COVERED PLATFORMS.—
12
1 of the covered platform, or would pose a
2 material privacy or security risk to con-
3 sumer data stored, held, used, processed,
4 or otherwise possessed by such covered
5 platform, the covered platform shall in-
6 clude in its plan evidence supporting these
7 beliefs in accordance with paragraph (2).
8 (ii) COMMISSION DETERMINATION.—
20 (I) IN GENERAL.—Subject to
21 subclause (II), if the Commission de-
22 termines under clause (ii) that a cov-
23 ered platform’s plan does not satisfy
24 the requirements of this subsection,
25 not later than 90 days after the
MUR22061 W95 S.L.C.
13
1 issuance of such determination, the
2 covered platform shall—
3 (aa) appeal the determina-
4 tion by the Commission to the
5 United States Court of Appeals
6 for the Federal Circuit; or
7 (bb) submit to the Commis-
8 sion a revised plan for a Commis-
9 sion determination pursuant to
10 clause (ii).
11 (II) LIMITATION.—If a covered
12 platform submits 3 revised plans to
13 the Commission for a determination
14 pursuant to clause (ii) and the Com-
15 mission determines that none of the
16 revised plans satisfy the requirements
17 of this subsection, the Commission
18 may find that the platform is not act-
19 ing in good faith in developing an im-
20 plementation plan and may require
21 the platform to implement, pursuant
22 to a plan developed for the platform
23 by the Commission, each content-ag-
24 nostic intervention applicable to the
25 platform (as determined by the Com-
MUR22061 W95 S.L.C.
14
1 mission) in an appropriately prompt
2 manner.
3 (B) STATEMENT OF COMPLIANCE.—Not
15
1 Commission required under subsection (a)(3), a
2 covered platform seeking an exemption from
3 any aspect of such rule may submit to the Com-
4 mission—
5 (i) a statement identifying any specific
6 aspect of a content-agnostic intervention
7 applicable to such covered platform (as de-
8 termined by the Commission under sub-
9 section (a)) that the covered platform rea-
10 sonably believes—
11 (I) is not technically feasible for
12 the covered platform to implement;
13 (II) will substantially change the
14 core functionality of the covered plat-
15 form; or
16 (III) will create a material and
17 imminent privacy or security risk to
18 the consumer data stored, held, used,
19 processed, or otherwise possessed by
20 such covered platform; and
21 (ii) specific evidence supporting such
22 belief, including any relevant information
23 regarding the core functionality of the cov-
24 ered platform.
MUR22061 W95 S.L.C.
16
1 (B) DETERMINATION BY THE COMMIS-
17
1 SEC. 5. TRANSPARENCY REPORT.
18
1 that piece of content during the month. Covered
2 platforms shall report the following information
3 about each piece of sampled content (with appro-
4 priate redactions to exclude the disclosure of illegal
5 content):
6 (A) The text, images, audio, video, or other
7 creative data associated with each such piece of
8 content.
9 (B) The details of the account or accounts
10 that originally posted the content; and
11 (C) The total number of views of each such
12 piece of content during the month
13 (3) HIGH REACH CONTENT.—Content modera-
14 tion metrics broken down by language to assess the
15 prevalence of harmful content on the covered plat-
16 form, including, for each language, the 1,000 most
17 viewed pieces of publicly visible content each month,
18 including the following (with appropriate redactions
19 to exclude the disclosure of illegal content):
20 (A) The text, images, audio, video, or other
21 creative data associated with each such piece of
22 content.
23 (B) The details of—
24 (i) the account that originally posted
25 the content; and
MUR22061 W95 S.L.C.
19
1 (ii) any account whose sharing or re-
2 posting of the content accounted for more
3 than 5 percent of the views of the content.
4 (4) REMOVED AND MODERATED CONTENT.—
20
1 (B) REQUIREMENTS FOR METRICS.—The
21
1 (1) IN GENERAL.—The Commission shall en-
2 force this Act in the same manner, by the same
3 means, and with the same jurisdiction, powers, and
4 duties as though all applicable terms and provisions
5 of the Federal Trade Commission Act (15 U.S.C. 41
6 et seq.) were incorporated into and made a part of
7 this Act.
8 (2) PRIVILEGES AND IMMUNITIES.—Any person
9 who violates section 4 or 5 or a regulation promul-
10 gated under section 4 shall be entitled to the privi-
11 leges and immunities provided in the Federal Trade
12 Commission Act (15 U.S.C. 41 et seq.).
13 (3) ENFORCEMENT GUIDELINES AND UP-
22
1 SEC. 7. DEFINITIONS.
2 In this Act:
3 (1) ALGORITHMIC AMPLIFICATION.—The term
4 ‘‘algorithmic amplification’’ means the promotion,
5 demotion, recommendation, prioritization, or de-
6 prioritization of user-generated content on a covered
7 platform to other users of the covered platform
8 through a means other than presentation of content
9 in a reverse-chronological or chronological order.
10 (2) COMMISSION.—The term ‘‘Commission’’
11 means the Federal Trade Commission.
12 (3) CONTENT MODERATION.—The term ‘‘con-
13 tent moderation’’ means the intentional removal, la-
14 beling, or altering of user-generated content on a
15 covered platform by the covered platform or an auto-
16 mated or human system controlled by the covered
17 platform, including decreasing the algorithmic rank-
18 ing of user-generated content, removing user-gen-
19 erated content from algorithmic recommendations,
20 or any other action taken in accordance with the
21 covered platform’s terms of service, community
22 guidelines, or similar materials governing the con-
23 tent allowed on the covered platform.
24 (4) CONTENT-AGNOSTIC INTERVENTION.—The
23
1 a user’s experience on the covered platform or the
2 user interface of the covered platform that does
3 not—
4 (A) rely on the substance of user-generated
5 content on the covered platform; or
6 (B) alter the core functionality of the cov-
7 ered platform.
8 (5) COVERED PLATFORM.—The term ‘‘covered
9 platform’’ means any public-facing website, desktop
10 application, or mobile application that—
11 (A) is operated for commercial purposes;
12 (B) provides a forum for user-generated
13 content;
14 (C) is constructed such that the core
15 functionality of the website or application is to
16 facilitate interaction between users and user-
17 generated content; and
18 (D) has more than 20,000,000 monthly ac-
19 tive users in the United States for a majority
20 of the months in the previous 12-month period.
21 (6) PRIVACY-PRESERVING MANNER.—The term
22 ‘‘privacy-preserving manner’’ means, with respect to
23 a report made by a covered platform, that the infor-
24 mation contained in the report is presented in a
25 manner in which it is not reasonably capable of
MUR22061 W95 S.L.C.
24
1 being used, either on its own or in combination with
2 other readily accessible information, to uniquely
3 identify an individual.
4 (7) USER.—The term ‘‘user’’ means a person
5 that uses a covered platform, regardless of whether
6 that person has an account or is otherwise registered
7 with the platform.
8 (8) USER-GENERATED CONTENT.—The term
9 ‘‘user-generated content’’ means any content, includ-
10 ing text, images, audio, video, or other creative data
11 that is substantially created, developed, or published
12 on a covered platform by any user of such covered
13 platform.