0% found this document useful (0 votes)
24 views

Copy of 1AC (1)

debate
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

Copy of 1AC (1)

debate
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 19

1AC - Plan

The United States federal government should strengthen its protection of patents
by passing the Patent Eligibility Restoration Act.
1AC – Competitiveness Advantage

Advantage 1 is competitiveness
The US and China are competing for global technological leadership. However, four
Supreme Court decisions called the “Mayo-Alice framework” created significant
uncertainty over whether high-tech innovation could receive patents. This will destroy
US tech leadership
James Edwards, 2024 - PhD, is executive director of Conservatives for Property Rights
(@4PropertyRights) and patent policy advisor to Eagle Forum Education & Legal Defense Fund. “America
has a chance to regain the innovation lead” The Carolina Journal, 3/4,
https://www.carolinajournal.com/opinion/america-has-a-chance-to-regain-the-innovation-lead/ //DH

Like it or not, the United States and China are locked in high-stakes global competition over trade,
technology, and national security. Ironically, at this crucial time, China appears to have found an
unwitting ally in the US Supreme Court.
The challenge couldn’t be more serious. But if Congress can move with bipartisan resolve, a legislative
solution is at hand. Sen. Thom Tillis of North Carolina has a plan for that.
First, the context. The Patent Act, under Section 101, spells out the categories of inventions eligible for
patenting; however, a series of Supreme Court decisions between 2010 and 2014 changed the test for
what could be patented. It expanded the list of what would be ineligible, chiefly among abstract ideas,
mathematical formulas, and products of nature. The tests involve highly subjective decisions for which
the Supreme Court has provided no further guidance.
Together, the Supreme Court rulings have led to inconsistent case decisions. The resulting uncertainty
has acted like kryptonite in innovation and investment communities. Patent rights across nearly all
technologies have become much less reliable. Worse, inventors’ ability to obtain patents in sectors of
crucial importance to the United States — including computer software, AI, and life sciences — has been
virtually wiped out.
Not only are foreign governments well aware of America’s self-inflicted restrictions on patent eligibility,
but they have seized upon the judicially created opportunity to their own competitive advantage. Many
of the inventions the United States deemed ineligible are being patented in China and Europe.
Between 2014 and 2019, more than 17,000 US patent applications were rejected based on ineligibility,
according to an analysis by George Mason University Law Professor Adam Mossoff. He found that 1,694
of those applications were subsequently granted patents in China or the EU.
China currently leads in 37 of 44 critical and emerging technologies, according to a study funded by the
US State Department. There is no shortage of know-how or inventors in the United States, yet it’s
stunning to think that the United States is ceding the global technology race because of dubious court
decisions.
It doesn’t have to be that way. Current and retired judges on the US Court of Appeals for the Federal Circuit have expressed deep concerns over
the patent eligibility chaos, as have the Solicitor General and Patent & Trademark Office directors. And in our deeply divided Congress, there’s a
broad, bipartisan call for reforms to reestablish America’s global primacy in innovation and technology.
if the United States is to beat China in emerging technologies — such as artificial intelligence,
Tillis understands that
— we must make sure we aren’t depriving our top
advanced computing, biotechnology, medical diagnostics, and 5G
inventors of the free-market incentives and rewards that drive our competitive advantage.
That’s why he introduced S.2140, the Patent Eligibility Restoration Act, with Senate Judiciary IP
Subcommittee Chairman Chris Coons of Delaware. Tillis’s bipartisan bill would resolve confusion by
retaining Section 101’s existing statutory categories for patent-eligible subject matter (i.e., process,
machine, manufacture, and composition of matter) and by replacing the ambiguous judicially created
exceptions with more clearly defined exceptions.
Tillis’ PERA legislation provides specific exceptions to eligible subject matter and ensures that they will be the only exceptions. These exceptions
include pure mathematical formulas, certain economic or social processes, processes that can be performed solely in the human mind,
processes that occur in nature independent of human activity, unmodified genes, and unmodified natural material.
Tillis’s bill also clarifies the narrow conditions under which otherwise unpatentable processes, genes, and materials may be patentable. For
example, under PERA, a process that cannot be practically performed without the use of a machine or computer may be patentable. The bill
also clarifies that human genes and natural materials that are “isolated, purified, enriched, or otherwise altered by human activity” or
“employed in a useful invention or discovery” are patent-eligible.
PERA would simplify eligibility determinations by requiring patent claims be read as a whole and
prohibiting the consideration of other patentability factors (e.g., novelty and nonobviousness), ensuring
Section 101 focuses solely on subject-matter eligibility. Under current law, patent examiners and courts
determining whether a claimed invention is eligible for a patent under Section 101 must consider vague
factors, including whether portions of a claim include elements that are “conventional” or “routine.”
Confusing, constricted and unclear patent eligibility court rulings have stymied the nation’s innovative
progress. Tillis’s PERA bill lays down clear, consistent rules about what subject matter is eligible. Adding
more certainty to the process ensures greater confidence in IP investments, which will grow America’s
economic pie and raise the standard of living in our country. At a time of emerging challenges to US
national security, Congress should make passage of PERA an urgent priority.

The Mayo-Alice framework generates investment-killing uncertainty about patent


protection and drives investment overseas
Adam Mossoff, 24 - Professor of Law, Antonin Scalia Law School George Mason University. Answers from
Adam Mossoff to Questions for the Record from Senator Thom Tillis, Senate Committee on the Judiciary,
Subcommittee on Intellectual Property, United States Senate “The Patent Eligibility Restoration Act – Restoring
Clarity, Certainty, and Predictability to the U.S. Patent System” 1/24,
https://www.judiciary.senate.gov/imo/media/doc/2024-01-23_-_qfr_responses_-_mossoff.pdf //DH
italics in original

Beyond the blithe acceptance of a 90% invalidation rate for patent claims as representative of normal
operating conditions in courts, there are several problems with the article by Professors Datzov and
Rantanen. First, Professors Datzov and Rantanen conflate legal unpredictability and commercial
unpredictability. As I have explained in numerous amicus briefs in patent eligibility cases over the past decade, the Mayo-Alice inquiry is
indeterminate insofar as courts are using it to invalidate patents on inventions that have long been upheld by courts as patent eligible before
the Mayo-Alice inquiry
the Mayo-Alice inquiry was created by the Supreme Court. I have also acknowledged in these amicus brief that
is predictable in a more narrow sense: courts almost always invalidate patents as ineligible when they
apply this legal framework. This creates unpredictability for innovators in the biopharma and high-tech
sectors who rely on the patent system to make long-term investments costing billions of dollars.25
Innovators cannot reasonably predict that they will receive reliable and effective property rights to
secure a return on their investments in creating and commercializing new innovations; in fact, their only
safe prediction is that there is a 90% chance their patent will be invalidated by a court. Thus, a
reasonable businessperson concludes not to make the investment in the new technology or drug, or to
shift the investment overseas to a country in which they can be certain they will receive legal
protection for their fruits of their inventive labors.
This is what one scholar has identified as “investment-killing uncertainty,” which he rightly recognizes
as the primary form of uncertainty in a legal system that “creates a problem that must be addressed by
policy makers and suggests appropriate action” in enacting reform.26 Investment-killing uncertainty,
which can be the result of a legal doctrine being applied consistently, is the unpredictability and
indeterminacy that lawyers and economists speak about when they speak of the function of property
rights in incentivizing investments and commercialization activities (as I explained above in my answer to Question
3(a)). Datzov and Rantanen agree with judges and others calling for reform when they find a 90% invalidation rate in the Federal Circuit’s
application of the Mayo-Alice inquiry, and this agreement confirms there is an unbalanced, anti-patent bias inherent in the Mayo-Alice inquiry
that must be reformed, just as nonobviousness doctrine required reform in the 1952 Patent Act given the high rates of invalidation of patents
under the Supreme Court’s “flash of creative genius” test.
Datzov and Rantanen’s questionable decision to blithely accept massively lopsided decisions lacking dissents as evidence of investment-
spurring predictability is further confirmed by their review of the Patent Trial and Appeal Board (PTAB) decisions. Datzov and Rantanen’s review
of the Federal Circuit’s affirmance rate of the PTAB’s § 101 decisions is especially relevant given the PREVAIL Act, and the STRONGER Patents
Act before this, introduced by members of this committee. The PREVAIL Act imposes important procedural and substantive reforms on an
agency tribunal that many lawyers, judges, and stakeholders in the innovation economy have repeatedly identified as institutionally and legally
unbalanced, and which the Federal Circuit has not reined in. Yet, Datzov and Rantanen find nothing wrong with a 100% affirmance by the
Federal Circuit of PTAB decisions invaliding patents under § 101 in nine years over a span of ten years (2012-2022)—there was only one year
the rate dropped to 93.3% (2020). Datzov and Rantanen conclude that “PTAB judges have been doing a good job of correctly and predictably
determining when the patent claims are ineligible.”27 Given the extensive, necessary reforms of the PTAB in the PREVAIL Act, this is a surprising
claim that further highlights the problematic nature of the study by these two academics.
There is much else that can be said about this study beyond its fundamental and improper equivocation
between difference senses of predictability in the patent eligibility debates. For reform advocates
supporting PERA, there is massive commercial and investment unpredictability created by a decade of
decisions applying the Mayo-Alice inquiry to invalidate patents at rates of 90% or more. (Whether these
inventions or discoveries would satisfy the patentability requirements for novelty, nonobviousness, and disclosure is a separate question, and
one that the USPTO and a court would assess if they moved past the threshold legal inquiry of whether an invention is patent eligible.) For
Datzov and Ratanen, a doctrine producing 90% patent invalidation rates—and 100% affirmance rates of § 101 invalidation decisions by the
But this is not predictable protection of property rights
PTAB—are acceptable because they reflect legal predictability.
that secures investments and promotes the creation of new commercial business models in the
innovation economy that drive economic growth.

*Tech innovation drives national power and patents lock in tech leadership. Chinese
leadership ends the liberal international order
Brad Glosserman, 23 – deputy director of and visiting professor at the Tama University Center for Rule
Making Strategies in Japan, and a senior advisor at Pacific Forum International, a Honolulu-based think tank “De-
Risking Is Not Enough: Tech Denial Toward China Is Needed” The Washington Quarterly, Volume 46, 2023 - Issue 4,
published 12/19, online: Taylor and Francis database, accessed via University of Michigan //DH

The world is on the brink of yet another industrial revolution, one that will be shaped by new and
emerging technologies such as artificial intelligence, biotechnology, quantum computing, and other
technologies which, analysts from the McKinsey Global Institute summarize, will “affect billions of
consumers, hundreds of millions of workers, and trillions of dollars of economic activity across
industries.” 5 That transition—a revolution in many ways—will be the fulcrum upon which global power
and the world order will balance. As the National Intelligence Council summarized in its last Global Trends Report, “some
technological areas appear to offer the potential for transformative change . . . advances in these areas will combine with other technologies,
such as energy storage, to shape societies, economies and perhaps even the nature of power.” 6
Most simply, there are economic advantages that flow from leading the way. Leadership in those new
and emerging technology sectors will yield revenues which facilitate a virtuous cycle of investment,
research and development, securing dominance in this and future generations of technology. Leading
promotes the creation of international standards which lock in that principal position; patent holders
can generate substantial revenues from licensing—Qualcomm generated about €5.2 billion from
licensing in 2017, more than 20 percent of its profit—and those royalty payments constitute a de facto
tax on second-place competitors, providing an immediate advantage in subsequent research into next-
generation technology. Standards also create dependencies, as they encourage the use of related
technology throughout a network even as new generations of equipment develop. An integral part of
China’s Belt and Road Initiative (BRI) is the proliferation of and reliance upon Chinese standards to
encourage integration with and dependence on Chinese tech providers such as Huawei, the
telecommunications giant. Tech supremacy confers legitimacy on that country’s innovation model,
reinforcing its soft power.
The green technology industry provides insight into how this future could look. There is consensus that China is already pacing development of
these vital technologies, holding “a commanding lead” in manufacturing most low-carbon technologies. China is responsible for the production
of about 90 percent of the world’s rare earth elements critical to this transition. It is also responsible for at least 80 percent of all stages in
making solar panels, and 60 percent of production of wind turbines and electric-vehicle (EV) batteries—and increasing capacity in both areas. In
some niche components, its share is even higher. Economists applaud China’s efforts to lower costs and speed the green transition, but the
massive subsidies afforded China’s solar panel producers have led European competitors to warn that they are being pushed to the brink of
bankruptcy by these unfair practices.7
This “cornering of the clean tech supply chain” has been compared to Saudi Arabia’s power over the oil market and the geopolitical influence it
If China’s dominance continues and even extends into other new technologies,
created in the 20th century. 8
Beijing would accelerate its own growth and position in the global economy today, while also taking the
lion’s share of revenues generated in multi-trillion-dollar industries and laying the foundations for future
generations of China-created green tech. This would then endow China with enormous soft power as
the leader of the energy transition critical to the planet’s survival.
What makes some new and emerging technologies different is that they not only generate wealth and prosperity, thereby validating the
some, such as AI and quantum
companies, countries and social systems that create them. That would be powerful enough. But
computing, also provide insight into and potential control over the processes by which future decisions
—no matter how distant—are made. Their effect is not just temporal, but enduring potentially for
generations. Because they are capable of tipping the balance of power in a variety of ways, pre-
eminence in these new technologies will determine who makes the rules and how the world works.
Failure by the United States and its allies to lead in this competition will undercut their ability to
construct or maintain a global order that favors their values and interests. That is what makes these technologies
different.
Russian President Vladimir Putin gets it. As early as 2017, he argued that “Artificial intelligence is the future, not only for Russia, but for all
humankind. … Whoever becomes the leader in this sphere will become the ruler of the world.” 9
China’s supreme leader Xi Jinping gets it. Xi has said that new technologies—artificial intelligence, big
data, quantum information and biotechnology—will trigger “earth-shaking changes” that will give China
an “important opportunity to promote leapfrog development,” and overtake competitors. 10 According
to China technology scholar Tai Ming Cheung, Xi’s mindset is “a Hobbesian backdrop of a life or death
struggle for the economic and strategic renaissance of China … an intensive zero-sum technological
revolution … to effectively compete for the global commanding heights.” 11
China scholar Rush Doshi, now serving as deputy senior director for China and Taiwan at the National
Security Council, has warned that “Beijing believes that the competition over technology is about more than whose
companies will dominate particular markets. It is also about which country will be best positioned to lead the world.” While party officials are
surpassing the United
reticent to speak bluntly, Doshi pointed to “commentaries and think tank pieces [that] seem to suggest that
States in high technology would end its era of global leadership, and presumably, usher in one of
Chinese leadership.” 12
Consider one sobering scenario for 2033: a world in which China has surpassed the United States as the
leading tech power and is typically first to announce breakthrough scientific discoveries and turn them
into technologies. 13 Shenzhen has eclipsed Silicon Valley as the world’s leading source of innovation.
China has closed the defense gap and can field weaponry as good as—if not better than—that of the US.
Its technology is preferred across much of the developing world and has been eagerly adopted by
autocrats and authoritarians who use it to impose China’s political model. The spread of “smart cities” that rely on
its data, algorithms and technology provide Beijing with the ability to manipulate even mundane decisions on the platforms China provides.
Imagine the mischief that can be done with—or the intelligence that could be gleaned from—control over computer systems that process visas,
for example.
Moreover, the insights afforded by access to all those systems accelerates the development of artificial intelligence in China and extends its
Chinese
reach even further. Domination of China’s home market, combined with unfair trading practices abroad, ensures that
companies maintain a competitive advantage over other businesses. This lead facilitates the spread of
Chinese-supported international standards across multiple types of technology, favoring the power of
the state over individual freedom globally and providing a technological underpinning to an increasingly
illiberal international order.

Russia and China are revisionist powers. The mere perception of displacing US tech
leadership could cause them to attack, risking nuclear war
Matthew Kroenig, 2021 – professor of government and foreign service at Georgetown University and
the director of the Scowcroft Strategy Initiative at the Atlantic Council “Will Emerging Technology Cause
Nuclear War?: Bringing Geopolitics Back In” Strategic Studies Quarterly, Winter,
https://www.airuniversity.af.edu/Portals/10/SSQ/documents/Volume-15_Issue-4/D-Kroenig.pdf //DH
4IR technologies = 4th Industrial Revolution

New Tech Arms Race


Many analysts believe the emerging technology of the 4IR could profoundly affect military capabilities and operational concepts.35 New
technology has had revolutionary effects on warfare and international politics throughout history from the Bronze Age to the gunpowder and
nuclear revolutions.36
New technologies with direct military application are in development, including AI, quantum information technology, hypersonic missiles,
directed energy, additive manufacturing, and biotechnology. How exactly these technologies will affect the future of warfare is still uncertain.
The National Defense Strategy Commission report charges that the United States lacks clear operational concepts for combat with Russia and
China.37 Still, there is reason to believe these new technologies could have meaningful military applications but perhaps not to the advantage
Russia and especially China might transcend the United
of the United States and its Allies and partners. At present,
States and its Allies and partners in some key 4IR technologies.
Indeed, AI could transform the future of warfare, including through the development of lethal autonomous systems.38 These
“killer robots” may lower the threshold of conflict by allowing political leaders to take a country to war without risking the lives of human
soldiers. When produced in large numbers, these drones could operate in swarms that overwhelm enemy military platforms and targets.39
Artificial intelligence could also be employed to rapidly sort through vast quantities of data, improving intelligence, surveillance, and
reconnaissance and making it easier to track and target enemy forces. The United States retains important advantages in AI, including through
its world- leading university system. But China, with its large population and surveillance tactics, has access to more data to train its AI
algorithms.40 Beijing is also less constrained by ethical and moral concerns and has the lead in some applications of AI, including facial-
recognition technology.
Quantum computing promises information advantages including the ability to have secure, encrypted
communications and to decode enemy communications. In its 2021 Military Balance report, the International Institute for
Strategic Studies states, “the integration of quantum technologies currently represents one of the most anticipated advances for armed
forces. . . . There is little doubt that they will have disruptive effect when they are employed at scale.”41 China may have the edge in this area,
as it was the first country to conduct a successful test of a quantum satellite.42
Space and cyber are increasingly important military domains. Space- based weapons, sensors, defensive interceptors, and the diffusion of
counterspace capabilities will make space an increasingly contested military environment.43 The United States is relatively more dependent on
Russia and China will likely employ
space- based assets and computers than its rivals, and the US Department of Defense warns
cyber and counterspace attacks in the early stage of any conflict with the United States in a bid to
disrupt US command, control, communications, computers, intelligence, surveillance, and reconnaissance (C4ISR).44
Hypersonic missiles, maneuverable and able to travel at over five times the speed of sound, could allow states to conduct low- or no- warning
attacks and to evade missile defenses.45 These weapons could also execute large- scale, nonnuclear strategic attacks, the rate of speed
compressing the decision- making time leaders have to respond to such attacks. Although the United States developed the initial concepts for
these weapons, Russia and China have prioritized their production, testing, and deployment. China has conducted more hypersonic tests than
any other nation, and Moscow and Beijing have deployed hypersonic weapons.46
Many other emerging technologies have military applications. Directed- energy microwaves and lasers could allow states to develop more
effective integrated air and missile defense systems or to degrade an enemy’s command and control.47 Additive manufacturing could greatly
reduce the cost of producing component parts of military platforms and creates the potential for large and rapid quantitative increases in
weapons systems, from drones and tanks to submarines and nuclear weapons.48
Biotechnology could be exploited to produce “super soldiers.” China has genetically engineered beagles with three
times the muscle mass of a typical canine, a technology that could possibly be applied to humans.49 Exoskeletons could provide soldiers with
superhuman strength, and brain implants promise superior cognitive performance. China employed exoskeletons in combat in its 2020 border
conflict with India.50
It is not yet clear how these new technologies, when combined with novel operational concepts, will affect the future of warfare, but it is likely
they will. A future state may, for example, be able to use additive manufacturing to produce masses of inexpensive drones directed by new AI
algorithms to swarm and overwhelm adversaries.51 The attack might be preceded by cyber and counterspace attacks that blind an adversary
and disrupt its command and control.
Following a successful advance, the country could then employ directed- energy weapons, autonomous mines, and other advanced defenses to
lock in territorial gains and thwart enemy attempts to roll back its aggression. It is possible that the first state to hone these technologies and
devise effective operational concepts will have a military edge over its opponents.
Novel Applications
How will states use such a newfound advantage? Technology rarely fundamentally changes the nature or objectives of states. More often,
states use technology to advance preexisting geopolitical aims. Moreover, enhanced power can result in
greater ambition. Given the geopolitical landscape described, it is likely the United States and its Allies
and partners at the core of the international system will behave differently with new military
technologies than will revisionist powers, such as Russia and China.
The spread of new technology to the United States and its Allies and partners would likely serve, on
balance, to reinforce the existing sources of stability in the prevailing international system. At the end of
the Cold War, the United States and its Allies and partners achieved a technological- military advantage over its great power rivals,
with the US using its unipolar position to deepen and expand a rules- based system. They also employed their military dominance to counter
perceived threats from rogue states and terrorist networks. The United States, its Allies, and partners did not, however, engage in military
aggression against great power, nuclear- armed rivals or their allies.
In the future, these status quo powers are apt to use military advantages to reinforce their position in
the international system and to deter attacks against Allies and partners in Europe and the Indo- Pacific. These states might
also employ military power to deal with threats posed by terrorist networks or by regional revisionist powers such as Iran and North Korea. But
it is extremely difficult to imagine scenarios in which Washington or its Allies or partners would use newfound military advantages provided by
emerging technology to conduct an armed attack against Russia or China.
Similarly, Moscow and Beijing would likely use any newfound military strength to advance their
preexisting geopolitical aims. Given their very different positions in the international system, however,
these states are likely to employ new military technologies in ways that are destabilizing. These states
have made clear their dissatisfaction with the existing international system and their desire to revise it.
Both countries have ongoing border disputes with multiple neighboring countries.
If Moscow developed new military technologies and operational concepts that shifted the balance of power in its favor, it would likely use this
advantage to pursue revisionist aims. If Moscow acquired a newfound ability to more easily invade and occupy territory in Eastern Europe, for
example (or if Putin believed Russia had such a capability), it is more likely Russia would be tempted to engage in aggression.
Likewise, if China acquired an enhanced ability through new technology to invade and occupy Taiwan or
contested islands in the East or South China Seas, Beijing’s leaders might also find this opportunity
tempting. If new technology enhances either power’s anti- access, area- denial network, then its leaders
may be more confident in their ability to achieve a fait accompli attack against a neighbor and then
block a US- led liberation.
These are precisely the types of shifts in the balance of power that can lead to war. As mentioned previously,
the predominant scholarly theory on the causes of war—the bargaining model—maintains that imperfect information on the balance of power
New technology can exacerbate
and the balance of resolve and credible commitment problems result in international conflict.52
these causal mechanisms by increasing uncertainty about, or causing rapid shifts in, the balance of
power. Indeed as noted above, new military technology and the development of new operational concepts have shifted the balance of power
and resulted in military conflict throughout history.
Some may argue emerging military technology is more likely to result in a new tech arms race than in
conflict. This is possible. But Moscow and Beijing may come to believe (correctly or not) that new
technology provides them a usable military advantage over the United States and its Allies and partners.
In so doing, they may underestimate Washington.
If Moscow or Beijing attacked a vulnerable US Ally or partner in their near abroad, therefore, there
would be a risk of major war with the potential for nuclear escalation. The United States has formal treaty
commitments with several frontline states as well as an ambiguous defense obligation to Taiwan. If Russia or China were to attack these states,
it is likely, or at least possible, that the United States would come to the defense of the victims. While many question the wisdom or credibility
it would be difficult for the United States to simply back down. Abandoning a
of America’s global commitments,
treaty ally could cause fears that America’s global commitments would unravel. Any US president, therefore,
would feel great pressure to come to an Ally’s defense and expel Russian or Chinese forces.
Once the United States and Russia or China are at war, there would be a risk of nuclear escalation. As
noted previously, experts assess the greatest risk of nuclear war today does not come from a bolt-out-
of-the-blue strike but from nuclear escalation in a regional, conventional conflict.53 Russian leaders
may believe it is in their interest to use nuclear weapons early in a conflict with the United States and NATO.54 Russia
possesses a large and diverse arsenal, including thousands of nonstrategic nuclear weapons, to support this nuclear strategy.
In the 2018 Nuclear Posture Review, Washington indicates it could retaliate against any Russian nuclear “de- escalation” strikes with limited
nuclear strikes of its own using low- yield nuclear weapons.55 The purpose of US strategy is to deter Russian strikes. If deterrence fails,
however, there is a clear pathway to nuclear war between the United States and Russia. As Henry Kissinger pointed out decades ago, there is
no guarantee that, once begun, a limited nuclear war stays limited.56
There are similar risks of nuclear escalation in the event of a US- China conflict. China has traditionally possessed a
relaxed nuclear posture with a small “lean and effective” deterrent and a formal “no first use” policy. But China is relying more on its strategic
forces. It is projected to double—if not triple or quadruple—the size of its nuclear arsenal in the coming decade.57
Chinese experts have acknowledged there is a narrow range of contingencies in which China might use nuclear weapons first.58 As in the case
of Russia, the US Nuclear Posture Review recognizes the possibility of limited Chinese nuclear attacks and also holds out the potential of a
limited US reprisal with low- yield nuclear weapons as a deterrent.59 If the nuclear threshold is breached in a conflict between the United
States and China, the risk of nuclear exchange is real.
In short, if a coming revolution in military affairs provides a real or perceived battlefield advantage for
Russia or China, such a development raises the likelihood of armed aggression against US regional allies,
major power war, and an increased risk of nuclear escalation.
Implications
Future scholarship should incorporate geopolitical conditions and the related foreign policy goals of the states in question when theorizing the
effects of technology on international politics. Often scholars attempt to conceptualize the effects of weapons systems in isolation from the
political context in which they are embedded.
Studies treat technology as disembodied from geopolitics and as exerting independent effects on the international system. But technology does
not float freely. Technology is a tool different actors can use in different ways. Bakers and arsonists employ fire in their crafts to strikingly
different ends. In the current international environment, Russia and China would tend to employ technology toward
advancing revisionist aims. Technological advances in these countries are therefore much more likely to disrupt the prevailing
international order and nuclear strategic stability.
This approach also suggests the potential threat new technology poses to nuclear strategic stability is more pervasive than previously
understood. To undermine strategic stability, new technology need not directly impact strategic capabilities. Rather, any technology that
promises to shift the local balance of power in Eastern Europe or the Indo- Pacific has the potential to threaten nuclear strategic stability.
This understanding of this issue leads to different policy prescriptions. If the technology itself is the problem, then it must be controlled and
should not be allowed to spread to any states. In contrast, the framework outlined here suggests a different recommendation: preserve the
prevailing balance of power in Europe and Asia. Technological change that, on balance, reinforces the prevailing international system should
strengthen stability.
Leading democracies, therefore, should increase investments in emerging technology to maintain a
technological edge over their adversaries. Export control and nonproliferation measures should be designed to deny emerging
military technology to Russia and China. Arms control should be negotiated with the primary objective of sustaining the current international
the consequences of failure could be shifts in the
distribution of power. Making progress in these areas will be difficult. But
international balance of power, conflict among great powers, and an increased risk of nuclear war.

The plan solves. Passing the Patent Eligibility Restoration Act creates certainty and
predictability by removing patent eligibility decisions from the courts.
Andrei Iancu, 24 – Partner, Sullivan & Cromwell; former Under Secretary of Commerce for Intellectual
Property and Director of the United States Patent and Trademark Office. Testimony before the
Subcommittee on Intellectual Property in the Senate Judiciary Committee, “The Patent Eligibility
Restoration Act – Restoring Clarity, Certainty, and Predictability to the U.S. Patent System” 1/23,
https://www.judiciary.senate.gov/imo/media/doc/2024-01-23_-_testimony_-_iancu.pdf //DH

As Chairman Coons and Ranking Member Tillis recognized in introducing PERA, “all 12 judges of the
United States Court of Appeals for the Federal Circuit have lamented the state of the law” when it
comes to patent eligibility under Section 101.1 The current state of the law is the result of many court
decisions over the decades trying to determine whether modern technologies —such as computer
software, DNA processing, and many others—fit into the categories for a patent defined in 1793, or
whether they are subject to certain exceptions courts have imposed since then . The patchwork of decisions over time,
struggling to keep up with fast-changing technologies, has created significant confusion and uncertainty
as to what is in and what is outside the bounds of the statute. These court decisions also have resulted
in certain de facto rules—such that diagnostic techniques, for example, are generally not eligible for a
patent in the United States—that Congress has never considered, debated, or passed into law. If the
United States Government is not to issue patents for certain categories of inventions that would
otherwise be part of the categories outlined in Section 101, then it is up to Congress to make that rule.
In other words, Congress defined the categories of patent subject matter; if there are to be exceptions
to those categories, they must likewise come from Congress.
The current state of the law has caused profound uncertainty amongst inventors, investors, and patent-
law practitioners alike. In turn, this uncertainty and confusion has hurt American innovation,
competition, and the economy. It also has threatened the Constitutional right to patent protection—a
right that James Madison and the Founders saw as vital to the economic strength and growth of our
nation.
The current state of the law has even sown confusion amongst the expert ranks of the hardworking
patent examiners at the United States Patent and Trademark Office (“USPTO”). To address this issue, the USPTO
promulgated guidelines in 2019 that synthesized the relevant caselaw and provided examiners and applicants a significantly improved
framework for analyzing eligibility under Section 101. This has dramatically improved the analysis at the USPTO. For example, a study by the
USPTO’s Chief Economist has shown that “uncertainty about determinations of patent subject matter eligibility in the first action stage of
patent examination for the relevant technologies decreased by 44% over the first year following publication of the 2019 [Revised Patent Subject
Matter Eligibility Guidance] compared to the previous year.”2
Courts, however, are independent and not bound by administrative guidelines. As a result,
Congressional action is needed to determine affirmatively which categories of inventions should be
deemed as statutorily unpatentable. Otherwise, as Justice Clarence Thomas warned, “exclusionary
principle[s]” espoused by the Judiciary about Section 101 risk “ swallow[ing] all of patent law.”3
Fortunately, PERA provides the legislative vehicle for the United States to correct the state of the law . PERA expressly outlines
certain categories that are not considered to be inventions eligible for a patent, and further indicates
that courts are not to create any exceptions that are not in the statute . Once the categories—after debate and
adjustment as appropriate—are settled on and passed into law, PERA will bring immediate certainty to Section 101. And
it will prevent future uncertainty by “eliminat[ing]” “[a]ll judicial exceptions to patentability ” and
returning patent-eligibility decision making to Congress and legislative debate.
1AC – Innovation Advantage

Advantage 2 is innovation
Eligibility rejections on subject matter grounds creates a broad preemptive effect
against entire industries. Uncertainty shifts industry to trade secrets protection,
destroying innovation
Chad Rafetto, 2024 - Judicial Law Clerk for Chief Judge Colm F. Connolly of the United States District
Court for the District of Delaware. J.D., May 2022, New York University School of Law “Fostering
Innovation Through a Legislative Overhaul of Patentable Subject Matter”, 32 Fed. Cir. B.J. 93, Nexis Uni,
accessed via University of Michigan //DH

B. Policy Issues
Mark Twain once wrote, "[A] country without a patent office and good patent laws was just a crab, and
couldn't travel any way but sideways or backwards."172 The United States used to be the "gold
standard" for patent [*115] protection and was the world leader for promoting innovation.173 But the
U.S. patent system is becoming a crab.
The purpose of the U.S. patent system is to incentivize innovation and the disclosure of new ideas.174
But currently § 101 jurisprudence is antithetical to the purpose of our patent system because it is
stalling innovation in certain fields. Not only is this jurisprudence contrary to the purpose of patent law,
but it is also causing America's patent system to fall behind that of other countries.
1. Our System Is Not Incentivizing Innovation in All Fields
Currently, there is a perception that certain kinds of inventions are precluded from receiving a patent
because those inventions cannot pass the patentable subject matter requirement .175 This perception
creates two issues. First, without certainty that downstream products in a field will be patentable, there
is little incentive to invest in either the building blocks or the downstream products.176 So there are
inadequate incentives to undergo the costs of innovation in certain fields. Second, when a company
decides to invest in innovation, the uncertainty of whether its inventions are patentable incentivizes the
hoarding of information in the form of trade secrets as opposed to the sharing of information that
occurs in the patent system.177
Both of these issues are contrary to the core of patent law policy and are evidence that the system is not
functioning optimally. The cause of these issues is that patentable subject matter functions best as a low
bar, but the bar has been raised and now § 101 is doing exclusionary work that is best left to other
doctrines. And the effect is that patents are now being granted or invalidated inconsistently across
different fields.
a. Patentable Subject Matter Is No Longer a Coarse Filter
The idea of patentable subject matter serving as a coarse filter has been turned on its head. The coarse
filter idea once meant that patentable subject matter was a low threshold to achieve.178 Thus, all
inventions but the [*116] bulky building blocks of science (i.e., laws of nature, physical phenomena, or
abstract ideas) would pass through the filter. But now this coarse filter is being used more frequently. A
lot more frequently. Since the Roberts Court's four cases on patentable subject matter from 2010 to
2014, the number of patent validity challenges on subject matter grounds has grown twentyfold.179
Further, one study found that out of a sample of 104 patent eligibility cases before the Federal Circuit
post-Alice, the court determined that only 7.7% of those patents were valid.180 This has created a
perception that certain types of inventions cannot receive patent protection, and this perception has
stalled innovation in those fields.
Patent application rejections and patent invalidity rulings on subject matter grounds are more
problematic for innovation than rejections and rulings based on other grounds because of the scope of
the inquiry. Patentable subject matter asks whether inventions of this general kind are patent eligible;
the other doctrines of patentability ask only whether a specific claim is patentable.181 So a rejection on
subject matter grounds acts as a cudgel and sends a signal that all inventions of that "general kind" are
not patentable, whereas a rejection on one of the other grounds acts as a scalpel and relates only to the
specific claim at hand. Because patentable subject matter has been used too frequently to invalidate or
reject a patent even though a rejection based on a different doctrine would suffice there is now a
perception that patents are unavailable in certain fields.182
b. Technical Fields Are Being Treated Unequally Under Patent Law
The Supreme Court has refused to categorically ban certain fields of innovation from being patentable because inventions are often
unforeseeable.183 As Judge Rader put it in his dissent in In re Bilski,184 "[W]hy should some [*117] categories of invention deserve no
protection?"185 The Supreme Court retained this policy by rejecting a categorical rule about the patentability of business methods because it
would "frustrate the purposes of the patent law."186 Yet the current § 101 jurisprudence is foreclosing inventions in certain industries from the
patent system.187
The current subject matter doctrine is preventing inventions in certain fields from being patented not because the advances in these fields are
not valuable to our society, but rather because these inventions do not map well onto the subject matter requirements for patent law.188
Alice has had a particularly negative effect on software, business methods, and biotechnology
patents.189 Post-Alice, the number of patent applications in each field decreased, yet even with fewer
applications, the number of § 101 rejections increased.190 Another study looked at the invalidation rates under § 101 of
over 800 patents before the Federal Circuit and found that 65% of software or IT patents were invalidated whereas only 50% of biotechnology
or life science patents were invalidated.191 These studies exemplify just one of the discrepancies across fields.
[*118] That is not to say that all patents should be granted at an equal rate. But if no field is
categorically banned from receiving a patent, then the subject matter rejection rates should be
comparable. As explained previously, a rejection on subject matter grounds has a broader preemptive
effect, so having higher subject matter rejection rates in certain fields forecloses a broader sphere of the
field from patent protection than a rejection under any other doctrine.
One example of this inequality is in the field of diagnostics. In Athena Diagnostics, Inc. v. Mayo Collaborative Services, LLC,192Mayo v.
Prometheus, and Association for Molecular Pathology v. Myriad, patents for diagnostic inventions were invalidated based on subject matter
grounds.193 This trio of cases has signaled to inventors that applying for a diagnostic patent is a risky business. The patent may not be granted
at all. Alternatively, even if the patent is granted, it can be invalidated when one attempts to enforce it.194 If the patent is invalidated, then the
This uncertainty incentivizes
inventor disclosed his invention to the public but reaped none of the rewards of the patent.
inventors in the field of diagnostics to seek out other forms of protection or to abandon research in that
field altogether in favor of a more patent-friendly field. For example, the Cleveland Clinic hospital
system has shifted its research focus away from diagnostics because it thinks that inventions in this
space will not be patentable under § 101.195

The shift to trade secrets protection weakens follow-on innovation


Courtenay C. Brinckerhoff, 24 – registered patent attorney and have been representing chemical,
biotech, and pharmaceutical clients before the USPTO for over 30 years. Answers To Questions For The
Record from Senator Tillis, before the U.S. Senate Committee on the Judiciary Subcommittee on
Intellectual Property, “The Patent Eligibility Restoration Act”, 1/23,
www.judiciary.senate.gov/imo/media/doc/2024-01-23_-_qfr_responses_-_brinckerhoff.pdf //DH

While many U.S. companies continue to innovate in the diagnostic space, many doing so are maintaining
their discoveries as trade secrets. This will not have the same impact on driving innovation as patent
protection would, because the discoveries are not being made known to others who might improve
upon them or build new innovations on them. In his written and live testimony, Mr. Blaylock argued that
PERA would undesirably “permit the privatization of natural phenomena in the form of knowledge of
new biomarkers and their clinical relevance,” but it is the current state of Section 101 that incentives
innovators to keep their discoveries to themselves. Mr. Blaylock’s position forgets that patent rights
come with the cost of public disclosure, and only last for a limited time. Without PERA, innovators have
little reason to share what they learn about newly discovered disease biomarkers and personalized
medicine, and more reason to develop their technologies in-house while keeping the underlying
methodology a trade secret. It is the trade secret paradigm, not the patent system, that rewards
“privatization” of knowledge.
I am aware of an empirical study reported in 2022 in the Washington and Lee Law Review2 that
determined that, “in the four years following Mayo, investment in disease diagnostic technologies was
nearly $9.3 billion dollars lower than it would have been absent Mayo.” The study focused on venture
capital investment in the United States, in particular. While it did not consider whether money was
invested in other countries instead, it at least indicates reduced investment in specific U.S. industries
impacted by Mayo, which is of grave concern.

Trade secrets protection reduces microbial innovation necessary for environmental


remediation
Courtenay C. Brinckerhoff, 24 – registered patent attorney and have been representing chemical,
biotech, and pharmaceutical clients before the USPTO for over 30 years. Written Testimony before the
U.S. Senate Committee on the Judiciary Subcommittee on Intellectual Property, “The Patent Eligibility
Restoration Act”, 1/23, https://www.judiciary.senate.gov/imo/media/doc/2024-01-23_-_testimony_-
_brinckerhoff.pdf //DH

Although the U.S. is considered a leader on the global stage, it is noteworthy that other countries have
not followed the U.S. very far down this road of patent ineligibility. Isolated and purified forms of
naturally-occurring products remain eligible for patenting everywhere else in the world, although some
countries have specific exceptions for isolated genes.15 For example, the European Patent Office permits patenting of isolated genes and gene
fragments as long as the patent’s description “indicate[s] the way in which the invention is capable of exploitation in industry.”16 Likewise,
Australia, China, Japan, and Korea (for example) continue to grant patents on isolated natural products. Most countries permit patenting of
diagnostic methods unless they are excepted on public policy grounds. For example, the European Patent Office permits patenting of diagnostic
methods as long as they are not “practised on the human or animal body,”17 and so permits patenting of diagnostic methods conducted using
saliva or blood samples (for example). Australia permits patenting of diagnostic methods without restriction (similar to the U.S. prior to Mayo),
and methods of detecting specific markers of a disease or condition in a biological sample may be patented in China, Japan, and Korea
(although methods of “diagnosing” a patient are excepted on public policy grounds).
This means that, since the changes in U.S. patent eligibility law flowing from Mayo, Myriad, and Alice,18
there are inventions that cannot be patented in the U.S. that can be patented in other countries. For
example, as outlined above, isolated natural products that may be useful as medications, diagnostic
agents, vaccines, antibiotics, or in industrial applications, can be patented around the world, except in
the U.S. This leads to an imbalance in intellectual property rights that can be obtained in the U.S. versus
elsewhere.
Regardless of where you seek to obtain a patent, you have to describe your invention in your patent
application in a level of detail sufficient for others to practice the invention. This means that inventors
who pursue patent protection have to disclose their inventions to the whole world, but cannot protect
them to the same extent in the U.S. if they are caught in the § 101 snares of Mayo, Myriad, and Alice.
This imbalance may cause innovators to think twice before pursuing a patent. If they do not expect to be
able to adequately protect their investments, they may decide to maintain the technology as a trade
secret, or shelve it altogether.
The imbalance in the “quid pro quo” of disclosure in return for patent rights19 is particularly acute for
technologies related to isolated microorganisms, such as bacteria determined to have specific properties
that make them particularly useful in commercial and industrial processes. (A few examples include
bacteria used in brewing, baking, cheese- and yogurt-making, oil- and plastic-degrading bacteria used
for environmental remediation, and carbon-fixing microbes used to address CO2 emissions.) In order to
satisfy the 35 U.S.C. § 112 “written description” requirement for obtaining a patent relating to a newly
discovered microorganism that is not readily available to the public, the patent applicant must “deposit”
a sample of the bacteria with a qualified depository—such as the American Type Culture Collection
(ATCC) in Gaithersburg, Maryland.20 The patent applicant also must assure that “all restrictions imposed
by the depositor on the availability to the public of the deposited material will be irrevocably removed
upon the granting of the patent.”21
Prior to Myriad, isolated bacteria were included in patent-eligible subject matter . Thus, even though
members of the public could obtain a sample of the bacteria once a patent granted, the public’s
freedom to use the bacteria often was limited by patent rights that covered any and all uses of the
isolated bacteria. Under Myriad, however, it is no longer possible to obtain a patent on isolated bacteria
per se. That means that a U.S. patent granted today might only cover a specific method of using the
bacteria. Nevertheless, the patent owner still must irrevocably remove “all restrictions ... on the
availability to the public of the deposited material” once the patent grants.22 This is another imbalance
that has arisen in the wake of Mayo, Myriad, and Alice that may cause innovators to hesitate before
pursuing a U.S. patent or developing technology for the U.S. market.

*Microbial innovation reduces chemical pollution


Teklit Gebregiorgis Ambaye et al, 2024 – PhD. Postdoctoral researcher, Technical University of
Denmark “Emerging technologies for the removal of pesticides from contaminated soils and their reuse
in agriculture” Chemosphere, May, Science Direct, accessed via University of Michigan //DH

During biodegradation, organic substances are completely broken down into inorganic components by
microorganisms. Microorganisms are the most abundant in nature. Their diversity, catalytic processes,
and capacity to function without oxygen make them efficient and economically viable for the
degradation of pesticides and other complex organic pollutants 93. Bacteria, fungi, and actinomycetes
are the main pesticide degraders and transformants. They can make structural modifications to the
pesticide molecule to reduce its toxicity. In addition, fungi typically biotransform insecticides and other
xenobiotics, which bacteria can further degrade and make harmless before being released into the
environment 186.
Fungi and bacteria produce several extracellular enzymes that act on various organic chemicals, making them ideal bioremediation agents.
Microbial extracellular enzymes, particularly oxidases, laccases, and manganese peroxidases, play essential roles in the degradation process.
Cytochrome P450, glutathione S-transferases (GSTs), and esterases have been reported as the primary enzyme families associated with
pesticide degradation. Several pesticide-degrading microorganisms have been identified 187. For instance, recent studies by Ambreen and
Yasmin 188 using Bacillus thuringiensis strain MB497 showed complete chlorpyrifos degradation (up to 99.9% in nine days). The results of this
study also revealed the production of a novel metabolite, di-isopropyl methane phosphonate, which was obtained through
alkylation/desulfurization reactions and then mineralized to produce carbon and phosphorus to support bacterial growth 188.
Enzyme-based pesticide degradation is a new treatment method for removing pesticides from polluted environments. Fungi and bacteria,
particularly white rot fungi, are considered extracellular enzyme-producing microorganisms for this process. Environmental effects can produce
Several microbial strains, including Transferases,
and secrete these enzymes, which are involved in lignin degradation.
isomerases, hydrolases, and ligases enzymes, are responsible for the biodegradation of pesticides,
making them a promising bioremediation agent.189,190. In addition, enzymes are responsible for various metabolic
reactions, such as hydrolysis, oxidation, oxygen addition, amino group oxidation, benzene ring hydroxyl group addition, dehalogenation, nitro
The
group reduction, sulfur replacement, side chain metabolism, and ring cleavage, all of which are catalyzed by these enzymes.
metabolic ability of microbes to detoxify or modify pollutant molecules, which is based on accessibility
and bioavailability, drives biodegradation191.
In general, three steps are involved in pesticide metabolism. Step I involves reduction, oxidation, or hydrolysis to create a more water-soluble
and less poisonous compound. Step II involves the conjugation of a pesticide or pesticide metabolite to a sugar or amino acid, which boosts its
water solubility and lowers its toxicity. Step III involves the conversion of step II metabolites into non-toxic secondary conjugates. Bacteria and
fungi produce intracellular and extracellular enzymes 192. Although microbial biodegradation is an efficient, cost-effective, and environmentally
beneficial method, some challenges remain. One of the biggest challenges is the inability to cultivate some microbial species, which makes it
difficult to identify the environmental pollutants that cause the most damage. In addition, temperature, pH, and humidity influence the
performance of microbes in the field, and rivalry between the target microorganism and other native microorganisms may restrict their growth.
Microbial consortiums can create an efficient system for pesticide bioremediation; however, selecting a partner bacterium that works well
together is essential for constructing a strong system. Hence, to further evaluate this symbiotic association, a thorough investigation of the
molecular dynamics, metabolic transitions and shifts, and small-molecule interactions between the microorganism and the surrounding
conditions and environmental factors is required 193. Microbial catabolism is diverse and capable of breaking down a wide range of chemical
substances; filling all the knowledge gaps is challenging. However, potential metabolic pathways and toxicity endpoints can be predicted using
in silico investigations, system modeling, and molecular dynamics, as shown in Figure 11.
The development of genetic
bioaugmentation, which involves introducing donor bacteria into polluted soil to increase the capacity
of native bacteria to break down pesticides, is a successful strategy 194. A general overview of the
intricate biodegradation processes is provided by multi-OMICS techniques (genomics, transcriptome,
and metabolomics), making it feasible to identify and characterize various molecular mechanisms,
metabolic pathways, their control, and potential interconnections, as shown in Figure 11.
Currently, most research is focused on the application of molecular docking in environmental
remediation for the intermolecular investigation of microbial biodegradation of pesticides, as shown in
Figure 12, which has enabled the discovery of new genes and enzymes involved in remediation
mechanisms, bridging research gaps and expanding our understanding of the remediation of xenobiotics
192.

The impact is agriculture and ecosystem collapse, and it risks extinction


Ravi Naidu, 2021 – Global Centre for Environmental Remediation (GCER), The University of Newcastle
“Chemical pollution: A growing peril and potential catastrophic risk to humanity” Environment
International
Volume 156, November 2021, Science Direct,
https://www.sciencedirect.com/science/article/pii/S0160412021002415 //DH
6. Contamination of the food chain
6.1. Food chain pollution
According to the American Academy of Paediatrics more than 10,000 chemicals are used or find their
way into the modern food supply (Trasande et al. 2018). Food chain pollution poses direct risks to humans from ingestion of
contaminated food (Fig. 3). The risk may be passed on to the next generation as pollutants were detected in human breast milk (van den Berg
et al. 2017) and were associated with cognitive and other health disorders, or by epigenetic means (Baccarelli and Bollati 2009). The adverse
effects of pollutants on the human gut microbiome are also a warning about potential long-term impacts on immunity and metabolism (Jin et
al. 2017).
Food can be contaminated at several stages before consumption - during crop or forage or animal production and harvesting, or post-harvest
during storage, processing, transport and processing. Heavy metal(loid)s, pesticides, dioxin, PCBs, antibiotics, growth-promoting substances,
packaging residues, preservatives and excess nutrients (e.g. nitrate) have all been found to contaminate food at higher than acceptable levels
(Awata et al., 2017, EFSA, 2018, Islam et al., 2017, Licata et al., 2004). This affects vegetables, grains, fish, and livestock via soil, surface water,
groundwater or aerial deposition (Zhong et al. 2015) (Fig. 3). For example, Cd concentrations of various foodstuffs in China, including
vegetables, rice, and seafood, were as high as 0.93 mg/kg and contributed 1.007 µg/kg bodyweight to the daily intake for children, which is 1.2
times higher than the acceptable limit recommended by WHO and the FAO (Zhong et al. 2015). Dioxin and PCB-like contaminants in food are
also a concern to human health according to a report commissioned by the European Food Safety Authority (EFSA). Similarly, human exposure
to pesticides can occur from residues in food or from legacy or inadvertent contamination during production and processing. Such
A recent study of pesticide
contamination of food products can have chronic impacts on human health (The Gurdian 2004).
pollution at global scale reported that 64% of agricultural land was at risk of pollution caused by multiple
active ingredients of pesticides. The risk includes adverse effects on food and water quality, biodiversity
and human health (Tang et al. 2021).
Post-harvest protection of food can also result in contamination by fumigants, formalin and other insecticides and preservatives (e.g. calcium
carbide, cyanide, sodium cyclamate, urea, melamine, aflatoxin and detergents), especially when they are used incorrectly, illegally or
accidentally. Serious examples have been reported from numerous countries, including China, India, and Brazil (Handford et al. 2016). Even in
countries with well-defined and established regulatory systems, such as those of the EU, chemical contamination in food and animal feed can
occur to an extent sufficient to cause concern, due to intentional and unintentional use of post-harvest chemicals (Silano and Silano 2017).
6.2. Loss of soil productivity
Healthy soils are essential for safe, healthy food, ecosystem service delivery, climate change abatement,
abundant food and fibre production, pollutant attenuation and freshwater storage, all of which are key
to the sustainability of the world food supply. Reduced food availability and security in less-developed
countries can occur when productive land is lost due to chemical contamination (Fig. 3). In the last 40
years nearly one-third of the Earth’s total arable land has been lost to soil erosion, desertification, urban
expansion, and contamination (Cameron et al. 2015). Soils contaminated with heavy metals and
pesticides cause loss of productive agricultural land and compromise food production and quality (Fig. 3).
There is no global estimate of the areal losses of arable land attributed to chemical pollution, but regional reports indicate significant loss or
potential loss. For example in Europe, 137,000 km2 of agricultural lands are at risk of being abandoned due to heavy metal(loid)s pollution
(Tóth et al. 2016). This situation is exacerbated in developing countries by inadequate waste treatment and uncontrolled exploitation of natural
resources (Lu et al., 2015, Tóth et al., 2016). China lost 0.13% of its total arable land due to chromium (Cr) pollution during 2005–2013 and 1.3%
remains at risk (Lu et al., 2015, Tóth et al., 2016). Yet, key policy instruments and initiatives for sustainable development rarely recognise that
contaminated soils compromise food and water security.
6.3. Biodiversity loss and damage to crops and livestock
Biodiversity in the Earth’s surface layer from bedrock to the vegetation canopy provides the primary source of services for the support of life on
Earth (Banwart et al. 2019; Cardinale et al. 2012). The acute and chronic impact of excessive current and historical use
of agrichemicals
and other industrial pollutants is contributing to a substantial loss of Earth’s biodiversity. The global loss
of honeybee communities due to neonicotinoid pesticides has caused an international crisis for crop
pollination (Dave 2013), for example. There are reports of pesticide pollutants causing the loss of more
than 40% of the total taxonomic pools of stream invertebrates in some regions (Beketov et al. 2013).
Residues of more persistent chemicals, including many pesticides, may have long-term ecological
impacts, especially in highly contaminated areas (Gevao et al. 2000) with significant threats of pollution
of groundwater and marine water (Arias-Estévez et al., 2008, Jamieson et al., 2017). Losses of up to 78%
of insect species have been reported from 290 sites in Germany (Seibold et al. 2019). Such ecological
impacts and their persistence may profoundly alter biological processes such as decomposition and soil
formation in natural environments, leading to unfavourable or challenging settings for human food production.
Reactive nitrogen pollution of the atmosphere and its deposition are responsible for declining biodiversity at regional (Hernández et al. 2016)
and global scales (Condé et al. 2015). For example, assessing more than 15,000 sites, including forest, shrubland, woodland and grassland in the
USA, Simkin et al. (2016) found that 24% of the sites had losses of vulnerability of species as a result of atmospheric nitrogen deposition, in
particular when the disposition was above 8.7 kg N/ha/yr. A similar study in the UK also revealed that species richness had declined with
Excess loading of nutrient pollutants
increases of nitrogen deposition in the range of 5.9 to 32.4 kg N/ha/yr (Southon et al. 2013).
by human activities affects hundreds of coastal and marine ecosystems and has been linked to a
‘missing’ biomass of flora and fauna (Diaz and Rosenberg 2008).
At a global scale there is also evidence that low crop yields may be caused by surface (tropospheric) ozone (O3) pollution (Tai et al. 2014);
elevated O3 levels are also linked to chemical pollutants. It was projected that by 2030 that O3 precursors could cause crop yield losses for
wheat (4–26%), soybeans (9.5–19%) and maize (2.5–8.7%) globally (Avnery et al. 2011). Reduction of crop yield due to the O3 exposure has also
been reported by several regional experimental and model studies (Debaje, 2014, Hollaway et al., 2012, Kumari et al., 2020). The yield losses
occur as a result of plant physiological interference with the O3 molecules such as the production of reactive oxygen species mainly through the
diffusion of O3 into the intercellular air space of plant leaves (Ainsworth 2017).
7. The chemical pollutant challenge for humanity: Discussion and questions
From a toxicological point of view, exposure to the vast array of modern chemicals and their billions of mixtures might cause acute or chronic
toxicity but also not pose any toxic risk to humans. This wide range of threats can be addressed using a risk-based approach (Siegrist and Bearth
2019). Due to methodological constraints and the varying susceptibility to toxins among humans, there are only a few reports showing direct
quantitative, whole life-span analyses of fatalities attributed to environmental pollutants (section 2–6). Nevertheless, compilation of the
substantial evidence of the health burden caused by chemical pollution both show and predict the impairment of normal human life expectancy
by direct exposure to pollutants, food contamination and fertility decline (Fig. 4) (Aitken et al., 2004, Hou and Ok, 2019, Rabl, 2006).
Rockström et al. (2009) described nine planetary boundaries that humans ought not to breach for our own safety: climate change, ocean
acidification, stratospheric ozone, global phosphorus and nitrogen cycles, atmospheric aerosol loading, freshwater use, land-use change,
biodiversity loss, and chemical pollution. Later, ‘chemical pollution’ was not considered as a single entry (Condé et al. 2015) as they also cause
climate change (e.g. emissions of CO2, methane and other greenhouse gases), ocean acidification due to elevated CO2, depletion of
stratospheric ozone due to released halocarbons, and interruption of P and N cycles. As pointed out here, atmospheric aerosol loading is
another aspect of anthropogenic chemical pollution (Singh et al. 2017), and ambient air pollutants are responsible for millions of premature
deaths and cost many billions of dollars (West et al. 2016).
Every year thousands of new chemicals are produced and most of them remain beyond current risk assessment regulations
(Sala and Goralczyk, 2013, Wang et al., 2020). The effects of mixed pollutants are especially unclear (Heys et al., 2016, Konkel, 2017). This due
to inadequate methodology to assess the interaction of chemical mixtures and the risk factors for human health (Heys et al. 2016), although the
effects of mixed pollutants on human health are probably physiologically more relevant than that of any single pollutant (Carpenter David et al.
Global climate change, including warming and extreme climatic conditions, will exacerbate human
2002).
exposure to chemical pollutants present in soil and water (Biswas et al. 2018). Erosion and aerial transport of polluted soil
or acidification of soil and water causing mobilisation of toxic heavy metal(loid)s are two mechanisms by which this can occur. There is in
general far too long a delay between scientific discovery of pollution problems and their effects, and regulations and actions to abate them.
It is likely humanity is approaching a dangerous tipping point due to our release of geogenic,
anthropogenic synthetic chemicals (Table 1, Table 2 and SI Box S1). This raises the issue that, as yet, no scientifically credible
estimate has been made of humanity’s combined chemical impact on the Earth and on human health. This gap was highlighted by Rockström et
al. (2009) whose popular ‘global boundaries’ chart was unable to include a boundary for chemical emissions because of a lack of data and
suitable methodology. Public awareness is constrained by several issues, including the fact that toxic chemicals are now so widely dispersed
throughout the Earth’s biosphere that their origins are untraceable, that cases of poisoning may take decades to be officially noticed,
researched and proven, that the polluters may not be aware of or well-equipped to curb the pollution, that consumers and many professionals
may be insufficiently educated in the risks. There are several local incidents that the aftermath analysis could reveal the insufficiency of
knowledge regarding the effect of synthetic chemicals. For instance, the Bhopal Union Carbide gas disaster of 1984 of such categories where
gaseous contaminant levels were so high that people died immediately following exposure.
Consequently, humanity is unaware of how near or far it is from exceeding the Earth’s capacity to
‘absorb’ or safely process our total chemical releases, which grows by many billions of tonnes with each
passing year. This represents a potential catastrophic risk to the human future and merits global
scientific scrutiny on the same scale and urgency as the effort devoted to climate change.

The Patent Eligibility Restoration Act solves by creating a more predictable innovation
environment. Limiting judicial discretion means innovators are less likely to rely on
trade secrets and will disclose their inventions
Maxwell H. Terry, 2023 – Managing Editor, Minnesota Law Review, Vol. 108 J.D. Candidate 2024,
University of Minnesota Law School. “Hello, World? Domestic Software Patent Protection Stands Alone
Due to Uncertain Subject Matter Eligibility Jurisprudence” 108 Minn. L. Rev. 403, Nexis Uni, Accessed via
University of Michigan //DH

C. Recalibrating the Domestic Jurisprudence Through Lessons Learned from Domestic Legislative
Proposals and Global Trends
The United States has a patentable subject matter issue. Rather than base eligibility determinations
along predictable lines, the United States instead utilizes ever-changing common law requirements that
blur patentability requirements rather than clearly define them. Legislative proposals and global trends
offer tremendous insight into how to better support American innovation. First, Congress should
statutorily define categories of patent-ineligible subject matter such that judicial discretion is limited to
not extend beyond explicit bounds. Second, due to software's importance and prevalence in today's
economy,309 the Patent Act should specifically consider software and computer-implemented
inventions. Finally, the Patent Act should clarify that patentable subject matter determinations are a
distinct requirement apart from, and without consideration of, other patentability requirements. Such
changes would align the United States' patent system with global standards while simultaneously
increasing predictability before the USPTO and federal courts.
First, the Patent Act should specifically enumerate categories of ineligible subject matter and limit a
court's discretion to make eligibility determinations beyond those defined. Judicially created categories
are amorphous, unworkable, and ultimately not essential considering the remainder of the Patent
Act.310 By moving away from common law development in this area, the standards for eligibility will be
clearer and more easily applied.311 Such a change also aligns with other IP5 members, who explicitly
enumerate ineligible subject matter categories.312 Additionally, by limiting court discretion to the mere
application of the statute to the invention at hand, a prospective inventor can more easily discern
whether their invention would be patentable.
Second, the Patent Act should specifically address software and computer-implemented inventions. One approach, as used by other IP5
members and proposed by PERA, is to eliminate patentability for software "as such," but to allow patents on discrete applications of
software.313 PERA further divided this analysis depending on the type of process sought to be patented, as non-technological processes remain
patent-eligible if they are solely capable of being performed through the use of a machine or manufacture, and a judge's evaluation of other
processes is constrained by statutory considerations.314 As it stands, software is often patented as a "process" under §101 .
Specifically
defining what types of software inventions are patentable, as Japan has done,315 would promote
consistency, simplify the application process, and incentivize inventors in computer-related fields to
publish their inventions through the USPTO rather than hold onto them as proprietary trade
secrets.316 Defining software subject matter eligibility would also limit a court's ability to hold such
patents invalid on patentable subject matter grounds, further promoting predictability, consistency, and
uniformity.
Finally, the Patent Act should make it abundantly clear that the patentable subject matter analysis, along
with its many considerations, is a doctrine distinct from other patentability requirements. One major
complaint with the current Alice/Mayo framework is that it wrongfully conflates subject matter eligibility issues with requirements embodied
PERA approach would correct this by
by other portions of the Patent Act, leading to inconsistent applications of §101.317 A
limiting a court's discretion to make considerations outside of §101.318 Alternatively, the Patent Act could follow
Japan or Europe's approach, wherein the subject matter eligibility determination goes to whether the claimed subject matter constitutes a
statutory "invention" at all, and a court only applies other patentability doctrines upon a finding that the claimed subject matter is, in fact, an
invention.319 Either outcome would increase predictability as an inventor will know which metrics their patent will be evaluated by .
While
this may cause fewer infringement cases to be dismissed early in litigation, §101 eligibility should not
draw from other portions of the Patent Act for the sake of judicial efficiency.
CONCLUSION
Software innovation is paramount to the development of modern society and will only continue to grow
in importance as emerging technologies such as quantum computing, artificial intelligence, and
automation become increasingly prevalent. The United States' patent system currently stands to hinder
the innovation of software and computer-implemented inventions more than it stands to help it, as the
common law development of the patentable subject matter doctrine has enshrouded the patent system
in a cloud of unpredictability. To right the course, and better align the United States with global trends,
Congress should take action to modify §101. Specifically, Congress should amend the Patent Act to
explicitly define categories of ineligible subject matter, provide specific clauses directed to the
patentability of software and computer-implemented inventions, and statutorily prevent courts from
making eligibility determinations based on patentability requirements outside of the four corners of
§101 itself. The Patent Eligibility Restoration Act accomplishes many of these goals and has already been
introduced in the Senate.320 These changes would promote a more consistent approach and ensure
that the United States retains its powerful position at the center of the global technology market.

*Other doctrines of patentability will exclude bad patents. But a broad standard for
subject matter eligibility prevents preempting entire fields of technology
Chad Rafetto, 2024 - Judicial Law Clerk for Chief Judge Colm F. Connolly of the United States District
Court for the District of Delaware. J.D., May 2022, New York University School of Law “Fostering
Innovation Through a Legislative Overhaul of Patentable Subject Matter”, 32 Fed. Cir. B.J. 93, Nexis Uni,
accessed via University of Michigan //DH

D. Applying this Article's Proposal to Recent Supreme Court Cases to Demonstrate Its Practicality
One of the goals of this Article's proposal is to shift the work currently being done by the subject matter
doctrine to the other doctrines of patentability. This shift requires that the other doctrines can perform
that exclusionary work. In Mayo, the Supreme Court said:
We recognize that, in evaluating the significance of additional steps, the § 101 patent-eligibility inquiry and, say, the § 102 novelty inquiry might
sometimes overlap. But that need not always be so. And to shift the patent-eligibility inquiry entirely to these later sections risks creating
significantly greater legal uncertainty, while assuming that those sections can do work that they are not equipped to do.231
[*126] But the Supreme Court is wrong. The other doctrines are adequately equipped to prevent
undesirable patents. To prove this theory, this section will analyze recent Supreme Court decisions and
demonstrate how the same result could have been reached through rulings based on other doctrines of
patentability besides § 101.
In reaching the same result as the Court did, the emphasis is not on whether the Court made the correct
decision if such a thing could even be proven.232 The purpose of this exercise is to show that patent law
can be effectuated in the same way without relying on § 101 as heavily as the Court currently does.
This overuse is significant because § 101 acts as a coarse filter; when decisions are rendered on a § 101
basis, there is a larger preemptive effect than for rejections based on the other doctrines of
patentability.233 Thus, invalidating a patent under the novelty, nonobviousness, written description, or
enablement doctrines means that the specific claim was not patentable, but rejecting a patent on
subject matter grounds means that inventions of this general kind are not eligible for patent protection.
Professor Michael Risch previously conducted this exercise in 2008 and discussed how the pre-2008 patentable subject matter case law could
have been decided on different grounds.234 To avoid being duplicative, the current analysis will focus on the post-2008 Supreme Court cases
Bilski, Mayo, Myriad, and Alice.
In Bilski, the patent dealt with the business method of hedging.235Bilski is a rare case because though the patent had been pending for eleven
years, the USPTO had never examined the patent for any element of patentability except for subject matter.236 Because § 101 is a threshold
inquiry, the USPTO, the parties litigating the issue, and each of the courts never even reached the [*127] other doctrines nor presented
arguments on these other doctrines.237 Despite this dearth of argumentation on the other aspects of patentability, both Judge Newman's
dissent at the Federal Circuit and Justice Stevens's concurrence at the Supreme Court mentioned § 112 for the proposition that the claim
language was too broad to be supported by the invention itself.238 Both Newman's dissent and Stevens's concurrence seem to suggest that the
majority opinions' reasonings were wrongly conflating preemption with claim breadth. So Bilski most likely could have been decided as a § 112
case because the invention did not support the breadth of the patent's claims and thus the patent still would have been denied.
Mayo, a case about using thiopurines to treat autoimmune disease, was more straightforward than Bilski because the government argued in an
amicus brief that the invention met the patentable subject matter bar, but that the patent is likely invalid under novelty or nonobviousness.239
The government argued this point because the only element that was distinct from the prior art was the last step of the process, the doctor's
mental inference.240 Because of the similarity between the prior art and the invention, either novelty or nonobviousness could have been used
to decide Mayo. It is also worth noting that the Supreme Court's only reasons for rejecting the government's position, besides their misguided
belief that the other sections are not capable of performing this function, were because it was inconsistent with prior doctrine and because the
Court did not wish to risk "creating significantly greater legal uncertainty."241 A legislative change would nullify the first concern because the
prior doctrine would no longer be relevant. As to the second concern, [*128] the Supreme Court has it backwards the prior inconsistent case
law has created legal uncertainty.242
Myriad, a case about the patentability of isolated DNA and cDNA,243 can also be resolved based on novelty and nonobviousness grounds. First,
the cDNA was deemed to be patentable, so a lower subject matter bar would not change that result.244 The isolated DNA segments, however,
were deemed not patentable because the DNA segment was a product of nature and isolating it did not make it patent eligible.245 Both of
these arguments are similar to the arguments that would be made for novelty and nonobviousness, respectively. The isolated DNA segments
have occurred before and are anticipated. And the act of isolating the DNA is obvious because it is not distinct enough from the prior art. Thus,
Myriad also could have been decided based on novelty or nonobviousness grounds.
Last, Alice was a case about a computer-implemented scheme for mitigating settlement risk.246 The patent was ruled ineligible on subject
matter grounds because it did nothing more than "instruct the practitioner to implement the abstract idea of intermediated settlement on a
generic computer."247 At the Federal Circuit, then Judge Moore's dissenting opinion specifically argued, "Section 102's novelty or § 103's
nonobviousness requirements are the means to challenge a system claim that does no more than take a familiar, well known concept and put it
on a computer."248 Thus, these doctrines could also have been used to rule this patent invalid.
Conclusion
Patent law's current § 101 jurisprudence is broken. The Federal Circuit and district courts cannot
effectively apply it, its case law is inconsistent, and, most importantly, it is harming innovation. By
harming innovation, § 101 is acting contrary to the goals of patent law. Accordingly, change is needed.
Congress is best situated to right the doctrine because of the Supreme Court's reluctance to touch upon
the topic, the innate difficulty of correcting the doctrine judicially, and the need for complex technical
knowledge.
[*129] This Article proposes a legislative change to patentable subject matter that turns the inquiry into
solely whether the invention fits within one of the inclusion categories, with no weight to be given to
the current carve-outs. In doing so, more of the exclusionary work will be shifted to the other doctrines
of patentability. This shift is preferred because of the smaller preemptive effect that a rejection under
one of these doctrines has compared to the preemptive effect of a subject matter rejection. Thus, even
if the rejection rates remain the same, fewer industries will think they are foreclosed from the patent
system while the same "bad inventions" that patent policy would aim to exclude can still be rejected.

*Patents are empirically central to innovation and have been responsible for prior
tech revolutions
Adam Mossoff, 24 - Professor of Law, Antonin Scalia Law School George Mason University. Statement
before the Committee on the Judiciary, Subcommittee on Intellectual Property, United States Senate
“The Patent Eligibility Restoration Act – Restoring Clarity, Certainty, and Predictability to the U.S. Patent
System” 1/24, https://www.judiciary.senate.gov/imo/media/doc/2024-01-23_-_testimony_-
_mossoff.pdf //DH

Before addressing a specific legal or policy debate in the patent system, it is necessary to first review the legal and economic evidence that sets
the framework for evaluating the current legal disputes and data. This is important, if only because there is widespread confusion today about
the key role of a patent system in promoting innovation and driving economic growth .
The patent system has been a key driver
of the U.S. innovation economy for over 200 years, as economists, historians, and legal scholars have
repeatedly demonstrated.16 The patent system was central to the successes of the Industrial Revolution
in the nineteenth century, the pharmaceutical and computer revolutions in the twentieth century, and
the biotech and mobile revolutions in the early twenty-first century. Studies have consistently shown
that patent systems that secure reliable and effective property rights to inventors strongly correlate
with successful innovation economies.17
Dr. Zorina Khan, an award-winning economist, has demonstrated that reliable and effective property
rights in innovation—patents—were a key factor in thriving markets for technology in the United States
in the nineteenth century.18 Other economists have also identified features of these robust nineteenth-
century markets in new technologies—such as an increase in “venture capital” investment in patent
owners, the rise of a secondary market in the sale of patents as assets, and the embrace of
specialization via licensing business models—as indicators of value-maximizing economic activity that
were made possible by reliable and effective patents.19 All of this remains true today: a twenty-first-
century startup with a patent more than doubles its chances of securing venture capital financing
compared to a startup without a patent, and this patent-based startup has statistically-significant increased chances of success
in the marketplace as well.20 These are the academic and scholarly analysis that confirm the everyday experience of what most people have
seen in Shark Tank, in which the venture capitalists always ask the entrepreneurs if they have patents for their inventions as a precondition to
investing in their new products or services.
The real-world results of reliable and effective property rights—whether in land or in inventions— is
extensive private investments, development of new products and services, and the creation and growth
of new commercial markets in which consumers benefit from new products and services. These have
been the consistent features of the U.S. innovation economy from the Industrial Revolution through
today’s mobile revolution. They were made possible by a patent system that was as innovative itself as
the inventions it promoted and secured in the marketplace.

You might also like