0% found this document useful (0 votes)
46 views25 pages

Chappie Workbook Updated

The article discusses the development of autonomous weapons that can select and engage targets without human oversight. While increased autonomy could improve accuracy, some argue it also raises ethical concerns and could lower the threshold for warfare. An upcoming international meeting will consider restrictions on these types of weapons.

Uploaded by

AdamDoughty
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views25 pages

Chappie Workbook Updated

The article discusses the development of autonomous weapons that can select and engage targets without human oversight. While increased autonomy could improve accuracy, some argue it also raises ethical concerns and could lower the threshold for warfare. An upcoming international meeting will consider restrictions on these types of weapons.

Uploaded by

AdamDoughty
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

CHAPPiE WORKBOOK

12 ENGLISH 2021

1
2
Explain how this character changes in the movie. What did you
learn from this character? Make sure you use evidence to prove
what you say.

3
Explain how this character changes in the movie.
What did you learn from this character? Make sure
you use evidence to prove what you say.

4
Explain how this character changes in the movie. What
did you learn from this character? Make sure you use
evidence to prove what you say.

5
Explain how this character changes in the movie. What
did you learn from this character? Make sure you use
evidence to prove what you say.

6
Explain how this character changes in the movie. What
did you learn from this character? Make sure you use
evidence to prove what you say.

7
Explain how this character changes in the movie. What
did you learn from this character? Make sure you use
evidence to prove what you say.

8
Why was the abandoned power station used as a location? How does it
make the audience feel about what is happening?

9
Close viewing: analysis of scene (Practice)
Verbal or oral Director’s purpose/message – link to
Description of feature Effect of feature/impact on audience
feature ideas/themes
Music

Sound Effects

Dialogue

10
Director’s purpose/message – link to
Visual feature Description of feature Effect of feature/impact on audience
ideas/themes
Cinematography

Actions

Costume

11
Lighting

Symbolism

12
Brainstorm the main ideas explored in the film

13
Brainstorm the main ideas explored in the film

14
Important techniques used in the film (visual and verbal)

15
Choose 4 techniques and explain in detail how and why they are used in
the sequence.
Technique How and why used

16
Technique How and why used

17
Choose an idea and explain how it is important to the world.

18
Choose an idea and explain how it is important to the world.

19
Compare CHAPPiE to another film you have seen. Think about both
similarities and differences.
Film title:

20
Compare CHAPPiE to another film you have seen. Think about both
similarities and differences.
Film title:

21
Fearing Bombs That Can Pick Whom to Kill
By John Markoff
Nov 11, 2014
On a bright fall day last year off the coast of Southern California, an Air Force B-1
bomber launched an experimental missile that may herald the future of warfare.
Initially, pilots aboard the plane directed the missile, but halfway to its destination, it
severed communication with its operators. Alone, without human oversight, the missile
decided which of three ships to attack, dropping to just above the sea surface and
striking a 260-foot unmanned freighter.
Warfare is increasingly guided by software. Today, armed drones can be operated by
remote pilots peering into video screens thousands of miles from the battlefield. But
now, some scientists say, arms makers have crossed into troubling territory: They are
developing weapons that rely on artificial intelligence, not human instruction, to decide
what to target and whom to kill.
As these weapons become smarter and nimbler, critics fear they will become
increasingly difficult for humans to control — or to defend against. And while pinpoint
accuracy could save civilian lives, critics fear weapons without human oversight could
make war more likely, as easy as flipping a switch.
Britain, Israel and Norway are already deploying missiles and drones that carry out
attacks against enemy radar, tanks or ships without direct human control. After launch,
so-called autonomous weapons rely on artificial intelligence and sensors to select
targets and to initiate an attack.
Britain’s “fire and forget” Brimstone missiles, for example, can distinguish among tanks
and cars and buses without human assistance, and can hunt targets in a
predesignated region without oversight. The Brimstones also communicate with one
another, sharing their targets.
Armaments with even more advanced self-governance are on the drawing board,
although the details usually are kept secret. “An autonomous weapons arms race is
already taking place,” said Steve Omohundro, a physicist and artificial intelligence
specialist at Self-Aware Systems, a research center in Palo Alto, Calif. “They can
respond faster, more efficiently and less predictably.”
Concerned by the prospect of a robotics arms race, representatives from dozens of
nations will meet on Thursday in Geneva to consider whether development of these
weapons should be restricted by the Convention on Certain Conventional Weapons.
Christof Heyns, the United Nations special rapporteur on extrajudicial, summary or
arbitrary executions, last year called for a moratorium on the development of these
weapons.

22
The Pentagon has issued a directive requiring high-level authorization for the
development of weapons capable of killing without human oversight. But fast-moving
technology has already made the directive obsolete, some scientists say.
“Our concern is with how the targets are determined, and more importantly, who
determines them,” said Peter Asaro, a co-founder and vice chairman of
the International Committee for Robot Arms Control, a group of scientists that
advocates restrictions on the use of military robots. “Are these human-designated
targets? Or are these systems automatically deciding what is a target?”
Weapons manufacturers in the United States were the first to develop advanced
autonomous weapons. An early version of the Tomahawk cruise missile had the ability
to hunt for Soviet ships over the horizon without direct human control. It was withdrawn
in the early 1990s after a nuclear arms treaty with Russia.
Back in 1988, the Navy test-fired a Harpoon antiship missile that employed an early
form of self-guidance. The missile mistook an Indian freighter that had strayed onto
the test range for its target. The Harpoon, which did not have a warhead, hit the bridge
of the freighter, killing a crew member.
Despite the accident, the Harpoon became a mainstay of naval armaments and
remains in wide use.
In recent years, artificial intelligence has begun to supplant human decision-making in
a variety of fields, such as high-speed stock trading and medical diagnostics, and even
in self-driving cars. But technological advances in three particular areas have made
self-governing weapons a real possibility.
New types of radar, laser and infrared sensors are helping missiles and drones better
calculate their position and orientation. “Machine vision,” resembling that of humans,
identifies patterns in images and helps weapons distinguish important targets. This
nuanced sensory information can be quickly interpreted by sophisticated artificial
intelligence systems, enabling a missile or drone to carry out its own analysis in flight.
And computer hardware hosting it all has become relatively inexpensive — and
expendable.
The missile tested off the coast of California, the Long Range Anti-Ship Missile, is
under development by Lockheed Martin for the Air Force and Navy. It is intended to
fly for hundreds of miles, maneuvering on its own to avoid radar, and out of radio
contact with human controllers.

23
Images from a computer showing a strike by a Brimstone missile, a British weapon,
on an Islamic State armed truck in Iraq. The “fire and forget” missile can distinguish
among tanks and cars and buses without human assistance.
In a directive published in 2012, the Pentagon drew a line between semiautonomous
weapons, whose targets are chosen by a human operator, and fully autonomous
weapons that can hunt and engage targets without intervention.
Weapons of the future, the directive said, must be “designed to allow commanders
and operators to exercise appropriate levels of human judgment over the use of force.”
The Pentagon nonetheless argues that the new antiship missile is only
semiautonomous and that humans are sufficiently represented in its targeting and
killing decisions. But officials at the Defense Advanced Research Projects Agency,
which initially developed the missile, and Lockheed declined to comment on how the
weapon decides on targets, saying the information is classified.
“It will be operating autonomously when it searches for the enemy fleet,” said Mark A.
Gubrud, a physicist and a member of the International Committee for Robot Arms
Control, and an early critic of so-called smart weapons. “This is pretty sophisticated
stuff that I would call artificial intelligence outside human control.”
Paul Scharre, a weapons specialist now at the Center for a New American Security
who led the working group that wrote the Pentagon directive, said, “It’s valid to ask if
this crosses the line.”
Some arms-control specialists say that requiring only “appropriate” human control of
these weapons is too vague, speeding the development of new targeting systems that
automate killing.
Mr. Heyns, of the United Nations, said that nations with advanced weapons should
agree to limit their weapons systems to those with “meaningful” human control over
the selection and attack of targets. “It must be similar to the role a commander has
over his troops,” Mr. Heyns said.
Systems that permit humans to override the computer’s decisions may not meet that
criterion, he added. Weapons that make their own decisions move so quickly that
human overseers soon may not be able to keep up. Yet many of them are explicitly
designed to permit human operators to step away from controls. Israel’s antiradar
missile, the Harpy, loiters in the sky until an enemy radar is turned on. It then attacks
and destroys the radar installation on its own.

24
Norway plans to equip its fleet of advanced jet fighters with the Joint Strike Missile,
which can hunt, recognize and detect a target without human intervention. Opponents
have called it a “killer robot.”
Military analysts like Mr. Scharre argue that smarter weapons should be embraced
because they may result in fewer mass killings and civilian casualties. Smart weapons,
they say, do not commit war crimes.
On Sept. 16, 2011, for example, British warplanes fired two dozen Brimstone missiles
at a group of Libyan tanks that were shelling civilians. Eight or more of the tanks were
destroyed simultaneously, according to a military spokesman, saving the lives of many
civilians.
It would have been difficult for human operators to coordinate the swarm of missiles
with similar precision.
“Better, smarter weapons are good if they reduce civilian casualties or indiscriminate
killing,” Mr. Scharre said.

http://www.nytimes.com/2014/11/12/science/weapons-directed-by-robots-not-
humans-raise-ethical-questions.html?smid=nytcore-iphone-share&smprod=nytcore-
iphone&_r=0

25

You might also like