0% found this document useful (0 votes)
28 views2 pages

HW Ted 2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views2 pages

HW Ted 2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

1) Check part one 😉

1.1 TED Talk Part 1 script

Our emotions influence every aspect of our lives from our health and how we learn, to how we do
business and make decisions, big ones and small. Our emotions also influence how we connect with
one another. We’ve evolved to live in a world like this, but instead, we’re living more and more of our
lives like this. So, I’m on a mission to change that. I want to bring emotions back into our digital
experiences.
I started on this path 15 years ago. I was a computer scientist in Egypt, and I had just gotten accepted
to a Ph.D. program at Cambridge University. So, I did something quite unusual for a young newly-wed
Muslim Egyptian wife: with the support of my husband, who had to stay in Egypt, I packed my bags and
I moved to England. At Cambridge, thousands of miles away from home, I realized I was spending more
hours with my laptop than I did with any other human. Yet despite this intimacy, my laptop had
absolutely no idea how I was feeling. It had no idea if I was happy, having a bad day, or stressed,
confused, and so that got frustrating. Even worse, as I communicated online with my family back
home, I felt that all my emotions disappeared in cyberspace. I was homesick, I was lonely, and on some
days, I was actually crying, but all I had to communicate these emotions was this. So, that got me
thinking, what if our technology could sense our emotions? What if our devices could sense how we felt
and reacted accordingly, just the way an emotionally intelligent friend would?
Our human face happens to be one of the most powerful channels that we all use to communicate
social and emotional states, everything from enjoyment, surprise empathy and curiosity. In emotion
science, we call each facial muscle movement an action unit. So, for example, action unit 12, it’s not a
Hollywood blockbuster, it is actually a lip corner pull, which is the main component of a smile. Try it
everybody. Let’s get some smiles going on. So, we have about 45 of these action units, and they
combine to express hundreds of emotions.
Teaching a computer to read these facial emotions is hard, because these action units, they can be fast,
they’re subtle, and they combine in many different ways. So, take, for example, the smile and the
smirk. They look somewhat similar, but they mean very different things. So, the smile is positive, a
smirk is often negative. Sometimes a smirk can make you become famous. But seriously, it’s important
for a computer to be able to tell the difference between the two expressions.
So how do we do that? We give our algorithms tens of thousands of examples of people we know to be
smiling, from different ethnicities, ages, genders, and we do the same for smirks. And then, using deep
learning, the algorithm looks for all these textures and wrinkles and shape changes on our face, and
basically learns that all smiles have common characteristics, all smirks have subtly different
characteristics. And the next time it sees a new face, it essentially learns that, you know this face has
the same characteristics of a smile, and it says, ‘Aha, you know, I recognize this. This is a smile
expression.’
2) Do gap fill task(part 2 and 3)

http://wowzahttp.cengage.com/natgeo/ngl/perspectives/per_bre_int_u01_full_support.mp4

1.2 TED Talk Part 2 script

So, the best way to demonstrate how this technology works is to try a live demo, so I need a volunteer,
preferably somebody with a face. Cloe’s going to be our volunteer today.
So, let’s give this a try.
1___________, the algorithm has essentially found Cloe’s face, so it’s this white bounding box, and it’s
tracking the main feature points on her face, so her eyebrows, her eyes, her mouth and her nose. The
question is, can it recognize her expression? So, we’re going to test the machine. So, first of all, give me
your poker face. Yep, awesome. And then as she smiles, this is 2_____________, it’s great. So, you can
see the green bar go up as she smiles. Now that was a big smile. Can you try like a 3_______ smile to
see if the computer can recognize? It does recognize subtle smiles as well. We’ve worked really hard to
make that happen. And then 4_______________, indicator of surprise.5 _____________, which is an
indicator of 6___________. 7_______. Yes, perfect. So, on the right side of the demo – look like you’re
happy. So, that’s joy. Joy fires up. And then give me a 8___________ face. Yeah,9 _________ your
nose. Awesome.
So, so far, we have 10 ___________ 12 billion of these emotion data points. It’s the largest emotion
database in the world. We’ve collected it from 2.9 million face videos, people who have agreed to share
their emotions with us, and from 75 countries around the world. It’s growing every day. It
11________________ that we can now 12_________ something as personal as our emotions, and we
can do it at this 13 ___________.

1.3 TED Talk Part 3 script

So, what have we learnt to date? Gender. Our data confirms something that you might 14_________.
Women are more 15_____________ than men. Let’s do culture. So, in the United States, women are 40
percent more expressive than men, but curiously, we don’t 16___________________ the UK between
men and women. Age: people who are 50 years and older are 25 percent more emotive than younger
people. Women in their 20s smile a lot more than men the same age, perhaps
17___________________ dating.
Where is this data used today? I want to share some examples that are especially 18__________ my
heart. 19___________________ wearable glasses can help individuals who are 20________________
read the faces of others, and it can help individuals on the autism spectrum interpret emotion,
something that they really 21_____________ with. What if your 22_____________ tracked your mood,
or your car sensed that you’re tired, or perhaps your fridge knows that you’re stressed, so it auto-locks
to 23___________ you from 24_________________ eating. I would like that, yeah.
I think in five years 25__________ the line, all our devices are going to have an emotion chip. As more
and more of our lives become digital, we are fighting a losing battle trying to 26__________ our usage
of devices in order to 27_____________ our emotions. So, what I’m trying to do instead is to bring
emotions into our technology and make our technologies more 28____________. So, I want those
devices that have 29____________ us to bring us 30_______________. And by humanizing technology,
we have this 31___________________________ to 32______________ how we connect with
machines, and 33__________, how we, as human 34___________, connect with
35__________________.
Thank you.

You might also like