EP2638460B1
EP2638460B1
(19)
Note: Within nine months of the publication of the mention of the grant of the European patent in the European Patent
Bulletin, any person may give notice to the European Patent Office of opposition to that patent, in accordance with the
Implementing Regulations. Notice of opposition shall not be deemed to have been filed until the opposition fee has been
paid. (Art. 99(1) European Patent Convention).
2
3 EP 2 638 460 B1 4
portable (e.g., a notebook computer, tablet computer, or cluding: a left portion with a left side of a split keyboard;
handheld device). In some examples, the device has a a right portion with a right side of the split keyboard; and
touchpad. In some examples, the device has a touch- a center portion in between the left portion and the right
sensitive display (also known as a "touch screen" or portion; detecting a first contact on the touch-sensitive
"touch screen display"). In some examples, the device 5 surface at a location that corresponds to the center por-
has a graphical user interface (GUI), one or more proc- tion of the integrated input area; detecting movement of
essors, memory and one or more modules, programs or the first contact along the touch-sensitive surface; in re-
sets of instructions stored in the memory for performing sponse to detecting movement of the first contact along
multiple functions. In some examples, the user interacts the touch-sensitive surface, moving the integrated input
with the GUI primarily through finger contacts and ges- 10 area in accordance with the movement of the first contact
tures on the touch-sensitive surface. In some examples, when the movement of the first contact exceeds a first
the functions may include image editing, drawing, pre- movement threshold; detecting a second contact, distinct
senting, word processing, website creating, disk author- from the first contact, on the touch- sensitive surface at
ing, spreadsheet making, game playing, telephoning, a location that corresponds to the split keyboard; detect-
video conferencing, e-mailing, instant messaging, work- 15 ing movement of the second contact along the touch-
out support, digital photographing, digital videoing, web sensitive surface; and, in response to detecting move-
browsing, digital music playing, and/or digital video play- ment of the second contact along the touch-sensitive sur-
ing. Executable instructions for performing these func- face, moving the integrated input area in accordance with
tions may be included in a non-transitory computer read- the movement of the second contact when the movement
able storage medium or other computer program product 20 of the second contact exceeds a second movement
configured for execution by one or more processors. threshold, the second movement threshold being greater
[0008] In accordance with some examples, a method than the first movement threshold.
is performed at an electronic device with a display and a [0011] In accordance with some examples, a method
touch-sensitive surface. The method includes: concur- is performed at an electronic device with a display and a
rently displaying a first text entry area and an unsplit key- 25 touch-sensitive surface. The method includes: concur-
board on the display; detecting a gesture on the touch- rently displaying on the display an application content
sensitive surface; and, in response to detecting the ges- area with a first size, and an input area with a keyboard,
ture on the touch-sensitive surface, replacing the unsplit the input area being adjacent to and separate from the
keyboard with an integrated input area. The integrated application content area with the first size, the input area
input area includes: a left portion with a left side of a split 30 being at a bottom of the display; detecting a gesture on
keyboard; a right portion with a right side of the split key- the touch- sensitive surface; in response to detecting the
board; and a center portion in between the left portion gesture on the touch-sensitive surface: moving the input
and the right portion. area away from the bottom of the display over the appli-
[0009] In accordance with the claimed invention, a cation content area; and increasing the application con-
method is performed at an electronic device with a display 35 tent area to a second size larger than the first size.
and a touch-sensitive surface. The method includes: dis- [0012] In accordance with some examples, a method
playing a first keyboard on the display, the first keyboard is performed at an electronic device with a display and a
comprising a first plurality of keys; detecting a key acti- touch-sensitive surface. The method includes: concur-
vation gesture at a first time at a location on the touch- rently displaying a text entry area, a left side of a split
sensitive surface that corresponds to a location of a first 40 keyboard, and a right side of a split keyboard, the left
key in the first keyboard; in response to detecting the key side of the split keyboard including a plurality of rows of
activation gesture at the first time, activating the first key; keys and the right side of the split keyboard including a
detecting one or more contacts on the touch-sensitive corresponding plurality of rows of keys; detecting a ges-
surface at a second time after the first time, the one or ture at a location on the touch-sensitive surface that cor-
more contacts corresponding to a keyboard selection 45 responds to a predefined area adjacent to and to the right
gesture; and, in response to detecting the one or more of a rightmost key in a respective row of the left side of
contacts that correspond to the keyboard selection ges- the split keyboard, wherein the rightmost key in the re-
ture at the second time after the first time: replacing the spective row of the left side of the split keyboard is unique
first keyboard with a second keyboard when the second to the left side of the split keyboard; and, in response to
time exceeds a predefined period of time after the first 50 detecting the gesture at the location on the touch-sensi-
time; and maintaining display of the first keyboard when tive surface that corresponds to the predefined area ad-
the second time is less than the predefined period of time jacent to and to the right of the rightmost key in the re-
after the first time. spective row of the left side of the split keyboard, entering
[0010] In accordance with some examples, a method in the text entry area a character that corresponds to a
is performed at an electronic device with a display and a 55 leftmost key in a corresponding respective row of the
touch-sensitive surface. The method includes: concur- right side of the split keyboard.
rently displaying a first text entry area and an integrated [0013] In accordance with some examples, a method
input area on the display, the integrated input area in- is performed at an electronic device with a display and a
3
5 EP 2 638 460 B1 6
touch-sensitive surface. The method includes: concur- structions for: concurrently displaying a first text entry
rently displaying a first text entry area and an integrated area and an unsplit keyboard on the display; detecting a
input area, the integrated input area including: a left por- gesture on the touch-sensitive surface; and, in response
tion with a left side of a split keyboard; a right portion with to detecting the gesture on the touch-sensitive surface,
a right side of the split keyboard; and a center portion 5 replacing the unsplit keyboard with an integrated input
with a second text entry area, the center portion in be- area, the integrated input area including: a left portion
tween the left portion and the right portion; detecting a with a left side of a split keyboard; a right portion with a
gesture at a location on the touch- sensitive surface that right side of the split keyboard; and a center portion in
corresponds to a location of a character key in the split between the left portion and the right portion.
keyboard; and, in response to detecting the gesture at 10 [0017] In accordance with the claimed invention, an
the location on the touch-sensitive surface that corre- electronic device includes a display including a touch-
sponds to the location of the character key in the split sensitive surface, one or more processors, memory, and
keyboard, inputting and concurrently displaying the cor- one or more programs. The one or more programs are
responding character in the first text entry area and the stored in the memory and configured to be executed by
second text entry area on the display. 15 the one or more processors. The one or more programs
[0014] In accordance with some examples, a method include instructions for: displaying a first keyboard on the
is performed at an electronic device with a display and a display, the first keyboard comprising a first plurality of
touch-sensitive surface. The method includes: concur- keys; detecting a key activation gesture at a first time at
rently displaying on the display an application content a location on the touch-sensitive surface that corre-
area that includes one or more text entry areas, and an 20 sponds to a location of a first key in the first keyboard; in
input area with a keyboard that is displayed over the ap- response to detecting the key activation gesture at the
plication content area; detecting a drag gesture on the first time, activating the first key; detecting one or more
touch-sensitive surface at a location that corresponds to contacts on the touch-sensitive surface at a second time
the input area on the display; in response to detecting after the first time, the one or more contacts correspond-
the drag gesture, moving the input area on the display in 25 ing to a keyboard selection gesture; and, in response to
accordance with the drag gesture; detecting a flick ges- detecting the one or more contacts that correspond to
ture on the touch- sensitive surface at a location that cor- the keyboard selection gesture at the second time after
responds to the input area on the display; and, in re- the first time: replacing the first keyboard with a second
sponse to detecting the flick gesture, moving the input keyboard when the second time exceeds a predefined
area on the display with inertia in accordance with the 30 period of time after the first time; and maintaining display
flick gesture such that the input area comes to rest at a of the first keyboard when the second time is less than
location adjacent to and just below a text entry area in the predefined period of time after the first time.
the application content area. [0018] In accordance with some examples, an elec-
[0015] In accordance with some examples, a method tronic device includes a display, a touch-sensitive sur-
is performed at an electronic device with a display and a 35 face, one or more processors, memory, and one or more
touch-sensitive surface. The method includes: concur- programs. The one or more programs are stored in the
rently displaying on the display a first text entry area, and memory and configured to be executed by the one or
an integrated input area, the integrated input area includ- more processors. The one or more programs include in-
ing: a left portion with a left side of a split keyboard; a structions for: concurrently displaying a first text entry
right portion with a right side of the split keyboard; and a 40 area and an integrated input area on the display, the in-
center portion in between the left portion and the right tegrated input area including: a left portion with a left side
portion; detecting a first input on the touch-sensitive sur- of a split keyboard; a right portion with a right side of the
face; in response to detecting the first input, entering a split keyboard; and a center portion in between the left
reconfiguration mode for the integrated input area; and, portion and the right portion; detecting a first contact on
while in the reconfiguration mode for the integrated input 45 the touch-sensitive surface at a location that corresponds
area: detecting a second input by a first thumb and/or a to the center portion of the integrated input area; detect-
second thumb; in response to detecting the second input, ing movement of the first contact along the touch-sensi-
adjusting the size of at least one of the left side and the tive surface; in response to detecting movement of the
right side of the split keyboard in the integrated input area; first contact along the touch-sensitive surface, moving
detecting a third input; and, in response to detecting the 50 the integrated input area in accordance with the move-
third input, exiting the reconfiguration mode for the inte- ment of the first contact when the movement of the first
grated input area. contact exceeds a first movement threshold; detecting a
[0016] In accordance with some examples, an elec- second contact, distinct from the first contact, on the
tronic device includes a display, a touch-sensitive sur- touch- sensitive surface at a location that corresponds
face, one or more processors, memory, and one or more 55 to the split keyboard; detecting movement of the second
programs. The one or more programs are stored in the contact along the touch-sensitive surface; and, in re-
memory and configured to be executed by the one or sponse to detecting movement of the second contact
more processors. The one or more programs include in- along the touch-sensitive surface, moving the integrated
4
7 EP 2 638 460 B1 8
input area in accordance with the movement of the sec- portion; detecting a gesture at a location on the touch-
ond contact when the movement of the second contact sensitive surface that corresponds to a location of a char-
exceeds a second movement threshold, the second acter key in the split keyboard; and, in response to de-
movement threshold being greater than the first move- tecting the gesture at the location on the touch-sensitive
ment threshold. 5 surface that corresponds to the location of the character
[0019] In accordance with some examples, an elec- key in the split keyboard, inputting and concurrently dis-
tronic device includes a display, a touch-sensitive sur- playing the corresponding character in the first text entry
face, one or more processors, memory, and one or more area and the second text entry area on the display.
programs. The one or more programs are stored in the [0022] In accordance with some examples, an elec-
memory and configured to be executed by the one or 10 tronic device includes a display, a touch-sensitive sur-
more processors. The one or more programs include in- face, one or more processors, memory, and one or more
structions for: concurrently displaying on the display an programs. The one or more programs are stored in the
application content area with a first size, and an input memory and configured to be executed by the one or
area with a keyboard, the input area being adjacent to more processors. The one or more programs include in-
and separate from the application content area with the 15 structions for: concurrently displaying on the display an
first size, the input area being at a bottom of the display; application content area that includes one or more text
detecting a gesture on the touch- sensitive surface; in entry areas, and an input area with a keyboard that is
response to detecting the gesture on the touch-sensitive displayed over the application content area; detecting a
surface: moving the input area away from the bottom of drag gesture on the touch-sensitive surface at a location
the display over the application content area; and in- 20 that corresponds to the input area on the display; in re-
creasing the application content area to a second size sponse to detecting the drag gesture, moving the input
larger than the first size. area on the display in accordance with the drag gesture;
[0020] In accordance with some examples, an elec- detecting a flick gesture on the touch- sensitive surface
tronic device includes a display, a touch-sensitive sur- at a location that corresponds to the input area on the
face, one or more processors, memory, and one or more 25 display; and, in response to detecting the flick gesture,
programs. The one or more programs are stored in the moving the input area on the display with inertia in ac-
memory and configured to be executed by the one or cordance with the flick gesture such that the input area
more processors. The one or more programs include in- comes to rest at a location adjacent to and just below a
structions for: concurrently displaying a text entry area, text entry area in the application content area.
a left side of a split keyboard, and a right side of a split 30 [0023] In accordance with some examples, an elec-
keyboard, the left side of the split keyboard including a tronic device includes a display, a touch-sensitive sur-
plurality of rows of keys and the right side of the split face, one or more processors, memory, and one or more
keyboard including a corresponding plurality of rows of programs. The one or more programs are stored in the
keys; detecting a gesture at a location on the touch-sen- memory and configured to be executed by the one or
sitive surface that corresponds to a predefined area ad- 35 more processors. The one or more programs include in-
jacent to and to the right of a rightmost key in a respective structions for: concurrently displaying on the display a
row of the left side of the split keyboard, wherein the first text entry area, and an integrated input area, the
rightmost key in the respective row of the left side of the integrated input area including: a left portion with a left
split keyboard is unique to the left side of the split key- side of a split keyboard; a right portion with a right side
board; and, in response to detecting the gesture at the 40 of the split keyboard; and a center portion in between the
location on the touch-sensitive surface that corresponds left portion and the right portion; detecting a first input on
to the predefined area adjacent to and to the right of the the touch-sensitive surface; in response to detecting the
rightmost key in the respective row of the left side of the first input, entering a reconfiguration mode for the inte-
split keyboard, entering in the text entry area a character grated input area; and, while in the reconfiguration mode
that corresponds to a leftmost key in a corresponding 45 for the integrated input area: detecting a second input by
respective row of the right side of the split keyboard. a first thumb and/or a second thumb; in response to de-
[0021] In accordance with some examples, an elec- tecting the second input, adjusting the size of at least one
tronic device includes a display, a touch-sensitive sur- of the left side and the right side of the split keyboard in
face, one or more processors, memory, and one or more the integrated input area; detecting a third input; and, in
programs. The one or more programs are stored in the 50 response to detecting the third input, exiting the recon-
memory and configured to be executed by the one or figuration mode for the integrated input area.
more processors. The one or more programs include in- [0024] In accordance with some examples, a graphical
structions for: concurrently displaying a first text entry user interface on an electronic device with a display, a
area and an integrated input area, the integrated input touch-sensitive surface, a memory, and one or more
area including: a left portion with a left side of a split key- 55 processors to execute one or more programs stored in
board; a right portion with a right side of the split keyboard; the memory includes concurrently displayed: a first text
and a center portion with a second text entry area, the entry area and an unsplit keyboard; wherein: in response
center portion in between the left portion and the right to detection of a gesture on the touch-sensitive surface,
5
9 EP 2 638 460 B1 10
the unsplit keyboard is replaced with an integrated input processors to execute one or more programs stored in
area, the integrated input area including: a left portion the memory includes concurrently displayed: an applica-
with a left side of a split keyboard; a right portion with a tion content area with a first size, and an input area with
right side of the split keyboard; and a center portion in a keyboard, the input area being adjacent to and separate
between the left portion and the right portion. 5 from the application content area with the first size, the
[0025] In accordance with some examples, a graphical input area being at a bottom of the display; wherein: a
user interface on an electronic device with a display, a gesture is detected on the touch- sensitive surface; in
touch-sensitive surface, a memory, and one or more response to detecting the gesture on the touch-sensitive
processors to execute one or more programs stored in surface: the input area is moved away from the bottom
the memory includes a first keyboard, the first keyboard 10 of the display over the application content area; and the
comprising a first plurality of keys; wherein: a key activa- application content area is increased to a second size
tion gesture is detected at a first time at a location on the larger than the first size.
touch-sensitive surface that corresponds to a location of [0028] In accordance with some examples, a graphical
a first key in the first keyboard; in response to detecting user interface on an electronic device with a display, a
the key activation gesture at the first time, the first key is 15 touch-sensitive surface, a memory, and one or more
activated; one or more contacts are detected on the processors to execute one or more programs stored in
touch- sensitive surface at a second time after the first the memory includes concurrently displayed: a text entry
time, the one or more contacts corresponding to a key- area, a left side of a split keyboard, and a right side of a
board selection gesture; and, in response to detecting split keyboard, the left side of the split keyboard including
the one or more contacts that correspond to the keyboard 20 a plurality of rows of keys and the right side of the split
selection gesture at the second time after the first time: keyboard including a corresponding plurality of rows of
the first keyboard is replaced with a second keyboard keys; wherein: a gesture is detected at a location on the
when the second time exceeds a predefined period of touch-sensitive surface that corresponds to a predefined
time after the first time; and display of the first keyboard area adjacent to and to the right of a rightmost key in a
is maintained when the second time is less than the pre- 25 respective row of the left side of the split keyboard, where-
defined period of time after the first time. in the rightmost key in the respective row of the left side
[0026] In accordance with some examples, a graphical of the split keyboard is unique to the left side of the split
user interface on an electronic device with a display, a keyboard; and, in response to detecting the gesture at
touch-sensitive surface, a memory, and one or more the location on the touch-sensitive surface that corre-
processors to execute one or more programs stored in 30 sponds to the predefined area adjacent to and to the right
the memory includes concurrently displayed: a first text of the rightmost key in the respective row of the left side
entry area and an integrated input area, the integrated of the split keyboard, a character that corresponds to a
input area including: a left portion with a left side of a split leftmost key in a corresponding respective row of the
keyboard; a right portion with a right side of the split key- right side of the split keyboard is entered in the text entry
board; and a center portion in between the left portion 35 area.
and the right portion; wherein: a first contact is detected [0029] In accordance with some examples, a graphical
on the touch-sensitive surface at a location that corre- user interface on an electronic device with a display, a
sponds to the center portion of the integrated input area; touch-sensitive surface, a memory, and one or more
movement of the first contact is detected along the touch- processors to execute one or more programs stored in
sensitive surface; in response to detecting movement of 40 the memory includes concurrently displayed: a first text
the first contact along the touch-sensitive surface, the entry area, and an integrated input area, the integrated
integrated input area is moved in accordance with the input area including: a left portion with a left side of a split
movement of the first contact when the movement of the keyboard; a right portion with a right side of the split key-
first contact exceeds a first movement threshold; a sec- board; and a center portion with a second text entry area,
ond contact, distinct from the first contact, is detected on 45 the center portion in between the left portion and the right
the touch-sensitive surface at a location that corresponds portion; wherein: a gesture is detected at a location on
to the split keyboard; movement of the second contact the touch-sensitive surface that corresponds to a location
is detected along the touch-sensitive surface; and, in re- of a character key in the split keyboard; and, in response
sponse to detecting movement of the second contact to detecting the gesture at the location on the touch-sen-
along the touch-sensitive surface, the integrated input 50 sitive surface that corresponds to the location of the char-
area is moved in accordance with the movement of the acter key in the split keyboard, the corresponding char-
second contact when the movement of the second con- acter is inputted and concurrently displayed in the first
tact exceeds a second movement threshold, the second text entry area and the second text entry area on the
movement threshold being greater than the first move- display.
ment threshold. 55 [0030] In accordance with some examples, a graphical
[0027] In accordance with some examples, a graphical user interface on an electronic device with a display, a
user interface on an electronic device with a display, a touch-sensitive surface, a memory, and one or more
touch-sensitive surface, a memory, and one or more processors to execute one or more programs stored in
6
11 EP 2 638 460 B1 12
the memory includes concurrently displayed: an applica- activate the first key; detect one or more contacts on the
tion content area that includes one or more text entry touch-sensitive surface at a second time after the first
areas, and an input area with a keyboard that is displayed time, the one or more contacts corresponding to a key-
over the application content area; wherein: a drag ges- board selection gesture; and, in response to detecting
ture is detected on the touch-sensitive surface at a loca- 5 the one or more contacts that correspond to the keyboard
tion that corresponds to the input area on the display; in selection gesture at the second time after the first time:
response to detecting the drag gesture, the input area is replace the first keyboard with a second keyboard when
moved on the display in accordance with the drag ges- the second time exceeds a predefined period of time after
ture; a flick gesture is detected on the touch-sensitive the first time; and maintain display of the first keyboard
surface at a location that corresponds to the input area 10 when the second time is less than the predefined period
on the display; and, in response to detecting the flick of time after the first time.
gesture, the input area is moved on the display with inertia [0034] In accordance with some examples, a non-tran-
in accordance with the flick gesture such that the input sitory computer readable storage medium has stored
area comes to rest at a location adjacent to and just below therein instructions which when executed by an electron-
a text entry area in the application content area. 15 ic device with a display and a touch-sensitive surface,
[0031] In accordance with some examples, a graphical cause the device to: concurrently display a first text entry
user interface on an electronic device with a display, a area and an integrated input area on the display, the in-
touch-sensitive surface, a memory, and one or more tegrated input area including: a left portion with a left side
processors to execute one or more programs stored in of a split keyboard; a right portion with a right side of the
the memory includes concurrently displayed: a first text 20 split keyboard; and a center portion in between the left
entry area, and an integrated input area, the integrated portion and the right portion; detect a first contact on the
input area including: a left portion with a left side of a split touch-sensitive surface at a location that corresponds to
keyboard; a right portion with a right side of the split key- the center portion of the integrated input area; detect
board; and a center portion in between the left portion movement of the first contact along the touch-sensitive
and the right portion; wherein: a first input is detected on 25 surface; in response to detecting movement of the first
the touch-sensitive surface; in response to detecting the contact along the touch-sensitive surface, move the in-
first input, a reconfiguration mode for the integrated input tegrated input area in accordance with the movement of
area is entered; and, while in the reconfiguration mode the first contact when the movement of the first contact
for the integrated input area: a second input by a first exceeds a first movement threshold; detect a second
thumb and/or a second thumb is detected; in response 30 contact, distinct from the first contact, on the touch-sen-
to detecting the second input, the size of at least one of sitive surface at a location that corresponds to the split
the left side and the right side of the split keyboard in the keyboard; detect movement of the second contact along
integrated input area is adjusted; a third input is detected; the touch-sensitive surface; and, in response to detecting
and, in response to detecting the third input, the recon- movement of the second contact along the touch-sensi-
figuration mode for the integrated input area is exited. 35 tive surface, move the integrated input area in accord-
[0032] In accordance with some examples, a non-tran- ance with the movement of the second contact when the
sitory computer readable storage medium has stored movement of the second contact exceeds a second
therein instructions which when executed by an electron- movement threshold, the second movement threshold
ic device with a display and a touch-sensitive surface, being greater than the first movement threshold.
cause the device to: concurrently display a first text entry 40 [0035] In accordance with some examples, a non-tran-
area and an unsplit keyboard on the display; detect a sitory computer readable storage medium has stored
gesture on the touch-sensitive surface; and, in response therein instructions which when executed by an electron-
to detecting the gesture on the touch-sensitive surface, ic device with a display and a touch-sensitive surface,
replace the unsplit keyboard with an integrated input ar- cause the device to: concurrently display on the display
ea, the integrated input area including: a left portion with 45 an application content area with a first size, and an input
a left side of a split keyboard; a right portion with a right area with a keyboard, the input area being adjacent to
side of the split keyboard; and a center portion in between and separate from the application content area with the
the left portion and the right portion. first size, the input area being at a bottom of the display;
[0033] In accordance with the claimed invention, a non- detect a gesture on the touch-sensitive surface; in re-
transitory computer readable storage medium has stored 50 sponse to detecting the gesture on the touch-sensitive
therein instructions which when executed by an electron- surface: move the input area away from the bottom of
ic device with a display including a touch-sensitive sur- the display over the application content area; and in-
face, cause the device to: display a first keyboard on the crease the application content area to a second size larg-
display, the first keyboard comprising a first plurality of er than the first size.
keys; detect a key activation gesture at a first time at a 55 [0036] In accordance with some examples, a non-tran-
location on the touch-sensitive surface that corresponds sitory computer readable storage medium has stored
to a location of a first key in the first keyboard; in response therein instructions which when executed by an electron-
to detecting the key activation gesture at the first time, ic device with a display and a touch-sensitive surface,
7
13 EP 2 638 460 B1 14
cause the device to: concurrently display a text entry ar- ic device with a display and a touch-sensitive surface,
ea, a left side of a split keyboard, and a right side of a cause the device to: concurrently display on the display
split keyboard, the left side of the split keyboard including a first text entry area, and an integrated input area, the
a plurality of rows of keys and the right side of the split integrated input area including: a left portion with a left
keyboard including a corresponding plurality of rows of 5 side of a split keyboard; a right portion with a right side
keys; detect a gesture at a location on the touch-sensitive of the split keyboard; and a center portion in between the
surface that corresponds to a predefined area adjacent left portion and the right portion; detect a first input on
to and to the right of a rightmost key in a respective row the touch-sensitive surface; in response to detecting the
of the left side of the split keyboard, wherein the rightmost first input, enter a reconfiguration mode for the integrated
key in the respective row of the left side of the split key- 10 input area; and, while in the reconfiguration mode for the
board is unique to the left side of the split keyboard; and, integrated input area: detect a second input by a first
in response to detecting the gesture at the location on thumb and/or a second thumb; in response to detecting
the touch-sensitive surface that corresponds to the pre- the second input, adjust the size of at least one of the left
defined area adjacent to and to the right of the rightmost side and the right side of the split keyboard in the inte-
key in the respective row of the left side of the split key- 15 grated input area; detect a third input; and, in response
board, enter in the text entry area a character that cor- to detecting the third input, exit the reconfiguration mode
responds to a leftmost key in a corresponding respective for the integrated input area.
row of the right side of the split keyboard. [0040] In accordance with some examples, an elec-
[0037] In accordance with some examples, a non-tran- tronic device includes: a display; a touch-sensitive sur-
sitory computer readable storage medium has stored 20 face; means for concurrently displaying a first text entry
therein instructions which when executed by an electron- area and an unsplit keyboard on the display; means for
ic device with a display and a touch-sensitive surface, detecting a gesture on the touch-sensitive surface; and,
cause the device to: concurrently display a first text entry means for, in response to detecting the gesture on the
area and an integrated input area, the integrated input touch-sensitive surface, replacing the unsplit keyboard
area including: a left portion with a left side of a split key- 25 with an integrated input area, the integrated input area
board; a right portion with a right side of the split keyboard; including: a left portion with a left side of a split keyboard;
and a center portion with a second text entry area, the a right portion with a right side of the split keyboard; and
center portion in between the left portion and the right a center portion in between the left portion and the right
portion; detect a gesture at a location on the touch-sen- portion.
sitive surface that corresponds to a location of a character 30 [0041] In accordance with some examples, an elec-
key in the split keyboard; and, in response to detecting tronic device includes: a display; a touch-sensitive sur-
the gesture at the location on the touch-sensitive surface face; means for displaying a first keyboard on the display,
that corresponds to the location of the character key in the first keyboard comprising a first plurality of keys;
the split keyboard, input and concurrently display the cor- means for detecting a key activation gesture at a first
responding character in the first text entry area and the 35 time at a location on the touch-sensitive surface that cor-
second text entry area on the display. responds to a location of a first key in the first keyboard;
[0038] In accordance with some examples, a non-tran- means for, in response to detecting the key activation
sitory computer readable storage medium has stored gesture at the first time, activating the first key; means
therein instructions which when executed by an electron- for detecting one or more contacts on the touch-sensitive
ic device with a display and a touch-sensitive surface, 40 surface at a second time after the first time, the one or
cause the device to: concurrently display on the display more contacts corresponding to a keyboard selection
an application content area that includes one or more gesture; and, in response to detecting the one or more
text entry areas, and an input area with a keyboard that contacts that correspond to the keyboard selection ges-
is displayed over the application content area; detect a ture at the second time after the first time: means for
drag gesture on the touch-sensitive surface at a location 45 replacing the first keyboard with a second keyboard when
that corresponds to the input area on the display; in re- the second time exceeds a predefined period of time after
sponse to detecting the drag gesture, move the input the first time; and means for maintaining display of the
area on the display in accordance with the drag gesture; first keyboard when the second time is less than the pre-
detect a flick gesture on the touch-sensitive surface at a defined period of time after the first time.
location that corresponds to the input area on the display; 50 [0042] In accordance with some examples, an elec-
and, in response to detecting the flick gesture, move the tronic device includes: a display; a touch-sensitive sur-
input area on the display with inertia in accordance with face; means for concurrently displaying a first text entry
the flick gesture such that the input area comes to rest area and an integrated input area on the display, the in-
at a location adjacent to and just below a text entry area tegrated input area including: a left portion with a left side
in the application content area. 55 of a split keyboard; a right portion with a right side of the
[0039] In accordance with some examples, a non-tran- split keyboard; and a center portion in between the left
sitory computer readable storage medium has stored portion and the right portion; means for detecting a first
therein instructions which when executed by an electron- contact on the touch-sensitive surface at a location that
8
15 EP 2 638 460 B1 16
corresponds to the center portion of the integrated input board; a right portion with a right side of the split keyboard;
area; means for detecting movement of the first contact and a center portion with a second text entry area, the
along the touch- sensitive surface; means for, in re- center portion in between the left portion and the right
sponse to detecting movement of the first contact along portion; means for detecting a gesture at a location on
the touch-sensitive surface, moving the integrated input 5 the touch-sensitive surface that corresponds to a location
area in accordance with the movement of the first contact of a character key in the split keyboard; and, means for
when the movement of the first contact exceeds a first in response to detecting the gesture at the location on
movement threshold; means for detecting a second con- the touch-sensitive surface that corresponds to the loca-
tact, distinct from the first contact, on the touch-sensitive tion of the character key in the split keyboard, inputting
surface at a location that corresponds to the split key- 10 and concurrently displaying the corresponding character
board; means for detecting movement of the second con- in the first text entry area and the second text entry area
tact along the touch-sensitive surface; and, means for, on the display.
in response to detecting movement of the second contact [0046] In accordance with some examples, an elec-
along the touch-sensitive surface, moving the integrated tronic device includes: a display; a touch-sensitive sur-
input area in accordance with the movement of the sec- 15 face; means for concurrently displaying on the display
ond contact when the movement of the second contact an application content area that includes one or more
exceeds a second movement threshold, the second text entry areas, and an input area with a keyboard that
movement threshold being greater than the first move- is displayed over the application content area; means for
ment threshold. detecting a drag gesture on the touch-sensitive surface
[0043] In accordance with some examples, an elec- 20 at a location that corresponds to the input area on the
tronic device includes: a display; a touch-sensitive sur- display; means for, in response to detecting the drag ges-
face; means for concurrently displaying on the display ture, moving the input area on the display in accordance
an application content area with a first size, and an input with the drag gesture; means for detecting a flick gesture
area with a keyboard, the input area being adjacent to on the touch- sensitive surface at a location that corre-
and separate from the application content area with the 25 sponds to the input area on the display; and, means for,
first size, the input area being at a bottom of the display; in response to detecting the flick gesture, moving the
means for detecting a gesture on the touch-sensitive sur- input area on the display with inertia in accordance with
face; in response to detecting the gesture on the touch- the flick gesture such that the input area comes to rest
sensitive surface: means for moving the input area away at a location adjacent to and just below a text entry area
from the bottom of the display over the application content 30 in the application content area.
area; and means for increasing the application content [0047] In accordance with some examples, an elec-
area to a second size larger than the first size. tronic device includes: a display; a touch-sensitive sur-
[0044] In accordance with some examples, an elec- face; means for concurrently displaying on the display a
tronic device includes: a display; a touch-sensitive sur- first text entry area, and an integrated input area, the
face; means for concurrently displaying a text entry area, 35 integrated input area including: a left portion with a left
a left side of a split keyboard, and a right side of a split side of a split keyboard; a right portion with a right side
keyboard, the left side of the split keyboard including a of the split keyboard; and a center portion in between the
plurality of rows of keys and the right side of the split left portion and the right portion; means for detecting a
keyboard including a corresponding plurality of rows of first input on the touch-sensitive surface; means for in
keys; means for detecting a gesture at a location on the 40 response to detecting the first input, entering a reconfig-
touch-sensitive surface that corresponds to a predefined uration mode for the integrated input area; and, while in
area adjacent to and to the right of a rightmost key in a the reconfiguration mode for the integrated input area:
respective row of the left side of the split keyboard, where- means for detecting a second input by a first thumb and/or
in the rightmost key in the respective row of the left side a second thumb; means for, in response to detecting the
of the split keyboard is unique to the left side of the split 45 second input, adjusting the size of at least one of the left
keyboard; and, means for, in response to detecting the side and the right side of the split keyboard in the inte-
gesture at the location on the touch-sensitive surface that grated input area; means for detecting a third input; and,
corresponds to the predefined area adjacent to and to means for, in response to detecting the third input, exiting
the right of the rightmost key in the respective row of the the reconfiguration mode for the integrated input area.
left side of the split keyboard, entering in the text entry 50 [0048] In accordance with some examples, an infor-
area a character that corresponds to a leftmost key in a mation processing apparatus for use in an electronic de-
corresponding respective row of the right side of the split vice with a display and a touch-sensitive surface in-
keyboard. cludes: means for concurrently displaying a first text entry
[0045] In accordance with some examples, an elec- area and an unsplit keyboard on the display; means for
tronic device includes: a display; a touch-sensitive sur- 55 detecting a gesture on the touch-sensitive surface; and,
face; means for concurrently displaying a first text entry means for, in response to detecting the gesture on the
area and an integrated input area, the integrated input touch-sensitive surface, replacing the unsplit keyboard
area including: a left portion with a left side of a split key- with an integrated input area, the integrated input area
9
17 EP 2 638 460 B1 18
including: a left portion with a left side of a split keyboard; an application content area with a first size, and an input
a right portion with a right side of the split keyboard; and area with a keyboard, the input area being adjacent to
a center portion in between the left portion and the right and separate from the application content area with the
portion. first size, the input area being at a bottom of the display;
[0049] In accordance with some examples, an infor- 5 means for detecting a gesture on the touch-sensitive sur-
mation processing apparatus for use in an electronic de- face; in response to detecting the gesture on the touch-
vice with a display and a touch-sensitive surface in- sensitive surface: means for moving the input area away
cludes: means for displaying a first keyboard on the dis- from the bottom of the display over the application content
play, the first keyboard comprising a first plurality of keys; area; and means for increasing the application content
means for detecting a key activation gesture at a first 10 area to a second size larger than the first size.
time at a location on the touch- sensitive surface that [0052] In accordance with some examples, an infor-
corresponds to a location of a first key in the first key- mation processing apparatus for use in an electronic de-
board; means for in response to detecting the key acti- vice with a display and a touch-sensitive surface in-
vation gesture at the first time, activating the first key; cludes: means for concurrently displaying a text entry
means for detecting one or more contacts on the touch- 15 area, a left side of a split keyboard, and a right side of a
sensitive surface at a second time after the first time, the split keyboard, the left side of the split keyboard including
one or more contacts corresponding to a keyboard se- a plurality of rows of keys and the right side of the split
lection gesture; and, in response to detecting the one or keyboard including a corresponding plurality of rows of
more contacts that correspond to the keyboard selection keys; means for detecting a gesture at a location on the
gesture at the second time after the first time: means for 20 touch-sensitive surface that corresponds to a predefined
replacing the first keyboard with a second keyboard when area adjacent to and to the right of a rightmost key in a
the second time exceeds a predefined period of time after respective row of the left side of the split keyboard, where-
the first time; and means for maintaining display of the in the rightmost key in the respective row of the left side
first keyboard when the second time is less than the pre- of the split keyboard is unique to the left side of the split
defined period of time after the first time. 25 keyboard; and, means for in response to detecting the
[0050] In accordance with some examples, an infor- gesture at the location on the touch-sensitive surface that
mation processing apparatus for use in an electronic de- corresponds to the predefined area adjacent to and to
vice with a display and a touch-sensitive surface in- the right of the rightmost key in the respective row of the
cludes: means for concurrently displaying a first text entry left side of the split keyboard, entering in the text entry
area and an integrated input area on the display, the in- 30 area a character that corresponds to a leftmost key in a
tegrated input area including: a left portion with a left side corresponding respective row of the right side of the split
of a split keyboard; a right portion with a right side of the keyboard.
split keyboard; and a center portion in between the left [0053] In accordance with some examples, an infor-
portion and the right portion; means for detecting a first mation processing apparatus for use in an electronic de-
contact on the touch-sensitive surface at a location that 35 vice with a display and a touch-sensitive surface in-
corresponds to the center portion of the integrated input cludes: means for concurrently displaying a first text entry
area; means for detecting movement of the first contact area and an integrated input area, the integrated input
along the touch-sensitive surface; means for, in response area including: a left portion with a left side of a split key-
to detecting movement of the first contact along the board; a right portion with a right side of the split keyboard;
touch-sensitive surface, moving the integrated input area 40 and a center portion with a second text entry area, the
in accordance with the movement of the first contact center portion in between the left portion and the right
when the movement of the first contact exceeds a first portion; means for detecting a gesture at a location on
movement threshold; means for detecting a second con- the touch-sensitive surface that corresponds to a location
tact, distinct from the first contact, on the touch-sensitive of a character key in the split keyboard; and, means for
surface at a location that corresponds to the split key- 45 in response to detecting the gesture at the location on
board; means for detecting movement of the second con- the touch-sensitive surface that corresponds to the loca-
tact along the touch-sensitive surface; and, means for, tion of the character key in the split keyboard, inputting
in response to detecting movement of the second contact and concurrently displaying the corresponding character
along the touch-sensitive surface, moving the integrated in the first text entry area and the second text entry area
input area in accordance with the movement of the sec- 50 on the display.
ond contact when the movement of the second contact [0054] In accordance with some examples, an infor-
exceeds a second movement threshold, the second mation processing apparatus for use in an electronic de-
movement threshold being greater than the first move- vice with a display and a touch-sensitive surface in-
ment threshold. cludes: means for concurrently displaying on the display
[0051] In accordance with some examples, an infor- 55 an application content area that includes one or more
mation processing apparatus for use in an electronic de- text entry areas, and an input area with a keyboard that
vice with a display and a touch-sensitive surface in- is displayed over the application content area; means for
cludes: means for concurrently displaying on the display detecting a drag gesture on the touch-sensitive surface
10
19 EP 2 638 460 B1 20
at a location that corresponds to the input area on the the one or more contacts corresponding to a keyboard
display; means for, in response to detecting the drag ges- selection gesture; and in response to detecting the one
ture, moving the input area on the display in accordance or more contacts that correspond to the keyboard selec-
with the drag gesture; means for detecting a flick gesture tion gesture at the second time after the first time: replace
on the touch-sensitive surface at a location that corre- 5 the first keyboard with a second keyboard on the display
sponds to the input area on the display; and, means for, unit when the second time exceeds a predefined period
in response to detecting the flick gesture, moving the of time after the first time; and maintain display of the first
input area on the display with inertia in accordance with keyboard on the display unit when the second time is
the flick gesture such that the input area comes to rest less than the predefined period of time after the first time.
at a location adjacent to and just below a text entry area 10 [0058] In accordance with some examples, an elec-
in the application content area. tronic device includes a display unit configured to con-
[0055] In accordance with some examples, an infor- currently display a first text entry area and an integrated
mation processing apparatus for use in an electronic de- input area, the integrated input area including a left por-
vice with a display and a touch-sensitive surface in- tion with a left side of a split keyboard, a right portion with
cludes: means for concurrently displaying on the display 15 a right side of the split keyboard, and a center portion in
a first text entry area, and an integrated input area, the between the left portion and the right portion; a touch-
integrated input area including: a left portion with a left sensitive surface unit configured to receive user contacts
side of a split keyboard; a right portion with a right side and movements of the user contacts; and a processing
of the split keyboard; and a center portion in between the unit coupled to the display unit and the touch-sensitive
left portion and the right portion; means for detecting a 20 surface unit. The processing unit is configured to detect
first input on the touch-sensitive surface; means for, in a first contact on the touch-sensitive surface unit at a
response to detecting the first input, entering a reconfig- location that corresponds to the center portion of the in-
uration mode for the integrated input area; and, while in tegrated input area; detect movement of the first contact
the reconfiguration mode for the integrated input area: along the touch-sensitive surface unit; in response to de-
means for detecting a second input by a first thumb and/or 25 tecting the movement of the first contact along the touch-
a second thumb; means for, in response to detecting the sensitive surface unit, move the integrated input area on
second input, adjusting the size of at least one of the left the display unit in accordance with the movement of the
side and the right side of the split keyboard in the inte- first contact when the movement of the first contact ex-
grated input area; means for detecting a third input; and, ceeds a first movement threshold; detect a second con-
means for, in response to detecting the third input, exiting 30 tact, distinct from the first contact, on the touch-sensitive
the reconfiguration mode for the integrated input area. surface unit at a location that corresponds to the split
[0056] In accordance with some examples, an elec- keyboard; detect movement of the second contact along
tronic device includes a display unit configured to con- the touch-sensitive surface unit; and, in response to de-
currently display a first text entry area and an unsplit key- tecting the movement of the second contact along the
board, a touch-sensitive surface unit configured to re- 35 touch-sensitive surface unit, move the integrated input
ceive user gestures, and a processing unit coupled to area on the display unit in accordance with the movement
the display unit and the touch-sensitive surface unit. The of the second contact when the movement of the second
processing unit is configured to detect a gesture on the contact exceeds a second movement threshold, the sec-
touch-sensitive surface unit, and, in response to detect- ond movement threshold being greater than the first
ing the gesture on the touch-sensitive surface unit, re- 40 movement threshold.
place the unsplit keyboard with an integrated input area. [0059] In accordance with some examples, an elec-
The integrated input area includes a left portion with a tronic device includes a display unit configured to con-
left side of a split keyboard, a right portion with a right currently display an application content area with a first
side of the split keyboard, and a center portion in between size, and an input area with a keyboard, the input area
the left portion and the right portion. 45 being adjacent to and separate from the application con-
[0057] In accordance with some examples, an elec- tent area with the first size, the input area being at a
tronic device includes a display unit configured to display bottom of the display unit; a touch-sensitive surface unit
a first keyboard, the first keyboard including a first plu- configured to receive user gestures; and a processing
rality of keys; a touch-sensitive surface unit configured unit coupled to the display unit and the touch-sensitive
to receive user gestures; and a processing unit coupled 50 surface unit. The processing unit is configured to detect
to the display unit and the touch-sensitive surface unit. a gesture on the touch-sensitive surface unit; and, in re-
The processing unit is configured to detect a key activa- sponse to detecting the gesture on the touch-sensitive
tion gesture at a first time at a location on the touch- surface unit: move the input area away from the bottom
sensitive surface unit that corresponds to a location of a of the display unit over the application content area, and
first key in the first keyboard; in response to detecting 55 increase a size of the application content area to a second
the key activation gesture at the first time, activate the size larger than the first size.
first key; detect one or more contacts on the touch-sen- [0060] In accordance with some examples, an elec-
sitive surface unit at a second time after the first time, tronic device includes a display unit configured to con-
11
21 EP 2 638 460 B1 22
currently display a text entry area, a left side of a split area in the application content area.
keyboard, and a right side of a split keyboard, the left [0063] In accordance with some examples, an elec-
side of the split keyboard including a plurality of rows of tronic device includes a display unit configured to con-
keys and the right side of the split keyboard including a currently display a first text entry area, and an integrated
corresponding plurality of rows of keys; a touch-sensitive 5 input area, the integrated input area including: a left por-
surface unit configured to receive user gestures; and a tion with a left side of a split keyboard, a right portion with
processing unit coupled to the display unit and the touch- a right side of the split keyboard, and a center portion in
sensitive surface unit. The processing unit is configured between the left portion and the right portion; a touch-
to detect a gesture at a location on the touch-sensitive sensitive surface unit configured to receive user inputs;
surface unit that corresponds to a predefined area adja- 10 a processing unit coupled to the display unit and the
cent to and to the right of a rightmost key in a respective touch-sensitive surface unit. The processing unit is con-
row of the left side of the split keyboard; and, in response figured to: detect a first input on the touch-sensitive sur-
to detecting the gesture at the location on the touch-sen- face unit; in response to detecting the first input, enter a
sitive surface unit that corresponds to the predefined area reconfiguration mode for the integrated input area; while
adjacent to and to the right of the rightmost key in the 15 in the reconfiguration mode for the integrated input area:
respective row of the left side of the split keyboard, enter detect a second input by a first thumb and/or a second
in the text entry area a character that corresponds to a thumb; in response to detecting the second input, adjust
leftmost key in a corresponding respective row of the the size of at least one of the left side and the right side
right side of the split keyboard. of the split keyboard in the integrated input area; and
[0061] In accordance with some examples, an elec- 20 detect a third input; and, in response to detecting the third
tronic device includes a display unit configured to con- input, exit the reconfiguration mode for the integrated in-
currently display a first text entry area and an integrated put area.
input area, the integrated input area including a left por- [0064] In accordance with some examples, a method
tion with a left side of a split keyboard, a right portion with is performed at an electronic device with a display and a
a right side of the split keyboard, and a center portion 25 touch-sensitive surface. The method includes: concur-
with a second text entry area, the center portion in be- rently displaying on the display an application content
tween the left portion and the right portion; a touch-sen- area and an unsplit keyboard, the unsplit keyboard being
sitive surface unit configured to receive user gestures; located at a bottom of the display; detecting a first gesture
and a processing unit coupled to the display unit and the on the touch- sensitive surface; and, in response to de-
touch-sensitive surface unit. The processing unit is con- 30 tecting the first gesture on the touch-sensitive surface:
figured to: detect a gesture at a location on the touch- converting the unsplit keyboard into a split keyboard and
sensitive surface unit that corresponds to a location of a moving the split keyboard away from the bottom of the
character key in the split keyboard; and, in response to display over the application content area in accordance
detecting the gesture at the location on the touch-sensi- with the first gesture.
tive surface unit that corresponds to the location of the 35 [0065] In accordance with some examples, an elec-
character key in the split keyboard, input and enable con- tronic device includes a display, a touch-sensitive sur-
current display of the corresponding character in the first face, one or more processors, memory, and one or more
text entry area and the second text entry area on the programs. The one or more programs are stored in the
display unit. memory and configured to be executed by the one or
[0062] In accordance with some examples, an elec- 40 more processors. The one or more programs include in-
tronic device includes a display unit configured to con- structions for: concurrently displaying on the display an
currently display an application content area that includes application content area and an unsplit keyboard, the un-
one or more text entry areas, and an input area with a split keyboard being located at a bottom of the display;
keyboard that is displayed over the application content detecting a first gesture on the touch- sensitive surface;
area; a touch-sensitive surface unit configured to receive 45 and, in response to detecting the first gesture on the
user gestures; and a processing unit coupled to the dis- touch-sensitive surface: converting the unsplit keyboard
play unit and the touch-sensitive surface unit. The into a split keyboard and moving the split keyboard away
processing unit is configured to: detect a drag gesture from the bottom of the display over the application content
on the touch-sensitive surface unit at a location that cor- area in accordance with the first gesture.
responds to the input area on the display unit; in response 50 [0066] In accordance with some examples, a non-tran-
to detecting the drag gesture, move the input area on the sitory computer readable storage medium has stored
display unit in accordance with the drag gesture; detect therein instructions which when executed by an electron-
a flick gesture on the touch-sensitive surface unit at a ic device with a display and a touch-sensitive surface,
location that corresponds to the input area on the display cause the device to: concurrently display on the display
unit; and, in response to detecting the flick gesture, move 55 an application content area and an unsplit keyboard, the
the input area on the display unit with inertia in accord- unsplit keyboard being located at a bottom of the display;
ance with the flick gesture such that the input area comes detect a first gesture on the touch-sensitive surface; and,
to rest at a location adjacent to and just below a text entry in response to detecting the first gesture on the touch-
12
23 EP 2 638 460 B1 24
sensitive surface: convert the unsplit keyboard into a split touch-sensitive surfaces are provided with faster, more
keyboard and move the split keyboard away from the efficient methods and interfaces for manipulating soft
bottom of the display over the application content area keyboards, thereby increasing the effectiveness, effi-
in accordance with the first gesture. ciency, and user satisfaction with such devices. Such
[0067] In accordance with some examples, a graphical 5 methods and interfaces may complement or replace con-
user interface on an electronic device with a display, a ventional methods for manipulating soft keyboards.
touch-sensitive surface, a memory, and one or more
processors to execute one or more programs stored in BRIEF DESCRIPTION OF THE DRAWINGS
the memory includes an application content area and an
unsplit keyboard. The unsplit keyboard is located at a 10 [0072] For a better understanding of the aforemen-
bottom of the display. A first gesture is detected on the tioned embodiments of the invention as well as additional
touch-sensitive surface. In response to detecting the first embodiments thereof, reference should be made to the
gesture on the touch-sensitive surface, the unsplit key- Description of Embodiments below, in conjunction with
board is converted into a split keyboard and the split key- the following drawings in which like reference numerals
board is moved away from the bottom of the display over 15 refer to corresponding parts throughout the figures.
the application content area in accordance with the first
gesture. Figure 1A is a block diagram illustrating a portable
[0068] In accordance with some examples, an elec- multifunction device with a touch-sensitive display in
tronic device includes: a display; a touch-sensitive sur- accordance with some embodiments.
face; means for concurrently displaying on the display 20
an application content area and an unsplit keyboard, the Figure 1B is a block diagram illustrating exemplary
unsplit keyboard being located at a bottom of the display; components for event handling in accordance with
means for detecting a first gesture on the touch-sensitive some embodiments.
surface; and,
means, enabled in response to detecting the first gesture 25 Figure 2 illustrates a portable multifunction device
on the touch-sensitive surface, including: means for con- having a touch screen in accordance with some em-
verting the unsplit keyboard into a split keyboard and bodiments.
means for moving the split keyboard away from the bot-
tom of the display over the application content area in Figure 3 is a block diagram of an exemplary multi-
accordance with the first gesture. 30 function device with a display and a touch-sensitive
[0069] In accordance with some examples, an infor- surface in accordance with some embodiments.
mation processing apparatus for use in an electronic de-
vice with a display and a touch-sensitive surface in- Figure 4A illustrates an exemplary user interface for
cludes: means for concurrently displaying on the display a menu of applications on a portable multifunction
an application content area and an unsplit keyboard, the 35 device in accordance with some embodiments.
unsplit keyboard being located at a bottom of the display;
means for detecting a first gesture on the touch-sensitive Figure 4B illustrates an exemplary user interface for
surface; and, means, enabled in response to detecting a multifunction device with a touch-sensitive surface
the first gesture on the touch-sensitive surface, including: that is separate from the display in accordance with
means for converting the unsplit keyboard into a split 40 some embodiments.
keyboard, and means for moving the split keyboard away
from the bottom of the display over the application content Figures 5A-5TTT illustrate exemplary user interfac-
area in accordance with the first gesture. es for manipulating soft keyboards in accordance
[0070] In accordance with some examples, an elec- with some embodiments.
tronic device includes a display unit configured to display 45
concurrently an application content area and an unsplit Figures 6A-6B are flow diagrams illustrating a meth-
keyboard, the unsplit keyboard being located at a bottom od of replacing an unsplit keyboard with an integrat-
of the display unit; a touch-sensitive surface unit config- ed input area in accordance with some embodi-
ured to receive gestures; and a processing unit coupled ments.
to the display unit and the touch-sensitive surface unit. 50
The processing unit is configured to: detect a first gesture Figures 7A-7B are flow diagrams illustrating a meth-
on the touch-sensitive surface unit; in response to de- od of responding to a keyboard selection gesture in
tecting the first gesture on the touch-sensitive surface accordance with some embodiments.
unit: convert the unsplit keyboard into a split keyboard,
and move the split keyboard away from the bottom of the 55 Figures 8A-8B are flow diagrams illustrating a meth-
display unit over the application content area in accord- od of moving an integrated input area in accordance
ance with the first gesture. with some embodiments.
[0071] Thus, electronic devices with displays and
13
25 EP 2 638 460 B1 26
Figure 9 is a flow diagram illustrating a method of Figure 23 is a functional block diagram of an elec-
moving an input area and adjusting the size of an tronic device in accordance with some embodi-
application content area in accordance with some ments.
embodiments.
5 DESCRIPTION OF EMBODIMENTS
Figures 10A-10B are flow diagrams illustrating a
method of entering characters with a split soft key- [0073] The claims of the present application are illus-
board in accordance with some embodiments. trated by Figs. 5H-5K and 7A-7B, and corresponding pas-
sages of the description such as paragraphs
Figures 11A-11D are flow diagrams illustrating a 10 [00200]-[00201] and [00286]-[00301]. Other portions of
method of using a center portion of an integrated the specification, however, provide important context,
input area in accordance with some embodiments. references, and explanations of related embodiments
and the devices and systems by which the invention may
Figures 12A-12B are flow diagrams illustrating a be implemented.
method of moving an input area that includes a key- 15 [0074] Many electronic devices have graphical user in-
board over an application content area in accord- terfaces with soft keyboards for character entry. On a
ance with some embodiments. relatively large portable device, such as a tablet compu-
ter, typing on an unsplit soft keyboard may be fine in
Figures 13A-13B are flow diagrams illustrating a certain situations, such as when the computer is resting
method of reconfiguring an integrated input area in 20 on a solid surface, but problematic in other situations.
accordance with some embodiments. For example, unsplit keyboards are not convenient for
typing when both hands are holding onto the device. Split
Figure 14 is a flow diagram illustrating a method of soft keyboards may be better in these situations. But the
automatically converting between an unsplit key- use of split keyboards for two-thumb typing when both
board and a split keyboard in accordance with some 25 hands are holding onto the device raises new issues that
embodiments. have not been recognized and/or properly addressed,
such as:
Figure 15 is a functional block diagram of an elec-
tronic device in accordance with some embodi- • Easily converting between an unsplit keyboard and
ments. 30 an integrated input area that includes a split key-
board;
Figure 16 is a functional block diagram of an elec-
tronic device in accordance with some embodi- • Preventing accidentally changing soft keyboards
ments. while typing (e.g., from an unsplit keyboard to an
35 integrated input area with a split keyboard, or vice
Figure 17 is a functional block diagram of an elec- versa);
tronic device in accordance with some embodi-
ments. • Moving an integrated input area when desired, but
preventing accidental movement of the integrated
Figure 18 is a functional block diagram of an elec- 40 input area when a user contact moves during typing
tronic device in accordance with some embodi- with the split keyboard;
ments.
• Moving an input area and adjusting the size of an
Figure 19 is a functional block diagram of an elec- application content area to display more of the ap-
tronic device in accordance with some embodi- 45 plication;
ments.
• More efficiently entering characters during two-
Figure 20 is a functional block diagram of an elec- thumb typing with a split soft keyboard;
tronic device in accordance with some embodi-
ments. 50 • Using the center portion of an integrated input area
to make character entry faster and more efficient dur-
Figure 21 is a functional block diagram of an elec- ing two-thumb typing;
tronic device in accordance with some embodi-
ments. • Moving an input area that includes a keyboard over
55 an application content area so that the input area is
Figure 22 is a functional block diagram of an elec- just below a text entry area in the application;
tronic device in accordance with some embodi-
ments. • Easily customizing a split keyboard in an integrated
14
27 EP 2 638 460 B1 28
input area to the size of the user’s thumbs; and to mean "when" or "upon" or "in response to
determining" or "in response to detecting," depending on
• Automatically converting between an unsplit key- the context. Similarly, the phrase "if it is determined" or "if
board and a split keyboard. [a stated condition or event] is detected" may be con-
5 strued to mean "upon determining" or "in response to
[0075] The embodiments described below address determining" or "upon detecting [the stated condition or
these issues and related issues. event]" or "in response to detecting [the stated condition
[0076] Below, Figures 1A-1B, 2, 3, and 15-23 provide or event]," depending on the context.
a description of exemplary devices. Figures 4A-4B and [0081] Embodiments of electronic devices, user inter-
5A-5TTT illustrate exemplary user interfaces for manip- 10 faces for such devices, and associated processes for us-
ulating soft keyboards. Figures 6A-6B, 7A-7B, 8A-8B, 9, ing such devices are described. In some embodiments,
10A-10B, 11A-11D, 12A-12B, 13A-13B, and 14 are flow the device is a portable communications device, such as
diagrams illustrating methods of manipulating soft key- a mobile telephone, that also contains other functions,
boards. The user interfaces in Figures 5A-5TTT are used such as PDA and/or music player functions. Exemplary
to illustrate the processes in Figures 6A-6B, 7A-7B, 8A- 15 embodiments of portable multifunction devices include,
8B, 9, 10A-10B, 11A-11D, 12A-12B, 13A-13B, and 14. without limitation, the iPhone®, iPod Touch®, and iPad®
devices from Apple Inc. of Cupertino, California. Other
EXEMPLARY DEVICES portable electronic devices, such as laptops or tablet
computers with touch-sensitive surfaces (e.g., touch
[0077] Reference will now be made in detail to embod- 20 screen displays and/or touch pads), may also be used.
iments, examples of which are illustrated in the accom- It should also be understood that, in some embodiments,
panying drawings. In the following detailed description, the device is not a portable communications device, but
numerous specific details are set forth in order to provide is a desktop computer with a touch-sensitive surface
a thorough understanding of the present invention. How- (e.g., a touch screen display and/or a touch pad).
ever, it will be apparent to one of ordinary skill in the art 25 [0082] In the discussion that follows, an electronic de-
that the present invention may be practiced without these vice that includes a display and a touch-sensitive surface
specific details. In other instances, well-known methods, is described. It should be understood, however, that the
procedures, components, circuits, and networks have electronic device may include one or more other physical
not been described in detail so as not to unnecessarily user-interface devices, such as a physical keyboard, a
obscure aspects of the embodiments. 30 mouse and/or a joystick.
[0078] It will also be understood that, although the [0083] The device typically supports a variety of appli-
terms first, second, etc. may be used herein to describe cations, such as one or more of the following: a drawing
various elements, these elements should not be limited application, a presentation application, a word process-
by these terms. These terms are only used to distinguish ing application, a website creation application, a disk au-
one element from another. For example, a first contact 35 thoring application, a spreadsheet application, a gaming
could be termed a second contact, and, similarly, a sec- application, a telephone application, a video conferenc-
ond contact could be termed a first contact, without de- ing application, an e-mail application, an instant messag-
parting from the scope of the present invention. The first ing application, a workout support application, a photo
contact and the second contact are both contacts, but management application, a digital camera application, a
they are not the same contact. 40 digital video camera application, a web browsing appli-
[0079] The terminology used in the description of the cation, a digital music player application, and/or a digital
invention herein is for the purpose of describing particular video player application.
embodiments only and is not intended to be limiting of [0084] The various applications that may be executed
the invention. As used in the description of the invention on the device may use at least one common physical
and the appended claims, the singular forms "a", "an" 45 user-interface device, such as the touch-sensitive sur-
and "the" are intended to include the plural forms as well, face. One or more functions of the touch-sensitive sur-
unless the context clearly indicates otherwise. It will also face as well as corresponding information displayed on
be understood that the term "and/or" as used herein re- the device may be adjusted and/or varied from one ap-
fers to and encompasses any and all possible combina- plication to the next and/or within a respective applica-
tions of one or more of the associated listed items. It will 50 tion. In this way, a common physical architecture (such
be further understood that the terms "includes," "includ- as the touch-sensitive surface) of the device may support
ing," "comprises," and/or "comprising," when used in this the variety of applications with user interfaces that are
specification, specify the presence of stated features, in- intuitive and transparent to the user.
tegers, steps, operations, elements, and/or components, [0085] Attention is now directed toward embodiments
but do not preclude the presence or addition of one or 55 of portable devices with touch-sensitive displays. Figure
more other features, integers, steps, operations, ele- 1A is a block diagram illustrating portable multifunction
ments, components, and/or groups thereof. device 100 with touch-sensitive displays 112 in accord-
[0080] As used herein, the term "if’ may be construed ance with some embodiments. Touch-sensitive display
15
29 EP 2 638 460 B1 30
112 is sometimes called a "touch screen" for conven- The wireless communication may use any of a plurality
ience, and may also be known as or called a touch-sen- of communications standards, protocols and technolo-
sitive display system. Device 100 may include memory gies, including but not limited to Global System for Mobile
102 (which may include one or more computer readable Communications (GSM), Enhanced Data GSM Environ-
storage mediums), memory controller 122, one or more 5 ment (EDGE), high-speed downlink packet access
processing units (CPU’s) 120, peripherals interface 118, (HSDPA), high-speed uplink packet access (HSUPA),
RF circuitry 108, audio circuitry 110, speaker 111, micro- wideband code division multiple access (W-CDMA),
phone 113, input/output (I/O) subsystem 106, other input code division multiple access (CDMA), time division mul-
or control devices 116, and external port 124. Device 100 tiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi)
may include one or more optical sensors 164. These 10 (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or
components may communicate over one or more com- IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-
munication buses or signal lines 103. MAX, a protocol for e-mail (e.g., Internet message access
[0086] It should be appreciated that device 100 is only protocol (IMAP) and/or post office protocol (POP)), in-
one example of a portable multifunction device, and that stant messaging (e.g., extensible messaging and pres-
device 100 may have more or fewer components than 15 ence protocol (XMPP), Session Initiation Protocol for In-
shown, may combine two or more components, or may stant Messaging and Presence Leveraging Extensions
have a different configuration or arrangement of the com- (SIMPLE), Instant Messaging and Presence Service
ponents. The various components shown in Figure 1A (IMPS)), and/or Short Message Service (SMS), or any
may be implemented in hardware, software, or a combi- other suitable communication protocol, including com-
nation of both hardware and software, including one or 20 munication protocols not yet developed as of the filing
more signal processing and/or application specific inte- date of this document.
grated circuits. [0091] Audio circuitry 110, speaker 111, and micro-
[0087] Memory 102 may include high-speed random phone 113 provide an audio interface between a user
access memory and may also include non-volatile mem- and device 100. Audio circuitry 110 receives audio data
ory, such as one or more magnetic disk storage devices, 25 from peripherals interface 118, converts the audio data
flash memory devices, or other non-volatile solid-state to an electrical signal, and transmits the electrical signal
memory devices. Access to memory 102 by other com- to speaker 111. Speaker 111 converts the electrical sig-
ponents of device 100, such as CPU 120 and the periph- nal to human-audible sound waves. Audio circuitry 110
erals interface 118, may be controlled by memory con- also receives electrical signals converted by microphone
troller 122. 30 113 from sound waves. Audio circuitry 110 converts the
[0088] Peripherals interface 118 can be used to couple electrical signal to audio data and transmits the audio
input and output peripherals of the device to CPU 120 data to peripherals interface 118 for processing. Audio
and memory 102. The one or more processors 120 run data may be retrieved from and/or transmitted to memory
or execute various software programs and/or sets of in- 102 and/or RF circuitry 108 by peripherals interface 118.
structions stored in memory 102 to perform various func- 35 In some embodiments, audio circuitry 110 also includes
tions for device 100 and to process data. a headset jack (e.g., 212, Figure 2). The headset jack
[0089] In some embodiments, peripherals interface provides an interface between audio circuitry 110 and
118, CPU 120, and memory controller 122 may be im- removable audio input/output peripherals, such as out-
plemented on a single chip, such as chip 104. In some put-only headphones or a headset with both output (e.g.,
other embodiments, they may be implemented on sep- 40 a headphone for one or both ears) and input (e.g., a mi-
arate chips. crophone).
[0090] RF (radio frequency) circuitry 108 receives and [0092] I/O subsystem 106 couples input/output periph-
sends RF signals, also called electromagnetic signals. erals on device 100, such as touch screen 112 and other
RF circuitry 108 converts electrical signals to/from elec- input control devices 116, to peripherals interface 118.
tromagnetic signals and communicates with communi- 45 I/O subsystem 106 may include display controller 156
cations networks and other communications devices via and one or more input controllers 160 for other input or
the electromagnetic signals. RF circuitry 108 may include control devices. The one or more input controllers 160
well-known circuitry for performing these functions, in- receive/send electrical signals from/to other input or con-
cluding but not limited to an antenna system, an RF trans- trol devices 116. The other input control devices 116 may
ceiver, one or more amplifiers, a tuner, one or more os- 50 include physical buttons (e.g., push buttons, rocker but-
cillators, a digital signal processor, a CODEC chipset, a tons, etc.), dials, slider switches, joysticks, click wheels,
subscriber identity module (SIM) card, memory, and so and so forth. In some alternate embodiments, input con-
forth. RF circuitry 108 may communicate with networks, troller(s) 160 may be coupled to any (or none) of the
such as the Internet, also referred to as the World Wide following: a keyboard, infrared port, USB port, and a
Web (WWW), an intranet and/or a wireless network, such 55 pointer device such as a mouse. The one or more buttons
as a cellular telephone network, a wireless local area (e.g., 208, Figure 2) may include an up/down button for
network (LAN) and/or a metropolitan area network volume control of speaker 111 and/or microphone 113.
(MAN), and other devices by wireless communication. The one or more buttons may include a push button (e.g.,
16
31 EP 2 638 460 B1 32
17
33 EP 2 638 460 B1 34
tions (or sets of instructions) 136. Furthermore, in some gesture may be detected by detecting a particular contact
embodiments memory 102 stores device/global internal pattern. For example, detecting a finger tap gesture in-
state 157, as shown in Figures 1A and 3. Device/global cludes detecting a finger-down event followed by detect-
internal state 157 includes one or more of: active appli- ing a finger-up (lift off) event at the same position (or
cation state, indicating which applications, if any, are cur- 5 substantially the same position) as the finger-down event
rently active; display state, indicating what applications, (e.g., at the position of an icon). As another example,
views or other information occupy various regions of detecting a finger swipe gesture on the touch-sensitive
touch screen display 112; sensor state, including infor- surface includes detecting a finger-down event followed
mation obtained from the device’s various sensors and by detecting one or more finger-dragging events, and
input control devices 116; and location information con- 10 subsequently followed by detecting a finger-up (lift off)
cerning the device’s location and/or attitude. event.
[0103] Operating system 126 (e.g., Darwin, RTXC, [0107] Graphics module 132 includes various known
LINUX, UNIX, OS X, WINDOWS, or an embedded op- software components for rendering and displaying
erating system such as VxWorks) includes various soft- graphics on touch screen 112 or other display, including
ware components and/or drivers for controlling and man- 15 components for changing the intensity of graphics that
aging general system tasks (e.g., memory management, are displayed. As used herein, the term "graphics" in-
storage device control, power management, etc.) and cludes any object that can be displayed to a user, includ-
facilitates communication between various hardware and ing without limitation text, web pages, icons (such as us-
software components. er-interface objects including soft keys), digital images,
[0104] Communication module 128 facilitates commu- 20 videos, animations and the like.
nication with other devices over one or more external [0108] In some embodiments, graphics module 132
ports 124 and also includes various software compo- stores data representing graphics to be used. Each
nents for handling data received by RF circuitry 108 graphic may be assigned a corresponding code. Graph-
and/or external port 124. External port 124 (e.g., Univer- ics module 132 receives, from applications etc., one or
sal Serial Bus (USB), FIREWIRE, etc.) is adapted for 25 more codes specifying graphics to be displayed along
coupling directly to other devices or indirectly over a net- with, if necessary, coordinate data and other graphic
work (e.g., the Internet, wireless LAN, etc.). In some em- property data, and then generates screen image data to
bodiments, the external port is a multi-pin (e.g., 30-pin) output to display controller 156.
connector that is the same as, or similar to and/or com- [0109] Text input module 134, which may be a com-
patible with the 30-pin connector used on iPod (trade- 30 ponent of graphics module 132, provides soft keyboards
mark of Apple Inc.) devices. for entering text in various applications (e.g., contacts
[0105] Contact/motion module 130 may detect contact 137, e-mail 140, IM 141, browser 147, and any other
with touch screen 112 (in conjunction with display con- application that needs text input).
troller 156) and other touch sensitive devices (e.g., a [0110] GPS module 135 determines the location of the
touchpad or physical click wheel). Contact/motion mod- 35 device and provides this information for use in various
ule 130 includes various software components for per- applications (e.g., to telephone 138 for use in location-
forming various operations related to detection of con- based dialing, to camera 143 as picture/video metadata,
tact, such as determining if contact has occurred (e.g., and to applications that provide location-based services
detecting a finger-down event), determining if there is such as weather widgets, local yellow page widgets, and
movement of the contact and tracking the movement 40 map/navigation widgets).
across the touch-sensitive surface (e.g., detecting one [0111] Applications 136 may include the following
or more finger-dragging events), and determining if the modules (or sets of instructions), or a subset or superset
contact has ceased (e.g., detecting a finger-up event or thereof:
a break in contact). Contact/motion module 130 receives
contact data from the touch-sensitive surface. Determin- 45 • contacts module 137 (sometimes called an address
ing movement of the point of contact, which is represent- book or contact list);
ed by a series of contact data, may include determining
speed (magnitude), velocity (magnitude and direction), • telephone module 138;
and/or an acceleration (a change in magnitude and/or
direction) of the point of contact. These operations may 50 • video conferencing module 139;
be applied to single contacts (e.g., one finger contacts)
or to multiple simultaneous contacts (e.g., "multi- • e-mail client module 140;
touch"/multiple finger contacts). In some embodiments,
contact/motion module 130 and display controller 156 • instant messaging (IM) module 141;
detect contact on a touchpad. 55
[0106] Contact/motion module 130 may detect a ges- • workout support module 142;
ture input by a user. Different gestures on the touch-sen-
sitive surface have different contact patterns. Thus, a • camera module 143 for still and/or video images;
18
35 EP 2 638 460 B1 36
19
37 EP 2 638 460 B1 38
into memory 102, modify characteristics of a still image the user to download and play back recorded music and
or video, or delete a still image or video from memory 102. other sound files stored in one or more file formats, such
[0120] In conjunction with touch screen 112, display as MP3 or AAC files, and executable instructions to dis-
controller 156, contact module 130, graphics module play, present or otherwise play back videos (e.g., on
132, text input module 134, and camera module 143, 5 touch screen 112 or on an external, connected display
image management module 144 includes executable in- via external port 124). In some embodiments, device 100
structions to arrange, modify (e.g., edit), or otherwise ma- may include the functionality of an MP3 player, such as
nipulate, label, delete, present (e.g., in a digital slide show an iPod (trademark of Apple Inc.).
or album), and store still and/or video images. [0127] In conjunction with touch screen 112, display
[0121] In conjunction with RF circuitry 108, touch 10 controller 156, contact module 130, graphics module
screen 112, display system controller 156, contact mod- 132, and text input module 134, notes module 153 in-
ule 130, graphics module 132, and text input module 134, cludes executable instructions to create and manage
browser module 147 includes executable instructions to notes, to do lists, and the like in accordance with user
browse the Internet in accordance with user instructions, instructions.
including searching, linking to, receiving, and displaying 15 [0128] In conjunction with RF circuitry 108, touch
web pages or portions thereof, as well as attachments screen 112, display system controller 156, contact mod-
and other files linked to web pages. ule 130, graphics module 132, text input module 134,
[0122] In conjunction with RF circuitry 108, touch GPS module 135, and browser module 147, map module
screen 112, display system controller 156, contact mod- 154 may be used to receive, display, modify, and store
ule 130, graphics module 132, text input module 134, e- 20 maps and data associated with maps (e.g., driving direc-
mail client module 140, and browser module 147, calen- tions; data on stores and other points of interest at or
dar module 148 includes executable instructions to cre- near a particular location; and other location-based data)
ate, display, modify, and store calendars and data asso- in accordance with user instructions.
ciated with calendars (e.g., calendar entries, to do lists, [0129] In conjunction with touch screen 112, display
etc.) in accordance with user instructions. 25 system controller 156, contact module 130, graphics
[0123] In conjunction with RF circuitry 108, touch module 132, audio circuitry 110, speaker 111, RF circuit-
screen 112, display system controller 156, contact mod- ry 108, text input module 134, e-mail client module 140,
ule 130, graphics module 132, text input module 134, and browser module 147, online video module 155 in-
and browser module 147, widget modules 149 are mini- cludes instructions that allow the user to access, browse,
applications that may be downloaded and used by a user 30 receive (e.g., by streaming and/or download), play back
(e.g., weather widget 149-1, stocks widget 149-2, calcu- (e.g., on the touch screen or on an external, connected
lator widget 149-3, alarm clock widget 149-4, and dic- display via external port 124), send an e-mail with a link
tionary widget 149-5) or created by the user (e.g., user- to a particular online video, and otherwise manage online
created widget 149-6). In some embodiments, a widget videos in one or more file formats, such as H.264. In
includes an HTML (Hypertext Markup Language) file, a 35 some embodiments, instant messaging module 141,
CSS (Cascading Style Sheets) file, and a JavaScript file. rather than e-mail client module 140, is used to send a
In some embodiments, a widget includes an XML (Ex- link to a particular online video.
tensible Markup Language) file and a JavaScript file (e.g., [0130] Each of the above identified modules and ap-
Yahoo! Widgets). plications correspond to a set of executable instructions
[0124] In conjunction with RF circuitry 108, touch 40 for performing one or more functions described above
screen 112, display system controller 156, contact mod- and the methods described in this application (e.g., the
ule 130, graphics module 132, text input module 134, computer-implemented methods and other information
and browser module 147, the widget creator module 150 processing methods described herein). These modules
may be used by a user to create widgets (e.g., turning a (i.e., sets of instructions) need not be implemented as
user-specified portion of a web page into a widget). 45 separate software programs, procedures or modules,
[0125] In conjunction with touch screen 112, display and thus various subsets of these modules may be com-
system controller 156, contact module 130, graphics bined or otherwise re-arranged in various embodiments.
module 132, and text input module 134, search module In some embodiments, memory 102 may store a subset
151 includes executable instructions to search for text, of the modules and data structures identified above. Fur-
music, sound, image, video, and/or other files in memory 50 thermore, memory 102 may store additional modules and
102 that match one or more search criteria (e.g., one or data structures not described above.
more user-specified search terms) in accordance with [0131] In some embodiments, device 100 is a device
user instructions. where operation of a predefined set of functions on the
[0126] In conjunction with touch screen 112, display device is performed exclusively through a touch screen
system controller 156, contact module 130, graphics 55 and/or a touchpad. By using a touch screen and/or a
module 132, audio circuitry 110, speaker 111, RF circuit- touchpad as the primary input control device for operation
ry 108, and browser module 147, video and music player of device 100, the number of physical input control de-
module 152 includes executable instructions that allow vices (such as push buttons, dials, and the like) on device
20
39 EP 2 638 460 B1 40
100 may be reduced. above a predetermined noise threshold and/or for more
[0132] The predefined set of functions that may be per- than a predetermined duration).
formed exclusively through a touch screen and/or a [0138] In some embodiments, event sorter 170 also
touchpad include navigation between user interfaces. In includes a hit view determination module 172 and/or an
some embodiments, the touchpad, when touched by the 5 active event recognizer determination module 173.
user, navigates device 100 to a main, home, or root menu [0139] Hit view determination module 172 provides
from any user interface that may be displayed on device software procedures for determining where a sub-event
100. In such embodiments, the touchpad may be referred has taken place within one or more views, when touch
to as a "menu button." In some other embodiments, the sensitive display 112 displays more than one view. Views
menu button may be a physical push button or other phys- 10 are made up of controls and other elements that a user
ical input control device instead of a touchpad. can see on the display.
[0133] Figure 1B is a block diagram illustrating exem- [0140] Another aspect of the user interface associated
plary components for event handling in accordance with with an application is a set of views, sometimes herein
some embodiments. In some embodiments, memory 102 called application views or user interface windows, in
(in Figure 1A) or 370 (Figure 3) includes event sorter 170 15 which information is displayed and touch-based gestures
(e.g., in operating system 126) and a respective applica- occur. The application views (of a respective application)
tion 136-1 (e.g., any of the aforementioned applications in which a touch is detected may correspond to program-
137-151, 155, 380-390). matic levels within a programmatic or view hierarchy of
[0134] Event sorter 170 receives event information and the application. For example, the lowest level view in
determines the application 136-1 and application view 20 which a touch is detected may be called the hit view, and
191 of application 136-1 to which to deliver the event the set of events that are recognized as proper inputs
information. Event sorter 170 includes event monitor 171 may be determined based, at least in part, on the hit view
and event dispatcher module 174. In some embodi- of the initial touch that begins a touch-based gesture.
ments, application 136-1 includes application internal [0141] Hit view determination module 172 receives in-
state 192, which indicates the current application view(s) 25 formation related to sub-events of a touch-based gesture.
displayed on touch sensitive display 112 when the appli- When an application has multiple views organized in a
cation is active or executing. In some embodiments, de- hierarchy, hit view determination module 172 identifies
vice/global internal state 157 is used by event sorter 170 a hit view as the lowest view in the hierarchy which should
to determine which application(s) is (are) currently active, handle the sub-event. In most circumstances, the hit view
and application internal state 192 is used by event sorter 30 is the lowest level view in which an initiating sub-event
170 to determine application views 191 to which to deliver occurs (i.e., the first sub-event in the sequence of sub-
event information. events that form an event or potential event). Once the
[0135] In some embodiments, application internal hit view is identified by the hit view determination module,
state 192 includes additional information, such as one or the hit view typically receives all sub-events related to
more of: resume information to be used when application 35 the same touch or input source for which it was identified
136-1 resumes execution, user interface state informa- as the hit view.
tion that indicates information being displayed or that is [0142] Active event recognizer determination module
ready for display by application 136-1, a state queue for 173 determines which view or views within a view hier-
enabling the user to go back to a prior state or view of archy should receive a particular sequence of sub-
application 136-1, and a redo/undo queue of previous 40 events. In some embodiments, active event recognizer
actions taken by the user. determination module 173 determines that only the hit
[0136] Event monitor 171 receives event information view should receive a particular sequence of sub-events.
from peripherals interface 118. Event information in- In other embodiments, active event recognizer determi-
cludes information about a sub-event (e.g., a user touch nation module 173 determines that all views that include
on touch-sensitive display 112, as part of a multi-touch 45 the physical location of a sub-event are actively involved
gesture). Peripherals interface 118 transmits information views, and therefore determines that all actively involved
it receives from I/O subsystem 106 or a sensor, such as views should receive a particular sequence of sub-
proximity sensor 166, accelerometer(s) 168, and/or mi- events. In other embodiments, even if touch sub-events
crophone 113 (through audio circuitry 110). Information were entirely confined to the area associated with one
that peripherals interface 118 receives from I/O subsys- 50 particular view, views higher in the hierarchy would still
tem 106 includes information from touch-sensitive dis- remain as actively involved views.
play 112 or a touch-sensitive surface. [0143] Event dispatcher module 174 dispatches the
[0137] In some embodiments, event monitor 171 event information to an event recognizer (e.g., event rec-
sends requests to the peripherals interface 118 at pre- ognizer 180). In embodiments including active event rec-
determined intervals. In response, peripherals interface 55 ognizer determination module 173, event dispatcher
118 transmits event information. In other embodiments, module 174 delivers the event information to an event
peripheral interface 118 transmits event information only recognizer determined by active event recognizer deter-
when there is a significant event (e.g., receiving an input mination module 173. In some embodiments, event dis-
21
41 EP 2 638 460 B1 42
patcher module 174 stores in an event queue the event or sub-event. In some embodiments, event comparator
information, which is retrieved by a respective event re- 184 includes event definitions 186. Event definitions 186
ceiver module 182. contain definitions of events (e.g., predefined sequences
[0144] In some embodiments, operating system 126 of sub-events), for example, event 1 (187-1), event 2
includes event sorter 170. Alternatively, application 5 (187-2), and others. In some embodiments, sub-events
136-1 includes event sorter 170. In yet other embodi- in an event 187 include, for example, touch begin, touch
ments, event sorter 170 is a stand-alone module, or a end, touch movement, touch cancellation, and multiple
part of another module stored in memory 102, such as touching. In one example, the definition for event 1
contact/motion module 130. (187-1) is a double tap on a displayed object. The double
[0145] In some embodiments, application 136-1 in- 10 tap, for example, comprises a first touch (touch begin)
cludes a plurality of event handlers 190 and one or more on the displayed object for a predetermined phase, a first
application views 191, each of which includes instruc- lift-off (touch end) for a predetermined phase, a second
tions for handling touch events that occur within a re- touch (touch begin) on the displayed object for a prede-
spective view of the application’s user interface. Each termined phase, and a second lift-off (touch end) for a
application view 191 of the application 136-1 includes 15 predetermined phase. In another example, the definition
one or more event recognizers 180. Typically, a respec- for event 2 (187-2) is a dragging on a displayed object.
tive application view 191 includes a plurality of event rec- The dragging, for example, comprises a touch (or con-
ognizers 180. In other embodiments, one or more of tact) on the displayed object for a predetermined phase,
event recognizers 180 are part of a separate module, a movement of the touch across touch-sensitive display
such as a user interface kit (not shown) or a higher level 20 112, and lift-off of the touch (touch end). In some embod-
object from which application 136-1 inherits methods and iments, the event also includes information for one or
other properties. In some embodiments, a respective more associated event handlers 190.
event handler 190 includes one or more of: data updater [0149] In some embodiments, event definition 187 in-
176, object updater 177, GUI updater 178, and/or event cludes a definition of an event for a respective user-in-
data 179 received from event sorter 170. Event handler 25 terface object. In some embodiments, event comparator
190 may utilize or call data updater 176, object updater 184 performs a hit test to determine which user-interface
177 or GUI updater 178 to update the application internal object is associated with a sub-event. For example, in an
state 192. Alternatively, one or more of the application application view in which three user-interface objects are
views 191 includes one or more respective event han- displayed on touch-sensitive display 112, when a touch
dlers 190. Also, in some embodiments, one or more of 30 is detected on touch-sensitive display 112, event com-
data updater 176, object updater 177, and GUI updater parator 184 performs a hit test to determine which of the
178 are included in a respective application view 191. three user-interface objects is associated with the touch
[0146] A respective event recognizer 180 receives (sub-event). If each displayed object is associated with
event information (e.g., event data 179) from event sorter a respective event handler 190, the event comparator
170, and identifies an event from the event information. 35 uses the result of the hit test to determine which event
Event recognizer 180 includes event receiver 182 and handler 190 should be activated. For example, event
event comparator 184. In some embodiments, event rec- comparator 184 selects an event handler associated with
ognizer 180 also includes at least a subset of: metadata the sub-event and the object triggering the hit test.
183, and event delivery instructions 188 (which may in- [0150] In some embodiments, the definition for a re-
clude sub-event delivery instructions). 40 spective event 187 also includes delayed actions that
[0147] Event receiver 182 receives event information delay delivery of the event information until after it has
from event sorter 170. The event information includes been determined whether the sequence of sub-events
information about a sub-event, for example, a touch or a does or does not correspond to the event recognizer’s
touch movement. Depending on the sub-event, the event event type.
information also includes additional information, such as 45 [0151] When a respective event recognizer 180 deter-
location of the sub-event. When the sub-event concerns mines that the series of sub-events do not match any of
motion of a touch the event information may also include the events in event definitions 186, the respective event
speed and direction of the sub-event. In some embodi- recognizer 180 enters an event impossible, event failed,
ments, events include rotation of the device from one or event ended state, after which it disregards subse-
orientation to another (e.g., from a portrait orientation to 50 quent sub-events of the touch-based gesture. In this sit-
a landscape orientation, or vice versa), and the event uation, other event recognizers, if any, that remain active
information includes corresponding information about for the hit view continue to track and process sub-events
the current orientation (also called device attitude) of the of an ongoing touch-based gesture.
device. [0152] In some embodiments, a respective event rec-
[0148] Event comparator 184 compares the event in- 55 ognizer 180 includes metadata 183 with configurable
formation to predefined event or sub-event definitions properties, flags, and/or lists that indicate how the event
and, based on the comparison, determines an event or delivery system should perform sub-event delivery to ac-
sub-event, or determines or updates the state of an event tively involved event recognizers. In some embodiments,
22
43 EP 2 638 460 B1 44
23
45 EP 2 638 460 B1 46
that interconnects and controls communications be- • Battery status indicator 406;
tween system components. Device 300 includes in-
put/output (I/O) interface 330 comprising display 340, • Tray 408 with icons for frequently used applications,
which is typically a touch screen display. I/O interface such as:
330 also may include a keyboard and/or mouse (or other 5
pointing device) 350 and touchpad 355. Memory 370 in- + Phone 138, which may include an indicator 414
cludes high-speed random access memory, such as of the number of missed calls or voicemail mes-
DRAM, SRAM, DDR RAM or other random access solid sages;
state memory devices; and may include non-volatile
memory, such as one or more magnetic disk storage de- 10 + E-mail client 140, which may include an indi-
vices, optical disk storage devices, flash memory devic- cator 410 of the number of unread e-mails;
es, or other non-volatile solid state storage devices.
Memory 370 may optionally include one or more storage + Browser 147; and
devices remotely located from CPU(s) 310. In some em-
bodiments, memory 370 stores programs, modules, and 15 + Video and music player 152, also referred to
data structures analogous to the programs, modules, and as iPod (trademark of Apple Inc.) module 152;
data structures stored in memory 102 of portable multi- and
function device 100 (Figure 1), or a subset thereof. Fur-
thermore, memory 370 may store additional programs, • Icons for other applications, such as:
modules, and data structures not present in memory 102 20
of portable multifunction device 100. For example, mem- + IM 141;
ory 370 of device 300 may store drawing module 380,
presentation module 382, word processing module 384, + Image management 144;
website creation module 386, disk authoring module 388,
and/or spreadsheet module 390, while memory 102 of 25 + Camera 143;
portable multifunction device 100 (Figure 1) may not
store these modules. + Weather 149-1;
[0162] Each of the above identified elements in Figure
3 may be stored in one or more of the previously men- + Stocks 149-2;
tioned memory devices. Each of the above identified 30
modules corresponds to a set of instructions for perform- + Workout support 142;
ing a function described above. The above identified
modules or programs (i.e., sets of instructions) need not + Calendar 148;
be implemented as separate software programs, proce-
dures or modules, and thus various subsets of these 35 + Alarm clock 149-4;
modules may be combined or otherwise re-arranged in
various embodiments. In some embodiments, memory + Map 154;
370 may store a subset of the modules and data struc-
tures identified above. Furthermore, memory 370 may + Notes 153;
store additional modules and data structures not de- 40
scribed above. + Settings 412, which provides access to settings
[0163] Attention is now directed towards embodiments for device 100 and its various applications 136;
of user interfaces ("UI") that may be implemented on port- and
able multifunction device 100.
[0164] Figure 4A illustrates an exemplary user inter- 45 + Online video module 155, also referred to as
face for a menu of applications on portable multifunction YouTube (trademark of Google Inc.) module
device 100 in accordance with some embodiments. Sim- 155.
ilar user interfaces may be implemented on device 300.
In some embodiments, user interface 400 includes the [0165] Figure 4B illustrates an exemplary user inter-
following elements, or a subset or superset thereof: 50 face on a device (e.g., device 300, Figure 3) with a touch-
sensitive surface 451 (e.g., a tablet or touchpad 355, Fig-
• Signal strength indicator(s) 402 for wireless commu- ure 3) that is separate from the display 450 (e.g., touch
nication(s), such as cellular and Wi-Fi signals; screen display 112). Although many of the examples
which follow will be given with reference to inputs on
• Time 404; 55 touch screen display 112 (where the touch sensitive sur-
face and the display are combined), in some embodi-
• Bluetooth indicator 405; ments, the device detects inputs on a touch-sensitive sur-
face that is separate from the display, as shown in Figure
24
47 EP 2 638 460 B1 48
4B. In some embodiments the touch sensitive surface [0171] In response to detection of either gesture 5010
(e.g., 451 in Figure 4B) has a primary axis (e.g., 452 in or 5014 on display 112, device 100 changes unsplit soft
Figure 4B) that corresponds to a primary axis (e.g., 453 keyboard 5008 (Figure 5A) into integrated input area
in Figure 4B) on the display (e.g., 450). In accordance 5016 (Figure 5C). In some embodiments, an animation
with these embodiments, the device detects contacts 5 showing the transition from unsplit soft keyboard 5008
(e.g., 460 and 462 in Figure 4B) with the touch-sensitive to integrated input area 5016 is displayed on display 112.
surface 451 at locations that correspond to respective For example, the transitional animation may show unsplit
locations on the display (e.g., in Figure 4B, 460 corre- soft keyboard 5008 splitting into halves and center area
sponds to 468 and 462 corresponds to 470). In this way, 5016-C appearing between the halves, with the halves
user inputs (e.g., contacts 460 and 462, and movements 10 moving apart from each other in directions 5017-A and
thereof) detected by the device on the touch-sensitive 5017-B (Figure 5B). The halves become split keyboard
surface (e.g., 451 in Figure 4B) are used by the device portions 5016-A and 5016-B (Figure B) and the keys of
to manipulate the user interface on the display (e.g., 450 unsplit soft keyboard 5008 are divided amongst the
in Figure 4B) of the multifunction device when the touch- halves. In some embodiments, some keys are included
sensitive surface is separate from the display. It should 15 in both left and right portions 5016-A and 5016-B of in-
be understood that similar methods may be used for other tegrated input area 5016. Figure 5B depicts user inter-
user interfaces described herein. face 5000B at an instant in the transitional animation from
unsplit soft keyboard 5008 to integrated input area 5016.
USER INTERFACES AND ASSOCIATED PROCESS- What was unsplit soft keyboard 5008 has transitioned
ES 20 into integrated input area 5016, with the keys of unsplit
soft keyboard 5008 divided between two opposing por-
[0166] Attention is now directed towards embodiments tions. Upon completion of the transition animation, inte-
of user interfaces ("UI") and associated processes that grated input area 5016 is displayed, as shown in Figure
may be implemented on an electronic device with a dis- 5C.
play and a touch-sensitive surface, such as device 300 25 [0172] Figure 5C depicts user interface 5000C, with
or portable multifunction device 100. integrated input area 5016 displayed after completion of
[0167] Figures 5A-5TTT illustrate exemplary user in- the transition animation. Integrated input area 5016 in-
terfaces for manipulating soft keyboards in accordance cludes left split keyboard portion 5016-A and right split
with some embodiments. The user interfaces in these keyboard portion 5016-B, and center area 5016-C be-
figures are used to illustrate the processes described be- 30 tween split keyboard portions 5016-A and 5016-B. Split
low, including the processes in Figures 6A-6B, 7A-7B, keyboard portions 5016-A and 5016-B and center area
8A-8B, 9, 10A-10B, 11A-11D, 12A-12B, 13A-13B, and 5016-C form integrated input area 5016. In some em-
14. bodiments, integrated input area 5016 includes keyboard
[0168] Figure 5A depicts user interface (UI) 5000A dis- unsplit key 5018, replacing keyboard split key 5012. In
played on touch-sensitive display 112 of a device (e.g., 35 some embodiments, a character key is included in both
device 100). UI 5000A may be a user interface in an split keyboard portions 5016-A and 5016-B. For example,
application (e.g., a notes application, a web browser ap- "G" key 5019 is included in both portions 5016-A and
plication, etc.) on device 100. UI 5000A includes text en- 5016-B. In some embodiments, some keys in unsplit soft
try area 5002. Cursor 5004 and input text 5006 are dis- keyboard 5008 are not displayed in integrated input area
played in text entry area 5002. 40 5016. For example, hide keyboard key 5009 (Figure 5A)
[0169] Unsplit soft keyboard 5008 is displayed on dis- in unsplit soft keyboard 5008 is not displayed in integrated
play 112. The keys of unsplit soft keyboard 5008 are not input area 5016.
split amongst two or more distinct keyboard portions. In [0173] In some embodiments, center area 5016-C dis-
some embodiments, unsplit soft keyboard 5008 includes plays duplicate cursor 5020 and duplicate input text 5022.
keyboard split key 5012. In some embodiments, key- 45 Duplicate cursor 5020 and duplicate input text 5022 mir-
board split key 5012 shows an icon or other graphical ror cursor 5004 and at least a portion of input text 5006,
indicia (e.g., an icon or graphic showing two halves mov- respectively. The portion of input text 5006 that is visible
ing, as in splitting, apart) that keyboard split key 5012 in center area 5016-C as duplicate input text 5022 at any
may be used to switch to an integrated input area that moment is typically the portion of input text 5006 that is
includes a split soft keyboard. 50 in the immediate vicinity of cursor 5004. In some embod-
[0170] Figure 5A also depicts exemplary gestures that, iments, duplicate cursor 5020 and duplicate input text
if detected, activate splitting of unsplit soft keyboard 5008 5006 are displayed at larger sizes than cursor 5004 and
in response. Gesture 5010 is a two-finger de-pinch ges- input text 5006, respectively. In some other embodi-
ture performed on unsplit soft keyboard 5008. Gesture ments, center area 5016 is empty, and text entry area
5010 includes finger contacts 5010-A and 5010-B moving 55 5002 is visible through center area 5016-C, as shown in
apart from each other in directions 5011-A and 5011-B, Figure 5K.
respectively, on display 112. On the other hand, gesture [0174] Figure 5D depicts user interface 5000D, with
5014 is a tap gesture on keyboard split key 5012. gesture 5026 detected on display 112. Gesture 5026 is
25
49 EP 2 638 460 B1 50
detected on "T" key 5024 in left split keyboard portion split keyboard 5008 into an integrated input area 5016
5016-A. In response to detection of gesture 5026 on "T" when the user is in the middle of typing using the unsplit
key 5024, "T" key 5024 is activated and a "t" character keyboard.
is entered into input text 5006. Duplicate input text 5022 [0179] Figure 5L depicts user interface 5000L dis-
in center area 5016-C also shows the "t" character being 5 played on display 112. User interface 5000L may be a
entered, mirroring the entering of the "t" character into user interface in an application (e.g., a notes application,
input text 5006. a web browser application, etc.) on device 100. User in-
[0175] Figure 5E depicts user interface 5000E, show- terface 5000L includes text entry area 5002 and integrat-
ing gestures that, if detected, activate un-splitting of the ed input area 5039. Integrated input area 5039 includes
split soft keyboard in integrated input area 5016 in re- 10 split keyboard portions 5039-A and 5039-B, and center
sponse. Gesture 5028 is a two-finger pinch gesture per- area 5039-C. In some embodiments, integrated input ar-
formed on integrated input area 5016. Gesture 5028 in- ea 5039 is integrated input area 5016, with duplicate input
cludes finger contacts 5028-A and 5028-B moving toward text and a duplicate cursor displayed in center area 5039-
each other in directions 5030-A and 5030-B, respectively, C.
on display 112. On the other hand, gesture 5032 is a tap 15 [0180] In some embodiments, integrated input area
gesture on keyboard unsplit key 5018. 5039 may move to a different location on display 112 in
[0176] In response to detection of either gesture 5028 response to a gesture (e.g., a dragging gesture). In Figure
or 5032 on display 112, device 100 changes integrated 5L, two gestures are shown: gesture 5040 and gesture
input area 5016 into unsplit soft keyboard 5008. In some 5044. Gesture 5040 is a dragging gesture where the fin-
embodiments, an animation showing the transition from 20 ger contact begins in center area 5039-C, as indicated
integrated input area 5016 to unsplit soft keyboard 5008 by finger contact position 5040-1, and moves to position
is displayed on display 112. For example, the transitional 5040-2. Gesture 5044 is a dragging gesture where the
animation may show split keyboard portions 5016-A and finger contact begins in split keyboard portion 5039-B,
5016-B merging together and center area 5016-C reduc- as indicated by finger contact position 5044-1, and moves
ing in size and eventually ceasing to be displayed. Figure 25 to position 5044-2.
5F depicts user interface 5000F at a point in the transi- [0181] Whether integrated input area 5039 does move
tional animation. Integrated input area portions 5016-A in response to detection of gesture 5040 or gesture 5044
and 5016-B are merging together in directions 5034-A depends on whether the magnitude of respective gesture
and 5034-B and center area 5016-C continually reduces 5040 or 5044, i.e., a distance the finger contact moves
in size. Upon completion of the transition animation, un- 30 in the gesture, exceeds a respective predefined thresh-
split soft keyboard 5008, including keyboard split key old. The amount of the threshold depends on whether
5012, is displayed in UI 5000G, as shown in Figure 5G. the detected gesture begins in center area 5039 or in
[0177] Figure 5H depicts user interface 5000H dis- split keyboard portion 5039-A or 5039-B. For a gesture
played on display 112. Tap gesture 5038 is detected at that starts in center area 5039-C, threshold 5042 is a
some time on "T" key 5036 in unsplit soft keyboard 5008. 35 predefined distance from the start of the gesture. For a
In response, "T" key 5036 is activated; a "t" character is gesture that starts in split keyboard portion 5039-B or
entered into input text 5006 displayed in UI 5000I, as 5039-A, threshold 5046 is a predefined distance from the
shown in Figure 5I. start of the gesture. The distance for threshold 5046 is
[0178] Figure 5I also depicts gesture 5010 detected on longer than the distance for threshold 5042. Thus, the
display 112 at a time after the time of detection of gesture 40 integrated input area will start moving in accordance with
5038. In some embodiments, whether integrated input the movement of the contact in gesture 5040 before the
area 5016 is displayed in place of unsplit soft keyboard integrated input area will start moving in accordance with
5008, in response to detection of gesture 5010 depends the movement of the contact in gesture 5044. The dis-
on the time when gesture 5038 is detected and the time tance threshold is greater over the split keyboard portions
when gesture 5010 is detected. If the time period between 45 of the integrated input area (as compared to the center
the time when gesture 5038 is detected and the time portion of the integrated input area) to prevent accidental
when gesture 5010 is detected exceeds a predefined pe- movement of the integrated input area when a user con-
riod of time (e.g., 0.3, 0.5, or 1.0 seconds), then integrated tact moves during typing with the split keyboard.
input area 5016 is displayed in UI 5000K, as shown in [0182] In some embodiments, a distance threshold is
Figure 5K. In some embodiments, a transitional anima- 50 compared against the vertical distance component of the
tion is displayed in UI 5000J, as shown in Figure 5J. Fig- gesture. In some other embodiments, the distance
ure 5J is similar to Figure 5B, and thus a detailed de- threshold is compared against the complete gesture, in-
scription of Figure 5J is omitted for brevity. If the time cluding both the horizontal and vertical distance compo-
period between the time when gesture 5038 is detected nents; the threshold is compared against the absolute
and the time when gesture 5010 is detected does not 55 distance of the gesture. Thus, for example, alternative
exceed the predefined period of time, then the display of thresholds 5046 and 5048 are shown for gesture 5044.
unsplit soft keyboard 5008 is maintained. This use of a Threshold 5046 is compared to the vertical component
time threshold prevents accidental conversion of the un- of gesture 5044. Threshold 5048, on the other hand, is
26
51 EP 2 638 460 B1 52
compared to the complete gesture 5044. input area 5039, with height 5065, docked at the bottom
[0183] If either gesture 5040 or 5044 is detected, and on display 112. The sum of height 5064 and height 5065
the detected gesture exceeds a respective threshold in is equal (or substantially equal, e.g., within 10 pixels) to
accordance with where the detected gesture began, then height 5066 of display 112. Gesture 5068 is detected on
integrated input area 5039 moves vertically in accord- 5 display 112. Gesture 5068 includes a finger contact mov-
ance with the direction of the detected gesture in UI ing on display 112 from position 5068-1 to position
5000M, as shown in Figure 5M. Split keyboard portions 5068-2.
5039-A, 5039-B, and center area 5039-C move together [0187] In response to detecting gesture 5068, integrat-
as one integrated unit in a vertical direction. Movement ed input area 5039 is undocked and moves vertically in
of integrated input area 5039 is typically restricted to ver- 10 accordance with the direction of gesture 5068 in UI
tical - up or down - movement, which keeps the left and 5000P, as shown in Figure 5P. Integrated input area 5039
right split keyboard portions adjacent to the left and right is displayed as an overlay over application content area
sides of the display, respectively, when the integrated 5062 in Figure 5P. Also in response to detecting gesture
input area is moved. In turn, this keeps the keys in the 5068, application content area 5062 increases in size to
left and right split keyboard portions easily reachable by 15 occupy height 5066. As a result of the increase in size,
the left and right thumbs, respectively, during two-thumb content in application content area 5062 that was previ-
typing. ously not visible in UI 50000 absent scrolling may become
[0184] Figure 5N illustrates two charts showing the visible in UI 5000P. For example, instant messaging (IM)
amount of the threshold based on where on display 112 field 5067 is visible in UI 5000P without scrolling. Thus,
the gesture begins. Chart 5050 shows the amount of the 20 when an input area with a keyboard is "undocked" from
threshold according to some embodiments. Chart 5050 the bottom of the display, more display area is used to
has an x-axis being the position along the width of display display the application content area and the input area
112 and the y-axis being the magnitude of the threshold, "floats" over the application content area.
with the ends of the x-axis representing the left and right [0188] Figure 5Q depicts UI 5000Q displayed on dis-
edges of display 112. Line 5052 is a line marking the 25 play 112. UI 5000Q includes text entry area 5002 and
center axis of display 112. Span 5055 between lines split soft keyboard 5069. In some embodiments, split soft
5054-A and 5054-B represent the width of center area keyboard 5069 is part of integrated input area 5016 or
5039-C. The areas outside of span 5055 represent the 5039. Cursor 5004 and input text 5006 are displayed in
widths of split keyboard portions 5039-A and 5039-B. A text entry area 5002. Split keyboard 5069 includes left
first value is defined for threshold 5056-B, for gestures 30 split keyboard portion 5069-A and right split keyboard
that begin in center area 5039-C. Threshold 5056-B is portion 5069-B. In left split keyboard portion 5069-A, the
constant for the width of center area 5039-C. A second rightmost letter keys include "T" key 5070, "F" key 5072,
value is defined for threshold 5056-A, for gestures that and "V" key 5074. In right split keyboard portion 5039-B,
begin in either split keyboard portion 5039-A or 5039-B. the leftmost letter keys include "Y" key 5076, "G" key
Threshold 5056-A is constant for the widths of split key- 35 5078, and "B" key 5080.
board portions 5039-A or 5039-B. [0189] To the right of "T" key 5070, "F" key 5072, and
[0185] Chart 5058 shows the amount of the threshold "V" key 5074 in left split keyboard portion 5069-A are
according to some alternative embodiments. Chart 5058 undisplayed key activation areas 5082, 5084, and 5086,
has an x-axis being the position along the width of display respectively. Activation area 5082 corresponds to "Y" key
112 and the y-axis being the magnitude of the threshold, 40 5076. Activation area 5084 corresponds to "G" key 5078.
with the ends of the x-axis representing the left and right Activation area 5086 corresponds to "B" key 5080. The
edges of display 112. Span 5055 between lines 5054-A undisplayed key activation areas are typically the same
and 5054-B represents the width of center area 5039-C. size as or slightly larger (e.g., up to 10% larger) than the
The areas outside of span 5055 represent the widths of corresponding key. The diagonal lines in activation areas
split keyboard portions 5039-A and 5039-B. Chart 5058 45 5082, 5084, and 5086 in the figures are used to indicate
shows threshold amounts 5060-A and 5060-B defined to that these activation areas are not displayed to the user,
be a particular amount at center line 5052 and increasing whereas the corresponding keys are displayed.
linearly from that amount with the distance from center [0190] To the left of "Y" key 5076, "G" key 5078, and
line 5052. Under either chart 5050 or 5058, the threshold "B" key 5080 are undisplayed key activation areas 5088,
within center area 5039-C is lower than the threshold in 50 5090, and 5092, respectively. Activation area 5088 cor-
split keyboard portion 5039-A or 5039-B. In center area responds to "T" key 5070. Activation area 5090 corre-
5039-C, the distance threshold is lower because there is sponds to "F" key 5072. Activation area 5092 corre-
less opportunity for confusing a dragging gesture (for sponds to "V" key 5074. The undisplayed key activation
moving integrated input area 5039) with a key activation areas are typically the same size as or slightly larger (e.g.,
gesture (for entering a character). 55 up to 10% larger) than the corresponding key. The diag-
[0186] Figure 5O depicts UI 50000. UI 50000 includes onal lines in activation areas 5088, 5090, and 5092 in
application content area 5062, with height 5064, dis- the figures are used to indicate that these activation areas
played on display 112. UI 50000 also includes integrated are not displayed to the user, whereas the corresponding
27
53 EP 2 638 460 B1 54
keys are displayed. 5020 remains stationary within center area 5016-C, and
[0191] Figure 5Q also shows gesture 5096 (e.g., a tap its position relative to duplicate input text 5022 changes
gesture) detected on "O" key 5094 in right split keyboard by advancing or retreating duplicate input text 5022, as
portion 5069-B. In response to detection of gesture 5096, shown in Figure 5Y.
a character "o" is entered into input text 5006 in UI 5000R, 5 [0198] Figure 5Z depicts UI 5000Z. Editing control ob-
as shown in Figure 5R. ject 5104, corresponding to a text editing operation, is
[0192] Figure 5R also shows gesture 5098 detected displayed in UI 5000Z near input text 5006. Editing control
on key activation area 5086 following detection of gesture object 5104 corresponds to a text pasting operation to
5096. In some embodiments, in response to detection of be performed on input text 5006 if activated. Duplicate
gesture 5098, a character "b" is entered into input text 10 editing control object 5106, corresponding to editing con-
5006 in UI 5000S, as shown in Figure 5S, as key activa- trol object 5104, is displayed in center area 5016-C near
tion area 5086 corresponds to "B" key 5080. duplicate input text 5022. Gesture 5108 (e.g., a tap ges-
[0193] In some other embodiments, a character "b" is ture) is detected on duplicate editing control object 5106.
entered into input text 5006 in response to detection of [0199] In response to detection of gesture 5108 on du-
gesture 5098 if the time period between the time of de- 15 plicate editing control object 5106, the text pasting oper-
tection for gesture 5096 and the time of detection for ges- ation corresponding to editing control object 5104 is ac-
ture 5098 is less than a predefined period of time. If the tivated in UI 5000AA, as shown in Figure 5AA. Text
time period exceeds the predefined period, entering of "ahead" is pasted into input text 5006. The text pasting
the character "b" is foregone; gesture 5098 is ignored. is mirrored in center area 5016-C, as duplicate text 5022
[0194] Figure 5T shows UI 5000T, with gesture 5100 20 is updated to also include the pasted text "ahead."
detected on key activation area 5088. In response to de- [0200] Figure 5BB depicts UI 5000BB. UI 5000BB in-
tection of gesture 5100, a character "t" is entered into cludes form 5110 displayed in text entry area 5002. Form
input text 5006 in UI 5000U, as shown in Figure 5U, as 5110 includes one or more text entry fields 5114 and one
key activation area 5088 corresponds to "T" key 5070. or more checkboxes 5112, each respective checkbox
[0195] Figure 5V depicts UI 5000V. UI 5000V includes 25 5112 associated with an option in form 5110. For exam-
text entry area 5002 and integrated input area 5016. In- ple, checkbox 5112-A in form 5110 is associated with the
tegrated input area 5016 includes split keyboard portions "Vegetarian" option. Duplicate checkboxes 5116 and the
5016-A (the left portion) and 5016-B (the right portion) corresponding options mirror checkboxes 5112 and the
and center area 5016-C. Cursor 5004 and input text 5006 corresponding options and are displayed in center area
are also displayed in text entry area 5002. Cursor 5004 30 5016-C. For example, duplicate checkbox 5116-A corre-
and input text 5006 are mirrored in center area 5016-C sponds to checkbox 5112-A, duplicate checkbox 5116-
as duplicate cursor 5020 and duplicate input text 5022, B corresponds to checkbox 5112-B, and so on forth. In
respectively. In some embodiments, duplicate cursor some embodiments, duplicate checkboxes 5116 and the
5020 and duplicate input text 5022 are displayed at larger corresponding options are displayed at larger sizes than
sizes than cursor 5004 and input text 5006, respectively. 35 their corresponding checkboxes 5112. Gesture 5118
[0196] Figure 5V also shows gesture 5102 (e.g., a tap (e.g., a tap gesture) is detected on duplicate checkbox
gesture) detected on "T" key 5024 in left split keyboard 5116-A, which is unselected (i.e., not checked), as is
portion 5016-A. In response to detection of gesture 5102, checkbox 5112-A.
a character "t" is entered into input text 5006 in UI 5000W, [0201] In response to detection of gesture 5118 on du-
as shown in Figure 5W. Input text 5006, including the 40 plicate checkbox 5116-A, checkbox 5112-A is selected
newly entered character "t," is mirrored in center area in UI 5000CC, as shown in Figure 5CC. Duplicate check-
5016-C, as duplicate input text 5022 also shows the new- box 5116-A is also selected, mirroring the selection of
ly entered letter "t." checkbox 5112-A.
[0197] Figure 5X shows UI 5000X, with gesture 5102 [0202] Figure 5DD depicts UI 5000DD. UI 5000DD
detected on duplicate cursor 5020 in center area 5016- 45 shows gesture 5120 detected in center area 5016-C,
C, while duplicate cursor 5020 is located at the end of away from duplicate cursor 5020 and duplicate input text
duplicate input text 5022 (mirroring cursor 5004 being 5022. In some embodiments, gesture 5120 is a double
located at the end of input text 5006). Gesture 5102 is a tap gesture or a finger contact held in place. In response
dragging gesture, with the finger contact moving in direc- to detection of gesture 5120 in center area 5016-C, pop-
tion 5103. In response to detection of gesture 5102 on 50 up menu 5122 is displayed in UI 5000EE, as shown in
duplicate cursor 5020, cursor 5004 is moved to a different Figure 5EE. Duplicate menu 5124 is displayed in center
position in input text 5006 in accordance with direction area 5016-C. Selection of an option in duplicate menu
5103 in UI 5000Y, as shown in Figure 5Y. The result of 5124 has the same effect as selecting the corresponding
the movement of cursor 5004 is mirrored in center area option in pop-up menu 5122. In other embodiments (not
5016-C, as duplicate input text 5022 is moved so that 55 shown), in response to detection of gesture 5120, menu
duplicate cursor 5020 is in the same position relative to 5124 is displayed in center area 5016-C, without concur-
duplicate input text 5022 as cursor 5004 is relative to rently displaying 5122; in other words, a pop-up menu is
input text 5006. In some embodiments, duplicate cursor just displayed in the center area 5016-C.
28
55 EP 2 638 460 B1 56
[0203] Figure 5FF depicts UI 5000FF. UI 5000FF in- characters 5146, are displayed in center area 5016-C
cludes handwriting 5126 made using one or more finger near duplicate input text 5022. Gesture 5149 (e.g., a tap
strokes 5128 made within center area 5016-C. Handwrit- gesture) is detected on duplicate Unicode character
ing 5126 is recognized to resemble a Chinese character, 5148-A, which corresponds to Unicode character 5146-
and one or more candidate characters (not shown) may 5 A.
be displayed in center area 5016-C. In response to se- [0210] In response to detection of gesture 5149 on du-
lection of a candidate character, the selected candidate plicate Unicode character 5148-A, Unicode character
character is entered into input text 5006 in UI 5000GG, 5146-A is entered into input text 5006 in UI 500000, as
as shown in Figure 5GG. In UI 5000GG, the character shown in Figure 5OO. The changed text input 5006 is
10 mirrored in center area 5016-C, as duplicate input text
" " is entered into input text 5006. In some embodi-
5022 also includes duplicate Unicode character 5148-A.
ments, duplicate input text 5022 also shows the character
In other embodiments (not shown), Unicode characters
" " being entered. 5148 are displayed in center area 5016-C, without con-
[0204] Figure 5HH depicts UI 5000HH. UI 5000HH in- currently displaying Unicode characters 5146; in other
cludes cursor 5004 in text entry area 5002 and drawing 15 words, the Unicode characters available for input are just
5130, made using one or more finger strokes 5132, in displayed in the center area 5016-C.
center area 5016-C. After completion of drawing 5130, [0211] Figure 5PP depicts UI 5000PP, which includes
drawing 5133 corresponding to drawing 5130 is entered input text 5006 and cursor 5004 displayed in text entry
into text entry area 5002 in UI 5000II, as shown in Figure area 5002. Cursor 5004 and at least a portion of input
5II. 20 text 5006 are mirrored in center area 5016-C as duplicate
[0205] Figure 5JJ depicts UI 5000JJ. UI 5000JJ in- cursor 5020 and duplicate input text 5022, respectively.
cludes input text 5006 and cursor 5004 displayed in text Gesture 5150 is detected in center area 5016-C. Gesture
entry area 5002. Suggested word 5134 for input text 5006 5150 includes a finger contact moving from position
is displayed. Duplicate suggested word 5136, corre- 5150-A to position 5150-B, going over duplicate input
sponding to suggested word 5134, is displayed in center 25 text 5022.
area 5016-C near duplicate input text 5022. Gesture [0212] In response to detection of gesture 5150 over
5138 (e.g., a tap gesture) is detected on X-icon 5137 duplicate input text 5022, duplicate input text 5022 is dis-
(which corresponds to X-icon 5135 accompanying sug- played with shading 5154 (or other highlighting) (Figure
gested word 5134) accompanying duplicate suggested 5QQ), indicating that duplicate input text 5022 is select-
word 5136. 30 ed. The corresponding text in input text 5006 is displayed
[0206] In response to detection of gesture 5138 on X- with shading 5152 (or other highlighting) as well.
icon 5137, suggested word 5134 is rejected, and input [0213] Figure 5RR depicts UI 5000RR, which includes
text 5006 remains as is in UI 5000KK, as shown in Figure input text 5006 and cursor 5004 displayed in text entry
5KK. area 5002. Cursor 5004 and input text 5006 is mirrored
[0207] Figure 5LL depicts UI 5000LL. UI 5000LL in- 35 in center area 5016-C as duplicate cursor 5020 and du-
cludes input text 5006 and cursor 5004 displayed in text plicate input text 5022, respectively. Gesture 5156 (e.g.,
entry area 5002. One or more emoticons 5140 available a double tap gesture, a triple tap gesture, or a tap and
for entry into input text 5006 are displayed. Duplicate hold gesture) is detected in center area 5016-C on du-
emoticons 5142, corresponding to emoticons 5140, are plicate input text 5022. Gesture 5156 is associated with
displayed in center area 5016-C near duplicate input text 40 a text formatting operation (e.g., making text bold, un-
5022. Gesture 5144 (e.g., a tap gesture) is detected on derling text, italicizing text, etc.).
duplicate emoticon 5142-A, which corresponds to emoti- [0214] In response to detection of gesture 5156 on du-
con 5140-A. plicate input text 5022, the formatting of input text 5006
[0208] In response to detection of gesture 5144 on du- is changed in UI 5000SS, as shown in Figure 5SS. Input
plicate emoticon 5142-A, emoticon 5140-A is entered into 45 text 5006 is changed to bold text. Duplicate input text
input text 5006 in UI 5000MM, as shown in Figure 5MM. 5022 is also changed to bold text to mirror the change in
The changed text input 5006 is mirrored in center area the formatting of input text 5006.
5016-C, as duplicate emoticon 5142-A is entered into [0215] Figure 5TT depicts UI 5000TT. Displayed in UI
duplicate input text 5022. In other embodiments (not 5000TT is a menu 5158 of input options for center area
shown), emoticons 5142 are displayed in center area 50 5016-C. Menu 5158 includes options for allowable user
5016-C, without concurrently displaying emoticons 5140; inputs in center area 5016-C, such as gestures associ-
in other words, the emoticons available for input are just ated with text formatting operations, gestures associated
displayed in the center area 5016-C. with text editing operations, Chinese character handwrit-
[0209] Figure 5NN depicts UI 5000NN, which includes ing, drawing, emoticons, and so forth. A user may select,
input text 5006 and cursor 5004 displayed in text entry 55 in menu 5158, inputs that they want enabled or disabled
area 5002. One or more Unicode characters 5146 avail- in center area 5016-C.
able for entry into input text 5006 are displayed. Duplicate [0216] Figure 5UU depicts UI 5000UU. UI 5000UU in-
Unicode characters 5148, corresponding to Unicode cludes application content area 5160 (e.g., a content area
29
57 EP 2 638 460 B1 58
in a web browser, content area in a word processing ap- keyboard stops just below text entry area 5162-A, none
plication, etc.). Displayed in application content area of text entry area 5162-A is obscured by soft keyboard
5160 are one or more text entry areas 5162 (e.g., text 5164. Even though trajectory 5174 and the associated
fields in an online form). For example, application content inertia would otherwise carry soft keyboard 5164 to ter-
area 5160 includes text entry areas 5162-A thru 5162-F 5 mination point 5175, which is above the bottom of text
that are a part of an online form. Soft keyboard 5164 is entry area 5162-A (Figure 5YY), trajectory 5174 is ad-
also displayed. While soft keyboard 5164 is shown as an justed so that soft keyboard 5164 stops just below text
unsplit soft keyboard, in some embodiments, the input entry area 5162-A.
area is a split soft keyboard or an integrated input area [0222] In some embodiments, when the keyboard is
that includes a split keyboard, such as integrated input 10 "thrown" or "flung," the keyboard bounces off the top or
area 5016 or 5039. bottom of the display by some amount (e.g., an amount
[0217] Gesture 5166 is detected on display 112. Ges- corresponding to a top or bottom toolbar height, respec-
ture 5166 is a dragging gesture that includes a finger tively) if there is some appreciable velocity component
contact starting at position 5166-A and moving to position of the touch at the time the touch lifts off. Conversely,
5166-B. In response to detection of gesture 5166, soft 15 when the keyboard is dragged to the edge of the display
keyboard 5164 moves vertically in accordance with the and released with no or very little velocity, the keyboard
direction of gesture 5166 and final position 5166-B in UI "docks" flush with the edge of the display (not shown).
5000VV, as shown in Figure 5VV. After drag gesture [0223] Figure 5AAA depicts UI 5000AAA displayed on
5166, soft keyboard 5164 may partially obscure a text display 112. UI 5000AAA includes text entry area 5002,
entry area 5162. For example, in UI 5000VV, soft key- 20 with cursor 5004 and input text 5006 displayed in text
board 5164 partially obscures text entry area 5162-F. entry area 5002. Integrated input area 5039 is also dis-
[0218] Figure 5UU depicts UI 5000WW. In UI played in UI 5000AAA. Integrated input area 5039 in-
5000WW, gesture 5168 is detected. Gesture 5168 is a cludes split keyboard portions 5039-A (the left portion)
flick gesture in a vertical direction starting from a location and 5039-B (the right portion) and center area 5039-C.
on soft keyboard 5164. In response to detection of ges- 25 [0224] Gesture 5176 is detected on display 112. In
ture 5168, movement trajectory 5170, including move- some embodiments, gesture 5176 is a two-thumb tap
ment inertia, is imparted to soft keyboard 5164 in accord- gesture, with one thumb on location 5176-A over split
ance with gesture 5168. For example, a short trajectory keyboard portion 5039-A and the other thumb on location
is imparted in response to a small flicking gesture, and 5176-B over split keyboard portion 5039-B.
a long trajectory is imparted in response to a large flicking 30 [0225] In response to detection of gesture 5176, device
gesture. Movement trajectory 5170 has termination point 100 enters a reconfiguration mode for integrated input
5171. area 5039 in UI 5000BBB (Figure 5BBB). While device
[0219] Soft keyboard 5164 moves with inertia in ac- 100 is in the reconfiguration mode, gesture 5178 is de-
cordance with movement trajectory 5170 in UI 5000XX, tected on display 112. Gesture 5178 includes a left thumb
as shown in Figure 5XX, and comes to rest at a position 35 contact moving toward the left edge of display 112. In
adjacent to and below text entry area 5162-F. As soft response to detection of gesture 5178, split keyboard
keyboard 5164 stops just below text entry area 5162-F, portion 5039-A reduces in size (e.g., in width, height, or
none of text entry area 5162-F is obscured by soft key- both width and height) in UI 5000CCC, as shown in Figure
board 5164. Even though trajectory 5170 and the asso- 5CCC, and the keys in split keyboard portion 5039-A res-
ciated movement inertia would otherwise carry soft key- 40 cale in accordance with the size reduction of split key-
board 5164 to termination point 5171, which is above the board portion 5039-A. In some embodiments, center area
bottom of text entry area 5162-F (Figure 5WW), trajectory 5039-C increases in size and split keyboard portion 5039-
5170 is adjusted so that soft keyboard 5164 stops just B maintains the same size in response. In some other
below text entry area 5162-F. embodiments, both center area 5039-C and split key-
[0220] Figure 5YY depicts UI 5000YY. In UI 5000YY, 45 board portion 5039-B increases in size in response.
gesture 5172 is detected. Gesture 5172 is a flicking ges- [0226] Figure 5DDD depicts UI 5000DDD, which
ture in a vertical direction starting from a location on soft shows gesture 5180 detected on display 112 while in the
keyboard 5164, and is a larger flicking gesture than ges- reconfiguration mode. Gesture 5180 includes a left
ture 5168. In response to detection of gesture 5172, thumb contact moving away from the left edge of display
movement trajectory 5174, including movement inertia, 50 112. In response to detection of gesture 5180, split key-
is imparted onto soft keyboard 5164 in accordance with board portion 5039-A increases in size (e.g., in width,
gesture 5172. Movement trajectory 5174 is larger than height, or both width and height) in UI 5000EEE, as
movement trajectory 5170 and has termination point shown in Figure 5EEE, and the keys in split keyboard
5175. portion 5039-A rescale in accordance with the size in-
[0221] Soft keyboard 5164 moves with inertia in ac- 55 crease of split keyboard portion 5039-A. In some embod-
cordance with movement trajectory 5174 in UI 5000ZZ, iments, center area 5039-C decreases in size and split
as shown in Figure 5ZZ, and comes to rest at a position keyboard portion 5039-B maintains the same size in re-
adjacent to and below text entry area 5162-A. As soft sponse. In some other embodiments, both center area
30
59 EP 2 638 460 B1 60
5039-C and split keyboard portion 5039-B decreases in one thumb on location 5190-A over split keyboard portion
size in response. 5039-A and the other thumb on location 5190-B over split
[0227] Figure 5FFF depicts UI 5000FFF, which shows keyboard portion 5039-B. In response to detection of ges-
gesture 5182 detected on display 112 while in the recon- ture 5190, device 100 exits the reconfiguration mode for
figuration mode. Gesture 5182 includes a right thumb 5 integrated input area 5039.
contact moving toward the right edge of display 112. In [0232] It should be appreciated that, while the details
response to detection of gesture 5182, split keyboard of Figures 5A and 5NNN were described in the context
portion 5039-B reduces in size (e.g., in width, height, or of display 112 in portrait orientation, the details of Figures
both width and height) in UI 5000GGG, as shown in Fig- 5A-5NNN also apply in an analogous manner to a display
ure 5GGG, and the keys in split keyboard portion 5039- 10 (e.g., display 112) in landscape orientation.
B rescale in accordance with the size reduction of split [0233] Figure 5OOO depicts UI 5000OOO displayed
keyboard portion 5039-B. In some embodiments, center on touch-sensitive display 112 of device 100. UI
area 5039-C increases in size and split keyboard portion 5000OOO may be a user interface in an application (e.g.,
5039-A maintains the same size in response. In some a notes application, a web browser application, etc.) on
other embodiments, both center area 5039-C and split 15 device 100. UI 5000OOO includes text entry area 5002.
keyboard portion 5039-A increases in size in response. A cursor and input text may be displayed in text entry
[0228] Figure 5HHH depicts UI 5000HHH, which area 5002. Unsplit soft keyboard 5008 is displayed on
shows gesture 5184 detected on display 112 while in the touch-sensitive display 112 at the bottom of the display.
reconfiguration mode. Gesture 5184 includes a right In some embodiments, unsplit soft keyboard includes
thumb contact moving away from the right edge of display 20 keyboard split key 5012, the details of which are de-
112. In response to detection of gesture 5184, split key- scribed above and are not repeated here.
board portion 5039-B increases in size (e.g., in width, [0234] Gesture 5194 is detected on touch-sensitive
height, or both width and height) in UI 5000III, as shown display 112. Gesture 5194 is a dragging or flick gesture
in Figure 5III, and the keys in split keyboard portion 5039- starting from a location on touch-sensitive display 112
B rescale in accordance with the size reduction of split 25 corresponding to unsplit keyboard 5008. Gesture 5194
keyboard portion 5039-B. In some embodiments, center moves in direction 5196 away from the bottom of touch-
area 5039-C decreases in size and split keyboard portion sensitive display 112.
5039-A maintains the same size in response. In some [0235] In response to detection of gesture 5194 on dis-
other embodiments, both center area 5039-C and split play 112, device 100 changes unsplit soft keyboard 5008
keyboard portion 5039-A decreases in size in response. 30 (Figure 5OOO) into split keyboard 5198 (Figure 5RRR).
[0229] Figure 5JJJ depicts UI 5000JJJ, which shows In some embodiments, an animation showing the transi-
gesture 5186 being detected on display 112 while in the tion from unsplit soft keyboard 5008 to split keyboard
reconfiguration mode, in some other embodiments. Ges- 5198 is displayed on display 112. For example, the tran-
ture 5186 includes a left thumb contact moving toward sitional animation may show unsplit soft keyboard 5008
the left edge of display 112. In response to detection of 35 moving away from the bottom of display 112 over text
gesture 5186, both split keyboard portions 5039-A and entry area 5002, in direction 5199, and splitting into split
5039-B reduce in size (e.g., in width, height, or both width keyboard portions 5198-A and 5198-B, as shown in Fig-
and height) in UI 5000KKK, as shown in Figure 5KKK, ures 5PPP and 5QQQ. As unsplit soft keyboard 5008
and the keys in split keyboard portions 5039-A and 5039- splits into portions 5198-A and 5198-B, portions 5198-A
B rescale in accordance with the size reduction of split 40 and 5198-B move apart in directions 5200-A and 5200-
keyboard portions 5039-A and 5039-B, respectively. B, respectively. In some embodiments, unsplit keyboard
Center area 5039-C also increases in size as a result. 5008 is moved away from the bottom of display 112 be-
[0230] Figure 5LLL depicts UI 5000LLL, which shows fore the splitting begins, as shown in Figure 5PPP. Upon
gesture 5188 being detected on display 112 while in the completion of the transition animation, split keyboard
reconfiguration mode, in some other embodiments. Ges- 45 5198 is displayed, as shown in Figure 5RRR.
ture 5188 includes a left thumb contact moving away from [0236] Figure 5PPP depicts UI 5000PPP, which shows
the left edge of display 112. In response to detection of an instant in the transition from unsplit keyboard 5008 to
gesture 5188, both split keyboard portions 5039-A and split keyboard 5198. Unsplit keyboard 5008 is moved
5039-B increase in size (e.g., in width, height, or both away from the bottom of display 112, in direction 5199
width and height) in UI 5000MMM, as shown in Figure 50 (in accordance with direction 5196 of gesture 5194), but
5MMM, and the keys in split keyboard portions 5039-A splitting has yet to begin.
and 5039-B rescale in accordance with the size increase [0237] Figure 5QQQ depicts UI 5000QQQ, which
of split keyboard portions 5039-A and 5039-B, respec- shows an instant, subsequent to the instant depicted in
tively. Center area 5039-C also decreases in size as a Figure 5PPP, in the transition from unsplit keyboard 5008
result. 55 to split keyboard 5198. Splitting of unsplit keyboard 5008
[0231] Figure 5NNN depicts UI 5000NNN, where ges- is ongoing, as split keyboard portions 5198-A and 5198-
ture 5190 is detected on display 112. In some embodi- B are displayed and continue to move in direction 5199.
ments, gesture 5190 is a two-thumb tap gesture, with Split keyboard portions 5198-A and 5198-B also move
31
61 EP 2 638 460 B1 62
away from each other in directions 5200-A and 5200-B, keyboard that includes character keys from the left and
respectively. right sides of a split keyboard) on the display. Figure 5A,
[0238] Figure 5RRR depicts UI 5000RRR, in which for example, shows text entry area 5002 and unsplit soft
split keyboard 5198 is displayed. In some embodiments, keyboard 5008 being displayed concurrently on display
split keyboard 5198 also includes keyboard unsplit key 5 112.
5018, the details of which are described above and are [0244] The device detects (604) a gesture on the
not repeated here. Gesture 5202, starting from a location touch-sensitive surface. For example, in Figure 5A, ges-
on display 112 corresponding to split keyboard 5198 and ture 5010 or 5014 is detected on display 112, which is a
moving in direction 5204, is detected. In some embodi- touch screen.
ments, gesture 5202 is a dragging or a flick gesture. 10 [0245] In some embodiments, the gesture is (606) a
[0239] Figure 5SSS depicts UI 5000SSS, which shows multifinger (i.e., more than one finger) depinch gesture
an instant in the transition from split keyboard 5198 to at a location on the touch-sensitive surface that corre-
unsplit keyboard 5008. Split keyboard 5198 is moved to- sponds to the location of the unsplit keyboard on the dis-
ward the bottom of display 112, in direction 5206 (in ac- play. For example, in Figure 5A, gesture 5010 is a two-
cordance with direction 5204 of gesture 5202), but the 15 finger depinch gesture on unsplit soft keyboard 5008. In
merging of split keyboard portions 5198-A and 5198-B some embodiments, the two-finger depinch gesture re-
has yet to begin. quires symmetric horizontal movement (or movement
[0240] Figure 5TTT depicts UI 5000TTT, which shows within a predetermined angle of horizontal, such as 5°,
an instant, subsequent to the instant depicted in Figure 10°, 15° or 20°) on the touch-sensitive surface. Requiring
5SSS, in the transition from split keyboard 5198 to unsplit 20 symmetric horizontal movement helps filter out anchored
keyboard 5008. Merging of split keyboard portions 5198- depinch gestures where only one touch moves, non-hor-
A and 5198-B is ongoing, as split keyboard portions 5198- izontal depinch gestures, and other gestures that may
A and 5198-B move toward each other in directions 5208- not be intended to replace the unsplit keyboard with an
A and 5208-B, respectively, as well as move in direction integrated input area.
5206. When the merging is complete, unsplit keyboard 25 [0246] In some embodiments, the gesture is (608) a
5008 is displayed at the bottom of display 112, as in Fig- tap gesture on a keyboard selection icon (e.g., a finger
ure 5OOO. tap gesture on an icon that toggles between the unsplit
[0241] Figures 6A-6B are flow diagrams illustrating a keyboard, the integrated input area, and possibly other
method 600 of replacing an unsplit keyboard with an in- types of keyboard input areas; a finger tap gesture on an
tegrated input area in accordance with some embodi- 30 icon that activates replacement of the unsplit keyboard
ments. The method 600 is performed at an electronic with the integrated input area). For example, in Figure
device (e.g., device 300, Figure 3, or portable multifunc- 5A, gesture 5014 is a tap gesture on keyboard split key
tion device 100, Figure 1) with a display and a touch- 5012, which shows an icon of two halves moving apart
sensitive surface. In some embodiments, the display is in a split.
a touch screen display and the touch-sensitive surface 35 [0247] In response to detecting the gesture on the
is on the display. In some embodiments, the display is touch-sensitive surface, the device replaces (610) the
separate from the touch-sensitive surface. Some opera- unsplit keyboard with an integrated input area. The inte-
tions in method 600 may be combined and/or the order grated input area includes a left portion with a left side
of some operations may be changed. of a split keyboard, a right portion with a right side of the
[0242] As described below, the method 600 provides 40 split keyboard, and a center portion in between the left
an intuitive way to replace an unsplit keyboard with an portion and the right portion. For example, in Figures 5A-
integrated input area for text entry. The method is par- 5C, in response to detection of gesture 5010, unsplit soft
ticularly useful when a user is typing with a tablet com- keyboard 5008 is replaced with integrated input area
puter and wants to change from using an unsplit keyboard 5016. Integrated input area 5016 includes left split key-
(e.g., for ten-finger typing when the tablet computer is 45 board portion 5016-A, right split keyboard portion 5016-
resting on a solid surface) to using an integrated input B, and center area 5016-C situated between the left and
area with a split keyboard (e.g., for two-thumb typing right portions 5016-A, 5016-B.
when the tablet computer is being held by the user’s re- [0248] In some embodiments, the width of the integrat-
maining eight fingers) or vice versa. The method reduces ed input area is the same (or substantially the same, e.g.,
the cognitive burden on a user when manipulating soft 50 90% or 95%) as the width of the display. This width makes
keyboards, thereby creating a more efficient human-ma- the left side of the split keyboard more accessible to the
chine interface. For battery-operated electronic devices, left thumb of a user. Similarly, this width makes the right
enabling a user to manipulate a soft keyboard faster and side of the split keyboard more accessible to the right
more efficiently conserves power and increases the time thumb of a user.
between battery charges. 55 [0249] In some embodiments, the integrated input area
[0243] The device concurrently displays (602) a first is visually distinguished from other user interface ele-
text entry area (e.g., in an application content area) and ments in the display, for example by providing the left
an unsplit keyboard (e.g. a single, unitary, or merged portion, center portion, and right portion of the integrated
32
63 EP 2 638 460 B1 64
input area with a common distinct shading, background pushed off-screen as they fade to zero opacity during the
color or pattern, and/or by providing a distinctive border transition.
around the left portion, center portion, and right portion [0255] In some embodiments, while displaying the in-
of the integrated input area. tegrated input area, the device detects (622) a second
[0250] In some embodiments, the integrated input area 5 gesture on the touch-sensitive surface (e.g., a gesture
includes a second text entry area (612). The second text 5028 or 5032 on display 112, Figure 5E).
entry area typically displays a duplicate portion of the first [0256] In some embodiments, the second gesture is
text entry area, such as an area near the cursor in first (624) a multifinger (i.e., more than one finger) pinch ges-
text entry area. For example, the second text entry area ture at a location on the touch-sensitive surface that cor-
may contain a duplicate of the cursor/insertion point in 10 responds to the location of the integrated input area on
the first text entry area and one or more words from the the display. For example, in Figure 5E, gesture 5028 is
most recent text entered by the user adjacent to the cur- a two-finger pinch gesture on display 112. In some em-
sor/insertion point. For example, integrated input area bodiments, the two-finger pinch gesture requires sym-
5016 includes center area 5016-C. Center area 5016-C metric horizontal movement (or movement within a pre-
displays duplicate input text 5022, making center area 15 determined angle of horizontal, such as 5°, 10°, 15° or
5016-C a second text entry area to the first text entry 20°) on the touch-sensitive surface. Requiring symmetric
area 5002. horizontal movement helps filter out anchored pinch ges-
[0251] In some embodiment, the first text entry area tures where only one touch moves, non-horizontal pinch
displays (614) text at a first size, and the second text gestures, and other gestures that may not be intended
entry area displays a portion of the text in the first text 20 to replace the integrated input area with the unsplit key-
entry area at a second size that is larger than the first board.
size. For example, in Figure 5C, duplicate text input 5022 [0257] In some embodiments, the second gesture is
is displayed in center area 5016-C at a larger size than (626) a tap gesture on a keyboard selection icon (e.g., a
input text 5006 in text entry area 5002. finger tap gesture on an icon that toggles between the
[0252] In some embodiments, while displaying the in- 25 unsplit keyboard, the integrated input area, and possibly
tegrated input area, the device detects (616) a gesture other types of keyboard input areas; a finger tap gesture
(e.g., a tap gesture 5026 on the "T" key 5024, Figure 5D) on an icon that activates replacement of the unsplit key-
at a location on the touch-sensitive surface that corre- board with the integrated input area). For example, in
sponds to a location of a character key in the split key- Figure 5E, gesture 5032 is a tap gesture on keyboard
board. In response to detecting the gesture at the location 30 unsplit key 5018, which shows an icon of two halves
on the touch-sensitive surface that corresponds to the merging together.
location of the character key in the split keyboard, the [0258] In response to detecting the second gesture on
device inputs and concurrently displays (618) the corre- the touch-sensitive surface, the device replaces (628)
sponding character in the first text entry area and the the integrated input area with the unsplit keyboard. For
second text entry area on the display. For example, in 35 example, in Figures 5E-5G, in response to gesture 5028
Figure 5D, in response to detection of gesture 5026, a or 5032, integrated input area 5016 is replaced with un-
character "t" is entered into input text 5006 and duplicate split keyboard 5008.
input text 5022. [0259] In some embodiments, replacing the integrated
[0253] In some embodiments, replacing the unsplit input area with the unsplit keyboard includes displaying
keyboard with the integrated input area includes display- 40 (630) an animation that transitions the integrated input
ing (620) an animation that transitions the unsplit key- area to the unsplit keyboard. For example, Figure 5F
board to the integrated input area. For example, Figure shows an instant in a transition animation from integrated
5B shows an instant in a transition animation from unsplit input area 5016 to unsplit soft keyboard 5008.
soft keyboard 5008 to integrated input area 5016. [0260] In some embodiments, the electronic device is
[0254] In some embodiments, the transition for each 45 (632) a portable electronic device (e.g., a tablet computer
character key is a linear interpolation between two states, such as the iPad® device from Apple Inc. of Cupertino,
the unsplit (or merged) state and the split state. In some California). For example, device 100 or 300 may be a
embodiments, at least one character key is duplicated portable tablet computer.
during the transition so that the left portion of the split [0261] In some embodiments, the display is (634) a
keyboard and the right portion of the split keyboard con- 50 touch-sensitive display that includes the touch-sensitive
tain at least one common character key (e.g., the "g" keys surface. For example, display 112 is a touch screen.
5019-A and 5019-B in Figure 5C). In some embodiments, [0262] Details in method 600 apply to the methods de-
some keys in the unsplit keyboard are not displayed in scribed below, and are omitted for brevity.
the split keyboard (e.g., hide keyboard key 5009 in the [0263] Figures 7A-7B are flow diagrams illustrating a
unsplit keyboard (Figure 5A) is not displayed in the split 55 method 700 of responding to a keyboard selection ges-
keyboard (Figure 5C)). In some embodiments, during the ture in accordance with some embodiments. The method
animated transition to the integrated input area, keys that 700 is performed at an electronic device (e.g., device
are not displayed in the split keyboard appear to be 300, Figure 3, or portable multifunction device 100, Fig-
33
65 EP 2 638 460 B1 66
ure 1) with a display and a touch-sensitive surface. In tacts that correspond to the keyboard selection gesture
some embodiments, the display is a touch screen display at the second time after the first time (713), the device
and the touch-sensitive surface is on the display. In some replaces (714) the first keyboard with a second keyboard
embodiments, the display is separate from the touch- when the second time exceeds a predefined period of
sensitive surface. Some operations in method 700 may 5 time after the first time.
be combined and/or the order of some operations may [0272] In some embodiments, replacing the first key-
be changed. board with the second keyboard includes displaying
[0264] As described below, the method 700 provides (716) an animation that transitions the first keyboard to
a way to prevent accidentally changing soft keyboards the second keyboard. For example, Figure 5J shows an
while typing (e.g., from an unsplit keyboard to an inte- 10 instant in a transition animation from unsplit soft keyboard
grated input area with a split keyboard, or vice versa). 5008 to integrated input area 5016.
The method reduces the cognitive burden on a user when [0273] In some embodiments, although the contacts
manipulating and using soft keyboards, thereby creating that correspond to a keyboard selection gesture are de-
a more efficient human-machine interface. For battery- tected on the touch-sensitive surface, the keyboard se-
operated electronic devices, enabling a user to manipu- 15 lection gesture is not recognized because the gesture
late and use soft keyboards faster and more efficiently recognizers for the keyboard selection gesture are disa-
conserves power and increases the time between battery bled for a predefined period of time after a key activation
charges. gesture is detected.
[0265] The device displays (702) a first keyboard on [0274] In response to detecting the one or more con-
the display, the first keyboard comprising a first plurality 20 tacts that correspond to the keyboard selection gesture
of keys (e.g. a split keyboard or, conversely, a single, at the second time after the first time, the device main-
unitary, or merged keyboard that includes character keys tains (718) display of the first keyboard when the second
from the left and right portions of the split keyboard). For time is less than the predefined period of time after the
example, in Figure 5H, unsplit keyboard 5008 is dis- first time.
played. Unsplit keyboard 5008 includes a plurality of let- 25 [0275] For example, in response to detecting a multi-
ter keys, an example of which is "T" key 5036. finger pinch gesture to select a merged keyboard when
[0266] The device detects (704) a key activation ges- a split keyboard is currently displayed, the split keyboard
ture at a first time at a location on the touch-sensitive is replaced by the merged keyboard if more than a pre-
surface that corresponds to a location of a first key in the defined period of time (e.g., 0.3, 0.4, 0.5, 0.6 seconds or
first keyboard (e.g., a tap gesture 5038 on the "T" key 30 some other reasonable period of time) has elapsed since
5036, Figure 5H). a key in the split keyboard was activated. But the split
[0267] In response to detecting the key activation ges- keyboard remains displayed if less than the predefined
ture at the first time, the device activates (706) the first period of time has elapsed since a key in the split key-
key (e.g., entering a character that corresponds to the board was activated, thereby preventing accidentally
first key or performing an action that corresponds to the 35 changing the keyboard when the user is actively typing.
first key). For example, in Figures 5H-5I, in response to Conversely, in response to detecting a multifinger de-
gesture 5038 on "T" key 536, a character "t" is entered pinch gesture to select a split keyboard when a merged
into input text 5006. keyboard is currently displayed, the merged keyboard is
[0268] In some embodiments, in response to detecting replaced by the split keyboard if more than the predefined
the key activation gesture at the first time, the device 40 period of time has elapsed since a key in the merged
disables (708) a gesture recognizer for the keyboard se- keyboard was activated. But the merged keyboard re-
lection gesture for the predefined period of time. For ex- mains displayed if less than the predefined period of time
ample, in Figure 5H, in response to detection of gesture has elapsed since a key in the merged keyboard was
5038, the device disables a gesture recognizer for the activated.
keyboard selection gesture for the predefined period of 45 [0276] As an example, in Figures 5H-5I, if the time pe-
time. riod from gesture 5038 to gesture 5010 exceeds the pre-
[0269] The device detects (710) one or more contacts defined period of time, then unsplit soft keyboard 5008
on the touch-sensitive surface at a second time after the is replaced with split soft keyboard area 5016, as shown
first time, the one or more contacts corresponding to a in Figure 5K. If the time period from gesture 5038 to ges-
keyboard selection gesture. For example, in Figure 5I, 50 ture 5010 does not exceeds the predefined period of time,
gesture 5010, which includes two contacts, is detected. then unsplit soft keyboard 5008 remains displayed.
[0270] In some embodiments, the keyboard selection [0277] In some embodiments, the electronic device is
gesture is (712) a multifinger gesture at a location on the (720) a portable electronic device (e.g., a tablet compu-
touch-sensitive surface that corresponds to the location ter). For example, device 100 or 300 may be a portable
of the first keyboard on the display. For example, in Figure 55 tablet computer.
5I, gesture 5010 is a two-finger depinch gesture on unsplit [0278] In some embodiments, the display is (722) a
soft keyboard 5008. touch-sensitive display that includes the touch-sensitive
[0271] In response to detecting the one or more con- surface. For example, display 112 is a touch screen.
34
67 EP 2 638 460 B1 68
[0279] Figures 8A-8B are flow diagrams illustrating a gesture 5044 is detected at position 5044-1 in right split
method 800 of moving an integrated input area in accord- keyboard portion 5039-B.
ance with some embodiments. The method 800 is per- [0286] The device detects (812) movement of the sec-
formed at an electronic device (e.g., device 300, Figure ond contact along the touch-sensitive surface. For ex-
3, or portable multifunction device 100, Figure 1) with a 5 ample, in Figure 5L, the finger contact in gesture 5044
display and a touch-sensitive surface. In some embodi- moves from position 5044-1 to position 5044-2.
ments, the display is a touch screen display and the [0287] In response to detecting movement of the sec-
touch-sensitive surface is on the display. In some em- ond contact along the touch-sensitive surface, the device
bodiments, the display is separate from the touch-sensi- moves (814) the integrated input area in accordance with
tive surface. Some operations in method 800 may be 10 the movement of the second contact when the movement
combined and/or the order of some operations may be of the second contact exceeds a second movement
changed. threshold, the second movement threshold being greater
[0280] As described below, the method 800 provides than the first movement threshold. For example, in Fig-
a way to prevent accidental movement of an integrated ures 5L-5M, when the movement of gesture 5044 ex-
input area when a user contact moves during typing with 15 ceeds threshold distance 5046 (or threshold distance
the split keyboard. The method reduces the cognitive bur- 5048, depending on the implementation), then integrated
den on a user when repositioning and using an integrated input area 5039 moves in accordance with the movement
input area that includes a split keyboard, thereby creating gesture 5044.
a more efficient human-machine interface. For battery- [0288] In some embodiments, a respective movement
operated electronic devices, enabling a user to reposition 20 threshold is a function of a horizontal distance of a re-
and use a soft keyboard faster and more efficiently con- spective contact from a vertical centerline of the integrat-
serves power and increases the time between battery ed input area (816). For example, in Figure 5N, charts
charges. 5050 and 5058 show the threshold distance as a function
[0281] The device concurrently displays (802) a first of distance from the centerline of integrated input area
text entry area and an integrated input area on the dis- 25 5039. In some embodiments, the movement threshold
play, the integrated input area including: a left portion increases as the horizontal distance of the contact from
with a left side of a split keyboard, a right portion with a the vertical centerline increases. For example, in Figure
right side of the split keyboard, and a center portion in 5L, threshold distance 5046 in right soft keyboard portion
between the left portion and the right portion. For exam- 5039-B is greater than threshold distance 5042 in center
ple, in Figure 5L, text entry area 5002 and integrated 30 area 5039-C. As another example. chart 5058 in Figure
input area 5039 are displayed on display 112. Integrated 5N shows the threshold distance varying linearly with dis-
input area 5039 includes left split keyboard portion 5039- tance from a centerline of integrated input area 5039.
A, right split keyboard portion 5039-B, and center area [0289] In some embodiments, the integrated input area
5039-C between left and right split keyboard portions is constrained to vertical movement on the display (e.g.,
5039-A and 5039-B. 35 when the width of the integrated input area is the same
[0282] The device detects (804) a first contact on the (or substantially the same, e.g., 90% or 95%) as the width
touch-sensitive surface at a location that corresponds to of the display) and the integrated input area moves in
the center portion of the integrated input area. For exam- accordance with a vertical component of movement of a
ple, in Figure 5L, a finger contact corresponding to ges- respective contact when a movement threshold for the
ture 5040 is detected at position 5040-1 in center area 40 respective contact is exceeded (818). For example, in
5039-C. Figure 5L-5M, even with gesture 5044 having an angular
[0283] The device detects (806) movement of the first movement (and thus having a horizontal component and
contact along the touch-sensitive surface. For example, a vertical component), movement of integrated input area
in Figure 5L, the finger contact in gesture 5040 moves 5039 is constrained to vertical movement. The horizontal
from position 5040-1 to position 5040-2. 45 movement of gesture 5044 is ignored.
[0284] In response to detecting movement of the first [0290] In some embodiments, the left side of the split
contact along the touch-sensitive surface, the device keyboard and the right side of the split keyboard maintain
moves (808) the integrated input area in accordance with fixed positions relative to each other within the integrated
the movement of the first contact when the movement of input area during movement of the integrated input area
the first contact exceeds a first movement threshold. For 50 (820). For example, in Figures 5L-5M, left and right split
example, in Figures 5L-5M, when the movement of ges- keyboard portions 5039-A and 5039-B remain in fixed
ture 5040 exceeds threshold 5042, then integrated input positions relative to each other during the movement.
area 5039 moves in accordance with the movement ges- [0291] Figure 9 is a flow diagram illustrating a method
ture 5040. 900 of moving an input area and adjusting the size of an
[0285] The device detects (810) a second contact, dis- 55 application content area in accordance with some em-
tinct from the first contact, on the touch-sensitive surface bodiments. The method 900 is performed at an electronic
at a location that corresponds to the split keyboard. For device (e.g., device 300, Figure 3, or portable multifunc-
example, in Figure 5L, a finger contact corresponding to tion device 100, Figure 1) with a display and a touch-
35
69 EP 2 638 460 B1 70
sensitive surface. In some embodiments, the display is is the sum of height 5064 and height 5065.
a touch screen display and the touch-sensitive surface [0297] In other words, when the keyboard is "an-
is on the display. In some embodiments, the display is chored" at the bottom of the screen, an application treats
separate from the touch-sensitive surface. Some opera- the keyboard as being an area with non-zero height that
tions in method 900 may be combined and/or the order 5 cannot be used to display the application content area.
of some operations may be changed. Thus, the application reduces the size of its application
[0292] As described below, the method 900 provides content area accordingly. But, when the keyboard is
a way to increase the size of an application content area moved away from the bottom of the screen (becomes
(e.g., an application view or window) when an input area "unanchored" from the bottom of the display), the appli-
with a keyboard is moved from the bottom of a display. 10 cation treats the keyboard as being an area with zero
The method is particularly useful when a user is typing height (even though the actual displayed height of the
with a tablet computer with a limited display area because input area is non-zero) and so the application increases
it allows for more of an application to be viewed, thereby its content display area to use more of the display (e.g.,
creating a more efficient human-machine interface. For to use all or substantially all of the display area). When
battery-operated electronic devices, enabling a user to 15 unanchored, the keyboard floats over the application
manipulate a soft keyboard and see more of an applica- content area. The keyboard moves vertically in response
tion lets the user work faster, which conserves power and to detecting an upward finger gesture. The keyboard may
increases the time between battery charges. move with inertia if the velocity upon liftoff of the finger
[0293] The device concurrently displays (902) on the is above a predefined threshold.
display an application content area with a first size and 20 [0298] Figures 10A-10B are flow diagrams illustrating
an input area with a keyboard (e.g., a split keyboard or a method 1000 of entering characters with a split soft
a merged keyboard), with the input area being adjacent keyboard in accordance with some embodiments. The
to and separate from the application content area with method 1000 is performed at an electronic device (e.g.,
the first size and the input area being at a bottom of the device 300, Figure 3, or portable multifunction device
display. For example, in Figure 5O, application content 25 100, Figure 1) with a display and a touch-sensitive sur-
area 5062 with height 5064 and integrated input area face. In some embodiments, the display is a touch screen
5039 with height 5065 are displayed. Integrated input display and the touch-sensitive surface is on the display.
area 5039 is docked at the bottom of display 112 and is In some embodiments, the display is separate from the
adjacent to and separate from application content area touch-sensitive surface. Some operations in method 900
5062. 30 may be combined and/or the order of some operations
[0294] The device detects (904) a gesture on the may be changed.
touch-sensitive surface (e.g., an upward flick or drag ges- [0299] As described below, the method 1000 makes
ture at a location on the touch-sensitive surface that cor- two-thumb typing with a split keyboard (e.g., on a tablet
responds to the input area at the bottom of the display). computer that is being held by the user’s remaining eight
For example, in Figure 5O, gesture 5068, which is a drag- 35 fingers) faster and more efficient by letting a user easily
ging gesture, is detected on display 112. activate certain keys on the right side of a split keyboard
[0295] In response to detecting the gesture on the with a left thumb (and conversely, letting a user easily
touch-sensitive surface, the device moves (906) the input activate certain keys on the left side of a split keyboard
area away from the bottom of the display over the appli- with a right thumb), thereby creating a more efficient hu-
cation content area, and increases (908) the application 40 man-machine interface. For battery-operated electronic
content area to a second size larger than the first size. devices, enabling a two-thumb typist to enter characters
For example, in Figure 5P, in response to detection of in a split soft keyboard faster and more efficiently con-
gesture 5068, integrated input area 5039 moves away serves power and increases the time between battery
from the bottom of display 112 and over application con- charges.
tent area 5062. Additionally, application content area 45 [0300] The device concurrently displays (1002) a text
5062 (Figure 5P) increases in size to a size with height entry area, a left side of a split keyboard, and a right side
5066, which is larger than the size of the application con- of a split keyboard, with the left side of the split keyboard
tent area with height 5064 (Figure 5O). including a plurality of rows of keys and the right side of
[0296] In some embodiments, the first size of the ap- the split keyboard including a corresponding plurality of
plication content area has a first height, the second size 50 rows of keys. A row of keys on the left side of the split
of the application content area has a second height, the keyboard and a row of keys on the right side of the key-
input area has an input area height, and the second board are corresponding if the rows belong in the same
height is greater than the first height by an amount equal row in the unsplit keyboard corresponding to the split
to (or substantially equal to (e.g., up to 5% or 10% differ- keyboard.
ence)) the input area height (910). For example, in Figure 55 [0301] For example, in Figure 5Q, text entry area 5002
5O, application content area 5062 has height 5064, and and split soft keyboard 5069 are displayed. Split soft key-
integrated input area 5039 has height 5065. In Figure board 5069 includes left split keyboard portion 5069-A
5P, application content area 5062 has height 5066, which and right split keyboard portion 5069-B. Split soft key-
36
71 EP 2 638 460 B1 72
board 5069 includes letter keys arranged in accordance right of the leftmost key in the corresponding respective
with the QWERTY layout. In left portion 5069-A, the top row of the right side of the split keyboard is entered in-
row of letter keys includes "Q," "W," "E," "R," and "T" stead. For example, returning to the lower rows of letter
keys. The middle row of letter keys includes "A," "S," "D," keys in left and right split keyboard portions 5069-A and
and "F" keys. The lower row of letter keys includes "Z," 5 5069-B in Figure 5Q, if undisplayed key activation area
"X," "C," and "V" keys. In right portion 5069-B, the top 5092 corresponding to "V" key 5074 is instead a dis-
row of letter keys include "Y," "U," "I," "O," and "P" keys. played duplicate "V" key, then in response to detection
The middle row of letter keys includes "G," "H," "J," "K," of a gesture on key activation area 5086 to the right of
and "L" keys. The lower row of letter keys includes "B," the rightmost "V" key 5074 in the lower row in left portion
"N," and "M" keys. The row with "Q," "W," "E," "R," and 10 5069-A, a character "b" (corresponding to "B" key 5080)
"T" keys on left portion 5069-A and the row with "Y," "U," is entered instead of a character "v" corresponding to the
"I," "O," and "P" keys on right portion 5069-B are corre- duplicate "V" key.
sponding because these rows belong to the same top [0306] In some embodiments, the device detects
row in an unsplit QWERTY keyboard. Similarly, the row (1010) a gesture at a location on the touch-sensitive sur-
with "A," "S," "D," and "F" keys correspond to the row 15 face that corresponds to a predefined area adjacent to
with "G," "H," "J," "K," and "L" keys, and the row with "Z," and to the left of a leftmost key in a respective row of the
"X," "C," and "V" keys correspond to the row with "B," right side of the split keyboard. For example, in Figure
"N," and "M" keys. On the other hand, the row with "Q," 5T, gesture 5100 is detected on predefined key activation
"W," "E," "R," and "T" keys on left portion 5069-A does area 5088, which is to the left of the rightmost "Y" key in
not correspond to the row with "G," "H," "J," "K," and "L" 20 the top row of letter keys in right split keyboard portion
keys or to the row with "B," "N," and "M" keys in right 5069-B.
portion 5069-B because they do not belong to the same [0307] In some embodiments, the predefined area ad-
row in an unsplit QWERTY keyboard. jacent to and to the left of the leftmost key in the respective
[0302] The device (1004) detects a gesture at a loca- row of the right side of the split keyboard is an undisplayed
tion on the touch-sensitive surface that corresponds to a 25 key activation area that corresponds to the rightmost key
predefined area adjacent to and to the right of a rightmost in the corresponding respective row of the left side of the
key in a respective row of the left side of the split key- split keyboard (1012). For example, in Figure 5Q, key
board. For example, in Figure 5R, gesture 5098 is de- activation areas 5088, 5090, and 5092, which are adja-
tected on predefined key activation area 5086, which is cent to and to left of keys 5076, 5078, and 5080, respec-
to the right of "V" key 5074 in the lower row of letter keys 30 tively, correspond to keys 5070, 5072, and 5074, respec-
in left split keyboard portion 5069-A. tively.
[0303] In some embodiments, the predefined area ad- [0308] In response to detecting the gesture at the lo-
jacent to and to the right of the rightmost key in the re- cation on the touch-sensitive surface that corresponds
spective row of the left side of the split keyboard is an to the predefined area adjacent to and to the left of the
undisplayed key activation area that corresponds to the 35 leftmost key in the respective row of the right side of the
leftmost key in the corresponding respective row of the split keyboard, the device enters (1014) in the text entry
right side of the split keyboard (1006). For example, in area a character that corresponds to a rightmost key in
Figure 5Q, key activation areas 5082, 5084, and 5086, a corresponding respective row of the left side of the split
which are adjacent to and to right of keys 5070, 5072, keyboard. For example, in Figures 5T-5U, in response
and 5074, respectively, correspond to keys 5076, 5078, 40 to detection of gesture 5100, a character "t" is entered
and 5080, respectively. into input text 5006, as key activation area 5088 corre-
[0304] In response to detecting the gesture at the lo- sponds to "T" key 5070 in left split keyboard portion 5069-
cation on the touch-sensitive surface that corresponds A; "T" key 5070 is the rightmost key in the corresponding
to the predefined area adjacent to and to the right of the row in left split keyboard portion 5069-B.
rightmost key in the respective row of the left side of the 45 [0309] If the rightmost key in the corresponding respec-
split keyboard, the device enters (1008) in the text entry tive row of the left side of the split keyboard is a duplicate
area a character that corresponds to a leftmost key in a of the leftmost key in the respective row of the right side
corresponding respective row of the right side of the split of the split keyboard, then the key adjacent to and to the
keyboard. For example, in Figures 5R-5S, in response left of the rightmost key in the corresponding respective
to detection of gesture 5098, a character "b" is entered 50 row of the left side of the split keyboard is entered instead.
into input text 5006, as key activation area 5086 corre- For example, in the middle rows of letter keys in left and
sponds to "B" key 5080 in right split keyboard portion right split keyboard portions 5069-A and 5069-B in Figure
5069-B; "B" key 5080 is the leftmost key in the corre- 5Q, if undisplayed key activation area 5084 correspond-
sponding row in right split keyboard portion 5069-B. ing to "G" key 5078 is instead a displayed duplicate "G"
[0305] If the leftmost key in the corresponding respec- 55 key, then in response to detection of a gesture on key
tive row of the right side of the split keyboard is a duplicate activation area 5090 to the left of the leftmost "G" key
of the rightmost key in the respective row of the left side 5078 in the middle row in right portion 5069-B, a character
of the split keyboard, then the key adjacent to and to the "f" (corresponding to "F" key 5072) is entered instead of
37
73 EP 2 638 460 B1 74
a character "g" corresponding to the duplicate "G" key. sible to the user’s thumbs, thereby creating a more effi-
[0310] In some embodiments, prior to detecting the cient human-machine interface. For battery-operated
gesture, the device detects (1016) a key activation ges- electronic devices, enabling a user to perform character
ture at a first time at a location on the touch-sensitive entry faster and more efficiently conserves power and
surface that corresponds to a location of a visible key in 5 increases the time between battery charges.
the split keyboard. For example, in Figures 5Q-5R, prior [0315] The device concurrently displays (1102) a first
to detection of gesture 5098, gesture 5096 is detected text entry area and an integrated input area, the integrat-
on "O" key 5094. A character "o" is entered into input text ed input area including a left portion with a left side of a
5006 in response to detection of gesture 5096. split keyboard, a right portion with a right side of the split
[0311] In response to detecting the gesture at a second 10 keyboard, and a center portion with a second text entry
time after the first time, the device enters (1018) in the area, the center portion in between the left portion and
text entry area the character that corresponds to the left- the right portion. For example, in Figure 5V, text entry
most key in the corresponding respective row of the right area 5002 and integrated input area 5016 are displayed
side of the split keyboard when the second time is less on display 112. Integrated input area 5016 includes left
than a predefined period of time after the first time, but 15 split keyboard portion 5016-A, right split keyboard portion
the device foregoes (1020) entering in the text entry area 5016-B, and center area 5016-C between left and right
the character that corresponds to the leftmost key in the portions 5016-A and 5016-B. Center area 5016-C serves
corresponding respective row of the right side of the split as a second text entry area, as duplicate cursor 5020
keyboard when the second time exceeds the predefined and duplicate input text 5022 are displayed in center area
period of time after the first time. In some embodiments, 20 5016-C.
the undisplayed keys areas are only activatable when [0316] The device detects (1104) a gesture at a loca-
the user is actively typing, as determined by detecting tion on the touch-sensitive surface that corresponds to a
activation of visible keys in the split keyboard within a location of a character key in the split keyboard (e.g., a
predefined period of time (e.g., 0.5, 1.0, or 2.0 seconds tap gesture 5102 on "T" key 5024 key on left portion 5016-
or some other reasonable period of time) of detecting the 25 A, Figure 5V).
gesture in the undisplayed key area. When the gesture [0317] In response to detecting the gesture at the lo-
in the undisplayed key area is detected after the prede- cation on the touch-sensitive surface that corresponds
fined period of time has elapsed since detecting activa- to the location of the character key in the split keyboard,
tion of a visible key, the character corresponding to the the device inputs and concurrently displays (1106) the
undisplayed key is not entered. This prevents accidental 30 corresponding character in the first text entry area and
text entry of characters that correspond to the undis- the second text entry area on the display. In Figure 5W,
played key areas when the user is not actively typing. in response to detection of gesture 5102 on "T" key 5024,
[0312] For example, if the time period between when a character "t" is entered into input text 5006 and con-
gesture 5096 is detected and when gesture 5098 is de- currently displayed. A character "t" is also entered into
tected is less than a predefined period of time, then a 35 duplicate input text 5022 and concurrently displayed in
character "b" is entered in response to gesture 5098. On center area 5016-C. Having a second text entry area in
the other hand, if the time period between when gesture the center portion of the integrated input area that shows
5096 is detected and when gesture 5098 is detected is a portion of the text being entered in the first text entry
more than the predefined period of time, then the char- area makes text input faster, more efficient, and less
acter "b" is not entered in response to gesture 5098. 40 stressful by reducing the amount of eye movement when
[0313] Figures 11A-11D are flow diagrams illustrating a user is thumb typing with the split keyboard.
a method 1100 of using a center portion of an integrated [0318] In some embodiments, the first text entry area
input area in accordance with some embodiments. The displays text at a first size, and the second text entry area
method 1100 is performed at an electronic device (e.g., displays a portion of the text in the first text entry area at
device 300, Figure 3, or portable multifunction device 45 a second size that is larger than the first size (1108). For
100, Figure 1) with a display and a touch-sensitive sur- example, in Figure 5V, duplicate text input 5022 is dis-
face. In some embodiments, the display is a touch screen played in center area 5016-C at a larger size than input
display and the touch-sensitive surface is on the display. text 5006 in text entry area 5002.
In some embodiments, the display is separate from the [0319] In some embodiments, the width of the integrat-
touch-sensitive surface. Some operations in method 50 ed input area is the same (or substantially the same, e.g.,
1100 may be combined and/or the order of some oper- 90% or 95%) as the width of the display (1110). Integrated
ations may be changed. input area 5016 in Figure 5V, for example, has a width
[0314] As described below, the method 1100 provides that spans the width of display 112.
a way to use the center portion of an integrated input [0320] In some embodiments, the left side of the split
area to make character entry faster and more efficient. 55 keyboard and the right side of the split keyboard maintain
The method is particularly useful when a user is perform- fixed positions relative to each other within the integrated
ing two-thumb typing with a tablet computer. The method input area during movement of the integrated input area
makes additional character entry functions readily acces- (1112). Having the left and right portions maintain fixed
38
75 EP 2 638 460 B1 76
positions relative to each other keeps the left and right trols in the center portion of the integrated input area
portions at relative positions on the display that are more have corresponding text editing controls in the first text
familiar to the use, and thus less cognitive readjustment entry area (1124). For example, the text editing controls
is needed on the part of the user to maintain typing effi- in the center portion may be duplicates of text editing
ciency. 5 controls in the first text entry area. For example, in Figure
[0321] For example, Figure 5M shows integrated input 5Z, text editing control 5104, corresponding to text editing
area 5039 moving in response to detection of a gesture control 5106, is displayed in text entry area 5002. In some
(e.g., gesture 5040). Within integrated input area 5039, embodiments, the device detects (1126) a gesture on
left portion 5039-A and right portion 5039-B maintain the touch-sensitive surface at a location that corresponds
fixed positions relative to each other during the move- 10 to a text editing control in the center portion of the inte-
ment. Integrated input areas 5039 and 5016 are similar, grated input area, and executes (1128) a text editing com-
and thus when integrated input area 5016 moves, left mand in accordance with the gesture on the touch-sen-
and right portions 5016-A and 5016-B maintain fixed po- sitive surface at the location that corresponds to the text
sitions relative to each other. editing control. Editing the text via gestures in the center
[0322] In some embodiments, the second text entry 15 portion of the integrated input area has the advantage of
area includes an insertion point that remains stationary requiring the user to move their thumb only a small dis-
on the display as text is entered (1114). A stationary in- tance from where they are typing (versus the alternative
sertion point in the second text entry area provides a of reaching to the first text entry area (or a menu bar at
stable focal point for the user that helps reduce lateral the top of the display) each time an edit function is need-
eye movement. For example, in Figure 5X-5Y, duplicate 20 ed.
cursor 5020 is stationary within center area 5016-C; [0326] For example, in Figure 5Z-5AA, text editing con-
when the position of duplicate 5020 relative to duplicate trol 5106 corresponding to a paste operation is displayed
input text 5022 changes, duplicate input text 5022 is dis- in center area 5016-C. Gesture 5108 on text editing con-
played as advancing or retreating relative to duplicate trol 5106 is detected in center area 516-C. In response
cursor 5020. 25 to detection of gesture 5108, a paste operation is exe-
[0323] In some embodiments, the second text entry cuted; text "ahead" is pasted into duplicate text 5022 and
area includes an insertion point (1116). The device de- input text 5006.
tects (1118) a gesture on the touch-sensitive surface at [0327] In some embodiments, the device displays
a location that corresponds to the insertion point in the (1130) user-selectable input elements (e.g., radio but-
second text entry area, and moves (1120) the insertion 30 tons, check boxes, pick lists, time pickers, and/or date
point in the second text entry area in accordance with pickers) in the center portion of the integrated input area,
the gesture on the touch-sensitive surface at the location detects (1132) a gesture on the touch-sensitive surface
that corresponds to the insertion point in the second text at a location that corresponds to a user-selectable input
entry area. The insertion point in the first text entry area element in the center portion of the integrated input area,
is also moved in accordance with the gesture on the 35 and, in response to detecting the gesture on the touch-
touch-sensitive surface at the location that corresponds sensitive surface at the location that corresponds to the
to the insertion point in the second text entry area. For user-selectable input element, selects (1134) the user-
example, a leftward swipe moves the insertion point to selectable input element. Selecting radio buttons, check
the beginning of a word, while a rightward swipe moves boxes, items in pick lists, times and dates via gestures
the insertion point to the end of a word. Adjusting the 40 in the center portion of the integrated input area has the
insertion point via gestures in the second text entry area advantage of requiring the user to move their thumb only
has the advantage of requiring the user to move their a small distance from where they are typing each time a
thumb only a small distance from where they are typing selection is needed.
(versus the alternative of reaching to the first text entry [0328] For example, in Figure 5BB-5CC, form 5110 is
area (or a menu bar at the top of the display) each time 45 displayed in text entry area 5002. Form 5110 includes
an adjustment is needed. checkboxes 5112, each respective checkbox corre-
[0324] For example, in Figure 5X-5Y, gesture 5102, sponding to an option in the form. At least some of check-
moving in direction 5103, is detected on duplicate cursor boxes 5112 are displayed in center area 5016-C as du-
5020 in center area 5016-C. In response to detection of plicate checkboxes 5116. Gesture 5118 is detected on
gesture 5102, duplicate cursor 5020 changes position 50 duplicate checkbox 5116-A in center area 5016-C. In re-
relative to duplicate input text 5022 within center area sponse to detection of gesture 5118, checkbox 5116-A
5016-C in accordance with gesture 5102 (and cursor is selected.
5004 does the same relative to input text 5006). [0329] In some embodiments, the device detects
[0325] In some embodiments, the device displays (1136) a gesture on the touch-sensitive surface at a lo-
(1122) text editing controls (e.g., icons, or graphical or 55 cation that corresponds to the center portion of the inte-
user interface objects for selecting, cutting, copying, grated input area (e.g., a tap gesture on an icon to activate
and/or pasting text) in the center portion of the integrated a popup menu or a predefined multifinger gesture within
input area. In some embodiments, the text editing con- the center portion), and, in response to detecting the ges-
39
77 EP 2 638 460 B1 78
ture on the touch-sensitive surface at the location that to complete or correct a series of characters displayed
corresponds to the center portion of the integrated input in the first text entry area and the second text entry area),
area, displays (1138) a popup view. Accessing a popup detects (1150) a gesture on the touch-sensitive surface
view (e.g., a window or menu in an application) via a at a location that corresponds to the suggested word,
gesture in the center portion of the integrated input area 5 and executes (1152) a text editing command in accord-
has the advantage of requiring the user to move their ance with the gesture on the touch-sensitive surface at
thumb only a small distance from where they are typing the location that corresponds to the suggested word. In
(versus the alternative of reaching to the first text entry some embodiments, tapping the suggested word ac-
area (or a menu bar at the top of the display) each time cepts and inputs the suggested word. In some embodi-
a popup view is needed. 10 ments, tapping the suggested word (or an X icon) rejects
[0330] For example, in Figures 5DD-5EE, gesture and terminates display of the suggested word. Editing
5120 is detected in center area 5016-C. In response to the text via gestures in the center portion of the integrated
detection of gesture 5120, popup menu 5124 is dis- input area has the advantage of requiring the user to
played. In some embodiments, popup menu 5122, cor- move their thumb only a small distance from where they
responding to popup menu 5124, is displayed in text entry 15 are typing (versus the alternative of reaching to the first
area 5002. text entry area (or a menu bar at the top of the display)
[0331] In some embodiments, the device detects each time an edit function is needed.
(1140) a plurality of gestures on the touch-sensitive sur- [0336] For example, in Figures 5JJ-5KK, suggested
face at a location that corresponds to the center portion word 5134 is displayed in text entry area 5002, and du-
of the integrated input area (e.g., a series of finger strokes 20 plicate suggested word 5136, corresponding to suggest-
and taps that correspond to Chinese, Japanese, Korean, ed word 5134, is displayed in center area 5016-C. Sug-
or other characters), and enters (1142) in the first text gested word 5136 is displayed with X icon 5137. Gesture
entry area and the second text entry area a character 5138 is detected on X icon 5137 associated with sug-
that corresponds to the plurality of gestures. Exemplary gested word 5136. In response to detection of gesture
characters include alphabetic characters, numeric char- 25 5138, suggested word 5136 is rejected and display of
acters, symbols, punctuation characters, Arabic script suggested word 5136 and suggested word 5134 are ter-
characters, Cyrillic characters, Greek characters, emoji minated.
symbols, emoticon symbols, Asian characters such as [0337] In some embodiments, the device displays
sinographs, Japanese Kanji, katakana, or hiragana, etc., (1154) a plurality of emoji characters in the center portion
Devanagari characters, Perso-Arabic characters, Gur- 30 of the integrated input area, detects (1156) a gesture on
mukhi characters, and Hebrew characters. Drawing the touch-sensitive surface at a location that corresponds
characters via gestures in the center portion of the inte- to an emoji character in the plurality of emoji characters
grated input area has the advantage of requiring the user (e.g., a tap gesture on the emoji character), and, in re-
to move their finger only a small distance from where sponse to detecting the gesture on the touch-sensitive
they are typing. 35 surface at the location that corresponds to the emoji char-
[0332] For example, in Figures 5FF-5GG, one or more acter, inputs and displays (1158) the emoji character in
gestures 5128 corresponding to handwriting 5126 are the first text entry area and the second text entry area on
the display. Entering emoji characters via gestures in the
detected in center area 5016-C. Character " " corre-
center portion of the integrated input area has the advan-
sponding to handwriting 5126 is entered into input text
40 tage of requiring the user to move their thumb only a
5006 and duplicate input text 5022.
small distance from where they are typing (versus the
[0333] In some embodiments, the device detects
alternative of reaching to the first text entry area (or a
(1144) a plurality of gestures on the touch-sensitive sur-
menu bar at the top of the display) each time an emoji
face at a location that corresponds to the center portion
character is needed.
of the integrated input area (e.g., a series of finger strokes
45 [0338] For example, in Figures 5LL-5MM, emoji char-
and taps that correspond to a simple drawing), and
acters 5140, including emoji character 5140-A, are dis-
makes (1146) a drawing in accordance with the plurality
played in text entry area 5002. Duplicate emoji characters
of gestures. Drawing via gestures in the center portion
5142 are displayed in center area 5016-C. Duplicate
of the integrated input area has the advantage of requir-
emoji character 5142-A is a duplicate of emoji character
ing the user to move their finger only a small distance
50 5140-A. Gesture 5144 is detected on duplicate emoji
from where they are typing.
character 5142-A. In response to detection of gesture
[0334] For example, in Figures 5HH-5II, one or more
5144, emoji character 5140-A is entered into input text
gestures 5132 corresponding to drawing 5130 are de-
5006 and displayed, and duplicate emoji character 5142-
tected in center area 5016-C. Drawing 5133, correspond-
A is entered into duplicate input text 5022 and displayed.
ing to drawing 5130 is entered into input text 5006.
55 [0339] In some embodiments, the device displays
[0335] In some embodiments, the device displays
(1160) a plurality of unicode characters in the center por-
(1148) a suggested word in the center portion of the in-
tion of the integrated input area, detects (1162) a gesture
tegrated input area (e.g., a word automatically suggested
on the touch-sensitive surface at a location that corre-
40
79 EP 2 638 460 B1 80
41
81 EP 2 638 460 B1 82
touch-sensitive surface at a location that corresponds to 5162 in application content area 5160 that meet prede-
the input area on the display. For example, in Figure fined criteria (e.g., within a predefined distance from ter-
5WW, gesture 5168 is detected on display 112. Gesture mination point 5175) are searched. Among candidates
5168 starts from a location corresponding to soft key- text entry areas that are found, one (e.g., 5162-A) is se-
board 5164. 5 lected as the text entry area under which soft keyboard
[0352] The device, in response to detecting the flick 5164 will come to rest. Trajectory 5174 is adjusted so
gesture, moves (1210) the input area on the display with that soft keyboard 5164 comes to rest under text entry
inertia in accordance with the flick gesture such that the area 5162-A.
input area comes to rest at a location adjacent to and just [0357] In some embodiments, identifying one of the
below a text entry area in the application content area. 10 candidate text entry areas as the text entry area that the
[0353] In other words, when a drag gesture is applied input area will come to rest adjacent to and just below
to the input area, the input area tracks (follows) the move- includes selecting (1218) a respective candidate text en-
ment of the finger making the drag gesture. When lift-off try area that is closest to a termination point of the tra-
of the finger making the drag gesture is detected, the jectory as the text entry area that the input area will come
movement of the input area stops. In contrast, when a 15 to rest adjacent to and just below. For example, in Figure
flick gesture is applied to the input area, the input area 5YY, text entry area 5162-A is selected as the text entry
is ’thrown’ in the direction of the flick gesture with some area under which soft keyboard 5164 come to rest; text
simulated inertia and friction, so the input area does not entry area 5162-A is closest to termination point 5175.
stop at a location that corresponds to where lift-off of the In 5ZZ, soft keyboard 5164 comes to rest adjacent to and
finger making the flick gesture occurred. Instead, the in- 20 under text entry area 5162-A.
put area continues to move in the direction of the flick [0358] In some embodiments, the respective candi-
gesture, gradually slows down, and comes to rest at a date text entry area is identified as the text entry area
location adjacent to and just below a text entry area in that the input area will come to rest adjacent to and just
the application. Thus, an imprecise flick gesture results below based on proximity of the respective candidate
in automatic, precise placement of a keyboard just below 25 text entry area to a termination point of the trajectory
a text entry area in an application, whereas a more pre- (1220). For example, in Figure 5YY-5ZZ, text entry area
cise drag gesture enables a user to manually position 5162-A is selected as the text entry area under which
the text entry area. soft keyboard 5164 comes to rest based on the fact that
[0354] For example, in Figures 5WW-5XX, in response it is the closest among text entry areas 5162 to termina-
to gesture 5168, soft keyboard 5164 moves with trajec- 30 tion point 5175.
tory 5170, including movement inertia, in accordance [0359] In some embodiments, the trajectory is calcu-
with gesture 5168. Soft keyboard 5164 comes to rest lated based on simulated physical properties of the input
adjacent to and below text entry area 5162-F; soft key- area (1222). For example, one or more simulated phys-
board 5164 docks just below text entry area 5162-F. ical properties are associated with application content
[0355] In some embodiments, moving the input area 35 area 5160. Examples of simulated physical properties
on the display with inertia in accordance with the flick include properties that affect the motion of an object, such
gesture includes calculating (1212) a trajectory of the in- as density, friction coefficient, and so forth. Values for
put area based on the flick gesture, searching (1214) for the properties are predefined. A trajectory (e.g., trajectory
one or more text entry areas in the application content 5170) is calculated based on the properties and the cor-
area that meet predefined candidate criteria, and, when 40 responding gesture (e.g., gesture 5168).
one or more candidate text entry areas are found, iden- [0360] In some embodiments, the candidate criteria
tifying (1216) a respective candidate text entry area as are met for a respective text entry area if the respective
the text entry area that the input area will come to rest text entry area is within a predefined distance of a termi-
adjacent to and just below and adjusting the trajectory nation point of the trajectory (1224). For example, in Fig-
accordingly. For example, in Figures 5WW-5XX, trajec- 45 ure 5YY, text entry areas that are within a predefined
tory 5170 with termination point 5171 is calculated based distance (e.g., distance 5177) from termination point
on gesture 5168. One or more text entry areas 5162 in 5175 are identified as candidate text entry areas under
application content area 5160 that meet predefined cri- which soft keyboard 5164 may come to rest in accord-
teria (e.g., within a predefined distance from termination ance with trajectory 5175. Text entry areas 5162-A and
point 5171) are searched. Among candidates text entry 50 5162-B are within distance 5177 and are thus identified
areas that are found, one (e.g., 5162-F) is selected as as candidate text entry areas.
the text entry area under which soft keyboard 5164 will [0361] Figures 13A-13B are flow diagrams illustrating
come to rest. Trajectory 5170 is adjusted so that soft a method 1300 of reconfiguring an integrated input area
keyboard 5164 comes to rest under text entry area 5162- in accordance with some embodiments. The method
F. 55 1300 is performed at an electronic device (e.g., device
[0356] As another example, in Figures 5YY-5ZZ, tra- 300, Figure 3, or portable multifunction device 100, Fig-
jectory 5174 with termination point 5175 is calculated ure 1) with a display and a touch-sensitive surface. In
based on flick gesture 5172. One or more text entry areas some embodiments, the display is a touch screen display
42
83 EP 2 638 460 B1 84
43
85 EP 2 638 460 B1 86
[0371] Analogously, in some embodiments, move- increases the size of both the left side and the right side
ment by the right thumb toward the right vertical side of of the split keyboard. For example, in Figures 5LLL-
the display and away from the right vertical side of the 5MMM, gesture 5188 is detected on display 112. Gesture
display decreases and increases, respectively, the size 5188 includes a left thumb moving away from the left
of the right side of the split keyboard, as shown in Figures 5 vertical side of display 112. In response to detection of
5FFF-5III. gesture 5188, left split keyboard portion 5039-A and right
[0372] In some embodiments, the second input in- split keyboard portion 5039-B increase in size.
cludes a horizontal movement of the first thumb towards [0375] Figure 14 is a flow diagram illustrating a method
a vertical side of the display closest to the first thumb of automatically converting between an unsplit keyboard
(e.g., moving the left thumb towards the left vertical side 10 and a split keyboard in accordance with some embodi-
of the display or moving the right thumb towards the right ments. The method 1400 is performed at an electronic
vertical side of the display), and in response to detecting device (e.g., device 300, Figure 3, or portable multifunc-
the horizontal movement of the first thumb towards the tion device 100, Figure 1) with a display and a touch-
vertical side of the display closest to the first thumb, the sensitive surface. In some embodiments, the display is
device reduces the size of the left side and the right side 15 a touch screen display and the touch-sensitive surface
of the split keyboard (1320). In some embodiments, is on the display. In some embodiments, the display is
movement by just one thumb concurrently reduces the separate from the touch-sensitive surface. Some opera-
size of both the left side and the right side of the split tions in method 1400 may be combined and/or the order
keyboard. For example, in Figures 5JJJ-5KKK, gesture of some operations may be changed.
5186 is detected on display 112. Gesture 5186 includes 20 [0376] As described below, the method 1400 provides
a left thumb moving toward the left vertical side of display an efficient way to manipulate a soft keyboard into a po-
112. In response to detection of gesture 5186, left split sition and configuration that best suits the user’s needs.
keyboard portion 5039-A and right split keyboard portion The method is particularly useful when the user alter-
5039-B are concurrently reduced in size. nates between placing a tablet computer on a surface
[0373] In some embodiments, the left edge of the left 25 (where typing on an unsplit keyboard at the bottom of the
side of the split keyboard maintains its position (which is display is more efficient) and holding the tablet computer
typically near the left vertical side of the display) as the with both hands (where two-thumb typing on a split key-
left side of the split keyboard is reduced. Thus, the right board away from the bottom of the display is more effi-
edge of the left side of the split keyboard moves closer cient). The method reduces the cognitive burden on a
to the left vertical side of the display as the left side of 30 user when configuring the soft keyboard for use, thereby
the split keyboard is reduced. This makes it easier for creating a more efficient human-machine interface. For
the left thumb to reach the keys near the right edge of battery-operated electronic devices, enabling a user to
the left side of the split keyboard and eliminates the need configure a soft keyboard faster and more efficiently con-
for the user to reposition the left edge of the left side of serves power and increases the time between battery
the keyboard after the left side of the keyboard is reduced. 35 charges.
Similarly, in some embodiments, the right edge of the [0377] The device concurrently displays (1402) on the
right side of the split keyboard maintains its position display an application content area, and an unsplit key-
(which is typically near the right vertical side of the dis- board (e.g. a single, unitary, or merged keyboard that
play) as the right side of the split keyboard is reduced. includes character keys from the left and right sides of
Thus, the left edge of the right side of the split keyboard 40 the split keyboard). The unsplit keyboard is located at
moves closer to the right vertical side of the display as the bottom of the display. For example, in Figure 5OOO,
the right side of the split keyboard is reduced. This makes text entry area 5002 (e.g., of a notes application) and
it easier for the right thumb to reach the keys near the unsplit keyboard 5008 are displayed on display 112. Un-
left edge of the right side of the split keyboard and elim- split keyboard 5008 is displayed at the bottom of display
inates the need for the user to reposition the right edge 45 112.
of the right side of the keyboard after the right side of the [0378] The device detects (1404) a first gesture on the
keyboard is reduced. touch-sensitive surface (e.g., an upward flick or drag ges-
[0374] In some embodiments, the second input in- ture at a location on the touch-sensitive surface that cor-
cludes a horizontal movement of the first thumb away responds to the unsplit keyboard at the bottom of the
from a vertical side of the display closest to the first thumb 50 display, or an upward drag gesture that starts at a location
(e.g., moving the left thumb away from the left vertical on the touch-sensitive surface that corresponds to a lo-
side of the display or moving the right thumb away from cation of a predefined key on the unsplit keyboard, such
the right vertical side of the display), and in response to as key 5012 in Figure 5OOO). For example, in Figure
detecting the horizontal movement of the first thumb 5OOO, gesture 5194, with upward movement 5196 away
away from the vertical side of the display closest to the 55 from the bottom of display 112, is detected on display
first thumb, the device increases (1322) the size of the 112.
left side and the right side of the split keyboard. In some [0379] In response to detecting the first gesture on the
embodiments, movement by just one thumb concurrently touch-sensitive surface, the device converts the unsplit
44
87 EP 2 638 460 B1 88
keyboard into a split keyboard and moves the split key- board, the device moves (1414) the unsplit keyboard to-
board away from the bottom of the display over the ap- wards the bottom of the display over the application con-
plication content area in accordance with the first gesture tent area in accordance with the second gesture. For
(1406). For example, in response to the detection of ges- example, in Figure 5SSS, split keyboard 5198 is moved
ture 5194, unsplit keyboard 5008 is converted to split 5 toward the bottom of display 112 prior to the start of the
keyboard 5198, and split keyboard 5198 is moved away merging of split keyboard portions 5198-A and 5198-B.
from the bottom of display 112, as shown in Figures [0384] In some embodiments, in response to detecting
5PPP-5RRR. In some embodiments, the application con- the second gesture on the touch-sensitive surface, the
tent area remains the same size before and after the first device displays (1416) an animation of the split keyboard
gesture, as shown in Figure 5OOO. In some embodi- 10 converting into the unsplit keyboard while moving the
ments, in response to detecting the first gesture, the ap- keyboard towards the bottom of the display. For example,
plication content area changes from a first size to a sec- Figures 5SSS and 5TTT illustrate instants in an animated
ond size larger than the first size, as described above transition from split keyboard 5198 to unsplit keyboard
with respect to method 900. 5008.
[0380] In some embodiments, in response to detecting 15 [0385] In accordance with some embodiments, Figure
the first gesture on the touch-sensitive surface and prior 15 shows a functional block diagram of an electronic de-
to converting the unsplit keyboard into the split keyboard, vice 1500 configured in accordance with the principles
the device moves (1408) the unsplit keyboard away from of the invention as described above. The functional
the bottom of the display over the application content blocks of the device may be implemented by hardware,
area in accordance with the first gesture. For example, 20 software, or a combination of hardware and software to
in Figure 5PPP, unsplit keyboard 5008 is moved away carry out the principles of the invention. It is understood
from the bottom of display 112 prior to the start of the by persons of skill in the art that the functional blocks
split into split keyboard portions 5198-A and 5198-B. described in Figure 15 may be combined or separated
[0381] In some embodiments, in response to detecting into sub-blocks to implement the principles of the inven-
the first gesture on the touch-sensitive surface, the de- 25 tion as described above. Therefore, the description here-
vice displays (1410) an animation of the unsplit keyboard in may support any possible combination or separation
converting into the split keyboard while moving the key- or further definition of the functional blocks described
board away from the bottom of the display. For example, herein.
Figures 5PPP and 5QQQ illustrate instants in an animat- [0386] As shown in Figure 15, an electronic device
ed transition from unsplit keyboard 5008 to split keyboard 30 1500 includes a display unit 1502 configured to concur-
5198. rently display a first text entry area and an unsplit key-
[0382] In some embodiments, while concurrently dis- board, a touch-sensitive surface unit 1504 configured to
playing on the display the application content area, and receive user gestures, and a processing unit 1506 cou-
the split keyboard, the split keyboard being located away pled to the display unit 1502 and the touch-sensitive sur-
from the bottom of the display, the device detects a sec- 35 face unit 1504. In some embodiments, the processing
ond gesture on the touch-sensitive surface (e.g., an unit 1506 includes a detecting unit 1508, a replacing unit
downward flick or drag gesture at a location on the touch- 1510, an inputting unit 1512, and a display enabling unit
sensitive surface that corresponds to the split keyboard, 1514.
or a downward drag gesture that starts at a location on [0387] The processing unit 1506 is configured to detect
the touch-sensitive surface that corresponds to a location 40 a gesture on the touch-sensitive surface unit 1504 (e.g.,
of a predefined key on the split keyboard, such as key with the detecting unit 1508), and, in response to detect-
5018 in Figure 5RRR). In response to detecting the sec- ing the gesture on the touch-sensitive surface unit 1504,
ond gesture on the touch-sensitive surface, the device replace the unsplit keyboard with an integrated input area
converts the split keyboard into the unsplit keyboard and (e.g., with the replacing unit 1510). The integrated input
moves the unsplit keyboard to the bottom of the display 45 area includes a left portion with a left side of a split key-
in accordance with the second gesture (1412). For ex- board, a right portion with a right side of the split keyboard,
ample, in Figure 5RRR, split keyboard 5198 is displayed and a center portion in between the left portion and the
on display 112. Split keyboard 5198 is displayed away right portion.
from the bottom of display 112. Gesture 5202, with down- [0388] In some embodiments, the integrated input area
ward movement 5204 toward the bottom of display 112, 50 includes a second text entry area.
is detected on display 112. In response to the detection [0389] In some embodiments, the first text entry area
of gesture 5202, split keyboard 5198 is converted to un- is a text entry area that displays text at a first size, and
split keyboard 5008, and unsplit keyboard 5008 is moved the second text entry area is a text entry area that displays
to the bottom of display 112, as shown in Figures 5SSS- a portion of the text in the first text entry area at a second
5TTT and 5OOO. 55 size that is larger than the first size.
[0383] In some embodiments, in response to detecting [0390] In some embodiments, the processing unit
the second gesture on the touch-sensitive surface and 1506 is configured to: while displaying the integrated in-
prior to converting the split keyboard into the unsplit key- put area, detect a gesture at a location on the touch-
45
89 EP 2 638 460 B1 90
sensitive surface unit 1504 that corresponds to a location 1600 includes a display unit 1602 configured to display
of a character key in the split keyboard (e.g., with the a first keyboard, the first keyboard including a first plu-
detecting unit 1508); and, in response to detecting the rality of keys; a touch-sensitive surface unit 1604 config-
gesture at the location on the touch-sensitive surface unit ured to receive user gestures; and a processing unit 1606
1504 that corresponds to the location of the character 5 coupled to the display unit 1602 and the touch-sensitive
key in the split keyboard, input and enable concurrent surface unit 1604. In some embodiments, the processing
display of the corresponding character in the first text unit 1606 includes a detecting unit 1608, an activating
entry area and the second text entry area on the display unit 1610, a replacing unit 1612, a maintaining unit 1614,
unit 1502 (e.g., with the inputting unit 1512 and the dis- and a disabling unit 1616.
play enabling unit 1514). 10 [0402] The processing unit 1606 is configured to detect
[0391] In some embodiments, the electronic device a key activation gesture at a first time at a location on the
1500 is a portable electronic device. touch-sensitive surface unit 1604 that corresponds to a
[0392] In some embodiments, the display unit 1502 is location of a first key in the first keyboard (e.g., with the
a touch-sensitive display unit that includes a touch-sen- detecting unit 1608); in response to detecting the key
sitive surface unit 1504. 15 activation gesture at the first time, activate the first key
[0393] In some embodiments, the gesture is a multi- (e.g., with the activating unit 1610); detect one or more
finger depinch gesture at a location on the touch-sensi- contacts on the touch-sensitive surface unit 1604 at a
tive surface unit 1504 that corresponds to the location of second time after the first time, the one or more contacts
the unsplit keyboard on the display unit 1502. corresponding to a keyboard selection gesture (e.g., with
[0394] In some embodiments, the gesture is a tap ges- 20 the detecting unit 1608); and in response to detecting the
ture on a keyboard selection icon. one or more contacts that correspond to the keyboard
[0395] In some embodiments, replacing the unsplit selection gesture at the second time after the first time:
keyboard with the integrated input area includes enabling replace the first keyboard with a second keyboard on the
display of an animation that transitions the unsplit key- display unit 1602 when the second time exceeds a pre-
board to the integrated input area. 25 defined period of time after the first time (e.g., with the
[0396] In some embodiments, the processing unit replacing unit 1612); and maintain display of the first key-
1506 is configured to: while displaying the integrated in- board on the display unit 1602 when the second time is
put area, detect a second gesture on the touch-sensitive less than the predefined period of time after the first time
surface unit 1504 (e.g., with the detecting unit 1508); and, (e.g., with the maintaining unit 1614).
in response to detecting the second gesture on the touch- 30 [0403] In some embodiments, the processing unit
sensitive surface unit 1504, replace the integrated input 1606 is configured to: in response to detecting the key
area with the unsplit keyboard (e.g., with the replacing activation gesture at the first time, disable a gesture rec-
unit 1510). ognizer for the keyboard selection gesture for the prede-
[0397] In some embodiments, the second gesture is a fined period of time (e.g., with the disabling unit 1616).
multifinger pinch gesture at a location on the touch-sen- 35 [0404] In some embodiments, the electronic device
sitive surface unit 1504 that corresponds to the location 1600 is a portable electronic device.
of the integrated input area on the display unit 1502. [0405] In some embodiments, the display unit 1602 is
[0398] In some embodiments, the second gesture is a a touch-sensitive display unit that includes the touch-sen-
tap gesture on a keyboard selection icon. sitive surface unit 1604.
[0399] In some embodiments, replacing the integrated 40 [0406] In some embodiments, the keyboard selection
input area with the unsplit keyboard includes enabling gesture is a multifinger gesture at a location on the touch-
display of an animation that transitions the integrated in- sensitive surface unit 1604 that corresponds to the loca-
put area to the unsplit keyboard. tion of the first keyboard on the display unit 1602.
[0400] In accordance with some embodiments, Figure [0407] In some embodiments, replacing the first key-
16 shows a functional block diagram of an electronic de- 45 board with the second keyboard includes enabling dis-
vice 1600 configured in accordance with the principles play of an animation that transitions the first keyboard to
of the invention as described above. The functional the second keyboard.
blocks of the device may be implemented by hardware, [0408] In accordance with some embodiments, Figure
software, or a combination of hardware and software to 17 shows a functional block diagram of an electronic de-
carry out the principles of the invention. It is understood 50 vice 1700 configured in accordance with the principles
by persons of skill in the art that the functional blocks of the invention as described above. The functional
described in Figure 16 may be combined or separated blocks of the device may be implemented by hardware,
into sub-blocks to implement the principles of the inven- software, or a combination of hardware and software to
tion as described above. Therefore, the description here- carry out the principles of the invention. It is understood
in may support any possible combination or separation 55 by persons of skill in the art that the functional blocks
or further definition of the functional blocks described described in Figure 17 may be combined or separated
herein. into sub-blocks to implement the principles of the inven-
[0401] As shown in Figure 16, an electronic device tion as described above. Therefore, the description here-
46
91 EP 2 638 460 B1 92
in may support any possible combination or separation of the invention as described above. The functional
or further definition of the functional blocks described blocks of the device may be implemented by hardware,
herein. software, or a combination of hardware and software to
[0409] As shown in Figure 17, an electronic device carry out the principles of the invention. It is understood
1700 includes a display unit 1702 configured to concur- 5 by persons of skill in the art that the functional blocks
rently display a first text entry area and an integrated described in Figure 18 may be combined or separated
input area, the integrated input area including a left por- into sub-blocks to implement the principles of the inven-
tion with a left side of a split keyboard, a right portion with tion as described above. Therefore, the description here-
a right side of the split keyboard, and a center portion in in may support any possible combination or separation
between the left portion and the right portion; a touch- 10 or further definition of the functional blocks described
sensitive surface unit 1704 configured to receive user herein.
contacts and movements of the user contacts; and a [0415] As shown in Figure 18, an electronic device
processing unit 1706 coupled to the display unit 1702 1800 includes a display unit 1802 configured to concur-
and the touch-sensitive surface unit 1704. In some em- rently display an application content area with a first size,
bodiments, the processing unit 1706 includes a detecting 15 and an input area with a keyboard, the input area being
unit 1708, and a moving unit 1710. adjacent to and separate from the application content
[0410] The processing unit 1706 is configured to detect area with the first size, the input area being at a bottom
a first contact on the touch-sensitive surface unit 1704 of the display unit 1802; a touch-sensitive surface unit
at a location that corresponds to the center portion of the 1804 configured to receive user gestures; and a process-
integrated input area (e.g., with the detecting unit 1708); 20 ing unit 1806 coupled to the display unit 1802 and the
detect movement of the first contact along the touch-sen- touch-sensitive surface unit 1804. In some embodi-
sitive surface unit 1704 (e.g., with the detecting unit ments, the processing unit 1806 includes a detecting unit
1708); in response to detecting the movement of the first 1808, a moving unit 1810, and an increasing unit 1812.
contact along the touch-sensitive surface unit 1704, [0416] The processing unit 1806 is configured to detect
move the integrated input area on the display unit 1702 25 a gesture on the touch-sensitive surface unit 1804 (e.g.,
in accordance with the movement of the first contact with the detecting unit 1808); and, in response to detect-
when the movement of the first contact exceeds a first ing the gesture on the touch-sensitive surface unit 1804:
movement threshold (e.g., with the moving unit 1710); move the input area away from the bottom of the display
detect a second contact, distinct from the first contact, unit 1802 over the application content area (e.g., with the
on the touch-sensitive surface unit 1704 at a location that 30 moving unit 1810), and increase a size of the application
corresponds to the split keyboard (e.g., with the detecting content area to a second size larger than the first size
unit 1708); detect movement of the second contact along (e.g., with the increasing unit 1812).
the touch-sensitive surface unit 1704 (e.g., with the de- [0417] In some embodiments, the first size of the ap-
tecting unit 1708); and, in response to detecting the plication content area has a first height, the second size
movement of the second contact along the touch-sensi- 35 of the application content area has a second height, the
tive surface unit 1704, move the integrated input area on input area has an input area height, and the second
the display unit 1702 in accordance with the movement height is greater than the first height by an amount equal
of the second contact when the movement of the second to the input area height.
contact exceeds a second movement threshold, the sec- [0418] In accordance with some embodiments, Figure
ond movement threshold being greater than the first 40 19 shows a functional block diagram of an electronic de-
movement threshold (e.g., with the moving unit 1710). vice 1900 configured in accordance with the principles
[0411] In some embodiments, a respective movement of the invention as described above. The functional
threshold is a function of a horizontal distance of a re- blocks of the device may be implemented by hardware,
spective contact from a vertical centerline of the integrat- software, or a combination of hardware and software to
ed input area. 45 carry out the principles of the invention. It is understood
[0412] In some embodiments, the integrated input area by persons of skill in the art that the functional blocks
is constrained to vertical movement on the display unit described in Figure 19 may be combined or separated
1702 and the integrated input area moves in accordance into sub-blocks to implement the principles of the inven-
with a vertical component of movement of a respective tion as described above. Therefore, the description here-
contact when a movement threshold for the respective 50 in may support any possible combination or separation
contact is exceeded. or further definition of the functional blocks described
[0413] In some embodiments, the left side of the split herein.
keyboard and the right side of the split keyboard maintain [0419] As shown in Figure 19, an electronic device
fixed positions relative to each other within the integrated 1900 includes a display unit 1902 configured to concur-
input area during movement of the integrated input area. 55 rently display a text entry area, a left side of a split key-
[0414] In accordance with some embodiments, Figure board, and a right side of a split keyboard, the left side
18 shows a functional block diagram of an electronic de- of the split keyboard including a plurality of rows of keys
vice 1800 configured in accordance with the principles and the right side of the split keyboard including a corre-
47
93 EP 2 638 460 B1 94
sponding plurality of rows of keys; a touch-sensitive sur- the text entry area the character that corresponds to the
face unit 1904 configured to receive user gestures; and leftmost key in the corresponding respective row of the
a processing unit 1906 coupled to the display unit 1902 right side of the split keyboard when the second time
and the touch-sensitive surface unit 1904. In some em- exceeds the predefined period of time after the first time
bodiments, the processing unit 1906 includes a detecting 5 (e.g., with the foregoing unit 1912).
unit 1908, an entering unit 1910, and a foregoing unit [0425] In accordance with some embodiments, Figure
1912. 20 shows a functional block diagram of an electronic de-
[0420] The processing unit 1906 is configured to detect vice 2000 configured in accordance with the principles
a gesture at a location on the touch-sensitive surface unit of the invention as described above. The functional
1904 that corresponds to a predefined area adjacent to 10 blocks of the device may be implemented by hardware,
and to the right of a rightmost key in a respective row of software, or a combination of hardware and software to
the left side of the split keyboard (e.g., with the detecting carry out the principles of the invention. It is understood
unit 1908); and, in response to detecting the gesture at by persons of skill in the art that the functional blocks
the location on the touch-sensitive surface unit 1904 that described in Figure 20 may be combined or separated
corresponds to the predefined area adjacent to and to 15 into sub-blocks to implement the principles of the inven-
the right of the rightmost key in the respective row of the tion as described above. Therefore, the description here-
left side of the split keyboard, enter in the text entry area in may support any possible combination or separation
a character that corresponds to a leftmost key in a cor- or further definition of the functional blocks described
responding respective row of the right side of the split herein.
keyboard (e.g., with the entering unit 1910). 20 [0426] As shown in Figure 20, an electronic device
[0421] In some embodiments, the predefined area ad- 2000 includes a display unit 2002 configured to concur-
jacent to and to the right of the rightmost key in the re- rently display a first text entry area and an integrated
spective row of the left side of the split keyboard is an input area, the integrated input area including a left por-
undisplayed key activation area that corresponds to the tion with a left side of a split keyboard, a right portion with
leftmost key in the corresponding respective row of the 25 a right side of the split keyboard, and a center portion
right side of the split keyboard. with a second text entry area, the center portion in be-
[0422] In some embodiments, the processing unit tween the left portion and the right portion; a touch-sen-
1906 is configured to detect a gesture at a location on sitive surface unit 2004 configured to receive user ges-
the touch-sensitive surface unit 1904 that corresponds tures; and a processing unit 2006 coupled to the display
to a predefined area adjacent to and to the left of a left- 30 unit 2002 and the touch-sensitive surface unit 2004. In
most key in a respective row of the right side of the split some embodiments, the processing unit 2006 includes
keyboard (e.g., with the detecting unit 1908); and, in re- a detecting unit 2008, a display enabling unit 2010, an
sponse to detecting the gesture at the location on the inputting unit 2012, moving unit 2014, an executing unit
touch-sensitive surface unit 1904 that corresponds to the 2016, a selecting unit 2018, a making unit 2020, a for-
predefined area adjacent to and to the left of the leftmost 35 matting unit 2022, and an entering unit 2024.
key in the respective row of the right side of the split [0427] The processing unit 2006 is configured to: de-
keyboard, enter in the text entry area a character that tect a gesture at a location on the touch-sensitive surface
corresponds to a rightmost key in a corresponding re- unit 2004 that corresponds to a location of a character
spective row of the left side of the split keyboard (e.g., key in the split keyboard (e.g., with the detecting unit
with the entering unit 1910). 40 2008); and, in response to detecting the gesture at the
[0423] In some embodiments, the predefined area ad- location on the touch-sensitive surface unit 2004 that cor-
jacent to and to the left of the leftmost key in the respective responds to the location of the character key in the split
row of the right side of the split keyboard is an undisplayed keyboard, input and enable concurrent display of the cor-
key activation area that corresponds to the rightmost key responding character in the first text entry area and the
in the corresponding respective row of the left side of the 45 second text entry area on the display unit 2002 (e.g., with
split keyboard. the inputting unit 2012 and the display enabling unit
[0424] In some embodiments, the processing unit 2010).
1906 is configured to: prior to detecting the gesture, de- [0428] In some embodiments, the first text entry area
tect a key activation gesture at a first time at a location displays text at a first size, and the second text entry area
on the touch-sensitive surface unit 1904 that corresponds 50 displays a portion of the text in the first text entry area at
to a location of a visible key in the split keyboard (e.g., a second size that is larger than the first size.
with the detecting unit 1908); and, in response to detect- [0429] In some embodiments, the width of the integrat-
ing the gesture at a second time after the first time: enter ed input area is the same as the width of the display unit
in the text entry area the character that corresponds to 2002.
the leftmost key in the corresponding respective row of 55 [0430] In some embodiments, the left side of the split
the right side of the split keyboard when the second time keyboard and the right side of the split keyboard maintain
is less than a predefined period of time after the first time fixed positions relative to each other within the integrated
(e.g., with the entering unit 1910); and forego entering in input area during movement of the integrated input area.
48
95 EP 2 638 460 B1 96
[0431] In some embodiments, the second text entry 2006 is configured to: detect a plurality of gestures on
area includes an insertion point that remains stationary the touch-sensitive surface unit 2004 at a location that
on the display unit 2002 as text is entered. corresponds to the center portion of the integrated input
[0432] In some embodiments, the second text entry area (e.g., with the detecting unit 2008); and, make a
area includes an insertion point, wherein the processing 5 drawing in accordance with the plurality of gestures (e.g.,
unit 2006 is configured to: detect a gesture on the touch- with the making unit 2020).
sensitive surface unit 2004 at a location that corresponds [0439] In some embodiments, the processing unit
to the insertion point in the second text entry area (e.g., 2006 is configured to: enable display of a suggested word
with the detecting unit 2008); and, move the insertion in the center portion of the integrated input area (e.g.,
point in the second text entry area in accordance with 10 with the display enabling unit 2010); detect a gesture on
the gesture on the touch-sensitive surface unit 2004 at the touch-sensitive surface unit 2004 at a location that
the location that corresponds to the insertion point in the corresponds to the suggested word (e.g., with the detect-
second text entry area (e.g., with the moving unit 2014). ing unit 2008); and, execute a text editing command in
[0433] In some embodiments, the processing unit accordance with the gesture on the touch-sensitive sur-
2006 is configured to: enable display of text editing con- 15 face unit 2004 at the location that corresponds to the
trols in the center portion of the integrated input area suggested word (e.g., with the executing unit 2016).
(e.g., with the display enabling unit 2010); detect a ges- [0440] In some embodiments, the processing unit
ture on the touch-sensitive surface unit 2004 at a location 2006 is configured to: enable display of a plurality of emoji
that corresponds to a text editing control in the center characters in the center portion of the integrated input
portion of the integrated input area (e.g., with the detect- 20 area (e.g., with the display enabling unit 2010); detect a
ing unit 2008); and, execute a text editing command in gesture on the touch-sensitive surface unit 2004 at a lo-
accordance with the gesture on the touch-sensitive sur- cation that corresponds to an emoji character in the plu-
face unit 2004 at the location that corresponds to the text rality of emoji characters (e.g., with the detecting unit
editing control (e.g., with the executing unit 2016). 2008); and, in response to detecting the gesture on the
[0434] In some embodiments, the text editing controls 25 touch-sensitive surface unit 2004 at the location that cor-
in the center portion of the integrated input area have responds to the emoji character, input and enable display
corresponding text editing controls in the first text entry of the emoji character in the first text entry area and the
area. second text entry area on the display unit 2002 (e.g., with
[0435] In some embodiments, the processing unit the inputting unit 2012 and the display enabling unit
2006 is configured to: enable display of user-selectable 30 2010).
input elements in the center portion of the integrated input [0441] In some embodiments, the processing unit
area (e.g., with the display enabling unit 2010); detect a 2006 is configured to: enable display of a plurality of uni-
gesture on the touch-sensitive surface unit 2004 at a lo- code characters in the center portion of the integrated
cation that corresponds to a user-selectable input ele- input area (e.g., with the display enabling unit 2010); de-
ment in the center portion of the integrated input area 35 tect a gesture on the touch-sensitive surface unit 2004
(e.g., with the detecting unit 2008); and, in response to at a location that corresponds to a unicode character in
detecting the gesture on the touch-sensitive surface unit the plurality of unicode characters (e.g., with the detect-
2004 at the location that corresponds to the user-selecta- ing unit 2008); and, in response to detecting the gesture
ble input element, select the user-selectable input ele- on the touch-sensitive surface unit 2004 at the location
ment (e.g., with the selecting unit 2018). 40 that corresponds to the unicode character, input and en-
[0436] In some embodiments, the processing unit able display of the unicode character in the first text entry
2006 is configured to: detect a gesture on the touch-sen- area and the second text entry area on the display unit
sitive surface unit 2004 at a location that corresponds to 2002 (e.g., with the inputting unit 2012 and the display
the center portion of the integrated input area (e.g., with enabling unit 2010).
the detecting unit 2008); and, in response to detecting 45 [0442] In some embodiments, the processing unit
the gesture on the touch-sensitive surface unit 2004 at 2006 is configured to: detect a gesture on the touch-sen-
the location that corresponds to the center portion of the sitive surface unit 2004 at a location that corresponds to
integrated input area, enable display of a popup view the second text entry area (e.g., with the detecting unit
(e.g., with the display enabling unit 2010). 2008); and, in response to detecting the gesture on the
[0437] In some embodiments, the processing unit 50 touch-sensitive surface unit 2004 at the location that cor-
2006 is configured to: detect a plurality of gestures on responds to the second text entry area, select a range
the touch-sensitive surface unit 2004 at a location that of text (e.g., with the select unit 2018).
corresponds to the center portion of the integrated input [0443] In some embodiments, the processing unit
area (e.g., with the detecting unit 2008); and, enter in the 2006 is configured to: detect a gesture on the touch-sen-
first text entry area and the second text entry area a char- 55 sitive surface unit 2004 at a location that corresponds to
acter that corresponds to the plurality of gestures (e.g., one or more words in the second text entry area (e.g.,
with the entering unit 2024). with the detecting unit 2008); and, in response to detect-
[0438] In some embodiments, the processing unit ing the gesture on the touch-sensitive surface unit 2004
49
97 EP 2 638 460 B1 98
at the location that corresponds to the second text entry date text entry area is identified as the text entry area
area, format the one or more words (e.g., with the for- that the input area will come to rest adjacent to and just
matting unit 2022). below based on proximity of the respective candidate
[0444] In some embodiments, input elements in the text entry area to a termination point of the trajectory.
center portion of the integrated input area are user-con- 5 [0450] In some embodiments, the trajectory is calcu-
figurable. lated based on simulated physical properties of the input
[0445] In accordance with some embodiments, Figure area.
21 shows a functional block diagram of an electronic de- [0451] In some embodiments, the candidate criteria
vice 2100 configured in accordance with the principles are met for a respective text entry area if the respective
of the invention as described above. The functional 10 text entry area is within a predefined distance of a termi-
blocks of the device may be implemented by hardware, nation point of the trajectory.
software, or a combination of hardware and software to [0452] In some embodiments, identifying one of the
carry out the principles of the invention. It is understood candidate text entry areas as the text entry area that the
by persons of skill in the art that the functional blocks input area will come to rest adjacent to and just below
described in Figure 21 may be combined or separated 15 includes selecting a respective candidate text entry area
into sub-blocks to implement the principles of the inven- that is closest to a termination point of the trajectory as
tion as described above. Therefore, the description here- the text entry area that the input area will come to rest
in may support any possible combination or separation adjacent to and just below.
or further definition of the functional blocks described [0453] In accordance with some embodiments, Figure
herein. 20 22 shows a functional block diagram of an electronic de-
[0446] As shown in Figure 21, an electronic device vice 2200 configured in accordance with the principles
2100 includes a display unit 2102 configured to concur- of the invention as described above. The functional
rently display an application content area that includes blocks of the device may be implemented by hardware,
one or more text entry areas, and an input area with a software, or a combination of hardware and software to
keyboard that is displayed over the application content 25 carry out the principles of the invention. It is understood
area; a touch-sensitive surface unit 2104 configured to by persons of skill in the art that the functional blocks
receive user gestures; and a processing unit 2106 cou- described in Figure 22 may be combined or separated
pled to the display unit 2102 and the touch-sensitive sur- into sub-blocks to implement the principles of the inven-
face unit 2104. In some embodiments, the processing tion as described above. Therefore, the description here-
unit 2106 includes a detecting unit 2108, and a moving 30 in may support any possible combination or separation
unit 2110. or further definition of the functional blocks described
[0447] The processing unit 2106 is configured to: de- herein.
tect a drag gesture on the touch-sensitive surface unit [0454] As shown in Figure 22, an electronic device
2104 at a location that corresponds to the input area on 2200 includes a display unit 2202 configured to concur-
the display unit 2102 (e.g., with the detecting unit 2108); 35 rently display a first text entry area, and an integrated
in response to detecting the drag gesture, move the input input area, the integrated input area including: a left por-
area on the display unit 2102 in accordance with the drag tion with a left side of a split keyboard, a right portion with
gesture (e.g., with the moving unit 2110); detect a flick a right side of the split keyboard, and a center portion in
gesture on the touch-sensitive surface unit 2104 at a lo- between the left portion and the right portion; a touch-
cation that corresponds to the input area on the display 40 sensitive surface unit 2204 configured to receive user
unit 2102 (e.g., with the detecting unit 2108); and, in re- inputs; a processing unit 2206 coupled to the display unit
sponse to detecting the flick gesture, move the input area 2202 and the touch-sensitive surface unit 2204. In some
on the display unit 2102 with inertia in accordance with embodiments, the processing unit 2206 includes a de-
the flick gesture such that the input area comes to rest tecting unit 2208, an entering unit 2210, an adjusting unit
at a location adjacent to and just below a text entry area 45 2212, an exiting unit 2214, a reducing unit 2216, and an
in the application content area (e.g., with the moving unit increasing unit 2218.
2110). [0455] The processing unit 2206 is configured to: de-
[0448] In some embodiments, moving the input area tect a first input on the touch-sensitive surface unit 2204
on the display unit 2102 with inertia in accordance with (e.g., with the detecting unit 2208); in response to detect-
the flick gesture includes: calculating a trajectory of the 50 ing the first input, enter a reconfiguration mode for the
input area based on the flick gesture; searching for one integrated input area (e.g., with the entering unit 2210);
or more text entry areas in the application content area while in the reconfiguration mode for the integrated input
that meet predefined candidate criteria; and, when one area: detect a second input by a first thumb and/or a
or more candidate text entry areas are found, identifying second thumb (e.g., with the detecting unit 2208); in re-
a respective candidate text entry area as the text entry 55 sponse to detecting the second input, adjust the size of
area that the input area will come to rest adjacent to and at least one of the left side and the right side of the split
just below and adjusting the trajectory accordingly. keyboard in the integrated input area (e.g., with the ad-
[0449] In some embodiments, the respective candi- justing unit 2212); and detect a third input (e.g., with the
50
99 EP 2 638 460 B1 100
detecting unit 2208); and, in response to detecting the concurrently an application content area and an unsplit
third input, exit the reconfiguration mode for the integrat- keyboard, the unsplit keyboard being located at a bottom
ed input area (e.g., with the exiting unit 2214). of the display unit 2302; a touch-sensitive surface unit
[0456] In some embodiments, the first input includes 2304 configured to receive gestures; and a processing
a first thumb and a second thumb, distinct from the first 5 unit 2306 coupled to the display unit 2302 and the touch-
thumb. sensitive surface unit 2304. In some embodiments, the
[0457] In some embodiments, the third input includes processing unit 2306 includes a detecting unit 2308, a
the first thumb and/or the second thumb. converting unit 2310, a moving unit 2312, and a display
[0458] In some embodiments, the second input in- enabling unit 2314.
cludes a horizontal movement of a left thumb towards a 10 [0464] The processing unit 2306 is configured to: de-
left vertical side of the display unit 2202; and the process- tect a first gesture on the touch-sensitive surface unit
ing unit 2206 is configured to, in response to detecting 2304 (e.g., with the detecting unit 2308); in response to
the horizontal movement of the left thumb towards the detecting the first gesture on the touch-sensitive surface
left vertical side of the display unit 2202, reduce the size unit 2304: convert the unsplit keyboard into a split key-
of the left side of the split keyboard (e.g., with the reducing 15 board (e.g., with the converting unit 2310), and move the
unit 2216). split keyboard away from the bottom of the display unit
[0459] In some embodiments, the second input in- 2302 over the application content area in accordance
cludes a horizontal movement of a left thumb away from with the first gesture (e.g., with the moving unit 2312).
a left vertical side of the display unit 2202; and the [0465] In some embodiments, the processing unit
processing unit 2206 is configured to, in response to de- 20 2306 is configured to, in response to detecting the first
tecting the horizontal movement of the left thumb away gesture on the touch-sensitive surface unit 2304: prior to
from the left vertical side of the display unit 2202, increase converting the unsplit keyboard into the split keyboard,
the size of the left side of the split keyboard (e.g., with move the unsplit keyboard away from the bottom of the
the increasing unit 2218). display unit 2302 over the application content area in
[0460] In some embodiments, the second input in- 25 accordance with the first gesture (e.g., with the moving
cludes a horizontal movement of the first thumb towards unit 2312).
a vertical side of the display unit 2202 closest to the first [0466] In some embodiments, the processing unit
thumb; and the processing unit 2206 is configured to, in 2306 is configured to: in response to detecting the first
response to detecting the horizontal movement of the gesture on the touch-sensitive surface unit 2304, enable
first thumb towards the vertical side of the display unit 30 display of an animation of the unsplit keyboard converting
2202 closest to the first thumb, reduce the size of the left into the split keyboard while moving the keyboard away
side and the right side of the split keyboard (e.g., with from the bottom of the display unit 2302 (e.g., with the
the reducing unit 2216). display enabling unit 2314).
[0461] In some embodiments, the second input in- [0467] In some embodiments, the processing unit
cludes a horizontal movement of the first thumb away 35 2306 is configured to, while concurrently displaying on
from a vertical side of the display unit 2202 closest to the the display unit 2302 the application content area and
first thumb; and the processing unit 2206 is configured the split keyboard, the split keyboard being located away
to, in response to detecting the horizontal movement of from the bottom of the display unit 2302: detect a second
the first thumb away from the vertical side of the display gesture on the touch-sensitive surface unit 2304 (e.g.,
unit 2202 closest to the first thumb, increase the size of 40 with the detecting unit 2308); in response to detecting
the left side and the right side of the split keyboard (e.g., the second gesture on the touch-sensitive surface unit
with the increasing unit 2218). 2304: convert the split keyboard into the unsplit keyboard
[0462] In accordance with some embodiments, Figure (e.g., with the converting unit 2310), and move the unsplit
23 shows a functional block diagram of an electronic de- keyboard to the bottom of the display unit 2302 in ac-
vice 2300 configured in accordance with the principles 45 cordance with the second gesture (e.g., with the moving
of the invention as described above. The functional unit 2312).
blocks of the device may be implemented by hardware, [0468] In some embodiments, the processing unit
software, or a combination of hardware and software to 2306 is configured to, in response to detecting the second
carry out the principles of the invention. It is understood gesture on the touch-sensitive surface unit 2304: prior to
by persons of skill in the art that the functional blocks 50 converting the split keyboard into the unsplit keyboard,
described in Figure 23 may be combined or separated move the unsplit keyboard towards the bottom of the dis-
into sub-blocks to implement the principles of the inven- play unit 2302 over the application content area in ac-
tion as described above. Therefore, the description here- cordance with the second gesture (e.g., with the moving
in may support any possible combination or separation unit 2312).
or further definition of the functional blocks described 55 [0469] In some embodiments, the processing unit
herein. 2306 is configured to, in response to detecting the second
[0463] As shown in Figure 23, an electronic device gesture on the touch-sensitive surface unit 2304: enable
2300 includes a display unit 2302 configured to display display of an animation of the split keyboard converting
51
101 EP 2 638 460 B1 102
into the unsplit keyboard while moving the keyboard to- detecting a key activation gesture (5038) at
wards the bottom of the display unit 2302 (e.g., with the a first time at a location on the touch-sensi-
display enabling unit 2314). tive surface that corresponds to a location
[0470] The operations in the information processing of a first key (5036) in the first keyboard;
methods described above may be implemented by run- 5 in response to detecting the key activation
ning one or more functional modules in information gesture at the first time, activating the first
processing apparatus such as general purpose proces- key; and
sors or application specific chips. These modules, com- detecting one or more contacts on the
binations of these modules, and/or their combination with touch-sensitive surface at a second time af-
general hardware (e.g., as described above with respect 10 ter the first time, the one or more contacts
to Figures 1A and 3) are all included within the scope of corresponding to a keyboard selection ges-
protection of the invention. ture (5010);
[0471] The operations described above with reference characterized by:
to Figures 6A-6B, 7A-7B, 8A-8B, 9, 10A-10B, 11A-11D,
12A-12B, 13A-13B, and 14 may be implemented by com- 15 in response to detecting the one or
ponents depicted in Figures 1A-1B. For example, detec- more contacts that correspond to the
tion operation 604, and replacing operation 610 may be keyboard selection gesture (5010) at
implemented by event sorter 170, event recognizer 180, the second time after the first time:
and event handler 190. Event monitor 171 in event sorter
170 detects a contact on touch-sensitive display 112, and 20 replacing the first keyboard (5008)
event dispatcher module 174 delivers the event informa- with a second keyboard (5016)
tion to application 136-1. A respective event recognizer when the second time exceeds a
180 of application 136-1 compares the event information predefined period of time after the
to respective event definitions 186, and determines first time; and
whether a first contact at a first location on the touch- 25 maintaining display of the first key-
sensitive surface corresponds to a predefined event or board (5008) when the second
sub-event, such as selection of an object on a user in- time is less than the predefined pe-
terface. When a respective predefined event or sub- riod of time after the first time.
event is detected, event recognizer 180 activates an
event handler 190 associated with the detection of the 30 2. The method of claim 1, including:
event or sub-event. Event handler 190 may utilize or call
data updater 176 or object updater 177 to update the in response to detecting the key activation ges-
application internal state 192. In some embodiments, ture (5038) at the first time, disabling a gesture
event handler 190 accesses a respective GUI updater recognizer for the keyboard selection gesture
178 to update what is displayed by the application. Sim- 35 for the predefined period of time.
ilarly, it would be clear to a person having ordinary skill
in the art how other processes can be implemented based 3. The method of any of claims 1-2, wherein the elec-
on the components depicted in Figures 1A-1B. tronic device is a portable electronic device.
[0472] The foregoing description, for purpose of expla-
nation, has been described with reference to specific em- 40 4. The method of any of claims 1-3, wherein the key-
bodiments. However, the illustrative discussions above board selection gesture (5010) is a multifinger ges-
are not intended to be exhaustive or to limit the invention ture at a location on the touch-sensitive surface that
to the precise forms disclosed. Many modifications and corresponds to the location of the first keyboard
variations are possible in view of the above teachings. (5008) on the display.
The scope of the invention is defined by the appended 45
claims. 5. The method of any of claims 1-4, wherein replacing
the first keyboard (5008) with the second keyboard
(5016) includes displaying an animation that transi-
Claims tions the first keyboard to the second keyboard.
50
1. A method, comprising: 6. An electronic device, comprising:
at an electronic device with a display (112) in- a display including a touch-sensitive surface;
cluding a touch-sensitive surface: one or more processors;
55 memory; and
displaying a first keyboard (5008) on the dis- one or more programs, wherein the one or more
play, the first keyboard comprising a first programs are stored in the memory and config-
plurality of keys; ured to be executed by the one or more proces-
52
103 EP 2 638 460 B1 104
sors, the one or more programs including in- 2. Verfahren nach Anspruch 1 umfassend:
structions for performing the method of any of
claims 1-5. als Antwort auf das Erfassen der Tastenbetäti-
gungsgeste (5038) zu einem ersten Zeitpunkt,
7. A computer readable medium comprising one or 5 Ausschalten eines Gestenerkenners für die
more programs, the one or more programs compris- Tastaturauswahlgeste für die vorbestimmte
ing instructions which, when executed by an elec- Zeitdauer.
tronic device with a display including a touch-sensi-
tive surface, cause the device to perform the method 3. Verfahren nach einem der Ansprüche 1 bis 2, wobei
of any of claims 1-5. 10 die elektronische Vorrichtung eine tragbare elektro-
nische Vorrichtung ist.
53
105 EP 2 638 460 B1 106
54
EP 2 638 460 B1
55
EP 2 638 460 B1
56
EP 2 638 460 B1
57
EP 2 638 460 B1
58
EP 2 638 460 B1
59
EP 2 638 460 B1
60
EP 2 638 460 B1
61
EP 2 638 460 B1
62
EP 2 638 460 B1
63
EP 2 638 460 B1
64
EP 2 638 460 B1
65
EP 2 638 460 B1
66
EP 2 638 460 B1
67
EP 2 638 460 B1
68
EP 2 638 460 B1
69
EP 2 638 460 B1
70
EP 2 638 460 B1
71
EP 2 638 460 B1
72
EP 2 638 460 B1
73
EP 2 638 460 B1
74
EP 2 638 460 B1
75
EP 2 638 460 B1
76
EP 2 638 460 B1
77
EP 2 638 460 B1
78
EP 2 638 460 B1
79
EP 2 638 460 B1
80
EP 2 638 460 B1
81
EP 2 638 460 B1
82
EP 2 638 460 B1
83
EP 2 638 460 B1
84
EP 2 638 460 B1
85
EP 2 638 460 B1
86
EP 2 638 460 B1
87
EP 2 638 460 B1
88
EP 2 638 460 B1
89
EP 2 638 460 B1
90
EP 2 638 460 B1
91
EP 2 638 460 B1
92
EP 2 638 460 B1
93
EP 2 638 460 B1
94
EP 2 638 460 B1
95
EP 2 638 460 B1
96
EP 2 638 460 B1
97
EP 2 638 460 B1
98
EP 2 638 460 B1
99
EP 2 638 460 B1
100
EP 2 638 460 B1
101
EP 2 638 460 B1
102
EP 2 638 460 B1
103
EP 2 638 460 B1
104
EP 2 638 460 B1
105
EP 2 638 460 B1
106
EP 2 638 460 B1
107
EP 2 638 460 B1
108
EP 2 638 460 B1
109
EP 2 638 460 B1
110
EP 2 638 460 B1
111
EP 2 638 460 B1
112
EP 2 638 460 B1
113
EP 2 638 460 B1
114
EP 2 638 460 B1
115
EP 2 638 460 B1
116
EP 2 638 460 B1
117
EP 2 638 460 B1
118
EP 2 638 460 B1
119
EP 2 638 460 B1
120
EP 2 638 460 B1
121
EP 2 638 460 B1
122
EP 2 638 460 B1
123
EP 2 638 460 B1
124
EP 2 638 460 B1
125
EP 2 638 460 B1
126
EP 2 638 460 B1
127
EP 2 638 460 B1
128
EP 2 638 460 B1
129
EP 2 638 460 B1
130
EP 2 638 460 B1
131
EP 2 638 460 B1
132
EP 2 638 460 B1
133
EP 2 638 460 B1
134
EP 2 638 460 B1
135
EP 2 638 460 B1
136
EP 2 638 460 B1
137
EP 2 638 460 B1
138
EP 2 638 460 B1
139
EP 2 638 460 B1
140
EP 2 638 460 B1
141
EP 2 638 460 B1
142
EP 2 638 460 B1
143
EP 2 638 460 B1
144
EP 2 638 460 B1
145
EP 2 638 460 B1
146
EP 2 638 460 B1
147
EP 2 638 460 B1
148
EP 2 638 460 B1
149
EP 2 638 460 B1
150
EP 2 638 460 B1
151
EP 2 638 460 B1
152
EP 2 638 460 B1
153
EP 2 638 460 B1
154
EP 2 638 460 B1
155
EP 2 638 460 B1
This list of references cited by the applicant is for the reader’s convenience only. It does not form part of the European
patent document. Even though great care has been taken in compiling the references, errors or omissions cannot be
excluded and the EPO disclaims all liability in this regard.
156