Creating An Iphone Application
Creating An Iphone Application
At a high level, the process for creating an iPhone application is similar to that for creating a Mac OS X application.
Both use the same tools and many of the same basic libraries. Despite the similarities, there are also significant
differences. An iPhone is not a desktop computer; it has a different purpose and requires a very different design
approach. That approach needs to take advantage of the strengths of iOS and forego features that might be irrelevant
or impractical in a mobile environment. The smaller size of the iPhone and iPod touch screens also means that your
application’s user interface should be well organized and always focused on the information the user needs most.
iOS lets users interact with iPhone and iPod touch devices in ways that you cannot interact with desktop
applications. The Multi-Touch interface reports on each separate finger that touches the screen and making it
possible to handle multifinger gestures and other complex input easily. In addition, built-in hardware features such
as the accelerometers, although present in some desktop systems, are used more extensively in iOS to track the
screen’s current orientation and adjust your content accordingly. Understanding how you can use these features in
your applications will help you focus on a design that is right for your users.
The best way to understand the design of an iPhone application is to look at an example. This article takes you on a
tour of the MoveMe sample application. This sample demonstrates many of the typical behaviors of an iPhone
application, including:
Displaying a window
Performing animations
Figure 1 shows the interface for this application. Touching the Welcome button triggers an animation that causes the
button to pulse and center itself under your finger. As you drag your finger around the screen, the button follows
your finger. Lift your finger from the screen and, using another animation, the button snaps back to its original
location. Double-tapping anywhere outside the button changes the language of the button’s greeting.
Figure 1 The MoveMe application window
Before reading the other sections of this article, you should download the sample (MoveMe) so that you can follow
along directly in the source code. You should also have already read the following orientation pages in the iOS Dev
Center to get a basic understanding of iOS and the tools and language you use for development:
iOS Overview
Tools for iOS Development
If you are not familiar with the Objective-C programming language, you should also have read Learning Objective-
C: A Primer to familiarize yourself with the basic syntax of Objective-C.
In iOS, the ultimate target of your Xcode project is an application bundle, which is a special type of directory that
houses your application’s binary executable and supporting resource files. Bundles in iOS have a relatively flat
directory structure, with most files residing at the top level of the bundle directory. However, a bundle may also
contain subdirectories to store localized versions of strings and other language-specific resource files. You do not
need to know the exact structure of the application bundle for the purposes of this article, but you can find that
information in “Build-Time Configuration Details” in iOS Application Programming Guide if you are interested in
it.
2. In the project toolbar, make sure the simulator option is selected in the Active SDK menu. (If the Active
SDK menu does not appear in the toolbar, choose Project > Set Active SDK > Simulator.)
3. Select Build > Build and Go (Run) from the menu, or simply click the Build and Go button in the toolbar.
When the application finishes building, Xcode loads it into the iOS Simulator and launches it. Using your mouse,
you can click the Welcome button and drag it around the screen to see the application’s behavior. If you have a
device configured for development, you can also build your application and run it on that device. For information
about how to configure devices for development and load applications, see iOS Development Guide.
Note: iOS does not support memory management using the garbage collection feature that is in Mac OS X v10.5
and later.
If you want to allocate generic blocks of memory—that is, memory not associated with an object—you can do so
using the standard malloc library of calls. As is the case with any memory you allocate using malloc, you are
responsible for releasing that memory when you are done with it by calling the free function. The system does not
release malloc-based blocks for you.
Regardless of how you allocate memory, managing your overall memory usage is important. Although iOS has a
virtual memory system, it does not use a swap file. This means that code pages can be flushed as needed but your
application’s data must all fit into memory at the same time. The system monitors the overall amount of free
memory and does what it can to give your application the memory it needs. If memory usage becomes too critical
though, the system may terminate your application. However, this option is used only as a last resort, to ensure that
the system has enough memory to perform critical operations such as receiving phone calls.
For more information about how to allocate objects in iOS, see Cocoa Fundamentals Guide. For information and tips
on how to improve your application’s memory usage, see “Using Memory Efficiently” in iOS Application
Programming Guide.
Listing 1 shows the main function for the MoveMe application. The main function is located in that
project’s main.m file. Every application you create will have amain function that is almost identical to this one. This
function performs two key tasks. First, it creates the application’s top-level autorelease pool used by the memory
management reference counting system. Second, it calls the UIApplicationMain function to create the MoveMe
application’s key objects, initialize those objects, and start the event-processing loop. The application does not
return from this function until it quits.
[pool release];
return retVal;
To launch Interface Builder and see how the application delegate object’s role is defined, double-click
the MainWindow.xib file (under MoveMe > Resources) in the Groups & Files pane of the Xcode project
window. MainWindow.xib is the nib file that contains your application’s window and defines the relationships
among several important objects in your application, including the application delegate. To see how the application
delegate relationship is established, click the File’s Owner icon in the nib file document window (titled
“MainWindow.xib”), show the Inspector window (choose Tools > Inspector), and click the Inspector window’s
Application Connections tab. As shown in Figure 3, the Inspector shows that the File’s Owner object (which
represents the application in the nib file) has a delegate outlet connected to the MoveMeAppDelegate object.
The application delegate object works in tandem with the standard UIApplication object to respond to changing
conditions in the application. The application object does most of the heavy lifting, but the delegate is responsible
for several key behaviors, including the following:
Setting up the application’s window and initial user interface
Performing any additional initialization tasks needed for your custom data engine
At launch time, the most immediate concern for the delegate object is to set up and present the application window
to the user, which is described in “Creating the Application Window”. The delegate should also perform any tasks
needed to prepare your application for immediate use, such as restoring the application to a previous state or creating
any required objects. When the application quits, the delegate needs to perform an orderly shutdown of the
application and save any state information needed for the next launch cycle.
For more information about the fundamental architecture and life cycle of an iPhone application, see “Core
Application Architecture” in iOS Application Programming Guide.
Windows provide the drawing surface for your user interface, but view objects provide the actual content. A view
object is an instance of the UIView class that draws some content and responds to interactions with that content. iOS
defines standard views to represent things such as tables, buttons, text fields, and other types of interactive controls.
You can add any of these views to your window, or you can define custom views by subclassing UIView and
implementing some custom drawing and event-handling code. The MoveMe application defines two such views—
represented by the MoveMeView and PlacardView classes—to display the application’s interface and handle user
interactions.
At launch time, the goal is to create the application window and display some initial content as quickly as possible.
The window is unarchived from theMainWindow.xib nib file. When the application reaches a state where it is
launched and ready to start processing events, the UIApplication object sends the delegate
an applicationDidFinishLaunching: message. This message is the delegate’s cue to put content in its window and
perform any other initialization the application might require.
1. It creates a view controller object whose job is to manage the content view of the window.
Listing 2 shows the applicationDidFinishLaunching: method for the MoveMe application, which is defined in the
application delegate’s implementation file,MoveMeAppDelegate.m. This method creates the main content view for
the window and makes the window visible. Showing the window lets the system know that your application is ready
to begin handling events.
Listing 2 Creating the content view
- (void)applicationDidFinishLaunching:(UIApplication *)application
self.viewController = aViewController;
[aViewController release];
[window addSubview:controllersView];
[window makeKeyAndVisible];
Note: You can use the applicationDidFinishLaunching: method to perform other tasks besides setting up your
application user interface. Many applications use it to initialize required data structures, read any user preferences,
or return the application to the state it was in when it last quit.
Although the preceding code creates the window's background view and then shows the window, what you do not
see in the preceding code is the creation of thePlacardView class that displays the Welcome button. That behavior is
handled by the setUpPlacardView method of the MoveMeView class, which is called from
the initWithCoder: method called when the MoveMeView object is unarchived from its nib file.
The setUpPlacardView method is shown in Listing 3. Part of the initialization of this view includes the creation of
a PlacardView object. Because the MoveMeView class provides the background for the entire application, it adds
the PlacardView object as a subview. The relationship between the two views not only causes the Welcome button
to be displayed on top of the application’s background, it also allows the MoveMeView class to handle events that
are targeted at the button.
// Create the placard view -- it calculates its own frame based on its image.
self.placardView = aPlacardView;
[aPlacardView release];
placardView.center = self.center;
[self addSubview:placardView];
For detailed information about creating windows and views, see “What Are Windows and Views?” in iOS
Application Programming Guide.
The PlacardView class in the MoveMe application draws the Welcome button and manages its location on the
screen. Although the PlacardView class could draw its content using an
embedded UIImageView and UILabel object, it instead draws the content explicitly, to demonstrate the overall
process. As a result, this class implements a drawRect: method, which is where all custom drawing for a view takes
place.
By the time a view’s drawRect: method is called, the drawing environment is configured and ready to go. All you
have to do is specify the drawing commands to draw any custom content. In the PlacardView class, the content
consists of a background image (stored in the Placard.png resource file) and a custom string, the text for which can
change dynamically. To draw this content, the class takes the following steps:
1. Draw the background image at the view’s current origin. (Because the view is already sized to fit the
image, this step provides the entire button background.)
2. Compute the position of the welcome string so that it is centered in the button. (Because the string size can
change, the position needs to be computed each time based on the current string size.)
{
// Draw the placard at 0, 0
/*
an embossed appearance. The size of the font and text are calculated
in setupNextDisplayString.
*/
// Find point at which to draw the string so it will be in the center of the view
CGPoint point;
[currentDisplayString drawAtPoint:point
forWidth:(self.bounds.size.width-STRING_INDENT)
withFont:font
fontSize:fontSize
lineBreakMode:UILineBreakModeMiddleTruncation
baselineAdjustment:UIBaselineAdjustmentAlignBaselines];
[[UIColor whiteColor] set];
[currentDisplayString drawAtPoint:point
forWidth:(self.bounds.size.width-STRING_INDENT)
withFont:font
fontSize:fontSize
lineBreakMode:UILineBreakModeMiddleTruncation
baselineAdjustment:UIBaselineAdjustmentAlignBaselines];
When you need to draw content that is more complex than images and strings, you can use Quartz or OpenGL ES.
Quartz works with UIKit to handle the drawing of vector-based paths, images, gradients, PDF, and other complex
content that you want to create dynamically. Because Quartz and UIKit are based on the same drawing environment,
you can call Quartz functions directly from the drawRect: method of your view and even mix and match Quartz calls
through the use of UIKit classes.
OpenGL ES is an alternative to Quartz and UIKit that lets you render 2D and 3D content using a set of functions
that resemble (but are not exactly like) those found in OpenGL for Mac OS X. Unlike Quartz and UIKit, you do not
use your view’s drawRect: method to do your drawing. You still use a view, but you use that view object primarily
to provide the drawing surface for your OpenGL ES code. How often you update the drawing surface, and which
objects you use to do so, are your decision.
For detailed information about each of the drawing technologies and how you use them, see “Supporting High-
Resolution Screens” in iOS Application Programming Guide.
Because there may be multiple fingers touching the device at one time, it is possible for you to use those events to
identify complex user gestures. The system provides some help in detecting common gestures such as swipes, but
you are responsible for detecting more complex gestures. When the event system generates a new touch event, it
includes information about the current state of each finger that is either touching or was just removed from the
surface of the device. Because each event object contains information about all active touches, you can monitor the
actions of each finger with the arrival of each new event. You can then track the movements of each finger from
event to event to detect gestures, which you can apply to the contents of your application. For example, if the events
indicate the user is performing a pinch-close or pinch-open gesture (as shown in Figure 4) and the underlying view
supports magnification, you could use those events to change the current zoom level.
Figure 4 Using touch events to detect gestures
The system delivers events to the application’s responder objects, which are instances of the UIResponder class. In
an iPhone application, your application’s views form the bulk of your custom responder objects. The MoveMe
application implements two view classes, but only the MoveMeView class actually responds to event messages.
This class detects taps both inside and outside the bounds of the Welcome button by overriding the following
methods of UIResponder:
To simplify its own event-handling behavior, the MoveMe application tracks only the first finger to touch the
surface of the device. It does this with the support of theUIView class, which disables multi-touch events by default.
For applications that do not need to track multiple fingers, this feature is a great convenience. When multi-touch
events are disabled, the system delivers events only related to the first finger to touch the device. Events related to
additional touches in a sequence are never delivered to the view. If you want the information for those additional
touches, however, you can reenable multi-touch support using thesetMultipleTouchEnabled: method of
the UIView class.
1. When a touch first arrives, it checks to see where the event occurred.
Double-taps outside the Welcome button update the string displayed by the button.
Single taps inside the button center the button underneath the finger and trigger an initial
animation to enlarge the button.
2. If the finger moves and is inside the button, the button’s position is updated to match the new position of
the finger.
3. If the finger was inside the button and then lifts off the surface of the device, an animation moves the
button back to its original position.
Listing 5 shows the touchesBegan:withEvent: method for the MoveMeView class. The system calls this method
when a finger first touches the device. This method gets the set of all touches and extracts the one and only touch
object from it. The information in the UITouch object is used to identify in which view the touch occurred
(the MoveMeView object or the PlacardView object) and the number of taps associated with the touch. If the touch
represents a double tap outside the button, the touchesBegan:withEvent: method calls
the setupNextDisplayString method to change the welcome string of the button. If the event occurred inside the
Welcome button, it uses the animateFirstTouchAtPoint: method to grow the button and track it to the touch location.
All other touch-related events are ignored.
// Only move the placard view if the touch was in the placard view
if ([touch tapCount] == 2)
[placardView setupNextDisplayString];
return;
[self animateFirstTouchAtPoint:touchPoint];
// to its location
placardView.center = location;
return;
When the user’s finger finally lifts from the screen, the MoveMe application responds by triggering an animation to
move the button back to its starting position in the center of the application’s window. Listing 7 shows
the touchesEnded:withEvent: method that initiates the animation.
self.userInteractionEnabled = NO;
[self animatePlacardViewToCenter];
return;
}
To simplify the event handling process for the application, the touchesEnded:withEvent: method disables touch
events for the view temporarily while the button animates back to its original position. If it did not do this, each of
the event-handling methods would need to include logic to determine whether the button was in the middle of an
animation and, if so, cancel the animation. Disabling user interactions for the short time it takes the button to travel
back to the center of the screen simplifies the event handling code and eliminates the need for the extra logic. Upon
reaching its original position, the animationDidStop:finished: method of the MoveMeView class reenables user
interactions so that the event cycle can begin all over again.
If the application is interrupted for some reason—for example, by an incoming phone call—the view is sent
a touchesCancelled:withEvent: message. In this situation, the application should try to do as little work as possible
to avoid competing for device resources. In the example implementation, the placard view’s center and
transformation are simply set to their original values.
placardView.center = self.center;
placardView.transform = CGAffineTransformIdentity;
For more information on handling events in iOS, see “Document Revision History” in iOS Application
Programming Guide.
Because of its importance, support for animation is built into the classes of UIKit already. The MoveMe application
takes advantage of this support by using it to animate the different aspects of the Welcome button. When the user
first touches the button, the application applies an animation that causes the size of the button to grow briefly. When
the user lets go of the button, another animation snaps it back to its original position. The basic steps for creating
these animations are essentially the same:
Listing 8 shows the animation code used to pulse the Welcome button when it is first touched. This method sets the
duration of the animation and then applies a transform to the button that scales it to its new size. When this
animation completes, the animation infrastructure calls thegrowAnimationDidStop:finished:context: method of the
animation delegate, which completes the pulse animation by shrinking the button slightly and moving the placard
view under the touch.
Listing 8 Animating the Welcome button
- (void)animateFirstTouchAtPoint:(CGPoint)touchPoint
[UIView setAnimationDuration:GROW_ANIMATION_DURATION_SECONDS];
[UIView setAnimationDelegate:self];
placardView.transform = transform;
[UIView commitAnimations];
[UIView setAnimationDuration:MOVE_ANIMATION_DURATION_SECONDS];
[touchPointValue release];
[UIView commitAnimations];
}
For more information about using the built-in view-based animations, see “Animating Views” in iOS Application
Programming Guide. For more information about Core Animation, see “Applying Core Animation Effects” in iOS
Application Programming Guide.
Listing 9 shows the contents of the Info.plist file for the MoveMe application. This file identifies the name of the
executable, the image file to display on the user’s Home screen, and the string that identifies the application
uniquely to the system. Because the MoveMe application is a full-screen application—in other words, it does not
display the status bar—it also includes the UIStatusBarHidden key and assigns to it the value true. Setting this key
to true lets the system know that it should not display the application status bar at launch time or while the
application is running. Although the MoveMe application could configure this same behavior programmatically, that
behavior would not take effect until after the application was already launched, which might look odd.
"http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>CFBundleDevelopmentRegion</key>
<string>en</string>
<key>CFBundleDisplayName</key>
<string>${PRODUCT_NAME}</string>
<key>CFBundleExecutable</key>
<string>${EXECUTABLE_NAME}</string>
<key>CFBundleIconFile</key>
<string>Icon.png</string>
<key>CFBundleIdentifier</key>
<string>com.yourcompany.${PRODUCT_NAME:identifier}</string>
<key>CFBundleInfoDictionaryVersion</key>
<string>6.0</string>
<key>CFBundleName</key>
<string>${PRODUCT_NAME}</string>
<key>CFBundlePackageType</key>
<string>APPL</string>
<key>CFBundleSignature</key>
<string>????</string>
<key>CFBundleVersion</key>
<string>1.0</string>
<key>UIStatusBarHidden</key>
<true/>
<key>NSMainNibFile</key>
<string>MainWindow</string>
</dict>
</plist>
Note: You can edit the contents of your application’s Info.plist file using TextEdit, which displays the XML
contents of the file as shown in Listing 9, or the Property List Editor, which displays the file’s keys and values in a
table. Xcode also provides access to some of these attributes in the information window for your application target.
To view this window, select your application target (in the Targets group) and choose File > Get Info. The
Properties tab contains some (but not all) of the properties in the Info.plist file.
For information about configuring your application’s Info.plist file, see “The Information Property List” in iOS
Application Programming Guide.
With this final piece in place, you now have all of the basic information needed to create your own functional
iPhone application. The next step is to expand on the information you learned here by learning more about the
features of iOS. The applications you create should take advantage of the built-in features of iOS to create a pleasant
and intuitive user experience. Some of these features are described in “Taking Your Applications Further”, but for a
complete list, and for information on how to use them, see iOS Application Programming Guide.
The system uses the accelerometers to monitor a device’s current orientation and to notify your application when
that orientation changes. If your application’s interface can be displayed in both landscape and portrait mode, you
should incorporate view controllers into your basic design. The UIViewController class provides the infrastructure
needed to rotate your interface and adjust the position of views automatically in response to orientation changes.
If you want access to the raw accelerometer data directly, you can do so using the shared UIAccelerometer object in
UIKit. The UIAccelerometer object reports the current accelerometer values at a configurable interval. You can also
use the data to detect the device’s orientation or to detect other types of instantaneous motion, such as the user
shaking the device back and forth. You can then use this information as input to a game or other application. For
examples of how to configure the UIAccelerometer object and receive accelerometer events, see “Accessing
Accelerometer Events” in iOS Application Programming Guide.
You access the user’s contact information using the Address Book and Address Book UI frameworks. For more
information about these frameworks, see Address Book Framework Reference for iOS and Address Book UI
Framework Reference for iOS.
The Core Location framework monitors signals coming from cell phone towers and Wi-Fi hotspots and uses them to
triangulate the user’s current position. You can use this framework to grab an initial location fix only, or you can be
notified whenever the user’s location changes. With this information, you can filter the information your application
provides or use it in other ways.
For an example of how to get location data in your application, see “Getting the User’s Current Location” in iOS
Application Programming Guide.
The Media Player framework is what you use to play back full-screen video files. This framework supports the
playback of many standard movie file formats and gives you control over the playback environment, including
whether to display user controls and how to configure the aspect ratio of video content. Game developers might use
this framework to play cut scenes or other prerendered content, while media-based applications can also use this
framework to play back movie files.
For information on how to use the picker interfaces, see “Taking Pictures with the Camera” in iOS Application
Programming Guide and “Picking a Photo from the Photo Library” in iOS Application Programming Guide