Java™ Media Framework API Guide
Java™ Media Framework API Guide
November 19,1999
JMF 2.0 FCS
ii JMF API Guide
iii
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii
About JMF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii
Design Goals for the JMF API . . . . . . . . . . . . . . . . . . . . . . . xiv
About the JMF RTP APIs . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv
Design Goals for the JMF RTP APIs . . . . . . . . . . . . . . . xvi
Partners in the Development of the JMF API . . . . . . . . . .xvii
Contact Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xvii
About this Document . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xvii
Guide to Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xvii
Change History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xix
Comments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xx
Part 1: Javaª Media Framework . . . . . . . . . . . . . . . . . . . . . . . .1
Working with Time-Based Media . . . . . . . . . . . . . . . . . . . . . . .3
Streaming Media . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Content Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Media Streams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Common Media Formats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Media Presentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Presentation Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Latency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Presentation Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
v
vi JMF API Guide
Media Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8
Demultiplexers and Multiplexers . . . . . . . . . . . . . . . . . . . . . .9
Codecs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9
Effect Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9
Renderers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9
Compositing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9
Media Capture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .10
Capture Devices. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .10
Capture Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .10
Understanding JMF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
High-Level Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11
Time Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .13
Managers. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .14
Event Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15
Data Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16
Push and Pull Data Sources . . . . . . . . . . . . . . . . . . . . . . .17
Specialty DataSources . . . . . . . . . . . . . . . . . . . . . . . . . . . .18
Data Formats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .19
Controls. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .20
Standard Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .20
User Interface Components . . . . . . . . . . . . . . . . . . . . . . . . . .23
Extensibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .23
Presentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .24
Players . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .25
Player States . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .26
Methods Available in Each Player State. . . . . . . . . . . . .28
Processors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29
Presentation Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29
Standard User Interface Components . . . . . . . . . . . . . .30
Controller Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .30
Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .32
Processor States . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .33
Methods Available in Each Processor State. . . . . . . . . .35
Processing Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .36
Data Output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .36
vii
Capture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Media Data Storage and Transmission . . . . . . . . . . . . . . . . . . . . 37
Storage Controls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Extensibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Implementing Plug-Ins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Implementing MediaHandlers and DataSources. . . . . . . . 39
MediaHandler Construction . . . . . . . . . . . . . . . . . . . . . . 39
DataSource Construction . . . . . . . . . . . . . . . . . . . . . . . . . 42
Presenting Time-Based Media with JMF . . . . . . . . . . . . . . . . 43
Controlling a Player. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Creating a Player. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Blocking Until a Player is Realized. . . . . . . . . . . . . . . . . 44
Using a ProcessorModel to Create a Processor . . . . . . 44
Displaying Media Interface Components . . . . . . . . . . . . . . 45
Displaying a Visual Component. . . . . . . . . . . . . . . . . . . 45
Displaying a Control Panel Component . . . . . . . . . . . . 45
Displaying a Gain-Control Component. . . . . . . . . . . . . 46
Displaying Custom Control Components. . . . . . . . . . . 46
Displaying a Download-Progress Component. . . . . . . 47
Setting the Playback Rate. . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Setting the Start Position . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Frame Positioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Preparing to Start . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Realizing and Prefetching a Player. . . . . . . . . . . . . . . . . 49
Determining the Start Latency . . . . . . . . . . . . . . . . . . . . 50
Starting and Stopping the Presentation . . . . . . . . . . . . . . . . 50
Starting the Presentation . . . . . . . . . . . . . . . . . . . . . . . . . 50
Stopping the Presentation . . . . . . . . . . . . . . . . . . . . . . . . 50
Stopping the Presentation at a Specified Time . . . . . . . 51
Releasing Player Resources . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Querying a Player . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Getting the Playback Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Getting the Media Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Getting the Time-Base Time . . . . . . . . . . . . . . . . . . . . . . . . . 54
Getting the Duration of the Media Stream . . . . . . . . . . . . . 54
viii JMF API Guide
RTPUtil . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
Preface
About JMF
The JMF 1.0 API (the Java Media Player API) enabled programmers to
develop Java programs that presented time-based media. The JMF 2.0 API
extends the framework to provide support for capturing and storing
media data, controlling the type of processing that is performed during
playback, and performing custom processing on media data streams. In
addition, JMF 2.0 deÞnes a plug-in API that enables advanced developers
and technology providers to more easily customize and extend JMF func-
tionality.
The following classes and interfaces are new in JMF 2.0:
AudioFormat BitRateControl Buffer
xiii
xiv JMF API Guide
In addition, the MediaPlayer Java Bean has been included with the JMF
API in javax.media.bean.playerbean. MediaPlayer can be instantiated
directly and used to present one or more media streams.
Future versions of the JMF API will provide additional functionality and
enhancements while maintaining compatibility with the current API.
¥ Be easy to program
¥ Support capturing media data
¥ Enable the development of media streaming and conferencing
applications in Java
xv
GlobalTransmissionStats TransmissionStats
The RTP packages have been reorganized and some classes, interfaces,
and methods have been renamed to make the API easier to use. The pack-
age reorganization consists of the following changes:
The name changes consist primarily of the removal of the RTP and RTCP
preÞxes from class and interface names and the elimination of non-stan-
dard abbreviations. For example, RTPRecvStreamListener has been
renamed to ReceiveStreamListener. For a complete list of the changes
made to the RTP packages, see the JMF 2.0 Beta release notes.
In addition, changes were made to the RTP APIs to make them compatible
with other changes in JMF 2.0:
¥ javax.media.rtp.session.io and
javax.media.rtp.session.depacketizer have been removed. Custom
RTP packetizers and depacketizers are now supported through the
JMF 2.0 plug-in architecture. Existing depacketizers will need to be
ported to the new plug-in architecture.
¥ Buffer is now the basic unit of transfer between the SessionManager
and other JMF objects, in place of DePacketizedUnit and
DePacketizedObject. RTP-formatted Buffers have a specific format
for their data and header objects.
¥ BaseEncodingInfo has been replaced by the generic JMF Format object.
An RTP-specific Format is differentiated from other formats by its
encoding string. Encoding strings for RTP-specific Formats end in
_RTP. Dynamic payload information can be provided by associating a
dynamic payload number with a Format object.
Design Goals for the JMF RTP APIs
The RTP APIs in JMF 2.0 support the reception and transmission of RTP
streams and address the needs of application developers who want to use
RTP to implement media streaming and conferencing applications. These
APIs are designed to:
Contact Information
For the latest information about JMF, visit the Sun Microsystems, Inc.
website at:
http://java.sun.com/products/java-media/jmf/
http://www.software.ibm.com/net.media/
Guide to Contents
This document is split into two parts:
¥ Part 1 describes the features provided by the JMF 2.0 API and
illustrates how you can use JMF to incorporate time-based media in
your Java applications and applets.
¥ Part 2 describes the support for real-time streaming provided by the
JMF RTP APIs and illustrates how to send and receive streaming
media across the network.
xviii JMF API Guide
At the end of this document, youÕll Þnd Appendices that contain complete
sample code for some of the examples used in these chapters and a glos-
sary of JMF-speciÞc terms.
xix
Change History
Version JMF 2.0 FCS
Comments
Please submit any comments or suggestions you have for improving this
document to [email protected].
Part 1:
Javaª Media Framework
1
2 JMF API Guide
1
Working with
Time-Based Media
Any data that changes meaningfully with respect to time can be character-
ized as time-based media. Audio clips, MIDI sequences, movie clips, and
animations are common forms of time-based media. Such media data can
be obtained from a variety of sources, such as local or network Þles, cam-
eras, microphones, and live broadcasts.
This chapter describes the key characteristics of time-based media and
describes the use of time-based media in terms of a fundamental data pro-
cessing model:
3
4 JMF API Guide
Streaming Media
A key characteristic of time-based media is that it requires timely delivery
and processing. Once the ßow of media data begins, there are strict timing
deadlines that must be met, both in terms of receiving and presenting the
data. For this reason, time-based media is often referred to as streaming
mediaÑit is delivered in a steady stream that must be received and pro-
cessed within a particular timeframe to produce acceptable results.
For example, when a movie is played, if the media data cannot be deliv-
ered quickly enough, there might be odd pauses and delays in playback.
On the other hand, if the data cannot be received and processed quickly
enough, the movie might appear jumpy as data is lost or frames are inten-
tionally dropped in an attempt to maintain the proper playback rate.
Content Type
The format in which the media data is stored is referred to as its content
type. QuickTime, MPEG, and WAV are all examples of content types. Con-
tent type is essentially synonymous with Þle typeÑcontent type is used
because media data is often acquired from sources other than local Þles.
Media Streams
A media stream is the media data obtained from a local Þle, acquired over
the network, or captured from a camera or microphone. Media streams
often contain multiple channels of data called tracks. For example, a
Quicktime file might contain both an audio track and a video track. Media
streams that contain multiple tracks are often referred to as multiplexed or
complex media streams. Demultiplexing is the process of extracting individ-
ual tracks from a complex media stream.
A trackÕs type identiÞes the kind of data it contains, such as audio or
video. The format of a track deÞnes how the data for the track is struc-
tured.
A media stream can be identiÞed by its location and the protocol used to
access it. For example, a URL might be used to describe the location of a
QuickTime Þle on a local or remote system. If the Þle is local, it can be
accessed through the FILE protocol. On the other hand, if itÕs on a web
server, the Þle can be accessed through the HTTP protocol. A media locator
provides a way to identify the location of a media stream when a URL
canÕt be used.
Working with Time-Based Media 5
¥ PullÑdata transfer is initiated and controlled from the client side. For
example, Hypertext Transfer Protocol (HTTP) and FILE are pull
protocols.
¥ PushÑthe server initiates data transfer and controls the flow of data.
For example, Real-time Transport Protocol (RTP) is a push protocol
used for streaming media. Similarly, the SGI MediaBase protocol is a
push protocol used for video-on-demand (VOD).
CPU Bandwidth
Format Content Type Quality
Requirements Requirements
CPU Bandwidth
Format Content Type Quality
Requirements Requirements
CPU Bandwidth
Format Content Type Quality
Requirements Requirements
Media Presentation
Most time-based media is audio or video data that can be presented
through output devices such as speakers and monitors. Such devices are
the most common destination for media data output. Media streams can
also be sent to other destinationsÑfor example, saved to a Þle or transmit-
ted across the network. An output destination for media data is some-
times referred to as a data sink.
Presentation Controls
While a media stream is being presented, VCR-style presentation controls
are often provided to enable the user to control playback. For example, a
control panel for a movie player might offer buttons for stopping, starting,
fast-forwarding, and rewinding the movie.
Latency
In many cases, particularly when presenting a media stream that resides
on the network, the presentation of the media stream cannot begin imme-
diately. The time it takes before presentation can begin is referred to as the
start latency. Users might experience this as a delay between the time that
they click the start button and the time when playback actually starts.
Multimedia presentations often combine several types of time-based
media into a synchronized presentation. For example, background music
might be played during an image slide-show, or animated text might be
synchronized with an audio or video clip. When the presentation of multi-
ple media streams is synchronized, it is essential to take into account the
start latency of each streamÑotherwise the playback of the different
streams might actually begin at different times.
Presentation Quality
The quality of the presentation of a media stream depends on several fac-
tors, including:
Traditionally, the higher the quality, the larger the Þle size and the greater
the processing power and bandwidth required. Bandwidth is usually rep-
resented as the number of bits that are transmitted in a certain period of
timeÑthe bit rate.
To achieve high-quality video presentations, the number of frames dis-
played in each period of time (the frame rate) should be as high as possible.
Usually movies at a frame rate of 30 frames-per-second are considered
indistinguishable from regular TV broadcasts or video tapes.
Media Processing
In most instances, the data in a media stream is manipulated before it is
presented to the user. Generally, a series of processing operations occur
before presentation:
The tracks are then delivered to the appropriate output device. If the
media stream is to be stored instead of rendered to an output device, the
processing stages might differ slightly. For example, if you wanted to cap-
ture audio and video from a video camera, process the data, and save it to
a Þle:
Codecs
A codec performs media-data compression and decompression. When a
track is encoded, it is converted to a compressed format suitable for stor-
age or transmission; when it is decoded it is converted to a non-com-
pressed (raw) format suitable for presentation.
Each codec has certain input formats that it can handle and certain output
formats that it can generate. In some situations, a series of codecs might be
used to convert from one format to another.
Effect Filters
An effect Þlter modiÞes the track data in some way, often to create special
effects such as blur or echo.
Effect Þlters are classiÞed as either pre-processing effects or post-process-
ing effects, depending on whether they are applied before or after the
codec processes the track. Typically, effect Þlters are applied to uncom-
pressed (raw) data.
Renderers
A renderer is an abstraction of a presentation device. For audio, the pre-
sentation device is typically the computerÕs hardware audio card that out-
puts sound to the speakers. For video, the presentation device is typically
the computer monitor.
Compositing
Certain specialized devices support compositing. Compositing time-based
media is the process of combining multiple tracks of data onto a single
presentation medium. For example, overlaying text on a video presenta-
tion is one common form of compositing. Compositing can be done in
either hardware or software. A device that performs compositing can be
abstracted as a renderer that can receive multiple tracks of input data.
10 JMF API Guide
Media Capture
Time-based media can be captured from a live source for processing and
playback. For example, audio can be captured from a microphone or a
video capture card can be used to obtain video from a camera. Capturing
can be thought of as the input phase of the standard media processing
model.
A capture device might deliver multiple media streams. For example, a
video camera might deliver both audio and video. These streams might be
captured and manipulated separately or combined into a single, multi-
plexed stream that contains both an audio track and a video track.
Capture Devices
To capture time-based media you need specialized hardwareÑfor exam-
ple, to capture audio from a live source, you need a microphone and an
appropriate audio card. Similarly, capturing a TV broadcast requires a TV
tuner and an appropriate video capture card. Most systems provide a
query mechanism to Þnd out what capture devices are available.
Capture devices can be characterized as either push or pull sources. For
example, a still camera is a pull sourceÑthe user controls when to capture
an image. A microphone is a push sourceÑthe live source continuously
provides a stream of audio.
The format of a captured media stream depends on the processing per-
formed by the capture device. Some devices do very little processing and
deliver raw, uncompressed data. Other capture devices might deliver the
data in a compressed format.
Capture Controls
Controls are sometimes provided to enable the user to manage the capture
process. For example, a capture control panel might enable the user to
specify the data rate and encoding type for the captured stream and start
and stop the capture process.
2
Understanding JMF
High-Level Architecture
Devices such as tape decks and VCRs provide a familiar model for record-
ing, processing, and presenting time-based media. When you play a movie
using a VCR, you provide the media stream to the VCR by inserting a
video tape. The VCR reads and interprets the data on the tape and sends
appropriate signals to your television and speakers.
11
12 JMF API Guide
Video camera
(Capture Device)
Output Devices
(Destination)
Figure 2-1: Recording, processing, and presenting time-based media.
JMF uses this same basic model. A data source encapsulates the media
stream much like a video tape and a player provides processing and con-
trol mechanisms similar to a VCR. Playing and capturing audio and video
with JMF requires the appropriate input and output devices such as
microphones, cameras, speakers, and monitors.
Data sources and players are integral parts of JMFÕs high-level API for
managing the capture, presentation, and processing of time-based media.
JMF also provides a lower-level API that supports the seamless integra-
tion of custom processing components and extensions. This layering pro-
vides Java developers with an easy-to-use API for incorporating time-
based media into Java programs while maintaining the ßexibility and
extensibility required to support advanced media applications and future
media technologies.
Understanding JMF 13
Time Model
JMF keeps time to nanosecond precision. A particular point in time is typ-
ically represented by a Time object, though some classes also support the
speciÞcation of time in nanoseconds.
Classes that support the JMF time model implement Clock to keep track of
time for a particular media stream. The Clock interface deÞnes the basic
timing and synchronization operations that are needed to control the pre-
sentation of media data.
A Clock uses a TimeBase to keep track of the passage of time while a media
stream is being presented. A TimeBase provides a constantly ticking time
source, much like a crystal oscillator in a watch. The only information that
a TimeBase provides is its current time, which is referred to as the time-base
14 JMF API Guide
¥ The time-base start-timeÑthe time that its TimeBase reports when the
presentation begins.
¥ The media start-timeÑthe position in the media stream where
presentation begins.
¥ The playback rateÑhow fast the Clock is running in relation to its
TimeBase. The rate is a scale factor that is applied to the TimeBase. For
example, a rate of 1.0 represents the normal playback rate for the
media stream, while a rate of 2.0 indicates that the presentation will
run at twice the normal rate. A negative rate indicates that the Clock is
running in the opposite direction from its TimeBaseÑfor example, a
negative rate might be used to play a media stream backward.
When the presentation stops, the media time stops, but the time-base time
continues to advance. If the presentation is restarted, the media time is
remapped to the current time-base time.
Managers
The JMF API consists mainly of interfaces that deÞne the behavior and
interaction of objects used to capture, process, and present time-based
media. Implementations of these interfaces operate within the structure of
the framework. By using intermediary objects called managers, JMF makes
it easy to integrate new implementations of key interfaces that can be used
seamlessly with existing classes.
Understanding JMF 15
To write programs based on JMF, youÕll need to use the Manager create
methods to construct the Players, Processors, DataSources, and DataSinks
for your application. If youÕre capturing media data from an input device,
youÕll use the CaptureDeviceManager to find out what devices are available
and access information about them. If youÕre interested in controlling
what processing is performed on the data, you might also query the Plug-
InManager to determine what plug-ins have been registered.
Event Model
JMF uses a structured event reporting mechanism to keep JMF-based pro-
grams informed of the current state of the media system and enable JMF-
based programs to respond to media-driven error conditions, such as out-
of data and resource unavailable conditions. Whenever a JMF object needs
to report on the current conditions, it posts a MediaEvent. MediaEvent is
subclassed to identify many particular types of events. These objects fol-
low the established Java Beans patterns for events.
16 JMF API Guide
For each type of JMF object that can post MediaEvents, JMF deÞnes a corre-
sponding listener interface. To receive notiÞcation when a MediaEvent is
posted, you implement the appropriate listener interface and register your
listener class with the object that posts the event by calling its addListener
method.
Controller objects (such as Players and Processors) and certain Control
objects such as GainControl post media events.
MediaEvent
extends
Controller has a ControllerListener
addControllerListener controllerUpdate(ControllerEvent)
removeControllerListener
creates ControllerEvent
getSourceController
Data Model
JMF media players usually use DataSources to manage the transfer of
media-content. A DataSource encapsulates both the location of media and
the protocol and software used to deliver the media. Once obtained, the
source cannot be reused to deliver other media.
A DataSource is identiÞed by either a JMF MediaLocator or a URL (univer-
sal resource locator). A MediaLocator is similar to a URL and can be con-
structed from a URL, but can be constructed even if the corresponding
protocol handler is not installed on the system. (In Java, a URL can only be
constructed if the corresponding protocol handler is installed on the sys-
tem.)
A DataSource manages a set of SourceStream objects. A standard data
source uses a byte array as the unit of transfer. A buffer data source uses a
Buffer object as its unit of transfer. JMF deÞnes several types of Data-
Source objects:
Understanding JMF 17
Controls Duration
PullDataSource PullSourceStream
InputSourceStream
URLDataSource
PullBufferDataSource PullBufferStream
PushDataSource PushSourceStream
PushBufferDataSource PushBufferStream
¥ Pull Data-SourceÑthe client initiates the data transfer and controls the
flow of data from pull data-sources. Established protocols for this type
of data include Hypertext Transfer Protocol (HTTP) and FILE. JMF
defines two types of pull data sources: PullDataSource and
PullBufferDataSource, which uses a Buffer object as its unit of
transfer.
¥ Push Data-SourceÑthe server initiates the data transfer and controls
the flow of data from a push data-source. Push data-sources include
broadcast media, multicast media, and video-on-demand (VOD). For
broadcast data, one protocol is the Real-time Transport Protocol
(RTP), under development by the Internet Engineering Task Force
(IETF). The MediaBase protocol developed by SGI is one protocol used
for VOD. JMF defines two types of push data sources: PushDataSource
and PushBufferDataSource, which uses a Buffer object as its unit of
transfer.
The degree of control that a client program can extend to the user depends
on the type of data source being presented. For example, an MPEG Þle can
18 JMF API Guide
be repositioned and a client program could allow the user to replay the
video clip or seek to a new position in the video. In contrast, broadcast
media is under server control and cannot be repositioned. Some VOD pro-
tocols might support limited user controlÑfor example, a client program
might be able to allow the user to seek to a new position, but not fast for-
ward or rewind.
Specialty DataSources
JMF deÞnes two types of specialty data sources, cloneable data sources
and merging data sources.
A cloneable data source can be used to create clones of either a pull or
push DataSource. To create a cloneable DataSource, you call the Manager
createCloneableDataSource method and pass in the DataSource you want
to clone. Once a DataSource has been passed to createCloneableData-
Source, you should only interact with the cloneable DataSource and its
clones; the original DataSource should no longer be used directly.
Cloneable data sources implement the SourceCloneable interface, which
deÞnes one method, createClone. By calling createClone, you can create
any number of clones of the DataSource that was used to construct the
cloneable DataSource. The clones can be controlled through the cloneable
DataSource used to create themÑ when connect, disconnect, start, or
stop is called on the cloneable DataSource, the method calls are propa-
gated to the clones.
The clones donÕt necessarily have the same properties as the cloneable
data source used to create them or the original DataSource. For example, a
cloneable data source created for a capture device might function as a
master data source for its clonesÑin this case, unless the cloneable data
source is used, the clones wonÕt produce any data. If you hook up both the
cloneable data source and one or more clones, the clones will produce
data at the same rate as the master.
A MergingDataSource can be used to combine the SourceStreams from sev-
eral DataSources into a single DataSource. This enables a set of Data-
Sources to be managed from a single point of controlÑwhen connect,
disconnect, start, or stop is called on the MergingDataSource, the method
calls are propagated to the merged DataSources.
To construct a MergingDataSource, you call the Manager createMerging-
DataSource method and pass in an array that contains the data sources
you want to merge. To be merged, all of the DataSources must be of the
Understanding JMF 19
same type; for example, you cannot merge a PullDataSource and a Push-
DataSource. The duration of the merged DataSource is the maximum of
the merged DataSource objectsÕ durations. The ContentType is applica-
tion/mixed-media.
Data Formats
The exact media format of an object is represented by a Format object. The
format itself carries no encoding-speciÞc parameters or global timing
information, it describes the formatÕs encoding name and the type of data
the format requires.
JMF extends Format to deÞne audio- and video-speciÞc formats.
Format FormatControl
getFormat
setFormat
getSupportedFormats
AudioFormat isEnabled
setEnabled
VideoFormat
IndexedColorFormat
RGBFormat
YUVFormat
JPEGFormat
H261Format
H263Format
Controls
JMF Control provides a mechanism for setting and querying attributes of
an object. A Control often provides access to a corresponding user inter-
face component that enables user control over an objectÕs attributes. Many
JMF objects expose Controls, including Controller objects, DataSource
objects, DataSink objects, and JMF plug-ins.
Any JMF object that wants to provide access to its corresponding Control
objects can implement the Controls interface. Controls deÞnes methods
for retrieving associated Control objects. DataSource and PlugIn use the
Controls interface to provide access to their Control objects.
Standard Controls
JMF deÞnes the standard Control interfaces shown in Figure 2-8:, ÒJMF
controls."
CachingControl enables download progress to be monitored and dis-
played. If a Player or Processor can report its download progress, it
implements this interface so that a progress bar can be displayed to the
user.
GainControl enables audio volume adjustments such as setting the level
and muting the output of a Player or Processor. It also supports a listener
mechanism for volume changes.
MediaEvent
addGainChangeListener gainChange(GainChangeEvent)
removeGainChangeListener
creates GainChangeEvent
getDB
getLevel
getMute
Control
BitRateControl
BufferControl
CachingControl
FormatControl
TrackControl
FrameGrabbingControl
FramePositioningControl
FrameProcessingControl
FrameRateControl
GainControl
H263Control
H261Control
KeyFrameControl
MonitorControl
MpegAudioControl
PacketSizeControl
PortControl
QualityControl
RTPControl
SilenceSuppressionControl
StreamWriterControl
Extensibility
Advanced developers and technology providers can extend JMF function-
ality in two ways:
Note: JMF Players and Processors are not required to support plug-ins.
Plug-ins wonÕt work with JMF 1.0-based Players and some Processor im-
plementations might choose not to support them. The reference imple-
mentation of JMF 2.0 provided by Sun Microsystems, Inc. and IBM
Corporation fully supports the plug-in API.
Presentation
In JMF, the presentation process is modeled by the Controller interface.
Controller deÞnes the basic state and control mechanism for an object
that controls, presents, or captures time-based media. It deÞnes the phases
Understanding JMF 25
that a media controller goes through and provides a mechanism for con-
trolling the transitions between those phases. A number of the operations
that must be performed before media data can be presented can be time
consuming, so JMF allows programmatic control over when they occur.
A Controller posts a variety of controller-speciÞc MediaEvents to provide
notiÞcation of changes in its status. To receive events from a Controller
such as a Player, you implement the ControllerListener interface. For
more information about the events posted by a Controller, see ÒControl-
ler EventsÓ on page 30.
The JMF API deÞnes two types of Controllers: Players and Processors. A
Player or Processor is constructed for a particular data source and is nor-
mally not re-used to present other media data.
Clock has a TimeBase
Duration
extends extends
Controller MediaHandler
extends extends
extends
Processor
Players
A Player processes an input stream of media data and renders it at a pre-
cise time. A DataSource is used to deliver the input media-stream to the
Player.The rendering destination depends on the type of media being pre-
sented.
DataSource Player
A Player does not provide any control over the processing that it performs
or how it renders the media data.
Player supports standardized user control and relaxes some of the opera-
tional restrictions imposed by Clock and Controller.
Clock has a TimeBase
syncStart
stop
getMediaTime
getTimeBase Duration
setTimeBase
setRate getDuration
extends extends
Controller
prefetch
realize
deallocate MediaHandler
close
addControllerListener setSource
extends extends
Player States
A Player can be in one of six states. The Clock interface deÞnes the two
primary states: Stopped and Started. To facilitate resource management,
Controller breaks the Stopped state down into Þve standby states: Unreal-
ized, Realizing, Realized, Prefetching, and Prefetched.
Stopped Started
RCE PFCE
SE
Transition Events:
RCE RealizeCompleteEvent
PFCE PrefetchCompleteEvent
SE StopEvent
In normal operation, a Player steps through each state until it reaches the
Started state:
¥ A Player in the Unrealized state has been instantiated, but does not yet
know anything about its media. When a media Player is first created,
it is Unrealized.
¥ When realize is called, a Player moves from the Unrealized state into
the Realizing state. A Realizing Player is in the process of determining
its resource requirements. During realization, a Player acquires the
resources that it only needs to acquire once. These might include
rendering resources other than exclusive-use resources. (Exclusive-
use resources are limited resources such as particular hardware
devices that can only be used by one Player at a time; such resources
are acquired during Prefetching.) A Realizing Player often downloads
assets over the network.
¥ When a Player finishes Realizing, it moves into the Realized state. A
Realized Player knows what resources it needs and information about
the type of media it is to present. Because a Realized Player knows how
to render its data, it can provide visual components and controls. Its
connections to other objects in the system are in place, but it does not
own any resources that would prevent another Player from starting.
¥ When prefetch is called, a Player moves from the Realized state into
the Prefetching state. A Prefetching Player is preparing to present its
media. During this phase, the Player preloads its media data, obtains
exclusive-use resources, and does whatever else it needs to do to
prepare itself to play. Prefetching might have to recur if a Player
objectÕs media presentation is repositioned, or if a change in the Player
objectÕs rate requires that additional buffers be acquired or alternate
processing take place.
¥ When a Player finishes Prefetching, it moves into the Prefetched state. A
Prefetched Player is ready to be started.
¥ Calling start puts a Player into the Started state. A Started Player
objectÕs time-base time and media time are mapped and its clock is
running, though the Player might be waiting for a particular time to
begin presenting its media data.
Processors
Processors can also be used to present media data. A Processor is just a
specialized type of Player that provides control over what processing is
performed on the input media stream. A Processor supports all of the
same presentation controls as a Player.
Presentation Controls
In addition to the standard presentation controls deÞned by Controller, a
Player or Processor might also provide a way to adjust the playback vol-
ume. If so, you can retrieve its GainControl by calling getGainControl. A
GainControl object posts a GainChangeEvent whenever the gain is modi-
Þed. By implementing the GainChangeListener interface, you can respond
to gain changes. For example, you might want to update a custom gain
control Component.
Additional custom Control types might be supported by a particular
Player or Processor implementation to provide other control behaviors
and expose custom user interface components. You access these controls
through the getControls method.
For example, the CachingControl interface extends Control to provide a
mechanism for displaying a download progress bar. If a Player can report
its download progress, it implements this interface. To Þnd out if a Player
supports CachingControl, you can call getControl(CachingControl) or
use getControls to get a list of all the supported Controls.
30 JMF API Guide
You can also implement custom user interface components, and use the
event listener mechanism to determine when they need to be updated.
Controller Events
The ControllerEvents posted by a Controller such as a Player or Proces-
sor fall into three categories: change notiÞcations, closed events, and tran-
sition events:
ControllerEvent
AudioDeviceUnavailableEvent
CachingControlEvent
ControllerClosedEvent
ControllerErrorEvent
ConnectionErrorEvent
InternalErrorEvent
ResourceUnavailableEvent
DataLostErrorEvent
DurationUpdateEvent
RateChangeEvent
SizeChangeEvent
StopTimeChangeEvent
MediaTimeSetEvent
TransitionEvent
ConfigureCompleteEvent
PrefetchCompleteEvent
RealizeCompleteEvent
StartEvent
StopEvent
DataStarvedEvent
EndOfMediaEvent
RestartingEvent
StopAtTimeEvent
StopByRequestEvent
DeallocateEvent
Processing
A Processor is a Player that takes a DataSource as input, performs some
user-deÞned processing on the media data, and then outputs the pro-
cessed media data.
Player has a DataSource
start
setSource
addController
getVisualComponent
getControlPanelComponent
extends
Track 1
Renderer
A B Plug-In
Renderer
Plug-In
Processor
Figure 2-16: Processor stages.
Processor States
A Processor has two additional standby states, ConÞguring and ConÞg-
ured, which occur before the Processor enters the Realizing state..
34 JMF API Guide
Unrealized Realized
SE
Transition Events:
CCE ConfigureCompleteEvent
RCE RealizeCompleteEvent
PFCE PrefetchCompleteEvent
SE StopEvent
Processing Controls
You can control what processing operations the Processor performs on a
track through the TrackControl for that track. You call Processor
getTrackControls to get the TrackControl objects for all of the tracks in
the media stream.
Through a TrackControl, you can explicitly select the Effect, Codec, and
Renderer plug-ins you want to use for the track. To Þnd out what options
are available, you can query the PlugInManager to Þnd out what plug-ins
are installed.
To control the transcoding thatÕs performed on a track by a particular
Codec, you can get the Controls associated with the track by calling the
TrackControl getControls method. This method returns the codec con-
trols available for the track, such as BitRateControl and QualityControl.
(For more information about the codec controls deÞned by JMF, see ÒCon-
trolsÓ on page 20.)
If you know the output data format that you want, you can use the set-
Format method to specify the Format and let the Processor choose an
appropriate codec and renderer. Alternatively, you can specify the output
format when the Processor is created by using a ProcessorModel. A Pro-
cessorModel deÞnes the input and output requirements for a Processor.
When a ProcessorModel is passed to the appropriate Manager create
method, the Manager does its best to create a Processor that meets the
speciÞed requirements.
Data Output
The getDataOutput method returns a Processor objectÕs output as a Data-
Source. This DataSource can be used as the input to another Player or Pro-
cessor or as the input to a data sink. (For more information about data
sinks, see ÒMedia Data Storage and TransmissionÓ on page 37.)
A Processor objectÕs output DataSource can be of any type: PushData-
Source, PushBufferDataSource, PullDataSource, or PullBufferDataSource.
Not all Processor objects output dataÑa Processor can render the pro-
cessed data instead of outputting the data to a DataSource. A Processor
that renders the media data is essentially a conÞgurable Player.
Understanding JMF 37
Capture
A multimedia capturing device can act as a source for multimedia data
delivery. For example, a microphone can capture raw audio input or a dig-
ital video capture board might deliver digital video from a camera. Such
capture devices are abstracted as DataSources. For example, a device that
provides timely delivery of data can be represented as a PushDataSource.
Any type of DataSource can be used as a capture DataSource: PushData-
Source, PushBufferDataSource, PullDataSource, or PullBufferDataSource.
Storage Controls
A DataSink posts a DataSinkEvent to report on its status. A DataSinkEvent
can be posted with a reason code, or the DataSink can post one of the fol-
lowing DataSinkEvent subtypes:
Extensibility
You can extend JMF by implementing custom plug-ins, media handlers,
and data sources.
Implementing Plug-Ins
By implementing one of the JMF plug-in interfaces, you can directly access
and manipulate the media data associated with a Processor:
Note: The JMF Plug-In API is part of the ofÞcial JMF API, but JMF Players
and Processors are not required to support plug-ins. Plug-ins wonÕt work
with JMF 1.0-based Players and some Processor implementations might
choose not to support them. The reference implementation of JMF 2.0 pro-
vided by Sun Microsystems, Inc. and IBM Corporation fully supports the
plug-in API.
MediaHandler Construction
Players, Processors, and DataSinks are all types of MediaHandlersÑthey
all read data from a DataSource. A MediaHandler is always constructed for
a particular DataSource, which can be either identiÞed explicitly or with a
MediaLocator. When one of the createMediaHandler methods is called,
Manager uses the content-type name obtained from the DataSource to Þnd
and create an appropriate MediaHandler object.
40 JMF API Guide
uses PackageManager
Manager
getContentPrefixList
getProtocolPrefixList
creates DataSource
createDataSource
getContentName
MediaHandler
extends
createPlayer creates
Player
createRealizedPlayer
extends
createProcessor creates
Processor
createRealizedProcessor
creates
createDataSink DataSink
creates
MediaProxy
extends
creates
DataSinkProxy
¥ Manager steps through each class in the search list until it finds a class
named Handler that can be constructed and to which it can attach the
DataSource.
<content package-prefix>.media.content.<content-type>.Handler
<content package-prefix>.media.processor.<content-type>.Handler
<content package-prefix>.media.datasink.protocol.Handler
<content package-prefix>.media.datasink.protocol.<content-type>.Handler
DataSource Construction
Manager uses the same mechanism to construct DataSources that it uses to
construct MediaHandlers, except that it generates the search list of Data-
Source class names from the list of installed protocol package-preÞxes.
For each protocol package-preÞx, Manager adds to the search list a class
name of the form:
<protocol package-prefix>.media.protocol.<protocol>.DataSource
Manager steps through each class in the list until it Þnds a DataSource that
it can instantiate and to which it can attach the MediaLocator.
3
Presenting Time-Based
Media with JMF
To present time-based media such as audio or video with JMF, you use a
Player. Playback can be controlled programmatically, or you can display a
control-panel component that enables the user to control playback interac-
tively. If you have several media streams that you want to play, you need
to use a separate Player for each one. to play them in sync, you can use
one of the Player objects to control the operation of the others.
A Processor is a special type of Player that can provide control over how
the media data is processed before it is presented. Whether youÕre using a
basic Player or a more advanced Processor to present media content, you
use the same methods to manage playback. For information about how to
control what processing is performed by a Processor, see ÒProcessing
Time-Based Media with JMFÓ on page 71.
The MediaPlayer bean is a Java Bean that encapsulates a JMF player to pro-
vide an easy way to present media from an applet or application. The
MediaPlayer bean automatically constructs a new Player when a different
media stream is selected, which makes it easier to play a series of media
clips or allow the user to select which media clip to play. For information
about using the MediaPlayer bean, see ÒPresenting Media with the Media-
Player BeanÓ on page 66
Controlling a Player
To play a media stream, you need to construct a Player for the stream, con-
Þgure the Player and prepare it to run, and then start the Player to begin
playback.
43
44 JMF API Guide
Creating a Player
You create a Player indirectly through the media Manager. To display the
Player, you get the Player objectÕs components and add them to your
appletÕs presentation space or application window.
When you need to create a new Player, you request it from the Manager by
calling createPlayer or createProcessor. The Manager uses the media URL
or MediaLocator that you specify to create an appropriate Player. A URL
can only be successfully constructed if the appropriate corresponding URL-
StreamHandler is installed. MediaLocator doesnÕt have this restriction.
Since the ProcessorModel does not specify a source URL in this example,
Manager implicitly Þnds a capture device that can capture audio and then
creates a Processor that can encode that into IMA4.
Note that when you create a Realized Processor with a ProcessorModel you
will not be able to specify processing options through the Processor
objectÕs TrackControls. For more information about specifying processing
options for a Processor, see ÒProcessing Time-Based Media with JMFÓ on
page 71.
You can access the Player objectÕs display properties, such as its x and y
coordinates, through its visual component. The layout of the Player com-
ponents is controlled through the AWT layout manager.
Every Player provides a default control panel. To display the default con-
trol panel:
Note that getControls does not return a Player objectÕs GainControl. You
can only access the GainControl by calling getGainControl.
Example 3-2: Using getControls to Þnd out what Controls are supported.
Control[] controls = player.getControls();
for (int i = 0; i < controls.length; i++) {
if (controls[i] instanceof CachingControl) {
cachingControl = (CachingControl) controls[i];
}
}
that media time passes twice as fast as the time-base time when the Player
is started.
In theory, a Player objectÕs rate could be set to any real number, with neg-
ative rates interpreted as playing the media in reverse. However, some
media formats have dependencies between frames that make it impossi-
ble or impractical to play them in reverse or at non-standard rates.
To set the rate, you call setRate and pass in the temporal scale factor as a
ßoat value. When setRate is called, the method returns the rate that is
actually set, even if it has not changed. Players are only guaranteed to
support a rate of 1.0.
Frame Positioning
Some Players allow you to seek to a particular frame of a video. This
enables you to easily set the start position to the beginning of particular
frame without having to specify the exact media time that corresponds to
that position. Players that support frame positioning implement the
FramePositioningControl.
Some Players can convert between media times and frame positions. You
can use the FramePositioningControl mapFrameToTime and mapTimeToFrame
methods to access this information, if itÕs available. (Players that support
FramePositioningControl are not required to export this information.)
Note that there is not a one-to-one correspondence between media times
and frames Ña frame has a duration, so several different media times
might map to the same frame. (See ÒGetting the Media TimeÓ on page 53
for more information.)
Presenting Time-Based Media with JMF 49
Preparing to Start
Most media Players cannot be started instantly. Before the Player can
start, certain hardware and software conditions must be met. For example,
if the Player has never been started, it might be necessary to allocate buff-
ers in memory to store the media data. Or, if the media data resides on a
network device, the Player might have to establish a network connection
before it can download the data. Even if the Player has been started
before, the buffers might contain data that is not valid for the current
media position.
You call realize to move the Player into the Realizing state and begin the
realization process. You call prefetch to move the Player into the Prefetch-
ing state and initiate the prefetching process. The realize and prefetch
methods are asynchronous and return immediately. When the Player
completes the requested operation, it posts a RealizeCompleteEvent or
PrefetchCompleteEvent. ÒPlayer StatesÓ on page 26 describes the opera-
tions that a Player performs in each of these states.
A Player in the Prefetched state is prepared to start and its start-up latency
cannot be further reduced. However, setting the media time through set-
MediaTime might return the Player to the Realized state and increase its
start-up latency.
Keep in mind that a Prefetched Player ties up system resources. Because
some resources, such as sound cards, might only be usable by one pro-
gram at a time, having a Player in the Prefetched state might prevent other
Players from starting.
50 JMF API Guide
1. Specify the point in the media stream at which you want to start by call-
ing setMediaTime.
2. Call start on the Player.
When a Player is stopped, its media time is frozen if the source of the
media can be controlled. If the Player is presenting streamed media, it
might not be possible to freeze the media time. In this case, only the
receipt of the media data is stoppedÑthe data continues to be streamed
and the media time continues to advance.
When a Stopped Player is restarted, if the media time was frozen, presenta-
tion resumes from the stop time. If media time could not be frozen when
the Player was stopped, reception of the stream resumes and playback
begins with the newly-received data.
To stop a Player immediately, you call the stop method. If you call stop on
a Stopped Player, the only effect is that a StopByRequestEvent is posted in
acknowledgment of the method call.
set. If the Started Player already has a stop time, setStopTime throws an
error.
You can call getStopTime to get the currently scheduled stop time. If the
clock has no scheduled stop time, getStopTime returns Clock.RESET. To
remove the stop time so that the Player continues until it reaches end-of-
media, call setStopTime(Clock.RESET).
Querying a Player
A Player can provide information about its current parameters, including
its rate, media time, and duration.
getMediaTime
15
10
Duration
5 10 15
}
}
}
If you start the Player at time 0.0, while the Þrst frame is displayed, the
media time advances from 0.0 to 5.0. If you start at time 2.0, the Þrst frame
is displayed for 3 seconds, until time 5.0 is reached.
54 JMF API Guide
myCurrentTBTime = player1.getTimeBase().getTime();
When a Player is running, you can get the time-base time that corre-
sponds to a particular media time by calling mapToTimeBase.
This Þlters out the events that you are not interested in. If you have regis-
tered as a listener with multiple Controllers, you also need to determine
which Controller posted the event. ControllerEvents come ÒstampedÓ
with a reference to their source that you can access by calling getSource.
When you receive events from a Controller, you might need to do some
additional processing to ensure that the Controller is in the proper state
before calling a control method. For example, before calling any of the
methods that are restricted to Stopped Players, you should check the
Player objectÕs target state by calling getTargetState. If start has been
called, the Player is considered to be in the Started state, though it might
be posting transition events as it prepares the Player to present media.
Some types of ControllerEvents contain additional state information. For
example, the StartEvent and StopEvent classes each deÞne a method that
allows you to retrieve the media time at which the event occurred.
Using ControllerAdapter
ControllerAdapter is a convenience class that implements ControllerLis-
tener and can be easily extended to respond to particular Events. To
implement the ControllerListener interface using ControllerAdapter,
you need to:
player1.setTimeBase(player2.getTimeBase());
When you synchronize Players by associating them with the same Time-
Base,you must still manage the control of each Player individually.
Because managing synchronized Players in this way can be complicated,
JMF provides a mechanism that allows a Player to assume control over
any other Controller. The Player manages the states of these Controllers
automatically, allowing you to interact with the entire group through a
Presenting Time-Based Media with JMF 57
single point of control. For more information, see See ÒUsing a Player to
Synchronize ControllersÓ.
When you interact with a managing Player, your instructions are auto-
matically passed along to the managed Controllers as appropriate. The
managing Player takes care of the state management and synchronization
for all of the other Controllers.
This mechanism is implemented through the addController and remove-
Controller methods. When you call addController on a Player, the Con-
troller you specify is added to the list of Controllers managed by the
Player. Conversely, when you call removeController, the speciÞed Con-
troller is removed from the list of managed Controllers.
Adding a Controller
You use the addController method to add a Controller to the list of Con-
trollers managed by a particular Player. To be added, a Controller must
be in the Realized state; otherwise, a NotRealizedError is thrown. Two
Players cannot be placed under control of each other. For example, if
player1 is placed under the control of player2, player2 cannot be placed
under the control of player1 without Þrst removing player1 from
player2’s control.
player2.addController(player1);
setRate Invokes setRate on all managed Stops all managed Controllers, in-
Controllers. Returns the actual vokes setRate, and restarts Control-
rate that was supported by all Con- lers. Returns the actual rate that
trollers and set. was supported by all Controllers
and set.
Removing a Controller
You use the removeController method to remove a Controller from the
list of controllers managed by a particular Player.
60 JMF API Guide
player2.removeController(player1);
¥ Start the Player objects by calling syncStart with a time that takes
into account the maximum latency.
You must listen for transition events for all of the Player objects and
keep track of which ones have posted events. For example, when you
prefetch the Player objects, you need to keep track of which ones have
posted PrefetchComplete events so that you can be sure all of them are
Prefetched before calling syncStart. Similarly, when you request that the
synchronized Player objects stop at a particular time, you need to listen
Presenting Time-Based Media with JMF 61
for the stop event posted by each Player to determine when all of them
have actually stopped.
In some situations, you need to be careful about responding to events
posted by the synchronized Player objects. To be sure of the state of all of
the Player objects, you might need to wait at certain stages for all of them
to reach the same state before continuing.
For example, assume that you are using one Player to drive a group of
synchronized Player objects. A user interacting with that Player sets the
media time to 10, starts the Player, and then changes the media time to 20.
You then:
1. Pass along the first setMediaTime call to all of the synchronized Player
objects.
2. Call prefetch on each Player to prepare them to start.
3. Call stop on each Player when the second set media time request is re-
ceived.
4. Call setMediaTime on each Player with the new time.
5. Restart the prefetching operation.
6. When all of the Player objects have been prefetched, start them by call-
ing syncStart, taking into account their start latencies.
In this case, just listening for PrefetchComplete events from all of the
Player objects before calling syncStart isnÕt sufÞcient. You canÕt tell
whether those events were posted in response to the Þrst or second
prefetch operation. To avoid this problem, you can block when you call
stop and wait for all of the Player objects to post stop events before con-
tinuing. This guarantees that the next PrefetchComplete events you
receive are the ones that you are really interested in.
The Player objectÕs visual presentation and its controls are displayed
within the appletÕs presentation space in the browser window. If you cre-
ate a Player in a Java application, you are responsible for creating the win-
dow to display the Player objectÕs components.
Note: While PlayerApplet illustrates the basic usage of a Player, it does
not perform the error handling necessary in a real applet or application.
For a more complete sample suitable for use as a template, see ÒJMF
AppletÓ on page 173.
Overview of PlayerApplet
The APPLET tag is used to invoke PlayerApplet in an HTML Þle. The WIDTH
and HEIGHT Þelds of the HTML APPLET tag determine the dimensions of the
appletÕs presentation space in the browser window. The PARAM tag identi-
Þes the media Þle to be played.
Example 3-5: Invoking PlayerApplet.
<APPLET CODE=ExampleMedia.PlayerApplet
WIDTH=320 HEIGHT=300>
<PARAM NAME=FILE VALUE="sample2.mpg">
</APPLET>
When a user opens a web page containing PlayerApplet, the applet loads
automatically and runs in the speciÞed presentation space, which contains
the Player objectÕs visual component and default controls. The Player
starts and plays the MPEG movie once. The user can use the default
Player controls to stop, restart, or replay the movie. If the page containing
the applet is closed while the Player is playing the movie, the Player auto-
matically stops and frees the resources it was using.
To accomplish this, PlayerApplet extends Applet and implements the Con-
trollerListener interface. PlayerApplet deÞnes Þve methods:
¥ initÑcreates a Player for the file that was passed in through the PARAM
tag and registers PlayerApplet as a controller listener so that it can ob-
serve media events posted by the Player. (This causes the PlayerAp-
plet controllerUpdate method to be called whenever the Player posts
an event.)
¥ startÑstarts the Player when PlayerApplet is started.
¥ stopÑstops and deallocates the Player when PlayerApplet is
stopped.
¥ destroyÑcloses the Player when PlayerApplet is removed.
Presenting Time-Based Media with JMF 63
Deallocating the Player releases any resources that would prevent another
Player from being started. For example, if the Player uses a hardware
device to present its media, deallocate frees that device so that other
Players can use it.
playback. This makes it easy to play a series of media clips or enable the
user to select the media clip that they want to play.
A MediaPlayer bean has several properties that you can set, including the
media source:
Show control panel Boolean Yes Controls whether or not the video control
panel is visible.
Media location String N/A The location of the media clip to be played.
It can be an URL or a relative address. For
example:
file:///e:/video/media/
Sample1.mov
http://webServer/media/
Sample1.mov
media/Sample1.mov
Fixed Aspect Ratio Boolean Yes Controls whether or not the mediaÕs origi-
nal fixed aspect ratio is maintained.
mp1.setMediaLocation(new java.lang.String("file:///E:/jvideo/media/
Sample1.mov"));
mp1.start();
68 JMF API Guide
mp1.stop();
if (mrl == null) {
System.err.println("Can't build MRL for RTP");
return false;
}
vSize = visualComp.getPreferredSize();
vSize.width = (int)(vSize.width * defaultScale);
vSize.height = (int)(vSize.height * defaultScale);
framePanel.add(visualComp);
visualComp.setBounds(0,
0,
vSize.width,
vSize.height);
addPopupMenu(visualComp);
}
}
controlComp = player.getControlPanelComponent();
if (controlComp != null)
{
if (oldComp != controlComp)
{
framePanel.remove(oldComp);
framePanel.add(controlComp);
if (controlComp != null) {
int prefHeight = controlComp
.getPreferredSize()
.height;
controlComp.setBounds(0,
vSize.height,
vSize.width,
prefHeight);
}
}
}
}
}
4
Processing Time-Based
Media with JMF
71
72 JMF API Guide
Configuring a Processor
In addition to the Realizing and Prefetching phases that any Player moves
through as it prepares to start, a Processor also goes through a ConÞguring
phase. You call configure to move an Unrealized Processor into the ConÞg-
uring state.
While in the ConÞguring state, a Processor gathers the information it
needs to construct TrackControl objects for each track. When itÕs Þn-
ished, it moves into the ConÞgured state and posts a ConfigureComple-
teEvent. Once a Processor is ConÞgured, you can set its output format and
TrackControl options. When youÕre Þnished specifying the processing
options, you call realize to move the Processor into the Realizing state
and begin the realization process.
Once a Processor is Realized, further attempts to modify its processing
options are not guaranteed to work. In most cases, a FormatChangeExcep-
tion will be thrown.
When you use setCodecChain to specify the codec and effect plug-ins for a
Processor, the order in which the plug-ins actually appear in the process-
ing chain is determined by the input and output formats each plug-in sup-
ports.
To control the transcoding thatÕs performed on a track by a particular
Codec,you can use the codec controls associated with the track. To get the
codec controls, you call the TrackControl getControls method. This
Processing Time-Based Media with JMF 73
returns all of the Controls associated with the track, including codec con-
trols such as H263Control, QualityControl, and MPEGAudioControl. (For a
list of the codec controls deÞned by JMF, see ÒStandard ControlsÓ on
page 20.)
Selecting a Renderer
To select the Renderer that you want to use, you:
A Processor can enable user control over the maximum number of bytes
that it can write to its destination by implementing the StreamWriterCon-
trol. You Þnd out if a Processor provides a StreamWriterControl by call-
Processing Time-Based Media with JMF 75
You can use JMF to capture media data from a capture device such as a
microphone or video camera. Captured media data can be processed and
rendered or stored for future use.
To capture media data, you:
1. Locate the capture device you want to use by querying the CaptureDe-
viceManager.
When you use a capture DataSource with a Player, you can only render the
captured media data. To explicitly process or store the captured media
data, you need to use a Processor.
77
78 JMF API Guide
A PortControl provides a way to select the port from which data will be
captured. A MonitorControl provides a means for displaying the deviceÕs
capture monitor.
Like other Control objects, if thereÕs a visual component that corresponds
to the PortControl or MonitorControl, you can get it by calling getCon-
trolComponent. Adding the Component to your applet or application win-
dow will enable users to interact with the capture control.
You can also display the standard control-panel component and visual
component associated with the Player or Processor youÕre using.
Example 5-1: Displaying GUI components for a processor.
Component controlPanel, visualComponent;
if ((controlPanel = p.getControlPanelComponent()) != null)
add(controlPanel);
if ((visualComponent = p.getVisualComponent()) != null)
add(visualComponent);
Capturing Time-Based Media with JMF 79
1. Get the MediaLocator for the capture device and construct a Processor.
2. Call configure on the Processor.
3. Once the Processor is in the Configured state, call getTrackControls.
4. Call setFormat on each track until you find one that can be converted
to IMA4. (For setFormat to succeed, appropriate codec plug-ins must
be available to perform the conversion.)
5. Realize the Processor and use itÕs output DataSource to construct a
DataSink to write the data to a file.
Capturing Time-Based Media with JMF 83
if (!encodingPossible) {
sh.close();
System.exit(-1);
}
// Realize the processor
if (!sh.realize(10000))
System.exit(-1);
try {
p = Manager.createRealizedProcessor(new ProcessorModel(formats,
outputType));
} catch (IOException e) {
System.exit(-1);
} catch (NoProcessorException e) {
System.exit(-1);
} catch (CannotRealizeException e) {
System.exit(-1);
}
// get the output of the processor
DataSource source = p.getDataOutput();
// create a File protocol MediaLocator with the location
// of the file to
// which bits are to be written
MediaLocator dest = new MediaLocator("file://foo.mov");
// create a datasink to do the file writing & open the
// sink to make sure
// we can write to it.
DataSink filewriter = null;
try {
filewriter = Manager.createDataSink(source, dest);
filewriter.open();
} catch (NoDataSinkException e) {
System.exit(-1);
} catch (IOException e) {
System.exit(-1);
} catch (SecurityException e) {
System.exit(-1);
}
// now start the filewriter and processor
try {
filewriter.start();
} catch (IOException e) {
System.exit(-1);
}
p.start();
// stop and close the processor when done capturing...
// close the datasink when EndOfStream event is received...
6
Extending JMF
You can extend JMF by implementing one of the plug-in interfaces to per-
form custom processing on a Track, or by implementing completely new
DataSources and MediaHandlers.
Note: JMF Players and Processors are not required to support plug-
insÑplug-ins wonÕt work with JMF 1.0-based Players and some 2.0-based
implementations might choose not to support plug-ins. The reference im-
plementation of JMF 2.0 provided by Sun Microsystems, Inc. and IBM
Corporation fully supports the plug-in API.
85
86 JMF API Guide
if (streams.length == 0) {
throw new IOException("Got a empty stream array
from the DataSource");
}
this.source = source;
this.streams = streams;
if (!supports(streams))
throw new IncompatibleSourceException("DataSource not
supported: " + source);
}
if (tracks[0] != null)
return tracks;
stream = (PullSourceStream) streams[0];
readHeader();
bufferSize = bytesPerSecond;
tracks[0] = new GsmTrack((AudioFormat) format,
/*enabled=*/ true,
new Time(0),
numBuffers,
bufferSize,
minLocation,
maxLocation
);
return tracks;
}
88 JMF API Guide
Effect Plug-ins
An Effect plug-in is actually a specialized type of Codec that performs
some processing on the input Track other than compression or decom-
pression. For example, you might implement a gain effect that adjusts the
volume of an audio track. Like a Codec, an Effect is a single-input, single-
output processing component and the data manipulation that the Effect
performs is implemented in the process method.
An Effect plug-in can be used as either a pre-processing effect or a post-
processing effect. For example, if a Processor is being used to render a
compressed media stream, the Effect would typically be used as a post-
processing effect and applied after the stream has been decoded. Con-
versely, if the Processor was being used to output a compressed media
stream, the Effect would typically be applied as a pre-processing effect
before the stream is encoded.
When you implement an Effect, you need to:
/**
* Return the control based on a control type for the effect.
**/
public Object getControl(String controlType) {
try {
Class cls = Class.forName(controlType);
Object cs[] = getControls();
for (int i = 0; i < cs.length; i++) {
if (cls.isInstance(cs[i]))
return cs[i];
}
return null;
} catch (Exception e) { // no such controlType or such control
return null;
}
}
/************** format methods *************/
/** set the input format **/
public Format setInputFormat(Format input) {
// the following code assumes valid Format
inputFormat = (AudioFormat)input;
return (Format)inputFormat;
}
/** set the output format **/
public Format setOutputFormat(Format output) {
// the following code assumes valid Format
outputFormat = (AudioFormat)output;
return (Format)outputFormat;
}
/** get the input format **/
protected Format getInputFormat() {
return inputFormat;
}
92 JMF API Guide
if (!iaf.matches(supportedInputFormats[0]))
return new Format[0];
// == prolog
byte[] inData = (byte[])inputBuffer.getData();
int inLength = inputBuffer.getLength();
int inOffset = inputBuffer.getOffset();
Extending JMF 93
// == main
if (sample>32767) // saturate
sample = 32767;
else if (sample < -32768)
sample = -32768;
// == epilog
updateOutput(outputBuffer,outputFormat, samplesNumber, 0);
return BUFFER_PROCESSED_OK;
}
/**
* Utility: validate that the Buffer object's data size is at least
* newSize bytes.
* @return array with sufficient capacity
**/
protected byte[] validateByteArraySize(Buffer buffer,int newSize) {
Object objectArray=buffer.getData();
byte[] typedArray;
if (objectArray instanceof byte[]) { // is correct type AND not null
typedArray=(byte[])objectArray;
if (typedArray.length >= newSize ) { // is sufficient capacity
return typedArray;
}
}
System.out.println(getClass().getName()+
" : allocating byte["+newSize+"] ");
typedArray = new byte[newSize];
buffer.setData(typedArray);
return typedArray;
}
/** utility: update the output buffer fields **/
protected void updateOutput(Buffer outputBuffer,
Format format,int length, int offset) {
94 JMF API Guide
3. Implement process to actually process the data and render it to the out-
put device that this Renderer represents.
Example: AWTRenderer
This example implements the Renderer plug-in to create a Renderer for
RGB images that uses AWT Image.
Example 6-5: Implementing a Renderer plug-in (1 of 7)
import javax.media.*;
import javax.media.renderer.VideoRenderer;
import javax.media.format.Format;
import javax.media.format.video.VideoFormat;
import javax.media.format.video.RGBFormat;
import java.awt.*;
import java.awt.image.*;
import java.awt.event.*;
import java.util.Vector;
96 JMF API Guide
/**
* Constructor
**/
public SampleAWTRenderer() {
// Prepare supported input formats and preferred format
int rMask = 0x000000FF;
int gMask = 0x0000FF00;
int bMask = 0x00FF0000;
/**
* Return the control based on a control type for the PlugIn.
*/
public Object getControl(String controlType) {
try {
Class cls = Class.forName(controlType);
Object cs[] = getControls();
for (int i = 0; i < cs.length; i++) {
if (cls.isInstance(cs[i]))
return cs[i];
}
return null;
} catch (Exception e) { // no such controlType or such control
return null;
}
}
/**
* PlugIn implementation
**/
Graphics g = component.getGraphics();
if (g != null) {
if (reqBounds == null) {
bounds = component.getBounds();
bounds.x = 0;
bounds.y = 0;
} else
bounds = reqBounds;
g.drawImage(destImage, bounds.x, bounds.y,
bounds.width, bounds.height,
0, 0, inWidth, inHeight, component);
}
return BUFFER_PROCESSED_OK;
}
/**
* VideoRenderer implementation
**/
/**
* Returns an AWT component that it will render to. Returns null
* if it is not rendering to an AWT component.
*/
public java.awt.Component getComponent() {
if (component == null) {
component = new Canvas() {
public Dimension getPreferredSize() {
return new Dimension(getInWidth(), getInHeight());
}
};
}
return component;
}
100 JMF API Guide
/**
* Sets the region in the component where the video is to be
* rendered to. Video is to be scaled if necessary. If
* <code>rect</code> is null, then the video occupies the entire
* component.
*/
public void setBounds(java.awt.Rectangle rect) {
reqBounds = rect;
}
/**
* Returns the region in the component where the video will be
* rendered to. Returns null if the entire component is being used.
*/
public java.awt.Rectangle getBounds() {
return reqBounds;
}
/**
* Local methods
**/
int getInWidth() {
return inWidth;
}
int getInHeight() {
return inHeight;
}
If you want to make your plug-in available to other users, you should cre-
ate an Java applet or application that performs this registration process
and distribute it with your plug-in.
You can remove a plug-in either temporarily or permanently with the
removePlugIn method. To make the change permanent, you call commit.
<protocol package-prefix>.media.protocol.<protocol>.DataSource
The protocol package-preÞx is a unique identiÞer for your code that you reg-
ister with the JMF PackageManager (for example, COM.mybiz) as a protocol
package-preÞx. The protocol identiÞes the protocol for your new Data-
Source. For more information, see ÒIntegrating a Custom Data Source
with JMFÓ on page 103.
For example, to integrate a new DataSource for the protocol type xxx, you
would create and install a package called:
<protocol package-prefix>.media.protocol.xxx.DataSource
that contains the new DataSource class. You also need to add your package
preÞx (an identiÞer for your code, such as COM.mybiz) to the protocol pack-
age-preÞx list managed by the PackageManager.
Example 6-7: Registering a protocol package-preÞx.
Vector packagePrefix = PackageManager.getProtocolPrefixList();
string myPackagePrefix = new String(“COM.mybiz”);
// Add new package prefix to end of the package prefix list.
packagePrefix.addElement(myPackagePrefix);
PackageManager.setProtocolPrefixList();
// Save the changes to the package prefix list.
PackageManager.commitProtocolPrefixList();
If you want to make your new DataSource available to other users, you
should create an Java applet or application that performs this registration
process and distribute it with your DataSource.
Implementing a DataSink
JMF provides a default DataSink that can be used to write data to a Þle.
Other types of DataSink classes can be implemented to facilitate writing
data to the network or to other destinations.
To create a custom DataSink, you implement the DataSink interface. A
DataSink is a type of MediaHandler, so you must also implement the Medi-
aHandler setSource method.
To use your DataSink with JMF, you need to add your package-preÞx to
the content package-preÞx list maintained by the PackageManager. For
more information, see ÒIntegrating a Custom Media Handler with JMFÓ.
<content package-prefix>.media.content.mpeg.sys.Handler
that contains the new Player class. The package preÞx is an identiÞer for
your code, such as COM.mybiz. You also need to add your package preÞx to
the content package-preÞx list managed by the PackageManager.
106 JMF API Guide
If you want to make your new MediaHandler available to other users, you
should create an Java applet or application that performs this registration
process and distribute it with your MediaHandler.
107
108 JMF API Guide
7
Working with Real-Time
Media Streams
Streaming Media
When media content is streamed to a client in real-time, the client can
begin to play the stream without having to wait for the complete stream to
download. In fact, the stream might not even have a predeÞned dura-
tionÑdownloading the entire stream before playing it would be impossi-
ble. The term streaming media is often used to refer to both this technique of
delivering content over the network in real-time and the real-time media
content thatÕs delivered.
Streaming media is everywhere you look on the webÑlive radio and tele-
vision broadcasts and webcast concerts and events are being offered by a
rapidly growing number of web portals, and itÕs now possible to conduct
audio and video conferences over the Internet. By enabling the delivery of
dynamic, interactive media content across the network, streaming media
is changing the way people communicate and access information.
109
110 JMF API Guide
large delays in receiving the data. This is very different from accessing
static data such as a Þle, where the most important thing is that all of the
data arrive at its destination. Consequently, the protocols used for static
data donÕt work well for streaming media.
The HTTP and FTP protocols are based on the Transmission Control Pro-
tocol (TCP). TCP is a transport-layer protocol1 designed for reliable data
communications on low-bandwidth, high-error-rate networks. When a
packet is lost or corrupted, itÕs retransmitted. The overhead of guarantee-
ing reliable data transfer slows the overall transmission rate.
For this reason, underlying protocols other than TCP are typically used for
streaming media. One thatÕs commonly used is the User Datagram Proto-
col (UDP). UDP is an unreliable protocol; it does not guarantee that each
packet will reach its destination. ThereÕs also no guarantee that the pack-
ets will arrive in the order that they were sent. The receiver has to be able
to compensate for lost data, duplicate packets, and packets that arrive out
of order.
Like TCP, UDP is a general transport-layer protocolÑa lower-level net-
working protocol on top of which more application-speciÞc protocols are
built. The Internet standard for transporting real-time data such as audio
and video is the Real-Time Transport Protocol (RTP).
RTP is deÞned in IETF RFC 1889, a product of the AVT working group of
the Internet Engineering Task Force (IETF).
1. In the seven layer ISO/OSI data communications model, the transport layer is level
four. For more information about the ISO/OSI model, see Understanding OSI.
Larmouth, John. International Thompson Computer Press, 1996. ISBN 1850321760.
Working with Real-Time Media Streams 111
RTP can be used over both unicast and multicast network services. Over a
unicast network service, separate copies of the data are sent from the
source to each destination. Over a multicast network service, the data is
sent from the source only once and the network is responsible for trans-
mitting the data to multiple locations. Multicasting is more efÞcient for
many multimedia applications, such as video conferences. The standard
Internet Protocol (IP) supports multicasting.
RTP Services
RTP enables you to identify the type of data being transmitted, determine
what order the packets of data should be presented in, and synchronize
media streams from different sources.
RTP data packets are not guaranteed to arrive in the order that they were
sentÑin fact, theyÕre not guaranteed to arrive at all. ItÕs up to the receiver
to reconstruct the senderÕs packet sequence and detect lost packets using
the information provided in the packet header.
While RTP does not provide any mechanism to ensure timely delivery or
provide other quality of service guarantees, it is augmented by a control
protocol (RTCP) that enables you to monitor the quality of the data distri-
bution. RTCP also provides control and identiÞcation mechanisms for
RTP transmissions.
If quality of service is essential for a particular application, RTP can be
used over a resource reservation protocol that provides connection-ori-
ented services.
112 JMF API Guide
RTP Architecture
An RTP session is an association among a set of applications communicat-
ing with RTP. A session is identiÞed by a network address and a pair of
ports. One port is used for the media data and the other is used for control
(RTCP) data.
A participant is a single machine, host, or user participating in the session.
Participation in a session can consist of passive reception of data
(receiver), active transmission of data (sender), or both.
Each media type is transmitted in a different session. For example, if both
audio and video are used in a conference, one session is used to transmit
the audio data and a separate session is used to transmit the video data.
This enables participants to choose which media types they want to
receiveÑfor example, someone who has a low-bandwidth network con-
nection might only want to receive the audio portion of a conference.
Data Packets
The media data for a session is transmitted as a series of packets. A series
of data packets that originate from a particular source is referred to as an
RTP stream. Each RTP data packet in a stream contains two parts, a struc-
tured header and the actual data (the packetÕs payload).
Bit 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 16 7 8 9 0 1 2 3 4 5 6 7 8 9 0 31
V PX CC M PT Sequence Number
Timestamp
¥ The RTP version number (V): 2 bits. The version defined by the cur-
rent specification is 2.
Working with Real-Time Media Streams 113
¥ Padding (P): 1 bit. If the padding bit is set, there are one or more bytes
at the end of the packet that are not part of the payload. The very last
byte in the packet indicates the number of bytes of padding. The
padding is used by some encryption algorithms.
¥ Extension (X): 1 bit. If the extension bit is set, the fixed header is fol-
lowed by one header extension. This extension mechanism enables
implementations to add information to the RTP Header.
¥ CSRC Count (CC): 4 bits. The number of CSRC identifiers that follow
the fixed header. If the CSRC count is zero, the synchronization source
is the source of the payload.
¥ Marker (M): 1 bit. A marker bit defined by the particular media
profile.
¥ Payload Type (PT): 7 bits. An index into a media profile table that
describes the payload format. The payload mappings for audio and
video are specified in RFC 1890.
¥ Sequence Number: 16 bits. A unique packet number that identifies
this packetÕs position in the sequence of packets. The packet number
is incremented by one for each packet sent.
¥ Timestamp: 32 bits. Reflects the sampling instant of the first byte in
the payload. Several consecutive packets can have the same
timestamp if they are logically generated at the same timeÑfor
example, if they are all part of the same video frame.
¥ SSRC: 32 bits. Identifies the synchronization source. If the CSRC count
is zero, the payload source is the synchronization source. If the CSRC
count is nonzero, the SSRC identifies the mixer.
¥ CSRC: 32 bits each. Identifies the contributing sources for the payload.
The number of contributing sources is indicated by the CSRC count
field; there can be up to 16 contributing sources. If there are multiple
contributing sources, the payload is the mixed data from those
sources.
Control Packets
In addition to the media data for a session, control data (RTCP) packets
are sent periodically to all of the participants in the session. RTCP packets
can contain information about the quality of service for the session partic-
ipants, information about the source of the media being transmitted on the
data port, and statistics pertaining to the data that has been transmitted so
far.
114 JMF API Guide
¥ Sender Report
¥ Receiver Report
¥ Source Description
¥ Bye
¥ Application-specific
RTCP packets are ÒstackableÓ and are sent as a compound packet that con-
tains at least two packets, a report packet and a source description packet.
All participants in a session send RTCP packets. A participant that has
recently sent data packets issues a sender report. The sender report (SR)
contains the total number of packets and bytes sent as well as information
that can be used to synchronize media streams from different sessions.
Session participants periodically issue receiver reports for all of the sources
from which they are receiving data packets. A receiver report (RR) con-
tains information about the number of packets lost, the highest sequence
number received, and a timestamp that can be used to estimate the round-
trip delay between a sender and the receiver.
The Þrst packet in a compound RTCP packet has to be a report packet,
even if no data has been sent or receivedÑin which case, an empty
receiver report is sent.
All compound RTCP packets must include a source description (SDES)
element that contains the canonical name (CNAME) that identiÞes the
source. Additional information might be included in the source descrip-
tion, such as the sourceÕs name, email address, phone number, geographic
location, application name, or a message describing the current state of the
source.
When a source is no longer active, it sends an RTCP BYE packet. The BYE
notice can include the reason that the source is leaving the session.
RTCP APP packets provide a mechanism for applications to deÞne and
send custom information via the RTP control port.
RTP Applications
RTP applications are often divided into those that need to be able to
receive data from the network (RTP Clients) and those that need to be able
Working with Real-Time Media Streams 115
References
The RTP speciÞcation is a product of the Audio Video Transport (AVT)
working group of the Internet Engineering Task Force (IETF). For addi-
tional information about the IETF, see http://www.ietf.org. The AVT
working group charter and proceedings are available at
http://www.ietf.org/html.charters/avt-charter.html.
IETF RFC 1889, RTP: A Transport Protocol for Real Time Applications
Current revision: http://www.ietf.org.internet-drafts/draft-ietf-
avt-rtp-new-04.txt
116 JMF API Guide
IETF RFC 1890: RTP ProÞle for Audio and Video Conferences with Minimal
Control
Current revision: http://www.ietf.org.internet-drafts/draft-ietf-
avt-profile-new-06.txt
JMF enables the playback and transmission of RTP streams through the
APIs deÞned in the javax.media.rtp, javax.media.rtp.event, and
javax.media.rtp.rtcp packages. JMF can be extended to support addi-
tional RTP-speciÞc formats and dynamic payloads through the standard
JMF plug-in mechanism.
You can play incoming RTP streams locally, save them to a Þle, or both.
Console
Data Source Data Sink
File
For example, the RTP APIs could be used to implement a telephony appli-
cation that answers calls and records messages like an answering machine.
Similarly, you can use the RTP APIs to transmit captured or stored media
streams across the network. Outgoing RTP streams can originate from a
Þle or a capture device. The outgoing streams can also be played locally,
saved to a Þle, or both.
117
118 JMF API Guide
RTP Architecture
The JMF RTP APIs are designed to work seamlessly with the capture, pre-
sentation, and processing capabilities of JMF. Players and processors are
used to present and manipulate RTP media streams just like any other
media content. You can transmit media streams that have been captured
from a local capture device using a capture DataSource or that have been
stored to a Þle using a DataSink. Similarly, JMF can be extended to support
additional RTP formats and payloads through the standard plug-in mech-
anism.
JMF API
RTP APIs
Depacketizer Packetizer
Codecs Codecs
Session Manager
In JMF, a SessionManager is used to coordinate an RTP session. The session
manager keeps track of the session participants and the streams that are
being transmitted.
The session manager maintains the state of the session as viewed from the
local participant. In effect, a session manager is a local representation of a
distributed entity, the RTP session. The session manager also handles the
RTCP control channel, and supports RTCP for both senders and receivers.
The SessionManager interface deÞnes methods that enable an application
to initialize and start participating in a session, remove individual streams
created by the application, and close the entire session.
Session Statistics
The session manager maintains statistics on all of the RTP and RTCP pack-
ets sent and received in the session. Statistics are tracked for the entire ses-
sion on a per-stream basis. The session manager provides access to global
reception and transmission statistics:
Session Participants
The session manager keeps track of all of the participants in a session.
Each participant is represented by an instance of a class that implements
the Participant interface. SessionManagers create a Participant when-
ever an RTCP packet arrives that contains a source description (SDES)
with a canonical name(CNAME) that has not been seen before in the session
(or has timed-out since its last use). Participants can be passive (sending
120 JMF API Guide
control packets only) or active (also sending one or more RTP data
streams).
There is exactly one local participant that represents the local client/server
participant. A local participant indicates that it will begin sending RTCP
control messages or data and maintain state on incoming data and control
messages by starting a session.
A participant can own more than one stream, each of which is identiÞed
by the synchronization source identiÞer (SSRC) used by the source of the
stream.
Session Streams
The SessionManager maintains an RTPStream object for each stream of RTP
data packets in the session. There are two types of RTP streams:
RTP Events
Several RTP-speciÞc events are deÞned in javax.media.rtp.event. These
events are used to report on the state of the RTP session and streams.
Understanding the JMF RTP API 121
MediaEvent
RTPEvent
ReceiveStreamEvent
ActiveReceiveStreamEvent
ApplicationEvent
InactiveReceiveStreamEvent
NewReceiveStreamEvent
RemotePayloadChangeEvent
StreamMappedEvent
TimeoutEvent
RemoteEvent
ReceiverReportEvent
SenderReportEvent
RemoteCollisionEvent
SendStreamEvent
ActiveSendStreamEvent
InactiveSendStreamEvent
NewSendStreamEvent
LocalPayloadChangeEvent
SessionEvent
LocalCollisionEvent
NewParticipantEvent
Session Listener
You can implement SessionListener to receive notiÞcation about events
that pertain to the RTP session as a whole, such as the addition of new
participants.
There are two types of session-wide events:
Remote Listener
You can implement RemoteListener to receive notiÞcation of events or
RTP control messages received from a remote participants. You might
want to implement RemoteListener in an application used to monitor the
124 JMF API Guide
sessionÑit enables you to receive RTCP reports and monitor the quality of
the session reception without having to receive data or information on
each stream.
There are three types of events associated with a remote participant:
RTP Data
The streams within an RTP session are represented by RTPStream objects.
There are two types of RTPStreams: ReceiveStream and SendStream. Each
RTP stream has a buffer data source associated with it. For ReceiveS-
treams, this DataSource is always a PushBufferDataSource.
Data Handlers
The JMF RTP APIs are designed to be transport-protocol independent. A
custom RTP data handler can be created to enable JMF to work over a spe-
ciÞc transport protocol. The data handler is a DataSource that can be used
as the media source for a Player.
The abstract class RTPPushDataSource deÞnes the basic elements of a JMF
RTP data handler. A data handler has both an input data stream (Push-
SourceStream) and an output data stream (OuputDataStream). A data han-
dler can be used for either the data channel or the control channel of an
RTP session. If it is used for the data channel, the data handler implements
the DataChannel interface.
An RTPSocket is an RTPPushDataSource has both a data and control chan-
nel. Each channel has an input and output stream to stream data to and
from the underlying network. An RTPSocket can export RTPControls to
add dynamic payload information to the session manager.
Understanding the JMF RTP API 125
<protocol package-prefix>.media.protocol.rtpraw.DataSource
RTP Controls
The RTP API deÞnes one RTP-speciÞc control, RTPControl. RTPControl is
typically implemented by RTP-speciÞc DataSources. It provides a mecha-
nism to add a mapping between a dynamic payload and a Format. RTPCon-
trol also provides methods for accessing session statistics and getting the
current payload Format.
SessionManager also extends the Controls interface, enabling a session
manager to export additional Controls through the getControl and get-
Controls methods. For example, the session manager can export a Buffer-
Control to enable you to specify the buffer length and threshold.
Reception
The presentation of an incoming RTP stream is handled by a Player. To
receive and present a single stream from an RTP session, you can use a
126 JMF API Guide
rtp://address:port[:ssrc]/content-type/[ttl]
The Player is constructed and connected to the Þrst stream in the session.
If there are multiple streams in the session that you want to present, you
need to use a session manager. You can receive notiÞcation from the ses-
sion manager whenever a stream is added to the session and construct a
Player for each new stream. Using a session manager also enables you to
directly monitor and control the session.
Transmission
A session manager can also be used to initialize and control a session so
that you can stream data across the network. The data to be streamed is
acquired from a Processor.
For example, to create a send stream to transmit data from a live capture
source, you would:
You control the transmission through the SendStream start and stop
methods.
When it is Þrst started, the SessionManager behaves as a receiver (sends
out RTCP receiver reports). As soon as a SendStream is created, it begins to
send out RTCP sender reports and behaves as a sender host as long as one
or more send streams exist. If all SendStreams are closed (not just stopped),
the session manager reverts to being a passive receiver.
Understanding the JMF RTP API 127
Extensibility
Like the other parts of JMF, the RTP capabilities can be enhanced and
extended. The RTP APIs support a basic set of RTP formats and payloads.
Advanced developers and technology providers can implement JMF
plug-ins to support dynamic payloads and additional RTP formats.
JMF Players and Processors provide the presentation, capture, and data
conversion mechanisms for RTP streams.
A separate player is used for each stream received by the session manager.
You construct a Player for an RTP stream through the standard Manager
createPlayer mechanism. You can either:
¥ Use a MediaLocator that has the parameters of the RTP session and
construct a Player by calling Manager.createPlayer(MediaLocator)
¥ Construct a Player for a particular ReceiveStream by retrieving the
DataSource from the stream and passing it to Manager.createPlay-
er(DataSource).
If you use a MediaLocator to construct a Player, you can only present the
first RTP stream thatÕs detected in the session. If you want to play back
multiple RTP streams in a session, you need to use the SessionManager
directly and construct a Player for each ReceiveStream.
129
130 JMF API Guide
Note: Because a Player for an RTP media stream doesnÕt Þnish realizing
until data is detected in the session, you shouldnÕt try to use Manager.cre-
ateRealizedPlayer to construct a Player for an RTP media stream. No
Player would be returned until data arrives and if no data is detected, at-
tempting to create a Realized Player would block indeÞnitely.
if (mrl == null) {
System.err.println("Can't build MRL for RTP");
return false;
}
if (player != null) {
if (this.player == null) {
Receiving and Presenting RTP Media Streams 131
framePanel.remove(oldVisualComp);
vSize = visualComp.getPreferredSize();
vSize.width = (int)(vSize.width * defaultScale);
vSize.height = (int)(vSize.height * defaultScale);
framePanel.add(visualComp);
visualComp.setBounds(0,
0,
vSize.width,
vSize.height);
addPopupMenu(visualComp);
}
132 JMF API Guide
controlComp = player.getControlPanelComponent();
if (controlComp != null)
{
if (oldComp != controlComp)
{
framePanel.remove(oldComp);
framePanel.add(controlComp);
if (controlComp != null) {
int prefHeight = controlComp
.getPreferredSize()
.height;
controlComp.setBounds(0,
vSize.height,
vSize.width,
prefHeight);
}
}
}
}
}
mgr.addFormat(new AudioFormat(AudioFormat.DVI_RTP,
44100,
4,
1),
18);
if (listener) mgr.addReceiveStreamListener(this);
if (sendlistener) new RTPSendStreamWindow(mgr);
try {
username = System.getProperty("user.name");
} catch (SecurityException e){
username = "jmf-user";
}
try{
InetAddress destaddr = InetAddress.getByName(address);
new SourceDescription(SourceDescription
.SOURCE_DESC_CNAME,
cname,
1,
false),
new SourceDescription(SourceDescription
.SOURCE_DESC_TOOL,
"JMF RTP Player v2.0",
1,
false)
};
mgr.initSession(localaddr,
userdesclist,
0.05,
0.25);
mgr.startSession(sessaddr,ttl,null);
} catch (Exception e) {
System.err.println(e.getMessage());
return null;
}
return mgr;
}
try
{
// get a handle over the ReceiveStream
stream =((NewReceiveStreamEvent)event)
.getReceiveStream();
playerlist.addElement(newplayer);
newplayer.addControllerListener(this);
if (newplayer != null) {
// stop player and wait for stop event
newplayer.stop();
// remove controllerlistener
newplayer.removeControllerListener(listener);
try {
// when the player was closed, its datasource was
// disconnected. Now we must reconnect the data-
// source before a player can be created for it.
Receiving and Presenting RTP Media Streams 137
rtpsource.connect();
newplayer = Manager.createPlayer(rtpsource);
if (newplayer == null) {
System.err.println("Could not create player");
return;
}
newplayer.addControllerListener(listener);
newplayer.realize();
To play an RTP stream from the RTPSocket, you pass the socket to Man-
ager.createPlayer to construct the Player. Alternatively, you could con-
struct a Player by calling createPlayer(MediaLocator) and passing in a
MediaLocator with a new protocol that is a variant of RTP, “rtpraw”. For
example:
Manager.createPlayer(new MediaLocator("rtpraw://"));
<protocol package-prefix>.media.protocol.rtpraw.DataSource
This must be the RTPSocket. The content of the RTPsocket should be set to
rtpraw. Manager will then attempt to create a player of type <content-
prefix>.media.content.rptraw.Handler and set the RTPSocket on it.
RTPControl interfaces for the RTPSocket can be used to add dynamic pay-
load information to the RTP session manager.
The following example implements an RTP over UDP player that can
receive RTP UDP packets and stream them to the Player or session man-
ager, which is not aware of the underlying network/transport protocol.
Receiving and Presenting RTP Media Streams 139
import javax.media.*;
import javax.media.format.*;
import javax.media.protocol.*;
import javax.media.rtp.*;
import javax.media.rtp.event.*;
import javax.media.rtp.rtcp.*;
public RTPSocketPlayer() {
// create the RTPSocket
rtpsocket = new RTPSocket();
140 JMF API Guide
// set the RTP Session address and port of the RTP data
rtp = new UDPHandler(address, port);
// set the RTP Session address and port of the RTCP data
rtcp = new UDPHandler(address, port +1);
if (player != null) {
player.addControllerListener(this);
// send this player to out playerwindow
// playerWindow = new PlayerWindow(player);
}
}
Receiving and Presenting RTP Media Streams 141
try {
addr = InetAddress.getByName(sockaddress);
if (addr.isMulticastAddress()) {
MulticastSocket msock;
msock.joinGroup(addr);
sock = (DatagramSocket)msock;
}
else {
sock = new DatagramSocket(sockport,addr);
}
return sock;
}
catch (SocketException e) {
e.printStackTrace();
return null;
}
catch (UnknownHostException e) {
e.printStackTrace();
return null;
}
catch (IOException e) {
e.printStackTrace();
return null;
}
}
142 JMF API Guide
while(true) {
if (closed) {
cleanup();
return;
}
try {
do {
dp = new DatagramPacket( new byte[maxsize],
maxsize);
mysock.receive(dp);
if (closed){
cleanup();
return;
}
len = dp.getLength();
Receiving and Presenting RTP Media Streams 143
if (outputHandler != null) {
outputHandler.transferData(this);
}
}
}
// methods of PushSourceStream
public Object[] getControls() {
return new Object[0];
}
return dp.getData().length;
}
try {
addr = InetAddress.getByName(myAddress);
} catch (UnknownHostException e) {
e.printStackTrace();
}
return dp.getLength();
}
}
¥ Use a MediaLocator that has the parameters of the RTP session to con-
struct an RTP DataSink by calling Manager.createDataSink.
¥ Use a session manager to create send streams for the content and con-
trol the transmission.
145
146 JMF API Guide
packets introduce less header overhead but higher delay and make packet
loss more noticeable. For non-interactive applications such as lectures, or
for links with severe bandwidth constraints, a higher packetization delay
might be appropriate.
A receiver should accept packets representing between 0 and 200 ms of
audio data. (For framed audio encodings, a receiver should accept packets
with 200 ms divided by the frame duration, rounded up.) This restriction
allows reasonable buffer sizing for the receiver. Each packetizer codec has
a default packetization interval appropriate for its encoding.
If the codec allows modiÞcation of this interval, it exports a corresponding
PacketSizeControl. The packetization interval can be changed or set by
through the setPacketSize method.
For video streams, a single video frame is transmitted in multiple RTP
packets. The size of each packet is limited by the Maximum Transmission
Unit (MTU) of the underlying network. This parameter is also set using
the setPacketSize method of the packetizer codec's PacketSizeControl.
if (devices.size() > 0) {
di = (CaptureDeviceInfo) devices.elementAt( 0);
}
else {
// exit if we could not find the relevant capturedevice.
System.exit(-1);
}
processor.setContentDescriptor(
new ContentDescriptor( ContentDescriptor.RAW));
track[i].setEnabled(false);
}
else {
encodingOk = true;
}
} else {
// we could not set this track to gsm, so disable it
track[i].setEnabled(false);
}
}
try {
ds = processor.getDataOutput();
} catch (NotRealizedError e) {
System.exit(-1);
}
d.open();
d.start();
} catch (Exception e) {
System.exit(-1);
}
}
150 JMF API Guide
from each cloned DataSource, its tracks encoded in the desired format,
and the stream sent out over an RTP session.
processor.configure();
processor.setContentDescriptor(
new ContentDescriptor( ContentDescriptor.RAW));
if (((FormatControl)track[i]).
setFormat( new AudioFormat(AudioFormat.ULAW_RTP,
8000,
8,
1)) == null) {
track[i].setEnabled(false);
}
else {
encodingOk = true;
}
}
else {
// we could not set this track to gsm, so disable it
track[i].setEnabled(false);
}
}
Transmitting RTP Media Streams 153
DataSource ds = null;
try {
ds = processor.getDataOutput();
} catch (NotRealizedError e){
System.exit(-1);
}
try {
rtpsm.createSendStream(ds, 0);
} catch (IOException e){
e.printStackTrace();
} catch( UnsupportedFormatException e) {
e.printStackTrace();
}
if (devices.size() > 0) {
di = (CaptureDeviceInfo) devices.elementAt( 0);
}
else {
// exit if we could not find the relevant capturedevice.
System.exit(-1);
}
processor.setContentDescriptor(
new ContentDescriptor( ContentDescriptor.RAW));
if (((FormatControl)track[i]).
setFormat( new AudioFormat(AudioFormat.GSM_RTP,
8000,
8,
1)) == null) {
track[i].setEnabled(false);
}
else {
encodingOk = true;
}
}
Transmitting RTP Media Streams 155
if (encodingOk) {
processor.realize();
try {
origDataSource = processor.getDataOutput();
} catch (NotRealizedError e) {
System.exit(-1);
}
cloneableDataSource
= Manager.createCloneableDataSource(origDataSource);
clonedDataSource
= ((SourceCloneable)cloneableDataSource).createClone();
SessionManager rtpsm1
= new com.sun.media.rtp.RTPSessionMgr();
} catch (IOException e) {
e.printStackTrace();
} catch( UnsupportedFormatException e) {
e.printStackTrace();
}
try {
cloneableDataSource.connect();
cloneableDataSource.start();
} catch (IOException e) {
e.printStackTrace();
}
// rtpsm2.initSession(...);
// rtpsm2.startSession(...);
try {
rtpsm2.createSendStream(clonedDataSource,0);
} catch (IOException e) {
e.printStackTrace();
} catch( UnsupportedFormatException e) {
e.printStackTrace();
}
}
}
else {
// we failed to set the encoding to gsm. So deallocate
// and close the processor before we leave.
processor.deallocate();
processor.close();
}
Example 10-4 encodes captured audio in several formats and then sends it
out in multiple RTP sessions. It assumes that there is one stream in the
input DataSource.
The input DataSource is cloned and a second processor is created from the
clone. The tracks in the two Processors are individually set to gsm and
dvi and the output DataSources are sent to two different RTP session man-
Transmitting RTP Media Streams 157
if (devices.size() > 0) {
di = (CaptureDeviceInfo) devices.elementAt( 0);
}
else {
// exit if we could not find the relevant capture device.
System.exit(-1);
}
try {
origDataSource = Manager.createDataSource(di.getLocator());
} catch (IOException e) {
System.exit(-1);
} catch (NoDataSourceException e) {
System.exit(-1);
}
if (streams.length == 1) {
cloneableDataSource
= Manager.createCloneableDataSource(origDataSource);
158 JMF API Guide
Processor p1 = null;
try {
p1 = Manager.createProcessor(cloneableDataSource);
} catch (IOException e) {
System.exit(-1);
} catch (NoProcessorException e) {
System.exit(-1);
}
p1.configure();
track[i].setEnabled(false);
}
else {
encodingOk = true;
}
}
else {
track[i].setEnabled(false);
}
}
Transmitting RTP Media Streams 159
try {
ds = processor.getDataOutput();
} catch (NotRealizedError e) {
System.exit(-1);
}
SessionManager rtpsm1
= new com.sun.media.rtp.RTPSessionMgr();
// rtpsm1.initSession(...);
// rtpsm1.startSession(...);
try {
rtpsm1.createSendStream(ds, // first datasource
0); // first sourcestream of
// first datasource
} catch (IOException e) {
e.printStackTrace();
} catch( UnsupportedFormatException e) {
e.printStackTrace();
}
}
// Now repeat the above with the cloned data source and
// set the encoding to dvi. i.e create a processor with
// inputdatasource clonedDataSource
// and set encoding of one of its tracks to dvi.
// create SessionManager giving it the output datasource of
// this processor.
<protocol package-prefix>.media.datasink.rtpraw.Handler
160 JMF API Guide
The session manager prepares individual RTP packets that are ready to be
transmitted across the network and sends them to the RTPSocket created
from:
<protocol package-prefix>.media.protocol.rtpraw.DataSource
YouÕre responsible for transmitting the RTP packets out on the underlying
network
In the following example, an RTPSocket is used to transmitting captured
audio:
Example 10-5: Transmitting RTP data with RTPSocket (1 of 3)
// Find a capture device that will capture linear audio
// data at 8bit 8Khz
try {
processor = Manager.createProcessor(di.getLocator());
} catch (IOException e) {
System.exit(-1);
} catch (NoProcessorException e) {
System.exit(-1);
}
Transmitting RTP Media Streams 161
processor.setContentDescriptor(
new ContentDescriptor( ContentDescriptor.RAW));
if (((FormatControl)track[i]).
setFormat( new AudioFormat(AudioFormat.GSM_RTP,
8000,
8,
1)) == null) {
track[i].setEnabled(false);
}
else {
encodingOk = true;
}
}
else {
// we could not set this track to gsm, so disable it
track[i].setEnabled(false);
}
}
d.open();
d.start();
} catch (Exception e) {
System.exit(-1);
}
}
11
Importing and Exporting
RTP Media Streams
Many applications need to be able to read and write RTP streams. For
example, conferencing application might record a conference and broad-
cast it at a later time, or telephony applications might transmit stored
audio streams for announcement messages or hold music.
You can save RTP streams received from the network to a file using an
RTP file writer DataSink. Similarly, you can read saved files and either
present them locally or transmit them across the network.
processor
= Manager.createProcessor( new MediaLocator(url));
} catch (IOException e) {
System.exit(-1);
163
164 JMF API Guide
if (((FormatControl)track[i]).
setFormat( new AudioFormat(AudioFormat.ULAW_RTP,
8000,
8,
1)) == null) {
track[i].setEnabled(false);
}
else {
encodingOk = true;
}
}
else {
// we could not set this track to ulaw, so disable it
track[i].setEnabled(false);
}
}
if (encodingOk) {
processor.realize();
try {
ds = processor.getDataOutput();
} catch (NotRealizedError e) {
System.exit(-1);
}
Importing and Exporting RTP Media Streams 165
try {
String url= "rtp://224.144.251.104:49150/audio/1";
d.open();
d.start();
} catch (Exception e) {
System.exit(-1);
}
}
If you want to transcode the data before storing it, you can use the Data-
Source retrieved from the ReceiveStream to construct a Processor. You
then:
This example handles a single track. To write a Þle that contains both
audio and video tracks, you need to retrieve the audio and video streams
from the separate session managers and create a merging DataSource that
carries both of the streams. Then you hand the merged DataSource to Man-
ager.createDataSink.
try {
// get the ReceiveStream
stream =((NewReceiveStreamEvent)event)
.getReceiveStream();
Manager.createDataSink(dsource, f);
} catch (Exception e) {
System.err.println("newReceiveStreamEvent exception "
+ e.getMessage());
return;
}
}
}
12
Creating Custom
Packetizers and
Depacketizers
Note: The RTP 1.0 API supported custom packetizers and depacketizers through
RTP-speciÞc APIs. These APIs have been replaced by the generic JMF plug-in
API and any custom packetizers or depacketizers created for RTP 1.0 will need to
be ported to the new architecture.
RTP packetizers are responsible for taking entire video frames or multiple
audio samples and distributing them into packets of a particular size that
can be streamed over the underlying network. Video frames are divided
into smaller chunks, while audio samples are typically grouped together.
RTP depacketizers reverse the process and reconstruct complete video
frames or extract individual audio samples from a stream of RTP packets.
The RTP session manager itself does not perform any packetization or
depacketization. These operations are performed by the Processor using
specialized Codec plug-ins.
167
168 JMF API Guide
JMF API
RTP APIs
Depacketizer Packetizer
Codecs Codecs
Codec/
Demux Packetizer
MUX JPEG_RTP
a b c
d Packetized JPEG_RTP encoded data output from Packetizer and input to Multiplexer
Buffer Format = JPEG_RTP
Buffer Header = RTPHeader (javax.media.rtp.RTPHeader)
Buffer Data = JPEG Payload header + JPEG Payload
1. See the IETF RTP payload specifications for more information about how particular
payloads are to be carried in RTP.
Creating Custom Packetizers and Depacketizers 171
All source streams streamed out on RTP DataSources have their content
descriptor set to an empty content descriptor of "" and their format set to
the appropriate RTP-speciÞc format and encoding. To be able to intercept
or depacketize this data, plug-in codecs must advertise this format as one
of their input formats.
For packets being sent over the network, the Processor's format must be
set to one of the RTP-speciÞc formats (encodings). The plug-in codec must
advertise this format as one of its supported output formats. All Buffer
objects passed to the SessionManager through the DataSource sent to crea-
teSendStream must have an RTP-speciÞc format. The header of the Buffer
is as described in javax.media.rtp.RTPHeader.
This Java Applet demonstrates proper error checking in a Java Media pro-
gram. Like PlayerApplet, it creates a simple media player with a media
event listener.
When this applet is started, it immediately begins to play the media clip.
When the end of media is reached, the clip replays from the beginning.
Example A-1: TypicalPlayerApplet with error handling. (1 of 5)
import java.applet.Applet;
import java.awt.*;
import java.lang.String;
import java.net.URL;
import java.net.MalformedURLException;
import java.io.IOException;
import javax.media.*;
/**
* This is a Java Applet that demonstrates how to create a simple
* media player with a media event listener. It will play the
* media clip right away and continuously loop.
*
* <!-- Sample HTML
* <applet code=TypicalPlayerApplet width=320 height=300>
* <param name=file value="Astrnmy.avi">
* </applet>
* -->
*/
173
174 JMF API Guide
/**
* Read the applet file parameter and create the media
* player.
*/
/**
* Start media file playback. This function is called the
* first time that the Applet runs and every
* time the user re-enters the page.
*/
/**
* Stop media file playback and release resources before
* leaving the page.
*/
/**
* This controllerUpdate function must be defined in order
* to implement a ControllerListener interface. This
* function will be called whenever there is a media event.
*/
if (progressBar == null)
if ((progressBar = cc.getProgressBarComponent()) != null)
{
add("North", progressBar);
validate();
}
player.setMediaTime(new Time(0));
player.start();
}
else if (event instanceof ControllerErrorEvent)
{
// Tell TypicalPlayerApplet.start() to call it a day
player = null;
Fatal (((ControllerErrorEvent)event).getMessage());
}
}
public StateHelper(Player p) {
player = p;
p.addControllerListener(this);
}
179
180 JMF API Guide
/**
* Demultiplexer for GSM file format
*/
/**
* GSM
* 8000 samples per sec.
* 160 samples represent 20 milliseconds and GSM represents them
* in 33 bytes. So frameSize is 33 bytes and there are 50 frames
* in one second. One second is 1650 bytes.
*/
183
184 JMF API Guide
if ( streams == null) {
throw new IOException("Got a null stream from the DataSource");
}
if (streams.length == 0) {
throw new IOException("Got a empty stream array
from the DataSource");
}
this.source = source;
this.streams = streams;
if (!supports(streams))
throw new IncompatibleSourceException("DataSource not
supported: " + source);
}
/**
* A Demultiplexer may support pull only or push only or both
* pull and push streams.
Demultiplexer Plug-In 185
/**
* Opens the plug-in software or hardware component and acquires
* necessary resources. If all the needed resources could not be
* acquired, it throws a ResourceUnavailableException. Data should not
* be passed into the plug-in without first calling this method.
*/
public void open() {
// throws ResourceUnavailableException;
}
/**
* Closes the plug-in component and releases resources. No more data
* will be accepted by the plug-in after a call to this method. The
* plug-in can be reinstated after being closed by calling
* <code>open</code>.
*/
public void close() {
if (source != null) {
try {
source.stop();
source.disconnect();
} catch (IOException e) {
// Internal error?
}
source = null;
}
}
/**
* This get called when the player/processor is started.
*/
public void start() throws IOException {
if (source != null)
source.start();
}
186 JMF API Guide
/**
* Resets the state of the plug-in. Typically at end of media
* or when media is repositioned.
*/
public void reset() {
}
if (tracks[0] != null)
return tracks;
stream = (PullSourceStream) streams[0];
readHeader();
bufferSize = bytesPerSecond;
tracks[0] = new GsmTrack((AudioFormat) format,
/*enabled=*/ true,
new Time(0),
numBuffers,
bufferSize,
minLocation,
maxLocation
);
return tracks;
}
} else {
maxLocation = Long.MAX_VALUE;
}
if (time < 0)
time = 0;
if (remainder > 0) {
switch (rounding) {
case Positionable.RoundUp:
newPos += blockSize;
break;
case Positionable.RoundNearest:
if (remainder > (blockSize / 2.0))
newPos += blockSize;
188 JMF API Guide
newPos += minLocation;
((BasicTrack) tracks[0]).setSeekLocation(newPos);
return where;
}
/**
* Returns a descriptive name for the plug-in.
* This is a user readable string.
*/
public String getName() {
return "Parser for raw GSM";
}
/**
* Read numBytes from offset 0
*/
public int readBytes(PullSourceStream pss, byte[] array,
int numBytes) throws IOException {
remainingLength = numBytes;
while (remainingLength > 0) {
////////////////////////////////////////////////////////////
// Inner classes begin
abstract private class BasicTrack implements Track {
BasicTrack(SampleDeMux parser,
Format format, boolean enabled,
Time duration, Time startTime,
int numBuffers, int dataSize,
PullSourceStream stream) {
this(parser, format, enabled, duration, startTime,
numBuffers, dataSize, stream,
0L, Long.MAX_VALUE);
}
/**
* Note to implementors who want to use this class.
* If the maxLocation is not known, then
* specify Long.MAX_VALUE for this parameter
*/
public BasicTrack(SampleDeMux parser,
Format format, boolean enabled,
Time duration, Time startTime,
int numBuffers, int dataSize,
PullSourceStream stream,
long minLocation, long maxLocation) {
this.parser = parser;
Demultiplexer Plug-In 191
buffer.setFormat(format);
Object obj = buffer.getData();
byte[] data;
long location;
boolean needToSeek;
synchronized(this) {
if (seekLocation != -1) {
location = seekLocation;
seekLocation = -1;
needToSeek = true;
} else {
location = parser.getLocation(stream);
needToSeek = false;
}
}
int needDataSize;
if ( (obj == null) ||
(! (obj instanceof byte[]) ) ||
( ((byte[])obj).length < needDataSize) ) {
data = new byte[needDataSize];
buffer.setData(data);
} else {
data = (byte[]) obj;
}
try {
if (needToSeek) {
long pos =
((javax.media.protocol.Seekable)stream).seek(location);
Demultiplexer Plug-In 193
float bytesPerSecond;
float bytesPerFrame;
float samplesPerFrame;
import javax.media.protocol.PullDataSource;
import javax.media.protocol.SourceStream;
import javax.media.protocol.PullSourceStream;
import javax.media.Time;
import javax.media.Duration;
import java.io.*;
import java.net.*;
import java.util.Vector;
197
198 JMF API Guide
// file to retrieve
protected String fileString;
if (readReply() == FTP_ERROR)
{
throw new IOException("connection failed");
}
try
{
issueCommand("QUIT");
controlSocket.close();
}
catch (IOException e)
{
// do nothing, we just want to shutdown
}
controlSocket = null;
controlIn = null;
controlOut = null;
}
dataSocket = serverSocket.accept();
serverSocket.close();
}
public void stop()
{
try
{
// issue ABORt command
issueCommand("ABOR");
dataSocket.close();
}
catch(IOException e) {}
}
if (extension.equals("avi"))
typeString = "video.x-msvideo";
else if (extension.equals("mpg") ||
extension.equals("mpeg"))
typeString = "video.mpeg";
else if (extension.equals("mov"))
typeString = "video.quicktime";
Sample Data Source Implementation 201
catch(IOException e)
{
System.out.println("error getting streams");
}
return streams;
}
{
this.user = user;
}
/**
* Pulls the response from the server and returns the code as a
* number. Returns -1 on failure.
*/
response.setSize(0);
while (true)
{
while ((c = controlIn.read()) != -1)
{
if (c == '\r')
{
if ((c = controlIn.read()) != '\n')
{
buff.append('\r');
Sample Data Source Implementation 203
if (c == '\n')
{
break;
}
}
responseStr = buff.toString();
buff.setLength(0);
try
{
code = Integer.parseInt(responseStr.substring(0, 3));
}
catch (NumberFormatException e)
{
code = -1;
}
catch (StringIndexOutOfBoundsException e)
{
/* this line doesn't contain a response code, so
* we just completely ignore it
*/
continue;
}
response.addElement(responseStr);
if (continuingCode != -1)
{
/* we've seen a XXX- sequence */
if (code != continuingCode ||
(responseStr.length() >= 4 &&
responseStr.charAt(3) == '-'))
{
continue;
}
else
{
/* seen the end of code sequence */
continuingCode = -1;
break;
}
}
else if (responseStr.length() >= 4 &&
responseStr.charAt(3) == '-')
{
continuingCode = code;
continue;
}
else
{
204 JMF API Guide
previousReplyCode = code;
return code;
}
Source Stream
Example D-2:
package intel.media.protocol.ftp;
import java.io.*;
import javax.media.protocol.ContentDescriptor;
import javax.media.protocol.PullSourceStream;
import javax.media.protocol.SourceStream;
// SourceSteam methods
// PullSourceStream methods
Example D-2:
public int read(byte[] buffer, int offset, int length) throws
IOException
{
int n = dataIn.read(buffer, offset, length);
if (n == -1)
{
eofMarker = true;
}
return n;
}
¥ EventPoster
207
208 JMF API Guide
TimeLineController
int ourState;
long timeLine[];
int currentSegment = -1;
long duration;
Thread myThread;
for (;;)
{
if (min == max) return min;
int current = min + ((max - min) >> 1);
else
{
min = current + 1;
}
}
}
// These are all simple...
if (factor == 0.0f)
{
factor = 1.0f;
}
// From Controller
boolean endOfMedia;
float rate = ourClock.getRate ();
else
{
endOfMedia = false;
}
// We face the same possible problem with being past the stop
// time. If so, we stop immediately.
boolean pastStopTime;
long stopTime = ourClock.getStopTime().getNanoseconds();
if ((stopTime != Long.MAX_VALUE) &&
((startTime >= stopTime && rate >= 0.0f) ||
(startTime <= stopTime && rate <= 0.0f)))
{
pastStopTime = true;
}
else
{
pastStopTime = false;
}
if (endOfMedia)
{
postEvent (new EndOfMediaEvent (this,
Controller.Started,
Controller.Prefetched, Controller.Prefetched,
new Time(startTime)));
}
else if (pastStopTime)
{
postEvent (new StopAtTimeEvent (this, Controller.Started,
Controller.Prefetched, Controller.Prefetched,
new Time(startTime)));
}
else
{
myThread = new Thread (this, "TimeLineController");
// This one is also pretty easy. We stop and tell the running
// thread to exit.
214 JMF API Guide
if (ourState == Controller.Prefetched)
{
myThread = null;
notifyAll ();
break;
}
long endOfMediaTime;
else if (currentSegment == 0)
{
timeToNextSegment = timeNow;
}
else
{
timeToNextSegment = timeNow - timeLine[currentSegment-1];
}
}
long waitTime;
if (ourRate > 0)
{
waitTime = (long) ((float) mediaTimeToWait / ourRate) /
1000000;
}
else
{
waitTime = (long) ((float) mediaTimeToWait / -ourRate) /
1000000;
}
// Add one because we just rounded down and we don't
// really want to waste CPU being woken up early.
waitTime++;
if (waitTime > 0)
{
// Bug in some systems deals poorly with really large
// numbers. We will cap our wait() to 1000 seconds
// which point we will probably decide to wait again.
218 JMF API Guide
TimeLineEvent
EventPostingBase
// import COM.yourbiz.media.EventPoster;
ListenerList iter;
for (iter = olist; iter != null; iter = iter.next)
{
if (iter.observer == observer) return;
}
ListenerList
class ListenerList
{
ControllerListener observer;
ListenerList next;
}
EventPoster
RTPUtil demonstrates how to create separate RTP players for each stream
in a session so that you can play the streams. To do this, you need to listen
for NewRecvStreamEvents and retrieve the DataSource from each new
stream. (See ÒCreating an RTP Player for Each New Receive StreamÓ on
page 132 for more information about this example.)
Example F-1: RTPUtil (1 of 5)
import javax.media.rtp.*;
import javax.media.rtp.rtcp.*;
import javax.media.rtp.event.*;
import javax.media.*;
import javax.media.protocol.*;
import java.net.InetAddress;
import javax.media.format.AudioFormat;
// for PlayerWindow
import java.awt.*;
import com.sun.media.ui.*;
import java.util.Vector;
223
224 JMF API Guide
mgr.addFormat(new AudioFormat(AudioFormat.DVI_RTP,
44100,
4,
1),
18);
if (listener) mgr.addReceiveStreamListener(this);
if (sendlistener) new RTPSendStreamWindow(mgr);
try {
username = System.getProperty("user.name");
} catch (SecurityException e){
username = "jmf-user";
}
try{
InetAddress destaddr = InetAddress.getByName(address);
new SourceDescription(SourceDescription
.SOURCE_DESC_TOOL,
"JMF RTP Player v2.0",
1,
false)
};
mgr.initSession(localaddr,
userdesclist,
0.05,
0.25);
mgr.startSession(sessaddr,ttl,null);
} catch (Exception e) {
System.err.println(e.getMessage());
return null;
}
return mgr;
}
public void update( ReceiveStreamEvent event)
{
Player newplayer = null;
RTPPlayerWindow playerWindow = null;
try
{
// get a handle over the ReceiveStream
stream =((NewReceiveStreamEvent)event)
.getReceiveStream();
playerlist.addElement(newplayer);
newplayer.addControllerListener(this);
if (!terminatedbyClose){
if (playerlist.contains(p))
playerlist.removeElement(p);
if ((playerlist.size() == 0) && (mgr != null))
mgr.closeSession("All players are closed");
}
}
}
broadcast
Transmit a data stream that multiple clients can receive if they choose
to.
Buffer
The container for a chunk of media data.
CachingControl
A media control that is used to monitor and display download
progress. information.
capture device
A microphone, video capture board, or other source that generates or
provides time-based media data. A capture device is represented by a
DataSource.
CaptureDeviceControl
A media control that enables the user to control a capture device.
CaptureDeviceInfo
An information object that maintains information about a capture
device, such as its name, the formats it supports, and the MediaLoca-
tor needed to construct a DataSource for the device.
CaptureDeviceManager
The manager that provides access to the capture devices available to
JMF.
Clock
A media object that deÞnes a transformation on a TimeBase.
close
Release all of the resources associated with a Controller.
229
230 JMF API Guide
codec
A compression/decompression engine used to convert media data
between compressed and raw formats. The JMF plug-in architecture
enables technology providers to supply codecs that can be seamlessly
integrated into JMFÕs media processing.
compositing
Combining multiple sources of media data to form a single Þnished
product.
conÞgured
A Processor state that indicates that the Processor has been con-
nected to its data source and the data format has been determined.
conÞguring
A Processor state that indicates that configure has been called and
the Processor is connecting to the DataSource, demultiplexing the
input stream, and accessing information about the data format.
content name
A string that identiÞes a content type.
content package-preÞx
A package preÞx in the list of package preÞxes that the PackageMan-
ager maintains for content extensions such as new DataSource imple-
mentations.
content package-preÞx list
The list of content package preÞxes maintained by the PackageMan-
ager.
content type
A multiplexed media data format such as MPEG-1, MPEG-2, Quick-
Time, AVI, WAV, AU, or MIDI. Content types are usually identiÞed by
MIME types.
Control
A JMF construct that can provide access to a user interface compo-
nent to supports user interaction. JMF controls implement the Con-
trol interface.
control-panel component
The user interface component that enables the user to control the
media presentation.
Glossary 231
Controller
The key construct in the JMF Player/Processor API. The Controller
interface deÞnes the basic state and control mechanism for an object
that controls, presents, or captures time-based media.
ControllerAdapter
An event adapter that receives ControllerEvents and dispatches
them to an appropriate stub-method. Classes that extend this adapter
can easily replace only the message handlers they are interested in
ControllerClosedEvent
An event posted by a Controller when it shuts down. A Control-
lerErrorEvent is a special type of ControllerClosedEvent.
ControllerEvent
The ControllerEvent class is the base class for events posted by a Con-
troller object. To receive ControllerEvents, you implement the Con-
trollerListener interface.
ControllerListener
An object that implements the ControllerListener interface to receive
notiÞcation whenever a Controller posts a ControllerEvent. See also
ControllerAdapter.
data
The actual media data contained by a Buffer object.
DataSink
An object that implements the DataSink interface to read media con-
tent from a DataSource and render the media to a destination.
DataSource
An object that implements the DataSource interface to encapsulate
the location of media and the protocol and software used to deliver
the media.
deallocate
Release any exclusive resources and minimize the use of non-exclu-
sive resources.
decode
Convert a data stream from a compressed type to an uncompressed
type.
demultiplex
Extract individual tracks from a multiplexed media stream.
232 JMF API Guide
Demultiplexer
A JMF plug-in that parses the input stream. If the stream contains
interleaved tracks, they are extracted and output as separate tracks.
duration
The length of time it takes to play the media at the default rate.
Effect
A JMF plug-in that applies and effect algorithm to a track and outputs
the modiÞed track in the same format.
encode
Convert a data stream from an uncompressed type to a compressed
type.
end of media (eom)
The end of a media stream.
StreamWriterControl
A Control implemented by data sinks and multiplexers that generate
output data. This Control enables users to specify a limit on the
amount of data generated.
format
A structure for describing a media data type.
frame
One unit of data in a track. For example, one image in a video track.
frame rate
The number of frames that are displayed per second.
GainChangeEvent
An event posted by a GainControl whenever the volume changes.
GainChangeListener
An object that implements the GainChangeListener interface to
receive GainChangeEvents from a GainControl.
GainControl
A JMF Control that enables programmatic or interactive control over
the playback volume.
JMF (Java Media Framework)
An application programming interface (API) for incorporating media
data types into Java applications and applets.
Glossary 233
key frame
A frame of video that contains the data for the entire frame rather
than just the differences from the previous frame.
latency
See start latency.
managed controller
A Controller that is synchronized with other Controllers through a
managing Player. The managing Player drives the operation of each
managed ControllerÑwhile a Controller is being managed, you
should not directly manipulate its state.
Manager
The JMF access point for obtaining system dependent resources such
as Players, Processors, DataSources and the system TimeBase.
managing player
A Player that is driving the operation of other Controllers in order to
synchronize them. The addController method is used to place Con-
trollers under the control of a managing Player.
media time
The current position in a media stream.
MediaHandler
An object that implements the MediaHandler interface, which deÞnes
how the media source that the handler uses to obtain content is
selected. There are currently three supported types of MediaHandlers:
Player (including Processor), MediaProxy, and DataSink.
MediaLocator
An object that describes the media that a Player presents. A MediaLo-
cator is similar to a URL and can be constructed from a URL. In the Java
programming language, a URL can only be constructed if the corre-
sponding protocol handler is installed on the system. MediaLocator
doesnÕt have this restriction.
MediaProxy
An object that processes content from one DataSource to create
another. Typically, a MediaProxy reads a text conÞguration Þle that
contains all of the information needed to make a connection to a
server and obtain media data.
MIME type
A standardized content type description based on the Multipurpose
Internet Mail Extensions (MIME) speciÞcation.
MonitorControl
A Control that provides a way to display the capture monitor for a
particular capture device.
multicast
Transmit a data stream to a select group of participants. See also
broadcast, unicast.
multiplex
Merge separate tracks into one multiplexed media stream.
multiplexed media stream
A media stream that contains multiple channels of media data.
Multiplexer
A JMF plug-in that combines multiple tracks of input data into a sin-
gle interleaved output stream and delivers the resulting stream as an
output DataSource.
Glossary 235
pre-process
Apply an effect algorithm before the media stream is encoded.
prefetch
Prepare a Player to present its media. During this phase, the Player
preloads its media data, obtains exclusive-use resources, and any-
thing else it needs to do to prepare itself to play.
prefetched
A Player state in which the Player is ready to be started.
prefetching
A Player state in which the Player is in the process of preparing itself
to play.
Processor
A special type of JMF Player that can provide control over how the
media data is processed before it is presented.
Processor state
One of the eight states that a Processor can be in. A Processor has
two more Stopped states than a Player: ConÞguring and ConÞgured. See
also Player state.
ProcessorModel
An object that deÞnes the input and output requirements for a Pro-
cessor. When a Processor is created using a ProcessorModel, the Man-
ager does its best to create a Processor that meets these requirements.
protocol
A data delivery mechanism such as HTTP, RTP, FILE.
protocol package-preÞx
A package preÞx in the list of package preÞxes that the PackageMan-
ager maintains for protocol extensions such as new MediaHandlers.
pull
Initiate the data transfer and control the flow of data from the client
side.
PullBufferDataSource
A pull DataSource that uses a Buffer object as its unit of transfer.
PullDataSource
A DataSource that enables the client to initiate the data transfer and
control the flow of data.
PullBufferStream
A SourceStream managed by a PullBufferDataSource.
PullSourceStream
A SourceStream managed by a PullDataSource.
push
Initiate the data transfer and control the flow of data from the server
side.
PushBufferDataSource
A push DataSource that uses a Buffer object as its unit of transfer.
PushDataSource
A DataSource that enables the server to initiate the data transfer and
control the flow of data.
PushBufferStream
A SourceStream managed by a PushBufferDataSource.
PushSourceStream
A SourceStream managed by a PushDataSource.
rate
A temporal scale factor that determines how media time changes
with respect to time-base time. A Player objectÕs rate deÞnes how
many media time units advance for every unit of time-base time.
raw media format
A format that can be directly rendered by standard media rendering
devices without the need for decompression. For audio, a PCM sam-
ple representation is one example of a raw media format.
realize
Determine resource requirements and acquire the resources that the
Player only needs to acquire once.
238 JMF API Guide
realized
The Player state in which the Player knows what resources it needs
and information about the type of media it is to present. A Realized
Player knows how to render its data and can provide visual compo-
nents and controls. Its connections to other objects in the system are
in place, but it doesnÕt own any resources that would prevent another
Player from starting.
realizing
The Player state in which the Player is determining what resources it
needs and gathering information about the type of media it is to
present.
render
Deliver media data to some destination, such as a monitor or speaker.
Renderer
A JMF plug-in that delivers media data to some destination, such as a
monitor or speaker.
RTCP
RTP Control Protocol.
RTP
Real-time Transfer Protocol.
session
In RTP, the association among a set of participants communicating
with RTP. A session is deÞned by a network address plus a port pair
for RTP and RTCP.
source
A provider of a stream of media data.
SourceStream
A single stream of media data.
SSRC
See synchronization source.
start
Activate a Player. A Started Player’s time-base time and media time
are mapped and its clock is running, though the Player might be
waiting for a particular time to begin presenting its media data.
Glossary 239
start latency
The time it takes before a Player can begin presenting media data.
started
One of the two fundamental Clock states. (The other is Stopped.) Con-
troller breaks the Started state down into several resource alloca-
tion phases: Unrealized, Realizing, Realized, Prefetching, and Prefetched.
status change events
Controller events such as RateChangeEvent, SizeChangeEvent StopTi-
meChangeEvent that indicate that the status of a Controller has
changed.
stop
Halt a Player’s presentation of media data.
stop time
The media time at which a Player should halt.
synchronization source
The source of a stream of RTP packets, identiÞed by a 32-bit numeric
SSRC identiÞer carried in the RTP header.
synchronize
Coordinate two or more ControllerÕs so that they can present media
data together. Synchronized ControllerÕs use the same TimeBase.
target state
The state that a Player is heading toward. For example, when a
Player is in the Realizing state, its target state is Realized.
time-based media
Media such as audio, video, MIDI, and animations that change with
respect to time.
TimeBase
An object that deÞnes the ßow of time for a Controller. A TimeBase is
a constantly ticking source of time, much like a crystal.
time-base time
The current time returned by a TimeBase.
track
A channel in a multiplexed media stream that contains media or con-
trol data. For example, a multiplexed media stream might contain an
audio track and a video track.
240 JMF API Guide
TrackControl
A Control used to query, control and manipulate the data of individ-
ual media tracks.
track format
The format associated with a particular track.
transcode
Convert a data stream from an uncompressed type to a compressed
type or vice-versa.
transition events
ControllerEvents posted by a Controller as its state changes.
unicast
Transmit a data stream to a single recipient.
unrealized
The initial state of a Player. A Player in the Unrealized state has been
instantiated, but does not yet know anything about its media.
URL
Universal Resource Locator.
user-interface component
An instance of a class that implements the Component interface. JMF
Players have two types of default user-interface components, a Con-
trolPanelComponent and a VisualComponent.
visual component
The user interface component that displays the media or information
about the media.
VOD
Video on Demand.
Index
A Control 29
addController method 57 control panel 45
adjusting audio gain 29 Controller
applet 173 implementing 104, 207
APPLET tag 62 state
AU 11 prefetched 27
AVI 11 prefetching 27
realized 27
B realizing 27
blocking realize 44 started 26, 27
broadcast media 17 stopped 26
Buffer 16 unrealized 27
ControllerAdapter 55
C ControllerClosedEvent 31
CachingControl 46, 47 ControllerErrorEvent 31
CachingControlEvent 31, 47 ControllerEvent 25
capture controls 78 getSource method 55
capture device state information 55
registering 106 ControllerListener 25, 27, 47
CaptureDeviceInfo 77, 78 implementing 47, 54, 173
CaptureDeviceManager 77 registering 54, 66
capturing media data 77, 78 Controllers
change notifications 30 synchronizing multiple 57
clearing the stop time 52 controllerUpdate method 56
Clock 13 implementing 55, 66
getTimeBase 56 controlling the media presentation 45
setTimeBase 56 createPlayer method 44, 64
close method 52 creating a Player 44, 64, 173
closed events 30
closing a Player 52 D
Codec 33 data format
implementing 88 output 36
ConfigureCompleteEvent 34 data, writing 37
configured state 33 DataSink 37
configuring state 33 DataSinkErrorEvent 37
ConnectionErrorEvent 31 DataSinkEvent 37
content-type name 41 DataSource
241
242 JMF API Guide
R state
rate 51 configuring 33
RateChangeEvent 31 prefetched 27
realize prefetching 27
blocking on 44 realized 27
realize method 27, 49 started 26, 27
RealizeCompleteEvent 31, 49, 66 stopped 26
realized state 27, 49 unrealized 27
realizing 27 stop method 50, 65
realizing a Player 49 stop time 51
realizing state 27 clearing 52
Real-time Transport Protocol (RTP) 5, 17 StopAtTimeEvent 31
registering a plug-in, plug-in StopByRequestEvent 31
registering 101 StopEvent 31
registering as a ControllerListener 54, 66 stopped state 26
releasing resources 65 stopping
removeController method 57 Player 50
Renderer 33 StopTimeChangeEvent 31
implementing 95 StreamWriterControl 37
Rendering 33 synchronization 50
ResourceUnavailableEvent 31 synchronizing Controllers 57
RestartingEvent 31 syncStart 50, 60, 61
reverse, playing in 48
RTP 5, 17 T
temporal scale factor 47
S time
sample program, PlayerApplet 61 getting 53
saving media data to a file 79 time-base time
Seekable 103 getting 54
setFormat 73 To 78
setLevel method 46 TrackControl 34, 36
setMute method 46 transcoding 33
setOutputContentDescriptor 71 transition events 30
setOutputContentType 33 TransitionEvent 31
setRate method 48
setSource method 105 U
setStopTime method 52 unit of transfer 16
setTimeBase method 56 unrealized state 27
setting URL 16, 44
audio gain 29 instantiating 44
stop time 51 user-interface 66
shutting down a Player 52 custom 46
SourceStream 103
start method 27, 50, 65 V
started state 26, 27 validate method 66
StartEvent 31, 50 video-on-demand (VOD) 17
starting a Player 50 VOD (video-on-demand) 17
Index 245
W
WAV 11