0% found this document useful (0 votes)
88 views

Dereje Teferi (PHD) Dereje - Teferi@Aau - Edu.Et

The document describes the characteristics and challenges of multimedia systems, including the need for computer control, integration of different media types represented digitally, and interactive interfaces. It also covers classes of multimedia applications such as hypertext, streaming audio and video, real-time interactions, and issues around limited bandwidth, packet jitter, and loss that multimedia systems must address.

Uploaded by

dave
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
88 views

Dereje Teferi (PHD) Dereje - Teferi@Aau - Edu.Et

The document describes the characteristics and challenges of multimedia systems, including the need for computer control, integration of different media types represented digitally, and interactive interfaces. It also covers classes of multimedia applications such as hypertext, streaming audio and video, real-time interactions, and issues around limited bandwidth, packet jitter, and loss that multimedia systems must address.

Uploaded by

dave
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Dereje Teferi (PhD)

[email protected]

1
Characteristics of a Multimedia System
A Multimedia system has four basic
characteristics:
 Multimedia systems must be computer
controlled.
 Multimedia systems are integrated.
 The information they handle must be
represented digitally.
 The interface to the final presentation of
media is usually interactive.
2
Challenges for Multimedia Systems
 Multimedia systems may have to render a variety of
media at the same instant -- a distinction from
normal applications.
 There is a temporal relationship between many
forms of media (e.g. Video and Audio)
There are two forms of problems here
 Sequencing within the media -- playing frames in
correct order/time frame in video
 Synchronisation -- inter-media scheduling (e.g.
Video and Audio).
 Lip synchronisation is clearly important for humans
to watch playback of video and audio and even
animation and audio.
3
Classes of multimedia Applications
 Hypertext and Hypermedia

 Streaming Stored Audio and Video

 Streaming Live Audio and Video

 Real-Time Interactive Audio and Video

 Others

4
Class: Hypertext and Hypermedia
 Hypertext is a text which contains links to other
texts.
 Hypertext is usually non-linear.
 Hypermedia is not constrained to be text-based.
 It can include other media, e.g., graphics, images,
and especially the continuous media - sound and
video.
 Ted Nelson is the first to use both these terms.
 The World Wide Web (WWW) is the best example
of hypermedia applications.

5
Class: Streaming Stored Audio and Video
 The multimedia content has been prerecorded and
stored on a server
 User may pause, rewind, forward, etc…
 The time between the initial request and display
start can be 1 to 10 seconds
 Constraint:
 after display start, the playout must be continuous
 if internet is disconnected transmission stops
 require sufficient bandwidth to play, especially at
higher quality, ex. Netflix, Prime video, DSTV etc

6
Class: Streaming Live Audio and Video
 Similar to traditional broadcast TV/radio, but
delivery over the Internet
 Non-interactive just view/listen
 Can pause or rewind
 Often combined with multicast
 The time between the initial request and display
start can be up to 10 seconds
 Constraint:
 requires a substantial amount of bandwidth
 if internet is disconnected transmission stops
 like stored streaming, after display start, the playout
must be continuous
7
Class: Real-Time Interactive Audio and Video
 Phone conversation/Video conferencing
 Examples
 Zoom
 Google meet
 MS teams
 Jitsi meet
 Cisco Webex etc..
 Constraint: delay between initial request and display start
must be small
 Video: <150 ms acceptable
 Audio: <150 ms not perceived, <400 ms acceptable
 Need continuous and reliable internet
 after display start, the playout must be continuous

8
Class: Others
 Multimedia sharing applications
 Download-and-then-play applications
 E.g. Napster, Gnutella, Freenet etc.
 Games
 Distance learning applications
 Coordinate video, audio and data

9
Problems with Internet: Challenges
 TCP/UDP/IP suite provides best-effort, no
guarantees on expectation or variance of packet delay

 Performance deteriorate if links are congested


(especially when trans-oceanic)

 Most router implementations use only First-Come-


First-Serve (FCFS) packet processing and
transmission scheduling

10
Problems and solutions
 Limited bandwidth
 Solution: Compression
 Packet Jitter
 Solution: Fixed/adaptive playout delay for
Audio (example: voice over IP)
 Packet loss
 Solution: FEC, Interleaving

11
Problem: Limited bandwidth
Intro: Digitalization
 Audio
 x samples every second (x=frequency)
 The value of each sample is rounded to a finite
number of values (for example 256: quantization)
 Video
 Each pixel has a color
 Each color has a value

12
Problem: Limited bandwidth
Solution: Compression
 Audio
 CD quality: 44100 samples per seconds with 16 bits
per sample, stereo sound
 44100*16*2 = 1.411 Mbps
 For a 3-minute song: 1.441 * 180 = 254 Mb =31.75 MB
 Video
 For 320*240 images with 24-bit colors
 320*240*24 = 230KB/image
 15 frames/sec: 15*230KB = 3.45 MB
 3 minutes of video: 3.456*60*3 = 622 MB
13
Audio compression
 Several techniques
 GSM (13 kbps), G.729(8 kbps), G723.3(6.4 and
5.3kbps)
 MPEG 1 layer 3 (also known as MP3)
 Typical compress rates 96kbps, 128kbps, 160kbps
 Very little sound degradation
 If file is broken up, each piece is still playable
 Complex (psychoacoustic masking, redundancy
reduction, and bit reservoir buffering)
 3-minute song (128kbps) = 2.8MB
14
Image compression: JPEG
 Divide digitized image into 8x8 pixel blocks
 Pixel blocks are transformed into frequency blocks
using DCT.
 The quantization phase limits the precision of the
frequency coefficient.
 The encoding phase packs this information in a
dense fashion
 Then entropy based compression such as Huffman
and run-length are done
15
Video compression
 Popular techniques

 MPEG 1 for CD-ROM quality video (1.5Mbps)

 MPEG 2 for high quality DVD video (3-6 Mbps)

 MPEG 4 for object-oriented video compression

16
Video Compression: MPEG
 MPEG uses inter-frame encoding
 Exploits the similarity between consecutive frames
 Three frame types
 I frame: independent encoding of the frame (JPEG)
 P frame: encodes difference relative to I-frame (predicted)
 B frame: encodes difference relative to interpolated frame
 Note that frames will have different sizes
 Complex encoding, e.g. motion of pixel blocks, scene
changes, …
 Decoding is easier then encoding
 MPEG often uses fixed-rate encoding

I B B P B B P B B I B B P B B
17
MPEG System Streams
 Combine MPEG video and audio streams in
a single synchronized stream
 Consists of a hierarchy with meta data at
every level describing the data
 System level contains synchronization
information
 Video level is organized as a stream of group of
pictures
 Pictures are organized in slices
…

18
MPEG System Streams (cont.)

19
MPEG System Streams (cont.)

20
Problem: Packet Jitter
 Jitter: Variation in delay
Sender
No jitter
6 5 4 3 2 1

Receiver
Jitter
5 6 4 3 2 1
 Example

pkt 6

pkt 5
21
Dealing with packet jitter
 How does voice over IP applications limit
the effect of jitter?
 A sequence number is added to each
packet
 A timestamp is added to each packet
 Playout is delayed

22
Problem: Packet loss
 Loss is in a broader sense: packet never
arrives or arrives later than its scheduled
playout time
 Since retransmission is inappropriate for
Real Time applications,
 FEC(Forward Error Correction) or
Interleaving are used to reduce loss impact.

23
Recovering from packet loss
Forward Error Correction
 Send redundant encoded chunk every n
chunks (XOR original n chunks)
 If 1 packet in this group lost, can reconstruct
 If >1 packets lost, cannot recover
 Disadvantages
 The smaller the group size, the larger the
overhead
 Playout delay increased

24
Recovering from packet loss
Receiver-based Repair
 The simplest form: Packet repetition
 Replaces lost packets with copies of the
packets that arrived immediately before the
loss
 A more computationally intensive form:
Interpolation
 Uses Audio before and after the loss to
interpolate a suitable packet to cover the
loss
25
Real Time Protocol (RTP)
 RTP logically extends UDP
 Sits between UDP and application
 Implemented as an application library
 What does it do?
 Framing
 Multiplexing
 Synchronization
 Feedback (RTCP i.e. with control)

26
RTP packet format
 Payload Type: 7 bits, providing 128
possible different types of encoding; eg
PCM, MPEG2 video, etc.
 Sequence Number: 16 bits; used to detect
packet loss

27
RTP packet format (cont)
 Timestamp: 32 bytes; gives the sampling
instant of the first audio/video byte in the
packet; used to remove jitter introduced
by the network
 Synchronization Source identifier
(SSRC): 32 bits; an id for the source of a
stream; assigned randomly by the source

28
Audio silence example
 Consider audio data type
 What do you want to send during silence?
 Not sending anything
 problem
 Other side needs to distinguish between loss and
silence
 Receiver uses Timestamps and sequence No.
to figure out what happened

29
RTP Control Protocol (RTCP)
 Used in conjunction with RTP. Used to exchange control
information between the sender and the receiver.
 Three reports are defined: Receiver reception, Sender, and
Source description
 Reports contain statistics such as the number of packets
sent, number
of packets lost,
inter-arrival jitter
 Typically, limit the
RTCP bandwidth to 5%.
Approximately one
sender report for three
receiver reports
30
Streaming Stored Multimedia
Example
 Audio/Video file is segmented and sent over
either TCP or UDP, public segmentation
protocol: Real-Time Protocol (RTP)
 User interactive control is provided, e.g. the
public protocol Real Time Streaming
Protocol (RTSP)

31
Streaming Stored Multimedia
Example
 Helper Application: displays content, which is
typically requested via a Web browser; e.g.
RealPlayer; typical functions:
 Decompression
 Jitter removal
 Error correction: use redundant packets to be used for
reconstruction of original stream
 GUI for user control
 etc

32
Streaming from Web Servers
 Audio: in files sent as HTTP objects
 Video (interleaved audio and images in one file, or
two separate files and client synchronizes the
display) sent as HTTP object(s)
 A simple architecture is to have the Browser
request the object(s)
and after their
reception pass
them to the player
for display

33
Streaming from a Web Server …
 Alternative: set up connection between server and
player, then download
 Web browser requests and receives a Meta File
(a file describing the object) instead of receiving
the file itself;
 Browser launches the appropriate Player and
passes it the Meta File;
 Player sets up a TCP connection with a streaming
server and downloads the file

34
Using a Streaming Server

35
36

You might also like