Title: Video and Streaming Media
1Video and Streaming Media
Andy Dozier
2Approach
- Video Standards
- Analog Video
- Digital Video
- Video Quality Parameters
- Frame Rate
- Color Depth
- Resolution
- Encoding/Decoding Standards
3Video Standard Summary
- Analog Video
- Composite
- Component
- Digital Video
4Composite Video Overview
- Optimized for wireless broadcast operation
- Frequency allocations are controlled by the FCC
- 54 MHz to 806 MHz (68 Channels)
- Allocate 6 MHz/Channel
- Utilizes a single communication channel
- Coaxial cable transmission
- Terrestrial broadcast
- Lowest resolution
5Composite Video Overview (contd)
- Defined by National Television Systems Committee
(NTSC) - Interface Standard (System M-NTSC) documented in
ANSI T1.502.1988 - M-NTSC Features
- Color or monochrome
- 30 frames/second
- 525 horizontal scan lines (483 usable)
6Interlacing
- A refresh rate of 30 frames/second exhibits
flicker - One frame is a complete image at a point in time
- Solution is to divide each frame into two
fields - One field consists of either all odd, or all even
scan lines - Odd and even scan lines are interlaced
- 262.5 horizontal scan lines/field
- Each field is refreshed at a rate of 30/second
- 60 fields/second total
- Phosphor persistence allows the eye to perceive
both fields at the same time - Eliminates flicker problem
7Composite Video Resolution
- Horizontal/vertical dimension ratio is 4/3
- Usable horizontal scan lines 483
- In order to make a horizontal line consistent in
an image, it is necessary for the image line to
cover more than one scan line - Number of horizontal image lines 70 of the
number of horizontal scan lines - Vertical resolution is 0.7 X 483, or 338
horizontal line/space pairs - Require the same horizontal resolution
- 4/3 X 338, or 450 vertical line/space pairs
Composite Video Resolution is equivalent to 450 X
338 pixels
8Composite Video
Features Single Wire or Channel NTSC Standard
Suitable for Broadcasting Lowest Resolution
Equivalent to 450 X 338 Pixels
9Color Theory
- Color theory is based on the psychophysical
properties of human color vision - First stated by Herman Grassman of Germany in
1854 - Any color can be matched by an additive
combination of different amounts of three
additive primary colors - Additive primary colors are different from
subtractive primary colors - Red/Green/Blue (RGB)
- In video, phosphors emit light, therefore we use
additive primaries
10Definitions
- Intrinsic nature of color is called Hue, or U
- Intensity of color is called Saturation, or V
- Hue and saturation taken together define color,
or Chrominance, C - Hue Saturation Chrominance C
- Brightness is described as Luminous Flux
- Luminance Y
- C and Y totally describe color sensation
11Color Spatial Resolution
- For most images, the fine detail picked up by the
human eye is conveyed by changes in Luminance - Cannot pick up color for small objects
- This implies that for very small areas in a
scene, the human eye is much more sensitive to
changes in Luminance, or brightness of the scene - For large areas, the eye responds mostly to colors
12Analog Component Video
- NTSC committee desired to design a color TV
signal system that was compatible with the black
and white (monochrome) system - Split the signal into components
- Luminance (Y)
- Chrominance (C)
- This signal system accounts for the variation in
sensitivity of the eye to different colors
Y 0.30 R 0.59 G 0.11 B
13Analog Component Video (contd)
- A variety of signal systems are used to provide
color displays - Composite signal systems embed the Chrominance
information into the transmitted signal - Systems which separate the Y, C, U, and V
information are referred to as Component Video
systems - Digital and analog versions
- Component video provides higher fidelity
14Analog Component VideoYUV
Features Separates Y, U, and V Current Color
TV System Combine YUV for transmission Used
for Color TV Receivers
15Analog Component VideoY/C
Features Separates Y and C Intermediate
Quality 2-wire system Called S-Video Used
for Hand-Held Cameras Hi-8 Super VHS
16Analog Component VideoRGB
Features Separates R, G, and B signals
Easily transformed into other signal systems
Y/C YUV Used for Color Monitors
17Digital Video
- Major disadvantages of analog techniques are
- Susceptibility to electromagnetic noise
- Quality degrades with multiple generations of
copies - Digital video techniques represent component
signals as streams of 1s and 0s - Eliminates degradation of multiple copy
generations - Excellent noise immunity
- Can be stored on hard disk drives, DVD, and
CD-ROM - Can be transported via data networks
18Digital Video Features
- Generated by digitizing analog video signals
- Composite Digital - D2 Standard
- Component Digital - D1 Standard
- Image quality is defined by three parameters
- Frame Resolution and Scaling
- Color Depth
- Frame Rate
19Frame Resolution and Scaling
- Each frame (image) is represented by an array of
pixels - If the pixel array is equal to the monitor
resolution, the image fills the monitor screen - Example 640 X 480 pixels
- Partial screen images may be displayed (scaled)
- Using a full screen resolution of 640 X 480
pixels - 320 X 240 pixels would fill 1/4 of the screen
- 160 X 120 pixels would fill 1/16 of the screen
20Scaling of Image Size
Full Screen
1/4 Screen
1/16 Screen
21Color Depth
- Color depth is defined by the number of bits used
to represent the color of each pixel - This determines the maximum number of colors that
can be represented, and therefore the realism
of the image. As an example
Red 8 bits/pixel Green 8 bits/pixel Blue 8
bits/pixel
Using 24 bits/pixel allows representation of 16.7
Million colors
22Frame Rate
- The number of times/second an image is refreshed
controls image quality - Flicker
- Jerkiness of motion
- Some encoding systems allow adjustment of the
frame rate to stay within the bandwidth allocated
by the network - Basic Rate ISDN allows a maximum of 128 kbps
- Most high quality video conferencing systems use
at least 384/kbps
23Digital Video Bandwidth Requirements
Frame Rate 60 frames/second Color Depth 24
bits/pixel Frame Size 640 X 480 pixels
- This example would require 442.37 Mbps to
transmit uncompressed video in real time - We have to consider compression techniques to
transmit video for affordable systems
24Digital Video Bandwidth Requirements
- Uncompressed D-1 video requires 270 Mbps
- This implies that it is still impossible to
transport an uncompressed D-1 signal over the
wide area - Bandwidth is too expensive
- It is also difficult to transport it over the
local area - Requires Gigabit Ethernet
25Video Stream Bandwidth
26Intraframe Compression
- The eye is not as sensitive to changes in color
on a small scale as intensity - This implies that a video imaging system can
throw away some of the color information in
each frame, and still appear realistic to the
human eye - Color sampling can be easily changed
(sub-sampling) - If this is done consistently for each frame, this
technique is referred to as Intraframe
compression
27Intraframe CompressionColor Subsampling
The previous example would require 221 Mbps _at_
411
28Alternative Intraframe Compression Techniques
- The key to successful intraframe techniques is
that each frame be preserved at the highest
resolution possible - Allows editing on a frame by frame basis
- The approach is to throw away information that
cannot be perceived by the human eye by adjusting
parameters
29Alternative Intraframe Compression (contd)
30JPEG
- The Joint Photographic Experts Group (JPEG)
developed a compression standard for 24-bit True
Color photographic images - Single frame encoding technology
- This technique utilizes Intraframe compression
- Subsampling of Chroma information
- Algorithm quantizes 8 X 8 blocks of pixels
- Achieves an image compression ratio of 2 to 30
over uncompressed images - One image equals one video frame
31Motion JPEG
- Utilizes JPEG encoding for each frame
- 30 frames/sec
- Variable compression rations (21 to 301)
- This allows editing on a frame by frame basis
- Industry standard for high definition storage and
retrieval - One drawback is that the MJPEG standard does not
encode audio - Proprietary solution required
One hour of broadcast video utilizing a 61
compression ratio requires 13 GBytes
32Interframe Compression
- Significant compression must be achieved to
transport and handle video streams via wide area
networks (WANs) - Achieved by Interframe compression
- Adjustment of image parameters
- Data compression achieved by dropping information
between frames - Common interframe compression techniques
available today - MPEG
33MPEG Compression
- In order to achieve significant compression
ratios predictive techniques are required - These techniques encode one complete frame
periodically, and predict the changes between
these key frames - MPEG encodes every 16th frame
Example Talking head, where only the lips and
head of the speaker are moving
34MPEG Encoding Scheme
35MPEG Disadvantages
- Since you have complete information every sixteen
frames ( every ½ second) video editing is more
difficult - Sound may need to be correlated to the frame of
choice
36Encoding Techniques
- Encoders are now available at reasonable prices
that bring the compression ratios into an
affordable range (lt 1.5 Mbits/sec) - Two types of encoders are available
- Symmetric
- Asymmetric
- Symmetric encoders can encode in real time
- Used for video streaming applications
- Asymmetric encoders cannot encode in real time
- Used for CD and DVD applications
37Encoder/Decoder (Codec) Types
38Streaming Video
- Originally, video was played via the Download
and Play method - For long video clips, it is more desirable to
start playing before waiting for the entire file
to download - Streaming video
- Requires Isochronous playback
- This is achieved by buffering
39Download and Play
40Isochronous Playback
41Video Streaming
42Video Editing and Authoring
- In order to create useful applications, it is
necessary to capture multiple streams, and
combine them into one - Multiple rates may also be required for different
users - After the streams are captured, an Editing and
Authoring process is required
43Video Editing Process