Figure 1: System Video Flow for Analog/Digital Sources and Displays.
In an opposite flow, a compressed stream is retrieved from a network or mass storage. It is then decompressed via a software decode operation and sent directly to a digital output display (like a TFT-LCD panel), perhaps being first converted to analog form by a video encoder for display on a conventional CRT.
Keep in mind that compression/decompression represent only a subset of possible video processing algorithms that might run on the media processor. Still, for our purposes, they set a convenient template for discussion. Let's examine in more detail the video-specific portions of these data flows.
Analog Video Sources
Embedded processor cores cannot deal with analog video directly. Instead, the video must be digitized first, via a video decoder. This device converts an analog video signal (e.g., NTSC, PAL, CVBS, S-Video) into a digital form (usually of the ITU-R BT.601/656 YCbCr or RGB variety). This is a complex, multi-stage process. It involves extracting timing information from the input, separating luma from chroma, separating chroma into Cr and Cb components, sampling the output data, and arranging it into the appropriate format. A serial interface such as SPI or I2C configures the decoder's operating parameters. Figure 2 shows a block diagram of a representative video decoder.
Digital Video Sources
Camera sources today are overwhelmingly based on either Charge-Coupled Device (CCD) or CMOS technology. Both of these technologies convert light into electrical signals, but they differ in how this conversion occurs.
CMOS sensors ordinarily output a parallel digital stream of pixel components in either YCbCr or RGB format, along with horizontal and vertical synchronization and a pixel clock. Sometimes, they allow for an external clock and sync signals to control the transfer of image frames out from the sensor.
CCDs, on the other hand, usually hook up to an "Analog Front End" (AFE) chip, such as the AD9948, that processes the analog output signal from the CCD array, digitizes it, and generates appropriate timing to scan the CCD array. A processor supplies synchronization signals to the AFE, which needs this control to manage the CCD array. The digitized parallel output stream from the AFE might be in 10-bit, or even 12-bit, resolution per pixel component.
For a more detailed discussion on tradeoffs between CMOS and CCD sensors, as well as an overview of a typical image processing pipeline, please refer to the article CCD and CMOS image sensor processing pipeline.
Analog Video Displays
Video Encoder
A video encoder converts a digital video stream into an analog video signal. It typically accepts a YCbCr or RGB video stream in either ITU-R BT.656 or BT.601 format and converts it to a signal compliant with one of several different output standards (e.g., NTSC, PAL, SECAM). A host processor controls the encoder via a 2- or 3-wire serial interface like SPI or I2C, programming such settings as pixel timing, input/output formats, and luma/chroma filtering. Figure 3 shows a block diagram of a representative encoder.
Figure 3: Block diagram of ADV7179 video encoder.
Video encoders commonly output in one or more of the following analog formats:
CVBS – This acronym stands for Composite Video Baseband Signal (or Composite Video Blanking and Syncs). Composite video connects through the ubiquitous yellow RCA jack shown in Figure 4a. It contains Luma, Chroma, Sync and Color Burst information all on the same wire.
S Video, using the jack shown in Figure 4b, sends the luma and chroma content separately. Separating the brightness information from the color difference signals dramatically improves image quality, which accounts for the popularity of S Video connections on today's home theater equipment.
Component Video – Also known as YPbPr, this is the analog version of YCbCr digital video. Here, the luma and each chroma channel are output separately, each with its own timing. This offers maximum image quality for analog transmission. Component connections are very popular on higher-end home theatre system components like DVD players and A/V receivers (Figure 4c).
Analog RGB has separate channels for Red, Green and Blue signals. Although this offers image quality similar to Component Video, this format is generally used in the computer graphics realm, whereas Component Video is primarily employed in the consumer electronics arena. RGB connectors are usually of the BNC variety, shown in Figure 4d.
Figure 4: Common Analog Video Connectors.
Cathode Ray Tubes (CRTs)
On the display side, RGB is the most popular interface to computer monitors and LCD panels. Most older computer monitors accept analog RGB inputs on 3 separate pins from the PC video card and modulate 3 separate electron gun beams to generate the image. Depending on which beam(s) excite a phosphor point on the screen, that point will glow either red, green, blue, or some combination of these colors. This is different from analog television, where a composite signal (one that includes all color information superimposed on a single input) modulates a single electron beam. Newer computer monitors use DVI, or Digital Visual Interface, to accept RGB information in both digital and analog formats.
The main advantages of CRTs are that they are very inexpensive and can produce more colors than a comparably sized LCD panel. Also, unlike LCDs, they can be viewed from any angle. On the downside, CRTs are very bulky, emit considerable electromagnetic radiation, and can cause eyestrain due to their refresh-induced flicker.
Liquid Crystal Display (LCD) Panels
There are two main categories of LCD technology: passive matrix and active matrix. In the former (whose common family members include STN, or "Super Twisted Nematic," derivatives), a glass substrate imprinted with rows forms a "liquid crystal sandwich" with a substrate imprinted with columns. Pixels are constructed as row-column intersections. Therefore, to activate a given pixel, a timing circuit energizes the pixel's column while grounding its row. The resultant voltage differential untwists the liquid crystal at that pixel location, which causes it to become opaque and block light from coming through.
Straightforward as it is, passive matrix technology does have some shortcomings. For one, screen refresh times are relatively slow (which can result in "ghosting" for fast-moving images). Also, there is a tendency for the voltage at a row-column intersection to "bleed" over into neighboring pixels, partly untwisting the liquid crystals and blocking some light from passing through the surrounding pixel area. To the observer, this blurs the image and reduces contrast. Moreover, the viewing angle is relatively narrow.
Active matrix LCD technology greatly improves upon passive technology in these respects. Basically, each pixel consists of a capacitor and transistor switch. This arrangement gives rise to the more popular term, "Thin-Film Transistor (TFT) Display." To address a particular pixel, its row is enabled, and then a voltage is applied to its column. This has the effect of isolating only the pixel of interest, so others in the vicinity don't turn on. Also, since the current to control a given pixel is reduced, pixels can be switched at a faster rate, which leads to faster refresh rates for TFTs compared to passive displays. What's more, modulating the voltage level applied to the pixel allows many discrete levels of brightness. Today, it is common to have 256 levels, corresponding to 8 bits of intensity.
Connecting to a TFT-LCD panel can be a confusing endeavor due to all of the different components involved. First, there's the panel itself, which houses an array of pixels arranged for strobing by row and column, referenced to the pixel clock frequency.The backlight is often a CCFL (Cold Cathode Fluorescent Lamp), which excites gas molecules to emit bright light while generating very little heat. Other reasons for the suitability of CCFLs to LCD panel applications are their durability, long life, and straightforward drive requirements. LEDs are also a popular backlight method, mainly for small- to mid-sized panels. They have the advantages of low cost, low operating voltage, long life, and good intensity control. However, for larger panel sizes, LED backlights can draw a lot of power compared to CCFL solutions.
An LCD controller contains most of the circuitry needed to convert an input video signal into the proper format for display on the LCD panel. It usually includes a timing generator that controls the synchronization and pixel clock timing of the individual pixels on the panel. However, in order to meet the LCD panel size and cost requirements, sometimes timing generation circuitry needs to be supplied externally in a "Timing Generator" or "Timing ASIC" chip. In addition to the standard synchronization and data lines, timing signals are needed to drive the individual rows and columns of the LCD panel. Sometimes, spare general-purpose PWM (pulse-width modulation) timers on a media processor can substitute for this separate chip, saving system cost.
Additional features of LCD controller chips are things like on-screen display support, graphics overlay blending, color lookup tables, dithering, and image rotation. The more elaborate chips can be very expensive, often surpassing the cost of the processor to which they're connected.
An LCD driver chip is necessary to generate the proper voltage levels to the LCD panel. It serves as the "translator" between the output of the LCD Controller and the LCD Panel. The rows and columns are usually driven separately, with timing controlled by the timing generator. Liquid crystal must be driven with periodic polarity inversions, because a dc current will stress the crystal structure and ultimately deteriorate it. Therefore, the voltage polarity applied to each pixel varies on either a per-frame, per-line, or per-pixel basis, depending on the implementation.
With the trend toward smaller, cheaper multimedia devices, there has been a push to integrate these various components into the LCD system. Today, there exist integrated TFT-LCD modules that include timing generation and drive circuitry, requiring only a data bus connection, clocking/synchronization lines, and power supplies. The electrical interface on an integrated TFT-LCD display module is straightforward. It typically consists of data lines, synchronization lines, power supply lines, and a clock. Some panels are also available with a composite analog video input, instead of parallel digital inputs.
OLED (Organic Light-Emitting Diode) Displays
The "Organic" in OLED refers to the material that's encased between two electrodes. When charge is applied through this organic substance, it emits light. This display technology is still very new, but it holds promise because it improves upon several deficiencies in LCD displays. For one, it's a self-emissive technology and does not require a backlight. This has huge implications for saving panel power, cost, and weight—an OLED panel can be extremely thin. Additionally, it can support a wider range of colors than comparable LCD panels can, and its display of moving images is also superior to that of LCDs. What's more, it supports a wide viewing angle and provides high contrast. OLEDs have an electrical signaling and data interface similar to that of TFT-LCD panels.
For all its advantages, so far the most restrictive aspect of the OLED display is its limited lifetime. The organic material breaks down after a few thousand hours of use, although this number has now improved in some displays to over 10,000 hours—quite suitable for many portable multimedia applications. It is here that OLEDs have their brightest future—in cellphones, digital cameras, and the like. It is also quite possible that in the near future we'll see televisions or computer monitors based on OLED technology. For the time being, though, as LCD panel technology keeps improving, the OLED mass production timeline keeps getting pushed out incrementally.
Now that we've covered the basics of connecting video streams within a system, it's time to take a look inside the processor, to see how it handles video efficiently. This is the subject of Part 4.
This series is adapted from the book "Embedded Media Processing" (Newnes 2005) by David Katz and Rick Gentile. See the book's web site for more information.
No comments:
Post a Comment