Interlace

From Wikipedia, the free encyclopedia

Jump to: navigation, search
For the method of incrementally displaying raster graphics, see Interlace (bitmaps).
For the decorative motif used in ancient European and Celtic art, see Migration Period art and Celtic knot.

Interlace is a technique of improving the picture quality of a video signal primarily on CRT devices without consuming extra bandwidth. Interlacing causes problems on certain display devices such as LCDs.[1] The concept of breaking a single video frame into interlaced 'fields' was first demonstrated by Russian inventor Léon Theremin in 1927,[2] and the technology was first patented by RCA engineer Randall C. Ballard in 1932.[3][4] Commercial implementation began in 1934 as cathode ray tube screens became brighter, increasing the level of flicker caused by progressive (sequential) scanning.[5] It was ubiquitous in television until the 1970s, when the needs of computer monitors resulted in the reintroduction of progressive scan. Interlace is still used for most standard definition TVs, and the 1080i HDTV broadcast standard, but not for LCD, micromirror (DLP), or plasma displays; these displays do not use a raster scan to create an image, and so cannot benefit from interlacing: in practice, they have to be driven with a progressive scan signal. The deinterlacing circuitry to get progressive scan from a normal interlaced broadcast television signal can add to the cost of a television set using such displays. Currently, progressive displays dominate the HDTV market.

Interlaced scan refers to one of two common methods for "painting" a video image on an electronic display screen (the second is progressive scan) by scanning or displaying each line or row of pixels. This technique uses two fields to create a frame. One field contains all the odd lines in the image, the other contains all the even lines of the image. A PAL based television display, for example, scans 50 fields every second (25 odd and 25 even). The two sets of 25 fields work together to create a full frame every 1/25th of a second, resulting in a display of 25 frames per second.

Contents

[edit] Description

With progressive scan, an image is captured, transmitted and displayed in a path similar to text on a page: line by line, from top to bottom.

The interlaced scan pattern in a CRT (cathode ray tube) display completes such a scan too, but only for every second line. This is carried out from the top left corner to the bottom right corner of a CRT display. This process is repeated again, only this time starting at the second row, in order to fill in those particular gaps left behind while performing the first progressive scan on alternate rows only.

Such scan of every second line is called interlacing. A field is an image that contains only half of the lines needed to make a complete picture. The afterglow of the phosphor of CRTs, in combination with the persistence of vision results in two fields being perceived as a continuous image which allows the viewing of full horizontal detail with half the bandwidth that would be required for a full progressive scan while maintaining the necessary CRT refresh rate to prevent flicker.

Only CRTs can display interlaced video directly – other display technologies require some form of deinterlacing.

[edit] History

When motion picture film was developed, it was observed that the movie screen had to be illuminated at a high rate to prevent visible flicker. The exact rate necessary varies by brightness, with 40 Hz being acceptable in dimly lit rooms, while up to 80 Hz may be necessary for bright displays that extend into peripheral vision. The film solution was to project each frame of film three times using a three bladed shutter: a movie shot at 16 frames per second would thus illuminate the screen 48 times per second. Later, when sound film became available, the higher projection speed of 24 frames per second enabled a two bladed shutter to be used maintaining the 48 times per second illumination — but only in projectors that were incapable of projecting at the lower speed.

But this solution could not be used for television — storing a full video frame and scanning it twice would require a frame buffer, a method that did not become feasible until the late 1980s. In addition, avoiding on-screen interference patterns caused by studio lighting and the limits of vacuum tube technology required that CRTs for TV be scanned at AC line frequency. (This was 60 Hz in the US, 50 Hz Europe.) In 1936 when the analog standards were being set in the UK, CRTs could only scan at around 200 lines in 1/50th of a second. By using interlace, a pair of 202.5-line fields could be superimposed to become a sharper 405 line frame. The vertical scan frequency remained 50 Hz, so flicker was not a problem, but visible detail was noticeably improved. As a result, this system was able to supplant John Logie Baird's 240 line mechanical progressive scan system that was also being used at the time.

From the 1940s onward, improvements in technology allowed the US and the rest of Europe to adopt systems using progressively more bandwidth to scan higher line counts, and achieve better pictures. However the fundamentals of interlaced scanning were at the heart of all of these systems. The US adopted the 525 line system known as NTSC, Europe adopted the 625 line system, and the UK switched from its 405 line system to 625 in order to avoid having to develop a unique method of color TV. France switched from its unique 819 line system to the more European standard of 625. It should be noted that although the term PAL is often used to describe the line and frame standard of the TV system, this is in fact incorrect and refers only to the method of superimposing the colour information on the standard 625 line broadcast. The French adopted their own SECAM system which was also adopted by some other countries, notably Russia and its satellites. PAL has been used on some otherwise NTSC broadcasts notably in Brazil.

[edit] Application

Interlacing is used by all the analogue TV broadcast systems in current use:

  • PAL: 50 fields per second, 625 lines, upper field drawn first
  • SECAM: 50 fields per second, 625 lines
  • NTSC: 59.94 fields per second, 525 lines
  • PAL-M: 59.94 fields per second, 525 lines

[edit] Benefits of interlacing

With any video system there are trade-offs. One of the most important factors is bandwidth, measured in megahertz (for analog video), or bit rate (for digital video). The greater the bandwidth, the more expensive and complex the entire system (camera, storage systems such as tape recorders or hard disks, transmission systems such as cable television systems or respectively internet, and displays such as television monitors or digital monitors).

Interlaced video reduces the signal bandwidth by a factor of two, for a given line count and refresh rate.

Alternatively, a given bandwidth can be used to provide an interlaced video signal with twice the display refresh rate for a given line count (versus progressive scan video). A higher refresh rate reduces flicker on CRT monitors. The higher refresh rate improves the portrayal of motion, because objects in motion are captured and their position is updated on the display more often. The human visual system averages the rapidly displayed still pictures into a moving picture image, and so interlace artifacts aren't usually objectionable when viewed at the intended field rate, on an interlaced video display.

For a given bandwidth and refresh rate, interlaced video can be used to provide a higher spatial resolution than progressive scan. For instance, 1920x1080 pixel resolution interlaced HDTV with a 60 Hz field rate (known as 1080i60) has a similar bandwidth to 1280x720 pixel progressive scan HDTV with a 60 Hz frame rate (720p60), but approximately 50% more spatial resolution.

Note that this is assuming an analog or uncompressed digital video signal. With digital video compression, as used in all current digital TV standards, interlacing introduces some additional inefficiencies over fully progressive video, and so the bandwidth savings are significantly less than half.

[edit] Problems caused by interlacing

Interlaced video is designed to be captured, transmitted or stored and displayed in the same interlaced format. Because each frame of interlaced video is composed of two fields that are captured at different moments in time, interlaced video frames will exhibit motion artifacts if the recorded objects are moving fast enough to be in different positions when each individual field is captured. These artifacts may be more visible when interlaced video is displayed at a slower speed than it was captured or when still frames are presented.

Because modern computer video displays are progressive scan systems, interlaced video will have visible artifacts when it is displayed on computer systems. Computer systems are frequently used to edit video and this disparity between computer video display systems and television signal formats means that the video content being edited cannot be viewed properly unless separate video display hardware is utilized.

To minimize the artifacts caused by interlaced video display on a progressive scan monitor, a process called deinterlacing can be utilized. This process is not perfect, and it generally results in a lower resolution, particularly in areas with objects in motion. Deinterlacing systems are integrated into progressive scan television displays in order to provide the best possible picture quality for interlaced video signals.

Interlace introduces a potential problem called interline twitter. This aliasing effect only shows up under certain circumstances, when the subject being shot contains vertical detail that approaches the horizontal resolution of the video format. For instance, a person on television wearing a shirt with fine dark and light stripes may appear on a video monitor as if the stripes on the shirt are "twittering". Television professionals are trained to avoid wearing clothing with fine striped patterns to avoid this problem. High-end video cameras or Computer Generated Imagery systems apply a low-pass filter to the vertical resolution of the signal in order to prevent possible problems with interline twitter.

This animation demonstrates the interline twitter effect. The two interlaced images use half the bandwidth of the progressive one. The interlaced scan (second from left) precisely duplicates the pixels of the progressive image (far left), but interlace causes details to twitter. Real interlaced video blurs such details to prevent twitter, as seen in the third image from the left, but such softening (or anti-aliasing) comes at the cost of resolution. A line doubler could never restore the third image to the full resolution of the progressive image. Note – Because the frame rate has been slowed down, you will notice additional flicker in simulated interlaced portions of this image.

Interline twitter is the primary reason that interlacing is unacceptable for a computer display. Each scanline on a high-resolution computer monitor is typically used to display discrete pixels that do not span the scanlines above or below. When the overall interlaced framerate is 30 frames per second, a pixel that spans only one scanline is visible for 1/30th of a second followed by 1/30th of a second of darkness, reducing the per-line/per-pixel framerate to 15 frames per second.

To avoid this problem, sharp detail is typically never displayed on standard interlaced television. When computer graphics are shown on a standard television, the screen is treated as if it were half the resolution of what it actually is or even lower. If text is displayed, it will be large enough so that horizontal lines are never just one scanline wide. Most fonts used in television programming have wide, fat strokes, and do not include fine-detail serifs that would make the twittering more visible.

Despite arguments against it and the calls by many prominent technological companies such as Microsoft to leave interlacing to history, interlacing continues to be supported by the television standard setting organizations and is still being included in new digital video transmission formats such as DV, DVB (including its HD modifications), and ATSC.

[edit] Interlace and computers

In the 1970s, computers and home video game systems began using TV sets as display devices. At this point, a 480-line NTSC signal was well beyond the graphics abilities of low cost computers, so these systems used a simplified video signal which caused each video field to scan directly on top of the previous one, rather than each line between two lines of the previous field. This marked the return of progressive scanning not seen since the 1920s. Since each field became a complete frame on its own, modern terminology would call this 240p on NTSC sets, and 288p on PAL. While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this. Computer monitor standards such as CGA were further simplifications to NTSC, which improved picture quality by omitting modulation of color, and allowing a more direct connection between the computer's graphics system and the CRT.

By the mid-1980s computers had outgrown these video systems and needed better displays. The Apple IIgs suffered from the use of the old scanning method, with the highest display resolution being 640x200, resulting in a severely distorted tall narrow pixel shape, making the display of realistic proportioned images difficult. Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than the 6, 7 and 8 MHz of bandwidth that NTSC and PAL signals were confined to. Apple Inc. built a custom 342p display into the Macintosh, and EGA for IBM compatible PCs was 350p. The Commodore Amiga created a true interlaced NTSC signal (as well as RGB variations). This ability resulted in the Amiga dominating the video production field until the mid 1990s, but the interlaced display mode caused flicker problems for more traditional PC applications where single-pixel detail is required. 1987 saw the introduction of VGA, on which PCs soon standardized, Apple only followed suit some years later with the Mac when the VGA standard was improved to match Apple's proprietary 24 bit colour video standard also introduced in 1987.

In the late 1980s and early 1990s, monitor and graphics card manufacturers introduced newer high resolution standards that once again included interlace. These monitors ran at very high refresh rates, intending that this would alleviate flicker problems. Such monitors proved very unpopular. While flicker was not obvious on them at first, eyestrain and lack of focus nevertheless became a serious problem. The industry quickly abandoned this practice, and for the rest of the decade all monitors included the assurance that their stated resolutions were "non-interlaced". This experience is why the PC industry today remains against interlace in HDTV, and lobbied for the 720p standard. Also the industry is lobbying beyond 720p, actually 1080/60p for NTSC legacy countries, and 1080/50p for PAL legacy countries.

[edit] See also

[edit] References

  1. ^ 21st-Impact/Interlaced vs. Progressive
  2. ^ "Albert Glinsky,Theramin,". University of Illinois Press, 2000, pp. 43. http://books.google.com/books?id=6DHlQJcMpBQC&pg=PA43. 
  3. ^ "Pioneering in Electronics". David Sarnoff Collection. http://www.davidsarnoff.org/kil-chapter09.htm. Retrieved on 2006-07-27. 
  4. ^ U.S. patent 2,152,234. Interestingly, reducing flicker is listed only fourth in a list of objectives of the invention.
  5. ^ R.W. Burns, Television: An International History of the Formative Years, IET, 1998, p. 425. ISBN 9780852969144.

[edit] External links

Personal tools