Misplaced Pages

NQ

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

The display resolution or display modes of a digital television , computer monitor , or other display device is the number of distinct pixels in each dimension that can be displayed. It can be an ambiguous term especially as the displayed resolution is controlled by different factors in cathode-ray tube (CRT) displays, flat-panel displays (including liquid-crystal displays ) and projection displays using fixed picture-element (pixel) arrays.

#356643

95-619: NQ may refer to: Normal Quality, a display resolution mode Display resolution It is usually quoted as width × height , with the units in pixels: for example, 1024 × 768 means the width is 1024 pixels and the height is 768 pixels. This example would normally be spoken as "ten twenty-four by seven sixty-eight" or "ten twenty-four by seven six eight". One use of the term display resolution applies to fixed-pixel-array displays such as plasma display panels (PDP), liquid-crystal displays (LCD), Digital Light Processing (DLP) projectors, OLED displays, and similar technologies, and

190-551: A 2880 × 1800 display on the MacBook Pro . Panels for professional environments, such as medical use and air traffic control, support resolutions up to 4096 × 2160 (or, more relevant for control rooms, 1∶1 2048 × 2048 pixels). In recent years the 16:9 aspect ratio has become more common in notebook displays, and 1366 × 768 ( HD ) has become popular for most low-cost notebooks, while 1920 × 1080 ( FHD ) and higher resolutions are available for more premium notebooks. When

285-494: A raster scan to create an image (their panels may still be updated in a left-to-right, top-to-bottom scanning fashion, but always in a progressive fashion, and not necessarily at the same rate as the input signal), and so cannot benefit from interlacing (where older LCDs use a "dual scan" system to provide higher resolution with slower-updating technology, the panel is instead divided into two adjacent halves that are updated simultaneously ): in practice, they have to be driven with

380-629: A 7 or 14 MHz bandwidth), suitable for NTSC/PAL encoding (where it was smoothly decimated to 3.5~4.5 MHz). This ability (plus built-in genlocking ) resulted in the Amiga dominating the video production field until the mid-1990s, but the interlaced display mode caused flicker problems for more traditional PC applications where single-pixel detail is required, with "flicker-fixer" scan-doubler peripherals plus high-frequency RGB monitors (or Commodore's own specialist scan-conversion A2024 monitor) being popular, if expensive, purchases amongst power users. 1987 saw

475-432: A 75 to 90 Hz field rate (i.e. 37.5 to 45 Hz frame rate), and tended to use longer-persistence phosphors in their CRTs, all of which was intended to alleviate flicker and shimmer problems. Such monitors proved generally unpopular, outside of specialist ultra-high-resolution applications such as CAD and DTP which demanded as many pixels as possible, with interlace being a necessary evil and better than trying to use

570-431: A computer display resolution is set higher than the physical screen resolution ( native resolution ), some video drivers make the virtual screen scrollable over the physical screen thus realizing a two dimensional virtual desktop with its viewport. Most LCD manufacturers do make note of the panel's native resolution as working in a non-native resolution on LCDs will result in a poorer image, due to dropping of pixels to make

665-481: A decade after the first ultra-high-resolution interlaced upgrades appeared for the IBM PC, to provide sufficiently high pixel clocks and horizontal scan rates for hi-rez progressive-scan modes in first professional and then consumer-grade displays, the practice was soon abandoned. For the rest of the 1990s, monitors and graphics cards instead made great play of their highest stated resolutions being "non-interlaced", even where

760-408: A display with a native 1366 × 768 pixel array). In the case of television inputs, many manufacturers will take the input and zoom it out to " overscan " the display by as much as 5% so input resolution is not necessarily display resolution. The eye's perception of display resolution can be affected by a number of factors – see image resolution and optical resolution . One factor

855-415: A double rate of progressive frames, resample the frames to the desired resolution and then re-scan the stream at the desired rate, either in progressive or interlaced mode. Interlace introduces a potential problem called interline twitter , a form of moiré . This aliasing effect only shows up under certain circumstances—when the subject contains vertical detail that approaches the horizontal resolution of

950-410: A fixed bandwidth, interlace provides a video signal with twice the display refresh rate for a given line count (versus progressive scan video at a similar frame rate—for instance 1080i at 60 half-frames per second, vs. 1080p at 30 full frames per second). The higher refresh rate improves the appearance of an object in motion, because it updates its position on the display more often, and when an object

1045-405: A full frame every 1/25 of a second (or 25 frames per second ), but with interlacing create a new half frame every 1/50 of a second (or 50 fields per second). To display interlaced video on progressive scan displays, playback applies deinterlacing to the video signal (which adds input lag ). The European Broadcasting Union argued against interlaced video in production and broadcasting. Until

SECTION 10

#1732851168357

1140-444: A lesser resolution for a more scaled vector rendering. Some emulators, at higher resolutions, can even mimic the aperture grille and shadow masks of CRT monitors. In 2002, 1024 × 768 eXtended Graphics Array was the most common display resolution. Many web sites and multimedia products were re-designed from the previous 800 × 600 format to the layouts optimized for 1024 × 768 . The availability of inexpensive LCD monitors made

1235-406: A maximum 1.5   MHz, or approximately 160 pixels wide, which led to blurring of the color for 320- or 640-wide signals, and made text difficult to read (see example image below). Many users upgraded to higher-quality televisions with S-Video or RGBI inputs that helped eliminate chroma blur and produce more legible displays. The earliest, lowest cost solution to the chroma problem was offered in

1330-436: A path similar to text on a page—line by line, top to bottom. The interlaced scan pattern in a standard definition CRT display also completes such a scan, but in two passes (two fields). The first pass displays the first and all odd numbered lines, from the top left corner to the bottom right corner. The second pass displays the second and all even numbered lines, filling in the gaps in the first scan. This scan of alternate lines

1425-448: A problem of applying the appropriate algorithms to the interlaced signal, as all information should be present in that signal. In practice, results are currently variable, and depend on the quality of the input signal and amount of processing power applied to the conversion. The biggest impediment, at present, is artifacts in the lower quality interlaced signals (generally broadcast video), as these are not consistent from field to field. On

1520-450: A progressive scan signal. The deinterlacing circuitry to get progressive scan from a normal interlaced broadcast television signal can add to the cost of a television set using such displays. Currently, progressive displays dominate the HDTV market. In the 1970s, computers and home video game systems began using TV sets as display devices. At that point, a 480-line NTSC signal was well beyond

1615-411: A sharper 405 line frame (with around 377 used for the actual image, and yet fewer visible within the screen bezel; in modern parlance, the standard would be "377i"). The vertical scan frequency remained 50 Hz, but visible detail was noticeably improved. As a result, this system supplanted John Logie Baird 's 240 line mechanical progressive scan system that was also being trialled at the time. From

1710-592: A smaller area using a higher resolution makes the image much clearer or "sharper". However, most recent screen technologies are fixed at a certain resolution; making the resolution lower on these kinds of screens will greatly decrease sharpness, as an interpolation process is used to "fix" the non-native resolution input into the display's native resolution output. While some CRT-based displays may use digital video processing that involves image scaling using memory arrays, ultimately "display resolution" in CRT-type displays

1805-490: A standard television set, the screen is either treated as if it were half the resolution of what it actually is (or even lower), or rendered at full resolution and then subjected to a low-pass filter in the vertical direction (e.g. a "motion blur" type with a 1-pixel distance, which blends each line 50% with the next, maintaining a degree of the full positional resolution and preventing the obvious "blockiness" of simple line doubling whilst actually reducing flicker to less than what

1900-489: A two-bladed shutter to produce 48 times per second illumination—but only in projectors incapable of projecting at the lower speed. This solution could not be used for television. To store a full video frame and display it twice requires a frame buffer —electronic memory ( RAM )—sufficient to store a video frame. This method did not become feasible until the late 1980s and with digital technology. In addition, avoiding on-screen interference patterns caused by studio lighting and

1995-442: A variability in resolution that fixed resolution LCDs cannot provide. Interlaced video Interlaced video (also known as interlaced scan ) is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth . The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of

SECTION 20

#1732851168357

2090-558: A video image on an electronic display screen (the other being progressive scan ) by scanning or displaying each line or row of pixels. This technique uses two fields to create a frame. One field contains all odd-numbered lines in the image; the other contains all even-numbered lines. Sometimes in interlaced video a field is called a frame which can lead to confusion. A Phase Alternating Line (PAL)-based television set display, for example, scans 50 fields every second (25 odd and 25 even). The two sets of 25 fields work together to create

2185-445: Is 2048 × 1536 pixels, whereas 4K reference resolution is 4096 × 3072 pixels. Nevertheless, 2K may also refer to resolutions like 2048 × 1556 (full-aperture), 2048 × 1152 ( HDTV , 16:9 aspect ratio) or 2048 × 872 pixels ( Cinemascope , 2.35:1 aspect ratio). It is also worth noting that while a frame resolution may be, for example, 3:2 ( 720 × 480 NTSC), that is not what you will see on-screen (i.e. 4:3 or 16:9 depending on

2280-452: Is (barely) acceptable for small, low brightness displays in dimly lit rooms, whilst 80 Hz or more may be necessary for bright displays that extend into peripheral vision. The film solution was to project each frame of film three times using a three-bladed shutter: a movie shot at 16 frames per second illuminated the screen 48 times per second. Later, when sound film became available, the higher projection speed of 24 frames per second enabled

2375-508: Is 1080i/25. This convention assumes that one complete frame in an interlaced signal consists of two fields in sequence. One of the most important factors in analog television is signal bandwidth, measured in megahertz. The greater the bandwidth, the more expensive and complex the entire production and broadcasting chain. This includes cameras, storage systems, broadcast systems—and reception systems: terrestrial, cable, satellite, Internet, and end-user displays ( TVs and computer monitors ). For

2470-414: Is a (small, usually even) integer number which translates into a set of actual resolutions, depending on the film format . As a reference consider that, for a 4:3 (around 1.33:1) aspect ratio which a film frame (no matter what is its format) is expected to horizontally fit in , n is the multiplier of 1024 such that the horizontal resolution is exactly 1024•n points. For example, 2K reference resolution

2565-438: Is a format of displaying, storing, or transmitting moving images in which all the lines of each frame are drawn in sequence. This is in contrast to interlaced video used in traditional analog television systems where only the odd lines, then the even lines of each frame (each image called a video field ) are drawn alternately, so that only half the number of actual image frames are used to produce video. Televisions are of

2660-431: Is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth . The interlaced signal contains two fields of a video frame captured consecutively. This enhances motion perception to the viewer, and reduces flicker by taking advantage of the phi phenomenon . The European Broadcasting Union has argued against interlaced video in production and broadcasting. The main argument

2755-461: Is affected by different parameters such as spot size and focus, astigmatic effects in the display corners, the color phosphor pitch shadow mask (such as Trinitron ) in color displays, and the video bandwidth. Most television display manufacturers "overscan" the pictures on their displays (CRTs and PDPs, LCDs etc.), so that the effective on-screen picture may be reduced from 720 × 576  (480) to 680 × 550  (450), for example. The size of

2850-520: Is called interlacing . A field is an image that contains only half of the lines needed to make a complete picture. In the days of CRT displays, the afterglow of the display's phosphor aided this effect. Interlacing provides full vertical detail with the same bandwidth that would be required for a full progressive scan, but with twice the perceived frame rate and refresh rate . To prevent flicker, all analog broadcast television systems used interlacing. Format identifiers like 576i50 and 720p50 specify

2945-422: Is captured. These artifacts may be more visible when interlaced video is displayed at a slower speed than it was captured, or in still frames. While there are simple methods to produce somewhat satisfactory progressive frames from the interlaced image, for example by doubling the lines of one field and omitting the other (halving vertical resolution), or anti-aliasing the image in the vertical axis to hide some of

NQ - Misplaced Pages Continue

3040-419: Is equivalent to about 440 total lines of actual picture information from left edge to right edge. Some commentators also use display resolution to indicate a range of input formats that the display's input electronics will accept and often include formats greater than the screen's native grid size even though they have to be down-scaled to match the screen's parameters (e.g. accepting a 1920 × 1080 input on

3135-510: Is particularly rare given its much lower line-scanning frequency vs typical "VGA"-or-higher analog computer video modes. Playing back interlaced video from a DVD, digital file or analog capture card on a computer display instead requires some form of deinterlacing in the player software and/or graphics hardware, which often uses very simple methods to deinterlace. This means that interlaced video often has visible artifacts on computer systems. Computer systems may be used to edit interlaced video, but

3230-507: Is possible to select the original 640 × 480 in the Advanced Settings window. Programs designed to mimic older hardware such as Atari, Sega, or Nintendo game consoles (emulators) when attached to multiscan CRTs, routinely use much lower resolutions, such as 160 × 200 or 320 × 400 for greater authenticity, though other emulators have taken advantage of pixelation recognition on circle, square, triangle and other geometric features on

3325-451: Is restored. The computers of the 1980s lacked sufficient power to run similar filtering software.) The advantage of a 720 × 480i overscanned computer was an easy interface with interlaced TV production, leading to the development of Newtek's Video Toaster . This device allowed Amigas to be used for CGI creation in various news departments (example: weather overlays), drama programs such as NBC's seaQuest and The WB's Babylon 5 . In

3420-408: Is simply the physical number of columns and rows of pixels creating the display (e.g. 1920 × 1080 ). A consequence of having a fixed-grid display is that, for multi-format video inputs, all displays need a "scaling engine" (a digital video processor that includes a memory array) to match the incoming picture format to the display. For device displays such as phones, tablets, monitors and televisions,

3515-443: Is stationary, human vision combines information from multiple similar half-frames to produce the same perceived resolution as that provided by a progressive full frame. This technique is only useful, though, if source material is available in higher refresh rates. Cinema movies are typically recorded at 24fps, and therefore do not benefit from interlacing, a solution which reduces the maximum video bandwidth to 5 MHz without reducing

3610-611: Is that no matter how complex the deinterlacing algorithm may be, the artifacts in the interlaced signal cannot be completely eliminated because some information is lost between frames. Despite arguments against it, television standards organizations continue to support interlacing. It is still included in digital video transmission formats such as DV , DVB , and ATSC . New video compression standards like High Efficiency Video Coding are optimized for progressive scan video, but sometimes do support interlaced video. Progressive scanning (alternatively referred to as noninterlaced scanning )

3705-521: Is the display screen's rectangular shape, which is expressed as the ratio of the physical picture width to the physical picture height. This is known as the aspect ratio . A screen's physical aspect ratio and the individual pixels' aspect ratio may not necessarily be the same. An array of 1280 × 720 on a 16:9 display has square pixels, but an array of 1024 × 768 on a 16:9 display has oblong pixels. An example of pixel shape affecting "resolution" or perceived sharpness: displaying more information in

3800-417: Is the primary reason that interlacing is less suited for computer displays. Each scanline on a high-resolution computer monitor typically displays discrete pixels, each of which does not span the scanline above or below. When the overall interlaced framerate is 60 frames per second, a pixel (or more critically for e.g. windowing systems or underlined text, a horizontal line) that spans only one scanline in height

3895-411: Is visible for the 1/60 of a second that would be expected of a 60 Hz progressive display - but is then followed by 1/60 of a second of darkness (whilst the opposite field is scanned), reducing the per-line/per-pixel refresh rate to 30 frames per second with quite obvious flicker. To avoid this, standard interlaced television sets typically do not display sharp detail. When computer graphics appear on

NQ - Misplaced Pages Continue

3990-768: The Atari 2600 Video Computer System and the Apple II+ , both of which offered the option to disable the color and view a legacy black-and-white signal. On the Commodore 64, the GEOS mirrored the Mac OS method of using black-and-white to improve readability. The 640 × 400i resolution ( 720 × 480i with borders disabled) was first introduced by home computers such as the Commodore Amiga and, later, Atari Falcon. These computers used interlace to boost

4085-513: The total number of pixels. In digital measurement, the display resolution would be given in pixels per inch (PPI). In analog measurement, if the screen is 10 inches high, then the horizontal resolution is measured across a square 10 inches wide. For television standards, this is typically stated as "lines horizontal resolution, per picture height"; for example, analog NTSC TVs can typically display about 340 lines of "per picture height" horizontal resolution from over-the-air sources, which

4180-467: The 1940s onward, improvements in technology allowed the US and the rest of Europe to adopt systems using progressively higher line-scan frequencies and more radio signal bandwidth to produce higher line counts at the same frame rate, thus achieving better picture quality. However the fundamentals of interlaced scanning were at the heart of all of these systems. The US adopted the 525 line system, later incorporating

4275-562: The 5∶4 aspect ratio resolution of 1280 × 1024 more popular for desktop usage during the first decade of the 21st century. Many computer users including CAD users, graphic artists and video game players ran their computers at 1600 × 1200 resolution ( UXGA ) or higher such as 2048 × 1536 QXGA if they had the necessary equipment. Other available resolutions included oversize aspects like 1400 × 1050 SXGA+ and wide aspects like 1280 × 800 WXGA , 1440 × 900 WXGA+ , 1680 × 1050 WSXGA+ , and 1920 × 1200 WUXGA ; monitors built to

4370-550: The 6, 7 and 8  MHz of bandwidth that NTSC and PAL signals were confined to. IBM's Monochrome Display Adapter and Enhanced Graphics Adapter as well as the Hercules Graphics Card and the original Macintosh computer generated video signals of 342 to 350p, at 50 to 60 Hz, with approximately 16 MHz of bandwidth, some enhanced PC clones such as the AT&;T 6300 (aka Olivetti M24 ) as well as computers made for

4465-406: The 720p and 1080p standard were also not unusual among home media and video game players, due to the perfect screen compatibility with movie and video game releases. A new more-than-HD resolution of 2560 × 1600 WQXGA was released in 30-inch LCD monitors in 2007. In 2010, 27-inch LCD monitors with the 2560 × 1440 resolution were released by multiple manufacturers, and in 2012, Apple introduced

4560-507: The Japanese home market managed 400p instead at around 24 MHz, and the Atari ST pushed that to 71 Hz with 32 MHz bandwidth - all of which required dedicated high-frequency (and usually single-mode, i.e. not "video"-compatible) monitors due to their increased line rates. The Commodore Amiga instead created a true interlaced 480i60/576i50 RGB signal at broadcast video rates (and with

4655-452: The PC industry today remains against interlace in HDTV, and lobbied for the 720p standard, and continues to push for the adoption of 1080p (at 60 Hz for NTSC legacy countries, and 50 Hz for PAL); however, 1080i remains the most common HD broadcast resolution, if only for reasons of backward compatibility with older HDTV hardware that cannot support 1080p - and sometimes not even 720p - without

4750-471: The PC world, the IBM PS/2 VGA (multi-color) on-board graphics chips used a non-interlaced (progressive) 640 × 480 × 16 color resolution that was easier to read and thus more useful for office work. It was the standard resolution from 1990 to around 1996. The standard resolution was 800 × 600 until around 2000. Microsoft Windows XP , released in 2001, was designed to run at 800 × 600 minimum, although it

4845-546: The TTL-RGB mode available on the CGA and e.g. BBC Micro were further simplifications to NTSC, which improved picture quality by omitting modulation of color, and allowing a more direct connection between the computer's graphics system and the CRT. By the mid-1980s, computers had outgrown these video systems and needed better displays. Most home and basic office computers suffered from the use of

SECTION 50

#1732851168357

4940-535: The ;axis (away from or towards the camera) will still produce combing, possibly even looking worse than if the fields were joined in a simpler method. Some deinterlacing processes can analyze each frame individually and decide the best method. The best and only perfect conversion in these cases is to treat each frame as a separate image, but that may not always be possible. For framerate conversions and zooming it would mostly be ideal to line-double each field to produce

5035-500: The aforementioned full-frame low-pass filter. This animation demonstrates the interline twitter effect using the Indian Head test card . On the left are two progressive scan images. Center are two interlaced images. Right are two images with line doublers . Top are original resolution, bottom are with anti-aliasing. The two interlaced images use half the bandwidth of the progressive one. The interlaced scan (center) precisely duplicates

5130-535: The artifacts in the interlaced signal cannot be completely eliminated because some information is lost between frames. Despite arguments against it, television standards organizations continue to support interlacing. It is still included in digital video transmission formats such as DV , DVB , and ATSC . New video compression standards like High Efficiency Video Coding are optimized for progressive scan video, but sometimes do support interlaced video. Progressive scan captures, transmits, and displays an image in

5225-488: The characteristics of the human visual system. This effectively doubles the time resolution (also called temporal resolution ) as compared to non-interlaced footage (for frame rates equal to field rates). Interlaced signals require a display that is natively capable of showing the individual fields in a sequential order. CRT displays and ALiS plasma displays are made for displaying interlaced signals. Interlaced scan refers to one of two common methods for "painting"

5320-454: The color keyed picture for each eye in the alternating fields. This does not require significant alterations to existing equipment. Shutter glasses can be adopted as well, obviously with the requirement of achieving synchronisation. If a progressive scan display is used to view such programming, any attempt to deinterlace the picture will render the effect useless. For color filtered glasses the picture has to be either buffered and shown as if it

5415-541: The color standards are often used as synonyms for the underlying video standard - NTSC for 525i/60, PAL/SECAM for 625i/50 - there are several cases of inversions or other modifications; e.g. PAL color is used on otherwise "NTSC" (that is, 525i/60) broadcasts in Brazil , as well as vice versa elsewhere, along with cases of PAL bandwidth being squeezed to 3.58 MHz to fit in the broadcast waveband allocation of NTSC, or NTSC being expanded to take up PAL's 4.43 MHz. Interlacing

5510-404: The combing, there are sometimes methods of producing results far superior to these. If there is only sideways (X axis) motion between the two fields and this motion is even throughout the full frame, it is possible to align the scanlines and crop the left and right ends that exceed the frame area to produce a visually satisfactory image. Minor Y axis motion can be corrected similarly by aligning

5605-528: The composite color standard known as NTSC , Europe adopted the 625 line system, and the UK switched from its idiosyncratic 405 line system to (the much more US-like) 625 to avoid having to develop a (wholly) unique method of color TV. France switched from its similarly unique 819 line monochrome system to the more European standard of 625. Europe in general, including the UK, then adopted the PAL color encoding standard, which

5700-476: The concept of breaking a single image frame into successive interlaced lines, based on his earlier experiments with phototelegraphy. In the USA, RCA engineer Randall C. Ballard patented the same idea in 1932, initially for the purpose of reformatting sound film to television rather than for the transmission of live images. Commercial implementation began in 1934 as cathode-ray tube screens became brighter, increasing

5795-418: The disparity between computer video display systems and interlaced television signal formats means that the video content being edited cannot be viewed properly without separate video display hardware. Current manufacture TV sets employ a system of intelligently extrapolating the extra information that would be present in a progressive signal entirely from an interlaced original. In theory: this should simply be

SECTION 60

#1732851168357

5890-431: The early 2010s, they recommended 720p 50 fps (frames per second) for the current production format—and were working with the industry to introduce 1080p 50 as a future-proof production standard. 1080p 50 offers higher vertical resolution, better quality at lower bitrates, and easier conversion to other formats, such as 720p 50 and 1080i 50. The main argument is that no matter how complex the deinterlacing algorithm may be,

5985-448: The effective picture scan rate of 60 Hz. Given a fixed bandwidth and high refresh rate, interlaced video can also provide a higher spatial resolution than progressive scan. For instance, 1920×1080 pixel resolution interlaced HDTV with a 60 Hz field rate (known as 1080i60 or 1080i/30) has a similar bandwidth to 1280×720 pixel progressive scan HDTV with a 60 Hz frame rate (720p60 or 720p/60), but achieves approximately twice

6080-617: The faster motions inherent in the increasingly popular window-based operating systems, as well as the full-screen scrolling in WYSIWYG word-processors, spreadsheets, and of course for high-action games. Additionally, the regular, thin horizontal lines common to early GUIs, combined with low color depth that meant window elements were generally high-contrast (indeed, frequently stark black-and-white), made shimmer even more obvious than with otherwise lower fieldrate video applications. As rapid technological advancement made it practical and affordable, barely

6175-431: The following resolutions: As far as digital cinematography is concerned, video resolution standards depend first on the frames' aspect ratio in the film stock (which is usually scanned for digital intermediate post-production) and then on the actual points' count. Although there is not a unique set of standardized sizes, it is commonplace within the motion picture industry to refer to " n K" image "quality", where n

6270-409: The frame rate for progressive scan formats, but for interlaced formats they typically specify the field rate (which is twice the frame rate). This can lead to confusion, because industry-standard SMPTE timecode formats always deal with frame rate, not field rate. To avoid confusion, SMPTE and EBU always use frame rate to specify interlaced formats, e.g., 480i60 is 480i/30, 576i50 is 576i/25, and 1080i50

6365-615: The frame rate isn't doubled in the deinterlaced output. Providing the best picture quality for interlaced video signals without doubling the frame rate requires expensive and complex devices and algorithms, and can cause various artifacts. For television displays, deinterlacing systems are integrated into progressive scan TV sets that accept interlaced signal, such as broadcast SDTV signal. Most modern computer monitors do not support interlaced video, besides some legacy medium-resolution modes (and possibly 1080i as an adjunct to 1080p), and support for standard-definition video (480/576i or 240/288p)

6460-418: The frame rate. I.e., 1080p50 signal produces roughly the same bit rate as 1080i50 (aka 1080i/25) signal, and 1080p50 actually requires less bandwidth to be perceived as subjectively better than its 1080i/25 (1080i50) equivalent when encoding a "sports-type" scene. Interlacing can be exploited to produce 3D TV programming, especially with a CRT display and especially for color filtered glasses by transmitting

6555-400: The full resolution of the progressive image. ALiS plasma panels and the old CRTs can display interlaced video directly, but modern computer video displays and TV sets are mostly based on LCD technology, which mostly use progressive scanning. Displaying interlaced video on a progressive scan display requires a process called deinterlacing . This is can be an imperfect technique, especially if

6650-659: The graphics abilities of low cost computers, so these systems used a simplified video signal that made each video field scan directly on top of the previous one, rather than each line between two lines of the previous field, along with relatively low horizontal pixel counts. This marked the return of progressive scanning not seen since the 1920s. Since each field became a complete frame on its own, modern terminology would call this 240p on NTSC sets, and 288p on PAL . While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this. Computer monitor standards such as

6745-450: The image fit (when using DVI) or insufficient sampling of the analog signal (when using VGA connector). Few CRT manufacturers will quote the true native resolution, because CRTs are analog in nature and can vary their display from as low as 320 × 200 (emulation of older computers or game consoles) to as high as the internal board will allow, or the image becomes too detailed for the vacuum tube to recreate (i.e., analog blur). Thus, CRTs provide

6840-422: The intended aspect ratio of the original material). Computer monitors have traditionally possessed higher resolutions than most televisions. Many personal computers introduced in the late 1970s and the 1980s were designed to use television receivers as their display devices, making the resolutions dependent on the television standards in use, including PAL and NTSC . Picture sizes were usually limited to ensure

6935-582: The introduction of VGA , on which PCs soon standardized, as well as Apple's Macintosh II range which offered displays of similar, then superior resolution and color depth, with rivalry between the two standards (and later PC quasi-standards such as XGA and SVGA) rapidly pushing up the quality of display available to both professional and home users. In the late 1980s and early 1990s, monitor and graphics card manufacturers introduced newer high resolution standards that once again included interlace. These monitors ran at higher scanning frequencies, typically allowing

7030-420: The invisible area somewhat depends on the display device. Some HD televisions do this as well, to a similar extent. Computer displays including projectors generally do not overscan although many models (particularly CRT displays) allow it. CRT displays tend to be underscanned in stock configurations, to compensate for the increasing distortions at the corners. Interlaced video (also known as interlaced scan )

7125-407: The level of flicker caused by progressive (sequential) scanning. In 1936, when the UK was setting analog standards, early thermionic valve based CRT drive electronics could only scan at around 200 lines in 1/50 of a second (i.e. approximately a 10 kHz repetition rate for the sawtooth horizontal deflection waveform). Using interlace, a pair of 202.5-line fields could be superimposed to become

7220-444: The limits of vacuum tube technology required that CRTs for TV be scanned at AC line frequency. (This was 60 Hz in the US, 50 Hz Europe.) Several different interlacing patents have been proposed since 1914 in the context of still or moving image transmission, but few of them were practicable. In 1926, Ulises Armand Sanabria demonstrated television to 200,000 people attending Chicago Radio World’s Fair. Sanabria’s system

7315-405: The maximum vertical resolution. These modes were only suited to graphics or gaming, as the flickering interlace made reading text in word processor, database, or spreadsheet software difficult. (Modern game consoles solve this problem by pre-filtering the 480i video to a lower resolution. For example, Final Fantasy XII suffers from flicker when the filter is turned off, but stabilizes once filtering

7410-522: The old scanning method, with the highest display resolution being around 640x200 (or sometimes 640x256 in 625-line/50 Hz regions), resulting in a severely distorted tall narrow pixel shape, making the display of high resolution text alongside realistic proportioned images difficult (logical "square pixel" modes were possible but only at low resolutions of 320x200 or less). Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than

7505-425: The old unprocessed NTSC signal, the screens do not all follow motion in perfect synchrony. Some models appear to update slightly faster or slower than others. Similarly, the audio can have an echo effect due to different processing delays. When motion picture film was developed, the movie screen had to be illuminated at a high rate to prevent visible flicker . The exact rate necessary varies by brightness — 50 Hz

7600-446: The other hand, high bit rate interlaced signals such as from HD camcorders operating in their highest bit rate mode work well. Deinterlacing algorithms temporarily store a few frames of interlaced images and then extrapolate extra frame data to make a smooth flicker-free image. This frame storage and processing results in a slight display lag that is visible in business showrooms with a large number of different models on display. Unlike

7695-405: The overall framerate was barely any higher than what it had been for the interlaced modes (e.g. SVGA at 56p versus 43i to 47i), and usually including a top mode technically exceeding the CRT's actual resolution (number of color-phosphor triads) which meant there was no additional image clarity to be gained through interlacing and/or increasing the signal bandwidth still further. This experience is why

7790-418: The pixels of the progressive image (left), but interlace causes details to twitter. A line doubler operating in "bob" (interpolation) mode would produce the images at far right. Real interlaced video blurs such details to prevent twitter, as seen in the bottom row, but such softening (or anti-aliasing) comes at the cost of image clarity. But even the best line doubler could never restore the bottom center image to

7885-525: The progressive-scan equivalents. Whilst flicker was often not immediately obvious on these displays, eyestrain and lack of focus nevertheless became a serious problem, and the trade-off for a longer afterglow was reduced brightness and poor response to moving images, leaving visible and often off-colored trails behind. These colored trails were a minor annoyance for monochrome displays, and the generally slower-updating screens used for design or database-query purposes, but much more troublesome for color displays and

7980-445: The scanlines in a different sequence and cropping the excess at the top and bottom. Often the middle of the picture is the most necessary area to put into check, and whether there is only X or Y axis alignment correction, or both are applied, most artifacts will occur towards the edges of the picture. However, even these simple procedures require motion tracking between the fields, and a rotating or tilting object, or one that moves in

8075-404: The simpler approach would achieve). If text is displayed, it is large enough so that any horizontal lines are at least two scanlines high. Most fonts for television programming have wide, fat strokes, and do not include fine-detail serifs that would make the twittering more visible; in addition, modern character generators apply a degree of anti-aliasing that has a similar line-spanning effect to

8170-399: The spatial resolution for low-motion scenes. However, bandwidth benefits only apply to an analog or uncompressed digital video signal. With digital video compression, as used in all current digital TV standards, interlacing introduces additional inefficiencies. EBU has performed tests that show that the bandwidth savings of interlaced video over progressive video is minimal, even with twice

8265-411: The technical difference is simply that of either starting/ending the vertical sync cycle halfway along a scanline every other frame (interlace), or always synchronising right at the start/end of a line (progressive). Interlace is still used for most standard definition TVs, and the 1080i HDTV broadcast standard, but not for LCD , micromirror ( DLP ), or most plasma displays ; these displays do not use

8360-441: The use of the term display resolution as defined above is a misnomer, though common. The term display resolution is usually used to mean pixel dimensions , the maximum number of pixels in each dimension (e.g. 1920 × 1080 ), which does not tell anything about the pixel density of the display on which the image is actually formed: resolution properly refers to the pixel density , the number of pixels per unit distance or area, not

8455-471: The vertical resolution in progress. 160 × 200 , 320 × 200 and 640 × 200 on NTSC were relatively common resolutions in the era (224, 240 or 256 scanlines were also common). In the IBM PC world, these resolutions came to be used by 16-color EGA video cards. One of the drawbacks of using a classic television is that the computer display resolution is higher than the television could decode. Chroma resolution for NTSC/PAL televisions are bandwidth-limited to

8550-405: The video format. For instance, a finely striped jacket on a news anchor may produce a shimmering effect. This is twittering . Television professionals avoid wearing clothing with fine striped patterns for this reason. Professional video cameras or computer-generated imagery systems apply a low-pass filter to the vertical resolution of the signal to prevent interline twitter. Interline twitter

8645-422: The visibility of all the pixels in the major television standards and the broad range of television sets with varying amounts of over scan. The actual drawable picture area was, therefore, somewhat smaller than the whole screen, and was usually surrounded by a static-colored border (see image below). Also, the interlace scanning was usually omitted in order to provide more stability to the picture, effectively halving

8740-423: Was essentially based on NTSC, but inverted the color carrier phase with each line (and frame) in order to cancel out the hue-distorting phase shifts that dogged NTSC broadcasts. France instead adopted its own unique, twin-FM-carrier based SECAM system, which offered improved quality at the cost of greater electronic complexity, and was also used by some other countries, notably Russia and its satellite states. Though

8835-507: Was mechanically scanned using a 'triple interlace' Nipkow disc with three offset spirals and was thus a 3:1 scheme rather than the usual 2:1. It worked with 45 line 15 frames per second images being transmitted. With 15 frames per second and a 3:1 interlace the field rate was 45 fields per second yielding (for the time) a very steady image. He did not apply for a patent for his interlaced scanning until May 1931. In 1930, German Telefunken engineer Fritz Schröter first formulated and patented

8930-603: Was progressive with alternating color keyed lines, or each field has to be line-doubled and displayed as discrete frames. The latter procedure is the only way to suit shutter glasses on a progressive display. Interlaced video is designed to be captured, stored, transmitted, and displayed in the same interlaced format. Because each interlaced video frame is two fields captured at different moments in time, interlaced video frames can exhibit motion artifacts known as interlacing effects , or combing , if recorded objects move fast enough to be in different positions when each individual field

9025-400: Was ubiquitous in displays until the 1970s, when the needs of computer monitors resulted in the reintroduction of progressive scan, including on regular TVs or simple monitors based on the same circuitry; most CRT based displays are entirely capable of displaying both progressive and interlace regardless of their original intended use, so long as the horizontal and vertical frequencies match, as

#356643