Misplaced Pages

Blue screen

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

In visual effects , match moving is a technique that allows the insertion of 2D elements, other live action elements or CG computer graphics into live-action footage with correct position, scale, orientation, and motion relative to the photographed objects in the shot . It also allows for the removal of live action elements from the live action shot. The term is used loosely to describe several different methods of extracting camera motion information from a motion picture . Also referred to as motion tracking or camera solving , match moving is related to rotoscoping and photogrammetry . Match moving is sometimes confused with motion capture , which records the motion of objects, often human actors, rather than the camera. Typically, motion capture requires special cameras and sensors and a controlled environment (although recent developments such as the Kinect camera and Apple 's Face ID have begun to change this). Match moving is also distinct from motion control photography , which uses mechanical hardware to execute multiple identical camera moves. Match moving, by contrast, is typically a software -based technology, applied after the fact to normal footage recorded in uncontrolled environments with an ordinary camera.

#543456

81-471: (Redirected from Blue Screen ) Blue screen , Blue Screen or bluescreen may refer to: Chroma key or blue-screen compositing, a technique for combining two still images or video frames Blue screen of death , a fatal system error screen in Microsoft Windows Blue–white screen , an assay useful in biotechnology Blue Screen (novel) ,

162-598: A "key". Green is used as a backdrop for TV and electronic cinematography more than any other colour because television weather presenters tended to wear blue suits. When chroma keying first came into use in television production, the blue screen that was then the norm in the movie industry was used out of habit, until other practical considerations caused the television industry to move from blue to green screens. Broadcast-quality colour television cameras use separate red, green and blue image sensors, and early analog TV chroma keyers required RGB component video to work reliably. From

243-522: A 2006 novel by Robert B. Parker Bluescreen , a 2016 novel by Dan Wells Topics referred to by the same term [REDACTED] This disambiguation page lists articles associated with the title Blue screen . If an internal link led you here, you may wish to change the link to point directly to the intended article. Retrieved from " https://en.wikipedia.org/w/index.php?title=Blue_screen&oldid=1255450212 " Category : Disambiguation pages Hidden categories: Short description

324-427: A bright and saturated image. There are several different quality- and speed-optimised techniques for implementing colour keying in software. In most versions, a function f ( r , g , b ) → α is applied to every pixel in the image. α  (alpha) has a meaning similar to that in alpha compositing techniques. α  ≤ 0 means the pixel is fully in the green screen, α  ≥ 1 means

405-414: A camera in a real or virtual world. Therefore, a camera is a vector that includes as its elements the position of the camera, its orientation, focal length, and other possible parameters that define how the camera focuses light onto the film plane . Exactly how this vector is constructed is not important as long as there is a compatible projection function P . The projection function P takes as its input

486-423: A camera vector (denoted camera ) and another vector the position of a 3-D point in space (denoted xyz ) and returns a 2D point that has been projected onto a plane in front of the camera (denoted XY ). We can express this: The projection function transforms the 3-D point and strips away the component of depth. Without knowing the depth of the component an inverse projection function can only return

567-424: A camera vector that when every parameter is free we still might not be able to narrow F down to a single possibility no matter how many features we track. The more we can restrict the various parameters, especially focal length, the easier it becomes to pinpoint the solution. In all, the 3D solving process is the process of narrowing down the possible solutions to the motion of the camera until we reach one that suits

648-478: A chroma-key background and inserted into the background shot with a distortion effect, in order to create a cloak that is marginally detectable. Difficulties emerge with blue screen when a costume in an effects shot must be blue, such as Superman 's traditional blue outfit. In the 2002 film Spider-Man , in scenes where both Spider-Man and the Green Goblin are in the air, Spider-Man had to be shot in front of

729-463: A computer can use these markers to compute the camera's position and thus render an image that matches the perspective and movement of the foreground perfectly. Modern advances in software and computational power have eliminated the need to accurately place the markers ⁠— ⁠the software figures out their position in space; a potential disadvantage of this is that it requires camera movement, possibly contributing to modern cinematographic techniques whereby

810-416: A default value of 1.0. A very simple g () is ( r , min( g , b ), b ). This is fairly close to the capabilities of analog and film-based screen pulling. Modern examples of these functions are best described by two closed nested surfaces in 3D RGB space, often quite complex. Colours inside the inner surface are considered green screen. Colours outside the outer surface are opaque foreground. Colours between

891-446: A field monitor, to the side of the screen, to see where they are putting their hands against the background images. A newer technique is to project a faint image onto the screen. Some films make heavy use of chroma key to add backgrounds that are constructed entirely using computer-generated imagery (CGI). Performances from different takes can be composited together, which allows actors to be filmed separately and then placed together in

SECTION 10

#1732858228544

972-425: A filter or the high contrast film's colour sensitivity to expose only blue (and higher) frequencies. Blue light only shines through the colour negative where there is not blue in the scene, so this left the film clear where the blue screen was, and opaque elsewhere, except it also produced clear for any white objects (since they also contained blue). Removing these spots could be done by a suitable double-exposure with

1053-454: A green screen and the Green Goblin had to be shot in front of a blue screen. The colour difference is because Spider-Man wears a costume which is red and blue in colour and the Green Goblin wears a costume which is entirely green in colour. If both were shot in front of the same screen, parts of one character would be erased from the shot. For a clean division of foreground from background, it

1134-509: A green top to make it appear that the subject has no body), because the clothing may be replaced with the background image/video. An example of intentional use of this is when an actor wears a blue covering over a part of his body to make it invisible in the final shot. This technique can be used to achieve an effect similar to that used in the Harry Potter films to create the effect of an invisibility cloak . The actor can also be filmed against

1215-506: A human can. A large number of points can be analyzed with statistics to determine the most reliable data. The disadvantage of automatic tracking is that, depending on the algorithm, the computer can be easily confused as it tracks objects through the scene. Automatic tracking methods are particularly ineffective in shots involving fast camera motion such as that seen with hand-held camera work and in shots with repetitive subject matter like small tiles or any sort of regular pattern where one area

1296-433: A narrow frequency band, which can then be separated from the other light using a prism, and projected onto a separate but synchronized film carrier within the camera. This second film is high-contrast black and white, and is processed to produce the matte. A newer technique is to use a retroreflective curtain in the background, along with a ring of bright LEDs around the camera lens . This requires no light to shine on

1377-420: A quarter of the time needed for other methods. In principle, any type of still background can be used as a chroma key instead of a solid colour. First the background is captured without actors or other foreground elements; then the scene is recorded. The image of the background is used to cancel the background in the actual footage; for example in a digital image, each pixel will have a different chroma key. This

1458-463: A reasonable match. For outdoor scenes, overcast days create a diffuse, evenly coloured light which can be easier to match in the studio, whereas direct sunlight needs to be matched in both direction and overall colour based on time of day. A studio shot taken in front of a green screen will naturally have ambient light the same colour as the screen, due to its light scattering. This effect is known as spill . This can look unnatural or cause portions of

1539-445: A reference for placing synthetic objects or by a reconstruction program to create a 3-D version of the actual scene. The camera and point cloud need to be oriented in some kind of space. Therefore, once calibration is complete, it is necessary to define a ground plane. Normally, this is a unit plane that determines the scale, orientation and origin of the projected space. Some programs attempt to do this automatically, though more often

1620-476: A scene featuring a genie escaping from a bottle was the first use of a proper bluescreen process to create a travelling matte for The Thief of Bagdad (1940), which won the Academy Award for Best Special Effects that year. In 1950, Warner Brothers employee and ex- Kodak researcher Arthur Widmer began working on an ultraviolet travelling matte process. He also began developing bluescreen techniques: one of

1701-791: A scene from incidental footage. A reconstruction program can create three-dimensional objects that mimic the real objects from the photographed scene. Using data from the point cloud and the user's estimation, the program can create a virtual object and then extract a texture from the footage that can be projected onto the virtual object as a surface texture. Match moving has two forms. Some compositing programs, such as Shake , Adobe Substance , Adobe After Effects , and Discreet Combustion , include two-dimensional motion tracking capabilities. Two dimensional match moving only tracks features in two-dimensional space, without any concern to camera movement or distortion. It can be used to add motion blur or image stabilization effects to footage. This technique

SECTION 20

#1732858228544

1782-420: A scene where an actor walks in front of a background, the tracking artist will want to use only the background to track the camera through the scene, knowing that motion of the actor will throw off the calculations. In this case, the artist will construct a tracking matte to follow the actor through the scene, blocking that information from the tracking process. Since there are often multiple possible solutions to

1863-414: A set of points { xyz i,0 ,..., xyz i,n } and { xyz j,0 ,..., xyz j,n } where i and j still refer to frames and n is an index to one of many tracking points we are following. We can derive a set of camera vector pair sets {C i,j,0 ,...,C i,j,n }. In this way multiple tracks allow us to narrow the possible camera parameters. The set of possible camera parameters that fit, F,

1944-424: A set of possible 3D points, that form a line emanating from the nodal point of the camera lens and passing through the projected 2-D point. We can express the inverse projection as: or Let's say we are in a situation where the features we are tracking are on the surface of a rigid object such as a building. Since we know that the real point xyz will remain in the same place in real space from one frame of

2025-434: A take. No longer do they need to perform to green/blue screens and have no feedback of the result. Eye-line references, actor positioning, and CGI interaction can now be done live on-set giving everyone confidence that the shot is correct and going to work in the final composite. To achieve this, a number of components from hardware to software need to be combined. Software collects all of the 360 degrees of freedom movement of

2106-440: A technological perspective it was equally possible to use the blue or green channel, but because blue clothing was an ongoing challenge, the green screen came into common use. Newscasters sometimes forget the chroma key dress code, and when the key is applied to clothing of the same colour as the background, the person would seem to disappear into the key. Because green clothing is less common than blue, it soon became apparent that it

2187-407: A white backdrop to include human actors with cartoon characters and backgrounds in his Alice Comedies . The blue screen method was developed in the 1930s at RKO Radio Pictures . At RKO, Linwood Dunn used an early version of the travelling matte to create "wipes" – where there were transitions like a windshield wiper in films such as Flying Down to Rio (1933). Credited to Larry Butler ,

2268-434: A white void and a camera. For any position in space that we place the camera, there is a set of corresponding parameters (orientation, focal length, etc.) that will photograph that black point exactly the same way. Since C has an infinite number of members, one point is never enough to determine the actual camera position. As we start adding tracking points, we can narrow the possible camera positions. For example, if we have

2349-402: Is a small set. Set of possible camera vectors that solve the equation at i and j (denoted C ij ). So there is a set of camera vector pairs C ij for which the intersection of the inverse projections of two points XY i and XY j is a non-empty, hopefully small, set centering on a theoretical stationary point xyz . In other words, imagine a black point floating in

2430-405: Is achieved by a simple numerical comparison between the video and the pre-selected colour. If the colour at a particular point on the screen matches (either exactly, or in a range), then the video at that point is replaced by the alternate background. In order to create an illusion that characters and objects filmed are present in the intended background scene, the lighting in the two scenes must be

2511-451: Is also important that clothing and hair in the foreground shot have a fairly simple silhouette, as fine details such as frizzy hair may not resolve properly. Similarly, partially transparent elements of the costume cause problems. Blue was originally used for the film industry as making the separations required a film that would only respond to the screen colour, and film that responded only to blue and higher frequencies (ultraviolet, etc.)

Blue screen - Misplaced Pages Continue

2592-496: Is best to have as narrow a colour range as possible being replaced. A shadow would present itself as a darker colour to the camera and might not register for replacement. This can sometimes be seen in low-budget or live broadcasts where the errors cannot be manually repaired or scenes reshot. The material being used affects the quality and ease of having it evenly lit. Materials which are shiny will be far less successful than those that are not. A shiny surface will have areas that reflect

2673-411: Is different from Wikidata All article disambiguation pages All disambiguation pages Chroma key Chroma key compositing , or chroma keying , is a visual-effects and post-production technique for compositing (layering) two or more images or video streams together based on colour hues ( chroma range). The technique has been used in many fields to remove a background from

2754-526: Is extremely difficult for an automatic tracker to correctly find features with high amounts of motion blur. The disadvantage of interactive tracking is that the user will inevitably introduce small errors as they follow objects through the scene, which can lead to what is called "drift". Professional-level motion tracking is usually achieved using a combination of interactive and automatic techniques. An artist can remove points that are clearly anomalous and use "tracking mattes" to block confusing information out of

2835-421: Is not very distinct. This tracking method also suffers when a shot contains a large amount of motion blur, making the small details it needs harder to distinguish. The advantage of interactive tracking is that a human user can follow features through an entire scene and will not be confused by features that are not rigid. A human user can also determine where features are in a shot that suffers from motion blur; it

2916-410: Is primarily used to track the movement of a camera through a shot so that an identical virtual camera move can be reproduced in a 3D animation program. When new animated elements are composited back into the original live-action shot, they will appear in perfectly matched perspective and therefore appear seamless. As it is mostly software-based, match moving has become increasingly affordable as

2997-409: Is some use of the specific full-intensity magenta colour #FF00FF in digital colour images to encode (1-bit) transparency; this is sometimes referred to as "magic pink". This is not a photographic technique and the extraction of the foreground from the background is trivial. The biggest challenge when setting up a blue screen or green screen is even lighting and the avoidance of shadow because it

3078-437: Is sometimes referred to as a difference matte . However, this makes it easy for objects to be accidentally removed if they happen to be similar to the background, or for the background to remain due to camera noise or if it happens to change slightly from the reference footage. A background with a repeating pattern alleviates many of these issues, and can be less sensitive to wardrobe colour than solid-colour backdrops. There

3159-815: Is sufficient to create realistic effects when the original footage does not include major changes in camera perspective. For example, a billboard deep in the background of a shot can often be replaced using two-dimensional tracking. Three-dimensional match moving tools make it possible to extrapolate three-dimensional information from two-dimensional photography. These tools allow users to derive camera movement and other relative motion from arbitrary footage. The tracking information can be transferred to computer graphics software and used to animate virtual cameras and simulated objects. Programs capable of 3-D match moving include: There are two methods by which motion information can be extracted from an image. Interactive tracking, sometimes referred to as "supervised tracking", relies on

3240-404: Is the intersection of all sets: The fewer elements are in this set the closer we can come to extracting the actual parameters of the camera. In reality errors introduced to the tracking process require a more statistical approach to determining a good camera vector for each frame, optimization algorithms and bundle block adjustment are often utilized. Unfortunately there are so many elements to

3321-455: The BBC ), or by various terms for specific colour-related variants such as green screen or blue screen ; chroma keying can be done with backgrounds of any colour that are uniform and distinct, but green and blue backgrounds are more commonly used because they differ most distinctly in hue from any human skin colour . No part of the subject being filmed or photographed may duplicate the colour used as

Blue screen - Misplaced Pages Continue

3402-407: The focal length of the lenses used can affect the success of chroma key. Another challenge for blue screen or green screen is proper camera exposure . Underexposing or overexposing a coloured backdrop can lead to poor saturation levels. In the case of video cameras, underexposed images can contain high amounts of noise , as well. The background must be bright enough to allow the camera to create

3483-405: The actor in front of a blue screen together with the background footage, one frame at a time. In the early 1970s, American and British television networks began using green backdrops instead of blue for their newscasts. During the 1980s, minicomputers were used to control the optical printer. For the film The Empire Strikes Back , Richard Edlund created a "quad optical printer" that accelerated

3564-423: The automatic tracking process. Tracking mattes are also employed to cover areas of the shot which contain moving elements such as an actor or a spinning ceiling fan. A tracking matte is similar in concept to a garbage matte used in traveling matte compositing. However, the purpose of a tracking matte is to prevent tracking algorithms from using unreliable, irrelevant, or non-rigid tracking points. For example, in

3645-449: The background other than the LEDs, which use an extremely small amount of power and space unlike big stage lights , and require no rigging . This advance was made possible by the invention in the 1990s of practical blue LEDs, which also allow for emerald green LEDs. There is also a form of colour keying that uses light spectrum invisible to human eye. Called Thermo-Key, it uses infrared as

3726-445: The background video. Chroma keying is also common in the entertainment industry for visual effects in movies and video games. Rotoscopy may instead be carried out on subjects that are not in front of a green (or blue) screen. Motion tracking can also be used in conjunction with chroma keying, such as to move the background as the subject moves. Prior to the introduction of travelling mattes and optical printing , double exposure

3807-459: The backing, or the part may be erroneously identified as part of the backing. It is commonly used for live weather forecast broadcasts in which a news presenter is seen standing in front of a large CGI map which is really a large blue or green background. Using a blue screen, different weather maps are added on the parts of the image in which the colour is blue. If the news presenter wears blue clothes, their clothes will also be replaced with

3888-407: The calibration process and a significant amount of error can accumulate, the final step to match moving often involves refining the solution by hand. This could mean altering the camera motion itself or giving hints to the calibration mechanism. This interactive calibration is referred to as "refining". Most match moving applications are based on similar algorithms for tracking and calibration. Often,

3969-418: The camera as well as metadata such as zoom, focus, iris and shutter elements from many different types of hardware devices, ranging from motion capture systems such as active LED marker based system from PhaseSpace, passive systems such as Motion Analysis or Vicon, to rotary encoders fitted to camera cranes and dollies such as Technocranes and Fisher Dollies, or inertia & gyroscopic sensors mounted directly to

4050-478: The camera being more sensitive to green light. In analog television , colour is represented by the phase of the chroma subcarrier relative to a reference oscillator. Chroma key is achieved by comparing the phase of the video to the phase corresponding to the pre-selected colour. In-phase portions of the video are replaced by the alternate background video. In digital colour TV , colour is represented by three numbers (red, green, blue intensity levels). Chroma key

4131-452: The camera is always in motion. The principal subject is filmed or photographed against a background consisting of a single colour or a relatively narrow range of colours, usually blue or green because these colours are considered to be the furthest away from skin tone. The portions of the video which match the pre-selected colour are replaced by the alternate background video. This process is commonly known as " keying ", "keying out" or simply

SECTION 50

#1732858228544

4212-413: The camera. There are also laser based tracking systems that can be attached to anything, including Steadicams, to track cameras outside in the rain at distances of up to 30 meters. Motion control cameras can also be used as a source or destination for 3D camera data. Camera moves can be pre-visualised in advance and then converted into motion control data that drives a camera crane along precisely

4293-454: The characters to disappear, so must be compensated for, or avoided by using a larger screen placed far from the actors. The depth of field used to record the scene in front of the coloured screen should match that of the background. This can mean recording the actors with a larger depth of field than normal. A chroma key subject must avoid wearing clothes which are similar in colour to the chroma key colour(s) (unless intentional e.g., wearing

4374-445: The cleanest key. In the digital television and cinema age, much of the tweaking that was required to make a good quality key has been automated. However, the one constant that remains is some level of colour coordination to keep foreground subjects from being keyed out. Before electronic chroma keying, compositing was done on (chemical) film. The camera colour negative was printed onto high-contrast black and white negative, using either

4455-409: The colour positive (thus turning any area containing red or green opaque), and many other techniques. The result was film that was clear where the blue screen was, and opaque everywhere else. This is called a female matte , similar to an alpha matte in digital keying. Copying this film onto another high-contrast negative produced the opposite male matte . The background negative was then packed with

4536-646: The cost of computer power has declined; it is now an established visual-effects tool and is even used in live television broadcasts as part of providing effects such as the yellow virtual down-line in American football . The process of match moving can be broken down into two steps. The first step is identifying and tracking features. A feature is a specific point in the image that a tracking algorithm can lock onto and follow through multiple frames ( SynthEyes calls them blips ). Often features are selected because they are bright/dark spots, edges or corners depending on

4617-433: The female matte and exposed onto a final strip of film, then the camera negative was packed with the male matte and was double-printed onto this same film. These two images combined creates the final effect. The most important factor for a key is the colour separation of the foreground (the subject) and background (the screen) – a blue screen will be used if the subject is predominantly green (for example plants), despite

4698-503: The first films to use them was the 1958 adaptation of the Ernest Hemingway novella, The Old Man and the Sea , starring Spencer Tracy . The name "Chroma-Key" was RCA 's trade name for the process, as used on its NBC television broadcasts, incorporating patents granted to RCA's Albert N. Goldsmith. A very early broadcast use was NBC's George Gobel Show in fall 1957. Petro Vlahos

4779-427: The green channel. Green can also be used outdoors where the light colour temperature is significantly blue. Red is avoided as it is in human skin, and any other colour is a mix of primaries and thus produces a less clean extraction. A so-called " yellow screen " is accomplished with a white backdrop. Ordinary stage lighting is used in combination with a bright yellow sodium lamp. The sodium light falls almost entirely in

4860-414: The green screen two stops higher than the subject, or vice versa. Sometimes a shadow can be used to create a visual effect. Areas of the blue screen or green screen with a shadow on them can be replaced with a darker version of the desired background video image, making it look like the person is casting a shadow on them. Any spill of the chroma key colour will make the result look unnatural. A difference in

4941-522: The image to the next we can make the point a constant even though we do not know where it is. So: where the subscripts i and j refer to arbitrary frames in the shot we are analyzing. Since this is always true then we know that: Because the value of XY i has been determined for all frames that the feature is tracked through by the tracking program, we can solve the reverse projection function between any two frames as long as P'( camera i , XY i ) ∩ P'( camera j , XY j )

SECTION 60

#1732858228544

5022-441: The initial results obtained are similar. However, each program has different refining capabilities. On-set, real-time camera tracking is becoming more widely used in feature film production to allow elements that will be inserted in post-production be visualised live on-set. This has the benefit of helping the director and actors improve performances by actually seeing set extensions or CGI characters whilst (or shortly after) they do

5103-423: The key colour, which would not be replaced by background image during postprocessing . For Star Trek: The Next Generation , an ultraviolet light matting process was proposed by Don Lee of CIS Hollywood and developed by Gary Hutzel and the staff of Image G . This involved a fluorescent orange backdrop which made it easier to generate a holdout matte , thus allowing the effects team to produce effects in

5184-409: The lights making them appear pale, while other areas may be darkened. A matte surface will diffuse the reflected light and have a more even colour range. In order to get the cleanest key from shooting green screen, it is necessary to create a value difference between the subject and the green screen. In order to differentiate the subject from the screen, a two-stop difference can be used, either by making

5265-410: The motion of the camera by solving the inverse-projection of the 2-D paths for the position of the camera. This process is referred to as calibration . When a point on the surface of a three-dimensional object is photographed, its position in the 2-D frame can be calculated by a 3-D projection function. We can consider a camera to be an abstraction that holds all the parameters necessary to model

5346-414: The needs of the composite we are trying to create. Once the camera position has been determined for every frame it is then possible to estimate the position of each feature in real space by inverse projection. The resulting set of points is often referred to as a point cloud because of its raw appearance like a nebula . Since point clouds often reveal some of the shape of the 3-D scene they can be used as

5427-631: The particular tracking algorithm. Popular programs use template matching based on NCC score and RMS error . What is important is that each feature represents a specific point on the surface of a real object. As a feature is tracked it becomes a series of two-dimensional coordinates that represent the position of the feature across a series of frames. This series is referred to as a "track". Once tracks have been created they can be used immediately for 2-D motion tracking, or then be used to calculate 3-D information. The second step involves solving for 3D motion. This process attempts to derive

5508-439: The pixel is fully in the foreground object, and intermediate values indicate the pixel is partially covered by the foreground object (or it is transparent). A further function g ( r ,  g ,  b ) → ( r ,  g ,  b ) is needed to remove green spill on the foreground objects. A very simple f () function for green screen is A ( r + b ) − Bg where A and B are user adjustable constants with

5589-462: The process considerably and saved money. He received a special Academy Award for his innovation. For decades, travelling matte shots had to be done "locked-down", so that neither the matted subject nor the background could shift their camera perspective at all. Later, computer-timed, motion-control cameras alleviated this problem, as both the foreground and background could be filmed with the same camera moves. Meteorologists on television often use

5670-405: The same path as the 3-D camera. Encoders on the crane can also be used in real time on-set to reverse this process to generate live 3D cameras. The data can be sent to any number of different 3D applications, allowing 3D artists to modify their CGI elements live on set as well. The main advantage being that set design issues that would be time-consuming and costly issues later down

5751-441: The same scene. Chroma key allows performers to appear to be in any location without leaving the studio. Advances in computer technology have simplified the incorporation of motion into composited shots, even when using handheld cameras. Reference points such as a painted grid, X's marked with tape, or equally spaced tennis balls attached to the wall, can be placed onto the coloured background to serve as markers. In post-production,

5832-512: The scene, and values from user-drawn masks. These produce closed surfaces in space with more than three dimensions. A different class of algorithm tries to figure out a 2D path that separates the foreground from the background. This path can be the output, or the image can be drawn by filling the path with α  = 1 as a final step. An example of such an algorithm is the use of active contour . Most research in recent years has been into these algorithms. Match moving Match moving

5913-461: The subject of a photo or video – particularly the newscasting , motion picture , and video game industries. A colour range in the foreground footage is made transparent, allowing separately filmed background footage or a static image to be inserted into the scene. The chroma keying technique is commonly used in video production and post-production. This technique is also referred to as colour keying , colour-separation overlay ( CSO ; primarily by

5994-409: The surfaces are partially covered, they are more opaque the closer they are to the outer surface. Sometimes more closed surfaces are used to determine how to remove green spill. It is also very common for f () to depend on more than just the current pixel's colour, it may also use the ( x ,  y ) position, the values of nearby pixels, the value from reference images or a statistical colour model of

6075-419: The user defines this plane. Since shifting ground planes does a simple transformation of all of the points, the actual position of the plane is really a matter of convenience. 3-D reconstruction is the interactive process of recreating a photographed object using tracking data. This technique is related to photogrammetry . In this particular case we are referring to using match moving software to reconstruct

6156-421: The user to follow features through a scene. Automatic tracking relies on computer algorithms to identify and track features through a shot. The tracked points movements are then used to calculate a "solution". This solution is composed of all the camera's information such as the motion, focal length, and lens distortion . The advantage of automatic tracking is that the computer can create many points faster than

6237-423: The window areas. In order to have figures in one exposure actually move in front of a substituted background in the other, a travelling matte was needed, to occlude the correct portion of the background in each frame. In 1918 Frank Williams patented a travelling matte technique, again based on using a black background. This was used in many films, such as The Invisible Man . In the 1920s, Walt Disney used

6318-410: Was awarded an Academy Award for his refinement of these techniques in 1964. His technique exploits the fact that most objects in real-world scenes have a colour whose blue-colour component is similar in intensity to their green-colour component. Zbigniew Rybczyński also contributed to bluescreen technology. An optical printer with two projectors, a film camera and a "beam splitter", was used to combine

6399-427: Was easier to use a green matte screen than it was to constantly police the clothing choices of on-air talent. Also, because the human eye is more sensitive to green wavelengths, which lie in the middle of the visible light spectrum, the green analog video channel typically carried more signal strength, giving a better signal to noise ratio compared to the other component video channels, so green screen keys could produce

6480-507: Was far easier to manufacture and make reliable than film that somehow excluded both frequencies higher and lower than the screen colour. In television and digital film making, however, it is equally easy to extract any colour, and green quickly became the favoured colour. Bright green is less likely to be in the foreground objects, colour film emulsions usually had much finer grain in the green, and lossy compression used for analog video signals and digital images and movies retain more detail in

6561-464: Was used to introduce elements into a scene which were not present in the initial exposure. This was done using black draping where a green screen would be used today. George Albert Smith first used this approach in 1898. In 1903, The Great Train Robbery by Edwin S. Porter used double exposure to add background scenes to windows which were black when filmed on set, using a garbage matte to expose only

#543456