Misplaced Pages

Fleet Satellite Communications System

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

FLTSATCOM (also FLTSAT ) is a satellite communication system controlled by the U.S. Space Force (formally SPAWARSYSCOM) which was used for UHF radio communications between ships, submarines, airplanes and ground stations of the Navy .

#32967

42-418: Most of the transponders on these satellites were simple repeaters with no authentication or control over what they retransmitted. Altogether eight satellites were launched from 1978 to 1989 by Atlas-Centaur rockets into geostationary orbit . The system became operational in 1981. The satellites were manufactured by TRW . The solar array of each satellite had a span of over 13.2 m. A special characteristic

84-421: A bent pipe (i.e., u-bend ) principle, sending back to Earth what goes into the conduit with only amplification and a shift from uplink to downlink frequency. However, some modern satellites use on-board processing, where the signal is demodulated, decoded, re-encoded and modulated aboard the satellite. This type, called a "regenerative" transponder, is more complex, but has many advantages, such as improving

126-694: A bandwidth of 36 MHz, 8 with 54 MHz, and 4 with 72 MHz, which totals to 1152 MHz, or 32 TPE (i.e., 1152 MHz divided by 36 MHz). Video Video is an electronic medium for the recording, copying , playback, broadcasting , and display of moving visual media . Video was first developed for mechanical television systems, which were quickly replaced by cathode-ray tube (CRT) systems, which, in turn, were replaced by flat-panel displays of several types. Video systems vary in display resolution , aspect ratio , refresh rate , color capabilities, and other qualities. Analog and digital variants exist and can be carried on

168-459: A closed-circuit system as an analog signal. Broadcast or studio cameras use a single or dual coaxial cable system using serial digital interface (SDI). See List of video connectors for information about physical connectors and related signal standards. Video may be transported over networks and other shared digital communications links using, for instance, MPEG transport stream , SMPTE 2022 and SMPTE 2110 . Digital television broadcasts use

210-433: A natively progressive broadcast or recorded signal, the result is the optimum spatial resolution of both the stationary and moving parts of the image. Interlacing was invented as a way to reduce flicker in early mechanical and CRT video displays without increasing the number of complete frames per second . Interlacing retains detail while requiring lower bandwidth compared to progressive scanning. In interlaced video,

252-442: A number is available. Analog video is a video signal represented by one or more analog signals . Analog color video signals include luminance (Y) and chrominance (C). When combined into one channel, as is the case among others with NTSC , PAL , and SECAM , it is called composite video . Analog video may be carried in separate channels, as in two-channel S-Video (YC) and multi-channel component video formats. Analog video

294-438: A particular refresh rate, display resolution , and color space . Many analog and digital recording formats are in use, and digital video clips can also be stored on a computer file system as files, which have their own formats. In addition to the physical format used by the data storage device or transmission medium, the stream of ones and zeros that is sent must be in a particular digital video coding format , for which

336-507: A ratio between width and height. The ratio of width to height for a traditional television screen is 4:3, or about 1.33:1. High-definition televisions use an aspect ratio of 16:9, or about 1.78:1. The aspect ratio of a full 35 mm film frame with soundtrack (also known as the Academy ratio ) is 1.375:1. Pixels on computer monitors are usually square, but pixels used in digital video often have non-square aspect ratios, such as those used in

378-404: A sequence of miniature photographic images visible to the eye when the film is physically examined. Video, by contrast, encodes images electronically, turning the images into analog or digital electronic signals for transmission or recording. Video technology was first developed for mechanical television systems, which were quickly replaced by cathode-ray tube (CRT) television systems. Video

420-583: A variety of media, including radio broadcasts , magnetic tape , optical discs , computer files , and network streaming . The word video comes from the Latin video (I see). Video developed from facsimile systems developed in the mid-19th century. Early mechanical video scanners, such as the Nipkow disk , were patented as early as 1884, however, it took several decades before practical video systems could be developed, many decades after film . Film records using

462-456: Is in rough chronological order. All formats listed were sold to and used by broadcasters, video producers, or consumers; or were important historically. Digital video tape recorders offered improved quality compared to analog recorders. Optical storage mediums offered an alternative, especially in consumer applications, to bulky tape formats. A video codec is software or hardware that compresses and decompresses digital video . In

SECTION 10

#1732859026033

504-455: Is less sensitive to details in color than brightness, the luminance data for all pixels is maintained, while the chrominance data is averaged for a number of pixels in a block, and the same value is used for all of them. For example, this results in a 50% reduction in chrominance data using 2-pixel blocks (4:2:2) or 75% using 4-pixel blocks (4:2:0). This process does not reduce the number of possible color values that can be displayed, but it reduces

546-517: Is often described as 576i50 , where 576 indicates the total number of horizontal scan lines, i indicates interlacing, and 50 indicates 50 fields (half-frames) per second. When displaying a natively interlaced signal on a progressive scan device, the overall spatial resolution is degraded by simple line doubling —artifacts, such as flickering or "comb" effects in moving parts of the image that appear unless special signal processing eliminates them. A procedure known as deinterlacing can optimize

588-527: Is reduced by registering differences between parts of a single frame; this task is known as intraframe compression and is closely related to image compression . Likewise, temporal redundancy can be reduced by registering differences between frames; this task is known as interframe compression , including motion compensation and other techniques. The most common modern compression standards are MPEG-2 , used for DVD , Blu-ray, and satellite television , and MPEG-4 , used for AVCHD , mobile phones (3GP), and

630-422: Is shot at a slower frame rate of 24 frames per second, which slightly complicates the process of transferring a cinematic motion picture to video. The minimum frame rate to achieve a comfortable illusion of a moving image is about sixteen frames per second. Video can be interlaced or progressive . In progressive scan systems, each refresh period updates all scan lines in each frame in sequence. When displaying

672-428: Is the series of interconnected units that form a communications channel between the receiving and the transmitting antennas. It is mainly used in satellite communication to transfer the received signals. A transponder is typically composed of: Most communication satellites are radio relay stations in orbit and carry dozens of transponders, each with a bandwidth of tens of megahertz. Most transponders operate on

714-502: Is used in NTSC television, YUV is used in PAL television, YDbDr is used by SECAM television, and YCbCr is used for digital video. The number of distinct colors a pixel can represent depends on the color depth expressed in the number of bits per pixel. A common way to reduce the amount of data required in digital video is by chroma subsampling (e.g., 4:4:4, 4:2:2, etc.). Because the human eye

756-440: Is used in both consumer and professional television production applications. Digital video signal formats have been adopted, including serial digital interface (SDI), Digital Visual Interface (DVI), High-Definition Multimedia Interface (HDMI) and DisplayPort Interface. Video can be transmitted or transported in a variety of ways including wireless terrestrial television as an analog or digital signal, coaxial cable in

798-499: The MPEG-2 and other video coding formats and include: Analog television broadcast standards include: An analog video format consists of more information than the visible content of the frame. Preceding and following the image are lines and pixels containing metadata and synchronization information. This surrounding margin is known as a blanking interval or blanking region ; the horizontal and vertical front porch and back porch are

840-547: The consumer market . Digital video is capable of higher quality and, eventually, a much lower cost than earlier analog technology. After the commercial introduction of the DVD in 1997 and later the Blu-ray Disc in 2006, sales of videotape and recording equipment plummeted. Advances in computer technology allow even inexpensive personal computers and smartphones to capture, store, edit, and transmit digital video, further reducing

882-522: The Doppler shift and thus infer range and speed from a communication signal without allocating power to a separate ranging signal. A transponder equivalent ( TPE ) is a normalized way to refer to transponder bandwidth. It simply means how many transponders would be used if the same total bandwidths used only 36 MHz transponders. So, for example, the ARSAT-1 has 24 IEEE K u band transponders: 12 with

SECTION 20

#1732859026033

924-463: The Internet. Stereoscopic video for 3D film and other applications can be displayed using several different methods: Different layers of video transmission and storage each provide their own set of formats to choose from. For transmission, there is a physical connector and signal protocol (see List of video connectors ). A given physical link can carry certain display standards that specify

966-489: The PAL and NTSC variants of the CCIR 601 digital video standard and the corresponding anamorphic widescreen formats. The 720 by 480 pixel raster uses thin pixels on a 4:3 aspect ratio display and fat pixels on a 16:9 display. The popularity of viewing video on mobile phones has led to the growth of vertical video . Mary Meeker , a partner at Silicon Valley venture capital firm Kleiner Perkins Caufield & Byers , highlighted

1008-463: The additional mass due to the EHF payload module. The fifth satellite reached geosynchronous orbit, but had severely limited utility due to damage to the solar arrays and antennas. The failure was attributed to explosive delamination of the fiberglass honeycomb fairing during flight. The inside wall of the fairing extensively damaged one of the solar arrays, and bent the transmit antenna mast which prevented

1050-611: The antenna from deploying fully. Flight 7 was launched out of sequence after a launch failure of a Delta mission carrying the GOES-G weather satellite grounded the entire US launch fleet weeks prior to the scheduled launch of F-6. By the time the Delta mishap investigation concluded there was no risk to the Atlas-Centaur system, F-7 was ready to launch and the system managers elected to swap missions to avoid delaying EHF system testing. Flight 6

1092-450: The building blocks of the blanking interval. Computer display standards specify a combination of aspect ratio, display size, display resolution, color depth, and refresh rate. A list of common resolutions is available. Early television was almost exclusively a live medium, with some programs recorded to film for historical purposes using Kinescope . The analog video tape recorder was commercially introduced in 1951. The following list

1134-438: The context of video compression, codec is a portmanteau of encoder and decoder , while a device that only compresses is typically called an encoder , and one that only decompresses is a decoder . The compressed data format usually conforms to a standard video coding format . The compression is typically lossy , meaning that the compressed video lacks some information present in the original video. A consequence of this

1176-417: The cost of video production and allowing programmers and broadcasters to move to tapeless production . The advent of digital broadcasting and the subsequent digital television transition are in the process of relegating analog video to the status of a legacy technology in most parts of the world. The development of high-resolution video cameras with improved dynamic range and color gamuts , along with

1218-509: The display of an interlaced video signal from an analog, DVD, or satellite source on a progressive scan device such as an LCD television , digital video projector , or plasma panel. Deinterlacing cannot, however, produce video quality that is equivalent to true progressive scan source material. Aspect ratio describes the proportional relationship between the width and height of video screens and video picture elements. All popular video formats are rectangular , and this can be described by

1260-445: The fields one at a time, rather than dividing up a complete frame after it is captured, the frame rate for motion is effectively doubled as well, resulting in smoother, more lifelike reproduction of rapidly moving parts of the image when viewed on an interlaced CRT display. NTSC, PAL, and SECAM are interlaced formats. Abbreviated video resolution specifications often include an i to indicate interlacing. For example, PAL video format

1302-414: The first practical video tape recorders (VTR). In 1951, the first VTR captured live images from television cameras by writing the camera's electrical signal onto magnetic videotape . Video recorders were sold for $ 50,000 in 1956, and videotapes cost US$ 300 per one-hour reel. However, prices gradually dropped over the years; in 1971, Sony began selling videocassette recorder (VCR) decks and tapes into

Fleet Satellite Communications System - Misplaced Pages Continue

1344-466: The growth of vertical video viewing in her 2015 Internet Trends Report – growing from 5% of video viewing in 2010 to 29% in 2015. Vertical video ads like Snapchat 's are watched in their entirety nine times more frequently than landscape video ads. The color model uses the video color representation and maps encoded color values to visible colors reproduced by the system. There are several such representations in common use: typically, YIQ

1386-432: The horizontal scan lines of each complete frame are treated as if numbered consecutively and captured as two fields : an odd field (upper field) consisting of the odd-numbered lines and an even field (lower field) consisting of the even-numbered lines. Analog display devices reproduce each frame, effectively doubling the frame rate as far as perceptible overall flicker is concerned. When the image capture device acquires

1428-749: The introduction of high-dynamic-range digital intermediate data formats with improved color depth , has caused digital video technology to converge with film technology. Since 2013, the use of digital cameras in Hollywood has surpassed the use of film cameras. Frame rate , the number of still pictures per unit of time of video, ranges from six or eight frames per second ( frame/s ) for old mechanical cameras to 120 or more frames per second for new professional cameras. PAL standards (Europe, Asia, Australia, etc.) and SECAM (France, Russia, parts of Africa, etc.) specify 25 frame/s, while NTSC standards (United States, Canada, Japan, etc.) specify 29.97 frame/s. Film

1470-622: The number of distinct points at which the color changes. Video quality can be measured with formal metrics like peak signal-to-noise ratio (PSNR) or through subjective video quality assessment using expert observation. Many subjective video quality methods are described in the ITU-T recommendation BT.500 . One of the standardized methods is the Double Stimulus Impairment Scale (DSIS). In DSIS, each expert views an unimpaired reference video, followed by an impaired version of

1512-895: The original weather guidelines developed after the Apollo 12 on-stand lightning strike for all future launches. In the late 1990s, FLTSATCOM satellites were gradually replaced by the UFO satellites . FLTSAT flights 7 and 8 continue to provide UHF communications. Presently, FLTSATCOM 7 and FLTSATCOM 8 have been used for repeating UHF Satcom transmissions by unauthorized radio users particularly in Brazil , including criminals, illegal loggers, truckers and individuals located in remote areas. This has been met with limited law enforcement action. ( UTC ) 21:17:01 18:57:00 01:26:00 03:54:00 08:16:00 02:30:01 02:30:01 08:56:02 Transponder (satellite communications) A communications satellite 's transponder

1554-436: The same video. The expert then rates the impaired video using a scale ranging from "impairments are imperceptible" to "impairments are very annoying." Uncompressed video delivers maximum quality, but at a very high data rate . A variety of methods are used to compress video streams, with the most effective ones using a group of pictures (GOP) to reduce spatial and temporal redundancy . Broadly speaking, spatial redundancy

1596-463: The satellite, rather than paying for a whole transponder or using landlines to send it to an Earth station for multiplexing with other stations. NASA distinguishes between a " transceiver " and "transponder". A transceiver has an independent transmitter and receiver packaged in the same unit. In a transponder the transmit carrier frequency is derived from the received signal. The frequency linkage allows an interrogating ground station to recover

1638-692: The signal to noise ratio as the signal is regenerated from the digital domain, and also permits selective processing of the data in the digital domain. With data compression and multiplexing , several video (including digital video ) and audio channels may travel through a single transponder on a single wideband carrier . Original analog video only had one channel per transponder, with subcarriers for audio and automatic transmission-identification service ATIS . Non-multiplexed radio stations can also travel in single channel per carrier (SCPC) mode, with multiple carriers (analog or digital) per transponder. This allows each station to transmit directly to

1680-505: Was a UHF transmit antenna reflector 4.9 m in diameter. Each satellite had 12 transponders, which worked in the UHF range from 240 - 400 megahertz. Additionally FLTSATCOM 7 and 8 included an experimental EHF transponder built by Lincoln Laboratory intended to test the MILSTAR ground terminals. The first seven satellites each had a launch mass of 1884 kg and the remaining two were 2310 kg, with

1722-515: Was destroyed in a launch failure. The Atlas-Centaur was launched on March 26, 1987 into a heavy overcast with light rain. 51 seconds into the flight, the Atlas was struck by lightning and the resulting EMP changed a value in the guidance computer's core memory, causing the rocket to yaw and break apart from aerodynamic stress. As a result of this mishap, NASA and the Air Force re-emphasised and clarified

Fleet Satellite Communications System - Misplaced Pages Continue

1764-407: Was originally exclusively live technology. Live video cameras used an electron beam, which would scan a photoconductive plate with the desired image and produce a voltage signal proportional to the brightness in each part of the image. The signal could then be sent to televisions, where another beam would receive and display the image. Charles Ginsburg led an Ampex research team to develop one of

#32967