Star Bus is a satellite bus family of Orbital ATK . It was originally developed by Thomas van der Heyden, co-founder of CTAI, and later sold to and manufactured by Orbital Sciences Corporation .
102-407: The Star Bus satellite platform is designed for various applications, including communications, remote sensing , and scientific missions. The highly configurable platform allows customization to meet specific mission requirements. In addition, it can support a wide range of payloads, including high-resolution imaging systems, microwave sensors, and advanced communication systems. The Star Bus platform
204-417: A time-of-flight camera is used to collect information about both the 3-D location and intensity of the light incident on it in every frame. However, in scanning lidar, this camera contains only a point sensor, while in flash lidar, the camera contains either a 1-D or a 2-D sensor array , each pixel of which collects 3-D location and intensity information. In both cases, the depth information is collected using
306-399: A combination with a polygon mirror, and a dual axis scanner . Optic choices affect the angular resolution and range that can be detected. A hole mirror or a beam splitter are options to collect a return signal. Two main photodetector technologies are used in lidar: solid state photodetectors, such as silicon avalanche photodiodes , or photomultipliers . The sensitivity of the receiver
408-690: A degree or two with electronic compasses. Compasses can measure not just azimuth (i. e. degrees to magnetic north), but also altitude (degrees above the horizon), since the magnetic field curves into the Earth at different angles at different latitudes. More exact orientations require gyroscopic-aided orientation , periodically realigned by different methods including navigation from stars or known benchmarks. The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal resolutions. In order to create sensor-based maps, most remote sensing systems expect to extrapolate sensor data in relation to
510-423: A different principle described in a Flash Lidar below. Microelectromechanical mirrors (MEMS) are not entirely solid-state. However, their tiny form factor provides many of the same cost benefits. A single laser is directed to a single mirror that can be reoriented to view any part of the target field. The mirror spins at a rapid rate. However, MEMS systems generally operate in a single plane (left to right). To add
612-738: A distance requires a powerful burst of light. The power is limited to levels that do not damage human retinas. Wavelengths must not affect human eyes. However, low-cost silicon imagers do not read light in the eye-safe spectrum. Instead, gallium-arsenide imagers are required, which can boost costs to $ 200,000. Gallium-arsenide is the same compound used to produce high-cost, high-efficiency solar panels usually used in space applications. Lidar can be oriented to nadir , zenith , or laterally. For example, lidar altimeters look down, an atmospheric lidar looks up, and lidar-based collision avoidance systems are side-looking. Laser projections of lidars can be manipulated using various methods and mechanisms to produce
714-416: A few peak returns, while more recent systems acquire and digitize the entire reflected signal. Scientists analysed the waveform signal for extracting peak returns using Gaussian decomposition . Zhuang et al, 2017 used this approach for estimating aboveground biomass. Handling the huge amounts of full-waveform data is difficult. Therefore, Gaussian decomposition of the waveforms is effective, since it reduces
816-554: A great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different sources. While these processing levels are particularly suitable for typical satellite data processing pipelines, other data level vocabularies have been defined and may be appropriate for more heterogeneous workflows. Satellite images provide very useful information to produce statistics on topics closely related to
918-550: A green spectrum (532 nm) laser beam. Two beams are projected onto a fast rotating mirror, which creates an array of points. One of the beams penetrates the water and also detects the bottom surface of the water under favorable conditions. Water depth measurable by lidar depends on the clarity of the water and the absorption of the wavelength used. Water is most transparent to green and blue light, so these will penetrate deepest in clean water. Blue-green light of 532 nm produced by frequency doubled solid-state IR laser output
1020-825: A large extent of geography. At the same time, the data is often complex to interpret, and bulky to store. Modern systems tend to store the data digitally, often with lossless compression . The difficulty with this approach is that the data is fragile, the format may be archaic, and the data may be easy to falsify. One of the best systems for archiving data series is as computer-generated machine-readable ultrafiche , usually in typefonts such as OCR-B , or as digitized half-tone images. Ultrafiches survive well in standard libraries, with lifetimes of several centuries. They can be created, copied, filed and retrieved by automated systems. They are about as compact as archival magnetic media, and yet can be read by human beings with minimal, standardized equipment. Generally speaking, remote sensing works on
1122-484: A legend of mapped classes that suits our purpose, taking again the example of wheat. The straightforward approach is counting the number of pixels classified as wheat and multiplying by the area of each pixel. Many authors have noticed that estimator is that it is generally biased because commission and omission errors in a confusion matrix do not compensate each other The main strength of classified satellite images or other indicators computed on satellite images
SECTION 10
#17328515246161224-410: A microscopic array of individual antennas. Controlling the timing (phase) of each antenna steers a cohesive signal in a specific direction. Phased arrays have been used in radar since the 1940s. On the order of a million optical antennas are used to see a radiation pattern of a certain size in a certain direction. To achieve this the phase of each individual antenna (emitter) are precisely controlled. It
1326-413: A moving vehicle to collect data along a path. These scanners are almost always paired with other kinds of equipment, including GNSS receivers and IMUs . One example application is surveying streets, where power lines, exact bridge heights, bordering trees, etc. all need to be taken into account. Instead of collecting each of these measurements individually in the field with a tachymeter , a 3-D model from
1428-508: A new imaging chip with more than 16,384 pixels, each able to image a single photon, enabling them to capture a wide area in a single image. An earlier generation of the technology with one fourth as many pixels was dispatched by the U.S. military after the January 2010 Haiti earthquake. A single pass by a business jet at 3,000 m (10,000 ft) over Port-au-Prince was able to capture instantaneous snapshots of 600 m (2,000 ft) squares of
1530-431: A point cloud can be created where all of the measurements needed can be made, depending on the quality of the data collected. This eliminates the problem of forgetting to take a measurement, so long as the model is available, reliable and has an appropriate level of accuracy. Terrestrial lidar mapping involves a process of occupancy grid map generation . The process involves an array of cells divided into grids which employ
1632-407: A process to store the height values when lidar data falls into the respective grid cell. A binary map is then created by applying a particular threshold to the cell values for further processing. The next step is to process the radial distance and z-coordinates from each scan to identify which 3-D points correspond to each of the specified grid cell leading to the process of data formation. There are
1734-472: A reference point including distances between known points on the ground. This depends on the type of sensor used. For example, in conventional photographs, distances are accurate in the center of the image, with the distortion of measurements increasing the farther you get from the center. Another factor is that of the platen against which the film is pressed can cause severe errors when photographs are used to measure ground distances. The step in which this problem
1836-679: A sample with less accurate, but exhaustive, data for a covariable or proxy that is cheaper to collect. For agricultural statistics, field surveys are usually required, while photo-interpretation may better for land cover classes that can be reliably identified on aerial photographs or high resolution satellite images. Additional uncertainty can appear because of imperfect reference data (ground truth or similar). Some options are: ratio estimator , regression estimator , calibration estimators and small area estimators If we target other variables, such as crop yield or leaf area , we may need different indicators to be computed from images, such as
1938-490: A scanning effect: the standard spindle-type, which spins to give a 360-degree view; solid-state lidar, which has a fixed field of view, but no moving parts, and can use either MEMS or optical phased arrays to steer the beams; and flash lidar, which spreads a flash of light over a large field of view before the signal bounces back to a detector. Lidar applications can be divided into airborne and terrestrial types. The two types require scanners with varying specifications based on
2040-400: A second dimension generally requires a second mirror that moves up and down. Alternatively, another laser can hit the same mirror from another angle. MEMS systems can be disrupted by shock/vibration and may require repeated calibration. Image development speed is affected by the speed at which they are scanned. Options to scan the azimuth and elevation include dual oscillating plane mirrors,
2142-500: A stand-alone word in 1963 suggests that it originated as a portmanteau of " light " and "radar": "Eventually the laser may provide an extremely sensitive detector of particular wavelengths from distant objects. Meanwhile, it is being used to study the Moon by 'lidar' (light radar) ..." The name " photonic radar " is sometimes used to mean visible-spectrum range finding like lidar. Lidar's first applications were in meteorology, for which
SECTION 20
#17328515246162244-440: A wide range of materials, including non-metallic objects, rocks, rain, chemical compounds, aerosols , clouds and even single molecules . A narrow laser beam can map physical features with very high resolutions ; for example, an aircraft can map terrain at 30-centimetre (12 in) resolution or better. The essential concept of lidar was originated by E. H. Synge in 1930, who envisaged the use of powerful searchlights to probe
2346-539: A wide variety of lidar applications, in addition to the applications listed below, as it is often mentioned in National lidar dataset programs. These applications are largely determined by the range of effective object detection; resolution, which is how accurately the lidar identifies and classifies objects; and reflectance confusion, meaning how well the lidar can see something in the presence of bright objects, like reflective signs or bright sun. Companies are working to cut
2448-579: Is a case study that used the voxelisation approach for detecting dead standing Eucalypt trees in Australia. Terrestrial applications of lidar (also terrestrial laser scanning ) happen on the Earth's surface and can be either stationary or mobile. Stationary terrestrial scanning is most common as a survey method, for example in conventional topography, monitoring, cultural heritage documentation and forensics. The 3-D point clouds acquired from these types of scanners can be matched with digital images taken of
2550-454: Is a method for determining ranges by targeting an object or a surface with a laser and measuring the time for the reflected light to return to the receiver. Lidar may operate in a fixed direction (e.g., vertical) or it may scan multiple directions, in which case it is known as lidar scanning or 3D laser scanning , a special combination of 3-D scanning and laser scanning . Lidar has terrestrial, airborne, and mobile applications. Lidar
2652-447: Is another parameter that has to be balanced in a lidar design. Lidar sensors mounted on mobile platforms such as airplanes or satellites require instrumentation to determine the absolute position and orientation of the sensor. Such devices generally include a Global Positioning System receiver and an inertial measurement unit (IMU). Lidar uses active sensors that supply their own illumination source. The energy source hits objects and
2754-438: Is commonly used to make high-resolution maps, with applications in surveying , geodesy , geomatics , archaeology , geography , geology , geomorphology , seismology , forestry , atmospheric physics , laser guidance , airborne laser swathe mapping (ALSM), and laser altimetry . It is used to make digital 3-D representations of areas on the Earth's surface and ocean bottom of the intertidal and near coastal zone by varying
2856-496: Is designed with a modular architecture, allowing for easy integration of various subsystems and payloads. The bus provides power, communications, and data handling capabilities, while the loads provide mission-specific capabilities. The platform is designed to be highly reliable and has been used in various missions, including the Hubble Space Telescope and NASA's New Horizons mission to Pluto. Since its initial development,
2958-454: Is for the green laser light to penetrate water about one and a half to two times Secchi depth in Indonesian waters. Water temperature and salinity have an effect on the refractive index which has a small effect on the depth calculation. The data obtained shows the full extent of the land surface exposed above the sea floor. This technique is extremely useful as it will play an important role in
3060-604: Is impossible to directly measure temperatures in the upper atmosphere, it is possible to measure the spectral emissions from a known chemical species (such as carbon dioxide) in that region. The frequency of the emissions may then be related via thermodynamics to the temperature in that region. To facilitate the discussion of data processing in practice, several processing "levels" were first defined in 1986 by NASA as part of its Earth Observing System and steadily adopted since then, both internally at NASA (e. g., ) and elsewhere (e. g., ); these definitions are: A Level 1 data record
3162-418: Is not visible in night vision goggles , unlike the shorter 1,000 nm infrared laser. Airborne topographic mapping lidars generally use 1,064 nm diode-pumped YAG lasers, while bathymetric (underwater depth research) systems generally use 532 nm frequency-doubled diode pumped YAG lasers because 532 nm penetrates water with much less attenuation than 1,064 nm. Laser settings include
Star Bus - Misplaced Pages Continue
3264-448: Is processed using a toolbox called Toolbox for Lidar Data Filtering and Forest Studies (TIFFS) for lidar data filtering and terrain study software. The data is interpolated to digital terrain models using the software. The laser is directed at the region to be mapped and each point's height above the ground is calculated by subtracting the original z-coordinate from the corresponding digital terrain model elevation. Based on this height above
3366-418: Is providing cheap information on the whole target area or most of it. This information usually has a good correlation with the target variable (ground truth) that is usually expensive to observe in an unbiased and accurate way. Therefore it can be observed on a probabilistic sample selected on an area sampling frame . Traditional survey methodology provides different methods to combine accurate information on
3468-450: Is relevant to highlight that probabilistic sampling is not critical for the selection of training pixels for image classification, but it is necessary for accuracy assessment of the classified images and area estimation. Additional care is recommended to ensure that training and validation datasets are not spatially correlated. We suppose now that we have classified images or a land cover map produced by visual photo-interpretation, with
3570-432: Is resolved is called georeferencing and involves computer-aided matching of points in the image (typically 30 or more points per image) which is extrapolated with the use of an established benchmark, "warping" the image to produce accurate spatial data. As of the early 1990s, most satellite images are sold fully georeferenced. In addition, images may need to be radiometrically and atmospherically corrected. Interpretation
3672-489: Is that of examined areas or objects that reflect or emit radiation that stand out from surrounding areas. For a summary of major remote sensing satellite systems see the overview table. To coordinate a series of large-scale observations, most sensing systems depend on the following: platform location and the orientation of the sensor. High-end instruments now often use positional information from satellite navigation systems . The rotation and orientation are often provided within
3774-402: Is that of increasingly smaller sensor pods such as those used by law enforcement and the military, in both manned and unmanned platforms. The advantage of this approach is that this requires minimal modification to a given airframe. Later imaging technologies would include infrared, conventional, Doppler and synthetic aperture radar. The development of artificial satellites in the latter half of
3876-708: Is the ability to filter out reflections from vegetation from the point cloud model to create a digital terrain model which represents ground surfaces such as rivers, paths, cultural heritage sites, etc., which are concealed by trees. Within the category of airborne lidar, there is sometimes a distinction made between high-altitude and low-altitude applications, but the main difference is a reduction in both accuracy and point density of data acquired at higher altitudes. Airborne lidar can also be used to create bathymetric models in shallow water. The main constituents of airborne lidar include digital elevation models (DEM) and digital surface models (DSM). The points and ground points are
3978-440: Is the critical process of making sense of the data. The first application was that of aerial photographic collection which used the following process; spatial measurement through the use of a light table in both conventional single or stereographic coverage, added skills such as the use of photogrammetry, the use of photomosaics, repeat coverage, Making use of objects' known dimensions in order to detect modifications. Image Analysis
4080-560: Is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring
4182-424: Is the recently developed automated computer-aided application that is in increasing use. Object-Based Image Analysis (OBIA) is a sub-discipline of GIScience devoted to partitioning remote sensing (RS) imagery into meaningful image-objects, and assessing their characteristics through spatial, spectral and temporal scale. Old data from remote sensing is often valuable because it may provide the only long-term data for
Star Bus - Misplaced Pages Continue
4284-417: Is the standard for airborne bathymetry. This light can penetrate water but pulse strength attenuates exponentially with distance traveled through the water. Lidar can measure depths from about 0.9 to 40 m (3 to 131 ft), with vertical accuracy in the order of 15 cm (6 in). The surface reflection makes water shallower than about 0.9 m (3 ft) difficult to resolve, and absorption limits
4386-410: Is used in numerous fields, including geophysics , geography , land surveying and most Earth science disciplines (e.g. exploration geophysics , hydrology , ecology , meteorology , oceanography , glaciology , geology ). It also has military, intelligence, commercial, economic, planning, and humanitarian applications, among others. In current usage, the term remote sensing generally refers to
4488-476: Is very difficult, if possible at all, to use the same technique in a lidar. The main problems are that all individual emitters must be coherent (technically coming from the same "master" oscillator or laser source), have dimensions about the wavelength of the emitted light (1 micron range) to act as a point source with their phases being controlled with high accuracy. Several companies are working on developing commercial solid-state lidar units but these units utilize
4590-598: The Amazon Basin , glacial features in Arctic and Antarctic regions, and depth sounding of coastal and ocean depths. Military collection during the Cold War made use of stand-off collection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on the ground, ensuring in the process that areas or objects are not disturbed. Orbital platforms collect and transmit data from different parts of
4692-1013: The EGU or Digital Earth encourage the development of learning modules and learning portals . Examples include: FIS – Remote Sensing in School Lessons , Geospektiv , Ychange , or Spatial Discovery, to promote media and method qualifications as well as independent learning. Remote sensing data are processed and analyzed with computer software, known as a remote sensing application . A large number of proprietary and open source applications exist to process remote sensing data. There are applications of gamma rays to mineral exploration through remote sensing. In 1972 more than two million dollars were spent on remote sensing applications with gamma rays to mineral exploration. Gamma rays are used to search for deposits of uranium. By observing radioactivity from potassium, porphyry copper deposits can be located. A high ratio of uranium to thorium has been found to be related to
4794-547: The European Commission . Forest area and deforestation estimation have also been a frequent target of remote sensing projects, the same as land cover and land use Ground truth or reference data to train and validate image classification require a field survey if we are targetting annual crops or individual forest species, but may be substituted by photointerpretation if we look at wider classes that can be reliably identified on aerial photos or satellite images. It
4896-511: The Hughes Aircraft Company introduced the first lidar-like system in 1961, shortly after the invention of the laser. Intended for satellite tracking, this system combined laser-focused imaging with the ability to calculate distances by measuring the time for a signal to return using appropriate sensors and data acquisition electronics. It was originally called "Colidar" an acronym for "coherent light detecting and ranging", derived from
4998-571: The Magellan spacecraft provided detailed topographic maps of Venus , while instruments aboard SOHO allowed studies to be performed on the Sun and the solar wind , just to name a few examples. Recent developments include, beginning in the 1960s and 1970s, the development of image processing of satellite imagery . The use of the term "remote sensing" began in the early 1960s when Evelyn Pruitt realized that advances in science meant that aerial photography
5100-527: The MetOp spacecraft of EUMETSAT are all operated at altitudes of about 800 km (500 mi). The Proba-1 , Proba-2 and SMOS spacecraft of European Space Agency are observing the Earth from an altitude of about 700 km (430 mi). The Earth observation satellites of UAE, DubaiSat-1 & DubaiSat-2 are also placed in Low Earth orbits (LEO) orbits and providing satellite imagery of various parts of
5202-533: The NDVI , a good proxy to chlorophyll activity. The modern discipline of remote sensing arose with the development of flight. The balloonist G. Tournachon (alias Nadar ) made photographs of Paris from his balloon in 1858. Messenger pigeons, kites, rockets and unmanned balloons were also used for early images. With the exception of balloons, these first, individual images were not particularly useful for map making or for scientific purposes. Systematic aerial photography
SECTION 50
#17328515246165304-628: The National Center for Atmospheric Research used it to measure clouds and pollution. The general public became aware of the accuracy and usefulness of lidar systems in 1971 during the Apollo ;15 mission, when astronauts used a laser altimeter to map the surface of the Moon. Although the English language no longer treats "radar" as an acronym, (i.e., uncapitalized), the word "lidar" was capitalized as "LIDAR" or "LiDAR" in some publications beginning in
5406-615: The electromagnetic spectrum , which in conjunction with larger scale aerial or ground-based sensing and analysis, provides researchers with enough information to monitor trends such as El Niño and other natural long and short term phenomena. Other uses include different areas of the earth sciences such as natural resource management , agricultural fields such as land usage and conservation, greenhouse gas monitoring , oil spill detection and monitoring, and national security and overhead, ground-based and stand-off collection on border areas. The basis for multispectral collection and analysis
5508-416: The time of flight of the laser pulse (i.e., the time it takes each laser pulse to hit the target and return to the sensor), which requires the pulsing of the laser and acquisition by the camera to be synchronized. The result is a camera that takes pictures of distance, instead of colors. Flash lidar is especially advantageous, when compared to scanning lidar, when the camera, scene, or both are moving, since
5610-499: The 1980s. No consensus exists on capitalization. Various publications refer to lidar as "LIDAR", "LiDAR", "LIDaR", or "Lidar". The USGS uses both "LIDAR" and "lidar", sometimes in the same document; the New York Times predominantly uses "lidar" for staff-written articles, although contributing news feeds such as Reuters may use Lidar. Lidar uses ultraviolet , visible , or near infrared light to image objects. It can target
5712-613: The 20th century allowed remote sensing to progress to a global scale as of the end of the Cold War. Instrumentation aboard various Earth observing and weather satellites such as Landsat , the Nimbus and more recent missions such as RADARSAT and UARS provided global measurements of various data for civil, research, and military purposes. Space probes to other planets have also provided the opportunity to conduct remote sensing studies in extraterrestrial environments, synthetic aperture radar aboard
5814-632: The Earth. To get global coverage with a low orbit, a polar orbit is used. A low orbit will have an orbital period of roughly 100 minutes and the Earth will rotate around its polar axis about 25° between successive orbits. The ground track moves towards the west 25° each orbit, allowing a different section of the globe to be scanned with each orbit. Most are in Sun-synchronous orbits . Lidar Lidar ( / ˈ l aɪ d ɑːr / , also LIDAR , LiDAR or LADAR , an acronym of "light detection and ranging" or "laser imaging, detection, and ranging" )
5916-474: The German students use the services of Google Earth ; in 2006 alone the software was downloaded 100 million times. But studies have shown that only a fraction of them know more about the data they are working with. There exists a huge knowledge gap between the application and the understanding of satellite images. Remote sensing only plays a tangential role in schools, regardless of the political claims to strengthen
6018-517: The Star Bus platform has undergone continuous enhancements, expanding its capabilities for increasingly complex missions. Orbital ATK, now part of Northrop Grumman after its acquisition in 2018, has further refined the platform to support geostationary communications satellites, low Earth orbit (LEO) missions, and interplanetary exploration. These advancements include improvements in power generation, thermal management, and propulsion systems, making it one of
6120-494: The United States is a stub . You can help Misplaced Pages by expanding it . Remote sensing This is an accepted version of this page Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object, in contrast to in situ or on-site observation . The term is applied especially to acquiring information about Earth and other planets . Remote sensing
6222-489: The atmosphere. Indeed, lidar has since been used extensively for atmospheric research and meteorology . Lidar instruments fitted to aircraft and satellites carry out surveying and mapping – a recent example being the U.S. Geological Survey Experimental Advanced Airborne Research Lidar. NASA has identified lidar as a key technology for enabling autonomous precision safe landing of future robotic and crewed lunar-landing vehicles. Wavelengths vary to suit
SECTION 60
#17328515246166324-831: The captured frames do not need to be stitched together, and the system is not sensitive to platform motion. This results in less distortion. 3-D imaging can be achieved using both scanning and non-scanning systems. "3-D gated viewing laser radar" is a non-scanning laser ranging system that applies a pulsed laser and a fast gated camera. Research has begun for virtual beam steering using Digital Light Processing (DLP) technology. Imaging lidar can also be performed using arrays of high speed detectors and modulation sensitive detector arrays typically built on single chips using complementary metal–oxide–semiconductor (CMOS) and hybrid CMOS/ Charge-coupled device (CCD) fabrication techniques. In these devices each pixel performs some local processing such as demodulation or gating at high speed, downconverting
6426-420: The city at a resolution of 30 cm (1 ft), displaying the precise height of rubble strewn in city streets. The new system is ten times better, and could produce much larger maps more quickly. The chip uses indium gallium arsenide (InGaAs), which operates in the infrared spectrum at a relatively long wavelength that allows for higher power and longer ranges. In many applications, such as self-driving cars,
6528-461: The data and is supported by existing workflows that support interpretation of 3-D point clouds . Recent studies investigated voxelisation . The intensities of the waveform samples are inserted into a voxelised space (3-D grayscale image) building up a 3-D representation of the scanned area. Related metrics and information can then be extracted from that voxelised space. Structural information can be extracted using 3-D metrics from local areas and there
6630-546: The data's purpose, the size of the area to be captured, the range of measurement desired, the cost of equipment, and more. Spaceborne platforms are also possible, see satellite laser altimetry . Airborne lidar (also airborne laser scanning ) is when a laser scanner, while attached to an aircraft during flight, creates a 3-D point cloud model of the landscape. This is currently the most detailed and accurate method of creating digital elevation models , replacing photogrammetry . One major advantage in comparison with photogrammetry
6732-495: The discovery of the Earth's Van Allen radiation belts . The TIROS-1 spacecraft, launched on April 1, 1960, as part of NASA's Television Infrared Observation Satellite (TIROS) program, sent back the first television footage of weather patterns to be taken from space. In 2008, more than 150 Earth observation satellites were in orbit, recording data with both passive and active sensors and acquiring more than 10 terabits of data daily. By 2021, that total had grown to over 950, with
6834-412: The entire field of view is illuminated with a wide diverging laser beam in a single pulse. This is in contrast to conventional scanning lidar, which uses a collimated laser beam that illuminates a single point at a time, and the beam is raster scanned to illuminate the field of view point-by-point. This illumination method requires a different detection scheme as well. In both scanning and flash lidar,
6936-408: The entire scene is illuminated at the same time. With scanning lidar, motion can cause "jitter" from the lapse in time as the laser rasters over the scene. As with all forms of lidar, the onboard source of illumination makes flash lidar an active sensor. The signal that is returned is processed by embedded algorithms to produce a nearly instantaneous 3-D rendering of objects and terrain features within
7038-625: The farmer who plants his fields in a remote corner of the country knows its value." The development of remote sensing technology reached a climax during the Cold War with the use of modified combat aircraft such as the P-51 , P-38 , RB-66 and the F-4C , or specifically designed collection platforms such as the U2/TR-1 , SR-71 , A-5 and the OV-1 series both in overhead and stand-off collection. A more recent development
7140-514: The field of view of the sensor. The laser pulse repetition frequency is sufficient for generating 3-D videos with high resolution and accuracy. The high frame rate of the sensor makes it a useful tool for a variety of applications that benefit from real-time visualization, such as highly precise remote landing operations. By immediately returning a 3-D elevation mesh of target landscapes, a flash sensor can be used to identify optimal landing zones in autonomous spacecraft landing scenarios. Seeing at
7242-417: The fields of media and methods apart from the mere visual interpretation of satellite images. Many teachers have great interest in the subject "remote sensing", being motivated to integrate this topic into teaching, provided that the curriculum is considered. In many cases, this encouragement fails because of confusing information. In order to integrate remote sensing in a sustainable manner organizations like
7344-581: The first commercial satellite (IKONOS) collecting very high resolution imagery was launched. Remote Sensing has a growing relevance in the modern information society. It represents a key technology as part of the aerospace industry and bears increasing economic relevance – new sensors e.g. TerraSAR-X and RapidEye are developed constantly and the demand for skilled labour is increasing steadily. Furthermore, remote sensing exceedingly influences everyday life, ranging from weather forecasts to reports on climate change or natural disasters . As an example, 80% of
7446-471: The ground the non-vegetation data is obtained which may include objects such as buildings, electric power lines, flying birds, insects, etc. The rest of the points are treated as vegetation and used for modeling and mapping. Within each of these plots, lidar metrics are calculated by calculating statistics such as mean, standard deviation, skewness, percentiles, quadratic mean, etc. Multiple commercial lidar systems for unmanned aerial vehicles are currently on
7548-408: The intensity of the returned signal. The name "photonic radar" is sometimes used to mean visible-spectrum range finding like lidar, although photonic radar more strictly refers to radio-frequency range finding using photonics components. A lidar determines the distance of an object or a surface with the formula : where c is the speed of light , d is the distance between the detector and
7650-461: The largest number of satellites operated by US-based company Planet Labs . Most Earth observation satellites carry instruments that should be operated at a relatively low altitude. Most orbit at altitudes above 500 to 600 kilometers (310 to 370 mi). Lower orbits have significant air-drag , which makes frequent orbit reboost maneuvers necessary. The Earth observation satellites ERS-1, ERS-2 and Envisat of European Space Agency as well as
7752-540: The laser is limited, or an automatic shut-off system which turns the laser off at specific altitudes is used in order to make it eye-safe for the people on the ground. One common alternative, 1,550 nm lasers, are eye-safe at relatively high power levels since this wavelength is not strongly absorbed by the eye. A trade-off though is that current detector technology is less advanced, so these wavelengths are generally used at longer ranges with lower accuracies. They are also used for military applications because 1,550 nm
7854-441: The laser repetition rate (which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, the number of passes required through the gain material (YAG, YLF , etc.), and Q-switch (pulsing) speed. Better target resolution is achieved with shorter pulses, provided the lidar receiver detectors and electronics have sufficient bandwidth. A phased array can illuminate any direction by using
7956-659: The laser, typically on the order of one microjoule , and are often "eye-safe", meaning they can be used without safety precautions. High-power systems are common in atmospheric research, where they are widely used for measuring atmospheric parameters: the height, layering and densities of clouds, cloud particle properties ( extinction coefficient , backscatter coefficient, depolarization ), temperature, pressure, wind, humidity, and trace gas concentration (ozone, methane, nitrous oxide , etc.). Lidar systems consist of several major components. 600–1,000 nm lasers are most common for non-scientific applications. The maximum power of
8058-459: The launch of the first artificial satellite, Sputnik 1 , by the Soviet Union on October 4, 1957. Sputnik 1 sent back radio signals, which scientists used to study the ionosphere . The United States Army Ballistic Missile Agency launched the first American satellite, Explorer 1 , for NASA's Jet Propulsion Laboratory on January 31, 1958. The information sent back from its radiation detector led to
8160-415: The major sea floor mapping program. The mapping yields onshore topography as well as underwater elevations. Sea floor reflectance imaging is another solution product from this system which can benefit mapping of underwater habitats. This technique has been used for three-dimensional image mapping of California's waters using a hydrographic lidar. Airborne lidar systems were traditionally able to acquire only
8262-464: The market. These platforms can systematically scan large areas, or provide a cheaper alternative to manned aircraft for smaller scanning operations. The airborne lidar bathymetric technological system involves the measurement of time of flight of a signal from a source to its return to the sensor. The data acquisition technique involves a sea floor mapping component and a ground truth component that includes video transects and sampling. It works using
8364-620: The maximum depth. Turbidity causes scattering and has a significant role in determining the maximum depth that can be resolved in most situations, and dissolved pigments can increase absorption depending on wavelength. Other reports indicate that water penetration tends to be between two and three times Secchi depth. Bathymetric lidar is most useful in the 0–10 m (0–33 ft) depth range in coastal mapping. On average in fairly clear coastal seawater lidar can penetrate to about 7 m (23 ft), and in turbid water up to about 3 m (10 ft). An average value found by Saputra et al, 2021,
8466-625: The most versatile satellite platforms in the industry. The Star Bus family has been employed in both commercial and governmental projects, including the SES and Intelsat communications constellations. The first satellite program based on the Star Bus platform, developed by Thomas van der Heyden for the Indonesian Direct Broadcast program IndoVision, was IndoStar-1 , which was launched in November 1997. This article about one or more spacecraft of
8568-400: The new system will lower costs by not requiring a mechanical component to aim the chip. InGaAs uses less hazardous wavelengths than conventional silicon detectors, which operate at visual wavelengths. New technologies for infrared single-photon counting LIDAR are advancing rapidly, including arrays and cameras in a variety of semiconductor and superconducting platforms. In flash lidar,
8670-402: The object or surface being detected, and t is the time spent for the laser light to travel to the object or surface being detected, then travel back to the detector. The two kinds of lidar detection schemes are "incoherent" or direct energy detection (which principally measures amplitude changes of the reflected light) and coherent detection (best for measuring Doppler shifts, or changes in
8772-522: The other hand, emits energy in order to scan objects and areas whereupon a sensor then detects and measures the radiation that is reflected or backscattered from the target. RADAR and LiDAR are examples of active remote sensing where the time delay between emission and return is measured, establishing the location, speed and direction of an object. Remote sensing makes it possible to collect data of dangerous or inaccessible areas. Remote sensing applications include monitoring deforestation in areas such as
8874-504: The phase of the reflected light). Coherent systems generally use optical heterodyne detection . This is more sensitive than direct detection and allows them to operate at much lower power, but requires more complex transceivers. Both types employ pulse models: either micropulse or high energy . Micropulse systems utilize intermittent bursts of energy. They developed as a result of ever-increasing computer power, combined with advances in laser technology. They use considerably less energy in
8976-848: The presence of hydrothermal copper deposits. Radiation patterns have also been known to occur above oil and gas fields, but some of these patterns were thought to be due to surface soils instead of oil and gas. An Earth observation satellite or Earth remote sensing satellite is a satellite used or designed for Earth observation (EO) from orbit , including spy satellites and similar ones intended for non-military uses such as environmental monitoring , meteorology , cartography and others. The most common type are Earth imaging satellites, that take satellite images , analogous to aerial photographs ; some EO satellites may perform remote sensing without forming pictures, such as in GNSS radio occultation . The first occurrence of satellite remote sensing can be dated to
9078-412: The principle of the inverse problem : while the object or phenomenon of interest (the state ) may not be directly measured, there exists some other variable that can be detected and measured (the observation ) which may be related to the object of interest through a calculation. The common analogy given to describe this is trying to determine the type of animal from its footprints. For example, while it
9180-463: The reflected energy is detected and measured by sensors. Distance to the object is determined by recording the time between transmitted and backscattered pulses and by using the speed of light to calculate the distance traveled. Flash lidar allows for 3-D imaging because of the camera's ability to emit a larger flash and sense the spatial relationships and dimensions of area of interest with the returned energy. This allows for more accurate imaging because
9282-500: The reflection of sunlight is detected by the sensor). Remote sensing can be divided into two types of methods: Passive remote sensing and Active remote sensing. Passive sensors gather radiation that is emitted or reflected by the object or surrounding areas. Reflected sunlight is the most common source of radiation measured by passive sensors. Examples of passive remote sensors include film photography , infrared , charge-coupled devices , and radiometers . Active collection, on
9384-401: The scanned area from the scanner's location to create realistic looking 3-D models in a relatively short time when compared to other technologies. Each point in the point cloud is given the colour of the pixel from the image taken at the same location and direction as the laser beam that created the point. Mobile lidar (also mobile laser scanning ) is when two or more scanners are attached to
9486-463: The signals to video rate so that the array can be read like a camera. Using this technique many thousands of pixels / channels may be acquired simultaneously. High resolution 3-D lidar cameras use homodyne detection with an electronic CCD or CMOS shutter . A coherent imaging lidar uses synthetic array heterodyne detection to enable a staring single element receiver to act as though it were an imaging array. In 2014, Lincoln Laboratory announced
9588-423: The support for teaching on the subject. A lot of the computer software explicitly developed for school lessons has not yet been implemented due to its complexity. Thereby, the subject is either not at all integrated into the curriculum or does not pass the step of an interpretation of analogue images. In fact, the subject of remote sensing requires a consolidation of physics and mathematics as well as competences in
9690-527: The target: from about 10 micrometers ( infrared ) to approximately 250 nanometers ( ultraviolet ). Typically, light is reflected via backscattering , as opposed to pure reflection one might find with a mirror. Different types of scattering are used for different lidar applications: most commonly Rayleigh scattering , Mie scattering , Raman scattering , and fluorescence . Suitable combinations of wavelengths can allow remote mapping of atmospheric contents by identifying wavelength-dependent changes in
9792-449: The term " radar ", itself an acronym for "radio detection and ranging". All laser rangefinders , laser altimeters and lidar units are derived from the early colidar systems. The first practical terrestrial application of a colidar system was the "Colidar Mark II", a large rifle-like laser rangefinder produced in 1963, which had a range of 11 km and an accuracy of 4.5 m, to be used for military targeting. The first mention of lidar as
9894-615: The territory, such as agriculture, forestry or land cover in general. The first large project to apply Landsata 1 images for statistics was LACIE (Large Area Crop Inventory Experiment), run by NASA, NOAA and the USDA in 1974–77. Many other application projects on crop area estimation have followed, including the Italian AGRIT project and the MARS project of the Joint Research Centre (JRC) of
9996-422: The use of satellite - or aircraft-based sensor technologies to detect and classify objects on Earth. It includes the surface and the atmosphere and oceans , based on propagated signals (e.g. electromagnetic radiation ). It may be split into "active" remote sensing (when a signal is emitted by a satellite or aircraft to the object and its reflection is detected by the sensor) and "passive" remote sensing (when
10098-495: The vectors of discrete points while DEM and DSM are interpolated raster grids of discrete points. The process also involves capturing of digital aerial photographs. To interpret deep-seated landslides for example, under the cover of vegetation, scarps, tension cracks or tipped trees airborne lidar is used. Airborne lidar digital elevation models can see through the canopy of forest cover, perform detailed measurements of scarps, erosion and tilting of electric poles. Airborne lidar data
10200-429: The wavelength of light. It has also been increasingly used in control and navigation for autonomous cars and for the helicopter Ingenuity on its record-setting flights over the terrain of Mars . The evolution of quantum technology has given rise to the emergence of Quantum Lidar, demonstrating higher efficiency and sensitivity when compared to conventional lidar systems. Under the direction of Malcolm Stitch,
10302-579: Was developed for military surveillance and reconnaissance purposes beginning in World War I . After WWI, remote sensing technology was quickly adapted to civilian applications. This is demonstrated by the first line of a 1941 textbook titled "Aerophotography and Aerosurverying," which stated the following: "There is no longer any need to preach for aerial photography-not in the United States- for so widespread has become its use and so great its value that even
10404-548: Was no longer an adequate term to describe the data streams being generated by new technologies. With assistance from her fellow staff member at the Office of Naval Research, Walter Bailey, she coined the term "remote sensing". Several research groups in Silicon Valley including NASA Ames Research Center , GTE , and ESL Inc. developed Fourier transform techniques leading to the first notable enhancement of imagery data. In 1999
#615384