Exploring the World of Optical Satellites
Optical satellites are a type of satellite used for Earth observation that capture images of the Earth’s surface using visible and near-infrared light.
These images can be used to monitor changes in the Earth’s surface over time, such as changes in vegetation, land use, and urbanization. Optical satellites can also be used for disaster monitoring and response, as well as for monitoring climate change and environmental degradation.
One of the most widely used optical satellites for Earth observation is the Landsat program, which began in 1972 and is managed by NASA and the United States Geological Survey (USGS).
The Landsat program has collected a vast archive of images of the Earth’s surface, which are freely available to the public. These images have been used for a wide range of applications, including land use mapping, crop monitoring, and wildfire detection.
Other notable optical satellites include the Sentinel-2 satellite, which is part of the European Space Agency’s (ESA) Copernicus program. Sentinel-2 captures high-resolution images of the Earth’s surface in 13 spectral bands, including several visible and near-infrared bands.
The satellite’s images are used for a wide range of applications, including land use mapping, agriculture monitoring, and disaster response.
How do Earth observation satellites capture images of the Earth’s surface using visible and near-infrared light?
Earth observation satellites capture images of the Earth’s surface using sensors that detect different parts of the electromagnetic spectrum. Visible and near-infrared light are two parts of the spectrum that are commonly used for Earth observation.
Optical sensors on Earth observation satellites work by detecting light reflected from the Earth’s surface. When sunlight strikes the Earth’s surface, some of the light is absorbed by the surface, while some is reflected back toward space.
Optical sensors detect the light that is reflected back toward space and use it to create images of the Earth’s surface.
Optical sensors typically detect light in different bands of the electromagnetic spectrum, ranging from visible light (400-700 nanometers) to near-infrared light (700-1100 nanometers).
Each band of light provides different information about the Earth’s surface. For example, visible light is useful for distinguishing between different types of vegetation, while near-infrared light can be used to detect moisture content in vegetation.
To create images of the Earth’s surface, optical sensors capture light in multiple spectral bands and combine them into a single image. These images can be used to monitor changes in the Earth’s surface over time, such as changes in vegetation, land use, and urbanization.
Optical satellite images are also used for disaster monitoring and response, as well as for monitoring climate change and environmental degradation.
Wait, what is the electromagnetic spectrum?
The electromagnetic spectrum is a range of all the different types of electromagnetic radiation. It includes all the different forms of energy that travel through space at the speed of light. Electromagnetic radiation consists of oscillating electric and magnetic fields, and it can be thought of as waves of energy.
The electromagnetic spectrum includes a wide range of wavelengths, frequencies, and energies. At one end of the spectrum are the longer-wavelength, lower-frequency, and lower-energy forms of radiation such as radio waves and microwaves.
In the middle of the spectrum are the visible light waves that we can see with our eyes. At the other end of the spectrum are the shorter-wavelength, higher-frequency, and higher-energy forms of radiation such as ultraviolet, X-rays, and gamma rays.
The different types of electromagnetic radiation have different properties and interact with matter in different ways. For example, radio waves can pass through walls and are used for communication, while X-rays can penetrate through soft tissue in the body and are used for medical imaging.
Different parts of the electromagnetic spectrum are also used for different types of remote sensing, such as visible and near-infrared light for optical remote sensing and microwaves for radar remote sensing.
The visible spectrum is typically captured using passive sensors, which detect natural light emitted or reflected from the Earth’s surface.
Active sensors, on the other hand, emit their own energy source (such as radar) and measure the response. So, visible light sensors are considered a form of passive remote sensing technology.
Some of the advantages of using the visible spectrum
- High spatial resolution: Visible light sensors can provide high-resolution images of the Earth’s surface, allowing for detailed mapping and analysis of land cover, vegetation, and other features.
- Natural color: Visible light sensors capture natural color images, which can be easily interpreted by humans and provide an intuitive way to visualize the Earth’s surface. Computer vision still benchmarks up against humans’ ability to interpret images!
- Cost-effective: Visible light can be captured during the day when the Sun is illuminating the Earth’s surface, passive remote sensing, means that the satellite does not have the burden of having to illuminate the Earth’s surface which means less power is needed and the satellite itself is less complex.
Some of the disadvantages of using the visible spectrum
One of the main disadvantages of using the visible spectrum for Earth observation is that visible light can be easily scattered, absorbed, or reflected by the atmosphere, clouds, or surfaces on the Earth’s surface. This can result in incomplete or distorted images, especially in cloudy or hazy conditions.
So you might be thinking, can’t we just automatically identify the clouds? and the answer is yes we can … sometimes!
Another disadvantage is that visible light only provides information about the surface features that can be seen with the naked eye, such as the color, texture, and shape of objects.
It does not provide information about other physical properties, such as temperature, moisture content, or chemical composition, that may be important for certain applications.
For more context on what is possible with RGB or images captured in the visible spectrum listen to these two episodes focused on capturing natural color aerial images
Visible light can only be captured during the daytime when the Sun is illuminating the Earth’s surface. This limits the availability of data and can make it difficult to monitor certain phenomena, such as nocturnal animal behavior or nighttime weather patterns.
Further reading: satellite orbits
What is atmospheric correction?
Atmospheric correction is the process of removing atmospheric effects from satellite imagery to obtain a true representation of the Earth’s surface.
When light travels from the sun to the Earth, it passes through the atmosphere, which can cause the light to scatter, be reflected, or be absorbed. This can result in the imagery appearing hazy or distorted, which can make it difficult to accurately interpret.
Atmospheric correction algorithms use information about the atmosphere, such as temperature, pressure, and humidity, to estimate the effects of the atmosphere on satellite imagery. By removing these effects, the resulting imagery is clearer and more accurate.
In conclusion, optical satellites have revolutionized the way we monitor and understand the Earth. These remarkable pieces of technology have opened up a wealth of possibilities for scientific research, resource management, and disaster response, providing us with detailed and accurate images of our planet from space.
As we continue to develop new technologies and applications for optical satellites, the potential for these remarkable tools only continues to grow.
From tracking changes in land use to monitoring natural disasters and assessing the impacts of climate change, optical satellites have become an essential tool for understanding our planet and shaping our future.
Optical Satellites FAQs
What is the resolution of the images captured by optical satellites?
The resolution of these images refers to the smallest discernible feature that can be distinguished by the sensor. The resolution of optical satellite images can vary depending on the satellite and sensor used, but typically ranges from several meters to less than a meter. For example, the Sentinel-2 satellite captures images with a resolution of 10 to 60 meters per pixel, while the WorldView-4 satellite can capture images with a resolution as fine as 30 centimeters per pixel. Higher-resolution images generally allow for more detailed analysis and mapping of the Earth’s surface.
What is the coverage area of optical satellites and how often do they capture images?
The coverage area and frequency of image capture for optical satellites can vary depending on the satellite and its mission. Some satellites are designed to provide global coverage of the Earth’s surface, while others focus on specific regions or applications. The frequency of image capture can also vary, with some satellites capturing images on a daily or weekly basis, while others may capture images less frequently. For example, the Landsat program captures images of the Earth’s surface every 16 days, while the Sentinel-2 program captures images every 5 days.