A newly discovered error in the way electronic systems record light casts doubt on data collected by many of the world’s telescopes–and even those off-world, like the Hubble Space Telescope.
WATCH: Miles revisits the time the Hubble had a different technical issue, right after it launched.
An international team of astrophysicists discovered the glitch while looking over old data from the SuperNova Integral Field Spectrograph (SNIFS) instrument on University of Hawaii’s 2.2-meter telescope atop Mauna Kea.
The issue was strange jumps in reported light intensity–there seemed to sometimes be splotches trailing faint signals, and normal background noise correction software could not suss out the issue.
After some careful sleuthing, the team figured out that there was a hidden feedback loop in the way analog signals were converted into digital code for storage, an issue that could plague most telescopes. The team dubbed the glitch “the binary offset effect”, which Boone et al. describe in a paper uploaded to Arxiv.
At its core, the problem is with how two electronic components process data–components you may know, if you are a photography enthusiast. A high-end research telescope is much like your off-the-shelf digital camera, just with much more expensive and powerful hardware.
The crucial device for capturing light the lens funnels in is called a charge coupled device (CCD). The CCD is a semiconductor chip that behaves in an interesting way when light hits it–if the photon has enough energy, it is absorbed by the silicon and bumps one of its electrons out of orbit.
This is called the photoelectric effect, the discovery of which earned Albert Einstein his Nobel Prize in Physics in 1921, and it is crucial in many of our current-day semiconductor technologies. For example, solar panels use the photoelectric effect to generate electricity, turning a downpour of photons into a steady stream of displaced electrons.
In a camera or a telescope, the CCD is made up of a tiny grid of these semiconductors, squares of which corresponds to pixels in the final image. You might think since the semiconductor turns light signals into a stream of electrons that it is already a digital recording, but it isn’t–it’s still a physical signal and has not turned into digitized data yet.
That’s where the something called an analog-to-digital converter (ADC) comes in. When light falls on one of those squares, an electric current of proportional intensity is shuttled to the ADC. It measures the intensity of each pixel’s electron stream and assigns it a single number that is then stored in binary code.
Binary data is sent as a series of electric pulses, sort of like Morse code: a blip means 1, no blip means 0. The binary offset effect happens when there are an abnormally high or low number of blips sent out in, so numbers that when transposed into binary have either a lot of ones or a lot of zeros.
When there’s too much or too little digital signal leaving the ADC, it can mess with voltage used for reference, meaning subsequent measurements a few pixels down will be recorded as being a bit brighter or darker than they actually are. That’s what the scientists dubbed as the binary offset effect.
Scientists studying bright or big objects, like gamma ray bursts and plumes of icy moons, shouldn’t worry–the effect is dulled by the sheer quantity of light. But astrophysicists dealing with faint light could see a more pronounced impact. Exoplanet hunters, beware!
The way these ADCs work is fairly standardized across telescopes, so the team looked at 22 other world-class instruments on renown telescopes including the Hubble, the Keck twins, and the Subaru, and the Hubble.
“We find evidence of the binary offset effect in 16 of 22 different instruments that were investigated, indicating that the effect is present in a significant amount of existing astronomical data,” wrote the team.
Though not every finding is under scrutiny, over 15,000 papers have been published using Hubble data alone, so there are bound to be hundreds if not thousands of low-light observations that may have been skewed.
Thankfully, the team also wrote, tested, and released some software that helps correct the issue, which can be applied to the raw data of any affected telescope. In the future, however, ADCs need to be redesigned to make sure newer telescopes, like the giant Thirty Meter Telescope planned for Mauna Kea, do not suffer the same glitch.
WATCH: Miles visits the controversial site of the planned Thirty Meter Telescope.
Banner image credit: NASA/ESA, manipulate by author.