New Algorithm Removes Underwater Distortions for Clear Colorful Pictures
Have you ever taken an underwater picture only to have it come out in hues of green and blue? That is because light behaves differently in water.
Although it may just be a bummer when taking personal pics, the phenomenon is actually a bigger deal in the scientific community where the inability to take accurate pictures of sea life is hindering essential progress. Now Oceanographer and engineer Derya Akkaynak and engineer Tali Treibitz, of the University of Haifa, have created a new artificial intelligence algorithm that may just solve all that.
How does the #ocean look without water? Check out the Sea-thru algorithm that removes water from your underwater images here: https://t.co/bHJq73gWVF #CVPR pic.twitter.com/VAWubfSPsQ
— Derya Akkaynak (@dakkaynak) June 14, 2019
Removing water
The best way to describe what the algorithm does is to say it removes the water. This is because it allows pictures to be captured underwater as if they were taken on dry land.
The researchers have called their new system Sea-thru and it is nothing short of extraordinary. Unlike photoshop that artificially fixes pictures, Sea-thru is a real-time physically accurate correction.
This image shared in the researchers work truly shows what the algorithm can achieve:

Obtaining true colors
RELATED: NVIDIA AI MAKES ARTISTS OUT OF EVERYONE
The algorithm has gotten the marine life scientific community excited. “What I like about this approach is that it's really about obtaining true colors,” Pim Bongaerts, a coral biologist at the California Academy of Sciences told Scientific American. “Getting true color could really help us get a lot more worth out of our current data sets.”

IE had the opportunity to interview Akkaynak on her work.
IE: How did you go about inventing the algorithm?
Akkaynak: I developed the Sea-thru algorithm during my post-doctoral fellowship at the University of Haifa, Marine Imaging Lab, as a result of three years of theoretical and experimental work. Digital cameras (along with underwater housings) have been commercially available really only since the early 1990s, so since then consistently correcting colors in underwater images had been a challenging and open problem in our field. That was the problem I started working on in 2015 when I first joined the University of Haifa.

In time, the reasons for the lack of a robust and consistent color correction algorithm became clear -- researchers were using an equation describing how light moves in the atmosphere to produce an image on the camera sensor, to correct colors in underwater photographs. What happens to light underwater is very different than what happens to it in the air. Once I discovered that I formulated a (more) physically accurate equation specifically for the ocean, and that equation is the real breakthrough that led to the Sea-thru algorithm. That equation is why the Sea-thru algorithm works better than existing algorithms, and has been able to produce the stunning corrections (algorithmically) that you have seen.

IE: How does the algorithm work?
Akkaynak: The only non-standard piece of information it requires is a ‘distance map’ – which
tells us the distance of each object in the scene from the camera. Other than that, it works on raw RGB images taken under natural light. It does not need a color chart in the images. It is not an AI algorithm – so there are no neural networks, or training involved, either.

There are different ways of obtaining a distance map. We estimate it utilizing multiple images of the scene. You can also use a stereo camera setup, and obtain distance from a single image pair, not needing multiple images. Once the algorithm has distance, it estimates all the necessary parameters for removing the “fog” and restoring colors based on the equation I mentioned above.

IE: What applications do you foresee for the algorithm?
Akkaynak: Sea-thru already works on video, which is very cool, because takes away the need for multiple images since video frames inherently are multiple images of the same scene. Also, it takes away the need to carry artificial lights, which means less expense and gear to carry for many photographers.
But where it will add tremendous value will be the automation of the analysis images and video taken by marine scientists. When these images (eg, surveys of reefs, seafloor, fish stocks, etc) are preprocessed with Sea-thru, scientists will then be able to use powerful computer vision and machine learning methods to be able to count, identify, segment, and classify animals and other objects in them. Currently, we acquire vast amounts of images, but the majority of analyses are done manually, which is tedious, slow, and expensive.

At the moment Sea-thru works only on images taken under natural light, but we will extend it to the case of artificial light as that’s how the majority of the ocean is explored. And of course, I see it as a module in Photoshop, integrated into consumer cameras, and even diving masks. It’s only a matter of time!
Qatar's football stadiums' AC technology serves as a testbed for an innovative cooling method.