This seems like it's just color fixing using stereographic distance modeling and a known reference palette. I'm not sure what is new or novel about this? Maybe it's just new to oceanographic photography?
They actually correct by using wavelength-dependent Backscatter and attenuation models while accounting for camera spectral sensitivities and illuminant spectrum. This is much more complex than "just color fixing".
It might not be "just color fixing", but to my eye at least, the results appear similar to if you were to simlpy white balance in lightroom (which doesn't always require a color chart either).
It's clearly much better than that. The problem is that, in a medium like water, light of different wavelengths absorb differently. So the longer distance the light travels through water, the more the effect is, and this effect is different depending on the wavelength. So you can't just open Photoshop and change the color levels of everything in the picture simultaneously. You actually have to know/estimate distances to different objects and make adjustments locally based on that.
The article explains that the colour chart was only for the photos used for training the model. After the model was complete it is not needed, so the use of the software is to be able to take photos without a reference palette when doing reef surveys.
Inventions often seem obvious or trivial in hindsight. Are you aware of this technique being used anywhere else? Because this seems pretty novel to me.
Yeah both are common in astrophotography. Color correction using a reference palette is also pretty common when doing any kind of scientific imaging, though in lab setting the distance is usually well established and not relevant.
> Once trained, the color chart is no longer necessary. As Akkaynak helpfully explained on Reddit, “All you need is multiple images of the scene under natural light, no color chart necessary.”
On Reddit:
> Just a clarification: the method does NOT require the use of a color chart. That part was not clear in the video. All you need is multiple images of the scene under natural light, no color chart necessary.
What annoyed me with that answer is they didn't explain why they needed the color chart at all. I would assume it's for training some model, which would lead to this method not working without the chart out of the box for example in muddy waters.
The chart is needed to validate that the algorithm works: you need to have known colors in the image, you need a reference. (The chart is the ground truth)
Once the chart is back to its exact color, the image can be considered corrected (at least for this distance, illumination…).
If the algorithm brings the chart back to its true colors at several distances and in various conditions, then it can be applied confidently on images without a chart.