• Removed user
    Aug. 18, 2023, 7:10 a.m.

    I can understand white-balancing in normal photography well enough but, particularly in the case of true IR or true UV, there is no "white" to balance. That is to say that all the radiation incident upon the sensor after passing through say an 850nm filter on the lens is not actually light and therefore can not be called "white". Whereas, for normal photography, there is a point for any shot on the CIE Chromaticity diagram that is both visible and can be a White Reference, e.g. Daylight D55.

    And yet UV or IR shooters talk about "white-balancing" their work as if it were somehow a necessary part of the work-flow ...

    ... the plot thickens for Full Spectrum because it could and usually does include some visible radiation.

    I suspect that the term "white-balancing" is used incorrectly whenever the spectrum incident upon the sensor is abnormal by choice or design.

  • Members 317 posts
    Aug. 18, 2023, 11:57 a.m.

    I'm not sure it is used any more incorrectly in "full spectrum photography" than in photography in the visible spectrum. In both the case of hyperspectral and multi-spectral imaging and imaging in a given spectral band (e.g., visible, IR, etc.) , the spectrum needs to be "balanced" so that meaningful information can be derived about the spacial image produced by reflective and/or emissive radiation in the band(s) of interest. That is in a visible photograph one wants a red shirt to appear red in the image and a yellow shirt to appear blue.

    The term white balance is also used loosely in radar imaging to mean changing map the values of the reflective image to compensate for effects of atmospheric effects on the transmitting and reflected radiation. In the case of a bistatic Synthetic Aperture Radar imager, the atmospheric effects on the transmitted path and reflected path can be different.

    en.wikipedia.org/wiki/Hyperspectral_imaging
    opg.optica.org/josaa/fulltext.cfm?uri=josaa-34-7-1085&id=367333

  • Members 534 posts
    Aug. 18, 2023, 12:58 p.m.

    One always has a choice of weighting the capture color channels in the output. Even if the math is not done to normalize an expected "white", there is still a choice of weighting the channels. The weighting is the real phenomenon; "white balance" is simply the most common use for it.

    With purely visible light, one can "white balance" for one white object, and most objects that look equally white in direct human vision should be very close, if there is minimal metamerism. Once you start recording enough invisible IR or UV light, then multiple perceptually "white" objects can greatly diverge from each other.

  • Removed user
    Aug. 18, 2023, 4:54 p.m.

    Hmmm ...

  • Members 534 posts
    Aug. 18, 2023, 5:40 p.m.

    I bought a UV flashlight recently, and found an interesting illusion. When you point the UV light at some white objects when there is also ambient visible light, they seem to get dark and brown! Obviously, it's not getting darker adding any kind of light, but the way the brain adapts makes it look like the flashlight is subtracting blue light from materials like porcelain!

  • Members 114 posts
    Aug. 18, 2023, 6:05 p.m.

    I'm using "white balance" tools either in camera or in post-production so I tend to use that language in my IR/full spectrum photography. It may not be the most technically correct terminology but it communicates the core function well enough.

  • Removed user
    Aug. 18, 2023, 6:28 p.m.

    Athena, do I understand correctly that the use of "white balance tools" is a necessary part of your work-flow?

  • Members 317 posts
    Aug. 18, 2023, 8:06 p.m.

    Somehow a word was omitted. That should read - That is in a visible photograph one wants a red shirt to appear red in the image and a yellow shirt not to appear blue.

    The point is the process that we call white balance in photography in the visible spectrum is a process that is required in all sorts of imaging. The world appears very differently deepening on the frequency of the radiation that is illuminating it. One can throw camouflage netting over a tank, and no one can see it with a camera in the in the visible spectrum. But it will stand out in a X band radar image. It will stand out in an IR image and UV. Cameras are used by large agricultural operations to determine when crops need to be irrigated, when there are holes in the cover crop, the ground temperature. Thermal imaging (IR and long wavelength red) is a critical tool. In fact when Fairchild merged with Loral, we kicked off a program with Loral Aerospace (formally Ford Aerospace to fly a satellite program that provide support to the Ag industry through a subscription service in about 1992. Of course the raw collected imagery data has to be "balanced" so that the signatures can be detected and characterized so they can be translated into the physical and biological characteristics sought.

    Of course NASA launched the first Landsat in 1972 which uses multi-spectral imaging to monitor the health of the earth. Of course there has to be an estimation of the illumination to balance the image so the heat maps and false color maps have a physical meaning - like the red shirt red and yellow shirt yellow.

    Today drones are becoming a tool for much of this as they are lot less expensive than subscriptions to satellite services.

  • Removed user
    Aug. 18, 2023, 9:30 p.m.

    The meaning of "IR" in the title of this thread was intended to have a max of about 1150nm as limited by the silicon sensors found in our cameras. So, although the talk of longer wavelengths is interesting, it is less relevant to the subject at hand. Similarly, full-spectrum for the purposes of this discussion should be taken to mean about 350-1150nm, so not "all sorts of imaging".

  • Members 114 posts
    Aug. 19, 2023, 3:36 p.m.

    Yes, without white balance tools (or substituting in other methods to balance colour channels) I wouldn't be able to produce the style of work I like. For example, an established technique for true IR (720nm and longer) captures is to white balance off of foliage, either using a post production tool like a white balance eyedropper or by creating a custom white balance in camera.

  • Removed user
    Aug. 19, 2023, 9:37 p.m.

    So, if the said foliage is emitting lots of true IR and with, say, the much-loved Hoya 720nm on the lens, then we would expect a highly-captured-but-not-blown red channel compared to the much lower capture with other two. I imagine that attempting to "balance" foliage to look white may not always be easy

  • Members 114 posts
    Aug. 22, 2023, 7:40 p.m.

    Pretty much. Here's a processed image and a RawDigger histogram from the original raw file:

    230625-ReferenceIR-026.jpg

    230625-ReferenceIR-026-Full-5240x3912.png

    And for reference, straight-out-of-camera processing with Daylight white balance:

    230625-ReferenceIR-026-2.jpg

    230625-ReferenceIR-026-2.jpg

    JPG, 807.8 KB, uploaded by Athena on Aug. 22, 2023.

    230625-ReferenceIR-026.jpg

    JPG, 813.5 KB, uploaded by Athena on Aug. 22, 2023.

    230625-ReferenceIR-026-Full-5240x3912.png

    PNG, 72.8 KB, uploaded by Athena on Aug. 22, 2023.

  • Members 204 posts
    Aug. 23, 2023, 5:43 a.m.

    For UV and IR photography the colors outside of our visible perception are really just imaginary since we can't actually see them. Thus, using WB is just a way to render the data into visible frequencies. Also, since many of us often simply go B&W with near-infrared, or with fully visible spectrum sensors for that matter, even then altering the WB is the similar to putting a color filter on your lens (something old school B&W photographers often did).

  • Members 317 posts
    Aug. 23, 2023, 12:11 p.m.

    We can not see IR or UV for that matter. We can only see a tiny slice of the spectrum. However, there is information in at most frequencies. The wonderful color images generated by the Space Telescope Science Institute from the Webb, Hubble, and several other on orbit cameras are generated by mapping the response in a given filter to a color. It is both an art and science. The instruments presents multiple raw files in different frequency bands and the information is transformed into an image with the coloring representing the band. That is no different from the "white balance" process although with non-visible spectroscopy - the colors have no absolute meaning. In visible photography one would like white to actually appear as white.

    What is interesting to do is download actual Webb or Hubble data from the STScI and produce your own images.

    outerspace.stsci.edu/display/MASTDOCS/Demos+and+Tutorials

  • Members 208 posts
    Aug. 29, 2023, 7:48 p.m.

    White balance is about how the visible output relates to the input in each of the three channels. This applies irrespective of the wavelengths recorded by the channels.
    The white being balanced is output not input so the term is being applied quite correctly :)

    As some or all the recorded wavelengths are invisible the combination that gives white is usually up to the photographer to produce the most appealing result, though in some scientific shooting it might be required to show equal energies or some such.

  • Removed user
    Aug. 29, 2023, 8:37 p.m.

    Thanks for the clarification which makes a lot of sense and I can see now why 'white balancing' for stuff like foliage can require quite drastic means often far beyond a normal temperature+tint method. Could also explain the popularity of the Kodak AeroChrome look with red foliage rather than white.

  • Members 208 posts
    Aug. 30, 2023, 2:01 p.m.

    Aerochrome was a colour film with enhanced NIR response. I've never used the film but have seen numerous images from it in my various IR photography books.
    I believe it records IR as red but otherwise records much as normal.
    There are quite a few other colour IR forms that give foliage in colours other than white, the Goldie look is very popular.

    I've heard of PTFE, paper, skin, foliage, clouds & concrete all being used for white balance, with some being balanced without the filter... IMO anything that looks good is valid :)

  • Removed user
    Aug. 30, 2023, 6:08 p.m.

    Agreed. On my modified (sorta full spectrum but CFA left in place) Lumix DMC-G1, I mostly have the WB cranked down to the minimum degs K, irrespective of what's on the lens (I don't use a "visible" lens filter on that camera).

  • Members 861 posts
    Sept. 3, 2023, 1:41 a.m.

    For what it's worth, given that IR/FS is kinda subjective, one of the things I really like about it is that there is no correct white balance. It's open to how you want to craft your image. Sometimes it makes sense to push colors one way or another for whatever reason.

    Speaking from a strictly modern sense of recently becoming aware, a fan, and a bit obsessed....We all want the things we can not have. AeroChrome is pretty much something none of us can have anymore. And it looks amazing, interesting, even stunning. Wouldn't you like to try shooting with it?

  • Removed user
    Jan. 18, 2024, 8:01 p.m.

    Not really, I've never processed any kind of film and that look can be closely simulated these days with a combination of lens filters, modified sensor and channel manipulations. See:

    kronometric.org/phot/ir/SD1M%20Kodak%20IR%20film%20simulation.pdf

    Applies to a Sigma DSLR ...

  • Members 208 posts
    Jan. 20, 2024, 12:39 p.m.

    Nice link, That I've not seen before. It may have been the paper that inspired the photographer I took the technique from, I didn't get my Foveon camera till 2019 having been after one for some years. I've found the same variation in IR transmission of X1 filters, having got three in different sizes & finding only one works to my liking (similar to the middle IR transmission in the paper) and the others are high IR & low IR versions.

    FWIW foveon sensors are VERY different to normal Bayer/CFA based ones. No amount of modifying a standard camera will replicate the results of an IR Foveon.

    Here's one of my first efforts with the SD14, pretty much SOOC:
    live.staticflickr.com/7822/47405347692_011149736c_c.jpgwrabness aerochrome small by Mike Kanssen, on Flickr

    User removable dust filter taken out, X1 filter added, camera put on Fluorescent WB & shot JPG.
    Reprocessing was in this case I think restricted to resizing for the web. I have ended up tweaking the hue slightly on other shots with this set-up.