• Members 30 posts
    April 16, 2023, 2:31 p.m.

    The 48MP capablity puts the main back camera of the iPhone 14 Pro and Pro Max to the class of high resolution cameras. As the 48MP go with RAW data in a DNG format, there is hope that one can look into the details of the underlying data quality and the micro-contrast of the raw image in the sensor.
    Microcontrast is just a synonym for the information contained in a full MTF or Spatial Frequency Response data set in my opinion.

    Essential tools for this venture are:
    mtfmapper an open source software provided by Frans van den Bergh .
    dcraw open source software by Dave Coffin
    (not used finally, continuing dcraw development: libraw open source software by Alex Tutubalin and Iliah Borg +...)
    exiftool open source software by Phil Harvey
    many thanks to the authors of these essential tools !

    I use my own charts and programs to complete the work based on the above essential tools.
    The background and detailed considerations can be found in my article together with Frans vd Bergh:
    "Fast full-field modulation transfer function analysis for photographic lens quality assessment" published in Applied Optics 2021.
    The article contents are freely accessible through this link: www.dora.lib4ri.ch/psi/islandora/object/psi:37151

    One of my standard MTF data slices for a full frame camera is maps of Meridional and Sagittal MTF across the image field at 40 cyc/mm.
    As the 14 Pro has a main sensor with crop factor ~3.5, this translates to 140 cyc/mm for the same detail resolution across the image frame.
    iPhone_14Pro_SFR140_48Mpix.png

    lets see if that worked. The 14 Pro sample here is somewhat decentered as is seen frequently for consumer optics. This leads to a banana shaped region of reduced micro contrast in the chart for meridional resolution mainly. The partial loss of sharpness can be spotted by eye when looking into the region of the image with lower micro contrast. It is not easily spotted in typical normal images, as frequently there are no image details available that allow eyeball sharpness comparisons.

    iPhone_14Pro_SFR140_48Mpix.png

    PNG, 56.6 KB, uploaded by BernardDelley on April 16, 2023.

  • Members 30 posts
    April 16, 2023, 3:14 p.m.

    Observations made while getting to this result:
    dcraw -D -4 -c IMG_2319.DNG > rpix.ppm
    the -D option should show pixel raw data as gray scale values as is. So I expected to see type P5 ppm file here. But no!
    pnmtoplainpnm < rpix.ppm > jpix.ppm
    produces a "human readable" ASCII version of the file. Expected was file type P2, which would have to be seen as pixel values with filtering according to the Bayer pattern. I expected to see 48 Mpix of 10 bit values.
    But, for this DNG file I got P6 respectively P3 type RGB values. It turned out to be 48 Mpix of 16 bit RGB spanning the range up to near 2**16-1 with gaps in the occurring numbers of 74 and 75. So the 10 bit values are scaled up by 74.xxx and the de-bayering offloads 3 numbers instead of one per pixel. This explains the huge DNG file sizes. The only advantage is that this DNG file is compatible for viewing for many software programs. Apart from the disadvantage of file size, we have also lost a handle on the Bayer pattern, and we also do not know the relation of these interpolated tricolor pixel values to the underlying single color Bayer pixel values. The contour plot in the opening post is suggestive that the extracted values are close to a believable raw value. It is not clear what mtfmapper --bayer green does with a P6 or P3 file. Its output suggests that it emulates "white". But if --bayer red or blue is asked a slightly different results comes. I hope that Frans will chime in and explain.

    A note on libraw: 4 channels may provide useful results, notably the G2 channel seems to have no image (as there is no second green in tricolor).
    libraw : dcraw_emu did no extraction as original dcraw, and libraw : unprocessed_raw also did not like this DNG file.

    The quad Bayer pixels of this camera has been reported to consist of groups of 4 pixels with the same color. To some reporters and me this makes little sense. Of course alternating Bayer patterns RGGB GRBG could still be extracted when going to 12 Mpix at twice the resolution.
    However, a normal Bayer pattern could just as well binned 2x2 according to color when all the 48 Mpix info is available. The advantage of the quad pixel with normal Bayer pattern would be, that the displacement between the centers of gravity of sensitivity between say R and B would be half as large as with a 12 Mpix sensor of the same physical size as the 48 Mpix quad sensor.

  • Members 78 posts
    April 16, 2023, 3:42 p.m.

    Hi Bernard, interesting. Yes, there are a few incarnations of 48MP quad Bayer, many of the new Sony sensors aimed at video seem to go that way (for instance I believe A7SIII and Pi V3 Cam). They are effectively 12MP cameras with HDR capabilities: the four same-color continuous pixels can be read with different timings/gain to extend the captured intensity range by a couple of bits. Depending on microlens layout each pixel quartet could be used as a sort of cross-type phase detector for auto focus. Also some sensors do a 'remosaic' generating an estimated standard Bayer cfa file at 48MP (though losing resolution so probably unsuitable for photography, which may be why the A7SIII is sold as a 12MP camera). It is very sensor specific.

    If the raw data is uncompressed I could try to load the file directly into Matlab to see what we get that way. It's a digital thing: it either works or it doesn't. Should you be interested shoot me a PM with the link.

    Jack

  • Members 30 posts
    April 16, 2023, 8:04 p.m.

    here is a we transfer download link for the 48 Mpix iPhone 14 Pro file, 60 MB, file IMG_2318.DNG showing the image underlying the MTF analysis.
    we.tl/t-juG6SsEZCW
    It can be looked at directly in (Mac) Adobe Bridge and opens in ACR, albeit somewhat dark. The binary P6 type extract file goes directly to Adobe PS (CS6) and shows like in Linux. Ubuntu Linux ristretto shows it, the way I think it should look and exactly the same way as the dcraw extracted files the binary P6 and the ASCII P3 file.
    The we transfer link expires after 7 days, as usual.

    By the way, with the binned 12 Mpix picture one could have 12 Mpix tricolor from the quad pixel. But, that is definiteley not what it looks like. The 12 Mpix DNG produces SFR which clearly look over sharpened. The decentering banana shows too.

  • Members 83 posts
    April 16, 2023, 8:18 p.m.

    This is absolutley unacceptable - I would never by any phone unless it produced a perfectly centered banana...

  • April 16, 2023, 8:29 p.m.

    I wish I understood all the above - it sounds fascinating.

    So, for the end result - if one took a a picture with an iPhone 14 at 48mp, would:

    a) the colour be comparable with (say) a 40mp Fuji
    b) the sharpness from the lens be comparable to a decent camera and lens?
    c) if printing on a large decent printer, would it be acceptable to an art gallery?

    If not, then just saying "I have a 48mp camera" doesn't really help when trying to determine whether a picture from an iPhone is as good as a full frame Canon/Nikon etc. ANd I am just talking tehcnical merits, not artistic merits (you can get art from a pinhole camera).

    Alan

    Alan

  • Members 621 posts
    April 16, 2023, 8:34 p.m.

    Wow, this is more depth than I normally bother with. For me, comparing my iPhone 12 Pro Max and the 14 on 8x10 crops from a full size 16”x20” and 24”x30” print test, the iPhone 14 looked a fair bit more detailed. Pretty amazing really.

  • Members 30 posts
    April 16, 2023, 8:52 p.m.

    These modern high quality phones make good pictures, good for calendars with prints 420mm ~ 16" wide. A 24 or higher Mpix crop 1.5 or full frame camera can make still better images, noticeable to the astute observer. The underlying raw image quality can be made better, especially better tone resolution at the pixel level can be fundamentally better with a larger sensor, which becomes more noticeable in lower light. But a good phone has pretty good software, knowing a lot about decent post processing. And the phone does it right away.

    For comparison I have an MTF map ready with a modest AF-S 24-85mm f/3.5-4.5 G zoom lens on my 45 Mpix D850 ready. It is better centered and provides a bit more micro contrast in a larger central region. I think this lens is about as it was designed to be. But the microcontrast falloff is faster to the far edge than I want to tolerate for all purpose mid range zoom on the D850. So I sold the lens.
    NikonD850+AFS24-85G_24f35_SFR40.png
    24mm F/3.5 at imaging ratio 1:77

    NikonD850+AFS24-85G_24f35_SFR40.png

    PNG, 58.1 KB, uploaded by BernardDelley on April 17, 2023.

  • Members 244 posts
    April 16, 2023, 10:11 p.m.

    I am truly not smart enough to understand this technical stuff. But, specific to the iPhone 14 pro max, how does this technical stuff jive with real world prints? My experience is similar to this:

    youtu.be/uo1C3oH3M4g

    And this:

    youtu.be/SAze0NoeuOI

    Does this technical stuff also align or does it refute?

    To me, the only important thing is the (large) print. What does this technical stuff say about a large print?

  • Members 30 posts
    April 17, 2023, 7:40 a.m.

    You certainly can do pretty good large prints with such a a phone. I bright light you get the sharpness. Both videos did not mention 48 Mpixel captures, but most probably they used them. The first one said little about the mirrorless counterpart, lets assume at least 24 Mpix. The second one said nothing about the deep shadow detail in the black rocks. The 3.5x crop sensor surely has less to show there than a reasonably current 1.5 x crop or full frame camera.
    Does it matter to you? Maybe not, maybe that detail does not add significantly to the impact of above image. An image may have poor technical quality by the most recently achieved standards and still be a great picture with a lot of visual impact.
    The second video talks about tripods and extra attachment lenses: he wants to maximize the use of the 48 Mpix main camera. Digital zooming and switching to the lesser built in wide and tele camera has a hit in image quality as a consequence. But these lesser cameras are with you when you have the phone. And did I get this right iPhone 14 is a waterproof camera ?

    The camera measured here has some decentering with a consequent loss of sharpness in part of the image. This is a consequence of fabrication tolerances. It may not matter. How many samples would you need to buy to find a really good one ? The loss of sharpness for my sample is easily seen in the test image presented for download. above.

  • Members 2 posts
    April 17, 2023, 9:49 a.m.

    If you use the --bayer green option with an RGB input image, MTF Mapper will first convert the RGB image to luminance, and then proceed to process the resulting grayscale image using only the pixels from the specified Bayer subset (green in my example here). Not sure if this makes much sense, but for simple demosaicing algorithms (bilinear and such) this means that --bayer green will end up using only pixels where the green channel was not interpolated (although you do end up adding the interpolated red and blue values, albeit with smaller weights).

    I don't necessarily think that this is the best way to handle this case; perhaps I should just refuse to accept the --bayer option on RGB images. At least that will produce well-defined behaviour :)

    -F

  • Members 30 posts
    April 17, 2023, 11:33 a.m.

    thanks for the explanation, Frans. So I will actually not use this, but fabricate gray scale images of P2 type by extracting from the tricolor exactly as I want.
    There is this issue about the quad pixel pattern, which is completely obscured in the 48 Mpix tricolor data. Maybe some info can be fished out by controlled filtering.

  • Members 244 posts
    April 17, 2023, 11:48 a.m.

    The person in the first video shoots a canon R5 and that was his comparison shots. Here is the first video in his two part series where he takes the images and works on the files:

    youtu.be/vWmlQBw6UU4

  • Members 976 posts
    April 17, 2023, 1:07 p.m.

    Sorry, but it is "Alex Tutubalin and Iliah Borg plus numerous contributions from many people, directly or indirectly".

  • Members 78 posts
    April 17, 2023, 4:16 p.m.

    Matlab does not open it properly as-is. It is in fact a 48MP lossy compressed file with PhotometricInterpretation 'LinearRaw' (instead of the typical 'CFA') and BitDepth of 30 [10,10,10] instead of plain 14, so apparently an RGB image. Here is the LinearizationTable:

    i.imgur.com/mGIwhgD.png

  • Members 78 posts
    April 17, 2023, 4:44 p.m.

    Impressive. However, I only watched the first part of the video and it was totally obvious which is which. There is only so much you can make up with computational photography.

  • Members 30 posts
    April 22, 2023, 9:51 a.m.

    Now I finally understand what you said. Did you read the Linearization table as such from the DNG file, or did you extract it from the data analysis as I did?
    I analyzed the histgrams of the previously communicated test file, 75% area is "white" 35% area is "black" with a contrast ratio of ~20. This contrast ratio is seen in the 16 bit tricolor values extracted from the dng file by dcraw -D -4 -c IMG_2319.DNG | pnmtoplainpnm
    I find only 868 distinct numerical pixel values in the file, which is consistent with 10 bit data. However, the 10 bits encode the pixel values in a nonlinear fashion, while the 16 bit values in the DNG are a linear representation of tricolor values. The graphic shows the steps or gaps in the tricolor values. It also shows the log normalized histograms. It is unlikely that my illumination turns out very precisely white balanced with respect of the sensor filtering. But, that is what the raw histograms suggest. So the iPhone 14 Pro does a white-balancing scaling before presenting its raw values.
    ghisto_iPhone14Pro.png

    ghisto_iPhone14Pro.png

    PNG, 21.8 KB, uploaded by BernardDelley on April 22, 2023.

  • Members 78 posts
    April 23, 2023, 7:46 a.m.

    Hi Bernard, there is a tag in the original DNG called 'LinearizationTable', the plot is of the relative data. Matlab opens raw files as if they were TIFFs, which the vast majority of them are, the difficult part is dealing with non-standard encoding. I assume one could also get a dump of the table above via Exiftool.

    Interesting about the white balancing. If the approach is similar to that shown in my links above, there is a ton of processing that goes into generating such a DNG RGB file: read 48MP in Quad Bayer format, possibly by reading each of the four same-color quartets at different exposure times for more dynamic range, subtract black levels, tone map the different exposures into a single quad bayer one of higher bit depth, white balance, guess the missing two channels, lossy encode to 10 bits per the table.

    Have you tried taking a couple of black frames and producing a spectrogram? I think it would show all sorts of filtering. Same with MTF off a slanted edge.

  • Members 30 posts
    April 23, 2023, 12:11 p.m.

    There is obviously lots of processing going into this DNG "RAW". This and the lack of detailed control discourage me from sinking more time into black frame analysis. What I found interesting, is that the RGB channels have exactly the same sets of 16 bit numbers. And they can be coded as 10 bit numbers with the coding table shown pink in my fig. The jitteriness comes from rounding errors +-1 around the monotonic stepped underlying? linearization table. Getting there with a linear ADC one appears to need a 16 bit ADC before lossy compression to 10 bit. 16 bit ?? is a bit strange for a 1.22 micro meter pixel, but a coarser ADC would show up with more jitteriness. The MTF analysis is not sensitive to such detail, as long as the RAW its presented as linear data. Tat it is. For the 48 Mpix the derived MTF look reasonable. The one shown is a greenish-white as explained by Frans. If I decompose the tricolor values assuming a straight Bayer pattern for the non interpolated values (which makes more sense to me), I can derive MTF maps for R, G,B which are in convincing relation to each other, similar to other lenses I have analyzed. Assuming it is really this quad Bayer arrangement, then I would have picked up some interpolated values, where I thought they would be non-interpolated. It would not make much of a difference. Not enough that one could decide.

    There is an option for a 12 bit RAW from the main sensor. It has no simple relation to the 48 Mpix raw, like quad pixel binned would be. MTF analysis of this shows the same lens defect, but the 12 Mpix "RAW" show massive sharpening effects in the MTF.

    Your page about the spectrogram of the Nikon Z7 is interesting. If I try to think like a development engineer of the Nikon team, I come to this hypothesis about how to fill in substitute data for AF sensor pixels: blue: derive a value from the 4 nearest blue pixels. For example use the average of median+ and median- to fill in for the missing data at the AF site. Same procedure at the Gb AF pixel. In the case of blue, you get something similar to median smoothing in the same row or column. In the case of green, the for nearest pixels, Ga, are neither in the same row or column. Thus for green there is no correlation in row or column values showing up in the spatial noise spectrum. Such local hypothesis can be tested by analyzing the (AF) pixel neighborhoods, to see if the values are consistent with the hypothesis or if there are sites which break the rule. (I used such an approach to find the hot pixel suppression algorithm, see at dpreview)

  • Members 78 posts
    April 23, 2023, 1:04 p.m.

    Right Bernard.

    Looking at similarities in the sensor world today, the Quad Bayer Sony sensor in the Pi V3 camera introduced in January of this year gives us a few hints about what might be happening under the hood in the 14 Pro. As mentioned earlier, I believe the 48MP file is not meant for day-to-day use. Quad sensors are meant to be mainly used at 1/4 the resolution, 12MP here, in part because shuttling around 4x as much data runs into all sorts of bandwidth, memory and power limitations - think for instance of video. The Sony A7SIII has a 48MP Quad Bayer sensor but produces 12MP raw files and is marketed as a 12MP camera.

    In the Pi V3, the four pixels of the same color in each Quad are not binned directly in some modes: they are read at different times, like in exposure bracketing. Stacking is performed on-sensor, producing a single HDR pixel out of the four. This is the way that a Pi engineer put it for the V3

    The V3 gains about 2 bits of dynamic range this way, so if we generously assume that the Pro 14's sensor produces 12 real bits out of the ADC, we could get 14 bits at 12MP out of the sensor after tone mapping. Normalizing this to 16 bits provides additional precision in order to perform the other operations discussed earlier to produce the RGB DNG file (wb, demosaicing etc.). It is easy to surmise that slow-framerate 48MP images could also be produced by additional processing, which I would think one should be able to pick up in dark frames. The V3 for instance produces a heavily processed full-resolution re-mosaiced Bayer 'raw' file out of the Quad data to allow for compatibility with existing raw converters.

    Just thinking aloud though.

  • Members 39 posts
    April 23, 2023, 9:02 p.m.

    Google already claims to use a short exposure taken on their wide angle lens to remove motion blur (blurred faces get replaced with an AI upscaled one taken from a crop of the wide lens). If you're getting shutter speed bracketed shots from one sensor in "one" exposure then I'm surprised they didn't use the short exposure pixels for this

  • Members 30 posts
    April 24, 2023, 2:12 p.m.

    I repeated my test with the huge test chart. Below, I show the SFR analysis for the 2x zoom mode of the 14 Pro which is achieved by a 2x crop on the 48 Mpix sensor resulting in a 12 Mpix DNG file. As the sensor area is cropped, I have upped the cut for SFR to 200 cycles/mm which corresponds to ~30 cycles/mm in a full frame sensor. The lower right corner is still affected by the reduced micro contrast due to the decentering, which was seen clearly in the map of the full sensor. It is obvious that the sensor data get a different raw preprocessing in the zoomed case. Now the micro contrast of the raw data in the image center for 200 c/mm exceeds the one for 140c/mm in the full 48 Mpix case. The system MTF (lens ++ sensor) is actually coming out a tad beyond the diffraction limit from the f/1.78 aperture alone. So clearly, a sharping method has been invoked. While the MTF measurement is designed to be insensitive to noise, practical image rendering of minuscule detail is bounded by the contrast transfer (as measured) and total noise muddying the detail.
    My feeling is that quad bayering with adjacent pixels of the same color would noticeably limit resolution and would not be compatible with the result as found below: a realistic pixel aperture MTF for the combined 2.44 micro meter pixel would lead to more contrast loss than the one caused by the lens.
    iPhone_14Pro_main2x_SFR200_12Mpix.png

    iPhone_14Pro_main2x_SFR200_12Mpix.png

    PNG, 50.4 KB, uploaded by BernardDelley on April 24, 2023.