Home Products Download Order Contacts


Subject: Re: OpenRAW article: "DNG is not the answer"!


"if an in-camera DNG is responsible for colo(u)r, why couldn't it also have algorithims embedded for sufficently unfolding it against a known standard? ... why couldn't a camera profile be designed to supply transformation matricies that also bring unknown cameras into a relatively known state?"

I'm puzzled - your statement is almost a quotation from the DNG specification! Am I missing the point you are making? (XYZ = CIE XYZ coordinates):

Page 27: ColorMatrix1 defines a transformation matrix that converts XYZ values to reference camera native color space values, under the first calibration illuminant.
Page 47: The transform from XYZ coordinates to camera color space coordinates is defined by a sequence of matrix operations. This transform is inverted to create a transform from camera color space coordinates to XYZ coordinates.

(There is typically also ColorMatrix2 for the second calibration illuminant, of course. Calibration illuminants are typically for "Standard light A" and "D65").

Here are a couple of in-camera DNG ColorMatrix1 values:

Leica DMR back:
1.7430 -0.9220 -0.0540
-0.4550 1.3330 0.1290
-0.0930 0.2260 0.7820

Richo GR Digital:
0.7724 -0.1822 0.0027
-0.5179 1.1992 0.3639
-0.1083 0.1894 0.7509

"don't we already know which attributes are most relevant, and couldn't this be appropriately handeled inside the private maker tags?"

Adobe have a good idea which attributes are most relevant, but they handle them via tags using TIFF encoding. Here are some more examples of such tags: BlackLevelDeltaV, WhiteLevel, BestQualityScale, AnalogBalance, BaselineExposure, BaselineNoise, BaselineSharpness, BayerGreenSplit, LinearResponseLimit, AntiAliasStrength, PhotometricInterpretation, CFALayout, DefaultScale, DefaultCropOrigin. Some examples of their values (and several others) are in this page:

"processor side dependencies are one way of dealing with raw data. however they work best with raw processors that are sufficiently advantaged to recognize the sensor technology--and the sensor data".

It is better to think of it as "sensor configuration" or "sensor characteristics" rather than "sensor technology". After all, the technology details (CMOS, CCD, something else) are not important - what matters are the calculations you have to do on the raw numbers to turn them into RGB values in some defined colour space in a rectilinear array.

The DNG parameters identify the sensor configuration. For example is it Bayer, and if so which of the 4 possible Bayer origins does it have? Has it got extra colours, and if so in what pattern? Are the sensor pixels square or rectangular; are they laid out in a rectilinear pattern or in one of a number of possible offset patterns?

"is this any different from what Nikon Capture does? are we simply displacing one format for another, or are we advocating a truly portable file format with self contained matricies that can bring the raw camera data into a relatively known state for all raw processors?"

As far as I know, NEFs do not contain the same amount of detail of the sensor characteristics as DNGs. But when it comes to what Nikon Capture does, that is a rather different question. Nikon Capture needs to know information like the above. Assuming it doesn't get it from the NEFs, it needs to get it from its own data. (It is possible that Nikon have started to hold some of the above in NEFs. For example, the D200 uses CFAPatternExif, which is similar to DNG's, indeed TIFF/EP's, CFARepeatPatternDim and CFAPattern).

"i'm now curious if this is because ACR 2.4 was resourced to handle all currently known sensor technologies at the time -- or because it can transform unknown sensor technologies to a relatively known state?"

In effect, the former. It appears to have a fairly general implementation for Bayer, multi-colour, and offset-sensor configurations, which were the main configurations at the time, and then uses the DNG parameters to drive its algorithms.

I can't see how it can transform unknown sensor configurations, (I've changed that from "technologies"), to a relatively known state. I doubt if anyone knows how to implement a totally general conversion algorithm, and DNG doesn't have the parameters to drive one. For example, if a sensor manufacturer put its pixels into concentric circles, DNG couldn't describe them, and I'm sure ACR 2.4 couldn't handle an upgrade to the DNG specification that COULD describe them. At some point in the future, ACR 2.4 will not be able to cater for such a future innovation. But innovations of that sort are so rare that this could be years away.

"does the ACR 2.4 engine contain the code to anti-alias Modul-R files?"

Since ACR 2.4 can convert Modul-R files, it must be able to handle the AA issue. But I don't know what it does, and how well it does it in all cases. (I've seen some noticeable moire in an A3 print taken with a DMG back, probably after using ACR, and ditto D70. But these have been details in other normal prints, and could perhaps be handled in Photoshop).

"i sometimes find myself in the untenable position of advocating by way of blind faith. hence my question: is it possible for DNGs to be totally self contained?"

As an engineer, I too would find that position untenable. So I've subjected DNG to as much scrutiny as I can over the last year and a half or more, and written up what I've learned for others to use. The details I've provided in this thread came from probing DNGs from raw files in my collection, not simply from copying what anyone else has said. Ditto the quotes from the DNG specification. (I don't just take an Adobe party-line). The 12 or more pages at the following link are available for anyone else to use:


View All Messages in adobe.digital.negative

OpenRAW article: "DNG is not the answer"! =>


Copyright 2006 WatermarkFactory.com. All Rights Reserved.