« Previous topic | Next topic »  
Reply to topic  
 First unread post  | 8 posts | 
by ronzie on Thu Feb 27, 2014 8:42 pm
User avatar
ronzie
Forum Contributor
Posts: 459
Joined: 26 May 2011
Location: 40 miles North of Minneapolis, MN, US
I have the P221W with Spectravue. The corrections are placed as stated into the monitor LUT.

The question I have is, should the .icc profile it creates also be used in the editing app software? Does it double up the corrections already in effect in the LUT?
 

by E.J. Peiker on Thu Feb 27, 2014 8:46 pm
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 86776
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
You should not make your photo editor's color space the monitor color space.  You should select something like Adobe RGB or ProPhotoRGB in your photo editor.
 

by Royce Howland on Thu Feb 27, 2014 9:19 pm
User avatar
Royce Howland
Forum Contributor
Posts: 11719
Joined: 12 Jan 2005
Location: Calgary, Alberta
Member #:00460
It depends on what you mean by "used in the editing app". If you mean use it as the working colour space, then E.J. is right -- that's a bad move. Always. Whether you have a high-end monitor or a normal one, or multiple monitors of any kind.

But if you mean use it as the output profile for rendering accurate colours to the display, then absolutely you should be using it. Photoshop and Lightroom will do this automatically and not give you any choice, anyway.

The reason it's not a problem is one of the things about monitor ICC profiles that confuses a lot of people. We say "calibration" and "profiling" as if they're one and the same, but they're not. Calibration is about putting the output device into a standard baseline state. This sets things like black point, white point, colour temperature, luminance, a few things like that. An ICC profile stores calibration data but this was done only after Apple (ironically) realized that an ICC profile was a handy place to stash the monitor calibration data. Normally, calibration data is not stored in an ICC profile. For example, printer profiles do not contain printer calibration data (often called "linearization" info in the case of printers).

The thing that makes use of the calibration data stored in the profile is not actually the editing app like Photoshop. Rather it's a LUT loader utility. The calibration data is read out of the profile, and then loaded into the LUT -- either the one in a high-end monitor (in which case the LUT loader is a proprietary piece of software provided by the monitor manufacturer), or the one in the video card for all other cases (in which case the LUT loader is ColorSync on Mac OS, a calibration tool LUT loader on most Windows systems, or Windows itself if you've configured this new feature on Win7 or Win8).

The main purpose of the ICC profile is to actually contain profiling data, which can be thought of as characterizing the colour space of the output device. How big & of what shape is the colour gamut, translation tables for converting image colour data into & out of the device colour space, and rules for how out-of-gamut colours will be treated (these are the 4 rendering intents including Perceptual and Relative Colorimetric). This profiling data is what gets used by your editing software -- Photoshop, Lightroom and everything else. The editing app converts image data from the current working colour space (e.g. Adobe RGB 1998) into the monitor colour space, prior to sending the image data down to the video card.

Once the image data hits the video card, it is run through the correction curves that are in the video card LUT, if there are any. The image data is then piped out to the monitor. Again, if there are correction curves found there in a monitor LUT, the image data is run through them. Then pixels appear on the screen. All of this is high level, and somebody probably can point out somewhere I'm on a bit of thin ice in specifics; but it should be generally accurate to illustrate the point. :)

So that's the long answer. The short answer is, there is no "double profiling" when using a high-end monitor like a NEC with its own built-in LUT.

Unless! Yes, there is an "unless". :) If you try to mix 2 monitors on one single video card, and one monitor has a built-in LUT but the other doesn't, then in most cases you will not be able to calibrate both monitors. That's because the only way to calibrate the LUT-less monitor is to load its calibration data into the video card's 8-bit LUT. If the video card only has one LUT (which is the case for most video cards), the second you load any correction curves into the card's LUT you just screwed up the high-end monitor. Even though a NEC has its own built-in LUT, it still receives image data from the video card. When you change the video card LUT, if there's only one LUT in the card, it affects the output to all connected monitors. That's normally a bad thing, and it defeats the purpose of the high-bit LUT in the monitor.

This is the big reason why you can't calibrate 2 (or more) normal monitors on Windows systems using a single garden variety video card. That's because the card has only one LUT, and there's no place to store the separate correction curves needed for more than one monitor. The are only 3 ways to calibrate multiple displays on Windows that I know of: 1) Use multiple video cards, each connected only to a single monitor. 2) Use a high-end video card that actually has multiple LUT's onboard. (Like an nVidia Quadro card.) 3) Use all high-end monitors each with their own built-in LUT.

Mac OS is different. Since Apple invented the idea of using the ICC profile to stash the calibration data, and had the ColorSync utility to handle loading that data into video card LUT's, I believe Apple since that time spec'ed out video cards that had multiple onboard LUT's. That may have changed in the era of Intel-based Mac's using more open components, I'm not entirely sure...
Royce Howland
 

by ronzie on Fri Feb 28, 2014 12:23 am
User avatar
ronzie
Forum Contributor
Posts: 459
Joined: 26 May 2011
Location: 40 miles North of Minneapolis, MN, US
Yes, Royce. I am referring to the monitor icc.profile, not the color space profile.

I use PSE which does not have an option for a monitor profile, just a working space profile. Now comes the effect of Windows XP loading the monitor profile. I just have a GeForce card with just one LUT table. The NEC is on port II but I notice in the GeForce software I can specify a default profile for each port in Dual View configuration. I have my ColorMunki Photo created monitor profile set up on Port 1 (Viewsonic VP2365-LED)) and the NEC profile from Spectravue set on Port 2 for the NEC P221W.

Since the CMP and NEC loaders can interfere with each other I always send from the Spectravue software the calibration to the NEC LUTs before editing on the NEC monitor in PSE. I also have a plug-in for PSE that lets me load the monitor as well as printer profile for soft proof mode which I usually edit in for my output target of a printer.

Matching seems to go very well.

Paint Shop Pro X3 up in its Color Management preferences lets you pick the monitor profile and when soft proofing the printer profile. I have found X3 unstable and run when necessary X4. Istarted with X2 which was worse.

In both cases the soft proof image matches a soft proof from QImage on the NEC monitor. QImage recognizes when I am viewing from a different monitor port and prompts.

I am not aware of this NVidia GeForce 6800GT card having seperate port LUTs but it acts like it does as far as the monitor video quality for both.

Maybe I'm lucky :)
 

by Royce Howland on Fri Feb 28, 2014 1:32 am
User avatar
Royce Howland
Forum Contributor
Posts: 11719
Joined: 12 Jan 2005
Location: Calgary, Alberta
Member #:00460
So you are in my "Unless!" situation described above. :)

I'd say there is zero chance that a GeForce 6800GT has more than one LUT. :) Just to clarify again, I believe you meant to say "working colour space" above when you said "color space profile". The latter term doesn't refer to anything. There are only ICC profiles involved here, they are just used for different purposes. Your monitor has an ICC profile created during the calibration & profiling process. Your working colour space in Photoshop Elements is also an ICC profile, such as Adobe RGB 1998.

PSE, like full Photoshop, Lightroom and most other apps, doesn't give an option for setting the monitor profile. It just picks up the one set as the default in the operating system. A few apps like Qimage or Canon DPP give you the option to set a monitor profile but this can be tricky if the wrong one is picked manually.

Back to your video card. Running 2 monitors on that card with Windows XP, I believe there is absolutely no chance that both can be correctly calibrated simultaneously. Even with one monitor being a NEC display, the NEC is still hanging off of the single LUT in the video card. As you say, trying to run both NEC and ColorMunki LUT loaders will conflict. That's because as soon as the Munki calibration data is loaded into the video card LUT it will compromise the display of the NEC monitor by pre-cooking the NEC-bound image data with correction curves designed for the Viewsonic calibration.

In this situation I would strongly recommend disabling the ColorMunki LUT loader. Use your NEC SpectraView monitor for colour critical work, and simply live with the fact that the Viewsonic monitor can't be calibrated at the same time. With the Munki LUT loader disable you can still leave the Viewsonic profile in place, and Photoshop (or Qimage) can still do the colour space translation work I described in my previous post. But the calibration part can't be done simultaneously for both monitors. Only one or the other can be calibrated at a time. It doesn't seem to make sense to me to be in a state where your Viewsonic is calibrated but the NEC is not... or worse, if the NEC has its calibration sabotaged by changing the video card LUT underneath it.
Royce Howland
 

by bradmangas on Fri Feb 28, 2014 7:57 pm
User avatar
bradmangas
Forum Contributor
Posts: 278
Joined: 15 Feb 2013
Just remember what E.J. said; You should not make your photo editor's color space the monitor color space. You should select something like Adobe RGB or ProPhotoRGB in your photo editor.
 

by ronzie on Sat Mar 01, 2014 12:28 am
User avatar
ronzie
Forum Contributor
Posts: 459
Joined: 26 May 2011
Location: 40 miles North of Minneapolis, MN, US
Just to be clear:

Before making the monitor calibration for either one I set the video card to default settings.

I do not have the CMP service running by default. I do not have the Spectravue service running by default. I assume the LUTs are sticky in the NEC monitor.

I have never set the color space standards (Adobe RGB or sRGB) in a monitor profile parameter field in any editor. I know the difference. Apologize if I stated that in the wee hours I was composing.

I have each monitor profile compared against sRGB and Adobe RGB on this site:
http://www.iccview.de/content/view/3/7/lang,en/
by putting the color space in the wire frame and the monitor in the gamut view or reverse. Just a check on quality of the calibration and the monitor quality.

Thanks for all responses.
 

by Royce Howland on Sat Mar 01, 2014 12:44 am
User avatar
Royce Howland
Forum Contributor
Posts: 11719
Joined: 12 Jan 2005
Location: Calgary, Alberta
Member #:00460
There is a difference between the ColorMunki service (technically a version of the X-Rite Device Service), the ColorMunki quick "photo tray" app, and the ColorMunki LUT loader. The only one to really worry about is the latter; it's called ColorMunki Gamma, and it runs from your Windows Startup folder. If it's in Startup, then it's running when you boot / login, and it's setting the video card LUT from the Munki monitor ICC profile. I suggest that you don't want to do this.

Yes, the NEC LUT is sticky in the monitor. Technically you don't need to run the SpectraView LUT loader utility. I leave mine running. In case the monitor's LUT ever got screwed up, the SV loader will reset it. But so far I've never had the monitor LUT get messed up except when I was fiddling around with things. :)

Monitor calibration software will wipe out any correction curves loaded into the video card LUT, prior to starting any monitor calibration run. So you're safe as far as that goes.

If you're not running the ColorMunki Gamma loader in Startup, then you're about as good as you can get on your config -- the NEC is fully calibrated and profiled, and the ViewSonic is profiled but not calibrated. Though you presumably will have manually adjusted the Viewsonic's brightness during the ColorMunki calibration run, so that's at least something. What likely will be off about the ViewSonic will include color temperature and greyscale neutrality, because there are no correction curves loaded into the video card LUT to adjust these characteristics.
Royce Howland
 

Display posts from previous:  Sort by:  
8 posts | 
  

Powered by phpBB® Forum Software © phpBB Group