fbpixel

Moderators: Greg Downing, E.J. Peiker

All times are UTC - 5 hours

  
« Previous topic | Next topic »  
Reply to topic  
 First unread post  | 7 posts | 
by DChan on Mon Apr 17, 2017 4:36 pm
DChan
Forum Contributor
Posts: 1521
Joined: 09 Jan 2009
I just got the BenQ to replace my dying Asus. I've read the menus but I think I still have some questions - probably stupid ones -because I'm not sure what's the proper way to use this monitor. Here's my questions:

1. There're several color modes I can choose from. But, which one should I choose? Say I'm web browsing, should I use standard, adobe RGB, sRGB or something else?  What mode to use when I'm processing photo or doing something else?? Are the color modes just there for my convenience to move from one color space to the next?

2. When calibrating using its own software, does it matter what color mode I'm in (I don't think the menu says anything on that)? Will the calibrated result affects all color modes?

3. When I calibrated the Asus using spyderco, adjusting the brightness level was part of the process. Do I have to adjust the brightness level of BenQ before or during the calibration?

4. Should I be choosing luminance 120 gama 2.2, etc. for the calibration or the program would recommend based on the lighting of my room?

5. How do I use third-party calibration software to calibrate this monitor (in what color mode)? If I do, will the calibrated result affects all other modes?

6. Is this a 10-bit monitor? How do I know I have a 10-bit monitor or not?

TIA !!
 

by E.J. Peiker on Mon Apr 17, 2017 8:08 pm
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 79130
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
1. I would use Adobe RGB - sRGB is essentially contained within that space so you should not have any problems
2. Calibrate with the monitor in Adobe RGB
3. Either way, the software likely overrides the physical setting on the monitor to give you the desired lumens
4. If you do a lot of printing, I would probably recommend something more like 100/2.2. if not or you are in a well lit room then 120/2.2 is fine.
5. You wouldn't do that as you would be double profiling the monitor. use what comes with the monitor.
6. You'll have to check the spec sheet and to use 10 bit you would need to use the DisplayPort connections and drive it with a graphic card that supports 10bit out - most non workstation cards do not but a few do like the nVidia 10XX series.
 

by Royce Howland on Mon Apr 17, 2017 9:15 pm
User avatar
Royce Howland
Forum Contributor
Posts: 11671
Joined: 12 Jan 2005
Location: Calgary, Alberta
Member #:00460
Here is some other input:

1. Don't use any of those colour mode presets. Instead, when you calibrate the display with the BenQ Palette Master Element software, you can associate your calibration settings with a preset called "Calibration 1" or "Calibration 2". These correspond with programmable LUT entries stored within the monitor itself. Normally you'd only have a single one of these defined, typically "Calibration 1", and just use it for everything. Having a second calibration setting is for specialized stuff that most people never need to worry about. All colour managed applications will display colours appropriately on your display when it has been profiled. For web browsing, you want to be using a browser that is colour managed. This has been discussed extensively in other threads.

2. When calibrated using Palette Master Element, your calibration settings will be associated with the presets called "Calibration 1" or "Calibration 2". The other colour modes are baked-in presets that side-step your calibrated settings. They simulate various operating modes like Adobe RGB, sRGB and others. I personally see no reason to use such presets. I calibrate my display and look at all images using the main profile through colour managed applications. I want to see how the image looks, fully, on my display. I don't really care how it looks on my monitor's idea of some simulated versions of some other arbitrary devices. That's what soft proofing is for, and that's a whole other topic.

3. As E.J. said, when using the Palette Master Element software, the display panel brightness is automatically adjusted to hit the desired brightness target you have entered. You don't have to manually adjust anything using this software; it controls all the critical settings.

4. Gamma 2.2 is the standard choice. As for the brightness target, it depends. The software will not recommend a brightness setting based on your room's ambient lighting, the application is too basic for that. But I don't use such functions in other software anyway; I prefer to be in control of the settings. What target to choose depends on what you're doing (e.g. just digital editing, or editing for print matching) and on your ambient lighting conditions. 120 cd/m2 is a standard rule of thumb, but you may need it brighter or darker. I run with 90 cd/m2, myself, because my workroom lighting is intentionally dim and I'm mostly concerned about print matching under typical (dim ish) print viewing lighting.

5. Agree with E.J. -- I wouldn't use a 3rd party calibration software with this monitor. No other software will be able to drive the proprietary onboard hardware, which defeats the purpose of paying extra for such a monitor. If for some reason you absolutely had to calibrate with a 3rd party application, you'd have to pick one of the preset colour modes like Adobe RGB. The results will be less accurate than with Palette Master Element, and the information can't be stored directly into the monitor's onboard LUT ("Calibration 1" or "Calibration 2" settings). If you change any of the monitor settings including the colour mode, your calibration will be thrown off because the profile is created based on the mode you had the monitor set to when you made the profile.

6. The SW2700PT is a 10-bit display. However, unless you also have a 10-bit capable video card (like the Nvidia Quadro series or certain high-end GeForce cards) driving the monitor over a recent version of DisplayPort (I believe 1.2 or higher), and have all the proper video drivers, AND also have software that supports 10-bit video output (most don't, Photoshop itself on Windows is one of the few that does), then you won't actually be seeing 10-bit imagery.
Royce Howland
Visit my web site for photo galleries, my blog and photo tours & workshops
 

by DChan on Tue Apr 18, 2017 12:44 am
DChan
Forum Contributor
Posts: 1521
Joined: 09 Jan 2009
Thank you for the responses, EJ and Royce!

I am using the monitor's Adobe RGB and have not changed a thing so far. As far as calibration goes, sound like it does not matter if I'm on its "Standard" or "Adobe RGB", I just need to run Patette Master Element and it will create a profile that suits my needs? This actually is how I used the Asus PA246, which also comes with color presets but I calibrated it (it has a custom option and so I just selected that when calibrating) anyhow and never selected any of the presets.

I also just hooked up the BenQ to my PC with a DisplayPort cable that came with the Asus (now I wondered why I never thought of using that cable; the one that comes with BenQ does not work with the GTX 1070 which I am using). The only thing is I don't know if it's 1.2 or higher :lol:
 

by Royce Howland on Tue Apr 18, 2017 8:55 am
User avatar
Royce Howland
Forum Contributor
Posts: 11671
Joined: 12 Jan 2005
Location: Calgary, Alberta
Member #:00460
Correct, just run Palette Master Element to calibrate and it will take care of all the details. It will set up the monitor profile to use the maximum colour gamut of the display, not arbitrarily limited to Adobe RGB, sRGB or whatever else. The monitor actually exceeds the gamut of Adobe RGB pretty well everywhere except in the saturated darker blues, where it falls very slightly short in a thin slice. This thin slice is why the monitor is advertised as having 99% coverage of Adobe RGB. In fact it's much more than 100% based on the full volume of the colour space which pushes well out in the greens and reds; but since it falls slightly short in the blues they say 99%.

The GTX 1070 specs state "DisplayPort 1.2 Certified, DisplayPort 1.3/1.4 Ready". The card also claims 10-bit support -- one of the first GeForce series cards to have it. So with current drivers and up-to-date apps, you should be 10-bit video capable. There are some manual settings you'll probably need to make to enable 10-bit. In Photoshop for example, you need to go into Edit > Preferences > Performance > Advanced Settings, and enable the 30-bit checkbox. You may need to do something in the Nvidia driver as well to enable 10-bit since I suspect by default it's switched off.

Since 10-bit mode affects the video display pipeline, you'll want to run calibration after enabling or disabling this mode in the video card drivers.

Once you think you have everything enabled, there are some test images you can look at in Photoshop that should clearly show whether 10-bit is active or not. Here's a web site with one such test image. If 10-bit is active, the image gradient should appear smooth. If 8-bit is still active, you'll see clear banding.
https://imagescience.com.au/knowledge/10-bit-output-support
Royce Howland
Visit my web site for photo galleries, my blog and photo tours & workshops
 

by E.J. Peiker on Tue Apr 18, 2017 2:35 pm
User avatar
E.J. Peiker
Senior Technical Editor
Posts: 79130
Joined: 16 Aug 2003
Location: Arizona
Member #:00002
I use the nVidia GeForce 1080 in 10 bit mode but sometimes when they issue a driver update, it resets to 8 bit and you have to go into the software and set it back to 10 bit - FYI
 

by DChan on Wed Apr 19, 2017 6:02 pm
DChan
Forum Contributor
Posts: 1521
Joined: 09 Jan 2009
So I did the adjustments in Photoshop and Nvidia control panel, downloaded that test image and looked at it in Photoshop...I saw banding. No 10-bit mode.

I also did some more googling and what I found is that only Nvidia Quadro cards support full 10-bit display. GTX supports Direct3D 10-bit but not Open-GL. Here's one of the answers I found (https://www.reddit.com/r/nvidia/comments/55oq07/tech_support_how_do_i_go_about_enabling_10_bit/):

Nvidia consumer class cards (Geforce GTX) can only output 10 bit color in a Direct X11 exclusive fullscreen mode. To get 10 bit color output on the Desktop in a way professional applications use it you need a Quadro card and drivers. Nvidia is blocking this for Geforce cards, regardless of the control panel setting a Geforce card will not process/output proper 10 bit color output.

Anandtech asked Nvidia tech support and got a similar reply.

Some folks on the net said there is a way to trick GTX card to act like the Quadro cards but there seems to be consequences.


Another thing is: before I did the adjustments in Photop CS6 (non Cloud), the box "use OpenCL" was checked. Now the box "use 30 bit display" is checked but "use OpenCL" is not. I have tried but I cannot reverse the adjustments back to what the settings were even though I reversed what I did in Nvidia control panel. Strange.
 

Display posts from previous:  Sort by:  
7 posts | 
  

Powered by phpBB® Forum Software © phpBB Group