Personal View site logo
55" LG OLED 55C6P 4K 3D HDR for $1399 only
  • 13 Replies sorted by
  • These are absolutely incredible TVs.

    I bought the non curved B6 model for $1499 last week. Totally in love with it.

  • I bought the E6 model for 1800+tax on a Super Bowl sale at a local shop. I was wary of online sellers with bad ratings and freight shipping with such an expensive item. In any case, the TV is amazing! Note that the E6 has a better processor than the B6 and does 3D. There's also an EDID hack so that the E6 and other 3D LG TVs looks like a 3D monitor to an Nvidia card so you can get 4K 3D support on a ton of games. There's more info about this on the Nvidia 3D vision forums.

  • The OLED55C6P is a 10 bit display, so it should make a great grading monitor. Just remember that only AMD GPU's have 10 bit colour- Nvidia only does 10 bit on Quadro workstation cards.

    I think that DX11 does 10 bit in fullscreen for 1000 series, however I think that most grading software uses OpenGL. :-(

  • You can switch DaVinci Resolve between OpenGL and CUDA if you have a Nvidia card.

  • I don't think that would make a difference, but I would LOVE to be proven wrong. DX11 is a really great tech, but as its MS only, I think vendors try to make cross-platform software, cross-platform.

    It's one of those very few, if only reasons to use a Quadro card. (for Video anyway)

  • @alcomposer

    Are you sure that something besides Resolve uses OpenGL?

    As DirectX is specially tailored for GPU assisted playback. And it is that Windows software must use.

  • The only software I can find that specifically states DX11 is Editmax:

    http://www.sobey.com/en/html/products/Production/

    Here is direct info from Nvidia:

    https://devtalk.nvidia.com/default/topic/985731/gtx-1060-375-26-not-output-10-bit-color/

    "NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI. For more information on NVIDIA professional line of Quadro GPUs, please visit: http://www.nvidia.com/page/workstation.html"

  • Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector.

    It is all very fishy. As it is not hard to use DirectX for UI. Actually it is hard and wrong to use OpenGL buffers due to way Windows drivers work with video. May be only to make thing more portable to Mac.

    Thing here is that NVidia has agreement with leading manufacturers considering Quadro cards as well as internal drivers switches. As all modern Quadro cards (except sometimes top model for a while) are no more than rebranded casual ones.

  • I would love to be proven wrong, but I have never been able to use Nvidia G-Force cards in 10 bit. Now, AMD, that is another story. I just hope VEGA competes well with 1080Ti for rendering etc. (not games of course)

    Now that 10 bit OLED panels are becoming more accessible, this may become more important to get the most out of the technology for colour grading. I know a professional colourist who uses a Sony OLED TV, so its not that far-fetched.

  • I would love to be proven wrong, but I have never been able to use Nvidia G-Force cards in 10 bit. Now, AMD, that is another story. I just hope VEGA competes well with 1080Ti for rendering etc. (not games of course)

    As I understand if you turn off signing check and hack drivers it can work. But not sure.

    Now that 10 bit OLED panels are becoming more accessible, this may become more important to get the most out of the technology for colour grading. I know a professional colourist who uses a Sony OLED TV, so its not that far-fetched.

    I do not see any big point in 10bit monitors for all but colorists (even more fun is that most of such monitors have OLED with wide gamut and effectively they are exactly around 8bit for Rec 709 primes). Your final delivery is usually 8 bit, except may be HEVC 10 bit. You do not need 10 bit monitor to use 10 bit source advantage.

  • I just hope VEGA competes well with 1080Ti for rendering etc. (not games of course)

    Most probably it is also false hopes. As in most real tasks (outside 2-3 miners) AMD chips are inferior to Pascal. VEGA could match 1070-1080, but it will be in global shortage due to HBM memory issues.

  • Good, makes me feel better about 1080Ti. :-)