Personal View site logo
Asus ProArt 32UCX VS Apple Pro Display XDR
  • While awaiting delivery of my Asus PA32UCX-K this afternoon, I thought I’d compare a few of the features and specs of the two displays, bearing in mind that the XDR has not yet been tested.

    The XDR boasts 6K, while the Asus is a 4K monitor; however, only the 2019 Mac Pro and 16-inch MacBook Pro can run the XDR at full 6K resolution. Both the Asus and the Apple are true 10-bit displays.

    Apple claims 500 nits for their display in SDR mode and 1,000 nits sustained and 1,600 nits peak brightness in HDR mode. The Asus can deliver anywhere from 620-656 nits in SDR and from 1,400-1,534 nits maximum brightness in HDR, depending on which reviewer you read.

    The Pro Display XDR has 576 local dimming zones, whereas the Asus has 1,152.

    On his YouTube channel, Marques Brownlee says his Pro Display XDR is virtually silent, while the Asus is supposed to get quite noisy.

    According to Asus, the maximum viewing angle of the PA32UCX-K is 178 degrees in the horizontal and vertical (Apple claims 180 degree viewing angle for the XDR), and according to tests, hue, saturation, brightness, contrast and color temperature remain virtually unchanged even at extreme viewing angles. I expect Apple’s display will be no slouch in this department either.

    As far as connectivity goes, Apple’s monitor has one TB3 port and three USB-C ports. The Asus comes with dual TB3 ports, one DisplayPort and three HDMI 2.0 ports.

    The Apple Pro Display XDR runs $6,500 in the USA when purchased with stand and AppleCare+. Throw in the nano-texture glass option and that figure skyrockets to $7,500.00. In Vietnam, the ProArt 32UCX-K costs half as much ($3,300.00) and comes with stand, a three-year warranty and an X-Rite i1 Display Pro colorimeter (value $250.00 in USA, $325 in VN). With the Asus, you will however have to factor in another grand for Blackmagic’s UltraStudio 4K Mini. Ouch!

  • 24 Replies sorted by
  • and according to tests, hue, saturation, brightness, contrast and color temperature remain virtually unchanged even at extreme viewing angles.

    It is total bullshit. I mean tests. Such things can't be unchanged both for VA and IPS panels.

    Also note that for now HDR and Rec 2020 displays are totally unsettled territory where each are making his own stuff

    Finally we'll see either double panels (ala Flanders Scientific) with good backlight, or we'll see microled backlight (around 0.5-2millions zones). With peak brightness reaching 5000nits.

    Also back of the monitor must be nice heatsink like aluminium with direct contact with backlight leds.

    Btw, connection of monitors as well as content won't have any HDR, just 14-16bit signal.

    It'll be very interesting to see your review still, very.

  • 1) Color accuracy of the Asus monitor is supposed to be superb, with something like 89% coverage of rec.2020.

    No displays can currently represent the entire rec.2020 spectrum. Like many HDR specs, it has been established to take into account future breakthroughs in technology.

    It is a desired goal, not a requirement, to cover 100% of the rec.2020 color space, just as no consumer television will have 10,000 nits peak brightness in the next decade - or ever, for all we know.

    2) The Asus absolutely displays HDR content when connected to a Mac via the Blackmagic UltraStudio 4K Mini. Or, at least, it better! hehe

    3) What has been the industry standard, the Sony BVM-X300, has a maximum 1,000 nits brightness. Its SDI ports carry a 10/12 bit signal. LG OLED televisions have been used in finishing Sense 8 and other large productions. LG OLED TVs are not particularly bright, either. Mine is something like 700 nits.

    4) It will be years before affordable micro LED HDR reference monitors come to market.

    5) Flanders Scientific 4K HDR reference monitors start at something like USD 35,000 for their 1,000 nit model. Hardly affordable for your average or even wealthy consumer.

    Screenshot 2019-11-18 at 1.22.51 PM.png
    1618 x 1392 - 234K
  • @jonpais

    Look at the price of good big led panels, such simplified panel (with extruded alu back) can be made for $70-100 in robotic factory. It is really high power and very bright, will have around 1200-4000 leds (for 32"). Add to this proper controller and you will have really good HDR monitor using any good IPS panel.

    MicroLed actually can be made affordable very fast also. Main issue here is top firms want to milk first business niche (they will be main consumer of microleds first). Plus patents need to be used fully :-)

    Panel manufacturers are not idiots and want to keep all to themselves (as it is now) to not allow splitting panel into two parts thing.

    Under unsettled territory I mean that it is very fast changing and final standard will be VERY from present HDR. All it will be just 14-16bits signal (to monitor). No special HDR modes.

    OLED is example of technology where managers and marketing made all the mess, as engineers told them issues and told that they are unsolvable.

  • Micro LED shipments in 2026 (that's right - six years from now!) will amount to just 0.4 percent of the global flat-panel display market. Smart watches, tablets, cellphones and television sets will comprise the majority of devices incorporating micro LEDs. HDR reference monitors will be the very last segment to utilize the technology. And just like the best of current true HDR reference displays, prices will be measured in the tens of thousands of dollars, not $100 or $200 dollars. Keep in mind: worldwide market share of OLED televisions is just 1.6% in 2019. I don't personally know anyone who owns an OLED television! And this in spite of the fact that prices have dropped dramatically.

  • Asus PA32UCX: First Impressions

    Apart from uploading a handful of Sony HLG HDR videos to YouTube last year – mostly out of morbid curiosity – I’d pretty much all but given up on HDR, for the simple reason that there were no remotely affordable monitors. That all changed last month when I learned about the Asus ProArt 32UCX at a Blackmagic event here in Ho Chi Minh City. Five weeks later, the Asus was at my doorstep.

    It takes only a few minutes to get the monitor up and ready to go. The display can be adjusted vertically or horizontally. Cable management is well thought-out and eliminates clutter.

    Being able to view a display at extreme angles without changes in hue, saturation, brightness and contrast are important to me for a television set, less so when it comes to editing and grading, since I’m usually sitting directly in front of the display and don’t have clients looking over my shoulder. Nevertheless, viewing angle neutrality is quite good. The image is not spoilt by reflections like my glossy iMac. And contrary to what I read at one website, the Asus runs exceptionally quietly.

    The image on the Asus looks a helluva lot crisper and more detailed than the iMac; its black levels are .0026 nits, compared to the iMac’s grayish .5 nits; peak brightness on the PA32UCX exceeds 1,400 nits, whereas the iMac is a paltry 500 nits; and the Asus has 1,152 local dimming zones, giving it exceptional contrast. I’ll be replacing the iMac with a 16″ MacBook Pro in the next couple of weeks.

    15. Asus vs Apple FCP 2.jpg
    4032 x 3024 - 2M
  • The reason production houses spend as much as $40,000 on reference monitors is quite simple: they need to have unwavering confidence that their work is going to be displayed exactly as intended. However, all displays change with time, and the only way to ensure accuracy is to calibrate them on a regular basis. Yet literature on this vital step in the production process is all but nonexistent. As you’ll soon learn, there’s much more to it than simply plugging a calibrator into the USB port of your monitor!

    Enter the PA32UCX-K, “the world’s first 32-inch 4K HDR monitor with peak brightness of 1,200 nits and mini-LED backlighting,” which I picked up solely in order to create HDR content for YouTube. The monitor is already pre-calibrated at the factory to ensure Delta-E (∆E) <1 color accuracy; each monitor sold comes with a detailed calibration report; an X-Rite i1Display Pro colorimeter is included in the box; and calibration software can be downloaded from Asus’ website. So we’re all good to go, right? Well, not exactly. You see, external monitors must be connected to a Mac via an I/O capture device – which in my case is the UltraStudio 4K Mini – in order to bypass the computer’s own color management. The typical chain would work in any post facility as follows:

    Workstation/Mac -> UltraStudio via TB3 -> Calibrated Monitor via HDMI or SDI

    The display in the chain is usually an industry-accepted broadcast monitor like the FSI or TVLogic. In this case, we’re replacing it with an Asus ProArt monitor, which costs less and does more. Typical broadcast monitors only accept video signals and cannot be used as an extension of your computer desktop/workspace – they’re purely for video signal preview only. The Asus supports multiple inputs and can be used as a desktop extension monitor.

    However, the UltraStudio 4K does not have any calibration capabilities. It is a professional I/O box which allows output of SMPTE & broadcast complaint video signals to external monitoring devices. It is important not to allow the PC/Mac OS to manage the color on the video output to the Asus if you want an accurate preview of the colors you have. Hence there is the UltraStudio serving as an I/O bridge between the iMac and the monitor.

    Regarding calibrating with an external calibrator like the X-Rite, this is software managed, and is only ideal for getting your desktop displays accurate; however, there are some variances due in part to external light and ambient light sources. There is a bit of good news though: Blackmagic has come out with a new product, the Teranex Mini SDI to HDMI 8K HDR (the bad news is, it costs USD $1,295.00). This is essentially a broadcast converter; however, it features an onboard calibration engine which works with the X-Rite calibrator via USB.

    Remember natively, the X-Rite calibrates the computer desktop environment via their software and this is to calibrate the VGA signal from the computer out to the computer displays. You won’t be able to use this to calibrate any monitors not connected to the computer via traditional computer VGA ports like Display Port, DVI or HDMI. This is simply because it relies on the computer managing the display colors. Once you use an UltraStudio or any broadcast standard video I/O device, you won’t be able to calibrate the preview monitor connected to these I/O devices because the computer no longer manages the colors being displayed.

    However, with the new Teranex Mini SDI to HDIM 8K HDR, there is a built-in calibrator and a USB port on the device – all you need to do is connect the X-Rite to the Teranex Mini, run the calibration engine via the device menus, follow the onscreen instructions, allow the Teranex Mini to read the color profile of your connected monitor and when it’s done, the Teranex Mini will generate a broadcast-accurate 3D LUT for that device, store it onboard, then apply it to the HDMI output to the connected display. It’s important to remember that the i1Display Pro included with the PA32UCX-K is not an HDR calibrator – for that, you’ll need to purchase the i1Display Pro Plus.The chain would look like this:

    iMac -> UltraStudio 4K via TB3 -> Teranex Mini SDI to HDMI 8K HDR via SDI -> Any HDMI Device via HDMI.

    Why is it so complicated? Well, if the post facility can afford broadcast standard SDI monitors costing more than a Tesla, then it’s all fine, they just need the UltraStudio 4K. However, if the post facility is just a new start-up with limited funds, or if it is a case of a freelancer who is just setting up his own studio with limited funds, it would be impossible to justify spending USD $6,000 on an FSI SDI monitor that does nothing more than display a video preview although it is broadcast accurate. So the next best solution would be to add a Teranex Mini SDI to HDMI 8K HDR into the chain and then connect any available decent quality HDMI monitor or even large TV to the chain to be used as an output preview device.

    “This all sounds so complex,” you might be say, “wouldn’t I be better off going with the Pro Display XDR?” Well, even with Apple’s gorgeous 6K monitor, you aren’t guaranteed broadcast accurate colors. This is because the MacOS is still managing output display colors which are not SMPTE compliant. Although boasting very high specifications, there is no mention anywhere of it being calibrated to SMPTE specifications, or that the MacOS is able to output SMPTE complaint video signals.

    There are of course production facilities planning to purchase the XDR display, but they still have the UltraStudio in their device chain connected to a Sony X300 Broadcast Master Monitor, purely for HDR preview and mastering. This SDI monitor alone runs in the neighborhood of USD $35,000. HDR really is the wild west, so unless a post facility is generating revenue from high-end work, it’s advisable to keep things simple. Once you start wanting to get everything HDR ready properly, the investment is going to be insane.

    My takeaways? HDR is a bottomless pit. It’s also endlessly fascinating. Being able to view my Final Cut Pro timeline in HDR on the Asus is a trip. Once you’ve seen shows like the brilliant documentary Chef’s Table or the superb television drama Sense8 on an OLED display, you’ll be clamoring for more HDR content. Would I recommend picking up a $4,000 HDR monitor though? My advice for those who, like me, just want to produce content for YouTube, would be to wait until next year, when there should be several affordable mini-LED HDR1000 compliant laptops on the market.

  • Remember natively, the X-Rite calibrates the computer desktop environment via their software and this is to calibrate the VGA signal from the computer out to the computer displays. You won’t be able to use this to calibrate any monitors not connected to the computer via traditional computer VGA ports like Display Port, DVI or HDMI. This is simply because it relies on the computer managing the display colors. Once you use an UltraStudio or any broadcast standard video I/O device, you won’t be able to calibrate the preview monitor connected to these I/O devices because the computer no longer manages the colors being displayed.

    This sounds little nuts. X-Rite (and Windows) will use GPU capabilities normally to apply LUT. Also note that as your monitor is calibrated you don't need all this shite, and can just use X-Rite to periodically check that all is ok (in reality LED based modern monitors have almost no drift in few years time).

    It’s important to remember that the i1Display Pro included with the PA32UCX-K is not an HDR calibrator – for that, you’ll need to purchase the i1Display Pro Plus

    Are you sure? As I see contradicting info

    https://www.avsforum.com/forum/139-display-calibration/2907966-testing-i1-display-pro-retail-oem-meters-hdr-compliance.html

    From my experience their higher model differs only in firmware that has some delay loops disabled.

    This is because the MacOS is still managing output display colors which are not SMPTE compliant. Although boasting very high specifications, there is no mention anywhere of it being calibrated to SMPTE specifications, or that the MacOS is able to output SMPTE complaint video signals.

    I think it is more of your NLE task, and FInal Cut and Resolve have such settings as far as I know.

  • 1) When using an external monitor connected with an I/O device, proper calibration cannot be achieved by connecting an external monitor to the iMac by TB3. I already tried it, and the results were horrible. You must bypass the Mac's color management to calibrate.

    2) LCD values change over time. Eizo is a fairly respectable name in reference monitors, but I could find similar information from other manufacturers if you like. According to Eizo:

    "Under continued use, a monitor's white color temperature and brightness change gradually over time.

    Assume a monitor's color temperature was set to 5,000 K and its brightness to 100 cd/m2 when purchased. Since these values will change with time under continued use, a few months later the white will have become warmer and the brightness will have dimmed, to 4,500 K and 90 cd/m2 , respectively.

    For this reason, calibration is important to return the monitor to its original state—i.e., to a color temperature of 5,000 K and a brightness of 100 cd/m2.

    When using a monitor for graphics, stable color can be maintained at all times by conducting calibration at a frequency of once every 200–300 hours.

    An LCD monitor used for graphics should be calibrated at least once every 200–300 hours (in ordinary use, once per month)"

    So it isn't possible to periodically check my monitor. It isn't possible to even check it once. LOL

    3) As far as X-Rite is concerned, I am still waiting for them to respond to my query. At this point in time, it doesn't matter to me, since I'm not about to rush out and spend $1,600.00 for the Teranex (cost in Vietnam). If they say the i1Display Pro is fine for HDR, then that's great - for everyone but me! hehe

  • Proper calibration cannot be achieved by connecting an external monitor to the iMac by TB3. I already tried it, and the results were horrible. You must bypass the Mac's color management to calibrate.

    I am sure it is a way to do it properly without BM hardware.

    Under continued use, a monitor's white color temperature and brightness change gradually over time. Assume a monitor's color temperature was set to 5,000 K and its brightness to 100 cd/m2 when purchased. Since these values will change with time under continued use, a few months later the white will have become warmer and the brightness will have dimmed, to 4,500 K and 90 cd/m2 , respectively.

    It is hard to say the time it had been written.

    Properly constructed led monitor change very little, very. As it is ONLY one thing that can change - led and phosphor on it (and if it is properly done and cool - speed of such change is extremely slow).

    Even very hot COB led lights can change like 20-50K and 1-3% brightness in one year.

    An LCD monitor used for graphics should be calibrated at least once every 200–300 hours (in ordinary use, once per month)"

    If you are working in pro grading house - yes, as it is just way to be safe and have paper in case of issues. But if you are small firm or individual working for weddings and such - forget this shit.

    As far as X-Rite is concerned, I am still waiting for them to respond to my query. At this point in time, it doesn't matter to me, since I'm not about to rush out and spend $1,600.00 for the Teranex (cost in Vietnam). If they say the i1Display Pro is fine for HDR, then that's great - for everyone but me! hehe

    Sorry? Did you read my link?

    i1Display Pro use 2000nits limit and HDR moniker just to get more differences in comparison table (despite their two colorimeters had been exactly the same, only specially crippled firmware makes difference in measurements speed).

  • Whether you are sure or not, it is pretty much universally agreed that you absolutely do require a dedicated IO device for accurate color reproduction on an external monitor.

    Common sense would tell you that if you are connecting your external monitor via an IO device to avoid Mac's color management, it makes zero sense whatsoever to then go ahead and run a TB3 directly to the Mac to perform calibration.

    "I'm sure you don't need a BM device". Vitaliy, 2019

    Cool! if there is a less expensive alternative to using the Teranex for doing routine calibration, I'm all in. I'm not a Blackmagic shill. LOL

    "Properly constructed led monitor change very little, very." Vitaliy, 2019

    Do you have evidence to back that up? How about a monitor that puts out a sustained 700 nits and peak brightness 1,600 nits that is used every day? No change? Really? Are you sure about that? There are lots of different technologies and they all change luminance and hue over time. Some of the cheapest monitors actually change the least, according to the CEO of FSI.

    As long as authorities like Alexis Van Hurkman recommend calibration, I'll continue to faithfully do it (on my computer, anyhow!) I already own a calibrator and it takes just a few minutes to do each month - hardly longer than it takes me to shower and shave. hehe

    Bram Desmet, CEO of FSI, talks in depth about calibration in this podcast. https://taoofcolor.com/2349/podcast-flanders-scientific-update-part-2/

    Not sure where all the disdain for calibration is coming from. I'm guessing you don't use scopes, either? haha

  • Whether you are sure or not, it is pretty much universally agreed that you absolutely do require a dedicated IO device for accurate color reproduction on an external monitor.

    This comes from dark VGA times :-) And old myths.

    Yet BM can save time for inexperienced user and/or better work for 12bit and more monitors.

    Cool! if there is a less expensive alternative to using the Teranex for doing routine calibration, I'm all in. I'm not a Blackmagic shill.

    As I told you - your monitor is calibrated, check calibration each 3-6 months if you are paranoid.

    Do you have evidence to back that up? How about a monitor that puts out a sustained 700 nits and peak brightness 1,600 nits that is used every day? No change? Really? Are you sure about that? There are lots of different technologies and they all change luminance and hue over time. Some of the cheapest monitors actually change the least, according to the CEO of FSI.

    Yes, but not for good HDR monitors yet.

    Let me explain to you carefully - ONLY thing that can degrade quite fast is badly cooled backlight leds. But big issue is that degradation not only changes spectrum, it also changes brightness of this specific led element (no calibration will save this).

    Not sure where all the disdain for calibration is coming from. I'm guessing you don't use scopes, either? haha

    You are funny.

    I am talking about science and logic in calibration. And experience comes from measurements.

    Most of forum calibration gurus come from projector and CRT areas. Their scary stories are also business, as the more calibrations rich guy orders - the better.

  • Well, I’m trying to sell the Asus now! Bad haloing, poorly conceived design (ports are inaccessible, controls all behind display, hood is attached with ten rubber plugs instead of just sliding on, no opening at top of hood for calibrating, need to spend $1,600 on Teranex to calibrate, aggressive sharpening that can’t be disabled, colors not accurate, requires one hour warm up for colors to stabilize, poor after sales support...

    So I started grading with my LG C7 OLED and it’s pretty incredible, but a 48” screen would be very welcome!

    Here’s a long rambling video about why I made the switch, along with how to set up an OLED for use as a grading monitor (skip ahead to last few minutes!)

  • need to spend $1,600 on Teranex to calibrate, aggressive sharpening that can’t be disabled, colors not accurate, requires one hour warm up for colors to

    Can you detail each of this?

    As for now I never saw led monitor with 1 hour color stabilization time. Do you have actual colorimeter measures during this process?

    So I started grading with my LG C7 OLED and it’s pretty incredible

    OLED looks very nice, but are last thing you want to use for any grading.

    First, it is WRGB (issues with sharpness that you can't set proper in such screens, is smallest thing) as you change brightness your color suddenly shift (as R+G+B spectrum differs from W spectrum) .

    Second, it is consumer OLED, so brightness drop is staggering in HDR and SDR. Thing can't keep more than 135cd/m2 full screen white.

    Third, again OLED, prone to image retention.

    Blue elements in OLEDs degrade (others also), they degrade non uniformly, making it impossible to do any real grading.

  • @Vitaliy_Kiselev LG OLED TVs were used in the production of Sense8. Good enough for me and much better than any miniLED or dual LCD display. Static picture will darken gradually but restores to normal brightness when I click in the timeline. In any case not a dealbreaker.

    https://www.flatpanelshd.com/news.php?subaction=showfull&id=1508155180

    AB6C19FE-E47E-4C66-97FB-FFFCAE2A53DE.png
    640 x 1136 - 140K
  • @Vitaliy_Kiselev Brightness does NOT drop to 135 nits when grading HDR content. I know, because I have edited HDR on the LG. And 120 nits is what I grade SDR content at, similar to all professional workflows.

  • LG OLED TVs were used in the production of Sense8. Good enough for me and much better than any miniLED or dual LCD display. Static picture will darken gradually but restores to normal brightness when I click in the timeline. In any case not a dealbreaker.

    You'll soon realize all nightmare :-)

    Your link about Technicolor is marketing thingy, ala THX in sound.

    Dual LCD displays are the future, including both reference ones and midrange to top consumer models.

    Only thing that holds their mass spread is fear. I talked to some people, As properly made such TVs and monitors will close the question for 8-10 years. Easy to make even 5000 nits models. They also are very repairable and easy to make proper cooled backlight.

    Brightness does NOT drop to 135 nits when grading HDR content

    Read carefully. It drops in HDR for full white screen.

    In new HDR standard revisions it'll be required to keep constant brightness from 1% to at least 50% of white screen. For LCD it is now idiotic energy standard that hold this up. Only OLED can't really keep up no matter that you do.

  • @Vitaliy_Kiselev You see boogeymen everywhere. LOL

    Not waiting eight years for an affordable dual LCD monitor with poorer viewing angles, poorer blacks than OLED, which is the best display quality available TODAY.

    5,000 nits is totally unnecessary and should be the very last consideration when choosing a monitor or television. 99% of HDR content shown on television is capped at 1,000 nits maximum! Movie theaters still average 50 nits, no one's complaining. hahah

    Not sure what you're talking about an all-white screen. :) I live in Vietnam, no snow scenes. hehe Even HDR, 90% of the time, the picture will be no brighter than an SDR one, 100 nits brightness and below. Only subjects that emit their own light and reflections should exceed 100 nits.

    Netflix, Dolby and Technicolor all use OLED TVs as do a number of production houses; they may not be their primary monitor, but they are the next best thing to a reference monitor and the most accurate display for showing clients work in progress.

    I've owned my OLED for a few years now and haven't experienced burn-in issues.

  • Not waiting eight years for an affordable dual LCD monitor with poorer viewing angles, poorer blacks than OLED, which is the best display quality available TODAY

    Actually this year Chinese will have 32" pro monitor :-)

    5,000 nits is totally unnecessary and should be the very last consideration when choosing a monitor or television. 99% of HDR content shown on television is capped at 1,000 nits maximum! Movie theaters still average 50 nits, no one's complaining. hahah

    This will change fast, as very soon we won't have any HDR (only marketing remains), just algorithms aimed at 14-16bit and special compression.

    Not sure what you're talking about an all-white screen. :) I live in Vietnam, no snow scenes. hehe Even HDR, 90% of the time, the picture will be no brighter than an SDR one, 100 nits brightness and below. Only subjects that emit their own light and reflections should exceed 100 nits.

    In measuring screen brightness you do it from 1% of area being white to 100% (all screen) being filled with white. OLED can't hold brightness due to their nature.

    Netflix, Dolby and Technicolor all use OLED TVs as do a number of production houses

    They are used for checking big contrast output devices, but NEVER for any important grading due to huge issues with non uniform degradation and color shifts.

    I've owned my OLED for a few years now and haven't experienced burn-in issues.

    I also worked with OLED and if you will use them as monitor for real - it is instantly visible.

  • @Vitaliy_Kiselev “Chinese will have so-and so (insert here: this week, this month, this year, this decade...) ????

    “Very soon we won’t have HDR...” Wrong again. HDR is here to stay.

    We’re talking about what is available today, not next year, not a decade from now. And today, OLED remains the gold standard.

    “Easy to make 5,000 nit display.” You’re killing me with your foolish talk. Heat dissipation, panel degradation, cost and power consumption are all enormously difficult problems to overcome. All HDR movies on TV are capped at 1,000 nits regardless of the display you’re watching on. The number of nits beyond 1,000 should be the very last consideration when purchasing a monitor.

    “Soon we’ll have nothing but 14-16 bit compression”. That’d be the $800 Chinese monitor coming out this year, right? LOL

    Re: bypassing Mac OS color management,

    “I think it is more of your NLE task, and FInal Cut and Resolve have such settings as far as I know.” - well, as usual, you are dead wrong here. Every colorist and expert on color management knows you need an i/o device for broadcast accurate color.

  • We had you talking how superb ASUS is and selling it in short time, same will happen with OLED as you'll use it for longer time for real grading and not amateur talks.

    We’re talking about what is available today, not next year, not a decade from now. And today, OLED remains the gold standard.

    I told you that it is not such. OLED is very good for watching TV at home, superb. Not suitable for grading as main monitor. But useful as third check monitor.

    “Very soon we won’t have HDR...” Wrong again. HDR is here to stay.

    As marketing term, yes. But internally it is now horrible additional shite jumping from one standard to the next with patent holders competition on who will get the money. As present HDR is being MADE such to become patentable from one side and require minimal to no changes in LSI design from the other. It'll all go away.

    “Easy to make 5,000 nit display.” You’re killing me with your foolish talk. Heat dissipation, panel degradation, cost and power consumption are all enormously difficult problems to overcome.

    You are wrong. With simple proper design it is no problems. I talked to manufacturers engineers, contrary to you. As well as know in detail thermal performance of top panel led lights (that are very similar to very strong backlight for monitors). Few companies are ready to make 5000 nit displays in no time, if not some EU and US energy saving standards. One manufacturer even talked about "sun mode" being able to reach 15000 nits up to 3-4 seconds in up to 15% area.

    “Soon we’ll have nothing but 14-16 bit compression”. That’d be the $800 Chinese monitor coming out this year, right? LOL

    Wrong. It is new compression standards that are coming. From simple ones in delivery department that is being used in HDMI 2.1 already to fit more data in the same bandwidth, to advanced consumer and editing ones.

    Every colorist and expert on color management knows you need an i/o device for broadcast accurate color.

    And here we go again. :-) What exactly is "broadcast accurate color" btw?

  • @Vitaliy_Kiselev Right - HDR can be a marketing ploy, just like HD, 4K, and 8K. Soon it will be emblazoned on every display, whether iPhone, tablet, television or monitor. hehe Who cares? Nobody! Even in rec. 709, the OLED absolutely destroys my iMac in terms of color, contrast, true blacks, brightness, and viewing angle.

    A tabloid political agenda can also masquerade as a blog for filmmakers!

    Who gives a sh*** if manufacturers battle it out over codecs and whatnot? That doesn't concern me. And it's got nothing whatsoever to do with either my video or using a ridiculously inexpensive display for grading HDR or SDR.

    I've been enjoying HDR content for several years now and LG is still the leader when it comes to OLED. LG consistently ranks number 1 in picture quality. My HDR TV will not be obsolete any time soon! As a matter of fact, OLED has just about plateaued in terms of quality.

    It took me a few weeks to get to know the Asus, I'm very familiar with the LG OLED, I won't be selling it anytime soon! LG will be releasing a 48" OLED in the spring (made in China. LOL), which is going to catch fire among independent filmmakers. It has filmmaker mode, which eliminates most processing. And like all current LG OLED televisions, it can be calibrated using CalMAN software.

    You are going way far afield, talking about Chinese-this and Chinese-that years from now.

    Technicolor, Dolby and others continue to use OLED displays to grade shows or to check the grade. Prove me wrong!

    And you need look no further than liftgammagain or Google elsewhere and you'll see that professionals are starting to integrate OLED televisions into their grading setups, particularly for SDR if not yet HDR.

    OLEDs are also the preferred display for use as client monitors in production houses, because they are the closest thing to a reference monitor costing tens of thousands of dollars.

    Again, you are living in fantasy land with your 15,000 nit display! Totally useless! Can you share a link to this display at Amazon?

    Impossible to take seriously someone who keeps veering off topic to fantasy land, who doesn't know what SMPTE is, the importance of color management including using i/o devices and calibration, who thinks you need a 15,000 nit display to do color correction, or that 14 or 16 bit is anything anyone here gives a damn about.

  • I believe living in the world of ordinary consumer, world ruled by lead prostitute media and bloggers, where all you need is to "choose proper product", is shitty life.

    PV always be about being able to change things, about looking into the future and affecting manufacturers directly, even if they are now capitalist manufacturers who all will need be destroyed. Hence, btw is the political agenda, as politics is concentrated form of economical relations.

    And sorry if all innovative affordable monitor and panels manufacturers are from China, it is not me to blame for this, but your local capitalists.

    Also sorry if your beloved HDR current standards will all go into oblivion quite soon, it is sad.

    Impossible to take seriously someone who doesn't know what SMPTE is, the importance of color management including using i/o devices and calibration, who thinks you need a 15,000 nit display to do color correction, or that 14 or 16 bit is anything anyone here gives a damn about.

    Going personal is sign of weak position, as well as trying to put in person mouth something he never talked about.

    Thing I talking about I know directly from manufacturers, personal work or working with people.

  • @Vitaliy_Kiselev In fact, there are next to NO YouTubers talking about using an LG OLED TV for grading video in Final Cut Pro - there is only one as far as I know. And that one was sponsored content, for sure!

    Not a single YouTuber to the best of my knowledge talks about precisely what gear you'll need and how exactly to set it up for color correction. That fact alone is pretty incredible in my opinion. If there was one, I would not have felt it necessary to make another video.

    My point was not to get people to squander more money on the latest and greatest tech, but how to save money. Many already own an OLED television. Even if you're grading on a lowly iMac or laptop, it is extremely useful to be able to check how your grade will look on the best display currently available. And I'm 100% positive you'll want to go back and make a few changes to the grade if you do!

    And if you're a wedding videographer making tons of money, it doesn't hurt to know how to set up an OLED as a client display, even if you don't do HDR. All SDR content looks radically better on an HDR display.

    At the same time, I do know that when I grade using an OLED with picture mode Technicolor Expert and upload it to YouTube, it will look identical to anyone else in the world watching on an LG OLED in Technicolor Expert - the mode with the least processing aside from filmmaker mode - and it should look okay on a laptop as well.

    And I think I can confidently assert that an OLED is better for grading than the XDR, with its awful off-axis viewing angles and color shifts, its haloing, its exorbitant price tag, uniformity issues, the fact that it only works within the Mac ecosystem, along with the inability to even calibrate it - many of the same criticisms I leveled at the Asus. LG OLED suffers from none of the above.

    I also believe I'm the first YouTuber and blogger to talk in depth about all the real shortcomings of the Asus as a video grading monitor, though prad.de detailed many of them on their website already.

    As far as image retention goes, I haven't found it to be a problem, and I've owned my television for a couple of years already. Local laws concerning energy consumption force television manufacturers to reduce brightness, but I haven't found that to be a real issue yet and the image quality destroys the PA32UCX, which has no such constraints! If it does end up being a problem, I'll be sure to mention it in another video.