Dynamic HDR?
Probably. And 4000 nits. 2017 feels like it'll be a nit race (whether that is relevant or not, it feels like something marketing would sink their hooks into).
Dynamic HDR?
Probably. And 4000 nits. 2017 feels like it'll be a nit race (whether that is relevant or not, it feels like something marketing would sink their hooks into).
I'll be pleasantly surprised if displays can hit 4000 nits. I'm expecting it to be more like 2000 nits this year, maybe 3000.Probably. And 4000 nits. 2017 feels like it'll be a nit race (whether that is relevant or not, it feels like something marketing would sink their hooks into).
You saw a single movie and wrote off the whole thing for life.I really hope HFR doesn't become a thing. I hate how it looks. Saw one movie too many for my lifetime (hobbit).
I'll be more impressed if they can improve the full-screen brightness from 150 nits.But if OLED's from LG and Sony can hit 1000nit, that will definitely count as a successful year I'd say.
I need some help with my B6.
I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?
Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!
Oh man
OH MAN!![]()
Dooo it![]()
I need some help with my B6.
I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?
Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!
I need some help with my B6.
I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?
Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!
Mine was like that till I turned to Limited black level, from automatic, but as you've tried that, I have no idea.
I'll be pleasantly surprised if displays can hit 4000 nits. I'm expecting it to be more like 2000 nits this year, maybe 3000.
And it is important.
We don't see light linearly.
Meaningful brightness improvements with HDR are measured in "stops" rather than nits.
Each stop is a doubling of brightness.
  100 nits = SDR
  200 nits = 1 stop brighter
  400 nits = 2 stops brighter
  800 nits = 3 stops brighter
 1600 nits = 4 stops brighter
 3200 nits = 5 stops brighter
 6400 nits = 6 stops brighter
12800 nits = 7 stops brighter
So the nits values start to look ridiculous, but when you look at it in terms of "stops", you can see that actually going from 800 nits to 1000 is a small change: 1/4 of a stop.
You wouldn't see much difference at all between a 3200 nit display and a 3500 nit display because the next stop is at 6400 nits.
But you will see a big difference between a 100 nit display and a 400 nit display - still a 300 nit difference - because that is two stops brighter.
You saw a single movie and wrote off the whole thing for life.
Have you considered that The Hobbit was just a mediocre movie with bad effects, and HFR had little to do with it?
Nothing will force you to buy/watch the HFR versions of movies/TV shows, I'm sure there will still be 24p versions available just like you can still buy DVDs today.
I can't wait for HFR to get here.
My main interest is gaming, and 120Hz support is a big deal for that - though we really need variable refresh rate support to take full advantage of it.
It also means that I won't need to use interpolation to dejudder movies and TV shows any more. I love how smooth interpolation makes things, but don't like the glitches that often appear.
I'll be more impressed if they can improve the full-screen brightness from 150 nits.
Peak brightness is far less important.
Throw away 4/5 of the frames. Maybe add a lot of motion blur.If the movie is shot at a higher frame rate, how would they still release a 24p version?
HDR is supposed to support up to 400 nits full-screen brightness.Peak brightness is absolutely more important than full-screen. Nobody needs a commercial to pop up on the screen in the middle of the night with an all white background @1800 nits.
Part of it depends on whether your display is properly calibrated.Anyway, after testing 4k Revenant out and seeing how good it looked, I tried the 1080p version on my ps4. What exactly are the hdr differences I'm supposed to be seeing? Because to me, other than the resolution bump, they looked nearly identical after swapping back and forth between the 2. As far I can tell, the TV is outputting hdr just fine with the 4k version. Both sources have pretty much the same settings too (max backlight, high smart led, native rgb, no motion plus, no dynamic contrast, warm 2, 0 sharpness, uhd color)
So I'm wondering if I just have some settings wrong or there's something else up. I currently have a ks9000 and was blown away after seeing those youtube hdr videos from that 4k channel. I decided to get Revenant yesterday to see how 4k hdr blurays looked. Unfortunately, I couldn't get my player (k8500) to display the movie properly. I got lots of white dots, flickering and random No Display messages. I figured it was the hdmi cable, so I tried a second cable that I had been using with my ps4 (non pro). This one was worse with no display at all. A 3rd cable that I had lying around though worked, which is interesting since this 3rd cable would not work at all with my ps4 at 1080p (any other resolution below that works fine). This is probably the first time I've run into issues with hdmi cables. All 3 have "high speed hdmi" written on them too.
Anyway, after testing 4k Revenant out and seeing how good it looked, I tried the 1080p version on my ps4. What exactly are the hdr differences I'm supposed to be seeing? Because to me, other than the resolution bump, they looked nearly identical after swapping back and forth between the 2. As far I can tell, the TV is outputting hdr just fine with the 4k version. Both sources have pretty much the same settings too (max backlight, high smart led, native rgb, no motion plus, no dynamic contrast, warm 2, 0 sharpness, uhd color)
Both games have been recently patched to address this issue. There should be a contrast option in the game menu to change. Setting between low/high. When using HDR, set to low. I would recommend kicking up the in-game brightness a notch or two as well if you're still not satisfied. This should net you a really excellent image if you've calibrated your set (make sure your HDMI black level is set to limited as well).I need some help with my B6.
I tried first light in HDR today but it's just sooo dark.
I think I tried everything from black levels on ps4, dynamic contrast etc but no fix.
I noticed that HDR has a 2.2 gamma and it's greyed out so there's no way to change it, anyone know how?
Or can someone with a B6 (and good looking firat light hdr) give me his HDR + ps4 settings?
Thanks!
Posted here without comment.
http://www.avsforum.com/forum/40-ol...eneral/2459978-burn-worries-2016-lg-oled.html
...what am I looking at?
Apparently there have been some burn in issues with OLED TVs. An issue that the poster was worried about earlier in the thread
Apparently there have been some burn in issues with OLED TVs. An issue that the poster was worried about earlier in the thread
This is not an issue on 2016 models unless you buy a demo floor unit used that has the same demo on loop at max brightness. Also don't use these as monitors for extensive periods of time unless gaming/watching movie, that should be a given.Posted here without comment.
http://www.avsforum.com/forum/40-ol...eneral/2459978-burn-worries-2016-lg-oled.html
I totally agree, the Z9D prototype at CES last year was 4000 nits, but who knows how long it will be until that makes it into the consumer version. Word on AVS was that the Z9D would be an 18 month product, so possibly not replaced until CES 2018, but who knows.I'll be pleasantly surprised if displays can hit 4000 nits. I'm expecting it to be more like 2000 nits this year, maybe 3000.
And it is important.
Nice, fingers crossed for you man! We've been seeing some pretty crazy price drops over here since it launched in September ($6999 > $6500 > $5600 > $4800) so hopefully the same thing happens where you are.Looking like I'll be using Cleveland Plasma or someone similar. Just need to work everything out for cost. Hence, the hope for a post CES drop in price. Maybe late January, early February, will be the time
I totally agree, the Z9D prototype at CES last year was 4000 nits, but who knows how long it will be until that makes it into the consumer version. Word on AVS was that the Z9D would be an 18 month product, so possibly not replaced until CES 2018, but who knows.
Nice, fingers crossed for you man! We've been seeing some pretty crazy price drops over here since it launched in September ($6999 > $6500 > $5600 > $4800) so hopefully the same thing happens where you are.
Haha I'll kick in a few dollars to see someone else get up in here with the ZThe US price has been holding strong so far, but I'm sure something will be happening sooner or later. Like I said, I won't be making a purchase until over a month from now. 37 days. But I'll coordinate everything with the house closing and the rest. Exciting times really.
75 for life.
I'll have my go fund me link up any day now
Haha I'll kick in a few dollars to see someone else get up in here with the Z
Surely it has to come down soon, it sucks you guys don't have the Panasonic set to drive the price down quicker, but it'd have to come down in line with other markets soon enough.
Wow 75" haha that will be something else!Can't wait to hear your impressions. Am I remembering correctly that 3D is important to you? If so it might be something else worth comparing in store, not to make your decision harder or anything, but I've heard that 3D is one of OLEDs other strong suits. Not to say it's bad on the Z, I've barely tried it, but it's active vs passive, and I'm not sure if you have a preference.
Wow 75" haha that will be something else!Can't wait to hear your impressions. Am I remembering correctly that 3D is important to you? If so it might be something else worth comparing in store, not to make your decision harder or anything, but I've heard that 3D is one of OLEDs other strong suits. Not to say it's bad on the Z, I've barely tried it, but it's active vs passive, and I'm not sure if you have a preference.
I'm struggling to switch from my 100" PJ, even with 4k.
The day draws neigh.
I had a 70" in my condo, but included it in the sale. I don't want to go much smaller than that. the 65" screens look nice, but my heart is really set on 75 for the living room/den and a 65" in the loft. We shall see
I hear you. A 100 is 78% larger than the 75. Sadness.
Black uniformity shouldn't be an issue. Gray uniformity however... It's not as good as I want. Debating on exchanging it.As someone that would like to buy an OLED after the 2017s are announced, I'm much more interested in brightness and near black uniformity than HFR...at least for this CES
Yes. It's called taste. As in, you try something, didn't like it, and don't want it again. I tried cilantro once - didn't like it. No need to try it again. I'm not a bloody idiot, when I don't like something, you don't try it multiple times just to be sure because someone on the Internet thinks you're wrong.You saw a single movie and wrote off the whole thing for life.
Yes. It's called taste. As in, you try something, didn't like it, and don't want it again. I tried cilantro once - didn't like it. No need to try it again. I'm not a bloody idiot, when I don't like something, you don't try it multiple times just to be sure because someone on the Internet thinks you're wrong.
Stranger Things - these pics are just used as an extreme example to show pure white next to pure black and how the Z9D can absolutely go toe-to-toe with OLED for perfect blacks. I pulled the blinds down for this, but I kept the light on the bottom of the TV and reciever on so you can see the blacks haven't been crushed in post processing.
![]()
Just looks like camera exposure/display gamma differences to me.Have time to watch the show today and I took a picture for comparison. Never mind the potato camera quality but looks like you are actually getting crushed black.
Yes. It's called taste. As in, you try something, didn't like it, and don't want it again. I tried cilantro once - didn't like it. No need to try it again. I'm not a bloody idiot, when I don't like something, you don't try it multiple times just to be sure because someone on the Internet thinks you're wrong.
Have time to watch the show today and I took a picture for comparison. Never mind the potato camera quality but looks like you are actually getting crushed black.
Great show btw.
This is not an issue on 2016 models unless you buy a demo floor unit used that has the same demo on loop at max brightness. Also don't use these as monitors for extensive periods of time unless gaming/watching movie, that should be a given.
So nothing has changed from plasmas then. Gotcha.
The point is that burn-in is still physically possible, because of how the pixels may wear unevenly on emissive displays. It's most evident on floor model TVs which show static logos and run the same demo loop over and over for a year but the fact that it can occur means that it still exists.
As long as this is the case, no one can claim it can't happen. Because it does and you can go to Best Buy and look at the evidence with your own eyes.
You missed one key fact: those TVs weren't running compensation cycles after being turned off, therefore the panel couldn't clear any IR present at the time. This never happens in real world day-to-day use because you're running compensation cycles once you're done for the day/night.
So nothing has changed from plasmas then. Gotcha.
The point is that burn-in is still physically possible, because of how the pixels may wear unevenly on emissive displays. It's most evident on floor model TVs which show static logos and run the same demo loop over and over for a year but the fact that it can occur means that it still exists.
As long as this is the case, no one can claim it can't happen. Because it does and you can go to Best Buy and look at the evidence with your own eyes.
Nah, his pic looks fine to me (could maybe use a touch more brightness, but that could be the result of the camera exposure). Your tv is way too bright which is clipping your whites.
"Can't happen" is not equal to "not an issue if you're reasonable" (still image for 16 straight hours, not running compensation cycles, etc.)
Brightness is fine on my TV, just the shitty LG phone camera makes it look like its clipping.
My point is his picture is missing some details on the darker side (around the ear and shoulder) which make me think he's having crushed black.
So now we're back to having to baby your TV and not able to do certain things with it like with plasma. Yeah, I'm not going to do that. If I'm using my media PC looking at the Windows desktop and I'm browsing the Internet I'm going to be looking at the Windows taskbar on my TV for potentially hundreds and hundreds of hours over the course of a year. Is this considered "reasonable" usage? Who defines what "reasonable" usage is?
So nothing has changed from plasmas then. Gotcha.