1080P Sony looks better than 4K Sony

My question is, was the rep just trying to sell me a higher price set. Should have got the remote and see what the settings were.

I'm sure he was, and yes, settings can definitely make a set "pop" over others. But, neither of those mean the TV just wasn't flat out better regardless.
 
I'm sure he was, and yes, settings can definitely make a set "pop" over others. But, neither of those mean the TV just wasn't flat out better regardless.
I would expect that at the price tier. But him saying the 690e was not true HDR would seem Sony selling false product.
 
Actually HDR when done right, depending on what movie your watching and how it was mastered makes a noticeable difference, in a good way.

Audiofreak71
Actually, when reconfiguring some stuff of mine, I noticed the TV said I was only getting HDR10 and not Dolby Vision. I bypassed the offending gear to get my Dolby Vision back and replayed the disc to see if it looked any different. YES! Dolby Vision compared to HDR10 is like HDR10 compared to standard color gamut. It makes a huge difference.

As for the original poster's issues, you may have a bad upscaler that is softening your edges, TV settings may need to be dialed in, or you may be used to looking at an AMOLED that wasn't color corrected. Many of the first/cheaper AMOLED TVs basically show HDR-like colors when the original content wasn't recorded in HDR. Everything is super saturated like the Wizard of Oz. I've seen this shown on displays at Best Buy touted as a feature! Screens that say "look how red this strawberry is", but ... it didn't look real. The comparison TV showing the "old" sRGB colors looked like a real strawberry. If you got used to that (you said the theater looked dull compared to your TV and the theater should be the litmus test if its a Dolby approved theater) then you are going to have to let your brain adjust. Newer sets are calibrated to show more accuracy.

If you don't like the Sony ... I'm loving the Vizio M-Series! It's cheap, but its a really great buy.
 
At that level one really needs to do the fact checking and understand the relevance of the details. It should be a good time but I find it quite tiresome to do that level of research...you about gotta though for laying down a nice chunk of change.

The last time I bought a TV I resigned to not getting the best/perfect TV because that doesn't really exist, but instead the one with the least worst flaws that I could tolerate.
 
I don't think so, Sadly, the most informative info on HDR vs Non-HDR sets seems to come from Consumer Reports which seems to require a subscription. ...sorry about that.
 
Actually, when reconfiguring some stuff of mine, I noticed the TV said I was only getting HDR10 and not Dolby Vision. I bypassed the offending gear to get my Dolby Vision back and replayed the disc to see if it looked any different. YES! Dolby Vision compared to HDR10 is like HDR10 compared to standard color gamut. It makes a huge difference.

As for the original poster's issues, you may have a bad upscaler that is softening your edges, TV settings may need to be dialed in, or you may be used to looking at an AMOLED that wasn't color corrected. Many of the first/cheaper AMOLED TVs basically show HDR-like colors when the original content wasn't recorded in HDR. Everything is super saturated like the Wizard of Oz. I've seen this shown on displays at Best Buy touted as a feature! Screens that say "look how red this strawberry is", but ... it didn't look real. The comparison TV showing the "old" sRGB colors looked like a real strawberry. If you got used to that (you said the theater looked dull compared to your TV and the theater should be the litmus test if its a Dolby approved theater) then you are going to have to let your brain adjust. Newer sets are calibrated to show more accuracy.

If you don't like the Sony ... I'm loving the Vizio M-Series! It's cheap, but its a really great buy.

You are correct Dolby Vision is absolutely stunning, now I don't think it's night and day better than HDR10 but you definitely can tell the difference in the color gamut especially on my LG Oled I can tell the difference.

Audiofreak71
 
I don't think so, Sadly, the most informative info on HDR vs Non-HDR sets seems to come from Consumer Reports which seems to require a subscription. ...sorry about that.
It's my understanding that HDR10 is basically DCI_P3 colorspace with 10 bits per pixel and 4000 nits brightness, while Dolby Vision is 12 bits with a REC2020 colorspace and 10000 nits brightness. No TV's actually support all of that though. The standard is the same as your PC with the ancient sRGB colorspace (for digital TVs .. analog TV used NTSC) and 8 bits per sub-pixel (24 bit color) .. not sure about brightness.

NO TVs actually support all that. Instead, they are rated by how-close they come. For reference, a "nit" is 1 candle per square meter. Imagine 10000 candles in a square meter and you can imagine how bright that TV would have to be! The colorspace is basically the range of colors that the bits are digitizing, and different color spaces are used for different uses.

The colorspace or "gamut" is the overall range of colors. It defines if pure red is blue tinted or orangey, if your white is brilliant white, soft white, or an eggshell, etc. More about color space "gamuts" here ... http://www.tftcentral.co.uk/articles/pointers_gamut.htm

See TVs compared as to how well they support these standards, here ... https://www.rtings.com/tv/tests/picture-quality/wide-color-gamut-rec-709-dci-p3-rec-2020

FYI, my Vizio is cheap and says it supports Dolby Vision, but the color is only 10bits. It just rounds or truncates the extra bits to the nearest 10 bit value. It can interpret the Dolby Vision stream, but that doesn't mean it can do everything the standard says it can do. Keep in mind that no TVs will support everything they claim to support. Its a careful marketing strategy to make you think you are getting more than you are paying for. That said, the Dolby still looks better than the HDR10 for the same movie and the Vizio is 1/3 the price of the high-end Samsungs that support a fuller range of the given HDR color spectrums ... but ... an extra 7% for $1000? I'll keep my money. Watch the comparison charts, but also read all the reviews you can.
 
Thanks for that wealth of information. I didn't understand a word it said. It might be a bit of information overload for my feeble mind but it does give me some reading material for the next few days.

I based my observation on seeing the demos on the TV's at Sam's Club where the expensive 4k sets they have on display make my two year old "D" series Vizio 4k look like a 1970's TV by comparison.

I'm pretty sure that by the time I go to replace it, that high tech stuff will have trickled down to the more affordable sets.
 
It's my understanding that HDR10 is basically DCI_P3 colorspace with 10 bits per pixel and 4000 nits brightness, while Dolby Vision is 12 bits with a REC2020 colorspace and 10000 nits brightness. No TV's actually support all of that though. The standard is the same as your PC with the ancient sRGB colorspace (for digital TVs .. analog TV used NTSC) and 8 bits per sub-pixel (24 bit color) .. not sure about brightness.

NO TVs actually support all that. Instead, they are rated by how-close they come. For reference, a "nit" is 1 candle per square meter. Imagine 10000 candles in a square meter and you can imagine how bright that TV would have to be! The colorspace is basically the range of colors that the bits are digitizing, and different color spaces are used for different uses.

The colorspace or "gamut" is the overall range of colors. It defines if pure red is blue tinted or orangey, if your white is brilliant white, soft white, or an eggshell, etc. More about color space "gamuts" here ... http://www.tftcentral.co.uk/articles/pointers_gamut.htm

See TVs compared as to how well they support these standards, here ... https://www.rtings.com/tv/tests/picture-quality/wide-color-gamut-rec-709-dci-p3-rec-2020

FYI, my Vizio is cheap and says it supports Dolby Vision, but the color is only 10bits. It just rounds or truncates the extra bits to the nearest 10 bit value. It can interpret the Dolby Vision stream, but that doesn't mean it can do everything the standard says it can do. Keep in mind that no TVs will support everything they claim to support. Its a careful marketing strategy to make you think you are getting more than you are paying for. That said, the Dolby still looks better than the HDR10 for the same movie and the Vizio is 1/3 the price of the high-end Samsungs that support a fuller range of the given HDR color spectrums ... but ... an extra 7% for $1000? I'll keep my money. Watch the comparison charts, but also read all the reviews you can.
True, until you have an OLED in your living room, you may think differently. Your correct about what you stated above and the highest rated panel only goes to 10bit like my LG Oled, supposedly in 2019 we'll start getting 12 bit panels.

Audiofreak71
 
True, until you have an OLED in your living room, you may think differently. Your correct about what you stated above and the highest rated panel only goes to 10bit like my LG Oled, supposedly in 2019 we'll start getting 12 bit panels.

Audiofreak71
The rep at BB said the 690e is only 8 bit.
 
The rep at BB said the 690e is only 8 bit.
That is why you aren't seeing 4k in all its glory, because you TV does not support the expanded color gamut which is HDR, therefore you probably won't notice much difference between 4k and 1080p. Really where 4k shines is the details in scenes that are further away and when you add hdr to that everything comes together nicely in a movie (depending on the movie, how it was mastered etc).
I fell into the same trap when I first got into 4k then hdr came out and TVs had to be hdcp 2.2 compliant with 10bit panels to take advantage of hdr. Maybe one day they'll get it right and future proof a TV were we can take advantage of future changes.

Audiofreak71
 
I went looking for the specs. This is what C-Net review listed. "12-bit support, Channel Block, Dynamic Backlight Control, Mastered in 4K". So is this an actual 12 bit or an 8/ bit that can support 12 bit?
 
This is from RTings about the HDR settings on your TV, check and make sure it's set to this while watching 4k movies.

HDR content is only shown properly in the HDR 'Scene Select' (or 'Auto'); it is the only mode that uses the HDR PQ curve instead of the SDR gamma curve. In this mode the TV's EOTF follows the PQ curve fairly well, though the slope of the line undershoots a bit. Adjusting the 'Gamma' setting to '+2' makes the EOTF follow the PQ curve more closely, as shown here. Even higher gamma would be better in bright rooms.

Audiofreak71
 
When I selected HDMI Enhanced it did help a lot with 4k discs. It does show HDR Video when I am streaming off Youtube & Netflix. The streams just did not seem way better than my 1080p tv to me. Guess if you want the best you have to pay the price.
 
When I selected HDMI Enhanced it did help a lot with 4k discs. It does show HDR Video when I am streaming off Youtube & Netflix. The streams just did not seem way better than my 1080p tv to me. Guess if you want the best you have to pay the price.
Unfortunately with tvs today you are correct, I learned the hard way and went through an agonizing journey of multiple tvs with issues and just not meeting my standard only to end up buying an LG Oled which should have just bought in the first place
The picture is stunning and is better than my pioneer elite Kuro I had so therefore it definitely meets my standard..

Audiofreak71
 
True, until you have an OLED in your living room, you may think differently. Your correct about what you stated above and the highest rated panel only goes to 10bit like my LG Oled, supposedly in 2019 we'll start getting 12 bit panels.

Audiofreak71
I bought a 65 inch LG OLED a couple months ago and I agree. The set it replaced was a 50 inch Pioneer elite Plasma, which was excellent... but this OLED is something else in terms of picture quality. I use audio through a AVR so don't use it's internal audio. It does a great job of upscaling and handles native 4k beautifully. The player is a Sony... don't have the model number with me... a new unit, 4K.. has two HDMI outputs, one dedicated for audio to AVR, which I don't use... new AVR handles everything directly from primary HDMI input from player, I selected it to output 4.1, as that is my configuration. HDR from a 4K bluray really makes a difference.
 
Solved a lot of my problems with the 960E. Local pawn had a Sony XBR-65X930D 65" for less than half the going prices I've seen, so I got it. Very pleased with it.
 
Last edited:
Youtube on the 930D was not showing HDR Video, did on the 690E. Called Sony, the 930D Youtube app will not transmit HDR. They are supposedly working on it.
 
Back
Top Bottom