And yet you tried to paint the "mathematically correct setup" as the one and only truth when it`s simply irrelevant in private households... Talk about misplaced sarcasm.
People's ignorance of how they arrive at their own decisions isn't proof that the "mathematically correct setup" is incorrect in any way.
You keep mentioning framerate which is itself a perfect example of what I mean: people KNOW instinctively that 60fps feels and looks smoother and given the choice they would buy something at 60fps over 30fps any day of the week. They often can't really put into words WHY, other than "it feels better", and they don't need a technical breakdown to back that up. But that doesn't stop us from still scientifically exploring exactly
why it is the case (eg. lower input lag, better motion fluidity etc), and perhaps at what point we stop being able to feel a difference.
No, a THX chart specifically doesn't factor into most people's decision making. I described what actually does -
being able to see a difference. With their own two eyes. It just so happens that when you work backwards from that, the THX chart explains an awful lot about what point a person stops being able to tell.
THX has become a standard in home cinema certification for a reason as it has a lot of predictive power. There isn't any guess work in how the seating is arranged or which screen size to pick or the angle of the speakers as it's already been worked out (they did the math
™). If their research is telling me that most people won't be able to tell the difference then I think there is a very good chance the consumer - who
their research was designed to aid - won't be able to tell either. We'll see.
Which you absolutely can if you are either near enough or have enough diameter....given you have good enough quality viewing material and not just some crappy encoded stream.
You seemed to just gloss over what I said re: monitors, so i'll just copy and paste my answer:
"Greater resolutions allow you to 'get away' with sitting closer to larger monitor sizes, so to speak. At some point though you are simply sitting too close to even take in everything on screen. The reason why things like THX measurements are useful and aren't just "theoretical bullshit" is that they also take into account
viewing angle. Yeah, you can make the argument that with 8K you can [sit right up against the screen.jpg]
But then you're missing out on tons of screen real estate around the periphery of the vision. And once you end up sitting far enough away from the monitor to correct that, then you start losing the resolution benefits! Get far enough away and you may as well just get a TV at that point."
People know when they are sitting too close once they begin to lose important parts of the image eg. an ammo counter in the corner of the HUD or an enemy creeping on you at the far edge of the screen. Again, viewing distance recommendations take that into consideration as they don't just start from one foot away.
As if price wasn`t subject to change.....

. Member how expensive the first 4k or OLED displays were?
Prices get lower but there is always a floor, particularly with the highest monitor sizes. A 98" Samsung, over a decade after 4K was introduced, is still going to be retailing at $8K this year.
Chinese-farmed cavier may be cheaper than the Russian stuff decades ago but it's still cavier and it's still €3,000/kilo.
Probably the same people that couldn`t tell the difference between 30 and 60+ fps, people like my mother. That´s not a standard, that`s anecdotal evidence...
It was a double blind study which is about as good as you can get in research. Literally the opposite of anecdote, which is all you've offered so far this whole thread.
You can read the full thing here:
https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-consumers-cannot-tell-the-difference.html
I don't even know what you're trying to argue here either.
Me: most people can't really tell the difference
You: that's bullshit I can totally tell the difference on a 120" screen if I sit close enough
Me: here's research showing most people couldn't tell the difference
You: nah they probably don't know shit anyway that research doesn't count
Well... yeah. That was kind of the whole point. If people can't tell, they can't tell, and they shouldn't need any extra insight or added "theoretical bullshit" to do so.
(the theoretical bullshit is just the cherry on top to explain why after

)
Switch out the 70"+ display diameters we`ve been talking about here for 50 sth and I swear this is
exactly the same discussion I had back when 4k started to establish itself.
"You can`t see a difference if you`re 50m away bro"...............
Just because a similar argument has been had before doesn't mean that you can't get a different conclusion later by tweaking some of the variables in the argument.
Well that's because humans can't hear frequencies above 20 kHz and you need to sample the sound at twice that frequency according to the Nyqvist theorem to be able to reproduce it perfectly in digital form.
So for playback to humans there's no advantage having a higher frequency. A bit simplified but pretty much. So it's not a very relevant comparison in this case.
I've already mentioned that and the reason I bring it up is because I think there are parallels between it and 4K in the sense that we reach biological diminishing returns, making advancing beyond the standard kind of pointless. With audio frequency it's our hearing failing to notice a difference and with 4K it's our vision.
Of course there are those who claim to be able to hear a difference up to 96khz (and even beyond lol) but they are outliers. I think 8K will have the same fate for most use cases.