I don't know if you're being purposely disingenuous or have no idea how people play on PC.
Let's just pretend that small selection of people who had to choose to participate in a survey were actually all of the PC gaming user base. Did it not cross your mind that those other 98% are perhaps favoring frame rate over resolution, you know one major key advantage of PC gaming? I would be one of those people, because I use my TV in 1080p/120Hz mode rather than 4K/60hz mode. What about all the people with 1440p/144Hz monitors for example? Have you ever heard a PC gamer sacrificing their frame rate just to hit 4k?
Do you have any data to support that? the majority of cases I see playing with highest framerates are competitive fps players, which don't care much for top visual quality but fast response to make more kills.
Where are all those examples here in this thread from people running 120Hz and 120fps games?
Maintaining stable 120fps or 144fps in games means sacrificing a lot of quality, otherwise you are just reducing input lag but not seeing any real improvement in smoothness.
Also, this argument about PCs not being included because it doesn't have generations - ignoring the fact PC hardware have far more generations than console. An i7 4790K is not the same gen as a 9900K, a 780Ti is not the same gen as a 2080Ti - but I know the point you're trying to make.
Don't manipulate my words. What
I said is PC is not "next gen", "next gen" and "last gen" are
consoles terms. PCs are a total mix of different hardware components in constant evolution. The point I
made is it seems the origin of this thread was comparing games on similar hardware ("next gen" consoles) but as time went on and PCs kept evolving now instead of comparing games in fair conditions it's all about "mine is the best!" just because the latest and most expensive PC hardware can bruteforce some settings.
Anybody can experience the game at 4K, 5K, even 8K, maxed settings if they want to though. Doesn't matter what hardware you have, the same settings are there and available to everyone and everyone can display exactly the same images, the only difference is how quickly their computer can render those images, that is it.
So it doesn't matter if the game runs at 5fps as long as you can make a good screenshot to post on a forum?
Well, that's one point of view...
I'm more interested in real gameplay conditions. High quality videos, specially 4K60 with 50-70Mbps bitrates, are much more representative of real gameplay than some static and cherrypicked screenshots.
Get better hardware the majority of us here run 4K. This is nextgen faceoff meaning the latest and greatest as it continually evovles not mid 2016 faceoff.
Where is your data for "the majority of us"?
I
do have better hardware. A Ryzen 2080S 32GB NVMe able to run Forza 7 and Forza Horizon 4 Ultra at 4K60, even forcing supersampling on transparency:
and even with Reshade on top and 3D:
but I acknowledge that this is not the case for the majority of people out there, and I can appreciate what's great about games that run on much more limited hardware.
It's not about "me me me me" and "my PC is best! haha" "pcmasterrace consolepeasants yadda yadda", but what is a fair comparision and what are merits of the games, the engines they run on and the studios behind them, and what are merits of vastly more expensive hardware components.