• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ultra High Resolution PC Textures

Celcius

°Temp. member
Many modern PC gaming graphics cards have loads of VRAM:

RTX 3090 = 24gb
RTX 3090 TI = 24gb
RTX 4090 = 24gb
RTX 4080 = 16gb
RX 6800 = 16gb
RX 6800XT = 16gb
RX 6900XT = 16gb
RX 6950XT = 16gb
RX 7900XT = 20gb
RX 7900XTX = 24gb
Intel Arc A770 = 16gb

For reference, the PS5 and Xbox Series X have 16gb of total memory split between the CPU and GPU (meaning, it's both RAM and VRAM).
Imagine a game using 8gb RAM for the CPU and 8gb VRAM for the GPU. In this example, an RX 6800XT would have double the space available and an RTX 3090 would have triple the space available. Many PC games of today don't utilize all of this extra VRAM, but will developers start adding ultra high resolution textures to PC games and then simply scaling them down for consoles or lower settings on PC?

Many current gen games look nice but then you put the camera up close to a rock or the ground and then it looks not so great. Taking advantage of all the extra VRAM in modern videocards to have ultra high resolution textures could make a big difference I think.

"But Celcius, this would increase the size of game installs! My poor SSD!"
Not necessarily. Take Call of Duty Black Ops Cold War - when you install the game it asks which games modes you want to install and higher resolution textures are an optional install. Final Fantasy XV takes yet another approach, which I think is best - the higher resolution textures are a free DLC. If you download and install the DLC then you can enable the high resolution textures and if you don't then you just get the base game with the default textures. (though what I'm asking for is textures that are much higher res than what either of these games currently provides).

What do you think GAF? Videocards with huge amounts of VRAM are here and I'm ready to say goodbye to some of the ps3-looking textures out there.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
You wont be seeing textures above 4K any time soon.
Not because devs couldnt make textures above that, but because of diminishing returns, if you are rendering your game at 4K, textures above 4K basically add nothing....hell alot of games use a mix of 4K and lower because sometimes the object takes up so little space on the screen at any time it doesnt add up for it to have a huge texture on it when itll be eating up VRAM yet adding very little visually.

Lets not forget how virtual geometry and virtual textures are a paradigm shift.

Even in offline rendering people dont really use textures above 4K.
Some hero assets might have multiple 4K UDIMs, but most GPU renderers have compress options that squeeze those textures down.
That plus new de/compression algorithms means we likely wont be seeing 16GB cards actually getting filled any time soon. (memory leaks notwithstanding)

For instance a render like this wont even go out of core with a 12GB card at 1440p.
RIOjNga.jpg


And yes this is a GPU render with amazing textures, amazing raytraced global illumination and multi multi bounce reflections.
You’re not wrong, it this would be viewed as a “waste of time” due to the small user base it’d be for, even though there are many cards, devs wouldn’t see the use of doing something that extra.
Almost all texture artists make/publish their textures at 4K or higher then compress them down for the game.
Shipping a game with "max" textures would actually be less work for artists cuz they wouldnt have to worry about how the texture looks at lower quality.
Also part of the reason people are loving virtual geometry, devs dont have to work extra hard making a high poly model and a low poly model, then needing to bake normal maps.
Atleast in Unreal Engine, most of the work is done for you, just insert that model and let the engine do the work.
The LOD0 will basically be a full resolution asset as the artist made it.
 
Last edited:

nkarafo

Member
The first Unreal Tournament had this figured out. If you used the correct hardware/API, the textures would change up close, revealing a different detailed texture. The up-close detail was amazing, without needing a huge texture that looks good from all distances. And the transition was seamless.

This is the only game i remember having this effect. There are probably others that i don't remember now but they are from the same era. I have no idea why it never became a thing.
 
The first Unreal Tournament had this figured out. If you used the correct hardware/API, the textures would change up close, revealing a different detailed texture. The up-close detail was amazing, without needing a huge texture that looks good from all distances. And the transition was seamless.

This is the only game i remember having this effect. There are probably others that i don't remember now but they are from the same era. I have no idea why it never became a thing.
I thought (so I could be wrong) this is how most engines work these days, in terms of both geometry and texture detail. For example, when you change the texture memory amount in a RE Engine game it'll render the higher resolution textures further out.
 

yamaci17

Member
Games are designed for consoles first and foremost. Most devs usually prepare one set of textures and call it a day. Anything lower will be hilariously minced bad looking textures (low /med presets on most PC games look dreadful. Only maximum texture settings are usable, in most cases. I said most cases, some games are outliers, such as doom eternal (streams less textures instead))


Here's how AC Valhalla with high textures look against PS4. They look similar. Quite literally, all consoles, all maximum PCs uses the one set of high textures and that's it. End of the story. Medium textures are hilarious garbage that no console system uses.

to begin with consoles will most likely use 9-10 gb for vram purposes and 3-5 gb ram for cpu purposes. they're not going to simply use 8 gb ram or something. 2-2.5 gb is allocated for the console itself. also, consoles do not need RAM as PC do, you copy lots of stuff to RAM and then to VRAM, this creates an additional, unnecessary RAM load that does not happen on consoles (so said the Nixxes)

Then comes the other factors. Consoles will start to omit ray tracing or use it sparingly, or do not enable super duper ultra effects. maxed out ray tracing with high quality Ray tracing effects can have a huge effect on VRAM usage. Rendering at native 4K, or 4K/DLSS Quality also could take up more VRAM then a game running at usual 4k/1200p upscaling on consoles.

then you have other various useless settings.

don't worry, people/devs will find ways / settings to fill up VRAM.

here's an example, PS4 has 8 GB in total (4-5 GB for GPU operations, 1.5-2.5 for CPU operations). Then you have 8 GB GPUs at the other side. Guess what, most games at 1440p/4K filled up the 8 GB VRAM quite nicely. You add some extra settings or stuff on top of it and suddenly you're knee deep.

these are all revelant discussion points for 16 gb. now for the 24 gb, it is clear that it is not meant for gaming at all. you can still find stuff, 8k and whatnot, to fill it up.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The first Unreal Tournament had this figured out. If you used the correct hardware/API, the textures would change up close, revealing a different detailed texture. The up-close detail was amazing, without needing a huge texture that looks good from all distances. And the transition was seamless.

This is the only game i remember having this effect. There are probably others that i don't remember now but they are from the same era. I have no idea why it never became a thing.
The texture was already a "high quality" texture and mipmaps are generated from that.
Its not making lower detail textures higher quality, its doing the exact opposite.
In fact Mipmaps increase the size of a texture because you have to store the lower level mips in it as well.
If an object/texture is only ever rendered one way, for instance a texture that only ever appears in a cutscene and is close to the camera, you might not ever generate a mipmap and the texture exists only as whatever the size/detail is.
But if a texture can be viewed from multiple distances it is advised to generate mipmaps so that at a distance the larger mip is not loaded into memory.

616d48cce4f35d248467e07a



What OP is suggesting is that the highest MIP would be something beyond 4K.
 

PeteBull

Member
Many PC games of today don't utilize all of this extra VRAM, but will developers start adding ultra high resolution textures to PC games and then simply scaling them down for consoles or lower settings on PC?
Wait for big budget games that are current gen only, from talented and solidly founded dev studios of A tier or S tier, they will most likely take advantage of more vram in our high end gpus, and cpu's that are already much faster from downclocked r7 3700x(zen 2 architecture) with cut cashe that is in ps5/xss/xsx.

Once those system are baseline, and not ps4/xbox one who even in 2013 werent top end products(8gigs of ddr5 in ps4 case being an exception) we can think of games looking next gen, aka roughly of the lvl of matrix demo(if u didnt see it plenty vids on youtube, including massive explanations from techchannels like digital foundry).
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Wait for big budget games that are current gen only, from talented and solidly founded dev studios of A tier or S tier, they will most likely take advantage of more vram in our high end gpus, and cpu's that are already much faster from downclocked r7 3700x(zen 2 architecture) with cut cashe that is in ps5/xss/xsx.

Once those system are baseline, and not ps4/xbox one who even in 2013 werent top end products(8gigs of ddr5 in ps4 case being an exception) we can think of games looking next gen, aka roughly of the lvl of matrix demo(if u didnt see it plenty vids on youtube, including massive explanations from techchannels like digital foundry).
"Talented" teams dont waste VRAM.
They are the teams least likely to force a game to need 12+GB of VRAM.
 

PeteBull

Member
"Talented" teams dont waste VRAM.
They are the teams least likely to force a game to need 12+GB of VRAM.
We talking about ultra high res texture option/texture pack, and ofc atm no game needs as much vram yet, i myself got 12gigs on my 3080ti, but gotta be realisitc here, games made from the ground up for current gen consoles will keep pushin pc requirements big time vs games made for last gen consoles too.

As a proof i can give u example of battlefield 3 vs battlefield 4 vs battefield1 vs BF5 requirements


BF1 and BF5 were not on ps360 anymore, big difference vs battlefield 3 in both minimal and especially recommended- thats where devs think game looks/runs acceptable( not saying it does, its bit subjective, but general benchmark).

Edit: added BF1 too so we get full spectrum/progression of pc requirements from bf3( 2011, fully made for ps3/x360 gen) to BF5 who probably took most advantage of ps4/xbox one in 2018, BF1 was first last gen version so no more xbox 360/ps3
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
We talking about ultra high res texture option/texture pack, and ofc atm no game needs as much vram yet, i myself got 12gigs on my 3080ti, but gotta be realisitc here, games made from the ground up for current gen consoles will keep pushin pc requirements big time vs games made for last gen consoles too.

As a proof i can give u example of battlefield 3 vs battlefield 4 vs BF5 requirements


BF5 was not on ps360 anymore, big difference vs battlefield 3 in both minimal and especially recommended- thats where devs think game looks/runs acceptable( not saying it does, its bit subjective, but general benchmark).
Whats any of what you posted got to do with VRAM usage and texture sizes?

Going over 4K textures leads to heavy heavy diminishing returns especially if you are rendering a scene at or around 4K resolution.
When games are being rendered at 8K consistently then maybe we will see 8K textures.
 

PeteBull

Member
Whats any of what you posted got to do with VRAM usage and texture sizes?

Going over 4K textures leads to heavy heavy diminishing returns especially if you are rendering a scene at or around 4K resolution.
When games are being rendered at 8K consistently then maybe we will see 8K textures.
Yups but what im saying is, dunno if u remember those discussions from 10years back, when ppl claimed for 1080p 2gigs of vram is more than enough, with time it changed to 3, then 4gigs, now its 8gigs for 1080p, i see a trend here :)
 

Skifi28

Member
Yups but what im saying is, dunno if u remember those discussions from 10years back, when ppl claimed for 1080p 2gigs of vram is more than enough, with time it changed to 3, then 4gigs, now its 8gigs for 1080p, i see a trend here :)
That's If you want to use the highest quality textures which aren't really meant for 1080p.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Yups but what im saying is, dunno if u remember those discussions from 10years back, when ppl claimed for 1080p 2gigs of vram is more than enough, with time it changed to 3, then 4gigs, now its 8gigs for 1080p, i see a trend here :)
OP is talking about RTX30s and RX6000s.

Sure devs will eventually actually eat 20GB of VRAM when games are rendering at well above 4K and textures and geometry are much more dense.
At that point the RTX30 even with all that VRAM wouldnt be able to render the game anyway.
Think of it like this.
If you gave a GTX 570 20GB of VRAM, how do you think it would handle modern games?

Note. Using 4K+ textures at 1080p is stupid.
You dont have enough pixels to see any of the details you are forcing into VRAM?
 
Last edited:

PeteBull

Member
OP is talking about RTX30s and RX6000s.

Sure devs will eventually actually eat 20GB of VRAM when games are rendering at well above 4K and textures and geometry are much more dense.
At that point the RTX30 even with all that VRAM wouldnt be able to render the game anyway.
Think of it like this.
If you gave a GTX 570 20GB of VRAM, how do you think it would handle modern games?

Note. Using 4K+ textures at 1080p is stupid.
You dont have enough pixels to see any of the details you are forcing into VRAM?
Agree fully here
 

Dr.D00p

Member
This is what Nanite in UE5 is designed to overcome, at least when it comes to ground textures and other environmental objects, basically 'infinite detail'

..maybe not so much on character models though.
 

poppabk

Cheeks Spread for Digital Only Future
I just installed Rise of Tomb Raider and when you select highest res textures it gives a big warning that they might use .... over 4gig of vram.
 

poppabk

Cheeks Spread for Digital Only Future
That's If you want to use the highest quality textures which aren't really meant for 1080p.
I can't think of a single game that even when played at 1080p didn't have textures that looked a little ropey when right up close. If a 10th of your 4k texture is filling the entire screen then it will start to look blurry.
 

winjer

Member
PC versions having high resolution texture packs is common throughout all console generations.
In some games is more noticeable than others, but usually it's a good thing to have, if you have enough vram.
So we can expect to see this in the coming years.

We also have to consider that consoles don't have that much memory. With only 16GB, that is only double what the PS4 had. The smallest increase gen over gen, ever.
 

Edgelord79

Gold Member
Many of the larger games on PC have ultra high resolution packs on nexus mods. This has been happening for years and are often better than the official versions that are or could be put out. Mileage various depending on the gpu used as has been mentioned here as well as the resolution of your monitor.

In many cases though it provides diminishing returns and is not as impactful on its own as visual and lighting overhauls.
 

rofif

Can’t Git Gud
Waste of time. The textures are already good.
And games take even 13gb of vram like resident evil 2 with ray tracing.
But wasting it on textures is pointless.
 

64bitmodels

Reverse groomer.
i'd love that but it doesn't seem very practical when most PC users (myself included) don't have a hard drive/SSD or GPU with enough VRAM to take advantage of this. the high end PC userbase (xx70/x700 and up) is very small in comparison to lowly budget builders and midrangers like me
plus as mentioned above games are made mostly for consoles as the baseline and are scaled up or down from there, they likely don't have textures any more higher res than what's currently available for PS4/PS5. Finally people don't really consider textures that important these days, people want more realistic lighting and models but a blurry texture every now and then isn't a concern, which makes sense seeing as how textures had been blurry and low res on consoles for 25 years. it's kind of been the norm for a quarter century
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
Many of the larger games on PC have ultra high resolution packs on nexus mods. This has been happening for years and are often better than the official versions that are or could be put out. Mileage various depending on the gpu used as has been mentioned here as well as the resolution of your monitor.

In many cases though it provides diminishing returns and is not as impactful on its own as visual and lighting overhauls.
Even though you could/can do it, it is kinda pointless.
 

Edgelord79

Gold Member
Even though you could/can do it, it is kinda pointless.
I agree that some of the time it is. But I got better performance and quality out of some non official texture packs than official ones. Like I said, often the lighting and graphical overhaul mods provide more value to the game than the textures themselves.
 

Hoddi

Member
Texture memory requirements are going down and not up with SSDs. There's much less reason to keep 8GB of textures cached in memory when you can just stream the mips that are visible.

Modern games like Spider-man and Cyberpunk will often use 8-10GB of VRAM but the textures are less than 2GB of that.
 

Celcius

°Temp. member
How many times do you put the camera right up to a rock ? Lol
For example, final fantasy 7 remake got a ps4 to ps5 upgrade. When playing the Yuffie dlc (exclusive to ps5), I remember looking at the rock textures in the opening area and thinking that they don’t look like they take advantage of ps5. I’m one of the people that looks for footsteps in the snow behind characters, etc… lol
 

Esppiral

Member
Game resolution is not tied to texture resolution, or viceversa, stop that bullshit. Some Dreamcast games use 2k resolution textures and the max output resolution of the console is 480p and the difference in there texture quality is noticeable, just stop.
 

nkarafo

Member
What about geometry density replacing textures then?

Remember Metroid Prime? That game had some very low-res textures and lack of bump mapping, things that make surfaces look ugly. Yet, the game looked great because of it's dense geometry. The smaller the flat surfaces are, the lesser the need of a higher resolution/size texture is. Theoretically, you could have such density where the flat surfaces would be small enough to not even need a texture. Just a color value or something. But from afar, there would be enough surfaces that their combined colors would depict something resembling a texture.

Textures take much more space than geometry so this would make games smaller in size. The problem though is, even if the GPU would be strong enough to produce those polygons, how much work would it need, from the developer? The forming of the geometry of the finer detail would probably need to be AI/Randomly generated while so the developers can only form the geometry of the "macro" detail by hand, not unlike how they do now. And use pictures and high-res assets as reference to assign the colors on each tiny surface semi automatically?
 

xPikYx

Member
Been modding on PC for years and yes 8k textures makes a difference, difference made when you get up close to them , even 4k textures when too close start getting blurred and pixelated, 8k is the right spot. Rendering resolution and texture resolution are 2 separate things by the way
 

GustavoLT

Member
I just want devs to make games running as clean as possible, I mean, at least 60fps, no aliasing, bad framepacing, stutters (most on pc) no save/load shader freeze... just a clean good looking game with a responsive gameplay!

Stop putting puddles everywhere just to justify your shitty ray tracing feature!!! Make it worth, if it kills resources, throw it away!
 

lukilladog

Member
You wont be seeing textures above 4K any time soon.
Not because devs couldnt make textures above that, but because of diminishing returns, if you are rendering your game at 4K, textures above 4K basically add nothing....hell alot of games use a mix of 4K and lower because sometimes the object takes up so little space on the screen at any time it doesnt add up for it to have a huge texture on it when itll be eating up VRAM yet adding very little visually.

Lets not forget how virtual geometry and virtual textures are a paradigm shift.

Even in offline rendering people dont really use textures above 4K.
Some hero assets might have multiple 4K UDIMs, but most GPU renderers have compress options that squeeze those textures down.
That plus new de/compression algorithms means we likely wont be seeing 16GB cards actually getting filled any time soon. (memory leaks notwithstanding)

For instance a render like this wont even go out of core with a 12GB card at 1440p.
RIOjNga.jpg


And yes this is a GPU render with amazing textures, amazing raytraced global illumination and multi multi bounce reflections.

Almost all texture artists make/publish their textures at 4K or higher then compress them down for the game.
Shipping a game with "max" textures would actually be less work for artists cuz they wouldnt have to worry about how the texture looks at lower quality.
Also part of the reason people are loving virtual geometry, devs dont have to work extra hard making a high poly model and a low poly model, then needing to bake normal maps.
Atleast in Unreal Engine, most of the work is done for you, just insert that model and let the engine do the work.
The LOD0 will basically be a full resolution asset as the artist made it.

Yeah well, make that a car with a 1 texture body racing livery and start to make close up shots... an 8k texture would be required at 4k.

BTW I made a rug in Skyrim and 8k barely cuts it at 1080p because it is so big:

112927-1660953462-1447617702.png
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Yeah well, make that a car with a 1 texture body racing livery and start to make close up shots... an 8k texture would be required at 4k.

BTW I made a rug in Skyrim and 8k barely cuts it at 1080p because it is so big:

112927-1660953462-1447617702.png
How is that rug UV'd?
Cuz no way that requires an 8K texture.
UV Unwrap the rug over itself use even a 2048x2048 texture imma bet the results look better than the 8K texture.
You are also losing detail in the texture because the screenshot/render literally doesn't have enough pixels to display the texture.
Its an 8K texture, take the screenshot at 8K and let the details shine.


P.S Unless the car is made of a single piece of geometry it wouldnt realistically be using a single texture, it would atleast be a bunch of UDIMs for the different components.
If its a single piece of Geometry GTA3 style.....Ill have to give it a go.
It actually sounds like a nice little challenge.
A relatively detailed racing car livery using only a single texture at or under 4K.
 

Alexios

Cores, shaders and BIOS oh my!
The first Unreal Tournament had this figured out. If you used the correct hardware/API, the textures would change up close, revealing a different detailed texture. The up-close detail was amazing, without needing a huge texture that looks good from all distances. And the transition was seamless.

This is the only game i remember having this effect. There are probably others that i don't remember now but they are from the same era. I have no idea why it never became a thing.
I don't recall this being a thing, can you link to any documentation about it?

Words like textures are pretty vague so I just get results about things like the "detail textures" setting that was common back then or about the alternative high quality textures they included in UT GOTY's 2nd disc for use with cards that could support the new S3TC compression technique.
 
Last edited:

intbal

Member
I don't recall this being a thing, can you link to any documentation about it?

Words like textures are pretty vague so I just get results about things like the "detail textures" setting that was common back then or about the alternative high quality textures they included in UT GOTY's 2nd disc for use with cards that could support the new S3TC compression technique.

It was similar, but they weren't using normal maps back then. So it was just a high frequency tiling texture that would combine with the regular texture when up close.
 

Celcius

°Temp. member
You are also losing detail in the texture because the screenshot/render literally doesn't have enough pixels to display the texture.
Its an 8K texture, take the screenshot at 8K and let the details shine.

Game resolution is not tied to texture resolution, or viceversa, stop that bullshit. Some Dreamcast games use 2k resolution textures and the max output resolution of the console is 480p and the difference in there texture quality is noticeable, just stop.
 

Alexios

Cores, shaders and BIOS oh my!
[/URL]

It was similar, but they weren't using normal maps back then. So it was just a high frequency tiling texture that would combine with the regular texture when up close.
Like I mention in your quote I know about detail textures, I'm just fairly sure it wasn't only showing when you get up close but an always on thing that was at the time's resolutions etc. most visible up close. That link specifically doesn't mention it only being enabled up per texture by distance or anything, the way it's worded at least it doesn't quite make that certain. And if it is what you say and only enabled at a certain distance, again, it was pretty commonly used in games, not just UT, for a good amount of time (probably until folks could put that much detail in the actual textures).

Edit: ok so it was further down the page under "distance based detail texturing" but it seems that was mostly used to avoid the obvious tiling/repetition of detail textures when further away rather than a way to stream in better textures only when looking at things up close. I guess it's close enough.

Edit again: well, I saw it mention mobile platforms at the bottom so I was like what, no way they had those in mind back then and indeed it seems that link is about Unreal Engine 4+, I was asking about UT 99+, or is that a feature introduced back then and still in effect today in the same way?

Edit after your edit: ok, then that shows, that it's still in effect, not something particularly special about UT (other than being among the first games to use it) that we have since then lost as the post I was replying to implied, it's still used (not to mention heavy use of actual texture streaming now).
 
Last edited:

intbal

Member
Like I mention in your quote I know about detail textures, I'm just fairly sure it wasn't only showing when you get up close but an always on thing that was at the time's resolutions etc. most visible up close. That link specifically doesn't mention it only being enabled up per texture by distance or anything, the way it's worded at least it doesn't quite make that certain. And if it is what you say and only enabled at a certain distance, again, it was pretty commonly used in games, not just UT, for a good amount of time (probably until folks could put that much detail in the actual textures).

Edit: ok so it was further down the page under "distance based detail texturing" but it seems that was mostly used to avoid the obvious tiling/repetition of detail textures when further away rather than a way to stream in better textures only when looking at things up close. I guess it's close enough.

Edit again: well, I saw it mention mobile platforms at the bottom so I was like what, no way they had those in mind back then and indeed it seems that pertains to Unreal Engine 4+, i was asking about UT specifically, or is that a feature introduced back then and still in effect today in the same way?

I'm going off memory. I recall reading in PC magazines at the time about the feature that they intended to add to Unreal (1). It was as I described and not as that link described. I only offered that link up because that's what the feature eventually turned into many years later.

Edit: Here it is, from the wikipedia page on Unreal:

Unreal was one of the first games to utilize detail texturing. This type of multiple texturing enhances the surfaces of objects with a second texture that shows material detail. When the player stands within a small distance from most surfaces, the detail texture will fade in and make the surface appear much more complex (high-resolution) instead of becoming increasingly blurry.
 
Last edited:
Current gen consoles are made with ultra fast texture streaming in mind, so developers can use the highest quality textures if they want. UE5 tech demo running on PS5 was using up to 8K textures!

Velocity Architecture tech demo shows it's possible to render 10GB VRAM scene in just 3.5GB VRAM (velocity architecture tech demo) thanks to ultra fast streaming and SFS technology, so 13GB RAM available on XSX for developers in plenty.
 

lukilladog

Member
How is that rug UV'd?
Cuz no way that requires an 8K texture.
UV Unwrap the rug over itself use even a 2048x2048 texture imma bet the results look better than the 8K texture.
You are also losing detail in the texture because the screenshot/render literally doesn't have enough pixels to display the texture.
Its an 8K texture, take the screenshot at 8K and let the details shine.


P.S Unless the car is made of a single piece of geometry it wouldnt realistically be using a single texture, it would atleast be a bunch of UDIMs for the different components.
If its a single piece of Geometry GTA3 style.....Ill have to give it a go.
It actually sounds like a nice little challenge.
A relatively detailed racing car livery using only a single texture at or under 4K.

I don't know how it was UV'd, the mesh is borrowed from another mod. Once walking over it, 1080p becomes plenty to display small portions of it:

nHnmGzU.jpg



As for racing car liveries, most if not all racing games use a big texture for most of the car, which is then mapped onto several parts of the body. Like this one for the difusse map for example... you want a 4k one, 2048 wont cut it for 1080p as you can see below:



While a 4k one does much better:

 

april6e

Member
I have stated for a decade now that consoles should have upgradable parts like a PC and get laughed out of the room every time I state it online.

For multiple generations now, all three gaming companies have had multiple versions of the same console. PS4>Pro, Wii to Wii U and the three Xbox Series consoles. Provide a base console and then offer multiple purchasable graphics card/RAM configurations (like 2 different upgrades to the base console) that you can install into the base console. The consumer then can choose whether they just want the cheap base console ($400-500) or the extremely expensive upgradable parts ($800-1000) that allow for PC level gaming. Nintendo wouldn't be able to do this but Sony and Microsoft definitely could.
 
Last edited:
Top Bottom