Baron_Calamity
Member
Do you guys think a PC with an i7 and a 1.5 gig 660 would run this game better than a PS4?
At the same settings, probably not but if you are willing to turn down some options, you can run it at higher frame rate.
Do you guys think a PC with an i7 and a 1.5 gig 660 would run this game better than a PS4?
well, it is.You probably shouldn't buy new hardware just because a shitty Ubisoft PC port runs like crap (if that's your background for the question). Noticed such reactions multiple times now.
I'm not such a big fan of that guide.
It tells people to increase to a specific voltage (1.33V). I find that to be problematic as different chips require different voltages to be stable. A safer approach is to find out what the chips default Vcore voltage is and work up from there.
It tells people to immediately start testing from the highest speed the PC can boot at, but to me this seems like the path to pain. A more sensible approach is to gain stability at a smaller overclock and then work up. Some chips won't be stable at very high speeds no matter what voltage you pump into them. This goes hand in hand with adjusting the voltage up from its default value.
Also, the Intel Burn Test produces insane load that no game will ever match. Temperatures peaking over 80C are not cause for panic while running that test.
I want to overclock my 4770k but I am terrified of messing with voltages.
I want to overclock my 4770k but I am terrified of messing with voltages.
Anybody with <2GB card would be better off with a current-gen (PS4/X1) version.
A PS4... maybe. An xbone, no way in heck. He can easily do 1080p on that at medium settings, which is WAY better than an Xbone, and not too far behind a PS4. In fact, you can probably also turn up a few settings to high - just not textures, at least not until we get a fix for the asset streamign issues from Ubi.
1.5GB just isn't enough for "high" textures. The stuttering issues people have with the game are, by and large, not due to VRAM ceilings.
Edit: The matter of the X1 version more or less boils down to preference. At 1080p the PC obviously commands a resolution advantage, but medium settings wouldn't compare favourably to what the X1 offers in other areas.
borderless window borderless window borderless window borderless window
How do you do that? I am assuming you are talking about UPlay.
A PS4... maybe. An xbone, no way in heck. He can easily do 1080p on that at medium settings, which is WAY better than an Xbone, and not too far behind a PS4. In fact, you can probably also turn up a few settings to high - just not textures, at least not until we get a fix for the asset streamign issues from Ubi.
Its much faster way to OC imo, and will quickly tell you what kind of 24/7 oc youll be able to achieve by starting with an average Vcore for an average OC speed. Then depending on a quick stability test, determine if you can push less or more. Then optimize the Vcore. Doing the small increment way will just take ages, although at then end you can end up with a nice set of profiles for particular speeds. For me, i run offset so i dont ever need to wind my overclocks back, so it is pointless having multiple oc profiles.
So for 2500k you may try as a start point 1.3v 4.4Ghz, then go from there.
As for bolded - Either way, neither approach is safer, running a chip with the non-optimum Vcore isnt unsafe.
I thought both the PS4 and XB1 versions were 900p at 30fps.
When I launch Watch_Dogs it defaults to borderless window even though I have changed it to full screen in settings, is this happening to anyone else?
When I launch Watch_Dogs it defaults to borderless window even though I have changed it to full screen in settings, is this happening to anyone else?
They just added an FPS counter to ShadowPlay.
Did Geforce experience update anything? It updated today.
I tried txaa x4 (or whatever the max aa is) and it's running well on 2560! Everything on Max!
Maybe I just need to drive around more. I have it capped to 35.
I was always on the latest drivers.
I5 2500k overclocked to 4.5
R9 270x gigabyte 2GB
1920-1080
Textures high
AA none
Vsync off
HBAO+ HIGH
LOD ultra
Reflections high
Shaders high
Water high
Shadow high
50-60 fps smooth as glass with vsync off.
I tried crossfire but have better luck just turning it off and using 1 card.
I've got a GTX 780 Classified (non-overclocked) and can run the game on ultra with SMAA at 40-60fps (mostly around the 40 range) at 1440p with vSync. I'm fine with this, but there is stutter (specifically while driving) which is very annoying. I know this is a widespread issue and about the Disable File Check target that you can add, but I'm not sure how I can add it to the uPlay version of the game. I also get a lot of pop-in, but this isn't really all that surprising.
1.5GB just isn't enough for "high" textures. The stuttering issues people have with the game are, by and large, not due to VRAM ceilings.
I'm running
16 gigs of ram,
GTX 770 4gig
i5 750 cpu
.....how boned am I?
Yea I would say in most cases it's not a vram limitation. I have 6gb of vram (2 gtx 780's) and still can't play on ultra without having unplayable stuttering, while driving. Even on medium it's noticeable. The sad part is I made the mistake of getting the collectors edition...
2x780 3gbs isn't 6GB of total ram. Crossfire and SLi can only use the VRAM on a single card.
Such as what? The only difference from medium PC settings on Xbone is textures, which seem to be somewhere between medium on PC and the PS4's quality.
I'd take the option of choosing what to sacrifice any day of the week over being stuck with whatever the devs thought best - which is 900p or 792p.
If he wants to he can go 720p and probably high textures (with temporal SMAA) and better shadowing than the PS4.
It'll be up to him.
Digital Foundry's analysis states that the PS4 and X1's textures are like-for-like (i.e. high) with just the effects such as AO being slightly less refined on X1.
They also said the Xbone version often drops < 30 fps and has a lot of screen tearing.
I'd wait for a few more games designed only for One/PS4/PC before reaching that conclusion. There are people on 6GB Titans and 780Tis who are getting stuttering on Medium.Game looks decent, but nothing earth shattering. And I am a bit annoyed that I can't even max it out. I only recently bought this rig but I probably wouldn't have if I'd known 2gb GPUs were becoming obsolete.
I'd wait for a few more games designed only for One/PS4/PC before reaching that conclusion. There are people on 6GB Titans and 780Tis who are getting stuttering on Medium.
I'd wait for a few more games designed only for One/PS4/PC before reaching that conclusion. There are people on 6GB Titans and 780Tis who are getting stuttering on Medium.
wut...
Not sure about that tbh. Maybe at 4K?
wut...
Not sure about that tbh. Maybe at 4K?
Yea I would say in most cases it's not a vram limitation. I have 6gb of vram (2 gtx 780's) and still can't play on ultra without having unplayable stuttering, while driving. Even on medium it's noticeable. The sad part is I made the mistake of getting the collectors edition...
From the previous page:
Admittedly he doesn't say what res, but it's only one of many examples in this thread of people with >2GB cards getting stuttering at settings lower than High - some even saying they get stuttering on the very lowest settings.
Sorry I mentioned it in a earlier post. 1080p stutter with 2 gtx780 6gb cards. It stutters on ultra and high to the point of making the game unplayable to drive. At medium its much better, but still stutters.
Aha. Well there ya go. People should stop thinking their hardware is obsolete or trying to brute-force performance by throwing money at it. The game has problems.Sorry I mentioned it in a earlier post. 1080p stutter with 2 gtx780 6gb cards. It stutters on ultra and high to the point of making the game unplayable to drive. At medium its much better, but still stutters.
I don't think it's that weird - the High and Ultra textures are designed to cover more geometry. Low and medium textures were only created to cover the more simplified geometry, as it is expected that a system that can run high textures can also run the full geometry detail. They'd be giving themselves unnecessary work by having to create another four sets of textures (High and Ultra to cover low geometry, Low and Medium to cover high geometry - combinations that shouldn't be necessary to use).It's pretty fucking weird for the developers to create a setting that changes world geometry and texture definition, then label it "Textures".
You're right, but both his cards are 6GB.I don't believe SLI works like that. It isn't like effectively doubling the Vram, it will only use the amount that a single card has.
You're right, but both his cards are 6GB.
Yea I would say in most cases it's not a vram limitation. I have 6gb of vram (2 gtx 780's) and still can't play on ultra without having unplayable stuttering, while driving. Even on medium it's noticeable. The sad part is I made the mistake of getting the collectors edition...
He confirmed that the new update will fix the problem which requires nearly 3GB of VRAM used by graphics cards on consoles and will alter the PC version to use the split memory concept.
Has this been posted yet? Sorry if old.
http://thefusejoplin.com/2014/06/watch-dogs-upcoming-pc-patch-smooth-frame-rates/
I'm running
16 gigs of ram,
GTX 770 4gig
i5 750 cpu
.....how boned am I?
Good for High
so the whole game is running off the VRAM, means 8GB DDR3 I have in pc has no impact ?
Wow, if that's the case it means it's a damn straight port from the console version without any optimization. PC lead development? My ass.