I picked up both the PS4 and Xbox One on release days, and recently completed the trinity with an Alienware 14. With Watchdogs near release date, hopefully you guys can help an uninformed enthusiast out. The Alienware 14 laptop has a GTX 765m graphic card, Intel Core i7-4700MQ 3.4 GHz Processor, 16 gb RAM, and a Samsung EVO SSD. The game specs use desktop graphic cards as guidelines, so as a PC rookie I'm unsure on how well this laptop's internals will fare. For a comparison of sorts, I can run Sleeping Dogs with everything maxed except anti-aliasing and get 50 fps. Which platform do you believe 'dogs will run best on, the Alienware or the PS4? Thanks for any input!
-
While your 765m (assuming stock?) is tad lower than the recommended 560 Ti, I'd imagine you'd get a better experience on PC if not only for a 30+ framerate. I'm sure someone else can elaborate much more.
zachary77 likes this. -
Thanks for the opinion. I have the card overclocked via Nvidia control panel to 750 mhz graphics and 2100 mhz memory with no temps over 70c. Would low/medium with 1080p, 45-60 fps have the same detail as the PS4 which is 900p 30 fps? I know this is speculative as it hasn't been released, so sorry to put you guys on the spot....
-
I would like to say that Ubisoft does not know what the hell the word "Optimization" means for the PC platform. Their minimum specs say you need at least a GTX 460, but the minimum vRAM is 1GB (and most 460s had 768MB of vRAM), as well as they included the Intel Iris Pro 5200 as a video card capable of running Watch Dogs well.
Secondly, the recommended specs include a GTX 560Ti, but list the vRAM required as 2GB. It also says it needs an i7-3770, which is an outright LIE if it only wants a 560Ti. I can safely say to you sir, wait. Wait until the game comes out, wait until people who buy it for reviewing or whatever other purposes have already purchased it and said their piece. If people with 560Tis can run the game easily, then grab it for PC. If not, then grab it for your PS4. If you run your PS4 on a sub-1080p screen, the PS4's res won't matter. Guesswork at this point on specs of a Ubisoft title is about as fruitful as me asking each of the Fragdolls for a date, one after the other, with all of them standing next to each other fully able to hear me asking the others for dates, and have them ALL say yes.
My analogies are getting better. I must continue working on them.zachary77 likes this. -
Thanks, D2. There has been a game dry spell after the new console launch titles (bar Infamous), so waiting for later reviews will be too torturous for me. Maybe I'll just buy both copies and trade the PS4 back in if the PC trumps it (and put a review up here for others). Keep the opinions coming though..............
-
The normal GTX 460 was 1GB, only the gimped 192-bit version was 768MB and those were rare. And GTX 560 Ti did come in 2GB flavors.
OP you should get Watch_Dogs on PS4. It will probably perform on-par if not better than on PC given your specs, and apparently PS4 has exclusive or timed exclusive content or something of that nature. -
^ I stand corrected. However, the 2GB version of the 560Ti was NOT the one most people use, I know that much at least. I've never come across a 2GB 560Ti user. I also can't find that configuration on nV's website, only 1GB and 1280MB versions, and the 1280MB version had extra cores so was more like a 570.
-
This article, linked in another gaming forum, might shed some more light on PC vs PS4:
quote:
Watch Dogs on Playstation 4. He stated that its pretty hard to tell the difference between PC and PS4 version of Watch Dogs.
Morin was asked: "You talked about resolution indeed, but I'm speaking of other GFXs. You said "on those no", to this he replied: "And I stand with what I said. If you use same resolution its hard to tell the difference, but PC are quite scalable on this..."
Morin Talks About Sexual Content & Nudity In Watch Dogs, PS4 Graphics, Play On PC With DualShock 4 -
"OP you should get Watch_Dogs on PS4. It will probably perform on-par if not better than on PC given your specs, and apparently PS4 has exclusive or timed exclusive content or something of that nature."
Even if I can possibly get 60 fps on low/medium vs the 30 on PS4? Are 3rd person games not as reliant on frame rate as FPS are? -
560 Ti 2GB was only on aftermarket cards so it doesn't show up on Nvidia's website. 560 Ti 448 core was a totally different card from the normal 560 Ti.
But he didn't mention what specs were in that PC, and if we assume the recommended specs (560 Ti or 7850), those are both better than 765M. -
If only the Alienware 14 had the ability to upgrade graphic cards then this wouldn't be so complexing...................
-
No idea about performance at this time as game isn't even out yet.
Yeah GTA-style games aren't twitch FPS so high frame rate isn't as much of a priority, especially if you're playing with a controller, but personally I still find them more enjoyable with at least 45 FPS or more.zachary77 likes this. -
Complain to Alienware and tell them to bring back the awesome M15x. :thumbsup:zachary77 likes this.
-
Some more on the nature of the graphics:
"On the negative side, the truth, almost everything else, has greatly disappointed graphically, not by comparing it to the famous demo from E3 2012, but certain details we were not expecting with an open world game at this point in the new generation.
Things like cars and pedestrians suddenly appearing beside, fog on the horizon hiding elements, especially in tunnels, ugly textures, physics leave much to be desired in vehicles, quite ordinary animations, discrete modelling of cars, facial expressions of the last generation ...
Lest you think we ordered too much, when compared with a recent title of the console and open world, inFamous: Second Son, this hits it a look, games seem almost from different generations. And we also had the opportunity to play the PC version, and the truth, things do not improve too much, is quite similar to what we saw in PS4, only a little better."
Watch Dogs on PS4 disappoint graphically, ordinary animations, last-gen facial expressions: Spanish Preview -
Dunno if you know about this, but Watch_Dogs is 900p on PS4 and 792p on Xbox One, both 30 FPS.
Confirmed: Watch Dogs PS4 900p, Xbox One 792p, both 30fps • Eurogamer.net -
Wouldn't be surprised if true as first-party studios will always be able to squeeze more juice out of the console hardware, and Ubisoft sux at optimization even on consoles.
-
Like I said, if it's ONE thing you can count on Ubisoft to produce, it's a game that looks good visually, but WAY overdraws the system it's being run on and/or has repercussions of said graphical fidelity. Games like GTA San Andreas and GTA 4 had pedestrian/car/texture popping, yes... but you had to actually move faster than your system would allow the game to load for it to show up. For exmaple: Grab the fastest car in either game, outfit it with Nitro, find the longest stretch of road you can, then SPEED as far and as fast as you can go. In GTA SA, pedestrians/cars will eventually vanish and when you slow down/stop/crash, you'll notice they just all kind of teleport into place. In GTA 4, the whole world loses textures entirely and becomes a grey-ish mesh of... something. Might as well be playing an Alpha. The difference between these and something like what's described in the two articles about Watch Dogs, however, is that the people in Watch Dogs didn't likely pass the city so fast that it couldn't render more of itself in time. Ubisoft is just bad at coding for PC platforms. They have been with every game they've put out in the last three or so years.
Next, about the graphics/resolution. If you have a 720p screen for your console, you won't notice a thing in Watch Dogs graphic-wise. It'll be 30fps, sure, but that doesn't break a game. Look at Dark Souls and South Park: The Stick of Truth. 60fps is all well and good, but it doesn't make an amazing game. Resolution, however, is a different story. With a 1080p screen, running an internally-rendered game at 900p or less that's stretched to being 1080p is... annoying. It makes things look blurry and you can't tell why your game is looking so bad, or if something's wrong with your eyes. Far less if it renders at 720p or around there... that's just awful. Most people can actually tell the difference, but don't realize anything is wrong unless they have something to compare it to. They actually, legitimately, don't know something's wrong. I've seen people play Resident Evil 5 splitscreen on a PS3 with the game running at about 20 frames per second, and I'm pointing it out and they're telling me "no, it's not low fps, that's how the game runs." and I just think to myself "wow... I'm never playing these kinds of games on a console." but they're all fine with it, because that's just the way it is. That Morin guy said that you can hardly tell a difference between PC/PS4... at the same resolution. And likely the same framerate. Which is to say that at non-mismatched resolutions (I.E. game is rendered above or equal to monitor's native resolution) AND a higher framerate, your game could definitely be easily different looking than the PS4's game running on a 1080p screen (mismatched res, 30fps). So take what developers say with HUGE grains of salt, and read between the lines. If they give you a fact like "The PC version's maximum settings are the PS4's settings and the PS4 has an unlocked framerate, but does not maintain 60fps all the time, though stronger PCs could maintain 60fps constant, and both run at 1080p natively.", THEN you take it word for word. When people don't answer the questions definitively, it's time for you to look for clues. Because they can't really lie about the game. If they say "the game lets you fly jets" and the game comes out and there aren't any jets to fly, then you CAN take legal action against them. Remember: Their objective is to make you want to buy their product without actually lying about their product. There's many ways to sound like you're saying what somebody wants to hear without actually saying what they want to hear.
So yeah. If you want watch dogs on PC, wait till it comes out then see if you can either get hold of a benchmark, find someone who would install it on your laptop under their account and let you run it to see how it works or simply look for reviews after like a day. If you REALLY can't wait, two things you should do:
1 - Condition yourself so that you can wait for games, especially if you know a dev has a track record for screwing over your chosen platform like Ubisoft does to PC, or devs will keep doing the same crap to us all the time.
2 - Buy it for PS4 and be guaranteed good visuals at a target framerate/resolution.zachary77 likes this. -
D2 great breakdown, which I enjoyed digesting. You are very skeptical on Ubisoft's efforts, so I'm curious on your take of Assassin's Creed 4 Black Flag. I thought it looked great on PS4, though you probably have a more detailed breakdown on its technical aspects.
-
D2 Ultima, you sure love to write novels don't you?
As for Wolfenstein, that's not out either so nobody can tell you for sure what performance is gonna be like. But based on the recommended system requirements (i7, GTX 460/Radeon 6850), which Bethesda says will get you 1080p60, I'd bet your AW 14 will breeze through it. It's id Tech 5 after all. Last game using the engine was RAGE and it runs 60 FPS maxed out on a potato. Plus just looking at the relatively simple graphics (lack of depth and shader effects, static pre-baked lighting) in the Wolfenstein trailers, I can tell it's gonna be a relatively undemanding game. -
On PC it's the classic example of the shoddy Ubisoft port: Runs like doggy doo-doo, overly taxing on the CPU, random drops in FPS and GPU usage, not very scaleable, and stuff like the PhysX smoke effects cut FPS in half when they're on-screen.
On PS4 it's a decent, more consistent experience: Native 1080p, steady locked 30 FPS, equivalent of medium-high settings w/o PhysX, and nice-looking PPAA (way better than the TXAA garbage on PC). -
Well octiceps and D2 have scared me from the PC version, so I guess the PS4 will get the dust brushed off of it for a hopefully an enjoyable experience. Thanks for all the aid, I'll buy Wolfenstein for my laptop and give my account for those that have equivalent PC builds.
D2 Ultima likes this. -
Good man! *claps happily on decision well made*
People just gotta stop trusting devs. If they keep lying and giving us unoptimized poo and we keep buying it, the publishers are going to keep thinking it's ok.
Also, yes, Opticeps, I love to write =3zachary77 likes this. -
Publishers/developers believing that their games sell on PC without them showing the PC platform the deserved master race respect, then blaming lack of sales for lack of time/attention.
Aka Ubisoft, the "who cares about optimization on PC? If they want to run it better they can toss a stronger graphics card at it," people. -
The irony is that so many of Ubisoft's games are poorly coded for multi-core CPU's (remember the horror that was Assassin's Creed III?) that it doesn't matter how strong of a GPU you throw at them, they will still lag.
D2 Ultima likes this. -
Well I didn't say that the quoted person knew what he was talking about... =3.octiceps likes this.
-
Assuming that watch dogs runs at 800p 30fps on consoles (arithmetic mean). I would definitelly go for the Alienware. If you can plug it to a nice HD Tv I am sure it will deliver a better experience than consoles. And if you just want to run it over 30 you could just set the resolution to 800p and still have the same (most probably better) performance in the Alienware. That without mentioning the price difference (always welcomed) in favour of the PC.
All of this is obviously my own opinion
-
Don't decide yet, and don't preorder. The day it comes out, people will be buying the PC version and reporting on how it performs, so just wait until the benchmarks come out to know with which platform you should go.
My prediction is that the PS4 will be 900p with an unstable 24~30 fps performance, and I'll be quite surprised if the i7+765M combo doesn't run the game better at 900p.D2 Ultima likes this. -
Are you using the same display for both platforms? If not I'd pick the one I'd rather play it on. If so wait for reviews, it's all complete speculation until we see some head-to-head analysis.
-
I completely agree with this... it will play just fine on your Alienware, but get it for the platform you'd rather play it on. For me, that's a PC, but for many others that's a console/big screen TV.
-
Some sources indicate desktop GTX670 sufficient for Ultra details:
Watch Dogs For PC Will Run On Ultra Settings Even On Mid-Range Hardware | NextPowerUp -
Nice! This game might actually be interesting. Not sure if $60 interesting, but at least if it drops below $40 I'll pick it up.
-
I wonder how the supposed seamless transition between single and multi player mode will turn out. The game seems very interesting, however I have so many games, which I haven't played yet... I don't know if I would pay $60, or $40, or whatever for yet another game, which would essentially be added to the queue. In a couple of weeks I should have some more free time though, so everything can happen.
If, by any chance, I will purchase the game, I WILL HACK YOU Mr WingNut!
HTWingNut likes this. -
dumitrumitu24 Notebook Evangelist
gtx 670 for ultra..i read that also..does ultra mean 1080p 60fps?or just playable framerate about 30fps and up?
-
They never mean that you should expect 60fps.
-
I think they mean 30+ fps
-
From all the info out there I have surmised (take with pinch of salt as it is only me own deduction) that there are many levels of multiplayer in this game. There is the usual stuff like enter the map, join a group and carry out "missions" either scripted or against other multiplayer gamers. There there is multiplayer in your single player experience. Any player can randomly "hack" into your gameplay whilst you are pursuing your singleplayer campaign, I'm not sure whether they can while you are actually in the middle of a campaign chapter. In your game, you are always Aiden and the other players look like regular civies, from another player's perspective, you are a plain civie and they are Aiden, it's clever stuff. I think you can also turn off the ability for other players to hack into your singleplayer game if you just want to do your own thing.
-
Next-gen consoles = High settings on PC:
https://www.twitter.com/Design_Cave/status/469127341692973056
Remember that PS4 is 900p30 and XB1 is 792p30. -
I have the game on PC and have been playing it for a while now. I mostly agree with that TotalBiscuit ? playthrough of Watch Dogs that I recently watched ( http://www.youtube.com/watch?v=IBb2BIVrV7M). The game does not appear to be all that well optimized for the PC, and the mouse controls are kind of wonky, especially in the menus. There are definitely weird mouse speed changes like TB described. I guess if you can use a gamepad on the PC that might not be an issue.
I also read a thread about memory leak issues on the PC: [PC] Watch Dogs memory leak | Forums
I can attest to having experienced weird stuttering and a major slowdown after a really long play session about two days ago, and I immediately started thinking it might be a memory leak. I have 16 GB RAm (my "m" is lowercase because my "m" key is kind of broken right now...), though. I think one person in the thread said they had 16 GB RAm too and that they had a dual-card setup kind of like mine (I have HD 6990m cards running in Crossfire), so maybe Crossfire and/or SLI can cause weird issues.
Watch Dogs actually kept crashing at first until I temporarily switched from Crossfire to a single card. The game stopped crashing after that. At that point I enabled windowless border mode under the Display menu in the game, and it seemed to be willing to run with Crossfire enabled. Anyhow, I'm not sure that the performance is any better with Crossfire enabled now that I've played some more. It may even be worse, but I've just been happy(ish) that it's even running right now - heh. -
Borderless windowed + crossfire shouldn't happen... unless AMD changed and allow CrossfireX in windowed mode now? But yes, the game is awfully optimized for PC. It's Ubisoft. We knew this.
-
Yeah, don't ask me why, but it worked on my systems in at least two previous cases. I've had issues with both Crysis 2 and Splinter Cell: Blacklist, and it was back when Crysis 2 was spazzing out that I was looking for solutions to the graphical issues I was having when trying to run in Crossfire. I vaguely recall finding a thread where someone suggested changing to windowless border mode, and I tried it. I still have zero idea why it worked - but it did. I was also using a Crossfire setup when playing Crysis 2 (older system but also an Alienware - specifically an M17x R2/R3).
I can say that Splinter Cell: Blacklist exhibited almost exactly the same behavior as Watch Dogs (which I'm not surprised by - thanks again, Ubisoft). Both SC:B and Watch Dogs crashed shortly after I launched each game every single time when Crossfire was enabled. Only after I went through the steps I described previously (switching to one card, launching the game, changing to windowless border mode, quitting the game, and re-enabling Crossfire) would the games start running when I had Crossfire enabled.
Maybe it has something to do with the peculiarities of different PCs? I currently have an Alienware M18X R1 now with two Radeon HD 6990m cards. All I know is that I definitely have Crossfire enabled when I launch the game, and that it's definitely set to windowless border under the Display setting in there atm.
And yes, my "m" key decided to start working normally this morning - thank God. Wasn't looking forward to working today without it being functional, heh. -
I know this is about watch_dogs.. But hold up a second.. How is your 'm' key being broken letting you hit 'm' but not shift+'m'? I'm so intrigued how this is happening.
-
Yeah the memory leak is real. In game monitoring shows I start out with about 5GB of ram usage, which steadily climbs and eventually plateaus at about 7.5GB after a few hours of gaming, with peak usage topping out at almost 8GB.
-
Meaker@Sager Company Representative
https://www.youtube.com/watch?v=b43ZlqPvBDs
Was watchdogs intentionally gimped on PC?D2 Ultima likes this. -
Old news, but the short answer is that it most definitely was.Meaker said: ↑Click to expand...
Was there really a Watch Dogs graphics downgrade? • Eurogamer.net -
Meaker@Sager Company Representative
No, those effects are still in the game and have been re-enabled with simple configuration settings, without significant performance impact so appears like the demo 2 years ago.
-
Its a definite down grade on mulitple fronts. A big giveaway is to notice the theatre light bulbs hanging from the entrance ceiling are actual detailed bulbs in the 2012 demo. Go to 2014 they have stripped them into flat nothingness.
help with watchdogs platform decision
Discussion in 'Gaming (Software and Graphics Cards)' started by zachary77, May 17, 2014.