Just gonna bump this thread, so everyone can see it![]()
-
-
usapatriot Notebook Nobel Laureate
Looking at the MW3 screens, it's got nothing on BF3. The graphics look so cheap and low-res. Frostbite 2 will kick its behind. Either way, it's gonna be interesting cause there's gonna be (already is) a massive marketing war over BF3 vs MW3!
-
Yeah! I still remember how Olin said that the Black Ops engine can top FB2... lol!
http://www.hiphopgamershow.com/2011/03/treyarch-says-frostbite-2-0-is-visually-stunning-but-shocked/
"Treyarch Says: ” Frostbite 2.0 Is Visually Stunning ” But Our Engine Can Be Updated To Those Things." LOLOLOL!
-
Don't forget that the CoD engines are id Tech 3 based, id Tech 4 may have been a rewrite, but so was the CoD4 engine; and looking at idT5 and idT6 in the future, I'd say that if they did it right, it may as well be possible. Frostbite's damage is going to be hard to bring in though.
Also remember that Frostbite is very shader-bound, you'll need a pretty fast card to keep up. id Tech 3 OTOH is very shader-efficient (without all the fluffy useless extra effects).
Are all the realtime demos that were quoted to be running on a single GTX 580 running with the AA technique that DICE developed? If that is true then Frosbite is more efficient in managing resources than I thought. -
What do you mean with "shader bound"?
Also, what is your opinion on the recommended specs? -
It relies on large shader-based calculations, so cards with more shaders will perform much better proportionally (for example while the 5770 has 800/5=160 shaders and the 6850 has 960/5=192 shaders, AnandTech's tests have shown that Frostbite gets a near 25% increase when there are only 20% more shaders). Obviously there is more to take into account than just shader count (memory bandwith), but it's a pretty good basic accounting of what's going on.
My recommended specs would be around an E8400, 4GB of DDR2 with an 8800GTX, that should give you the ability to play with some physics disabled, 1920x1080, no AA, and some small effects turned off.
I currently have an E6600 with a 5770, I'm planning to upgrade to Bulldozer (FX-8110) when boards come out with PCI-E 3.0. -
Ah, I see.
How do you think my GPU would perform? MRHD 5730 with 720QM and 6GB RAM. I run BC2 @ 1080p all settings high except effect on medium and shadows on low, no HBAO and VSYNC, no AA and 4x AF at 30-80 FPS. -
You should be able to run it close to those settings, BC2 was a console port, so it wasn't optimized very well for the PC, BF3 is being developed on the PC first then ported to the consoles.
Although if you're playing on an LCD, I'd suggest turning VSYNC on, you might run into image corruption. VSYNC syncs your frame rate to the monitors frequency (which is usually around 60Hz (or MHz)), if your FPS is higher than then the monitors refresh rate then the image starts corrupting, since the monitor is being fed more images that it can handle. -
Really? Well, we must also put in the table that BF3 is using deferred rendering and it's pretty tough to handle : look at Metro 2033 and Crysis 2. They are pretty heavy, specially due to Deferred shading. Add some terrain tessellation, and you will have a pretty heavy game.
In my opinion, I think I will run it with these settings @ around 25-30 FPS : 1080p, maxed textures and objects, no post processing, DX10, shadows low, shaders medium and effects medium. I will OC though, for 50/60 from 650/800 to 700/860.
In my experience, yes, VSYNC reduced my screen tearing, but it also dropped my minimum FPS. -
usapatriot Notebook Nobel Laureate
I just preordered BF3 from Best Buy!
So what's the official release date? Best Buy is saying 12/31 but that doesn't sound right. -
It's not correct. It's around november 2nd
All sites say 12/31, if you notice
-
Why is everyone so insistent on comparing BF3 and MW3? They're going to be two completely different games, that will probably each be enjoyable in their own way. I intend to buy both and have fun with both.
-
usapatriot Notebook Nobel Laureate
Negative ghost rider, Activision has been shafting gamers since MW2 with rehashed titles year after year, hence, they deserve all the bashing. -
I agree with you, but there are undoubtedly people who will enjoy MW3. I probably will overall, despite what flaws it will have. -
I enjoyed All COD Titles that I've played (COD 2, COD 3, COD 4, MW 2, BlOps)
AND all MOH Games that I've Plated (MOH, MOH: Frontline, MOH 2010)
AND BF:BC2 (only BF game I've played)
all in their own way. My favorite is BC2, but I like them all individually. They are different and each have their flaws, but all are decent in their own right. That not to say I would pay full price for each... but yeah. -
^ Indeed. I enjoyed Black Ops "enough", but seeing as I have played it in months I'd say it wasn't worth the $45 (hell I even got a discount from the original $60), especially since bc2 cost me a big $6 and has more hours logged than cod.
Time will still tell if BF3 shapes up to be worth preorder/full price, or if it'll be a game you pick up during a steam sale... -
Agreed on the deferred shading part, not so agreed on everything else
.
-
Does anyone know how dual 485m's will fair with this game?
-
Whats the processor?
-
Easy, it'll play very well. Just look at your performance in M2033, add a couple of FPS and Voila!
Then how many FPS do you think I'll get? I get around 15-40 on Metro 2033 at DX11 with Tessellation at High settings at 1080p
-
Haven't decided on a processor yet but it will be either a 990x or 2940x depending on which laptop I get. Yea, it should run better than Metro so I should be ok. I'm thinking of a desktop as well with dual 580s. Still thinking about all options. If Nivida drops dual 570m or 580m during the summer, I can gladly wait because this game will be pushing limits.
-
You could get me an Alienware M18x with dual 6970M if you want...
Anyways, I don't think this game will be off the roof heavy-weight game, DICE will probably optimize it for the widest audience possible, while maintaing an amazing quality. -
I had dual 5870s in a Sager x8100. Never again or at least not in the forseeable future will I mess with AMD cards. People warned me not to get them but I did not listen. I saw more Blue screens than Zerg's Creep!
Anyways, that's why I haven't pulled the trigger on the Alienware. I need good Nvidia cards. 460s are a bit weak for me right now so I will wait. The Sager X7200 is an option with the dual 485m but my gut is tellin me to wait a few months since Im so indecisive right now.
Lol! The joke went right over my head! I'm still tired! -
That is REALLY odd...
My 5730 has never failed me. I only had 1 BSOD, and the 5730 perfors BETTER than I expected and what is written in NBC.
I <3 AMD, they are very good! Specially with the new Mjolner I driver project! -
usapatriot Notebook Nobel Laureate
I'd expect something like this for the recommend specs.
OS: Windows Vista or Windows 7
CPU: Quad-Core 2.0GHz+
MEM: 3Gb+
HDD: 20Gb+
GPU: Geforce GTX 280 1GB/Radeon 4890 1GB or Higher (swap in your notebook equivalents of these cards). -
I would replace the GPU for the GTX 460 (560?)/ATi 5850 (6850?) as to run it maxed.
-
Around 20 at a minimum. It'd average 30 at your settings.
-
This is another advantage on playing from a laptop on a external monitor ( 1080p ). If I get low FPS in 1080p, I simply will play on my laptop, which is 1366x768
Because I HATE non-native resolutions. Makes me sick lol
-
This.. I also play on an external 1080p 23" screen and my laptop is native 1366*768. So if I'm hurting to run it with my laptop on 1080p I'll just revert back to using my native laptop screen
-
Shadowfate Wala pa rin ako maisip e.
Any news on upcoming trailers?????????
SP or MP but MP preferably now. -
Probably not until E3. There's the producer's commentary if you haven't watched it.
<embed src="http://www.youtube.com/v/W__UkA1mOFM?fs=1&hl=en_US&rel=0" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width='560' height="349">Last edited by a moderator: May 6, 2015 -
redrazor11 Formerly waterwizard11
What is everyones opinion on avatar customizations? Brink really got me thinking about this....at the very least I'd like to see the pins and stripes I've won appear during battle on my corpse. Maybe some notches in my gun based on dog tags, or something to set my gear apart from everyone elses....so when they pick up my junk they'll be like "Oh, this dude was a BAMF".
-
IMHO, that wouldn't be very BF-ish. I like your idea, but I can't see it fitting in the BF world. BF is about realism. But I'm sure there will be weapon customization, but not like in BO ( Thank GOD! )! That was childish... A pink heart RDS? Who goes to war like that lol.
-
redrazor11 Formerly waterwizard11
How are pins, patches, and stripes not realistic? Soldiers wear these on their uniforms everyday...signify rank and honor.
How are notches in a gun not realistic? Plenty of guns have been found with notches in them based on kill count. It'd be more realistic than a number in the corner of your screen...seriously. -
iFail. I didn't quiet know what you were talking about...
If it is that, then hell yeah, I'm with you!
Also, what are notches? -
redrazor11 Formerly waterwizard11
A notch is a line that is engraved with a knife or other tool. Sometimes people in prison carve notches into the walls to signify days of confinement. Assassins carve notches into their guns to signify successful target terminations.
Google "tally marks" -
DO WANT! I love your idea! It's a very nice feature, and actually very realistic
-
I wouldn't mind it too much if there was zero performance hit to the game. If it becomes even a second of distraction while I'm alive I wouldn't like it, and be more inclined not to pick up other people's gear. I think though that the way it's done in BC2 is adequate where the number of stars on the weapon of the killer is shown after the player's death.
-
Wasn't something like this talked about a few years back with the first leak? Along with more social integration.
-
Maybe last year, because I remember saying something similar. I didn't like the avatar customization for the F2P Battlefields since they delayed respawning to load up other players' customizations. Then there's cod7, which I haven't played, but suspect the customizations were partially responsible for the criticism the game engine was receiving.
-
I hope i can play this on my Clevo with i7 820qm and ati 5870...
What you guys think? -
Hey, you are Portuguese too!
Though I like Russia more, MUCH more
We don't know
I guess pretty well, just take off 10 FPS and a little lower settings from how you play BC2.
BTW, how did you got a Clevo in Portugal? By how much? -
Yeah, I hope so
I got it on WSI by 1600€, a i7 820qm, 8gb ddr3, ati 5870, 15.6'' full-hd screen by September 2010... -
There is some info of wake island :
Battlefield Blog -
usapatriot Notebook Nobel Laureate
BF3 is gonna be so awesome. You can tell just from how the developers talk about the game and stuff. They know BF to the core and they're using they're knowledge of what BF has always been about to make BF3 even better! -
Not saying you're wrong (I haven't read the whole article), but you should try to separate marketing from facts. Many times companies put their best speaker on the front page and the person who does the real work in the background.
-
usapatriot Notebook Nobel Laureate
Then read the article. I've been following DICE since 2005 and they are not that kind of developer, even if they are owned by EA. -
DO WANT IN GAME.
-
Do we have any word on the min specs of BF3. I am hoping i will atleast be able to ply at high(if not ultra). I am considering selling my system for a M18x with 2x6970ms just for BF3. And that should be able to max it out right? Since twin 6970ms are almost as powerful as a GTX580 and that is what the trailer gameplays were played on.
-
usapatriot Notebook Nobel Laureate
Nothing yet. We're all just speculating as this point, we really have no idea what the min requirements for BF3 will be.
Battlefield 3 Discussion Thread
Discussion in 'Gaming (Software and Graphics Cards)' started by usapatriot, Jan 25, 2011.