6970M CF will not max this game, quote me on that : p
-
-
Mechanized Menace Lost in the MYST
-
-
Mechanized Menace Lost in the MYST
-
my guess is med-high
-
Med-High sounds about right. I think high will be possible, but only super high end desktops will be able to play it on ULTRA. -
skygunner27 A Genuine Child of Zion
Here's the REAL recently released specs:
Minimum PC requirements for Battlefield 3:
•Hard Drive Space: 15 GB for disc version or 10 GB for digital version
•Operating System: Windows Vista or Windows 7
•Processor: Core 2 Duo @ 2.0GHz
•RAM: 2GB
•Video Card: DirectX 10 or 11 compatible Nvidia or AMD ATI card
Recommended PC requirements for Battlefield 3:
•Hard Drive Space: 15 GB for disc version or 10 GB for digital version
•Operating System: Windows 7 64-bit
•Processor: Quad-core Intel or AMD CPU
•RAM: 4GB
•Video Card: DirectX 11 Nvidia or AMD ATI card, GeForce GTX 460, Radeon Radeon HD 6850
Link
Battlefield 3 PC specs revealed
I think with 6970M+ CF, we all have nothing to worry about. -
If you take a close look at many of the released screenshots of BF3, the textures and feel of the game look eerily like the Source engine for some odd reason. Everything seems too "smooth". Reminds me of Counterstrike: Source.
So assuming the recommended specs for BF3 is a desktop 6850, the laptop equivalent would be the 6970M, meaning CF 6970M would maybe just barely be able to max BF3 out. Since the difference between high and ultra settings usually constitutes at least a 50% drop in performance, and the difference between single 6970M and CF 6970M is around 50% too. -
Mechanized Menace Lost in the MYST
These are the fake specs released from gamestop, which they got from a BF3 forum.
http://gamingeverything.com/interstitial.php?url=http://gamingeverything.com/?p=6228 -
skygunner27 A Genuine Child of Zion
-
The thing is that the game will look good even at a lower resolution. Also if you look at some very grahically intensive games, the differences between high and ultra are not usually HUGE compared to a "low to high."
A lot is personal preference. I personally like "high" at native resolution vs "ultra" at 1280 by 720. Some view that the opposite. 1080P is still a LOT of pixels and if you lower the amount of pixels than you can increase the demands. Or if you do play 1080p then you really don't have to worry about AA at the resolution.
Hopefully we get some nice settings with the current offering we have in our systems as I really don't plan any upgrades until the next gen consoles are out for a few years. I tend to be die hard pc gamer and then switch to console gaming for a few years and then upgrade my PC's.
Either way, this game looks BEAST and will definitely be my go to game for awhile. I was really hoping the 6970CF would handle this game well but after looking closely at the footage....may be wishful thinking! -
iPhantomhives Click the image to change your avatar.
-
I remember when Crysis was being ran by "two 7800 GTXs in SLI." When it debuted people were struggling with Desktop 8800 GTXs. I ran it on my old M15x with a combination of medium/high at 1400 by 900 and did not really enjoy the experience. Fast forward 4 years and I can finally enjoy it with 5870s in CF on my old R2. Now I am going to try on my m18x.
So I am "hopeful" that BF3 can run well and look good but I don't expect any current notebook GPU's to be maxing the game. Time will tell. Hopefully the 7 series AMD cards will fit the m18x as that would be a pretty sweet upgrade path for current Gen 1 adopters of the m18x with 6970s/6990s. 28nm fab process would be pretty sweet. -
They promised NO mod tools. It should run better then BFBC2 actually, because BFBC2 was a horrible port. I hope they have bots in the game. Because I like playing with bots.
-
AMD has already noted that the 7 Series cards will be mxm 3.0 ... so they should fit right in the m18x no problem.
m17x R2 owners are the luckiest sons of Bs ever ... considering they too will probably be able to use the 7 series ... crazy longevity in those laptops. -
Oh dam I'm not asking about 6990m vs 580m no more... -
lol r1,2,3 refer to revisions of the m17x. the M18x is ... just that ... I guess you could call it an R1 ...
ALiensmudge refers to damage on the exterior of my unit when it was shipped ...
I posted pics in another thread, it was pretty crazy -
Hi guys,
Just seen the required system specs for Battlefield 3,
Battlefield 3 system requirements announced | bit-gamer.net
Be interesting to see how our machines handle the game eh, especially in DX11.
20GB install though -
If their recommended specs are HD 6950, We should say a 6970 should deliver playable frame rates with all the eye candy on bar MSAA settings. Maybe we can have 2X 4X and still be ok, remains to be seen what sort of demands its highest settings will place on a 6970 desktop card.
Two 6970Ms definitely eclipses the single desktop 6970 and in the worst case match it. So 6990M dual users are sitting pretty, the same for dual 580M GTXs.
The 20GB install is good news which means lesser compressed textures. -
Fricken awesome, glad to see I will be gaming at the top of the peak until the 7Ks come out, and maybe beyond. I was really only worried about this game since this and D3 will be chewing up my time for the next year most likely.
-
Those smoke and flame effects are some of the most stonking visuals i have seen in a game.
-
*sigh* aw man, hope my single 580m holds up as well as it can. I did manage to get a vantage GPU score of 18k. Beating a GTX 560 Ti but I don't think it is really gaming clocks. We'll see when the thing comes out
-
Looks like at worst you could OV it and OC widez
-
The 580M has 384 shaders yes? how high do they clock stable? The desktop is around 880Mhz GPU core, so if they say a single 560Ti is enough should hold pretty good for an overclocked 580M GTX.
-
I was wrong, the desktop is at 820Mhz core clocks not 880mhz, those would be the super clocked versions.
EDIT: Oh from my old 8600M GT experiences the biggest gains were from higher shader clocks more so than core clocks. I guess that still holds valid today. -
my 8600GT is still up and working, My brother is using the laptop these days has a nice T9300 CPU in it and 4GB ram. Still plays a lot of current games at mid settings
-
-
Core clocks are different from shader clocks in Nvidia. AMD has shader clock locked to core clocks. -
Yeah the nvidia's shaders in the Fermi architecture mean that it is always double the core clock. You can't change them independently. You can't unlink them. Not even with flashing.
-
-
I said "Fermi" which means 4xx series and higher.
-
Just checked the entire Geforce 8 series had independent clocks for their shaders and core clocks.
For example the 8800 GTS has a core clock of 650 and shaders at 1625. Definitely not linked together.
I dont know though if it changed since then. -
-
The ones in the Fermi cards have separate clocks. My 580m has 625 core and 1250 shader. However when tuning one, the other moves proportionally. Unlike the DX10 cards before them, the DX11 cards are permanently linked. Shame, I could probably crank some more frames by OCing the shaders more than the core.
-
Yeah thats a bummer they went that route. I enjoyed my time with the Geforce 8/9 series for that reason.
-
What about us dual 460M owners? Will it be playable?
-
-
Im trying to understand the GPU terminology a little better, so questions:
-what is does the term fermi mean?
- What are shaders and cores, and what do they actually do (both hardware wise, and software/in-game)?
-with the new 7 series from AMD,i hear the terminology of FAB process, what is that? as well as it being a 28nm GPU, what does that mean as well? -
No prob brother, I have probably been saying that to you in another thread.
Fermi is the name of the old architecture of the Nvida series of cards; I don't know what the current names are. Fermi was the "big thing." Just a name used to define the new Nvidia cards at the time.
Shader clocks I believe effect the shaders in games. So up the clocks and it will help push the shaders for said application.
Fab process is just "Fabrication" of the new CPU's/GPU's. Think of it like a Car line....so to speak. It's "retooling" so that the new GPU's are going down to only 28nm so that it can fit a LOT more transistors and what not.
I hope that clears things up! -
Previously we had terms like pixel shaders and pipelines, the more the better. And there were the simple concepts like the core clock and the memory clocks.
That all changed with Direct X9 when "Unified shaders" came into the picture. This was conceived so GPU makers can design one basic unit and repeat it till they ran out of real estate on the silicon wafer or thermal barriers were reached for their design rules.
Nvidia likes to call these simple units as CUDA units, AMD/ATI calls them SPs (stream processors). Its the same concept 1 unit that can be repeated over and over like your cores in CPUs. We have dual quad hexa octa cores in CPUs. Think of it as many hundreds of such cores in the GPU.
What makes their units different is the stuff inside it. These units can have any number of shaders clubbed together, not every shader inside does the same kind of work load. Though all of them are collectively refereed to as shaders.
AMD in their 5000 series had 5 shaders clubbed together with the dispatcher, memory controller request logic, cache etc. to form 1 SP or Stream processor, Nvidia to start with the earliest DX9 card had lesser shaders around 2 but their had other units to accompany these shaders. Nvidia's single shader is not directly comparable with AMD's sahder as they do different work loads at different rates. Which is why we can see a 384 Shader 580M GTX giving strong competition to AMD's 1120 shader 6990M cards. Fermis today have about 32 cores in a single Unit.
So 384 Shaders represent 384/32= 12 CUDA Units while for AMD 1120 shaders represents 1120/5 = 224 SPs (stream processors)
Nvidia likes to call the shaders as cores. Its all marketing spin. What we are interested in is the base Unit that is repeatable. Sometimes Nvidia calls these Units are 'clusters' as well. Now they call it (SM) Stream Multiprocessors
Recently AMD's changed their architectures from 5 Shader SPs to 4 Shader SPs. They did this to have more SPs to pack in the same real estate on silicon. Sure they lost some compute power per SP but they gained significantly on gaming performance. This they called VLIW4 in the 6000 series compared to the VLIW5 based tech of the 5000 series desktops.
Just like we say all processors are not comparable clock for clock because they do different work loads per cycle. We cant compare these shaders pound for pound.
FAB: is short for Fabrication Foundry. A plant or facility for manufacturing Chips from CPUs to RAM chips to GPUs.
The terms like 28nm 45nm 22nm 32nm is the channel width of a transistor. The transistor is the basic element in a chip, millions of these mostly billions of these form the various units and logic that makes up the whole working chip.
The smaller the channel width the faster the switching, the lower the power consumed. But there are other nasty things like leakage current etc that can hamper the gains on paper.
So lets say we take our 6970Ms, the chip code named 'Barts', which is made now on the 40nm process. Lets shrink the design for 28nm. Now the chip is smaller for the same job but consumes less voltage and can clock much higher. But AMD or Nvidia will not leave it at that. They will pack more SPs and Units to make up a certain predetermined thermal envelope and make a more powerful GPU. -
I made some corrections as I got carried away in mixing Fermi details with the first DX9 chips. My mind thought two things at the same time. Sorry about that.
-
Hmm, I figure a GPU upgrade may be in the mix soon barring a likely clampdown on free time.
-
Just remember just because you have the recommended specs, it does not mean that you will be able to cap all the settings. Especially on resolutions higher than 1080.
-
At 1080p dual 6970Ms is more than enough given how a single 5870 mobility can handle the current alpha BF3. This will be true for the dual 6970Ms (and above) despite the extra settings the full retail version will have. I can bet on it. -
So its still a valid point. -
I fail to see the point of trying to push any of the current laptop systems even on crossfire at resolutions higher than 1080p as there are games which get unplayable at 2560 x 1600. Crysis 2, Metro 2033, Witcher 2 all going to be only barely playable at those resolutions even on dual 6870s. Metro 2033 getting 17.5fps or something at 2560 by 1600 on dual 6850s.
You need 6950s/580/570/6970s in crossfire/SLi to even begin to play them smoothly without micro-shuttering that you will face at 27-35fps range. That being the desktop situation, so expecting the same on a laptop is just absurd.
You suggesting 2560 x 1600 is the mainstream resolution these days? I would beg to differ. Anyone can deliberately try and make a system look weak. 1920 by 1080p is the most frequently used 'high' resolution for gaming.
So I ask again whats the point in trying to go beyond the norm when you know you can get all the eye candy on 1080p? I am fairly certain you are in the minority here when it comes to playing beyond 1080p. And at 1080p resolution BF3 will run just fine. If you keep sticking to 2560 by 1600 you will run out of steam very soon with newer titles and it wont matter if you had the 7000Ms or Kepler GPUs. -
I'm running on my 28" in 1200 resolution and I have no problems with any of my games.
Not to mention I know allot of other people on this forum use IPS monitors at higher resolutions.
So just because you only play things at 1080 does not mean that everybody limits their selfs. -
guys once the beta comes out, if peeps can log what frame rates they are getting on the highest possible settings on both the 6990's and the 580's, that would do me a huge favor. hopefully it can stay at 40+ with dual GPU's. thanks
-
Battlefield 3 on M18x - Discussion Thread
Discussion in 'Alienware 18 and M18x' started by Ghost Warrior, Jun 12, 2011.