The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Battlefield 3 on M18x - Discussion Thread

    Discussion in 'Alienware 18 and M18x' started by Ghost Warrior, Jun 12, 2011.

  1. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    6970M CF will not max this game, quote me on that : p
     
  2. Ajbeagles

    Ajbeagles Notebook Evangelist

    Reputations:
    40
    Messages:
    395
    Likes Received:
    0
    Trophy Points:
    30
    I have a huge doubt any laptop will be able to max this thing, they didnt even get it maxed with their desktop with the 580 GTX
     
  3. Mechanized Menace

    Mechanized Menace Lost in the MYST

    Reputations:
    1,370
    Messages:
    3,110
    Likes Received:
    63
    Trophy Points:
    116
    Couldn't this be just pre-beta code issues causing bad performance?
     
  4. Ajbeagles

    Ajbeagles Notebook Evangelist

    Reputations:
    40
    Messages:
    395
    Likes Received:
    0
    Trophy Points:
    30
    It could be, why knows they may optimize the game more to run better with a certain card but most likely if that happens it will be a runs great with nvidia and optimized more so with nvidia which the m18x doesnt have a fantastic choice for as of now but even with the game in a perfect state i doubt a laptop from this year can run it at max with no lag
     
  5. Mechanized Menace

    Mechanized Menace Lost in the MYST

    Reputations:
    1,370
    Messages:
    3,110
    Likes Received:
    63
    Trophy Points:
    116
    I'm sure we can also edit some things like the renderaheadlimit and other settings in the Settings.cfg file. I don't think we will max it out, but we should be close. Medium just doesn't seem right to me.
     
  6. Ajbeagles

    Ajbeagles Notebook Evangelist

    Reputations:
    40
    Messages:
    395
    Likes Received:
    0
    Trophy Points:
    30
    my guess is med-high
     
  7. hawk1410

    hawk1410 Bird of Prey

    Reputations:
    296
    Messages:
    2,171
    Likes Received:
    16
    Trophy Points:
    56
    The game was running on a desktop GTX580 and it was not running on max.
    Med-High sounds about right. I think high will be possible, but only super high end desktops will be able to play it on ULTRA.
     
  8. skygunner27

    skygunner27 A Genuine Child of Zion

    Reputations:
    1,694
    Messages:
    2,679
    Likes Received:
    7
    Trophy Points:
    56
    Here's the REAL recently released specs:

    Minimum PC requirements for Battlefield 3:
    •Hard Drive Space: 15 GB for disc version or 10 GB for digital version
    •Operating System: Windows Vista or Windows 7
    •Processor: Core 2 Duo @ 2.0GHz
    •RAM: 2GB
    •Video Card: DirectX 10 or 11 compatible Nvidia or AMD ATI card

    Recommended PC requirements for Battlefield 3:
    •Hard Drive Space: 15 GB for disc version or 10 GB for digital version
    •Operating System: Windows 7 64-bit
    •Processor: Quad-core Intel or AMD CPU
    •RAM: 4GB
    •Video Card: DirectX 11 Nvidia or AMD ATI card, GeForce GTX 460, Radeon Radeon HD 6850

    Link
    Battlefield 3 PC specs revealed

    I think with 6970M+ CF, we all have nothing to worry about. :)
     
  9. Juanderful

    Juanderful Notebook Consultant

    Reputations:
    91
    Messages:
    294
    Likes Received:
    2
    Trophy Points:
    31
    If you take a close look at many of the released screenshots of BF3, the textures and feel of the game look eerily like the Source engine for some odd reason. Everything seems too "smooth". Reminds me of Counterstrike: Source.

    Recommended specs for games usually mean "You can run this on high settings at 30FPS, but not ultra/everything enabled."

    So assuming the recommended specs for BF3 is a desktop 6850, the laptop equivalent would be the 6970M, meaning CF 6970M would maybe just barely be able to max BF3 out. Since the difference between high and ultra settings usually constitutes at least a 50% drop in performance, and the difference between single 6970M and CF 6970M is around 50% too.
     
  10. Mechanized Menace

    Mechanized Menace Lost in the MYST

    Reputations:
    1,370
    Messages:
    3,110
    Likes Received:
    63
    Trophy Points:
    116

    These are the fake specs released from gamestop, which they got from a BF3 forum.

    http://gamingeverything.com/interstitial.php?url=http://gamingeverything.com/?p=6228
     
  11. skygunner27

    skygunner27 A Genuine Child of Zion

    Reputations:
    1,694
    Messages:
    2,679
    Likes Received:
    7
    Trophy Points:
    56
  12. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    The thing is that the game will look good even at a lower resolution. Also if you look at some very grahically intensive games, the differences between high and ultra are not usually HUGE compared to a "low to high."

    A lot is personal preference. I personally like "high" at native resolution vs "ultra" at 1280 by 720. Some view that the opposite. 1080P is still a LOT of pixels and if you lower the amount of pixels than you can increase the demands. Or if you do play 1080p then you really don't have to worry about AA at the resolution.

    Hopefully we get some nice settings with the current offering we have in our systems as I really don't plan any upgrades until the next gen consoles are out for a few years. I tend to be die hard pc gamer and then switch to console gaming for a few years and then upgrade my PC's.

    Either way, this game looks BEAST and will definitely be my go to game for awhile. I was really hoping the 6970CF would handle this game well but after looking closely at the footage....may be wishful thinking!
     
  13. iPhantomhives

    iPhantomhives Click the image to change your avatar.

    Reputations:
    791
    Messages:
    1,665
    Likes Received:
    48
    Trophy Points:
    66
    How about 6990?
     
  14. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    It's all speculation. The differences seen in game between the two wont be shocking. I mean when CS:S came on the scene in 2004 my XPS lappy with a pentium and MR 9800 (8 pixel pipelines) could run the game relatively well. Now after all the patches and updates, that thing couldn't run that game nearly as well. I think the laptops will play the game well but not maxed.

    I remember when Crysis was being ran by "two 7800 GTXs in SLI." When it debuted people were struggling with Desktop 8800 GTXs. I ran it on my old M15x with a combination of medium/high at 1400 by 900 and did not really enjoy the experience. Fast forward 4 years and I can finally enjoy it with 5870s in CF on my old R2. Now I am going to try on my m18x.

    So I am "hopeful" that BF3 can run well and look good but I don't expect any current notebook GPU's to be maxing the game. Time will tell. Hopefully the 7 series AMD cards will fit the m18x as that would be a pretty sweet upgrade path for current Gen 1 adopters of the m18x with 6970s/6990s. 28nm fab process would be pretty sweet.
     
  15. AlienTroll

    AlienTroll Notebook Evangelist

    Reputations:
    319
    Messages:
    598
    Likes Received:
    0
    Trophy Points:
    30
    They promised NO mod tools. It should run better then BFBC2 actually, because BFBC2 was a horrible port. I hope they have bots in the game. Because I like playing with bots.
     
  16. Shaden

    Shaden Notebook Deity

    Reputations:
    827
    Messages:
    1,337
    Likes Received:
    7
    Trophy Points:
    56
    AMD has already noted that the 7 Series cards will be mxm 3.0 ... so they should fit right in the m18x no problem.

    :)

    m17x R2 owners are the luckiest sons of Bs ever ... considering they too will probably be able to use the 7 series ... crazy longevity in those laptops.
     
  17. _Cheesy_

    _Cheesy_ Notebook Hoarder

    Reputations:
    9
    Messages:
    1,060
    Likes Received:
    0
    Trophy Points:
    55
    What is this R1, R2, and R3? Is that just the different generation of Alienware notebook? If so I take it all current Alienware are R3? Also what is Aliensmudge?

    Oh dam I'm not asking about 6990m vs 580m no more... :D
     
  18. Shaden

    Shaden Notebook Deity

    Reputations:
    827
    Messages:
    1,337
    Likes Received:
    7
    Trophy Points:
    56
    lol r1,2,3 refer to revisions of the m17x. the M18x is ... just that ... I guess you could call it an R1 ...

    ALiensmudge refers to damage on the exterior of my unit when it was shipped ...

    I posted pics in another thread, it was pretty crazy
     
  19. serialsid

    serialsid Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
  20. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    If their recommended specs are HD 6950, We should say a 6970 should deliver playable frame rates with all the eye candy on bar MSAA settings. Maybe we can have 2X 4X and still be ok, remains to be seen what sort of demands its highest settings will place on a 6970 desktop card.

    Two 6970Ms definitely eclipses the single desktop 6970 and in the worst case match it. So 6990M dual users are sitting pretty, the same for dual 580M GTXs.

    The 20GB install is good news which means lesser compressed textures.
     
  21. Shaden

    Shaden Notebook Deity

    Reputations:
    827
    Messages:
    1,337
    Likes Received:
    7
    Trophy Points:
    56
    Fricken awesome, glad to see I will be gaming at the top of the peak until the 7Ks come out, and maybe beyond. I was really only worried about this game since this and D3 will be chewing up my time for the next year most likely.
     
  22. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    :eek: Those smoke and flame effects are some of the most stonking visuals i have seen in a game.
     
  23. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    *sigh* aw man, hope my single 580m holds up as well as it can. I did manage to get a vantage GPU score of 18k. Beating a GTX 560 Ti but I don't think it is really gaming clocks. We'll see when the thing comes out :(
     
  24. Shaden

    Shaden Notebook Deity

    Reputations:
    827
    Messages:
    1,337
    Likes Received:
    7
    Trophy Points:
    56
    Looks like at worst you could OV it and OC widez
     
  25. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    The 580M has 384 shaders yes? how high do they clock stable? The desktop is around 880Mhz GPU core, so if they say a single 560Ti is enough should hold pretty good for an overclocked 580M GTX.
     
  26. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    I was wrong, the desktop is at 820Mhz core clocks not 880mhz, those would be the super clocked versions.

    EDIT: Oh from my old 8600M GT experiences the biggest gains were from higher shader clocks more so than core clocks. I guess that still holds valid today.
     
  27. Shaden

    Shaden Notebook Deity

    Reputations:
    827
    Messages:
    1,337
    Likes Received:
    7
    Trophy Points:
    56
    my 8600GT is still up and working, My brother is using the laptop these days has a nice T9300 CPU in it and 4GB ram. Still plays a lot of current games at mid settings
     
  28. ARGH

    ARGH Notebook Deity

    Reputations:
    391
    Messages:
    1,883
    Likes Received:
    24
    Trophy Points:
    56
    shader / core is same thing.
     
  29. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    I believe you are mistaken?

    Core clocks are different from shader clocks in Nvidia. AMD has shader clock locked to core clocks.
     
  30. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    Yeah the nvidia's shaders in the Fermi architecture mean that it is always double the core clock. You can't change them independently. You can't unlink them. Not even with flashing.
     
  31. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    Then how was I able to have them clocked independently on the 8600M GT? Unless something changed recently? I remember using norbit or something to have the clocks custom set in the vBIOS for my old 8600M GT.
     
  32. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    I said "Fermi" which means 4xx series and higher.
     
  33. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    Just checked the entire Geforce 8 series had independent clocks for their shaders and core clocks.

    For example the 8800 GTS has a core clock of 650 and shaders at 1625. Definitely not linked together.

    I dont know though if it changed since then.
     
  34. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    Ah I see, ok then that's different, Thanks for that info. +1 rep
     
  35. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    The ones in the Fermi cards have separate clocks. My 580m has 625 core and 1250 shader. However when tuning one, the other moves proportionally. Unlike the DX10 cards before them, the DX11 cards are permanently linked. Shame, I could probably crank some more frames by OCing the shaders more than the core.
     
  36. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    Yeah thats a bummer they went that route. I enjoyed my time with the Geforce 8/9 series for that reason.
     
  37. 24usedtorock

    24usedtorock Notebook Geek

    Reputations:
    0
    Messages:
    97
    Likes Received:
    1
    Trophy Points:
    16
    What about us dual 460M owners? Will it be playable?
     
  38. Shaden

    Shaden Notebook Deity

    Reputations:
    827
    Messages:
    1,337
    Likes Received:
    7
    Trophy Points:
    56
    that depends on your definition of playable medium settings probably
     
  39. 3demons

    3demons Battlefield 3 Ace

    Reputations:
    305
    Messages:
    596
    Likes Received:
    0
    Trophy Points:
    30
    this may not be the place to ask, but ill bite anyway.

    Im trying to understand the GPU terminology a little better, so questions:

    -what is does the term fermi mean?

    - What are shaders and cores, and what do they actually do (both hardware wise, and software/in-game)?

    -with the new 7 series from AMD,i hear the terminology of FAB process, what is that? as well as it being a 28nm GPU, what does that mean as well?
     
  40. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81

    No prob brother, I have probably been saying that to you in another thread.

    Fermi is the name of the old architecture of the Nvida series of cards; I don't know what the current names are. Fermi was the "big thing." Just a name used to define the new Nvidia cards at the time.

    Shader clocks I believe effect the shaders in games. So up the clocks and it will help push the shaders for said application.

    Fab process is just "Fabrication" of the new CPU's/GPU's. Think of it like a Car line....so to speak. It's "retooling" so that the new GPU's are going down to only 28nm so that it can fit a LOT more transistors and what not.

    I hope that clears things up!
     
  41. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    Fermi is just a family name given to a class of chips Nvidia developed. AMD calls their stuff like Evergreen, Northern Islands, Southern Islands.

    Previously we had terms like pixel shaders and pipelines, the more the better. And there were the simple concepts like the core clock and the memory clocks.

    That all changed with Direct X9 when "Unified shaders" came into the picture. This was conceived so GPU makers can design one basic unit and repeat it till they ran out of real estate on the silicon wafer or thermal barriers were reached for their design rules.

    Nvidia likes to call these simple units as CUDA units, AMD/ATI calls them SPs (stream processors). Its the same concept 1 unit that can be repeated over and over like your cores in CPUs. We have dual quad hexa octa cores in CPUs. Think of it as many hundreds of such cores in the GPU.

    What makes their units different is the stuff inside it. These units can have any number of shaders clubbed together, not every shader inside does the same kind of work load. Though all of them are collectively refereed to as shaders.

    AMD in their 5000 series had 5 shaders clubbed together with the dispatcher, memory controller request logic, cache etc. to form 1 SP or Stream processor, Nvidia to start with the earliest DX9 card had lesser shaders around 2 but their had other units to accompany these shaders. Nvidia's single shader is not directly comparable with AMD's sahder as they do different work loads at different rates. Which is why we can see a 384 Shader 580M GTX giving strong competition to AMD's 1120 shader 6990M cards. Fermis today have about 32 cores in a single Unit.

    So 384 Shaders represent 384/32= 12 CUDA Units while for AMD 1120 shaders represents 1120/5 = 224 SPs (stream processors)

    Nvidia likes to call the shaders as cores. Its all marketing spin. What we are interested in is the base Unit that is repeatable. Sometimes Nvidia calls these Units are 'clusters' as well. Now they call it (SM) Stream Multiprocessors

    Recently AMD's changed their architectures from 5 Shader SPs to 4 Shader SPs. They did this to have more SPs to pack in the same real estate on silicon. Sure they lost some compute power per SP but they gained significantly on gaming performance. This they called VLIW4 in the 6000 series compared to the VLIW5 based tech of the 5000 series desktops.

    Just like we say all processors are not comparable clock for clock because they do different work loads per cycle. We cant compare these shaders pound for pound.

    FAB: is short for Fabrication Foundry. A plant or facility for manufacturing Chips from CPUs to RAM chips to GPUs.

    The terms like 28nm 45nm 22nm 32nm is the channel width of a transistor. The transistor is the basic element in a chip, millions of these mostly billions of these form the various units and logic that makes up the whole working chip.

    The smaller the channel width the faster the switching, the lower the power consumed. But there are other nasty things like leakage current etc that can hamper the gains on paper.

    So lets say we take our 6970Ms, the chip code named 'Barts', which is made now on the 40nm process. Lets shrink the design for 28nm. Now the chip is smaller for the same job but consumes less voltage and can clock much higher. But AMD or Nvidia will not leave it at that. They will pack more SPs and Units to make up a certain predetermined thermal envelope and make a more powerful GPU.
     
  42. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    I made some corrections as I got carried away in mixing Fermi details with the first DX9 chips. My mind thought two things at the same time. Sorry about that. :eek:
     
  43. wraithrsw

    wraithrsw Notebook Consultant

    Reputations:
    59
    Messages:
    178
    Likes Received:
    0
    Trophy Points:
    30
    Hmm, I figure a GPU upgrade may be in the mix soon barring a likely clampdown on free time.
     
  44. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    Just remember just because you have the recommended specs, it does not mean that you will be able to cap all the settings. Especially on resolutions higher than 1080.
     
  45. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    The M18x has only 1080p as its max, so why worry about resolutions higher than 1080? Unless you go onto external monitor with much higher resolutions.

    At 1080p dual 6970Ms is more than enough given how a single 5870 mobility can handle the current alpha BF3. This will be true for the dual 6970Ms (and above) despite the extra settings the full retail version will have. I can bet on it.
     
  46. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    Don't forget some of us have external monitors for when we are home.

    So its still a valid point.
     
  47. mharidas

    mharidas VLSI/FAB Engineer

    Reputations:
    340
    Messages:
    948
    Likes Received:
    0
    Trophy Points:
    30
    I fail to see the point of trying to push any of the current laptop systems even on crossfire at resolutions higher than 1080p as there are games which get unplayable at 2560 x 1600. Crysis 2, Metro 2033, Witcher 2 all going to be only barely playable at those resolutions even on dual 6870s. Metro 2033 getting 17.5fps or something at 2560 by 1600 on dual 6850s.

    You need 6950s/580/570/6970s in crossfire/SLi to even begin to play them smoothly without micro-shuttering that you will face at 27-35fps range. That being the desktop situation, so expecting the same on a laptop is just absurd.

    You suggesting 2560 x 1600 is the mainstream resolution these days? I would beg to differ. Anyone can deliberately try and make a system look weak. 1920 by 1080p is the most frequently used 'high' resolution for gaming.

    So I ask again whats the point in trying to go beyond the norm when you know you can get all the eye candy on 1080p? I am fairly certain you are in the minority here when it comes to playing beyond 1080p. And at 1080p resolution BF3 will run just fine. If you keep sticking to 2560 by 1600 you will run out of steam very soon with newer titles and it wont matter if you had the 7000Ms or Kepler GPUs.
     
  48. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    I'm running on my 28" in 1200 resolution and I have no problems with any of my games.

    Not to mention I know allot of other people on this forum use IPS monitors at higher resolutions.

    So just because you only play things at 1080 does not mean that everybody limits their selfs.
     
  49. 3demons

    3demons Battlefield 3 Ace

    Reputations:
    305
    Messages:
    596
    Likes Received:
    0
    Trophy Points:
    30
    guys once the beta comes out, if peeps can log what frame rates they are getting on the highest possible settings on both the 6990's and the 580's, that would do me a huge favor. hopefully it can stay at 40+ with dual GPU's. thanks
     
  50. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    Well the closed beta has been out for a while and many forum members including Skygunner have been involved with it. The only problem is they have signed a NDA so they cannot post pics but they are trust worthy in my opinion so if they post their findings here ill believe them.
     
← Previous pageNext page →