After having burned my fingers on a costly hp machine (dv9500t), I am thinking these days about whether to buy another power packed (this time a gaming one) notebook and that too an ultra exprensive M17X.
My primary question is how good are the graphics cards (GTX 280M and GTX 260M) that are being offered? How old are these into the markets and what would be the age of oldest notebook machine using these cards? I know they must not be having the heating defects like 8400-8600M series, but still how hot do they get? Also having an air conditioner on, when running games, does it help to prolong the life of a GPU like GTX 280M?
In "Dual 1GB GDDR3 NVIDIA® GeForce® GTX 280M" what does dual mean? Does it mean that the total graphics RAM is 2 GB? That would be insane![]()
Sorry for the newbie questions, as I am really afraid of buying another costly lemon!
-
Charles P. Jefferies Lead Moderator Super Moderator
Our official M17x review will be up next week, keep checking the homepage ( www.notebookreview.com).
The GTX 260M and 280M are very powerful video cards, the fastest on the market as a matter of fact. The best value is currently the dual GTX 260Ms in SLI; the 280Ms are not a whole lot faster. The M17x has a great cooling system so they stay rather cool; I have not seen them go above 80*C, even with the CPU is overclocked.
Having an air conditioner on while playing games could help, but the M17x doesn't need it. You won't necessarily prolong the lifespan by having one on.
The M17x does not have 2GB of video memory; in SLI mode, the video memory between the cards is mirrored, so you actually have 1GB total. 1GB is more than enough for all modern games and should be enough for some time. -
That is not correct, the M17X has 2GB of video memory. 1GB for each card as per the description on alienware website and also the information on my M17X when looking into Nvidia Properties.
For someone that writes review you should do a better job at informing people.
Sorry to be so rude but bad information is already bad enough from newbies but from officials of notebookreview it is not acceptable.
"Dual NVIDIA® GeForce® GTX 280M, 2GB SLI® Enabled [Included in Price]" -
Chaz was correct in his statement..
I would suggest you go and read on how SLI works! -
lol
the cards only use one cards ram mate, so only 1gb is used
PC's 101
... -
Soviet Sunrise Notebook Prophet
Joebarchuck looks in the mirror every morning thinking that there is a clone of him mimicking his every movement.
-
I think there is a problem here. First of all I never said there was 2GB per card. There is 2GB total and yes it is mirrored with SLI but nonetheless you still have 2GB of total video memory though only 1GB is really useful.
The problem here is the statement of Chaz: he says the M17X does not have 2GB of video memory well technically that's not true; there is!
SLI works this way: one card renders the top of the screen and the other renders the bottom. They each have 1GB of memory to utilize to render each of what they have to do therefore yes there is 2GB of video memory but of course neither card can use 2GB. We all got that. The point is there is 2GB of video memory on the M17X. -
Soviet Sunrise Notebook Prophet
It doesn't matter. Only 1GB is useable despite technically having 2 x 1GB cards. The only time 2GB is used to describe SLi is in marketing by companies.
-
Well I guess you could view it both ways but again what I wanted to say is that the way Chaz wrote it implies each card has 512MB because remember that someone that knows nothing of SLI or crossfire like the OP would automatically assume that therefore it is necessary to state the truth each card has 1GB of memory totalling 2GB of video memory usable at the rate of 1GB per card.
-
ok
the facts instead of argueing
there is 2 gigs of vram
only 1 gig is used -
actually i lol'd @ that
-
Soviet Sunrise Notebook Prophet
-
I am lost also So if you are not using ram from 2nd card what is the use of a dual cards? So a 1gig card should be plenty for light gaming like GTA 4 or Sacred 2 ? I need to know this before i purchase a dual. I was going to order one 3 days ago but decided to wait a few weeks.
-
-
-
1GB for top half of screen + 1GB for bottom half of screen = 1GB for the entire screen
1 x 0.5 + 1 x 0.5 = 1!!
Works out mathematically. -
You use both cards but not both cards ram
-
Is it possible to dedicate one card to PhysX exclusively though rather than SLi? If that was the case, you would be able to use up to 2GB of RAM.
-
No ...........
-
I'm sure you could hack PhysX to take over one card... just requires driver linking. I'm just trying to push the argument that there has to be a way to push the RAM to 2GB of usage.
-
Go ahead if ur smarter then nvidia be my guest
-
"jerry jerry jerry"
-
Can you provide a LINK that supports your description of how SLI works? I was under the impression that it worked somewhat like what was alluded to in Chaz's and Soviet's posts--that SLI was similar to RAID'ed hard drives except only applications that are coded to utilize... whatever nevermind -
-
-
Kade Storm The Devil's Advocate
Just wiki the information.
Dual cards rendering each half of a screen is known as 'split-screen-rendering', and then there's dual cards that take turns rendering each frame, which is known as 'alternate-frame-rendering'. The latter is actually much more efficient and better for a majority of games, especially those that utilise a shader-intensive engine.
My source: Nvidia Control Panel.
As for Physx, well you simply deactivate SLi, and the idle card will run your Physx. I thought this was all understood? But how PhysX will ever use 1GB of GDDR3 RAM is beyond me as of the moment.
I think the real bone of contention here was that whole 'technically, you have a total of 2 gigs of ram,' rhetoric. However, we all know that effectively the rendering performance cuts down to 1 GB of RAM. With SLi, you simply yield the benefits of two cards taking the pressure off each other by either taking turns at rendering a running scene, or rendering one-half of the scene, per card. It never meant that everything is literally doubled. End of story. Chaz was spot-on, and he even made a point to mention that both cards will have 1 GB of RAM, each. How and why anyone would be motivated to pursue such a bloody semantic argument against him, and then grasp at every little straw, escapes logic altogether. -
No, that's not quite how it works. Here is the the wiki...
http://en.wikipedia.org/wiki/Scalable_Link_Interface
Split-screen isn't 50% of the screen per cards, but messured 50% of the workload (or attempts to). You may have one cards actually rendering allot more of the screen due to other areas (water/transparency/complexity) requiring more processing power.
Put on the SLI indicators on a game that properly supports split-screen and watch the lines go all over the place. -
Kade Storm The Devil's Advocate
Of course you don't, but we're talking about dedicating a single card to PhysX. When you disable SLi, the idle GPU will most likely become the dedicated PPU for the relevant software using a PhysX engine.
-Fin- -
-
Kade Storm The Devil's Advocate
I have a harsh bias against Dell these days. However, Alienware is now under Dell's control, so you'll get the same support that other Dell owners get, I believe. And in theory, this is what should be happening.
I think another M17x owner would be better suited to answer this question. Here in UK, they've still got different teams for the XPS and the Alienware. How about state-side? -
It really works like the paragraph below but I used top of the screen and bottom of the screen to simplify the process:
Split Frame Rendering (SFR), the first rendering method. This analyzes the rendered image in order to split the workload 50/50 between the two GPUs. To do this, the frame is split horizontally in varying ratios depending on geometry. For example, in a scene where the top half of the frame is mostly empty sky, the dividing line will lower, balancing geometry workload between the two GPUs. This method does not scale geometry or work as well as AFR, however. -
ok, were getting off topic now joebarchuk and need to get back to where you provide a LINK that supports your argument that dual 1gb cards SLI'd =2gb of Vram.
-
These guys RAID'd 24 256gb Samsung SSD's but the total storage is still only 256gb.
http://www.youtube.com/watch?v=aIZt...770975482&playnext=1&playnext_from=PL&index=1 -
-
Wasn't that a RAID 0??
Pretty sure it was, he combined 24x256 for a total of 6TB of Storage, hence RAID 0 not RAID 1.... -
I pasted just above your post how wikipedia sees SLI working. This clearly shows that each graphic card processes different parts of the image rendering each using the 1GB of memory they have.
Therefore it's not like RAID 1 where two 256GB drive only gives 256GB of available space.
What is true though is that each graphic card only has the possibility of using 1GB but it's not shared memory. They each have their own 1GB to process from therefore you can argue it's like having 2GB of video memory.
Chaz reply to the OP clearly states there is a shared 1Gb memory which is absolutely not the case. -
It is not shared per-se, but both memory modules are filled with the same data. It is pointless to argue the whole "I have 2GB's of VRAM" deal.
So we could say that it is redundant in how it works.
That ok, joe? -
Charles P. Jefferies Lead Moderator Super Moderator
That said, let's stay on-topic here. Here is the original post for those interested:
-
This happens quite a bit, chaz.
Someone posts something they think is right, but then gets proven wrong.
Then they'll try and defend it for another 100 posts..... -
Either graphics card set up will be fine for all modern games at max settings (except Crysis of course) and it should run them all very smoothly. My single 260 has never gotten over 60 degrees thanks to the M17x's amazing cooling system. I don't think having the AC on while gaming would have any noticeable impact on GPU life. A better thing to do would be to get a notebook cooler such as the NZXT Cryo LX.
If you haven't figured it out, dual 1GB cards mean 2 cards with 1 GB each for a total of 2GB, but only 1 GB is usable (Not sure why, someone mentioned that they mirror each other). Regardless, 1GB is and will be sufficient for a long time to come.
You will very much enjoy the M17x if you decide to order it, I can almost guarantee it. -
-
so other words , for the casual gamer that playes Prototype , GTA 4, Scared 2 will be better off with a single 280, 260, or better with a dual 260 ? I was about to order dual 260 thinking i would would have 2 gigs of video ram , thinking i would not need to update my video for at least 5 years.
I mostly deal with dvd's but do not want to be stuck like my last notebook , to where after i got it , 6 months later i could not play any games. (gforce go 5200 32meg video card wooohoooo).
Used that laptop for like 4 years without playing any games with that awesome 32 megs of video ram.
Long store short, casual gamer be better off with a single 280, 260, or better with a dual 260 ? Game maybe 8 hours a week will not put a lot of strain on a single 1 gig card. But video and surfing 70 hours a week. -
VRAM does not matter
get sli and forget about it -
Soviet Sunrise Notebook Prophet
This thread receives the Soviet Seal of Approval. Joebarchuck, thanks for making us Californians look bad.
-
Don't make Californians look bad D:
-
SoundOf1HandClapping Was once a Forge
For real.
+1 char -
Hold on, first of all I've been on this forum for many months and I have never argued nor contracticted anyone. I was always to the best of my knowledge helpful or looking for help.
But here some things that are not true have been said like "they both process the same data". NO, each card in SLI mode does not process the same data. They process different part of the graphics and they combine it to make one image or one frame should I say therefore each 1GB of video memory for each card is used independently. This is how SLI works.
I totally understand that if SLI worked as shared memory meaning, both card use the same video RAM then yes there would only be 1GB of video memory no matter how much is advertised but that's not the case at all. -
Mmmm...
I that i opened up a tread about how reliable a M17X is..
if you want to discus something like that fine.. but stay on topic here.
start a new tread or send some PM's
Don't start over again.. -
So how reliable is it.. ?
-
There does appear to be a fair few issues with the Hybrid GPU, Stealth getting stuck, black screens on wake-up ect all software driver related it appears.
Some updates seem to have fixed some of these issues already ie BIOS A02.
I say 8/10 in the reliability stakes.
M17X - How reliable it is?
Discussion in 'Alienware' started by Visu2k7, Aug 30, 2009.