Basically that... 5870 will be superior in the end IMO..
lol most laptops beat G73 with 5870 in 3dmark 06 ... i'm not suprised this does.. what does it get in vantage?
-
The problem for the GTX 480M is that it costs so much and there is no need for it...
The current issue with Alienware M17x is that, there is no reason to upgrade to the HD5870M. That beast already plays every game out there like a beast still with HD4870. And DX11 isn't that big of a deal considering the trend in game development.
But the real truth of the matter is, there isn't a single game aside from Metro 2033 that the HD5870M cannot play in high details 1080p yet...
And as everyone who has a brain except Nvidia PR have noted, full tessellation in games is far away... Current trend is DX9 Console to PC and I don't see that changing until Microsoft and Sony introduce the next gen Consoles. So for at least another 2 years, I don't see any need for Nvidia tessellation.
And for game developers, they aren't stupid. They know the only way for people to play games with high tessellation is to either buy a $4,000 laptop or a to buy a $400-600 beast with $200 water cooling with $$$ for upgraded PSU and more $$$ in electric bill. There aren't enough gamers with that kind of expendable cash for any developer to waste money to take advantage of this. If they can, they will only do it as a tech show, but if any game company wants to make a game like BC2 with lots of players participating, they won't waste the time or money.
Just casual talk online, the most recommended card I hear so far is a cheap 8800 still. For most gamers this is still more than enough.
Online FPS Games to Come, no need for expensive Nvidia
- The next big FPS multi-player will be Medal of Honor using the same engine as BC2
- BF3 is rumored to use the next Frostbite 2 engine, and that engine will only use tessellation on wheels to create rounder images, but that's it, rest is on DirectCompute
- The next COD likely will use the same formula as the other recent COD, get this to run at 60 FPS on as many machines as possible.
Online MMOPG Games to Come, no need for expensive Nvidia
- FF XIV will be a great looking MMO, but they want as many people to play. It will not be a tech show for Nvidia
- GW2 will be the same as above and probably not as good looking as FF XIV, thinking slightly better than Aion
- Tera Online will be using UT3 engine, so yeah, again make it look good, run on as many machines as possible
- And of course the Giant WoW will remain DX9. Every guild I was in from casual to hardcore only had 1/2 the players with DX10.1... The attraction to play WoW is because MAJORITY of gamers are casual and they don't care about the failures from Nvidia. And WoW provides them with endless entertainment on DX9, that's the big draw to the game IMO.
Strategy Games
- Games have always been more emphasis on CPU anyways. RA3, DoW II, Civ IV, Supreme Commander are a joke on the HD5870M.
- Starcraft 2, yeah... 8800 is overkill for this game already
RPG Games
- ME3 will be using UT3 again it seems and runs awesome on HD5870M
So personally, for me, I don't see any need to upgrade my HD5870M for a long time.
- And I think a lot of laptop gamers are going to be thinking my way also. -
I think your great guy ziddy123... you have very good knowledge and willing to help out people.... but you do come out to be a really hard core ati fan and nvidia hater.
I will just close one thing, your analogy is great and truly well put, but seems like Japanese car, where its efficient, good millage, has all the latest tech, etc. But there will always be Muscle car / high performance lovers, where they will pay premium for extra power and even if they dont need it, and they have to spend so much on gas... its just the way it is. Just don't hate the nvidia family, even if they aren't that efficient as ati, they will still have something to offer to the market.
And if we all were to follow your advice... nvidia will go bankrupt then ati will own the market... and then no more good innovation from them, its healthy we have both companies releasing new products and consumers buying them. -
It is basically the difference between effective and efficient. The 480M brings the big performance, along with high power consumption, heat etc etc. The HD5870 brings adequate performance, while mantaining a more efficient performance/watt.
Both bring things to the table, and in the end both achieve the goal of giving people on the go a decent gaming chip. -
Wow, ziddy and sean, you guys seriously need to take a step back, to ask yourselves why you care so much.
-
Well, hater's gonna hate. Before any benches are out, it's "at most 10-20% performance gain", when benches are out, it's "too hot", if next it's proven that the cards will actually run cooler or on par than 5870MR (which is very possible), I guess all's left will be "the card is too expensive". There will always be excuses to hate.
Talking of practicality, some people just like to play all games on max settings at high frame rates, if you just want to play all games, hell, a macbookpro with an integrated 320M can play almost all games as well, just minus the eye candies, why even get the 5870MR in the first place. A high performance card is a high performance card, saying how it's too high performance that you don't need so much performance in a thread about performance graphics card is really lame.
I don't know what's with the hype over heat issues either, the card is not even on our hands yet. When 5870MR was going to be out, we are saying, how much cooler it will run compared to 285M, but in fact it runs even hotter despite the rated TDP.
And honestly, it's ironic, when 5870MR is out and 285M don't even have dx11 yet, the same haters are saying how great tessellation is, then, when 480M that greatly utilize dx11 is out, the same people now start saying how meaningless is dx11.
I myself uses a 5870MR, but I know it's enough for me, at least for now, and when I see a higher performance card I don't just go bash it because I can't get it, I respect it as a superior hardware. -
16000=12000???
ziddy and sean are you guys kidding me>?
my 260m gtx must= the hd5870 with your reasoning.
480m gtx>>>>>>mobility hd5870>280mgtx>260m gtx -
The only consumer that exceeds 5870MR in 06 on notebookcheck is 285M, and if you would look at the test results, the results from 285M only includes 820QM and 920XM CPU, while those in 5870MR ranges from i5 to i7. The 06 test here is both using the same CPU. -
I don't like Nvidia as a company. They have dirty business practices. They seem to do everything for themselves, not to progress the gaming/computer industry. I don't think they are supporter of gamers, as if they were, they would do what's good for gamers. But it's always about them and refusing to cooperate with others to make the situation ideal for everyone. They are constantly trying to play dirty in order to force developers/customers to support them.
I'm a big believer, give people choices. So I like that Nvidia is a choice. But how you present and provide that choice is what I have issue with.
PhysX, forcing developers to use it if they want Nvidia support... A closed system that screws over all other competition.
Disabling Hybrid PhysX, even though these ATi users have spent good money to buy an Nvidia Card with PhysX.
CUDA, why not give it up and pursue OpenCL?
The fiasco with DX 10, and why we had DX 10.1
XFX decides to make ATi cards, now Nvidia has them on their blacklist. Awesome
They are always playing dirty with their benchmarks. Heck even going as far as to buy Unigine 1.0 and etc.
And here is the latest of their antics:
NVIDIA breaks up with Hardware Secrets | Hardware Secrets
Let the consumer decide! But Nvidia rather play, we decide for you, or we will do everything we can to do so.
Am I exaggerating, probably. But that's how I see what's going on. I see progression, then I read about Nvidia doing some bs to hold back the progression and go down some side street of proprietary.
TBH I don't see much difference between Nvidia and Apple. -
Oh wow. That is rather childish, IMHO.
-
Yeah, ok. -
-
I'll take the top performer 480m. Too bad it won't work with the w870cu!
-
I always thought Ziddy and Sean where the same guy
.
Anyway, Ziddy man, stop it, I stroll along many ati forums (my desktop having 5870 Crossfired and having so many driver problems, I have to look into forums for help, altho I still luv em more than anything NVidia) and have never seen ANYONE so ATI fanboy as you, you are a good part of this community, but just STOP THE HATING man, what has nvidia done to you that is so evil that you must fight such a battle on a forum?. ><
I know you love your Asus ati laptop and all, but man, all this anger will make you the next Sith.
EDIT: My personal opinion is that I too dislike nvidia as a company, I joined these forums 2 years ago, looking for a laptop, and ended up going with a NVIDIA 9600m gt re branded card, add on top of that my GFs 8600m fried because of..well we all know why, and since then have been disgusted by them, but the whole competition is good, and if NVIDIA releases a TOP PERFORMANCE card, be it 100w or w/e if it works, it works, and thats it. And to be honest seeing all the problems I have with ati DRIVER wise, you cant say they are much better.
What I am saying is, I understand you, but ATI would do the same things NVIDIA did if ATI had the problems they had, they are company, and survival is key, stop the fanboism at the expense of objectivity.) -
MahmoudDewy Gaming Laptops Master Race!
Eurocom will provide The GTX 480 in SLi setup
"he Panther 2.0, as it is called, will combine two GeForce GTX 480M GPUs in SLI configuration"
Notebookcheck: Eurocom set to bring you the NVIDIA GeForce GTX 480M inside its D900F Panther -
SoundOf1HandClapping Was once a Forge
Now where are the 460m and 470m, I wonder. Those are more relevant to my interests.
-
-
. I kinda think we wont see in asus gtx480s, as we didnt see gtx285s.
-
SoundOf1HandClapping Was once a Forge
That's actually why I'm asking. I'm assuming Asus wouldn't be so insane as to put the highest-level Fermi into a 15-incher, so I'm wondering what the second- or third-best cards would be. I'm waiting for their new 15-inch gamer.
And, if Phinagle's schedule is correct, they might have the time do something to keep the dang card from burning through my chasis. -
-
If a 480 were to ever make it to anything less than a 17 inch laptop, then that company would have to be incredibly creative.
-
Here's a comparison I did of memory bandwidth and computational power of the GTX 480M and HD 5870 vs their desktop versions:
Radeon HD 5870 - 2.72TFLOPS, 153.6GB/s
Mobility Radeon HD 5870 - 1.12TFLOPS (41.2%), 64GB/s (41.7%)
GTX 480 - 1.345TFLOPS, 177.4GB/s
GTX 480M - 0.598TFLOPS (44.4%), 76.8GB/s (43%)
Both pairs of cards have the same overall architecture, though with different SP counts and different clock speeds. Overall, I would expect the performance gap between these laptop cards to be similar to that between their desktop versions, with the gap being another 5% or so bigger in the mobile case. -
There is nothing coming that will make my machine melt, so i will save my cash for 580M's
Edit -- Oh and to all those who doubted me when I said 480M's will be out by end of June and called me out on it. Bet ya wish you believed me now -
italian.madness Notebook Consultant
-
Put them into a X7200 with 120hz refresh rate screen for 3Dvision that will require double the power and it will be just right. I really don't see any point with all those 3D vision notebooks running around with some weaksauce 360M, it can't even play all games at max settings for 2D screen, not to mention 3D screens that require double the power.
-
italian.madness Notebook Consultant
-
Just use the desktop 5770 and GTX 465 for the more direct comparison. -
Based on the power usage, I would expect the 480M to be worse than the 465 by a larger margin than the MR5870 is worse than the 5770. -
Nonetheless, it's pretty easy to do the same comparison for the other cards, so here I go:
Radeon HD 5770 - 1.36TFLOPS, 76.8GB/s
Mobility Radeon HD 5870 - 1.12TFLOPS (82.4%), 64GB/s (83.3%)
GTX 465 - 0.855TFLOPS, 102.6GB/s
GTX 480M - 0.598TFLOPS (69%), 76.8GB/s (74.9%)
Basically, if you take GTX 465 vs HD 5770, and subtract roughly 15% off the GTX 465's lead, that's what you can expect from the 480M as compared to the 5870. -
Can find links to Hothardware, Techpowerup, Legitreviews, Tweaktown, and Hexus reviews through this Fud article. -
-
Judging by AnandTech's benchmarks, the GTX 465 has an average lead of ~35% over the HD 5770 in both DX10 and DX11 games (at 1920x1200), so the GTX 480M should be roughly 20% faster than the Mobility Radeon HD 5870. -
but considering that the VAST VAST VAST majority of games are based and limited by console devolpment then there is no reason to change anything
why buy a 480M now when you dont need it only for it to struggle in 1 year to 18 months when dx11 take up is in full swing and the new consoles are out or on the horizon -
taken from the nbr gaming thread http://forum.notebookreview.com/gam...-between-nvidia-gtx-480m-mobility-hd5870.html
-
-
because there are so many threads to do with the 480 that its better to look at them all in one thread.
as your the expert ill take your word on that. -
I find this statement from Notebookcheck hard to believe
-
Seeing as the 5870m is faster than the 4770 (its very close to the 5770), its just more evidence notebookcheck is inaccurate.
-
-
Rumors for yas....
Source: 4Gamer.netLast edited by a moderator: Feb 6, 2015 -
^^^Dead Image ^^^
-
So there's an Asus G73W coming, with 120Hz LCD and GTX 480M.
Asus + 100W GPU? Yikes. -
Here's the image as an attachment.Attached Files:
-
-
ASUS debuts 15.6-inch ROG G53 3D gaming laptop at Computex -- Engadget
Video card for the G53 is unnamed. It just says "nvidia enthusiast graphics." 460/470, maybe? -
-
Why in the world would they go with that? :/ Have a source?
-
Very lame.
-
Megacharge Custom User Title
-
Actually, IMO, the core of 480M should have around the same TDP as the 285M, it's just that the 256bit GDDR5 that's taking up the huge power.
NVIDIA GeForce GTX 480M is out
Discussion in 'Gaming (Software and Graphics Cards)' started by MahmoudDewy, May 25, 2010.