Hey I'm rooting for it too, it has the looks and specs but the pricing, well that's another story.
I don't see why they just put a HD5870 in it and say "Now it features the most powerful GPU to date".
-
-
On nvidias behalf it does hold crown as highest performance GPU, i suppose phin has a very good point, they could always swap to the new core if its released within time, keep hoping
And yeah that gimmicks quite funny, when all my friends with mac find out that i run no protection on my machine without contracting virus' or anything, they end up quite suprised
Only unsafe thing about windows, is Java and Flash in my oppinion, back doors to full control through active X, they suck... -
Yeh, I am hoping they change to the latest GPU to justify the price, but honestly doesn't matter to me, the game market doesn't even have many games which support Direct X 11. I just need a good notebook with a graphics card that can play most games. I personally find Nvidia gpu's to be better at running most games than ATI. My bro's comp has a Nvidia, while my sister's comp has ATI, my bro's comp loads the fastest when playing BC2. Maybe its just a better gfx card, I don't know.
-
NotEnoughMinerals Notebook Deity
well it's hard to go off of that unless you have the specifics of each card
-
EA has been working well with ATi, DICE so expect the same from Medal of Honor. Now that Activision is on board, expect more games running well on ATi. AMD don't think force developers to use their logos, so it's not as evident, Nvidia pretty much hold developers hostage is the difference for their marketing.
For MMO, Aion was another game well optimized for ATi and since NCSoft is a massive ATi fan, we shall see more MMOs like it.
Also if you look at recent titles, ATI is making a huge impact. Bioshock 2 was made with ATi assistance. The best selling game MW2 has ATi to thank, EA Executives pretty much said it was ATi working hard to do what their stubborn developers wanted is why the game runs so well now. BC2 is another game that ATi worked extensively with, Dirt 2 obvsiously.
Ironic you chose BC2 when BC2 actually per price per card, ATi swamps Nvidia cards. For less money you get more performance from AMD in Bad Company 2 hands down, no question.
I don't think it's gamers, but game developers and manufacturers are getting tired of Nvidia's tactics. PhysX still not used widely, forcing developers to say FU to ATi users is not a good way to make them happy. And look at XFX, the largest seller of Nvidia cards now sells AMD. Nvidia whined like a cry baby limiting XFX access to the GTX 480, so what does XFX do? They release a monster HD5970 with 4 GB in response which still has less heat and power used by GTX 480 and price not much higher, but performance ridiculous.
So I would not look at AMD dis-favorably, I see good things coming. AMD is making more and more push for open compatibility and giving gamers and developers more choices, it's going to make a lot of people happy to buy AMD and to work with AMD. -
Oh man, you guys should watch the furmark GTX480 video, 45 db at 100C from the fan...
-
I know the desktop cards run hot, hopefully the mobile version does not.
-
IMO, the 1 fan design is going to have trouble keeping this machine cold. I reach temps (~ 90C) on my G51 which is using both cooler components and cooled by 1 fan. -
NotEnoughMinerals Notebook Deity
I don't mind the one fan as much as I would have preferred a 2 heatsink setup
-
Mobile FERMI's are going to be 32mm so hopefully the smaller size will help with the extra heat.
-
SemiAccurate :: Nvidia officially denies sub 20 percent Fermi yields
LOLs. In response to when FERMI GF100 will use less power and emit less heat, Drew Henry a General Manager for MCP Business unit of Nvidia said, paying more for you electric bill for only 10% more performance is worth it. He said for upcoming possible 512 shader version, 15-20 watt more than the current GTX480 is acceptable.
So for Nvidia, get ready for more power sapped, not less....
@IKAS V you are probably looking at something ridiculous 48-64 Core unit there.... pretty ridiculous if you ask me. The performance of the GF100 is entirely dependent on number of shader cores, even 32 less reduces performance by like 15-20% from the 485 Core that GTX485 has.
Also it won't be 32nm. TSMC dropped it, it's not an option. And currently their 28nm architecture is a complete mess. It's just going to be 40nm same as ATi's mobile with puny number of shader cores.
The only other option is Global Foundry and guess what, they dropped 32nm also. Plus GF is heavily funded by AMD, so probably won't be seeing a Nvidia GF deal for a while. -
Thanks for clearing that up.
-
I'd have to bet against the GT660 getting the 100W GTX 480M if the chassis only got the one fan.
Not looking good for any of the other existing MSI gaming models either. -
Either way it seems newer graphics cards are getting hotter and hotter, they should forget shrinking them and all that other junk,
And i'd have to agree with phin, 100w is a pain in the butt to contain with a single fan. -
Can you imagine The power brick size if they put a video card, which alone eats up to 100W power? how about other components? How about 200W power brick?
Even the one on GX640 seems very large and crumbersome for me. -
Well the folks in clevo were predicting 300w+ power bricks, i mean what kinda mobo would also be able to handle that heat/pressure/power. (Clevo desktop laptop thingy could,)
But jeesh. -
Dell is rumored to be going to a 400w brick for the M17x when it gets the GTX 480M.
-
What!? are yuo serious?
The m17x had alot of issues at high power draw though didnt it? (Well screen flickering.) -
Still, you'd be better off with a CrossFire config in the m17x for sheer power, unless they're planning on GTX 480 SLI...
As for GPU power efficiency, it looks to me like it's only Nvidia that's going the wrong way there. ATI's RV770 was, on the other hand, a brilliant move. Intel learned its lesson about power efficiency many years ago with the Pentium 4; hopefully Nvidia will learn theirs as well. -
bump
Anyone has any news about this laptop. I can't find anything about it, no release dates/ prices etc (maybe they decided to hold off with the release date?) -
Ok we got some news here (or at least a step forward in the MSI GT660 procedure)
Source: MSI announces the GT660 16-inch gaming laptop | TechConnect Magazine
Official specs
* LED- back-lit 16" screen
* Intel Core i7-720QM
* up to 12 GB's of RAM
* NVIDIA GeForce GTX 285M 1 GB
* up to 2 HDD's (1.2TB in total)
* ECO engine - to conserve battery life
* TDE technology for CPU OC (Turbo Drive Engine),
* DVD burner or Blu-ray combo
* Wi-Fi 802.11b/g/n
* optional Bluetooth
* 4 in 1 card reader
* 1,3 megapixel camera
* USB 3.0
* HDMI output
* Speakers made with co- operation with Dynadio
* Windows 7 Home Premium or Ultimate (64-bit)
* screen lid resistant to scratches
* weight: 3,5 kg
Source: MSI GT660 - to jeszcze notebook, czy ju? choinka ?wi?teczna? :: PCLab.pl
I'm kind of worried about the amount of LED lights they've built in the laptop. I really, really hope that you will be able to turn then off, because IMO it will look bad/ joke in some situations. -
Dont worry about it, excellent find by the way,
But look at the under pannel, the cooling airflow is by far different to the standard msi have stuck with,
It looks thick, possibly stacked HDD's with passive airflow then a single fan running GPU-CPU (Possibly where there are more vents.)
Doesn't look to have a removable rear cover though, possibly due to? -
BenLeonheart walk in see this wat do?
Is that dual-fan setup?:O
-
its possible but hard to say judging by the shape, which is why i think passive + active vents.
-
Looking at the photo showing bottom, the subwoofer in the rear seems to look decent.
Now we need to wait for a first review, to see how this runs. -
Hehe i did actually put possibly due to a render? but for some reason its been censored.
I think some of the vents will be dedicated to HDD cooling really, and there will be a primary fan, but the vent coverage is much larger (They should put a ps3 fan in it haha) -
Those lights look terrible.
The chassis is thicker but I don't remember exactly by how much...I think I read somewhere it was 2.1". -
It's a 1 fan setup with 1 exhaust .. The other one seems to be blocked.
Those lights does look terrible -
NotEnoughMinerals Notebook Deity
Yea, at first I thought the lights were alright but now they just look like they think you wanna turn off the lights in your room and pretend ur in a sad sad night club.
1 fan with those components... doesn't sound like a good idea unless they've done A LOT of work on the cooling system -
I think its ok, shouldn't run as hot as the 58 series cards,
Buuut there may be an alternative to our thinking.
The MSI fan is designed to pull in the heat and throw it out of the vent (Primary one.) If that second vents passive, it'l not only influence temperatures alittle, but it should drag some fresh air over the components too (Following their 1 fan design philosophy) it may work,
Although i do forsee a bigger, more powerful fan!
(I think some of the lights are ok, but the other i'd just rip the machine appart to unplug haha!) -
Those lights terribly remind me about asus G51 series. So sad you cannot switch off them in Asus...I hope they will let users to decide whether to make fool of themselves or keep good image.
As for cooling, I think they will make a single fan solution with separate heatsinks for cpu and GPU so for example heat from gpu will go out through the back vents and heat from cpu will go out through side vents. Pretty much the best you can do with one fan.
It also depends how they will provide air to the fan: will they redirect the airflow through internal components or simply let the fan to suck air straight from below/through keyboard buttons. From what I can see now - they prefer to suck air through internal components (hdd's and motherboard), which is good, but it increases the temperature of the "cool" air to pall the sinks.
We'll see how this will work out. In any case I believe this will be better than a current solution to put both heatpipes on single sink.
The weight and width somehow frightens me as this notebook comes out of the GX640's class regarding mobility and weight.
And WHY OH WHY they have to put so wide bezels on the sides of the keyboard? They could easily widen arrow keys or separate them from right ctrl with a small gap. -
However they always seem to exaggerate "lights" in photos such as these, the G51 photos were also showing the lights to have a strong luminosity but it turned out they did not.
-
Was talking about the one on the Far right (left on the pic) which is blocked ..
-
Yeah, which is why i think its HDD passive cooling
.
-
-
Audio and 3d ability suposedly, but unknown at this point.
-
Now im analyzing it, and it just doesn't really offer anything special or a outstanding. (judging based on the specs/ pics) I don't really see what this price is justyfying (Audio, 3.0 USB, i7?)? -
If it weren't for those shortcomings the 3 RAM slots, 2 HDD bays and USB 3.0 would more than make up for the extra .1" to .2" of thickness. -
-
they copied form the best - looks like an Asus G51
i'd like that chassis with a 5850 insdie -
Hardly the best, G51 was not too impressive in my oppinion, but yeah scrapping the 285 for the 5850 would be awesome.
-
I guess that unveiling at CeBit, some 4 months ago, wasn't enough for MSI so the GT660 will get a second go at it at Computex...but you still won't be able to order just yet.
MSI Computex 2010: The GT660 gaming laptop | bit-tech.net
Not only is the Mob. HD5870 less expensive than the GTX285M, if the prices being listed by some sights for the Clevo x7200 are any indication, the Mob. HD5870 should be a couple hundred cheaper than the GTX480M. -
As for the battery, GT660 has a nine- cell one. How much do you recon it will run for (wireless turned of and 20/30% screen brighteness)? 2/3 hrs? -
Dam.... been waiting over a month to see this notebook to be released. Deciding between this, or a Alienware M15x with hd 5850 that I can order now, but the total configuration will just be under $3000.
-
EDIT : BTw whats the assumed wattage of the power brick with the FERMI? 400W ? -
So there saying, some companies even want to sli the stupid fermi card, i hope msi doesnt touch it! but i do hope MSI gets to grip with teh SLI world at one point!
-
If the GT660 chassis gets a Fermi GPU it probably won't be the GTX 480M but a lower model perhaps based on maybe the gf104 or gf108 GPU. Think something like GTX 460M or GTS 450M
-
Yeah that'd be nice to be honest, they need to get the dam book out already lol.
-
Anyone think the GT660 weighs a tad bit too much? Even the GX640 only weirds 2.8kg, and the GX740 weighs 3.2kg. I know the GT660 is a 16' which fits as much as a 17' but thats 0.3kg more than the GX740. Anyone know what makes it heavier? Design materials, components? Though it looks thinner, this might just weight more than a Alienware M15x.
MSI GT660 with Nvidia GTX285M
Discussion in 'MSI' started by v_c, Feb 23, 2010.