I'm waiting to upgrade my ancient dell laptop and trying to find out more about the upcoming graphics cards / cpu's - and I'm very interested in the GTS 250m for it's incredible performance/watt potential, but all of the information is the same apart from the graphics core (GTS 250m is the GT215 core vs G92 core for others)
Are there any significant improvements brought about by this core-change, or is it just a name change upon name-change and it'll just be a shrunken old chip?
Any links to information on this GT215 core would be appreciated
-
All you need to know. It's not a shrunken 8800M GTX, it's an entry level GT200 chip. -
Now I see, the Nvidia does use newer tech in laptops, but it's slower.
So you either choose wether newer tech or better performance.The GTS 250/260M uses the famed GT200 core, and even uses GDDR5, but it's much slower than the GTX 280M, but the 280M uses old G92 core with only GDDR3.
It's a toss up.You either have to choose newer tech, more efficiency, but slower, or choose older tech with performance.The GTS 250 is more ideal for laptop though, because it probably runs cooler while still giving decent performance. -
They're certainly not slower, these are the mid range replacements, and they blow the previous crap out of the water.
The GTS 260M is on the same level as a 9800M GT, which is pretty good. But on top of that, it consumes less energy and produces less heat, and being 40nm it has great overclocking potential. -
Well the GDDR5 models have yet to be tested. The Gt240M seems to be on par with the HD4670 so it's nothing revolutionary. GDDR5(either from ATI or Nvidia) is basically what the mid-range field is anticipating.
-
The GTS 350M/360M will be replacing the GTX 260M/280M.
-
One has to wonder how the GT300's will turn out..
-
-
My main point wasn't about the overall specs of the new 200 series GPUs it was more asking about what difference the Core change would really bring. Here's the specs of the GTS 250m and the 8800GTX/9800GT
Card: 8800GTX/9800GT
Based on (Desktop Core) G92 Core
Pipelines 96
Core Speed * 500 MHz
Shader Speed * 1250 MHz
Memory Speed * 800 MHz
Memory Bus Width 256 Bit
Memory Type GDDR3
Current Consumption 65w
Card: GTS 250m
Based on (Desktop Core) GT215
Pipelines 96
Core Speed 500 MHz
Shader Speed 1250 MHz
Memory Speed 1600 MHz
Memory Bus Width 128 Bit
Memory Type GDDR5
Current Consumption 28w
So specs-wise, the main differences are (8800GTX vs GTS 250m order):
DX10 vs 10.1 // 512mb max memory vs 1gb max memory // G92 core vs GT215 core
Are there any significant differences arising from the core (or is it the DX 10.1 compatibility alone) or is it just a 9800GT (same apart from core, 128bit+gddr5 approx = 256 bit+gddr3) at a mere 43% of the energy usage? -
The memory speed is way higher, and the core is said to be much more efficent clock for clock.
-
These are still pretty high-end cards, the doubled memory speed makes up for the 128bit bus, essentially making it a 256bit card.
As for the GDDR3 models they are the same essentially as this gen, but with some more shader processors.
So to answer your question, yes, basically the same high-end card just with less power consumption. -
-
The Memory speed is higher because it is GDDR5 compared to GDDR3, when you take into account the 8800/9800 is 800mhz/256bit/gddr3 compared to the GTS 250m with 1600mhz/128bit/gddr5 they are the same.
However if the core is more efficient clock for clock then I could expect something more powerful than the 8800/9800 rebrands. -
The GTS 250m should be a bit slower than the 8800M GTX (though this could be different if we're talking about games that support DX 10.1). Although it uses GDDR5, it's also using a 128bit bus, and because of GDDR5's relatively high latency, 256bit + GDDR3 will be faster than 128bit + GDDR5 granted that everything else (shader count, etc.) is still the same.
-
Still, I dont think latencies will be an issue. -
This also explains why a desktop 4770 isn't significantly faster than a 4830, even though the latter has a lower core and effective memory clock. -
But the clocks are not the same, I see what your saying and what that article proves, but latencies wont be an issue because GDDR5 is clocked so much higher, there would be no reason to use it if it was clocked the same as the GDDR3. The bus differences will make them the same technically in this case, but for the price, and power consumption drop, and if your correct about the latencies, maybe a 10% drop in performance?
The article says just 1600x1200, on what exactly? Its not very informative, doesnt say what game or application, what the other settings were, etc etc.
Cannot really say until they come out though, mostly just a speculation thread, say take anything said with a a grain of salt. -
i care more about the pricing. nvidia's official site list the gts250/260m as 'high performance' cards. i hope they can keep the price down.
-
As for the clocks, of course GDDR5 is clocked higher to make up for the lack of bandwidth from using a 128 bit bus compared to using a 256 bit interface. But that's just bandwidth without taking into account the speed in which all that bandwidth is processed (the latency; or conversely, how long the delay is between processing). But you're right, it's all speculation at this point, but from what has been tested, it should be slower than an 8800M GTX when running DX9 and DX10 and it could actually be faster in games that support DX10.1. -
Kamin_Majere =][= Ordo Hereticus
I'm still jonesing for the 260GTS. That (theoretical) preformance for 38watts... WOW
Tow of the 260GTS's in SLI are only going to be pushing 11 watts more than an 8800GTX and will give a pretty huge boost in preformance.
With as low as these power draws are i wonder whos going to be the first laptop maker to have a 3 or 4 way SLI configuration(j/k)
But it would be amazing to have SLI in a form that doesnt take insane cooling to make useable -
-
Wow.....put a couple GTS 250/260M's in SLI, and you have there a great gaming laptop with less power draw, less heat, newer tech, and more efficient.Where can I get one?
Heck, I'll even settle for a Single card configuration.Can't wait to see this get into laptops. -
260M's are readily available now. Check out XoticPC.com
-
The GT S is not available, anywhere.
-
Nvidia still has room left after the GTS 260m to add shaders and a larger memory bus width to a G200 based 40-nm die and produce cards with better performance than the GTX 260m and GTX 280m. What's more is that, even though the G214 and G212 GPUs couldn't be shrunk to 40-nm, Nvidia still has plans to release a 40-nm G215 GPU which is a good indication that they're not ready to abandon the G200 tech completely.
Given Nvidia's history of milking their GPUs until they shrivel up to dust and blow away I'd be knocked-to-the-floor stunned if mobile 300 series cards were based off the new desktop 300 series architecture, and not instead something like 112 to 128 shader, 256-bit bus versions of the G200 series GPUs. -
I'm betting on 300M being G200 based, with at least one being the desktop GTX 260 in 40nm form. I'm not sure if they can go higher than 256-bit, so maybe Nvidia will go 256-bit w/ GDDR5 memory. I'd be fine with that, but there had better be more than a measly 128 shaders on the highest model.
-
Man I wouldn't mind seeing SLI GTS 250m, that'd only be around 56w TDP, less than a 160m and around 25% less than a GTX 260/280m with potential for great scaling as more games are adding dual GPU options... Still dreaming of a power-efficient gaming monster, hoping the GTS 250m will cover my battery-life and performance wishes, the GTS 260m doesn't seem like it would be significant performance increase for more than 35% increase in TDP.
-
-
It should be cheaper due to the 40nm process. The GTX 280M came out cheaper than the 9800M GTX, after a die shrink...
-
-
-
But $400 cheaper than the 9800M GTX, nonetheless. Hopefully the 40nm chips can come in well under $300. That would be progress.
-
-
-
Yes, at launch it ran $800 to $900 for a single GTX, while the 9800M GT was going for $500.
-
Will the GTS 250m be just a shrunken 9800GT/8800GTX?
Discussion in 'Gaming (Software and Graphics Cards)' started by jk6959, Aug 27, 2009.