1.5GB might be more than enough in most cases but the alternative 768MB could put a strain on memory.
40nm GDDR5 comes in 2GB modules, that don't really cost more to produce and doesn't create more heat than the 1GB of 55nm GDDR5 that cards were using. A memory die shrink works in the same way a GPU die shrink does, in that it can double the shaders from one generation to the next without doubling the card's cost or heat.
Bus width determines how many memory chips a card will use, and the amount of VRAM on each chip is determined by what the memory manufacturers are making at the time. You may end up with more GPU memory than you need, but just like how, a year or so ago, 4GB was more system RAM than most people would need, once the price dropped on 8GB more and more people started justifying their need for more RAM.
Eyefinity, tessellation, GPGPU, and hopefully Hybrid Crossfire will all help you justify some of that extra VRAM.![]()
-
I bet the GTX 460 with 768MB of memory will run games much better than a mid-rage mobile card with 1.5 GB of vRAM.
-
I haven't gone through the whole thread just this last page, but I do notice there seems to be an argument about how much memory is enough.
Which reminds me, wasn't Bill Gates quoted at one time saying that 640k of memory should be enough for everyone?
Think about this... -
Did Gates Really Say 640K is Enough For Anyone?
though as to your point, it once was reasonable for the balls out best card (8800GTX) to only have 768MB of memory, and before that, the 9700Pro to have 256MB!! Now the top end cards have 2GB and 1.5GB
EDIT: and despite consolization, memory useage is still growing... -
The desktop GTX 460 768MB? Step over 1920x1080 and that card shows it's memory limits. Anyone considering running multiple displays, or a larger resolution (externally for notebooks) will need more memory....and a slower GPU doesn't automatically erase the need for more memory to run at higher resolutions.
Depending on how "high-end/gamer enthusiast" the possible 192bit Heathrow is, it could potentially choke on 768MB of memory before the GPU caps out on processing more pixels. -
I I guess you are right from this perspective, but few ppl. actually do this. In fact most laptops today have resolutions of 1366x768 or 1600x900 at best. It's only the high-end that gets higher resolutions and consequently also better cards.
-
I just opted for a GTX 460 1GB over 768GB because I run a 1920x1200 monitor. Didn't want the bottleneck.
-
AMDs Southern Islands tapes out | SemiAccurate
AMD will out the Southern Islands GPU architecture early | SemiAccurate
Though none of that really means launch of HD7000M before CES 2012. -
Yeah, doesn't mean squat until 28nm is ready. I haven't read anything that makes me think that will be possible until Q3 at the earliest.
-
Sounds like I'll be wanting a 7500 - 7700 series card. I just ordered a 5820TG with a AMD 6500 series card but 28nm = drool.
Are CPU's going to be 28nm by then? -
Probably not, unless Ivy Bridge which is 22nm is released before 2012.
-
The only 2 8nm CPU will be AMD's next-gen Bobcat based netbook APU: Krishna and Witchita.
22nm Ivy Bridge is due for launch around the same time we should expect to see the 7000M series.....but personally I'm more eager to see these GPU paired with mobile Bulldozer based Trinity APU (32nm).
2012 CPU options can be discussed in these threads:
http://forum.notebookreview.com/har...3-forget-huron-river-22nm-ivy-bridge-way.html
http://forum.notebookreview.com/har...t-upgrades/505702-amd-fusion-info-thread.html -
More waiting begin ..... Oh well
M17x R4
-
AMD's 28nm graphics is only a die shrink
Here you go... enjoy the fud. -
Standard fanboy reporting procedure: belittle the competition's die shrink as nothing special then later harp on how your favored manufacturer used a die shrink to increase transistor count, lower die size, and improve power efficiency with the new & better HKMG silicon.
-
Heh, frankly I don't mind if it's just a die-shrink. Even with a die shrink I would expect a 30% performance boost just from that.
Now, if AMD could get that Dynamic Switchable Graphics tech. to work with any MXM port... I'm sold. Though I will miss PhysX
.
-
"just a die shrink" I love die shrinks! I only ever buy tocks because of die shrinks. We're using laptops... die shrinks are important.
-
With Intel backing Lucid's version of Optimus (aka Virtu) for desktops both AMD and Nvidia have got a new light lit under their bums to press forward with switchable graphics.
I was also thinking the other day how I wouldn't be surprised if LucidLogix took up Aegia's old role and started offering it's chip as a 3rd party physics processor....though I believe Nvidia would fight it and AMD wouldn't need it because they'll eventually be able to offload physics to the integrated stream processors on their APU. -
Mmm, didn't know that about Intel. Well hopefully all this will get implemented soon cause I am thinking about getting a new laptop.
Personally I think PhysX is rather good and better than what AMD or LucidLogix has to offer, but Nvidia just had to make it propriety software. -
i think that i'll be waiting for the r5 at the pace thing are going my laptop can www handre 99% of the games til then and by then we will have some crazy 22nm 1600 sp per mobile cards with cfx and switchable graphic
off course no new architecture they adrealy got thier 4way arch out they are just gonna bring it to a die shrink as SP amount increase -
This is all great and good, but what game will use this fully? I dont think there will be a new console generation until 2015, which effectively stalls anything breakthrough on pcs until then. There may be a game or two that might give it a little push, but realistically even in 2012 we will be in the same situation we are now, games designed for consoles and slightly altered for pc gamers.
-
I admit you have a point. I'm more eager to see the progress made on mid-range mobile graphics and iGP.
-
If this is going to be the "next gen card that does more then dx11, then this may be what the next gen consoles are based off of. But AMD sounds like they have other plans for dual CPU/GPU hybrid chips, if thats the case then we will see one of those land in consoles rather than a single gpu and a single cpu. The saving would benifit both the manufacturer and consumer alike. The other thing to think about aswell is, DX11 is all that amazing, not yet anyways. When ps3 was coming out it was a fresh coat of paint on dx9 and the new dx10. Dx11 isn't boasting anything radically new and different, it doesn't look much different then what we have already. People will want a new system when they can see the difference.
-
Kinda though the 28nm was going to be late this year, but seems they will start mass production in may????
Report: Radeon HD 7000 series about to hit mass production -
I'm not expecting any mobile 28nm AMD GPUs in a 1 year timeframe, best scenario included. Roadmaps showed one or two models supposedly for Q4 iirc, which in realistic terms mean you can add at least 3 or 4 months before you're able to select from a decent choice of notebooks packing the aforementioned chip. Then Ivy Bridge is expected for H1 2012 (could even mean later than Q1) and if mobile 7000 is really ahead Intel's schedule I wouldn't expect OEMs to put these 28nm GPUs in many Sandy Bridge laptops. I sure would wait for Ivy Bridge if I'm already waiting for the new 28nm GPUs and I suspect a majority of potential buyers would as well.
-
How much better will be the 7000 series of cards? will games look 10X better then there console alternatives? probably not, and no developer will waste money developing unique code for some special effects that only 1 platform can benifit from, and only a select crowd who go out and drop the money so a fraction of the whole platform. Is this worth the buy......
-
I think the developers would be better off investing in OPTIMIZING console ports for PC's rather than some special effects you won't even notice.
Seriously though, how much of a difference can be between games for PC's and those for consoles?
Except for the PC being able to push slightly better visual settings at higher resolutions, not so much.
There will be no discernible change in 3d meshes, only higher res/detailed textures and extra effects.
Although, are you REALLY up for having one measly special effect you won't even notice on a PC that will for example eat up 20FPS without it?
I know I'm not.
Right now numerous console ports are pathetically coded for the PC.
LucasArts Force Unleashed II for example runs horribly slow on my laptop and the hardware I have is better than the ones on consoles (not that FU 2 was good... the story was way too short, and graphics were not that much different compared to the first game).
GTA IV is another example of a horribly done port.
The PC version needs a quad core to run fine.
Seriously? -
There really aren't many "bad" console -> PC ports.
Just being able to run at 1080p res with AA and framerates above 30 is enough of an advantage for me.
The consoles are a technical mess. Their most AAA budget games are sub-HD and borderline 30fps average.
I guess my point is, if all the new chips offer me is more AA at higher resolutions, I will be content with that.
Anti-aliasing is overlooked far too often, imo. -
Meaker@Sager Company Representative
What's the point? What's the point!?
Are you F***ING kidding me?
Maybe we can actually see some higher res displays in the smaller machines becuase the POS graphics cards they have to put up with will finally be able to cope.
The gulf between the desktop mainstream and notebooks has been widening again. -
There's no link between the available mobile GPUs and their current performances and low res displays. 1080p are already quite common and the same case is often designed to accept systems ranging from a basic CPU and no discrete GPU to a quad-core CPU and decent GPU. It's a matter of price not a matter of available performance on mobile GPUs offerings. Aside from gaming (would you remind me how many people game on their laptop, again ?) there's no inconvenient to 1080p display.
-
Karamazovmm Overthinking? Always!
it is also a matter of portability. I dont want to have to lug around +3kgs of equipment. That is a thing of the past, that is what was acceptable on 2006. -
Resolution??? Really??? You done with 1080p already , because I know 1080p is damn enough for my 52 inch, Id rather have this standard for a while instead of buying a 4k screen already, and new formats and new bandwidth needs. How about better polygon meshes, how about several layers of texture layering with ray tracing to really emulate the was light is absorbed through muscle and skin layers. How about infinite detail, I can wait until there is something that drastically sets a new graphics card apart from the one I have now. Their was a huge leap in graphic technology and technique from the beginning of 2000 to 2005 that required a new generation of graphics card, but in the last 6 years since there has only been small changes and not enough to amount to a game from 2011 looking worlds better then what we saw in 2005.
-
You and this "infinite detail" stuff...
-
Meaker@Sager Company Representative
1366x768 displays are horrid. If we had 1600x900 at least as average we would need faster GPUs to actually play games on them.
You can stick a 1366x768 display on there and sure the current crop of gfx chips is ok on there.
However if the GPUs are more powerful they are going to be tempted to put higher res displays that can actually use the power. -
Haha whats that supposed to mean
. Regardless, its something that will happen sometime, when you walk up to textures they won't fall apart into a glob of colours and bump map effects. Thats something Id like to see, or more polys per model, stuff like that. 1080p is fine for me, and makes things nice a crisp.
-
Meaker@Sager Company Representative
Yeah but creating the art for said content would be hard work lol.
-
Possibly, this is were I want to base the bulk of my work when I get to industry is in modelling and texturing, but that aside most games start with high poly models now, then destroy them into a low poly with a normal map generally, which for the most part get the job done, textures is a resolution thing I think for the most part, RAGE from ID is using Megatextures in its new engine. Its more work to build a game for today then it most likely will in the future, all the work limiting the model and making sure it still looks as good.
-
Dell's Vostro 3350 has a Radeon HD 7450M with 512MB GDDR5, I'm unsure if this is the 28nm GPU architecture. Dell Vostro Laptops | Dell Canada
If it indeed is a HD 7450M and 28nm then this is a early sample geared towards business purchases as cheap BETA testing for those who are willing to pay the premium for a early HD 7000M series GPU.
So looks like AMD is going to try and release the HD 7000M series before 2012 to regain some laptop GPU market, currently NVIDIA 500M series GPU are on almost every Sandy Bridge laptop. -
Or it could be a typo.
Same core count as the HD6450M but GDDR5.....though Dell has a history of listing the wrong type of memory that comes on their cards. -
Meaker@Sager Company Representative
Sounds like a rebranded 6490M.
-
Wouldn't make a whole lot of sense to start rebranding right before Llano comes out with HD6000 series IGP.
Dell's free to put GDDR5 on an HD6450M just as they're free to put 3GB of VRAM on their GT 555M so the specs might be right, but I'm going to hold to the notion that someone hit the 7 key instead of the 6 when they were typing out the model number. -
There's no way AMD would let the first announced 7000 series chip be a rebranded mobile GPU.
-
Karamazovmm Overthinking? Always!
indeed, however the first 6000 series was a rebranded 5600/5700, so there is a precedent -
I doubt it's an actual 7000 series GPU. TYPO!
-
Meaker@Sager Company Representative
Dell is not free if ATI dissable the GDDR5 parts of the mem controller and I don't think the 6450 would support it.
Just like dell cant pair the 144 shader part 555m with GDDR5. -
AMD doesn't officially specify what kind of VRAM a HD6450M has to use...they just generalize that GDDR5/DDR3 is available on HD6400M cards.
AMD Radeon? HD 6400M Series Graphics
That same kind of loose definition of what VRAM can be used on AMD's cards is why we saw HD4870M and HD5850M using GDDR3. -
I think there needs to be something more ground breaking then DX11 frankly, and until the next real big bump in Graphics technology and coding comes along, I think we as the ultimate consumers for their products should be laying the pressure on these companies and push them to make bigger break throughs in graphic techonology. I won't down play the significance of tesselation but I feel the last 5 years really hasn't brought much new to the table. Some might blame consoles for that but if games were based off of the lets say R770 chip , would games be leaps and bounds different ?
-
Hardware is fine, I think we need software improvements.
-
Meaker@Sager Company Representative
Check out the unreal demo showing what they could do if everyone owned a dual GTX580 config.
Also for notebooks a shrink to 28nm means a default leap in performance without any arch improvements due to the power envelope. -
Meaker@Sager Company Representative
Thats general specs for the entire series, I really doubt they would let you pair a 6450 with GDDR5.
Much like they would not let you pair a 6600 series chip with GDDR5.
AMD's 28nm Mobilty HD7000 Series - Coming 2012
Discussion in 'Gaming (Software and Graphics Cards)' started by Phinagle, Dec 30, 2010.