I'm just pointing out the possibility. Like how the 5790s were supposed to have the sleep issue with the 8800M GTX.
-
Possibility, sure nothing is 100% in the world. Heck maybe sager will do a free upgrade for us, who knows.
-
Just wanted to point out that it was probable, and not definite like you mentioned in your earlier post, don't want too many broken hearts in case it turns out to be incompatible. -
So, for all intents and purposes 9800gt = 8800gtx. Fantastic
Good to know my lil GPU is still holding it's own.
Next up!
Lets see some OCing benchmarks!! How hard can you push a 9800M GT compared to an 8800M GTX?
~ducks~ -
Donald@Paladin44 Retired
^--^
No need to duck
Let's blow a few of them up to see how far they will go...then come back and tell the rest of us.
~ok, so now I'm ducking~ -
Sigh. I've been following this forum for the last couple of weeks, considering to get the NP8660. It's great that Donald got us some real results, because I was definitely worried that the 9800GT would be the loser. I think the take-home message is that while there is no guarantee that Donald and Justin are always correct, they have never said anything to intentionally mislead. I'd put my money on them over some fellow who waltzes into the forum randomly and starts bragging about how much he knows and how his last setup was an uber high-end setup (some kids drive a Lexus SC as their first car because Daddy bought it for them). I always enjoy lurking and reading the debates on this forum but I was getting pretty disgusted by this last one.
On topic...the 9800M GT and 8800M GTX have pretty much identical characteristics. Has it been confirmed that they are the same component (but maybe with a name change like what nVidia did for the desktop 9 series)? I think there was an earlier post alluding to this.
The 3870 vs 9800M will be interesting to see, can't wait for the benchmarks. Hmmm, maybe I should create a new thread and we can repeat the flamewar
EDIT: I just took a look at GPU-Z in Justin's benchmark of the 5796 ( http://forum.notebookreview.com/showthread.php?t=272942) and the specs match up with those listed for the 8800M GTX on nVidia's website (I'm assuming that these are correct). The only thing we don't know from GPU-Z is the process size and the memory bus width (I'd be guessing it's also 256-bit)....but it is looking like the 9800M GT is the "new" 8800M GTX. -
Last edited by a moderator: Feb 6, 2015 -
I think he meant <img src = "http://www.freefoto.com/images/01/08/01_08_52---Duck_web.jpg" height=300 width=200>
-
It would be great if the NP8660 was offered with the AMD Puma platform. It would have great battery life without sacrificing performance.
By the way, do you guys think that thd HD3870 will run cooler than the nVidia GPUs? I want to be 100% sure before ordering... -
-
-
That's completely incorrect. How do more stream processors make a GPU hotter?
By your notion my GTX 280s with 256 of them would about 2 fold hotter than a 9800 GTX, but they're not. THe gtx 280 on load is roughly the same in my experience.
Please stop posting factually incorrect information. -
What in his statement merited this kind of response? -
armagideontime Notebook Consultant
being a bit harsh noel, dont you think?
-
Honestly, I'm shocked that you haven't been banned yet Noel.
-
He's just digging his own grave. We have a resident lawyer on the forum, maybe now we need an electrical engineer.
-
NoelGallagher, you might want to disprove him with evidence, instead of just calling people "liar" or "using incorrect info".
-
-
anythings that runs on the circuit board makes heat, and considering wrre talking about laptops and NOT DESKTOPS, this could make a huge difference, because of the confined spaces. -
Since you brought up the gtx 280 lets take a look.
http://www.anandtech.com/video/showdoc.aspx?i=3354&p=8
Well lets see here, total system load with a 9800 gtx is 228.1 watts and the gtx 280 313.1 watts. Can't make it much clearer than that. I don't see any way two gpu's using the same cooling solution can run at the same temps when there is a 85 watt difference in power consumption. As narsnail pointed out though, the gtx 280 and the 9800 gtx do not use the same cooling solution.
Going back to laptops, why does a 8800m gtx run hotter than the 7950 gtx more processors and higher clocks. Yet the 8800m gtx is built on a 65nm fabrication process and the 7950 gtx is built on a 90nm process. So if the die shrink isn't enough what else effects temps? Stream processors amount of memory and how high or low they are clocked. So if the 3870 is built very close to the desktop counterpart they are obviously going to try and get as close to those desktop clocks. Which could end up making it run around the same temps at the 9800m series cards -
You're using Ohms law with Resistance multiplied by current to give a voltage
What you're missing is that these laws are with constant current. the Cooling solutions ARE the same. I've had both and the cooling solution of one isn't much different from the other aside from the backplate.
First of all it's not stream processors that are the physical unit on the silicon. They're called ALUs:
ALUs are responsible for stream processing, there is no real stream processor.
Stream processing is a paradigm used in Computational Physics, which I am specializing in at the University so I kindly ask you should stop posting information full of misnomers.
There is alot of mathematics involved.
Your assumptions are unfounded. A smaller fabrication allows higher EFFICIENCY, Efficiency in computational physics is a vague term. In your world you're speaking of heat dissipation in correlation with fabrication and stream processors.
You've stated, and I quote
A smaller die shrink with the same tier of competition video cards will result in a lower thermal dissipation which is then offset by the increase in "stream processors" which results in about equal thermal output if that now-die-shrunk GPU gets the increase in stream processors.
This is absolutely incorrect. With a smaller fabrication the actual architecture of the chip isn't going to change. If we see a 3850 in a 55nm package it's not going to have increased ALUs to, say the previous 3850 on a 65nm. A die shrink isn't always going to provide you with lower temperatures, it can, however provide you with much higher efficiency.
The point of a die shrink is to decrease the amount of leakage by increasing the radius of insulation of the di-electric tunnels. If we decrease the fabrication (eg 65 to 55) we allow less power draw due to a higher efficiency and insulation of the tunnel. The reason to die shrink is to help us not only obey moore's law, but in the long run help save money by saving on Research and development.
If you could have a more efficient card with the same technology, this allows us to have the same clocks and overhead but now with lower prices!
You're asking yourself why the 8800 GTX runs hotter than a 7950 GTX... How about you compare the amount of transistors. You do know that the 8800 series from nVidia was the most revolutionary video card released in the 3d-gaming era since the Geforce 256? To compare raw statistical power and I quote wikipedia on this
:
Got it? -
nope, your talking to yourself now. Have fun
-
NoelGallagher is correct about the ALU part, but I never knew that they don't generate heat.
-
yeah sorry Noel were not all going to University for Computational Physics, so dont talk down to us, and just because we dont have every little detail correct doesnt mean we are "completely wrong".
someone is pretty high up on their horse.
this is like an NHL player coming down to houseleague and owning them all and telling them how much they suck. -
-
All that schoolwork is gunna be really pointless when you can't hold a conversation with any superior or customer because you endlessly insult them. It would make this thread a lot more informational and far less hot-headed if you could just be a bit more kind and humble. Not all of us are educated to such degree, so obviously some incorrect assertions are going to be made. Either way there's no need to be so mean about correcting them.
On Topic: Are there any reasons to believe that the 9800m GT will be able to be overclocked higher than the 8800GTX? -
I think Beyond3D can provide a lot of food for thought here on the subject of GPU architectures.
@NoelGallagher : You did try to make some points there, but it really was lost with the condensing tone you addressed the people with, a person who is right, doesn't need to be insulting. Also, drumming on the Computational Physics is not helping your argument either.
Compare the CO 65nm card with the 55nm card and see that there IS lower power consumption, comparing the cooling solutions without comparing the fan running profile is also wrong, you should try to compare the TDP numbers. And the reduction of the power consumption can be larger than what was demonstrated by Anand, look at the TDP numbers here : http://en.wikipedia.org/wiki/List_of_AMD_Athlon_64_microprocessors
Compare the 130nm and 90nm processors on the clocks, see that a change in the process specially when involves the use of more advanced libraries enabled by the new node, can make much more of a difference. For example, the 45nm CELL processor, consumes as much as half the 90nm CELL on PlayStation 3 at the same clock.
I think the polite answer instead of the bit above and the rest, would have been :
-
Nice try mujtaba but I think it's a waste of time and effort, Noel is just one of those personne who are too much superior to need the use of reference and formal/informative tone.
Personnally Noel your information might be right but I never managed to read one of your post completely because you piss me off -
I think people should stop these ego fights and move on...
Regarding Neil, I would just like to add that I agree with what people are saying... Even if you gather all the knowledge in the world, coming here being arrogant and agressive won't buy you anything.
The only thing I can get from Noel's posts is anger and arrogance. The message gets completely lost.
So a good advise... try and be a bit more humble and nice. If you still don't want to do that, just stop yourself of posting on a forum and offending and insulting (and pissing off) so many people.
Cheers -
I appreciate it, it helps me learn. Points taken and I thank you for the informative posot -
hey, i think we should ease up on hating on noel now.
his posts attitude seemed to change. either way, we all just need to chill easy =)
sorry, i dont have any super intellectual stuff to contribute =\ -
Spec, that was the most intellectual contribution anyone could expect now.
Cheers. -
All electronics generate heat. It has nothing to do with the type of architecture the component is. Basically, heat is a sign of a transistor's inefficiency because that heat is energy that should have been doing electrical work. Heat is generated because metals used in semiconductors are imperfect, electrons collide, and heat is generated. Electrons are also lost to leakage, which can generate more heat. (Indeed, today's processes can loose up to a quarter of their current to leakage).
Nirvana, not saying you suggested this idea since you referenced him, but I wanted to clear that up for everyone.
I am a graduated EE major and know a fair amount about device physics, so you can trust me on this one -
Ahahaa... everyone is coming here to show their badges.
I'm a PhD in Genetics and I know nothing about computers or physics... but I'm great, no? -
anexanhume, let me rephrase my sentence: this is my first time hear someone (NoelGallagher) said that ALU don't generate heat. I didn't say whether it is true or not.
-
Nirvana: certainly hope I didn't offend. That's what I'm actually trying to get rid of because of all the heated pontification in here -
auburncoast Notebook Deity NBR Reviewer
so, a little off topic but didn't want to post a new thread over it: is it true that the 1GB memory on the 9800m GTX cannot be used? I heard that the maximum amount that can be used is 512mb. Would the 1GB give any head room over the 512mb? If not, there wouldn't be too much of a performance gain from the 9800m GT to the 9800m GTX would there?
-
256 bit BUS means that yes, it probably can't use over 512MB of VRAM.
Edit: I don't know what the BUS on the 9800GTX is, but even if it is 256 like the 9800GT, it has more shader cores and is probably clocked higher, so it will perform better than the 9800GT. -
I didn't say ALU's don't generate heat, of course they do.
Stream Processors don't. ALUs are responsible FOR stream processing so of course they're going to :/ -
http://www.nvidia.com/object/geforce_9800m_gt.html
So it has 96 SPs (and the clock shown by gpu-z was right). -
Not sure how official this is, but it seems like the 9800gt is pretty similar to the 8800gtx. Plus some info on the 9800gtx
http://translate.google.com/transla...usiast-gamers.html&sl=it&tl=en&hl=en&ie=UTF-8 -
sorry my DSL was down for a few day.. so what is the result ?
-
Donald@Paladin44 Retired
Start at Page 18 and you will see the result.
-
result is Donald stand correct.
-
The only thing which is strange for me is that run 0 for 8800M GTX is faster than runs 1,2 and 3. On my lappy run 0 is always around 0.8-1 fps slower. Since we know that this cards are almost indentical both 360Gflops there has to be some background process which turn on and slowed down 8800M GTX. This happen to me once. -
There are some new 9800m "benchmarks" on the NotebookCheck GPU benchmark listing. I don't know whether these are reliable or not so interpret however you please.
-
Someone confirm that this data can't be right, please? -
armagideontime Notebook Consultant
yeah, i find that very hard to believe...
-
-
the 9800m gt is equivalent to the current 8800m gtx. what i noticed was that it is less expensive. i believe i purchased my 8800 sli for $800 dollars and now the 9800 sli is $500 dollars.
but i think the 9800m gt vs 8800m gtx is a non-issue right now. what i believe will be an issue is a required mobo upgrade to support the new 9800m gtx in sli. that would tick me off as i seriously thought we were done with mobo revisions!
Differences? 8800M GTX and 9800M GT
Discussion in 'Sager and Clevo' started by wr0ck, Jul 15, 2008.