According to various benchmarks and some games it gets the same performance as my 240m. So Intel shaders are extremely efficient? 16 = 48, if it had as many shaders as a GTX 680 it would get 14,000 in 3dmark11, and yes I understand that it would be impossible to stick that many shaders on a cpu but they could just make it a dual cpu socket motherboard and make one cpu a insanely powerful gpu.
Intel HD Graphics 4000 - Notebookcheck.net Tech
NVIDIA GeForce GT 240M - Notebookcheck.net Tech
-
-
That's Ivy Bridge for you.
That means it can run Skyrim at medium o.o because my 9400M can run it at medium 1280x800 -
-
It's probably as good as a 96 shader nvidia card (525m). But clocks and memory are different so you can't just say 16 intel shaders = x nvidia shaders.
-
What about driver support and image quality (2D and 3D)? How do they compare in this respect?
-
All you can really conclude based off of shader count is how two cards in the same generation from the same company compare. -
^even that is pretty flaky depending on how the memory bus scales.
at the end of the day, figure out what games you ACTUALLY PLAY and pick the graphics solution that works best in those games. -
Greg is right. Even Nvidia`s own shaders are tricky to compare between generations, like Fermi vs Kepler.
-
Tsunade_Hime such bacon. wow
I think initial benchmarks of HD 4000 was around 550v/5650 Mobility performance which is drastically better than HD 3000 (around 8600M GT mark).
-
I will bench my i7-3610QM when I get it for HD 4000 performance. Granted the GPU has max speed of 1.1GHz vs 1.25GHz of the i7-3720QM but that shouldn't matter much I think. I'm also hoping to be able to boost RAM speeds up to at least 2000 if I can find the right RAM and check IGP performance there.
-
Wow, that's a ton of power to be packed into a 11.6" frame. If HD4000 is about 5650 level in terms of power, then it looks like AMD once again has nothing going for it in the CPU department; that's better than the 6620g in their highest APU if I recall correctly.
I would love to see benchmarks of you running, say, BF3 on the HD4000 only. -
The AMD A6/A8 Llano CPU isn't that bad. It's relatively comparable to an i5-2520M with multi-threaded apps, but single threaded apps suffer. But in general performance is on par with i5-2520M. See here: http://forum.notebookreview.com/har.../652836-intel-beat-out-amd-4.html#post8401304 as well as my benchmarks of 6620G. -
Yeah, I've been following your benchmarks as I've been looking for something with better battery life and similar or better performance to my Y560 for a few months now. Finally settled on the Y470 with i7 and 550m, which has not disappointed me yet. For $469, how could I go wrong?
-
-
I believe it scaled reasonably well. Don't have links atm, but iirc HD 3000 from 1333 to 1600 was fairlly significant but 1600 to 1866 had diminishing returns.
-
-
Looks like the lower end of graphics is what is getting the big jump in this new generation.
-
Everything you guys said is true but I can't bring myself to buy another notebook. I remember playing burnout and batman a few years back but i'm sure once I get FIOS my notebook and a updated cloud gaming site will be great
After some more reading I have concluded that for most people a gaming gpu is useless because gaikai and onlive are going to dominate the market. I had been using onlive when I was away from my home to game and it was good enough to replace a pc. Obviously it isn't as responsive but that will improve later on.
In the future I hope to be able to use services like onlive to play games on my new galaxy note. (the service already exists but its too expensive if you add up all the costs associated with mobile cloud gaming)
Currently I play newer games on gaikai since its so simple and easy to use. -
6970M -> 7970M = about 90% increase (7970M scored P5447, 6970M scored P2821) -
-
Sounds like Intel is really pushing to get the Monopoly.
Can't believe the Intel HD4000 is on par with a AMD 5000 mobile card.
What will they do next? Stick Ram and SSD into the CPU and just do away with all other hardware. -
Don't get too excited, it's not that powerful. It's ok for 720p gaming at low/medium detail, but not much more. Definitely a welcome improvement, but far from replacing dedicated GPUs or RAM or storage.
-
Meaker@Sager Company Representative
Source: AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review
Still behind llano, let alone trinity.
Also no where near a proper 5000 series chip. -
-
Tinderbox (UK) BAKED BEAN KING
My last two notebook had HD3000 and Windows Calibration never stayed set, even with all Intel start-up`s disabled, and i use to get video corruption with you tube video`s on both notebooks, different brands, latest flash player, and video driver`s
Everything works great with AMD Graphics, but AMD`s driver updates don't work with Dual Amd graphic notebooks, they do not see the dedicated gpu, so it`s not all roses with AMD either.
John. -
-
Yup, Althernai is correct.
-
Tsunade_Hime such bacon. wow
-
Meaker@Sager Company Representative
-
If Intel made a dedicated 100w gpu for laptops...
-
Most people don't buy gaming level chips so if intel takes out the bottom of the market, it will take most of its share and then the only way that nvidia will survive is if they jack up their prices
-
Main question here is IF Intel does that.
Thus far, it would appear that AMD's Trinity is a VERY promising alternative, and seeing how AMD has a tendency to offer cheaper solutions compared to Intel, I would argue there's a real possibility that they could easily dominate the bottom of the market. -
Would I be right in thinking that for 720p gaming at low detail an Ivy Bridge i5 with HD4000 would run much cooler vs Ivy i5 with nVidia 525M (or 6 series equivalent)?
Reason I ask is that I had a laptop where the nVidia 9800M GT SLI died from heat, and it put me off discrete graphics cards. I am hoping that HD4000 will be a cooler running alternative. -
Here's some quick testing of the HD 4000 in Ivy Bridge i7-3610QM compared with AMD 6620G. Not looking so great.
http://forum.notebookreview.com/sag...10-clevo-w110er-first-look-review.html#hd4000 -
So it got 4822 3dmark06 in your review.
My 8510w scores 5700 points with t9300 and Quadro FX570. I haven't bothered to run any newer benchmarks with it since it would only make me sad... -
-
Upgrading needs money
I wouldn't take anything less than new Elitebook to replace old top-of-the-line workstation.
Besides that laptop can still run fluently enough several virtual machines under 2k8r2 server and that's all I need for now -
You have the patience of Yoda. I would never have the patience to hold out for so long with a laptop. Although I understand the desire for a new Elitebook. I'll be jealous when you do!
Then again, I have a few desktops over six years old and still get used, but like you said, if it works for what you need it for then why change? They're used mainly as light HTPC's, one in my kids' room and one in my basement with big screen TV.
-
-
-
-
Ok, I wouldn't call it "throttling." Probably the Turbo expires pretty damn quick and goes back to base clocks. It's not just Cinebench. Also the iGPU results are way lower on 3DMark.
Comparison with Notebookcheck's 3610QM:
3DMark06 CPU: 4407/6099
3DMark Vantage CPU: 16438/21508
X264 is too low too. At least it shouldn't be slower than the 3610QM. I could see that it may be intentional. 45W may be too much for 11-inch form factor. That may mean the 35W 3612QM would fare lot better. -
I guess I'm struggling to understand this then. 100% CPU use like Prime95 pegs all four cores at 3.1GHz for over an hour, no problem. Other programs as well. Only Cinebench shows this phenomenon. Temps are also only about 78-80C compared with Prime95 at 92C.
Actually the 3612QM doesn't boost at all when the dedicated GPU is active as was found in the Sager W110ER forums.
Throttlestop does some funky things with this Ivy Bridge, although I'm not real familiar with Throttlestop. -
As far as HD 4000 / 3610QM is concerned, the issue is non-standard resolutions. The results from notebook check are at 1280x768 not the 3DMark06 and Vantage standard of 1280x1024. They do this because it fits on the screen, but imho, it only causes confusion. Here's my run at 1280x768 which matches up to Notebookcheck. Same thing goes for 3DMark Vantage.
-
-
interesting stuff I am going to wait for HD 5000 wich will get 10k in 3dmark06 so that I can play games from 2010-2011 which is IMO the best graphics I am going to need for my 1024x768 plasma. I feel like that will me the limit for my graphical needs after that its on to the cloud...
-
3DMark06 is far from a decent benchmark to base anything on. It's over six years old and very CPU dependent, meaning you can have a very fast CPU with crappy GPU and still score high. Unless you plan on playing games primarily form 2008 and earlier, it's far from a good judge of gaming performance for anything new.
But HD 5000 I doubt will score 10000 3DMarks, I guess close, but that won't be out for over a year now. And then the 4K screens will be coming out, so you'll need a GPU to drive a screen 3840x2160 resolution. -
-
I don't see the point in playing the latest and greatest I still enjoy playing Grid and Burnout paradise. I understand the whole idea for neverending improvements but I don't think graphics for gaming is the way to go. Developers need to focus on gameplay more.
I assure you that even while playing sports games it is useless to see the players face the number is enough.
Intel HD Graphics 4000 is as good as a 48 shader Nvidia card?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by LakeShow89, Apr 29, 2012.