DDR4 should come soon as possible, because integrated GPUs really could benefit from. Maybe Broadwell yes...
There will be variants with 32MB, 64MB too. Might unaccurate to call these memories for cache, but a memory which is dedicated for GPU only.
-
-
HopelesslyFaithful Notebook Virtuoso
-
Yes, it is on the motherboard, very similar what AMD introduced in 2008 with 790GX, called "sideport" memory. However AMD discontinued to use it, because was not effective enough in real use and caused plus power consumption. Maybe Intel found the solution, we will see in later tests. I'm just afraid those Anandtech tests won't representative the real gaming situations, settings...
-
HopelesslyFaithful Notebook Virtuoso
-
Karamazovmm Overthinking? Always!
we dont know the size, we dont know which models will feature that, we dont know anything. What we know was that there was an added layer of functionality in the GT3, however a real revamp in the arch for the igpu is only stated for broadwell, not for haswell.
What we also dont know is how much EU units will be available, nor their clocks, nor their thermal budget. One thing that we cant deny is how much progress has there been since the first 4500 HD to the now HD 4000, that is since the introduction of the Nehalem arch, what was an afterthought became one of the main decisive things for the engineer team. That is also no excuse for the still comparatively poor performance compared to what AMD offers in their APUs, I hope they give similar performance to what the 7660G does or more.
according to intel time tables, DDR4 will be avaialble for broadwell and certainly for skylake, the idea is that DDR4 is already being produced, so that there is enough stock for next year. -
Believe me, iGPU jumping for faster Memory. AMD's Trinity is jumping for even tighter Memory timings... It depends how strong is the iGPU, because a more powerful needs more food from the memory.
-
Karamazovmm Overthinking? Always!
it can be on the PCH, it can be in a special connection to the CPU avoiding the PCH completely so that the bandwidth is as large as possible (which according to the leaks and semi infos, it is very large), it can be on the die itself, the mobo is a very large area, basically unless we know how it works, there are several places that the edram can be.
-
-
Karamazovmm Overthinking? Always!
-
Karamazovmm Overthinking? Always!
-
-
LOL I just purchased a top of the line DV6 with Ivy Bridge!! Should have waited! Oh well, It's not like this is slow or anything!
-
.
-
Jayayess1190 Waiting on Intel Cannonlake
-
Jayayess1190 Waiting on Intel Cannonlake
-
Doesn't look like anything to get too excited about. So I can wait until June 2014 when nVidia Maxwell and Intel Broadwell will collide. Just hope Broadwell actually isn't only a BGA chip. Intel i7-5920QM + GTX 890m in 18 months and counting.
Saving my pennies now.
I'll likely bump up to an i7-3740QM in the interim and overclock the hell outta my 680m if needed. -
I have a penryn thinkpad R61(2.5 ghz). Since I put an ssd and replaced vista with ubuntu, its performance have been great and I have replaced the battery twice. I will wait for broadwell or skylake and buy a convertible either with Mac OS(I hope there will be a MBA with touch) or windows equivalent with Android in super saver battery mode. I am hoping we will see ssd at least 512gb at reasonable price and AMOLED screens as well. -
Also on the upgraded integrated graphics, the GT3 is going to available only with BGA processors. I think this is a way for Intel to protect Nvidia and AMD on the graphics front, or at least try to keep forcing more expensive "ultrabooks" on consumers. Intel has powerful integrated graphics for their thin and light "ultrabooks" they are trying to push, while users with normal laptops will still be forced to purchase Nvidia and AMD graphics if they want something powerful enough to actually game on.
-
HopelesslyFaithful Notebook Virtuoso
-
People who need IGP, are mostly those who buy cheap notebooks and ultrabooks and don`t upgrade GPU/CPU anyways
Not a big fan of soldered hardware anways grrr -
HopelesslyFaithful Notebook Virtuoso
which is what i said right? or just i say it wrong in my sleep this mornign ^^
-
superparamagnetic Notebook Consultant
There actually is a good engineering reason for using BGA. Solder conducts heat better than pins, so it might help with heat dissipation on what's likely to be a 47W to 57W chip.
Most likely it's targeted towards space constrained systems. 57W is about the TDP of a ULV chip + midrange discrete graphics. While you might be able to dissipate that kind of heat, the space you save by going down to a single chip instead of two could be used for things like a bigger battery or better cooling. In this market it doesn't make sense to waste space on a socket anyways.
Then again it could just be Intel up to its old market segmentation schemes again. -
HopelesslyFaithful Notebook Virtuoso
the BGA and PIN doesn't matter. The "heat" is electricity at that point and doesn't matter. So that will have no effect on temps. Now the point of a thinner package definately could be argued on fitting a better heatsink in there.
-
BGA reduces production costs, if only by a little bit, for both Intel and the laptop manufacturer. It also reduces the height of the chip on the PCB by a few millimeters which can translate to a few millimeters shaved off from the thickness of a computer if they design it correctly. It also makes it more of a headache for end users trying to upgrade as well as laptop manufacturers performing warranty service. A motherboard problem also costs them a CPU unless they go for the annoying process of reclaiming it from the motherboard. Also if they have multiple CPU options for a single laptop model, it is a headache with making and stocking a handful of different motherboards for a single PC.
Personally, I hate BGA CPU's. The laptop I'm typing on has one and it is starting to get too slow for me with a lot of multi-tasking. -
HopelesslyFaithful Notebook Virtuoso
-
It depends on how much it costs to upgrade, and how much benefit there is from an upgrade. In lots of cases it makes no sense because you are almost throwing your money at something that will not get much better. But in others, a small amount of money can greatly boost system performance. -
HopelesslyFaithful Notebook Virtuoso
if i were you wait for di skrink. i will always upgrade on a die skrink if i can
-
superparamagnetic Notebook Consultant
BGA is better than PGA for dissipating heat into the motherboard. In effect the motherboard acts as a secondary heatsink (with a sizeable surface area). For low power chips like Intel's PCH the motherboard is enough to act as the primary heatsink on a BGA package.
Intel has a nice spiel on BGA thermal characteristics: Ball Grid Array Packaging: Packaging Databook Ch 14
See 14.10.1 if you want to read about it. -
HopelesslyFaithful Notebook Virtuoso
-
So deep in entropy here.
-
Jayayess1190 Waiting on Intel Cannonlake
Haswell GT3 graphics to launch in Q3 2013 via Fudzilla here and here.
-
Jayayess1190 Waiting on Intel Cannonlake
Core i7 3940MX is Intel’s new mobile king
Intel is working on Haswell Core i7-4930MX processor
M-line Haswell, turbo to 3.9GHz, 57W TDP
-
HopelesslyFaithful Notebook Virtuoso
isn't m for mobile? and isn't there a xm and mx chip? mx is soldered and xm is socket?
-
Kind of debating whether or not to wait for Haswell...
-
HopelesslyFaithful Notebook Virtuoso
otherwise waiting is relatively pointless. -
Heck I ran the HD 4000 overclocked by 200MHz and DDR3 RAM at 2133MHz and 3DMark11 went from ~ 600 to ~ 750, while that is a 25% gain, the end result is still laughable. The 7660G in the A10-4600m is stock ~ 1350 @ 1600MHz RAM. Point being is that Intel has a lot of work to do to at least double the performance of their current design, which isn't likely. If integrated graphics are important to you, I'd say go for AMD. If you want CPU power and decent graphics performance go Intel + dedicated GPU, even a low end one will be better than Intel's IGP.
One issue I've found too even with AMD's IGP is that CPU hungry games like BF3 fight for system resources with the IGP, so while on paper it looks to be about equivalent to a 630m, in some cases it can't even manage 25 FPS where it should be able to pull 40FPS. The dedicated card has its own memory and architecture to use independent of the CPU. -
-
The only thing disappointing about Haswell for me is that it still takes a top of the line mobile CPU to have a non-Turbo baseclock at 3.0GHz. It's really too bad for workflows where high frequencies matter.
-
HopelesslyFaithful Notebook Virtuoso
-
-
HopelesslyFaithful Notebook Virtuoso
no a chip is limited to a set TDP so if it has 17 watt TDP and the CPU can use 17 watts on full load and the GPU can use 16 watts on full load than they have to share the 17 watts so the CPU and GPU each get 5 and 12 respectively if thats what the program needs. I have a SB ulv notbook. The CPU can use 17 watts in Prime 95 and in furmark the iHPU can use 16-17 watts so with a total TDP of 17 only one can run at full see or they each are throttled. I can play DC online or Dungeon siege 3 at 15 FPS because it can only use 17 watts. When you look at CPU it is at 50% and GPU is at 30-50% (If i remember) and the total TDP is at 17 watts and CPU is using like 10 and GPU is at 7.
So take an intel CPU and run furmark and prime 95 separately and see how much TDP each can use and than run a game or both benchmarks and compare.
You'll see that i am right.
EDIT: I am i wondering is if a 45 watt 22nm intel(IB) would do better than a 32nm 35 watt trinity because of this issue. Even if intels GPU is slower does the total TDP and lower process give it better game performance than the AMD counter part? I know amd will win in synthetic when its just processing one part but what about combined? What are the comparisons in 3dmark 11 in combined tests and not total?
Also remember i did this with a ULV....it has a very limited TDP...a 45 watt processor may not have this issue or not as severe. If my ULV could run at 35 TDP than i could actually play DC online and DS3 to some level but 15-25 is a joke. Also FPS varied in game from 10-30 overall and while watching hwinfo i noticed it sagged in FPS when the cpu needed more juice while maintaining overall same TDP so iGPU are great if the game is not CPU intensive....if it is GPU performance will tank -
Forget Haswell, Broadwell is on the way!
Forget Broadwell, Skylake is on the way!
Forget Skylake, Skymont is on the way! -
HopelesslyFaithful Notebook Virtuoso
-
Though in the case if bf3 imo its just more likely that the game is CPU limited. (could check with render.showperfoverlay 1 or something) -
HopelesslyFaithful Notebook Virtuoso
EDIT: typed it in phone inc lass and it botched what i typed
With BF3 i would bet it is TDP and not limited by CPU. I don't see how the CPU would be too slow. Also if it slows down due to CPU not keeping up than it will again be affected by TDP before CPU is too slow.
a 45w TDP CPU will use 100% TDP aka 45w TDP on max load.....so how can a iGPU run if they game requires that much CPU? Again no matter what type of game it is if it requires a lot of CPU or GPU it will throttle by TDP limit. A desktop chip would not do this since it has a nearly infinite TDP (If K chip) Even than the desktop chip probably doesn't have a 77w TDP iGPU so i am sure TDP will never get into the way. (could be wrong but i would be amazed if that happened. Well i could see it if your trying to play Total War..particular Rome or Medivial 2. Running 25k units on screen requires a ton of GPU and CPU so i could see it being limited by CPU or TDP.)
If a game in a 45w chip can really be limited by ram i would be surprised. That has to be using a ton of bandwidth on such a low setting. If that is the case than maybe DDR4 will fix this problem.
Also another point is that I can see a game limited by slow CPU and not by iGPU or ram if it is a single threaded game like what I stated above. AKA CS Source with bots or Total War or a crazy Sim city 4 map or Roller coaster tycoon 3. RCT3 i get 30-120FPS with max sized map when i use tools since it is single threaded
EDIT: @Cakefish...beyond that we will switch to a different material than silicon and than quantum CPUs!!!!
BTW: r3d...you do realize you repeated exactly what i said in your first statement right? -
BF3 is a very CPU and thread hungry game in multiplayer. It can use 8 threads no problem, and does best with a fast hyperthreaded quad core Intel CPU. Go ahead and test yourself with Throttlestop on an Intel quad core and adjust the speed of the CPU from 2GHz to 3GHz and you will see a marked difference in BF3. The AMD A10 doesn't even come close to performance of the Intel i7 quad core so you can imagine the resultant performance drop. TDP would only really be a factor if temps were too high but in this case of the Trinity, they are not. They are 70-72C max. Most other games I've tried so far, however, perform well with the Trinity IGP.
-
HopelesslyFaithful Notebook Virtuoso
TDP has nothing to do with temps....it has to do with power usage. If i run a 45w TDP chip at 0C i am not going to get more than 45w out of it compared to 70C
Anyways i am using my experience with Intel chips...maybe an AMD chip do not reach max TDP on 100% CPU. maybe 100% on an AMD chip is only 20 watts out of 35 watts. I don't know never used one but i know what i am telling you is 100% correct. Just go test what i told you.
Fact is if AMD cpu will use its 35 or 45 watt TDP window with 100% CPU utilization then playing a game that is CPU intensive will kill the GPU performance due to a lack of TDP. The two WILL fight for TDP, which will kill your gaming performance.
If i remember correct....DC universe online would require more CPU than DS3. I would get 10 watts CPU and 7 watts GPU on DCUO and on DS3 i would get the opposite if i remember correctly. I did this about a year ago so i can't quite remember. I don't really feel like proving it too you since i know i am right so do whatever you want test it and see for yourself or blindly believe what you wish. It is your own prerogative. -
superparamagnetic Notebook Consultant
I'm not sure where you get your info, HopelesslyFaithful, but everything I've read says TDP is both power and thermals.
The Sandy Bridge architecture specifically allows a CPU to turbo up above TDP if there's thermal headroom, so yes you can get more than 45W out of your 45W chip. Source: AnandTech - Intel's Sandy Bridge Architecture Exposed
Anand also explains how Intel calculates TDP: AnandTech - Intel Brings Core Down to 7W, Introduces a New Power Rating to Get There: Y-Series SKUs Demystified
Thermals play a big role. -
@HopelesslyFaithful - TDP is primarily to do with heat. Of course this is directly related to energy consumed, but TDP guidelines are to tell laptop manufacturers what cooling to provide for, not how much energy the CPU should draw. And not sure how you can measure the power consumption between the CPU and IGP if that's what you were saying in your last comment.
Even Wikipedia states: Thermal design power - Wikipedia, the free encyclopedia
" The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of power the cooling system in a computer is required to dissipate."
" This ensures the computer will be able to handle essentially all applications without exceeding its thermal envelope..."
" For example, a laptop's CPU cooling system may be designed for a 20 watt TDP, which means that it can dissipate up to 20 watts of heat without exceeding the maximum junction temperature for the computer chip."
Forget Intel Ivy Bridge, Haswell on the way
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Jayayess1190, Jan 28, 2011.