As far as I know about DDR4, it isn't available for the consumer market yet, although there has companies that already announce them for desktop RAM. They already have them for servers, what about laptop DDR4? What do you guys think?
-
-
Karamazovmm Overthinking? Always!
Skylake will have ddr4 support so 1-2 years down the road
Desktop ddr4 is already for sale though it doesn't enjoy the same.amount of line ups currently as ddr3 -
Biggest draw also will be a fairly small improvement on power drain.. of the kind that isn't going to really register on laptops on a properly configured system. In real terms, the biggest improvement for laptops or embedded devices is that you can get the same speed as before on a much smaller chip. Along with better made ram controllers, that could be interesting for slimming down motherboard designs.
Meanwhile, to achieve much better bandwidth than before, the latency will apparently rise fairly high. Unless, I'm assuming, you can turn off the internal handling of memory bank prefetches, and exploit the better quality ram chips in terms of lower volt requirement, better consistency on reads and writes, etc. At a higher bus frequency with hardware that allows that. Which theoretically should give you a possibility to get both lower response time as well as higher bandwidth total.
But that's not going to apply to laptops or embedded devices. So..
The interesting thing that's happening in mobile devices at the moment is that the miniaturization options - ddr3l, downclocked buses, better efficiency, smaller chip dies, better bandwidth at lower power consumption, etc. currently being hyped for intel and x86 platforms -- also applies to other solutions. In fact, ARM devices exist because of these options. And as they become even cheaper to make, and the lines between a general computing device and a "processing device" hub are blurred enough, we're going to see options for getting a reasonably high yield platform running on a fraction of the power a current laptop runs at now.
We kind of already have it, of course. But what needs to happen for these solutions to be adopted as anything but "app" devices. Is that software is developed a bit differently, and a little bit smarter. So the move towards extremely low powered devices isn't really taking off in the way some of us predicted some.. ten years ago..
That's the same problem as what you're running into with ddr4 ram. It allows a different kind of programming interface to exploit mapping in parallel on the memory bus. But unless you prepare ram areas programmatically for it, the benefit is going to be unpredictable and at some average rate that makes it seem useless.
But if it becomes more common to employ programmers to create solutions for specific programs with explicit... I don't know.. abstract assembly, to make something up. To have an abstract high-level language designed to exploit parallelisation that becomes available and already is available in architecture. If that happens, and if we don't stick with the "need" to let all the low-level nitty-gritty be handled automagically -- then very interesting things could be happening in computers very, very quickly.
Until then, though.. don't expect too much. -
Need a chipset to support DDR4.
Desktop 9 series does, mobile 9 series doesnt.
DDR4 have less power consumption and less heat output than DDR3 but apprantly that doesnt apply for notebooks...
IGPs would greatly benefit from faster memory, and you find most IGP users on notebooks, but that doesnt apply either...
Good job Intel -
MTA16ATF1G64HZ-2G1 - SO-DIMM 8GB DDR4, 260pin, 1.2V, CL15, 2133MT/s.
-
-
-
DDR4 is going to be like DDR3 when it was initially launched.
I remember those raging debates of DDR2 vs DDR3, which finally settled when new CPUs stopped supporting DDR2.
DDR4 is going to be expensive for 1-3 years. You should be more concerned about the CPU/GPU and SSD developments.
I'm more interested in Broadwell and Carizzo, especially with rumors about Carrizo getting stacked DRAM cache (aka potentially cheaper version of Intel's L4 cache design). -
-
If you noticed, with every new DDR generation, latency timing increases but so does clock rate. DDR3 has a higher timing latency than DDR2.tijo likes this. -
First time I hear of this, but I can see the idea behind the comment. It has to do with the CAS latency or in other words, how many clock cycles it takes before the memory can execute an instruction it was sent. Is DDR4's latency that horrific though. I mean in terms of the number of cycles, maybe, but in seconds, what does it look like. The actual time it takes for RAM to perform an operation depends on both the latency and clock speed.
I haven't followed DDR4 at all, so I have no idea what the latencies and clocks actually look like.
Primer on RAM timings and how they work: Understanding RAM Timings | Hardware Secrets -
Doesn't seem to be much official or completely reliable information out there about ddr4 setups yet. But from the concepts, it looks like the latency for individual instruction fetches could become fairly high in order to maintain the higher throughput, at lower speeds compared to ddr3. Because of the way it's supposed to use multiple pipes, and to prepare that, one instruction might become fairly slow.
It's a completely sound concept, though. And like was said, if you run this stably on a high enough frequency, the latencies could be lowered. And I'm guessing that it will be possible to turn the prefetching functions off and simply run the thing on extreme clocks.
But when laptop manufacturers can't even tweak ddr3 ram and dram:"fsb" ratios, clock frequency thresholds, load balancing, etc. The probability of them figuring out how to tweak a "baseline" ratio for web-browsing and video-viewing that runs on high latency and comparatively insanely high bandwidth. But then would allow the cpu and bus to clock up during load and an increased frequency of fetches, etc. That seems pretty much slim to none.
In any case, it's still going to end up at a miniscule power-draw improvement over ddr3l. So unless it's coupled with incredibly slim motherboard designs, and new bridge/cpu and gpu solutions, so you can get a desktop and a word processor running at 1w draw or less, etc. ..something that approaches what similar tech can do right now.. then this isn't worth anything in the laptop world. Note also that this would need to be coupled with software solutions that exploit the parallelism options.
If anything - I'd guess many laptop makers would be very unhappy to sell something like this if it actually provided any of those things. Because it puts their products side by side with arm-solutions that would beat them in any way that matters.. -
-
always. (btw, thanks for zapping the dbl-post. Wifi...).
Take the panther point setups we've had over the last three years. They're still being sold by a lot of manufacturers, with ddr3 and 1600Mhz ram. It's possible to set these up with very low bus-speeds... you'd decrease the base clock value, and allow very high multiples for a not incredibly high processor speed. And then set the limitation for when the processors should go into higher frequency states in such a way that normal work-loads wouldn't affect them. That you would enter higher clock states only when you actually need more power in order to avoid increased response times, or visual latency of some kind. Because the goal would be to let the computer run at an as low frequency and power-draw as possible at all times.
Or... you could have different profiles for different scenarios. On the intel firmwares you can actually change all of this, with the combined settings between the extended and basic bios functions. There's a different approach to this that would involve balancing towards the temperature limit as well..
And on most laptops, apparently none of this is used. It's like watching a Buster Keaton movie with cowboys falling over themselves and shooting everywhere - except they all have laser pistols, and all their limbs keep falling off and things just settle down. It's just a huge mess, and it's going nowhere. Laptop designs that rely on bidirectional throttling kicking in to avoid the boost processor states. But the frequency of the high processor states is still is enough to overheat it, so the entire system grinds to a forced halt at 92 degrees, for example. This is an actual example of a very, very expensive and fairly popular laptop that's still being sold.
So if there's a theoretical possibility to end up with ddr4 designs that have extremely low idle draws, that can wake up and shut down without ceremony on battery for.. a week if left alone. Where you can have customisable notifications from lid-mounted lights for various things such as incoming calls, e-mail, calendar notifications, etc. While also having a possibility to turn up the processor clocks for the heavy lifting. Then that would be nice, and allow a certain kind of flexibility that would make x86 laptops competitive against specialized arm solutions.
But.. can't really see that happening with the way things are done currently..
I mean, was happy with the way my current haswell-thing is running. But it's not set up consciously to exploit low clocks and have high processor state options in the way I described. And.. why? Who knows..? We keep seeing IGP/cpu packages that have a minimum of 20w draws while doing anything other than showing a photograph. While it's actually possible to pull that down on the existing hardware if you tweaked it with simply "consistent and sufficient input and visual feedback latency" or whatever, instead of the usual: "let's make the "idle" state a normal processing load, and then turn the entire thing up to 11 when watching youtube!".
It'd be awesome if it just wasn't possible to do it, of course. Then you could just say: well, it's just too technologically advanced for 2014. But.. it isn't. And it sort of is a bit surprising that we're not seeing something happening here. Even if no one actually had negative feedback for manufacturers who just don't care about any of this -- that area is the only place laptops can currently improve. You would think that a very inexpensive thing like that would be chosen before physical redesigns, or new motherboards and processors, and so on. ..but apparently not. -
You're welcome for the double post, I always nuke any I see since it makes the forum easier to read, especially since modern website design dictate that content density has to be as low as possible because that obviously makes sense
. anyways, I could rant about content density on "modern" sites all day.
So, if I get what you're saying correctly, the manufacturers basically have the possibility of tweaking the RAM settings to that it could enter a low power state like the CPUs currently do in order to reduce power draw and then kick back in high gear when the situation demands it. That's pretty interesting and would be nice. Most manufacturers can't seem to be bothered to design their board with components that play well together, especially in terms of power draw to start with, so there's definitely a lot of room for improvement. -
1st gen DDR4 will have pretty bad latencies. Corsair and Crucial already have some rolled out so you can go see what it's like, not to mention the initial high cost as well. Here's an article saying DDR4 will have to run at 3200MHz before catching up to DDR3's latency @ 2133MHz.
The upcoming X99 chipset with Haswell-E will support DDR4, but I think Intel made a bad decision to go with DDR4 on X99, simply because you'll be paying more or equal or less initially, and it won't be until 2016 that X99 will reach its full potential (unless you fancy spending $600+ on ram alone) -
Damn, looks like I might to stay with DDR3 RAM when I am going to get my gaming laptop in about 6-8 months.
-
Meaker@Sager Company Representative
Unless you plan on getting a system using only IGP it will make no difference.
-
AMD APUs are the only CPUs that would benefit from faster RAM, though the first-gen DDR4 can't compete against high-end DDR3.
-
-
It makes no sense to pair budget APUs with expensive RAM. AMD was correct to hold off on DDR4 support.
-
-
It's rare that you find a budget i3 + dedicated GPU. Extremetech had an article about the disappearance of mid-range GPUs from laptops, especially ultrabooks. It noted that in many laptop manufacturers (with the notable exception for HP and one other one), you have to spend over $1000 to get a mid-range GPU or Iris Pro.
I've came across a few laptop models that locked an expensive i7 with a dedicated GPU, or only offer something like a Fermi 820M. And many laptops only had an i7 (such as i7 4600U with no GPU).
And gaming on Intel's regular IGPs? Um, no thanks for me... -
Any words on laptop DDR4 RAM?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by ipwn3r456, Aug 21, 2014.