The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    "Dual Core" a big bore

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Mamba, Jan 2, 2007.

  1. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
  2. jpagel

    jpagel Notebook Evangelist

    Reputations:
    20
    Messages:
    357
    Likes Received:
    0
    Trophy Points:
    30
    Yea that is true, if you have verry little RAM- Large amounts of ram will utilize that mulitcore processing since the CPU can process more chunks per MS, it just needs the ram to store them quickly and push them through instead of bottlenecking on the rest of the system - Nice article though.
     
  3. moon angel

    moon angel Notebook Virtuoso NBR Reviewer

    Reputations:
    2,011
    Messages:
    2,777
    Likes Received:
    15
    Trophy Points:
    56
    I knew it! That and the overheating means I'll take an ML-44 for my next machine please! :D
     
  4. jpagel

    jpagel Notebook Evangelist

    Reputations:
    20
    Messages:
    357
    Likes Received:
    0
    Trophy Points:
    30
    If some of the software takes advantage of the software, why not rewrite the parts that dont?
     
  5. Arla

    Arla Notebook Deity

    Reputations:
    35
    Messages:
    1,073
    Likes Received:
    0
    Trophy Points:
    55
    Eh, all I got from that article is that depending on what you are trying to do depends on how useful multi-cores will be...

    Well DUH!!!!!

    Isn't obvious both that some operations can never benefit from multicores since they must be single threaded, and that some operations will run into other limiting factors (memory speed, graphics card speed, hard drive speed, you know, all those OTHER bottlenecks in PC's).
     
  6. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    Are two cars better than one? Depends on whether you have two people driving them or not. If you only have one driver, there's very little benefit. People thinking that dual-core will double their processing speed are mistaken. People that think that it will allow their machine to multitask better, as well as perform calculations that are optimized for multiple threads/processes faster are correct. It runs many graphical rendering programs a lot faster for me because it has multiple CPU's, or allows me to dedicate one core to processing images while I can still use the machine to surf, play non-memory-intensive games, etc. without slowdown.

    And Adobe needs to hire better mathematicians and/or software engineers if they can't parallelize multiple partial-differential equation solutions.
     
  7. jpagel

    jpagel Notebook Evangelist

    Reputations:
    20
    Messages:
    357
    Likes Received:
    0
    Trophy Points:
    30
    "And Adobe needs to hire better mathematicians and/or software engineers if they can't parallelize multiple partial-differential equation solutions."

    I was thinking the same exact thing -
     
  8. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    The amount of RAM has nothing to do with how many cores are utilized. Or with the speed at which RAM can be accessed.

    What the article is saying is twofold:
    1: Not all tasks on a CPU can be distributed to multiple cores. Any programmer can tell you that. Some tasks are just inherently serial, and there is nothing to be gained by adding more CPU cores.
    2: Even with tasks that can be parallelized, that may not make a difference if the bottleneck is memory access. RAM is terribly slow compared to CPU's. There are a lot of clever tricks employed to make RAM seem faster than it is, but ultimately, we get what, 12 GB/s of bandwidth to even the fastest dualchannel DDR2 system. That's not much compared to the amount of data two or four CPU cores can churn through. Again, for some tasks this is a problem, for others it isn't.

    Because it's not always possible. Some things just have to be computed serially. If you have a series of operations, and each one depends on the result of the previous one, there's just nothing to do. You can only use one core at a time, because only one task is ready for execution at any time.

    And of course, in the cases where it *is* possible, it still costs a lot of money to rewrite things. So in those cases, it's still very much a gradual process.

    True, *if* that is the problem.
    But I suspect that such equations are not the main bottleneck in Photoshop. As they say, it's an application that consumes a lot of memory bandwidth. If you have to iterate through a 50MB image file, then it takes a certain amount of time to just access all the data in RAM. And while the operations that have to be performed on each pixel may be able to be parallelized or distributed between cores, that may not help if each core just has to spend that much longer waiting for the data they requested.

    It depends. But it's no secret that memory speeds are lagging horribly behind CPU speeds. For many applications, this can be hidden, but sometimes, if what you really need is lots of memory accesses, there's just not much to do, and adding more CPU cores won't improve the matter.
     
  9. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    What I get from the article is what I suspected: that if you already have a Pentium M 1866 MHz CPU running under 30W (as I do) there's very little benefit to going to a slightly hungrier, hotter chip that does things just a wee bit faster in most instances. I'll put my wallet back in my pocket and wait a few years. My suspicion is that the doubling of cores from 2 to 4 (then 8, 16?) is not going to be the right road forward.
     
  10. RogueMonk

    RogueMonk Notebook Deity

    Reputations:
    369
    Messages:
    1,991
    Likes Received:
    0
    Trophy Points:
    55
    Give me a break. The article is bad journalism--A nice flashy title, that doesn't have anything to do with its content.

    The guy is talking about 8 and 16 core machines, not dual core. In fact, except for the flashy title and intro paragraph, the article doesn't even mention dual core cpus.

    duh.
     
  11. Zellio

    Zellio The Dark Knight

    Reputations:
    446
    Messages:
    1,464
    Likes Received:
    0
    Trophy Points:
    55
  12. Gator

    Gator Go Gators!

    Reputations:
    890
    Messages:
    1,889
    Likes Received:
    0
    Trophy Points:
    55
    Um...well I have to disagree with that. Making multiple core processors just makes sense, both hardware wise and software wise. Nowadays, many apps are multi-threaded and multi-core processors are designed to optimize performance on them. You really have to look into the multi-core architecture to appreciate the simplicity and elegance of the design, which for all intents and purposes blows the old single core Pentiums and AMD's away on the right software...as well as for multitasking.
     
  13. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    You're changing the subject from dual core to 64-bit, but in any case, I disagree. Adobe are making a sound business decision, one that I'd make too. They'd be totally insane to build a 64-bit version of their software when, and I quote: "...the number of Photoshop customers running the 64-bit version of Vista will remain very tiny over the next couple of years."

    1) The number of people who actually multitask is very small. I work all day on mine and hardly ever do.

    2) "On the right software" - that's the key phrase. When a significant percentage take advantage of dual core, that'll be the time to move, and I bet the processors will be much more mature then: cooler, cheaper etc
     
  14. Zellio

    Zellio The Dark Knight

    Reputations:
    446
    Messages:
    1,464
    Likes Received:
    0
    Trophy Points:
    55
    I was mentioning 64 bit because they want to be lazy and bash everything.

    64-bit is slower because it's bigger, but it boots faster and it capable of more complicated stuff. it doesn't exactly use more memory. Adobe is telling half-truths.
     
  15. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    Oookay then :rolleyes:
     
  16. Gator

    Gator Go Gators!

    Reputations:
    890
    Messages:
    1,889
    Likes Received:
    0
    Trophy Points:
    55
    Alright well if you only run single threaded apps one at a time, then a dual/quad core processor would not help you significantly. I also will not inject 64 bit processing into this conversation, although it is a very good idea.

    What Windows XP and Vista do to take advantage of dual core is to place different processes on different cores, so that while the figurative left hand is reaching for a hot cup of Java, the right hand can be thumping away at a web browser process. Essential windows processes are also allocated in this way. In effect, while software programmers can specifically code for multicore processors to fully take advantage, they don't have to. If either 1) you run a program with multiple threads---any type of Internet chat program for example or 2) you run multiple programs at the same time (I'm sure you have more than just your browser open right now), you can take advantage of dual/quad/octo-core goodness. Hope that helps clear things up.
     
  17. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    I agree with all you say, and if I had enough (any?) bottlenecks or slowdowns in my daily computing life, the way I used to have with old CPUs in the 1990s, I'd race out and buy a 2 core CPU machine immediately. But I don't, and nor do 96%+ of people.

    That's why I recommend sitting on your hands for a while, if your current machine is doing the job.
     
  18. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    I disagree.
    Multicore processors doesn't "just make sense", because many apps don't *need* the extra processing power, whether or not they're multithreaded. Very few of the apps that people regularly use actually require the amount of CPU power we have available, and so, the *practical* real-world benefit of going multicore is dubious for most PC users. For servers, for scientific calculations, for databases, all of this is certainly a big help. For your average user who wants to play the occasional game of HL2, and who spends most of their time in MS Word with Winamp running in the background, it makes no difference, and it's foolish to pretend it does.

    Next, about the elegance... I disagree there too.
    Multicore is *not* an elegant solution. It's an admission of defeat. It's the fact that "Ok, we can't scale CPU's the proper way any longer. Now we have to use techniques that put the burden on the programmers to utilize them properly". For the last 20 years or so, all improvements in CPU design have been targeted at making things run faster while keeping it simple for the programmer. Going from, say, the 286 to the Athlon 64, performance has exploded, while things have become *simpler* for the programmer. Simpler and more reliable. We no longer have to worry about register allocation, about controlling the cache manually, about reordering instructions for best performance, or about running of memory, of using a pagefile for virtual memory. All this is done for us, on the CPU.

    *That* is elegant. Being able to run such complex beasts as modern CPU's are, without forcing the programmer to navigate around every potential pitfall and carefully keep track of everything that happens in the CPU, that is an achievement CPU designers can be proud of.

    Giving up and saying "Oh well, we'll just let the programmer deal with this. We'll offer lots of raw performance, and tough luck if it can only actually be utilized 3% of the time. That's not our problem, is *not* elegant.

    Multicore is the second-best option only.
    It adds performance, yes, but only in some relatively rare and specialized cases, and only if the programmer is willing to spend countless hours making use of it, and debugging the much more complex software that is suddenly required.

    Compare that to CPU pipelining, which has been done on PC's since the 486. That was an improvement that benefited *everything*. No matter the program, no matter the programmer, no matter what you do with your PC, CPU pipelining enabled a vast performance leap.
    Dualcore? It enables a mere doubling in performance, and *only* in the rare case where the task being executed is *perfectly* suited for multithreading, when the programmer has been able to take the extra time to carefully construct the program to take advantage of this, avoid bugs and anything that would make performance degrade to single-core levels again (which happens depressingly easily, and which is why I say multicore is a "second-rate" optimization. Sure, it's better than nothing, but it requires to much effort, and the rewards are too small for it to be truly celebrated as the savior of performance)

    That is a half-truth if I ever heard one. It can easily use more memory. Pointers are bigger, because memory addresses are 64 bit big, rather than 32. Depending on the application, that may or may not make a big difference.
    64-bit isn't neccesarily slower, though. Athlon 64 runs faster in 64-bit mode than in 32. There's no reason why it'd boot faster. (On the contrary, it has to switch from 16 to 32 to 64 bit mode during boot. That doesn't take more than a millisecond or two, but it's still more work than have to be done in 32-bit mode). And it's capable of exactly the same stuff. It just makes it a bit more convenient and efficient when doing some of them. (Because 64-bit data doesn't have to be "emulated", and because more memory can be addressed without any trickery)

    I'd like you to hit ctrl-alt-del. Open task manager, look at the performance tab, and tell me your CPU usage. What is it? 2%? 5%? Is that what you call "multitasking"?
    Now, open MS Word. Set Winamp to play a MP3. You've got a firewall and antivirus in the background, yes? Better load up MSN too. Got firefox up too? Better open four new tabs. Download a PDF, and open it. And while we're at it, check your email too. Every gamer needs daemontools running in the background too. And XFire.
    Now tell me your CPU usage. Still 5%?
    Gosh, we really need multicore to keep up with all this multitasking, huh?

    Hope you get my point. ;)
    There is a big difference between needing (or gaining any benefit from) multiple cores, and merely running multiple programs at the same time.
    And it is a common misconception that "multitasking" is so much better on a multicore system. How many threads can a dualcore system run simultaneously? 2. How many processes are running currently? In my case, 33. That probably means a few hundred threads.

    So, we have a few hundred threads. They have to be distributed on two cores. That still means *a lot* of threads having to run on each core. In fact, there are so many that it doesn't really look all that different.
    If a CPU core can't handle 200 threads at an acceptable speed (which most dualcore fanatics seem to suggest), then why would we expect it to be able to handle 100? That's still an awful lot. It still gives a ton of context switching, of *pretending* to be able to run threads in parallel, when they're really executed serially.

    There just isn't such a big difference. Each core *always* has to shuffle dozens and dozens of threads. And it's capable of doing that very efficiently, without seeming slow or laggy or anything else. So if that is all you need it to do, one core is just as good as two, really.

    The idea that "if you run multiple processes, you need multicore" is just hopelessly flawed. We are *always* running multiple processes. We've done that ever since the OS was invented. It was the main *reason* the OS was invented. And we certainly don't need one core per process for that. ANd we certainly aren't *getting* one core per process either, not even with dual, quad or octocore.
    We're still stuck with countless threads and processes competing over the same few cores. That's not a problem that's unique to singlecore systems. And it's not even a problem at all.

    Only if you need it to shuffle multiple threads that actually require significant CPU time, does the picture change. And for most everyday users? That doesn't happen at all, unless they're playing games.

    Just like any half-decent OS will do. But when you have a *total* of 5% CPU utilization, how much does this matter? Do you think applications will run faster if each core is being utilized 2.5% of the time, than if one core is running 5%?
    It makes no difference whatsoever. The CPU spends the majority of the time idling in any case.

    Now, if multiple threads or processes actually use lots of CPU time, you're correct, multicore is the way to go. But the "problem" is, that doesn't happen often, as I think I've illustrated above.
    Tell me of an everyday tasks that actually exercises both cores. Something most computer users do regularly, which actually puts load on multiple cores. Playing one of the couple of games that take advantage of multithreading is one example, yes. It's actually the only common example I can think of.

    Do you know how much time the average CPU spends idling?
    Well over 90%. Even when you take gaming and other CPU-heavy activities into account. The CPU still has nothing to do the vast majority of the time.
     
  19. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    meh. I agree with part of what you said, but the COMPILER takes care of most of the heavy lifting now. Optimizing for the X86 is still a pain, and it's completely different than from the Power architecture, both of which can be fed the same code through a specially designed compiler to get good-performing programs. It still depends entirely on the programmers to get good performance out of a machine, just as it always has. The more power a machine has, the more things a programmer can do. Multiple cores are a paradigm-shift from the horrid loops most games and other programs are currently written in, and I think will usher in a more granular and scalable program structure. Multiple cores FTW.
     
  20. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    The compiler does *a lot* of the heavy lifting, yes. But don't underestimate how much the CPU has to do. The CPU still performs out of order execution, it rearranges load/stores, it pipelines execution for us, and hides the awful complexities this introduces. All the control and data hazards, having to forward data or stall the pipeline which the programmer *could* be forced to worry about. It holds buffers for the the OS to keep track of virtual memory, it controls the cache. It does so much for us that we often forget about it and assume that "A CPU is a CPU. The only way to improve on it is to add a sister CPU so we have two cores"

    My point was that the way CPU's have progressed in the last 20 years have made it really hard to write *slow* code. There are still plenty of optimizations that have to be done by the programmer, but it is virtually impossible to write code that just makes the CPU crap out and run at a snail's pace

    Maybe, maybe not.
    But keep in mind how many different models there are for exploiting parallelism. Scientists have spent decades working out much better paradigms, programming languages and compilers that could allow us to truly take advantage of multiple cores. Without having to program specifically with multiple threads.

    I guess you could say it's the concept of threads, I think is flawed. Or at least, it's flawed in a multicore world. As long as we're stuck in the idea that "parallelism equals threads", we'll never be able to really use multicore systems, other than the current patch job, of occasionally outsourcing a task to a separate thread, then hoping it brings us a speed improvement, and doesn't cause *too* many bugs.

    What I object to, is not the idea of multiple cores as such. But rather, the idea that multiple cores are the solution in themselves, and we just have to "get used to them".

    Multicore is a solution (not *the* solution, but certainly *a* solution), but for it to become truly useful, and scale beyond 2-4 cores, we need to ditch the obsolete idea of monolithic threads containing long serial "programs".

    The Cell processor (which I've spent plenty of time criticising here and elsewhere) is on to something there. I don't think the Cell itself is a particularly well executed example (the dev tools are nowhere near mature enough to support this paradigm, and they had to cut far too many corners with the hardware), but that's where the future of multicore is. The dualcore CPU's we have are just an ugly transitioning phase we have to get over. it's that uncomfortable time after single-core chips ran out of steam, but before we'd really taken the consequence and perform the *real* paradigm shift.
    (Going from one to two threads is not a paradigm shift. Ditching threads entirely is)

    Before we can perform this paradigm shift and really make use of multiple cores, really allow that paradigm to scale (to 64, 128 or 10,000 cores), we need to ditch a lot of baggage. The idea of threads has to be discarded. The idea of imperative programming, C, C++, C#, Java, most of the language most people know, have to be thrown out.
    Until we do that, multithreading is a hack, a quick fix to allow us to keep moving for another year or two before we drown in the complexity it causes.

    (And to everyone else, I apologize for the two far too long posts ;))
     
  21. Gator

    Gator Go Gators!

    Reputations:
    890
    Messages:
    1,889
    Likes Received:
    0
    Trophy Points:
    55
    Jalf I think you are putting the carriage before the horse. An underutilization of hardware resources does not imply that anything is inherently wrong, or second rate about the hardware. Just because software engineers have not gotten around to writing programs that take full advantage of an increasingly popular hardware architecture does not mean this is the fault of the hardware itself. The Core Duo architecture for example is a vast improvement over the older Pentium architectures in terms of workload/cycle and power consumption. Unless you are an ardent and immovable critic of Intel, even you must accept this fact given all of the benchmark and real life testing done on this particular generation of chips. For the first time in a long time, the performance gains observed were not the result of miniaturization by pushing silicon to the edges of molecular reality, or by pumping more power into the processor, but by a vast improvement in processor architectural design. That is why I call it an elegant solution.

    Now, while multi-processor systems have been around for a long time, multi-core systems are just beginning to become rapidly available to common consumers rather than just server/system admins. Which means that programmers suddenly face the daunting task of implementing support for this concept in their code. Yes, multithreaded software is more complicated to design and implement than single threaded ones, but it is the logical next step in programming---and multicore processors gives us the best chance to fully take advantage of that. If you intend to argue that we should stick to single threaded application programming, then I challenge you to write me any type of networking program that allows user input into a GUI interface while monitoring network traffic for incoming messages. Make no mistake about it, threading is the easier road taken in most modern applications of this type, not the harder. Ask yourself if the very smart people at Intel would have thought "Hmmmm this multi-threading thing is really becoming popular and necessary in the new software prevalent on the market, maybe we should make a chip that makes it harder for the programmers to implement it and then ship them out in vast quantities".

    And lets face it, nowadays a good processor will not be running full tilt all the time when given a few programs to run. Back in the days before dedicated graphics cards, the CPU would be a little busier processing the graphics information. But have you ever tried to use a Pentium machine that showed >90% CPU usage? I have, and it's when I tried to install and run a program that was designed for processors a few generations down the line. Basic I/O became too much, I soon lost user access to the OS after that. CPU time is precious, and it should only be used when it is needed. When it is not needed, it should be idle. The fact that we observe a 5% usage of a dual/multicore processor most of the time is neither an indication of hardware incompetency nor compromised design. It is more of an indication that most programs just aren't written to take advantage of the hardware yet, in which case please see what I wrote before.

    Also, yes, we will still have to context switch since we can't just dedicate a single core to a single process or thread, but the point is that a multicore approach potentially allows us to bypass much of the blocking and idle time that a single core system forces upon us inherently. And no, individual software makers will most likely not have to do this core-management in their code: the responsibility is placed on the OS programmer. And there is a reason why we pay companies like Microsoft billions of dollars a year to do their jobs. It is by no means an impediment to software development as you seem to have suggested. And all things are hard before they are easy; incorporation of new PC hardware is no exception.
     
  22. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    Thanks for admitting that anyone buying these white elephant chips is actually wasting their time.

    No. My single core Pentium M 1866 runs cooler than the dual cores from Intel and AMD -- I've checked.

    Elegance betokens beautiful simplicity. I think that introducing multiple CPUs looks more like a hack than a profound insight or breakthrough, and it's going to generate god-awfully complex and ugly programming code if properly utilized.

    "Daunting task"? "More complicated"? When you use "elegant" to describe something scientific, technical, or mathematical, elegant means "gracefully concise and simple; admirably succinct".

    SIMPLE != COMPLICATED

    Am I getting through?

    Well done, you may have found an application for multiple cores.

    The "geniuses" at Intel have made a lot of bad decisions in the last few years; this could be one more.

    I couldn't agree more, and that's why Joe Blow does not need to buy these machines, and won't need to for many years.

    You still haven't successfully addressed the argument that multicores are unnecessary for the vast majority of users. I don't know anyone who complains their PC is too slow or clogs up (okay maybe my son, an avid gamer with a 4yr old 1.6 Pentium Centrino, who likes to run games flat-out with every possible texture turned on). But almost nobody else using computers for business or personal computing needs them. So why buy them, other than to line the pockets of shareholders of AMD and Intel?
     
  23. Gator

    Gator Go Gators!

    Reputations:
    890
    Messages:
    1,889
    Likes Received:
    0
    Trophy Points:
    55
    I have no idea how to respond to you simply because you lack the fundamental knowledge of computers, programming, and technology in general. I'm sure if we were back in the 1800's, you'd be the guy loudly screaming that the average joe blow doesn't need a box on four wheels when a horse does just fine.

    I am both pained and amused by this attempt to use my example to Jalf against me. If you had any idea how many times you've used an application of this type, you would not try to twist it into a seemingly obscure occurrence in computing. In fact, you are probably using a multitude of these programs at the very instant you are reading this message.

    I have addressed your concerns and whatever doubts you seem to have about multi-core processors by giving a relatively non-technical and truthful explanation. That you somehow try to twist it around to show that a budding---no scratch that, brand new---genre of consumer chips is somehow flawed because complete software support didn't magically appear overnight is beyond me. But it's a free country, and if you want to stop lining the pockets of AMD and Intel shareholders, then be my guest and break out an abacus and paper notebook for your computing needs. Meanwhile, if you care to follow Jalf's example and point out a particular concept of multicore processor design that you'd like to criticize, I'll be here.
     
  24. BaNZ

    BaNZ Notebook Consultant

    Reputations:
    13
    Messages:
    172
    Likes Received:
    0
    Trophy Points:
    30
    Thanks guys for sharing all these information. I have actually learn't quite a bit even though I have no knowledge of programming at all.

    But I do think I'm a heavy user. I've got lots of programs running at the same time eg, Skype, msn, mirc, ftp server, outlook, firefox, IE, winamp, vmware with vista, osx, ubuntu. Word and adobe reader, bitcomet, emule, vnc and basically loads of other misc programs. The bottleneck seems to be the i/o especially when multiple OS is running.

    I know it only uses less than 5% of the cpu most of the time unless I'm encoding a video or loading another OS in vmware.

    I have a question, if I'm playing a computer game and I go to task manager to set the cpu affinity to 1 cpu. I'll see that it uses that cpu to 100% at all times when I'm in the game. The other processor will idle most of the time. Does this actually helps since I have an idle processor which can be used for other programs when needed?

    I'm using a core duo 1.66 laptop, which I hardly notice any difference to my single core amd barton desktop.
     
  25. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    LOL! I programmed mainframe computers for 20 years. Haha.

    When you stoop to ad hominems like this, you've already lost the argument.
     
  26. Arla

    Arla Notebook Deity

    Reputations:
    35
    Messages:
    1,073
    Likes Received:
    0
    Trophy Points:
    55
    This is where I 100% agree with you, I will never bother upgrading my PC to the newest hottest thing if my current machine is doing the job, why would I unless I felt I had money burning a hole in my pocket.

    So you know all about multithreading then since Mainframes have been doing it for years, so I'm not quite sure I understand why you are so against it?

    Again, for YOU personally it sounds like you don't need to bother upgrading (and again, I'd recommend to anyone that asks not upgrading UNLESS your current machine is not fulfilling your requirements) however for those who are looking to upgrade I would 100% recommend a dual core, and probably recommend going for a 64bit dual core too since they are going to be what's used more and more in the future.

    Yes, right now quite a lot of programs do not take full advantage of the new capabilities of mutlicore and 64bit, however doesn't mean they will continue to not take advantage of it.

    Jalf, do you have any links to more information on this, I'm curious to read more and try to understand what you are talking about better.
     
  27. boon27

    boon27 Notebook Evangelist

    Reputations:
    4
    Messages:
    340
    Likes Received:
    0
    Trophy Points:
    30
    That's one reason to not need to buy a dual core as most programs used would never require that much cpu usage as well as dual core. But to think now if wanting to buy a less expensive laptop with a pentium M is getting more difficult so its like forcing customers just to get a dual core as thats whats available in the stores
     
  28. Jumper

    Jumper Notebook Deity

    Reputations:
    65
    Messages:
    840
    Likes Received:
    0
    Trophy Points:
    30
    I bought an X2 4200 two months after they came out. Paid $585. Put 2GB of RAM in the box too, with a fresh Windows install.

    12 months later I was able to get ahold of another 512MB of RAM to put into my old Barton 2500+, bringing it up to 1GB. I also did a full Windows reinstall there to re-use the machine as a MediaPC.

    That is when I realized I had been had. The speed difference is minimal. The RAM was the bottleneck. I could have just upgraded and added a new video card.

    Have any of you looked at the way XP load balances dual core lately??? If you run ONE instance of Prime95, it takes up 52% of both cores instead of 100% of one core. Windows bounces it back and forth between the two cores, also moving the cache information at the same. This actually makes the process run slightly slower then on an equivalent single core system.

    Did you ever notice that there were only ever about 6 good single vs dual core CPU benchmarks with real-world tests? And no one bothers to even do them anymore?

    I love how all of you have been had. Until the Windows thread balancing system is fixed and program actually use multiple threads, dual core is pretty much usless for the home user, and mostly useless for gamers.

    AMD and Intel were worried people were going to *stop* buying new CPUs. That's why AMD discontinued the A64-3000, which was a popular low end/overclocking chip - to force people to buy the more expensive parts. Well isn't this a wonderful solution, convince people that they need two, and get all the enthusiasts to go out and buy new CPUs to keep up with the guy next door.
     
  29. Vasichko

    Vasichko Notebook Consultant

    Reputations:
    0
    Messages:
    109
    Likes Received:
    0
    Trophy Points:
    30
  30. Arla

    Arla Notebook Deity

    Reputations:
    35
    Messages:
    1,073
    Likes Received:
    0
    Trophy Points:
    55
    Not sure what Prime95 is, but running most CPU intensive applications on my PC I will get 100% use on one core and very little on the other, maybe Prime95 is just incredibly badly coded?

    I think the problem is that most people are buying into the Intel/AMD Hype that Dual-Core means everything will run "twice as fast" which is a complete load of rubbish, research people, determine what you REALLY need, what is it you need to do, what is it you will be doing, how quickly do you NEED to do it.

    For me, I do quite a bit of video encoding, as such dual core is great because most video encoders are written to take advantage of the dual core and as such really do run twice as fast, most average users, hell most average users don't need half the processing power of today because most average users use maybe word and the internet with any great frequency.
     
  31. Jumper

    Jumper Notebook Deity

    Reputations:
    65
    Messages:
    840
    Likes Received:
    0
    Trophy Points:
    30
    The only time I have ever used my dual core to its full potential has been video encoding while gaming (have to set affinity to one core on the encoder or it will use up both though), running MATLAB while gaming, etc. It's good for fast video encoding only as well.

    However, for the amount of time I do those three things, it was not worth the additional cost. Prices have dropped now to where it may be, and you can't actually buy single core CPUs anymore anyway, so I guess we don't really have a choice.

    At least some of the blame for muti-tasking issues needs to fall to Windows, since the Windows scheduler really really sucks.
     
  32. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    Bingo!

    If notebooks became cheaper and cheaper, we'd win, but there'd be a lot of unhappy shareholders.
     
  33. Arla

    Arla Notebook Deity

    Reputations:
    35
    Messages:
    1,073
    Likes Received:
    0
    Trophy Points:
    55
    I'm not sure this is an argument against it though.

    It's like complaining that you can't buy a 386 laptop anymore because no-one makes one? They are FORCING me to buy at least a Pentium M (I think that's about the slowest currently on the market) despite the fact that all I want to do is run DOS programs, I don't even want to run Windows.

    Edit: To make it slightly more realistic, I'm not sure most "normal" users need anything more than to be able to run windows 98 (maybe XP) and a internet browser, that's what MOST people seem to use computers for nowadays, so anything that needs more power than doing that is probably too powerful and a "waste" for the average user.
     
  34. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    Face facts, anything more powerful than a Pentium M in a notebook is completely unnecessary for the vast majority of users.

    Despite that, the enthusiasts and those who have to have the very latest will always be prepared to pay the big bucks for the latest processors. And major retailers like HP and Dell are forced to keep the latest processors in all their offerings because of the need to seem cutting edge, the greater profit margins, and secret deals with the chip makers.

    To escape from this arms race from which everyone profits but you, you essentially have to build your own computer from parts, thus saving lots of money and being able to specify it to your exact liking.
     
  35. Gator

    Gator Go Gators!

    Reputations:
    890
    Messages:
    1,889
    Likes Received:
    0
    Trophy Points:
    55
    Well you certainly managed to hide your technical expertise very well.

    Au contraire, I believe the analogy applies perfectly in this case.

    I have said it before and I'll say it again, if all you do is some light computing---such as word processing---then you will not notice a difference. And before you jump in here and attempt to cite this as evidence that the "average joe blow" does not need multicore processors, I'd ask you to curb your enthusiasm and cite me a source that says the average joe blow only uses word processors and maybe looks at a website or two.

    In the mean time, since you are such a fan of the Pentium M, then go ahead and shell out $350 for a 760 Dothan 2.0GHz Socket 478 processor. I spent my $300 on a T7200 2.0Ghz Socket 479 Core 2 Duo, the latest in dual core processor technology. Yes, I'm sure the average joe blow would be so much happier spending more on a previous generation, single core processor that may or may not run a few degrees cooler, that has been shown to be much slower in benchmarks for specific applications. Don't believe me? Do a price search...I hope you've heard of Newegg...or ask some of the resellers on this forum about this. And maybe read an article or two from a reliable source, like this one from AnandTech.
    So much for your argument for the Pentium M, and not lining the pockets of AMD and Intel.

    When you have something specific to discuss about the multicore processor concept, I'll be glad to spend more time debating the specifics of it. But seeing as how you have absolutely nothing technical to contribute to the discussion, and have not attempted to, nor any sources citing weaknesses of multicores in general, there is nothing really to debate. Right now all you've got is an opinion, and you do know what they say about opinions don't you? Please do a little research on the topic before attempting to lead some kind of forum revolution against AMD/Intel and their latest chip.
     
  36. moon angel

    moon angel Notebook Virtuoso NBR Reviewer

    Reputations:
    2,011
    Messages:
    2,777
    Likes Received:
    15
    Trophy Points:
    56
    Celeron-M, Sempron, Turion ML/MK, Core Solo... You can still buy Pentium-Ms too and I believe there's no LV/ULV Core Duo yet either so a lot of the hyper-portable machines (Portege R200 etc.) Still use Pentium-Ms.

    Having said that, you are entirely right that you can't buy a powerful pc these days without having a dual core, probably a Core Duo. The only one I can think of is the Alienware Aurora m9700 which still uses Turion MLs.
     
  37. lixuelai

    lixuelai Notebook Virtuoso NBR Reviewer

    Reputations:
    463
    Messages:
    2,326
    Likes Received:
    1
    Trophy Points:
    56
    This thread is pretty retarded. First of all the "average joe" are not that much different from some of you very 1337 mainframe programmers. If people want to buy the latest Dual Core processor...then they should be able to. If you want to stay with the Pentium Ms...then go right ahead. No one is stopping you. Heck why not buy like 100 Pentium M laptops so they can last you a lifetime? I can see the "average joe" benefitting from Dual Core. They can do a virus scan while work lag-free on their computer. They can run multiple programs at once. Seriously none of you are considered "power users". Do people really really need dual core? Probably no. But then do we really really need Pentium M? Do we really really need Pentium 4? 3? 2? 1? 0? -1?? Heck we should probably all go to the stone age. There is no written language then. Perfect no need for internet or word processing!

    btw some genius mentioned Prime95 I see. There is a very good reason its called Prime 95. Might want to check out this program called Orthos :rolleyes:
     
  38. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    This is not an entirely technical topic. The technical issues are well understood, and I thought the chap who debated you earlier on the specifics of threading etc got the better of you without breaking a sweat. I'm more interested in the economics of it, the psychology of the market, and the foisting of unnecessary products on unsuspecting consumers.

    But I can quite understand why you'd like to avoid discussing these issues.

    That's a straw man argument. I do much, much more than mere browsing and word processing with my mid-range Pentium M, including Java programming, watching videos, Photoshop editing, etc etc. I DO NOT face performance, slowdown, bottleneck, choking or other issues that would make me think I need another, faster, multicored CPU.

    The high price for Pentium M is an artifact of the market, probably dictated by chip availability issues. It is not necessarily a permanent or inevitable situation and I would imagine these chips will become cheaper over time, if they are still made, as has happened with other CPUs.

    The article you linked is about top-end desktop CPUs, written by and for geeky fanboyz. Talk about irrelevant!

    PC World's comparison of dual core and Pentium M is more relevant:

    [​IMG]

    Lol. I seem to have gotten under your skin nicely.
     
  39. moon angel

    moon angel Notebook Virtuoso NBR Reviewer

    Reputations:
    2,011
    Messages:
    2,777
    Likes Received:
    15
    Trophy Points:
    56
    I'd prefer to see a Pnetium-M and Core Solo comparison, to see how much is the dual cores and how much is the newer more complex design, irrelevent of numbers of cores.

    Oh and why did intel have to call their chip 'Core' it makes discussions like this so much harder...
     
  40. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    Along with friends I enjoy making fun of my P4 for it's heat output, inefficiency, and other inferior characteristics when placed alongside the Pentium M. But I also enjoy reminding myself that I haven't pissed away my money on a more powerful computer than what I need. My P4 runs everything perfectly, and of ALL the lag I have experienced, it's been because of my HDD or (in the case of games) my GPU.

    My personal opinion is that dual-core adds multi-tasking capability that is not much better than a P4 w/ HT or a top-of-the-line Pentium M. Certainly any of these processors are adequate for my needs. But for my wants (performance), I would spend the money otherwise burned on a Core Duo as opposed to a Pentium M on upgrading the slowest part of the system.

    For instance, if I have the option of buying a new laptop (only reason would be for better battery life on my current P4), and I had to choose between the following two laptops of the same price:
    2GHz Core Duo, 1GB of generic RAM, 80GB HDD spinning at 4200RPM
    2GHz Pentium M, 1GB of generic RAM, 80GB HDD spinning at 7200RPM

    I would choose the second option. The second system would offer superior performance even if the Core Duo machine had a 5400RPM HDD. This is assuming that availability of the two processors is the same. Maybe I would even have 50 bucks left over to spend on a good mouse or a different GPU model. I would sacrifice CPU power (the most powerful component of any computer) to upgrade the slowest part of the computer.
     
  41. Gator

    Gator Go Gators!

    Reputations:
    890
    Messages:
    1,889
    Likes Received:
    0
    Trophy Points:
    55

    I applaud you for finally providing some relevant data to this debate, as it were. And yes, you do get under my skin---not only because of your pretentious attitude and stereotyping of common computer users as "average joe blows", but also because you my friend are ignorant and stubbornly unwilling or afraid to face the fact that you are. And that is something I can "lol" at very easily, though I choose to feel sorry for you and your inability to comprehend the basics of computing as well as the tech market trends you seem so determined to argue about.

    Now about your research: thank you very much for proving my point.
    Look at the statistical categories listed on the chart you listed. Do you have any idea of what the numbers even mean? Of all of the categories and tests run, you've managed to find only one test of where the Pentium M beat the Core Duo (not Core 2 Duo mind you) on a WorldBench test by two points, 99 to 97. Now look at the multitasking test, media encoding test, and of course your much vaunted Adobe Photoshop test---which was a single threaded app mind you. In all of those tests, the dual core processor handily beat the Pentium M---which is by the way, more expensive. Oh, and battery life? You can see that it varies, but that a dual core processor can keep up with or even exceed the battery life of a single core processor is an indication of architectural superiority.

    "Lol" indeed, you've managed to do my research and my work for me by showing everyone here just why the dual core and multicore systems are so much better for nearly everyone. It's both humorous ("lol") and pitiful that you probably lack the capacity to understand that.
     
  42. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    You seem to miss the point entirely. I'm addressing myself to the people who currently own 1-3 year old Pentium M laptops, and I'm telling them to sit on their hands because upgrading at this stage does not make sense, as the slightly better scores for multicores in the above test clearly show. Pay attention, if you can.

    Prices? Hmm, I wonder why Intel are keeping the price of their old chips so high? Could it be they are trying to make their newer chips more attractive and price competitive? There's a problem when you have to manipulate the market like that.

    As for you, Gatordoody, you have such an odious habit of resorting to personal insult and ad hominem argument that I won't engage further with you, sonny. Cheers.
     
  43. Gator

    Gator Go Gators!

    Reputations:
    890
    Messages:
    1,889
    Likes Received:
    0
    Trophy Points:
    55

    No parting shots then, we shall leave it at that.
     
  44. lixuelai

    lixuelai Notebook Virtuoso NBR Reviewer

    Reputations:
    463
    Messages:
    2,326
    Likes Received:
    1
    Trophy Points:
    56
    The second system would not offer superior performance lol. The difference between a 7200rpm and 4200rpm is big but it wont come close to compensating for the 2ghz Core Duo vs the 2ghz Pentium M. A 1.6Ghz Celeron M 420 is around as fast as the 2ghz Pentium M and a 1.2Ghz Core Duo w/ 4200rpm HDD gets around the same PCMarks05 score as the Celeron M 420 w/ 7200rpm HDD. I benched my Dell D420 with my Lenovo N100 and Dell 710M. Overall the D420 is the fastest of the three even with the slowest hard drive.
     
  45. Keizafk

    Keizafk Notebook Geek

    Reputations:
    75
    Messages:
    97
    Likes Received:
    0
    Trophy Points:
    15
    Definately. What about those people who have a chipset older than Pentium M, perhaps one getting too out of date to run things smooth, and have decided to get a new system because of that. Would you recommend Pentium M or Core2Duo if the price is the same?

    If you blame Gatordude for not getting your point, you're definately not getting his point either. It's a better CPU than Pentium-M, or, C2D would be, a better CPU than whatever else the market offers for laptops. So, if you've decided that it's a time to buy a new laptop, why not get a C2D?

    Obviously the average joe who is running on Pentium-M probably is blissfully unaware of this debate and is happy that his Pentium-M is running everything smooth. The people who are more interested (or who are about to purchase a new laptop) take part in these debates. And, the bottom line, what's bad with technology advancing, even if you wouldn't be needing all the power a T7600 will give you to check your emails and write some documents?
     
  46. lixuelai

    lixuelai Notebook Virtuoso NBR Reviewer

    Reputations:
    463
    Messages:
    2,326
    Likes Received:
    1
    Trophy Points:
    56
    His argument is basically if we were all stuck in the stone age the price of microprocessors will be very very low. Which is of course true since by then we wont even have microprocessors.

    I can surf the net, do word processing, watch videos and code on a 200mhz Pentium running Linux. Now why dont we all go back to the Pentium Is?
     
  47. Mamba

    Mamba Notebook Guru

    Reputations:
    6
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    Obviously, get the newer chip. Duh.

    Obviously. I'm not addressing people who have clapped out PCs and are looking for a totally new machine. But if your machine is recent and runs adequately, there's really nothing much to tempt you to "upgrade".

    [snip silly stuff]
     
  48. lixuelai

    lixuelai Notebook Virtuoso NBR Reviewer

    Reputations:
    463
    Messages:
    2,326
    Likes Received:
    1
    Trophy Points:
    56
    Lol cant find a counter-argument and resorting to personal attacks? Isnt this what you accused someone else of doing? ;) :rolleyes:

    Cant I argue this for say the Pentium 200mhz I was talking about?
     
  49. Jumper

    Jumper Notebook Deity

    Reputations:
    65
    Messages:
    840
    Likes Received:
    0
    Trophy Points:
    30
    You know what the average user needs more then anything? Something to make sure that Windows doesn't end up with 80 processes starting every time the start the computer, like I see on my family computers every time I am home.

    Dual Core is just a way to treat a symptom, it is not a solution to the problem.

    PC World said the #1 reason for home users to have dual core CPUs was to "run the anti-virus, anti-adware, anti-spyware, and firewall on their own core."

    I laughed my ass off at that.

    AMD and Intel decided that dual core was the way to go, and Thou Shalt Not Challenge Thy Manufacturers, I guess. Personally I think it was mostly driven by self-preservation when they realized that they couldn't make single cores faster at a high enough rate with new ground-up designs taking 3 years. Easier to slap on more cores every 2 years and make everyone by new CPUs.

    Maybe it's because you are all Intel fanboys and Pentium 4 sucked so bad for so long that you can't even remember having a viable single core system that didn't catch the house on fire.

    They have great marketing folks, although the 'dancing minorities' commercials don't exactly want to make me buy Intel CPUs...
     
  50. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    To lixluelai:
    Quote: "A 1.6Ghz Celeron M 420 is around as fast as the 2ghz Pentium M"

    Two things.... First, you can't be serious about this statement. The Celeron M is basically a stripped down Pentium M, lacking SpeedStep for energy saving and having less cache. That makes even the lowest-end Pentium M capable of outperforming the Celeron M you listed above with great ease; they are two different CPU families. One is a budget option, the other a mobile performance solution (in it's day that is). So not only does the Pentium M yield better battery life, it also offers superior performance due to it's larger cache size and clock speed.

    Quote: "2GHz Core Duo, 1GB of generic RAM, 80GB HDD spinning at 4200RPM
    2GHz Pentium M, 1GB of generic RAM, 80GB HDD spinning at 7200RPM"

    Secondly, you said that the first system would beat the second in BENCHMARK scores. Real-life situations are never the same. The CPU is by far the fastest and most responsive component in the computer, while the hard disk, being a mechanical device, is on the opposite end of the spectrum. That means that in the case of the first system, the fancy processor would be done it's tasks so quickly that it would soon be waiting for the slow HDD to fetch new data. The 4200RPM would have slower access times and usually a smaller cache size, making it a huge bottleneck. In contrast, the second system's hard drive would have a smaller seek time to find requested data, a larger memory bank to store temporary data, and a much faster transfer rate, resulting in huge responsive and performance gains. The Core Duo would spend 99% of the time waiting for the hard disk. The Pentium M on the other hand would be completing assigned tasks and getting new data fetched from the faster, more robust HDD. Overall, the CPU would spend less time idling and more time working; now that's my kind of CPU. The result would be a faster system with regards to the Pentium M rig. The maximum performance of the system (considering only the CPU, RAM, HDD) depends largely on the slowest component used > the HDD.

    Quote: "Maybe it's because you are all Intel fanboys and Pentium 4 sucked so bad for so long that you can't even remember having a viable single core system that didn't catch the house on fire. They have great marketing folks, although the 'dancing minorities' commercials don't exactly want to make me buy Intel CPUs...why isn't there a dancing white guy? I guess white guys just aren't ethnic enough to sell CPUs..."

    To Jumper: what are you talking about? I haven't seen many Intel/AMD commercials, I just chose Intel b/c, well, they make better mobile processors. There isn't anything to argue here. So what ARE you talking about?
     
 Next page →