The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    what are we going to do after "Haswell"?

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by 660hpv12, Aug 15, 2008.

  1. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    for those you you who doesnt know what Haswell is, its the future CPU chip by intel. year 2012. it will have 22nm process and 8 cores by default. ever since the beginning of computing really, processing power has been doubling ever year or two. well thats only achieved by cramming more and more transistors, in order to fit those Billions of transistor, the chips has to be manufactured with smaller and smaller process. for those you you who dosent know the "XXnm" manufacturing process is the wave length of lazer used to etch the chip. the lazer used to be focused with lenses, but due to the small wave length, its been absorbed by the lenses itself. so Now we use mirrors. But the problem now is that even these mirrors made absolutely perfectly (imperfections bigger than the size of a Atom will cause the mirror to be rejected) are going to absorb the wave length. In theory due to the laws of physics there cannot be a manufacturing process smaller than roughly 15nm. thus no room to cram more transistors which means no more performance improvements. We will hit the WALL some time next decade. Well thats the background.

    Are we doomed to have no more powerful CPUs and GPUs? with quatium computing and DNA computing not even in its infancy, are we doomed to have "carppy 15nm processors" for a really long time in the future, are advancements in computing stopping? Are my dreams of seeing a "Universe Simulator" going to be crushed? that sux.

    ya that was a rant, I hate been screwed. I always wanted to play a game where absolutely everything in the universe is going to be rendered. but even if we made the game. no computer will be powerful enough to play it. well looks like we are going multi-core to the extreme at the very best

    edit: sorry I dont exactly know where in the forum this fits, well certainly it isnt off topic
     
  2. cjcerny

    cjcerny Notebook Consultant

    Reputations:
    7
    Messages:
    174
    Likes Received:
    0
    Trophy Points:
    30
    Okay, slowdown...first, lasers are not used in the production of CPU's. Ultraviolet light is used along with a mask on the silicon wafer. Then, the unmasked stuff is removed by etching chemicals.

    Second, I understand that a CPU making lesson wasn't really your question. Your real question is: what happens when we hit the wall with the current manufacturing paradigm. The answer to that is simple: chipmakers like Intel and AMD and IBM and everyone else base their survival on huge amounts of research on ways to constantly make their chips faster. They know full well that there are limits to what we currently know about chipmaking. They'll either find ways past those limits or come up with new chipmaking technology. Sure, the laws of physics are immutable, but their resourcefullness in keeping their jobs and respective companies bottom lines healthy are not.
     
  3. powerpack

    powerpack Notebook Prophet

    Reputations:
    7,101
    Messages:
    5,757
    Likes Received:
    0
    Trophy Points:
    0
    More bad news, someday we will all die? I just mean I think you are getting a little ahead of yourself and all. Maybe we will strap 2,3,4.................... in at the same time?
     
  4. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    Thanks for the correction. the background aka "lesson" wasnt meant to be a lesson but rather info to get more people engaged to the discussion. Letting the big companies do there thing is really the best we can hope for. But really in Five years or so we can no longer use silicon based chips anymore. while ideas like NDA computing is really floating far away from reality. I have to say when one sees no hope in the near future, thats where the insecurity came from
     
  5. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    lol I guess so. Maybe thats just the thoughts of a man who has too much time on his hands. Guess we should be more worried about the end of the world in 2012
     
  6. chen

    chen Notebook Deity

    Reputations:
    224
    Messages:
    741
    Likes Received:
    1
    Trophy Points:
    30
    I never had a thought about that...I always figure that humans have the brain to solve any problem when needed being solved...that's why we are not wiped out by viruses yet.
     
  7. derekrb

    derekrb Notebook Enthusiast

    Reputations:
    1
    Messages:
    49
    Likes Received:
    0
    Trophy Points:
    15
    I'm not particularly worried. Already, some people have claimed the ability to build a basic quantum computer....

    Spintronics, DNA computing....I'm not so much worried about what happens as excited to see what people come up with.
     
  8. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    I wish I can be as laid back as you guys are. I know its not even my problem, nor do I have the ability to solve the problem. but the only reason we are not wiped out my viruses yet is because of people like me (well people who are worried like me but with a lot more know-how) that said, I dont want the computer I buy in the year 2013 to be my last one. Guess we can always shove 500 cores together till we get DNA computers up and running
     
  9. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Well, according to the Great Oracle Known As Wikipedia, 16 nm is seen as the node that will transition integrated circuits from CMOS (Complementary metal–oxide–semiconductor, what everything's done on now) to nanoelectronics.
    Since Intel and ITRS have mentioned an 11 nm node due sometime in 2015, I don't think this is something to worry about right away.
     
  10. ChristopherAKAO4

    ChristopherAKAO4 Notebook Nut

    Reputations:
    641
    Messages:
    1,700
    Likes Received:
    0
    Trophy Points:
    55
    Sounds like we should be more worried about our technology revolting on us than making it more powerful. :p JK But seriously stop worrying about the fact that CPUs will never be powerful enough to create a fake world and enjoy the real one!
     
  11. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    cool, pass me a link. great pass time for a bored yet worried person
     
  12. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    lol this leads to the matrix. let me be a crack pot. who told you we are living in a "real" universe?
     
  13. ChristopherAKAO4

    ChristopherAKAO4 Notebook Nut

    Reputations:
    641
    Messages:
    1,700
    Likes Received:
    0
    Trophy Points:
    55
    LOL! Maybe I'm not, but at least it beats worrying about creating another!
     
  14. FrankTabletuser

    FrankTabletuser Notebook Evangelist

    Reputations:
    274
    Messages:
    346
    Likes Received:
    0
    Trophy Points:
    30
    Reducing the wave length does not just mean faster CPU. They reduce the wave length to reduce costs and reduce power consumption. Even if you can't reduce the wavelength any more you have tons of possibilities to still increase the speed.
    The Intel Core uses a different architecture than the Intel Pentium and so will the Intel 10, 11, 15 or whatever also do, maybe. They can improve everything. Maybe they stop using silicium, maybe they will be able to use superconductors to build the CPU, that will get funny then :D :eek: They can improve the cache, the instructions, implement the GPU, CPU, ... in one core. They can improve the multi core architecture, the peripheral and and and...
    Don't be afraid, there will be faster computers available, even after 2013, maybe more than ever after 2013 when they have to use their brains again and can't rely on just reducing the wave length and adding some cores.
     
  15. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    correct me if I'm wrong. I think the real reason behind reduction in wavelength is to create "space" for more transistors which has everything to do with processing power. power saving of small wave length is really just a side effect.
    btw I really want DNA computers. every single DNA molecule can act as transistors hence CPU made out of DNA. and of course everyone knows DNA is a great data storer. So really no more need for cache, ram or HD or processor or GPU. just DNA trillions of them
     
  16. FrankTabletuser

    FrankTabletuser Notebook Evangelist

    Reputations:
    274
    Messages:
    346
    Likes Received:
    0
    Trophy Points:
    30
    yes you are right. When they reduce the wave length they are able to put more transistors on the same space.
    This means they can reduce the costs. Else the CPU will get more expensive with the added transistors if they aren't able to reduce the size of the transistors and keep or even reduce the size of the whole die, that's one reason why atom is so cheap, it has a very small die.
     
  17. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
  18. Gregory

    Gregory disassemble?

    Reputations:
    2,869
    Messages:
    1,831
    Likes Received:
    1
    Trophy Points:
    56
    Well I plan to be filthy rich by 2012. So much so that I will hit the wall in what I can cram into my wallet.

    Still though, I appreciate every dollar I earn along the way ;).

    Don't worry so much about it man! Technology will never hit the wall. We will always be cranking out new ways to push the limit. With every new technological innovation we are learning how little we truly know about technology. Who knows maybe by year 2060 we'll have computer chips embedded in our DNA? Then every skin cell could be a processor core. That'd be a lot of processors.
     
  19. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    DNA computing is what I would like to see, we can have so much more DNA molecules than we can cram transistors on to silicon chips. its just that I havent heard anything solid on DNA computing. that bothers me lol.
    btw: what was the reason of going multi core in the first place (apart from multi tasks)? can processors be clocked no higher than 4ghz? they may still be hope if we can have a 50Ghz oct-core lol

    edit: I hope this is not considered bumping old thread, I just has something to say lol
     
  20. John Kotches

    John Kotches Notebook Evangelist

    Reputations:
    133
    Messages:
    381
    Likes Received:
    0
    Trophy Points:
    30
    There is still quite a ways to go, into the X-Ray and Gamma Ray range. Here is a link to some information (10 years old) on X-ray lithography. With X-ray lithography they were projecting 4nm trace sizes then.

    Far ultraviolet, X and Gamma radiation (aka rays) are high-energy ionizing radiations and have some dangers. I don't know what special construction techniques for the facility will be required.

    Also Moore's law speaks to a doubling of transistor count, which happens to equate to a roughly doubling of CPU power.

    Heck, with 4nm trace sizes, you might be able to put one GB of cache on die.
     
  21. 660hpv12

    660hpv12 Notebook Deity

    Reputations:
    63
    Messages:
    1,031
    Likes Received:
    0
    Trophy Points:
    55
    omg thanks, I over looked the different ranges of electro magentic waves, if we can use gamma ray lithography, we might even see a "fm" chip