The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Possibly bye bye Intel CPUS if you want SLI...

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by eleron911, Jun 6, 2008.

  1. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
  2. Wishmaker

    Wishmaker BBQ Expert

    Reputations:
    379
    Messages:
    1,848
    Likes Received:
    35
    Trophy Points:
    66
    Intel would not do something like this with no reason. There is more to this than meets the eye and they will not abandon hardcore gamers. There is always a backup plan and this seems the first major move to give Larabee a shot.
     
  3. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Marketting ploy, sure.
    But it may be time for AMD to step up :)
     
  4. X2P

    X2P COOLING | NBR Super Mod

    Reputations:
    3,179
    Messages:
    5,361
    Likes Received:
    4
    Trophy Points:
    206
    Maybe intel is going into the graphics card buisness. seeing as that is the worst thing they can do.
     
  5. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    AMD and ATI are together, and Intel and Nvidia have been doing like a secret partnership.

    SLI is Nvidia tech, so why would it be bye bye intel cpus? that makes no sense.

    It would only hurt both companies and allow AMD to take advantage of the situation and intel is all about being the power king and cutting off AMD at every step.

    Lets release X process a few months faster than AMD, lets do this faster than AMD, oh AMD caught up and they are releasing a cpu thats just as good as ours? hmm lets do a MASSIVE price cut then so they still cant sell anything.

    It used to be back and forth back and forth between the two pretty much every month or every other month one company overtook the other with better products or prices and this was good for the customer because competition drives lower prices and faster evolution of products. Now with Intels big lead for so long its hurting us and prices have gone up. If they really do something like this it may be just what we need to try to put things in balance, but something tells me intel would just copy the SLI tech and release there own chipset to get even further ahead, or heck just buy Nvidia lol.
     
  6. Xirurg

    Xirurg ORLY???

    Reputations:
    3,189
    Messages:
    7,375
    Likes Received:
    3
    Trophy Points:
    206
    not possible,this means 1000000$ loses for intel....
     
  7. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    I'll probably get flamed for this, but SLI sucks anyways. Paying for a second graphics card for a ~20% boost in performance is completely impractical, when instead you could use that money to buy a single, top-of-the-line card. Obviously this is off-topic, but I just want to say that I don't think Intel is missing out on much here.
     
  8. xrmx89x

    xrmx89x Notebook Guru

    Reputations:
    0
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    You get way MORE then "~20% boost in performance." Have you seen the charts lately on how well it scales? It can scale well above 75%.
     
  9. Wishmaker

    Wishmaker BBQ Expert

    Reputations:
    379
    Messages:
    1,848
    Likes Received:
    35
    Trophy Points:
    66
    Larabee will be ready to serve hardcore gamers :).
     
  10. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    What do you mean gamers will be hurt. Their not talking about not using Nvidia graphics, just not SLI, unless Nvidia caves and licenses them the ability to make it themselves.

    Nvidia just wont be making motherboard chipsets, Intel will still use Nvidia graphics cards.

    Maybe Intel has their own way of performing the SLI function with any video card.

    But I hope that it is a big mistake for intel and they fall on their face.

    Besides what is it that Intel offers that AMD doesn't for a gamer?

    And people actually question the allegations that Intel is anti-competitive
     
  11. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    SLI can sometimes scale as far as 100% ,maybe even 105% in some PROVEN cases. But we are talking modern games here, where it matters the most.
    Maybe,but I fear that without SLI(mainly quad SLI), gaming at max details and 2560x1600 might never happen.
     
  12. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    59
    Trophy Points:
    466
    nVidia will shoot themselves in the foot. SLI would only be possible on AMD processors then, which that seems like suicide to me as AMD won't want them touching their ATI baby.

    I see lawsuits and monopoly claims in the very near future.
     
  13. Shyster1

    Shyster1 Notebook Nobel Laureate

    Reputations:
    6,926
    Messages:
    8,178
    Likes Received:
    0
    Trophy Points:
    205
    It's probably mostly jockeying behind the scenes over how to slice up the money pie. My guess is that someone at Intel feels that, since Nehalem has a new architecture, it doesn't technically fit within the current licensing agreement, and thus provided a wedge for Intel to try and shake more money out of the deal. It might also be partly due, as another poster pointed out, to Intel getting paranoid that NVidia is going to start moving in the direction of more general processors and not stay put in the GPU niche, which might have caused Intel to want more restrictive terms in the license that would have the effect of making sure NVidia limited its GPUs to the usual graphics support functions.
     
  14. Wishmaker

    Wishmaker BBQ Expert

    Reputations:
    379
    Messages:
    1,848
    Likes Received:
    35
    Trophy Points:
    66
    Have no fear, the EU will discipline Intel if they have too :p.
     
  15. Meemat

    Meemat Notebook Evangelist

    Reputations:
    127
    Messages:
    462
    Likes Received:
    0
    Trophy Points:
    30
    Dear Intel: wtf r u doin ????

    Seriously, if nVidia's only option is the largely inferior (so far) AMD, then what's the point of being SLI-capable if you won't end up with the fastest rig?

    Also, AMD won't let nVidia get close to ATi, like Greg said
     
  16. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    I think you meant their only competition, or alternative is AMD/ATI, but I would not consider them largely inferior. You will always have one or the other on top. And Nvidia and you have benefited from the technology ATI has developed and made mainstream, namely GDDR3 and then they will benefit from ATI's development of GDDR5.

    Im not sure what anyone means by AMD wont let Nvidia get close to ATI.
     
  17. SideSwipe

    SideSwipe Notebook Virtuoso

    Reputations:
    756
    Messages:
    2,578
    Likes Received:
    0
    Trophy Points:
    55
    They mean AMD/ATI doesnt want anything that will take their advantage away from them. Nvidia has not had an AMD SLI board in a while so ATI dominates that area and allows Intel to use its Crossfire tech in their chipsets.

    Of course if this does happen (intel and nvidia no longer offer SLI boards), then AMD might accept Nvidia SLI so it could potentially sell more AMD CPUs, meanwhile ensuring only ATI Crossfire will be available for Intel CPUs. It is in a way a win/win situation for AMD. Intel will lose if it does push Nvidia away.
     
  18. The_Observer

    The_Observer 9262 is the best:)

    Reputations:
    385
    Messages:
    2,662
    Likes Received:
    0
    Trophy Points:
    55
    The might be something as per their plan.
     
  19. jooooeee

    jooooeee Stealth in disguise

    Reputations:
    737
    Messages:
    1,311
    Likes Received:
    0
    Trophy Points:
    55
    Intel isn't going to stop making CPUs simple as that
     
  20. Pants!

    Pants! Notebook Geek

    Reputations:
    0
    Messages:
    89
    Likes Received:
    0
    Trophy Points:
    15
    I worked there for 7 years, and indeed there must be something to quantify their decision.

    Most likely Nvidia chipsets were causing a restriction they were not happy with, additionally SLi is kind of a yawn...

    But if Intel was so worried about gaming then why are their GPU's just about the worst on the market?
     
  21. StefanHamminga

    StefanHamminga Notebook Consultant

    Reputations:
    55
    Messages:
    148
    Likes Received:
    1
    Trophy Points:
    31
    Actually makes sense to me, Intel's biggest rival in it's main markets is Nvidia, not?

    Nvidia (and AMD/ATI) is pushing to make the GPU more general purpose and thereby decreasing the importance of the CPU in the PC.
    Intel obviously wants to put an end to this while they still can (by crippling nvidia as much as possible).

    From a consumer point of view I'd like the EU to force both Intel and Nvidia to remove artificial SLI limitations and make SLI work on any chipset.
    Even better would be a pact between IBM, AMD, Nvidia & Microsoft to introduce a new (open) standard instruction set (Power 6 seems a sensible base), forcing a way out of the x86 stronghold Intel has on CPUs.
     
  22. pukemon

    pukemon are you unplugged?

    Reputations:
    461
    Messages:
    2,551
    Likes Received:
    245
    Trophy Points:
    81
    perhaps intel has something up their sleeve. perhaps they haven't unveiled it yet, or maybe there is something that has been overlooked and nobody's really taken notice of how important/influential it could be on the market. maybe i don't know what i'm talking about, but honestly for some reason this has my curiosity piqued.
     
  23. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Who had the plans to make the CPU and GPU into a single component?
    AMD?
     
  24. crazyanz

    crazyanz Notebook Consultant

    Reputations:
    7
    Messages:
    211
    Likes Received:
    0
    Trophy Points:
    30
    dunno what Intel thinks they are but they have to stop right now! they are on their way to manufacture every piece of a computer and ban out all other manufacturers from their system. I hope AMD catches up soon with Intel, Intel is really bugging me lately with this kind of news
     
  25. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    umm... just throwing it out there. SLI will not increase performance 100%... and uhh... definitely not 105% lol.

    other than that, no real comment except that intel clearly is not a company that likes to fight fair, but they do have the best cpu's out at the moment.
     
  26. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Well, I beg to differ.
    A friend showed me Unreal Tournament 3 with a 8800 GTS 320 SLI , and whadda you know, in some cases he got twice the fps he was getting with a single card.
    Also, PM Brainer for more details...
     
  27. crazyanz

    crazyanz Notebook Consultant

    Reputations:
    7
    Messages:
    211
    Likes Received:
    0
    Trophy Points:
    30
    i rather have a good gpu with a medium cpu then a good cpu with a rubbush gpu if intel i really pushing nvidea out of the market i think they cut themselve in the finger and intel will sell alot less cpu's to gamers
     
  28. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Or Nvidia will start developping CPUs too :D
     
  29. deaffob

    deaffob Notebook Guru

    Reputations:
    0
    Messages:
    50
    Likes Received:
    0
    Trophy Points:
    15
    not really, 99% of the consumers are casual users. they don't really care about the extreme vid cards. they only need integrated for the most part. Intel won't lose anything, if they do, it's only from the really small population of gamers. GL defending nVidia. Gamers are only the small portion of the buyers
     
  30. SideSwipe

    SideSwipe Notebook Virtuoso

    Reputations:
    756
    Messages:
    2,578
    Likes Received:
    0
    Trophy Points:
    55
    didnt nvidia already bring out a CPU for small mobile devices?
     
  31. crazyanz

    crazyanz Notebook Consultant

    Reputations:
    7
    Messages:
    211
    Likes Received:
    0
    Trophy Points:
    30
    yeah a allin one chip with cpu gpu north and south bridge in 1 chip for small devices
     
  32. crazyanz

    crazyanz Notebook Consultant

    Reputations:
    7
    Messages:
    211
    Likes Received:
    0
    Trophy Points:
    30
    I think you dont look far enough at firts gamers are actually a pretty big and lucrative market ;) second its not only gamers using heavy video cards its also photographers and all kinds of other power graphics things.

    Then you have a thing called reputation if nvidea manages to produce cpu's or does something together with AMD and intel will only sell cpu's to simple desktops they will lose even more costumers because instead of good ads about powerful intels you will see everybody talking about amd and nvidea nobody talks about a boring office computer ;)
     
  33. Wishmaker

    Wishmaker BBQ Expert

    Reputations:
    379
    Messages:
    1,848
    Likes Received:
    35
    Trophy Points:
    66
    I doubt photographers use SLI and Crossfire. What for? I am one of them and I am very happy with my 1900XTX. I have a mate with an AVID machine and he is not using SLI or Crossfire. This was invented for gamers. Gamers are a special niche in the market and they will always have access to high performance cards, be it Larabee or something else. Just because NVIDIA won't be able to SLI on Nehalem does not mean the Apocalypse is coming. Patience, Intel will not ignore gamers, they never have.
     
  34. SideSwipe

    SideSwipe Notebook Virtuoso

    Reputations:
    756
    Messages:
    2,578
    Likes Received:
    0
    Trophy Points:
    55
    I think you guys seem to think Intel will stop all development of SLI type motherboards but that isnt true. Intel will always (for now) have Crossfire support so high end gamers will still be catered for if using ATI and well according to the HP Blackbird, you CAN run Crossfire on an SLI board and vice versa should work too. Also both ATI and Nvidia often release X2s as single cards so SLI enabled boards are not really needed.
     
  35. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    Of course, that source was Nvidia employee, who probably was biased and didn't tell the whole story, like if Intel just wanted a higher percentage of the profits or something and Nvidia said no. Nvidia is doing fine as a company, always outperforming revenue expectations, but Intel is trying to make it back to the status they held before AMD kicked them in the @$$, super filthy stinking rich off of gouging consumers.
     
  36. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    If Larabee becomes a reality, Nvidia will slowly start to fade away.
    Separating the CPU and GPU seems to be a thing of the past,at least as Intel presents it.
     
  37. benx009

    benx009 Notebook Evangelist

    Reputations:
    184
    Messages:
    663
    Likes Received:
    0
    Trophy Points:
    30
    But if this comes true:
    Then we'll have Intel, NVIDIA, and AMD/ATi all competing in both the CPU and GPU markets??? I mean, I'm all for competition, but geez when it gets to the point where I'm deciding whether or not to pair my NVIDIA CPU w/ an ATi GPU or maybe instead my AMD CPU w/ an Intel GPU, I'm going back to my 2001 rig :mad:
     
  38. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    Nvidia is already making some ASICs with ARM processors in them for the UMPC market, or netbooks. They have combined the ARM processor with graphics on one chip, like AMD's geode.

    I used to design ASICs with ARM processors in them, some of them two processors on a small chip which had to be ultra power efficient. I think Nvidia will have a good product, better suited for the task than Intel's Atom.
     
  39. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    That might be pushing it a little. I don't think Nvidia got that far just yet.
    But it's all converging towards mixture, Nvidia trying to develop CPUs, and Intel developping Larabee :D
    And AMD already bought Ati, so.. :D
     
  40. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    Oh yes, Nvidia launched the new CPU/GPU chips at the Computex show this week.

    Heres a good article about it.
    http://arstechnica.com/news.ars/pos...as-play-for-intel-arm-and-the-mid-market.html

    Nvidia has also been looking to acquire VIA.

    I think there will always be discrete GPUs for high end gaming. Or maybe when they can get enough cores on there they will have 8 cores, 4 of them being gpus, for easy crossfire/SLI ability.

    It seems like theres going to be a lot of heat in there.
     
  41. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    mmmm, Tegra looks hott!
     
  42. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    Yes tegra will be good.

    There is nothing breakthrough about it in my mind, since I was developing similar ASICs several years ago for very low power applications, except Nvidia added the High Definition video aspect.

    AMD's Geode is similar, and maybe they will have an update for it. There were UMPCs at the Computex show based on AMD/ATI, 9"ers.

    AMD will probably have something that blows it all away in a year. Like the idea that an 8 core CPU is actually 4 cpu cores and 4 gpu cores for some serious fast parallel GPU processing :) Or the ATI HD4000 PCI-E series may have GPUs that have multiple cores, instead of the HD3870 X2 nonsense. AMD has something up their sleeve and are about to throw a trump card.
     
  43. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    One can only hope. Thus far, AMD has failed to deliver.
     
  44. moon angel

    moon angel Notebook Virtuoso NBR Reviewer

    Reputations:
    2,011
    Messages:
    2,777
    Likes Received:
    15
    Trophy Points:
    56
    I disagree. Their cpus are not as powerful as intel's for sure but their current chipsets and integrated solutions are far in advance of both nvidia and intel.
     
  45. Wishmaker

    Wishmaker BBQ Expert

    Reputations:
    379
    Messages:
    1,848
    Likes Received:
    35
    Trophy Points:
    66
    ...yet they fail to deliver performance. At the end of the day, numbers count and not how advanced some design is.
     
  46. benx009

    benx009 Notebook Evangelist

    Reputations:
    184
    Messages:
    663
    Likes Received:
    0
    Trophy Points:
    30
    Agreed. I'm no Intel fanboy or anything (far from it, espeically w/ how they're threatening nvidia w/ Larabee), but AMD really has just failed enthusiasts on both the price and performance front. I mean, when your $235 quad-core 9850 black edition processor is being owned by intel's $130 dualcore E7200..... well, it just doesn't leave much more to be said.

    AMD does have some pretty good, not to mention cheap, dual-core processors out there, but they will never satisfy enthusiasts.... let's just hope they really do have something up their sleeves in the future..
     
  47. wilsonywx

    wilsonywx Notebook Evangelist

    Reputations:
    20
    Messages:
    424
    Likes Received:
    0
    Trophy Points:
    30
    I think AMD has been focusing on the midrange and low-end markets, hence the unsatisfactory enthusiast hardware.
     
  48. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    It depends what tasks you are doing. An enthusiast that needs really fast RAM will prefer AMD. And there is no game that Intel can play better than an AMD system.

    So I dont know which enthusiasts are disappointed. The only reason Intel is offering cpus at those low prices is because of AMD. You may not remember the days of PII or III cpus that cost $400 and so on before AMD started bringing the competition. We bought a Pentium 1 system for like $2200 in 95 or something like that. You can just look at Intel's share price/porfit margin history and understand they were gouging consumers before AMD came out with Athlon and they had to price things competitively. Right now they are trying to take more marketshare.

    In multithreaded applications the 9850 is much better than the E7200. Adding cores doesnt add performance unless you use benchmarks that utilize many cores.

    Intel CPUs do many things better than the AMD ones. How could AMD have imagined that we would be manufacturing down to 45nm in such a short time after AMD switched to the direct connect architecture to make it possible to put so much cache on Intel's cpus. That is the only reason they are performing better, and why Intel raced to get to 45nm. If intel didnt have 3MB or 6Mb of L2 Cache they would stink, becuase their FSB can't keep up with RAM speeds like AMDs does, so it uses L2 cache to run your programs or as a fast RAM buffer.

    And its amusing, because the way AMD started performing better was to add more L2 cache to Athlon CPUs and they started the multiple data bursts per clock cycle and now Intel took off with it, with Quad pumped and huge caches.
     
  49. pukemon

    pukemon are you unplugged?

    Reputations:
    461
    Messages:
    2,551
    Likes Received:
    245
    Trophy Points:
    81
    i'm still not clear on why amd doesn't use bigger L2 caches. i get differing opinions on it upping their performance. i know their equivalent of intel's fsb, the hypertransport, makes a big chunk of their performance. i think amd can compete directly with intel in most markets, i just wish they would put it on them. intel needs all the competition it can get. i wish their was a 3rd serious competitor. via sucks. ibm needs to bring back their processors. i forgot what they were called. power processor? i've read their cpu's did more per cycle in some applications. they still make processors right? the wii and 360 use them? i'd like to see intel pushed, and then intel push back or fall flat their butts.
     
  50. scooberdoober

    scooberdoober Penguins FTW!

    Reputations:
    1,718
    Messages:
    2,221
    Likes Received:
    3
    Trophy Points:
    56
    Just another reason why monopolies are a bad thing. I wish Intel had some better competition. If they did they wouldn't be so quick to stick it to other companies.
     
 Next page →