http://www.techspot.com/news/30330-intel-stonewalling-nvidias-nehalem-chipsets.html
-
Intel would not do something like this with no reason. There is more to this than meets the eye and they will not abandon hardcore gamers. There is always a backup plan and this seems the first major move to give Larabee a shot.
-
Marketting ploy, sure.
But it may be time for AMD to step up -
Maybe intel is going into the graphics card buisness. seeing as that is the worst thing they can do.
-
ViciousXUSMC Master Viking NBR Reviewer
AMD and ATI are together, and Intel and Nvidia have been doing like a secret partnership.
SLI is Nvidia tech, so why would it be bye bye intel cpus? that makes no sense.
It would only hurt both companies and allow AMD to take advantage of the situation and intel is all about being the power king and cutting off AMD at every step.
Lets release X process a few months faster than AMD, lets do this faster than AMD, oh AMD caught up and they are releasing a cpu thats just as good as ours? hmm lets do a MASSIVE price cut then so they still cant sell anything.
It used to be back and forth back and forth between the two pretty much every month or every other month one company overtook the other with better products or prices and this was good for the customer because competition drives lower prices and faster evolution of products. Now with Intels big lead for so long its hurting us and prices have gone up. If they really do something like this it may be just what we need to try to put things in balance, but something tells me intel would just copy the SLI tech and release there own chipset to get even further ahead, or heck just buy Nvidia lol. -
not possible,this means 1000000$ loses for intel....
-
I'll probably get flamed for this, but SLI sucks anyways. Paying for a second graphics card for a ~20% boost in performance is completely impractical, when instead you could use that money to buy a single, top-of-the-line card. Obviously this is off-topic, but I just want to say that I don't think Intel is missing out on much here.
-
You get way MORE then "~20% boost in performance." Have you seen the charts lately on how well it scales? It can scale well above 75%.
-
.
-
What do you mean gamers will be hurt. Their not talking about not using Nvidia graphics, just not SLI, unless Nvidia caves and licenses them the ability to make it themselves.
Nvidia just wont be making motherboard chipsets, Intel will still use Nvidia graphics cards.
Maybe Intel has their own way of performing the SLI function with any video card.
But I hope that it is a big mistake for intel and they fall on their face.
Besides what is it that Intel offers that AMD doesn't for a gamer?
And people actually question the allegations that Intel is anti-competitive -
-
nVidia will shoot themselves in the foot. SLI would only be possible on AMD processors then, which that seems like suicide to me as AMD won't want them touching their ATI baby.
I see lawsuits and monopoly claims in the very near future. -
It's probably mostly jockeying behind the scenes over how to slice up the money pie. My guess is that someone at Intel feels that, since Nehalem has a new architecture, it doesn't technically fit within the current licensing agreement, and thus provided a wedge for Intel to try and shake more money out of the deal. It might also be partly due, as another poster pointed out, to Intel getting paranoid that NVidia is going to start moving in the direction of more general processors and not stay put in the GPU niche, which might have caused Intel to want more restrictive terms in the license that would have the effect of making sure NVidia limited its GPUs to the usual graphics support functions.
-
Have no fear, the EU will discipline Intel if they have too
.
-
Dear Intel: wtf r u doin ????
Seriously, if nVidia's only option is the largely inferior (so far) AMD, then what's the point of being SLI-capable if you won't end up with the fastest rig?
Also, AMD won't let nVidia get close to ATi, like Greg said -
I think you meant their only competition, or alternative is AMD/ATI, but I would not consider them largely inferior. You will always have one or the other on top. And Nvidia and you have benefited from the technology ATI has developed and made mainstream, namely GDDR3 and then they will benefit from ATI's development of GDDR5.
Im not sure what anyone means by AMD wont let Nvidia get close to ATI. -
They mean AMD/ATI doesnt want anything that will take their advantage away from them. Nvidia has not had an AMD SLI board in a while so ATI dominates that area and allows Intel to use its Crossfire tech in their chipsets.
Of course if this does happen (intel and nvidia no longer offer SLI boards), then AMD might accept Nvidia SLI so it could potentially sell more AMD CPUs, meanwhile ensuring only ATI Crossfire will be available for Intel CPUs. It is in a way a win/win situation for AMD. Intel will lose if it does push Nvidia away. -
The_Observer 9262 is the best:)
The might be something as per their plan.
-
Intel isn't going to stop making CPUs simple as that
-
Most likely Nvidia chipsets were causing a restriction they were not happy with, additionally SLi is kind of a yawn...
But if Intel was so worried about gaming then why are their GPU's just about the worst on the market? -
StefanHamminga Notebook Consultant
Actually makes sense to me, Intel's biggest rival in it's main markets is Nvidia, not?
Nvidia (and AMD/ATI) is pushing to make the GPU more general purpose and thereby decreasing the importance of the CPU in the PC.
Intel obviously wants to put an end to this while they still can (by crippling nvidia as much as possible).
From a consumer point of view I'd like the EU to force both Intel and Nvidia to remove artificial SLI limitations and make SLI work on any chipset.
Even better would be a pact between IBM, AMD, Nvidia & Microsoft to introduce a new (open) standard instruction set (Power 6 seems a sensible base), forcing a way out of the x86 stronghold Intel has on CPUs. -
perhaps intel has something up their sleeve. perhaps they haven't unveiled it yet, or maybe there is something that has been overlooked and nobody's really taken notice of how important/influential it could be on the market. maybe i don't know what i'm talking about, but honestly for some reason this has my curiosity piqued.
-
Who had the plans to make the CPU and GPU into a single component?
AMD? -
dunno what Intel thinks they are but they have to stop right now! they are on their way to manufacture every piece of a computer and ban out all other manufacturers from their system. I hope AMD catches up soon with Intel, Intel is really bugging me lately with this kind of news
-
masterchef341 The guy from The Notebook
umm... just throwing it out there. SLI will not increase performance 100%... and uhh... definitely not 105% lol.
other than that, no real comment except that intel clearly is not a company that likes to fight fair, but they do have the best cpu's out at the moment. -
Well, I beg to differ.
A friend showed me Unreal Tournament 3 with a 8800 GTS 320 SLI , and whadda you know, in some cases he got twice the fps he was getting with a single card.
Also, PM Brainer for more details... -
i rather have a good gpu with a medium cpu then a good cpu with a rubbush gpu if intel i really pushing nvidea out of the market i think they cut themselve in the finger and intel will sell alot less cpu's to gamers
-
Or Nvidia will start developping CPUs too
-
not really, 99% of the consumers are casual users. they don't really care about the extreme vid cards. they only need integrated for the most part. Intel won't lose anything, if they do, it's only from the really small population of gamers. GL defending nVidia. Gamers are only the small portion of the buyers
-
didnt nvidia already bring out a CPU for small mobile devices?
-
yeah a allin one chip with cpu gpu north and south bridge in 1 chip for small devices
-
second its not only gamers using heavy video cards its also photographers and all kinds of other power graphics things.
Then you have a thing called reputation if nvidea manages to produce cpu's or does something together with AMD and intel will only sell cpu's to simple desktops they will lose even more costumers because instead of good ads about powerful intels you will see everybody talking about amd and nvidea nobody talks about a boring office computer -
-
I think you guys seem to think Intel will stop all development of SLI type motherboards but that isnt true. Intel will always (for now) have Crossfire support so high end gamers will still be catered for if using ATI and well according to the HP Blackbird, you CAN run Crossfire on an SLI board and vice versa should work too. Also both ATI and Nvidia often release X2s as single cards so SLI enabled boards are not really needed.
-
Of course, that source was Nvidia employee, who probably was biased and didn't tell the whole story, like if Intel just wanted a higher percentage of the profits or something and Nvidia said no. Nvidia is doing fine as a company, always outperforming revenue expectations, but Intel is trying to make it back to the status they held before AMD kicked them in the @$$, super filthy stinking rich off of gouging consumers.
-
If Larabee becomes a reality, Nvidia will slowly start to fade away.
Separating the CPU and GPU seems to be a thing of the past,at least as Intel presents it. -
But if this comes true:
-
Nvidia is already making some ASICs with ARM processors in them for the UMPC market, or netbooks. They have combined the ARM processor with graphics on one chip, like AMD's geode.
I used to design ASICs with ARM processors in them, some of them two processors on a small chip which had to be ultra power efficient. I think Nvidia will have a good product, better suited for the task than Intel's Atom. -
That might be pushing it a little. I don't think Nvidia got that far just yet.
But it's all converging towards mixture, Nvidia trying to develop CPUs, and Intel developping Larabee
And AMD already bought Ati, so.. -
Oh yes, Nvidia launched the new CPU/GPU chips at the Computex show this week.
Heres a good article about it.
http://arstechnica.com/news.ars/pos...as-play-for-intel-arm-and-the-mid-market.html
Nvidia has also been looking to acquire VIA.
I think there will always be discrete GPUs for high end gaming. Or maybe when they can get enough cores on there they will have 8 cores, 4 of them being gpus, for easy crossfire/SLI ability.
It seems like theres going to be a lot of heat in there. -
mmmm, Tegra looks hott!
-
Yes tegra will be good.
There is nothing breakthrough about it in my mind, since I was developing similar ASICs several years ago for very low power applications, except Nvidia added the High Definition video aspect.
AMD's Geode is similar, and maybe they will have an update for it. There were UMPCs at the Computex show based on AMD/ATI, 9"ers.
AMD will probably have something that blows it all away in a year. Like the idea that an 8 core CPU is actually 4 cpu cores and 4 gpu cores for some serious fast parallel GPU processingOr the ATI HD4000 PCI-E series may have GPUs that have multiple cores, instead of the HD3870 X2 nonsense. AMD has something up their sleeve and are about to throw a trump card.
-
-
moon angel Notebook Virtuoso NBR Reviewer
-
-
AMD does have some pretty good, not to mention cheap, dual-core processors out there, but they will never satisfy enthusiasts.... let's just hope they really do have something up their sleeves in the future.. -
I think AMD has been focusing on the midrange and low-end markets, hence the unsatisfactory enthusiast hardware.
-
It depends what tasks you are doing. An enthusiast that needs really fast RAM will prefer AMD. And there is no game that Intel can play better than an AMD system.
So I dont know which enthusiasts are disappointed. The only reason Intel is offering cpus at those low prices is because of AMD. You may not remember the days of PII or III cpus that cost $400 and so on before AMD started bringing the competition. We bought a Pentium 1 system for like $2200 in 95 or something like that. You can just look at Intel's share price/porfit margin history and understand they were gouging consumers before AMD came out with Athlon and they had to price things competitively. Right now they are trying to take more marketshare.
In multithreaded applications the 9850 is much better than the E7200. Adding cores doesnt add performance unless you use benchmarks that utilize many cores.
Intel CPUs do many things better than the AMD ones. How could AMD have imagined that we would be manufacturing down to 45nm in such a short time after AMD switched to the direct connect architecture to make it possible to put so much cache on Intel's cpus. That is the only reason they are performing better, and why Intel raced to get to 45nm. If intel didnt have 3MB or 6Mb of L2 Cache they would stink, becuase their FSB can't keep up with RAM speeds like AMDs does, so it uses L2 cache to run your programs or as a fast RAM buffer.
And its amusing, because the way AMD started performing better was to add more L2 cache to Athlon CPUs and they started the multiple data bursts per clock cycle and now Intel took off with it, with Quad pumped and huge caches. -
i'm still not clear on why amd doesn't use bigger L2 caches. i get differing opinions on it upping their performance. i know their equivalent of intel's fsb, the hypertransport, makes a big chunk of their performance. i think amd can compete directly with intel in most markets, i just wish they would put it on them. intel needs all the competition it can get. i wish their was a 3rd serious competitor. via sucks. ibm needs to bring back their processors. i forgot what they were called. power processor? i've read their cpu's did more per cycle in some applications. they still make processors right? the wii and 360 use them? i'd like to see intel pushed, and then intel push back or fall flat their butts.
-
Just another reason why monopolies are a bad thing. I wish Intel had some better competition. If they did they wouldn't be so quick to stick it to other companies.
Possibly bye bye Intel CPUS if you want SLI...
Discussion in 'Hardware Components and Aftermarket Upgrades' started by eleron911, Jun 6, 2008.