Whats your opinions, would you be willing to switch GPU vendors?
-
mobius1aic Notebook Deity NBR Reviewer
I wouldn't switch.
-
Lets see...
Intel Owns the mobile PC CPU market
Intel owns a HUGE majority of the Desktop PC market
Intel owns ALL of the Apple market
Intel owns the netbook market
Intel Owns the IGP market(more or less)
Intel will own the SSD market considering how good they are
Intel has a solid presence in the Wi-Fi market
Intel has a very strong presence in the chipset market
No, I'd much rather support ATi or Nvidia. -
hehe fanboys
gamers will support whoever has the best hardware that is supported by third parties the most. and atm that would be nv. the tide is changing to ATI again though, yay. -
mobius1aic Notebook Deity NBR Reviewer
I honestly don't see Intel gaining a foothold into the performance graphics market. The cost will be high or Intel will have to purposely lose lots of money just take a bite out of ATi or Nvidia. I do definetly see Larrabee architecture becoming Intel's standard IGP set, but it comes down not only to capability, but power efficiency as well. Sure Intel has smaller node process down to a science, but by the time Larrabee makes it's real debut ATi and Nvidia will have smaller process GPUs, much more powerful GPUs as well as cheaper and more power efficient for the performance than what is available now.
And I'll admit, I'm a bit of a fanboy for AMD CPU products, I don't want to see Intel become a monopoly either. Took enough hitting myself in the head just to break down and get this Asus because it was such a good deal. I really wanted an AMD system, but couldn't beat this laptop for the price and the graphics performance. -
I could care less who makes it - if it performs, it performs. I would buy it.
Benchmarks alone would not win me over, I'd want to see that it has equal or lower power consumption, heat output, etc as well. But yeah. I'd buy it.
Intel already tried this though - they made a dedicated GPU several years ago, I am sure some of you remember this. It was a huge flop. I don't have high expectations. -
well... when nvidia gpu's got that heat issue... and I haven't used ati gaming cards... and intel does make nice cpu's and chipsets imo... i would pic the 1st and go with intel
I like them -
usapatriot Notebook Nobel Laureate
Honestly, I don't see Intel's "Larabee" offering the performance necessary to convert power users from either Nvidia or ATI GFX.
-
If it makes a good platform for switchable graphics, why not? But as a high performance GPU? I think not.
-
Although there is very little information on it out yet, I suspect the initial iteration of Larrabee will not be that great -- either the TDP will be too much or the performance will be too little. However, Intel is great at refining stuff and they can price their stuff competitively and absorb the early losses while the dies are in the process of being shrunk. And of course there is always a chance it might surprise everyone and be the best GPU on the market. We'll see. -
If Intel can swim with the sharks, then so be it. It will only force nVidia and ATI to get off their butts and beat Intel. Remember, competition drives technology.
-
But think about how poor Nvidia will crumble and die!
-
Poor nVidia? The company that has monopolized the GPU market for years? Please. nVidia will not crumble an die. Trust me, with the weak product lines they've offered lately only because ATI has had trouble keeping up, watch them come out with some pretty interesting stuff in short order that they've kept in their back pocket. It's called business...
-
-
theneighborrkid Notebook Evangelist
So what about the opposite of this and letting nVidia into the CPU world...
-
Well lets face it, Nvidia is still a big company, and everyone likes an underdog.. But since AMD and Nvidia have been in the Gpu market for a lengthier period of time they should be able to adapt and continue to expand. The only thing that Intel has is a hell of alot mmore funds to throw around.. But remember Intel will still want to keep up the expansion of their cpu's and possibly expand into other area's to dominate the market (I would not be surprised if they turned around and attempted to make LCD screens... Overclockable LCD screens... Actually is that possible? imagine upping the refresh rates to such an extent that you notice no difference anyway.. pretty much as it is now)
Wasn't intel going to combine the cpu and gpu into one? haven't they started that on netbooks.. But then again I think Nvidia has already done that with a customised Hp...
As for AMD the 4970x2's crossfired is just insane.. what they these *underdogs* need to do is to keep pushing the envolope, instead of just rehashing (I already asked you to stop that Nvidia, you naughty little profiteering company.. Next time that happens its going to be me putting you into time out and going AMD or Intel, you have been warned) and whilst pushing the envolope, invest in ways of making the processes cheaper to produce the cards, whilst making the gap between lower end and higher end models smaller..
but then again I may just invest in an old atari and enjoy the use of dots for entertainment.. that or decide to attack random innocent really old games that no longer push the envolope, and mod'em to run better with the newer ideas added in (e.g. the strategic view on supreme commander), therefore expanding that old universe in new ways.. but that takes effort..
And now having forgotten what I started talking about, I would try Larabee as long as they don't pull a Telstra (if you live in AUS you would know that they are the reason that internet is so expensive and slow, that monopolising industry)..
as was stated previously by previous posters, competition drives innovation, and that is what the masses want.. or is it??
I guess I should end the post at some point?? -
For mobile user, it will sting, for gaming powerhouse usage, it will just stink.
No, Nvidia wont just die off, they are trying their best to acquire/work with VIA. -
So what we're seeing is an integrated GPU that can probably compete with the low-end dedicated GPUs that are currently on the market now, sounds great if you're a casual notebook user who plays games on the side.
Other users who prioritise GPU power over every other single component in a notebook configuration will just continue and pick up the high-end GPU notebooks as they've always done. -
-
i'd like to see someone do something different.
If larabee works out at least is a next steup rather add more shaders, increase the clocks type thing.
2 gpu makers in the market is not always fun, with 3 is gonna be intresting look at xbox360, wii and PS3 -
davepermen Notebook Nobel Laureate
I like it to program on it, doing raytracing, and other fancy stuff. or sound-crounching (dsp style).
i don't like the programming models gpu force upon me. and i espencially dislike nvidia in both marketing, design choises and the way they design their coding packages. -
-
-
How about Larabee + SLI GTX 300's ?
-
Larabee is pretty much a project to prove that the concept WORKS at this point. They've so far managed to get tons of P4 cores to run games, and know that with each added core is a near linear performance boost. But the performance of even 32 cores is terrible. But the method they are going at this with will be extremely costly and have a HUGE power consumption. For the Larabee to become reality, and compete with ATI/nVidia at this point, it would need several HUNDREDS of x86 cpu cores. They'd be far better off cost wise just using the standard method of creating video cards.
I'll be surprised if they actually make anything out of this. -
-
MMMmmm. So who's Sony, and who's Nintendo? This is Deja Vu.
-
What do you mean by that exactly, you mean like, what the GPU accelerates, or possible to do.
Your not disliking API's are you, because if companies started making their own API's that would be the last thing the industry needs to put a bullet in its head. -
Reading back on a few things I think larrabee MIGHT actually work well. If they can design a 2GHz Atom CPU with a TDP of 3w, like they just did, I bet they can get a 100+ core GPU ... thing, within the watt usage of other mainstream high end GPU's. -
I'd bet any firm would pay out the ... bum, for that kind of power savings. -
I found this pretty good article about it:
http://arstechnica.com/hardware/news/2007/04/clearing-up-the-confusion-over-intels-larrabee.ars
Larabee will sting?
Discussion in 'Gaming (Software and Graphics Cards)' started by MonkeyMhz, Apr 6, 2009.