Me and my buddy were talking about graphics cards and how it just boils down to red and green.
but we disscused of that if alienware (well probly dell) could make a gpu exclusive to alienware products that it up to competition with red and greens annual realeases
well maybe not exclusive but a brand if you will. that could compete with the top kings in gpus.
desktop and mobile versions?
highly unlikely but it was just an idea we were talking about that was really random after talking about red and green competitions LOL
thoughts?![]()
-
-
They would pretty much have to buy one or the other. And coming from someone who paid half the price as a 680M and got something with same/better performance - let them have nvidia. then the overpriced bloated junker companie trifecta will be complete. If nvidia went with dell down the death slope to hell, well, fine by me... hahaha
-
That would be interesting, but unfortunately Dell would have to spend years of R&D to even produce a GPU that can compete
. Or, Dell could also sign a contract with Nvidia/AMD for a custom GPU (like Apple usually does). But Nvidia/AMD would not want to do this, as they lose profit from not being able to sell GPU chips to other notebook manufacturers.
-
bigtonyman Desktop Powa!!!
As others have said it would be an interesting concept, but I doubt it will ever happen.
-
failwheeldrive Notebook Deity
Not that we need another AMD vs Nvidia thread lol. -
I think if anything, Alienware could do a factory overclock and tune on the GPU's, but it would probably get messy. As much as i would to see another competitor it,I don't see it happening.
-
Don't they already do a factory overclock to a certain extent? My nvidia runs at 718mhz core but it will boost to 758 if its ok thermally?
-
This would never happen. Nvidia and AMD have spent years and years developing their GPUs. Dell wouldn't be able to compete. If they did I certainly wouldn't want or trust it in my system.
-
Would be cool, but will cost too much. It's cheaper to outsource to NVIDIA and AMD.
-
Alienware would have to begin from scratch rendering this a propostorous proposition. At very most I'd expect them to overclock and rebadge certain cards.
-
It's taken Intel, a company that has more money than they can spend, almost ten years to get something that can fit into the low end.
There's no point for any company to try to compete in x86 graphics besides Intel/nVidia/AMD. -
This was an idea LOL not that it will happen but i was just saying. what if.. what if they have been in the dark about a graphics card specificly for alienwere did tests over a 10 year period and decided to release it....
would you look into it and buy it? or wait to see what the rich people do first and buy it just cuz and post reveiws lol like tech people if you will on youtube lol not bashing rich people or anything lol.
i agree with some people they would probly just brand there name on a card and overclock it or something lol
but its pretty cool idea IMO -
Dell does not have the money to try to squeeze into another closed market.
-
-
Meaker@Sager Company Representative
See what happened to 3dfx
Another gpu vendor would have to exist just as amd/nvidia do to compete, be across all markets. -
-
Meaker@Sager Company Representative
It's hard, due to the amount of investment needed to design these chips to have space for 3 large players.
Maybe PowerVR will move up from mobile devices, but that will take some time. -
lol watch someone create a revolutionary gpu for desktop and mobile that will blow everything outta the water. like for a mobile for example someone could create a chip that not only saves power but has max performance and performace just good as the newest persay gpu on desktop just mobiled lol
-
IBM is probably working on stuff like that, but its probably not going to see the light of day for years if at all. And if it does, its going to be expensive.
-
-
Meaker@Sager Company Representative
It's not just about chip development either, it's about the drivers (perhaps more so) development time in that is also time and resource expensive.
-
In regards to looking at this from a economics standpoint, it would be dumb to do this simply because a.) they would have to "catch up" to two huge competitors, B.) getting the word out asap would be the hardest part because they would lose money on the GPU's due to no one knowing about the their own GPU.
And tbh, i wouldnt buy their GPU's at all, just because i trust EVGA and i swear by them.
Basically, "if it aint broke, dont fix it". Let alienware stick to what they are good at. -
Meaker@Sager Company Representative
Let's maybe put it this way, Intel tried to execute on an add in graphics card to compete against AMD/Nvidia and they simply could not get a competitive produce out before Nvidia and AMD had iterated again and left their product behind 6 months before it would launch.
-
lol but what if it had technologly that wow'd you and you got the urge to check it out lol
-
Meaker@Sager Company Representative
It comes down to speed, dx level support and drivers with price sorting out close cards.
Screw up on one of these and you can go home. -
-
Meaker@Sager Company Representative
Yes, look at what a slip up with enduro did to AMD and that was with the game still able to run, going back to the early days of 3d cards was where you had REALLY bad drivers that could not even run a game or where only certain sets would work at all with certain titles.
-
Back in the 90's it was often the case that when your card was out of date, it literally would not play new games. And this could happen in 6 months, even to the most powerful cards.
And people wonder why consoles were so huge back then... (PS2 sold over 200,000,000 units) -
lol back in the 90's. dont get me started LOL
-
Hmm the 90's. lets see the main video cards I remember were ALI, Trident, SIS, Matrox ( they are still around ) I think 3dFX was late 90's before nvidia took them over and screwed them up. I know there were a few others as well. now we have basically intel, AMD and Nvidia . head into certain types of CAD and medical imaging and MAtrox is still the big boy even after losing out in the 3D gaming market after the Matrox Millennium Cards. ( but they were still better video capture cards than even the new crap we have now for low res video )
am I the only one that remembers a staggering 128kb of Video Ram???? I hit the big leagues with 1MB!!! -
I do remember a time when 128MB of VRAM was the next huge thing, but not really before that as far as GPU go.
Anyways, pertaining to the design of a new brand of GPU, as others have said, you'd need a ton of cash to sink into R&D and years even before putting something at least competitive in the low end. Also, max performance and low power consumption isn't something that will be easily done, especially considering that we're getting closer and closer to the actual limit of what the laws of physics allow us to do with the current materials and designs we're using to make CPUs and GPUs. Heat dissipation is sort of the hot topic right now in research pertaining to components like CPUs.
Now, I won't pretend to be an electrical engineer and to understand these things at a very deep level, but the seminar I did attend about three weeks ago was about the need to tackle the heat dissipation problem related to the thermodynamics of electronics, minimum voltage necessary for the transistors, etc. and the prospects of using graphene and CNTs as new materials for making CPUs and GPUs.
The initial idea is a nice one, but it's not something I see happening any time soon. The possibility of an already existing player in a different market being able to roll out something is there though, but again, that will require pouring massive funding into R&D for something that may or may not work and while R&D is extremely important in the field of electronics, not everyone is willing to dedicated tons of cash to it. You can be certain that Intel does it, IBM does it, AMD does it with what resources they have and nVidia does it as well for sure. I'm sure the other chip manufacturers do it as well, but I'd be surprised if most of them did it on the same level. -
LOL. I remember back when there was like half a dozen+ graphics cards makers. The market was a total mess, drivers were insane and horrible, and the companies mostly competed with adds instead of product quality.
I am all for competition, but only if its done right. Things are much better today with only 2 big companies in the mix (as long as AMD and Nvidia are at each others throats we should be fine. If they buddy up like the cable companies and printer ink makers have then were screwed).
I remember an add from the early 90's in some computer magazine that was one of the most insane things I have ever seen lol. Was a full page add of a guy who looked like an over the top version of the villain from Spy Kids holding this ENORMOUS and horribly ugly looking graphics card. Apparently it had 8-10 (cant remember which one) of the most powerful graphics cards on the market, from each company, all on one board. It was called the "omega 3000" or some crap like that, and the caption under it was "we took the 8/10 most powerful graphics cards in existence and put them all on one board! Don't ask us how we did it though!"
All around it there were bogus looking reviews, and idiotic captions like "its the last graphics card you'll ever need!", and "It will still play games 10 years from now!" (utter BS)
The price was like 3-4 grand too. And you know at least one sap bought that crap and believed everything the add said. They must have been real disappointed when it was left in the dust within 2 years (if it even worked at all. Seriously, how they hell would 10 graphics cards, with extremely differing levels of quality, all made by different companies, ever be made to work coherently together?!?!)
Today Innovation is going on at a somewhat slower pace. The distance between 8-16bit 2d sprites and advanced 3d animation is a hell of a lot larger than advanced 3d animation and what graphics makers are trying to achieve today (near photo realism).
I am betting if you somehow crammed 10 GTX 680's (or 5 GTX 690's) into a desktop and overclocked the hell out of them, you would probably still be able to play games in 10 years. Albiet, at base settings, and there is a good chance they would all burn out on you before the 6-8 year mark.
An Alienware graphics card (just a discussion i had)
Discussion in 'Alienware' started by Gunnkeeper, Jan 19, 2013.