if true.
http://www.semiaccurate.com/2009/10...x275-gtx260-abandons-mid-and-high-end-market/
-
-
Kamin_Majere =][= Ordo Hereticus
If true, its very very bad for us the consumer (and then ATI will singlehandly control the market and dictate what we get forever)
But somehow i doubt this is true (or at least not that bad) -
Wow and just to think that ATI was almost out of the picture about a year or so ago and now look at them. Yet I believe that this is a cycle that goes on between ATI and Nvidia and AMD and Intel, every few years one takes the lead and then after a while the other will lead. Just my observation. God Bless
-
So when is either Intel or AMD slashing their CPUs so the other one has full contorl of the market? XD
Hmm, it'd indeed be bad for consumers if only one company becomes the source for a given product. I hope Nvidia stays competitive in some way though. They could abandon the high end market, but the mid--range has money to be made. In fact, that's where ATI's whole marketing success started. -
I have a hard time believing it for a variety of reasons ranging from the fact that the website is called semiaccurate.com to the writing style to the fact that nvidia would essentially be exiting the GPU business if they did that.
Today's not April 1st, is it? -
I am also guessing that the start of it was with the soldering failures and then nothing coming reallynew coming out of the pipelines so just redoing current products but with smaller dies like going from 65nm to 55nm. Whereas AMD just kept a low profile and kept working at it. Again just my take on it. God Bless
-
-
Well there was another thread created somewhere about Nvidia trying some new technology to compete in the CPU(or at least number crunching) world rather than the gaming world.
I'm not entirely up to date to all this, but seems as though Nvidia is shifting their priorities and company goals if these "articles" are to be believed. For a long time, they've mostly wanted to hang that "most powerful graphics solution" crown so maybe they've decided to take a change? -
Even though I am am more of an AMD/ATI fan, I hate it when the competition closes up or gives up as this hurts the consumers.
-
-
I call shenanigans. Nvidia is not going to leave the high end gaming GPU market entirely to ATi. If they do, I swear I will eat my 8800GT on camera and post it.
-
Im fairly certain that all of the GT200 based GPUs were meant to be high end, even though the article says otherwise, I wouldnt really classify the GTX 260 as "mid range". Especially in comparison to the other lower end GPUs that were supposed to be released.
-
The GTX 275 is more debatable, but it's definitely not a high end $400-$500 card either. -
I figured it was more like, GT200 = high end, G92 = midrange ( due to the advent of the GT200 ), and the rest were pretty much low end. -
I wouldn't consider the HD 4870 high end either, unless if it was the X2 version.
That'd be like considering the i7 920 a high end CPU, even knowing that the i7 975 exists. -
-
I don't see why people are so surprised, not sure why this didn't happen sooner. Nvidia hasn't just been competing with ATI for gpu's, but with their own retarded re branding system. Seriously how stupid do they think people are?? Why would anyone pay more for the same thing that uses just a few less watt's. I remember when the first gtx cards came out and then a few months later people had to decide if they wanted to buy a gtx 260 or a gtx 260....
More and more lately, companies seem to think slapping on generic stickers is enough to market products. ATI isn't perfect either with their naming, example the 4830 and 4770. Unfortunately ATI's products offer more than just a die shrink and thats a major reason why people are buying them now. When people have money to burn they don't go buy the same thing again, its just common sense for most. -
-
Red_Dragon Notebook Nobel Laureate
I also think high end = 4870/90...at least for now....
-
i7 920 is pretty high end performance wise, since most high end users would choose to overclock it to over 4ghz. only a select few dumba55 would buy a dell with i7 975. -
at least ATI offers much lower price for their desktop GPUs...
hope this trend will continue even after NVIDIA out from high-end market -
"Nvidia can make chips and sell them at a loss, or retreat from those markets and lose less money."
god i love it when what we learn at uni happens in real life. Nvidia are making an economic loss equal to their fixed cost, and their marginal revenue is equal to their average variable cost. But since marginal revenue has dipped further below the average variable cost, they are now incuring the fixed cost loss and some variable cost loss. Hence the best option is to pull out of the market in the long run so that they only incur their fixed cost loss. Since they cannot escape their fixed cost in the short run but they can escape their variable cost.
So we should expect nvidia to lay off workers. -
I'm calling BS on this article. First, the article was extremely unprofessional and I have never head of that website. Also, this website is the only source of this information, making it really unreliable
-
moral hazard Notebook Nobel Laureate
-
Makes perfect sense for them to kill these cards as they're not going to be selling many of them with ATI's 5000 series desktop cards already on the market and GT300 based desktop cards coming for the Holiday shopping season.
If they don't ceasing production on these cards now all they will end up with is a large overstock of cards they will have to sell for a loss later. -
While it is possible that this is true, I would wait to hear it from someone other than Charlie Demerjian before leaping to conclusions. Does anyone know what is it with him and Nvidia? His articles on the subject are always written as if Nvidia killed his dog or something.
It's hardly the catastrophe that he makes it out to be. AMD/ATI will release the 57XX series sometime in October and it is quite likely that compared to these, Nvidia's higher end chips will no longer make sense at the price points where it can make a profit selling them. They'll be back in the high end with the GT300 chips shortly. -
Demerjian is a moron, and obviously has some deep-rooted hatred of Nvidia stemming from being picked last at football one too many times, or being called dough-boy in grade school. In fact, there's a whole blog dedicated to him, apty named 'Charlie Demerjian is a doucebag.'
http://tinyurl.com/ycc6e2h
Picture of bag in question.
http://photos1.blogger.com/photoInclude/blogger/5437/1673/1024/charlie_bunny_a.jpg -
-
Bah, I can't beilve this thread, do you people believe everything you read online. Read BrandonSis articles, that's right now theres another article for you all to read. I hope in between you reading both and having a mental breakdown because the thin walls of your painful reality have just collapsed. You realize not everyone must be telling the truth if there on the Interwebs. Now lets all take a deep breath wait and the truth has to come out eventually.
-
Regardless if any of this actually comes true, I think this article and blogger is an absolutely unreliable source. I'd peg him as an ex-sales rep for the company because he sounds disgruntled at the "management" for some reason. I highly doubt a company with Nvidia's track record can throw in the towel and be ready to disappear from the market due to 2 bad quarters. ATI has struggled for much longer than that and managed to stick it out through several years of "bad times".
-
-
-
Will nVidia be back in business with the GT300, yes. Fortunately they will have a replacement for the GTX285 and GTX295 pretty soon but, it will take them longer to replace the GTX275 and 260.
If I'm an exclusive partner, I would definitely think twice about the exclusivity agreement. -
Didn't see that coming...
-
-
actually i've been hearing about stuff like this ever since i started looking at more powerful mobile phones over the summer...
-
The 200's probebly are going to be EOL, but that would most likely be because the 300's will be comming soon. Didnt the same guy who wrote that story also claim the 295 was scrapped and would never come out? I could be wrong there but I know he seems to really hate nvidia.
Oh well, unless I read this from a more reliable source I will just assume the 200's are eol to make room for the 300's and that is good news to me. -
Nvidia is making the right move. GT200 from the beginning was a marketing failure. Too expensive to produce, and since Ati was able to sell a GPU nearly as powerful for much less, Nvidia had to drop prices and swallow losses from the get-go. Now that the 5850 and 5870 are out, the GTX 285 is completely outclassed and overpriced, Nvidia just can't sell these GPUs anymore. Then again, this is just a timely end of production. There are plenty of units still on shelves, and for the next few months you can probably nab these GPUs for less than $200.
If Fermi fails, Nvidia won't be making desktop GPU's like they have in the past. It's even in question whether Nvidia would survive if Fermi fails. -
-
what worries me even more is that the new NVDIA GPU's might not be focused on gaming but on HPCs... not good for us gamers...
-
Rather predictably, Nvidia has denied Charlie's claims.
-
Alexrose1uk Music, Media, Game
Nvidia have always denied claims, even when they've turned out to be true.
I dont like Charlie's style, and he is always anti-Nvidia when it's been possible, but on the other hand, out of his articles I've read, he usually does seem to have an aspect, at the very least, of the truth, and I strongly suspect he has some reasonably placed internal sources of information.
You have to remember this is the same Nvidia right now that A) Showed off a FAKE fermi card B) Denied it completely when asked C) Were later forced to retract and admit it was NOT the real card.
Id agree, this being Charlie, we need to take the biased aspects with a pinch of salt, but I wouldnt be surprised if theres a large element of truth there as well.
GT200 is deeply anti-cost effective right now, and G92b is now pretty damned old. GT300 parts are needed damn quickly for Nvidia to retain thier current market share, which has already begun to ebb towards ATI.
For those who DONT know who Charlie is, Im pretty sure he USED to work at some of the large IT news websites, like Fudzilla, he's not an unknown, and neither is the attitude he has developed over the last several years against Nvidia (not that at least some of it is entirely unfounded).
As even your article admits Althernai
-
The most disturbing part of this whole thread is that a good half or so of the commenters believe that article completely, not whether it might or might not be accurate.
I still don't believe it as cutting the GTX 260/275/285 would leave nvidia with effectively nothing for the holiday season (it'd be a miracle if they could produce enough GT300 chips to meet demand the moment it launches), so I'm going to remain skeptical until other, more reliable sources (meaning someone who doesn't obviously hold a grudge against nvidia) surface. -
Well even if it is true, 1 thing i really dont want is Nvidia to completely drop out, Nvidia holding crown for a long time (and when have the problems with the new GPUs started (or the gddr5 controller)? i havent heard any, and people seem to enjoy overclocking them like crazy)
Nvidia have always been my faviourite, even if i have moved to ATI -
even if its not true, we're still allowed to speculate as if it was.
-
Even if it's not true, it should be.
Nvidia would be better off. -
-
Really? I thought 8800 GTs tasted great with a side of smoked E8400 and snazzy IDE cable noodles
Back on topic, if Nvidia kills the 200 series, why are some of the even older series still on the market? -
Nvidia kills GTX285, GTX275, GTX260?
Discussion in 'Gaming (Software and Graphics Cards)' started by brochiller, Oct 6, 2009.