I feel the same way, except my game of choice is Spore. I wish LBP would be released to PC's, because that would definitely steal some months of my life.
-
H.A.L. 9000 Occam's Chainsaw
-
we're not playing Zork or anything anymore - the games have GUI and to build upon the complexity of the interface, the graphics need to be improved upon. there are a myriad number of games that are released weekly, and the bulk of them don't ever see the spotlight because what sells is fun easy-to-pick-up games. there are still those games out there that provoke the imagination and 'pull you in'. -
The statement that you need good graphics to get sucked into a game is nonsense. The playstation one Spyro games created the best environment/setting where you could lose yourself thanks to creative use of colours. -
-
even though you might not notice you said it, but Spyro created a good environment to you because of the GRAPHICS. i remember Donkey Kong 64 was a phenomenal looking game for the 64. Majoras Mask was so impressive with the environment and detail. these games were very fun, but also, they were so new with what was done. when LOZ went 3D it was AMAZING. you could do so much in the game. the game felt like a breath of fresh air. 12 years later, 3rd person action is very normal for character control.
would you play spyro again had it just came out now in 2010, with nothing but a texture / polygon update and the series hadn't came out yet? i know i probably wouldn't get Mario 64 and many other games. i'm older. i like different things then when i was a kid. i can't really translate the entertainment value as entertainment was so much different back then. i was so fascinated with many games, especially because they were so new and ground breaking on the PS1 and N64.
game play is always pure win for everyone; however, how many wii games do you own? aren't they the must purist of 'gameplay over graphics' games around? they're downright fun, but, i'm not looking to get those games.
the best selling titles 'gamer' titles do have GOOD graphics. Blizzard's games have excellent graphics. WoW is 5-6 years old or something, yet with each expansion they update the graphics and detail the environment much much more.
i think most people are more disgruntled with the use of physics in game versus graphics. graphical engines are now heavily physics based which tax the hardware very much.
think of it this way - there will be a point in time where graphics can no longer exceed in terms of visuals. they will be at the utmost visual complexity that they can be. nothing can be realer than life itself. as the market progresses towards being able to run extreme graphical content easily, we will see a slow transition back to much more 'imaginative' gameplay. graphics are already good for the majority of people. this brought forth the Wii. games with great gameplay are there. they're just not games that 'hardcore' gamers would play.
compare the average game from 2000 to the average to 2010, there is a vast improvement. look at 1990 compared to 2010. imagine in 2020 and 30 what visuals would be like.
especially if we're able to get on exascale computing for personal usage, we'd be able to crunch out very, very impressive results for games in the future.
back on topic - i think ATI will hold the marketshare for new gpus still since many people are waiting for Fermi to have the 5000 series get a price drop. -
I had a great time reading that article and would mostly agree with that guy.
The article aside, I have said it before, but I hate being taken for a fool, and although not being a particular fan-boy of nvidia, always tended (is that a word) to go more towards them. but renaming after renaming and the 8600 crap (my GFs pcs 8600 burned, and...w/e long story) I just feel, well, I feel marketed ><.
And I am stubborn, my desktops card is an ATI and I have nothing but enjoyment from it and because I am stubborn, there is no way I am giving my money out to people I feel consider me as nothing else than a money bag, so I'll be an ATI fanboy. Until ATI pulls that same crap, then ill just give up.
Anyway great read that article. -
I like your setup. I use vintage JBL monitors, the L series that JBL created, studio monitors for home use.
Like your AKG's, they look like the studio K240s? I've got the Sennheiser HD650 and Audio Technica AD900 headphones with a dedicated amp like yourself. Audio is fun right?
I've got the exactmat also, but hate it. Thinking of ordering a cloth mat Goliathus or the Steel qCK.
-
AKGs are k271 mkII
I dont have a dedicated amp but a External sound card, thinking about the amp tho.
Audio is the best.
I love the mat, but not too sure about the armrest, I had a cloth one before, never again. Its all about personal taste i think.
/OFF TOPIC I FEEL T3H BAN STIKZ ABOVE MY H3ADZ! -
From what little I understand about transistors and whatnot, this seems like Nvidia is trying to jump too many steps in the evolution of technology, or is being lazy/stupid in how they're designing the card.
It also seems like whoever they're getting to make the cards has issues with making them, or the tech to control the manufacturing of defects isn't there yet either.
I should study more. -
H.A.L. 9000 Occam's Chainsaw
IDK... it just seems to me that TSMC is trying to take the cheapest way out possible, even if that means sub-par transistors, and substrate layers, and that someone else could step up to the plate and show TSMC how it's done. Granted ATI did what NVIDIA can't, but they had to re-design to compensate for TSMC's shortcomings. -
the [.problem with nvidia is that in the past year or so they have been trying to become a company thats NOT a gpu maker. they have dumped considerable resources on their nvidia ion cous and their nvidia tegra mobile platforms.
unfortunately for them theyve been hit by major problems. intel filed a legal case against them since they are not registered in the SEC as cpu manufacturers - a legal battle that nvidia will likely loose.
the only gadget to use the tegra is the hd zune, and unfortunately devastating for nvidia is the fact that they are NOT a development partner for windows 7 phone.
these are gigantic setbacks indeed. and nvidia released a statement to investors last quarter that they are ideed refocusing on their core business - making gpu's.
the mobile refreshes i believe are a direct effect of this former shifting of priorities. fermi was also definitely a 'casualty' as it was set back in favor of tegra i believe. now they seem to have no choice but to focus on gpu tech for both the professional and enthusiast markets.
that could bode well for everyone, as they will not only be forced to come out with newer tech, but also cheaper products to fight of the ati tide. -
If I would buy it in 2010? YES! My first console was a ps2 (which I bought 2-3 years after it was released), and I started playing games like ffx10, timesplitters 2, and so on. So by the time I got to know spyro the graphics were already very outdated. Nevertheless its one of the best games I eve rplayed thanks to gameplay and briliant colours.
I own 0 wii games, but that has nothing to do with their graphics. Nintendo consoles tend to bring out +-5 good games after the first 1-2 years of the console release, after that your console is just a dust collector. Nonetheless, I bought a gamecube just for zelda the windwaker. Yes...just for that 1 game. Why? I love good gameplay with flashy colours ^^ (like a moth to a flame rofl).
About the Fermi release, i'm with you. I'm also waiting for nvidia to release a super strong gpu just so the ati 5 series will drop in price. Nvidia's quality has an even worse than toyota quality reputation to me now.
I think i'm one of the many who just want a gpu that will last a long time and that will be affordable. By answering all your aspects about games and graphics I hope your convinced that hardcore gamers like myself care more about gameplay that graphics. Though I must admit that new graphic wonders like Assassins creed would not have had as much fans with ps1-ps2 graphics... -
but if nvidia brings out a super strong gpu it is going to be priced more expensive because nvidia really can't afford to drop the prices. However ati is in a position to drop prices to put the hurt on nvidia. if we see a price drop at all its because Ati is trying to force nvidia to lose market share not because nvidia is trying to put the hurt on ati
-
Exactly. That was my thinking. The FERMI and Nvidia's route just doesn't make much sense when the i7 Quad is cheaper and more powerful, I don't need a GPU to do general processing. Some rail against ATi for their lack of 16 reference frames in DXVA support, but honestly the i7 barely uses 10% CPU to run the best 1080p encodes I've seen.
Nvidia should focus on the GPU. As it is most users still view the Nvidias as GPUs, but since they are not, really throws off for gaming community. Nvidia didn't just ruin it for themselves but ruined it for the rest of us.
If Nvidia was on board in making GPUs with DX compliance, we'd have far more DirectX 10 games and DirectX 10 would be what DX11 is now, since MS had to strip so much of DX10 away because of Nvidia's refusal to comply with the rest of the gaming community. That's why I don't like Nvidia, they didn't just screw themselves over, they screwed us all, the gamers, the developers also.
-
-
"A broken clock is correct twice a day"
Charlie's nVidia bashing is so constant, eventually he was bound to get it right...I think they really have dropped the ball on Fermi.
I just put together a desktop for the first time in a long time (not for me) and it was unbelievable how much more appealing ATIs products are at the moment, and will be for a while. (Im betting Fermi will flop pretty hard, this article makes a severly good case for that idea, much better than charlie's usual stuff)
Knowing the market's history though, and with nVidia's not implementing Tegra into anything substancial, I think we will see NVidia come back into its traditional market strong with a solid traditional GPU..their new ideas have failed and they will retreat back to their strength. There last quarter's plan stated so. (Keep in mind though, it will be a year at the earlies before this happens). -
Microsoft, game developers, Ati and Nvidia have a committee setup by microsoft to decide what gets put in each version of directx. Notice Nvidia hasn't been playing nice and hasn't supported dx 10.1 ever and still has yet come close to supporting dx 11. It isn't an accident and couldn't be an accident when all parties have a knowledge of what is going into each directx years in advance. you think directx 12 specs aren't being worked on now probably even in finalized and in development. -
-
H.A.L. 9000 Occam's Chainsaw
-
-
Taken in context. "Hasn't ever" is probably grammatically incorrect, and should be "has never". I'm no literary professor, but I know how to read, write, and listen.
Nvidia has in fact, released DX10.1 GPU's. Any of their 40nm parts are DX10.1 as well, and they have been available for the past 6 months. -
masterchef341 The guy from The Notebook
-
Looks like someone jumped the gun on this:
http://www.engadget.com/2010/02/22/nvidia-geforce-gtx-480-and-gtx-470-fermi-cards-launching-march-2/ -
masterchef341 The guy from The Notebook
surprise surprise, the truth is not what the extremists say.
-
Its been confirmed March 27th release date of Fermi in the US at least.
So what that guy said may not be the truth, I never thought it was at all. -
Lostinlaptopland Notebook Consultant
A launch does not necessarily mean availability.
-
-
My suspicion also. Launched but not readily available.
I'm still curious if it will release consuming the 250 watts as previews suggest and whether people will need larger cases to house this thing. If so, then that's a fail right off the bat on launch date.
Anyone going to Pax East to find out? -
-
Fermi "naked photos" leaked.
Very large GPU die.
For comparison the desktop HD5870:
-
The Fermi looks a hell of a lot like the 3dfx Voodoo5 cards. Which came out just before Nvidia bought 3dfx... hmmm
-
I saw that on engaget. I secretly hope it doesnt blow my 5870 away ><
EDIT: and if it doesnt, be sure ill be here telling you all how bad nvidia really is! -
-
Hell, its massive!
Maybe they are getting all the 3dfx peops to fix nvidia haha. -
-
-
Lostinlaptopland Notebook Consultant
Exactly. But it is still going to be huge. 50% bigger than the 5870 isn't it?
Or is that just the amount of transistors? -
-
-
Lostinlaptopland Notebook Consultant
And bar the 8 series have always struggled to beat them or lost. Does not bode well that line of thinking.
-
That special cooling system needs holes embedded in the PCB itself? Can't be good.
Does anyone make green PCB's anymore? -
-
Lostinlaptopland Notebook Consultant
I think blaze-senpai was referring to the quarter circle cut outs on the PCB. I have heard these have been done before and are nothign new though.
-
There's no way in the world the GPU in that picture is ever fitting into a notebook. It's going to have to get cut down.... a lot....and whatever is left will have to go up against a notebook card based on ATI's desktop HD5770. -
-
-
-
It doesn't seem this has been posted yet:
<param name="movie" value="http://www.youtube.com/v/vpdPSZB8A8E&hl=en_US&fs=1&"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/vpdPSZB8A8E&hl=en_US&fs=1&" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width='640' height="385"></embed></object>Last edited by a moderator: May 6, 2015 -
nice video
-
Last edited by a moderator: May 6, 2015
Nvidia's Fermi is Broken and Unfixable.
Discussion in 'Gaming (Software and Graphics Cards)' started by anexanhume, Feb 17, 2010.