Ok, i know i overdid the title, but i want to get peoples attention... Ok, i have a HUGE question and a little bit for everyone to think about. Now, i know that whenever you may see a thread that says "8700 Screenshots" or "Crysis Screenshots with 8800" you may look to see how nice and crisp it looks. Also maybe to compare GPU's to buy in a notebook yourself. But, i've come to a conclusion/problem/answer/etc. with "screenshots". Say, if you have a GMA 950 or something and you're looking at 8600M screenshots of say "The new Call of Duty (2 or something-i forget)" Now, when your looking at them, they arent looking any better than a GMA 950 would on that game (saying all other specs are simmalar), but people do this thinking how nice it looks, while really it techniccally cant look any better than yours can. So, if you are comparing a 8600M and say a Go 7600, it shouldnt look much different because both cards are higher proformance than a GMA 950, which couldnt actually produce that actual crisp picture say a 8600M user would. The same with TV's, they show you Ads on how HD tv looks, but its on your regular "not HD" tv, then it cant look much better than a DVD of say that same scene, right? Does everyone see what im trying to get at? It doesnt make any sense. Now if someone could tell me im wrong with some wierd explaination on how your GPU will show these screenshots how the actual compared card that made the screenshots on thier GPU that may be 2X worse, please explain that to me because i really dont understand. So, thanks for reading my long post and tell me what you think...
-
Most screenshots are posted with an FPS counter like fraps, or at least a discussion about the FPS rate. A screenshot by itself doesn't mean anything.
-
Well, good question... I dont know, but i would think that your Graphics Card would see that your trying to produce two higher pictures of its possible preformance, so possibly it would... See which one is higher quality and reduce both qualitys to an equivilent scale the GPU can show. So its still distorting it from not showing the actual true picture, but a exact, to scale equivelent. Well, thats my best bet, tried my best
-
I'm really confused...
All video cards can generally create the same exact image of a game - it's the frames per second that's the important variable. For example, someone with low-end integrated GeForce 6100 graphics could create a screenshot that looks exactly like this:
Exhibit A: Crysis on "hacked" Very High settings @ 1920x1200
But obviously, it wouldn't be playable (heck, it's not even playable on my Go 7950 GTX) But that screenshot looks just as pretty on my screen as it does on your screen. That screenshot says that "A Go 7950 GTX can run Crysis at 1920x1200 with Very High settings and get 4 FPS".
It's not so much about how good it looks, as much as how good it can look at a playable framerate. That's why it's important to show screenshots with FRAPS turned on, or at least tell what kind of framerates you're getting with a certain card. Or maybe I've missed the point entirely?Last edited by a moderator: May 8, 2015 -
There is no difference between a screenshot taken on a computer that features an 8600M, vs one from a computer with a GMA 950, as long as they are both taken at the same resolution.
The difference is during play time, with the 8600M you will get decent fps, while with the GMA 950 you may risk not being able to run the game at all.
So the reason threads are created with the title '8600M screenshots', and so, is because of what Lithus said: We want to show the fps on specific games. -
Not directly related but how do you save screenshots taken ingame? I accidently pressed print screen when playing the supreme commander demo but nothing came up. Could someone give me a pointer on how to do these along with the FRAPS program?
-
-
The same goes for other graphical features. A screenshot can show you how a DX10 game may look different from the same game in DX9, hence the use for screenshots (even without a framerate) becomes apparent, IMO. -
The adds on tv for HDTV is nothing close to hdtv. Its just an add. Go to your local store and check out the difference. You will clearly see it. If not, then you need glasses.
-
-
-
But I think the OP was saying that like, a Go 7300 and a Go 7950 GTX (both SM3 cards) couldn't actually produce the same image quality. Which isn't true, it's just the playability of a certain level of graphical quality that's at stake.
-
ShadowoftheSun Notebook Consultant
Actually, graphics cards WILL produce varying images from identical settings. For example, a Geforce 6-series is likely to produce shimmering across its textures (a known bug), wheras its competition, the Radeon X series, did not produce similar shimmering. The Radeon X1xxx series is known to have higher image quality than its competition, the GeForce 7xxx. Finally, the 8-series cards are generally considered to have superior IQ than the HD 2xxx series, although the difference is negligible.
What does this mean? Basically, even if all things are equal in terms of settings, resolution, SM, and DX level, cards from different architectures will produce noticibly different images. The level of texture filtering, and the way that filtering is applied, differs across the different ways each card interacts with its images.
Here's an example- A comparison between a 2900 and 8800 with equal levels of texture (anisotropic) filtering. Notice how, at equal numbers of AF samples (16), the 8800 produces noticably cleaner edges and is closer to ideal (a perfect circle). In this instance, the 8800 has superior IQ across identical settings.
However, this is not the main reason for the SS threads. The threads are basically to show a consumer what he/she should expect to see when playing a game at a playable frame rate. It really has very little to do with how pretty the image is itself, nor with any type of comparison. -
This is an interesting topic.
If what you are asking is if you are looking at a screen shot on a computer with a lower quality graphic card would it not look as good if someone else viewed a screen shot with a higher quality graphic card. If this is what you are asking then it will look the same.
The reason why the HD doesn't look good on a standard definition TV is because of the resolution. When you are looking at a screenshot you are looking at the encoded image that the graphic card on a particular computer produced.
What is more important when viewing a screen shot is your monitor. If you have an old monitor that can't even do 32 bit color then yes, the screen shot will not look as good. Since the graphics card is not be asked to render a scene, just display an encoded image it should look identical on two different machines with two different graphic cards with the same basic monitor.
Of course there are can be slight differences in how to two graphics cards decode the screen shot, however these days the difference would be hard for the human eye to tell the difference. The monitor is the biggest factor with a screen shot. -
ok, dont really understand all of it, but thanks everyone
*((Screenshots... **PLEASE READ**))*
Discussion in 'Gaming (Software and Graphics Cards)' started by Nvidia, Nov 11, 2007.