I didn't say it had anything to do with the game's design, I was asking you to show me this being done to a real game.
-
If you set your GPU to do output on framebuffer resolution (usually the default), your screen's controller will scale the frame. If you set your GPU to scale by it self, your GPU will do it. Either way it is already done, so you can set non-native resolution without seeing black bars around the picture in the first place.
I used nearest resampling in the first picture because I can't change the physical pixel density of the screen you're viewing this on. I have to fake this effect to make 4 physical pixels look like one, -
-
You said native is always better. With the same 540p output a native screen would be 540p as well. Compare the two resampled screenshots. I agree that number two looks like crap, like you said. But is number one really much better?
Open this in a new tab and scale to 1:1 (fullscreen mode on 1080p should do it). The left half is the 540p native you're asking for. The right half is 540p resampled onto 1080p. They both look like crap, and I don't find the left to be significantly better than the right. -
You're still using doctored images to try to explain whatever it is you're trying to explain. But my point still stands that I want proof of that quality of upscaling applied to actual games, which you have not provided, not theory, of which you've provided plenty of. Meanwhile, my personal experiences of running games at 540p upscaled to 1080p look much, much worse than what your work in Photoshop with a couple of filters shows.
-
By the way. not everyone doctoring raster images is using PhotoShop. -
Here's what I don't get.
You're telling me I should be seeing this:
1080p
540p upscaled to 1080p
But this is what I'm actually seeing?
1080p
540p upscaled to 1080p
Where's the disconnect? It doesn't take a genius to figure out that whatever resampling methods you're using to create those images is completely different, and of much higher quality, than what is being implemented by my GPU/display in actual games. -
-
I do not have the Planet Side game, so I can't play with the settings. Given that you've uploaded a 1080p image to show the effect of 540p upscale, my guess is the in-game resolution setting is still at 1080p, but there's some kind of graphical koality setting to put rendering quality down to 50% (whatever that means), is that right? If yes, the upsacling is done by your game's code in its own way, not by GPU output scaling code or screen controller function.
Can you set the game's output resolution to 540p (you might need to add a custom resolution profile to your GPU's config), set rendering quality to 100%, and then let your screen or GPU output code to do the job?
The scene you showed is a architecture scene with near vertical and near horizontal lines everywhere. Those lines are very sensitive to aliasing. More importantly, the 1080p screenshot you provided does not seem to have any effective anti-aliasing technique applied. The lower rendering resolution simply make it worse. If this is the aliasing effect you talked about, you should not blame upscaling, as the aliasing effect is already there when the low-resolution frame buffer was drawn. The following upscaling is not responsible.
By saying "540p upscaled onto 1080p looks terrible", you seem to imply that a native 540p screen would be better. But you have never seen a 540p (or similar) screen showing the same game at native resolution for comparison, have you?
The 540p upscaled image of mine you referenced was not done with any special tricks. Each pixel was simply divided into four with the original colour value. You can zoom in to see the detail. -
Omg dude you have serious problems in understanding the english language(alongside serious antisocial issues for swearing on a forum)
Ok i'm going to explain it to you in the simplest words possible.
Everyone here is talking that content DOWNSCALED to 1080p on an 4k screen ia going to look just as good as content that is ALREADY 1080P on a 1080p screen of the same size(let's say 15' for arguments sake.) because there is no aproximation of the colors...each 4 pixels on the 4k screen becomes one larger pixel of the same color when you are playing the 1080p content....whereas every pixel on the same size 1080p screen is ALREADY the size of the 4 smaler pixels added together.
It's like if you take a crayon and color a 4x4 square...then you take the same crayon and color 4 smaller squares of 1x1 put together next to each other...In both cases you will get the same final result....one big colored 4x4 square.
What YOU are saying is that 4K content downscaled to 1080p on a 4k screen looks nowhere as good as 4k on a 4k screen. That's because 4k NATURALLY looks better then 1080p...atleast that's the example that you are giving with your 540p argument. If you could have a 540p monitor and play the same 540 p content that you would play on your 1080p display of the same size they would look IDENTICAL
540p is just a crappy looking image to begin with and it will not look better wether you are playing it on 540 or 1080 native res screen
Both afirmations are correct and NONEXCLUSIVE of each other. It's just your brain that can not compute the difference between what WE are saying and what YOU are saying.
Sent from my GT-N7000 using TapatalkHTWingNut, Cloudfire, inperfectdarkness and 3 others like this. -
Yes there are more to a display than just pixels, I understand that. But it seems that you are stuck on the idea that you can`t have more pixels and a great display.
GT60 have a 2880x1620 display plus it is IPS. That would give great colour accuracy, great viewing angles as well as super resolution.
I know MSI said they were gonna put a 3K display on their GT70 the moment they had one available to buy. So that tells me that no company that makes these displays haven`t even made one yet. You are probably right that its because more people buy these weak 14/15" notebooks and they would reach out to more customers. But I think its sad that a notebook OEM can`t see the value in making these displays themselves and offer it on a SLI notebook that is actually capable of running games on 4K. Surely that would create more sales and they would steal customers from other OEMs?
-
If you took a 20x20 image and scale it to fill a 1920x1080 screen or 4k screen it's going to look like horrid, regardless. But once you scale from an already detailed image on a screen with imperceptible pixels to the human eye, the image, especially moving images in video games, will appear pretty much the same.
-
Just sayin'
Sent from my GT-N7000 using Tapatalk -
Cloudfire, Jerome3773 and deniqueveritas like this.
-
inperfectdarkness Notebook Evangelist
I wouldn't call the GT60 20D a "weak notebook". That said, good gaming laptops don't have to hit 2 grand. My GX660R was $1350 after rebate, and it's no slouch on performance at all.
But since you brought up the point that "no one is making 17" 4k displays yet", why stick to 16:9? Let's start over with 16:10. -
Sent from my GT-N7000 using Tapatalk -
inperfectdarkness Notebook Evangelist
-
Sent from my GT-N7000 using Tapatalk -
masterchef341 The guy from The Notebook
Oh, they'll absolutely disappear - the question is when. Doesn't look like it's going to happen any time soon. I'd say less than 50% likely that they're gone in 10 years.
-
inperfectdarkness Notebook Evangelist
-
It has nothing to do with stupidity.
Consumers are provided with 1080 p(16:9) content(bluray/internet you name it) hence consumer search for 1080p screen.
Provide 16:10 content and consumers will search for 16:10 screens.
It's basic adaptability really..they don't really care as long as the content matches the screens
Sent from my GT-N7000 using Tapatalk -
inperfectdarkness Notebook Evangelist
Content matches the screens on 16:10 just fine. It allows me to have the play menu bar on the display without having it overlap the content.
Consumers aren't THAT smart, you really give them too much credit. they swallowed 1080p hook, line and sinker because that's what was pushed to them. Nevermind that many (myself included) lost 10% of their vertical pixels by being force fed 1080p.
I don't have a problem with 16:9 TV's. I just want the OPTION for 16:10 displays on my PC...and the collective industry has given me and those like me the middle finger. -
I don`t understand the big fuzz about 16:10 and 16:9 and never will I guess.
I didn`t lose anything. My display is just as big as the 16:10 displays, I just gained space in the length. Games and movies, they all go great long with 16:9. I don`t need a longer display. I have a mouse to scroll with.
I have zero problem using this 16:9 display. In fact I love this PLS display.Jobine and deniqueveritas like this. -
inperfectdarkness Notebook Evangelist
having used 4:3, 16:10 and 16:9, i can honestly say that i enjoyed the move to 16:10 and hated the move to 16:9. it just felt "off". after studying for a bit, i discovered that this was probably due to the "golden rectangle". there's plenty of other reasons to hate 16:9 (pretty much everything other than watching movies), but the golden rectangle really carries weight when you start to get into aesthetics. -
Isn`t a 17.3 inch a 17.3 inch?
-
inperfectdarkness Notebook Evangelist
15.6" - 1920x1080 = 2.0736MP
Smaller diagonal screen, Same class of resolution, bigger display (more MP). -
-
inperfectdarkness Notebook Evangelist
http://www.displaywars.com/23-inch-16x10-vs-23-inch-16x9
try punching in your own numbers.
unless my math is wrong, for any given diagonal size, the difference is 5% larger for 16:10. THAT'S why you should care.
p.s.
even with 15.4" vs. 15.6", the 16:9 display still has 2.44% smaller area. -
-
One side of entertainment that is affected by screen aspect ratio is movies.
Another side is gaming... but I guarantee you there are alot more ppl watching movies than playing games on a laptop)
Everything else(browsing, music etc etc) is not dependent on the proportions of the screen so it will not affect it considerably. -
-
Do you really think that a majority of the people who are playing candy crush or words with friends on fb care about resolution?
Sent from my Nexus 5 using Tapatalk -
inperfectdarkness Notebook Evangelist
No, of course not. And PC sales keep declining. Which is why it should be more important than ever to cater to the high-end users who want 16:10, because your average, run-of-the-mill user is using a tablet or smartphone and doesn't give 2 s about it.
-
-
What makes 16:10 so much better than 16:9? Enlighten me. -
http://en.wikipedia.org/wiki/Golden_ratio
)
Although I'm pretty sure Plants vs Zombies 2 is close to 16:9 as the ideal ratio to play at
Those games in no way affect the peoples need(or want) for a certain aspect ratio.... movies, youtube and various other things that use the 16:9 ratio as a standard do.
Sent from my GT-N7000 using Tapatalk -
Retina displays are 16:10. Many tablets and cell phones are using 16:10 as well. And with a Windows laptop, with the taskbar at the bottom, and toolbars at the top, it's nice to have more vertical real estate. However at ~ 3K resolutions, 16:9 is 2880x1620 vs 16:10 2560x1600, you have more vertical pixels and higher pixel density, so I guess it depends on the end resolution. The major conundrum intially when 16:9 started to replace 16:10 years ago was that 1920x1200 was replaced by 1920x1080. Plus 1280x800 by 1366x768. 1680x1050 and 1440x900 were replaced by 1600x900.
-
16:10 3K resolution is 2880x1800, e.g. Retina MacBook Pro, so the argument for 16:10 still stands.
-
https://www.apple.com/macbook-pro/specs-retina/
-
inperfectdarkness Notebook Evangelist
if only apple wasn't the only company to make 16:10. it almost makes me like them, god bless it.
-
How about we move this discussion about aspect ratio to where it belongs?
-
inperfectdarkness Notebook Evangelist
doesn't it belong here? onboard graphics are just like 16:9, crap that the industry keeps trying to force down my throat.
-
Onboard graphics actually have a use... they keep your laptop running cooler and longer. Maybe you and I don't care that much for battery life but MANY other ppl do.
Sent from my GT-N7000 using Tapatalk -
Yep with optimus and enduro you can have the best of both worlds. The low power and heat of igpus with the power of dgpus. I wouldn't buy a notebook if it didn't have both, and I think its these technologies that will actually end up prolonging the life of dgpus significantly as most of the drawbacks are completely taken care of. All the performance is there when you need it, and they sit nice and quiet and draw < 1W when you don't.
-
inperfectdarkness Notebook Evangelist
sounds great in theory, but i've heard NOTHING but bad about optimus...at least with regards to the GT60's.
-
Dont get me wrong, technology isnt flawless and there are legit issues that pop up, but most of the time they are not universal issues which points to something specific about somebodys setup -
inperfectdarkness Notebook Evangelist
that could be. i've had pretty good luck with ATI drivers, and mediocre luck with Nvidia. i remember back in 2003/4 when i had to uninstall an Nvidia driver and rollback to the old one, because i could no longer run a higher resolution so maybe it's just my bad luck, but sometimes it seems like Nvidia cares more about "the crown" than it does stability or hardware durability (ask me about my GTX 260).
-
Too much noise about a 5% reduction in display space imo.
The scale tip back again when I`m watching movies with my 16:9 display -
Guys, the suggestion of keeping the discussion about aspect ratio to the thread on it was sound, if the thread veers too much off topic, the posts will either get moved or the thread will be closed because it has run its course.
As for dGPUs dying, not anytime soon, I can see them being relegated to even more of a niche market, but they aren't going anywhere. There's a whole segment of people depending on their quadros and firepros for good mobile performance and the enthusiasts for the Radeons and GeForces that are capable of gaming. I cna definitely see the lower tier dGPUs going the way of the dodo in a few CPU generations if Intel and AMD continue to improve their integrated graphics. -
Do you think discrete graphics cards will become obsolete....soon
Discussion in 'Gaming (Software and Graphics Cards)' started by minerva330, Jan 8, 2014.