The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Do you think discrete graphics cards will become obsolete....soon

    Discussion in 'Gaming (Software and Graphics Cards)' started by minerva330, Jan 8, 2014.

  1. minerva330

    minerva330 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    3
    Trophy Points:
    16
    I was wondering what the community thinks about the possibility of discrete video cards becoming more and more a line of "pure enthusiasts" hardware. According to Intel Iris Pro Graphics 5200 - NotebookCheck.net Tech Intel's Iris pro can run most games on medium settings in or around 30fps.

    If Intel continues to push their APU's, and assuming they can continue exponential performance gains rather than hit some sort of bottleneck, will discrete video cards become less relevant? As of now the Iris Pro is only 10-15% behind the GT 650M and GT 750M and is faster than AMD's 8650G. What does this mean for Nvidia and AMD?

    I think with advent of 4K APUs may have a more difficult time keeping pace with dedicated GPUs but who knows what Skylake or beyond might bring, Intel's roadmap is shooting for 5nm by 2020-that is only 6 years away.
     
  2. EvoHavok

    EvoHavok Notebook Consultant

    Reputations:
    5
    Messages:
    139
    Likes Received:
    25
    Trophy Points:
    41
    The high end ones won't become obsolete, but ever since I saw the Iris Pro 5200 I started wondering about mid range and lower GPUs. "Dedicated graphics" won't be of much importance as the sole marketing scheme anymore, as many ads don't actually name the video card model.
     
  3. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Haha... No they will not become obsolete. Iris Pro 5200 graphics laptop costs more than a machine with a GTX 765m which has over twice the performance. Iris Pro can run at 720p, beyond that it still has very limited bandwidth to run at 1080p. Maybe your lowest end GPU's will be replaced by IGP eventually, but even today the standard HD 4600 can't compete with the lowest end 720m or Radeon 8550m. And yes with larger resolutions, IGP's will fall WAY behind, again due to limited bandwidth.

    Iris Pro is just an experiment, nothing more. Perhaps they will take lessons learned from it and incorporate it into their future products, but I doubt they will continue down the patch of Iris Pro as the norm.
     
    deniqueveritas likes this.
  4. minerva330

    minerva330 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    3
    Trophy Points:
    16
    Totally agree but it still pretty impressive none the less.

    Do you think it may be possible in the future to link a APU with a current video card in a SLI/crossfire type configuration? I am sure it is mainly a driver/controller issue versus a pure architectural issue, no? If that is possible you would be able to squeeze a lot more life out of your set-up and if Iris pro turns out to be a successful experiment it could worth it from a price to performance ratio perspective.
     
  5. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    In general? Nope I don`t think so.

    But eating up the low end space, yep. IGPs from Intel, APUs from AMD, SOC`s from Nvidia/ARM, they are all making low end notebooks obsolete because the devices they are in are so much cheaper.
    High end gaming, I mean the gamers who want to game with high settings in atleast 1080p, need a dedicated GPU. The IGPs/APUs/SOCs can play 1080p too, but mostly for old games.
    4K/3K is the big fuzz right now and even the greatest mobile GPUs (780M etc) can play only a few games in 4K. You need SLI notebooks to utilize that 4K display to the full in games.

    imo
     
  6. EvoHavok

    EvoHavok Notebook Consultant

    Reputations:
    5
    Messages:
    139
    Likes Received:
    25
    Trophy Points:
    41
    Not like those SLI notebooks can do so much better at 4K at Ultra settings. If people are really that much into 4K, just get a desktop.
     
  7. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    AMD has been doing asymmetrical CrossFire for quite some time, and AMD APU + weak dGPU laptops are all over the place.
     
    HTWingNut likes this.
  8. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    GTX 780M SLI could play many games in 4K.
    Just check out GTX Titan results.
    Nvidia GTX780Ti Review (1600p, Ultra HD 4K) | KitGuru

    Of course it wouldn`t be 60FPS, but playable 30-50FPS in most of them

    Maxwell top end cards in SLI would be amazing with 4K display. You hear that Alienware?
     
  9. minerva330

    minerva330 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    3
    Trophy Points:
    16
    Cool. Wasn't aware. Thanks for the info.

    It would be interesting to see the 'theoretical" benchmarks of the IP 5200 and 780M in an asymmetrical configuration. I know it's not going to happen but I am just curious what it would look like.
     
  10. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    AMD's setup worked marginally at best. It was subject to micro stutter and really were better off overclocking the dedicated card to exceed performance of the combined GPU performance without the stutter. Faster GPU's didn't/don't benefit from the IGP much at all, if anything they will hinder it.

    SLI/CrossFire work best with identical performing cards.
     
  11. EvoHavok

    EvoHavok Notebook Consultant

    Reputations:
    5
    Messages:
    139
    Likes Received:
    25
    Trophy Points:
    41
    Now that would be something. A fortune, but still glorious.

    On-topic: The asymmetrical builds have results all over the place. AMD should stop creating them and focus on more equivalent Crossfire setups.
     
    HTWingNut and Cloudfire like this.
  12. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    It would hardly work. If you ping-pong with 2 cards like typical SLI/CF setups, the Intel GPU would waste all the time and the NV(or AMD) card would not have enough job to do. If you let both cards do as much work as possible it would result in dramatically high latency spikes. Your output will get messed up and some frames would appear in wrong order. Alternatively you can add high latency artificially to all the frames rendered on the dGPU so you can boost frame rate a little bit at the cost of very laggy game play. Pointless if you ask me.


    In theory, some rendering pipeline designs can be distributed across different GPUs, resulting in sightly higher total throughput with little lag. But I'm not aware of any implementation of this kind in any type of real time graphics.
     
  13. Saucycarpdog

    Saucycarpdog Notebook Guru

    Reputations:
    0
    Messages:
    69
    Likes Received:
    7
    Trophy Points:
    16
    No, discreet cards are never going to disappear. And if a really good IGPU does come out it would, at best, replace the lowest end cards. From what AMD showed at CES I think it will be them who does it though and not intel.
     
  14. Oemenia

    Oemenia Notebook Evangelist

    Reputations:
    0
    Messages:
    331
    Likes Received:
    5
    Trophy Points:
    31
    The HD 4600 is a decent chip but even then its simply a bigger version of the 4000 rather than an architectural improvement. Besides the new consoles are out and requirements are going to shoot up meaning the Intel IGPs will struggle to play on Low again.

    Its great news they are improving so much but its still a while, at least now you can play older games comfortably and the occasional new games with decent settings.
     
  15. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    If anything, dedicated desktop cards will die out first. Until laptops can hit parity with desktops, discreet cards will still be around. 1080p performance may be fairly easy (as it should, since "2k" resolution (1600x1200 / 1080p) has been the standard for well over a decade. Until 4k displays are the standard, and games are programmed to be rendered at that level natively, we won't begin to see the end of discreet cards. There's just too much room for improvement. We'd probably already be there if consoles weren't destroying progress by watering down what is "acceptable".
     
  16. skoreanime

    skoreanime Notebook Consultant

    Reputations:
    40
    Messages:
    148
    Likes Received:
    1
    Trophy Points:
    31
    Well, no real surprise the Iris cost/performance ratio is pretty high. It's still a product in its infancy. Despite the enthusiasm for high end GPUs, the low and mid range still make the vast majority of sales. I can see Intels Iris series becoming a pretty big threat to AMD and Nvidia.
     
  17. Oemenia

    Oemenia Notebook Evangelist

    Reputations:
    0
    Messages:
    331
    Likes Received:
    5
    Trophy Points:
    31
    Blame teh consoles!
     
  18. minerva330

    minerva330 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    3
    Trophy Points:
    16
    I would agree. If you look at the latest Steam Hardware & Software Survey a majority of the members (that took the survey) are reporting using APUs. This has to impact AMD and Nvidia in some way, although they won't admit it... Nvidia sales and profits rise despite PC slump

    If their largest and most profitable segment is shrinking what does that mean for the innovation of dGPU's. Gamers, enthusiasts and supercomputers can't fill the gap for ever, or can they?
     
  19. lewdvig

    lewdvig Notebook Virtuoso

    Reputations:
    1,049
    Messages:
    2,319
    Likes Received:
    26
    Trophy Points:
    66
    Maybe not obsolete, but not a requirement either.

    Intel 5200 is close enough to a 750m that if they make a similar leap when Skylake comes out in 2015 then I can see an IGP at roughly a 680m/7970m level.

    Considering 880m and R9 290m (or whatever) are just rebadges of those chips, that is scary.

    Also, with PS4 running a Pitcairn derivative I think developers will target that level. So the only reason for more power will be 4k resolution that PCs can offer.
     
  20. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Where is the IGP gonna get that memory bandwidth from? Can we has 8-16GB unified GDDR5 on a 256-bit or greater bus in every gaming laptop by 2015? Can we has a lot of software, specifically games, written to take advantage of HSA while maintaining backward compatibility with the traditional memory architecture by 2015? LOL
     
    HTWingNut, Zero000 and deniqueveritas like this.
  21. Akimitsui

    Akimitsui Notebook Evangelist

    Reputations:
    256
    Messages:
    371
    Likes Received:
    100
    Trophy Points:
    56
    Without discrete graphics cards, how on earth are we supposed to fry eggs now? :eek:

    [​IMG]
     
    Cloudfire likes this.
  22. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Have no fear, my friend. The huge APU on PS4 offers enough power for frying eggs. Try the new HSA flavour!
     
  23. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    Absolutely not. In fact, they will continue to get more powerful. The APU will have their division for playing Pokémon, and the content creators will use the real GPUs. :p
     
  24. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    [​IMG]
     

    Attached Files:

  25. qweryuiop

    qweryuiop Notebook Deity

    Reputations:
    373
    Messages:
    1,364
    Likes Received:
    89
    Trophy Points:
    66
    o god.... why.... why do you have to pick on pokemon.... right in my childhood

    btw those fighters look good with those balls, at least each of them have ball
    (sounds weird but it needs to be singular, am i right?)
     
  26. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Actually, there were rumors of AMD Kaveri using GDDR5 for system RAM, but it never materialized.
    I would imagine that GDDR5 can be implemented in such a manner for PC's on a smaller manufacturing process, just like it was done for the consoles (hence the enormous bandwidth there).
    Carrizo (successor to Kaveri) might feature this option, but its more likely to come with DDR4 support (which should double the existing bandwidth I think).

    HSA is there to SIMPLIFY things for developers and take advantage of the shared memory on the APU (to my understanding, its not difficult to implement HSA support for existing software... actually, if I'm not mistaken, its as simple as integrating it into a patch, and was specifically designed so it can be implemented relatively quickly - all you have to do is essentially have developers behind it, and the HSA Foundation already does this - Windows 8 and all upcoming/existing Adobe products are HSA ready).

    Plus, Kaveri, and chips in general that support HSA do not have to copy data from the CPU and convert it into GPU readable code, hence the shared memory on the APU where data goes directly from the CPU to be processed by the GPU.
    Kaveri is the first implementation of HSA, hence why people need to write a patch for existing software to utilize this function... however, if I understood the premise behind Excavator design, it should be able to do this on a hardware level without prior software support (so the CPU and GPU can really work in union automatically).

    This is why Kaveri for example is touted as having 12 compute cores (4 CPU steamroller cores and 8 GCN cores) - and since the GPU can do things far more efficiently and faster than a CPU in numerous areas....

    As for maintaining backward compatibility... the APU's are still x86, are they not?
    As for the far future, if you want backward compatibility, then I would imagine you will have to use something akin to DosBox and Virtual Machine (besides, we have to get away from this ancient design as is).
     
  27. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    HD5200 aka Iris Pro is faaaaaar from a GT750M.
    We are talking 20-50% difference.
    The higher res, the higher difference.
     
    deniqueveritas likes this.
  28. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    That's the biggest thing there is resolution. Sure it might come close to competing with a lower mid Renee card at 720p or below but with 4k making inroads and Maxwell 20nm and AMD Crystal right around the corner, the performance gap will widen considerably.

    Beamed from my G2 Tricorder
     
  29. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yup, once you hit the IGPs with MSAA and such or higher res, they can`t keep up with dedicated graphic cards.
    Anandtech`s HD5200 review covered it pretty well.
    AnandTech | Intel Iris Pro 5200 Graphics Review: Core i7-4950HQ Tested

    I welcome 4K. Once they hit 17"/18" notebooks I will be one happy dude. Right now I`m really pissed about the fact that 13-14-15" notebooks with a weak midrange GPU get them. While 17and 18 inch notebooks with the greatest hardware is still stuck with 1080p displays.
    Or the fact that mobile phones are starting to get 1440p displays.While the notebooks with the greatest hardware is still stuck with 1080p displays.

    Somewhere in the companies that make displays for notebooks, there is one gigantic troll of a CEO/engineer sitting in his office with a coffee cup, laughing at us. Every day I dream that the guy will drop some really hot coffee on his lap and get severely burned.
     
    columbosoftserve and Akimitsui like this.
  30. Zero000

    Zero000 Notebook Deity

    Reputations:
    26
    Messages:
    902
    Likes Received:
    69
    Trophy Points:
    41
    Low end , yes

    High end , nope
     
    hfm likes this.
  31. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    As an enthusiast and/or gamer you're missing three critical points here:

    1) these "higher" resolutions, not 4k displays

    2) desktop replacement only account for 10% of the laptop market

    3) there's more to a display than just mere pixels

    I'd prefer a quality HD display than one that merely adds pixels. After the camera industrie's race for higher pixel debacle, you'd thing we would realize that by now.
     
  32. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    I agree with #3. That said, msi's 3k display is actually very, very good.
     
    Cloudfire likes this.
  33. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Yes, but when handheld devices have higher resolution than a 17" screen, it's a bit ridiculous. Plus the higher resolution, the less need for running at "native resolution" because the pixel density is so high. That's the one key thing we lost when going from analog CRT to LCD, is scalability. You could run at 1600x1200 or 640x480 on the same screen and it would look crisp either way. 4k are double 1080p in both horizontal and vertical pixels so you could still run at 1080p and it likely won't show any aliasing or "fuzzy" image. Ideally I'd like to see 6400x3600 (6k?), which would be 4x 1600x900 and could run 1600x900 for gaming and have awesome performance with no degradation of image, but have a superfine LCD for desktop tasks.
     
    Cloudfire likes this.
  34. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Really? And you're comparing LCD's and CRT's now?

    Uh what? Have you tried running 960x540 on a 1920x1080 native? It looks like blurry, aliased crap, much worse than 1600x900 or 1280x720. Why should 1920x1080 on a 4K native be any different?
     
  35. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    i know. let's go back to 16:10 and throw all of the 16:9 crap in the garbage.
     
  36. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Well, if you ever bothered to look at the numbers more closely, standard 16:9 4K (3840*2160) is exactly four 1920*1080 matrices in a 2x2 layout, and has perfect 2:1 scaling for 1080p. It's called QFHD for a reason.

    :p [ Insert mandatory 4:3 super race comment here. ] :p
     
    Cloudfire likes this.
  37. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yes I understand basic arithmetic, but in practice it doesn't scale the way you guys think it theoretically does. Doing 960x540 on a native 1080p is basically rendering a 540p image and stretching it to four times its original area to fit the screen, that's why it looks horrible and much worse than any other higher non-native resolution with interpolation. I urge you to try it out and see for yourself.
     
  38. -Jinx-

    -Jinx- Notebook Evangelist

    Reputations:
    226
    Messages:
    560
    Likes Received:
    144
    Trophy Points:
    56
    Yes it looks horrible because the PPI(pixel density) is poor.... 960x540 is just a poor resolution on any size screen. It's not blurry because of the scaling. each pixel is BIG because it's comprised of 4 smaller pixels. There is no magic...math is math. If an 1080p image looks good on an 1080 p screen it will look the same on the same screen size 4k screen. The same as if a 540p image looks bad on an 1080p screen it will look just as bad on an 540p.

    On a side note it seems that everywhere I go I find you posting wrong info Octiceps :))
     
    Cloudfire and deniqueveritas like this.
  39. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It's called PIXEL DENSITY: Pixel density - Wikipedia, the free encyclopedia
    " Some observations have indicated that the unaided human eye can generally not differentiate detail beyond 300 PPI;"

    So sure if you have 1920x1080 and run at 960x540 it will look blurry on a 15" display, that's only about 140ppi. But crank up the pixel density high enough (over 300 dpi) it won't really matter, it will look the same as if it were a native 960x540 screen or a 3840x2160.

    Too many Pixels? | SamMobile

    And no where do you see me comparing LCD's and CRT's? I just stated (again) that is something that was lost when going from CRT to LCD, which is a fact. Once we regularly achieve > 300ppi then it won't matter again.
     
    Cloudfire and deniqueveritas like this.
  40. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Oooops, you were talking about 960x540, which was 1/2 of 1080p. It was me who didn't pay attention to numbers closely. My bad. :(

    540p is marginal better than traditional TV. Ofc it will look blurry. 1080p on 4K will give you the same density as native 1080p, even though the pixel transition might look more smooth/blurry depending on resampling algo, which differs from implementation to implementation. If the algo is simple nearest it should look almost identical to native. Which algo is the best depends on what is being displayed, and personal preference as well.

    But if you were seeing any kind of aliasing effect due to resampling from 540p to 1080p there was something wrong with your configuration.
     
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No it is blurry because of the scaling with interpolation. The pixels on an LCD are a fixed size, they can't physically get bigger or smaller.

    PlanetSide 2 Examples

    1080p Rendering Quality 100% (1080p native):

    h0PbF8N.jpg

    1080p Rendering Quality 50% (540p native):

    MNgs6Gi.jpg

    Rendering Quality 100%, Low Preset:

    GG7sEPS.jpg

    Rendering Quality 50%, High Preset:

    FWSn96Y.jpg

    That should put to rest any misconceptions that running 1/4 of native resolution will somehow look just as sharp as running native resolution.
     
  42. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    When you try to compare native and supersampled displays with the same logical pixel density on a WEB PAGE, please upload PNG files using 2:1 nearest resampling and 2:1 smooth interpolation.

    No one says 540p is as good as 1080p. We're saying 1080p on native 1080p is about as good as 1080p on native 2160p resampled.
     
  43. -Jinx-

    -Jinx- Notebook Evangelist

    Reputations:
    226
    Messages:
    560
    Likes Received:
    144
    Trophy Points:
    56
    Dude the idea was not that running 1/4 of native resolution will somehow look just as sharp as running native resolution.

    It was that if you pump 4x the number of pixels in a 15 inch display and then downscale it to 1x it will look just the same as a 15 inch display with a 1x native resolution

    That is if i spread 1920x1080 pixels on a 15 inch 4k screen it will look just the same as 1920x1080 on a 1080p screen
    Sent from my GT-N7000 using Tapatalk
     
  44. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Well we're not talking about supersampling/downsampling, now are we?
     
  45. -Jinx-

    -Jinx- Notebook Evangelist

    Reputations:
    226
    Messages:
    560
    Likes Received:
    144
    Trophy Points:
    56
    No we aren't... no one was...except you... everyone here was talking about playing 1080p content on a 4k screen and it looking juat as sharp as it would on the same size screen with 1080p native res..

    Sent from my GT-N7000 using Tapatalk
     
    Cloudfire and deniqueveritas like this.
  46. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    I completely agree about 1080p on 4k looking like 1080p on 1080p. i think one of the biggest problems we've had thus far though, is a lot of bad scaling programs--which is why it's taken forever to go past 1080p, simply because so many unwashed masses were scared about the text size on something with a higher resolution.

    as to the 4:3 comment, um, no. the GOLDEN RATIO is where it's at. 1.6180339..... 16:10 is 1.6. 16:9 is 1.777777777 repeating. so 16:10 is practically identical to the golden ratio, whereas 16:9 is quite a bit off (and 4:3 is even further off). to add insult to injury, the entire point of 1080p is to avoid letterboxing...which is still common with the new 22:9 standard that many movies are shot in.
     
  47. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    If you want to download the uncompressed PNG's, be my guest: 1 2 3 4

    You can download them and view to your heart's content, but I'm telling you, besides a nearly imperceptible increase in sharpness, there's absolutely no difference compared to the compressed images on the Web page.
     
  48. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Took a PNG format screenshot from here.

    I scaled down this image to 540p as GPU input, and scale it back to 1080p as panel output. The first one is done with nearest resampling to simulate the native screen, as 4 pixels with exactly the same colour value will appear as a bigger one. The second was done with cubic resampling, a smooth interpolation algo commonly seen. The third one shows the two images 50-50.

    [​IMG]
    [​IMG]
    [​IMG]

    Open the four images in separate tabs and scale to 1:1 to view the effect.

    Didn't know some people were aiming for the golden.
     
  49. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    OK great, so you know how to use Photoshop, but show me a game which does this?
     
  50. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    The 540p image is used to simulate the framebuffer output from the GPU and anything happens after this point has nothing to do with the game's design. Are we not on the same page or something?
     
    Cloudfire and -Jinx- like this.
 Next page →