The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Do you think discrete graphics cards will become obsolete....soon

    Discussion in 'Gaming (Software and Graphics Cards)' started by minerva330, Jan 8, 2014.

  1. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    You're talking about ten years ago and a gpu six generations removed. Optimus is quite trouble free for most people and laptops for the lay few years at least.

    Beamed from my G2 Tricorder
     
  2. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    that was the first of my nvidia troubles, with the 4600ti. I was still a loyal fan though. Then in 09 I got a sager with a GTX 260m, and after Nvidia giving me the middle finger on replacing it--I paid $800 out of pocket to get it fixed. and the ONLY reason i did that was because that was the last time 16:10 was available (not including apple). there's been more instances than just these two...but suffice it to say, my Nvidia allegiance is no more.

    i'm actually cautiously optimistic about integrated graphics inroads. with ATI/AMD now, i keep hoping that we'll soon get an integrated GPU solution that is in the upper-tier of performance, not just lower midrange. it would allow me to opt for AMD over Nvidia in clear conscience--and with peace of mind to boot. currently however, the performance just isn't sufficient to make me jump ship.
     
  3. Tsunade_Hime

    Tsunade_Hime such bacon. wow

    Reputations:
    5,413
    Messages:
    10,711
    Likes Received:
    1,204
    Trophy Points:
    581
    Obsolete, depends who you are talking to. dGPU for normal joe schmoe, sure, but dGPU has never been required on an average joe computer. Casual gamers can be taken care of with powerful iGPU yes, but the mid range and higher cannot be satisfied with on board graphics.
     
    deniqueveritas likes this.
  4. skoreanime

    skoreanime Notebook Consultant

    Reputations:
    40
    Messages:
    148
    Likes Received:
    1
    Trophy Points:
    31
    Only had a couple problems with Optimus when I've dealt with it on a friends laptop, but it runs pretty well for the most part. The entire premise of the platform is seamless switching between integrated and dedicated, but sometimes you gotta force it one way or another. But I still like the solution my 3820TG uses with the manual switching between integrated and dedicated. Having full control on when to switch is the best solution IMO.
     
  5. be77solo

    be77solo pc's and planes

    Reputations:
    1,460
    Messages:
    2,631
    Likes Received:
    306
    Trophy Points:
    101
    Optimus is great, has worked perfect in my last two laptops... honestly, I can't see going back to a machine without it and was a required feature in my last purchase. Those that fuss about it the loudest always seem to be the ones that haven't tried it in years if ever.

    Dedicated GPU's will stick around, as there's a huge gap between what's capable with igpu's.... plus, igpu's have to share the TDP with the CPU, so at least in the foreseeable future that will be a huge limiting factor in high end performance; look how hot a Haswell quad core runs now by itself while playing BF4.

    Having said that, I do happily game on an igpu, in an ULV CPU stuffed in a tablet case to boot.... so it's definitely possible and they've come a LONG way. I find myself playing anything from BF4 to Kerbal Space Program to AoE more often on my tablet than my dedicated gaming laptop. That to me is a huge indication the future is bright for iGPUs.
     
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    But then non-Optimus gets ShadowPlay and 120 Hz among other things. There are pros and cons to each. It's 50-50 in my eyes.

    You play BF4 on the Surface Pro 2? You're braver than I thought.
     
  7. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I have already created a custom 960x540 resolution in my Nvidia Control Panel and after selecting that in PlanetSide 2 along with Rendering Quality 100% it still looks like a blurry mess. The only difference between this and 1080p/RQ 50% is that screenshots are captured at 540p instead of a 1080p upscale of 540p. The end result in-game is visually indistinguishable to my eyes. I'm pretty sure that whatever is going on with my GPU/display scaling only allows bilinear interpolation of non-native resolutions. Even a perfect 1:4 nearest-neighbor approach would look worlds better than what I'm seeing as your simulated image proved. It would be pixelated and blocky for sure, but sharp. But whatever is going on on my end, it is always blurring the final image.

    As for bicubic, I'm not sure that is offered outside of Paint and Photoshop programs, which is why I asked you several times to show me an example where you are able to preserve much of the detail in an actual game through nearest-neighbor or bicubic filtering while running 1/4 of native resolution without having it devolve into a blurry mess. Scaling bitmaps in an image manipulation program isn't comparable to applying it to a real game. Theory vs. practice. There are many games where you can adjust resolution scale independent of resolution, for example in BF4 or in most UE3 games though the console and .ini files. Be my guest and conduct your own investigation into in-game upscaling vs. GPU/display upscaling. I'd be very interested in your findings. :)

    And BTW that scene I showed does have anti-aliasing if it isn't apparent to you. PlanetSide 2's built-in Edge AA can't be disabled, that's why there is a slight blur even at native resolution.

    And I have seen either 640x480 or 800x600 (can't remember which) native on some really old LCD's. Both of those resolutions are fewer pixels than 960x540. Everything was big and blocky for sure, but it was sharp. Think early Doom and Wolfenstein games without filtering. Obviously low-res but not so hard on the eyes when they're not smeared with a blur filter. Way better-looking than taking a modern high-res monitor of the same dimensions and running the same low resolution on it with bilinear filtering.
     
  8. Saucycarpdog

    Saucycarpdog Notebook Guru

    Reputations:
    0
    Messages:
    69
    Likes Received:
    7
    Trophy Points:
    16
    If anything will kill low end dgpus it will probably be this.

    Edit: hmm, I'm surprised tablet reviews hasn't made an article about this yet.
     
  9. minerva330

    minerva330 Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    3
    Trophy Points:
    16
    I know. It's interesting to speculate since according to Nvidia their new K1 is just as powerful as a Xbox/PS3... ;)

    Sent from my Nexus 5 using Tapatalk
     
  10. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Yes but with an arm core inside the GPU that has access to the system memory would be the best of both worlds.

    Say maxwell for instance, optimus without the intel IGPU ;)
     
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Ditch Intel IGPU altogether? Nice. I'm all for it!
     
    deniqueveritas likes this.
  12. Jobine

    Jobine Notebook Prophet

    Reputations:
    934
    Messages:
    6,582
    Likes Received:
    677
    Trophy Points:
    281
    Intel... stop wasting your time on making GPUs... and focus on the CPU!
     
    Cloudfire likes this.
  13. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    OK. Got it in the end. I believe we are talking about a personal preference issue, not a technical one. We just got caught up in the terminology.

    You prefer the image to be pixelated and feel more comfortable this way. I personally prefer the opposite. If rendering resolution is limited I feel more comfortable with a blurry image than a blocky one. But that's just me.

    In theory, bicubic resampling a frame buffer is super easy for modern GPUs. But just like you said, theory is not reality. I'm not sure what kind of implementation we actually get. However, given you preference on blocky but sharp images, even if it is bicubic you won't like it.

    Zoomed-in on that screenshot again. Yes, there is a bit of blur, but I can't really tell if it was some kind of post-processing AA or simply compression defect. :(

    Whatever AA it was, it's definitely not effective. I still see zigzagged lines all over the place.
     
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah that has always been my personal preference.

    [​IMG] vs. [​IMG]

    I'd take the first one every time.

    And like you said, bicubic is hardly any better than bilinear for this image. A tiny bit sharper but then there's haloing around the edges:

    [​IMG]

    If you want to see the anti-aliasing in PS2, here are some lossless and uncompressed BMP's I took:

    Simple File Sharing and Storage.

    Simple File Sharing and Storage.

    It's definitely there. Just look at the mountains and buildings against the sky, or the trees. But I agree with you: The geometry anti-aliasing isn't very good, and there is a noticeable blur in the overall image. I don't think it's a post-processing filter as that wouldn't show up in fullscreen captures such as these. I'm not forcing FXAA from somewhere else, not that it would show up either. This is all the game's built-in Edge AA.
     
← Previous page