You're talking about ten years ago and a gpu six generations removed. Optimus is quite trouble free for most people and laptops for the lay few years at least.
Beamed from my G2 Tricorder
-
inperfectdarkness Notebook Evangelist
i'm actually cautiously optimistic about integrated graphics inroads. with ATI/AMD now, i keep hoping that we'll soon get an integrated GPU solution that is in the upper-tier of performance, not just lower midrange. it would allow me to opt for AMD over Nvidia in clear conscience--and with peace of mind to boot. currently however, the performance just isn't sufficient to make me jump ship. -
Tsunade_Hime such bacon. wow
Obsolete, depends who you are talking to. dGPU for normal joe schmoe, sure, but dGPU has never been required on an average joe computer. Casual gamers can be taken care of with powerful iGPU yes, but the mid range and higher cannot be satisfied with on board graphics.
deniqueveritas likes this. -
Only had a couple problems with Optimus when I've dealt with it on a friends laptop, but it runs pretty well for the most part. The entire premise of the platform is seamless switching between integrated and dedicated, but sometimes you gotta force it one way or another. But I still like the solution my 3820TG uses with the manual switching between integrated and dedicated. Having full control on when to switch is the best solution IMO.
-
Optimus is great, has worked perfect in my last two laptops... honestly, I can't see going back to a machine without it and was a required feature in my last purchase. Those that fuss about it the loudest always seem to be the ones that haven't tried it in years if ever.
Dedicated GPU's will stick around, as there's a huge gap between what's capable with igpu's.... plus, igpu's have to share the TDP with the CPU, so at least in the foreseeable future that will be a huge limiting factor in high end performance; look how hot a Haswell quad core runs now by itself while playing BF4.
Having said that, I do happily game on an igpu, in an ULV CPU stuffed in a tablet case to boot.... so it's definitely possible and they've come a LONG way. I find myself playing anything from BF4 to Kerbal Space Program to AoE more often on my tablet than my dedicated gaming laptop. That to me is a huge indication the future is bright for iGPUs. -
But then non-Optimus gets ShadowPlay and 120 Hz among other things. There are pros and cons to each. It's 50-50 in my eyes.
You play BF4 on the Surface Pro 2? You're braver than I thought. -
As for bicubic, I'm not sure that is offered outside of Paint and Photoshop programs, which is why I asked you several times to show me an example where you are able to preserve much of the detail in an actual game through nearest-neighbor or bicubic filtering while running 1/4 of native resolution without having it devolve into a blurry mess. Scaling bitmaps in an image manipulation program isn't comparable to applying it to a real game. Theory vs. practice. There are many games where you can adjust resolution scale independent of resolution, for example in BF4 or in most UE3 games though the console and .ini files. Be my guest and conduct your own investigation into in-game upscaling vs. GPU/display upscaling. I'd be very interested in your findings.
And BTW that scene I showed does have anti-aliasing if it isn't apparent to you. PlanetSide 2's built-in Edge AA can't be disabled, that's why there is a slight blur even at native resolution.
And I have seen either 640x480 or 800x600 (can't remember which) native on some really old LCD's. Both of those resolutions are fewer pixels than 960x540. Everything was big and blocky for sure, but it was sharp. Think early Doom and Wolfenstein games without filtering. Obviously low-res but not so hard on the eyes when they're not smeared with a blur filter. Way better-looking than taking a modern high-res monitor of the same dimensions and running the same low resolution on it with bilinear filtering. -
If anything will kill low end dgpus it will probably be this.
Edit: hmm, I'm surprised tablet reviews hasn't made an article about this yet. -
I know. It's interesting to speculate since according to Nvidia their new K1 is just as powerful as a Xbox/PS3...
Sent from my Nexus 5 using Tapatalk -
Meaker@Sager Company Representative
Say maxwell for instance, optimus without the intel IGPU -
deniqueveritas likes this.
-
Intel... stop wasting your time on making GPUs... and focus on the CPU!
Cloudfire likes this. -
You prefer the image to be pixelated and feel more comfortable this way. I personally prefer the opposite. If rendering resolution is limited I feel more comfortable with a blurry image than a blocky one. But that's just me.
Whatever AA it was, it's definitely not effective. I still see zigzagged lines all over the place. -
Yeah that has always been my personal preference.
vs.
I'd take the first one every time.
And like you said, bicubic is hardly any better than bilinear for this image. A tiny bit sharper but then there's haloing around the edges:
If you want to see the anti-aliasing in PS2, here are some lossless and uncompressed BMP's I took:
Simple File Sharing and Storage.
Simple File Sharing and Storage.
It's definitely there. Just look at the mountains and buildings against the sky, or the trees. But I agree with you: The geometry anti-aliasing isn't very good, and there is a noticeable blur in the overall image. I don't think it's a post-processing filter as that wouldn't show up in fullscreen captures such as these. I'm not forcing FXAA from somewhere else, not that it would show up either. This is all the game's built-in Edge AA.
Do you think discrete graphics cards will become obsolete....soon
Discussion in 'Gaming (Software and Graphics Cards)' started by minerva330, Jan 8, 2014.