The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nvidia Optimus how to minimize the performance loss

    Discussion in 'Asus' started by yveschen, May 6, 2021.

  1. yveschen

    yveschen Notebook Enthusiast

    Reputations:
    3
    Messages:
    11
    Likes Received:
    5
    Trophy Points:
    6
    I was slightly ticked off that this years asus line ups did not have a mix switch so I tried my best to ignore the bottlenecks. On my scar 17 5900hx/3080 I noticed Radeon software has anti lag, enable that and disable freesync.

    another thing is if 1440p gaming actually lessens the bottleneck to a pretty significant degree? For cod warzone, I did tests I. 1440p and 1080p and although the lows were worse at 1440p, there was only a 10fps dip between the two resolutions on average, which makes me wonder how bad Optimus is at higher res gaming considering everyone gets so upset about it (including me)
     
  2. undervolter0x0309

    undervolter0x0309 Notebook Evangelist

    Reputations:
    67
    Messages:
    442
    Likes Received:
    246
    Trophy Points:
    56
    I haven't find a place to minimize the lag. I wonder if the amd options are only for games that run on the integrated AMD gpu...? (I have the asus duo 15 se btw 5900hx/3070)
     
  3. ovidiup

    ovidiup Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    I had the same question on reddit and I could not find any response...
    Basically my comment was:

    "My question is like this: If Optimus is the bottleneck, and optimus means that the internal graphic card is always processing the redirected display data of the internal display, forcing the internal graphic card to run at maximum speeds would improve the performance? I am asking as the AMD Vega Igpu when is not used actively in 3D, almost all the time stays at 400 MHz(and all the time is processing what is displayed on internal screen as well due to optimus), and I am not sure if it can process the display data at that speed faster enough. I could not find a way to make the IGPU core clock speed stay at 2000 MHz all the time to test the impact.

    Or the optimus performance impact is because the wattage is shared between CPU and IGPU the performance impact appears? as Igpu steals W from the CPU package... I am trying to understand what more exactly in the optimus technology is creating the bottleneck and if there is a way to decrease the performance impact. For example if it is IGPU processing speed maybe forsing the IGPU to run at maximum speeds helps, if is the W consumed by IGP to under-volt, limit it somehow... etc"

    Edit: I would add the 3rd option: I have to live with optimus... as I want a LAPtop but I don't like optimus only.... :)
     
    Last edited: May 10, 2021
  4. yveschen

    yveschen Notebook Enthusiast

    Reputations:
    3
    Messages:
    11
    Likes Received:
    5
    Trophy Points:
    6
    I think Optimus is not an issue with the sharing of the tdp, but rather just because the buffer effect of the igpu loses efficiency, especially when the dgpu is pushing more FPS. I noticed for a while now, while gaming that the low lows are much higher than the high highs. It seems that if your gpu was pumping out 200fps, more is lost due to the sheer number of FPS but rather 100fps, the loss is more minimal since it’s easier for the igpu to render 100 frames before the newer information gets pushed onto the stack.

    Jarrod recently made a video comparing the scar and legion 7. Scar with Optimus posted 112fps on shadow of the tomb raider at 1080p but 126 FPS on an edisplay. I wish to see the same comparisons at higher resolutions to see if the percentage difference drops due to the less number of frames needed to render.
     
    Lakshya likes this.