The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Why you can't disable enduro.

    Discussion in 'Sager and Clevo' started by Meaker@Sager, Jun 27, 2012.

  1. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    Some people seem to think you can disable enduro with a bios update.

    Let me show you why not:

    [​IMG]

    Take out the IGP in the clevo machine and the chain is broken.
     
  2. ocr123

    ocr123 Notebook Consultant

    Reputations:
    25
    Messages:
    134
    Likes Received:
    44
    Trophy Points:
    41
    That makes sense :) Thanks!
     
  3. arcticjoe

    arcticjoe Notebook Deity

    Reputations:
    66
    Messages:
    877
    Likes Received:
    67
    Trophy Points:
    41
    Is it yet known whether you can disable Optimus if you have an Nvidia card in a clevo EM series? I'm contemplating buying a 680m and selling my 7970m.
     
  4. clintre

    clintre Notebook Evangelist

    Reputations:
    99
    Messages:
    375
    Likes Received:
    6
    Trophy Points:
    31
    If the 7970 goes through the IGP as shown, the nVidia will do the same.
     
  5. elingeniero

    elingeniero Notebook Geek

    Reputations:
    31
    Messages:
    99
    Likes Received:
    0
    Trophy Points:
    15
    Thanks

    My question: WHY? :D

    Is there a reason they would do it like this?
     
  6. clintre

    clintre Notebook Evangelist

    Reputations:
    99
    Messages:
    375
    Likes Received:
    6
    Trophy Points:
    31
    From my point of view it would be to cut cost. Would be nice to hear from Clevo why, but that will never happen.
     
  7. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    There is also a slight power penalty.

    Cost on multiple factors, more chips, more traces, more complexity in the bios (more bug testing).
     
  8. fenryr423

    fenryr423 Notebook Evangelist

    Reputations:
    29
    Messages:
    542
    Likes Received:
    0
    Trophy Points:
    30
    so basically what we need is a solution (im assuming a driver) that shuts off the igpu leaving only the channels open that the dgpu needs to run through in order to operate?
     
  9. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    What you are describing is enduro. Since the integrated is the only chip wired its the integrated chip that has to output the rendered frame.
     
  10. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    and the iGPU has no issue keeping up with what's being outputted by the GPU?
     
  11. RogerCD

    RogerCD Notebook Guru

    Reputations:
    22
    Messages:
    59
    Likes Received:
    1
    Trophy Points:
    16
    It makes a bit of sense, but where did you get those diagrams/information?
     
  12. Alterac

    Alterac Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Id also like to know the for source those diagrams.

    I do know that Nvidia's Optimus works by attaching the dGPU to the iGP's Framebuffer for the video output.

    It would be odd for AMD/ATI to do anything different than that. Even more odd to have Dell design a different method, that is not the standard method that the device manufacturers support in their reference documents.
     
  13. birdsonbat

    birdsonbat Notebook Consultant

    Reputations:
    1
    Messages:
    174
    Likes Received:
    0
    Trophy Points:
    30
    so we have to wait for better drivers then...?
     
  14. fenryr423

    fenryr423 Notebook Evangelist

    Reputations:
    29
    Messages:
    542
    Likes Received:
    0
    Trophy Points:
    30
    Meaker made them himself. We just have to wait for mature drivers. optimis sucked when it first came out as well. patience!
     
  15. DeutschPantherV

    DeutschPantherV Notebook Consultant

    Reputations:
    32
    Messages:
    152
    Likes Received:
    1
    Trophy Points:
    31
    aren't the ivy bridge processors' igpus supposed to be significantly better than those in the sandy bridge processors anyways? I am not sure how much of a difference there is, but I would figure it is easier for the chip to pass information on than to process it on its own. Thus, it should be simple for the igpu to just move the information through so if it is powerful enough it will not be the weak link in the system. Instead, the way the data is being handled as a whole is the problem. that definately = drivers.
     
  16. arcticjoe

    arcticjoe Notebook Deity

    Reputations:
    66
    Messages:
    877
    Likes Received:
    67
    Trophy Points:
    41
    it shouldnt, as it doesnt have a problem keeping up with Optimus.
     
  17. erikk

    erikk Notebook Consultant

    Reputations:
    16
    Messages:
    119
    Likes Received:
    1
    Trophy Points:
    31
    Meaker you are correct that you cannot "disable" enduro or completely bypass the iGPU completely. Having said that, you could design a "switch" that forces the dGPU to be used exclusively and always.

    Enduro does some sort of check based on program running, what is being requested, or some other criteria by which to judge whether to use the iGPU or the dGPU. If we could force the dGPU to be used for everything then that check would never occur and never choose wrong and presumably that check incurs some sort of performance hit even if it is extremely minimal and only for the very start of any new program.

    So calling it "disabling Enduro" is inaccurate but I think what we're interested in is being able to force Enduro's choice.
     
  18. birdsonbat

    birdsonbat Notebook Consultant

    Reputations:
    1
    Messages:
    174
    Likes Received:
    0
    Trophy Points:
    30
    wouldnt it be great to have an interface where u check off what programs/ processes use the igpu/dgpu..........
     
  19. clintre

    clintre Notebook Evangelist

    Reputations:
    99
    Messages:
    375
    Likes Received:
    6
    Trophy Points:
    31
    @erikk

    I think that would be the best case if there is no true way to disable the iGPU which sounds like there is not. The actual iGPU or HD 4000 is actually on the processor in any case. So I am guessing that the AMD/nVidia is actually going through the frame buffer or external component that is controlled by the iGPU.

    Not really an expert in that area anymore.
     
  20. erikk

    erikk Notebook Consultant

    Reputations:
    16
    Messages:
    119
    Likes Received:
    1
    Trophy Points:
    31
    Yes I know that you can do that in the control interface. But wouldn't it be great if there was an option in there to check to just ALWAYS use dGPU? I mean at very least you have to go in and manually set every program to use dGPU. And that means that every time you change programs Enduro has to check the program against the list (alt-tab check list, alt-tab check list, etc). Can you even set it to use dGPU for straight Windows (Desktop, Windows Explorer, Internet Explorer, etc) without any specific program? And when Enduro switches from a program that uses iGPU to a program that uses dGPU or vice versa is there any lag, stutter, hiccup, etc? If you're hopping between 2 programs that have different settings does it do it smoothly without a problem?

    Having said all this, I don't have my laptop yet so I'm not very familiar with all the ins and outs yet and I'm only speaking from second hand knowledge but I believe that my comments are accurate.
     
  21. birdsonbat

    birdsonbat Notebook Consultant

    Reputations:
    1
    Messages:
    174
    Likes Received:
    0
    Trophy Points:
    30
    *sigh*.......
     
  22. erikk

    erikk Notebook Consultant

    Reputations:
    16
    Messages:
    119
    Likes Received:
    1
    Trophy Points:
    31
    I understand that maybe you think we're being lazy and not setting the programs to use dGPU in CCC but are you of the opinion that being able to force the choice to always use dGPU won't give any performance gains?

    I'll be the first to admit that I don't know if it will help but having multiple programs configured differently and a check for every running program sure seems like a recipe for problems and performance issues and personally I don't see why giving the "forced" functionality to us the end user is in any way hard to do or something they should keep us from being able to do. Ideally at some point in the future (sooner would be better) the AMD drivers will be mature enough to actually do what they're supposed to without any performance problems but would having a force dGPU option even then be a problem? At that point we'd all just recommend don't bother using it, Enduro is great and there's no reason not to use it now.
     
  23. birdsonbat

    birdsonbat Notebook Consultant

    Reputations:
    1
    Messages:
    174
    Likes Received:
    0
    Trophy Points:
    30
    im only sighing because i think i need to take a break from stressing out about the 7970m driver issues all day

    im just going to wait till my machine actually comes in then ill start worrying again
     
  24. erikk

    erikk Notebook Consultant

    Reputations:
    16
    Messages:
    119
    Likes Received:
    1
    Trophy Points:
    31
    I know the feeling. When XoticPC offered to put "expedited" on my order with Sager I almost told them not to because I didn't want anyone doing anything with my pc to be rushing.
     
  25. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It's just a pass-thru. The iGPU isnt doing any processing of the image.

    All I know is in my HP DV6z when there was finally a way (with hax) to set fixed dGPU, performance went up a measurable amount and there were no hiccups. It was still muxed through the iGPU, but there was no software to fuss with when the dGPU should engage or not. It was just ON.

    I know in that case it's AMD GPU to AMD CPU, but the concept is the same.
     
  26. TR2N

    TR2N Notebook Deity

    Reputations:
    301
    Messages:
    1,347
    Likes Received:
    255
    Trophy Points:
    101
    So can we assume the Clevo HM series is identical to the diagram on the right? i.e. the dell/AW one?
     
  27. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    Yes, because optimus/enduro was not active on that generation's hardware.
     
  28. BenWah

    BenWah Notebook Consultant

    Reputations:
    119
    Messages:
    289
    Likes Received:
    0
    Trophy Points:
    30
    Hey sager! We want optimus! We want optimus!
    Oh !
    We don't want optimus! We don't want optimus!
     
  29. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Thus explaining why the hell the Clevo's cant be used without a GPU.
     
  30. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Personally I never wanted Optimus or any kind of software switching. I always wanted a hard switch.
     
  31. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    yes, ideally, a hardswitch would be the best. perhaps on the next gen they will have a standard mxm slot that connects to a mux and an igpu that connects to a mux with a spdt switch haha.
     
  32. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    Oh, duh. Thanks

    100% agree. Or at least the option to not use the software.. I wouldn't mind the software if it worked correctly, but they're simply not developed enough for our liking..
     
  33. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    it's not the fact that they're not developed for our liking, it's just not developed enough to be released to the general public tbh.
     
  34. V1R4G3

    V1R4G3 Notebook Geek

    Reputations:
    17
    Messages:
    92
    Likes Received:
    2
    Trophy Points:
    16
    Start a petition.
     
  35. Penguissimo

    Penguissimo Notebook Enthusiast

    Reputations:
    31
    Messages:
    47
    Likes Received:
    0
    Trophy Points:
    15
    Awesome illustration, thanks! One question—I read on another forum that the external video ports are wired directly to the discrete card, while the internal LCD follows the architecture you describe. Do you know if this is true, and if so, what the implications would be? If this is true (and I honestly have no idea), would it be possible to bypass the integrated card by running the system in clamshell mode?
     
  36. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    general public is one thing.. I consider most of us to be more power user-ish :D
     
  37. hackness

    hackness Notebook Virtuoso

    Reputations:
    1,237
    Messages:
    2,367
    Likes Received:
    430
    Trophy Points:
    101
    I think the iGPU may have been holding the dGPU back too even on Optimus, the followings are the screenshots I took between the P150EM with i7-3610QM + GTX 675M and the G73JW with i7-940XM + GTX 460M. The program I used is FFXIV benchmark, both pictures are taken during loading screen (Glossy screen = G73JW, Matte screen = P150EM):

    G73JW i7-940XM@ 2.4GHz (to match the 3610QM) and GTX 460M @ stock clock, GPU usage 99%, 425.0 FPS:
    [​IMG]

    P150EM i7-3610QM@ 2.3GHz and GTX 675M @ stock clock, GPU usage 82%,226.8 FPS:
    [​IMG]
    What the heck.
     
  38. TR2N

    TR2N Notebook Deity

    Reputations:
    301
    Messages:
    1,347
    Likes Received:
    255
    Trophy Points:
    101
    Why did Clevo go for this design?
    Was it thrown onto them from AMD or did they choose this path for igpu/dgpu piggyback?
     
  39. truekiller28

    truekiller28 Notebook Consultant

    Reputations:
    93
    Messages:
    217
    Likes Received:
    0
    Trophy Points:
    30
    There must be a way to get full usage on our 7970M.
    I can't actually monitor my frequencies !!! :(
    I do the overclock using MSI AfterBurner (done the tutorial for unlocking OC), but I can't monitor my real frequencies, GPU-Z only show me the Intel 4000 Integrated Graphics.... Help ?
     
  40. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    Afterburner shows your frequency lol.

    Also Yes the design will hold back FPS..... when you get over 200fps and the bus can't handle any more frames being sent over it.
     
  41. hackness

    hackness Notebook Virtuoso

    Reputations:
    1,237
    Messages:
    2,367
    Likes Received:
    430
    Trophy Points:
    101
    This design basically half'd the bandwidth :D When I run at 720P I can get roughly 1000FPS at loading screen on the G73JW, and only 550FPS on the P150EM.
     
  42. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Try firing up GPU-z AFTER you're running a 3D app.

    But if it's limiting FPS then that's crazy. What does this mean for high bandwidth games, AA enabled, etc? Will this cause issues?
     
  43. hackness

    hackness Notebook Virtuoso

    Reputations:
    1,237
    Messages:
    2,367
    Likes Received:
    430
    Trophy Points:
    101
    I assume what it affects is just the amount of pixels that are output through the iGPU, the gaming performance under 60FPS won't really be affected, what goes through the iGPU is the already processed frames. The screenshot I uploaded are FFXIV benchmark in 1080P mode, so the pixels that it's allowed to output should be around 226 x 1920 x 1080 at 32bit colour. And for the AA part, it shouldn't affect as it's Post-processing on the GPU before it goes through the iGPU window.
     
  44. arcticjoe

    arcticjoe Notebook Deity

    Reputations:
    66
    Messages:
    877
    Likes Received:
    67
    Trophy Points:
    41
    If iGPU does not limit FPS or bandwidth on 680m / Optimus, there should be no reason it limits with AMD. So I am really hoping the issue lies in the way AMD card handles the throughput and can actually be resolved via driver update. I also have another test that points toward a software cause / solution: - if I run Kombustor burn in test in DX9 / OpenGL mode, I get around 80fps / 75% usage; with DX10 / DX11 mode I get 99% usage and 125 fps. This kinda proves that the card is capable of doing 100% usage and performance but certain software functions are bottlenecking rendering process.
     
  45. truekiller28

    truekiller28 Notebook Consultant

    Reputations:
    93
    Messages:
    217
    Likes Received:
    0
    Trophy Points:
    30
    Ahah, Already tried that trick but unfortunaly didn't work.

    As soon as I alt+tab, I can see my running AfterBurner showing me that frequencies are droping (going on iGPU ?). Gpu-Z still shows me Integrated Frequencies in any situation...
     
  46. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    use the osd and enable the core frequency, i have it on my 485m not sure about amd though but worth looking.
     
  47. AetasSerenus

    AetasSerenus Notebook Geek

    Reputations:
    9
    Messages:
    76
    Likes Received:
    0
    Trophy Points:
    15
    This might be why people are having problems with high resolution screens and why 3d Screens won't work with Optimus.
     
  48. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    Yes initially PCI-E 2.0 was a bottleneck for 120hz 1080p.

    However now we are on PCI-E 3.0 it should be ok.