The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Boost this clock, turbo that, benchmark this and that.......

    Discussion in 'Sager and Clevo' started by Calibre41, Apr 29, 2013.

  1. Calibre41

    Calibre41 Notebook Evangelist

    Reputations:
    547
    Messages:
    384
    Likes Received:
    4
    Trophy Points:
    31
    I want to talk about all the various "boost" technologies out there....... I want to share my opinion, and I'm eager to hear yours.....

    Boost, this, turbo that, its pretty much a standard now....

    Didn't CPU's used to "underclock" to allow powersaving, -- I can see this is useful... nice feature

    and now we turbo/boost to increase performance, sounds good I see and really understand how useful a multicore CPU "overclocking" itself when not all cores are active is great, boosting older titles or particular games perhaps.....

    A graphics card boosting up and down, what the hell is the point ??

    in my opinion a good system config will pound the GPU within a fraction of its capacity, it will be hot, it will not be able to turbo, when things ease off, I don't want it to "turbo" what benefit would that have, things are clearly easing off - so why not take time to recover - cool off, get ready for the next bomb/bullet spark flying, smoke, fog and ambient occlusion onslaught !!

    I have limited my W110's 650m to 59fps.......... I understand my screen along with the majority of other peoples cannot handle much more than 60-120fps (60 in my case) My angle on this is that if we start allowing the GPU to gain some ground in terms of heat and power we could be giving it room for a good "turbo" next round . But we limit the gpu further by overclocking and warming it up it when it's clearly not necessary - why not reserve this performance and only "boost" when fps drops below a certain level say below 30fps and bam -- give it a boost if power/heat allows, and then recover again when things are easy? (I'm sure we've all used a system at some stage that is playable, but then occasionally bogs down in a particular momentary scenario)

    - I guess part (and maybe the start??) of this running around in my head is me wondering if the manufacturers blinding us with 120+ fps results and benchmarks that aren't actually relevant, when the real world noticeable performance sub 120fps is not been properly represented

    Just a thought.........
     
  2. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    Consider furmark and CPU burn in tests like the intel burn tests.

    In order to 100% guarantee your CPU at one frequency you would have to take these extremes as your maximum speeds and therefore limit your clocks.

    Turbo lets the CPU clock down during particular power leeches while maintaining a higher level of performance in more general applications.

    Same goes for different games, depending on what shaders they use depends on how much they load the total core.

    My max temp stable speed in crysis 3 is 1033mhz where as in a title like bioshock infinite its around 1066mhz, but I can't set that permanently in case I want to run crysis 3....

    Artificially limiting your clocks also has a downside, it introduces input lag which can cause inaccuracy and feeling of sluggishness.

    Nvidia are working on an FPS target mechanism at the moment but they need to address the above problem a bit more.
     
  3. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I'd rather have control over the boost to be honest. I don't like the auto turbo. You shouldn't have to pay $1000 for a CPU to unlock that potential. It's ridiculous. Same with GPU's. Even if it takes a special software or key code that you have to buy from the hardware manufacturer, I'd be fine with that.
     
  4. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    Well who knows, maybe alienware will take a similar approach to their system bios and integrate further desktop like graphics options if Nvidia and AMD let them.
     
  5. Jaycob

    Jaycob Notebook Consultant

    Reputations:
    20
    Messages:
    195
    Likes Received:
    7
    Trophy Points:
    31
    Your logic is pretty solid. You do need to be aware of a few things :)

    - Although CPU's "Turbo Boost" when needed (usually for a few seconds, and just if there is heat room for it), GPUs don't exactly turbo-boost. They usually only have two different settings : 2D clocks (usually very low clocks, AKA idle clocks. They are the clocks the GPU is at when not executing a 3D context) and 3D clocks (the "full blown" frequencies. Represents the GPU being ready for some 3D goodness). Some tools allow you to set profiles (a keyboard shortcut to make the card run with specific clocks and the likes). Overclocking means replacing the 3D clocks by higher clocks. Or you can tone them down.

    - Now, there is another thing you must take into account. Even if I have my card on constant 3D clocks, overclocked, it is even possible that its temperature is lower than yours. Why? Workload. If you have a card under heavy constant load, it will heat up. If it is not being utilized, even if running at high clocks, temperature impact is minimal. Limiting framerates is a good idea (as you did! it is really a good idea), as it does prevent heating up in many scenarios. So, imagine I have my card overclocked, but I'm using v-sync on an old title. Imagine that that means it is utilized just 30% of the time. If I ran the same game, with stock clocks or underclocked but with v-sync off, I could easily have my card run hotter! Even with lower frequencies!! Software frame-limiters are usually sloppier than v-sync, but cost less processing power. v-sync can easily shave off 5 to 10% of your FPS. So, if you are on a 60Hz monitor, playing a game that averages 90fps, v-sync is a good idea. But if you are playing a 30FPS game, you are going to lose a couple of frame. Now, there are some adaptative v-syncs, that "know" when to kick in or not, but I digress...

    - Ah, that is somewhat true. Synthetic benchmarks are just the tip of a cards true performance. Many benchmarks come in the form of "games" some playable some not, in order to best simulate that :)
     
  6. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    He is talking about turbo boost seen on the GTX titan being brought to mobile.
     
  7. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    and to mention that turbo on cpus is much more stable and can be kept for a good while, its not anymore the few secs
     
  8. Jaycob

    Jaycob Notebook Consultant

    Reputations:
    20
    Messages:
    195
    Likes Received:
    7
    Trophy Points:
    31
    True, it is. But unfortunatly, most "stock" configurations, even in modern CPUs have them running for no more than half a minute. Throttlestop proves that it can run stable for quite longer.
     
  9. Jaycob

    Jaycob Notebook Consultant

    Reputations:
    20
    Messages:
    195
    Likes Received:
    7
    Trophy Points:
    31
    My mistake then. I have yet to read anything about that. Sounds like some dynammic overclocking of some sort. Possibly when FPS dips here and there, no ?
     
  10. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    This Video sums it up pretty well. There was a really indepth hands on video made when it was first released but I can't seem to find it.
     
  11. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    You know that reading this, amd marketing dept will just say that enduro didnt have an issue of underutilization, they just were seeing the needs of "comfortable" gaming as in limited the fps and thus limited the heat
     
  12. Calibre41

    Calibre41 Notebook Evangelist

    Reputations:
    547
    Messages:
    384
    Likes Received:
    4
    Trophy Points:
    31
    I see your angle on this, and I don't think they'd be stupid enough to insult us with that - although it wouldn't surprise me


    to others reading this - i'd love for this thread not to be plagued by another 7970m utilization issue argument, we have threads for venting the pro/cons of AMD's drivers please use them :thumbsup:
     
  13. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    Hear Hear!
     
  14. Calibre41

    Calibre41 Notebook Evangelist

    Reputations:
    547
    Messages:
    384
    Likes Received:
    4
    Trophy Points:
    31
    great video, i like this.

    What I'm suggesting is my GT650m (which features a "boost" where it will boost to 950mhz) automatically boost's - I don't have control over it as a typical end user.

    What I'd like to see, which might answer my question (prayers), is not just a temperature, and frequency target, but actually an FPS target which I could also set to priority in a similar fashion, so at driver level the GPU limits the frames to say 59fps for my example, saving vast amounts of power, and reducing heat, but ALSO I want to set a 2nd priority, of say 110% core frequency, so when the 59fps priority isn't met, the core can boost up to 110% and give it more power when required.

    Thanks for video link, I wasn't aware of that, it looks great.

    I know we all want that sort of fan profile utility bringing to our laptops, I might start a petition thread to Clevo asking people to sign up, and forward it to clevo lol !!!!!!!!! "Clevo! GIVE US CUSTOM FAN CONTROL PROFILES !!!!!!!!!!!!
     
  15. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    It'd be amazing to see this in laptops, but it's unlikely we'll see it.
     
  16. Jaycob

    Jaycob Notebook Consultant

    Reputations:
    20
    Messages:
    195
    Likes Received:
    7
    Trophy Points:
    31
    I'm surprised something like this isn't achieved via software already. It wouldn't weigh too much on the CPU either, and could easily be transfered to a lower (driver) level.
     
  17. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    If you feel like creating it I'm sure there would be a number of people willing to donate to the cause ;)
     
  18. Calibre41

    Calibre41 Notebook Evangelist

    Reputations:
    547
    Messages:
    384
    Likes Received:
    4
    Trophy Points:
    31
    "But why??" is the rhetorical question looping around in my mind that is aimed squarely at the manufacturers ??????????

    I can, thanks to prema set my fan to 100%, but I'd rather have full control, its not like they couldn't implement a fail safe feature, to stop you from running the cpu/gpu to hot.

    Come on clevo, give us control - we buy clevo's - clearly we know what we're doing already! :D
     
  19. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    Haha, you'd think everyone would know what they're doing, some of you kids are just crazy though ;)
     
  20. Jaycob

    Jaycob Notebook Consultant

    Reputations:
    20
    Messages:
    195
    Likes Received:
    7
    Trophy Points:
    31
    Fair enough. Not too used to accessing those readings on a GPU, but if I find some guidance on google on how to do so (or even, if there is some sort of system calls or API to do so), i'd be happy to give it a shot ^^. Although reading is one thing, writing (frequencies) on the other hand... But if I can nail that I can probably create a service that keeps track of temperature and load, at least, and scales frequencies accordingly.

    Edit: Found this AMD Display Library API . Will give the documentation a read... http://developer.amd.com/tools-and-sdks/graphics-development/display-library-adl-sdk/

    Edit 2: Sorry for OT. Really sorry. Just to let you guys know that I found the API functions to get it going. It's on the Overdrive 6 module. Think I should be capable of making something remotely similar (in terms of functionality) to MSI Afterburner (regarding clocks, temps, and the likes). Sigh. Will have to try a few things with them. Perhaps later this week or the next.
     
  21. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    Haha, yeah that's all beyond what I know how to do currently and I don't have a dire need to learn which is where I would assume most people would categorize themselves that'd even find a use for it. It'd be handy in some situations though, so if something could be created and it's within your skillset, it'd be a good project.
     
  22. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Turbo boost and GPU boost is Nvidia`s solution to let the system itself find the optimum performance based on temperature.
    Instead of tailoring 100 different versions GT 650M to the 100 different notebooks which come in all different shapes (and cooling capabilities), Nvidia make code the GPU to follow the system`s temperature and apply different clocks accordingly. System builders are happy because their notebooks doesn`t catch on fire, Nvidia have a much easier job coding the GPUs.
     
  23. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    They did once and people broke their machines....

    We will see the FPS target mature once Nvidia have had another go at it and it will likely come back.
     
  24. Jaycob

    Jaycob Notebook Consultant

    Reputations:
    20
    Messages:
    195
    Likes Received:
    7
    Trophy Points:
    31
    Since I found a few things (the API) I think I'll give it a whirl eventually.

    Also found a post (regarding this API) talking about AMD's Dynamic boost : AMD May Introduce A True GPU Dynamic Boost Clock | VideoCardz.com
     
  25. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
  26. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Don't like it. I'd rather have control over it and have steady voltage and clock.
     
  27. Calibre41

    Calibre41 Notebook Evangelist

    Reputations:
    547
    Messages:
    384
    Likes Received:
    4
    Trophy Points:
    31
    O right...... Well that doesn't help our cause......... :(
     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    I like the boost so long as it can be controlled, like I said, I would rather run at 1066mhz in bioshock and 1033mhz in crysis without having to faff with settings.