The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    XPS BIOS Optimus Linux CUDA

    Discussion in 'Dell XPS and Studio XPS' started by Sam2255, Jan 14, 2012.

  1. Sam2255

    Sam2255 Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    I'm thinking about buying an XPS 15 for CUDA development under Linux. The big deal is something called Optimus.

    I'm trying to find out if I can set the BIOS to hard-code which graphics chip to use.

    I'm trying to find someone with a relatively new XPS 15 machine to check the BIOS for me. If so, please answer a few Q's for me:

    About how old is the machine?
    Do you have an Nvidia graphics chip?
    Which one? 460m, 525m, 540m, 555m, etc.
    What version of BIOS are you running?
    Does the BIOS have an Optimus setting? (I'm not sure what they would call it.) It should be under video. It will have options like "discrete" (Nvidia), "integrated" (Intel), or both (Optimus).

    Thanks
     
  2. nomygod

    nomygod Notebook Geek

    Reputations:
    6
    Messages:
    91
    Likes Received:
    0
    Trophy Points:
    15
    Optimus and linux do not play well at all right now. Here's my input:
    XPS 15 l502x
    about 6 months old
    Bios version A07 (CapitanKasar's undervolted bios)
    other info in my sig.
    The BIOS has sparse settings, and does not include a setting for video.
    Some say it is not possible because of the way optimus works
    Going to the laptop display, all video to the screen is processed by the intel card. Any high-work video is processed by the nvidia card and sent through the intel card to the screen.
    All video to the hdmi/displayport is sent via the nvidia card.

    So in answer to your question, no it can not. Amd switchable graphics can do this, but that is because control of all video is actually passed between the integrated card and the dedicated card.
    Hope I've been helpful, granted I don't know much about linux
     
  3. cri-cri

    cri-cri Notebook Consultant

    Reputations:
    43
    Messages:
    111
    Likes Received:
    0
    Trophy Points:
    30
  4. LLStarks

    LLStarks Notebook Evangelist

    Reputations:
    39
    Messages:
    390
    Likes Received:
    2
    Trophy Points:
    31
    You don't need Bumblebee for CUDA on Optimus. Just the CUDA PPA or .run files directly from Nvidia.

    Plus, the blog's instructions are pretty garbage since half of the packages are cut off by the margin:

    sudo apt-get install nvidia-cuda-gdb nvidia-cuda-toolkit nvidia-compute-profiler libnpp4 nvidia-cuda-doc libcudart4 libcublas4 libcufft4 libcusparse4 libcurand4 nvidia-current nvidia-opencl-dev nvidia-current-dev nvidia-cuda-dev opencl-headers

    For Arch, just install cuda-toolkit with pacman.
     
  5. SuspiciousLurker

    SuspiciousLurker Notebook Geek

    Reputations:
    24
    Messages:
    82
    Likes Received:
    0
    Trophy Points:
    15
    This statement is only true for the HDMI port. Output thru DisplayPort is done with Optimus fully functional (i.e. correctly switches between nVidia and Intel).

    Plugging into the HDMI connector will force the nVidia GPU on 100% of the time.
     
  6. Sam2255

    Sam2255 Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Thanks all for the info & comments.

    Could you elaborate on this a bit? If I plug an device into the HDMI port, the Nvidia card will be controlling the laptop display and HDMI display, or just the HDMI display out? Any idea if this action is done by the hardware or software driver?

    I'm trying to force the Nvidia card to be ON all the time, in a Linux environment. (CUDA won't select the Nvidia card. I need to force it somehow.) If hardware does it by just plugging something into the HDMI, I can handle that. If it is software, I'll need to figure out if there is a Linux driver that will do it. There is no Linux support for CUDA. (I'm pretty sure there is regular Linux driver support for their chips.)
     
  7. SuspiciousLurker

    SuspiciousLurker Notebook Geek

    Reputations:
    24
    Messages:
    82
    Likes Received:
    0
    Trophy Points:
    15
    From my own personal experience (Win 7 Ultimate): I utilize an external monitor quite a bit; when connecting via HDMI cable, the nVidia card is FORCED on (I can detect this by watching the nVidia GPU Activity icon). This is true whether the desktop is being Cloned or Extended to the external monitor. It stays this way even sitting at the desktop with no apps running. Additionally, I can hear the fan running much more often (presumably the nVidia chip is generating more heat than the Intel graphics would). It's as if the Optimus settings (which are software-based) are completely ignored when the HDMI is in use. This leads me to suspect that this forced action is done in hardware.

    Surely, when the nVidia GPU is enabled, it is controlling ALL displays (internal and external). I can't imagine Intel and nVidia working so well together that they cooperate and share the workload of multiple apps running on different displays simultaneously. But I'm definitely no expert on that.

    This is why I much prefer to use a DisplayPort cable instead, as it preserves the Optimus functionality in full. For your CUDA detection purposes, however, it might behoove you to connect with HDMI instead.
     
  8. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    If you use the HDMI port then the nvidia card is forced on. There's physically no way you can use the intel graphics with the HDMI port as there's no electrical connection.

    As for the two working together, I'm not sure whether they share the workload, but the intel frame buffer is shared with the nvidia GPU so that it can display on the notebook screen. So while it's using the same buffer, that doesn't mean that the intel graphics is helping with the processing in any way. Frankly I don't really see much point in trying since it's insignificant by comparison and might actually drag performance down.