The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    (Future) Clevos with Razer Core (eGPU)?

    Discussion in 'Sager and Clevo' started by darkarn, Jan 7, 2016.

  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Nothing that complicated, Use the laptop as a laptop free, and docked.

    When at the desk you have access to the desktop resources through the TB3 <> eDock.

    As in the traditional eGPU paridgm the GPU would be available for one or the other, laptop or eDock computer. That GPU would display on the desktop monitors.

    The laptop could use it's internal display either way.

    The eDock computer could have another output for it's CPU iGPU to display through another input on the monitor(s), so you could still use it while the laptop used the dGPU(s).

    You could have a simple mode for a less complex configuration, and an advanced mode to allow control over the resources accessible to the laptop and/or eDock computer.

    It's the kind of functionality MS should be "inventing" for Windows, instead of mucking around endlessly with the UI.

    To get complex...

    We were headed for a much more elegant interchange of network resource sharing not all that long ago, then Network speeds didn't mature quickly enough - too costly - now we can start up that intranet sharing again now that we have TB3 (+ TB3), multiple high bandwidth paths.

    Thin client was always a lose, because you couldn't function when the server was down. And, network speeds wouldn't allow fast enough interconnects to support an independant fat client and still use the server power fully - you needed to keep all the resources proximate to the IO bus.
     
    Last edited: Jan 13, 2016
  2. JADabandon

    JADabandon Notebook Geek

    Reputations:
    2
    Messages:
    90
    Likes Received:
    21
    Trophy Points:
    16
    Hello good folks, I noticed that the Razer Core is set to release in April, I'm tempted to jump on it for my Clevo P750DM-G, hoping that it will make it easier to upgrade to a Pascal GPU once those eventually release.
    I just don't really have the budget right now to drop $500 on a thunderbolt box that still needs a GPU added to it.
    Is anyone here planning on buying one so the rest of us can find out if it will actually work with our Clevos before buying it?
    I'm personally skeptical on it as it seems to be made for ultra books with only Intel graphics.
     
  3. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    I suggest a wait and see approach, people will always test for you.
     
  4. JADabandon

    JADabandon Notebook Geek

    Reputations:
    2
    Messages:
    90
    Likes Received:
    21
    Trophy Points:
    16
    Thanks for the reply, I'd love to mess around with new tech, but I just don't have the budget to dive into uncharted waters.
    I'd really love to see just what the laptop will do with the internal GPU when you boot up with a new GPU on the thunderbolt port.
     
  5. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Just the same thing as plugging a card into a system that already has one. Both would display and operate.
     
  6. JADabandon

    JADabandon Notebook Geek

    Reputations:
    2
    Messages:
    90
    Likes Received:
    21
    Trophy Points:
    16
    So it shouldn't operate like a Optimus system where it can switch between cards, but what should happen is running both cards simultaneously? I guess that should be interesting with two monitors.
    If it really is that simple and it doesn't create havoc for the bios or the drivers, I might just jump on it once they release it.
    Maybe plug in a old video card and see what happens.
    Hopefully someone here can get their hands on it to test it out with a Clevo laptop.
     
  7. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,755
    Likes Received:
    2,200
    Trophy Points:
    181
    Sorry to spoil the party but I can tell you for free that it won't work with the current chassis that have TB3 support.

    I'm sure you guys can speculate the reasons why this might be, but I can't really explain more - don't want to reveal info from different parties or create missinterpretations.
     
    Last edited: Mar 22, 2016
  8. JADabandon

    JADabandon Notebook Geek

    Reputations:
    2
    Messages:
    90
    Likes Received:
    21
    Trophy Points:
    16
    So unless Razer works with Clevo, the Razer Core will not be compatible with Clevo laptops? I had a feeling drivers and bios updates would be needed...
     
  9. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,215
    Likes Received:
    740
    Trophy Points:
    131
    Over on the Razer forums the owner of Razer was quoted saying that any TB3 laptop could use it with a major but, that but being the need for specific drivers for the MB, graphics, etc. Looking at the surface book and its driver woes it seems to me that these drivers would require a concentrated effort and developed at some expense. You have to wonder if an outfit like Clevo would be willing to develop those drivers for what amounts to a product that is third party and will act to cut into their own sales? I also think that like the surface book the drivers will be a work in progress even for the Razer guys in the near term.
     
  10. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,755
    Likes Received:
    2,200
    Trophy Points:
    181
    Well there's a problem straight out of the box which as you suggest is because Razer wouldn't really want to help Clevo (in general terms, I don't mean that ther's any bad blood between them!!!). But the problem isn't lack of collaboration, or really even if Razer works with Clevo or not, the defining factor is if Clevo will or will not support eGPU through the TB3 and associated firmware.

    @krabman also touches on the financial aspect and we could turn the conversation into something pretty boring here......but whilst some laptop companies seem to be embracing eGPU, others see it as watering down their market. Most people on here want to be able to upgrade their MXM dGPU rather than buy a new laptop, but a laptop manufacturer can't survive financially on selling half as many laptops but twice as many graphics cards.

    Don't get me wrong, I'm not supporting any company's policy or thinking here, just in part agreeing with financial side and business plans.
     
  11. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    We will have to wait and see just how the drivers work out, but as usual, graphics in notebooks is never a simple process ;)
     
  12. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,215
    Likes Received:
    740
    Trophy Points:
    131
    It's not going to work out on its own, in the near term Clevo will have to actively pursue supporting the device or other similar devices by expending the resources needed to develop drivers. I'm not an economics major but I am self employed and working directly with the largest seafood company in the world on a day to day basis. I other words no one important but I do have a sense of how the corporate mind thinks. When I imagine myself running Clevo I don't see any upside to expending resources on providing drivers for this product but I do see a downside to it making the question of supporting it an easy nix. If I'm that executive the smart money resides with waiting to see if the concept gains traction and then standing on the shoulders of others and using their experience to make my own device down the road when many unknowns have been sorted out at others expense. Because the product is niche you don't have to worry about missing the ground floor and market share by waiting but you avoid cannibalizing your own product until such time as you have to and then you can at least do so with your own goods when it becomes necessary.

    Keep in mind gents I'm stating my opinion here based simply on the ol putting myself in the other guys shoes methodology. Maybe I'm wrong.
     
  13. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    Personally I want nothing to do with egpu. If I wanted a desk bound gpu, I would have bought a desktop. Keep the processing power inside the machine - otherwise there's no point in owning a gaming laptop. I'd build a desktop and buy a netbook before going down the egpu path.

    The real solution here, is for companies to invest in actually following the MXM standard. There's no reason why MXM cards can't be like desktop pci-e cards. Just buy a new card from the next generation from any manufacturer and plug it into your laptop. At least the last few generations of Clevo machines have enabled native MXM support, so bios updates aren't required for new MXM card compatibility - but EVERYONE needs to get on that bandwagon.

    And it would be nice too if companies like, Clevo, Dell, MSI, etc started selling their MXM modules on markets like newegg.
     
  14. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,215
    Likes Received:
    740
    Trophy Points:
    131
    All true. On the other hand there are real benefits to the idea. It moves the heat out of the laptop allowing for better thermals with what remains, the interior space vacated can be utilized by other hardware or a lesser laptop that is lighter and more easily portable can be used, it plays well with people who already are using their laptops in a dock setup. There are also people that only game at home and not on the road that could move to one easy to carry lappy like the Stealth along with the Core and the rest of the docking stuff and come out at nearly the same cost as a gaming system and cheap laptop but lose having to worry about keeping multiple devices synced since they would only have to maintain one PC. Quite a few people mentioned this as their use for it and while its certainly throwing money at a problem it's also effective for those with the money to throw. Of course a lot of people would say buying an expensive gaming laptop is also throwing money at a problem...

    It is however for the reasons you mentioned going to be a niche product. Outside of the money ahead factor you bring up I can't imagine packing that thing around being any better than packing a DTR for a road warrior. On the other hand there are people like me that go someplace and stay there for extended time periods. I could have one of these on my boat along with a monitor and other peripherals and simply bring along my (now much more travel-friendly) laptop and I'm living. I cant leave a PC there because there is zero connectivity and nowadays it's almost impossible to try and bring the data needed to maintain a gaming pc at a remote site. Someone like me could actually come out ahead of the game but yeah, niche for sure.
     
  15. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    The thing is, if I wanted something thin and light, I would not have bought a gaming laptop. Maybe I'm in the minority, but physical dimensions and weight don't factor that much into a laptop's portability for me. What makes a laptop portable for me is the fact that it has a battery and the monitor, keyboard, and mouse are built into the chassis. (Weight limit for me is probably around 20lbs) If I can have a 1" thick laptop that weighs less than 10lbs with dual MXM slots, and a real cpu socket, then great! But if such a laptop only exists with bga cpu/gpu options and/or an egpu, then no thank you. I don't see the point. What would honestly make more sense to me would be if the egpu was its own standalone desktop that you plug your laptop into, but then again, why not just have a desktop waiting for you at home?

    For me, I have a hard time understanding the desire for a thin and light gaming laptop, if it means removing the gpu from the chassis. How is that better than a desktop/netbook combo? Because now if you actually want to bring all that processing power with you on the go (which is probably one of the reasons why you bought a gaming laptop instead of a desktop), you'll also need to lug around the egpu enclosure and all that jazz - which essentially defeats the purpose of having a laptop (nevermind the fact that you'll also need to lug around a UPS because the egpu doesn't have a battery). Why not just lug around a desktop and have even more performance if you're going to go through all that trouble?

    EDIT: Also, egpu not supporting SLI or CFX (that I know of) is a HUGE negative for me as well
     
    Last edited: Mar 25, 2016
  16. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Certainly though the 4x connection SLI/Xfire would not work well (unless it was a single card).
     
  17. krabman

    krabman Notebook Deity

    Reputations:
    352
    Messages:
    1,215
    Likes Received:
    740
    Trophy Points:
    131

    I've already agreed with both your points, as I said my situation is possibly too niche to be described as niche. Maybe superniche? :) I used to maintain a gaming rig up north since laptops couldn't compete on costs or performance compared to a desktop but it is no longer possible to maintain a gaming desktop without connectivity which precipitated my move to DTR gaming rigs.

    I understand in concept the folks that are throwing money at what isn't a problem to me, I'm old and every day my money becomes less valuable than my time. I quoted that portion of your post only to point out that the world has a lot of people in it. You need only wander over to the Razer section and read the ongoing thread there to see there are many who think much along the same lines as you have put forth and yet a fair number also think this is the best thing since the invention of breasts. Whatever stance a person takes the thing has sold out its first run which tells you that some people are voting yes with their wallets.
     
  18. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Razer has nothing to do with the eGPU standard. It has been created by Intel as part of the Thunderbolt 3 spec. The Razer core is simply the first TB3/eGPU certified device which has been brought to market. MSI will be releasing their own in the near future (to supercede the somewhat clunky GS30/dock thing).

    All eGPU requires for switching within the OS is DXHybrid (ie Hybrid switchable graphics/Optimus) support built into the BIOS. Clevo already has this but for obvious reasons never baked it into the P700 machines. It would literally cost them nothing but a couple of hours to stick it in the BIOS.

    Standalone eGPU was demonstrated some 7 months ago by the Thunderbolt team at Intel:

    Note: In this video, the eGPU is acting in standalone mode. The video output to the large monitor is via HDMI or DP directly from the GPU (can't see the connector). This is exactly the same behaviour as existing TB2 or MXM/M.2 extension implementations at the moment.

    To re-iterate, the Core and any other enclosure should still work as a standalone GPU enclosure at the minimum. At a basic level, there are literally just 4 PCIE lanes going to the port, it'll simply be like having 2 GPUs plugged in simultaneously in a desktop.

    If we get BIOS support, that simply means you'll also be able to direct the eGPU output BACK to the internal GPU (970M/980M in the case of P700) and onto the internal screen. You'll also get hotplug capability (since the display output never physically changes) and in the case of existing Optimus/Hybrid systems, the ability to select between 3 GPUs (iGPU, dGPU and eGPU) to do the rendering.
     
  19. darkarn

    darkarn Notebook Evangelist

    Reputations:
    47
    Messages:
    655
    Likes Received:
    226
    Trophy Points:
    56
    Whoa, now that sounds really complicated; used to think that the eGPU will automatically reroute the graphics back to the laptop screen directly instead of having to go through the existing GPU on the laptop...

    Something tells me that there will be a performance hit but how much I not sure
     
  20. Support.1@XOTIC PC

    Support.1@XOTIC PC Company Representative

    Reputations:
    203
    Messages:
    4,355
    Likes Received:
    1,099
    Trophy Points:
    231
    So many questions, but will have to just wait for answers when they launch. Still, can't wait to see some eGPUs and see what all they can do.
     
  21. XMG

    XMG Company Representative

    Reputations:
    749
    Messages:
    1,755
    Likes Received:
    2,200
    Trophy Points:
    181
    Yup fully aware of this - that's not the point I was making ;-)
     
  22. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    If the firmware did support it, the performance of an optimus setup (for nvidia external cards this is how they behave if they can pipe to the internal display) is pretty much identical to regular output these days, the biggest thing is the lag it adds which is fine for regular gaming but does not play well with VR.
     
  23. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    The "automatic rerouting of graphics" is done via PCIE frame-buffer transfers. Which is exactly how Optimus/Hybrid graphics works. Hybrid graphics systems always have the screen connected to only one GPU (typically the iGPU).

    As far as performance, the bottleneck is DMI. The whole thing has to run through 4 PCIE3 lanes to start and then funnel through another 4 lane DMI channel which is shared with SATA/USB/Ethernet and other PCIE devices like M.2 slots. Fact is, for gaming loads it probably won't be a problem but there can be some real situations which will create the bottleneck. Such as if you have a M.2 NVME SSD and fully stress it at the same time as the GPU. There's simply not enough bandwidth for that to happen.
     
  24. darkarn

    darkarn Notebook Evangelist

    Reputations:
    47
    Messages:
    655
    Likes Received:
    226
    Trophy Points:
    56
    Same here, I hope my Sager can survive the wait though! :D

    Hmm... Let's hope the lag is not too significant then. I wonder if the generation gap between the eGPU and the dGPU will play a part...

    Ah I see, now that clears up a bit... Hmm, I hope this does not mean that other peripherals will need to be sacrificed for eGPUs
     
  25. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    The lag should be no different really unless you start using a huge cable.
     
  26. darkarn

    darkarn Notebook Evangelist

    Reputations:
    47
    Messages:
    655
    Likes Received:
    226
    Trophy Points:
    56
    Huge cable? First time I heard that the size (rather than the length) of the cable can affect performance...
     
  27. JADabandon

    JADabandon Notebook Geek

    Reputations:
    2
    Messages:
    90
    Likes Received:
    21
    Trophy Points:
    16
    I was also hoping for simplicity in design, I have a old D900F that I became bummed out over reaching the upgrade limit.
    It's not every day that I can afford to dump well over a thousand dollars into a laptop built to accept desktop parts.
    But when I can afford it, I want to stretch it out as long as possible.
    I think everyone was hoping that this eGPU over thunderbolt 3 was going to be just like having a external pci express connection for your gpus.
    But if this razer core works as promised, I may just get one, but it sounds like it has a lot of challenges to overcome.
    Why can't Clevo just come up with their own external pci express? I'm sure laptops like the p700 with desktop chipsets have the pci express lanes to spare!
     
  28. JADabandon

    JADabandon Notebook Geek

    Reputations:
    2
    Messages:
    90
    Likes Received:
    21
    Trophy Points:
    16
    I really do wish that all laptop manufacturers that utilize MXM in their laptops allowed for graphics cards to be installed just like on the desktop. But there are too many size and weight issues that they take into account "this gaming laptop is thinner" or "this gaming laptop weighs less" which has a impact on the shape of the heat sink and requires a different shape for the MXM card.
    Then there is also the crucial aspect of how much power the GPU needs to draw, doesn't the desktop 980 have its own power connector on it?
    Plus if the laptop was designed for the end user to easily upgrade their graphics card, you always have to buy the proprietary card from the manufacturer. And if you are still using your laptop by the time a new Nvidia GPU has come out that you want, they would rather you buy a new laptop anyways.
    It just seems like you can ride a desktop computer for more years with upgrading small things like memory and graphics, than completely replacing the whole system when there is a new generation of Nvidia cards coming out.
    Ugh, the MXM graphics card had so much potential for making all laptops easy to upgrade.
     
  29. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Huge length cable to clarify, I thought that was obvious ;)
     
  30. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    That's exactly what it is.

    The only "challenge" is the added feature of hybrid graphics capability. If Clevo doesn't add Hybrid support to the P700 BIOS, then you simply plug your monitor directly into the eGPU instead of using the internal panel.

    They actually don't. All 16 CPU lanes go straight to the MXM slot/s. Beyond that there's only a 4-lane DMI connection which connects to the PCH which has 20 lanes of it's own. 4x lanes for NVME1, 4x lanes for NVME2, 4x lanes for TB3. 1x lane for Wifi, 1x lane for Ethernet. That's 18/20 already just in those 3 devices and you still have to feed SATA/USB3 etc.
     
  31. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,485
    Trophy Points:
    681
    I believe this is very possible after giving the Clevo systems some extra 'love' treatment...
     
    godfafa_kr likes this.
  32. darkarn

    darkarn Notebook Evangelist

    Reputations:
    47
    Messages:
    655
    Likes Received:
    226
    Trophy Points:
    56
    I have similar usage logic too; if it is just one part that needs changing, just change that instead of junking the entire system

    I was thinking about thicker cables lol
     
  33. JADabandon

    JADabandon Notebook Geek

    Reputations:
    2
    Messages:
    90
    Likes Received:
    21
    Trophy Points:
    16
    Well you clarified things for me. I am still considering buying that Razer Core box.
    I actually wore out the heat sink mounting screws on my old D900F by constantly changing out between my Nvidia and AMD GPUs, and I'm not feeling that anxious about changing GPUs inside my P750DM too often.
    Plus those MXM GPUs are expensive.
    But I absolutely love trying out different hardware combinations and benchmarking just for the sake of it.
    So I will be anxious to see just how well a good GPU can perform inside that Razer Core box, but with only 4 pci express lanes?
    Hum, I'll just wait for benchmark results I guess.
    But reading your description of the different pci express lanes, would it be safe to assume on SLI versions of these laptops, like the P870DM, those 16 lanes for MXM get divided into 8 lanes for the two GPUs?
     
  34. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    The thing is, mxm is a standard. It has a standard connector, standardized board size, standardized height limitations, and standardized maximum power limitations. The desktop 980 cards are actually not strictly following the mxm standard.

    Traditionally, manufacturers like dell, clevo, and msi have followed the mxm standard and as such their mxm cards have been more or less interchangeable beatween each other's laptops.

    If mxm was utilized like it was intended then any laptop with a mxm slot would have to take any mxm card from any generation or manufacturer. I don't understand why we are so close to this, yet can't get there.

    Also one of the reasons for mxm modules being so expensive is because they are hard to obtain. If they were sold on Newegg for example, their prices would definitely fall to around desktop pcie card prices
     
  35. PrimeTimeAction

    PrimeTimeAction Notebook Evangelist

    Reputations:
    250
    Messages:
    542
    Likes Received:
    1,138
    Trophy Points:
    156
    If my memory serves me right, It was discussed last year that not all TB3 ports could do eGPU. Intel needs to have "Thunderbolt 3 eGFX extension" and AMD needs to have "XConnect" in order to have eGPU working.

    And it wasnt just Razer coming up with a TB3 eGPU. As far as I remember, MSI, ASUS and ACER also had similar plans for 2016.
     
  36. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    They seem to be the first to actually launch a product, though we have to see when it actually hits stores.
     
  37. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Should work fine at least for current gen cards. For comparison, my desktop PC runs a 980Ti in a PCIE 2.0 x8 slot. That's the same bandwidth as x4 lanes of PCIE 3.0, minus about 5% since 3.0 is a bit more efficient. It benches the same as any other 980Ti in stock or overclocked form (20K firestrike score).

    Indeed. The real thing to see will be what happens when that bus is shared with other devices. I expect that typical gaming loads will work fine.

    Yes, SLI machines have the CPU lanes split to x8/x8. To be honest, I'm surprised the M.2 PCIE lanes aren't drawn from the CPU in the P750/770, using a x8/x4/x4 configuration as that would be a superior way to do it. That being said, Intel specifically calls the 6700K PCIE bus "PCIE Express 3.0 Graphics" lanes so there may be some limitations from Intel there.

    FYI, this is no longer the case. MXM has been frequently broken or overridden mostly because ODMs usually manufacture their own cards, unlike the desktop space where card manufacturers must also build to external standards.

    For example, the MXM 3 standard still assumes LVDS is used for internal LCD. The 100W limit is also frequently broken by the top tier card from Nvidia (120W+ since the 680MX), which is also one of the ways they've managed to narrow the gap between desktop and mobile cards. If you go back to the older GTX285M and prior they only had 75W to work with when their desktop cousins were pushing 200W+. It's a small wonder that the mobile chips actually got over 50% the performance of the desktop equivalent named card and yet did it with 40% the power budget at the time.
     
  38. JADabandon

    JADabandon Notebook Geek

    Reputations:
    2
    Messages:
    90
    Likes Received:
    21
    Trophy Points:
    16
    Very interesting, I actually went an Googled the best I could to find pictures of the GTX 970m and 980m for the Dell Alienware and compared them to the ones for Clevo laptops.
    They do look the same, I never really considered installing a Alienware GPU into my Clevo laptop, and probably still wouldn't, but it would be nice to be able to.
    Wouldn't the firmware be different in the cards though? Although you can still flash different firmwares.
    The laptop manufacturer does build their own MXM cards, right? The actual MXM cards themselves don't come from Nvidia, right?
    Now I seem to remember that Asus has laptops with MXM GPUs that are built differently, I tried looking up pictures of their laptops but couldn't find anything quickly, maybe I'll look some more later on after work...
     
  39. aarpcard

    aarpcard Notebook Deity

    Reputations:
    606
    Messages:
    1,129
    Likes Received:
    284
    Trophy Points:
    101
    Traditionally, Dell (alienware), Clevo, and MSI have followed the MXM standard. In most cases, vbios's can be flashed interchangeably between cards - sometimes you do have to do this to get cards to work like I needed to run Dell vbios's on my Clevo 6990m's in my Alienware to get them to function properly. I've used a mixture of Dell and Clevo cards in my Alienware M17x R2 for the past 4 generations. I also used Dell 7970m's in my current Clevo laptop for a brief period of time.

    I've sold Dell cards to be used in MSI laptops as well. Generally speaking cards from these three companies are more or less compatible because they closely follow the MXM standard.

    NVIDIA and AMD design a reference MXM module which adheres to the MXM standard (at least in terms of dimensions, and pinout - for the most part) which the manufacturers have the option of implementing. Obviously they don't have to, and thus you get companies like Asus or HP.
     
  40. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    The 285M was not even based on the same chip generation as the desktop 200 series.

    MXM is a standard on the software and physical front. It can be adapted of course, but with the right research it allows for at least some interchangeability.
     
  41. godfafa_kr

    godfafa_kr Notebook Evangelist

    Reputations:
    318
    Messages:
    485
    Likes Received:
    424
    Trophy Points:
    76
    I hope you can give the "love" to our clevo system!
     
    TomJGX likes this.
← Previous page