Read the thread...
-
-
TLDR version...please!
-
Someone brought up this weekend we should have a CES road trip 2017, then posted today a photo of the Wally World trip Wagon - CES or Bust - then posted a photo of the NBR crew arriving as a Rock Band ready to break the laws - so I followed up with more options for arriving as Cosplay characters - hence the NYCC video's of Cosplay characters.
Still TL..R?
-
ah....SDCC is the dream but passes for it are difficult to get.
plus i am gonna stay away from the US if trump becomes president.Ionising_Radiation likes this. -
"You don't come to US... US Comes To You!!" ... sigh.Last edited: Jan 11, 2016
-
Well obviously, if you already have those two machines, you likely won't dump them both to get a new MSI GS30. But the idea is that if you're the type of person who would have that kind of setup, an eGPU would be something that would work for you.
-
it's called amazon.com
-
Yes, Amazon.
-
If the eGPU worked standalone to utilize all the $ I paid for the eGPU/GPU and big screen(s) then yeah, that would be cool.
The eGPU will need to be a fully functional standalone computer...that sits on my.... d e s k t o p... -
anyhoo...
gaming laptops will always fall behind desktops and getting the performance of a current desktop GPU in a laptop form factor is not possible. eGPUs i find to be cumbersome a defeat the point of it being mobile. Are you going to be carrying a dock with you while travelling? -
Wheels, that's what the eGPU's need, and a leash, so you can pull them around behind you - so they can be mobile after all
-
Of course not, an eGPU is not for mobile gaming. The eGPU is for people who want a thin, ultra-portable laptop (say, for university classes) and also want a powerful desktop gaming rig. The docking-station eGPU of the MSI GS30 is a perfect example of this. That way it costs less than having an ultrabook + desktop, ideally provides the same level of gaming power, and means you only have one machine that needs to be maintained.
Not everybody who wants a laptop needs/wants to game on the go.D2 Ultima likes this. -
and call it rover?
you gonna train it to heel? -
-
moviemarketing Milk Drinker
Exactly, in three or four years from now, when your 2015/2016 gaming laptop with TB3 is obsolete, it will be much cheaper to buy the latest desktop card + eGPU enclosure than it would be to buy a new high end desktop or laptop.
Even if you have a non-BGA laptop that miraculously works with Volta/whatever cards, it would probably still be cheaper to buy the desktop card + enclosure than it would to buy the latest MXM mobile flagship.
And even in 3-4 years, that 2015 laptop will likely still be fast enough to handle demanding work or school projects on the go. -
...and that's the whole point of a laptop like the GS30. It isn't a gaming laptop. It's a thin laptop that connects to a eGPU docking station. It's pretty much the equivalent of having a gaming desktop and a thin laptop for classes, except its just one machine, and when you want the full desktop gaming experience, you connect it to the dock.
We're not talking about the Alienware GA, which is (for some reason) designed to be used with a gaming laptop. -
Well, the P775DM1 using a 980 almost like the P870DM's version is a second option, though of course no SLI. But yes, the amount of choice is dwindled. Considering single GPU alone, there's only four machines with good enough CPU options: P750DM, P770DM, P775DM1 and P870DM. Then when all I do is recommend Clevo everywhere people even accuse me of being paid by them or something. That'd be wonderful, being paid by them to speak the truth.
One of these days we'll advance. I hope. The voice I want to have is a voice like Linus or Tom's Hardware. Thousands upon thousands of people getting the truth like it is, and realizing how they're screwed and/or compromising.
As Ramzay said though the purpose of the eGPU solution is for people who want their laptop to do mostly laptop-y things until they're home and ready to game, and they get to keep one system. But of course, the only notebook with halfway-decent tech for that is the GS30 as he and I also said. Using ULV chips are great for battery life and basic tasks, but they're worse than desktop i3s. Using non-ULV chips is fine and all, but then the machine needs to handle it. ASUS' model doesn't look capable of handling it. And then there's the whole deal about storage and I/O.Mr. Fox likes this. -
I used to go to SDCC starting when it was new and open enough to decide to go the opening day; being a local helps.
The deluge of people that go now makes it a non-event to me, the last time I went it was a joke how long it took to get across the buildings inside. Refuge in the suites was also tough, finally we had to hold out own parties and those got flooded with people too.
At least now they limit the crowds with limited access, but then that ruins it for many, how can you plan to go a year in advance if you can't be sure you can get in?
NYCC looks busy now too. -
You would think that with all the creativity out there, and the influence of "Transformers", someone would make a laptop/dock that looks like something we would want to own, something cool, even if it looked like a desktop/keyboard that pulled apart and reformed as a laptop.
Anything but this thin laptop on top of a "printer box"
Last edited: Jan 11, 2016 -
Then the GS30 is clearly not for me as I prefer purpose built systems.
That looks... Ugly!
Sent from my SM-T560NU using TapatalkTomJGX likes this. -
moviemarketing Milk Drinker
Why only the GS30? There are many decent quad core Skylake laptops with TB3 port. -
Using Intel Realsense, with Amazon Alexa input, and the power train from a firey self-balancing smart scooter...
So that's how it happens....first the eGPU MSE-6 droid...
Then finally, the C3PO fully integrated self-driving PC; intelligent enough to stop and ask for directions
Last edited: Jan 11, 2016killkenny1 and moviemarketing like this. -
I don't expect it to be for you. Or for me. I simply meant that for machines designed for eGPU solutions, for people who want a basic-purpose laptop on the go and to game at home, that it's a good idea. I don't mean AT ALL that everybody should switch to such a solution, and I know I would not.
I was speaking about purposefully-built-for-eGPU machines. So far, we have the Crapzer Stealth, the ASUS mystery and the MSI GS30. I don't even count alienwares because they have a dGPU. Their eGPU solution is basically a bonus. Just like a P770DM with a TB3 port that can use an eGPU solution if Clevo simply made an eGPU solution for their notebooks that use TB3.
When someone is making a notebook that doesn't have a dGPU and uses its available space to make it the best notebook it can be without considering a dGPU, but rather intends it to be used with an eGPU solution (even if not theirs), then it hits my consideration point. Everything else is simply "a bonus". -
If Trump becomes president, NOBODY is coming into our country.
-
I wonder if his wig has gotten to his ego?
Sent from my SM-T560NU using Tapatalkhmscott likes this. -
-
The wig IS trump!
Mind blown!
Sent from my SM-T560NU using Tapatalkhmscott likes this. -
And, veering back on topic...
Trump is the eGPU for the Wig mobile unit, neither are fully functional without the other
Last edited: Jan 11, 2016TomJGX, Ionising_Radiation, HTWingNut and 1 other person like this. -
moviemarketing Milk Drinker
The AW amplifier and MSI enclosures may be limited to proprietary connector or same brand laptop, but if I understand correctly, the Razer eGPU is not exclusive to the 'Crapzer' - it can be used with any laptop with TB3 port (maybe even desktops).
Hopefully as API/drivers improve, would be nice to exploit asymmetrical SLI pairing the enclosures with laptops that already have a decent graphics card.Last edited: Jan 11, 2016hmscott likes this. -
That has not been defined/decided yet.
-
moviemarketing Milk Drinker
This was from a PC gamer article - then again, it's PC gamer so take it with a grain of salt http://www.pcgamer.com/the-razer-core-looks-like-the-graphics-card-enclosure-weve-been-waiting-for/ -
Even if the Razer eGPU isn't exclusive, the fact that it uses Optimus-like connections means it's junk. No DSR, no desktop shadowplay, no nVidia tech, etc. And it's going to have problems with some games too. Remember Dragon Age Inquisition launching with Optimus was a pain in the meowmix? I do, and I don't even have an Optimus notebook. Hence why I was so hard on it.
Asymmetrical SLI might as well be forgotten about right now. Regular SLI is garbage right now for a most anything that's new, and they seem to have no intentions to do async multi-GPU like AMD does. So nope, don't even expect it. -
Honestly, eGPUs will truly be successful once they become hot-swappable (no need for a reboot). If the Razer can hold up to their promises without major or frequent glitches, then good for eGPU set-ups.
-
That is an OS limitation, not a hardware limitation. Otherwise we would be able to use MUX switches without rebooting. It's literally the same thing.
If it's not changing the primary graphics adapter, it's Optimus/Enduro-type stuff, which is stupid.Mr. Fox likes this. -
killkenny1 Too weird to live, too rare to die.
It really looks like one of those 3 in 1 scanner-printer-copy-machine things lol.
Wonder how much such thing costs. Probably it makes more sense just to get an ordinary desktop PC and a cheap laptop or 2 in 1 a la T100.
And the size of that thing is just ridiculous.Ionising_Radiation and hmscott like this. -
6th Gen 2.2-GHz Intel Core i7-6650U processor with Intel Iris graphics 540
It's already in a local best buy somewhere near you in a Surface Pro 4.
Not that this matters- we got what we got, and it's crap.hmscott likes this. -
MSI has improved on the idea for the new GS40, and provided a combination dock + eGPU (PCIE) + cooling pad - without the cooling... how did they miss that?
Update: after looking at the photo of the dock without the laptop, there are 2 risers at then top of the dock that fit against the laptop exhaust so maybe the dock is providing a vacuum to suck out the hot air - so the dock is acting like a cooler.
Last edited: Jan 12, 2016moviemarketing likes this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Exactly. Windows has some weird behind-the-scenes mechanisms for GPUs and PCIe... Oddly enough, Apple (of all companies) has gotten the switching mechanism right ever since 2010 or so when it had switchable graphics in its MacBooks. Just one setting tweak in System Preferences, a log-out-log-in cycle, and boom, primary GPU is changed. No Optimus rerouting through Intel nonsense. It's also another reason why Optimus laptops don't properly hackintosh with OS X without significant DSDT and SSDT tweaks. -
Fortunately the newer laptops have started implementing mux switching of the iGPU and dGPU - through BIOS or Windows level setting - via physical switch / software.
This year I have seen much fewer Optimus complaints come through the forums than previous years. -
Ionising_Radiation ?v = ve*ln(m0/m1)
Is this trend referring to Clevo only, or throughout the laptop industry?
Do you mean complaints... for hackintoshes? Or just complaints in general?
Just to clarify. -
Windows only, for laptops that also have an Intel iGPU enabled.
Many top laptops have removed the iGPU Optimus and Mux option, and only provide dGPU connectivity.
I was in the dGPU only camp, but have grown appreciative of the MUX option for battery only use.
With the MUX as with the dGPU only, the Intel iGPU is unpowered during normal AC performance usage - which gives you more temp / power headroom since the parasitic iGPU is dormant.
Besides the BS you have to deal with when trying to get a program to use the dGPU - in Optimus - the iGPU is powered on and the models I have seen that added Optimus after being dGPU only raised their CPU temps 10c compared to no iGPU powered.
Top / (any?) Clevo's don't have a Mux, nor does the new MSI GT80S 980 SLI, and not sure about the new Skylake MSI GT80S 980m/970m/965m models.
My G750JH had dGPU, no iGPU at all, and temps were amazingly low for CPU, which meant within the confines of BGA, I could OC the CPU much higher than the next generation with Optimus.
The GT80's with MUX can run iGPU on battery and squeeze out a little more run time, I appreciate that in my GT80 SLI-263
Last edited: Jan 12, 2016Ionising_Radiation likes this. -
Ionising_Radiation ?v = ve*ln(m0/m1)
@hmscott very nice, but the thing is I see other 'name-brand' laptops still coming with Optimus, they have this huge sticker plastered all over the chassis. Also, as you say, high-end laptops have almost never had Optimus, but even the P640RE still does, to a certain extent. All the same, the developments are very encouraging.
hmscott likes this. -
Good eye; I didn't see that. Unfortunately it still is attached to a terrible CPU for performance purposes (though fine for day-to-day actions and maybe some older games).
Log in/out cycle huh... might I ask if switching user does it? If not, it's quite possible Apple simply reloads the drivers on a log in/out cycle and that doesn't require a system reboot. But uhh yes. Windows has focused all their time on preparing the OS to allow little user control, and thus they haven't fixed simple matters like this
Interesting. I haven't seen many laptops use the mux factor lately. Can you point out the ones? From what I had noticed, since 900M, all the higher end ASUS and Clevo notebooks didn't use it, the GT72 does not have it now (but I believe originally had it in its first revision?) and we know Alienware does not use it anymore. I was of the opinion that there were simply more dGPU-only notebooks.
On the contrary, high end laptops have always had Optimus since it came out, for the most part.
Clevo was (thankfully) late on the uptake; the HM series was spared when other notebooks used it. EM, SM, SM-A and Sx have it.
All alienwares had it, but they could use dGPU only, so they didn't count.
All ASUS had it
All MSI had it (though their external display ports were wired directly to the dGPU, so closing the notebook and running an external display only would give full nVidia functionality).
Of course, SLI laptops cannot use Optimus (another reason why I said earlier to forget async SLI for things like the Razer Core) so the P370EM, P570WM, P37xSM and P37xSM-A didn't have it.
The top end Clevos don't have it because desktop CPU chipsets can't support it, I think (I could be wrong, if someone knows, please clarify!)Last edited: Jan 12, 2016 -
Ionising_Radiation ?v = ve*ln(m0/m1)
My bad, it doesn't even require a log in/out cycle. More info here.
EDIT: Even more info here. Key points to note:
and this:The first is that the switching is all handled automatically by Mac OS X without any user intervention (though there is actually a System Preference to deactivate it, if you choose). Apps that use advanced graphics frameworks such as OpenGL, Core Graphics, Quartz Composer or others will cause the OS to trigger the discrete GPU. So, when you are reading or writing Mail, or editing an Excel spreadsheet, Mac OS X will simply use the integrated Intel HD graphics. If you fire up Aperture or Photoshop, Mac OS X kicks on the NVIDIA GeForce GT 330M.
All in all, for all Apple's failings, they have created a superior graphics-switching solution compared to Nvidia's Optimus. Apple's APIs have distinct frameworks and classes for calling on the dGPU when needed, so it's a simple function that switches on the dGPU and then all processing is done on that. On the contrary, Optimus, frankly, is terrible.The second way that it differs from Optimus is that the integrated graphics are powered down when the discrete GPU is active. This saves even more power than Optimus does, leading to a stated battery life as long as nine hours.
Ahem, the P6xxRx series (except the P640RE) all are G-Sync capable, so they are all MUXed. You can switch between iGPU only, dGPU only and Optimus in the BIOS, apparently.
Desktops apparently have their own version of Optimus... It's called Synergy. Whether it has actually been implemented, I have no idea.Last edited: Jan 12, 2016hmscott likes this. -
Interesting. But seems to be false. The display adapter would have to hand off if powering down one GPU for another. I.E. we'd had a flash, at least. One display needs to stop being used and another started. I don't fully believe what Apple is saying, unless the OS somehow has a way to go from iGPU --> optimus --> dGPU smoothly then disable the iGPU? Would that be possible? Using Optimus for a split second would cause the dGPU to run number crunching, and if they can simply hand off the work? Or maybe it starts the dGPU and both of them render kind of like AFR for a second, then the iGPU powers down after the dGPU takes place... Maybe it might be possible? But I still think it shouldn't be THAT easy. A flash, I get. That? I don't think so.
Interesting, I didn't know the P6xxRx ended up being muxed. I really didn't keep up with any but the P640RE, but I thought they just disregarded mux for some reason. Glad to know I was wrong. I feel much better about those machines now. Even though what I had thought was that it was iGPU or dGPU only using its MUX. But good to know it has Optimus for whoever happens to want it at whatever time.
Now, I know desktops have SOMETHING. But I don't think it works like Optimus or Enduro. I'll have to read up on it, but from my understanding, their chipsets simply do not support Optimus/Enduro-type functionalities, though running different displays off different GPUs and keeping different GPUs active seems to work. I think one guy actually even managed to shove an AMD GPU and a nVidia GPU in a system and Windows 10 (with MS' removal of the artificial block that didn't allow multiple GPU drivers from conflicting vendors to be installed at the same time) and use shadowplay to capture a game rendered on his AMD card. -
Ionising_Radiation ?v = ve*ln(m0/m1)
I really have no idea exactly how it works, but when I launched Witcher 2 on my friend's Retina MBP with the GT 750M, there was no flash. The game ran at dGPU frames, so it was clearly not using the iGPU. What is interesting, though, is that Apple's GPU switching implementation is brand-agnostic - AMD, Nvidia - they don't need to rewrite the code for the solution every time they change the brand, which happens practically with every refresh of their MacBooks. They like to alternate back and forth between the two vendors. As you say, the dGPU is probably started up when it is called, some magical mixing goes on and then the iGPU shuts off. As you say, it probably has some overlap between the two GPUs. I might need to delve into the Core Graphics classes to see how they work. As far as the user is concerned, it's 'seamless'.hmscott likes this. -
The only thing going through my mind when I read the article
Mr. Fox and Ionising_Radiation like this. -
I see. See, this is what separates Apple's bullcow from Micro$haft's bullcow. APPLE, while having generally wanting to retain about as much control over their own devices as humanly possible, and like forced obsolescence of their phones, do certain things that people like M$ and ASUS don't. They actually TRY to find ways to optimize the way the consumer experiences their system. It's clean, fluid, has few problems in general, etc. Their QA is usually fantastic, and they hold high standards for warranties and materials used in their gear. I mean, their stuff sucks from performance or basic sense standpoints, but they'll make sure your battery life is long. They'll make sure that once that OS is installed you won't encounter any issues unless you're the kind of user who doesn't mind using Linux.
Micro$haft steals the outward look and feel of what Apple brings to the table (or tried to do so) with Windows 8 and now 10, but the OSes have basic things wrong with it. CPU util limitations for some programs based randomly on turbo boost multipliers, making overclocking lose potential for some programs or games. Removing or hiding (useful) features under the pretense of "thinking about the consumer" or "making sure people can't accidentally screw something up". The only thing people usually screw up is downloading too much porn or visiting many shady websites looking for free stuff. Windows viruses are more common than OS X ones, so Windows PC users notice the ill effects most. But no, they have to take basic control over and over and make things harder for the experienced users to do, to "protect" us. Some random idiot isn't going to go find the cmd prompt command to perma-disable Driver Signature Enforcement if something won't install because Windows blocked it due to it being unsigned and potentially unsafe. But no, they'll make it so that you MUST reboot the PC in a certain way.
Or they'll make it so you can't tap F8 while booting the OS to access system recovery/boot options. You need to do it from inside the OS, or the OS needs to be broken and fail to boot at least (but sometimes more than) once. To "protect" people from being stupid, apparently.
Their support is poo compared to the amount of innate windows issues that go wrong that simply have to do with Windows not behaving, and their WHQL insurance has dwindled considerably since Windows 8. That's why 8 and 8.1 and 10 are so cheap compared to Vista/7. It's because they cut down support you have to pay for that. FOR THE CONSUMER. Right?
And now they're doing forced obsolescence too, so they can get people onto their spyware OS lacking basic features like proper update control, or not resetting options when updating, or even having spyware options REMAIN OFF if you reboot after turning them off. They've not solved registry issues, or "bit rot", or the simple fact that leaving the OS on for certain periods of time simply deteriorates performance. I have a 16GB paging file, I have 16GB of RAM. I have two folders open, my pictures and my screenshots folders. They're always open, as I usually upload pictures from them by dragging frequently; sometimes multiple times a day. After a couple days it suddenly becomes sluggish to scroll through them; things take a while to load. This is on a SSD, with 16GB of sRAM, with 16GB of paging file memory allocated TO A SSD. I can't scroll through my pictures folder like that. But if I reboot the PC and open a bunch of stuff and do as many different things as possible that I can do for about 5 hours, I can still scroll through pictures and screenshots without lag, without waiting for loading. It's appalling, that such basic crap isn't fixed, but they're removing other features "for the consumer". They just don't care. Apple doesn't let that kind of crap happen on their OSes. I know enough people who've used them extensively, both for work and for home, to have enough stories to go by. And the scroll lag isn't new; it happened on Windows Vista and on 7 and on 8 and on 8.1 and it still happens on 10, even though I haven't touched 10 yet (I've heard people talking a lot).
And they'll never even think of trying to fix these basic issues, or registry wear and tear issues, or anything so basic that's been happening since who knows when. Because people will use Windows anyway, so there's no need to. That's the sad truth about it right now. Truth be told though, I've never cared much about the little nagging things. I keep my system mostly clean, somehow. 3 days after a fresh reinstall when I've installed everything and tuned it to my liking, my OS moves about the same as it does after a year of installation. I'm happy about that. But from my experience, and from past systems' experiences that I've owned, that's apparently extremely hard to do, and don't you DARE tell me you've made your "latest and greatest" OS which fixes none of the root problems and has more things removed and broken than fixed compared to previous OSes. And then they're using as much incentive as possible to get stuff done.
Do you know what happened to my friend Fuzzy one day? Fuzzy was in teamspeak with us. Fuzzy has a P150HM; it's had no upgrades since its original purchase. You can imagine its HDD's state, I suppose, though it works properly. One day Fuzzy in teamspeak said "AHHH FUH-" and timed out. Three hours later, Fuzzy came back and said "hey guess what, I'm on Windows 10. I accidentally hit "update" on windows update and it installed 10". No joke. Like, what? You're pushing it so strongly, and you're touting a bunch of games for the gamers like Killer Instinct and Gears of War and Halo this and blah blah... all for "Windows 10". Because Micro$haft decided it won't run on Windows 8.1, or 7. Not because the OSes are incapable, and I most definitely doubt it's because they're all being written in DX12. No, just everything will be artificially limited to Win 10 from now on, because that means people will go to it! And they can push bloated numbers to the public proudly! I've gotten calls from my father at his workplace from people there who've done absolutely nothing, and found Windows 10 on their (work) laptops, and suddenly things don't work anymore, like the printer. And they don't know what to do. But no, WINDOWS 10, BECAUSE IT'S THE BEST THING TO BE ON WHERE WE'VE ONLY MANAGED TO ADD DX12 AND BREAK OR REMOVE EVERYTHING ELSE. NOTHING for the consumer. NOTHING.
But no. Apple takes care of their consumer. When you buy Apple, you aren't buying a system for control. But you're buying a system you can reliably use without much fear, and if anything DOES happen, their support will likely fix just about any basic issue real quick. And guess whose OS is cheaper than even the cheapened regular Windows 8 is? Apple's.
Yeah I devolved into a rant, but everything is true. It's the difference between Micro$haft (and ASUS, for different reasons but a similar extent) and Apple. The first two don't like consumer choice and tinkering and upgrades just like the third, but the first two don't look after their customers and make things easy where it matters, UNLIKE the third.
@Mr. Fox grab a cup of tea and have a read, I bet you'll enjoy this.moviemarketing, Mr. Fox and Ionising_Radiation like this. -
I'm enjoying an ice cold Diet Coke and gritting down on a couple of nice juicy grilled Angus patties right now. But, yeah... they're all control freak devils... all three of them. And, Micro$lop was a lot better at everything they did before they started making up stories about pretending to care what their customers wanted and playing god. I used to actually admire Billy-Bob's empire, but now I got nothing nice to say about their digital brothel near Bothell.HTWingNut and Ionising_Radiation like this.
"True gaming laptops are finally becoming a reality"...
Discussion in 'Gaming (Software and Graphics Cards)' started by booboo12, Jan 9, 2016.