The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    M17x R4 Wishlist

    Discussion in 'Alienware 17 and M17x' started by skydrome1, Jun 2, 2011.

  1. imglidinhere

    imglidinhere Notebook Deity

    Reputations:
    387
    Messages:
    1,077
    Likes Received:
    59
    Trophy Points:
    66
    This. The mostly aluminum body would be the only thing I'd like to see again since that finalized the purchase when you knew your machine was made of actual metal. :D That and the ability to add multiple GPUs. :D
     
  2. Stealth55

    Stealth55 Notebook Consultant

    Reputations:
    177
    Messages:
    210
    Likes Received:
    0
    Trophy Points:
    30

    Thats a docking station and laptop can have that now but it runs the card at 1x speed.my idea isfull 16x speeds.
     
  3. Tsunade_Hime

    Tsunade_Hime such bacon. wow

    Reputations:
    5,413
    Messages:
    10,711
    Likes Received:
    1,204
    Trophy Points:
    581
    1) Change ODM manufacturer back to Flextronics
    2) Bring back 16:10 WUXGA RGBLED, possibly IPS, or at least a higher color gamut than the WLED offered, and offer a matte screen
    3) Bring back multi GPU's (but that would mean the quick and swift death of M18x)
    4) Possibly a Thunderbolt connection
    5) Metal chassis instead of chocolate tray holder plastic chassis
     
  4. Generic User #2

    Generic User #2 Notebook Deity

    Reputations:
    179
    Messages:
    846
    Likes Received:
    0
    Trophy Points:
    30
    ...dude...just ask for a thunderbolt port and an included cable rather than asking for frankenstein lol

    EDIT: ps, you don't really lose performance with reduced pcie lanes unless you're connecting to dual gpus.
     
  5. Stealth55

    Stealth55 Notebook Consultant

    Reputations:
    177
    Messages:
    210
    Likes Received:
    0
    Trophy Points:
    30
    right...so i can use this thing External Thunderbolt graphics card for Macs will not blow you away | ExtremeTech. xD nty i rather have "frankenstein" if this thunderbolt gpu adapter cant even perform better then current mobile radeon/nvidia gpus.
     
  6. Generic User #2

    Generic User #2 Notebook Deity

    Reputations:
    179
    Messages:
    846
    Likes Received:
    0
    Trophy Points:
    30
    are there supposed to be benchmarks in that article?
     
  7. ibraveheart

    ibraveheart Notebook Evangelist

    Reputations:
    35
    Messages:
    593
    Likes Received:
    2
    Trophy Points:
    31
  8. sfara

    sfara Notebook Guru

    Reputations:
    1
    Messages:
    69
    Likes Received:
    0
    Trophy Points:
    15
    I just wish they make the product feel for polished and properly tested before released ... tired of all those little (big?) imperfections !
     
  9. bigspin

    bigspin My Kind Of Place

    Reputations:
    632
    Messages:
    3,952
    Likes Received:
    566
    Trophy Points:
    181
    1. Assemble screen in dust free environment (I have dust bits inside my b'new system)
    2. Matt screen option

    3. iGPU disable option in bios (Tired of driver issues/ I don't wanna use hacked bios)
     
  10. bigtonyman

    bigtonyman Desktop Powa!!!

    Reputations:
    2,377
    Messages:
    5,040
    Likes Received:
    277
    Trophy Points:
    251
  11. Tsunade_Hime

    Tsunade_Hime such bacon. wow

    Reputations:
    5,413
    Messages:
    10,711
    Likes Received:
    1,204
    Trophy Points:
    581
    Double? 1 x GTX 580M already smokes a GT555M by a longshot...and SLI 580M utterly destroys a GT555M. Someone bring a medic!
     
  12. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    Heh, an M17x with a 580m is double. An M18x with 2 580m would be 4 times faster.

    The 555m is a graphics score of around 1500, the 580m is 3400, and the duel 580m would be 6500ish.

    Course with OCing a single 580m will hit 4500 graphics score, which makes it 3 times faster, and who knows what a duel rig will hit, probably in the 8000 range.
     
  13. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    Just look at the benchmark thread for scores :p
     
  14. javilionaire

    javilionaire Notebook Consultant

    Reputations:
    2
    Messages:
    279
    Likes Received:
    0
    Trophy Points:
    30
    Bring back SLI/Xfire to the M17x R4! :D
     
  15. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    This is exactly what I don't want. Say byebye to Optimus.
     
  16. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    ?? Wha? The M18x has dual 580m's and it still has optimus.
     
  17. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    No it doesn't. It has old-school graphics switching the same as the M17x R1/R2. Half the time, it requires a reboot.
     
  18. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    Oh right my mistake. I don't have Optimus anyways as I'm using 3D. I'd love to see a Lightboost panel and switchable graphics manual or auto. I know it isn't Dell's fault that switching isn't available for 3D so I'd like to see Nvidia/Intel make it available :)
     
  19. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    And I'll tack on that I'd like to see Intel/NVIDIA (not sure who's at fault here) allow Optimus on an SLI-capable chipset. I'd be all over an M18x R2 in that case, otherwise it's an M17x R4 that may/may not have some horrible onion (peeling) chassis coating.
     
  20. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    I've covered my chassis in lime green matte vinyl to protect it. I'm not a big fan of dual GPU systems but from what I'm seeing of the M18x guys is that it isn't as much of a headache as before.

    Perfect wishlist would be:

    M17x R4 with Lightboost or IPS RGBLED. Optimus included for both (both 3D and IPS don't have optimus currently). And of course the newest GPUs CPUs yada yada.

    M18x R2 with Matte IPS RGBLED and maybe 3D. With Optimus. Newest GPUs and CPUs.
     
  21. inzelux

    inzelux Notebook Evangelist

    Reputations:
    244
    Messages:
    319
    Likes Received:
    0
    Trophy Points:
    30
    As this thread probably have mentioned all my general thoughts about performance, i'm only going to mention one of my wishes..

    Fans that go a lot faster,fans that can spin in the opposite direction (And which are programmed to do so during boot-up for 5 seconds or so at max speed) as well as better air-flow design).


    Now, why?

    Because, by reversing the fans for a while at high speed, will probably knock off some of the dust layer that tend to place on one side of the fan-blades after a while, as well as higher fan-speeds combined with better airflow design probably will make the system chill better.

    - Scott.
     
  22. javilionaire

    javilionaire Notebook Consultant

    Reputations:
    2
    Messages:
    279
    Likes Received:
    0
    Trophy Points:
    30
    What's wrong with an optional xfire/sli in the M17x?

    You don't have to choose the xfire/sli if you don't want it! :)
     
  23. darkdomino

    darkdomino Notebook Deity

    Reputations:
    203
    Messages:
    833
    Likes Received:
    2
    Trophy Points:
    31
    I have only had my M17x R3 for a couple weeks now, but I'd suggest that the R4 have a more powerful fan and ventilation system. I think temperature control should be an utmost priority for systems like these. I'd also like to see them work on getting some quieter fan technology in there.
     
  24. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    the wrong thing is xfire/sli in m17x will ruin the sales of m18x and they are mass producing this stuff...
     
  25. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    SLI/CF isn't limited by the number of graphics cards, it's the chipet. Whether you have one or two cards on the wrong board won't make a difference: you still won't get Optimus.
     
  26. Tsunade_Hime

    Tsunade_Hime such bacon. wow

    Reputations:
    5,413
    Messages:
    10,711
    Likes Received:
    1,204
    Trophy Points:
    581
    Optimus should be banished from this Earth. So many issues with it. It should be either manual graphics switch or a BIOS option to disable XX GPU or no switchable graphics at all.

    SLI/Crossfire is limited by the chipset's bandwidth. Yes they are not going to make M17x with dual GPU's as it would compete against M18x..
     
  27. Gearsguy

    Gearsguy Notebook Deity

    Reputations:
    570
    Messages:
    1,592
    Likes Received:
    0
    Trophy Points:
    55
    I agree, bring back manual switching. Oh no we have to go through one second of a black screen! I mean even if they use the same type of technology where the GPU is always on but on a super low clock rate, at least allow us to do it manually
     
  28. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    The best would be manual switching without the need for a restart.
     
  29. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    This. It's not the manual switching I had a problem with, it's the reboots. I've used an Optimus laptop before, and I had virtually no issues with it. Occasionally, I'd have to whitelist a game, but that's it.
     
  30. javilionaire

    javilionaire Notebook Consultant

    Reputations:
    2
    Messages:
    279
    Likes Received:
    0
    Trophy Points:
    30
    What is Optimus?
     
  31. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    I guess I am the only person liking optimus on the planet :) I really do like the fact that battery takes 4 hours on max performance using iGP, at the same time I can run the games instead of double clicking, right click go to run with high performance graphics processor (I don't even bother whitelisting from nvidia control panel). I need my laptop on a mobile basis, every day carrying it to department and back home, so I don't want to plug it in the second I arrive somewhere, just take it out and use it for a while before I feel like plugging in and optimus is providing the perfect platform for it, my thoughts
     
  32. bigtonyman

    bigtonyman Desktop Powa!!!

    Reputations:
    2,377
    Messages:
    5,040
    Likes Received:
    277
    Trophy Points:
    251
    I agree its great when it works, but its never going to be perfect for people who like to tweak and such. :p
     
  33. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    If I had to pick, I'd take Optimus (with its few problems) over manual switching (with/without reboots) any day.
     
  34. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    Just throwing this out there for a laugh:

    Image having a M17x R4 the same size as the R3 now but with dual graphics. How? External proprietary MXM enclosure. Similar to eGPU docks, but using MXM slots instead. GPU one will be in the slot in the laptop. The SLi bridge will connect the GPU to the mobo and be passed through the ExpressCard slot to the second GPU.

    For example, R4 with GTX 675m and SLi ribbon connected to the mobo. The extermal MXM enclosure will have a 90mm fan directly on top with a few heatpipes going here and there. When the GTX 680m comes out, just need to get one for the laptop and one for the external MXM box.

    This would be the next step in removable SLi. Instead of solutions like eGPU where the external GPU takes over, this would be like a secondary GPU dock. SLi when gaming, powerful single GPU on the move :D

    PROBLEMS?
     
  35. katalin_2003

    katalin_2003 NBR Spectre Super Moderator

    Reputations:
    14,958
    Messages:
    5,671
    Likes Received:
    1,517
    Trophy Points:
    331
    That's a good idea, we've talked on the subject in a different thread but problem is, as always bandwidth. For this to work as you want it, the external MXM slot has to be connected to PCIe x16/x8 as our current SLi/CF motherboards and not thru USB, eSATA thunderbolt etc..

    This would definitely be a +1 for portability.
     
  36. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    Yeah as always :(

    What about a proprietary connector? What if the motherboard was made for dual cards but the second slot was only the slot but no space. Then it would only be the case of getting an extender. We've seen those around.

    Anyways all of this is just dreaming. Wish I was a billionaire and could hire people to create this for me.
     
  37. inzelux

    inzelux Notebook Evangelist

    Reputations:
    244
    Messages:
    319
    Likes Received:
    0
    Trophy Points:
    30
    That, right there, woulda' been an +1 rep if it wasn't for the fact that i need to spread some love before i can rep you again.

    That would be awesome, but then again, as someone else earlier here mentioned, giving SLI/CfX back to the 17x will automatically make it compete in the segment wherse the 18x is supposed to fight, which is unluckily not happening..? :(

    - Scott.
     
  38. katalin_2003

    katalin_2003 NBR Spectre Super Moderator

    Reputations:
    14,958
    Messages:
    5,671
    Likes Received:
    1,517
    Trophy Points:
    331
    Now you're talking! It is very much douable and revolutionary :)
     
  39. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    Yeah unfortunately, the M17x's are now competition for the Asus G74 and Clevo/Sager 8170 model (as if they have a chance anyways :D)

    Still though, if I was some gazillionaire, I would like to see this being done. This could work on many levels. For example the M11x and M14x could just be running iGPU when being moved around, plug in the external MXM and you get dedicated GPU. The M17x can have 1+1 GPUs and the M18x can have 2+1 allowing three-way-SLi :eek:

    Best of all, the external MXM box can be moddable. Such as custom Alienware box design, AWCC lighting control. Or even custom homebrew liquid/phase change/liquid nos cooling.
     
  40. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Widezu, just for being so creative, I will send you 100 if this ever come true, I would LOVE to have a dock of ridiculously powerful GPU cluster (name it, maybe quadfirex) and a laptop which I can just dock and use at ease, it is THE desktop replacement, which has the power of desktop and is mobile for people like us, oh man, that would be the dream come true, PLEASE ALIENWARE OH PLEASE READ WHAT THIS GUY IS SAYING!!! +rep until infinity :)
     
  41. Rypac

    Rypac Notebook Evangelist

    Reputations:
    242
    Messages:
    563
    Likes Received:
    0
    Trophy Points:
    30
    Nooo Widezu... you're getting me too excited!!! ;)

    Seriously, this I one of the best ideas I have heard in a long, long time. You, my friend, need to be hired as Dell's creative director for Alienware. :)
    I'm sure the engineers would revel in pushing the boundaries like this!
     
  42. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    Moved to another thread.
     
  43. superdave643

    superdave643 Notebook Geek

    Reputations:
    55
    Messages:
    94
    Likes Received:
    0
    Trophy Points:
    15
    In addition to addressing the various issues (especially the apparent thermal ones) seen with Alienware laptops, the following is my little 10-item wishlist for a future new product. Feel free to add your own or argue with my list.
    1. Ditch the M15 and M17, and replace both with an M16 that fits within a 15.6" laptop's footprint. This will slot in perfectly between the M14 and M18. Kill two birds with one stone. Maintain the M17's keyboard size. Yes, I've pulled out a tape measure and it's feasible.
    2. Make it thin - <=0.75 inches (19 mm). Carbon fibre, magnesium/lithium alloys to make it light. Use the entire metal portion of the chassis as a heat sink and fin array. If you isolate properly, neither the chassis nor other components will get annoyingly hot. And the CPU/GPU stay much cooler with so much more thermal mass and cooling surface area.
    3. 16.4" 16:10 screen à la Sony Vaio (unorthodox, I know). Make it borderless (screen extends close to the edges of the lid), LED-backlit, IPS, 2560x1600, non-glare, optional 3D and touchscreen.
    4. Load up on USB ports, minimum 6, better 8. Lose the 2.0 ones and go with all native 3.0 that will run right out of the boot-up screen.
    5. Ready for future PCIe HSDL interface drives. Maybe add a minicard SSD bay that the M6600 has.
    6. 97Wh battery lifted directly from the M6600.
    7. 3D-HD webcam with 2 CCDs at eyes-width apart.
    8. Ivy Bridge CPU and Nvidia GK SLI /AMD CGN Crossfire GPU options
    9. Please, PLEASE swap the positions between the PAGE UP/PAGE DN and HOME/END keys!!! I keep hitting the DELETE key when I want to PAGE UP/DN. What do you think that does to my Outlook e-mail, DELL/Alienware?!?!?!?
    10. Are Ethernet, VGA port and an optical drive really needed anymore??
     
  44. mrm2x

    mrm2x Notebook Consultant

    Reputations:
    103
    Messages:
    151
    Likes Received:
    5
    Trophy Points:
    31
    15,6" with 2560x1600 and dual-gpu? i think you´re trolling ;)

    but i´m all in for that 16:10 aspect ratio ^^
     
  45. Tsunade_Hime

    Tsunade_Hime such bacon. wow

    Reputations:
    5,413
    Messages:
    10,711
    Likes Received:
    1,204
    Trophy Points:
    581
    I doubt they are going to do dual GPU for M17x, it will compete with M18x flagship laptop. And thinness...look Alienwares were never about thin. In order to properly cool the internal components, it needs beefier heatsinks and that contributes to the thickness. I'm going to probably say the Razer Blade has heat issues with an i7 dual core and a midrange GT555M, given how thin it is.

    And USB 3.0 isn't native to the chipset yet, they are all running off of 3rd party chips, and thus won't boot off USB 3.0. In Ivy Bridge, they will have more USB 3.0 ports and be able to boot off of them.

    16:9 is unfortunately here to stay, and nobody is going to put 2560x1600 in a laptop, too much $$$ and too little people willing to pay such an insane price. I don't like Edge to Edge screens, as most you cannot take apart, you have to buy the entire LCD top portion if you crack it. I believe the R2's RGBLED top portion can be 400+ brand new, maybe 250ish used.
     
  46. superdave643

    superdave643 Notebook Geek

    Reputations:
    55
    Messages:
    94
    Likes Received:
    0
    Trophy Points:
    15
    I'm not the one who started about resolutions higher than 1080p. Resolution is not the end goal, it's pixels-per-inch. This article (and many others) speculate that Apple will drive up screen pixel densities in the MBPs, approaching the retina displays in the iPhone and iPad. And I want a 16.x" screen, not 15. :)

    And yes, I am overboard with dual-discrete GPUs. Ivy Bridge's HD4000 should give some bump in graphics fire power (not optimistic Intel will make good drivers for it either, though) and later Haswell will go even further. Meanwhile, I would rather see some means of adapting a single desktop Kepler/Southen Islands as the sole dedicated GPU option; Tom's Hardware has tested the 580M SLI and 6990M Crossfire and their respective performances lag behind even single chip desktop versions with the respectively same model numbers. Of course, this might be impossible to package within something as small as- or smaller than the M17 and then there's the power consumption and thermal issues as well...
    I'm a mechanical engineer and heat transfer is the subject for my PhD (not specifically for computers though). You need to get out of your mind that a heat-sink is a metal block that sits over the CPU/GPU. Jonny Ive and others understood it that you can use elements of the chassis of a device to have other functions than just to hold pieces in (in this case an antenna in the iPhone). It's very jujitsu-like in philosophy. As I said originally, you use the entirety of the metal part of the chassis itself with fins die-cast on appropriate places with a forced airflow stream as the heat sink itself. The slugs of the CPU/GPU are mounted directly against the chassis, perhaps with a thin-sheet copper heat spreader. Then you can eliminate the traditional heat sinks and heat pipes, thus making the computer lighter and thinner. With proper design, no part of the chassis should exceed temperatures barely-above-ambient. Of course, proper electrical- and thermal isolation must be designed-in.
    That's what I'm counting on. :)
     
  47. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    if it has watercooling i would buy it the first day it comes out :p
     
  48. Tsunade_Hime

    Tsunade_Hime such bacon. wow

    Reputations:
    5,413
    Messages:
    10,711
    Likes Received:
    1,204
    Trophy Points:
    581
    Well if you've ever saw the R1/R2 inside chassis, you'll see why it is that thick.

    Water cooling would be too complicated, too expensive, and given how ignorant some people are, simply put dangerous. If you've ever seen the external watercooling people have done...not pretty nor practical.
     
  49. superdave643

    superdave643 Notebook Geek

    Reputations:
    55
    Messages:
    94
    Likes Received:
    0
    Trophy Points:
    15
    Right, water cooling still requires a heat exchanger and circulation pump between the heated water and the environment, something that is lost on many people. This is impractical to package within a laptop, unless one can tolerate tubes going in and out connecting to an external device.

    One really needs to throw away out-of-date concepts of what makes any electronic device large and thick and think ahead. Otherwise we'd still be computing with Osborne portables ca. early 1980s and holding bricks for cell phones.
     
  50. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    @superdave: You're kidding, right? No Ethernet? What do you want to use for network connections?
     
← Previous pageNext page →