The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD External GFX goes PCI-E 2.0

    Discussion in 'Gaming (Software and Graphics Cards)' started by StormEffect, Jun 4, 2008.

  1. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    If you're talking about an external graphics card powering your internal LCD, first of all WHY? Second that's going to take the adoption of things like Display port to allow such a multiple configuration. The current LVDS panel wouldn't allow direct external input, and re-routing it through the laptop would require significant internal adaptation (and figure out how to feed the signal back into the PC without latency and using resources)

    Yes you need a Laptop with an XGP conection.

    Well as the only standard for now, it's pretty much settled until someone offers something better. But like so many things it may take a while to (or never) be adopted on a wider basis.
     
  2. CruXii

    CruXii Notebook Guru

    Reputations:
    12
    Messages:
    70
    Likes Received:
    0
    Trophy Points:
    15
    Wow, this sounds kickass. I wonder what nvidia is gonna do about this...
     
  3. brainer

    brainer Notebook Virtuoso

    Reputations:
    334
    Messages:
    2,478
    Likes Received:
    4
    Trophy Points:
    56
    its gonna be expensive rofl
     
  4. Thaenatos

    Thaenatos Zero Cool

    Reputations:
    1,581
    Messages:
    5,346
    Likes Received:
    126
    Trophy Points:
    231
    Yeah it would be nice to buy the new platform from AMD and have a ultra light laptop with great battery power. Everything a road warrior needs, and when back at the hotel able to plug it in and play the latest games. I for one am not brand biased one bit, as of now Im more intel due to performance but not stuck on the brand. The only worry like stated by other is it not working on the internal monitor. Why you ask? because I take my laptop with me everywhere. If it only worked on an external monitor that would be great for home use but, but what if I wanted to game on the road? Having to lug around an external LCD would not be cool.

    But hey if this worked on the internal LCD even at 1280x800 on a 12 in LCD Id be all over this (granted if it actually had the power to play without sacrificing detail levels and resolution). Ill be keeping my eye on this as it may be something to invest in.
     
  5. Dreidel

    Dreidel Notebook Evangelist

    Reputations:
    144
    Messages:
    315
    Likes Received:
    0
    Trophy Points:
    30

    Why not? Then you don't have to have an external monitor when you already have a laptop with you.

    The point of the external graphics card besides multiple displays is to reduce heat and increase graphics performance because eventually they should adopt external desktop GPUs. Thus, having the portability of a laptop and the power of a desktop.
     
  6. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Thats what I'm most interested about also.
     
  7. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    It would seem that the easiest way to get it to display on the laptop LCD, is for the notebook manufacturer to include a video input option and just reroute it back to the laptop with DVI cable from the XGP :) Could be done today easily.

    Since it need to be plugged into the wall, the XGP is not going to get you gaming at the park, but it would be convenient when flying for an extended hotel trip to not have to bring a monitor, but I suppose you can plug it into the TV somehow, find a hotel with LCD TV :)
     
  8. dmacfour

    dmacfour Are you aware...

    Reputations:
    404
    Messages:
    1,155
    Likes Received:
    0
    Trophy Points:
    55
    I read that the xpg cable isn't just a PCI express cable. I read that the USB routes through it too. Maybe it's possible that they will add something that adds laptop monitor as primary support. I'll see if I can find my source.
     
  9. someone777

    someone777 Notebook Evangelist

    Reputations:
    14
    Messages:
    634
    Likes Received:
    0
    Trophy Points:
    30
    Well i wonder how computers will be in next 5 years. Desktop might get replaced by notebooks :p
     
  10. dmacfour

    dmacfour Are you aware...

    Reputations:
    404
    Messages:
    1,155
    Likes Received:
    0
    Trophy Points:
    55
    I hope they do. They'll get replaced by high performance docking stations. :D

    Here is a little about the cable from Gizmodo http://gizmodo.com/5013116/ati-mobi...ith-crossfirex-and-xgp-external-graphics-box:
    I bet they could add a way to use the internal monitor as well on the cable itself.
     
  11. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    For those time when the hassle of the external brak-out box is Ok, but the external monitor is too much?
    Laptop panels suck. I understand if you have limited funds you wouldn't want to buy those, but it's what these solutions are geared for.
    Your needs are better suited by Hybrid graphics and a larger Laptop, where you could enable disable the powerful GPU and otherwise run off the IGP.

    Which is exactly what this is for and where it's always been leading, the docking station power of a desktop, yet it can be a small portable and underpowered as a MackBook Air when going portable. The adoption for this in the desktop market too has been hinted at for a long time, pretty much culminating in Darren mock-up at THG in 2006;
    http://www.tomshardware.com/reviews/graphics-state-union,1287-4.html

    Feeding it back to the laptop's own screen has limited utility, and IMO will likely not be supported until it's as easy to do as simply adding another jack. Right now there's low return on investment for that idea.

    Even just external graphics is niche, feeding it back into the tiny laptop LCD is a niche of that niche.
     
  12. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    the real answer needs to be a drop-in gpu like ram. i know its not practical right now, but thats how it needs to be.

    as easy as a cpu swap or a ram swap.

    i dont REALLY want a lumbering external gpu on my laptop. i want all my laptop's parts inside.
     
  13. Thaenatos

    Thaenatos Zero Cool

    Reputations:
    1,581
    Messages:
    5,346
    Likes Received:
    126
    Trophy Points:
    231
    I don't see how there is low return on it feeding back to the LCD. They make money on the units and having it be able to give you desktop graphics on the go with the LCD thats built in is what makes this awesome. Otherwise if it requires an external monitor, I could have the same performance or better with a desktop. Making this a wasted option unless you only have said AMD laptop. I know having a nice gaming external monitor will make things visually better, but whats the sense in spending lots of money on mobile technology thats not truly mobile.

    You mean like MXM technology? If so that also sounds really nice.
     
  14. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Really?

    I don't think you guys realise just how difficult that would be.

    Anything short of a pass-through is unuseable and would require a ton of hardware and throughput to manage a high resolution image at high bit depth. It's not like capturing your console output through a VIVO connection.

    The only way to get this to work well under DVI would be a shared input, which would require adding a splitter & switch.
    The best way to get it to work would be using the 2-way support built into Display Port, and just have a passtrough input to a Display Port driven LCD.
     
  15. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    yes, exactly like mxm. the mxm that never took off. every laptop needs to incorporate the tech, and the standards should be... well... more standard.

    i can buy ddr2 ram and it will work with my notebook, period. it should be the same way with mxm.
     
  16. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Which is kinda what we the consumer thought/hoped MXM and AXIOM would do.

    Then OEMs convinced the IHVs that people 'prefered' to buy a whole new laptop than to upgrade, and support would be problematic.

    For nV and ATi it is easier, because then it's the OEM that support the hardware, whereas if AMD and nV solde you the card directly then they would have to support you, including drivers, etc.

    Edit: Dang, beaten to the punch.
     
  17. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    Im sure there are a 1000 people in the computer industry that could build you an external video input in about a day. It doesn't need anything fancy, just tap straight into the LCD, doesnt have to go through the PCI or any other bus on the laptop.

    Its just not done today because as you say, why would i want to plug something into my small crummy laptop display? Now I have a reason.
     
  18. Thaenatos

    Thaenatos Zero Cool

    Reputations:
    1,581
    Messages:
    5,346
    Likes Received:
    126
    Trophy Points:
    231
    Again technology possibilities aside. Why would I spend money on something that turns a laptop into a pseudo desktop graphics option that isn't actually portable? Granted expecting gamer LCD performance out of a laptop LCD is crazy, but actually being able to take this technology on the road is key. For the price this thing will probly be you could have a desktop connected at home and leave the laptop to mobile only tasks, as you'll only be able to use it where an external monitor is.
     
  19. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    I see XGP reasonable and actually exciting ONLY if it will support internal screen. And I'm sure they thought of it when designing the cable.
     
  20. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    I'm quite excited by this development as I will be in the market for a new gaming PC in a few months and I am still unsure as to which route I will take (notebook or desktop).
     
  21. zipx2k5

    zipx2k5 Notebook Consultant

    Reputations:
    12
    Messages:
    272
    Likes Received:
    0
    Trophy Points:
    30
    Or they could take advantage of the hybrid CrossFireX technology they've already developed and have the internal IGP CrossFire with the external card, and the laptop LCD would use the output from the IGP. Due to the performance difference between the discreet card and the IGP, they wouldn't be corendering, the data would just be fed through that way. They hinted at doing this when they first revealed CrossFireX, so I don't think it's unreasonable to assume that they have it working. No sense developing another interface when the technology is already there.
     
  22. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    Exactly that. ATI already has the technology in their 3000 series cards, and will be ready for prime time by the 4000 series.
     
  23. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Right now Hybrid CrossfireX which is the type you're thinking about is limited to low end graphics cards because the IGP actually does co-render and do compositing, which is fine for an HD2400/3400's output but a little tougher when the other card is and HD3870. And if it did no render then is still needs to be the output device, which is possible, but the major limitation being that fully rendered 8/10bit per channel frames then need to be sent back to the IGP's TMDS over those lanes. It alone would occupy about 4 lanes worth of data constantly just to send the output from the ROPs (assuming no stop along the way back through the IGP itself), and it has to have strict timing control. All of which increases the complexity even further. The hybrid NB/IGP would likely be the best starting point though. You think that it's a hassle to get audio card to send the audio back for output out of the internal speakers, that's just audio, not something a demanding as video. Still easeier to simply add a display port input once supported, which would then also let you use 2 laptops simulatneously thus doubling your niche cache. :rolleyes:

    It's doable (just like the external input is doable), just not very practical, and still don't see it as a killer app adding an HD3870 to power a 10-12" LCD versus other features that would garner more interest from the mobile and desktop crowd.
     
  24. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    You just have to say NO, don't you?
     
  25. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Yeah, sorry for being a realist, I'll just leave you guys to talk about the technical aspects, like Storm Effects previous definitive comment on the subject in the first thread we 'discussed' this with 'open' minds;

    "Also remember that with most external graphics cards, you need an external monitor. The card can't feed the video back into your laptop screen."

    http://forum.notebookreview.com/showpost.php?p=2915781&postcount=2

    I'm not saying it can't, I'm saying it's impractical without some changes to current aspects of the design. And the main issue is the motivation rquired to get those changes made by the IHVs and OEMs.

    Big difference between exposing the barriers to adoption and making you rethink why you're asking, and jumping to conclusions.

    But anywhoo, yes, NJoy, it's just about saying "NO", as you can see in that other thread (and the other thread wit 2.0 discussing these developments) I've always been against this type of thing. :rolleyes:
     
  26. zipx2k5

    zipx2k5 Notebook Consultant

    Reputations:
    12
    Messages:
    272
    Likes Received:
    0
    Trophy Points:
    30
    Really I think your negativity towards this whole idea is getting old. Just because you cannot find a use personally to ever use an external graphics card with the built-in laptop LCD doesn't mean that nobody ever will. Obviously, AMD/ATI thinks that not only would people find it to be a useful feature, but is also worth their time and money to develop a solution. I said that when they debuted the CrossFireX technology they mentioned that they did want the output eventually to go through the IGP. I realize that there are some immediate problems you could associate with that due to timings and the bandwidth needed to transfer the rendered frames, but I wasn't proposing a solution, I was saying that AMD/ATI was working on one. Regardless of your beliefs as to the practicality of such a system, some engineering team at AMD/ATI seems to have figured it out.

     
  27. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    I never said nobody ever will, I said it requires changes to make it possible, regardless of which method you use right now it's not parctical. Probably gets as old as people mis-quoting you.
    Eventually it will change but right now the methods you guys are using to solve the question is like you're trying to hammer a square peg into the round hole instead of simply waiting for the square hole to come along which it is. My reply was in line with the current design issues limiting sending it back, and also the comment 'the easiest option.. ..send the DVI back into the laptop' which remain issues for the reasons mentioned. This is also why I say this currently isn't the solution for some, they'd still be better suited by hybrid graphics on a larger notebook waiting for the technology to develop further. BTW, using Fudo as your source made me smile, I'll mention that next time I chat with him.

    Anywhoo, eventually there will be all things to all people, however it's still low on the list of priorities, with the first being the standard implementation as a docking station style solution which is a big step none the less, and will still be very slow in gaining traction.
     
  28. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    Its probably a lot easier to have a splitter/switch on the laptop LCD than you think. I have a very small one for my desktop.
     
  29. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    But your desktop doesn't use a direct LVDS link though unlike most laptop LCDs.
    It's doable, but once again, you need to add the input interface, then rework the splitter into the TMDS->LCD path.

    I'll just leave it at that, and you can make the argument for why it's a benefit for a MFR to put it on every laptop in a specific line?
     
  30. vshade

    vshade Notebook Evangelist

    Reputations:
    35
    Messages:
    340
    Likes Received:
    0
    Trophy Points:
    30
    I agree that is a bit overkill to use an hd3870 to power a 12" screen, but about the part of the 4 lanes...

    A final rendered image will only have the pixels that need to be sent so for a 1920x1200 screen you will end with about 2.3x10^6 pixels which is 2.3 megabytes, since a laptop screen is only updated 60 times in a second you will need 138 megabytes/s which is about 2 lanes.

    And as far as remember the pci express is full duplex, so sending data back will not impact that much the performance, since the bandwidth limitation occurs only in the other direction.

    Also Nvidia already do this in the desktop in a version of hybrid SLI called HybridPower ( http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3305&p=7) to save energy, use the igp to render the desktop and the dedicated graphics to play games, sending data back to the igp framebuffer and then to the monitor, so it will not be that difficult to ati to do the same.
     
  31. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    What's your bit depth on those pixels looks like you've only accounted for one channel of the minimum 3 required, and that's also assuming that they will dumb it down to 8bit per channel instead of their current 10, and without any check-bits or other information.

    Yes, but you still need bi-directional transfers when communicating with the CPU, the iimpact of which you find will depend more on the application.

    ATi has that as well, the laptop version is called PowerXpress which is an adaptation of Hybrid CrossfireX, however this is still a different design and does not bypass the IGP and requires it to copy the entire frame buffer (seems heavy but might be difficult to separate from the output buffer, but it's their statement) to system memory and then having the IGP output is as a 2D render and not just a dump to the TMDS, which requires a minimum of 2 extra stops, the use of system memory and NB traffic, and also increases the amount of information required to be sent over the external PCIe if you really are copying all of the frame buffer to the system memory? Now it's not int8 per RGB channel it's much larger, requiring at minimum support for all 4 channels and FP16/FP32 frame buffer overhead (unless they cripple the features available). :eek:

    I don't doubt that eventually it will come, but right now there's a lot of obstacles in the way. I just don't see it as something as easy to overcome or as a priority.

    However rather than dash peoples dreams, I'll just leave it at "it's possible", and leave others to explain how it'll be done in their minds, and why they feel they need it.
     
  32. cdnalsi

    cdnalsi Food for the funky people

    Reputations:
    433
    Messages:
    1,605
    Likes Received:
    0
    Trophy Points:
    55
    So what are you guys saying?

    Is there such thing as a 9800GX2 (or equivalent speed) external video card for any Laptop? :D
     
  33. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Sure put an HD4870 on a LASSO external board, and voila you have your wish, you just need to get ATi to let you have both of them first. :cool:

    [​IMG]
     
  34. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    Well, if the manufacturer decided to add another interface to use an external video card, then I imagine they might expect someone to use an external video card, so they may come to the conclusion that another port is needed to use an external video card on the laptop. so the decision was already made you see :)
     
  35. RPGman

    RPGman Notebook Guru

    Reputations:
    2
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
  36. Thaenatos

    Thaenatos Zero Cool

    Reputations:
    1,581
    Messages:
    5,346
    Likes Received:
    126
    Trophy Points:
    231
    Either way if this allows you to use the internal monitor Ill be on it like white on rice. Otherwise if it only has external support Ill wait til a.) they support internal LCDs or b.) by an actual gaming laptop.
     
  37. RPGman

    RPGman Notebook Guru

    Reputations:
    2
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
    Either or it still looks to be an very interesting solution adding an high end graphics card to an budget laptop and then adding more performance with crossfire between the hd3200 and the mobility HD3870 which is about equal to an 8800m GTS? You'll have yourself a gaming rig that can compete with med-high end gaming pc's in less than half the size that a gaming rig takes up. Also not to mention that there isn't any bandwidth limitations... :) The Puma platform may truly be revolutionary only time will tell... :)
     
  38. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Still, you ll need to plug it into an external lcd :(
     
  39. dmacfour

    dmacfour Are you aware...

    Reputations:
    404
    Messages:
    1,155
    Likes Received:
    0
    Trophy Points:
    55
    Gaming on a budget screen wouldn't be fun anyways. LCDs are nice and cheap to buy nowadays too.
     
  40. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    I m a poor dude... I ll try to save any penny if i can.. budget screen doesn't really annoy me as long as it handles xga.

    But really, if they fix this problem.. Nvidia is gonna be so in a deep hole that they ll have to get produce something similar.. but in the end, i think ATI will be the winner since..
    ATI = Crossfire between different series of cards (HD3800 x HD3200)
    SLI = Same series card only (8400Mgs x 8400Mgs????)
     
  41. vshade

    vshade Notebook Evangelist

    Reputations:
    35
    Messages:
    340
    Likes Received:
    0
    Trophy Points:
    30
    Sorry for not accounting all the 3 color channels, that multiply my calculations by 3 or a little bit more if you account 10 bit per channel.

    The four channels and floating point buffers only exists in the rendering stages, the final image is a plain 24 bit color image. The dvi allows for a 48bit but the screens are usually 18 or 24 bit only, so there is no need for these higher bpp modes.
     
  42. StormEffect

    StormEffect Lazer. *pew pew*

    Reputations:
    613
    Messages:
    2,278
    Likes Received:
    0
    Trophy Points:
    55
    I think I came up with ATIs solution to TheGreatGrapeApe's argument. They already admit that the second generation of this technology will support using the external graphics to power the internal screen, and I think I know why. The second generation of this technology should be released right around AMDs "FUSION" processors, CPUs that combine CPU cores and a GPU core on a single chip. If this is the case, and the chip is running integrated graphics right off the CPU chip, then the external GPU doesn't need some half-assed passthrough to display on the notebook lcd, instead it should be able to use the integrated GPU as a conduit.

    The GPU is already communicating to the CPU over the PCI-E, if the GPU is on the exact same chip then the latency shouldn't be an issue (nor should interchip bandwidth be a problem). The framebuffer from the external GPU can be copied to the integrated GPU while it is collecting data from the CPU, and theoretically it could all be done in one pass.

    I'm not sure I've got it all straight, but what do you think GreatApe?
     
  43. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Once again, you're thinking about the output portion aka the output buffer I mention above, the frame buffers since around VS3.0 (now mandatory) have been used to render to vertex (R2VB) and texture, and now support much higher bit depth.
    DX10 exploits this even further.

    That's why I said they better only copy the output buffer because now the frame buffer is huge and has gone further than just being the traditional output buffer that dumps to the RAMDACs or TMDS.

    If all it's doing is copying the buffer without filtering for just the output buffer segment, then that's going to be a huge amount of data to copy back to the IGP and system memory.
     
  44. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    I think that would help, but really with the NB/IGP combo they could do it now with a few tweaks. Fusion definitely makes it more flexible, but their current design has the PCIe communicating through the NB first anyways, so the nice thing would be reducing one stop over the best case scenario for the current design. Now if fusion is 'system on a chip' with the CPU/NB/IGP all in one then you reduce 2 stops. The best current scenario, they could already reduce 2 out of 5+ by giving the current NB/IGP it's own dedicated memory where it doesn't need to share from the system memory pool.

    I have a feeling that they may make it more practical with a revision of the 780 chipset, and likely give people the option before fusion hits, but if Fusion is a 'system on a chip' part it should help even more.
     
  45. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    They already give the IGP its own discrete memory.
     
  46. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
  47. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    I read somewhere that it was an optional feature for OEMs. Although it may be seen as the Northbridge optional frame buffer memory.

    I cant find the article now, but I read that the IGP could use discrete memory, which isnt anything new, ATI did that with 1100/1150.
     
  48. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Yes, it's an option that is the quickest and easiest for them to implement which is why I mention it, but right now it's not implemented, everything else requires more significant reworks, compared to the small amount to put a 32-64MB VRAM buffer on-board.

    Just another thing to improve, not currently available but, like I said, something they could easily improve on the current solution.
     
  49. TheGreatGrapeApe

    TheGreatGrapeApe Notebook Evangelist

    Reputations:
    322
    Messages:
    668
    Likes Received:
    0
    Trophy Points:
    30
    Interesting thing poped up on their site overnight...

    ... according to their recently released WhitePaper;
    http://ati.amd.com/technology/xgp/xgp_technology_brief.pdf

    There's a 16 lane version on the books, which would help any bandwidth issues.

    And they mention "CROSS DISPLAY" for driving the laptop monitor, and mention it as a "will be able" whether that means now they wil be able to or eventually they will be able to isn't clear by the wording, but it's obviously part of the plan.
     
  50. TommyB0y

    TommyB0y Notebook Deity

    Reputations:
    127
    Messages:
    1,501
    Likes Received:
    2
    Trophy Points:
    56
    Its just not implemented on the Fujitsu, or maybe it is and they don't know better in the advertising. One of the power saving features is the Northbridge frame buffer so the GPU doesnt have to keep the cpu awake to refresh the screen contents, since the RAM is run through the CPU's internal memory controller.

    That way when just reading a webpage or document or viewing/showing ppt files the cpus can turn off between slides or pages.

    The optional Northbridge frame buffer ranges from 32MB-256MB according to the article I found here, sounds a lot like discrete graphics memory unless its exclusive to the desktop 780 chipset.
    http://www.pcper.com/article.php?aid=527&type=expert
     
← Previous pageNext page →