The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    DirectX 11.2 improves gaming performance & textures

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Jun 29, 2013.

  1. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    Last edited by a moderator: May 12, 2015
  2. imglidinhere

    imglidinhere Notebook Deity

    Reputations:
    387
    Messages:
    1,077
    Likes Received:
    59
    Trophy Points:
    66
    WHY!?

    What's the point of using your system RAM as Video RAM!? Seriously... the 2GB of Vram on my 6970M is several magnitudes faster than the fastest system ram on the market TODAY.

    I fail to see how this makes any sense.

    EDIT:

    Watched the video and my point is proven. They limited the GPU to use only 16MB of RAM on the and they opted to use the system RAM instead of whatever was left on the GPU under use? I mean... really people, this isn't a good example.

    If they can pull something off that shows how this can benefit a desktop GPU, I'll be impressed.
     
  3. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I believe that they meant 16MB of GPU RAM as well as using the swapped RAM gives such an increase. They definitely overexaggerated the use of it; I'm sure. It was most likely a video or something they rehearsed displaying. Swapping new-level graphics has not been that quick in a long time; usually things need a restart or they flash to reset the new level of detail etc, I don't believe it's as instant as pressing a button and having the new detail instantly appear like that.

    Either way, just like DirectX 11.1, I hope they change their minds about not including it on Windows 7 and such, instead of swindling gamers with no otherwise interest into getting Windows 8 for it. I'm very sorry, but even with 8.1 I don't want such a store-based mobile-OS-like program.

    Oh well, we can only wait and see how this would work out in real time.
     
  4. itsjustme84

    itsjustme84 Notebook Geek

    Reputations:
    140
    Messages:
    76
    Likes Received:
    0
    Trophy Points:
    15
    facepalm

    what does your version of dx11 do?
     
  5. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I don't understand your question. I was referring to a while ago when I remember Microsoft stating that DirectX 11.1 would only be available for Windows 8, but they later changed it so that Windows 7 would get the update. I mean that I hope they change their minds about limiting it to Windows 8.1 and Xbox 1. Unlike Windows XP which is, at its core, unable to use DirectX 10 or 11 (as far as I remember), Windows Vista/7/8 should all be inherently capable of using DirectX 10 and 11, and maybe even 12 which probably isn't even conceived yet.

    If I'm wrong, just point me to the right stuff XD
     
  6. Captmario

    Captmario Notebook Consultant

    Reputations:
    50
    Messages:
    200
    Likes Received:
    9
    Trophy Points:
    31
    will current GPUs support DX 11.2?
     
  7. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    I am not sure why you think this won't be useful. He says that currently, memory management compresses the textures data into a size that can fit on the VRAM. The VRAM is used to not only store textures but also results of the GPUs processing if it needs to store them for use later. That requires the texture data to be subsampled into a size that can fit on the VRAM. If you use system RAM which can potentially be upgraded to 32 GB on most machines running the Professional version of W8, you could store these textures there and send them to the GPU as and when needed. There shouldn't be any bottlenecking because DDR3 memory provides enough bandwidth to send data fast enough. The image processing on the GPU takes up a lot more time. This isn't about speed of access. It is about being able to use more data to render better quality images.

    I also do not understand how him stating that them limited VRAM use to 16MB proves your point? They are saying that basically to render images, most of the memory is used up in storing what needs to be used instead of what is processed. If they store what needs to be used on the system RAM, why is that a problem?

    The ones that support DX11 will. Earlier ones obviously won't if they don't support DX11. He states that at the end of the lecture.
     
  8. baii

    baii Sone

    Reputations:
    1,420
    Messages:
    3,925
    Likes Received:
    201
    Trophy Points:
    131
    Did previous revision direct x actually improved performance? In mean more fps.
     
  9. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    305
    Trophy Points:
    101
    Yes. You could get the same visual quality with a higher framerate.
     
  10. Woot

    Woot Notebook Enthusiast

    Reputations:
    1
    Messages:
    27
    Likes Received:
    1
    Trophy Points:
    5
    I believe the reason for this update is to synchronize the PC direct X with the XBOX 180. The new Xbox has unified system memory correct? If so then this update could help ease ports between PC And Xbox 180.

    BTW fully agree on BS requirement for Windows 8.1. No reason other than marketing why it can't be given to W7
     
  11. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    it will be quite awhile until developers actually implement DirectX11.2. They are just starting to use DirectX 11 for heaven's sake.
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Actually they've been starting to use DirectX 11 since about 2010 with BF:BC2. It's quite mainstream now, a lot of games come out either in DX11 or have separate DX9/DX11 versions. DirectX 11.1 however... I am not sure about. It says 11.0 is the only thing installed on my system, but I don't have a card capable of using DX11 (yet) so I don't know if it avoids installing 11.1 because of this. Some games refuse to launch at all (like Crysis 3) for me while others use the compatibility mode of DX11 forced onto DX10 (like BF3, MOH: WF, Bioshock Infinite, Homefront, etc) so maybe the ones that flat out won't launch are the ones using DX11.1? If that's the case, then 11.1 is really barely beginning to be used as you say. But not regular 11. Especially not now that the new consoles will be out in a couple months.
     
  13. itsjustme84

    itsjustme84 Notebook Geek

    Reputations:
    140
    Messages:
    76
    Likes Received:
    0
    Trophy Points:
    15
    This upadate is epic, we are talking photorealistic.

    suprise nbr guess whos back?
     
  14. Undyingghost

    Undyingghost Notebook Evangelist

    Reputations:
    78
    Messages:
    437
    Likes Received:
    33
    Trophy Points:
    41
    You mean they still use DX9. :D
     
  15. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    Windows 8 comes with DX11.1. It shows up as DX11 if you look for the installed version but if you have W8 and you didn't uninstall (can you do that?) the DX it came with and specifically installed DX11, you have DX11.1. Also 11.1 is an update so any GPU that supports WDDM 1.2 will support DX11.1.

    The games that refuse to launch require DX11 - they don't have a compatibility mode or a DX9 version. Crysis 3, like you found out, is one example. It has a hack that you can use - don't know how well it works as I have not tested it myself. However, it should have given you an error message stating what the problem was.

    11.1 is simply an update to 11. So any GPUs that support 11 will support 11.1. I don't think Microsoft ever released a DX update that did not work on GPUs that supported that particular version. Same story for 11.2. If your GPU supports 11, it will suport 11.2.
     
  16. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    They are not THAT STUPID to really force GPUs to use only 16MB of vRAM (and I have recently updated/lowered the bar of their stupidity). But this DX11.2 is just a joke which will work on noobs so they would start buying Windows 8. We have 2GB, 3GB, 4GB GPUs which barely use half of their vRAM so why would we need anything more allocated in RAM? RAM which is much slower and has smaller bandwidth.
     
  17. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    The current GPUs barely use their VRAM because texturs are compressed. You cannot compress them to "just" fit a certain space. Compression is quantized.

    Secondly, this could be in preparation for higher resolution displays. You don't want to store 4k res display textures on VRAM.
     
  18. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    So we are talking here what is faster: decompress textures from fast vRAM or send already decompressed data from RAM with slower speed? Well at least 1-st way depends on the processing power while 2nd is already bottlenecked by RA and can't be fastened. DDR4 will be in next Generation processors while we still didn't see 2-nd wave of Hasswells.
     
  19. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    How can you decompress textures from VRAM? If you compress them because VRAM isn't enough, where would you store them after they are decompressed?
     
  20. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    I have no idea. You told that vRAM is barely utilized because textures are compressed. If they are compressed then they must be decompressed somewhen. Maybe not for holding them elsewhere but for immediate processing or for something else. But I believe whatever is happening, it is not worse (at least for closest year) than the RAM way.
    1 year later I believe we gonna see Windows Purple or Duper-Orange but anyway claiming it as advantage of Windows Blue is speculative.
     
  21. EpicBlob

    EpicBlob Notebook Evangelist

    Reputations:
    49
    Messages:
    410
    Likes Received:
    16
    Trophy Points:
    31
    Wait, so currently I have Windows 7 with just DX 11. If I were to upgrade to Windows 8 and thus get DX 11.1, would I see an improvement in performance in Battlefield 3? Or are the games that use DX11.1 a small crowd?
     
  22. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    1% maybe. I recall people were telling about tiny gaming performance on Windows 8. Benchmarkd because it is hardly noticed.
     
  23. maverick1989

    maverick1989 Notebook Deity

    Reputations:
    332
    Messages:
    1,562
    Likes Received:
    22
    Trophy Points:
    56
    The textures are compressed so that lesser detail needs to be processed. If you see the video, you will understand what I am talking about. I don't think MS will spend so much money in developing a newer addition to DX because "noobs" would purchase Windows 8. DX11 claimed huge benefits over DX9. We didn't see people flocking to W7 because of it. Also, it is not a claim. The rep showed you difference in rendering. It is not only gamers that will benefit. CAD software will benefit from this much much more than gamers would. What IS speculative is how much it will benefit games, something that can only be said definitively once they actually release it.
     
  24. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    I am talking about DX11.2 vs DX11.1 or even just 11. Most likely you are correct about professional use as I can easily imagine multipleGigs job. But speaking about most people and gaming... MS has already spent much more for Windows 8 and even more after its release (free copies, sales, advertising etc) and I bet their main goal of Dx11 is to sell more Windows 8 to everyone, not to a small amount of professionals. It is my point of view. You may have another one.
     
  25. felix3650

    felix3650 Notebook Evangelist

    Reputations:
    832
    Messages:
    631
    Likes Received:
    224
    Trophy Points:
    56
    DX 11.2 is an exclusive of 8.1 AND Xbox One. What a coincidence eh? :p
    Anyway DX11 is used by recent games yes, but not it's entire subset of instructions. Why so? Because of tesselation. That is what makes the most significant visual difference between DX9 DX10 and DX11. It's very taxing on the GPU. Crysis 3 makes heavy use of it for environmental mapping, shadows and reflections. Most mainstream GPUs can't handle tesselation let alone 4K textures.
    Regarding textures, they are compressed on the hdd and get decompressed once uploaded on VRAM. That's why a high memory bandwidth helps with higher resolutions. DX 11.2 could help ingame performance by storing already processed textures in RAM (like those of distant objects which don't get updated often) and putting more frequently accessed textures onto the highly faster VRAM (local objects, lights, reflection surfaces etc.) That's my guess, I'm not sure how things will work out though.

    :D
     
  26. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    lol funny, over the hundreds of different DirectX updates over the years, I think I have never noticed any difference.
     
  27. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I definitely notice differences between 9 and 10, but not much between 10 and 11. 11's tessellation is a huge improvement, but the drain on cards is disproportionately large versus the gained visual candy. Also, good engines/designers have been able to make textures look very bumpy and not flat without using DX11 or even 10; but I suppose it's quite difficult to do (or at least time consuming) as most games don't employ it lately.

    As for DirectX 10, people simply don't bother coding in it most of the time. I don't know why it was skipped over; maybe it was just ahead of its time. But for example... when I played the homefront demo (it was locked to DX11 and high; settings wouldn't change (rendering AMD cards unable to use it as PhysX was forced ON); and even though all I ever got to run it on was my 280M, it looked better than Battlefield BC2 maxed out in DX10, and later I found out that it was forced DX11. I've never heard homefront being praised to have such groundbreaking graphics though, so either Frostbyte 1.5 wasn't up to snuff, or DX11 really can do a lot if coded well.
     
  28. felix3650

    felix3650 Notebook Evangelist

    Reputations:
    832
    Messages:
    631
    Likes Received:
    224
    Trophy Points:
    56
    Real world quality depends on polygon numbers in a 3D scene. The jump from DX9 to DX11 is huge if properly done.
    The problem is we have games optimised mainly for consoles right now. Consoles render primarly on the CPU and only features like light sources, shadows and anti-aliasing gets done on the integrated GPU. PC ports get adapted to the resolution and (unfortunately) only some of the GPU features (depending on the Console). If games were really coded in DX11 from scratch and then toned down for consoles, as D2 Ultima said, we could really reach that graphical beauty found on tech demos and benchmarks. I only hope that with the newer console generations running native x86 code, programming starts on the PC as base and then gets toned down to Console requirements. Just like streaming videos on youtube. High bandwidth connections get 720p or 1080p by default. Slower connections are served with 360p or even 240p. Crysis 3 is an example of what a game engine, coded primarily for the PC, can do (The Witcher 2 and Far Cry 3 too) :D
     
  29. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Yeah well let's hope we get some good PC optimization going on soon. I can't wait =D.