The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    NVIDIA GTX 680M Thread - M18x R2 and M18x R1 Upgrade Discussion

    Discussion in 'Alienware 18 and M18x' started by Bytales, Jul 4, 2012.

  1. Bytales

    Bytales Notebook Evangelist

    Reputations:
    56
    Messages:
    690
    Likes Received:
    4
    Trophy Points:
    31
    Hello there people. I just noticed dell.de has as option in the m18x the 680m sli. Price is double compared to amd 7970 CF.
    Lets just not talk about that.

    First question.
    1)On dell.de it says 680m sli 2gb gddr5.
    Is it 2gb or 4 gb ? I heard the 680m should have 4 gb per chip. If its double the price, at least we should be gettin double the memory.
    2)Does anyone own a 680m sli rig so that i c an see some benchmarks ?
     
  2. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    the 4GB 680M will be added later on

    as far as I know everyone's 680m SLI rig is in production at the latest
     
  3. Arestavo

    Arestavo Notebook Evangelist

    Reputations:
    188
    Messages:
    559
    Likes Received:
    215
    Trophy Points:
    56
    Dell reps have come out and told us that they will ONLY carry the 2GB version of the 680M. Clevo has a 4GB version that MIGHT be compatible, should you buy it and install it yourself.

    Orders for the 680M SLI probably won't be in for several days more.
     
  4. Bytales

    Bytales Notebook Evangelist

    Reputations:
    56
    Messages:
    690
    Likes Received:
    4
    Trophy Points:
    31
    Well thats a bummer.

    Perhaps one can find a 680m sli 4gb kit that will fit the x7200. Then there would be no reason to upgrade to m18x.
     
  5. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    Not really. The cards are identical apart from the amount of VRAM, and 2GB is more than enough for running games at 1080p - it's enough for multi-screen gaming even.

    Basically, there will be zero performance difference between the 2GB and 4GB cards unless you're running games with very high resolution textures across 3 screens - in which case, the cards won't perform quick enough to get playable framerates even if they have 4GB of VRAM.
     
  6. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    That really depends on your setup, but if your only doing 1080 then the 2gig model will be fine.

    As for me I refuse to pay for the 2gig model if I can get the 4gig. The price is almost identical so I want the top end model. Otherwise if were talking best bang for your buck then none of these cards hit that mark.
     
  7. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    What do you need 4GB for?
     
  8. DarkSkies

    DarkSkies Notebook Evangelist

    Reputations:
    282
    Messages:
    316
    Likes Received:
    0
    Trophy Points:
    30
    That's right, Dell confirmed 2GB is the only option that will be available. That's also right that more VRAM matters in case of multi-display setups or very very high single display resolutions, and since m18x are 1080p, the only real different between 2GB and 4GB versions would be the higher price of the latter.

    From what I've observed, the only game that actually approached 2GB of VRAM usage on 580M equipped R1 was Crysis 2 with the hi-rest patch.
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  10. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    RJTECH sells the 4gig cards for 800 dollars.

    Laptop Video Cards :: nVidia GTX 680M 4GB Mobile Video Card - R&J Technology, Clevo Barebone Notebook kits, Laptop and desktop system builder

    Alienware on their website has the SLI (2gig version) for "upgrade price" of $1100. Single upgrade price for 550. Meaning that each card is over 550 since the upgrade price has already the 660m price included.

    So the price increase is slight and lets face it if your paying top dollar for a 680m why not really get the best model out there.

    Otherwise if you want best bang for your buck 7970 is the card you want.


    As far as your argument of not needing 4gigs sorry but my 6990 Crossfire could not cut it with my 3 30" monitors (2560 x 1600 per screen).
    I used hydra on my 6990 CF setup to create one big screen from these monitors but the gfx card just could not cut it.

    So for me why should I pay top dollar on a 2gig card when I can get the 4gig version instead.


    EDIT: Here is a link to my screens http://accessories.us.dell.com/sna/...=dellSearch&baynote_bnrank=0&baynote_irrank=2
     
  11. vince207gti

    vince207gti Notebook Guru

    Reputations:
    13
    Messages:
    51
    Likes Received:
    5
    Trophy Points:
    16
    has anyone ordered a m18x with sli ? how long did it take to get your rig from "preparation" to "production" ?
    thanks :)
     
  12. Ironleaf

    Ironleaf Notebook Guru

    Reputations:
    58
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    The average time to go from production to shipping right now is misleading. Figure two weeks at the most right now. Once their stock of 680m improves that number should drop quite a bit.
     
  13. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    Do you have the 680m in sli? What is your 3dmarks11 score and how is it doing in games?
     
  14. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    You're running three monitors with a higher resolution than what's considered average. Of course 2GB of VRAM will choke that; the standard multi-screen setup typically uses 1920x1080 screens.

    But you've got a bigger problem: 6990 crossfire is not fast enough to run games at that sort of resolution, regardless of the amount of VRAM.

    680M's will be the same. Even with 680M SLi, you're not going to be able to run games at 7680x1600 and high detail, even if you have 4GB of VRAM. The GPUs just aren't powerful enough for that.

    My point is that the GPU power and shader count will bottleneck the system before the amount of VRAM will. So what's the point of having more VRAM?
     
  15. ironminded

    ironminded Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    Just to back EviLCorsaiR on this one; running games at something in the neighborhood of 7680x1600 is going to death-grip the hell out of almost any set of graphics cards, laptop or desktop, currently out there. If you were to set up a 4xSLI or 4xCF desktop rig with some of the high end [670, 680, 7990] cards out there right now you might get away with it but even then it could be dicey.

    Remember; even jumping from a 1920x1080 to a 2560x1600 res monitor is DOUBLING your pixel count. So when you jump from a 1920x1080 res to a triple screen setup to a triple screen setup with 2560x1600 monitors your dumping a huge amount of extra load on the graphics cards in a system.

    While laptop graphics have come a long way and boast some pretty nifty capabilities with this generation, they just done have the compute power [irregardless of the amount of vram you throw at them] to power resolutions that high.
     
  16. Bothi_G

    Bothi_G Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    I ordered mine on Monday, July 2nd. It's already in production, but the delivery date isn't until the 23rd. Sucks because I am impatient as all get out. I am a US customer, and I ordered 680m sli rig.
     
  17. vince207gti

    vince207gti Notebook Guru

    Reputations:
    13
    Messages:
    51
    Likes Received:
    5
    Trophy Points:
    16
    @Bohti, hope mine will be in production soon, ordered it the 27th june, still in preparation, delivery date is 24th july ... (french customer)

    i assume the building and testing time won't last more than two weeks, you'll have it soon :D
     
  18. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    You guys must be so excited. A week seems like a lifetime when you are waiting for something this awesome. Congrats to both of you.
     
  19. MonnieRock

    MonnieRock Notebook Consultant

    Reputations:
    216
    Messages:
    129
    Likes Received:
    0
    Trophy Points:
    30
    I need 4GB for Digital Combat Simulator A-10C. Using high viewing distances and high textures for terrain/vehicles/objects uses over 2.6GB and that is not maxing everything out :cool:

    So why would I take a step backwards on stettings, limit myself on current and future advances offered in the application when purchasing a new system?

    Thank you,
    Monnie
     
  20. DarkSkies

    DarkSkies Notebook Evangelist

    Reputations:
    282
    Messages:
    316
    Likes Received:
    0
    Trophy Points:
    30
    Talking the amount of VRAM vs. multi display setups: Let's start from the fact that 3 monitors cannot be connected to a single M18x talking a proper nvidia surround setup. You cannot mix digital with analogue output signal to go surround, and there is only one mini-DP and one HDMI out in m18x.

    In other words 680M won't run surround regardless if it has 2GB of VRAM or 10. :eek:

    BTW, the card will be obsolete next year anyway so what's the commotion about VRAM out of sudden? Will this year's games run fine maxed out on the 2GB version? Yes they will. There, done.
     
  21. Bytales

    Bytales Notebook Evangelist

    Reputations:
    56
    Messages:
    690
    Likes Received:
    4
    Trophy Points:
    31
    Was just asking because i saw the 680m comes with 4gb then i saw on dell 2gb model.

    I installed evga precision x, and with sli enabled in diablo 3, all maxed out and adaptive vsync on, i get 60 fps stable and aprxo 430mb from each card is occupied.

    I guess 2 gb should be enough for most of the games out there, if not for all.

    Was just thinking, since one pays a load of money , might as well receive the 4gb model.

    Also wanted to ask, the temperature of my cards are 78-79 for the main and aprox 70 for the second when i play diablo 3, usage is about 50-60%. I know its hot in my room, but are not these temp a bit to hot ?

    Also if one gets 680m sli with 4gb each totalling 8gb video memory one could keep up with the golden rule that always stands, 4 times more RAM then Video memory. meaning the 32gb ram in the m18x2

    Now i have 460m sli with 1,5gb each (3gb video ram) and 12 gb ram in the x7200. Coincidence ?

    I guess the 4gb model would be usefull with a quad FULL HD retina like display in a laptop. But untill then, even a quad HD will be an improvement. It seems we can't get away from the 1080p resolutions. Its seems we are stuck with them.
     
  22. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    That golden rule doesn't make any sense for games, maybe CAD or 3D work.
     
  23. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    What.

    Firstly, it doesn't 'total' 8GB video memory in actual usage. Because of the way SLi (and Crossfire on AMD cards) works, both cards have the same files on their VRAM at any one time, so the total usable VRAM is that of a single card - 4GB in this case - no matter how many GPUs you have linked up.

    Secondly, suggesting you need that much RAM is ridiculous. 8GB, if anything, is still overkill for today's games. Very few games will utilise even half of that, most graphically intensive games I play use about 1.5GB-2GB of RAM, possibly up to 2.5GB.

    For the vast majority of people, the only reason you'd ever want more than 8GB of RAM today (on non-server machines) is if you wanted to run a RAMdisk, or if you're running CAD applications or virtual machines. This applies no matter what GPU you're using.
     
  24. highfly

    highfly Notebook Consultant

    Reputations:
    2
    Messages:
    284
    Likes Received:
    1
    Trophy Points:
    31
    apparently the 4GB v2GB Vram on the desktop 680 doesn't make any differnce even with high res monitors as its chocked by the 256bit bus
     
  25. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    I am with you on this one. Why pay so much money and not get the best version of the card.
     
  26. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    Because outside of a few specific cases (as in the one you quoted) which only apply to very few people, there's absolutely no benefit in having the 'best version'.

    It's like on the old M14x R1, the 555M came in 1.5GB and 3GB versions. The 3GB was only marginally more expensive...but what's the point of paying that little bit extra when there's no performance benefit?
     
  27. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    So your ok with paying 2x the price of a 7970 but 2.1x the price for the higher end version of the same card is out of the question?

    Bottom line we don't "need" the 680m or the 7970 to run most games at 1080p but we choose to upgrade! So since I am choosing to upgrade I am also choosing to upgrade to the best version out there. If I wanted the best bang for my buck it would be the 7970 but since I want to go nvidia this time round and pay the inflated price then I might as well get their top tier version of the card.
     
  28. killaz05

    killaz05 Notebook Evangelist

    Reputations:
    12
    Messages:
    373
    Likes Received:
    2
    Trophy Points:
    31
    I think you are misusing the term "more bang for your buck." Just because the 7970 has more vRAM does not mean it is more powerful then the 680m, in fact the 680m is the more powerful card even with less vRAM so it is worth the money over the 7970. I also understand what you mean about getting the best card that Nvidia offers. I too want the 4GB card because why not. I am already going to pay a ton of money, why not go all of the way and I am sure I will find a use for the extra vRAM at some point. It is also a bragging thing to an extent. I can say "I have dual 4GB GTX 680m SLI, what do you have?" Haha that sounds nice. Are they going to perform different from the 2GB cards? No, so I would recommend if you are getting a new system to stick with the 2GB from Dell when having one built. For someone like me with the R1, then either is a viable option really.
     
  29. Arestavo

    Arestavo Notebook Evangelist

    Reputations:
    188
    Messages:
    559
    Likes Received:
    215
    Trophy Points:
    56
    In all actuality, the Dell 680M and 7970M have the exact same about of GDDR5. Also, with the benchmarks posted so far, they both appear to be rather close. Or have I missed a major site benchmark review?
     
  30. killaz05

    killaz05 Notebook Evangelist

    Reputations:
    12
    Messages:
    373
    Likes Received:
    2
    Trophy Points:
    31
    I thought I read somewhere that the 7970s were 3GB cards. Oh well, but I would rather the Nvidia due to driver support and PhysX.
     
  31. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    The desktop cards are 3GB!
     
  32. oni222

    oni222 Notebook Deity

    Reputations:
    310
    Messages:
    733
    Likes Received:
    5
    Trophy Points:
    31
    I agree with everything your saying except the misuse of bang for your buck. What I meant by it is that the 7970 is almost as good as the 680m but at half the price. So isn't that the "best bang for your buck?".

    Rep+ for you!
     
  33. ironminded

    ironminded Notebook Enthusiast

    Reputations:
    0
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    Something to keep in mind is that any person's use of the term "best" presupposes a specific context or frame of reference. What context one person uses may not perfectly equate to the context of another.

    So while a 680m with 4gb of DDR5 may be for some the "best" because of the specific context they are operating within [specific software/use requirements] it may not be the "best" for others. While the performance hit may be minuscule going from 2gb to 4gb of memory will require an ever so slight power increase and will create more heat when overclocked [more memory at a given over clock creates more heat]. Now this may not affect the person who requires 4gb for their specific application it may be a consideration for others.

    All I'm trying to point out is that in order to determine which card, or variant of a given card, is the "best" we should be clear about the context we are operating in. Additionally, if someone else is operating in a different context we should be respectful and understanding of the fact that their context may dictate another card/variant as the "best" for that application.

    We see this in things like the variation between cards in given games. For some games an ATI/AMD card may preform optimally, while in others a Nvidia card may give a higher frame rate. We can't say that either card is un-qualifiedly [is that a word?] the best. We can only say that in the context of a specific game one card gives better performance than the other.
     
  34. Arestavo

    Arestavo Notebook Evangelist

    Reputations:
    188
    Messages:
    559
    Likes Received:
    215
    Trophy Points:
    56
    Unequivocally is the word you are looking for, and it is very much a matter of perspective.
     
  35. Bytales

    Bytales Notebook Evangelist

    Reputations:
    56
    Messages:
    690
    Likes Received:
    4
    Trophy Points:
    31
    Let me ask another interesting question.

    I play diablo III, 1080p, all maxed out, with adaptive vsyinc ON. I get constant 60 fps.

    What i learned is that if without vsync the game reaches say 150 fps and you limit the game to 60 fps, then the gpu will not pull so much current because it wont need to reach those higher frames. That makes power consumption lower and perhaps temperatures lower.

    Anyways with vsync ON i get both gpu load at about 50-55%.

    Question is now, since 680m is much more powerfull then 460m, it will probably reach much more FPS unimpended by vsync.
    If vsync is ON, will the load then be lower than in the case of 460m ?

    Will that result in even lower temperatures and even lower power consumption ?

    If so, that alone is a reason to upgrade.
     
  36. Arestavo

    Arestavo Notebook Evangelist

    Reputations:
    188
    Messages:
    559
    Likes Received:
    215
    Trophy Points:
    56
    VSYNC is a frames per second limit. 60 is the max, unless you have a 120Hz monitor - which in that case it will be 120 FPS.

    Without VSYNC, the 680M will get more FPS with the exact game settings that the 460M has - if the settings are higher when using the 680M, that may not be the case.

    The load would be lower, and temps may be lower depending on the fan profile.
     
  37. DarkSiren

    DarkSiren Notebook Guru

    Reputations:
    51
    Messages:
    66
    Likes Received:
    0
    Trophy Points:
    15
    I'm glade someone pointed this out.
     
  38. sirana

    sirana Notebook Deity

    Reputations:
    267
    Messages:
    748
    Likes Received:
    329
    Trophy Points:
    76
    The power of the 680M a little overclocked is comparable to the desktop 580, so the load will be significantly lower than the 460M :)
    BTW we still need someone to volunteer for installing the 680M in the x7200 :D
     
  39. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    Yes, I think there is a huge difference being discussed for "best bang for your buck." It's great to do that but it is also good to buy only what you need for what you want. If there is no benefit from increased Vram due to it being chocked by the 256 bit bus, then why spend money (even if it's marginal after the total checkout price) when there is ZERO benefit to achieve? That, to me, is just buying into marketing BS. Buy what gives you the performance and features you desire. However, if having more Vram (even if it's useless) makes one "feel" better then I guess that is money well spend and why there is no "right" or "wrong" answer (given my logic used). However there is a "smart" way to spend money to achieve what is needed for whatever one's intended uses and outcomes are. Peace.
     
  40. momosan

    momosan Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    Hello,

    I tried searching for this but I came up empty handed. Anyway, I have a M18X R1 with 6990 Crossfire X. I would like to retrofit the system with a single 680M. Has anyone successfully done this and does anyone have the list of parts that's required?

    Also, do you have to modify the .inf file for driver install even if you order a 680M from Dell?

    Thanks in advance!
     
  41. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    Yes, we are already starting to see this upgrade. It is compatible with the R1 and works very well. You may need to mod the driver for now, as the hardware device ID changes when installed in the R1 (much like it was initially with the 7970M video cards). It will also show up in Futuremark bench results a Generic VGA until they add the device ID variation to their database. They may have already done so... haven't checked.

    5150Joker has a single 680M in his R1 and it's blistering fast (3DMark11 7563 with only one GPU, which is similar, if not slightly higher than a single 7970M).

    The parts are not available from Dell at this time. You will need to buy the GPU from Clevo for now, and you can expect to pay nearly $800 for one GPU from them, but it is a more robust GPU with twice the amount of vRAM. The heat sinks for the 580M will work with the 680M.
     
  42. momosan

    momosan Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    Thanks for your reply, Mr. Fox.

    So basically, I would need to get, the 680m card from Clevo, 580/680 heat sink from Dell and some thermal paste to do this retrofit?

    Do you suggest that I wait to order the part from Dell or just order it from Clevo? Think there will be any differences between the cards?
     
  43. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,835
    Likes Received:
    583
    Trophy Points:
    131
    The clevo is 680M 4GB while the dell 680M is 2GB.
     
  44. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
    If you are only planning to install one 680M, yes... a heat sink for the left side, and support plate (aka back plate or heat plate). The 580M heat sink comes with pads and thermal interface material applied already if purchased new from Dell, so buying paste would be optional. If you plan to install 680M SLI you will need an SLI bridge cable as well as a right heat sink and second support plate. As I mentioned, the Clevo cards appear to be more robust with twice the vRAM. There are some differences in the vBIOS, they are not compatible between the GPUs, and the information I have seen reported suggests the Clevo cards offer slightly more aggressive performance. I don't have a recommendation either way. There would be pluses to ordering them from Dell, but they are not available for purchase at this time.
     
  45. supermi

    supermi Notebook Guru

    Reputations:
    0
    Messages:
    74
    Likes Received:
    2
    Trophy Points:
    16
    Got my 680m installed in my M18X today and itworked likea charm HOWEVER:

    the optimus feature which lets me switch between the discreet gpu and integrated gpu at the press of a button to save power on battery did not work as the computer neither in windows or BIOS would read an integrated gpu the intel 3000 hd on die with the cpu AT ALL. so there was NO way to switch ... so I had to put in an ssd anyway so I reinstalled windows and now not only does that not work but upppon reinstalling windows, drivers and alienware programs some of the quick keys do not work, like the quick key for mute does not work and volume up lowers the volume and volume down raises the volume now and a few more things like that!!!

    Is optimus working for others with the 4gb 680m in the m18xr1? any ideas?
     
  46. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    Have You Test It On games? I Don't Think that The the optimus is avalible on alienware laptop!
     
  47. supermi

    supermi Notebook Guru

    Reputations:
    0
    Messages:
    74
    Likes Received:
    2
    Trophy Points:
    16
    it is very much available ... other than for me right now
     
  48. DVSman

    DVSman Notebook Consultant

    Reputations:
    9
    Messages:
    121
    Likes Received:
    3
    Trophy Points:
    31
    Optimus (GPU switching on the fly) is available on my m14x but it is not available on my m18x since I have 2xGPU / SLI. It may however be available on those m18x with just one GPU (I'm not sure on that one).

    In my case (SLI machines) Dell / AW does provide for a manual GPU switching which requires a keystroke + reboot which is 'technically' not =/= "Optimus"
     
  49. supermi

    supermi Notebook Guru

    Reputations:
    0
    Messages:
    74
    Likes Received:
    2
    Trophy Points:
    16
    Ok yes you are right as my m14x does the optimus automatic switching, my m18x did need the manual change and reboot as well... however that does not work if no integrated gpu is noticed even in BIOS ...

    that is my issue here

    I just have the discreet gpu to choose
     
  50. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,626
    Trophy Points:
    931
 Next page →