How do I explain this:
YouTube - MSI GX640 Play Dirt2 DirectX11
![]()
-
BenLeonheart walk in see this wat do?
-
Ehh you dont, it runs on max...end of discussion.
-
BenLeonheart walk in see this wat do?
Yeah, I meant "visual evidence" for mr himura kenshin 10,000
-
then prove it by actually testing it rather than relying on a youtube video that was made on January 18, 2010 way before this laptop was even released.
-
my 5870MR runs dirt 2 on max with 4x AA smoothly, a 5850MR is not much weaker, so running on max should be no problem.
-
BenLeonheart walk in see this wat do?
He wants evidence...
-
He doesnt need evidence, hes just ignoring visual fact...
And 5870 statistic which is based around the 5750/5770, just like the 5850 is,
Thats an accurate video, kudos to you ben for posting it. -
the problem i have with the video is that it doesnt show frame rates. For all you know it could be running on 35 fps (which is what it runs on ultra i think), and in a video you wont really be able to tell the difference. Until you can actually prove me wrong with a proper screenshot then I fail to believe that video.
-
BenLeonheart walk in see this wat do?
A video recorded with a videocam will surely show different framerates than a video recorded, per say, with FRAPS.
If you record with a cam, you can NOTICE the lag... here, i'll show you:
Both running on Macbook with 8600mGT ~2.2~2.4Ghz
Details on high and shadows+shaders = medium...
YouTube - Crysis on a Macbook Pro 2.4 Ghz 15"
Recorded with ane xternal cam, FRAPS shows you the framerate, and its accurate, FRAPS in that case will NOT eat your framerate because it is not being used to RECORD(use up PC resources)
YouTube - Crysis on a Macbook Pro 3
Recorded in-game with FRAPS
Difference = Quite obvious.
Its kind of a long shot, but well, thats UNDER 30 framerates, definitely.
Over 30 = playable. -
over 30 is definitely playable, but when you see how it runs over 50 fps then you would never wanna go back to playing at 30 fps. In any case, you guys can run it at whatever you want or believe whatever you want. The people that actually have the game will truly know how it runs.
-
Yeah great, lets watch a 50fps vid with eyes that are viewing at 24, that makes perfect sense,
Tell you what, i play crysis at 24 fps +, thats fluent gameplay, just like it is at 60 fps, its not off the frames, in some cases it can be off the coding itself, but its been benchmarked, and dirt 2 plays on high settings,
Try find the thai review if you want more evidence.
Not to mention consoles being locked at 30... -
if you have never experience high fps game play. 30fps will be fine for you. i would never play a fps on console even if i can use keyboard/mouse.
once you played games with a highend desktop you know the difference.
if you want to game on laptop the mb 5850 is the 4th most powerful gpu (and the 1st isnt even that much faster) so why bother asking for fps. and if you are a picky hardcore gamer, gaming notebooks are not for you. -
BenLeonheart walk in see this wat do?
Consoles are not locked at 30.
You must be REALLY special to not notice the difference.
Pop Halo 3 on XBOX360 = 30fps.
Pop Timesplitters 2 on Gamecube = 60fps.
it is visible to the naked eye.
Movies = 24fps + motion blur.
-
-
Not to mention console GPU hardware is apserlutely nothing in todays comparison,
At the end of the day the game can be run on ultra, and from that video, motion blur isn't really going to effect Dirt 2 much at all,
And the keyboard and mouse combo are out for the consoles now, people are saying its making a difference, so regardless of frames, people are playing it,
(Infact how would motion blur even effect an fps?) -
BenLeonheart walk in see this wat do?
Oh yes I did
Game Developers (not consoles) usually lock games at lower frames because that way they can fit in more visual details... Halo 3 at 60fps would be impossible (unless Halo3 gave u the option to tinker with the visual setting), because of the hardware limitation.
There is little I can do to prove to you what I'm trying to prove...
To me, 30 frames is MORE than playable, but 60FPS is noticeable with the naked eye...
Give it a try, phone some friends and just look at how smooth a game looks @ 60fps, and one @ 30fps.
Either way, they're playable.
Motion blur: Motion blur - Wikipedia, the free encyclopedia
-
I'd give that motion blur article a 0, just run eve in 60 fps and 30 fps online modes, no motion blur difference, only interface delay. (Mouse - KB)
Your screen will impact the most i believe, some screens even have engines like subfield hz tech which add their own frames.
Im fairly certain the difference between frames will just impact the control interface response (I dont know why) but not performance or graphics quality, and as you sayconsoles are using 30fps to squeeze performance out of it. -
BenLeonheart walk in see this wat do?
Well, its noticeable.
I never Implied the use of motion blur in videogames to aid this, I said "Movies".
Which is in a whole different league.
As I was surfing to provide Mr. Frijj with something fun, I came across this:
guu2.mkv ... at uploaded.to - Free File Hosting, Free Image Hosting, Free Music Hosting, Free Video Hosting, ...
Its a small video showcasing the "opening" Spore scene.
One at 30fps. one at 60.
Like I said...
Its much MUCH harder to get running Crysis at 60fps 1900 x 1200 at enthusiast...
You'd need a Tri or Quad SLI or a crossfire setup.
And thats the sole reason why most console games are locked by Devs @ 30.
Because its much easier to run Crysis at 30fps (which thats what single cards like the 5870 get) in enthusiast.... than expecting to see it reach 60 fps... you'll only do this by lowering res, and perhaps switching to High details..
The more demanding the games, obviously, the more power they require, hardware-wise...
Console games you cannot upgrade, so devs often choose to lock their games at 30.
Now, some console games like Smash Brothers, are free to go from 30 up to 60fps, because its not that graphically demanding, and the ATi card in the Wii can handle it just fine...
I remember when UT99 came out, I was blown the hell away, but by that time, my Pentium2 could barely handle it... because of the software rendering... and it ran like at 25 frames...
When I bought my first Voodoo3 16MB card... my life changed, I understood and lived it .... i saw that it was MUCH smoother... seamless, made-me-dizzy...
My Final point still stands.
You can SEE the difference..
'nurther' video:
60vs24.avi
Download it and open it with VLC...
If you cannot see the difference there, then i'll give up trying to explain.
So, with newer games coming out, if you expect them to be 60fps "out of the box" they'd have to be heavily toned-down...
thats why wee need z0mg 5870, 5850, 285GTX SLI, etc.. to run them at HIGH frames with very pretty visuals...
for the record: i dont give a rat's rear end about FPS that much (only in multiplayer)
I enjoy my single player experiences with 30 as minimum... If i love the game, i'd go with 25 and i wouldnt give an enchilada.
Now, for precision/pwnage on UT3, BF2, i need 60. (not that i cant on 30, it s just.. i dunno)
Peace -
Oh i get ya about the films man, but them alone are recorded at their respective rates where gaming yields completely different results,
I can easily see the motion blur so dont worry
Its just in games i dont find it noticable at all.
And i think crysis has blur shaders itself anyway doesn't it? for things like fast turning and depth of field. -
Updated with some pics of the inside along with two of my cat while I attempt to find the time (to not game) to take some pictures/ screenies for the review
-
BenLeonheart walk in see this wat do?
PC pr0n O_O
BTW, I feel like such a noob, but I'd better ask, instead of being a donkey without asking...
The last time I popped opened Laptops was my dv2000, m1330, m1530 and a Medion pc...
my question is as follows:
What's with all the copper surrounding the GPU? better cooling comes to mind, but... oh well I guess it is for better heat dissipation, therefore my question invalidates itself as a question and upgrades into an assumption, anyone can clarify that for me?
and @ Dspr_02...
Is the fan REALLY loud?
-
The heat pipes look kinda really buff and nice, and the CPU HS looks really well placed too, but the GPU just looks flimsey dont you think?
Ill check inside my machine after, but im sure the 4850's is thicker then that. -
You kinda answered it yourself
I looked around because its a good question.
Copper is known to be a good thermal conductor. From my understanding, more copper around the CPU/GPU, means the copper will absorb the heat generated by the source and transfer the heat energy more efficiently. So it might just be why this 1 fan system is doing so good so far.
If you guys wanna read on the subject.
Thermal conductivity - Wikipedia, the free encyclopedia
Heat sink - Wikipedia, the free encyclopedia
Benchtest.Com - Aluminum and Copper Heat Sink
-
Hey Dspr, can you run a SC2 demo, and post a screenshot? I just wanna see what Ultra looks like on this computer. I've never really seen ultra settings before. The ones on youtube are too blurry for me to see, and the 1v1 matches are most likely run on high for playability.
Thanks man -
And yes, copper is a very good thermal conduct. -
Expensive too, when I had the copper heatsink of my Clevo's GPU replaced, I was charged 100 bucks for it.
-
The fan isn't really that loud, I had just spend 6 or so hours in the emergency room at the local hospital so I was in a semi foul mood. The kind where everything hits a nerve and sets you off. ( I tore something in my shoulder and the doctor just flung my arm around causing even more pain.) -
Ouch, mind you, have you checked your display pic recently
Its hard to say, today i'll rip open my machine and see how different the changes are, or what the copper looks like for starters. -
If you want a better cooling you can make a silver heatpipe ^^
-
Thats true, but making it, that'l be tough
Besides coppers good when made properly because it has that whole process and chemical change thing going on! -
Some guy built a watercooled mini-itx pc...I think that it is called "Project MiniMe" or something... -
Lol?? you'l have to pull that up, im interested to see how he did that haha.
-
-
Anyone made one of those yet too??
-
Graphene, baby!
-
I'll just stick to liquid nitrogen
-
Wasn't it sager or something that showed a prototype watercooled notebook ? ( could be wrong about the manufacturer)
EDIT : it's fujitsu -
fujitsu? Got any links mr dead?
-
-
Don't see what's the point though, water cooling is mostly for faster conducting of heat to long distances so that the more optimum cooling solution could be located further away from the heat source, there's not much space on the notebook anyways, and it's such a short distance from the CPU/GPU to the outside, a copper heatpipe is a lot better.
And as they said, their water cooling is not really to increase the cooling, but to reduce the fan noise. None of their laptops use water cooling as well, most probably a failed product. -
I personally think thats awesome, coupled with msi's design to pull air over the components, that water cooling would be quite effective, i'd love one of them to test in my machine.
-
I disagree , there's plenty of space ...
People complain about the sager/clevo/tank coz of the fan noise , saying that , it can easily fit one of those as they are kinda massive -
Even ours has a good amount of space, and the new books have made the bottom petrude out slightly more (I think) with some minor shape alteration.
-
Thanks. -
Yes it's definitely around 7lbs with the battery. Coming from a 17" Gateway FX though, this thing feels like a netbook to me.
-
btw, ekoez, most gaming laptops are heavy. Even the Alienware m11x weighs close to 5lbs, and that's an 11.6" laptop.
-
Thanks for the help kosti =D
-
-
-
Oh fz, it may not look it, but believe me they're huge haha.
As for the 9 cell, it adds most weight to anything, i remove it from mine and my laptop feels unrealistic in weight compared to size.
Dspr_02's MSI GX640-098US Review
Discussion in 'MSI' started by Dspr_02, Apr 20, 2010.