Hello. I have a 6920G which somehow got a hair behind the screen some time ago.
My warrant ended some time ago so I guess I need to remove it myself, if I want it removed (and I do).
The hair is in the middle, not too far from the bottom so it sure is annoying. It looks like a very, very thin strain of hair, most likely from one of our cats.
I have googled for many hours for ways to disassemble my laptop and even tried it myself without a guide and I came pretty far until I gave up in fear of breaking the screen.
Do anyone have a guide or something on how I can remove the LCD to clean it off? Far as I understand service manuals are not to be asked for (for legal reasons?) so a guide or detailed instructions is what I desperately need.
Anyone kind enough to help me? :3
-
-
Check your private messages for a better explanation.
Best of luck! -
Thanks for your help TehSuigi. I managed to come as far as having the LCD completely removed, however I never managed to remove the backlight, just tilt" the LCD a little to be able to blow some air between, but it didn't help unfortunately.
However, I now had the opportunity to change the thermal paste on both the CPU and GPU and that dropped the temperature of the GPU down to 67C (from 75C) which obviously means more OVERCLOCKING!
I have managed to increase the shaders with memory with 54MHz (up to 1836MHz now), memory with 15MHz (515MHz now) and I am now working on the GPU (715MHz at the moment). I don't dare increasing the memory too much however I have glued a few pennies onto the heatsink (glue and AS5) just to be safe.
I also managed to undervolt a little more (from 1.05 to 1.0375 i RMclock, 1.11 according to Everest) AND I found out about throttlestop yesterday, so something good came out of all this at least -
Jeeezuss, your overclock is insane. Is that stable?
(and would you mind contributing to the NBR team on HWbot then?) -
I dropped back to what I have in my signature because I downloaded ATi tool and the artifact scanner was going coo-coo, and I think I saw 2 yellow "lines" in crysis and maybe 1 flicker in 3Dmark06 after I had done the artifact scan, but I am not sure so to be safe I reverted to the clocks in my signature.
I ran the Crysis benchmark tool for many, many runs too to check if the PC would freeze. The shaders have reached their max stable clocks at 1836MHz, but I am not sure about the ram and the GPU. The GPU might be giving artifacts at 730MHz.
How do HWbot work? Never used it before.
I'll post my 3Dmark06 in a few minutes, just need to run it again because it seems like it didn't get saved.
One odd thing however; earlier when I ran 3Dmark06 2 out of 3 results were EXACTLY the same.
I don't remember the exact results on the runs, but I think the first run was about 5047 points (at clocks in siggy), while the 2nd were LOWER but with 730/1836/515, and the 3rd run exactly the same as the first test, but with 730/1836/515. I even saved the last test results on my harddrive, extracted the .xml file, opened it in internet explorer and scrolled down to the results, so it wasn't something wrong with futuremark.com. Very odd.
EDIT: Here is the 3Dmark result. For some reason it show different clock speeds than Rivatuner. Which should I trust?
http://service.futuremark.com/resultAnalyzer.action?resultId=13524678&resultType=14
Could the 9500M card have something to do with the same thing as the 9600GT, the crystal oscillator?
http://www.techpowerup.com/reviews/NVIDIA/Shady_9600_GT/
Considering Rivatuner reads from the clock generator I think I'd rather trust Rivatuner over GPUz or Everest.
EDIT2: Lol it seems that I am nr1 on HWbot when it comes to the 9500M GS @ 3Dmark06. http://hwbot.org/hardware/videocard/geforce_9500m_gs?tab=rankings#3dmark_2006
Wasn't expecting that. Now to become nr1 on the other Futuremark benchmarks
EDIT3: I have been reading more into why Rivatuner and GPUz show different clock speed. The question is: Which one is right?
In Rivatuner when I OC I have set the GPU, for example, to 660MHz however in rivatuners hardware monitor it says 715MHz. So we have GPUz saying the GPU is running at 660MHz and we have Rivatuner saying 715MHz. How do we find out which one is right? I don't know. I am not that great when it comes to computers and hardware. Though what I think is that rivatuner is the one showing the correct clocks. Why? Take a look at my HWbot link. Compare the clocks on my card and the person below.
Let's say Rivatuner is wrong, and that my GPU is actually running at 660MHz. How come then that I am getting 500 points more than CoFoKrypton, who has both a higher clocking on the GPU AND also has a much more powerful CPU?
All I can guess on is that Rivatuner is showing the right clocks. Remember that I am not that great when it comes to hardware and that this is just speculations from an amateur -
Careful there - HWBot demands that you run 3DMark06 at its default resolution of 1280x1024, not 1280x768.
I had to find an external monitor to make my run.
HWBot is an overclocking competition/database. Run your benchmarks, take screenshots, post your score, and enjoy.
You'll probably be able to sneak ahead of me in a few benchmarks - try 3DMark03 or 3DMark05, since their native resolutions are just 1024x768. -
Hmm didn't know it ran at a different resolution. Need to find another monitor then. How do I remove a submission?
EDIT: I found a CRT that can run @ 1280x1024 so we will have new results soon. Embarrassing that I didn't notice the resolutionBut to my defense I haven't bothered with benchmarks for like 1 year or more.
I tried to OC a bit more on the core and the RAM. The ram is running at 532 MHz now, and the GPU at either 680 or 738MHz (I dunno what to trust; GPUz or rivatuners hwmonitor). The shaders are still at 1700/1838 (which ever is the right clock), and one more step will make the PC freeze during 3Dmark.
New results: http://service.futuremark.com/resultAnalyzer.action?projectType=14&XLID=0&UID=26814425
Still nr1 lol, but with only like 40 points. Gonna be running a lot of crysis benchmark loops to ensure stability. Got any suggestions on any other method I can use to ensure stability?
EDIT2: Managed to squeeze out 8 more points lol: http://service.futuremark.com/compare?3dm06=13528590 -
Your results are phenomenal. Way to go!
I'd go with GPU-Z to be honest. RivaTuner read my speeds wrong as well, but then again, that article with the clock crystal dealt with a desktop 9600 GT. Our mobile 9500M GSs actually use the core from the previous generation 8600 series, so they probably aren't affected. -
Pretty cool being the one with rank 1 on hwbot with a 9500m GS. Never thought I'd be in first place on anything when it comes to hardware
One odd thing though: I have my shaders at 1660 at the moment, and rivatuners monitor shows them as 1782 MHz (even at 1626MHz it shows as 1782). If I bump them up to 1674MHz Rivatuner STILL show the clocks @ 1782 MHz. If I set it to 1675MHz rivatuner will show it as 1836MHz (1MHz = 54 MHz in the monitor).
The biggest mystery however is that I get NO more points in 3Dmark at all when I have it between 1626 and 1674MHz. If I clock it up with 1MHz to 1675 so rivatuner show 1838MHz I will get around 20-40~ more points...
A better explanation:
I set shaders @ 1626MHz, Rivatuner shows them as 1782MHz. GPUz: 1626MHz
I set shaders @ 1674MHz, Rivatuner shows them as 1782MHz. GPUz: 1674MHz
No difference in 3Dmark06 score
I set shaders @ 1675MHz, Rivatuner shows them as 1838MHz. GPUz: 1675
I get around 20-40~ more points in 3Dmark06.
1MHz clock difference in Rivatuner = 20-40 points???
The numbers are probably not 100% accurate on the clocks, but the gain in score is about correct.
Hair behind my screen, how do I disassemble a 6920G?
Discussion in 'Acer' started by apamedvapen, Mar 21, 2010.