The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Y500 GT650M Overclock

    Discussion in 'Lenovo' started by n1smo, Dec 13, 2012.

  1. Milo01

    Milo01 Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    5
    Yeah i knew that but above 310.9 the core locked at 925Mhz, and my video memory failing at 2300 :(
     
  2. dronelebeau

    dronelebeau Notebook Geek

    Reputations:
    0
    Messages:
    94
    Likes Received:
    1
    Trophy Points:
    16
    thanks guys! i finally got the balls to OC my gpu's. i was really hesitant at first because of how hot it is in here. but yeah, it had to happen one point in time. :D

    i started off with 980/2300 and temps are already getting hotter and i even forgot to turn turbo boost back on, hence the lower physics score. i really envy your temps, they run way much cooler. gonna be increasing clocks next time. for the meantime i will be testing this clock in real-time gaming.


    3dmark11_OC_980.png
     
  3. ChrisNee1988

    ChrisNee1988 Notebook Guru

    Reputations:
    1
    Messages:
    53
    Likes Received:
    0
    Trophy Points:
    15
    How do you turn turbo boost on?
     
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It's already on by default.
     
  5. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Your GPU temps are fine. It's the CPU you should be worried about on this machine. It runs like 20 C hotter than the GPU at full load.
     
  6. dronelebeau

    dronelebeau Notebook Geek

    Reputations:
    0
    Messages:
    94
    Likes Received:
    1
    Trophy Points:
    16
    yeah it's the cpu temps that i'm worried about. with turbo boost, it reached almost 90 once when i was playing farcry3. could be a badly applied stock paste. i'm not up to reapplying them myself so i guess the only way to get around this is to turn turbo boost off. i guess oc-ing the gpus more will have little effect, if it has any, on cpu temps.

    would the temps be cooler if i had opt for and i5? or 3632qm?
     
  7. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Trust me, it's not badly applied thermal paste. Others have repasted and seen no improvement or even higher temperatures. The problem is the design of the cooling system in this thing. The main GPU and CPU share the same copper heatpipe and fan. Most gaming notebooks with good cooling systems, like my former ASUS G73Jh, have a separate heatsink, heatpipe, and fan for each component.

    From what I have seen overclocking the GPU increases the temperature of the CPU some as well. I would not recommend disabling Turbo Boost because then you're not getting the performance you've paid for. Far Cry 3 is a relatively demanding game CPU-wise and would benefit from Turbo Boost. I would recommend you to use a good cooling stand if needed to keep the CPU below 95 C. Anything past it is a bit too close to thermal shutdown (105 C) for my comfort.

    Given the same cooling system, an i5 will run cooler and an i7-3632QM probably will as well but then you're sacrificing performance. Even though the i7-3632QM is still a quad core and not much slower in terms of MHz it's thermal behavior might be different due to it being a lower-TDP chip and that might decrease performance as well.
     
  8. Kukri

    Kukri Notebook Consultant

    Reputations:
    2
    Messages:
    261
    Likes Received:
    7
    Trophy Points:
    31
    Here's my highest score, at the highest stability I could get with 3dmark. Haven't tested it with Furmark.

    System: Lenovo Y400, i7-3630QM, 1 x GT 650m

    Settings: 720p, 1120 Mhz gpu core, 2500 Mhz gpu memory, cpu in high performance mode

    View attachment 94660

    3151 baby!
     
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    The real test is whether it is 100% stable in all games. I've had cards that could bench all day on a certain overclock but crash as soon as I played any games.

    What is your ASIC quality by the way?
     
  10. Kukri

    Kukri Notebook Consultant

    Reputations:
    2
    Messages:
    261
    Likes Received:
    7
    Trophy Points:
    31
    It was fully stable in all games I've been playing for a week (like Bioshock at max settings, 720P) and furmark at 1105 Mhz core, 2555 Mhz memory. With that I did get ooone little green artifact during the game's ending, which I attributed to the memory, so I brought that down to 2500. I'll do some furmark testing when I get home.

    What's ASIC quality?
     
  11. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It's a number assigned at the factory based on the quality of your GPU silicon. Higher is better obviously.

    Use GPU-Z and right-click on the title bar:

    Screenshot (6).png ASIC Quality.PNG
     
  12. Kukri

    Kukri Notebook Consultant

    Reputations:
    2
    Messages:
    261
    Likes Received:
    7
    Trophy Points:
    31
    Mine shows 88.7%. Is that typical?
     
  13. RebellionElite

    RebellionElite Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    Highest stable overclock is 1125 Core and 2750 Memory with 310.90 Drivers. Stable in Crysis 1 and Crysis 2, 3DMark 2011 and Furmark (2800 Mem gave me artifacts)

    When running in SLI, I have both cards overclocked to 1100 Core and 2700 Memory.

    No issues with benchmarks 3D Mark or Games (Crysis 1 and Crysis 2)

    I also have an i5 Y500, so all temperatures stay in check.
     
  14. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Wow those are high. What is your ASIC quality for each card?
     
  15. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I have no idea. I don't even know if it's an accurate measure of how well a GPU will overclock or overvolt because the method by which the software obtains the reading could be wrong.

    ASIC is basically a number assigned at the fab indicating the quality of the the section of wafer that the GPU was cut from. The center of the wafer is always best and quality goes down as you move toward the edges. Binning saves the fab a lot of money because they can simply take the lesser wafers and sell them as lower models with reduced performance instead of throwing them out. For example, the best cuts of GK104 go into the GTX 690 and GTX 680 while the lesser ones go in the 670 and 660 Ti.

    In our case, I'm thinking they put the highest ASIC GK107 wafers in the GTX 650 and GTX 660M and then our GT 650M and the GT 640M get lower ones.
     
  16. Kukri

    Kukri Notebook Consultant

    Reputations:
    2
    Messages:
    261
    Likes Received:
    7
    Trophy Points:
    31
    What 3dmark scores have you gotten with that in single gpu configuration (I'm curious how the i5 might effect it)? I'm also wondering why your memory can clock higher (mine goes nutty at 2600), but it could also be my lower than average, apparently, ASIC :p. The core speeds are the same though.

    Octiceps, I read on a thread in here where most people were showing 100% ASIC for their gt 650m cards.
     
  17. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Read above. I don't think ASIC quality has much if anything to do with how high your memory overclocks. That's dependent on the GDDR5 memory chips which are different.

    Anyways yours seems to be pretty good number considering this is a laptop. It's already 10% higher than mine. My Ultrabay GPU ASIC quality is in the 60's lol.
     
  18. Kukri

    Kukri Notebook Consultant

    Reputations:
    2
    Messages:
    261
    Likes Received:
    7
    Trophy Points:
    31
    Lol, I was joking. But yea, it shouldn't make a difference there. BTW, furmark runs fine, no anomalies or crashes. Looks like 1120 is about the highest stable overclock for me, without being able to increase the gpu voltage...
     
  19. AcoustiK

    AcoustiK Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    Recently got my Y400, and SLI GT 650m, tried to use nvidiainspector and the bat files provided in this thread, no luck, seems to be locked to my default clocks, Running 3Dmark11 in SLI got me 3770, sounds a bit low...
     
  20. Untamed

    Untamed Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    3
    Trophy Points:
    16
    The new drivers don't overclock and require BIOs modification.

    See here: Lenovo Y400 / Y500 - unlocked BIOS / wlan whitelist mod
     
  21. voozers

    voozers Notebook Consultant

    Reputations:
    22
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    I dled the files but I couldn't find out how to direct them tp my NvidiaInspector. Can someone tell me the exact command line to edit the batch file to find the program? I'm not good with batch files and I even moved my program to the C drive because thats where it sounds like these batch files were meant for but they didnt work/
     
  22. dronelebeau

    dronelebeau Notebook Geek

    Reputations:
    0
    Messages:
    94
    Likes Received:
    1
    Trophy Points:
    16
    batch files are just the same as inputting in the command prompt. it only executes the lines automatically so you wont have to type them all over again. also if there's gonna be any error message, you won't see it when using the batch file coz it closes immediately. you can go to command prompt (start>accessories>command prompt) then input what's in the batch file line by line.
     
  23. Estbarul

    Estbarul Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    0
    Trophy Points:
    5
    As reference to post 87, this are my batch files with new drivers.
    OC:
    @echo off
    C:\nvidiaInspector\nvidiaInspector.exe -setBaseClockOffset:0,0,135 -setMemoryClockOffset:0,0,250 -setGpuClock:0,2,925 -setMemoryClock:0,2,2250 -forcepstate:0,0
    ||-forcepstate:0,5
    Default clock:
    @echo off
    C:\nvidiaInspector\nvidiaInspector.exe -setGpuClock:0,2,790 -setMemoryClock:0,2,2000 -setBaseClockOffset:0,0,0 -setMemoryClockOffset:0,0,0 -forcepstate:0,16
     
  24. voozers

    voozers Notebook Consultant

    Reputations:
    22
    Messages:
    226
    Likes Received:
    0
    Trophy Points:
    30
    Never mind I got it. I had to rebrowse the thread but to clarify it was because I didn't make an nvidia inspector FOLDER from where the batch files could run the command. It works now.
     
  25. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    Y500 650M SLI
    Turbo Boost Off, both cards at 1100/2500
    3DMark11 score of P4502. I can bench it at higher clocks but it isn't 100% stable. I have a suspicion that it's my UltraBay card holding me back.Forcing Turbo Boost on gives minimal gains and raises temperatures significantly.

    I'm using the latest drivers and a modified vbios.
     
  26. dronelebeau

    dronelebeau Notebook Geek

    Reputations:
    0
    Messages:
    94
    Likes Received:
    1
    Trophy Points:
    16
    turbo boost only increases physics score. which is not regarded in some games. what were your temps? i feel you. having turbo boost on really heats up the laptop more.
     
  27. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    If I stress test with the OCCT PSU test (which simultaneously runs a Linpack calculation and an SLI-enabled Furmark-type test) I'll get ~95C on the processor (this seems to be the actual peak temperature @ 2.4GHz, there is no throttling), 70C on the built-in card, and 72C on the Ultrabay GPU. Running 3DMark11 gives a max of 82C on the CPU and the same 70/72 on the GPU's.
     
  28. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Well I get 7600 on my Physics score, so yeah, Turbo Boost makes a big difference. I'm using ThrottleStop to force the maximum Turbo multiplier and my CPU never drops below 3.2 GHz. It goes as high as 90 C in 3DMark 11 but for the purpose of benchmarking that's fine.
     
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Actually if your CPU is running at 2.4 GHz it technically is throttling itself by disabling Turbo Boost. Before I used ThrottleStop, I got the same kind of behavior when running a GPU and CPU burn test simultaneously.

    I avoid tests like OCCT because my CPU passes 100 C in less than a minute. :p There's a reason Lenovo programmed the CPU to throttle by default under simultaneous GPU and CPU workload because the CPU runs way too hot.
     
  30. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    I guess I should have clarified throttling to mean throttling below it's constant(non-turbo) clock, as it'll do above a certain temperature. I like OCCT as a stress test as it provides for headroom since thermal properties will degrade over time due to dust buildup, etc.

    Turbo boost certainly makes a difference in physics score, but very little difference in overall 3DMark score/real world FPS. I have tested with Throttlestop enabled and the temperature tradeoff is not worth it in my opinion. Real life gaming performance seems to be GPU-limited, even in the overclocked SLI configuration.
     
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I like to squeeze every last ounce of performance out of this thing, especially in games where I don't want a CPU-bottleneck to slow down the performance of my SLI setup. OCCT PSU Test and FurMark+Prime95 are unrealistic test scenarios and the CPU never gets that hot in real gameplay or benchmarks so I let it run as fast as possible.
     
  32. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    If you don't mind testing, what temps do you get with your OC and Throttlestop enabled during a 3DMark 11 run?
     
  33. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I won't have time for the next few days because I'm in the middle of a clean install and I haven't had a chance to OC the GPU yet. But the last time I ran 3DMark 11 with GPU's at stock and ThrottleStop on the CPU went as high as 90 C on one core and and both GPU's were below 70 C. This was using a notebook cooler but I don't think it makes a difference either way because the cooler barely seems to help.
     
  34. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    The reason I ask is I get unacceptably high temperatures with Throttlestop enabled(>100C max) and the cards OC'd, but I think that might occur as a result of the CPU test, which like Prime95 or OCCT isn't a realistic usage scenario. I might look into that tonight.
     
  35. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    You get "unacceptably high temperatures" in 3DMark 11 or in CPU burn tests? What's your interpretation of "unacceptably high temperatures?" I get as high as 90 C in 3DMark 11 with ThrottleStop forcing the Turbo multiplier.

    Yeah Prime95 and Linpack are unrealistic CPU usage scenarios and I can make my CPU thermal shutdown easily with those tests.
     
  36. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    Unacceptably high with either 3dMark or Prime95 (hottest core maxed out at 103). I was hypothesizing that the 3DMark 11 CPU test is responsible, since it is unrealistic for the same reason as Prime95/OCCT. I only have the basic edition, but I might just quit before the CPU test to see what my temps are on the graphics tests only. Throttlestop temperature is acceptable when I'm just gaming, but the tiny performance increase isn't worth the significantly higher temperatures.
     
  37. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's weird, sounds like something isn't right at your end. For me, the 3DMark 11 CPU test runs way cooler than the CPU burn tests, regardless of ThrottleStop. The 3DMark 11 Physics test is definitely not as demanding as Prime95 and Linpack. Those last two cause my CPU to run at 50 W for a while before the CPU throttles itself to ~3.0 GHz to maintain 45 W since Ivy Bridge can only exceed TDP for a certain amount of time. In 3DMark 11 the CPU is always at 3.2 GHz in the Physics portion and power consumption is never over 45 W.
     
  38. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    I'm not in front of my computer right now, so I may be confusing my recollection of the max temperatures. Otherwise, what may be occurring is a combination of two factors. Your temperatures were run with a cooler (which would get progressively more effective as temperatures increased above ambient, meaning you could see very little effect for <90C and a noticeable effect above it, keeping your temps at around 90), and your run was on stock clocks without forcing p-states, which means you had more thermal overhead since your cards weren't working at peak clocks/volts during the CPU test.

    I might run it at stock clocks/performance states with Throttlestop on and see what happens.
     
  39. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Nah it wasn't the cooler. Like I said it didn't lower temperatures for me much at all. Maybe 1-2 C but that could have been a placebo affect. My CPU stayed around 90 C regardless.

    It might've been caused by your GPU overclock, especially a high OC on the main GPU since that shares the same fan and copper heatpipe as the CPU. You should test with everything at stock. I haven't OC'ed my GPU yet so that's why I might not have experienced that yet.
     
  40. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    It's the higher GPU clocks.

    With stock clocks/performance states and Throttlestop enabled I get a score of P3620 and a physics score of 7563. Max CPU temp is 96 (which I expect to be higher than yours since my current airflow environment around the laptop isn't the greatest), and not in the 100's as when both Throttlestop and GPU OC is enabled.

    In any case, I have no intention of using Throttlestop considering I get >4500 with the GPU OC with CPU temps staying below 80. Any real-life usage that is CPU limited will likely have the processor turbo-ing anyway since the GPU won't be near full load.

    Interestingly enough, once OC'd past a certain point, the main GPU stays cooler than the Ultrabay one, despite having to share its thermal dissipation potential with the CPU.
     
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I'd double-check that with an in-game OSD if I were you. If you fire up Crysis 3 you'll probably find that the CPU will stay at 2.4 GHz without ThrottleStop since your GPU's are already being pushed to the edge. Crysis 3 burns up my GPU more than any sythetic test--FurMark/Kombustor, OCCT, etc--does and it's very CPU-hungry to boot. You might find you'll get 5-10 more FPS going from 2.4 GHz to 3.2 GHz.

    Anyway, I'd find some way to get that max CPU temp in 3DMark 11 down to around 90 if I were you. That should give you enough overhead to OC your GPU and not worry about thermal shutdown. I find that the maximum CPU temp in 3DMark is a pretty accurate indicator of the max in a game.
     
  42. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    I can't comment on Crysis 3, but the Bioshock: Infinite benchmark gives identical results (Throttlestop off faster by 0.1 fps) whether or not Throttlestop is turned on, indicating that throttling is not occurring in that usage case. I did order an adjustable cooler anyway, I'll see what effect that has on temps in general. Considering it's not noticeably throttling anyway while gaming, I question the wisdom of running Throttlestop, at least without simultaneously monitoring temperatures.
     
  43. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    BioShock Infinite is a very GPU-bound game unlike Crysis 3. The fact that there's no difference with ThrottleStop could just mean that the system is not disabling Turbo Boost under this particular workload, or it could mean that the 800 MHz CPU increase makes no difference. Logging the data during the benchmark run should tell you which is the case.

    I hate to throw free performance away and as long as my CPU stays around 90 C when gaming I'm perfectly fine with letting it run as fast as it can.
     
  44. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    That's a perfectly valid position, but you may be throwing away GPU OC headroom considering chips are generally stabler when cooler (in other words, you may get higher stable clocks when your GPU is at 70C as opposed to 75C). On the other hand, this may not apply if you're limited by the stability of the Ultrabay GPU (as I seem to be).

    The usefulness of Throttlestop is obviously dependent on the specific program. I was able to get a 4fps (56->60) increase in the World in Conflict benchmark, but I chose that game because it
    is particularly CPU-intensive. I'll start using Throttlestop if I can get temps down with a cooler, but with my current airflow setup its's not worth the 0-7% boost it provides at a cost of 15C.
     
  45. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Yeah if you match clocks when overclocking, the Ultrabay GPU does seem to be the limiter. According to GPU-Z the ASIC quality of my Ultrabay GPU is a bit lower than the internal one so it likely won't go as high. But you could always decouple the clock speeds using Nvidia Inspector and clock GPU0 higher than GPU1. MSI Afterburner can't seem to overclock the two cards individually.

    That statement about GPU OC headroom would probably be true if the cards were closer to their max operating temperature but they're so far away I don't know if it matters at all. I think the only thing that can make this GPU go higher is more voltage, which not to sound redundant wouldn't bet the best thing for an already too-hot CPU. ;)
     
  46. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    I did a quick Google on this topic and the consensus seemed to be that there was no point to running differently clocked cards in SLI since they would both operate at the lower frequency. Do you know if there is a point to OC'ing the two card asynchronously?

    I'm asking essentially out of pure interest, I like to leave a little stability headroom on my OC's anyway.
     
  47. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Actually as of Kepler the two cards can operate just fine at different frequencies since GPU Boost is designed to scale clock speed according to workload. I was able to run my two cards at different clocks before without problems. Not sure if there is any point though.
     
  48. klebs89

    klebs89 Notebook Enthusiast

    Reputations:
    0
    Messages:
    26
    Likes Received:
    3
    Trophy Points:
    6
    Makes sense, I think all the stuff I saw referred to previous architectures.
     
  49. bamselinen

    bamselinen Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5

    Can you tell me how to push the OC on both card if i am using SLI is there a line i need to add to the bat file..??
     
  50. Character Zero

    Character Zero Notebook Evangelist

    Reputations:
    41
    Messages:
    327
    Likes Received:
    26
    Trophy Points:
    41
    Yeah look at this one posted a couple of pages back:
    Notice you have basically repeating lines with the difference being a "1" or a "0" (see bolded numbers). The "0" is the internal card and the "1" is the ultrabay card.
     
← Previous pageNext page →