The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    OC the 2090 to match the G1s

    Discussion in 'Sager and Clevo' started by rideexileex, Jul 6, 2007.

  1. rideexileex

    rideexileex Notebook Geek

    Reputations:
    20
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    I've been reading around and there hasn't been a blunt/direct statement about this... Yes I realize the 2090 clock speeds have been crippled compared to the Asus G1s card, but what's to prevent turning up the clock speeds to match the Asus system?
     
  2. wave

    wave Notebook Virtuoso

    Reputations:
    813
    Messages:
    2,563
    Likes Received:
    0
    Trophy Points:
    55
    I know RVtune says that the IFL90 uses GDDR3. But I dont think it is true. RVTune gets the core speed and the shader speed wrong so why not the GDDR3 and GDDR2 issue.

    If I am right then the IFL90 uses GDDR2 and it just cant be clocked as high as the GDDR3 in the G1S.
     
  3. Donsell

    Donsell Notebook Evangelist NBR Reviewer

    Reputations:
    163
    Messages:
    546
    Likes Received:
    0
    Trophy Points:
    30
    Wave - it seems like you want it to be GDDR2 rather than GDDR3. Donald has said its GDDR3 and has had them in his shop. Is it that hard to believe?
     
  4. laptophunting

    laptophunting Newbie

    Reputations:
    1
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    I haven't seen that statement made by Donald on these forums. Where did you see that?

    I agree with Wave, the reported clockings seem to indicate that GDDR2 is used.

    In any case whether it is GDDR2 or GDDR3 is irrelevant if the clock speed can't be upped to the G1S levels. So far I haven't heard any reports of overclocking the memory to anywhere near G1S levels.
     
  5. Donsell

    Donsell Notebook Evangelist NBR Reviewer

    Reputations:
    163
    Messages:
    546
    Likes Received:
    0
    Trophy Points:
    30
  6. sco_fri

    sco_fri Notebook Evangelist

    Reputations:
    75
    Messages:
    451
    Likes Received:
    0
    Trophy Points:
    30
    Wasnt the HEL80 underclocked as well?

    Update - Here is the review of the HEL80

    "The 3D performance of the HEL80 is not as fast as other notebooks using the same chip, mainly because the HEL80's GPU is underclocked."
     
  7. sco_fri

    sco_fri Notebook Evangelist

    Reputations:
    75
    Messages:
    451
    Likes Received:
    0
    Trophy Points:
    30
    Chrisyano did a HEL80 review as well, and it seems as if he may be a very good source on the "why" and "what to do about it"

    Personally I dont care whether the IFL90 has the highest benchmarks scores or not, the gpu is going to be more than enough for me either way. I would expect that anyone that wants the highest benchmarks in a dx10 system already has their G1S purchased.
     
  8. Donald@Paladin44

    Donald@Paladin44 Retired

    Reputations:
    13,989
    Messages:
    9,257
    Likes Received:
    5,843
    Trophy Points:
    681
    Donsell, I have corrected that post.

    No one knows for sure yet whether the memory is GDDR2 or GDDR3 even though one owner has posted their Riva screen shot showing GDDR3. The problem is that software tool is not an “official” (read “reliable” ;) tool recognized by either Compal or nVIDIA. The Compal specs say GDDR2, so until we actually get the production models into the US we won’t know for sure. However the bottom line is that it really doesn’t matter to the performance of the video card. It will kick butt just the way it is, and you couldn’t tell the difference in your gaming experience.

    I totally agree with this. All this talk about synthetic benchmarks and clock speeds, and even the HUGE difference between the performance of GDDR2 vs. GDDR3 (which does not exist), for the most part, is splitting very fine hairs. It makes for great conversation, but doesn't mean much.

    Once you get your laptop you will forget all of these comparisons and just game...none of this will improve your reaction time. If you can get a significant increase in FPS it will make a difference, but a few hundred benchmark points, or a few frames faster will not be noticeable in actual play.
     
  9. ffkol

    ffkol Notebook Consultant NBR Reviewer

    Reputations:
    70
    Messages:
    271
    Likes Received:
    0
    Trophy Points:
    30
    I think it's more of a psychological issue, as in placebos. After all, people will always go for the latest/biggest numbers right?

    EDIT: As in GDDR3 instead of DDR2.
     
  10. rideexileex

    rideexileex Notebook Geek

    Reputations:
    20
    Messages:
    96
    Likes Received:
    0
    Trophy Points:
    15
    Yeah, I ended up doin that (goin for the best) with the esata and hdmi ports of the g1s... honestly... I'm not goin to use those - the closest I can think of is hooking up a projector at college, and being an engineering student, the 2090 is going to appear much more professional in my mind
     
  11. Scavar

    Scavar Notebook Evangelist

    Reputations:
    50
    Messages:
    498
    Likes Received:
    0
    Trophy Points:
    30
    Sounds like a good plan rideexileex, save you some money too.