The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    My Nvidia GTX 880M Test Run Review

    Discussion in 'Alienware' started by Johnksss, Feb 26, 2014.

  1. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    You're going to have to mod the INF file if you install any GPU that Alienware has not reported a hardware ID to NVIDIA or AMD. By default, the INF file only supports a standard configuration that has been reported by the manufacturer. Even downgrading to an older GPU that was never a stock option would require the INF to be modded (i.e. 460M or 6970M in an M18xR2).
     
  2. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    laptopvideo2go tend to have modified infs to use.
     
  3. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Yes, they sure do. Once you learn how to mod the INF yourself it is a lot easier and faster to just mod it yourself, so it's worth learning how and it's very simple.
     
  4. Arotished

    Arotished Notebook Evangelist

    Reputations:
    0
    Messages:
    481
    Likes Received:
    104
    Trophy Points:
    56
    Where can I find information on how to mod the .inf file?

    Does this apply for 780M also for the R2?
     
  5. jeepinchris

    jeepinchris Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    long time reader, first time poster. i finally pulled the trigger to upgrade my old 8170 to a new 8278s with an 880m, Sager shipped the machine with what i would assume is a modified version of the nvidia installer(332.66) to detect the device ID of the 880m, the most current official and beta builds available on nvidia dont even detect the card. I can go about modding the inf for the latest available build, but im not sure its going to change anything. as of right now, even when i setup a global or app specific rule in nvidia's control panel to use the 880m, everything seems to default to the intel 4600. I cant install the drivers provided by sager without installing the intel drivers as well, and disabling the intel 4600 kills the 880m also... any suggestions on this? not sure if that statement would hold true if i modified the latest drivers myself, or if "optimus" requires both to be on there no matter what. When you did your initial review on techinferno i know you mentioned you used a modded vbios and the 331 driver, so i ran 3dmark basic and compared with those.

    results are here Generic VGA video card benchmark result - Intel Core i7 4900MQ,Notebook P17SM-A

    on the home screen it shows both the 880m and the intel 4600, for results it shows "generic VGA" and an intel 4900(its really a 4810).
    as you see those scores seem on par with your "tinkered" firestrike score at the top of your writeup, but still far from your final score. also, look at the results at the bottom, it lists my graphics card as generic, shows its using gtx 880m driver 9.18.13.3266 with 8 gig of ram and 954 core clock. i currently have the nvidia profile set to "gtx880m" as opposed to "auto select" as a global profile.

    numbers are one thing, in game performance is another. i know its nowhere near the BEST benchmark game in any way, but the only thing ive tossed on this machine at this point is a fresh install of WoW. my old 780m on my alienware non OC'd ran it at ultra with no problem, my old 6990 OC'd in the 8170 could run most settings at ultra and be just fine in MOST cases. using the default "high" profile is pushing it for this, lots of stutter and slowdown although the FPS monitor still shows everything smooth at 60fps(vsync/triple buffer on). the recommended settings are mostly "good". each of those machines had less CPU power than this as well, and i know WoW can also be heavily CPU reliant.

    am i pretty much stuck until an official driver is released? do you think apps just arent detecting it properly? id hate to just sit on my hands and wait for nvidia, so if you can toss out any suggestions to how i might be able to better use this card now they would be greatly appreciated.


    Thanks!
     
  6. LostInaMaze

    LostInaMaze Notebook Guru

    Reputations:
    0
    Messages:
    66
    Likes Received:
    3
    Trophy Points:
    16
    This is troubling to hear. The card is released and sold yet there arent any working drivers? Rediculous. And that not so great performance... This might have to put off my plans to buy a new 880m laptop for a while....
     
  7. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Don't be discouraged. All of this is absolutely normal for a new GPU release and it's unreasonable to expect a different outcome with 880M... a few thoughts...

    • If Clevo allows you to disable Optimus, do so immediately... only re-enable it if and when you need to use the battery. If you want good results, Optimus will be an impediment. You have to install the Intel Graphics driver with Optimus, so that's normal. Find out if you can disable it... on some machines you cannot.
    • The installer is no different. This probably had to manually edit the INF file to add the hardware ID if NVIDIA hasn't added it to the INF. This is not a big deal and it's common to have to do it on machines that have been upgraded as well. This has no impact on performance.
    • You will never achieve great results with a stock NVIDIA vBIOS - go get a 880M mod from svl7 if you want great performance (applies to all Kepler and most Fermi GPUs).
    • You need to post your GPU hardware ID in Futuremark Forums... they need to add the new laptop/GPU combo to their database. This has nothing to do with drivers. It is not listed and this is why it is seen as "Generic VGA" in your benchmark. Same thing happens on every new GPU release. It has to be in their database to not be identified as "Generic VGA".
     
    unityole likes this.
  8. jeepinchris

    jeepinchris Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5



    Thanks fox, appreciate the response.

    Doesn't appear clevo lets you disable Optimus. Best I can do is force only the 880m to be used in the nvidia control panel, not flat out disable the intel.

    I will get a modded vbios soon, just gotta go make some more posts over there ;). Didn't want to make 5 junk posts just to be able to d/l, perhaps if someone has a modded bios for the card they can get in touch another way and hook me up.

    I figured seeing the card had already seen results posted from johnksss and others and it specifically called it what it was that it was already in their db.

    Clevo provided me with a different driver, V9.18.13.3235 as opposed to the "newer" build that shipped with the laptop, .3266, this older build performs substantially better but still has it's share of weirdness and performance degradation, especially playing with vsync. Doesn't matter if I toggle it on in thee nvidia CP, and then off/on in games, as soon as it's on everything goes to sh!t, and when it's off there's a decent amount of tearing. At least I can live with it for now at this point.
     
  9. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    You would need to correct the vbios issue first before going with a different driver....
     
  10. MolocM18x

    MolocM18x Notebook Geek

    Reputations:
    0
    Messages:
    94
    Likes Received:
    3
    Trophy Points:
    16
    Thanks for the great write up !! Looks like I am keeping my 780 in my Alienware 17 then lol


    Sent from my iPhone using Tapatalk
     
    johnksss likes this.
  11. Prasad

    Prasad NBR Reviewer 1337 NBR Reviewer

    Reputations:
    1,804
    Messages:
    4,956
    Likes Received:
    10
    Trophy Points:
    106
    Seems like the 870M (at it's price point) should be sufficient for me.. thanks! :)
     
  12. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    How much is the 870M going for?

    Edit:
    640 dollars or so.
     
  13. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    ^that's a complete ripoff, considering you can get a 780M + heatsink for just $695, with +10% performance at stock to boot.
     
  14. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    I can't really say anything till I see that 870M pushed....
     
  15. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Fair enough, but that 192-bit bus is going to put it at a disadvantage.
     
  16. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    I thinking this as well, but you never know... :)
    Soon, we shall have some verified answers...
     
    Mr. Fox likes this.
  17. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    Lack of ROPs will keep it back too. The 780m is worth getting over it at that price.
     
  18. mgalena

    mgalena Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    johnksss --

    I'm a little confused, please help me out... what vbios or other means are you using to overclock the 880m?

    I have one of these, so I'm quite interested in removing the throttling issues, even if I don't OC it too much beyond that. I've read a number of your posts, here and other sites, but it's still unclear to me what I should be using to do this... it seems like the "780m OC edition" vbios on techinferno is what I should use, but I'm unclear if the additional 4GB in memory plays well with that, even though otherwise it's essentially the same card.

    Could you point me in the right direction? Thank you!
     
  19. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    Well, We haven't officially released anything at the moment. That was a collaboration between svl7 and myself. We are still waiting on a few other factors to present themselves first. Also not sure if nvidia is coming clean with this next driver or going to make it even worse than things are going to be for any 800 series card and up now. (Speculation at this point)
     
    Mr. Fox likes this.
  20. mgalena

    mgalena Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    Ah... thank you for the response... understood!

    Are the recommended drivers still the 327.23 version at this point?
     
  21. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    That driver seems to work well enough but it was crashing during osd and gaming, but that could be because i was recording at the time.

    Edit:
    So, time to shut up a bunch of people who talk about what video memory you can't use. :D

    This is the 780M at 1920x1080P and 2550x1440P
    [​IMG]
    [​IMG]
    2550x1440P
    [​IMG]
     
    Mr. Fox likes this.
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Even an older game like Skyrim with ENB mods and maxed out settings (thanks Brother TBoneSan for tips on those) can gobble up the 2GB of VRAM that many GPUs are limited by.

    Skyrim2.24.jpg
     
    johnksss and TBoneSan like this.
  23. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Hey hey hey looking good Brother Fox :D

    You leave no stone unturned.. Love it ! :thumbsup:
     
    Mr. Fox likes this.
  24. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    take a look at the one right above it brother bone.
     
    TBoneSan likes this.
  25. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    i still think 8gb is way overkill. maybe using those 8gb for os system then I have a ton of usage for it. and yay my 32gb will be here on monday or tuesday
     
    Mr. Fox likes this.
  26. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    Well....Just add this to the list of things that fall under over kill. :D
    And that list is pretty darn big as well.


    Although it's not over kill for BF4. First person ever to use 3.8 GB Vram!. And that game has been out for quite some time now.
     
  27. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    my eyes cant tell the difference between anything above 1080p in a 18 inch or smaller screen, quality wise. frames/sec would change lot more. with 20 inch im using 1650 x 1200 and I think its just a bit overkill lol
     
  28. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    You know Im only posting that stuff to say. "I told you so!", but not to you though. :D

    Now i need to load up bf4 on my desktop to see if it will in fact go higher than 4gb of vram, since the card in there is a 6gb titan.

    Heck, brother unityole, I game at 1920x1080P on normal and no over clock. As long as it plays and im not stuttering or slow dragging when it comes to amining. Im golden! :)
     
    unityole likes this.
  29. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    do you go tri screen at 1080p each? that will probably go over 4gb. well tbh i wouldnt mind more memory and i might eventually find my ways into some situation where i'll need it. always dreamed about multi screen laptop :D
     
  30. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    I have multi screens but i just used my 2550x1440p monitor. it would take some time to set that p for my laptop though.
     
  31. Canious

    Canious Notebook Consultant

    Reputations:
    134
    Messages:
    101
    Likes Received:
    3
    Trophy Points:
    31
    I agree with 8gb being overkill, i thought it was a typo when i first saw it lol
     
  32. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    Really? And how so? Since 4gb can be eaten up pretty quickly now.
    Edit:
    I see why now.
    i7 2720QM | HyperX 8GB 1866 MHz | 320GB HDD | Crucial M4 128GB SSD | Radeon™ HD Dual 6990M CrossFireX |

    :D
     
    reborn2003 and Mr. Fox like this.
  33. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    8GB is overkill for 1080p though. Maybe that's what he meant?
     
  34. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    This is not the "8 gb is to much review"
     
    DumbDumb, reborn2003 and Mr. Fox like this.
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    I still stand by my belief that too much of a good thing is always the nicest option. If 8GB is more than you need, that's a good spot to be in when you stop and think about it. That's kind of the point of owning an expensive gaming or benching beast, right? If you wanted average, you'd better of tinkering with a cheap plastic MSI or Asus or a mickey mouse Razer gaming "Ultrabook" (oxymoron) with a low end i7 CPU and a totally average mid-range 2GB GPU. When you buy up and go with a real beast you want more horsepower than necessary to get the job done.

    But all that aside, here's Call of Duty: Ghosts maxed out settings running 2560x1440 on my M18xR2, just gobbling up my 4GB 780M vRAM like there's no tomorrow...

    Now, with 780M SLI I am still getting a consistent 60-70 FPS, but 3.9GB of the 4.0GB of vRAM is just... well... gone, LOL.

    4K display anyone?

    Anyone?

    Bueller?

    Bueller?


    If you're buying a new Alien beast the three options are: A: 2GB 765M, B: 2GB 860M, both of which get spanked by 680M and 780M. Option C is 8GB 880M. Looking at the other options, what's not to love about Option C?
     
    reborn2003, johnksss and TBoneSan like this.
  36. Optimistic Prime

    Optimistic Prime Notebook Evangelist

    Reputations:
    572
    Messages:
    521
    Likes Received:
    95
    Trophy Points:
    41
    I agree with you, but I wouldn't say Call of Duty: Ghosts is a very good example. :p

    Sent from my SM-N900V using Tapatalk
     
  37. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    Ok, I know I shouldn't, but I'm going to ask anyway.....Now why is cod Ghost not a good example?
    No need to talk about ports this and that either. :D

    We are only looking at what can max out this 4 GB barrier that no one thought could be surpassed....
    EDit:
    Make that 2GB
     
    Mr. Fox likes this.
  38. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    I really like Call of Duty: Ghosts. I think the offline campaign is pretty decent. I understand that there are people that like online multi-player games, and I have heard that Ghosts has issues that don't affect me in offline campaign, but none of that really matters. As John said... these are examples of games that exceed the functional capacity of 2GB GPUs. This one maxes out 4GB when everything is cranked up.

    These examples are merely an answer to the chatter about how much vRAM is adequate versus "overkill" rather than a game title popularity contest. ;)

    I have to admit that I'm puzzled by the fact that some people almost seem upset that 880M has 8GB of vRAM. That seems so silly. So what even if it were true that 8GB is overkill by today's standards. Since when is aiming for "adequate" or "good enough" the goal for an Alienware owner? It would make a lot more sense to see the same folks complaining that 765M and 860M have only 2GB of vRAM, which is clearly not "more than enough" by today's standards and will probably soon become inadequate for gaming.
     
  39. Optimistic Prime

    Optimistic Prime Notebook Evangelist

    Reputations:
    572
    Messages:
    521
    Likes Received:
    95
    Trophy Points:
    41
    Just my opinion, that's all. When I had my hands on it the code was pretty dodgy. It could have very well been fixed by now. They have also been known to artificially inflate their minimum requirements. My primary point was that just because it uses up 5GB of vRAM, (say, in a system with a Titan) that doesn't mean it needs to. The textures look messy, in my opinion. Given their track record, I would say it's more likely a memory leak. There is no way a game that looks that old really needs that much.

    They could have fixed it by now, that's entirely possible. I don't have an actual license to go back and have a look. At the time, the game didn't have an FoV toggle, and was locked at around 65. This is actually a problem, as I get headaches from simulation sickness.

    As I said before, I agree with you on everything else. In fact, I was one of those concerned that 2GB of vRAM on the 680Ms may not be enough in a few years, lol. I casually mentioned it not because of game popularity, but that the game has known problems (or did at launch). I didn't think it would become a discussion point. :)
     
  40. kenny27

    kenny27 Notebook Deity

    Reputations:
    294
    Messages:
    919
    Likes Received:
    167
    Trophy Points:
    56
    To be fair 8GB does sound like a lot when the Titan has 6GB, the 780/ti GTX has 3GB and the R9 290X 4GB.
    I was a bit skeptical that 8GB was excessive but its obviously not for "future proofing" 1080P @ Max settings. Games are only going to use more memory not less!
     
  41. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    Every single game known to man has problems. :D

    Also, my favorite quote fits here.
    Quit jumping on examples as if they are topics! (a favorite forum mistake)

    Adding:
    Me personally, don't really care if the game is good or bad, only that we have a way of using a gread deal more of that video memory. Which also means other ways to better utilize it later. :D
    If we get out on this now, then when they start messing things up like they almost always do.... We will already be on the road to fix it. :)
     
    Mr. Fox likes this.
  42. DumbDumb

    DumbDumb Alienware !Wish money wasn't the problem.

    Reputations:
    1,583
    Messages:
    1,649
    Likes Received:
    259
    Trophy Points:
    101
    so who wants to ship me a set of 880s huh huh?
     
  43. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    I would, but I think I might be eyeing me a benz s500 instead. :D
     
    Optimistic Prime and TBoneSan like this.
  44. Arotished

    Arotished Notebook Evangelist

    Reputations:
    0
    Messages:
    481
    Likes Received:
    104
    Trophy Points:
    56
    Just got myself two 880M along with 3940XM to replace my 680Ms and 3840QM in my R2, looking forward to see the results and compare it with my stationary 4770K/780ti SLi setup with is running 4,5Ghz and 1250/7800Mhz,

    Have any other installed the 880M into their R2 and have some tips for the ini file modification?
     
    TBoneSan likes this.
  45. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I'm looking forward to see some benchmarks from you fella. Congrats on the upgrade. I'm not too sure how to mod the ini for 880 but I bet brother Johnksss and/or Mr Fox know.

    Dare I ask what a pair of 880m's set you back?
     
  46. Arotished

    Arotished Notebook Evangelist

    Reputations:
    0
    Messages:
    481
    Likes Received:
    104
    Trophy Points:
    56
    Thanks, need to get the dual 330W up and running also.

    I paid 2247 dollars for the twins and the XM including express shipping.
     
  47. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    880M (2x) SLI is nothing in comparison to 780Ti (2x) SLI. :D

    20nm Maxwell is supposed to deliver about 780/780Ti performance on the mobile front, but we don't know that for certain.
     
  48. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    I really don't get how the numbers add up. It seems like mobile gpus 880m sli in particular is hitting 16k in 3dmark 11 with each card roughly getting around 8k in 3dmark 11. A single 780ti which I have hits 14k but add another 780ti into the mix and you get 19.5k. It's not double the numbers as you would expect. Is it a problem with the benchmark? What is really going on here?
     
  49. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Adding another card (2x SLI) should never double performance. If a single 880M gets 8k in 3DMark, 880M SLI should not be getting 16k. It's more likely to be around 12k or 13k.

    Having two cards splits the workload up, allowing one card to focus on part of the screen (top?) and the second card on the other part (bottom?). It does not assign both cards to do the same work. It's assigning different parts of the work to each card and that is where the 'extra' performance boost comes from - the speed at which this work is completed. Because they're doing different things that lead to the same outcome, performance varies.

    Maybe I explained that wrong. It's not easy to explain. I'm clearly no expert.
     
  50. woodzstack

    woodzstack Alezka Computers , Official Clevo reseller.

    Reputations:
    1,201
    Messages:
    3,495
    Likes Received:
    2,593
    Trophy Points:
    231
    in that example if i could guess, it would be the cpu being a limiter. on the mobile front the cpu is fast enough in regards to the speed of the gpu's to make a larger difference allowing nearly max performance boost, while on the desktop front, the ability of the cpu is limited , either adding a 2nd card is too much to keep up with fully, or the gpu's are faster then the buses could keep up with or simply just faster then the cpu. period.

    but who knows.

    we all know the CPU plays a heavy role in any SLI setup. Ideally, you want two fast cars going nuts, you want 4.5Ghz+... (For optimal results at least)
     
← Previous pageNext page →