The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Alienware 13 Pre-Release Speculation Thread

    Discussion in '2015+ Alienware 13 / 15 / 17' started by tinker_xp, Aug 8, 2014.

  1. BriS2k

    BriS2k Notebook Consultant

    Reputations:
    0
    Messages:
    135
    Likes Received:
    2
    Trophy Points:
    31
    man I must be going blind - didnt see the dropdown for each component -- thought they still employed that horrible slideshow type configuration process
     
    reborn2003 and bumbo2 like this.
  2. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    Mine Still In Preproduction. :) ETA Delivers by
    12/16/2014. :hi2:
     
    reborn2003 likes this.
  3. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    Yes man. I hate this Typo Of Configuration Page! :mad:
     
  4. FoHMike

    FoHMike Notebook Geek

    Reputations:
    16
    Messages:
    89
    Likes Received:
    25
    Trophy Points:
    16
    Yeah, their site definitely sucks.
     
  5. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    You spent $3500 on two 13" notebooks? Creating a weak botnet with laptops? :confused:
     
  6. FoHMike

    FoHMike Notebook Geek

    Reputations:
    16
    Messages:
    89
    Likes Received:
    25
    Trophy Points:
    16
    Since when to botnetters actually own the machines? ;)
     
  7. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    It was a joke.
     
  8. garache

    garache Notebook Enthusiast

    Reputations:
    31
    Messages:
    20
    Likes Received:
    5
    Trophy Points:
    6
    The i7 5600U is a Broadwell CPU that is not available yet.
     
  9. Poildur

    Poildur Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    Will i get perfs of a 980 if i use it with the AMPLIFIER or no ?

    What performancew will i get ?
     
  10. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    A desktop performance, probably. For games the graphics power is what matters!
     
  11. Poildur

    Poildur Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    Yes but i heard the CPU will maybe bottleneck the GPU, is that real ?
     
  12. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    The graphics power matters more for GPU intensive games, and CPU-bound games need proportionately more CPU, but both benefit from a good GPU and GPU. More games now and in the future will benefit from a quad-core design. If the CPU didn't matter we'd all still be using Pentium 4's still lol.
     
    bumbo2 likes this.
  13. Poildur

    Poildur Newbie

    Reputations:
    0
    Messages:
    6
    Likes Received:
    0
    Trophy Points:
    5
    So, the GPU will perform at 100 percent or no ?
     
  14. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    We Don't Know Yes.
     
  15. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    There is no way in hell a GTX 980 desktop card will perform at 100% with an i5 dual-core mobile processor, LOL. Keep dreaming, boys.

    The 4710HQ bottlenecks the 980 M, and that's barely reaching GTX 780 performance. Déjà vu... I said this before? :confused:
     
  16. FoHMike

    FoHMike Notebook Geek

    Reputations:
    16
    Messages:
    89
    Likes Received:
    25
    Trophy Points:
    16
    Fyi tho, at PAX, the 13 was playing arkham origins at 3k with the 980m perfectly smooth.
     
  17. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I'm not suggesting it won't benefit from it. It just won't perform at its full capacity due to the bottleneck.
     
  18. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Not sure why this is so hard to grasp either... you could use a killer desktop card for an anemic i5 ULV processor, but it's like trying to drive a giant water sprinkler with a tiny hose... yes, the giant sprinkler could potentially do more but it's underused with the tiny hose. There is an optimal sizing, where the hose and the sprinkler work together with no lost potential; beyond that it's just waste of money. Same for the coupling of the i5 ULV with a desktop processor. The 860m is not a bad card, makes me wonder what desktop card available today is the maximum the i5 ULV can feed and push the GPU to it's limit so no lost processing on either side?
     
  19. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,175
    Likes Received:
    17,888
    Trophy Points:
    931
    It would also require an entirely new motherboard.
     
  20. FoHMike

    FoHMike Notebook Geek

    Reputations:
    16
    Messages:
    89
    Likes Received:
    25
    Trophy Points:
    16
    I understand bottlenecking, I was just pointing out that it's not bad enough to prevent one from running AAA titles @ 3k.
     
  21. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    Guys, the alienware 13 was no made or built for overclocking to my Understanding!!! so Bottlenecking is no A Big Deal to me! I Just Want the portability.
     
  22. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Point taken - the real test here is going to be when someone starts posting numbers of a fairly current game that truly recommends and uses a quad-core. To date, unless I have missed it, all the reviews I have seen show numbers from benchmarks quite obviously for GPU-bound games.

    It might very well support OC'ing, heck the M11x's did in some cases and that was with a far weaker CPU/GPU. Using Throttlestop on an R2 using an i7 actually made it faster than an equivalent M11x-R3 with and i7 in terms of CPU though the weaker video card evened things back out in favor of the R3.

    I have been reading that the initial Broadwell impressions on the current Lenovo Yoga are far from what everyone hoped, but it's still unclear how much is due to the software setup and throttling Lenovo put into place. If there are similar issues on the i7-Broadwell used in the AW 13, I might just opt to get the Black Friday sale price on even a current i5 AW 13 since it's not my workhorse laptop but my around-house quiet one and avoid any Broadwell throttling if that turns out to be an issue. The problem is Dellienware is likely to introduce the Broadwell alongside the 900-series GPU's.... though I have a strong suspicion we may just still get an 860m :mask:
     
  23. bumbo2

    bumbo2 Notebook Deity

    Reputations:
    324
    Messages:
    1,612
    Likes Received:
    104
    Trophy Points:
    81
    @Docsteel. You Got A point here! totally Agree on This, the real test here is going to be when someone starts posting numbers of a fairly current game that truly recommends and uses a quad-core. But in This case I'll left My M18 Take care of That.
     
  24. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    HTWingNut already proved the 4710HQ is not ideal for the 980M ( http://forum.notebookreview.com/gaming-software-graphics-cards/763237-gtx-980m-limited-cpu.html). The 980M is almost a GTX 780.

    It's impossible for any dual-core mobile CPU to adequately support a desktop GTX 980. It will be better than the 860M, but a bottleneck will exist.
     
    bumbo2 likes this.
  25. QUICKSORT

    QUICKSORT Notebook Evangelist

    Reputations:
    212
    Messages:
    686
    Likes Received:
    536
    Trophy Points:
    106
    Guys you need to understand the concept of what the CPU will bottleneck.
    In the usual circumstances graphical effect that require CPU are different, and graphical aspects that require GPU performance is different.
    Now what you have to do is. Turn down the effects that require CPU performance and turn up the effects that Require GPU performance.

    Now what uses CPU? effects such as, shadows and dynamic lighting, physics (in case it's Nvidia PhysX, it's done by the GPU instead of the CPU)
    And about GPU, the GPU is responsible, about resolution, Textures, Anti-aliasing, Filtering, ambient occlusion, Tesselation, TessFX (in games such as TOmb raider, that's also the reason why they used Tomb Raider as Example benchmark for the alienware 13, because it's VERY GPU performance requiring.)

    HOWEVER! There are games that even if you put it down to the lowest settings, it still requires CPU performances due to the game mechanics. A good Example is managing All players online in an MMO.

    So there are games that will run VERY good if optimized in the video settings correctly, but then again there will be games that run VERY bad nomatter what GPU you put in the Graphics amplifier.
     
  26. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Yep - that's why it's important to look at GPU-bound game results alongside CPU-bound results when evaluating a system. Each game is a point along a spectrum between CPU-bounded-ness and GPU bounded-ness... I suspect even for current and upcoming games that would take full advantage of a quad-core that the ULV dual will in the majority of cases be sufficient, but will half the time bottleneck the GPU as all the lighting and physics, etc. have to take place coordinated with GPU rendering. If the CPU is slow... then the GPU will be often sitting idle a percentage of the time.
     
  27. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Ugh not this again. While I agree a dual core ULV is going to bottleneck the crap out of a desktop 980, the part in bold is significant exaggeration and unwarranted at best.

     
    Robbo99999 likes this.
  28. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    You realize there will always be a bottleneck in computers, yes?
    I think it is more of looking at future games than at present games. In current games, the bottleneck is not important (obviously). However, in the future, it may be problematic.
     
    bumbo2, Docsteel and xxpmrong like this.
  29. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I do not believe saying something is "not ideal" warrants the title of exaggeration. It's not like I said, "The 4710HQ horrendously bottlenecked the GTX 980M, so much so, you only get 10 FPS in BF4." That would be an exaggeration of the bottleneck. :D

    Yeah, exactly. That's my concern. The games coming out in fall of 2015 are going to be hugely CPU intensive. Don't even get me started on the 2GB of VRAM the 860M offers... That's yet another bottleneck (or soon will be). :rolleyes: There's an entire thread dedicated to VRAM discussion in the Gaming and Graphics board.

    In that regard, the Alienware 13 is obsolete before it's even sold, especially if you account for the Amplifier.
     
  30. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I aim simply responding to the part about 4710HQ being "not ideal" for a 980M. When future games become a bottleneck, you will run out of GPU power long before you run out of CPU power. If you're referring to CPU bound games, then the 4710HQ is "not ideal" under all circumstances, not just when paired with a 980M.

    I simply disagree with the assessment that the 4710HQ will present any issues at all for a single 980M is all.

    Look at these test results for desktop chips, and see how little difference going from a quad core i5 without HT to a hex core i7 with HT makes in most games.
     
  31. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Thanks for posting the link, n=1. Maybe I am wrong, but aren't all those games in the tests fairly well known to be GPU-bound games? If so, your point about there not being _as big_ a spread is true based on those results, but I wish we could get a similar list for quad vs dual core for current games that state a benefit from going quad.... anyone know of a good one?

    I came across a somewhat old discussion on dual vs quad, with hyperthreading... it confirms something I have been suspecting, that dual cores supporting four threads might give a good bit of the benefit that a quad (granted, with 8 threads) would provide for a lot of games that are probably coded to look for four threads (on a dual core or 4 out of 8 on a quad). Some games coming up will be optimal for say eight threads (quad-core) but very few to utilize hexacores with 12 threads or heaven forbid, octacores with 16 threads. While not optimal, the dual-core with four threads probably will be sufficient for a couple more years... about the life of the AW 13 for this revision.

    The Tech Buyer's Guru - Dual-Core, Quad-Core, and Hyperthreading Benchmark Analysis

    I'm still in favor of waiting for an i7 though, dual core or not simply because the extra cache has been shown to definitely be impacting on games, not to mention the speed increase. not a huge increase mind you, but enough of one to be worth the difference imho.

    This is a link from the article above : http://techbuyersguru.com/haswellgaming.php

    Note this statement near the end of page 3: "With a balanced gaming rig, the GPU usage should never fall below CPU usage, and ideally should always be above 90%."
     
  32. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Lost Planet 2 is known to be very CPU intensive and the results do reflect that. Crysis 3 should've been another poster child for more cores + more speed = better performance, but it doesn't really show up in that chart for some reason.

    All the desktop i3 CPUs are dual core with HT, and the Pentiums are dual core without HT. Luckily that review has representative chips from each segment, so you do a direct comparison of how a dual core without HT Pentium compares to a dual core with HT i3, to a quad core without HT (i5) and with HT (i7), and all the way to a hex core i7 with HT.

    I was in the "4710HQ is not enough for 980M" camp previously, and spread a lot of misinformation that way. So I'm simply doing what I can to help clean up the mess that I had a part in, which is why you may see me getting worked up about this particular issue. :)

    And just so nobody misunderstands me, I definitely agree even a dual core desktop chip with HT simply won't be enough going forward, let alone a dual core ULV.
     
  33. xxpmrong

    xxpmrong Notebook Consultant

    Reputations:
    0
    Messages:
    164
    Likes Received:
    30
    Trophy Points:
    41
    How do i know if the games that I play is CPU or GPU bound?

    I usually find myself wasting time on simulation / strategy games like Civilization / UFO / Total War / Sim City / War Games / NBA 2k15 / World of Tanks

    I am not too much of an FPS guy.
     
  34. nimbot3

    nimbot3 Notebook Enthusiast

    Reputations:
    0
    Messages:
    49
    Likes Received:
    4
    Trophy Points:
    16
    So its obvious that the ULV will bottleneck any higher-end GPU's e.g the GTX 980 and so on but what about the GTX 860M that comes with the laptop? Any possible bottlenecking there?

    Also just to confirm and if I understand this right, the CPU bottleneck can be somewhat averted if you put graphics settings higher and more processing strain on the GPU instead?

    I'm currently looking for a laptop for on the go before this January and this caught my attention. Graphics amplifier is something that I would get way later so I'm curious to see whether the GTX 860M will also be bottlenecked by the CPU as well. Especially in games like Battlefield 4 and so on. It's either this or I'm going for that MSI GS60 which is more expensive so I'm banking on this.
     
  35. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    RTS and simulation games tend to be very CPU bound. One way to check whether you're CPU or GPU bound is to monitor the utilization of each while playing. If the CPU is constantly being pegged at 95% while the GPU is strolling along at 60% then it's a very good sign you're CPU bound.

     
    xxpmrong likes this.
  36. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    You are only lying to yourself if you believe the i5-4210U processor will not bottleneck the GTX 980. I'd be willing to put money on the fact that, if a good reviewer (like HTWingNut) were to get his/her hands on an AW 13 and Amplifier with the GTX 980, it would be quite obvious that there is a bottleneck. How dramatic of one? I don't know. But it will exist.

    • Fact: The 4710HQ is more powerful than any dual-core ULV processor offered in the AW 13.
    • Fact: The 4710HQ bottlenecks the GTX 980M (equal to a GTX 770~780) in some games.
    • Fact: The GTX 980 is more powerful than a GTX 980M or GTX 770/780 by a good margin.
    If a better processor cannot fully support a weaker GPU, how can one expect a weaker CPU to support a better GPU?
     
  37. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I don't believe I ever said the 4210U wouldn't bottleneck a 980. In fact I said the exact opposite, that it would bottleneck the crap out of a 980.

    And for the love of god please stop saying the 4710HQ "bottlenecks" the 980M. In my book something that runs over 100 FPS is not a bottleneck. Hell for any sufficiently non-demanding game, the CPU will always "bottleneck" the GPU. Let's say you go from 300 to 400 FPS when overclocking from 3.5 to 4.5GHz. Would you call that a "bottleneck"? Do you see what I'm getting at?
     
    bumbo2 likes this.
  38. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    I stand corrected, based on the other games in the list I assumed it too was more GPU-bound, thanks for pointing that out.

    I'd add too that MMO's also fall into the CPU-bound camp a good deal... this is why I am torn on the AW 13... or just going for an AW 14. If Dellienware would just state that the AW 14 is "dead" I'd consider moving on one, but without a word I am left hanging to see what they do in January. For all we know it could be back with a Broadwell chip and 900-series GPU... but I doubt it :(
     
  39. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    My posts are all saying that the 4210U will bottleneck a GTX 980. So, what's the point of this discussion? The 4710HQ isn't even in the Alienware 13. :confused:

    I used it as an example to prove if a better CPU has trouble with a weaker GPU, the weaker CPU will definitely have trouble with a better GPU. I'm not debating the 4710HQ here. I could have chosen any other CPU for my example, but I just happened to choose the 4710HQ because it was recently reviewed.
     
  40. FoHMike

    FoHMike Notebook Geek

    Reputations:
    16
    Messages:
    89
    Likes Received:
    25
    Trophy Points:
    16
    AW14 isn't even on their website anymore?
     
  41. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Nope - it disappeared the minute the AW 13 went on sale unfortunately.

    On the question of bottle-necking - both points in the thread are true - you could get 100 fps or so, and be bottle-necking but for the majority of people it would be a moot point. I fall into the camp that is more than happy with 60 fps, but I could see people getting used to higher frame rates and the question of bottle-necking then being an issue.

    SO aggravated with Dellienware for not putting out an official end to the AW 14 or not... I am half-tempted to buy one as I need a more general machine on the road but the issues with the AW 17/18's these days give me pause to even considering one of them not to mention the greater size (this is an issue for some of us, the thought of trying to use an AW 17 in coach on a plane just doesn't work. I guess AW 17/18 owners must all fly first class :)
     
  42. FoHMike

    FoHMike Notebook Geek

    Reputations:
    16
    Messages:
    89
    Likes Received:
    25
    Trophy Points:
    16
    Dell FUBAR'ed my second order as well, and again, canceled it rather than trying to fix it. '3rd times the charm' order now had a ship date of Jan 14th, and they "can't" change that despite my previous 2 orders showing a ship date of Dec 16th (both canceled by Dell due to mistakes by Dell). I told them nevermind, that delay can eat it. Gonna wait for broadwell and see what Aorus does.

    I wish Dell had a way to order that didn't involve the internet or the phone lol. Both ways resulted in my order being completely wrong.
     
  43. Docsteel

    Docsteel Vast Alien Conspiracy

    Reputations:
    776
    Messages:
    2,147
    Likes Received:
    911
    Trophy Points:
    131
    Precisely my plan... by then we should know where everything sits for the lineup. Man, I so wanted a laptop under the tree this year and was picturing a quad-core AW 13.... sigh :(

    I was interested in the Aorus too after Dell dropped the ball.. but so many reports of heat, noise and build issues... ugh....

    I've been a Dellienware loyalist since 2009, but these moves now are really making me start to think Clevo/Sager if Dellienware can't pull out a good rabbit in January..
     
  44. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    You've also said on more than one occasion the 4710HQ will bottleneck the 980M. I'm saying you should stop using that example because it's a bad one, and the 4710HQ does not bottleneck the 980M, not in the normal sense of the word anyway.
     
  45. CapitalPrince

    CapitalPrince Notebook Guru

    Reputations:
    16
    Messages:
    51
    Likes Received:
    8
    Trophy Points:
    16
    was going to buy the alienware 13, but then realized that checking on notebookcheck the i5 4210u even bottlenecks the 840m. AW 13 will probably give less than desirable results.
     
  46. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Just relaying HTWingNut's findings.

    If I stopped using that example, it still doesn't matter to this discussion. The 4210U will always bottleneck the 980, lol. So, you're arguing a moot point.
     
  47. FoHMike

    FoHMike Notebook Geek

    Reputations:
    16
    Messages:
    89
    Likes Received:
    25
    Trophy Points:
    16
    Aorus - Linus says it runs cool? But would wait for the 9xx for sure. And, imo, no getting around a certain head load with a lot of power in a small space. As long as it's cool when doing desktop work, that's what matters to me.

    Clevo/sager - not slim enough for me, and holy jebus they're ugly.

    Dell - dual core ULV whaaaat?! /wrists

    I agree Doc, gonna have to keep waiting.
     
  48. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    When have I ever said the 4210U wouldn't bottleneck the 980? :confused: All I've ever been saying repeatedly is to stop using the 4710HQ as a bottlenecking example, because it's a terrible one. A much better example would be some of the older MSI models where they paired a 7970M with an AMD A10. Now that was a real bottleneck in every sense of the word.
     
  49. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Talk to HT.
     
  50. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    *sigh* Multiple people have chimed in later in that thread to point out it's not a real bottleneck, and I even did some testing at the very end to see how CPU speed would affect FPS in a simulated 980M SLI machine. I'm just gonna put this quote here and leave it at that.

     
← Previous pageNext page →