The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    New Z model with Intel Core i5 CPU

    Discussion in 'VAIO / Sony' started by exetlaios, Jan 2, 2010.

  1. lpx

    lpx Notebook Consultant

    Reputations:
    3
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
    I presume that the 128 is 2 x 64 instead of 2 x 2 x 64, so yes, it'd be slower.
     
  2. jon_lui

    jon_lui Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    15
    That's not true, voltage is the most important factor in determining the power of a processor. It is related by the equation P=CFV^2 where p is power, c is capacitance, f is frequency and v is voltage. While frequency is linearly related to power voltage relates exponentially. So in order to increase battery life or lower temperature best to choose a processor with the lowest voltage. It is also the reason behind the long battery life of the CULV (consumer ultra low voltage) processor.

    Reference
    http://www.tomshardware.com/forum/240001-29-howto-overclock-quads-duals-guide
    Scroll down or search for "The title of the document is, "Intel® Core™2 Extreme Quad-Core Processor QX6700Δ and Intel® Core™2 Quad Processor Q6000 Δ Sequence Thermal and Mechanical Design Guidelines." It’s dated Jan 2007 and has an official Intel Document Number of 315594-002. I took a screenshot of section 4.1 on page 31 (where the above quote came from): "
     
  3. bluehaze013

    bluehaze013 Notebook Evangelist

    Reputations:
    12
    Messages:
    371
    Likes Received:
    0
    Trophy Points:
    30
    And that couldn't possibly have anything to do with the CULV running at close to half the given operating frequency of it's standard voltage counterpart now could it? ;) It goes hand in hand but the operating frequency has a far bigger impact. I know from first hand experience with overclocking you can leave your processor at stock frequency and increase the voltage with very minimal increases in temperature. However raise the operating frequency of the processor and things quickly get out of hand. The higher the heat, the higher the wattage, the higher the current draw.

    Give it a try yourself and see :)
     
  4. abhiku

    abhiku Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    0
    Trophy Points:
    15
    How much did it cost for this config?
     
  5. shodanjr_gr

    shodanjr_gr Notebook Enthusiast

    Reputations:
    0
    Messages:
    40
    Likes Received:
    0
    Trophy Points:
    15
    I just order one of the preconfigured Z-series (model number VPCZ112GX/S) with the Core i5 CPU, 4 gigs of ram and the 128GB SSD along with a refurb sleeve case. Total cost was $2017, including expedited shipping, with a $134 discount through Sony's educational program.

    Estimated shipping date is March 1rst on my order page!
     
  6. arth1

    arth1 a҉r҉t҉h

    Reputations:
    418
    Messages:
    1,910
    Likes Received:
    0
    Trophy Points:
    55
    No, you didn't ask for the difference between two identified chips. You only mentioned the chipset lines, i5 and i7. From that, there's no way for anyone to deduce just which two chips you wanted us to compare, and any meaningful answer will have to compare the chipset lines. For which the answer is valid.

    If you wanted people to do the entire job for you, and find out just which chips to compare, and then compare the specs and digest the answer for you, you are in for a surprise. Few will do that. Some work from the person who asks is generally expected.

    Anyhow, do you really think the tone of your last few posts will make it more or less likely that you'll get good answers?
     
  7. mercer2

    mercer2 Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    I just called sony and they told me that my shipped day was march 19th, I placed my order for a signature edition two weeks ago.

    The wait is killing me.
     
  8. roweraay

    roweraay Notebook Deity

    Reputations:
    59
    Messages:
    837
    Likes Received:
    0
    Trophy Points:
    30
    To be quite candid, your tone is pretty belligerent for a new guy asking questions and seeking answers. If I were you, I would adopt a more friendly attitude, which in turn would make people more amenable to spending their time in answering your questions. ;)
     
  9. Will17869

    Will17869 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5

    All else being equal, a higher resolution display should always be 'better' than a lower resolution screen.
    By 'better' I mean a clearer, higher quality picture while having the appropriate text size when using the correct DPI level. Any application that regularly uses 'zoom-to-fit' (photos, pdfs, powerpoints, ..) will give a better picture on a high res display, simply because there are more dots available to draw the screen, with one possible exception being low-resolution video zoomed to full-screen. A full-screen image is the same physical size whether the LCD is 1600x900 or 1920x1080, but the higher resolution means a finer dot pitch, allowing for smoother fonts etc.

    A 7-point font is reasonably legible when printed on paper, because printers can resolve several hundred DPI. However on a typical, non-Z, screen of ~96 DPI a 7-point font will be made up of so few dots that the characters are not clear.

    But all else is not equal, so there are compromises. Specifically:
    - Scaling of bitmapped graphics (icons etc.) to appropriate DPI level may degrade quality.
    - Some applications might not use windows DPI settings, requiring manual adjustment to get text large enough to read.
    - Brighter backlight on 1920x1080 screen appears to impact battery.

    A fully DPI aware operating system, applications, web browser would make it a much easier decision. But as it stands, both have pros and cons.
    - The 1600x900 display has a better chance of being readable without DPI scaling, but won't give as clear a full-screen picture.
    There are other factors too, for example the screen quality / bit-depth that are hard to gauge without actually seeing the display.
     
  10. bluehaze013

    bluehaze013 Notebook Evangelist

    Reputations:
    12
    Messages:
    371
    Likes Received:
    0
    Trophy Points:
    30
    Show me one of those and i'll gladly take a 1080p screen on even a 10" laptop LOL Problem is it doesn't exist other than in cellphones currently with pinch to zoom features etc. :( For practical everyday use DPI scaling in Windows is rather poor. Not all webpages adhere to the standards required for dpi scaling to work, nor do all applications etc... It becomes quite frustrating after awhile having to constantly adjust DPI settings, change fonts, etc... whenever you want to visit different websites or run different applications and they still don't look right in the end.

    It is much more ideal to just go with the lower resolution screen when what it really boils down to is being able to say you have a 1920x1080 screen on your 13" because noone can really tell the difference looking at the screen between a 1600x900 and a 1920x1080 other than the real estate and the size of text the dots are already so close togethor you'd need a magnifying glass to see any seperation. Same can be said for even lower resolution quality screens such as in the macbook pros.

    The way I see it theres essentially 2 user bases as far as resolution would be concerned, the business users that need more screen real estate for running photo/video editing, spreadsheets in dual windows etc... and the average user who needs the larger text to make web pages readable.

    You either get one or the other, can't have both at the present time unfortunately, so buyers would be wise to choose accordingly rather than be wowed by the numbers. IMO
     
  11. Sunfox

    Sunfox Notebook Deity

    Reputations:
    29
    Messages:
    738
    Likes Received:
    16
    Trophy Points:
    31
    If Microsoft had done a proper job "fixing" DPI-related issues in Windows 7, we wouldn't be having this conversation. :)

    I read a long article from the W7 dev team explaining WHY they didn't change things to be ideal. Kind of an annoying read because they know exactly what has to be done, but haven't done it due to some misplaced interest in backwards compatibility and concern for sloppy programmers that wouldn't know how to use the new settings.

    In my opinion, the future will hold:

    * Monitors will all report their DPI value to the system

    * System will automatically adjust settings so that a 1 inch square in the system equals 1 inch as shown on the screen - invisible from the user

    * Separate preference settings for font and element size will allow users to customize items to their preference, so folks that like tiny fonts and icons can have it, while those who require large fonts can have that too

    * Graphics will switch to a combination of high resolution bitmaps that resize smoothly (so that you always shrink something smaller, not try to enlarge it bigger) and vector artwork that can be shown at any size

    * Programs will be developed with flexible interfaces that don't break when a special needs person requires large menus. Microsoft is kind of there with Office 2007 - it's one of the few apps that scales very nicely - although I hate it for other reasons... :)
     
  12. Chirality

    Chirality Notebook Consultant

    Reputations:
    62
    Messages:
    245
    Likes Received:
    0
    Trophy Points:
    30
    I don't see how you can ever do DPI scaling perfectly. DPI scaling only works for vector graphics. For anything bitmapped, DPI scaling is just upsampling/downsampling, and the outcome is the same as if you had used a non-native resolution. And it's not realistic to expect an all-vector operating system. So much will remain bitmapped.
     
  13. SurferJon

    SurferJon Notebook Evangelist

    Reputations:
    6
    Messages:
    620
    Likes Received:
    18
    Trophy Points:
    31
    I'm a little confused with what HD is and screen resolutions.

    My old, stolen laptop was a Toshiba U205 with a screen resolution of 1280x800. The laptop I'm borrowing right now is a Sony NW with a 1366x768 resolution. This laptop has a Blu-Ray drive, obviously meaning the screen is HD. So does that mean my old Toshiba was almost HD too since the resolutions were almost the same? Or am I missing something here? Everything I've read in this thread seems to suggest HD is a high screen resolution.

    Secondly, is there any way I could somehow simulate the 1920x1080 resolution? I have a 31.5" HD TV and this Sony NW. Is there a certain way of setting the resolution on the TV and sitting a certain distance from it to sort of see what the large Z resolution would be? I called the Sony Style store in Costa Mesa and they said they won't get the Z's in until the middle of March (doy), but if I can figure out which resolution I want earlier it'll mean I can get my laptop earlier and return this one back to my brother who also needs it (though I need it more since I'm in college).
     
  14. bluehaze013

    bluehaze013 Notebook Evangelist

    Reputations:
    12
    Messages:
    371
    Likes Received:
    0
    Trophy Points:
    30
    HD is just marketing speak for higher resolution designed to sell TV's. There have been "HD" monitors for computers long before they ever existed for TV's. 720P HD is 1280x720 1080p HD is 1920x1080. In a TV higher resolution is always better because they have proper scalers in them.
     
  15. SurferJon

    SurferJon Notebook Evangelist

    Reputations:
    6
    Messages:
    620
    Likes Received:
    18
    Trophy Points:
    31
    So my old laptop with a 1280x800 resolution WAS a 720 HD screen? o_O I'm confused...
     
  16. irwinman

    irwinman Newbie

    Reputations:
    7
    Messages:
    9
    Likes Received:
    0
    Trophy Points:
    5
    Any display with at least 700 lines of vertical resolution can technically be considered "HD." So even your old 15" 1024 x 768 LCD monitor from a decade ago can be tagged with today's "HD" label :eek2:. Full HD is any display with at least 1080p (1920 x 1080) resolution [see: wiki].
     
    Last edited by a moderator: Jan 29, 2015
  17. SurferJon

    SurferJon Notebook Evangelist

    Reputations:
    6
    Messages:
    620
    Likes Received:
    18
    Trophy Points:
    31
    My HD TV right now is showing Dr. Phil. It is super-super clear. Why do my other 13 inch TV's show a less-detailed image? (Rhetorical). Are they 640x480 or something?
     
  18. bluehaze013

    bluehaze013 Notebook Evangelist

    Reputations:
    12
    Messages:
    371
    Likes Received:
    0
    Trophy Points:
    30
    Yes. Anything capable of displaying 1280x720 would be conisdered "HD 720p" anything capable of displaying 1920x1080 would be considered HD "1080p" The label is for the exact purpose you are experiencing LOL To confuse potential buyers into thinking it's some new technology aka the next greatest thing so people will buy it even though things have been "HD" long before the label came into existence. :D
     
  19. yellowfrizbee

    yellowfrizbee Notebook Consultant

    Reputations:
    0
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    Its true. Ive by FAR beat everyone in this thread on asking dumb novice questions, but everyone on here has answered them very politely and patiently. :)


    If you are looking for help, you have to remember nobody on here is obligated to answer anything. It would be nice of you to at least show appreciation and be grateful that they are even taking the time to answer you back. Which reminds me, thanks everyone thats been patient with me. Ive learned ALOT :cool:
     
  20. arth1

    arth1 a҉r҉t҉h

    Reputations:
    418
    Messages:
    1,910
    Likes Received:
    0
    Trophy Points:
    55
    (more sensible stuff snipped for brevity)

    It will have to. Because as resolution goes towards infinity, the concept of a "pixel" will no longer be relevant to end users. Bitmaps just won't work. If you have a "gigaquad" (to borrow terminology from Star Trek) of memory to store a picture in, it doesn't make sense to store it as a bitmap, but as an algorithm that re-creates the desired image no matter what your resolution is.
    Fonts already do that -- the outline algorithm is stored, and not the bitmap, like they did in the old days. This can and will happen for pictures too. And indeed, JPEG and MPEG compression is the first feeble step on the way, even though still fettered by the "pixel" resolution.

    While on the subject of prediction, the RGB tyranny will also go away, eventually. RGB is too limited. At first, I expect that we'll be able to adjust the colour of each subpixel. That will blow regular RGB out of the water, and even xvYCC will no longer be the limit. Then, who knows -- as pixels go smaller, there won't be a need for subpixels -- all pixels can be colour-changing subpixels, if you like, and not a fixed size, because what a "pixel" is will vary between devices.
     
  21. bluehaze013

    bluehaze013 Notebook Evangelist

    Reputations:
    12
    Messages:
    371
    Likes Received:
    0
    Trophy Points:
    30
    That is 480p LOL 640x480 resolution or if 16:9 widescreen 720x480 and can be interlaced which is only half the resolution in reality and most likely if it's a cheap 13" so it's more similiar to a 320x240 image scaled up to 640x480 resolution.
     
  22. bluehaze013

    bluehaze013 Notebook Evangelist

    Reputations:
    12
    Messages:
    371
    Likes Received:
    0
    Trophy Points:
    30
    My prediction would be that hardware scalers will be implemented in the monitors themselves independent of the OS much the same as the TV's do. I think it is the only way of doing it properly because it is 100% independent of the OS so there is no chance for anything to not scale properly. There will always be glitches with OS dependent scaling just due to human error.
     
  23. arth1

    arth1 a҉r҉t҉h

    Reputations:
    418
    Messages:
    1,910
    Likes Received:
    0
    Trophy Points:
    55
    Why? Fonts have already gone vector. So have widgets. SGI used vectors for icons already in the early 90s. Images and video? Bitmaps are on the way out there too -- the "lossy" formats already play with this, using float percentages for defining areas instead of pixels.

    So yes, I expect that pixels go the way of the dodo for end-user applications. Above the driver level, they will no longer be interesting once the resolutions are high enough. Yes, it will take years, and probably decades. But the conversion has already started.
     
  24. yellowfrizbee

    yellowfrizbee Notebook Consultant

    Reputations:
    0
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    @sanfranguy06: Answering your question to the best of my abilities. I hope this helps in your decision, im sorry I cant give you much more info than this.

    Nobody has the computers yet, so we can only speculate. We will surely run tests of our own once we get the computers, but right now all we have to look upon are what limited information we have on the internet about the i7-620m and the i5-540m.

    The only thing I can say is the i7 will be faster (by how much, we dont know) but the i5 will most likely run cooler and give more battery life (once again, by how much, we can only speculate). The i7-620m can turbo boost to 3.33Ghz and the i5-540m respectably at 3.06 Ghz. Another difference that could help in your decision might be the i7-620m's extra MB cache that the i7 has on the i5-540m. Hope that helped in any way :)
     
  25. sanfranguy06

    sanfranguy06 Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    THANK YOU FRIZBEE!!

    I actually went on to preorder with the i5 after talking to a buddy who's an hardware engineering architect at Intel! I told him in the most taxing situations, I'd need to run latin hypercube simulations in Matlab and Excel and he told me that the difference would be imperceptible for the most part.

    Thanks for the help and hope you're happy with whatever you end up choosing.
     
  26. yellowfrizbee

    yellowfrizbee Notebook Consultant

    Reputations:
    0
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    Im glad I could help! Yeah, for your case and what youre doing it probably will save you cash going for the i5-540m; seems like the i5 will give you plenty performance. Hope you like your decision.

    Next time frustration hits you though, its better if you walk away from the screen for a little bit then come back. It will allow you to calm down and thus then we can avoid the unneeded exchanging of rude words. Im sure the mods would appreciate it! ;)
     
  27. arth1

    arth1 a҉r҉t҉h

    Reputations:
    418
    Messages:
    1,910
    Likes Received:
    0
    Trophy Points:
    55
    Don't forget that Z is also available with an i5-520M.
     
  28. SurferJon

    SurferJon Notebook Evangelist

    Reputations:
    6
    Messages:
    620
    Likes Received:
    18
    Trophy Points:
    31
    *is the stupid consumer*

    Isn't 3.33 GHz way faster than 3.06? Say I've got 100 Pokemon card scans and run them in Photoshop using a batch action (like reduce the size, correct the colors, save it here, reduce the size, save it as a GIF here, etc). Wouldn't the Z's i7 finish those way faster than the i5? Or if I'm editing an HD video.
     
  29. yellowfrizbee

    yellowfrizbee Notebook Consultant

    Reputations:
    0
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    ^@ SurferJon, High five! Im the stupid consumer too :p! Ive been thinking the same thing "Geez, 3.3Ghz seems much better than 3.06Ghz!". Ive learned alot and am still learning though. Then again, as in terms of processors, I usually think in gaming terms.

    Ah, very valid point. I dont know what made me assume he was only deciding between the i5-540m and i7-620m. Tbh, Ive been quite negligent towards the i5-520m in all my posts. It was subconsciously, I assure you.

    Forgot all about that little contender! Alas, if im honest, I have a feeling I will regret getting the i7-620m as opposed to the i5-540m personally. Although, I do game alot so maybe it will prove worth it.
     
  30. bluehaze013

    bluehaze013 Notebook Evangelist

    Reputations:
    12
    Messages:
    371
    Likes Received:
    0
    Trophy Points:
    30
    Not neccesarily, the thing is every component in the computer is a potential bottleneck so just because the processor is capable of running at a higher mhz does not neccesarily mean it will finish tasks faster.

    In a straight up benchmark/synthetic test the higher mhz cpu will always finish faster in this case "way faster" no, marginally faster yes. just because 3.3 is a marginal increase from 3.1. But in real world use the difference between i7 620m at 3.3 and i7-540m at 3.1 will be virtually identical as not many applications/tasks will ever tax the cpu to a 100% situation where the cpu becomes the bottleneck. In most situations the CPU will always be waiting on other components so the extra speed will not make a difference.

    Same applies to the bigger cache, unless it is constantly full you will not notice a performance increase, it will be much easier to fill the cache however than to max the processor to 100% so likely the cache would show the majority of the gain but in real world use it will be virtually undiscernable again because unless the CPU is struggling it will always be able to move data in and out of the cache faster than the component that is feeding it this information. You would most likely have to use a benchmark to be able to discern the gain.
     
  31. SurferJon

    SurferJon Notebook Evangelist

    Reputations:
    6
    Messages:
    620
    Likes Received:
    18
    Trophy Points:
    31
    I don't play any video games, I only do editing in Photoshop. I'm a film major, so I also do HD film editing. That's about the heaviest tasks I do (though I often have like 30 Firefox tabs open and dozens of little programs). But I use my laptop for notetaking and watch movies (saved on my HD), so battery life means a lot to me. So for you experts, i5 or i7? :p

    EDIT: Thanks bluehaze. But then why are people picking the i7 if in real-world situations the difference isn't noticeable and the i7 is hotter and more power consuming?
     
  32. arth1

    arth1 a҉r҉t҉h

    Reputations:
    418
    Messages:
    1,910
    Likes Received:
    0
    Trophy Points:
    55
    Most likely, but not certainly.
    If a faster CPU generates so much heat over time that the cooling arrangements have problems getting rid of it fast enough, it will throttle itself down. Once this happens, the faster CPU will be slower than the otherwise slower CPU.
    "Normal" operations will seldom cause throttling, but even on the older Zs, there are certain types of code that can trigger throttling if run for any length of time on the fastest versions.
     
  33. SurferJon

    SurferJon Notebook Evangelist

    Reputations:
    6
    Messages:
    620
    Likes Received:
    18
    Trophy Points:
    31
    Thanks for the explanation, but what's throttling? Is that when the CPU speeds up to finish the task? Also, what's this whole speed boost thing about (going from 2.6 to 3.33 or whatever). And another thing - if the i7 seems to have so much against it, why are people buying it rather than the i5? I is confused.
     
  34. bluehaze013

    bluehaze013 Notebook Evangelist

    Reputations:
    12
    Messages:
    371
    Likes Received:
    0
    Trophy Points:
    30
    Because they want the biggest and the best, much like the "HD" marketing people get sucked in and have to have the 1920x1080p screen because it must be better! It will be for some people but for the average it won't but they will be able to say they have the best even though it is painful to use.

    Granted it's not the same for the processor you won't suffer at all from having an i7 performancewise and you will be able to tell everyone you have the best you just won't notice much if any difference at all in everyday use other than the possibly shorter battery life.

    You also have to understand most of these manufacturers that offer configure to order systems plan on making their money on the upgrades, they suck you in with a really low base price then overcharge for all the upgrades in order to make a bigger profit. They plan on consumers having the mentality that this is more expensive so it must be better so I will pay for it, they plan on consumers wanting to be able to say they have the best they plan on any number of things all intended to suck the money out of your pockets.
     
  35. gammaknife

    gammaknife Notebook Consultant

    Reputations:
    0
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    I can answer your q with a q :D 128gb ssd is 2*64 ie 128gb in raid 0 where as 256gb is a quad raid. both the configs say they are raid 0 irrespective of their size. But i am not sure if 256gb which is not a true quad raid here is faster than 128gb in raid 0. more q's if they have GC too.
     
  36. gammaknife

    gammaknife Notebook Consultant

    Reputations:
    0
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    @sturmnacht I am not sure if more people here are going for 128 or 256 ssd based on speed. the upgrade costs abt 400$ but not sure if we can upgrade sony customized controller to put a new ssd later.
     
  37. arth1

    arth1 a҉r҉t҉h

    Reputations:
    418
    Messages:
    1,910
    Likes Received:
    0
    Trophy Points:
    55
    No, rather the opposite. The CPU detects that it's running hot, and reduces its frequency until it cools down.

    As an example, doing heavy Fourier transformations for any length of time on my P9500, and it will after a few seconds cause a throttle back from 2.53 GHz to as low as 1.1 GHz until the temperature goes down. So it becomes "fast-slow-fast-slow-fast-slow". If I manually reduce it to run at 2.13 GHz, it actually does better overall for this particular job. But for most things, leaving it at full speed will be faster.

    That's the flip side of the same coin -- simplified, if you only use one core out of two (or four) because you run a single-threaded app, the CPU won't get too hot, so the one core that is running can select a higher clock speed than "normal" without the whole package overheating. Several of the Core2 CPUs had Turbo Boost too. My P9500, for example, can go from 2.53 GHz to 2.66 GHz when only one core is in use.

    (In addition, the i5/i7 CPUs can also get very brief speed bursts when the CPU is cold enough, even if running all cores. This, however, isn't sustainable for any length of time, so it won't make a lot of difference.)

    It doesn't. It's almost certainly going to be faster, most of the time, or for most users. Yes, it will use more power, and yes, it has a higher risk of overheating and throttling, but for "normal" use, it will usually be faster. (At least when the CPU is a bottleneck, and not, say, the graphics card or IO.)
    Whether it's worth the price premium is something only the buyer can decide.
     
  38. arth1

    arth1 a҉r҉t҉h

    Reputations:
    418
    Messages:
    1,910
    Likes Received:
    0
    Trophy Points:
    55
    Why do people buy sports cars when the speed limit is 80?

    Some of it has to do with keeping up with the Joneses, and some of it has to do with the thrill of owning something that's really over the top for their use, but some also take advantage of the faster acceleration, and it's worth it for them, even if it doesn't really get them from A to B noticeably faster.
     
  39. roweraay

    roweraay Notebook Deity

    Reputations:
    59
    Messages:
    837
    Likes Received:
    0
    Trophy Points:
    30
    Wow, seems like some of the bad vibes have gotten cleaned up a bit....a good thing ! :)
     
  40. nutman

    nutman Notebook Consultant

    Reputations:
    24
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    It's funny watching the 1080p display argument...
    I have a 2 year old second hand IDTech MD22292 with a 3840x2400 resolution in a 22" panel so I laugh at the whole DPI issue but I think I am more of an exception than the rule.
     
  41. freedom16

    freedom16 Notebook Deity

    Reputations:
    137
    Messages:
    1,824
    Likes Received:
    0
    Trophy Points:
    0
    @ SurferJohn, i7 dual core 620 may run hotter but its a 32mn one so it will have longer battery life, cause the processor is small, even though the dual core is 2.66 but really concered about battery life then go for a low end core i5 450 or something like that, i am very very intrigued by this machine, i will just probably wait until this is at the outlet, the sony z series has the best amazing screen i have seen from a sony!
     
  42. gammaknife

    gammaknife Notebook Consultant

    Reputations:
    0
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    Any previous z owners :D do you know which brand RAM sony uses?
     
  43. freedom16

    freedom16 Notebook Deity

    Reputations:
    137
    Messages:
    1,824
    Likes Received:
    0
    Trophy Points:
    0
    Hynix, i am not sure if i am spelling the name right.
     
  44. arth1

    arth1 a҉r҉t҉h

    Reputations:
    418
    Messages:
    1,910
    Likes Received:
    0
    Trophy Points:
    55
    Since you say previous owners, I take it you mean the older series? If so, it varies. Mine came with Elpida RAM, but others have gotten Hynix.

    (I switched mine for lower latency RAM from Kingston. It's probably still either Elpida or Hynix, but binned to higher specs.)
     
  45. gammaknife

    gammaknife Notebook Consultant

    Reputations:
    0
    Messages:
    191
    Likes Received:
    0
    Trophy Points:
    30
    Thx. I am finding more reviews for crucial than kingston. any thoughts if one is more reliable than other?

    Here is a link:
    http://www.amazon.com/gp/product/B0...&pf_rd_t=101&pf_rd_p=470938631&pf_rd_i=507846
     
  46. Sunfox

    Sunfox Notebook Deity

    Reputations:
    29
    Messages:
    738
    Likes Received:
    16
    Trophy Points:
    31
    Has it been confirmed yet whether the CPU is soldered or socketed?

    If soldered, then you won't be able to upgrade it in the future, so some figure that getting the i7 will give you that extra couple of months of life out of the system. Personally I'd opt for the i7 since I spend most of the time plugged in, and getting that extra few minutes of battery life isn't that important to me, and I always want more performance out of my notebooks.

    Also all CPUs offered on the new Z are 32nm. The i5-540M and i7-620M are essentially identical. They could have EASILY called the i7-620M, say, the i5-550M and no one would have batted an eye. The only reason they bumped it up to i7 is because the cache was slightly larger than the rest of the i5's.

    @gammaknife: I've always had excellent success with Kingston. Nothing wrong with Crucial. Nothing particulary wrong with any of the big memory vendors these days...
     
  47. roweraay

    roweraay Notebook Deity

    Reputations:
    59
    Messages:
    837
    Likes Received:
    0
    Trophy Points:
    30
    In case of the current F-series, Sony uses Hynix. However, note that the F-series has the faster DDR3-1333 RAM, than the DDR3-1066 RAM that the Z uses and hence the brand employed in the new Z could very well be different from that of the F.
     
  48. roweraay

    roweraay Notebook Deity

    Reputations:
    59
    Messages:
    837
    Likes Received:
    0
    Trophy Points:
    30
    That is a debatable point, since the Quad-core i7 chips, the i7-720QM and the i7-820QM employed in the F-series, have 2 different Cache sizes.

    The i7-720QM has 6MB of Cache, while the i7-820QM has 8MB of cache.....both employing 4 cores, 8 threads.

    Having said that, if I were buying the new-Z, I personally would pay the slightly higher price and go with the i7-620M than the i5-540M, regardless of the added battery life I could eke out of the i5 version.
     
  49. Sunfox

    Sunfox Notebook Deity

    Reputations:
    29
    Messages:
    738
    Likes Received:
    16
    Trophy Points:
    31
    4mb is still more than 3mb, but otherwise yeah, Intel's branding of these chips doesn't really make the most logical of sense. Not like the good old days when an "SX" meant no math co-processor, and you wanted the "DX" version. :)

    I probably would have kept dual core no-hyperthreading to i3, dual core hyperthreading to i5, quad core hyperthreading to i7, and then used the number series to differentiate other major features such as low voltage, cache size and so forth.
     
  50. yellowfrizbee

    yellowfrizbee Notebook Consultant

    Reputations:
    0
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    "Hey honey! I know you said i5-540m, but the i7-620m was only a hundred bucks more, so I thought id go ahead and upgrade that for you"

    "Oh..uh, well.. ok. Thanks, dad."

    In short? Because I let my dad order it. Thats why in my case :(..bless his heart.
     
← Previous pageNext page →