Are there any leaks of the performance of the 860M or do you just guess that the 860M will be about the 770M?
I just wonder if a 4K Display makes sence? Both GPUs, 860M and R9 M270X, offer not enough performance to deliver acceptable framerates!
Even an R9 290X has problems with this Resolution (BF4, Crysis 3, ... ).
Unless they dont offer a display with good interpolation, gaming will suck if you have the 4k Display.
Also why should i need a 4K display at 15,6 Inch?
-
i hope they would have a less expensive non-touch configuration. i am just aiming for non-touch non-4k as long as it can play my games at least on high settings.
-
@Alex555 That's the 860M's benchmark on Notebookcheck's website ( NVIDIA GeForce GTX 860M - NotebookCheck.net Tech ).
But I believe the values are just predictions because there's currently no laptop out there with the 860M. It is also rumored that the 860M had 4GB dedicated but hey, only time will tell.
Lets be patient till an actual test is taken on the real hardware. -
But like someone said earlier, it wouldn't be bad to have the 4k panel for the future. So, I can stretch my budget if possible. -
If the 860M can run with the 765M on stock clocks it would be good enough for me.
-
If so, this would be an awsome laptop. There are maybe more powerfull laptops with lower price or same, but almost no one is as slim like this.
-
Isnt possible to set the resolution to full hd when gaming, and use 4k when watching moves, pictures etc...?
-
-
-
Yup, think of it this way if you want the 4K Panel. 4K setting for Video/Images, and then 1080P for Gaming. 4K gaming (while possible on some desktop's with high end cards) is still years away before it will become common.
You should be able to scale from 4K to 1080P the same way you would scale from 1080P to 720P on current monitor setups. -
It's MUCH better then scaling from 1080p to 720p.
When scaling from 4k to 1080 p each pixel(for 1080p res) will be formed by EXACTLY 4 smaller subpixels making a perfect image(not blurred)
When scaling down from 1080p to 720 p the display needs to aproximate the color of some pixels(interpolate) thus showing you a blurry image.
That's the beauty of 4K displays ...they scale perfectly down to 1080 p....same way that 3200x1800 on the new Dell XPS scales perfectly down to 1600x900
There is no interpolation needed -
Huh, learned something new, thank you for the info.
-
Go run 1280x800 on a 2560x1600 monitor or 1280x720 on a 2560x1440 monitor and then tell it to my face that it's not blurry. -
hailgod likes this.
-
I'm not interested in 4K displays unless if Youtube supports 4Ks and WiFi networks or my ISP can handle such videos.
Which is probably going to happen by the time I have to replace the 4K display laptop. -
Now please go troll another topic.... basic logic proves you wrong , not to mention my own personal experience on the matter. -
-
I'm surprised more ppl didn't already know this since the retina(2880x1800) macbook pro is running crisp in 1440x900 without any complaints about blurriness and such
Maybe octiceps will read this post and realise he is wrong...if the XPS example wasn't enough -
I want to see in Y50 --- Intel Iris Pro 5200 Graphics and Core i7-4950HQ + GT860m, + 1080p not glossy screen, will be perfect machine.
-
Would scaling down ever hurt performance in games? Say that I purchased a quad hd display for the y50. Obviously gaming at qhd is not viable. If I scaled down to 1080P, would there be an additional performance hit? What about non native resoluion scaling?
-
-
Didn't know 1200x800 is a standard resolution. Go figure.
-
How would it be blurry? One scaled 1080p pixel would be represented by exactly four pixels on a 4k/2160p screen. The reason it works is because the pixels on a 4k/2160p screen are four times as small and four pixels make a square or in other words a bigger "mega" pixel. When a screen has an inexct conversion like QHD/1440p to 1080p or one scaled 1080p pixel converting to 4/3 QHD/1440p pixels then the screen has to interpolate/guess some pixels because they don't line up perfectly although downscaling to 720p would be perfect because QHD/1440p has exactly four times as many pixels. A screen with a perfect amount of pixels like this could thus be considered "native." It doesn't necessarily have to be four pixels either 9, 16, 25 (any perfect square) pixels would work. An example is watching a 720p on a 4k/2160p screen. One scaled 720p pixel would be converted to exactly 9 pixels on a 4k/2160p screen with no guessing done by the computer. To sum it up, there is absolutely no reason for a screen to interpolate if the downscaling conversion is exact. If you compare this in real life you MUST have two displays of identical size and build quality. If it looks blurrier you most likely have a crappy screen or the manufacturer coded the scaling for the screen really badly. I hope this clears everything up.
-
Don't bother randomguy....10 other people tried to explain it to him in another thread....it's useless....he juat says the same thing.
He doesn't understand that the blurriness only takes place when the ratio of resolutions isn't an integer value i.e. 1080/720=1.5(3/2)=> blurriness because that's when the interpolation alghorithm will guess the colors of some pixelsbe77solo likes this. -
Oh, okay. I guess some people just can't understand.
-
On a positive note, a 4k screen would support not only 1080p but also 720p "natively."
-
Yea...actually I hadn't realised that until I read your post.
That's pretty cool. If you don't mind playing in 720p then the 860 will be viable for the latest games for a very long time. -
Better than xps 15?
Sent from my GT-I9505 using Tapatalk -
Great, so people with apparently no actual experience in the matter break out some kindergarten arithmetic and start believing in the infinite detail myth of raster images. What else is new?
-
Mai profile picture is new.
-
be77solo, GreaseMonkey90, -Jinx- and 2 others like this.
-
)
Here's a short explanation on bilinear interpolation:
"Bilinear interpolation can be used where perfect image transformation with pixel matching is impossible, so that one can calculate and assign appropriate intensity values to pixels."
Notice the pixel matching part....since 1 pixel on 1080 p resolution is perfectly matching 4 subpixels from 4k res there is no resulting blurriness from the interpolation ....essentially it wouldn't be interpolation it would be pixel doubling.
Windows 8.1 already has this with 200% scaling
Even with simple bilinear interpolation a 1080p image stretched on a 4k 15" screen will not look worse than a 1080p image displayed on 1080p 15" screen.
There have been a lot of articles on this lately and it's one of the main selling points for the new 4k TV's...they advertise that 1080p content looks better on a 4k TV due to the advanced interpolation alghorithms they have implemented.
Whilst I haven't seen any real improvement in using the XPS screen in 1600x900 downscaled vs my own laptop in 1600x900 native res it definetly didn't look worse.
That being said this contradiction between you and ...well...most other people on this forum has gone on long enough and atleast from my side it's stopping here whatever you are going to say next.
Cheers!!
P.S. I'm fully aware that you can not infinetly enhance the detail of an image even with the best interpolation techniques .
Buuuuut ....while for low res images a billinear interpolarion (even when the ratio is an integer) will give you a blurry mess, on a high ress 1080p image upscaled to 4k there will be an improvement because of the high pixel density and it will still look crisp.
The problem you have is that you are limiting yourself to the use of low res images in your testsbe77solo likes this. -
So I went to Best Buy today. I am not sure how accurate this information is but I was able to speak to someone in the "green" shirt working at the Best Buy (not sure who he is compared to the people wearing the blue shirts) regarding the release date of the y50. He actually told me that the y50 would be available in-stores February 16 which is basically a week from now. He says he saw it in the inventory shipment listing. By the way, this was a Best Buy in Michigan.
Once again, I am not sure how accurate this is and it sure seemed like he knew what he was talking about because I was getting ready to buy the y510 and the y50 came up so I decided to hold off.
Similarly, when I spoke to one of those Lenovo online chat agents, the gentleman also stated that sometime at the end of this month they would be released. -
GreaseMonkey90 Notebook Evangelist
-
I asked two online agents. It's kind of weird because one agent told me that the Y50 is coming in February like he knew what he was talking about but the second wasn't sure and said it would probably release around May. If it really comes in February that would be great because I'm looking for a new laptop and then we can finally see the price for each configuration.
-
This is getting spicy.
-
Trying to figure out the release dates is important as far technology goes. There is always going to be something new and more improved and it would suck to miss out on a new version of the laptop. If it is a matter of waiting just a month you might as well.
-
This is very good news. If it comes this February, then 2014's gonna be awesome.
-
GreaseMonkey90 Notebook Evangelist
yea, I just chatted with the agent
he said " Yes, we are expecting it by Feb month end" -
What if the sales agents are pulling a gt 755m avalibilty card on us? :sly:
-
I assume retailers will offer 3-4 configurations of the laptop. I was reading somewhere that the 14 inch model was going to be limited to ATI while the 15 inch will have the nVIDIA card. Hopefully this laptop stays reasonable as far pricing goes. I have a feeling it will run kind of expensive if the base price is going to be somewhere around 999. I also the graphics card, SSH vs HDD, RAM will be customizable.
-
I work at Best Buy, I'll check out the inventory system on my next shift and let Y'all know.
Sent from my SAMSUNG-SGH-I747 using Tapatalkhfm, hailgod, pca9 and 1 other person like this. -
-
Just saw this on Techpowerup : NVIDIA GeForce GTX 860M | techPowerUp GPU Database
Now compare it to this one: NVIDIA GeForce GTX 770M | techPowerUp GPU Database
They seem to be exactly the same! Well, except for the 860M having 1 GB more memory (4GB vs 3GB on the 770M).
So the 860M is a 770M with beefed up memory, thats not bad right? Especially not considering the expectation was some OC 765M with a 128bit bus. Now we get the full blown 192bit! -
TechPowerUp, NoteBookCheck.net, etc. for unreleased chips will always copy paste the specs from the previous gen +10 with a caveat sticker that says "Unreleased/Grain of Salt/A friend of a friend who's cousin knows a guy who heard from a dude who's sister said a guy who works near a guy at nVidia said, etc."
560M = 470M
660M = 570M
760M = 670M
860M = 770M
Once in the wild, the specs change to reflect the actual information. -
Thanks iLiftCars, that would be awesome.
-
Come on, where did he go?
-
This seems to line up with the Y50 release date...
http://videocardz.com/49557/exclusive-nvidia-maxwell-gm107-architecture-unveiled -
If all goes well, we could be getting a gimped gm107. That should beat the 765m. -
Seems consistent with all the different speculation we've had here (or should I say I've had). The primary reason I thought y50 would be thinner would be because they took out the optical drive but this new "Maxwell" architecture of the graphics card could be also much thinner thus making the laptop slimmer..
February 18th is what it says in the link so yea that could very well much mean the y50s and y40s are on their way assuming they have these newer graphics cards.
Y50 Thread
Discussion in 'Lenovo' started by Jobine, Jan 4, 2014.