CDPR has just reiterated that there will be no delay in launch; evidently they read my post above and didn't want some unhinged fan getting medieval on their buttocks. At least that's how I'm telling the story.![]()
-
saturnotaku Notebook Nobel Laureate
-
-
I was fascinated with the news a while back that it will run on win7 in dx12, and the various articles say that dx12 support was brought to win7 a while back? How does that work? Is that just that only certain games get that like wow etc?, am I missing some awesome update?
--If this was already discussed sorry tldr 35 pages of chat. -
-
ratchetnclank Notebook Deity
-
saturnotaku Notebook Nobel Laureate
Sent from my iPhone using Tapatalk -
ratchetnclank Notebook Deity
If CD Projekt limit microtransactions to skins and cosmetics in the multiplayer then i don't see an issue with it. They've shown themselves to be very consumer friendly so far and i don't think it's an image they would gladly sabotage. -
saturnotaku Notebook Nobel Laureate
https://gamerant.com/cyberpunk-2077-removed-features-wall-running/ -
ratchetnclank Notebook Deity
-
game is looking good but the driving controls look spotty -
-
because they are always all over the road or driving really slowly in them
-
- big city
- lake
- desert
- cars
- guns
Isn't this the standard GTA recipe?
Not saying it doesn't look amazing, but I do hope there's more to it. I'm going to be disappointed if there won't be player controlled flying cars. -
BrightSmith Notebook Evangelist
Fallout meets GTA meets Deus Ex?
JRE84, killkenny1 and cucubits like this. -
ratchetnclank Notebook Deity
Using this logic GTA is using the final fantasy VII recipe of Big city,Lake,desert,vehicles,guns.JRE84 and killkenny1 like this. -
killkenny1 Too weird to live, too rare to die.
cucubits likes this. -
-
dude I was just about to post that video....weird.
also I retract my statement the driving looks good, I think they were just showcasing it moreso than just playing -
can you teach me your ways master.....that was unreal literally updated page to post and you posted it
krabman likes this. -
I saw it there when I opened Youtube; said it was posted 3 minutes ago. You can't miss when blind luck hands you 3 minutes ago!
JRE84 likes this. -
thegreatsquare Notebook Deity
-
Prototime likes this.
-
JRE84 likes this.
-
yeah I know even linus said 4k was now look...but 8k this time really is a bad idea for progress
Prototime likes this. -
JRE84 likes this.
-
it's the push i dont like....doesnt bring us closer to photo realism just delays it
Prototime likes this. -
Other than VR, where the screen is smashed against your eyes, 8K is mostly imperceptible for gaming purposes. Panel makers will hype it up to try to sell new TVs and new monitors. But unless you're sitting 2 feet away from a 65" screen, or 2.3 feet from a 70" screen - which no reasonable person would ever do - you won't get the benefits of 8k. There might be some limited benefit in reducing aliasing and making text sharper due to higher DPI, but at normal viewing distances, you won't see a huge general image quality improvement like jumping from 720p to 1080p, or 1080p to 1440p. Best to save your GPU horsepower for other uses.
8k vs 4k vs 1080p viewing distance calculations: https://www.forbes.com/sites/kevinm...waste-of-money-for-most-viewers/#4d7939bd3036Last edited: Sep 19, 2020JRE84 likes this. -
"If you have a dedicated theater in your home (not a “home theater” as the term is commonly used) that can accommodate a huge screen, 8K may make sense. For everyone else, 8K is a waste of money."
my thoughts exactly -
I'm not so sure I'd be able to perceive a huge difference between 4K @ 15.6" vs like what I have now which is 17" 2560x1600 (essentially 1440p except 16:10). I'd say 1440p would be good enough for me to remove the aliasing mess from 1080p desktop work or intricate UI elements of games. Of course I'm speaking from aging eyes, if I was still 25 with better than 20/20 vision I might have a different answer, lol.
Some people don't care, but I've been using nothing but HiDPI for 6 years now and I'm not going back. Things like font rendering and sharpness of UI elements just look too good to go back.Prototime likes this. -
I'm a home theater guy as in had several included a dedicated room in the last couple houses and that doesn't make me an expert but you do learn a few things when you're into something. There is only so much detail that can be resolved by the human eye at a given distance even with what amounts to perfect vision. Nothing can get any of us around that limitation. IMO if you want to see the value of various resolutions at a set distance, look it up, the science is in on this one.
-
Personally this is how i feel is a good sweet spot for screens resolutions. Anything above is getting into diminishing returns.
15.6 inch - 2560x1440p (2.5k)
27 inch - 5120x2880p (5k)
55 inch+ - 8k
VR - 16k+ -
EDIT: I remember now that I blind switched back and forth and she easily was able to pick out the 4K one consistently. I'm guessing if we had an 8K 49" panel at that distance we probably wouldn't easily pick out 8K vs 4K.. but I don't have one to test that theory. Just a hunch. -
I say 4k for laptops
8k for projectors and tvs
And 16k for vr
Anything More is waste...but what will they do in 50 years..32k no doubt -
Last edited: Sep 20, 2020
-
I did have an Eurocom Tornado F5 (MSI 16L13) w/ i7-7700K and GTX 1080 using 15.6" @ 4K panel. I played quite a few games at 4K on that and it was nice. Not every game was able to be played fluently on it at 4K though so I'd have to scale down quite a few. It was far better than 1080p, most of them I could play at 1440 which was a little better but there was still interpolation. I sold it previous to nvidia implementing integer scaling, might have been a little better at 1080p at that point since it was an interpolated mess to downscale.
Overall I wouldn't have changed anything about it except maybe try to source a 1440p panel, but there was no option for that from hidevolution or eurocom at the time. Only 1080 or 4K, and I wasn't going back to 1080.
EDIT: i wanted to sell it anyway.. it super light for what it was.. but I'm done with 6.5lb notebooks.. this setup I have now is MILES better even though it costs a little more. I don't game on the go, only when I'm home. Actually.. that Eurocom was 3K.. this LG Gram 17 + the eGPU was around $2500-2600... lol .. and now I can easily upgrade to a 3080 without changing anything..Last edited: Sep 20, 2020 -
i'm thinking of getting a 3080 egpu for my gs63 8re any recommendations or advice....I heard the egpu 2070 equates to a 1060 native so im wondering if its worth it to go for the 3080
-
anyways back on topic...this game is really shaping up coming out soon only let down is it won't look truely next gen....thoughts opinions on this ?
-
-
im thinking 3070 performance and well worth it as a 3080 max q shouldnt even be close to a full 3070
-
http://forum.notebookreview.com/threads/rtx-3080-in-an-egpu-discussion.834171/Last edited: Sep 20, 2020 -
A good compromise would be to run the game at 1800p high-ultra on a 4k monitor, let that upscale on its own and then resharpen the image using Reshade's CAS.fx (Radeon image sharpening port) plugin.Prototime likes this. -
yeah this is going to be big. At first I was not super excited but as more gameplay videos come out it gets better everytime I see it, this is going to be bigger than the witcher 3
-
-
yeah like this hype has me going a bit, I hope it's good and people swear by it, the videos look great but in the end I hope its just half as good as GTA V
-
I AM READY
Cyberpunk 2077 chat thread.
Discussion in 'Gaming (Software and Graphics Cards)' started by krabman, Jun 18, 2018.