Game Developers Conference | GDC | Home
www.gdconf.com/
GDC is the world's largest professional game industry event, with 5 days of learning, inspiration, and networking March 19-23, 2018 in San Francisco.
GDC is the world's largest professional game industry event. Join game designers, programmers, artists, producers, and business professionals for 5 days of unparalleled education, inspiration and networking for the global game development community.
http://www.gdconf.com/
Still time to go if you are local![]()
GDC 2018 - Hours & Location
http://www.gdconf.com/attend/hourslocation.html
GDC 2018 - Passes and Prices
http://www.gdconf.com/attend/passes.html
GDC 2018 Expo (Wednesday-Friday, March 21-23, 2018)
See more than 550 companies showcasing the latest game development tools and services in the GDC Expo, Wednesday-Friday, March 21-23, 2018. With leading technology companies like Amazon Lumberyard, Google, Intel, Nvidia, Oculus, Sony, and Unreal Engine, cutting-edge game and VR/AR tools providers, and much more, the GDC Expo brings you the latest innovations and facilitates new partnerships and business opportunities.
http://www.gdconf.com/expo/index.html
GDC Youtube Channel
https://www.youtube.com/channel/UC0JB7TSe49lg56u6qH8y_MQ
GDC 2018 Flash Forwards Playlist - Currently 31 Video's
-
-
Decommissioned GDC Trailer 2018
Published on Mar 18, 2018
In Decommissioned, you play as the last astronaut to visit the International Space Station, charged with successfully ending the station's 26 year mission by deorbiting the space station into the Indian Ocean. After powering down the station's aging systems and prepping to leave, a catastrophic failure disables your only means of escape. Now - with your oxygen failing and your orbit plunging deeper into the Earth's atmosphere, you must find a way to repair and bring back online the systems that once kept you alive, and finally make it back home.
Features:
• Immersive movement system - Players navigate the space station using their hands, just like real astronauts.
• Complex simulated systems - All space station systems are simulated and interoperate with each other in a realistic manner. Players will have to manage power, water, oxygen, and other vital resources to keep themselves and the space station alive.
• Survival as a puzzle - All systems aboard the station can fail. Players must use the limited resources on board the space station to keep the station running. As supplies run low, players will have to dismantle other systems to keep vital systems working, leading to tough choices about what stays and what goes. Removing the wrong system could have dire consequences!
Published on Mar 18, 2018
Warm Lamp Games and Alawar Premium, a division of Alawar Entertainment are proud to reveal Beholder 2, is set to release on Steam 2H of 2018. The game, set to release in 3D this time, and takes place in the same grim and hopeless dystopian Beholder universe where privacy is dead and a totalitarian State controls every aspect of life ruled by the same “Wise Leader.” The game will feature an expansive world filled with new scenes, rooms, floors inside the actual Ministry.
“You are a newly employed department officer within the Ministry of a totalitarian State. While you are poised to have an illustrious career and possibly become Prime Minister someday, the way up won't be easy. So, how high up the career ladder will you climb? The choice is still yours to make!” - Warm Lamp Games
Published on Mar 17, 2018
Gameplay Video of the Game Alkimya (in production)
Put the game in your Steam wishlist:
http://store.steampowered.com/app/521...
Follow us on Facebook: https://www.facebook.com/BadMinions/
Published on Mar 15, 2018
Playdius will show its line-up behind closed doors at GDC 2018.
Here's an overview of our GDC 2018 Line-Up, featuring Away: Journey to the Unexpected, Dead In Vinland, Old School Musical, Hover, Edge of Eternity & BAFL.
To this list, we also added Jengo, upcoming point & click game that starts his Figs campaign today! Be one of the early backers here! http://bit.ly/Jengo-fig
Published on Mar 15, 2018
Sugar Creative GDC 2018 AR/VR Showreel
Published on Mar 15, 2018
Sugar Creative was lucky enough to attend GDC 2018. We created a showreel showcasing our Augmented & Virtual reality work.
Published on Mar 9, 2018
GDC is only two weeks away, and this is probably one of the events of the year for indie games. I'm looking forward to trying out new indie games as well as revisiting titles that I already know I'm going to love!
Below are all the details on the games featured in this video as well as the sources for the trailers I used. Thank you!
---------------------------------------------------------------------
The Media Indie Exchange
http://www.mediaindieexchange.com/
https://twitter.com/indieexchange
---------------------------------------------------------------------
Intro Games
Mineko's Night Market
Developer:Meowza Games
Publisher:Humble Bundle
http://store.steampowered.com/app/762...
http://minekosnightmarket.com/
https://www.youtube.com/watch?v=8CYyY...
Joggernauts
Developer: Space Mace
Publisher:Graffiti Games
http://store.steampowered.com/app/747...
https://spacemacegames.com/
https://www.youtube.com/watch?v=5mDns...
❄️Frostpunk
Developer:11bit Studios
Publisher: 11bit Studios
http://store.steampowered.com/search/...
http://www.frostpunkgame.com/
https://www.youtube.com/watch?v=wAJfT...
Dead Cells
Developer: Motion Twin
Publisher: Motion Twin
http://store.steampowered.com/search/...
https://dead-cells.com/
https://www.youtube.com/watch?v=F72iu...
---------------------------------------------------------------------
Featured Games
Wytchwood
Developer: Alientrap
Publisher: Alientrap
https://www.youtube.com/watch?v=neMTQ...
http://www.alientrap.com/games/wytchw...
http://store.steampowered.com/app/729...
Platforms: PC
Once Upon a Coma
Developer: Atmos Games
Publisher: Serenity Forge
https://www.kickstarter.com/projects/...
https://www.youtube.com/watch?v=TQJqO...
http://store.steampowered.com/app/733...
Platforms: PC, Mac, Linux, (possibly Switch)
Knuckle Sandwich
Developer: Andrew Brophy
Publisher: Andrew Brophy
https://www.youtube.com/watch?v=xqd28...
https://twitter.com/kncklsndwch
http://knucklesandwich.biz/
Platforms: PC and Mac
Yuppie Psycho
Developer: Baroque Decay
Publisher: Another Indie
http://www.yuppiepsycho.com/
https://www.youtube.com/watch?v=h92g7...
http://store.steampowered.com/app/597...
Platforms: PC, Mac, Linux
⚔️Sinner: Sacrifice for Redemption
Developer: DARK STAR
Publisher: Another Indie
http://anotherindie.com/game/sinner/
https://www.youtube.com/watch?v=1btUV...
http://store.steampowered.com/app/691...
Platforms: PC, PS4, Xbox One
---------------------------------------------------------------------
Outro Games
Punch Planet
Developer: Sector-K Games
Publisher: Sector-K Games
http://store.steampowered.com/app/577...
http://www.punchplanet.com/
https://www.youtube.com/watch?v=N5NyG...
Carrion
Developer: Phobia Games
Publisher: Phobia Games
https://carrion.games/
https://www.youtube.com/watch?v=OM_IO...
Children of Morta
Developer: Dead Mage
Publisher: 11bit Studios
http://store.steampowered.com/app/330...
http://childrenofmorta.com/
https://www.youtube.com/watch?v=OWl3N...
---------------------------------------------------------------------
***Want More Kimchica?***
✅ Instagram: @jennywindom
✅ Twitter: @kimchica25
✅ Twitch: /kimchica
--------------------------------------------------------------
Published on Feb 22, 2018
Trailer music - "Fans Forever" by Doseone
Without further ado, here's a list of some of the games and developers that will take part in Day of the Devs at GDC 2018:
A Way Out from Hazelight Studios
Dead Static Drive from Team Fanclub
Dreams from Media Molecule
Exo One from Exbleative
Harold Halibut from Slow Bros.
Kids from Playables (First time it's been playable to the public)
Knights and Bikes from Foam Sword
Mosaic from Krilbite Studio (First time it's been playable to the public)
Minit from Kitty Calis, Jan Willem Nijman, Jukio Kallio, and Dominik Johann
Noita from Nolla Games (First time it's been playable to the public)
Untitled Goose Game from House House
Set aside some time to swing by and check them out at Day of the Devs: GDC Edition, as well as all the other great interactive spaces at the show, as you plan out your conference week using the GDC 2018 Session Scheduler-
http://schedule.gdconf.com/list?_mc=B...
GDC 2018 itself will take place March 19th through the 23rd at the Moscone Center in San Francisco.
For more information on GDC 2018, visit the show's official website- http://www.gdconf.com/Last edited: Mar 18, 2018trvelbug likes this. -
GameSpot - The Mix 2018 Livestream
Live March 19, 11:00 AM
Scheduled for Mar 19, 2018
GameSpot gets a special visit from multiple indie game developers for games such as Dead Cells, Black Future 88, Milanoir, Carrion, Frost Punk and Noita during GDC 2018.
Live March 19, 10:00 AM
Scheduled for Mar 19, 2018
Google Developer Day will be full of announcements and best practices to help developers build incredible games. Starting with a keynote, we'll then dive in to three main themed sessions including innovation & new platforms, pre-launch best practices, and ways to optimize your games post-launch. Each session will include several mini-talks from different Google teams and developer partners sharing new tools, learnings, and more.
Subscribe to the Google Developers Channel: http://goo.gl/mQyv5L
Live March 19, 6:30 PM
Scheduled for Mar 19, 2018
Tune in to see new features and learn about the future of Unity in 2018!
The Unity GDC keynote will take place on March 19th at 6:30PM (Pacific Time) and we've gotten permission to stream it right here on the Brackeys channel!
Unity will be sharing their latest innovations covering graphics, performance, artist tools and more!
Building on the foundation of the 2017 cycle, which introduced powerful artist tools that give teams of artists and engineers the power to do more together, the keynote will be the first deep dive into Unity 2018.
The keynote will be hosted by Lucas Meijer (Technical Director) and will feature John Riccitiello (CEO), Brett Bibby (VP of Engineering), Natasha Tatarchuk (Director of Global Graphics), Joachim Ante (CTO) and Yibing Jiang (Technical Art Supervisor).
Live March 21, 9:00 AM
Scheduled for Mar 21, 2018
Experience Unity Live at GDC. Tune in to our interview style talk show hosted by Will Goldstone (Product Manager) and Liz Mercuri (Technical Evangelist). Catch a glimpse of all the latest announcements and news in the Unity ecosystem, featuring special guests and industry leaders. https://unity3d.com/gdc-2018Last edited: Mar 19, 2018 -
The GDC Channel has lots of videos already posted about GDC 2018, not limited to but including "Flash Forward" videos about coming GDC events, topics, forums, etc - 31 videos in a playlist
Here are some examples:
GDC 2018 Flash Forward: HoloLens and Beyond: AR Game Design Challenges and Open Problems
GDC18: Applying AAA Techniques on Mobile Games: Understanding Flow Maps and Its Applications
GDC 2018 Flash Forward: A Matter of Music Design: Driving Gameplay with Music
GDC 2018 Flash Forward: 'Move or Die': Why It's Worth It to Chase Your Tail
GDC 2018 Flash Forwards Playlist - 31 videos -
10 TOP TIPS for GDC 2018 - Advice for Indie Game Devs
Published on Mar 5, 2018
Hey guys! Here are my 10 top tips for GDC, mainly aimed at first-timers to the conference, giving an insight on how to prepare for the event and what it's like to be there. Enjoy!
Like, comment and subscribe for more videos like this as well as tips and tricks on getting into the games industry, starting your own indie studio and awesome vlogs!
----
GDC Party Links
The Fellowship of GDC parties: https://www.facebook.com/groups/TheFe...
GDC Party List https://docs.google.com/spreadsheets/...
Twitter List https://twitter.com/gdcpartylist
----
FOLLOW ME:
Twitter: http://www.twitter.com/dandarocha
Instagram: http://www.instagram.com/dan_darocha
Website: http://www.dandarocha.com/
----
Dan Da Rocha is a game designer and co-creator behind Q.U.B.E. and Hue. He’s currently working on Q.U.B.E. 2 amongst other projects and regularly produces content to teach aspiring game developers on how to get into the industry and successfully make their own games. -
Expanding DirectX 12: Microsoft Announces DirectX Raytracing
by Ryan Smith on March 19, 2018 1:02 PM EST
"To many out there it may seem like DirectX 12 is still a brand-new technology – and in some ways it still is – but in fact we’ve now been talking about the graphics API for the better part of half a decade.
Microsoft first announced the then-next generation graphics API to much fanfare back at GDC 2014, with the initial iteration shipping as part of Windows 10 a year later.
For a multitude of reasons DirectX 12 adoption is still in its early days – software dev cycles are long and OS adoption cycles are longer still – but with their low-level graphics API firmly in place, Microsoft’s DirectX teams are already hard at work on the next generation of graphics technology. And now, as we can finally reveal, the future of DirectX is going to include a significant focus on raytracing.
This morning at GDC 2018 as part of a coordinated release with some of their hardware and software partners, Microsoft is announcing a major new feature addition to the DirectX 12 graphics API: DirectX Raytracing. Exactly what the name says on the tin, DirectX Raytracing will provide a standard API for hardware and software accelerated ray tracing under DirectX, allowing developers to tap into the rendering model for newer and more accurate graphics and effects.
Going hand-in-hand with both new and existing hardware, the DXR command set is meant to provide a standardized means for developers to implement ray tracing in a GPU-friendly manner. Furthermore as an extension of the existing DirectX 12 feature set, DXR is meant to be tightly integrated with traditional rasterization, allowing developers to mix the two rendering techniques to suit their needs and to use the rendering technique that delivers the best effects/best performance as necessary.
Why Ray Tracing Lights the Future
Historically, ray tracing and its close colleague path tracing have in most respects been the superior rendering techniques. By rendering a scene more like the human eye works – by focusing on where rays of light come from, what they interact with, and how they interact with those objects – it can produce a far more accurate image overall, especially when it comes to lighting in all of its forms. Specifically, ray tracing works like human vision in reverse (in a manner of speaking), casting rays out from the viewer to objects and then bouncing from those objects to the rest of the world, ultimately determining the interactions between light sources and objects in a realistic manner. As a result, ray tracing has been the go-to method for high quality rendering, particularly static images, movies, and even pre-baked game assets.
Ray Tracing Diagram (Henrik / CC BY-SA 4.0)
However the computational costs of photorealistic ray tracing are incredible due to all of the work required to not only trace individual rays, but also the sheer number of them. This is a ray for every screen pixel (or more) cast, reflected, refracted, and ultimately recursively generated many times over. Bouncing from object to object, refracting through objects, diffusing along other objects, all to determine all of the light and color values that ultimately influence a single pixel.
An illustration of ray recursion in a scene
As a consequence of this, ray tracing has not been suitable for real-time rendering, limiting its use to “offline” use cases where systems can take as much time as they need. Instead, real-time graphics has been built around rasterization, a beautiful, crass hack that fundamentally projects 3D space on to a 2D plane. By reducing much of the rendering process to a 2D image, this greatly simplifies the total workload, making real-time rendering practical. The downside to this method is, as one might expect, that it’s not as high quality; instead of accurate light simulations, pixel & compute shaders provide approximations of varying quality. And ultimately shaders can’t entirely make up for the lack of end-to-end 3D processing and simulations.
While practical considerations mean that rasterization has – and will continue to be – the dominant real-time rendering technique for many years to come, the holy grail of real-time graphics is still ray tracing, or at least the quality it can provide. As a result, there’s been an increasing amount of focus on merging ray tracing with rasterization in order to combine the strengths of both rendering techniques. This means pairing rasterization’s efficiency and existing development pipeline with the accuracy of ray tracing.
While just how to best do that is going to be up to developers on a game-by-game basis, the most straightforward method is to rasterize a scene and then use ray tracing to light it, following that up with another round of pixel shaders to better integrate the two and add any final effects. This leverages ray tracing’s greatest strengths with lighting and shadowing, allowing for very accurate lighting solutions that properly simulate light reflections, diffusion, scattering, ambient occlusion, and shadows. Or to put this another way: faking realistic lighting in rasterization is getting to be so expensive that it may just as well be easier to do it the right way to begin with.
Enter DirectX Raytracing
DirectX Raytracing then is Microsoft laying the groundwork to make this practical by creating an API for ray tracing that works with the company’s existing rasterization APIs. Technically speaking GPUs are already generic enough that today developers could implement a form of ray tracing just through shaders, however doing so would miss out on the opportunity to tap into specialized GPU hardware units to help with the task, not to mention the entire process being non-standard. So both to expose new hardware capabilities and abstract some of the optimization work around this process to GPU vendors, instead this functionality is being implemented through new API commands for DirectX 12.
But like Microsoft’s other DirectX APIs it’s important to note that the company isn’t defining how the hardware should work, only that the hardware needs to support certain features. Past that, it’s up to the individual hardware vendors to create their own backends for executing DXR commands. As a result – and especially as this is so early – everyone from Microsoft to hardware vendors are being intentionally vague about how hardware acceleration is going to work.
At the base level, DXR will have a full fallback layer for working on existing DirectX 12 hardware. As Microsoft’s announcement is aimed at software developers, they’re pitching the fallback layer as a way for developers to get started today on using DXR. It’s not the fastest option, but it lets developers immediately try out the API and begin writing software to take advantage of it while everyone waits for newer hardware to become more prevalent. However the fallback layer is not limited to just developers – it’s also a catch-all to ensure that all DirectX 12 hardware can support ray tracing – and talking with hardware developers it sounds like some game studios may try to include DXR-driven effects as soon as late this year, if only as an early technical showcase to demonstrate what DXR can do.
In the case of hitting the fallback layer, DXR will be executed via DirectCompute compute shaders, which are already supported on all DX12 GPUs. On the whole GPUs are not great at ray tracing, but they’re not half-bad either. As GPUs have become more flexible they’ve become easier to map to ray tracing, and there are already a number of professional solutions that can use GPU farms for ray tracing. Faster still, of course, is mixing that with optimized hardware paths, and this is where hardware acceleration comes in.
Microsoft isn’t saying just what hardware acceleration of DXR will involve, and the high-level nature of the API means that it’s rather easy for hardware vendors to mix hardware and software stages as necessary. This means that it’s up to GPU vendors to provide the execution backends for DXR and to make DXR run as efficiently as possible on their various microarchitectures. When it comes to implementing those backends in turn, there are some parts of the ray tracing process that can be done in fixed-function hardware more efficiently than can be done shaders, and as a result Microsoft is giving GPU vendors the means to accelerate DXR with this hardware in order to further close the performance gap between ray tracing and rasterization.
DirectX Raytracing Planned Support
Vendor Support
AMD Indeterminate - Driver Due Soon
NVIDIA Volta Hardware + Software (RTX)
NVIDIA Pre-Volta Software
For today’s reveal, NVIDIA is simultaneously announcing that they will support hardware acceleration of DXR through their new RTX Technology. RTX in turn combines previously-unannounced Volta architecture ray tracing features with optimized software routines to provide a complete DXR backend, while pre-Volta cards will use the DXR shader-based fallback option. Meanwhile AMD has also announced that they’re collaborating with Microsoft and that they’ll be releasing a driver in the near future that supports DXR acceleration. The tone of AMD’s announcement makes me think that they will have very limited hardware acceleration relative to NVIDIA, but we’ll have to wait and see just what AMD unveils once their drivers are available.
Though ultimately, the idea of hardware acceleration may be a (relatively) short-lived one. Since the introduction of DirectX 12, Microsoft’s long-term vision – and indeed the GPU industry’s overall vision – has been for GPUs to become increasingly general-purpose, with successive generations of GPUs moving farther and farther in this direction. As a result there is talk of GPUs doing away with fixed-function units entirely, and while this kind of thinking has admittedly burnt vendors before (Intel Larrabee), it’s not unfounded. Greater programmability will make it even easier to mix rasterization and ray tracing, and farther in the future still it could lay the groundwork for pure ray tracing in games.
Unsurprisingly then, the actual DXR commands for DX12 are very much designed for a highly programmable GPU. While I won’t get into programming minutiae better served by Microsoft’s dev blog, Microsoft’s eye is solidly on the future. DXR will not introduce any new execution engines in the DX12 model – so the primary two engines remain the graphics (3D) and compute engines – and indeed Microsoft is treating DXR as a compute task, meaning it can be run on top of either engine. Meanwhile DXR will introduce multiple new shader types to handle ray processing, including ray-generation, closest-hit, any-hit, and miss shaders. Finally, the 3D world itself will be described using what Microsoft is terming the acceleration structure, which is a full 3D environment that has been optimized for GPU traversal.
Eyes on the Future
Like the announcement of DirectX 12 itself back in 2014, today’s announcement of DirectX Raytracing is meant to set the stage for the future for Microsoft and its hardware and software partners. Interested developers can get started with DXR today by enabling Win10 FCU’s experimental mode. Meanwhile top-tier software developers like Epic Games, Futuremark, DICE, Unity, and Electronic Arts’ SEED group are already announcing that they plan to integrate DXR support into their engines. And, as Microsoft promises, there are more groups yet to come.
Project PICA PICA from SEED, Electronic Arts
Though even with the roughly one year head start that Microsoft’s closest developers have received, my impression from all of this that DXR is still a very long-term project. Perhaps even more so than DirectX 12. While DX12 was a new API for existing hardware functions, DXR is closer to a traditional DirectX release in that it’s a new API (or rather new DX12 commands) that go best with new hardware. And as there’s essentially 0 consumer hardware on the market right now that offers hardware DXR acceleration, that means DXR really is starting from the beginning.
The big question I suppose is just how useful the pure software fallback mode will be; if there’s anything that can meaningfully be done on even today’s high-end video cards without fixed-function hardware for ray tracing. I have no doubt that developers will include some DXR-powered features to show off their wares early on, but making the best use of DXR feels like it will require hardware support as a baseline feature. And as we’ve seen with past feature launches like DX11, having a new API become the baseline will likely take quite a bit of time.
The other interesting aspect of this is that Microsoft isn’t announcing DXR for the Xbox One at this time. Windows and the Xbox One are practically tied at the hip when it comes to DX12, which makes me wonder what role the consoles will have to play. After all, what finally killed DX9 in most respects was the release of the 8th gen consoles where DX11/12 functionality was a baseline. So we may see something similar happen with DXR, in which case we’re talking about a transition that’s 2+ years out.
However we should hopefully get some more answers here later this week at GDC. Microsoft is presenting a couple of different DXR-related sessions, the most important of which is DirectX: Evolving Microsoft's Graphics Platform and is being presented by Microsoft, DICE, and SEED. So stay tuned for more news from GDC."cj_miranda23 likes this. -
AMD Announces Real-time Ray Tracing Support for ProRender and Radeon GPU Profiler 1.2
by Nate Oh on March 19, 2018 8:15 PM EST
https://www.anandtech.com/show/1255...cing-for-prorender-and-radeon-gpu-profiler-12
"First disclosed this evening with teaser videos related to a GDC presentation on Unity, today AMD is announcing two developer-oriented features: real-time ray tracing support for the company's ProRender rendering engine, and Radeon GPU Profiler 1.2.
Though Microsoft’s DirectX Raytracing (DXR) API and NVIDIA’s DXR backend “RTX Technology” were announced today as well, the new ProRender functionality appears to be largely focused on game and graphical development as opposed to an initiative angled for real-time ray tracing in shipping games. Similarly, while Radeon GPU Profiler (RGP) has not received a major update since December 2017, as it is AMD’s low-level hardware-based debugging/tracing tool for Radeon GPUs this is likewise purely for developers.
In any case, for Radeon ProRender AMD is bringing support for mixing real time ray-tracing with traditional rasterization for greater computational speed. As with today's other real-time ray tracing announcements, AMD's focus is on capturing many of the photorealism benefits of ray tracing without the high computational costs. At a basic level this is achieved by limiting the use of ray tracing to where it's necessary, enough so that it can be done in real-time alongside a rasterizer. Unfortunately beyond a high-level overview, this is all AMD has revealed at this time. We're told a proper press release will be coming out tomorrow morning with further details.
As for the new version of RGP, 1.2 introduces interoperability with RenderDoc, a popular frame-capture based graphics debugging tool, as well as improved frame overview. The update also brings detailed barrier codes, relating to granular regulation of graphical work among DX12 units.
Regardless, AMD has yet more to say on the ray-tracing topic. Along with tomorrow's press release, AMD has a GDC talk scheduled for Wednesday on “Real-time ray-tracing techniques for integration into existing renderers,” presumably discussing ProRender in greater detail."
NVIDIA Announces RTX Technology: Real Time Ray Tracing Acceleration for Volta GPUs and Later
by Nate Oh on March 19, 2018 1:01 PM EST
https://www.anandtech.com/show/1254...tracing-acceleration-for-volta-gpus-and-later
"In conjunction with Microsoft’s new DirectX Raytracing (DXR) API announcement, today NVIDIA is unveiling their RTX technology, providing ray tracing acceleration for Volta and later GPUs. Intended to enable real-time ray tracing for games and other applications, RTX is essentially NVIDIA's DXR backend implementation. For this NVIDIA is utilizing a mix of software and hardware – including new microarchitectural features – though the company is not disclosing further details. Alongside RTX, NVIDIA is also announcing their new GameWorks ray tracing tools, currently in early access to select development partners.
With NVIDIA working with Microsoft, RTX is fully supported by DXR, meaning that all RTX functionality is exposed through the API. And while only Volta and newer architectures have the specific hardware features required for hardware acceleration of DXR/RTX, DXR's compatibility mode means that a DirectCompute path will be available for non-Volta hardware. Beyond Microsoft, a number of developers and game engines are supporting RTX, with DXR and RTX tech demos at GDC 2018."Last edited: Mar 20, 2018 -
AMD Presentations at GDC 2018
https://www.reddit.com/r/Amd/comments/85euhj/amd_presentations_at_gdc_2018/
riktothepast 1 day ago
The talks are recorded and then uploaded to the GDC vault ( https://www.gdcvault.com) some of them are available for free on the youtube channel https://www.youtube.com/channel/UC0JB7TSe49lg56u6qH8y_MQ/videos
dune297 14 hours ago
Hey! I'm working the conference this week. Normally sessions can't be fully recorded beyond the first 5 minutes aside from the union that does the video recording for GDC themselves. The ones presented by AMD (sponsored) give AMD permission to record the session themselves (but I don't think the Vulkan one is AMD sponsored?), so there is some possibility that AMD may put something out on their own, but it's not a huge possibility. Usually videos start going up a week or so after the conference and the GDC Vault gets updated 2 weeks after, so you'll probably have to wait till then.
Presented by AMD:
Real-Time Ray-Tracing Techniques for Integration into Existing Renderers - Wednesday, 21st March, 9:30am-10:30am
The Art of Profiling: Radeon GPU Profiler & RenderDoc - Wednesday, 21st March, 11:00am - 12:00pm
Simulating and Rendering Physically-Realistic Curly Hair - Wednesday, 21st March, 12:45pm - 1:45pm
Taking the Red Pill: Using Radeon GPU Profiler to Look inside Your Game - Wednesday, 21st March, 2:00pm - 3:00pm
Engine Optimization Hot Lap - Wednesday, 21st March, 3:30pm - 4:30pm
Optimizing for the AMD Ryzen Family of CPU and APU Processors - Wednesday, 21st March, 5:00pm - 6:00pm
With AMD speakers:
Advanced Graphics Techniques Tutorial: "New Techniques for Accurate Real-Time Reflections" & "Memory Management in Vulkan and DX12" - Monday, 19th March, 10:00am - 11:00am ( Adam Sawicki Developer Technology Engineer, AMD)
WebGL and glTF - Monday, 19th March, 10:00am - 11:00am ( David Wilkinson SMTS Graphics Engineer, AMD)
HLSL in Vulkan: There and Back Again - Monday, 19th March, 2:40pm - 3:40pm ( Matthaus G. ChajdasDeveloper Technology Engineer, AMD)
Advanced Graphics Techniques Tutorial: Water Rendering in 'Far Cry 5' - Monday, 19th March, 4:00pm - 5:00pm ( Cristian Cutocheras Member of Technical Staff, AMD)
Getting Explicit: How Hard is Vulkan Really? - Monday, 19th March, 5:30pm - 6:30pm ( Matthaus G. Chajdas Developer Technology Engineer, AMD)cj_miranda23 likes this. -
Intel Presentations at GDC 2018
https://www.reddit.com/r/intel/comments/85f198/intel_presentations_at_gdc_2018/
Presented by Intel:
The Blade for All Conquerors: Making the Most of Intel Core for the Best Gaming Experience - Wednesday, March 21, 9:30am - 10:30am
Optimizing Total War: 'WARHAMMER II' - Wednesday, March 21, 11:00am - 12:00pm
Maximize Your Audience: Getting Space Pirate Trainer to Perform on Intel Integrated Graphics - Wednesday, March 21, 2:00pm - 3:00pm
Masked Software Occlusion Culling (MOC): A Guide to CPU Culling - Wednesday, March 21, 3:30pm - 4:30pm
World of Tanks: Enriching Gamer's Experience with Multi-Core Optimized Physics and Graphics - Wednesday, March 22, 10:00am - 11:00am
Scaling CPU Experiences: Maximizing the Unity Job System on All Levels of Hardware - Wednesday, March 22, 11:30am - 12:30pm
Accelerating Game Development and Enhancing Game Experiences with Intel Optane Technology - Wednesday, March 22, 4:00pm - 5:00pm
Forts and Fights from Fun-Size to Full-Size: Scaling Fortnite and Unreal Tournament with Unreal Engine - Wednesday, March 23, 10:00am - 11:00amcj_miranda23 likes this. -
GDC18 show guide from NVIDIA
https://www.reddit.com/r/nvidia/comments/85eywy/gdc18_show_guide_from_nvidia/
GDC18 show guide from NVIDIA
This is a big year at GDC for NVIDIA. We will be making some exciting announcements about new technologies we are introducing as well as major updates to some of our most popular features. We will be reviewing the latest developments in deep learning for game developers, and announcing our new graphics debugger and the public release of NVIDIA Highlights and ANSEL Photo Mode. Stay tuned for the latest news!
https://developer.nvidia.com/gdc18-show-guide-nvidia
Sponsored Sessions
Wednesday, March 21st | Room: 3022
- Improving Real-Time Rendering of Reflections, Shadows and Ambient Occlusion | Time: 12:45 - 1:45 PM | NVIDIA
Session Description: Edward Liu and Ignacio Llamas from NVIDIA will present ground-breaking advances to significantly improve the quality of real-time rendering of reflections, shadows and ambient occlusion
Speakers: Ignacio Llamas, Senior Manager, Real Time Rendering Software & Edward Liu, Senior Real Time Rendering Engineer
- Real-Time Rendering Advances from NVIDIA Research | Time: 2:00 - 3:00 PM | NVIDIA
Session Description: Morgan McGuire and Petrik Clarberg will discuss hot-off-the-press innovations in real-time illumination and coming disruptive changes to renderer designs. Nir Benty will discuss how these sea-changes affect the next version of Falcor, NVIDIA's open source R&D rendering framework.
Speakers: Morgan McGuire, Distinguished Research Scientist | Petrik Clarberg, Senior Research Scientist | Nir Benty, Senior Graphics Software Engineer
- Temporal Super-Resolution | Time: 3:30 - 4:00 PM | NVIDIA
Session Description: Marco Salvi will present a simple, but powerful extension to temporal anti-aliasing that combine anti-aliasing and super-resolution. The resulting algorithm, TSRAA, adds only a small runtime cost, is easier to integrate into a rendering engine than complex schemes like checkerboarding, and allows for flexible amounts of super-resolution.
Speaker: Marco Salvi, Principal Research Scientist
- Advances in Real-Time Voxel-Based GI | Time: 4:00 - 4:30 PM | NVIDIA
Session Description: Voxel based lighting has been around for a few years, but very few games have actually used it so far. In this session, we’re going to discuss VXGI, the Nvidia’s voxel based lighting solution, and uncover some recent improvements that aim to make it more practical and useful even for VR games on present day graphics hardware. We will also discuss how simple, planar area lights can be efficiently implemented with a combination of voxel-based occlusion and analytic irradiance calculations.
Speakers: Alexey Panteleev, Senior Developer Technology Engineer & Rahul Sathe, Senior Developer Technology Engineer
- Interactive Global Illumination in Frostbite | Time: 5:00 - 5:30 PM | Electronic Arts
Session Description: In this talk, Frostbite Rendering will present the results of ongoing R&D being conducted on our high quality static light baking solution (Flux) to reach the goal of an interactive GI workflow for artists. We will describe the techniques we applied to migrate our lightmap renderer to GPU and talk about future avenues we see opening up from this effort. Flux is used in FIFA, Madden and Star Wars Battlefront II.
Speakers: Sébastien Hillaire, Senior Software Engineer, Frostbite
- Shiny Pixels and Beyond: Rendering Research at SEED | Time: 5:30 - 6:00 PM | Electronic Arts
Session Description: In this talk, we will present results from the latest research done at SEED, a cross-disciplinary team working on cutting-edge, future graphics technologies and creative experiences at Electronic Arts. We will explain in detail the techniques from our research, so that attendees can understand how these results were achieved. We hope to inspire developers and provide a glimpse of the future with novel rendering techniques that could power the creative experiences of tomorrow.
Speakers: Colin Barré-Brisebois, Senior Software Engineer & Johan Andersson, Technical Fellow
- Nvidia GameWorks: New Simulation Features | Time: 10:00 - 11:00 AM | NVIDIA
Session Description: In this talk we will discuss new aspects and features developed for the NVIDIA GameWorks simulation libraries. We will present a new PhysX SDK rigid body solver and new joint features that offer better overall accuracy and robustness. We will give an overview on the NVIDIA Blast destruction library and the corresponding Unreal plugin. Further we will present new Unity and Unreal plugins for the Flex particle-based simulation library.
Speakers: Michelle Lu, Senior Software Engineer | Kier Storey, Devtech Manager |Simon Schirm, PhysX and FLEX Project Lead
- Using artificial intelligence to enhance your game 2 of 2 | Time: 11:30 - 12:30 | NVIDIA & Microsoft
Session Description: Machine learning has revolutionized many important fields, ranging from computer vision and natural language processing to healthcare and robotics. In two sessions, Microsoft and NVIDIA will discuss how developers can embrace machine learning methods for graphics and gaming. We’ll cover both gaming use cases and applications for machine learning as well as how to best leverage recent GPU hardware for machine learning workloads.
Speaker: Yury Uralsky Distinguished engineer, GPU Architecture | Stuart Schaefer, Partner Software Architect, Microsoft | Daniel Kennett, Microsoft
- Deep Learning for Game Developers | Time: 12:45 - 1:45 PM | Room 3014 | NVIDIA
Session Description: Deep learning continues to find new applications in many domains relevant to game developers. In this talk, we will discuss research on some of the most interesting new ways to apply deep learning to problems such as content creation, graphics, text, and speech. This talk will give you a sense of the current capabilities of deep learning, as well as where the field is headed.
Speaker: Bryan Catanzaro VP, Applied Deep Learning Research
- Deep Learning for Animation and Content Creation | Time: 2:00 - 3:00 PM | NVIDIA
Session Description: In this talk, the speakers examine tools and technologies that NVIDIA’s GameWorks team is building to leverage the power of Deep Learning for content creation. They demonstrate recent research into ways that Deep Learning networks can be used to generate realistic looking human animation, and talk about how to use GPUs for high performance runtime inferencing of networks in modern games. The talk also covers the latest work in applying deep learning to texture synthesis and super resolution (including video) using both cloud and client based approaches.
Speakers: Gavriel State, Senior Director, System Software & Andrew Edelsten, Director, Developer Technologies (Deep Learning)
- NVIDIA GameWorks Technologies in Final Fantasy XV, behind the scenes | Time: 3:00 - 3:30 PM | Room 2035 (South Expo Floor) | NVIDIA
Session Description: Improving visual fidelity in modern games is a challenging task and Final Fantasy XV was not an exception. Doing this requires detailed analysis of the game engine to come up with a list of technologies which would naturally fit into existing content. The existing game was already very high fidelity. With all the resources and excellent cooperation from Square Enix Business Division 2 we were ready to take risks and tried to utilize every single piece from GameWorks to make the fantasy become a bit more real.
Speakers: Masaya Takeshige, Senior Development Technology Engineer & Evgeny Makarov, Senior Developer Technology Engineer
- Beyond performance; Introducing NVIDIA's new Graphics Debugger | Time: 4:00 - 5:00 PM | NVIDIA
Session Description: In this session, the Developer Tools team at NVIDIA will present a powerful new tool for graphics debugging and profiling. We’ll demonstrate features like Pixel History, Events Viewer and Resource Viewer, as well as advanced profiling functionality like the Range Profiler. In addition, we’ll have some exciting new features to show that we hope will become an essential part of every programmers arsenal for debugging graphics applications. The key takeaway will be a strong understanding of our tools and how to make them an everyday part of your development process.
Speakers: Aurelio Reis, Director of Graphics Developer Tools & Dan Price Engineering Manager, Graphics Developer Tools
- Fixing the Hyperdrive - Maximizing Rendering Performance on NVIDIA GPUs | Time: 5:30 - 6:30 PM | NVIDIA
Session Description: With the release of Nsight 5.5, a subset of the key GPU hardware metrics we have been using at NVIDIA to come up with driver-side and application-side optimizations are finally available to all DX11 and DX12 developers.
This talk will start with a general theoretical overview of our performance triage method. It will then show examples of how our method can be applied to triage and speedup various GPU workloads that suffer from non-obvious pathologies, such as suboptimal resource binding on DX12 impacting the latencies of texture fetch instructions.
Speaker: Louis Bavoil, Principal engineer
- NVIDIA Vulkan Update | Time: 10:00 - 11:00 AM | NVIDIA
Session Description: Two years after release, Vulkan is a mature and full-featured low-level graphics API, with significant adoption in the developer community.
NVIDIA will present a status update on our Vulkan software stack. We will cover latest Vulkan developments, including extensions, software libraries and tools. We will also cover best practices and lessons learned from our own work with the Vulkan API in the past year.
Speaker: Nuno Subtil, Senior Developer Technology Engineer
- Aftermath – Advances in GPU Crash Debugging | Time: 11:30 AM - 12:00 PM | Room 3001/3003 | NVIDIA
Session Description: An update on what’s been happening in the world of TDR debugging and NVIDIA’s ‘Aftermath’ technology. With the increasing complexity of rendering APIs and GPU technology, we need to find a better way of debugging issues that present on the GPU. Since the release of Aftermath last year there has been many changes and improvements in this pivotal area; in this session we’ll recap where we are and review the latest and greatest!
Speaker: Alex Dunn, Senior Developer Technology Engineer
- Capture Amazing Content with NVIDIA Ansel Photo Mode and Highlights Video Capture Tool | Time: 12:15 - 1:15 PM | NVIDIA
Session Description:In this talk, we will show how to easily integrate GeForce Experience platform features like NVIDIA Ansel photo mode and NVIDIA Highlights video recording.
NVIDIA Ansel allows gamers to compose and style photos as well as capture 360 photospheres and HDR images. NVIDIA Highlights allows gamers to automatically capture their best moments through developer-defined game events.
Speaker: Bryan Dudash, Senior Manager Developer Technology
- Advances in the HDR Eco-System | Time: 1:30 - 2:30 PM | NVIDIA
Session Description: HDR display devices have received much attention recently. However, most of the focus has been on TVs, making it difficult for production pipelines and PC gamers. With desktop HDR such as G-SYNC HDR becoming mass market, now is a great time to take a fresh look at our pipelines. This talk will cover the standards and technologies behind HDR from a PC perspective with a dive into some color science to help understand the complexity of it all. Ultimately, it will boil this down to practical advice for game developers.
Speaker: Evan Hart, Principal Engineer
- Accelerating your VR Games with VRWorks | Time: 3:00 - 4:00 PM | NVIDIA
Session Description: Across graphics, audio, video, and physics, the NVIDIA VRWorks suite of technologies helps developers maximize performance and immersion for VR games. We'll explore the latest features of VRWorks, explain the VR-specific challenges they address, and provide application-level tips and tricks to take full advantage of these features, whether you use your own home-grown engine or one like Unreal Engine or Unity. Special focus will be given to the newest technologies in VRWorks, with an emphasis on the most relevant bits to game developers.
Speaker: Cem Cebenoyan, Director of Engineering
cj_miranda23 likes this. - Improving Real-Time Rendering of Reflections, Shadows and Ambient Occlusion | Time: 12:45 - 1:45 PM | NVIDIA
-
cj_miranda23 Notebook Evangelist
Very wise marketing ploy to hype up the upcoming GPU's! This might be a better news
ASRock to join the graphics card market with ‘Phantom Gaming’ GPU
https://www.kitguru.net/components/...graphics-card-market-with-phantom-gaming-gpu/
https://www.anandtech.com/show/12544/asrock-teases-phantom-gaming-graphics-cardshmscott likes this. -
-
GDC 2018 - Five things to watch out for
-
But thanks for the comment, it's nice to know my efforts to share what I enjoy don't go unappreciated.Last edited: Mar 20, 2018Firefox@yami likes this. -
Real-time ray-tracing demo this time around. Real-time on what hardware, though? Nvidia RTX is Volta or higher, which leaves exactly one $3000 prosumer card, not counting Quadros or unannounced consumer cards. Alternatively, there's Radeon Rays, which in AMD spirit is open-source, and of course the hardware-agnostic DirectX Raytracing (DXR for short).
Futuremark also has an upcoming demo, but only comparison screenshots have been released thus far AFAIK.
https://www.techpowerup.com/242556/...racing-demo-teases-upcoming-3d-benchmark-testLast edited: Mar 21, 2018hmscott likes this. -
Ray tracing demos look cool, everything does have that raytrace a little uncanny fake look to it though.. reminds me of Amiga days.. Everything is just a little too reflective and perfect.
hmscott likes this. -
What I am concerned about is Nvidia playing dirty-pool by sticking another proprietary hardware / software box around games that locks out good performance by other GPU's, like from AMD and soon from Intel.
Not to mention spiking the prices of GPU's again with "Tensor" hardware "we must have" to game.
Nvidia should not force inclusion of Raytracing in games, and leave out the expensive Tensor core add-on's in gaming GPU's, and find a way to "reduce" GPU costs for consumer gaming products, not increase costs/prices for them!
Now, if all the GPU's can support these extensions in an open development environment, the GPU costs don't go up at all, and it doesn't slow down FPS or look like crap in games, then I'm all for Raytracing.
http://forum.notebookreview.com/threads/nvidia-thread.806608/page-24#post-10699344Last edited: Mar 21, 2018 -
GDC 2018 Sizzle Reel | Unreal Engine
State of Unreal | GDC 2018 | Unreal Engine
-
hmscott likes this.
-
There is a lot of mid-range in the top Steam 60% of video cards..
hmscott likes this. -
Not sure if all were shown / connected to GDC 2018, but all were released around the same day.
Siren Behind The Scenes | Project Spotlight | Unreal Engine
Siren Real-Time Performance | Project Spotlight | Unreal Engine
Fortnite Replay System | Project Spotlight | Unreal Engine
Ali-A Takes On Unreal Engine Replay | Project Spotlight | Unreal Engine
3Lateral’s Osiris Black Performed by Andy Serkis | Project Spotlight | Unreal Engine
Next-Gen Digital Human Performance by Andy Serkis | Project Spotlight | Unreal Engine
Last edited: Mar 21, 2018 -
GDC 2018 Live Stream
GDC
Started streaming 37 minutes ago... right now jump to -27:30 for start...
-
hmscott likes this.
-
-
Epic Games has now confirmed what hardware they used to render their "Reflections" demo in real-time. A whopping quad-SLI Tesla V100 system, in the form of the $60,000 Nvidia DGX Station. I expected SLI in some form, involving very expensive cards, but that's just insane when the term real-time rendering is used to bait consumers into drumming up the hype of a new technology.
https://www.tweaktown.com/news/6129...wer-unreal-engine-ray-tracing-demo/index.html
In other news, 4A Games has confirmed Metro Exodus will feature Nvidia RTX, making them the first developer to announce the inclusion.cj_miranda23 and hmscott like this. -
GDC 2018 Tech Demo - NVIDIA RTX Real-Time Ray Tracing in Metro Exodus
Published on Mar 22, 2018
We are happy to announce our collaboration with NVIDIA using RTX technology to include real-time ray traced Global Illumination in our upcoming game, Metro Exodus.
This demonstration shows RTX implemented and running in Metro Exodus, on our proprietary 4A Engine, using actual game content - you might recognise this environment from our 2017 E3 trailer! We have utilized true raytracing to render both Ambient Occlusion and Indirect Lighting in full realtime, in a practical in-game scenario.
“Previously, we had utilized a mix of several custom-made systems to satisfy our hungry demand for dynamic content of varying scale. Now we are able to replace it with one single system that covers all our needs and outputs the quality of offline renderers.” - 4A Games’ Chief Technical Officer, Oleksandr Shyshkovtsov.
Check out the full blog post on our website for more information:
NVIDIA Reveals Collaboration with 4A Games on RTX Technology at GDC 2018
http://www.4a-games.com.mt/4a-dna/2...n-with-4a-games-on-rtx-technology-at-gdc-2018
Metro Exodus is an epic, story-driven first person shooter from 4A Games that blends deadly combat and stealth with exploration and survival horror in one of the most immersive game worlds ever created.
Flee the shattered ruins of dead Moscow and embark on an epic, continent-spanning journey across post-apocalyptic Russia in the greatest Metro adventure yet.
Explore the Russian wilderness in vast, non-linear levels and follow a thrilling story-line inspired by the novels of Dmitry Glukhovsky that spans an entire year through spring, summer and autumn to the depths of nuclear winter.
Metro Exodus will be departing Autumn 2018 on Xbox One, PlayStation 4, and PC!
Metro:
Web: http://www.MetroTheGame.com
Twitter: http://twitter.com/MetroVideoGame
Facebook: http://facebook.com/MetroVideoGame
Instagram: http://instagram.com/MetroVideoGame
4A Games:
Web: http://4a-games.com/
Twitter: https://twitter.com/4agames
Facebook: https://facebook.com/4aGames/ -
This is definitely a better showing, a little more realistic instead of being overly reflective for demo's sake. I'm sure it'll get better.
PS: Also the reflective water right at the very end in the upper right corner..Last edited: Mar 23, 2018hmscott likes this. -
GDC 2018 Wrap-up
Juciy Realm hands-on at GDC 2018
Tobii EyeCore hands-on at GDC 2018
Evasion VR GDC 2018 Update
-
Hands-On with Oculus Go VR Headset!
Oculus Go hands-on at GDC 2018
Oculus Go: First hands-on impressions of Facebook's $199 VR headset!
Oculus Go... Who Cares?
Last edited: Mar 24, 2018 -
In The Crags - GDC 2018 Trailer
428 Shibuya Scramble GDC 2018 - Avance
Fire Pro Wrestling World GDC 2018 - Avance
Steins;Gate Elite GDC 2018 - Avance
-
GDC 2018 - 18 minutes of EverReach: Project Eden Gameplay with Dieter Schoeller
Zanki Zero GDC 2018 - Avance
PixelJunk™ Monsters 2 GDC 2018 - Avance
Flipping Death is Funny and a Whole Lot of Fun - GDC 2018
Last edited: Mar 24, 2018 -
Wavedash Games CEO talk about Icons: Combat Arena at GDC 2018
GDC 2018, Game Analytics
GDC 2018, Mintegral
GDC 2018, Kabam
-
Tens of thousands of gamers flock to San Francisco for GDC 2018
Nintendo Switch GDC Awards 2018 (Game Developers Choice Awards)
San Francisco GDC 2018: Chatterview with Limited Run Games!
San Francisco GDC 2018: The Neohabitat Project Chatterview
Last edited: Mar 24, 2018 -
Clunker Junker [alt.ctrl.gdc 2018]
GDC 2018 - Logitech G
Harold Halibut hands-on at GDC 2018
The Tech Behind Harold Halibut's Art Style Is Insane - GDC 2018
-
Black Future 88 hands-on at GDC 2018
Where Cards Fall hands-on at GDC 2018
Exclusive GDC 2018: Home Sweet Home Interview
GDC 2018 Developer Talks Conan Exiles
-
Catan VR GDC 2018 Gameplay Interview
2018 GDC Awards: Cuphead (Best Visual Art Winner)
Viveport Developer Awards at GDC 2018
GDC 2018 - Vulkan on Android
The Khronos Group - GDC 2018 - 6 video Playlist -
GGTV - GDC 2018: Doctor Who Infinity with Susan Cummings
The Technology Behind NVIDIA RTX - GDC 2018
NVIDIA RTX and GameWorks Ray Tracing Technology Demonstration
NVIDIA RTX Real-Time Ray Tracing Tech Demo From Remedy Entertainment
Last edited: Mar 24, 2018 -
GDC 2018, Games controller innovations
GDC 2018 - March 19th - 23rd Moscone Center, San Francisco, CA
Discussion in 'Gaming (Software and Graphics Cards)' started by hmscott, Mar 18, 2018.