The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Nvidia Thread

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Nvidia is crowned the smartest company in the world right now (NVIDIA)
    [​IMG]
    Nvidia was crowned the smartest company in the world by MIT.
    The graphic processing unit (GPU) manufacturer was announced as the smartest among all public and private companies by MIT because of its business savvy and innovative technologies.

    The list, produced by MIT, listed other companies like SpaceX and Amazon at the top of the list, but neither could match the prowess of Nvidia.

    "The list is our best guess as to which firms will be the dominant companies of the future. Amazon and Facebook and Google are on it, but so are plenty of newcomers," David Rotman wrote about the list.

    The main advantage Nvidia has over its competition comes from the explosive growth in artificial intelligence.

    As giant companies like Google and Apple try to develop their AI technologies, they often need huge data centers to speed along their research. Nvidia profits from this because these companies use its data-center and GPU chips when building and upgrading these centers. MIT says the company spent $3 billion to develop its new data-center chip, a bet that has paid off for the company. The company's chips are used by every major player in the AI game, according to Nvidia.

    Self-driving cars have been dominating the news recently. Many of the same names working on AI technology are also working on self-driving cars. Nvidia has partnered with a slate of car manufacturers to use the company's autonomous-driving chipset.

    Though not mentioned by MIT, the exciting field of cryptocurrencies could be another big business for Nvidia. The company is rumored to be producing a cryptocurrency mining-specific chip that could help it capitalize on the hype around currencies like Bitcoin and Ethereum.

    All these areas have no doubt contributed to the companies impressive rise. Nvidia shares are up 37.51% this year. The company is also no stranger to MIT's smartest companies list, as it made an appearance in 2015 and 2016 as well.

    Click here to see Nvidia's stock price move in real time...
    [​IMG]Markets Insider


     
    Last edited: Mar 11, 2018
    Vasudev and chezzzz like this.
  2. CaerCadarn

    CaerCadarn Notebook Deity

    Reputations:
    320
    Messages:
    1,169
    Likes Received:
    1,124
    Trophy Points:
    181
    A great 'Huzzah' from my side!!!

    Gesendet von meinem ONEPLUS A3003 mit Tapatalk
     
    Dr. AMK likes this.
  3. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Smartest but least commercially ethical.
     
    Maleko48, oSChakal, triturbo and 5 others like this.
  4. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    That mining chip has me interested, if it takes off that will be a lot of cash for them.
     
    jaug1337, Vasudev, Starlight5 and 2 others like this.
  5. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Have you met Intel?
     
    Gursimran82956, chezzzz and Dr. AMK like this.
  6. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I really feel like in both cases what they are doing is more "We're on top" laziness. You'll see that stuff going away when they've got AMD breathing down both their necks.
     
  7. ChanceJackson

    ChanceJackson Notebook Evangelist

    Reputations:
    39
    Messages:
    562
    Likes Received:
    231
    Trophy Points:
    56
    Not too surprisingly sadly they like intel have the best tech in their segments and the runner-up competes with both in all segments
     
    Dr. AMK likes this.
  8. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I think AMD's new GPUs will solidify that into a good system where they dominate the lower end and offer a good price compromise on the high end. Last RX run didn't really accomplish that for me.
     
    Dr. AMK likes this.
  9. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    If nvidia is so smart then why can't it create 2 non-problematic graphics driver sets in a row?
     
    Maleko48 and Dr. AMK like this.
  10. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    They seem to be putting a lot more resources towards dev than support. Another place AMD can move into a good position in the market if they play their cards right.
     
    Dr. AMK likes this.
  11. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    I think the GPU's and all graphics cards are small segment from their business. They are not the smartest company for only selling cards, correct me if I'm wrong.
     
  12. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    More accurately, consumer GPUs aren't all of their business. GPUs for supercomputing are huge right now, and expensive.
     
    ChanceJackson, hmscott and Dr. AMK like this.
  13. StormJumper

    StormJumper Notebook Virtuoso

    Reputations:
    579
    Messages:
    3,537
    Likes Received:
    488
    Trophy Points:
    151
    That's called blindsided by Intel.
    I believe that when I see that but it happens so far-their CPU is a different story.
    AMD should be the one crying fowl but we got their surrogate to do that instead-not surprised here.
    Unfortunately AMD fell on their sword and let Nivida go straight over them.
     
    Dr. AMK likes this.
  14. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I don't think we've seen Ryzen/TR have their full impact yet.
     
    Dr. AMK likes this.
  15. ChanceJackson

    ChanceJackson Notebook Evangelist

    Reputations:
    39
    Messages:
    562
    Likes Received:
    231
    Trophy Points:
    56
    The reviews on the Vega FE aren't that impressive basically 1070 level iirc the future doesn't bode well considering Volta is around the corner.

    Having to split major R&D bucks between CPU and GPU is clearly not a sustainable model for competitiveness with market leaders who only need to invest in just one of those areas hence why Nintendo had to ditch it's 3 generation long partner to go with nVidia because AMD didn't have anything that could compete tdp to perf with the 920mx which is like 550 gflops fp32 for only 16w tdp and that is last gen hardware!
     
    Last edited: Jul 12, 2017
    Dr. AMK likes this.
  16. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    Thought that had more to do with a different direction in form factor, but point taken. I'd be interested to see if XBox and PS products make a similar switch from AMD APUs in the future.
     
    ChanceJackson and Dr. AMK like this.
  17. ChanceJackson

    ChanceJackson Notebook Evangelist

    Reputations:
    39
    Messages:
    562
    Likes Received:
    231
    Trophy Points:
    56
    Game consoles aren't as worried about Watt to Perf ratios since they have more cooling real estate and don't have to worry about battery life, IMO the focus there is more like cost to perf with an emphasis on total cost which I doubt nVidia will be willing to race to the bottom with AMD on
     
    Dr. AMK likes this.
  18. James D

    James D Notebook Prophet

    Reputations:
    2,314
    Messages:
    4,901
    Likes Received:
    1,132
    Trophy Points:
    231
    They know what happens after they choose Nvidia (monopoly). But if one becomes greedy to bite Nvidia's offer then 2nd willjump too and oh, boy.
     
    Dr. AMK likes this.
  19. ChanceJackson

    ChanceJackson Notebook Evangelist

    Reputations:
    39
    Messages:
    562
    Likes Received:
    231
    Trophy Points:
    56
    Also iirc both xbox and playstation divisions got burned dealing with nVidia reaping all the benefits of die shrinks and not passing cost savings down the line
     
    Dr. AMK likes this.
  20. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    That makes sense. The good news is consoles are still big sellers so AMD's got a decent income stream from that to play with.
     
    Dr. AMK and ChanceJackson like this.
  21. ChanceJackson

    ChanceJackson Notebook Evangelist

    Reputations:
    39
    Messages:
    562
    Likes Received:
    231
    Trophy Points:
    56
    Let's just hope AMD can be competitive enough to keep getting contracts next gen as that and OpenCL users are AMD's saving grace right now
     
    Dr. AMK likes this.
  22. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    You think they'll cash in on the mining craze? There's already dedicated NVidia GPUs for it, but AMD seems like the more popular choice, if they made a dedicated card I think they'd clean up.
     
    ChanceJackson and Dr. AMK like this.
  23. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  24. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Dr. AMK likes this.
  25. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  26. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  27. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  28. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  29. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Dr. AMK likes this.
  30. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Dr. AMK likes this.
  31. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    Still cool. But vertically oriented displays, even when they're useful are like visual nails on a chalkboard to me.
     
    Dr. AMK and tilleroftheearth like this.
  32. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Agreed! :D :D :D

    While they may fit a standard 'page' - they don't fit with our naturally landscape viewpoint by a long shot.

     
    Dr. AMK likes this.
  33. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Half the time they're just used for lists anyway. What a waste.
     
    Dr. AMK and tilleroftheearth like this.
  34. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    NVIDIA Inventions Promise to Make Augmented Reality More Comfortable
    https://blogs.nvidia.com/blog/2017/07/27/nvidia-research-augmented-reality-vr/

    Few moments are more magical than slipping on a headset and being instantly transported to an immersive virtual world.

    To help bring such experiences to more people, we’re showing some of NVIDIA Research’s latest work to heighten the magic of VR and AR at the next week’s SIGGRAPH computer graphics conference, in Los Angeles.

    We’ll present work in two areas: what researchers call “varifocal displays,” which give users the ability to focus more naturally while enjoying VR and AR experiences; and haptics, which enhances VR and AR with touch and feel. This represents the latest in a growing body of research we’ve shared over the past decade at industry events such as SIGGRAPH, as well as academic venues.
    Enhancing Focus in AR and VR
    We’re demonstrating a pair of techniques that address vergence-accommodation conflict. That’s caused when our eyes, accustomed to focusing on objects in 3D space, are presented with stereo images with parallax depth cues, but which are presented on a flat screen at a constant optical distance. Both aim to solve this in different ways by varying the focus of virtual images in front of a user, depending on where they’re looking.

    The first, Varifocal Virtuality, is a new optical layout for near-eye display. It uses a new transparent holographic back-projection screen to display virtual images that blend seamlessly with the real world. This use of holograms could lead to VR and AR displays that are radically thinner and lighter than today’s headsets.

    This demonstration makes use of new research from UC Berkeley’s Banks lab, led by Martin Banks, which offers evidence to support the idea that the our brains use what a photographer would call a chromatic aberration — causing colored fringes to appear on the edges of an object — to help understand where an image is in space.

    Our demonstration shows how to take advantage of this effect to better orient a user. Virtual objects at different distances, which should not be in focus, are rendered with a sophisticated simulated defocus blur that accounts for the internal optics of the eye.

    So when a user is looking at a distant object it will be in focus. A nearby object they are not looking at will be more blurry just as it is in the real world. When the user looks at the nearby object, the situation is reversed.

    The second demonstration, Membrane VR, a collaboration between University of North Carolina, NVIDIA, Saarland University, and the Max-Planck Institutes, uses a deformable membrane mirror for each eye that, in a commercial system, could be adjusted based on where a gaze tracker detects a user is looking.

    The effort, led by David Dunn, a doctoral student at UNC, who is also an NVIDIA intern, allows a user to focus on real-world objects that are nearby, or far away, while also being able to see virtual objects clearly.

    For example, a label displaying a person’s name above a person’s head might appear to a user to actually be on top of their head, creating an experience that blends the virtual and real world’s more seamlessly. (To learn more, read the award-winning paper Dunn co-authored on this technique.)

    New Ideas in Haptics
    We’re also showing off two new techniques for using fluid elastomer actuators — small air chambers — to provide haptic feedback that enhances VR and AR, by connecting what you see on your display to what you feel in your hand. Both are created by Cornell
    University in collaboration with NVIDIA.

    One is a prototype VR controller that lets VR users experience tactile feedback while they play, relaying a sense of texture and changing geometry. Its soft skin can safely provide force feedback, as well as simulate different textures and materials.

    The second is a controller that changes its shape and feel as you use it. So, a foam sword — the kind you might wave around at a sporting event — feels soft and squishy, yet can transform, in a moment, into a katana that feels longer and firmer in your grip.

    We’ve integrated these novel input devices with our VR Funhouse experience. You’ll feel a knock when you whack-a-mole with a mallet in the game, or a kick when you fire at plates in a shooting gallery with antique revolvers.

    Learn More
    There is a lot of work left to be done to take ideas like these to market and make VR and AR more comfortable for users. But, from optics to haptics, NVIDIA is committed to solving the industry’s hardest technology problems in order to drive mass adoption of VR and AR.

    Come see our latest ideas on display at SIGGRAPH’s Emerging Technologies exhibit. And don’t forget to stop by our booth for demonstrations of how you can put technologies such as AI and VR to work.
     
    Last edited: Mar 11, 2018
  35. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    That's pretty sweet, especially the haptic feedback. Having gloves that could approximate whatever controls you want would be awesome, like if they could restrict their shape to simulate a gun or provide feedback to a virtual interface.
     
    Dr. AMK likes this.
  36. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    If they did that I'll be the 1st buyer :)
     
  37. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I'll be right behind you. Having a completely customizable interface within an environment would be probably the biggest non-visual interface leap that isn't a full body suit.
     
    Dr. AMK likes this.
  38. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    DeepSat: Monitoring the Earth’s Vitals with AI
    https://insidehpc.com/2017/08/deepsat-monitoring-earths-vitals-ai/

    This sponsored post NVIDIA is the first of five in a series of case studies that illustrate how AI is driving innovation across businesses of every size and scale.

    The Earth’s climate has changed throughout history. In the last 650,000 years, there have been seven cycles of glacial advance and retreat, with the abrupt end of the last ice age (about 7,000 years ago) marking the beginning of modern climate and the era of human civilization.

    [​IMG]
    NASA: Earth’s 2016 surface temperatures were warmest on record. (Photo: climate.nasa.gov)

    Most of these climate changes are attributed to very small variations in Earth’s orbit that change the amount of solar energy our planet receives. Many of the more recent changes, however, are directly related to increased amounts of carbon in the atmosphere. September 2016 was the warmest September in 136 years of modern record-keeping, according to a monthly analysis of global temperatures by scientists at the NASA Goddard Institute for Space Studies (GISS).

    Given the impact of this change, the need to monitor land surface from satellite images and understand the impact of warming on crop yield changes, vegetation, and other landscapes is vital. However, automating satellite image classification is a challenge due to the high variability inherent in satellite data and the lack of sufficient training data. Most methods rely on commercial software that is difficult to scale given the region of study (the entire globe) and frequently encounter issues around compute and memory-intensive processing, cost, massively parallel architecture, and machine learning automation.
    Deep Learning for DeepSat
    In order to better keep a finger on the pulse of the Earth’s health, NASA developed DeepSat, a deep learning AI framework for satellite image classification and segmentation. An ensemble of deep neural networks within NASA’s Earth Science and Carbon Monitoring System, DeepSat provides vital signs of changing landscapes at the highest possible resolution, enabling scientists to use the data for independent modeling efforts. This is just one way innovation in deep learning and AI has lead to a deeper understanding of our planet.

    [​IMG]
    Zoom of San Francisco, CA, with individual trees
    segmented in this highly heterogeneous landscape. (Photo: NVIDIA)

    DeepSat is used in many ways, including: accurately quantifying the amount of carbon sequestered by vegetated landscapes to offset emissions; to downscaling climate projection variables at a high resolution; and providing critical layers that allow assessment of effects like the urban heat island and rooftop solar efficiency. DeepSat provides a robust library of models that are trained using millions of tunable parameters and can be scaled across very large and noisy data sets.


    DeepSat provides vital signs of earth's changing landscapes at the highest possible resolution #AICLICK TO TWEET
    With the compute power of NVIDIA GPUs, NASA was able to train the networks from a survey of 330,000 image scenes across the continental U.S. Average image tiles were 6000 x 7000 pixels, each weighing about 200 MB. The entire dataset for this sample was close to 65 TB for a single time epoch, with a ground sample distance of one meter. NASA also built a large training database (SATnet) of hand-generated, labeled polygons representing different land cover types for model training. CNN Models were trained on NVIDIA DIGITS DevBox, and the trained model was run on all image scenes using the NASA Ames Pleiades Supercomputer GPU cluster, equipped with NVIDIA Tesla GPUs with 217,088 NVIDIA CUDA cores.


    “Our best network dataset produced a classification accuracy of 97.95% and outperformed three state-of-the-art object recognition algorithms by 11%.” – Sangram Ganguly, Senior Research Scientist, NASA Ames Research Center.

    A Firmer Grasp on Carbon Impact

    Powered by NVIDIA GPUs, testing and training performance saw marked improvement across the board. Increasing the input size allowed for a gradient descent with less noise, bigger images that provided more context for classification, and improved classification/segmentation accuracy. Training time was reduced, which automatically brings more experimentation and faster innovation.

    “Our best network dataset produced a classification accuracy of 97.95% and outperformed three state-of-the-art object recognition algorithms by 11%,” said Ganguly.

    Such insight into Earth’s behavior has potential impact across industries, countries, and humanity itself. Imagery showing the changes to our planet’s vital signs can better prepare governments to plan for natural disasters—showing areas at risk of forest fires, flooding, and avalanches. It can also assist farmers with crop production on a hotter, drier planet, and provide deeper insight into sea levels, temperatures, and levels of acidity. It would not be an overstatement to say that the future of our planet depends upon it.

    This case study on innovation within the AI industry from NVIDIA first ran as part of the company’s Deep Learning Success Stories.
     
    Last edited: Mar 11, 2018
  39. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    That's really cool. I guess I'm still mentally living in an aerial/sat layered era where they just overlay GIS type data on photos, this is light years beyond that.
     
    Dr. AMK likes this.
  40. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    NVIDIA Builds On AI Success At GTC Beijing
    Jensen Huang, the CEO of NVIDIA, will not willingly cede the leadership position in deep neural net acceleration to anyone, not even Google and Microsoft. NVIDIA GPUs have been on the forefront of accelerated neural network processing and are the de facto standard for accelerated neural network research and development (R&D) plus deep learning training. At the NVIDIA GPU Technology Conference (GTC) in Beijing China earlier this week, the company maneuvered to also become the de facto standard for accelerated neural network inference deployment.

    At GTC Beijing, NVIDA lined up the major Chinese cloud companies for AI computing: Alibaba Cloud, Baidu Cloud, and Tencent Cloud. And for those companies that want AI on premises, NVIDIA has support from Chinese system providers Huawei, Inspur, and Lenovo based on the HGX chassis design.
    [​IMG]
    NVIDIA HGX server chassis design for GPUs

    In the US, NVIDIA has been under some pressure as Google invested in building its own custom Neural Network (NN) processor – the Tensor Processing Unit or TPU. Microsoft has also invested in using field programmable gate arrays (FPGAs) from Intel’s Altera division to accelerate some NN functions in Azure cloud, mostly for inference. There are also a number of companies focused on NN processing ranging from Wave Computer (which is building rack servers for NN processing), to chips from Intel (from the Nervana acquisition), and newly released intellectual property cores from Imagination Technology (PowerVR NX2). But, all the new-comers are comparing themselves to NVIDIA GPUs as the baseline, not CPUs.

    NVIDIA has been a clear leader in the more computational and time-consuming part of NN creation – training. Training is where vast sets of training data are processed to develop a set of weights that represent the NN learning. Those weights are then used by a second set of neural network hardware to implement pattern recognition – this is referred to as the inference stage. Using GPUs, researchers took the training part from weeks (on CPUs) to days, making modern NN-based Artificial Intelligence (AI) practical.

    NVIDIA’s CEO recognized the opportunity to take GPU compute beyond high-performance computing (HPC) and supercomputers, and embraced the NN processing. NVIDIA’s investment in general purpose-GPU (GP-GPU) compute set the stage for the company to be the logical choice for NN R&D. Key to the early support was software, followed by changes in the GPU architecture. With NVIDIA’s newest Volta GPU, the company has added Tensor Cores for even greater NN processing. As a result, Volta is highly desired chip for AI processing the data center.
    [​IMG]
    NVIDIA

    NVIDIA Volta with HBM2 on board

    NVIDIA is also working hard to create a lead in cloud inference. At GTC-Beijing, it announced inference designs with Alibaba Cloud, Tencent, Baidu Cloud, JD.com, and iFlytek.

    The data center is the best place for NN training and for cloud services inference it is still fine. But when the action needs to be real time (like autonomous cars, drones, or robots), private (on your smartphone), or need to reduce volume f information transferred (video surveillance), inference will likely be run on edge devices. Edge inference is where AI really begins to scale – as more IoT devices become intelligent. NVIDIA is also looking at that market as the next big wave. As such, NVIDIA has built a high-performance system for automotive (NVIDIA Drive) using a system on chip (SoC) called Xavier. The chip is also being used for autonomous machines like drones and robots. The NVIDIA Drive platform already has many designs with the company claiming 145 autonomous vehicle startups using the platform, not to mention the big companies like Toyota.

    The Xavier SoC includes a new processing element called the deep-learning accelerator (DLA) to accelerate the inference functions and can reach 20-30 tera operations per second (TOPS) for only a few 10s of Watts. For autonomous drones and robots, real time inference is also critical as multiple sensors need to be processed and cannot rely of cloud services due to variable latency. Xavier will start sampling in Q1 of 2018 for early customers.



    One interesting cross-over for NVIDIA’s AI program and its graphics business is the idea that you can train a robot in a virtual reality (VR) world before it is deployed in the real world. The NVIDIA program is called Project Isaac and it creates a virtual robot to inhabit a virtual world that’s visually accurate and has realistic physics modeling. By training the robot in VR, it can be trained faster than in the real world and with greater safety.



    We expect the company will build more differentiated GPUs for NN processing in the future – Volta and Xavier are just the start of this trend. NVIDIA is pushing hard to stay ahead of the competition as large companies and start ups start targeting NN inference acceleration as the next processing gold rush. So far, NVIDIA has staked a credible claim and is busy investing in the business for rapid growth.

    Kevin Krewell
     
    Last edited: Mar 11, 2018
  41. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    VR simulators for robots...crazy.
     
    Dr. AMK likes this.
  42. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  43. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  44. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  45. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    That's pretty neat, is that what handles the lighting in tilt brush?
     
    Dr. AMK likes this.
  46. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  47. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331

    Looks like a Bond villain's headquarters. But awesome that they're using VR to model it.
     
    Dr. AMK likes this.
  48. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Best use of VR I've seen. :)

     
    Dr. AMK likes this.
  49. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    So far I've enjoyed interactive VR experiences more than just environments.
     
    Dr. AMK likes this.
  50. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
 Next page →