The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Nvidia Thread

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Dr. AMK, Jul 4, 2017.

  1. wyvernV2

    wyvernV2 Notebook Evangelist

    Reputations:
    177
    Messages:
    567
    Likes Received:
    386
    Trophy Points:
    76
    Well, in this segment, i would definitely award AMD for sacrificing VEGA for Na'Vi. Remember the dinner intro with raja koduri? After talking a bit about infinity fabrics, he said that the people are just getting over hyped for vega, while they should show that enthusiasm for Navi!

    As per plans, AMD was to release vega on a 7nm finfet, but even they saw that bandwidth is causing more than enough troubles, so what is the solution? They released a less powered vega NOW, and have plans to release a better vega frontier for gen4pci. But thats for workstations, what for gamers? Well, they just announced NaVi straight away, however, controversies are still inbound.
     
    Dr. AMK and Vasudev like this.
  2. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    BITNAND Offers Up A Mining Optimized GTX 1060 6GB Card For Just $389
    https://hothardware.com/news/bitnand-offers-up-a-mining-optimized-gtx-1060-6gb-card-for-just-389
    cryptocurrency mining has had on the graphics card market has been so extreme, very few people could have predicted it. GPUs that were as easy to purchase as a gallon of milk became as hard to find as the hottest deal at Best Buy on Black Friday.

    In addition to the rarity, inflated pricing has been a real issue as well, with many GPUs costing twice as much (or more) than their intended MSRP. A $649 GeForce GTX 1080 Ti, for example, can currently be had as "cheap" as $900, and that's a relative bargain compared to some of the other pricing we've seen. It's not hard to find the same GPU going for $1,300+ at virtually any e-tailer. One of the saving graces for gamers has been companies selling pre-built systems, like MAINGEAR, where adding a high-end GPU to a build doesn't inflate the price by the same degree, but not everyone is in the market for an entirely new rig.

    [​IMG]

    It's been rumored that AMD and NVIDIA are considering releasing mining-specific GPUs, although some third-parties have already gotten started. That includes BITNAND, which is now taking orders for a GeForce GTX 1060 optimized for mining. There are no video outputs on the card, and for that matter there's no PCIe bracket either, though like their gamer-centric counterparts a 6-pin PCIe power connector is required.

    The card is also passively cooled. That's ideal for those wanting to run a fleet of them, especially in a climate controlled room, because there's no need to worry about fans failing. BITNAND is charging $389 for this card, and says it will begin shipping at the end of the month.

    [​IMG]

    Compared to standard GeForce GTX 1060s, $389 isn't the best deal. Amazon lists a handful at $349 or under, so miners plucking these passive GPUs up would be doing so for reduced noise, possible optimized performance, and power savings. As with any GTX 1060, this one can be overclocked to further tune performance, although your mileage will vary -- especially with passive cooling.

    BITNAND says that this card hits about 22MH/s for ETH mining and 500H/s for XMR mining. And the company claims a power draw of 70W, which contrasts sharply with a reference GeForce GTX 1060's 120W specification. A 50W difference could help justify the card's premium price, because it would ultimately cost less to run over time.

    If rumors prove true and AMD and NVIDIA do have mining cards en route, then maybe there will be enough gamer cards left for the rest us and pricing will fall back down to expected levels. It seems like a pipe dream given the current crypto craze, but we are optimistic something will be done to help the situation moving forward.
     
    Vasudev and hmscott like this.
  3. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    This was right in time :D Most likely inside information :p Micro$haft have just killed the 3GB version for use for Ethereum Mining NVIDIA’s GeForce GTX 1060 3GBs Rendered Less Than Optimal For Ethereum Mining After Windows Update
    Earlier today, our US & Youtube Editor, Keith May discovered something pretty darn interesting. On a very lean windows installation, Ethereum mining is now over 3 GB in size. This might sound like random gibberish to the average person, but to the crypto-enthusiast this has far reaching implications. In layman’s terms, it means that the graphics memory required to mine ETH is now over 3GB in size, making most of the economical variants of the GTX 1060 useless in a mining rig.

    This is going to have very interesting consequences for the market dynamics. Since GeForce GTX 1060 3GBs are suddenly less than optimal for mining purposes, you will see an increase in 1060s dumped to the second hand market flooding it with cheap GPUs that anyone can snag up. While we would recommend exercising extreme caution in selecting a second hand GPU from any bulk seller, this will also mean that new GTX 1060 3GB prices should start to return to normality and gamers will finally be able to claim one class of card as their own (that is to say, the entry-level market).
     
    Vasudev, hmscott and Dr. AMK like this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I think this physically useless for gaming lockout design that's sprouting up is the worst solution that could be implemented. It's locking out the wrong group from using that GPU!!

    The problem with this approach, making a mining only GPU, is that it still uses up 1 Nvidia GPU die, which will now never be used as a gaming GPU, it's forever locked into a card that is useless for gaming.

    And, GPU miners will still need more GPU's and keep buying up the gaming GPU's anyway.

    If Nvidia would lock down the firmware of the gaming GPU boards, a percentage of them for miners and a percentage for gamers - with lockdown code on the gaming GPU's that would perform crappy at mining - then gamers would be assured of at least a percentage of the GPU's.

    The gaming only GPU's (or bad at mining GPU's set in firmware), could be set at the demand for gaming GPU's + 20% for growth, and whatever is left can be good at gaming or mining, and those could be priced at 2x-10x MSRP to gouge mining use.
     
    Last edited: Feb 17, 2018
    Mr. Fox, Vasudev and Dr. AMK like this.
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Ah, the DAG isn't over 3GB, it's Microsoft's Windows Update that increased the OS's footprint in VRAM...

    " Update: We did some more digging around and found out the root cause of the 1060 3GB not mining Ethereum: a windows 10 update that has increased the OS footprint on the vRAM."

    That's gonna affect all mining, not just Ethereum...

    It's also weird, as wth is MS loading into VRAM that's so large it's going to take up a significant % of VRAM??
     
    Vasudev and Dr. AMK like this.
  6. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    I think it's still one step forward to fix the mining gpu shortage problem, by moving miners from buying gaming gpu to this dedicated mining gpu, they are listening to what the market is saying and they start to provide a practical solutions.
     
    Vasudev and hmscott like this.
  7. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    Said this before... Nvidia give a damn in who buy their cards. They scream up. What else can they do to calm down the gamesrs? Its a rotten game, Hmscott!!
     
    Vasudev, hmscott and Dr. AMK like this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Why are you guys posting this all in 3x threads??

    http://forum.notebookreview.com/thr...scussion-thread.794965/page-437#post-10682803

    http://forum.notebookreview.com/thr...right-now-nvidia.806608/page-16#post-10682791

    http://forum.notebookreview.com/thr...l-transformation.812591/page-24#post-10682789
     
    Last edited: Feb 17, 2018
    alexhawker and Dr. AMK like this.
  9. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    hmscott and Dr. AMK like this.
  10. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Not gonna happen now, is it? :)
     
    Papusan and Dr. AMK like this.
  11. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    hmscott likes this.
  12. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, you aren't supposed to post the same thing in multiple threads...

    The way around this is to pick and post in the best thread, and if there is a strong reason for it to appear in another thread, someone will read your post and post their own version in the other thread(s).

    You can pick one now, and Report the others to have them deleted... or maybe find another article reporting something similar, and replace / edit the text in the other posts to make them different - completely different.
     
    Dr. AMK likes this.
  13. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    It fit's damn good in the Windows 10 thread :D
     
    Dr. AMK and hmscott like this.
  14. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Will do that, and I will take care next time. Thanks
     
    hmscott likes this.
  15. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    See @Dr. AMK that's how it gets spread, someone else will re-formulate your original post and re-post your post in another thread :)
     
    Papusan and Dr. AMK like this.
  16. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Just in case of emergency, there is one exception. If you have an info of dire need of distribution, you can make one original post, and then reply to others with that problem in other threads. It's not a re-post if it's solving the same problem in multiple threads. It is best if you don't copy / paste answers, make them unique to the response you make - tailor it to their request.
     
    Papusan and Dr. AMK like this.
  17. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    The scam work fits well in both threads (Windoze X and nvidia, both is involved) as hand in glove :D Threads that are most read are the best.
     
    hmscott and Dr. AMK like this.
  18. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Did you just call yourself a thread-spammer scammer?? ;)
     
    Last edited: Feb 17, 2018
    Papusan and Dr. AMK like this.
  19. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    If I'm, I'm not the only one :D
     
    Dr. AMK and hmscott like this.
  20. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    The problem for me would be a 1060 is too wimpy to care about and is not attractive at any price more that "free" (free is always a good price).

    Well, OK... maybe that is a bit of an exaggeration. I would buy a 1060 for $100 or less, overclock the crap out of it to gain HWBOT points in its model class, then sell it for $150.
     
  21. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Yes, cross-posting is against forum rules. Apart from that, if the information is extra important or something that many people will be interested in reading I think calling it spam might be too harsh. Unless the posted content is objectionable or violates other forum rules, I don't really consider it terrible behavior or a disservice to the community. Some people never explore more than a few threads. A quote/link to a main thread is a good idea though. It can help coax those folks into getting out and exploring a little more often. If nothing else, it's nice to see people passionate about spreading good and helpful information.
     
    Dr. AMK and Papusan like this.
  22. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    My main point with the post... Micro$haft is out again with trash patches. This won't be the last time we will see the Morons Kill performance. I wonder why they had to do this. Ain't it other ways?

    "You can still however mine it on Windows Server or Linux which do not take up as much vRAM as Windows 10. The current DAG file size is 2.33 GB which essentially means that windows is now taking up more than 660 mb of vRAM under windows 10 conditions. We are running a very learn installation of windows 10 and were unable to mine on it. As mentioned previously, Windows Server and Linux based rigs should still work fine."
     
    Dr. AMK likes this.
  23. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, I was only trying to help @Dr. AMK from getting modded, and helping him learn how to get the word out without going against the rules. :)

    @Papusan was the self-admitted multi-thread-spammer scammer :D :confused: :p :eek: :rolleyes: o_O ;)
     
    Dr. AMK and Mr. Fox like this.
  24. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,213
    Messages:
    39,333
    Likes Received:
    70,629
    Trophy Points:
    931
    Yup. Nothing wrong with that.

    And, you and I, and many others have done it, too. Sometimes we all have to just "get the word out" to the community.

    Brother @Papusan is absolutely not a scammer. He posts good and accurate information, and has the best intentions at heart for those he seeks to warn about the horrors and woes of BGA filth. And, I love Spam. Especially grilled in a skillet so it is crispy on the edges. :vbwink:
     
    Last edited: Feb 17, 2018
  25. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Should NVIDIA Be Concerned About Amazon's Custom AI Chip?
    https://www.fool.com/investing/2018/02/17/should-nvidia-be-concerned-about-amazons-custom-ai.aspx
    The e-commerce giant is the latest technology company looking to create a custom chip to foster its AI ambitions.

    The dawn of artificial intelligence (AI) has set off an arms race of sorts, with the largest companies in technology seeking to benefit from the nascent tech. NVIDIA Corporation(NASDAQ:NVDA) has been one of the biggest beneficiaries of this trend, as its graphics processing units (GPUs) were the early choice for training AI systems. The ability of GPUs to perform a significant number of complex mathematical calculations simultaneously made them the perfect choice for AI applications.

    NVIDIA has seen its data center business explode, producing triple-digit year-over-year growth for seven consecutive quarters, and its stock price has risen over 1,000% since early 2015. A number of high-profile companies that once embraced NVIDIA's GPUs are now looking to develop other solutions to augment or replace the illustrious GPU. Amazon.com (NASDAQ:AMZN) may be the latest tech giant to enter the fray.

    [​IMG]
    THE TECH INDUSTRY IS RUSHING TO DEVELOP CUSTOM AI CHIPS. IMAGE SOURCE: GETTY IMAGES.

    Alexa's connection to the cloud
    Amazon was an early adopter of AI, and is working on custom chips that would have the ability to do on-device processing -- or at-the-edge processing -- rather than relying solely on a devices connection to the cloud, according to a report in The Information. Currently, when a user makes a request to Amazon's digital assistant Alexa, the information is transmitted to the cloud, which processes the request and submits a response back to the device. This process results in a slight delay. The ability to handle speech recognition locally would improve the response times for any device powered by the digital assistant, including the Echo family of smart speakers.

    Amazon bolstered its processor expertise with the $350 million acquisition of Annapurna Labs in early 2015. The company developed networking chips for data centers capable of transmitting greater quantities of data while consuming less power. Amazon now has a stable of more than 450 employees possessing some level of chip experience, which could lead the company to develop other specialized chips. The report also implies that Amazon may be developing an AI processor for Amazon Web Services, its cloud computing segment.

    To each their own
    In early 2016, AI pioneer Google, a division of Alphabet (NASDAQ:GOOGL) (NASDAQ:GOOG), revealed that it had created a custom AI chip dubbed the tensor processing unit (TPU). The application-specific integrated circuit (ASIC) was designed to provide more energy efficient performance for the company's deep learning AI applications, which are capable of learning by processing massive quantities of data. The chip formed the foundation for TensorFlow, the framework used to train the company's AI systems.

    The latest version of the TPU can handle both the training and inference phases of AI. As the name implies, an AI system "learns" during the training phase, while the inference phase is when the algorithms do the job for which they have been trained. Google recently announced that access to these processors will now be available to its cloud customers.

    Apple (NASDAQ:AAPL) has long been a proponent of user privacy and has taken a different path than its tech brethren. The company's mobile devices add electronic noise to any data transmitted to the cloud while stripping away any personally identifiable information, which provides a greater degree of user privacy and security. With the release of the iPhone 8 and iPhone X models, the company developed a neural engine as part of its new A11 Bionic chip -- a cutting-edge processor that could handle many AI functions locally. This vastly reduces the amount of user information that is transmitted to the cloud, which helps to secure the data.

    Microsoft Corporation (NASDAQ:MSFT) decided early on to stake a claim in a customizable processor known as a field programmable gate array (FPGA) -- a specialty chip that can be configured for a specific purpose by the customer after it has been manufactured. These have become the foundation for Microsoft's Azure cloud computing system and offer flexible architecture and lower power consumption than traditional offerings like GPUs.


    Growth continues unabated
    While each of these companies has employed a different processor strategy, they still use a significant number of NVIDIA's GPU's in their operations.

    NVIDIA's growth has continued to shine. In its most recent quarter, NVIDIA reported a record revenue of $2.91 billion, which grew 34% over the prior-year quarter. The company's data center segment -- which houses AI sales -- grew 105% year over year to $606 million and now accounts for 21% of NVIDIA's total revenue.

    The competition was inevitable, but so far no solution can completely replace the GPU. NVIDIA can rest easy -- for now.
     
  26. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I believe we should all be very concerned about both of them. :D
     
    Dr. AMK likes this.
  27. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    NVIDIA’s Earnings Report Reveals a Worrying Problem
    http://www.foxbusiness.com/markets/2018/02/18/nvidia-s-earnings-report-reveals-worrying-problem.html

    Published February 18, 2018 Markets Motley FoolOpens a New Window.


    The NVIDIA (NASDAQ: NVDA) juggernaut continues to outperform as the graphics card specialist topped Wall Street expectationsOpens a New Window. once again with a terrific showing in the fourth quarter of fiscal 2018. The company reported a steep rise in its revenue and earnings, thanks mainly to a massive year-over-year jump in its data center and gaming businesses.

    What's more, NVIDIA expects stronger growth in the current quarter, forecasting a 50% year-over-year increase in revenue, while net income could jump close to 82%. So the company seems on target to keep up its outstanding growth rate. But a closer look at the latest report indicates that there's one area proving to be a concern.

    Automotive growth has struggled
    The hype around NVIDIA's self-driving car venture has failed to translate into revenue growth. The company reported $132 million in automotive revenue last quarter, an increase of just 3% from the year-ago quarter. In fact, the automotive business is now just a small part of NVIDIA's huge empire, supplying just 4.5% of the top line.

    This raises red flags at a time when the race to develop self-driving carsOpens a New Window. is intensifying, indicating that NVIDIA is probably losing the early advantage it established by landing marquee customers such as Tesla. What's even more surprising is that NVIDIA claims to have a solid ecosystem of 225 partners using its AI-enabled platforms for self-driving cars.

    But all these partnerships haven't translated into any material gains for NVIDIA so far, bringing the company's automotive business to a point of stagnation. As it turns out, the company blames a change in business strategy as the reason behind its automotive woes.

    NVIDIA CFO Colette Kress pointed out over the recent conference call that the company is pivoting to a new sales model in the automotive space, moving away from vehicle infotainment systems, which are becoming commoditized. Instead, it's focusing on "next-generation AI cockpits systems and complete top-to-bottom self-driving vehicle platforms built on NVIDIA hardware and software."

    But will this help revive NVIDIA's automotive business?

    The road ahead
    NVIDIA remains confident about its automotive growth in the long run despite the short-term hiccups. As already mentioned, the company has an extensive ecosystem of automotive partners using its solutions for their driverless-car ambitions, and they should start contributing to NVIDIA's growth in the next few years.

    CEO Jensen Huang believes that we will start seeing cars with self-driving capability on roads from late 2019, with the market hitting critical mass in 2020. By 2022, Huang believes, almost every car being made will have such capabilities.

    This could be a big deal for NVIDIA, as a car with autonomous capabilities could bring anywhere between $500 and $1,000 in revenue for the company. On the other hand, a robotaxi that doesn't require a human driver could bring thousands of dollars in revenue, according to Huang.

    The good news is that NVIDIA is aggressively pursuing this market as evident from its recent product development moves and partnerships. Late last year, the company revealed a new self-driving platform -- the DRIVE PX Pegasus -- that's 10 times more powerfulOpens a New Window. than its predecessor.

    NVIDIA had clearly stated that it's going after the robotaxi market with this new chip, targeting the massive ridesharing space. The chipmaker has struck deals with key players to deploy self-driving taxis based on its Pegasus platform, including NuTonomy and Optimus Ride.

    In all, there are at least 25 companies developing driverless cars based on Pegasus, though it is NVIDIA's latest deal withOpens a New Window. Uber that could give it a big boost in the long run. In January, Uber announced that it will be using NVIDIA's AI-enabled self-driving car technology in its fleet of driverless cars and trucks.

    Uber is among the leading providers of ridesharing services across the globe, and it's aggressively working to automate its fleet. For instance, in November last year, Uber announced plans to buy 24,000 cars from Volvo from 2019 to 2021 that will be outfitted with autonomous technology. This is in line with NVIDIA's timeline for deployment of self-driving cars as mentioned earlier.

    So NVIDIA investors shouldn't lose hope just yet as its automotive business could recover next year when deployments begin. But at the same time, the graphics specialist needs to keep a close watch on rival chipmaker Intel's moves in autonomous carsOpens a New Window.. The latter looks well placed to begin monetizing its automotive technology thanks to its Waymo-Fiat-BMWpartnership, so NVIDIA will have to work hard to stay ahead and keep hold of its clients to ward off competition.

    10 stocks we like better than Nvidia
    When investing geniuses David and Tom Gardner have a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*

    David and Tom just revealed what they believe are the 10 best stocksOpens a New Window. for investors to buy right now... and Nvidia wasn't one of them! That's right -- they think these 10 stocks are even better buys.
     
  28. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The real problem is practically nobody that doesn't benefit financially from self-driving vehicles wants them at all.

    I don't want one, I can't imagine in my hopefully long lifetime ever being comfortable putting my life in the hands of automation driven by rule based AI.

    My experience shows that the huge number of transactions per trip, across the billions (trillions) of trips world wide, is going to turn out to be a blood bath, and an insurance nightmare.

    Maybe in a 100 years, well beyond when I expect to be here, there will be success enough to allow this to move forward, but from what I've seen, it's just a bunch of marketing BS to raise money.

    The problem is so many non-technical people have bought into the BS that real automotive companies are forced to waste billions on this pipe-dream that will end badly, or go out with a whimper when enough of them figure it's not doable, and nobody really wants it at all.
     
    Dr. AMK likes this.
  29. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    3 Common Inaccuracies About NVIDIA's Cryptocurrency and Artificial Intelligence Businesses
    https://www.fool.com/investing/2018/02/18/3-common-inaccuracies-about-nvidias-cryptocurrency.aspx
    The astronomical rise of the GPU specialist's stock has attracted some financial writers who would do well to either sit it out or do more homework.

    Beth McKenna
    (TMFMcKenna)
    Feb 18, 2018 at 1:32PM

    NVIDIA (NASDAQ:NVDA) stock has been a huge winner in recent years. Since 2016, shares have returned 640%, versus the S&P 500's nearly 40% return. This exceptional performance has left investors hungry for articles about the fast-growing graphics processing unit (GPU) specialist and artificial intelligence (AI) player, which in turn means that financial writers are coming out of the woodwork to meet this demand.

    If said woodwork was filled only with folks who were both capable of grasping meaty tech topics and willing to do adequate homework before gracing the digital world with their words and analysis, this en masse emergence wouldn't be an issue. Not surprisingly, however, this isn't the case. Certainly, we all make occasional errors, but there's no excuse for getting key things wrong -- at least, repeatedly wrong -- because of doing nil, or considerably inadequate, research.

    Here are three things that I've read -- including on some well-regarded sites -- on multiple occasions that are wrong, with at least some of them a sign that you should probably be skeptical about the rest of the content of a particular article.

    [​IMG]
    IMAGE SOURCE: GETTY IMAGES.

    1. Calling NVIDIA a "bitcoin play"
    This is a very common and timely error. I think what happens here is that some writers see mention that "NVIDIA is a cryptocurrency play" (which is true), do no digging, and make the inaccurate leap to "NVIDIA is a bitcoin play," because bitcoin is the best-known digital currency as a result of being the oldest and largest crypto, by market cap.

    The accurate information: NVIDIA's GPUs are generally considered ideal for "mining," or generating, emerging cryptocurrencies that require mining. (This still includes Ethereum, the second largest crypto, as well as some other smaller digital currencies.) Once such a digital currency gets large enough, however, someone will build an application-specific integrated circuit (ASIC) to mine it, and others will increasingly follow suit, making a GPU a second-rate tool for mining that particular cryptocurrency. This scenario happened to bitcoin, and it will eventually happen to other cryptocurrencies if they continue to increase in value and maintain the need to be mined. NVIDIA CEO Jensen Huang covered this information in detail on at least one of the company's quarterly earnings calls in fiscal 2018, so it's not hard to find. A quick internet search also does the trick.

    2. Providing an exact quantification of NVIDIA's cryptocurrency business
    I've seen some articles that provide an exact quantification of NVIDIA's cryptocurrency business in terms of the dollar amount of revenue generated in a particular quarter or other time period, and/or the percentage of total revenue that crypto sales comprised.
    The accurate information: The company itself can't provide an exact quantification because it has no way of knowing how many consumers are buying its GeForce gaming cards to use for mining digital currencies. The revenue numbers that CFO Colette Kress provided on the Q2 and Q3 calls (she didn't provide a number on the Q4 call) are only for sales of the company's application-specific boards. In Q2 and Q3, sales of these application-specific products accounted for 6.7% and 2.7%, respectively, of NVIDIA's total revenue. So any writer who is using these numbers to provide an exact quantification is understating the size of NVIDIA's cryptocurrency business.

    3. Equating NVIDIA's entire data-center business with an artificial intelligence business
    I've read multiple articles that equate NVIDIA's entire data-center business to an artificial intelligence business. This error matters because it leads investors to believe that NVIDIA's AI business is bigger -- probably considerably bigger -- than it actually is.

    The accurate information: AI is a huge growth driver for NVIDIA's fast-growing data-center platform, which is the company's largest platform by revenue behind gaming. However, AI is not the only growth driver for this platform. Accurate information is easily obtained: At the beginning of every quarterly earnings call, Kress outlines the growth drivers for each of the company's four target market platforms, and oftentimes Huang expounds on this information in the question-and-answer session that follows management's opening remarks. Along with AI, the data-center growth drivers have typically been high-performance computing (HPC) and virtualized computing. (The latter refers to creating and using virtual versions of a computing device or resource.) Kress did say on the Q4 earnings call that NVIDIA is "starting to see the convergence of HPC and AI," but these are still largely two different things.
     
  30. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Yes are completely right, we should all be very very concerned about both of them.
     
  31. Lynx2017

    Lynx2017 Notebook Evangelist

    Reputations:
    238
    Messages:
    420
    Likes Received:
    396
    Trophy Points:
    76
    Nvidia is a smart business model as well, just in general. Like charging $200 for G-Sync, I mean they paid the engineers to invent it and for it to work in windowed mode, they had every right to charge extra for it. Their quality control is absolutely outstanding as well and while I have been an AMD fanboy most of my life just because always on a tight budget, I do respect Nvidia's level of quality. Their stock price reflects this.
     
    Papusan and Dr. AMK like this.
  32. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Practically yes, but you have to know that all what we know about the technologies is just what they wants us to know, there are so many very advanced technologies already they own it, but they hide them from us to show them in the right time. Automation is coming, digital transformation is coming, no matter we like it or not.
     
  33. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    They'll be happy selling the sizzle for as long as they can get away with it, and then slowly slink off and tout something else as being the missing piece they were looking for, and we wait for that to arrive - having waste $B's on follies.
     
    Dr. AMK likes this.
  34. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The 2,578 Problems With Self-Driving Cars
    By Mark Harris , 2 Feb 2017 | 13:00 GMT
    https://spectrum.ieee.org/cars-that...ving/the-2578-problems-with-self-driving-cars

    "Reports from companies testing autonomous vehicles in California show they are improving but still far from perfect

    Last year, a self-driving car failed about every 3 hours in California, according to figures filed with the state’s Department of Motor Vehicles.

    Every January, carmakers testing self-driving cars in California have to detail how many times their vehicles malfunctioned during the preceding year. These so-called disengagement reports detail every time a human safety driver had to quickly take control of their car, either due to hardware or software failure or because the driver suspected a problem.

    The reports—detailing 2,578 failures among the nine companies that carried out road-testing in 2016—give a unique glimpse into how much testing the different companies are doing, where they are doing it, and what is going wrong. None of this year’s disengagements resulted in an accident.

    [​IMG]
    Image: Mark Harris; Source: California DMV
    Alphabet’s spin-out company Waymo [Google in chart above] still has by far the biggest testing program—its 635,868 miles of testing accounted for over 95 percent of all miles driven by self-driving cars in California in 2016. Waymo’s fleet of 60 self-driving cars reported a total of 124 disengagements, 51 of them due to software problems. That represents a sharp reduction in disengagements from the previous year, from 0.8 disengagements for every 1,000 miles of autonomous driving down to just 0.2.

    Bosch, by contrast, reported over 1,400 disengagements while covering just 983 miles in three vehicles—equivalent to a hefty 1,467 disengagements for every 1,000 miles of driving. But that doesn’t mean Waymo’s cars are 8,000 times safer than Bosch’s, as every company has its own way of counting disengagements.

    For instance, Waymo does not count every single time the driver grabs the wheel to take over from the robotic chauffeur, which it admits happens many thousands of times annually. Instead, the company later simulates what would have happened if the human had not jumped in—and only reports disengagements where the car would have done something unsafe. It calculates that if its drivers had taken no action at all, nine disengagements in 2016 would have led to the car hitting an obstacle or another road user. That is down from 13 the previous year, despite covering 50 percent more miles.

    “Waymo’s report would seem to suggest substantial improvement,” says Bryant Walker-Smith, a professor at the University of South Carolina. “But I’d want to know whether Waymo’s system could handle any of the system-initiated disengagements by achieving a minimal risk condition, say by pulling off to the side of the road, rather than immediately disengaging.”

    The other problem with comparing disengagement rates is that different companies are using California’s testing permits for different things. Only Waymo and Cruise Automation, now owned by General Motors, have large, general-purpose testing programs. In its first year on the state’s roads, Cruise’s two dozen cars went from covering less than 5 miles in June 2015 to over 2,000 miles in September 2016. Its disengagement rate also plummeted over the same period, from over 500 to under 3 per 1,000 miles.

    No other company drove more 5,000 miles in 2016, and some of the world’s biggest carmakers, including BMW, Ford, and Mercedes-Benz, covered less than 1,000. “The low number of miles, combined with high number of disengagements, suggests that R&D engineers are occasionally using a local vehicle to get real-world performance data useful for a specific project,” says Walker-Smith.

    Despite holding testing permits, Honda and Volkswagen drove no autonomous miles at all last year on public roads in California, preferring to test on private courses or out of state.

    Once more, the most mysterious disengagement report is from Tesla. In 2015, the company reported no disengagements at all, suggesting that it either carried out no public testing in California or that its cars were flawless. This year, its report admits 182 disengagements in 550 miles of autonomous driving during 2016.

    However, all but a handful of those disengagements happened in just four cars over the course of a single long weekend in October, possibly during the filming of a promotional video. Tesla does much of its testing out of state and on test tracks, although it also benefits from receiving millions of miles of road data from thousands of AutoPilot-equipped vehicles owned by its customers.

    Companies that began autonomous vehicle testing in California in 2016, including startups Zoox, Drive.ai, Faraday Future, and NextEV, will not have to report their disengagement data until this time next year. Uber, which abandoned its pilot program of self-driving Volvos in San Francisco rather than apply for a testing permit, is currently testing in Arizona and Pennsylvania, states that do not require companies to report disengagements or failures."

    I am sure you noticed just how few miles most car makers are logging on open road tests, it's clear that at least 5-10 more years of on road testing is needed with constant improvements in sensors, methodology for decision making, compute power applied effectively rather than speculatively, and user interface improvements.

    With these results being logged by trained drivers and engineers, knowing what to expect, with years of simulation and private track training (I hope!!), imagine how Mama and Papa, and Grandparents and Kids - the highest need groups - will react to "disengagements".

    I'm pretty sure the skew toward accidents will be great and sustained over the weening on to autonomous driving for passenger cars - when that time arrives - if ever.

    What a great time be be alive, before things get well and truly messed up by AI and automation.
     
    Last edited: Feb 18, 2018
    Dr. AMK likes this.
  35. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Wow, there it is already, the thing we are missing that we must have, and therefore must wait for to arrive:

    "Driverless vehicles will need standardized software to talk to one another and avoid crashes."

    That didn't take long. Just in time to save companies from themselves and their promises to deliver autonomous drive this year and next.

    They can't deliver (it's impossible), so now they need a good "official" excuse, voila! :D

    Apple's Business Model Will Backfire in Self-Driving Cars
    Driverless vehicles will need standardized software to talk to one another and avoid crashes.
    By Noah Smith, Updated on February 16, 2018, 6:51 AM PST
    https://www.bloomberg.com/view/arti...s-need-standardized-software-to-avoid-crashes

    "Fans of self-driving cars will have breathed a sigh of relief at the news that Uber and Google’s Waymo, two giants in the industry, have settled their intellectual-property lawsuit. This removes a huge distraction for companies, and freeing them up to focus on their own research.

    So that’s good. Driverless cars will be an incredible boon to society. They could save tens of thousands of American lives every year, and even more in countries such as India and China, and prevent millions of injuries. They could save the average American hundreds of hours each year -- hours that she or he would then have to do work or rest while on the road. And they could eliminate the stress of driving, improving health and quality of life for billions.

    The positive, transformative potential of self-driving cars is hard to overstate. Thus it’s great to see the U.S. government thinking about how to regulate autonomous vehicles well in advance of the technology’s wide adoption. Bipartisan legislation has been introduced in both the House and Senate to harmonize rules at the federal level, while the National Highway Traffic Safety Administration is trying to remove unnecessary regulatory barriers. Meanwhile, Senators Bill Nelson, Gary Peters and John Thune have put forward a set of principles they plan to use when regulating self-driving cars.

    Most of the principles are sensible -- promoting safety, cybersecurity and education, while propelling innovation forward as fast as possible. But one of the principles -- the principle of government remaining neutral toward alternative autonomous vehicle technologies -- deserves some closer examination.

    Unlike the normal, human-guided variety, self-driving cars work better as a network. When the vehicles talk to each other, they can avoid crashes much more effectively. But if Waymo cars don’t talk to General Motors cars, that will reduce safety for everyone on the road.

    That means companies have an incentive to wire their cars to talk to each other, right? Maybe not. Lack of interbrand communication would mean that each car buyer would have an incentive to purchase a vehicle from the most dominant or popular brand, because that car would have the largest number of other cars it could talk to (and avoid crashing into). In other words, there’s a real danger that car companies will try to be like Apple, limiting interconnectivity in order to try to turn the auto market into a closed, monopolized ecosystem.

    The potential profit from monopolizing the auto market -- or even just the software component of that market -- is vast. Suppose that people are willing to pay $10,000 for the software that controlled a self-driving car. There are about 17 million new cars sold in the U.S. each year. Just multiplying those two numbers would translate to revenue of $170 billion a year, or almost as much as Amazon.com's sales last year.

    But if the company was a true monopoly -- if you had to buy your self-driving software from one company or suffer a much higher risk of injury and death -- it would act to maximize its profit. That is what monopolies do. It would raise prices, restrict output and make it harder for a middle-class person to own a car.

    A monopoly in self-driving technology would be bad for consumers, but recent research shows that it also would be bad for workers in the industry, since dominant employers are able to use their power to pay lower wages. For any number of reasons, government should act to prevent one company from dominating this industry.

    The traditional way to deal with a monopoly is to break it up, as AT&T was ordered to break up in 1982. But in the case of self-driving cars, doing so would leave the road less safe overall, since we’d be left with a bunch of smaller brands whose cars couldn’t talk to each other.

    A better idea is for the government to mandate that all autonomous cars be able to talk to each other. The easiest way to do that is to mandate that self-driving software companies make public all of the software and protocols that they use for communications between cars. That might not solve the whole problem, since those protocols and that code may work much better with one company’s system than another’s. But if regulators enact this rule early on, it would allow companies to develop their systems around a universal, shared intervehicle communications system from the ground up. That would be true tech neutrality, with no one system having a market advantage.

    This sort of rule would prevent safety-based monopolization of the auto fleet. But in spirit, it’s really more similar to something the government has been doing for centuries -- harmonizing weights and measures. That’s such an essential, useful government function that it’s actually written into the U.S. Constitution."
     
    Dr. AMK likes this.
  36. Firefox@yami

    Firefox@yami Notebook Geek

    Reputations:
    31
    Messages:
    87
    Likes Received:
    32
    Trophy Points:
    26

    something about automation car that you may be interested in
     
    hmscott and Dr. AMK like this.
  37. Firefox@yami

    Firefox@yami Notebook Geek

    Reputations:
    31
    Messages:
    87
    Likes Received:
    32
    Trophy Points:
    26
    well the car is actually a part in that video . though this channel has only 9 videos , i found that they were interesting i want to share it here .
     
    hmscott and Dr. AMK like this.
  38. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    hmscott likes this.
  39. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    The Coming Intersection Of HPC And The Enterprise Data Center
    https://www.forbes.com/sites/johnwe...-and-the-enterprise-data-center/#6f7d20071fd2
    [​IMG]
    BERLIN, GERMANY - JANUARY 12: Close-up of cables and LED lights in a server center on January 12, 2018, in Berlin, Germany. (Photo Illustration by Thomas Koehler/Photothek via Getty Images)

    High Performance Computing (HPC) traditionally exists as a separate and distinct discipline from enterprise data center computing. Both use the same basic components—servers, networks, storage arrays—but are optimized for different types of applications. Those within the data center are largely transaction-oriented while HPC applications crunch numbers and high volumes of data. However, an intersection is emerging, driven by more recently by business-oriented analytics that now falls under the general category of Artificial intelligence (AI).

    Data-driven, customer-facing online services are advancing rapidly in many industries, including financial services (online trading, online banking), healthcare (patient portals, electronic health records), and travel (booking services, travel recommendations). The explosive, global growth of SaaS and online services is leading to major changes in enterprise infrastructure, with new application development methodologies, new database solutions, new infrastructure hardware and software technologies, and new datacenter management paradigms. This growth will only accelerate as emerging Internet of Things (IoT)-enabled technologies like connected health, smart industry, and smart city solutions come online in the form of as-a-service businesses.

    Business is now about digital transformation. In the minds of many IT executives, this typically means delivering cloud-like business agility to its user groups—transform, digitize, become more agile. And it is often the case that separate, distinctly new cloud computing environments are stood-up alongside traditional IT to accomplish this. Transformational IT can now benefit from a shot of HPC.

    HPC paradigms were born from the need to apply sophisticated analytics to large volumes of data gathered from multiple sources. Sound familiar? The Big Data way to say the same thing was “Volume, Variety, Velocity.” With the advent of cloud technologies, HPC applications have leveraged storage and processing delivered from shared, multi-tenant infrastructure. Many of the same challenges addressed by HPC practitioners are now faced by modern enterprise application developers.

    As enterprise cloud infrastructures continue to grow in scale while delivering increasingly sophisticated analytics, we will see a move toward new architectures that closely resemble those employed by modern HPC applications. Characteristics of new cloud computing architectures include independent scaling compute and storage resources, continued advancement of commodity hardware platforms, and software-defined datacenter technologies—all of which can benefit from an infusion of HPC technologies. These are now coming from the traditional HPC vendors— HPE, IBM and Intel with its 3D-XPoint for example—as well as some new names like NVIDIA, the current leader in GPU cards for the AI market.

    To extract better economic value from their data, enterprises can now more fully enable machine learning and deep neural networks by integrating HPC technologies. They can merge the performance advantages of HPC with AI applications running on commodity hardware platforms. Instead of reinventing the wheel, the HPC and Big Data compute-intensive paradigms are now coming together to provide organizations with the best of both worlds. HPC is advancing into the enterprise data center and it’s been a long time coming.
     
  40. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
  41. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
     
    Vasudev likes this.
  42. Lynx2017

    Lynx2017 Notebook Evangelist

    Reputations:
    238
    Messages:
    420
    Likes Received:
    396
    Trophy Points:
    76
    Congrats to Nvidia for caring about AI I guess. Meanwhile the rest of us can't find an affordable GPU to play games on. Shame there is no 3rd company in this competition, AMD cant keep up performance wise, and nvidia just doesn't care anymore, they already announced dedicated mining cards are taking priority in production lines soon... ugh gg life.
     
    hmscott, Vasudev and Dr. AMK like this.
  43. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    NVIDIA GPU Pricing Won’t Stabilize Until Q3 Claims Retailer But What About Ampere And Volta?
    Read more at https://hothardware.com/news/nvidia...-continue-through-q3-2018#U7Y6bQSaC0szmVTW.99

    It's a terrible time right now for gamers and DIY PC enthusiasts who are looking to build up a new gaming PC. The problem isn’t that the hardware available now is sub-par; we have some fantastic CPUs and GPUs on the market. Unfortunately, as it has been widely reported, the problem is that cryptocurrency miners are gobbling up all the graphics cards, leaving many gamers unable to find a card for an even remotely reasonable price. When you can find a graphics card, the prices are usually multiple times higher than MSRP. The problem is that supply right now isn't meeting the demand and there are several reasons why, according to a retailer that spoke directly with NVIDIA on the matter.

    [​IMG]
    A buying manager at Massdrop called B.Hutch has offered up some details on what is going on with retailers and NVIDIA behind the scenes. B.Hutch wrote, "NVIDIA was here at Massdrop HQ a couple of weeks ago re-confirming what we already know. There is a couple of reason why there is a shortage and also such a high demand of cards."

    The first reason relates to a shortage of memory for video cards right. Apple and Samsung, along with other smartphone makers, are consuming the same type of memory for smartphones and tablets as NVIDIA needs for its GPUs. The catch is that Apple, Samsung, and these other smartphone makers are willing to pay more for the memory, leaving less for graphics card manufacturing. That shortage means fewer video cards can be made by all NVIDIA partners including MSI, Gigabyte, Asus, EVGA, and others.

    The other reason cited for the shortage is one we've already talked about at length here -- cryptocurrency mining. Ethereum, Bitcoin and other miners are gobbling up all the cards, thus driving up demand and prices overall. Massdrop also says that NVIDIA shared some detail on how long it thinks this shortage could last.

    B.Hutch wrote, "While NVIDIA was here they also let us know that the pricing in the market will continue to go up through Q3 of this year most likely before we start seeing any type of relief. So, unfortunately the end to this is not right around the corner and we have not seen the worst of it yet."

    This calls into question speculation on when exactly NVIDIA might launch its next generation consumer GPU, code-named Ampere. It was previously rumored that Ampere would be arriving around the April time frame, which doesn't coincide with the "Q3" commentary here that allegedly came right from the horse's mouth. Perhaps Q3 is when we might see a Volta consumer variant based on HBM2? It's anyone's guess but the commentary that Q3 will bring some pricing relief is interesting to ponder.

    Regardless, Massdrop also promises to continue to list all the video cards that it can sell to the community. The company also warns that pricing will likely be above MSRP, but the merchant is only marking them up enough to "barely cover back end expenses." B.Hutch notes that these markups are because there are still some gamers out there who are willing to pay for cards because they don’t want to wait until the end of the year for prices to begin to stabilize.

    B.Hutch wrote, "Why should Massdrop offer these cards above MSRP? Well, there are still customers who want to build computers now and not wait 6, 9, 12 months for the market to stabilize. So as long as I can bring them a better deal than what is currently available I will continue to do so. That being said MSRP is kind of out the window at this point so keep your eye on market price VS Massdrop as you will not see any cards out there anywhere close to MSRP for a while to come. "

    While the high demand and short supply are bad for gamers and might frustrate NVIDIA, the demand has been a good thing for Team Green's bottom line. NVIDIA absolutely destroyed its earnings goals in the most recent quarter thanks in part to high demand for its consumer GPUs.
     
    Papusan, Vasudev and hmscott like this.
  44. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    GPU Technology Conference
    http://www.kurzweilai.net/gpu-technology-conference-3

    Dates: March 26 – 29, 2018
    Location: San Jose, California

    [​IMG]

    VIDIA’s GPU Technology Conference (GTC) is the premier AI and deep learning event, providing you with training, insights, and direct access to experts from NVIDIA and other leading organizations.

    See the latest breakthroughs in self-driving cars, smart cities, healthcare, big data, high performance computing, virtual reality and more.

    —Event Producer

    Event Site: GPU Technology Conference

    Topics: AI/Robotics | VR/Augmented Reality/Computer Graphics
     
    Papusan, Vasudev and hmscott like this.
  45. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    Dr. AMK likes this.
  46. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Almost 1 month from now, any leaked information about the specs?
     
    Vasudev and Papusan like this.
  47. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,701
    Messages:
    29,840
    Likes Received:
    59,615
    Trophy Points:
    931
    Only that it will use GDDR6. HBM2 is too expencive. The shareholders prefer most possible $$$ :D
     
    Vasudev and Dr. AMK like this.
  48. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931

    Godzilla - 1
    Geforce - 0
     
    Last edited: Feb 26, 2018
    Vasudev and Dr. AMK like this.
  49. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    NVIDIA Turing GTX 2080/70 GPUs Allegedly Launching in July for Gamers, Ampere to Succeed Volta in HPC Market at GTC
    https://wccftech.com/rumor-nvidia-t...eforce-lineup-ampere-to-succeed-volta-in-hpc/
    [​IMG]
    Fresh off the rumor mill we have another wild one. According to Igor Wallossek of Tom’s Hardware Germany, NVIDIA is allegedly launching its brand new Turing graphics architecture in July for the gaming segment. Igor, who’s the same individual that leaked NVIDIA’s Ampere codename last year, claims that there’s been a switcheroo of sorts.

    NVIDIA is now said to be planning to replace its existing Pascal powered GeForce gaming lineup with a brand new graphics architecture code named “Turing” around July. “Ampere” on the other hand is now said to be a Volta successor in the compute, HPC & AI / machine learning markets.
    Capture.PNG
    NVIDIA Turing Architecture is Designed For Gaming, Allegedly Launching in July – Ampere to Succeed Volta in Compute Markets by End of March
    This switch is somewhat unintuitive as Alan Turing has been famous for his work in the fields of computer science and cryptanalysis. Which makes the use of his name more suitable for a graphcis architecture that’s targeted towards professional applications, rather than gaming. Although, this latest update comes straight from the same source that leaked Ampere to begin with, so we’re inclined to believe that it has some credibility.

    [​IMG]
    English computer scientist, mathematician, logician and cryptanalyst Alan Turing

    Ampere which has reportedly been in production since late January is expected to be revealed at GTC later this month between the 26th and 29th. Turing on the other hand will only enter production in June with an announcement expected that month, at Computex in Taipei Taiwan. A hard launch is said to follow a month later in July.

    This suggests that NVIDIA’s alleged GeForce GTX 20 series lineup, including the GTX 2080 and 2070 will not actually be released later this month, but rather debut in June and hit shelves in July. Whilst a successor to the Volta V100 accelerator based on the Ampere architecture is what we’ll probably see an announcement for later this month at GTC.

    Regardless of codenames, the facts around the hardware specifications of the new GeForce series are still the same. We’re still looking at a GP104/GTX 1080 class chip replacement in the summer, built on TSMC’s 12nm process technology and featuring 16gbps GDDR6 memory from Samsung.

    [​IMG]RELATED Rumor : NVIDIA GTX 2080, 2070 Ampere Cards Launching March 26-29 At GTC 2018
    As always, remember to take this rumor, as with any other, with the necessary dose of NaCl. We should get a much better idea of what NVIDIA’s GPU roadmap looks like at GTC later this month. It seems the rumors are pouring in and won’t stop anytime soon. I’d urge extra caution in the meantime especially when dealing with conflicting information like what we’ve been seeing lately.
     
    Vasudev likes this.
  50. Dr. AMK

    Dr. AMK Living with Hope

    Reputations:
    3,961
    Messages:
    2,182
    Likes Received:
    4,654
    Trophy Points:
    281
    Rumor : NVIDIA GTX 2080, 2070 Ampere Cards Launching March 26-29 At GTC 2018
    https://wccftech.com/rumor-nvidia-gtx-2080-2070-ampere-cards-launch-march-gtc-2018/

    [​IMG]
    NVIDIA is reportedly readying a brand new lineup of GeForce GTX graphics cards based on its upcoming 12nm Ampere graphics architecture for an official debut at this year’s GTC in late March, a mere month away from today.

    Ampere has been a frequent subject of leaks and rumors in the past several months. The code name hasn’t been publicly disclosed by NVIDIA in any of its previous roadmaps, however several reports have emerged alleging that Ampere will in fact power the next generation lineup of GeForce GTX graphics cards this year. In essence NVIDIA is said to be skipping a generation in the gaming market, Volta, in favor of Ampere. Whether Ampere is a stand-alone leapfrog architecture or a tweaked version of Volta that’s tuned specifically for gaming is still unclear.

    NVIDIA “Ampere” for Gaming, “Turing” for AI & Compute
    Two key features of Ampere include TSMC’s new 12nm manufacturing process and Samsung’s new 16gbps GDDR6 memory. The first Ampere GPU that NVIDIA is expected to announce next month is GA104, a replacement to GP104 ( GTX 1080/70 ) in terms of chip size and market positioning and a replacement to GP102 (GTX 1080 Ti) in terms of performance.

    So, one could reasonably expect a pair of products with pricing similar to the GTX 1080/70 and performance similar to the GTX 1080 Ti/ Titan Xp. These will be the purported GTX 2080/1180 and GTX 2070/1170 cards. According to previous reports, GP102 has already entered end-of-life and is no longer being manufactured, while production of GA104 has been well underway since January.

    I will also be remiss if I didn’t mention that the naming scheme of the brand GeForce GTX Ampere series has yet to be confirmed, and that “2080” and “2070” are simply placeholders at the moment. Whether the company will follow a 20 series or 11 series naming scheme is yet to be determined. Although, with the series coming out in mere weeks we shouldn’t have to wait too long before we know for sure.


    English computer scientist, mathematician, logician and cryptanalyst Alan Turing,

    On the compute front the company is expected to debut an entirely new graphics architecture called Turing that’s specifically designed and optimized for AI and machine learning applications.

    Previously we’ve seen the company use the same architecture with chip specific tweaks and optimizations for different markets e.g. FP64, HBM and other component configurations that would differentiate the gaming products from the compute products.

    If what we’ve been hearing is accurate, then 2018 will be the first year where NVIDIA is going to debut two distinctly different graphics architectures for gaming and compute rather than just tweaked chips based on the same architecture.

    GTC 2018 will officially kick off on March 26th and will ends on March 29th, the announcement will likely come at Jensen’s keynote speech.Until then remember to take this rumor, like any other, with a grain of salt.

    Capture.PNG
     
    Vasudev likes this.
← Previous pageNext page →