Subject: Graphics Cards | March 4, 2018 - 02:02 PM | Scott Michaud
Tagged: nvidia, hotfix, graphics drivers
NVIDIA has published a hotfix driver, 391.05, for a few issues that didn’t make it into the recently released 391.01 WHQL version. Specifically, if you are experiencing any of the following issues, then you can go to the NVIDIA forums and follow the link to their associated CustHelp page:
- NVIDIA Freestyle stopped working
- Display corruption on Titan V
- Support for Microsoft Surface Book notebooks
While improved support for the Titan V and the Microsoft Surface Book is very important for anyone who owns those devices, NVIDIA Freestyle is an interesting one for the masses. NVIDIA allows users to hook the post processing stage of various supported games and inject their own effects. The feature launched in January and it is still in beta, but lead users still want it to work of course. If you were playing around with this feature and it stopped working on 390-based drivers, then check out this hotfix.
For the rest of us? Probably a good idea to stay on the official drivers. Hotfixes have reduced QA, so it’s possible that other bugs were introduced in the process.
Subject: Graphics Cards | February 28, 2018 - 09:04 PM | Ryan Shrout
Tagged: bitmain, bitcoin, qualcomm, nvidia, amd
This article originally appeared in MarketWatch.
Research firm Bernstein recently published a report on the profitability of Bitmain Technologies, a secretive Chinese company with a huge impact on the bitcoin and cryptocurrency markets.
With estimated 2017 profits ranging from $3 billion to $4 billion, the size and scope of Beijing-based Bitmain is undeniable, with annual net income higher than some major tech players, including Nvidia and AMD. The privately held company, founded five years ago, has expanded its reach into many bitcoin-based markets, but most of its income stems from the development and sale of dedicated cryptocurrency mining hardware.
There is a concern that the sudden introduction of additional companies in the chip-production landscape could alter how other players operate. This includes the ability for Nvidia, AMD, Qualcomm and others to order chip production from popular semiconductor vendors at the necessary prices to remain competitive in their respective markets.
Bitmain makes most of its income through the development of dedicated chips used to mine bitcoin. These ASICs (application-specific integrated circuits) offer better performance and power efficiency than other products such as graphics chips from Nvidia and AMD. The Bitmain chips are then combined into systems called “miners” that can include as many as 250 chips in a single unit. Those are sold to large mining companies or individuals hoping to turn a profit from the speculative cryptocurrency markets for prices ranging from a few hundred to a few thousand dollars apiece.
Bitcoin mining giant
Bernstein estimates that as much as 70%-80% of the dedicated market for bitcoin mining is being addressed by Bitmain and its ASIC sales.
Bitmain has secondary income sources, including running mining pools (where groups of bitcoin miners share the workload of computing in order to turn a profit sooner) and cloud-based mining services where customers can simply rent mining hardware that exists in a dedicated server location. This enables people to attempt to profit from mining without the expense of buying hardware directly.
A Bitmain Antminer
The chip developer and mining hardware giant has key advantages for revenue growth and stability, despite the volatility of the cryptocurrency market. When Bitmain designs a new ASIC that can address a new currency or algorithm, or run a current coin algorithm faster than was previously possible, it can choose to build its Antminers (the brand for these units) and operate them at its own server farms, squeezing the profitability and advantage the faster chips offer on the bitcoin market before anyone else in the ecosystem has access to them.
As the difficulty of mining increases (which occurs as higher-performance mining options are released, lowering the profitability of older hardware), Bitmain can then start selling the new chips and associated Antminers to customers, moving revenue from mining directly to sales of mining hardware.
This pattern can be repeated for as long as chip development continues, giving Bitmain a tremendous amount of flexibility to balance revenue from different streams.
Imagine a situation where one of the major graphics chip vendors exclusively used its latest graphics chips for its own services like cloud-compute, crypto-mining and server-based rendering and how much more valuable those resources would be — that is the power that Bitmain holds over the bitcoin market.
Competing for foundry business
Clearly Bitmain is big business, and its impact goes well beyond just the bitcoin space. Because its dominance for miners depends on new hardware designs and chip production, where performance and power efficiency are critical to building profitable hardware, it competes for the same foundry business as other fabless semiconductor giants. That includes Apple, Nvidia, Qualcomm, AMD and others.
Companies that build ASICs as part of their business model, including Samsung, TSMC, GlobalFoundries and even Intel to a small degree, look for customers willing to bid the most for the limited availability of production inventory. Bitmain is not restricted to a customer base that is cost-sensitive — instead, its customers are profit-sensitive. As long as the crypto market remains profitable, Bitmain can absorb the added cost of chip production.
Advantages over Nvidia, AMD and Qualcomm
Nvidia, AMD and Qualcomm are not as flexible. Despite the fact that Nvidia can charge thousands for some of its most powerful graphics chips when targeting the enterprise and machine-learning market, the wider gaming market is more sensitive to price changes. You can see that in the unrest that has existed in the gaming space as the price of graphics cards rises due to inventory going to miners rather than gamers. Neither AMD nor Nvidia will get away with selling graphic cards to partners for higher prices and, as a result, there is a potential for negative market growth in PC gaming.
If Bitmain uses the same foundry as others, and is willing to pay more for it to build their chips at a higher priority than other fabless semiconductor companies, then it could directly affect the availability and pricing for graphics chips, mobile phone processors and anything else built at those facilities. As a result, not only does the cryptocurrency market have an effect on the current graphics chip market for gamers by causing shortages, but it could also impact future chip availability if Bitmain (and its competitors) are willing to spend more for the advanced process technologies coming in 2018 and beyond.
Still, nothing is certain in the world of bitcoin and cryptocurrency. The fickle and volatile market means the profitability of Bitmain’s Antminers could be reduced, lessening the drive to pay more for chips and production. There is clearly an impact from sudden bitcoin value drops (from $20,000 to $6,000 as we see saw this month) on mining hardware sales, both graphics chip-based and ASIC-based, but measuring that and predicting it is a difficult venture.
Subject: General Tech | February 19, 2018 - 12:59 PM | Jeremy Hellstrom
Tagged: Nintendo Switch, nvidia, Tegra X1
Sometimes a flaw in a chips design can be used for good, for instance a flaw in Nvidia's Tegra X1 chip which allows a successful install of Linux. The flaw is in the firmware, so Nintendo will not be pushing a fix out that will disable this feature on current Switches. For now, those who have managed this trick are not sharing so you will have to wait to try to fry your own Switch for now. As The Inquirer points out, this is not a terrible issue as the Linux based Switch still needs work to enable you to play anything on it, be it Switch games, legacy Nintendo or Steam.
"NOT CONTENT with simply getting Linux to boot on the Nintendo Switch, the hacker folks over at fail0verflow have managed to get the hybrid console to behave like a full-fat Linux PC."
Here is some more Tech News from around the web:
- Chrome Extension Brings 'View Image' Button Back @ Slashdot
- Guidemaster: Smartwatches worthy of replacing your favorite timepiece @ Ars Technica
- Microsoft fixes limitations of Windows 10 on ARM by deleting any mention of them @ The Inquirer
- If you don't like what IBM is pitching, blame Watson: It's generating sales 'solutions' now @ The Register
- Oh sh-itcoin! Crypto-dosh swap-shop Coinbase empties punters' bank accounts @ The Register
- Google Exposes How Malicious Sites Can Exploit Microsoft Edge @ Slashdot
- When it absolutely, positively needs to be leaked overnight: 120k FedEx customer files spill from AWS S3 silo @ The Register
- Zhiyun Crane 2 Gimbal @ TechPowerUp
Subject: Graphics Cards | February 18, 2018 - 02:54 PM | Scott Michaud
Tagged: opengl, nvidia, metal, macos, apple
Just two days ago, NVIDIA has published a job posting for a software engineer to “implement and extend 3D graphics and Metal”. Given that they specify the Metal API, and they want applicants who are “Experienced with OSX and/or Linux operating systems”, it seems clear that this job would involve macOS and/or iOS.
First, if this appeals to any of our readers, the job posting is here.
Second, and this is where it gets potentially news-worthy, is that NVIDIA hasn’t really done a whole lot on Apple platforms for a while. The most recent NVIDIA GPU to see macOS is the GeForce GTX 680. It’s entirely possible that NVIDIA needs someone to fill in and maintain those old components. If that’s the case? Business as usual. Nothing to see here.
The other possibility is that NVIDIA might be expecting a design win with Apple. What? Who knows. It could be something as simple as Apple’s external GPU architecture allowing the user to select their own add-in board. Alternatively, Apple could have selected an NVIDIA GPU for one or more product lines, which they have not done since 2013 (as far as I can tell).
Apple typically makes big announcements at WWDC, which is expected in early June, or around the back-to-school season in September. I’m guessing we’ll know by then at the latest if something is in the works.
Subject: General Tech | February 15, 2018 - 11:32 AM | Ken Addison
Tagged: podcast, Intel, amd, nvidia, raven ridge, r5 2400g, r3 2200g, arm, project trillium, qualcomm, snapdragon 845, x24, LTE, 5G
PC Perspective Podcast #487 - 02/15/18
Join us this week for a recap of news and reviews including new AMD Desktop APUs, Snapdragon 845 Performance Preview, ARM Machine Learning, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano
Peanut Gallery: Alex Lustenberg, Ken Addison
Program length: 1:18:46
Podcast topics of discussion:
Week in Review:
News items of interest:
Picks of the Week:
Subject: Systems | February 14, 2018 - 01:49 PM | Sebastian Peak
Tagged: small form factor, silent, SFF, nvidia, mini PC, Intel, Inferno, GTX 1080, gaming, fanless, core i7 7700k, compulab, Airtop2
Compulab, maker of mini systems such as the fitlet and Airtop is bringing the compact, fanless concept to a powerful gaming system - with no less than an Intel Core i7-7700K and NVIDIA GeForce GTX 1080. The catch? Is is not yet available, pending an upcoming Kickstarter campaign beginning February 24.
The teaser image of the upcoming Airtop2 Inferno fanless gaming system
The Airtop2 is already available for purchase in a fanless workstation version, built-to-order with up to an Intel Xeon E3-1275 v6 and NVIDIA Quadro P4000 (starting at $2575 for that configuration before adding memory/storage), and this new "Inferno" version of the Airtop2 promises to be very interesting to silent computing enthusiasts.
Front and rear views of the Inferno system
A fanless gaming system with high-end components is only going to be as effective as its cooling system, and here Compulab has a lot of experience on the industrial/embedded side of things.
Exploded view of the standard Airtop2 design (no images of the Airtop2 Inferno interior available yet)
Compulab lists these specs for the Airtop2 Inferno (along with the teaser, "and a little more..."):
- Unlocked Intel Core-i7 7700K
- NVIDIA GeForce GTX 1080
- Up to 64 GB DDR4 2400 RAM
- 2x NVMe + 4x 2.5″ SSD / HDD
- 2x USB 3.1 + 7x USB 3.0 | dual LAN | front (and back) audio
Compulab has also provided some benchmark results to demonstrate how effective their fanless implementation of these components is, with results using 3DMark and Unigine Heaven available on the Inferno product page.
The company has set up a Q&A page for the Airtop2 Inferno, but pricing/availability info will probably have to wait until February 24th when the Kickstarter campaign is active.
Subject: General Tech | February 9, 2018 - 01:06 PM | Jeremy Hellstrom
Tagged: nvidia, jen-hsun huang, billions
NVIDIA's odd take on the end of quarters means that we are just seeing their Q4 2017 earnings report along with their fiscal 2018 earnings. The news is good for the green team, with Q4 earnings of $2.91 billion, up 34% from Q4 2016. Total fiscal 2018 earnings are $9.71 billion, an increase of 41% from fiscal 2017 released last January. The success of the Nintendo Switch certainly helped and the $133 million tax cut they received didn't hurt, nor the popularity of graphics cards, which NVIDIA states they would prefer to sell to gamers; though it is unclear how they could enforce this on the open market.
It is too early to see an impact from their removal of consumer class GPUs from data centres, assuming they do not reverse this particular decision we will see a jump in that revenue in the coming quarters. The Inquirer offers their take on the finer points of NVIDIA's earning statement here.
"The firm revealed in its Q4 earnings report 2017 that its total revenues for the quarter swelled by a massive 34 per cent from the previous year, with its revenue for the full fiscal year hitting $9.71bn, up by 41 per cent."
Here is some more Tech News from around the web:
- Intel adopts Orwellian irony with call for fast Meltdown-Spectre action after slow patch delivery @ The Register
- Microsoft ditches passwords in latest build of Windows 10 S @ The Inquirer
- Wish you could log into someone's Netgear box without a password? Summon a &genie=1 @ The Register
- The 2018 Ars Technica Valentine’s Day gift guide
For the first time in several years, the notebook market has gotten very interesting from a performance standpoint. First, we had Intel’s launch of its Kaby-Lake Refresh 8th Generation processors which packed a true quad-core CPU into a 15W package. Then, we heard about AMD’s Raven Ridge which aimed to combine a quad-core mobile CPU with Radeon Vega graphics into that same 15W power target.
Even though the excitement over Raven Ridge may have subsided a bit after Intel and AMD’s joint announcement of Vega graphics combined with Intel CPUs in the Kaby-Lake G platform, that is still yet to be released and will reside in a significantly higher class of power usage.
So today we are taking a look at AMD’s Raven Ridge, what may be AMD’s first worthy entry into the thin-and-light notebook market.
For our Raven Ridge testing, we are taking a look at the HP Envy x360, which at the time of writing is the only machine to be shipping with these Ryzen Mobile processors (although more machines have been announced and are coming soon). Additionally, we also wanted to wait a while for the software ecosystem on this new platform to stabilize (more on that later).
Subject: General Tech | January 24, 2018 - 02:36 PM | Jeremy Hellstrom
Tagged: gaming, linux, nvidia, amd
With the current mining insanity driving GPU prices high enough it makes more financial sense to buy a gaming laptop or boutique system than to purchase a GPU on its own. The alternative is to continue on with your current GPU, even if it is a bit long in the tooth. Phoronix recently tested a battery of AMD and NVIDIA cards, focusing on older or less powerful models to see what kind of gaming performance they are capable of. The switch to Linux makes sense as Microsoft is beginning to refuse to recognize older GPUs and blocking the installation of the older drivers they require. You will have to turn down your graphics settings to reach playable FPS but there are titles out there you can still enjoy at 1080p.
"A request came in this week to look at how low-end and older graphics cards are performing with current generation Linux games on OpenGL and Vulkan. With ten older/lower-end NVIDIA GeForce and AMD Radeon graphics cards, here is a look at their performance with a variety of native Linux games atop Ubuntu using the latest Radeon and NVIDIA drivers."
Here is some more Tech News from around the web:
- Unlocked PS4 consoles can now run copies of PS2 games @ Ars Technica
- Nvidia’s GeForce Now PC beta is much better at cloud gaming than you think @ Rock, Paper, SHOTGUN
- Humble Paradox Interactive Bundle
- Subnautica devs on terror and why there are no guns @ Rock, Paper, SHOTGUN
- Age of Empires: Definitive Edition launches 20th Feb @ HEXUS
- Ubi announce bear necessities for Far Cry 5 on PC, inc 4K specs @ Rock, Paper, SHOTGUN
- Sequels for the sequel throne! Battlefleet Gothic: Armada 2 bringing more WH40K spaceship RTS action @ Rock, Paper, SHOTGUN
Subject: General Tech | January 10, 2018 - 02:13 PM | Jeremy Hellstrom
Tagged: amd, nvidia, relive, ShadowPlay, gaming
[H]ard|OCP are comparing AMD and NVIDIA's exhibitionist software to see which offers streamers the best experience. The two applications are superficially similar but they both offer different features and performance, not to mention only supporting their own hardware. From a performance standpoint, NVIDIA's ShadowPlay is slightly ahead in efficiency but not in any meaningful way, you would not be able to discern between the two in a blind test. When you look at features, AMD's ReLive is the clear winner. You can set your bitrate between 1-100Mbps at every resolution, from 360p to 2160p while NVIDIA maxes out at 50Mbps at any resolution and only supports up to 1440p. There are several other features AMD included which surpass NVIDIA's offerings, read about them all here.
"We take AMD ReLive in the AMD Radeon Software Adrenalin Edition and NVIDIA ShadowPlay as part of GeForce Experience and find out which one is more FPS and CPU-efficient for recording gameplay. We will compare features, specifications, and find out which better suits content creators for recording gameplay."
Here is some more Tech News from around the web:
- “The least-worst idea we had”—The creation of the Age of Empires empire @ Ars Technica
- Warhammer II’s Tomb Kings are a defensive juggernaut @ Rock, Paper, SHOTGUN
- Humble Hope For Orphans Bundle
- Bridge Constructor Portal isn’t a rollercoaster of laughs, but it’s still good @ Rock, Paper, SHOTGUN
- 44 GPU Fortnite Benchmark: The Best Graphics Cards for Playing Battle Royale @ TechSpot
- Total War: Three Kingdoms tackles the turbulence of 3rd century China @ Rock, Paper, SHOTGUN
- Cyberpunk 2077 beeps back to life, may yet boop, whirr @ Rock, Paper, SHOTGUN