Subject: Graphics Cards | November 25, 2016 - 01:21 PM | Jeremy Hellstrom
Tagged: msi, gtx 1070, GTX 1070 Quick Silver, factory overclocked
MSI's new Quick Silver design looks very different from most of their other cards, black and silver with a shiny metal backplate as opposed to the red and black we are used to. The GTX 1070 which TechPowerUp reviewed has a bit of a factory overclock, the base Core clock is 76MHz higher than the default at 1582MHz though they have left the VRAM at the default frequency. There is headroom left in the card, TechPowerUp hit a stable 2101MHz Core, 2290MHz VRAM, not the best results they have seen but certainly a decent increase. Drop by for a look at its performance in over a dozen games.
"MSI's GTX 1070 Quick Silver does away with the red-and-black color theme and uses stylish silver instead. Thanks to the powerful cooler from the GTX 1070 Gaming Z, the card is the coolest and quietest GTX 1070 we ever tested. It also comes at a rather affordable $425."
Here are some more Graphics Card articles from around the web:
- ASUS ROG STRIX GTX 1080 Gaming A8G @ Kitguru
- ZOTAC GeForce GTX 1060 Mini 3 GB @ techPowerUp
- Zotac GeForce GTX 1050 Ti @ Hardware Secrets
- XFX Radeon RX 470 @ Hardware Secrets
Subject: Graphics Cards | November 17, 2016 - 08:54 PM | Scott Michaud
Tagged: amd, graphics drivers
The fourth Radeon Software Crimson Edition graphics driver to be released this month, dated November 15th, was just published on their website. These have not been WHQL certified, like the previous ones, but that might actually be for the best. Rapid graphics driver releases, not throttled by Microsoft red tape, probably increases driver quality over this busy time of year. Also, I recently found out that WHQL certification is not a requirement for clean installed Windows 10 Anniversary Edition systems with Secure Boot Enabled. Both AMD and NVIDIA sign their hotfix drivers in a way that satisfies this check, without going through the entire WHQL process.
That aside, Radeon Software Crimson Edition 16.11.4 rolls in additional fixes to Civilization VI. AMD isn’t saying what these fixes are, such as whether they are for general performance optimizations or stability issues that we haven’t heard about yet, but it’s out now so you should probably update if you are currently playing the game. The driver also fixes problems when attempting to watch web video and play a game simultaneously, which is actually something I do frequently. (Don’t knock listening to podcasts while playing StarCraft II Arcade until you try it...) Thirdly, 16.11.4 also fixes rendering issues in Titanfall 2 that occur while piloting a Titan.
Subject: Graphics Cards | November 17, 2016 - 07:24 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Update, November 17th @ 7:21pm: NVIDIA has released 375.95 Hotfix to fix this issue. They are working on getting it WHQL certified for their website and GeForce Experience. You can download and install it directly, though.
Update, November 16th @ 12:56pm: NVIDIA has reproduced the low memory clocks issue, found its cause, and are working on a fix. They believe it only affects certain factory-overclocked cards. It is obviously a high priority, so a hotfix driver will likely be issued (unless they can get it WHQL certified quick enough that it would be pointless).
Original post below:
NVIDIA has just released a new graphics driver. GeForce Game Ready 375.86 provides optimized support for Ubisoft's Steep, which is an open-world game with wingsuiters, skiers, snowboarders, and paragliders. It also rolls in extra optimizations for previous game ready games that are receiving patches: Battlefield 1, Civilization VI, and Tom Clancy's The Division.
Before you install, though, there is one particularly annoying issue that is being reported on GeForce forums. NVIDIA is currently investigating reports that certain, but not all, Pascal GPUs are having their video memory stuck at 810 MHz, leading to (as you would expect) severe performance loss. It's possible that the affected users are all running a specific overclocking application or something. If you are in a bit of a rush and don't want to put up with potentially rolling back, then you might want to skip the version.
Thankfully, both discrete graphics vendors have been releasing multiple versions per month. The wait shouldn't be too long.
Subject: Graphics Cards | November 15, 2016 - 02:58 PM | Jeremy Hellstrom
Tagged: rx 480, nvidia, GTX1060, amd
On one side of the ring is the RX 480, with 2304 Stream Processors, 32 ROPs and 144 Texture Units. In the opposite corner, at 1280 CUDA Cores, 48 ROPs and 80 Texture Units is the GTX 1060. The two cards retail for between $200 to $250 depending on the features present on the card as well as any sales. [H]ard|OCP tested the two cards head to head, not just raw performance numbers but also the stability of the GPU frequencies. power draw and temperatures. All games were tested at base clocks and at the highest stable overclock and the results were back and forth, in some games AMD pulled ahead while in others NVIDIA was the clear winner. It is worth keeping in mind that these results do not include VR results.
"We take GIGABYTE’s Radeon RX 480 G1 GAMING video card and pit it against a MSI GeForce GTX 1060 GAMING X video card in today’s evaluation. We will overclock both video cards as high as possible and compare performance and find out what both video cards have to offer in the upper $200 price range for gaming."
Here are some more Graphics Card articles from around the web:
- Galax GTX 1070 EXOC Sniper Review @ OCC
- Zotac GeForce GTX 1050 @ Hardware Secrets
- Gigabyte GTX 1050 Ti G1 Gaming 4 GB @ techPowerUp
- MSI GTX 1050 Ti 4GB Gaming X 4G @ Kitguru
Subject: Graphics Cards | November 14, 2016 - 11:22 AM | Ryan Shrout
Tagged: video, the last hope, serious sam vr, rx 480, radeon, Polaris, multi-gpu, liquidvr, amd, affinity
While VR excitement might have cooled slightly in the enthusiast community, there continues to be innovation and software releases on both the Oculus Rift and HTC Vive that are bringing me back to what I think we believe to be part of the future of PC gaming. Serious Sam VR: The Last Hope was announced at E3 this year and is now available as an early access game on Steam. It is a dual wielding shooter that combines the enemies of the previous games along with the crazy weapons that made the series iconic.
And hey, there is something awesome about using a missile launcher that takes up half the screen.
One interesting technology addition to the game is use of AMD LiquidVR affinity multi-GPU. A Croteam developer recently posted a blog on the GPUOpen.com site talking about the implementation.
We wanted to add LiquidVR Affinity Multi-GPU rendering support to our engine because two GPUs can render the two eye views in almost half the time compared to a single GPU and this would greatly reduce our GPU bottlenecks. Affinity MGPU can either be done in one pass or with a separate pass for each eye, in which case we reap the GPU side benefits while the CPU workload stays the same.
We needed about a week to modify all shaders and to make sure that correct data is set for each eye. Single pass rendering with Affinity Multi-GPU gave us a huge speed improvement on both CPU and GPU from our original VR implementation. In the end, it took us less time to do single pass rendering correctly than it took us to fix all the problems caused by multi pass multi-GPU rendering.
After the interest in the Deus Ex multi-GPU scaling video I thought I would see if the Serious Sam implementation was actually beneficial to gamers.
- Test System
- Core i7-5960X
- X99 MB + 16GB DDR4
- AMD Radeon RX 480 8GB
- Driver: 16.10.2
The test was simple: I found that a single RX 480 could run the game at Medium settings perfectly well, but could it be playable on High with multi-GPU? By adding in a second Radeon RX 480 I was able to bring the performance up by 55% or so, making the VR experience nearly flawless.
It's not perfect scaling, but the benefits of multi-GPU for VR, when properly implemented, are obvious. As more games and experiences are released that require higher compute capability or have in-game settings that allow for better image quality, the ability to scale across GPUs will be a welcome addition to the ecosystem.
Check out the video here if you haven't seen any Serious Sam VR gameplay yet!
Subject: Graphics Cards | November 10, 2016 - 08:27 PM | Scott Michaud
Tagged: quarterly earnings, nvidia
The most recent quarter for NVIDIA, which is the three months ending on October 30th, has just passed $2 Billion USD in revenue, an increase of 54% from last year. All said and done, this leads to $542 million in GAAP net income, which is also up 108% from last quarter (or up 120% from the same quarter last year).
NVIDIA doesn't attribute this increase to any specific line of products. Instead, CEO Jen-Hsun Huang takes the opportunity to promote the “years of work and billions of dollars” they spent on the Pascal architecture, applying it all over the place. While I'm guessing a lot of the sales are carried over from last quarter's parts, which are now able to keep up with demand, NVIDIA points to laptop SKUs of 10-series GPUs, the launch of Tesla P4 and P40 GPUs, and initial shipments of the DGX-1 as new and notable for this quarter.
NVIDIA expects to have an even better quarter with the holiday, aimed at $2.1 Billion USD, plus or minus a couple percent. A lot more details are available on NVIDIA's blog, including their Switch announcement with Nintendo, their Drive PX2 platform, and their next-generation Tegra processor, codenamed Xavier.
Subject: Graphics Cards, Systems | November 10, 2016 - 11:44 AM | Ryan Shrout
Tagged: VR, rift, Oculus, atw, asynchronous timewarp, asynchronous spacewarp, asw
Oculus has announced that as of today, support for Asynchronous Spacewarp is available and active for all users that install the 1.10 runtime. Announced at the Oculus Connect 3 event in October, ASW promises to complement existing Asynchronous Timewarp (ATW) technology to improve the experience of VR for lower performance systems that might otherwise result in stutter.
A quick refresher on Asynchronous Timewarp is probably helpful. ATW was introduced to help alleviate the impact of missed frames on VR headsets and started development back with Oculus DK2 headset. By shifting the image on the VR headset without input from the game engine based on relative head motion that occurred AFTER the last VR pose was sent to the game, timewarp presents a more accurate image to the user. While this technology was first used as a band-aid for slow frame rates, Oculus felt confident enough in its advantages to the Rift that it enables for all frames of all applications, regardless of frame rate.
ATW moves the entire frame as a whole, shifting it only based on relative changes to the user’s head rotation. New Asynchronous Spacewarp attempts to shift objects and motion inside of the scene by generating new frames to insert in between “real” frames from the game engine when the game is running in a 45 FPS state. With a goal of maintaining a smooth, enjoyable and nausea-free experience, Oculus says that ASW “includes character movement, camera movement, Touch controller movement, and the player's own positional movement.”
To many of you that are familiar with the idea of timewarp, this might sound like black magic. Oculus presents this example on their website to help understand what is happening.
Seeing the hand with the gun in motion, ASW generates a frame that continues the animation of the gun to the left, tricking the user into seeing the continuation of the motion they are going through. When the next actual frame is presented just after, the gun will have likely moved slightly more than that, and then the pattern repeats.
You can notice a couple of things about ASW in this animation example however. If you look just to the right of the gun barrel in the generated frame, there is a stretching of the pixels in an artificial way. The wheel looks like something out of Dr. Strange. However, this is likely an effect that would not be noticeable in real time and should not impact the user experience dramatically. And, as Oculus would tell us, it is better than the alternative of simply missing frames and animation changes.
Some ASW interpolation changes will be easier than others thanks to secondary data available. For example, with the Oculus Touch controller, the runtime will know how much the players hand has moved, and thus how much the object being held has moved, and can better estimate the new object location. Positional movement would also have this advantage. If a developer has properly implemented the different layers of abstraction for Oculus and its runtime, separating out backgrounds from cameras from characters, etc., then the new frames being created are less likely to have significant distortions.
I am interested in how this new feature affects the current library of games on PCs that do in fact drop below that 90 FPS mark. In October, Oculus was on stage telling users that the minimum spec for VR systems was dropping from requiring a GTX 970 graphics card to a GTX 960. This clearly expands the potential install base for the Rift. Will the magic behind ASW live up to its stated potential without an abundance of visual artifacts?
In a blog post on the Oculus website, they mention some other specific examples of “imperfect extrapolation.” If your game or application includes rapid brightness changes, object disocclusion trails (an object moving out of the way of another object), repeated patterns, or head-locked elements (that aren’t designated as such in the runtime) could cause distracting artifacts in the animation if not balanced and thought through. Oculus isn’t telling game developers to go back and modify their titles but instead to "be mindful of their appearance."
Oculus does include a couple of recommendations to developers looking to optimize quality for ASW with locked layers, using real-time rather than frame count for animation steps, and easily adjustable image quality settings. It’s worth noting that this new technology is enabled by default as of runtime 1.10 and will start working once a game drops below the 90 FPS line only. If your title stays over 90 FPS, then you get the advantages of Asynchronous Timewarp without the potential issues of Asynchronous Spacewarp.
The impact of ASW will be interesting to see. For as long as Oculus has been around they have trumpeted the need for 90 FPS to ensure a smooth gaming experience free of headaches and nausea. With ASW, that, in theory, drops to 45 FPS, though with the caveats mentioned above. Many believe, as do I, that this new technology was built to help Microsoft partner with Oculus to launch VR on the upcoming Scorpio Xbox console coming next year. Because the power of that new hardware still will lag behind the recommended specification from both Oculus and Valve for VR PCs, something had to give. The result is a new “minimum” specification for Oculus Rift gaming PCs and a level of performance that makes console-based integrations of the Rift possible.
Subject: Graphics Cards | November 9, 2016 - 09:52 PM | Scott Michaud
Tagged: graphics drivers, dishonored 2, crimson, amd
Just a handful of days into this busy month for video game companies, and AMD has released their third Radeon Software Crimson Edition drivers for November. 16.11.3, like 16.11.2 and 16.11.1, are not certified by WHQL. From a quality standpoint, Microsoft certification hasn't exactly made a difference over the last year or so. In fact, both graphics vendors rapidly releasing hotfixes between regular WHQL milestones seems to have a better user experience.
Unfortunately, this does mean that users of clean installed Windows 10 1607 with Secure Boot enabled will be missing out. Correction: The drivers are actually signed by Microsoft with the attestation process.
As for the driver itself, 16.11.3 rolls in AMD's optimizations for Dishonored 2. The game goes live in two days, so this should give users an opportunity to find a good time to install and reboot before launch. It also fixes an issue where Valve's Steam client and EA's Origin client would fail when an external GPU, using AMD's X-Connect Technology standard, is detached.
We have a lot of gaming notebooks
Back in April I did a video with MSI that looked at all of the gaming notebook lines it built around the GTX 900-series of GPUs. Today we have stepped it up a notch, and again are giving you an overview of MSI's gaming notebook lines that now feature the ultra-powerful GTX 10-series using NVIDIA's Pascal architecture. That includes the GTX 1060, GTX 1070 and GTX 1080.
What differentiates the various series of notebooks from MSI? The GE series is for entry level notebook gaming, the GS series offers slim options while the GT series is the ultimate PC gaming mobile platforms.
|GE series||GS series||GT62/72 series||GT 73/83 series|
|Screen||15.6" and 17.3"
|14", 15.6" and 17.3"
1080p and 4K
|15.6" and 17.3"
|17.3" and 18"
|CPU||Core i7-6700HQ||Core i7-6700HQ||Core i7-6700HQ||Core i7-6820HK
|GPU||GTX 1060 6GB||GTX 1060 6GB||GTX 1060 6GB
GTX 1070 8GB
|GTX 1070 8GB (SLI option)
GTX 1080 8GB (SLI option)
|Storage||128-512GB M.2 SATA
|128-512GB M.2 SATA
|128-512GB PCIe and SATA
|Up to 1TB SSD (SATA, NVMe)
|Optical||DVD Super-multi||None||Yes (GT72 only)||Blu-ray burner (GT83 only)|
|Features||Killer E2400 LAN
USB 3.1 Type-C
Steel Series RGB Keyboard
|Killer E2400 LAN
Killer 1535 WiFi
|Killer E2400 LAN
Killer 1535 WiFi
USB 3.1 Type-C
3x USB 3.0 (GT62)
3x USB 3.0 (GT72)
|Killer E2400 LAN
Killer 1535 WiFi
5x USB 3.0
Steel Series RGB (GT73)
Mechanical Keyboard (GT83)
|Weight||5.29-5.35 lbs||3.75-5.35 lbs||6.48-8.33 lbs||8.59-11.59 lbs|
Our video below will break down the differences and help point you toward the right notebook for you based on the three key pillars of performance, price and form factor.
Thanks goes out to CUK, Computer Upgrade King, for supplying the 9 different MSI notebooks for our testing and evaluation!
Subject: Graphics Cards | November 7, 2016 - 06:00 PM | Scott Michaud
Tagged: nvidia, graphics drivers
Update, November 7th @ 5:25pm EST:
First, NVIDIA gave Ryan their official statement, which I included below verbatem.
GeForce Experience collects data to improve the application experience; this includes crash and bug reports as well as system information needed to deliver the correct drivers and optimal settings. NVIDIA does not share any personally identifiable information collected by GeForce Experience outside the company. NVIDIA may share aggregate-level data with select partners, but does not share user-level data. The nature of the information collected has remained consistent since the introduction of GeForce Experience 1.0.The change with GeForce Experience 3.0 is that this error reporting and data collection is now being done in real-time.
They also pointed to their GeForce Experience FAQ.
It sounds like there's a general consensus, both from NVIDIA and even their harshest critics, that telemetry only affects GeForce Experience, and not their base driver. I still believe that there should be a more granular opt-out that still allows access to GeForce Experience, like web browsers and Visual Studio prompt with a checkbox during install. Still, if this concerns you, and, like Windows 10, it might not and that's okay, you can remove GeForce Experience.
Also, GamersNexus yet again did a very technical breakdown of the situation. I think they made an error, though, since they claimed to have recorded traffic "for about an hour", which may not have included the once-per-day reporting time from Windows Task Scheduler. (My image below suggests, at least for my system, monitor once per hour but report at 12:25pm and user login.) I reached out to them on Twitter for clarification, but it looks like they may have just captured GeForce Experience's typical traffic.
Update, November 7th @ 7:15pm EST: Heard back from GamersNexus. They did check at the Windows Task Scheduler time as well, and they claim that they didn't see anything unusual. They aren't finished with their research, though.
Original news, posted November 6th @ 4:25pm EST, below.
Over the last day, users have found NVIDIA Telemetry Monitor added to Windows Task Scheduler. We currently don't know what it is or exactly when it was added, but we do know its schedule. When the user logs in, it runs an application that monitors... something... once every hour while the computer is active. Then, once per day (at just after noon on my PC) and once on login, it runs an application that reports that data, which I assume means sends it to NVIDIA.
Before we begin, NVIDIA (or anyone) should absolutely not be collecting data from personal devices without clearly explaining the bounds and giving a clear option to disable it. Lots of applications, from browsers to software development tools, include crash and error reporting, but they usually and rightfully ask you to opt-in. Microsoft is receiving a lot of crap for this practice in Windows 10, even with their “Basic” option, and, while most of those points are nonsense, there is ground for some concern.
I've asked NVIDIA if they have a statement regarding what it is, what it collects, and what their policy will be for opt-in and opt-out. I haven't received a response yet, because I sent it less than an hour ago on a weekend, but we'll keep you updated.