Subject: Graphics Cards | February 28, 2017 - 10:55 PM | Tim Verry
Tagged: pascal, nvidia, GTX 1080, GDC
Update Feb 28 @ 10:03pm It's official, NVIDIA launches $699 GTX 1080 Ti.
NVIDIA is hosting a "Gaming Celebration" live event during GDC 2017 to talk PC gaming and possibly launch new hardware (if rumors are true!). During the event, NVIDIA CEO Jen-Hsun Huang made a major announcement regarding its top-end GTX 1080 graphics card with a price drop to $499 effective immediately.
The NVIDIA GTX 1080 is a pascal based graphics card with 2560 CUDA cores paired with 8GB of GDDR5X memory. Graphics cards based on this GP104 GPU are currently selling for around $580 to $700 (most are around $650+/-) with the "Founders Edition" having an MSRP of $699. The $499 price teased at the live stream represents a significant price drop compared to what the graphics cards are going for now. NVIDIA did not specify if the new $499 MSRP was the new Founders Edition price or an average price that includes partner cards as well but even if it only happened on the reference cards, the partners would have to adjust their prices downwards accordingly to compete.
I suspect that NVIDIA is making such a bold move to make room in their lineup for a new product (the long-rumored 1080 Ti perhaps?) as well as a pre-emptive strike against AMD and their Radeon RX Vega products. This move may also be good news for GTX 1070 pricing as they may also see price drops to make room for cheaper GTX 1080 partner cards that come in below the $499 price point.
If you have been considering buying a new graphics card, NVIDIA has sweetened the pot a bit especially if you had already been eyeing a GTX 1080. (Note that while the price drop is said to be effective immediately, at the time of writing Amazon was still showing "normal"/typical prices for the cards. Enthusiasts might have to wait a few hours or days for the retailers to catch up and update their sites.)
This makes me a bit more excited to see what AMD will have to offer with Vega as well as the likelihood of a GTX 1080 Ti launch happening sooner rather than later!
Subject: General Tech, Graphics Cards | February 27, 2017 - 03:39 PM | Jeremy Hellstrom
Tagged: MWC, GDC, VRMark, Servermark, OptoFidelity, cyan room, benchmark
Futuremark are showing off new benchmarks at GDC and MWC, the two conferences which are both happening this week. We will have quite a bit of coverage this week as we try to keep up with simultaneous news releases and presentations.
First up is a new benchmark in their recently released DX12 VRMark suite, the new Cyan Room which sits between the existing two in the suite. The Orange Room is to test if your system is capable of providing you with an acceptable VR experience or if your system falls somewhat short of the minimum requirements while the Blue Room is to show off what a system that exceeds the recommended specs can manage. The Cyan room will be for those who know that their system can handle most VR, and need to test their systems settings. If you don't have the test suite Humble Bundle has a great deal on this suite and several other tools, if you act quickly.
Next up is a new suite to test Google Daydream, Google Cardboard, and Samsung Gear VR performance and ability. There is more than just performance to test when you are using your phone to view VR content, such as avoiding setting your eyeholes on fire. The tests will help you determine just how long your device can run VR content before overheating becomes an issue and interferes with performance, as well as helping you determine your battery life.
VR Latency testing is the next in the list of announcements and is very important when it comes to VR as high or unstable latency is the reason some users need to add a bucket to their list of VR essentials. Futuremark have partnered with OptoFidelity to produce VR Multimeter HMD hardware based testing. This allows you, and hopefully soon PCPer as well, to test motion-to-photon latency, display persistence, and frame jitter as well as audio to video synchronization and motion-to-audio-latency all of which could lead to a bad time.
Last up is the brand new Servermark to test the performance you can expect out of virtual servers, media servers and other common tasks. The VDI test lets you determine if a virtual machine has been provisioned at a level commensurate to the assigned task, so you can adjust it as required. The Media Transcode portion lets you determine the maximum number of concurrent streams as well as the maximum quality of those streams which your server can handle, very nice for those hosting media for an audience.
Expect to hear more as we see the new benchmarks in action.
Subject: Graphics Cards | March 19, 2016 - 03:02 PM | Ryan Shrout
Tagged: VR, vive, valve, htc, gdc 2016, GDC
A story posted over at UploadVR has some interesting information that came out of the final days of GDC last week. We know that Valve, HTC and Oculus have recommended users have a Radeon R9 290 or GTX 970 GPU or higher to run virtual reality content on both the Vive and the Rift, and that comes with a high cost for users that weren't already invested in PC gaming. Valve’s Alex Vlachos has other plans that might enable graphics cards from as far back as 2012 to work in Valve's VR ecosystem.
Valve wants to lower the requirements for VR
Obviously there are some trade offs to consider. The reason GPUs have such high requirements for the Rift and Vive is their need to run at 90 FPS / 90 Hz without dropping frames to create a smooth and effective immersion. Deviance from that means the potential for motion sickness and poor VR experiences in general.
From UploadVR's story:
“As long as the GPU can hit 45 HZ we want for people to be able to run VR,” Vlachos told UploadVR after the talk. “We’ve said the recommended spec is a 970, same as Oculus, but we do want lesser GPUs to work. We’re trying to reduce the cost [of VR].”
It's interesting that Valve would be talking about a 45 FPS target now, implying there would be some kind of frame doubling or frame interpolation to get back to the 90 FPS mark that the company believes is required for a good VR experience.
Image source: UploadVR
Vlachos also mentioned some other avenues that Valve could expand on to help improve performance. One of them is "adaptive quality", a feature we first saw discussed with the release of the Valve SteamVR Performance Test. This would allow the game to lower the image quality dynamically (texture detail, draw distance, etc.) based on hardware performance but might also include something called fixed foveated rendering. With FFR only the center of the image is rendered at maximum detail while the surrounding image runs at lower quality; the theory being that you are only focused on the center of the screen anyway and human vision blurs the periphery already. This is similar to NVIDIA's multi-res shading technology that is integrated into UE4 already, so I'm curious to see how this one might shape out.
Another quote from UploadVR:
“I can run Aperture [a graphically rich Valve-built VR experience] on a 680 without dropping frames at a lower quality, and, for me, that’s enough of a proof of concept,” Vlachos said.
I have always said that neither Valve nor Oculus are going to lock out older hardware, but that they wouldn't directly support it. That a Valve developer can run its performance test (with adaptive quality) on a GTX 680 is a good sign.
The Valve SteamVR Performance Test
But the point is also made by Vlachos that "most art we’re seeing in VR isn’t as dense" as other PC titles is a bit worrisome. We WANT VR games to improve to the same image quality and realism levels that we see in modern PC titles and not depend solely on artistic angles to get to the necessary performance levels for high quality virtual reality. Yes, the entry price today for PC-based VR is going to be steep, but I think "console-ifying" the platform will do a disservice in the long run.
Subject: Shows and Expos | March 16, 2016 - 09:00 PM | Jeremy Hellstrom
Tagged: skulltrail, Skull Canyon, nuc, Intel, GDC
No we are not talking about the motherboard from 2008 which was going to compete with AMD's QuadFX platform and worked out just as well. We are talking about a brand new Skull Canyon NUC powered by an i7-6770HQ with Iris Pro 580 graphics and up to 32GB of DDR4-2133. The NUC NUC6i7KYK will also be the first system we have seen with a fully capable USB Type-C port, it will offer Thunderbolt 3, USB 3.1 and DisplayPort 1.2 connectivity; not simultaneously but the flexibility is nothing less than impressive. It will also sport a full-size HDMI 2.0 port and Mini DisplayPort 1.2 outputs so you can still send video while using the Type C port for data transfer. The port will also support external graphics card enclosures if you plan on using this as a gaming machine as well.
The internal storage subsystem is equally impressive, dual M.2 slots will give you great performance, the SD card slot not so much but still a handy feature. Connectivity is supplied by Intel Dual Band Wireless-AC 8260 (802.11 ac) and Bluetooth 4.2 and an infrared sensor will let you use your favourite remote control if you set up the Skulltrail NUC as a media server. All of these features are in a device less than 0.7 litres in size, with your choice of two covers and support for your own if you desire to personalize your system. The price is not unreasonable, the MSRP for a barebones system is $650, one with 16GB memory, 256GB SSD and Windows 10 should retail for about $1000. You can expect to see these for sale on NewEgg in April to ship in May.
Shedding a little light on Monday's announcement
Most of our readers should have some familiarity with GameWorks, which is a series of libraries and utilities that help game developers (and others) create software. While many hardware and platform vendors provide samples and frameworks, taking the brunt of the work required to solve complex problems, this is NVIDIA's branding for their suite of technologies. Their hope is that it pushes the industry forward, which in turn drives GPU sales as users see the benefits of upgrading.
This release, GameWorks SDK 3.1, contains three complete features and two “beta” ones. We will start with the first three, each of which target a portion of the lighting and shadowing problem. The last two, which we will discuss at the end, are the experimental ones and fall under the blanket of physics and visual effects.
The first technology is Volumetric Lighting, which simulates the way light scatters off dust in the atmosphere. Game developers have been approximating this effect for a long time. In fact, I remember a particular section of Resident Evil 4 where you walk down a dim hallway that has light rays spilling in from the windows. Gamecube-era graphics could only do so much, though, and certain camera positions show that the effect was just a translucent, one-sided, decorative plane. It was a cheat that was hand-placed by a clever artist.
GameWorks' Volumetric Lighting goes after the same effect, but with a much different implementation. It looks at the generated shadow maps and, using hardware tessellation, extrudes geometry from the unshadowed portions toward the light. These little bits of geometry sum, depending on how deep the volume is, which translates into the required highlight. Also, since it's hardware tessellated, it probably has a smaller impact on performance because the GPU only needs to store enough information to generate the geometry, not store (and update) the geometry data for all possible light shafts themselves -- and it needs to store those shadow maps anyway.
Even though it seemed like this effect was independent of render method, since it basically just adds geometry to the scene, I asked whether it was locked to deferred rendering methods. NVIDIA said that it should be unrelated, as I suspected, which is good for VR. Forward rendering is easier to anti-alias, which makes the uneven pixel distribution (after lens distortion) appear more smooth.
Subject: General Tech | March 15, 2016 - 05:32 PM | Sebastian Peak
Tagged: VRScore, VR, virtual reality, gdc 2016, GDC, crytek, CRYENGINE, benchmark, Basemark
Basemark has announced VRScore, a new benchmarking tool for VR produced in partnership with Crytek. The benchmark uses Crytek’s CRYENGINE along with the Basemark framework, and can be run with or without a head-mounted display (HMD).
"With VRScore, consumers and companies are able to reliably test their PC for VR readiness with various head mounted displays (HMDs). Unlike existing tools developed by hardware vendors themselves, VRScore has been developed independently to be an essential source of unbiased information for anyone interested in VR."
An independent solution is certainly welcome as we enter what promises to be the year of VR, and Basemark is well known for providing objective benchmark results with applications such as Basemark X and OS II, cross-platform benchmarks for mobile devices. The VRScore benchmark supports the Oculus Rift, HTC Vive, and Razer's OSVR headsets, and the corporate versions include VRTrek, a left/right eye latency measurement device.
Here’s the list of features from Basemark:
- Supports HTC Vive, Oculus Rift and OSVR
- Uses CRYENGINE
- Supports both DirectX 12 and DirectX 11
- Features Codename: Sky Harbor, an original IP game scene by Crytek
- Includes tests for interactive VR (VR game), non-interactive VR (360 VR video) and VR spatial audio (360 sound)
- Can be used with or without an HMD
- Power Board, an integrated online service, gives personalized PC upgrading advice and features performance ranking lists for HMDs, CPUs and GPUs
- Corporate versions include VRTrek, a patent pending latency testing device with dual phototransistors for application to photon latency, display persistence, left and right eye latency, dropped frames and duplicated frames testing
VRScore Trek eye latency measurement device, included with corporate version
VRScore is currently available only to corporate customers via the company’s early access program and Benchmark Development Program. The consumer versions (free and paid) will be released in June.
Subject: General Tech, Shows and Expos | February 4, 2016 - 07:47 PM | Scott Michaud
Tagged: GDC, gdc 2016, epic games, ue4, VR, vive vr
Epic Games released Unreal Engine 4 at GDC two years ago, and removed its subscription fee at the next year's show. This year, one of the things that they will show is Unreal Editor in VR with the HTC Vive. Using the system's motion controllers, you will be able to move objects and access UI panels in the virtual environment. They open the video declaring that this is not an experimental project.
Without using this technology, it's hard to comment on its usability. It definitely looks interesting, and might be useful for VR experiences. You can see what your experience will look like as you create it, and you probably even save a bit of time in rapid iteration by not continuously wearing and removing the equipment. I wonder how precise it will be though, since the laser pointers and objects seemed to snap and jitter a bit. That said, it might be just as precise and, even still, it only really matters how it looks and behaves, and it shouldn't even prevent minor tweaks after the fact anyway.
Epic Games expects to discuss the release plans at the show.
Subject: General Tech | January 20, 2016 - 07:06 PM | Scott Michaud
Tagged: vulkan, ue4, nvidia, Intel, gdc 2016, GDC, epic games, DirectX 12, Codemasters, arm, amd
The 30th Game Developers Conference (GDC) will take place on March 14th through March 18th, with the expo itself starting on March 16th. The sessions have been published at some point, with DX12 and Vulkan prominently featured. While the technologies have not been adopted as quickly as advertised, the direction is definitely forward. In fact, NVIDIA, Khronos Group, and Valve have just finished hosting a developer day for Vulkan. It is coming.
One interesting session will be hosted by Codemasters and Intel, which discusses bringing the F1 2015 engine to DirectX 12. It will highlight a few features they implemented, such as voxel based raytracing using conservative rasterization, which overestimates the size of individual triangles so you don't get edge effects on pixels that are partially influenced by an edge that cuts through a tiny, but not negligible, portion of them. Sites like Game Debate (Update: Whoops, forgot the link) wonder if these features will be patched in to older titles, like F1 2015, or if they're just R&D for future games.
Another keynote will discuss bringing Vulkan to mobile through Unreal Engine 4. This one will be hosted by ARM and Epic Games. Mobile processors have quite a few cores, albeit ones that are slower at single-threaded tasks, and decent GPUs. Being able to keep them loaded will bring their gaming potential up closer to the GPU's theoretical performance, which has surpassed both the Xbox 360 and PlayStation 3, sometimes by a factor of 2 or more.
Many (most?) slide decks and video recordings are available for free after the fact, but we can't really know which ones ahead of time. It should be an interesting year, though.
Subject: General Tech, Shows and Expos | March 14, 2015 - 07:30 AM | Scott Michaud
Tagged: vive vr, vive, valve, re vive, Portal 2, Portal, mwc 15, MWC, htc, gdc 15, GDC
At the recent Game Developer Conference and Mobile World Congress events, Valve had a demo for HTC's Vive VR system that was based in the Portal universe. The headset is combined with two controllers, one for each hand, which sound like a cross between Valve's Steam Controller and the Razer Hydra.
When HTC briefed journalists about the technology, they brought a few examples for use with their prototype. C|Net described three: a little demo where you could paint with the controllers in a virtual space, an aquarium where you stand on a sunken pirate ship and can look at a gigantic blue whale float overhead, and a Portal-based demo that is embedded above. I also found “The Gallery” demo online, but I am not sure where it was presented (if anywhere).
Beyond VR, the Source 2 engine, which powers the Portal experience, looks good. The devices looked very intricate and full of detail. Granted, it is a lot easier to control performance when you are dealing with tight corridors or isolated rooms. The lighting also seems spot on, although it is hard to tell whether this capability is dynamic or precomputed.
The HTC Vive developer kit is coming soon, before a consumer launch in the Autumn.
Subject: Graphics Cards, Mobile, Shows and Expos | March 7, 2015 - 07:00 AM | Scott Michaud
Tagged: vulkan, PowerVR, Khronos, Imagination Technologies, gdc 15, GDC
Possibly the most important feature of upcoming graphics APIs, albeit the least interesting for enthusiasts, is how much easier driver development will become. So many decisions and tasks that once laid on the shoulders of AMD, Intel, NVIDIA, and the rest will now be given to game developers or made obsolete. Of course, you might think that game developers would oppose this burden, but (from what I understand) it is a weight they already bear, just when dealing with the symptoms instead of the root problem.
This also helps other hardware vendors become competitive. Imagination Technologies is definitely not new to the field. Their graphics powers the PlayStation Vita, many earlier Intel graphics processors, and the last couple of iPhones. Despite how abrupt the API came about, they have a proof of concept driver that was present at GDC. The unfinished driver was running an OpenGL ES 3.0 demo that was converted to the Vulkan API.
A screenshot of the CPU usage was also provided, which is admittedly heavily cropped and hard to read. The one on the left claims 1.2% CPU load, with a fairly flat curve, while the one on the right claims 5% and seems to waggle more. Granted, the wobble could be partially explained by differences in the time they chose to profile.
According to Tom's Hardware, source code will be released “in the near future”.