Shedding a little light on Monday's announcement
Most of our readers should have some familiarity with GameWorks, which is a series of libraries and utilities that help game developers (and others) create software. While many hardware and platform vendors provide samples and frameworks, taking the brunt of the work required to solve complex problems, this is NVIDIA's branding for their suite of technologies. Their hope is that it pushes the industry forward, which in turn drives GPU sales as users see the benefits of upgrading.
This release, GameWorks SDK 3.1, contains three complete features and two “beta” ones. We will start with the first three, each of which target a portion of the lighting and shadowing problem. The last two, which we will discuss at the end, are the experimental ones and fall under the blanket of physics and visual effects.
The first technology is Volumetric Lighting, which simulates the way light scatters off dust in the atmosphere. Game developers have been approximating this effect for a long time. In fact, I remember a particular section of Resident Evil 4 where you walk down a dim hallway that has light rays spilling in from the windows. Gamecube-era graphics could only do so much, though, and certain camera positions show that the effect was just a translucent, one-sided, decorative plane. It was a cheat that was hand-placed by a clever artist.
GameWorks' Volumetric Lighting goes after the same effect, but with a much different implementation. It looks at the generated shadow maps and, using hardware tessellation, extrudes geometry from the unshadowed portions toward the light. These little bits of geometry sum, depending on how deep the volume is, which translates into the required highlight. Also, since it's hardware tessellated, it probably has a smaller impact on performance because the GPU only needs to store enough information to generate the geometry, not store (and update) the geometry data for all possible light shafts themselves -- and it needs to store those shadow maps anyway.
Even though it seemed like this effect was independent of render method, since it basically just adds geometry to the scene, I asked whether it was locked to deferred rendering methods. NVIDIA said that it should be unrelated, as I suspected, which is good for VR. Forward rendering is easier to anti-alias, which makes the uneven pixel distribution (after lens distortion) appear more smooth.
Subject: Graphics Cards, Mobile, Shows and Expos | February 23, 2016 - 08:46 PM | Scott Michaud
Tagged: raytracing, ray tracing, PowerVR, mwc 16, MWC, Imagination Technologies
For the last couple of years, Imagination Technologies has been pushing hardware-accelerated ray tracing. One of the major problems in computer graphics is knowing what geometry and material corresponds to a specific pixel on the screen. Several methods exists, although typical GPUs crush a 3D scene into the virtual camera's 2D space and do a point-in-triangle test on it. Once they know where in the triangle the pixel is, if it is in the triangle, it can be colored by a pixel shader.
Another method is casting light rays into the scene, and assigning a color based on the material that it lands on. This is ray tracing, and it has a few advantages. First, it is much easier to handle reflections, transparency, shadows, and other effects where information is required beyond what the affected geometry and its material provides. There are usually ways around this, without resorting to ray tracing, but they each have their own trade-offs. Second, it can be more efficient for certain data sets. Rasterization, since it's based around a “where in a triangle is this point” algorithm, needs geometry to be made up of polygons.
It also has the appeal of being what the real world sort-of does (assuming we don't need to model Gaussian beams). That doesn't necessarily mean anything, though.
At Mobile World Congress, Imagination Technologies once again showed off their ray tracing hardware, embodied in the PowerVR GR6500 GPU. This graphics processor has dedicated circuitry to calculate rays, and they use it in a couple of different ways. They presented several demos that modified Unity 5 to take advantage of their ray tracing hardware. One particularly interesting one was their quick, seven second video that added ray traced reflections atop an otherwise rasterized scene.
It was a little too smooth, creating reflections that were too glossy, but that could probably be downplayed in the material ((Update: Feb 24th @ 5pm Car paint is actually that glossy. It's a different issue). Back when I was working on a GPU-accelerated software renderer, before Mantle, Vulkan, and DirectX 12, I was hoping to use OpenCL-based ray traced highlights on idle GPUs, if I didn't have any other purposes for it. Now though, those can be exposed to graphics APIs directly, so they might not be so idle.
The downside of dedicated ray tracing hardware is that, well, the die area could have been used for something else. Extra shaders, for compute, vertex, and material effects, might be more useful in the real world... or maybe not. Add in the fact that fixed-function circuitry already exists for rasterization, and it makes you balance gain for cost.
It could be cool, but it has its trade-offs, like anything else.
Subject: Mobile, Shows and Expos | February 22, 2016 - 05:09 AM | Ryan Shrout
Tagged: video, snapdragon 820, snapdragon, qualcomm, MWC 2016, MWC, LG, G5
The new LG G5 flagship smartphone offers a unique combination of form factor, performance and modularity that no previous smartphone design has had. But will you want to buy in?
I had a feeling that the Snapdragon 820 SoC from Qualcomm would make an impression at Mobile World Congress this year and it appears the company has improved on the previous flagship processor quite a bit. Both Samsung and LG have implemented it into the 2016 models, including the new G5, offering up a combination of performance and power efficiency that is dramatically better than the 810 that was hindered by heat and process technology concerns.
Along with the new processor, the G5 includes 4GB of RAM, 32GB of on-board storage with micro SD expansion, a 2,800 mAh battery and Android 6.0 out of the box. The display is 5.3-in and uses LG IPS technology with a 2560x1440 resolution, resulting in an impressive 554 PPI. LG has updated the USB connection to Type-C, a move that Samsung brushed off as unnecessary at this time.
The phones design is pretty standard and will look very familiar to anyone that has handled a G4 or similar flagship smartphone in recent months. It was bigger in the hand than the iPhone 6s but considering the panel size differences, it was more compact than expected.
Modularity is the truly unique addition to the G5 though. The battery is replaceable by sliding out a bottom portion of the phone, released with a tab on the left side. This allows LG to maintain the metal body construction but still offer flexibility for power users that are used to having extra batteries in their bag. This mechanism also means LG can offer add-on modules for the phone.
The first two available will be the LG Cam Plus and the LG Hi-Fi Plus. The Cam Plus gives the phone a camera grip as well as dedicated buttons for the shutter, video recording and zoom. Including an extra 1,200 mAh of battery is a nice touch too. The Hi-Fi Plus module has a DAC and headphone amplifier enbeded in it and can also be used connected to a PC through the USB Type-C connection; a nice touch.
I was overall pretty impressed with what LG had to offer with the G5. Whether or not the modular design gains any traction will have to be seen; I have concerns over the public's desire to carry around modules or affect the form factor of their phones so dramatically.
Subject: Displays, Shows and Expos | February 21, 2016 - 08:27 PM | Scott Michaud
Tagged: MWC, mwc 16, valve, htc, vive, Oculus
Valve and HTC announced that the Vive consumer edition will be available in April for $799 USD, with pre-orders beginning on February 29th. Leave it to Valve to launch a product on a date that doesn't always exist. The system comes with the headset, two VR controllers, and two sensors. The unit will have “full commercial availability” when it launches in April, but that means little if it sells out instantly. There's no way to predict that.
The announcement blog post drops a subtle jab at Oculus. “Vive will be delivered as a complete kit” seems to refer to the Oculus Touch controllers being delayed (and thus not in the hands of every user). This also makes me think about the price. The HTC Vive costs $200 more than the Oculus Rift. That said, it also has the touch controllers, which could shrink that gap. It also does not come with a standard gamepad, like Oculus does, although that's just wasted money if you already have one.
Unlike the Oculus, which has its own SDK, the Vive is powered by SteamVR. Most engines and middleware that support one seem to support both, so I'm not sure if this will matter. It could end up blocking content in an HD-DVD vs BluRay fashion. Hopefully Valve/HTC and Oculus/Facebook, or every software vendor on an individual basis, works through these interoperability concerns and create an open platform. Settling on a standard tends to commoditize industries, but that will eventually happen to VR at some point anyway. Hopefully, if it doesn't happen sooner, cross-compatibility at least happens then.
Subject: Mobile, Shows and Expos | February 21, 2016 - 05:14 PM | Scott Michaud
Tagged: Samsung, epic games, unreal engine 4, vulkan, galaxy s7, MWC, mwc 16
Mobile World Congress starts with a big bang... ... ... :3
Okay, not really; it starts with the formation of a star, which happens on a continual basis across the universe. I won't let facts get in the way of a pun, though.
As for the demo, it is powered by Unreal Engine 4 and runs on a Samsung Galaxy S7 with the Vulkan API. The setting seems to be some sort of futuristic laboratory that combines objects until it builds up into a star. It is bright and vibrant, with many particles, full-scene anti-aliasing, reflections, and other visual effects. The exact resolution when running on the phone was never stated, but the YouTube video was running at 1080p30, and the on-stage demo looked fairly high resolution, too.
Epic Games lists the features they added to mobile builds of Unreal Engine 4 for this demo:
- Dynamic planar reflections
- “Full” GPU particle support, which includes vector fields.
- Temporal Anti-Alising, which blends neighboring frames to smooth jaggies in motion.
- ASTC texture compression (created by ARM and AMD for OpenGL and OpenGL ES)
- Full scene dynamic cascaded shadows
- Chromatic aberration
- Dynamic light refraction
- Filmic tonemapping curve, which scales frames rendered in HDR to a presentable light range
- Improved static reflections
- High-quality depth of field
- Vulkan API for thousands of onscreen, independent objects.
The company has not stated which version of Unreal Engine 4 will receive these updates. I doubt that it will land in 4.11, which is planned for March, but they tend to release a full dot-version every one to three months. They also have early previews for those who wish to try it early, some compiled leading up to launch, and others that need to be built from GitHub.
Subject: General Tech, Shows and Expos | February 4, 2016 - 07:47 PM | Scott Michaud
Tagged: GDC, gdc 2016, epic games, ue4, VR, vive vr
Epic Games released Unreal Engine 4 at GDC two years ago, and removed its subscription fee at the next year's show. This year, one of the things that they will show is Unreal Editor in VR with the HTC Vive. Using the system's motion controllers, you will be able to move objects and access UI panels in the virtual environment. They open the video declaring that this is not an experimental project.
Without using this technology, it's hard to comment on its usability. It definitely looks interesting, and might be useful for VR experiences. You can see what your experience will look like as you create it, and you probably even save a bit of time in rapid iteration by not continuously wearing and removing the equipment. I wonder how precise it will be though, since the laser pointers and objects seemed to snap and jitter a bit. That said, it might be just as precise and, even still, it only really matters how it looks and behaves, and it shouldn't even prevent minor tweaks after the fact anyway.
Epic Games expects to discuss the release plans at the show.
Subject: Displays, Shows and Expos | January 9, 2016 - 02:59 AM | Scott Michaud
Tagged: CES, CES 2016, dell, ultrasharp, oled
For the longest time, display technology was stagnant. Professional monitors were 1440p, IPS panels (or 2560x1600 for 16:10 models) and high-90% Adobe RGB color, which is useful for both video and print gamuts. Consumer monitors were based on TN technology that could maybe cover the smaller sRGB color space, which covers video. Mobile devices, due to their small size, relatively high viewing angle requirements, and eventually high PPI, started introducing higher-end technologies to consumers. G-Sync, and later FreeSync, continued to differentiate high-end panels. Still, apart from the shift to 4K 60Hz, professional panels didn't go through an astonishing upgrade.
Image Credit: Engadget
OLED was always on the horizon though, and are now being integrated into consumer, and professional, monitors. The Dell UltraSharp U3017Q is one such display, with a 30-inch size and 4K resolution. It completely covers Adobe RGB and 97.8% of DCI-P3. DCI-P3 is not a superset of Adobe RGB, it's just a bit more shifted into the reds, and it is designed for digital cinema projects. Because it's not blocking white light, it can get deeper blacks and more saturated colors.
For accessories, it has a USB Type-C connector that can provide 100W of power, as well as high-speed data and apparently video.
Its pricing and availability is where we get to its downside. It will ship March 31st, which is great news for the new technology, but it will cost $4,999, which is not so amazing. That said, if companies get their hands on it, it might eventually trickle into the prosumer and consumer space, like the 4K IGZO panels did a couple of years ago.
What do our readers think?
Did it launch too early? Or does this make you interested when the price drops? Or, alternatively, are you planning on dropping a huge chunk of cash as soon as they'll take it?
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Cases and Cooling, Shows and Expos | January 8, 2016 - 06:09 PM | Ryan Shrout
Tagged: prototype, gaming case, evga, CES 2016, CES
While we have already posted our story and given you a video breakdown of the upcoming EVGA SC 17 gaming notebook, the company had another revolutionary product on display at its suite that I think our readers are going to enjoy seeing. I bring to you, the EVGA Gaming Case:
As we dive through this, keep in mind that EVGA claims this is an early prototype but that it should have product on the market by end of Q2 2016.
The case is HUGE, like, bigger that you might imagine just from looking at the pictures. The first feature that stands out is that EVGA intends for the orientation of the case to have the window facing the gamer, rotating it from a standard case direction by 90 degrees. That puts your components and lighting and all the work you put into it on full display, which is great, but it also means you'll need more space on your desk or on your floor in one dimension.
The current build is all plastic, but EVGA's CEO Andrew Han told me it would be all metal when it shipped, an impressive feat for sure. The internals are not rotated at all but EVGA was able to keep a clean look from the "new front" by including a swinging door on the back where the display connections and USB ports, etc. come out, hiding them from view.
Power supplies are hidden by the mirror finish bottom section that currently holds an easy to use, if overly simplified, intake/exhaust fan speed controller. The power button is also on the bottom right of the case, a result of the new orientation that EVGA plans for. The company will offer different versions of the case, ranging from $79 to $299 depending on features, the most expensive of which will upgrade that fan controller to a full LCD touch screen that can also display and interact with EVGA's Precision X software. Very cool!
A full window faces the user, bigger than any case I have had experience with. An illuminated EVGA logo sits on the right hiding water cooling gear you might have installed. The internals have more than enough room for basically any and all hardware, as you could guess from the size of the thing. There are zero optical drive bays - sorry Josh - but the back has space for eight 3.5-in drives and six 2.5-in drives.
It's too early for me to make a final statement, but the case is surely ambitious. If they can pull it off, EVGA might just create the ultimate enthusiast PC case without going completely insane on pricing. With targets of $79 on the low end, but keeping the same metal construction and layout, and $299 on the high end, this unnamed case will stay on my radar in 2016.
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Cases and Cooling, Shows and Expos | January 8, 2016 - 05:38 PM | Scott Michaud
Tagged: CES, CES 2016, amd, cpu cooler, air cooler
AMD seems to be starting off 2016 right. This is the year that they intend to switch to the Zen microarchitecture, and hopefully reclaim a profitable CPU market-share. While that's later in the year, they showed off a new stock cooler that will be bundled with upcoming processors. We don't have a press release or announcement for it, but they did publish a video to their Red Team fan community and they discussed it with attendees of the show.
The new cooler, called the Wraith, is significantly larger than their previous stock heatsink. It is rated at 125W, up from the previous offering's 95W. This dissipation wattage might allow some overclocking room, depending on the chosen TDP at launch, while providing lower noise at stock voltage and frequency. The fan is now constant speed, so it shouldn't whine under load. It might have also allowed them to tune the fan for its RPM, too.
Speaking of lower noise, the aforementioned video shows a dramatic reduction in that area. We're force to trust their recording and frequency-distribution graph. If accurate, the noise appears to be much lower and the energy is spread out over many frequencies.
No clue when it will launch, though.
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Mobile, Shows and Expos | January 8, 2016 - 04:44 PM | Scott Michaud
Tagged: CES, CES 2016, microsoft, windows 10
Microsoft is partnering with Transatel to provide cellular data services for Windows 10 PCs and tablets, but not phones. It will launch in France, the United Kingdom, and the United States, but could be rolled out to other regions over time. This will not be a contract service. Everything will be pre-paid, with short-term plans (think “XGB for the next 30 days for Y upfront”) available for a discount before a trip or something.
One downside is that compatible PCs will require a SIM card slot, which a Microsoft-branded SIM card will be inserted into. The write-up at Thurrott.com doesn't discuss external adapters, like the USB cellular modems that carriers offer and were popular until tethering became mainstream. A few unlocked LTE, USB modems can be found online, which you'd think would be compatible, but I'm not up on many of the details. I'm not a mobile enthusiast.
Despite the source being a Microsoft corporate VP, speaking on the record, it has not been officially announced by the company yet. Details, like when it will be available, have not been released.
Follow all of our coverage of the show at http://pcper.com/ces!