All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Mobile | February 25, 2016 - 04:43 PM | Ryan Shrout
Tagged: MWC, MWC 2016, Samsung, galaxy, s7, s7 edge, qualcomm, snapdragon, snapdragon 820
I got to spend some time with the brand new Samsung Galaxy S7 and S7 Edge phones at MWC this week in Barcelona. Is this your next Android flagship phone?
Subject: Systems, Mobile | February 25, 2016 - 04:42 PM | Ryan Shrout
Tagged: MWC, MWC 2016, Huawei, matebook, Intel, core m, Skylake, 2-in-1
Huawei is getting into the PC business with the MateBook 2-in-1, built in the same vein as the Microsoft Surface Pro 4. Can they make a splash with impressive hardware and Intel Core m processors?
Subject: General Tech | February 24, 2016 - 09:41 PM | Scott Michaud
Tagged: microsoft, xamarin, Qt, .net, mono
Microsoft has purchased Xamarin, who currently maintain the Mono project.
This requires a little background. The .NET Framework was announced in 2000, and it quickly became one of the most popular structures to write native applications, especially simple ones. Apart from ASP.NET, which is designed for servers, support extended back to Windows 98, but it really defined applications throughout the Windows XP era. If you ever downloaded utilities that were mostly checkboxes and text elements, they were probably developed in .NET and programmed in C#.
Today, Qt and Web are very popular choice for new applications, but .NET is keeping up.
The Mono project brought the .NET framework, along with its managed languages such as C#, to Linux, Mac, and also Windows because why not. Android and iOS versions exist from Xamarin, under the name Xamarin.iOS and Xamarin.Android, but those are proprietary. Now that Microsoft has purchased Xamarin, it would seem like they now control the .NET-derived implementations on Android and iOS. The Mono project itself, as it exists for Linux, Mac, and Windows, are under open licenses, so (apart from Microsoft's patents that were around since day one) the framework could always be forked if the community dislikes the way it is developing. To visualize the scenario, think of when LibreOffice split from OpenOffice a little while after Oracle purchased Sun.
If they do split, however, it would likely be without iOS and Android components.
Subject: Graphics Cards, Mobile, Shows and Expos | February 24, 2016 - 01:46 AM | Scott Michaud
Tagged: raytracing, ray tracing, PowerVR, mwc 16, MWC, Imagination Technologies
For the last couple of years, Imagination Technologies has been pushing hardware-accelerated ray tracing. One of the major problems in computer graphics is knowing what geometry and material corresponds to a specific pixel on the screen. Several methods exists, although typical GPUs crush a 3D scene into the virtual camera's 2D space and do a point-in-triangle test on it. Once they know where in the triangle the pixel is, if it is in the triangle, it can be colored by a pixel shader.
Another method is casting light rays into the scene, and assigning a color based on the material that it lands on. This is ray tracing, and it has a few advantages. First, it is much easier to handle reflections, transparency, shadows, and other effects where information is required beyond what the affected geometry and its material provides. There are usually ways around this, without resorting to ray tracing, but they each have their own trade-offs. Second, it can be more efficient for certain data sets. Rasterization, since it's based around a “where in a triangle is this point” algorithm, needs geometry to be made up of polygons.
It also has the appeal of being what the real world sort-of does (assuming we don't need to model Gaussian beams). That doesn't necessarily mean anything, though.
At Mobile World Congress, Imagination Technologies once again showed off their ray tracing hardware, embodied in the PowerVR GR6500 GPU. This graphics processor has dedicated circuitry to calculate rays, and they use it in a couple of different ways. They presented several demos that modified Unity 5 to take advantage of their ray tracing hardware. One particularly interesting one was their quick, seven second video that added ray traced reflections atop an otherwise rasterized scene.
It was a little too smooth, creating reflections that were too glossy, but that could probably be downplayed in the material ((Update: Feb 24th @ 5pm Car paint is actually that glossy. It's a different issue). Back when I was working on a GPU-accelerated software renderer, before Mantle, Vulkan, and DirectX 12, I was hoping to use OpenCL-based ray traced highlights on idle GPUs, if I didn't have any other purposes for it. Now though, those can be exposed to graphics APIs directly, so they might not be so idle.
The downside of dedicated ray tracing hardware is that, well, the die area could have been used for something else. Extra shaders, for compute, vertex, and material effects, might be more useful in the real world... or maybe not. Add in the fact that fixed-function circuitry already exists for rasterization, and it makes you balance gain for cost.
It could be cool, but it has its trade-offs, like anything else.
Subject: General Tech | February 23, 2016 - 05:00 PM | Sebastian Peak
Tagged: wireless headset, VOID Wireless, VOID Surround, RGB, gaming headset, gaming headphones, Corsair VOID, corsair, 7.1 headset, 7.1 channel
Corsair has released a pair of gaming headsets in their VOID lineup, with the new VOID Surround Hybrid and a white version of the VOID Wireless RGB.
The VOID Surround Hybrid Gaming Headset
"The VOID Surround Hybrid Stereo Gaming Headset brings Corsair’s most advanced gaming headset to the widest range of devices yet. VOID Surround’s mobile-compatible 3.5mm connector offers instant connectivity to virtually any audio source, as well as full headset capability with Sony’s PlayStation 4 and Microsoft’s Xbox One (requires Xbox One Wireless Controller with a 3.5mm port or Xbox One Stereo Headset Adapter)."
With the addition of a 3.5 mm analog input the Hybrid version of the VOID Surround can be used with virtually any device, though to experience surround effects the headset still needs to be connected via USB.
"For connection to a PC, VOID Surround includes a USB 7.1 Dolby headphone adapter, unlocking genuine Dolby Surround for deadly accurate positional audio, as well as a fully customizable EQ in the Corsair CUE (Corsair Utility Engine) software."
The new white version of the VOID Wireless RGB headset
The new VOID Wireless RGB headset released is simply a new white color, so specs and features remain constant from the previous options. As to pricing, MSRPs for these headsets are $79.99 for the VOID Surround Hybrid, and $129.99 for the VOID Wireless RGB, making them more affordable than some of the competition at the high end of the market.
Here are the features and specs for both headsets from Corsair:
VOID Surround Gaming Headset Specifications
- Genuine Dolby Headphone: Treat yourself to 7.1 channels of accurate and immersive surround
- Universal Compatibility: The mobile-compatible connector works with PlayStation 4, Xbox One and mobile devices. The included USB Dolby 7.1 sound card unlocks genuine Dolby Surround for PC.
- Embark on Marathon Gaming Sessions: Microfiber-wrapped memory foam ear pads enable extended play.
- Unlock Legendary Audio: Oversized 50mm neodymium drivers bring the action to life with brilliant range and precision.
- Crystal Clear Voice Communication: The noise-canceling microphone on the VOID headset puts your voice in the spotlight—and nothing else
- Microfiber/Memory Foam Earpads: Play in comfort for hours… and hours
VOID Wireless Dolby 7.1 RGB Gaming Headset – (White) Specifications
- Legendary Audio, Zero Hassle: 2.4GHz wireless freedom up to 40 ft. + 16 hours of uninterrupted gaming
- Epic Immersion and True Multi-Channel Audio: Genuine Dolby Headphone surround delivers lethally accurate 7.1 positional audio
- RGB Lighting: Sync with other Corsair RGB devices—or light your own path
- CUE Control: Instantly re-spec your gaming audio—EQ, Dolby and volume—with a single digital control.
- InfoMic: Everything you need to know about your audio status—instantly.
- Unlock Legendary Audio: Oversized 50mm neodymium drivers bring the action to life with brilliant range and precision
- Microfiber/Memory Foam Earpads: Play in comfort for hours… and hours
- Take Command: The advanced unidirectional noise-cancelling microphone makes you loud and clear
Corsair is also announcing a new feature for their Corsair Utility Engine software, called "VOID Visualizer":
"Combining a digital Corsair VOID headset (VOID Wireless, USB or Surround) with any RGB-enabled keyboard (such as the K70 RGB or Strafe RGB) enables gamers to unleash a stunning multi-color graphic equalizer on their keyboard, turning it into a real-time display of the active audio or microphone signal. Compatible with VOID Surround, VOID RGB Wireless and VOID RGB USB headsets, VOID Visualizer can be enabled with just a few clicks in the Corsair Utility Engine."
Subject: General Tech | February 23, 2016 - 04:00 PM | Sebastian Peak
Tagged: survey, mechanical keyboard, Go Mechanical Keyboard, gaming keyboard, Cherry MX
Keyboard enthusiast site Go Mechanical Keyboard recently conducted a reader survey to determine what their readers preferred in a mechanical keyboard, and the results (from 950 responses) provided some interesting data.
The data (which the site has made available in its raw format here) includes results from favorite key switch to preferred form-factor, as well as brand and model preferences. The site created an impressive infographic to display the results, which is partially reproduced here. I'd recommend a visit to Go Mechanical Keyboard to see the full version, as well as links to prior year's surveys.
Getting to a few of the results, we'll start with the all-important mechanical key switches:
Cherry MX Blue was the winner for favorite typing experience, with MX Brown switches actually winning both gaming and all-purpose categories. Of course, key switches are a very personal choice and these results are limited to the readers of one particular site, though that does not invalidate the results. The position of the MX Brown surprised me, as my impression had been it was less popular than a few of the other options out there. (I'm curious to see what our readers think!)
Next we'll look at the preferred form-factor (which is accompanied by a couple of other data points):
Tenkeyless (TKL) slightly edges out the next highest result, which was the "60%" form-factor. Admittedly, I had not heard of this size prior to reading these results, and here's what I found from a quick search (I retrieved the following from the Deskthority Wiki):
"60% keyboards omit the numeric keypad of a full-size keyboard, and the navigation cluster of a tenkeyless keyboard. The function key row is also removed; the escape key is consequently moved into the number row."
I'll skip ahead to the favorite overall keyboard results, which in no way could cause any disagreement or disparagement on the internet, right?
The Vortex Poker 3 was the winner, a 60% keyboard (there's that form-factor again!) offered with a variety of MX switches. These keyboards run from about $129 - $139, depending on version. A model with Cherry MX Blue switches and white backlighting is listed on Amazon for $139.99, and versions with other key switches are also listed. The CM QuickFire Rapid, a tenkeyless design that sells for under $80 was second, followed by the Corsair K70, a standard 104-key design that sells for $129.
There was quite a bit more info on the full version of the infographic, and the source article (and site) is definately worth checking out if you're interested in mechanical keyboards. I'm curious to know what our readers prefer, too, so I'll be checking the comments!
Subject: Mobile | February 23, 2016 - 01:14 PM | Ryan Shrout
Tagged: snapdragon 820, Samsung, s7, qualcomm, MWC 2016, MWC, galaxy
No one is more excited to see the Snapdragon 820 processor in the Galaxy S7 (in some regions) than Qualcomm and Qualcomm's investors. Missing the S6 design win completely was a big blow to the SD 810 but the move to FinFET technology and a new SoC design have put the SD 820 back in the driver's seat for flagship smartphones it seems. While talking with Qualcomm's Peter Carson, Senior Director of Marketing and Modems, I learned quite a bit about the X12 LTE modem integration with the Galaxy S7 as well. As it turns out, the application processor itself isn't the only thing that has impressed OEMs or that will benefit consumers.
Modem marketers have a problem - quantifying the advantages of one LTE modem over another can be troublesome and complex. It's not as simple as X% faster or X% longer battery life, though those aspects of performance see improvement with better modem technology. And while of course the new announcement of Gigabit LTE is getting all the media attention at Mobile World Congress this week, there is a lot of excitement internally about the shipping implementation of the S7's modem.
The Galaxy S7 encompasses the most advanced Qualcomm TruSignal antenna technology implementation to date, combining several features to add real-world benefits to the cellular performance of the device.
First, the S7 will feature the most advanced version of the antenna tuner including a closed loop feedback cycle that will tweak antenna properties in real time based on sensor data and current signal properties. If the proximity sensor is activated or you are rotating or moving the mobile device, the receiver can adjust antenna properties to improve signal reliability measurably.
The best examples fall on the cell edge, where dropped calls are common and low voice quality are found. You can improve the gain of the antenna, that is adversly affected by simply holding the device, for much better reliability and even data throughput. That means fewer dropped calls and network drops for users that have moderate service reliability. Voice quality will get better as well, as the error rates that cause data loss in low signal areas will be reduced.
But even users that have a good signal can get benefits from the tech - gains of just 2-3 db will allow the modem and receiver to go into a lower power state, reducing 20% of the modem power draw. That won't equate to 20% total system battery life improvement but users that depend on their phones for extended use will see benefits from this integration.
Another TruSignal feature included in this modem implementation is smart transmit antenna switching. The simple explanation here is that the modem can swap which antennas are in receive and transmit modes in order to improve the transmit (upload) performance by as much as 10db! Based on properties of the antenna signal, position of the device and if you are in a heavy upload workload (posting some photos to Facebook, a video to YouTube), TruSignal allows the modem to change in real-time.
These techniques are additive so Galaxy S7 owners will find that both the antenna tuner and antenna switching are going to move the cellular performance forward, though Qualcomm isn't saying if ALL implementation of Samsung's new flagship smartphone will implement the features. I would guess that we'll see this on the Snapdragon 820 + X12 powered models only,
but I haven't learned yet which modem the Exynos-powered versions are using yet. Turns out the versions of the S7 that utilize the Samsung Exynos SoC are using a non-Qualcomm modem, so they will not support the features seen here.
Subject: Storage | February 23, 2016 - 01:05 PM | Allyn Malventano
Tagged: DroboPro, drobo, B810i, B800i
The B810i comes with several improvements over preceding products in the line:
- 180MB/s reads / 110 MB/s writes (across a pair of iSCSI enabled Gigabit Ethernet ports).
- New 64TB max volume size
- Data Tiering
- SSDs installed as part of the array are automatically assigned to caching duties.
- Cache performance is claimed 5-10x faster than the 'cold' HDD tier.
- Cache Pre-heat
- Metadata describing the contents / duplicated data in the cache is also saved to the array, meaning the cache can survive a reboot of the device.
- Accelerated self-healing
- Drobo claims rebuilds are now 8x faster. This is due to increased parallelism taking place during that process.
- This is in addition to Drobo rebuilds that have only ever needed to re-duplicate the data present (and not all disks front to back as with traditional RAID).
- This is the same near-bulletproof system that has proven itself extremely resistant to failure (but remember, RAID is *not* a backup!).
Along with this launch, Drobo is running a promotion where sales by 4/30/2016 will receive two free 2TB HDDs as part of the $1699 purchase of a B810i.
The B810i replaces the B800i in the current Drobo lineup:
We're working on a round of NAS / SAN pieces here...
...along with an ioSafe 1515+, which would have collapsed the desk if I had it tried to fit it into this picture. That 75 lb beast will have to stay on the floor :).
Subject: Graphics Cards | February 22, 2016 - 11:03 PM | Ryan Shrout
Tagged: vive, valve, steamvr, steam, rift, performance test, Oculus, htc
Though I am away from my stacks of hardware at the office attending Mobile World Congress in Barcelona, Valve dropped a bomb on us today in the form of a new hardware performance test that gamers can use to determine if they are ready for the SteamVR revolution. The aptly named "SteamVR Performance Test" is a free title available through Steam that any user can download and run to get a report card on their installed hardware. No VR headset required!
And unlike the Oculus Compatibility Checker, the application from Valve runs actual game content to measure your system. Oculus' app only looks at the hardware on your system for certification, not taking into account the performance of your system in any way. (Overclockers and users with Ivy Bridge Core i7 processors have been reporting failed results on the Oculus test for some time.)
The SteamVR Performance Test runs a set of scenes from the Aperture Science Robot Repair demo, an experience developed directly for the HTC Vive and one that I was able to run through during CES last month. Valve is using a very interesting new feature called "dynamic fidelity" that adjusts image quality of the game in a way to avoid dropped frames and frame rates under 90 FPS in order to maintain a smooth and comfortable experience for the VR user. Though it is the first time I have seen it used, it sounds similar to what John Carmack did with the id Tech 5 engine, attempting to balance performance on hardware while maintaining a targeted frame rate.
The technology could be a perfect match for VR content where frame rates above or at the 90 FPS target are more important than visual fidelity (in nearly all cases). I am curious to see how Valve may or may not pursue and push this technology in its own games and for the Vive / Rift in general. I have some questions pending with them, so we'll see what they come back with.
A result for a Radeon R9 Fury provided by AMD
Valve's test offers a very simple three tiered breakdown for your system: Not Ready, Capable and Ready. For a more detailed explanation you can expand on the data to see metrics like the number of frames you are CPU bound on, frames below the very important 90 FPS mark and how many frames were tested in the run. The Average Fidelity metric is the number that we are reporting below and essentially tells us "how much quality" the test estimates you can run at while maintaining that 90 FPS mark. What else that fidelity result means is still unknown - but again we are trying to find out. The short answer is that the higher that number goes, the better off you are, and the more demanding game content you'll be able to run at acceptable performance levels. At least, according to Valve.
Because I am not at the office to run my own tests, I decided to write up this story using results from a third part. That third party is AMD - let the complaining begin. Obviously this does NOT count as independent testing but, in truth, it would be hard to cheat on these results unless you go WAY out of your way to change control panel settings, etc. The demo is self run and AMD detailed the hardware and drivers used in the results.
- Intel i7-6700K
- 2x4GB DDR4-2666 RAM
- Z170 motherboard
- Radeon Software 16.1.1
- NVIDIA driver 361.91
- Win10 64-bit
|2x Radeon R9 Nano||11.0|
|GeForce GTX 980 Ti||11.0|
|Radeon R9 Fury X||9.6|
|Radeon R9 Fury||9.2|
|GeForce GTX 980||8.1|
|Radeon R9 Nano||8.0|
|Radeon R9 390X||7.8|
|Radeon R9 390||7.0|
|GeForce GTX 970||6.5|
These results were provided by AMD in an email to the media. Take that for what you will until we can run our own tests.
First, the GeForce GTX 980 Ti is the highest performing single GPU tested, with a score of 11 - because of course it goes to 11. The same score is reported on the multi-GPU configuration with two Radeon R9 Nanos so clearly we are seeing a ceiling of this version of the SteamVR Performance Test. With a single GPU score of 9.2, that is only a 19% scaling rate, but I think we are limited by the test in this case. Either way, it's great news to see that AMD has affinity multi-GPU up and running, utilizing one GPU for each eye's rendering. (AMD pointed out that users that want to test the multi-GPU implementation will need to add the -multigpu launch option.) I still need to confirm if GeForce cards scale accordingly. UPDATE: Ken at the office ran a quick check with a pair of GeForce GTX 970 cards with the same -multigpu option and saw no scaling improvements. It appears NVIDIA has work to do here.
Moving down the stack, its clear why AMD was so excited to send out these early results. The R9 Fury X and R9 Fury both come out ahead of the GeForce GTX 980 while the R9 Nano, R9 390X and R9 390 result in better scores than NVIDIA's GeForce GTX 970. This comes as no surprise - AMD's Radeon parts tend to offer better performance per dollar when it comes to benchmarks and many games.
There is obviously a lot more to consider than the results this SteamVR Performance Test provides when picking hardware for a VR system, but we are glad to see Valve out in front of the many, many questions that are flooding forums across the web. Is your system ready??
Subject: Processors, Mobile | February 22, 2016 - 04:11 PM | Sebastian Peak
Tagged: TSMC, SoC, octa-core, MWC 2016, MWC, mediatek, Mali-T880, LPDDR4X, Cortex-A53, big.little, arm
MediaTek might not be well-known in the United States, but the company has been working to expand from China, where it had a 40% market share as of June 2015, into the global market. While 2015 saw the introduction of the 8-core Helio P10 and the 10-core helio X20 SoCs, the company continues to expand their lineup, today announcing the Helio P20 SoC.
There are a number of differences between the recent SoCs from MediaTek, beginning with the CPU core configuration. This new Helio P20 is a “True Octa-Core” design, but rather than a big.LITTLE configuration it’s using 8 identically-clocked ARM Cortex-A53 cores at 2.3 GHz. The previous Helio P10 used a similar CPU configuration, though clocks were limited to 2.0 GHz with that SoC. Conversely, the 10-core Helio X20 uses a tri-cluster configuration, with 2x ARM Cortex-A72 cores running at 2.5 GHz, along with a typical big.LITTLE arrangement (4x Cortex-A53 cores at 2.0 Ghz and 4x Cortex-A53 cores at 1.4 GHz).
Another change affecting MediaTek’s new SoC and he industry at large is the move to smaller process nodes. The Helio P10 was built on 28 nm HPM, and this new P20 moves to 16 nm FinFET. Just as with the Helio P10 and Helio X20 (a 20 nm part) this SoC is produced at TSMC using their 16FF+ (FinFET Plus) technology. This should provide up to “40% higher speed and 60% power saving” compared to the company’s previous 20 nm process found in the Helio X20, though of course real-world results will have to wait until handsets are available to test.
The Helio P20 also takes advantage of LPDDR4X, and is “the world’s first SoC to support low power double data rate random access memory” according to MediaTek. The company says this new memory provides “70 percent more bandwidth than the LPDDR3 and 50 percent power savings by lowering supply voltage to 0.6v”. Graphics are powered by ARM’s high-end Mali T880 GPU, clocked at an impressive 900 MHz. And all-important modem connectivity includes CAT6 LTE with 2x carrier aggregation for speeds of up to 300 Mbps down, 50 Mbps up. The Helio P20 also supports up to 4k/30 video decode with H.264/265 support, and the 12-bit dual camera ISP supports up to 24 MP sensors.
Specs from MediaTek:
- Process: 16nm
- Apps CPU: 8x Cortex-A53, up to 2.3GHz
- Memory: Up to 2 x LPDDR4X 1600MHz (up to 6GB) + 1x LPDDR3 933Mhz (up to 4GB) + eMMC 5.1
- Camera: Up to 24MP at 24FPS w/ZSD, 12bit Dual ISP, 3A HW engine, Bayer & Mono sensor support
- Video Decode: Up to 4Kx2K 30fps H.264/265
- Video Encode: Up to 4Kx2K 30fps H.264
- Graphics: Mali T-880 MP2 900MHz
- Display: FHD 1920x1080 60fps. 2x DSI for dual display
- Modem: LTE FDD TDD R.11 Cat.6 with 2x20 CA. C2K SRLTE. L+W DSDS support
- Connectivity: WiFiac/abgn (with MT6630). GPS/Glonass/Beidou/BT/FM.
- Audio: 110db SNR & -95db THD
It’s interesting to see SoC makers experiment with less complex CPU designs after a generation of multi-cluster (big.LITTLE) SoCs, as even the current flagship Qualcomm SoC, the Snapdragon 820, has reverted to a straight quad-core design. The P20 is expected to be in shipping devices by the second half of 2016, and we will see how this configuration performs once some devices using this new P20 SoC are in the wild.
Full press release after the break:
Subject: Mobile, Shows and Expos | February 22, 2016 - 10:09 AM | Ryan Shrout
Tagged: video, snapdragon 820, snapdragon, qualcomm, MWC 2016, MWC, LG, G5
The new LG G5 flagship smartphone offers a unique combination of form factor, performance and modularity that no previous smartphone design has had. But will you want to buy in?
I had a feeling that the Snapdragon 820 SoC from Qualcomm would make an impression at Mobile World Congress this year and it appears the company has improved on the previous flagship processor quite a bit. Both Samsung and LG have implemented it into the 2016 models, including the new G5, offering up a combination of performance and power efficiency that is dramatically better than the 810 that was hindered by heat and process technology concerns.
Along with the new processor, the G5 includes 4GB of RAM, 32GB of on-board storage with micro SD expansion, a 2,800 mAh battery and Android 6.0 out of the box. The display is 5.3-in and uses LG IPS technology with a 2560x1440 resolution, resulting in an impressive 554 PPI. LG has updated the USB connection to Type-C, a move that Samsung brushed off as unnecessary at this time.
The phones design is pretty standard and will look very familiar to anyone that has handled a G4 or similar flagship smartphone in recent months. It was bigger in the hand than the iPhone 6s but considering the panel size differences, it was more compact than expected.
Modularity is the truly unique addition to the G5 though. The battery is replaceable by sliding out a bottom portion of the phone, released with a tab on the left side. This allows LG to maintain the metal body construction but still offer flexibility for power users that are used to having extra batteries in their bag. This mechanism also means LG can offer add-on modules for the phone.
The first two available will be the LG Cam Plus and the LG Hi-Fi Plus. The Cam Plus gives the phone a camera grip as well as dedicated buttons for the shutter, video recording and zoom. Including an extra 1,200 mAh of battery is a nice touch too. The Hi-Fi Plus module has a DAC and headphone amplifier enbeded in it and can also be used connected to a PC through the USB Type-C connection; a nice touch.
I was overall pretty impressed with what LG had to offer with the G5. Whether or not the modular design gains any traction will have to be seen; I have concerns over the public's desire to carry around modules or affect the form factor of their phones so dramatically.
Subject: Displays, Shows and Expos | February 22, 2016 - 01:27 AM | Scott Michaud
Tagged: MWC, mwc 16, valve, htc, vive, Oculus
Valve and HTC announced that the Vive consumer edition will be available in April for $799 USD, with pre-orders beginning on February 29th. Leave it to Valve to launch a product on a date that doesn't always exist. The system comes with the headset, two VR controllers, and two sensors. The unit will have “full commercial availability” when it launches in April, but that means little if it sells out instantly. There's no way to predict that.
The announcement blog post drops a subtle jab at Oculus. “Vive will be delivered as a complete kit” seems to refer to the Oculus Touch controllers being delayed (and thus not in the hands of every user). This also makes me think about the price. The HTC Vive costs $200 more than the Oculus Rift. That said, it also has the touch controllers, which could shrink that gap. It also does not come with a standard gamepad, like Oculus does, although that's just wasted money if you already have one.
Unlike the Oculus, which has its own SDK, the Vive is powered by SteamVR. Most engines and middleware that support one seem to support both, so I'm not sure if this will matter. It could end up blocking content in an HD-DVD vs BluRay fashion. Hopefully Valve/HTC and Oculus/Facebook, or every software vendor on an individual basis, works through these interoperability concerns and create an open platform. Settling on a standard tends to commoditize industries, but that will eventually happen to VR at some point anyway. Hopefully, if it doesn't happen sooner, cross-compatibility at least happens then.
Subject: Mobile, Shows and Expos | February 21, 2016 - 10:14 PM | Scott Michaud
Tagged: Samsung, epic games, unreal engine 4, vulkan, galaxy s7, MWC, mwc 16
Mobile World Congress starts with a big bang... ... ... :3
Okay, not really; it starts with the formation of a star, which happens on a continual basis across the universe. I won't let facts get in the way of a pun, though.
As for the demo, it is powered by Unreal Engine 4 and runs on a Samsung Galaxy S7 with the Vulkan API. The setting seems to be some sort of futuristic laboratory that combines objects until it builds up into a star. It is bright and vibrant, with many particles, full-scene anti-aliasing, reflections, and other visual effects. The exact resolution when running on the phone was never stated, but the YouTube video was running at 1080p30, and the on-stage demo looked fairly high resolution, too.
Epic Games lists the features they added to mobile builds of Unreal Engine 4 for this demo:
- Dynamic planar reflections
- “Full” GPU particle support, which includes vector fields.
- Temporal Anti-Alising, which blends neighboring frames to smooth jaggies in motion.
- ASTC texture compression (created by ARM and AMD for OpenGL and OpenGL ES)
- Full scene dynamic cascaded shadows
- Chromatic aberration
- Dynamic light refraction
- Filmic tonemapping curve, which scales frames rendered in HDR to a presentable light range
- Improved static reflections
- High-quality depth of field
- Vulkan API for thousands of onscreen, independent objects.
The company has not stated which version of Unreal Engine 4 will receive these updates. I doubt that it will land in 4.11, which is planned for March, but they tend to release a full dot-version every one to three months. They also have early previews for those who wish to try it early, some compiled leading up to launch, and others that need to be built from GitHub.
Subject: Mobile | February 21, 2016 - 07:52 PM | Sebastian Peak
Tagged: snapdragon 820, smartphone, qualcomm, MWC 2016, MWC, modular phone, LG G5, LG, ips, G5, Android
LG has officially unveiled their newest flagship Android handset, and in addition to high-end specs the G5 features a unique modular construction.
The LG G5
The G5 is powered by the new Snapdragon 820 SoC, and offers a 5.3-inch, 2560x1440 IPS display (making slightly smaller than the earlier G4, which was a 5.5-inch device with the same resolution). And while the G5 looks every bit a sleek Android flagship, there’s more going on here than the typical sealed handset. LG has implemented a modular design, where optional components can be added from a port on the bottom of the phone.
The LG Cam Plus (left) and Hi-Fi Plus (right)
The first of two announced modules is the LG Cam Plus, which is a camera grip that also adds 1200 mAh to the battery capacity (for a total of 4000 mAh). The second is the LG Hi-Fi Plus, which adds a high-resolution DAC and headphone amp to the phone. The headphone amp is “tuned by B&O”, and the DAC supports up to 32-bit / 384 kHz. The Hi-Fi Plus can also be used as a standalone USB device.
(Image via Android Police)
One of the features that had leaked ahead of the announcement was an always-on display, leading to speculation about the use of an OLED panel. But this is LG we are talking about, and they have implemented a high-DPI (554) IPS display instead. So how does this always-on display feature avoid aggressively draining your battery? The post from ComputerBase offers this analysis:
“Instead, the company opted for an optimization of display drivers and power management in order to realize the permanent display of notifications, time, date and other information on the large main screen. The adjustments for example it is possible to limit the backlight to a part of the screen. According to LG, the activated always-on function consumes thanks to the optimizations per hour 0.8 percent of the battery charge.”
Specs via Android Central:
- Display: 5.3-inch IPS quad-HD quantum display (2560x1440, 554 dpi)
- Processor: Snapdragon 820
- Storage: 32GB UFS ROM, microSD up to 2TB
- RAM: 4GB LPDDR4
- Rear camera: 16MP main, 8MP wide-angle (135 degrees)
- Front camera: 8MP
- Battery: 2800 mAh removable
- Modules: LG Cam Plus (camera grip with 1100 mAh), LG Hi-Fi Plus with B&O Play
- Dimensions: 149.4 x 73.9 x 7.7mm
- Weight: 159 grams
- Networks: LTE/3G/2G
- Connectivity: Wifi 802.11a/b/g/n/ac, USB Type C, NFC, Bluetooth 4.2
- Colors: Silver/Titan/Gold/Pink
- Operating system: Android 6.0.1
There were three additional accessories announced with the phone: The 360 VR (a VR headset) 360 CAM (for creating 360-degree movies and photos) and something called the Rolling Bot (a Wi-Fi connected sphere equipped with a camera, mic, and speaker).
Ryan had hands-on time with the G5 from LG's booth at MWC 2016:
No specific pricing or release date have been announced yet, but we should know more next month when LG is expected to provide more release details.
Subject: Mobile | February 21, 2016 - 06:00 PM | Sebastian Peak
Tagged: VIBE K5 Plus, VIBE K5, Snapdragon 616, Snapdragon 415, smartphone, qualcomm, MWC 2016, MWC, Lenovo, Android
Lenovo has announced a new pair of smartphones in their VIBE series, and these offer very impressive specs considering the asking price.
The VIBE K5 will retail for $129, with the K5 Plus slightly higher at $149. What does this get you? Both are 5-inch devices, with a modest 1280x720 resolution on the standard K5, or FHD 1920x1080 on the K5 Plus. The phones are both powered by Qualcomm SoCs, with a Snapdragon 415 in the K5 (quad-core 1.4 GHz), and the faster Snapdragon 616 (8-core 1.7 GHz) in the K5 Plus.
Here’s a look at the specifications for these phones:
- Screen: 5.0” HD (1280x720) display (K5) or IPS Full HD (1920x1080) (K5 Plus)
- Processor: Qualcomm snapdragon 415 octa-core (K5) or 616 octa-core processor (K5 Plus)
- Storage: 2GB LP DDR3 RAM | 16GB eMCP built-in storage | up to 32GB microSD expandable storage support
- Graphics: Adreno 405: up to 550MHz 3D graphics accelerator
- Camera: Rear: 13MP with 5-piece lens and FHD video recording, Front: 5MP fixed-focus with 4-piece lens
- Connectivity: Dual SIM slots with 4G LTE connectivity + BT 4.1; WLAN: Wi-Fi 802.11 b/g/n, Wi-Fi hotspot
- Battery: 2750mAh interchangeable battery
- Audio: 2 x speakers, 2 x mics, 3.5 mm audio jack, Dolby Atmos
- Thickness: 8.2 mm (.32 in)
- Weight: 142 g (5 oz)
- OS: Android 5.1, Lollipop
On paper these smartphones present a compelling value reminiscent of the ASUS Zenfone 2, with the K5 Plus easily the better bargain with a 1920x1080 IPS display and octa-core processor for $149. We’ll have to wait to pass judgment until the UI performance and camera have been tested, but these new VIBE K5 phones certainly looks like a promising option.
The VIBE K5 and K5 Plus will be available in March.
Subject: Systems, Mobile | February 21, 2016 - 06:00 PM | Sebastian Peak
Tagged: x5 Z8300, windows 10, tablet, MWC 2016, MWC, MIIX 310, Lenovo, ips, intel atom, convertible tablet, 2-in-1
The Lenovo ideapad MIIX 310 is a 2-in-1 that combines a 10.1-inch tablet with a detachable keyboard, and when you consider the specs Lenovo is pricing this very aggressively at $229 - including the keyboard.
“This 10-inch tablet is one of the most affordable devices that not only combines both tablet and PC in one, but unlike many of its rivals, comes with a detachable keyboard as standard. The ideapad MIIX 310 boasts an optional FHD display, making movie marathons that much more immersive.”
The $229 retail is a starting price, and the 1920x1080 IPS screen option will cost you more (just how much is not yet known). Beyond the display the MIIX 310 is powered by an Intel Atom x5-Z8300, a quad-core processor that operates at up to 1.84 GHz. Memory is limited to 2 GB, with up to 128 GB of eMMC storage available.
Here’s a look at the specifications:
- CPU: Intel Atom x5 Z8300 CPU
- Graphics: Integrated Intel
- Screen: 10.1” up to FHD (1920x1080) IPS, 300 nits
- Cameras: 2MP front & 5MP rear camera
- Battery: Up to 10 hours local video playback
- Memory: 2GB RAM
- Storage: Up to 128GB eMMC
- Audio: Stereo Speakers
- Connectivity: 802.11 B/G/N + BT 4.0 4G
- LTE Support: Optional
- OS: Windows 10 Home
As mentioned above, the ideapad MIIX 310 will start at $229, with availability set for June.
Subject: Systems, Mobile | February 21, 2016 - 06:00 PM | Sebastian Peak
Tagged: YOGA 710, YOGA 510, yoga, windows 10, notebook, MWC 2016, MWC, Lenovo, laptop, ips, convertible tablet, 2-in-1
Lenovo has announced a pair of new convertible laptop options with the YOGA 710 and YOGA 510, and each of these new models are available in two sizes.
First we have the YOGA 710, which is available in both an 11-inch and a 14-inch version. The smaller 11-inch model is limited to an Intel Core m5 processor, while the 14-inch version offers a 6th-gen (Skylake) Intel Core i7 CPU. Here's a look at the available specs:
YOGA 710, 11-inch:
- Screen: 11.6” FHD 1920x1080 IPS Touch; 300
- CPU: Up to Intel 6th Gen Core M5 CPU
- Memory: Up to 8GB LP-DDR3
- Storage: Up to 256GB SSD
- Graphics: Integrated Intel
- Audio: Stereo speakers with Dolby Audio certification
- Battery: 40Whr; up to 8 hours
- Webcam: 1MP Fixed Focus CMOS camera (720p)
- Connectivity: 1x1 or 2x2 A/C WiFi + Bluetooth 4.1
- Ports: 1x always-on USB 3.0, Micro-HDMI, audio combo jack
- OS: Windows 10 Home
YOGA 710, 14-inch:
- Screen: 14” FHD 1920x1080 IPS Touch; 300 nits
- CPU: Up to Intel 6th Gen Core i7 CPU
- Memory: Up to 8GB DDR4
- Storage: Up to 256GB SSD
- Graphics: Optional NVIDIA GFX GeForce 940MX
- Audio: JBL Speakers with Dolby Audio certification
- Battery: Up to 52.5Whr; up to 8.5 hours local HD video playback @200nits
- Webcam: 1MP Fixed Focus CMOS camera (720p)
- Connectivity: 2x2 A/C WiFi + Bluetooth 4.1
- Ports: 1x always-on USB 3.0, 1x USB 3.0, Micro-HDMI, SDXC Reader, Display Port (combo with HDMI), audio combo jack
- OS: Windows 10 Home
Next we have the YOGA 510, which is available in both 14-inch and 15-inch versions, and promises up to 8.5 hours of battery life.
Specs on these models include:
- Screen: 14” & 15” FHD 1920x1080 IPS Touch; 250 nits
- CPU: Up to Intel 6th Gen Core i7 CPU or Pentium
- Memory: Up to 8GB DDR4
- Storage: Up to 1TB HDD or up to 256GB SSD
- Graphics: 14: Up to AMD Radeon R5 M430; 15: Up to AMD Radeon R7 M460 2GB
- Audio: Stereo Speakers with Audio by Harmon Kardon
- Keyboard: Optional Backlit keyboard
- Battery: Up to 52.5 Whr; up to 8.5 hours local HD video playback @200nits
- Webcam: 1MP Fixed Focus CMOS camera (720p)
- Connectivity: 1x1 A/C WiFi + Bluetooth 4.1, GIGA LAN
- Ports: 1x always-on USB 2.0, 2x USB 3.0, HDMI, SDXC Card Reader, audio combo jack
- OS: Windows 10 Home
These new YOGA models will be available in July, and pricing was announced as follows:
- Yoga 710 11-inch $499; 14-inch $799
- Yoga 510 14-inch $599; 15-inch $699
Subject: Mobile | February 21, 2016 - 05:56 PM | Ryan Shrout
Tagged: MWC, MWC 2016, qualcomm, snapdragon, snapdragon wear
Earlier this month, Qualcomm announced the creation of the Snapdragon Wear platform and the Snapdragon Wear 2100 SoC, the very first in a new family of products built to address consumer wearables market. Even though the Snapdragon 400 series of processors had already found its way into a large majority (65% according to Qualcomm) of all of the currently shipping Android Wear watches, Qualcomm hopes that the improvements in the Snapdragon Wear 2100 will further the company's market share and improve on the experiences that users have with wearable products.
Snapdragon Wear 2100 offers several advantages over the Snapdragon 400 series of SoCs:
Utilizing Qualcomm Technologies’ expertise in connectivity and compute, the Snapdragon Wear platform consists of a full suite of silicon, software, support tools, and reference designs to allow mobile, fashion, and sports customers to bring a diverse range of full-featured wearables to customers quickly. Available in both tethered (Bluetooth® and Wi-Fi®) and connected (4G/LTE and 3G) versions, Snapdragon Wear 2100 innovates along four wearables core vectors:
- Smaller Size – 30 percent smaller than the popular Snapdragon 400, Snapdragon Wear 2100 can help enable new, thinner, sleeker designs
- Lower Power – 25 percent lower power than the Snapdragon 400 across both tethered and connected use cases, allowing for longer day of use battery life
- Smarter Sensors – With an integrated, ultra-low power sensor hub, Snapdragon Wear 2100 enables richer algorithms with greater accuracy than the Snapdragon 400
- Always Connected – Next-generation LTE modem with integrated GNSS, along with low power Wi-Fi and Bluetooth delivers an always connected experience
There is no direct mention of comparative performance though, something I am looking to get answered this week.
This week's announcement from Qualcomm is the addition of three new partners for the Snapdragon Wear platform, on top of the launch partner LG. The new names might not be household brands but they will offer a strong growth segment for Qualcomm as more vendors enter the wearables markets through ODMs.
- Borqs – A global leader in software and products for IoT providing customizable, differentiated and scalable Android-based smart connected devices and cloud service solutions, Borqs is offering connected (3G/4G) and tethered (Wi-Fi®/Bluetooth®) smartwatch and kid watch reference designs based on Snapdragon Wear 2100.
- Compal – A global manufacturer of notebook PCs, smartphone, tablet and display products and smart wearable devices, Compal is delivering reference designs and device production based on Snapdragon Wear 2100 supporting both Android Wear and Android operating systems and targeting connected (3G/4G) and tethered (Wi-Fi/Bluetooth) use cases.
- Infomark – An early innovator in the emerging kid watch segment, where the company has previously launched two generations of products (JooN1, JooN2) based on Qualcomm Technologies, Infomark is offering a reference design based on Snapdragon Wear 2100 targeting kid and elderly watch segments.
I should be getting hands-on with hardware built on the Snapdragon Wear 2100 SoC from LG and these three new partners this week while at Mobile World Congress 2016, so stayed tuned for more coverage!
Subject: Mobile | February 21, 2016 - 05:18 PM | Ryan Shrout
Tagged: MWC, MWC 2016, qualcomm, vulkan, snapdragon, snapdragon 820, adreno 530
As we prepare for the onslaught of new mobile devices and technologies being announced at Mobile World Congress in Barcelona, the low-level Vulkan API begins its campaign to take hold in the PC and mobile spaces, superceding the OpenGL standard that exists today in hopes of providing a more efficient use of compute resources across the industry.
Qualcomm announced official support for the Vulkan API on its Adreno 530 GPU and the Snapdragon 820 processor. Vulkan API support will be coming for upcoming other unannounced Adreno 5xx series GPUs and currently shipping Adreno 4xx GPUs, allowing us to wonder if Vulkan support will find its way into currently shipping handsets.
As Qualcomm points out in its press release on the news, the Vulkan API will bring some important and groundbreaking changes to the mobile space.
- Explicit control over GPU operation, with minimized driver overhead for improved performance;
- Multi-threading-friendly architecture to increase overall system performance;
- Optimal API design that can be used in a wide variety of devices including mobile, desktop, consoles, and embedded platforms;
- Use of Khronos’ new SPIR-V intermediate representation for shading language flexibility and more predictable implementation behavior;
- Extensible layered architecture that enables innovative tools without impacting production performance while validating, debugging, and profiling;
- Simple drivers for low-overhead efficiency and cross vendor portability.
Vulkan API support is being added to Qualcomm's development tools suite this week as well.
“We are pleased to have contributed to the definition of Khronos’ new Vulkan API. Qualcomm Technologies will be among the first to ship conformant Vulkan drivers, starting with Snapdragon 820’s embedded Adreno 530 GPU, and subsequently with our Adreno 4xx series GPUs. Vulkan enables the next generation of graphics performance by adding multi-threaded command buffer generation and explicit control of advanced graphics capabilities within Adreno GPUs,” said Micah Knapp, director of product management, Qualcomm Technologies, Inc. “In the coming days, we anticipate supporting Vulkan in the Snapdragon developer tools including Snapdragon Profiler and the Adreno SDK, to help application developers take advantage of this outstanding new API when creating graphics and compute applications for smartphones, tablets, VR HMDs and a variety of other types of devices that use Snapdragon processors.”
A quick look at the Khronos page listing companies with Vulkan conformant drivers shows Qualcomm on the short list, meaning it has provided the standards body with a driver that has passed its first level of certification. With its emphasis on efficiency, the Vulkan API and Qualcomm's early integration could be the most important place that the API ends up. In a technology field where battery life and performance must balance unlike anywhere else, getting this new implementation of graphics and compute could push mobile devices forward quickly.
Subject: Graphics Cards | February 20, 2016 - 12:11 AM | Scott Michaud
Tagged: vulkan, linux
Update: Venn continued to benchmark and came across a few extra discoveries. For example, he disabled VDPAU and jumped to 89.6 FPS in OpenGL and 80.6 FPS in Vulkan. Basically, be sure to read the whole thread. It might be updated further even. Original post below (unless otherwise stated).
On Windows, the Vulkan patch of The Talos Principle leads to a net loss in performance, relative to DirectX 11. This is to be expected when a developer like Croteam optimizes their game for existing APIs, and tries to port all that work to a new, very different standard, with a single developer and three months of work. They explicitly state, multiple times, not to expect good performance.
Image Credit: Venn Stone of LinuxGameCast
On Linux, Venn Stone of LinuxGameCast found different results. With everything maxed out at 1080p, his OpenGL benchmark reports 38.2 FPS, while his Vulkan raises this to an average of 66.5 FPS. Granted, this was with an eight-core AMD FX-8150, which launched with the Bulldozer architecture back in 2011. It did not have the fastest single-threaded performance, falling behind even AMD's own Phenom II parts before it in that regard.
Still, this is a scenario that allowed the game to scale to Bulldozer's multiple cores and circumvent a lot of the driver overhead in OpenGL. It resulted in a 75% increase in performance, at least for people who pair a GeForce 980
Ti ((Update: The Ti was a typo. Venn uses a standard GeForce GTX 980.)) with an eight-core, Bulldozer CPU from 2011.