Cooler Master's MasterAir Maker 8; big cooler, big price

Subject: Cases and Cooling | January 26, 2016 - 01:16 PM |
Tagged: cooler master, masterair maker 8, air cooler

At 758g and standing 135x145x172mm (5.3x5.7x6.8") with the fan installed the MasterAir Maker 8 is not the largest heatsink on the market but it is certainly a solid hunk of metal.  Cooler Master has included a black plastic x-brace with captive screws similar to the mount shipped with the Hyper D92 which will help protect your CPU from cracking, a nice touch for those who choose to invest in this cooler.  The price is steep compared to the competition, at $130 it is priced more like an AIO watercooler than an air cooler so the performance needs to be equally as impressive.  The Tech Report tested it on an i5-6600K against the Nepton 240M and the cooling performance was similar, however the acoustical performance was not.  Read on to learn more about the noises this cooler produced and if it is really worth the price tag.

front34.jpg

"Cooler Master's MasterAir Maker 8 CPU cooler uses a unique base design to pack in more heat pipes than any other cooler we know of in its size class. We put this cooler through our testing gauntlet to see whether more is better."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

New, from the company that brought you SuperFish ...

Subject: General Tech | January 26, 2016 - 12:13 PM |
Tagged: security, Lenovo, idiots

Lenovo chose the third most popular password of 2015 to secure its ShareIT for Windows application and for bonus points have made it hard coded, which there is utterly no excuse for in this day and age.  If you aren't familiar with the software, it is another Dropbox type app which allows you to share files and folders, apparently with anyone now that this password ridiculousness has been exposed.  As you read on at The Inquirer the story gets even better, files are transferred in the clear without any encryption and it even creates an open WiFi hotspot for you, to make sharing your files even easier for all and sundry.  There are more than enough unintentional vulnerabilities in software and hardware, we really don't need companies programming them in on purpose.  If you have ShareIT, you should probably DumpIT.

***Update***

We received word that there is an updated version of ShareIT available for those who do use the app and would like to continue to do so.

They can also access the latest versions which are posted and available for download on the Lenovo site. The updated Android version of SHAREit is also available for download on the Google Play store. Please visit the Lenovo security advisory page for the latest information and updates: (https://support.lenovo.com/us/en/product_security/len_4058)

10574265464_449a1b2b96_b.jpg

"HOLY COW! Lenovo may have lost its mind. The firm has created vulnerabilities in ShareIT that could be exploited by anyone who can guess that '12345678' could be a password."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

The ~$220 showdown, XFX R9 380 DD Black Edition OC 4GB versus the GTX 960

Subject: Graphics Cards | January 25, 2016 - 03:19 PM |
Tagged: XFX R9 380 Double Dissipation Black Edition OC 4GB, xfx, gtx 960

In one corner is the XFX R9 380 DD Black Edition OC 4GB, at factory settings and with an overclock of 1170MHz core and 6.4GHz memory and in the other corner is a GTX 960 with a 1178MHz Boost clock and 7GHz memory.  These two contenders will compete in a six round 1080p match featuring Fallout 4, Project Cars, Witcher 3, GTAV, Dying Light and BF4 to see which is worthy of your hard earned buckaroos.  Your referee for today will be [H]ard|OCP, tune in to see the final results.

1452968502DlKPGLEWlN_1_7_l.jpg

"Today we evaluate a custom R9 380 from XFX, the XFX R9 380 DD BLACK EDITION OC 4GB. Sporting a hefty factory overclock and the Ghost Thermal 3.0 custom cooling with Double Dissipation, we compare it to an equally priced reference GeForce GTX 960. Find out which video card provides the better bargain."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: [H]ard|OCP

Have you tried the Steam Controller yet?

Subject: General Tech | January 25, 2016 - 01:27 PM |
Tagged: input, Steam Controller

The claims seem suspect; how exactly can a Steam Controller replace a mouse and keyboard when gaming?  That suspicion is being tested over at The Tech Report who recently tried out Valve's new Steam Controller, comparing it not only to a standard PC input setup but also to a XBox controller.  For the test they used Rocket League, Team Fortress 2, Just Cause 3, and Helldivers with mixed results.  In the end the Steam Controller was just not as useful as the Logitech M570 trackball, wireless keyboard, and Xbox 360 controller the reviewer is used to.  That said, with a lot of practice and time spent tweaking your input profiles you could find the Steam Controller is for you ... if you want it.

steamdesk_thumb.png

"Valve's Steam Controller is supposed to obviate the mouse and keyboard for PC gaming in the living room. We put our thumbs on the Steam Controller's twin trackpads and took it for a spin to see whether it does the job."

Here is some more Tech News from around the web:

Tech Talk

Make yourself a WiFi camera remote

Subject: General Tech | January 25, 2016 - 12:40 PM |
Tagged: wifi, camera, DIY, iot

Hack a Day has posted a perfect example of how inexpensive and easy it is to build yourself useful things instead of shopping for expensive electronics.  If you have looked at the prices of cameras or adapters which allow you to wirelessly take a picture you have probably been disappointed, but you don't have to stay that way.  Instead, take an existing manual remote trigger, add in a WiFi enabled SoC module like the ESP8266 suggested in the video, download and compile the code and the next thing you know you will have a camera with wireless focus and shutter trigger.  Not too shabby for a ~$5 investment.

index.jpg

"It’s just ridiculous how cheap and easy it is to do some things today that were both costly and difficult just two or three years ago. Case in point: Hackaday.io user [gamaral] built a WiFi remote control for his Canon E3 camera out of just three parts"

Here is some more Tech News from around the web:

Tech Talk

 

Source: Hack a Day

AMD Shows Dual-Fiji Graphics Card in a Falcon Northwest PC at VRLA

Subject: Graphics Cards | January 25, 2016 - 11:51 AM |
Tagged: fury x2, Fiji, dual fiji, amd

Lo and behold! The dual-Fiji card that we have previous dubbed the AMD Radeon Fury X2 still lives! Based on a tweet from AMD PR dude Antal Tungler, a PC from Falcon Northwest at the VRLA convention was utilizing a dual-GPU Fiji graphics card to power some demos.

This prototype Falcon Northwest Tiki system was housing the GPU beast but no images were shown of the interior of the system. Still, it's good to see AMD at least recognize that this piece of hardware still exists at all, since it was initially promised to the enthusiast market by "fall of 2015."  Even in October we had hints that the card might be coming soon after seeing some shipping manifests leak out to the web. 

dualfuryken_0.jpg

Better late than never, right? One theory floating around inside the offices here is that AMD is going to release the Fury X2 along with the VR headsets coming out this spring, with hopes of making it THE VR graphics card of choice. The value of using multi-GPU for VR is interesting, with one GPU dedicated to each eye, though the pitfalls that could haunt both AMD and NVIDIA in this regard (latency, frame time consistency) make the technological capability a debate. 

Source: Twitter

Report: Intel Tigerlake Revealed; Company's Third 10nm CPU

Subject: Processors | January 24, 2016 - 12:19 PM |
Tagged: Tigerlake, rumor, report, processor, process node, Intel, Icelake, cpu, Cannonlake, 10 nm

A report from financial website The Motley Fool discusses Intel's plan to introduce three architectures at the 10 nm node, rather than the expected two. This comes after news that Kaby Lake will remain at the present 14 nm, interrupting Intel's 2-year manufacturing tech pace.

intel_10nm.jpg

(Image credit: wccftech)

"Management has told investors that they are pushing to try to get back to a two-year cadence post-10-nanometer (presumably they mean a two-year transition from 10-nanometer to 7-nanometer), however, from what I have just learned from a source familiar with Intel's plans, the company is working on three, not two, architectures for the 10-nanometer node."

Intel's first 10 nm processor architecture will be known as Cannonlake, with Icelake expected to follow about a year afterward. With Tigerlake expected to be the third architecture build on 10 nm, and not coming until "the second half of 2019", we probably won't see 7 nm from Intel until the second half of 2020 at the earliest.

It appears that the days of two-year, two product process node changes are numbered for Intel, as the report continues:

"If all goes well for the company, then 7-nanometer could be a two-product node, implying a transition to the 5-nanometer technology node by the second half of 2022. However, the source that I spoke to expressed significant doubts that Intel will be able to return to a two-years-per-technology cycle."

intel-node-density_large.png

(Image credit: The Motley Fool)

It will be interesting to see how players like TSMC, themselves "planning to start mass production of 7-nanometer in the first half of 2018", will fare moving forward as Intel's process development (apparently) slows.

High-End Surface Pro 4 and Surface Book Are Available

Subject: Systems | January 23, 2016 - 02:26 AM |
Tagged: microsoft, surface, surface pro 4, surface book

The Microsoft Surface Book and Surface Pro 4 launched back in October, and Ryan published a review of them in December. He didn't really make reference to it, but the highest-end model of each were unavailable until a later date. As it turns out, that time is roughly now. I say “roughly” because, while Microsoft has launched the devices, Amazon's landing page doesn't list them, and searching for the product directly shows a price tag of just under $10,000. I assume Amazon hasn't pushed the appropriate buttons yet.

microsoft-2016-surfacebook.JPG

The only real improvement that you will see, versus the second-highest SKU, is a jump in SSD capacity from 512GB to 1TB. This extra storage will cost roughly 1$/GB, but this is also a very fast NVMe SSD. If 512GB was too small, and you were holding out for availability of the 1TB model, then your wait should (basically) be over.

Although, since you waited this long, you might want to hold off a little longer. Microsoft is supposed to be correcting (some say) severe issues with upcoming firmware. You may want to see whether the problems are solved before dropping two-and-a-half to three grand.

Source: Microsoft

If want great audio and don't care about the price; HiFiMAN HE-1000

Subject: General Tech | January 22, 2016 - 01:47 PM |
Tagged: hifiman, headphones, HE-1000, audio

HiFiMAN have been producing mid level and high end audio products for quite some time, straddling the line between affordable and audiophile quality.  The HE-1000 are of the aforementioned audiophile level, at $3000 you really have to have discerning ears to want to pick up these cans.  The headset is quite pretty, built with leather, wood, and aluminium with soft cloth for the earcups and a window blind design on the exterior which HiFiMAN claims has a positive effect on the audio quality.  techPowerUp tested these headphones out, you can read the description of their experience in the audio soundstage these headphones create in their review ... or not.

he-1000.jpg

"HiFiMAN is constantly developing their planar technology, and today, we will take a look at their latest state-of-the-art headphone. It is dubbed the HE-1000 and features a nanometer thick diaphragm, leather headband, and milled aluminum. We take HiFiMAN's most audacious and pricey headphone for a ride!"

Here is some more Tech News from around the web:

Audio Corner

 

Source: techPowerUp

Cortana, where is my app?

Subject: General Tech | January 22, 2016 - 12:15 PM |
Tagged: microsoft, Windows Store, windows 10

If the complaints of the developers in this story over at The Register are accurate then the problem with the Windows Store might not be that there are no good apps but instead that you simply can't find them.  If a developer can't find their own app in the store using keywords in the title or description or even the ones they submitted to the store then how can you expect to?  If the only good way to find an app is to know its exact name, how many apps are there in the store that no one but the developer has even seen?  It is still possible that an improved search function will not help the Windows Store but at this point its reputation could not get all that much worse.

all-search-results-windows-store.jpg

"Looking at the developer forums though, it seems that official guidance and assistance for this issue is not easy to find, which will not help Microsoft in its efforts to establish a strong Windows 10 app ecosystem."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register

GDDR5X Memory Standard Gets Official with JEDEC

Subject: Graphics Cards, Memory | January 22, 2016 - 11:08 AM |
Tagged: Polaris, pascal, nvidia, jedec, gddr5x, GDDR5, amd

Though information about the technology has been making rounds over the last several weeks, GDDR5X technology finally gets official with an announcement from JEDEC this morning. The JEDEC Solid State Foundation is, as Wikipedia tells us, an "independent semiconductor engineering trade organization and standardization body" that is responsible for creating memory standards. Getting the official nod from the org means we are likely to see implementations of GDDR5X in the near future.

The press release is short and sweet. Take a look.

ARLINGTON, Va., USA – JANUARY 21, 2016 –JEDEC Solid State Technology Association, the global leader in the development of standards for the microelectronics industry, today announced the publication of JESD232 Graphics Double Data Rate (GDDR5X) SGRAM.  Available for free download from the JEDEC website, the new memory standard is designed to satisfy the increasing need for more memory bandwidth in graphics, gaming, compute, and networking applications.

Derived from the widely adopted GDDR5 SGRAM JEDEC standard, GDDR5X specifies key elements related to the design and operability of memory chips for applications requiring very high memory bandwidth.  With the intent to address the needs of high-performance applications demanding ever higher data rates, GDDR5X  is targeting data rates of 10 to 14 Gb/s, a 2X increase over GDDR5.  In order to allow a smooth transition from GDDR5, GDDR5X utilizes the same, proven pseudo open drain (POD) signaling as GDDR5.

“GDDR5X represents a significant leap forward for high end GPU design,” said Mian Quddus, JEDEC Board of Directors Chairman.  “Its performance improvements over the prior standard will help enable the next generation of graphics and other high-performance applications.”

JEDEC claims that by using the same signaling type as GDDR5 but it is able to double the per-pin data rate to 10-14 Gb/s. In fact, based on leaked slides about GDDR5X from October, JEDEC actually calls GDDR5X an extension to GDDR5, not a new standard. How does GDDR5X reach these new speeds? By doubling the prefech from 32 bytes to 64 bytes. This will require a redesign of the memory controller for any processor that wants to integrate it. 

gddr5x.jpg

Image source: VR-Zone.com

As for usable bandwidth, though information isn't quoted directly, it would likely see a much lower increase than we are seeing in the per-pin statements from the press release. Because the memory bus width would remain unchanged, and GDDR5X just grabs twice the chunk sizes in prefetch, we should expect an incremental change. No mention of power efficiency is mentioned either and that was one of the driving factors in the development of HBM.

07-bwperwatt.jpg

Performance efficiency graph from AMD's HBM presentation

I am excited about any improvement in memory technology that will increase GPU performance, but I can tell you that from my conversations with both AMD and NVIDIA, no one appears to be jumping at the chance to integrate GDDR5X into upcoming graphics cards. That doesn't mean it won't happen with some version of Polaris or Pascal, but it seems that there may be concerns other than bandwidth that keep it from taking hold. 

Source: JEDEC

Crystal Dynamics Reveals Minimum Specifications for Rise of the Tomb Raider on the PC

Subject: General Tech | January 22, 2016 - 02:06 AM |
Tagged: tomb raider, pc gaming, min specs, can it run

Crystal Dynamics has revealed the minimum system requirements for Rise of the Tomb Raider on PC. This latest Lara Croft adventure sees the ever-resilient tomb raider following in the footsteps of her father in search of an artifact said to grant immortality amidst the lost city of Kitezh. Fortunately for gamers, Rise of the Tomb Raider has quite a low bar for entry with modest minimum system requirements. You will need more powerful hardware than its 2013 predecessor (Tomb Raider), but it is still quite manageable.

Rise of the Tomb Raider.jpg

PC gamers will need a 64-bit version of Windows, a dual core Intel Core i3-2100 (2 core, 4 thread at 3.1 GHz) or, for example, AMD FX 4100 processor, 6 GB of system memory, 25 GB of storage space for all the game files, and, of course, a graphics card with 2 GB of video memory such as the NVIDIA GTX 650 or AMD Radeon HD7770. Naturally, hardware with higher specifications/capabilities will get you better performance and visuals, but the above is what you will need to play.

Minimum PC System Requirements:

  • Dual Core Processor (e.g. Core i3-2100 or FX 4100)
  • 6 GB RAM
  • 25 GB Available Storage Space
  • 2 GB Graphics Card (e.g. GTX 650 or Radeon HD7770)

For those curious, Tomb Raider (2013) required XP SP3 32-bit, a dual core Intel Core 2 Duo E6300 or Athlon 64 X2 4050+ CPU, 1 GB of RAM (2 GB for Vista), and an NVIDIA 8600 or AMD HD2600 XT GPU with 512MB of video memory.

Rise of the Tomb Raider will reportedly add new stealth and crafting components along with new weapons and options for close quarters combat. Further, the game will feature day and night cycles with realistic weather which should make for cold snow-filled nights in Siberia as well as opportunities to sneak up on unwitting guards freezing their buns off!

The game is set to release on January 28th for the PC and joins the the Xbox One version that launched back in November 2015 where it will be a timed exclusive (it will come to the PS4 later this year).

Personally, I am excited for this game. I picked up its predecessor during a Steam sale for super cheap only to let it sit in my inventory for about a year. It was one of those 'I'll play it eventually, but it's not really a priority' things where the price finally got me (heh). Little did I know how wrong I was, because once I finally got around to firing up the game, I played it near constantly until I beat it! It was a surprisingly fun reboot of the series, and I am hopeful that RofTR will be more of the same! 

Source: Bit-Tech

Gigabyte Teases New Power Supplies

Subject: Cases and Cooling | January 21, 2016 - 10:44 PM |
Tagged: modular psu, gigabyte, ATX PSU

Gigabyte made an announcement teasing two new power supplies last week. The G750H and B700H are 80 PLUS rated models topping out at 750W and 700W respectively. A company most well-known for its motherboards, it was somewhat surprising to see it tease power supplies and to discover that these PSUs are not even the first to be sold by Gigabyte with its branding.

The G750H and B700H are ATX form factor and use a semi-modular design that leaves the 24-pin ATX and 8-pin CPU power cables permanently attached and uses modular cables for all other connections (see below). One neat thing is that Gigabyte is using all black flat individually sleeved cables which may make it easier to hide and route them behind the motherboard tray (which on some cases can be an especially narrow channel). Both models are rated for SLI and Crossfire multi-GPU setups, use at least some Japanese capacitors (the G750H uses all Japanese capacitors), have a MTBF of 100,000 hours, and five year warranties.

Gigabyte G750H PSU.png

In addition to the motherboard and CPU power, users can install two eight pin PCI-E, five SATA power, three Molex, and one floppy power connector. The modular cable configuration is the same on both PSU models.

The G750H is up to 90% efficient (80+ Gold) and uses a 140mm temperature controlled fan to keep noise levels low and the internal components cool (and efficient). Gigabyte has opted for a single rail design that sees the 12V rail rated at up to 62 amps.

On the other hand, the B700H is up to 85% efficient (80+ Bronze) at typical loads. It has a smaller 120mm temperature controlled fan for cooling. This model also uses a single 12V rail, but it tops out at 54 amps.

Several sites around the Internet have indicated (including Maximum PC) that Gigabyte has made the G750H and B700H available now, but they do not seem to be for sale yet in the US. I have tried to unearth pricing as well as the identity of the ODM Gigabyte is using for these new units, but no such luck so far. From my research, it appears that Gigabyte has used a number of different ODM/OEMs of varying quality for their past power supplies. It seems that we will have to wait for reviews to know for sure how these new PSUs will perform. I hope that Gigabyte has stepped up its power supply game as it has quite a bit of competition these days!

Source: Gigabyte

Windows 10 Build 11102 Released to Fast Ring Insiders

Subject: General Tech | January 21, 2016 - 06:39 PM |
Tagged: windows 10, microsoft

I wasn't planning on reporting every Windows 10 Insider build, but I actually have something to say about this one. The last couple of builds were examples of Microsoft switching to a faster release cycle for preview users, although the known issues list were quite benign. The semi-frequent upgrade cycles would shake users away from the Insider program, or transition to Slow.

They now seem ready to start rolling back to less QA for Fast users.

windows-10-bandaid.png

Five issues are known about build 11102. First, internal changes to the Windows Graphics architecture cause some games to crash on launch, full screen, or resolution changes. The Witcher 3, Fallout 4, Tomb Raider, Assassin's Creed, and Metal Gear Solid V are known to be affected by this bug, but other games (and maybe even other software) could be affected too. Second, screen readers and other accessibility software may crash randomly. If you require those accommodations, then this build could make your device functionally unusable to you.

Beyond those two, big issues, three other ones are present. There is an error message on login with a workaround, a breaking change for old wireless drivers that you should probably upgrade beforehand if you rely upon wireless to download drivers, and “The Connect button does not show up in Action Center.”

Microsoft is currently updating the deep insides of the OS, which means that they will be poking around the source code in weird places. Once it's completed, this should make Windows more maintainable, especially for multiple types of hardware. But again, if you're not wanting to be a part of this, switch to Slow or leave Insider.

Source: Microsoft

Not the Toshiba Satellite of yore, the Satellite Radius 12 has new tricks

Subject: Mobile | January 21, 2016 - 06:14 PM |
Tagged: toshiba, Satellite Radius 12

Ah, the old Toshiba Satellite; like a Volvo it was never the best nor the prettiest but short of a major collision nothing could kill it.  Since those times Toshiba has had a rough go of it, The Inquirer states they have predicted a $4.5bn loss, just after being caught cooking the books.  That has not stopped them from improving their Satellite lineup and the Satellite Radius 12 ultraportable is a great example of that.

The screen on this 300x209x15.4mm (11.8x8.2x0.6") and 1.32kg (2.9lb), 12.1" convertible laptop is an impressive 3840x2160 IPS display which can be fully flipped open to a tablet like form factor.  An i7-6500U, 8GB RAM and an unspecified 256GB SSD offer great performance, although battery life does suffer somewhat due to the screen and components.  Toshiba also included a dedicated Cortana button, cellphone like volume rocker, 0.9MP webcam and an infrared camera which works with Windows Hello but is not a RealSense camera.  The Inquirer found a lot to like about this laptop as well as some fairly serious shortcomings, read about them all in their review.

satellite-radius-12-design-540x334.jpeg

"This is the latest in Toshiba's rotating display convertible line, and the first of its kind to include a so-called 4K screen, making it an interesting proposition regardless of its creator's misfortunes."

Here are some more Mobile articles from around the web:

Mobile

 

Source: The Inquirer

Podcast #383 - Acer Predator X34, ASUS X99-M, AMD Q4 Earnings and more!

Subject: General Tech | January 21, 2016 - 02:34 PM |
Tagged: x99-m, X170, X150, video, Silent Base 800, Q4 2015, Predator X34, podcast, gigabyte, g-sync, freesync, earnings, be quiet, asus, amd, acer

PC Perspective Podcast #383 - 01/21/2016

Join us this week as we discuss the Acer Predator X34, ASUS X99-M, AMD Q4 Earnings and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Know anyone who uses the Intel Driver Update Utility? Update the updater ASAP

Subject: General Tech | January 21, 2016 - 12:52 PM |
Tagged: Intel, intel driver update utility, security

The Intel Driver Update Utility is not the most commonly found application on PCs but someone you know may have stumbled upon it or had it installed by Geek Squad or the local equivalent.  Since Windows Vista the tool has been available, it checks your system for any Intel parts, from your APU to your NIC and then looks for any applicable drivers that are available.  Unfortunately it was doing so over a non-SSL URL which leaves the utility wide open to a man in the middle attack and you really do not want a compromised NIC driver.  The Inquirer reports today that Intel quietly updated the tool on January 19th to resolve the issue, ensuring all communication and downloads are over SSL.  If you know anyone using this tool, recommend they update it immediately.

intel-driver-update.jpg

"Intel has issued a fix for a major security vulnerability in a driver utility tool that could have allowed a man-in-the-middle attack and a malware maelstrom on victims' computers."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer

Google Releases Chrome 48 with Interesting Features

Subject: General Tech | January 21, 2016 - 02:59 AM |
Tagged: google, chrome

Web browsers are typically on rapid release cycles so they can get features out frequently. The Web is changing on a constant basis to help it become an effective application platform, which is cross-compatible with competing implementations. A common complaint is that the cycle is to yield high version numbers for marketing, to give a false sense of maturity, but I'd expect that frequent, breaking changes are kind-of necessary to synchronize features between implementations. If Google lands a feature a month after Mozilla publishes a new version, should they really wait two years for their next one? Granted, they probably knew about it pre-release, but you get the idea. Also, even if the theory is true, artificially high version numbers is one of the most benign things a company could do.

Google_Chrome_icon_(2011).png

Some versions introduce some fairly interesting features, though. This one, Google Chrome 48, deprecates RC4 encryption for HTTPS, which forces web servers to use newer cyphers or they will fail to load.

Another major one, and probably more interesting for our audience, is the introduction of VP9 to WebRTC. This video codec is Google's open competitor to H.265. At similar quality settings, VP9 will use about half of the bandwidth (or storage) as VP8. WebRTC is mostly used for video conferencing, but it's really an open platform for webcam, microphone, audio, video, and raw, peer-to-peer data connections. There are even examples of it being used to synchronize objects in multiplayer video games, which has nothing to do with video or audio streaming. I'm not sure what is possible with this support, but it might even lead to web applications that can edit video.

Google Chrome 48 is available today. Also, as a related note, Firefox 44 should release next week with its own features, like experimental rendering of WebGL images offscreen and multi-threaded. The full changelog for Google Chrome 48 from Git is about 42 MB large and, ironically, tends to crash Firefox.

Source: VentureBeat

GDC 2016 Sessions Are Up and DirectX 12 / Vulkan Are There

Subject: General Tech | January 20, 2016 - 07:06 PM |
Tagged: vulkan, ue4, nvidia, Intel, gdc 2016, GDC, epic games, DirectX 12, Codemasters, arm, amd

The 30th Game Developers Conference (GDC) will take place on March 14th through March 18th, with the expo itself starting on March 16th. The sessions have been published at some point, with DX12 and Vulkan prominently featured. While the technologies have not been adopted as quickly as advertised, the direction is definitely forward. In fact, NVIDIA, Khronos Group, and Valve have just finished hosting a developer day for Vulkan. It is coming.

gdc-2016-logo.png

One interesting session will be hosted by Codemasters and Intel, which discusses bringing the F1 2015 engine to DirectX 12. It will highlight a few features they implemented, such as voxel based raytracing using conservative rasterization, which overestimates the size of individual triangles so you don't get edge effects on pixels that are partially influenced by an edge that cuts through a tiny, but not negligible, portion of them. Sites like Game Debate (Update: Whoops, forgot the link) wonder if these features will be patched in to older titles, like F1 2015, or if they're just R&D for future games.

Another keynote will discuss bringing Vulkan to mobile through Unreal Engine 4. This one will be hosted by ARM and Epic Games. Mobile processors have quite a few cores, albeit ones that are slower at single-threaded tasks, and decent GPUs. Being able to keep them loaded will bring their gaming potential up closer to the GPU's theoretical performance, which has surpassed both the Xbox 360 and PlayStation 3, sometimes by a factor of 2 or more.

Many (most?) slide decks and video recordings are available for free after the fact, but we can't really know which ones ahead of time. It should be an interesting year, though.

Source: GDC

Phoronix Tests Almost a Decade of GPUs

Subject: Graphics Cards | January 20, 2016 - 03:26 PM |
Tagged: nvidia, linux, tesla, fermi, kepler, maxwell

It's nice to see long-term roundups every once in a while. They do not really provide useful information for someone looking to make a purchase, but they show how our industry is changing (or not). In this case, Phoronix tested twenty-seven NVIDIA GeForce cards across four architectures: Tesla, Fermi, Kepler, and Maxwell. In other words, from the GeForce 8 series all the way up to the GTX 980 Ti.

phoronix-2016-many-nvidia-roundup.jpg

Image Credit: Phoronix

Nine years of advancements in ASIC design, with a doubling time-step of 18 months, should yield a 64-fold improvement. The number of transistors falls short, showing about a 12-fold improvement between the Titan X and the largest first-wave Tesla, although that means nothing for a fabless semiconductor designer. The main reason why I include this figure is to show the actual Moore's Law trend over this time span, but it also highlights the slowdown in process technology.

Performance per watt does depend on NVIDIA though, and the ratio between the GTX 980 Ti and the 8500 GT is about 72:1. While this is slightly better than the target 64:1 ratio, these parts are from very different locations in their respective product stacks. Swapping the 8500 GT for the following year's 9800 GTX, which leads to a comparison between top-of-the-line GPUs of their respective times, and you see a 6.2x improvement in performance per watt versus the GTX 980 Ti. On the other hand, that part was outstanding for its era.

I should note that each of these tests take place on Linux. It might not perfectly reflect the landscape on Windows, but again, it's interesting in its own right.

Source: Phoronix