All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Editorial | May 21, 2015 - 03:34 PM | Ken Addison
Tagged: podcast, video, amd, hbm, Fiji, g-sync, ips, XB270HU, corsair, Oculus, supermicro, asus, gladius, jem davies, arm, mali
PC Perspective Podcast #350 - 05/21/2015
Join us this week as we discuss AMD's plan for HBM, IPS G-SYNC, GameWorks and The Witcher 3, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:24:12
Week in Review:
News item of interest:
Hardware/Software Picks of the Week:
Sebastian: Aukey Quick Charge 2.0 Portable Charger
Ryan wasn't the only one to test BenQ's XL2730Z 27-in 2560x1440 144 Hz FreeSync Monitor, The Tech Report also had a chance to test one, as well as talk to NVIDIA's Tom Petersen about their competing technology. They also had a chance to discuss FreeSync in general with AMD's David Glen who is one of the engineers behind FreeSync. Their benchmarks and overall impression of the displays capabilities and FreeSync in general are a major portion of the review but the discussion with the two company representatives makes for even more interesting reading.
"AMD's FreeSync is here, personified in BenQ's XL2730Z monitor. We've gone deep into the display's performance and smoothness, with direct comparisons to G-Sync using 240-fps video. Here's what we found."
Here are some more Display articles from around the web:
- AMD's FreeSync; A Long-Term Review @ Hardware Canucks
- AOC Q2778VQE @ Kitguru
- Philips Brilliance BDM4065UC @ Kitguru
- BenQ BL3201PT @ Kitguru
Subject: General Tech | May 21, 2015 - 12:40 PM | Jeremy Hellstrom
Tagged: linux, EXT4, raid, bug
On Tuesday a bug was discovered to have been introduced to Linux 4.0 kernel when a fix was added to deal with RAIDs where the chunksize not a power of 2, a problem present since Linux 3.14-rc1. This fix has been causing corruption on RAIDs and the file system on that RAID, making many an unhappy Arch Linux user. Only users of rolling release flavours will be effected, distros with scheduled updates like RHEL or Ubuntu are not effected at this time. The good news is that as of today there is a fix available if you wish to apply it, as well as defining the fix which caused the issue. Check out both at Phoronix.
"A few days ago we reported on an EXT4 file-system corruption issue being discovered within the stable Linux 4.0 kernel series. The good news is the issue has been uncovered and a patch is available, but it could still be a few days before it starts getting sent out in stable updates."
Here is some more Tech News from around the web:
- Boffins have devised TERMINATOR style LIQUID METAL – for an antenna @ The Register
- Build A 1000W LED Flashlight @ Hack a Day
- Qualcomm to step out of TSMC top-5 client list in 2H15 @ DigiTimes
- Take Our Survey: Best Linux Hacker SBCs for Under $200 @ Linux.com
- 'Millions' of routers open to absurdly outdated NetUSB hijack @ The Register
- Google Fiber has taken on a piracy policing role @ The Inquirer
Subject: Storage | May 20, 2015 - 02:48 PM | Jeremy Hellstrom
Tagged: XP941, SSD 750, ssd, SM951, pcie, NVMe, MZVPV512HDGL, AHCI
For owners of Z97 or X99 boards with updated UEFIs or a rare SFF-8643 connector for the 2.5" version, booting from NVMe is possible, for the rest the Intel SSD 750 will have to be a storage drive. Al recently looked at this more than impressive PCIe SSD and now [H]ard|OCP has had a bash at it. The review is certainly worth checking out as some of their tests, especially the real world ones, differ from the benchmarks that Al used. This will give you more information about how the new SSD will handle your workloads, research worth it if you are thinking of spending $1055 for the 1.2TB model.
"Intel is set to be the catalyst for a long-awaited leap forward in storage technology with the new SSD 750 bringing NVMe storage to client PCs for the first time, and turning the high end SSD space upside-down. We are expecting blinding IOPs and we dig in to find out what it can mean to the hardware enthusiast."
Here are some more Storage reviews from around the web:
- Samsung SM951 M.2 NVME 256GB @ The SSD Review
- Samsung SM951 M.2 512GB @ The SSD Review
- OCZ ARC 100 240GB SSD Review @ Madshrimps
- Silicon Power S80 480GB @ Bjorn3d
- Kingston HyperX Savage 240 GB @ techPowerUp
- SanDisk CloudSpeed Eco SSD @ The SSD Review
- Synology DiskStation DS415+ NAS Review @ Madshrimps
- Inateck FD1005 Top-Loading HDD Docking Station
- Toshiba MQ02ABF075 2.5'' Mobile Thin HDD Review @ Madshrimps
- Silicon Power Armor A60: Rugged, Portable, and Affordable @ Bjorn3d
- Samsung Portable SSD T1 250GB USB 3.0 Drive Review @ NikKTech
- Toshiba TransMemory-EX II 64GB USB 3.0 Flash Drive Review @ NikKTech
Subject: General Tech | May 20, 2015 - 01:58 PM | Jeremy Hellstrom
Tagged: The Witcher 3, gaming, CD Projekt RED
The new and not quite as pretty as advertised Witcher is here from CD Projekt RED, available from GoG among other places. Rock, Paper, SHOTGUN have started another one of their ongoing diaries to share their experiences, so far involving a bare bum and the amazing Tutorial Man. They also went straight for the dream sequence right off the bat; a smart move to get that over and done with in the early stages. There will be more, as this is a very large game. If you are looking for more details on graphics settings than to turn off Vidal Sasson, there is a post here discussing the options they used as well as the links below.
"I shall instead run a (mostly) in-character diary series covering my adventures in, presumably, just the earlier stages of CDP’s saucy roleplayer. But for the record, it runs OK if I turn Fancy Hair off but it has crashed twice so far."
Here is some more Tech News from around the web:
- Witcher 3: Wild Hunt – a true monster in the making @ The Register
- The Witcher 3: Performance Analysis @ techPowerUp
- Wot I Think: Galactic Civilizations III @ Rock, Paper, SHOTGUN
- Battlefield 4 Spring Patch Will Add Guns, Guns, Guns @ Rock, Paper, SHOTGUN
- “We’re experimenting with something that nobody’s done before” – Larian On Original Sin’s Enhanced Edition @ Rock, Paper, SHOTGUN
- Life Imitates Art: The Black Glove Shelved @ Rock, Paper, SHOTGUN
Subject: General Tech | May 20, 2015 - 01:02 PM | Jeremy Hellstrom
Tagged: twitter, google
Several years back Google thought it would be fun to include Tweets in Google searches and while they were smart to discontinue that, the reasoning behind ending it, Google+, may not have been as sound. According to Slashdot, once again your searches for information on Google will be accompanied by 140 character posts of scintillating wisdom which will obviously impart far more knowledge than the citation you were looking for. This should also do wonders for those looking to limit the perspectives and opinions they are exposed to as dissenting views can easily be drowned out by tweets that reinforce your beliefs just by minor alterations the text in your search.
On the plus side, one comment on Slashdot shows how to add operators back into your searches, just paste &tbs=li:1 at the end of the URL once you have searched.
Add "&tbs=li:1" to your keyword search string. For example: https://www.google.com/search?q=%s&tbs=li:1
"Google will now begin showing tweets alongside search results. Mobile users searching via the Android/iOS apps or through the browser will start seeing the tweets immediately, while the desktop version is "coming shortly." The tweets will only be available for the searches in English to start, but Twitter says they'll be adding more languages soon."
Here is some more Tech News from around the web:
- Tearing Down The Apple Watch @ Hack a Day
- Microsoft launches Office for Android preview with a touch more touch @ The Inquirer
- Ricoh rolls out electrifyingly exciting RUBBER! @ The Register
Subject: Graphics Cards | May 19, 2015 - 03:51 PM | Jeremy Hellstrom
Tagged: memory, high bandwidth memory, hbm, Fiji, amd
Ryan and the rest of the crew here at PC Perspective are excited about AMD's new memory architecture and the fact that they will be first to market with it. However as any intelligent reader is wont to look for; a second opinion on the topic is worth finding. Look no further than The Tech Report who have also been briefed on AMD's new memory architecture. Read on to see what they learned from Joe Macri and their thoughts on the successor to GDDR5 and HBM2 which is already in the works.
"HBM is the next generation of memory for high-bandwidth applications like graphics, and AMD has helped usher it to market. Read on to find out more about HBM and what we've learned about the memory subsystem in AMD's next high-end GPU, code-named Fiji."
Here are some more Graphics Card articles from around the web:
- AMD HBM High Bandwidth Memory Technology Unveiled @ [H]ard|OCP
- Diamond Wireless Video Stream HD 1080P HDMI @ eTeknix
- KFA2 GeForce GTX 980 ‘8Pack Edition’ 4096MB @ Kitguru
- Gigabyte GTX 960 OC 2 GB @ techPowerUp
- eForce GTX TITAN X Video Card Review @ Hardware Secrets
Subject: Cases and Cooling | May 19, 2015 - 02:55 PM | Jeremy Hellstrom
Tagged: Silverstone, Tundra TD02-E, CPU Water Block
SilverStone's Tundra TD02-E is a 240mm radiator, 278x124x27mm in total, with a hefty weight of 1.5kg which you should keep in mind when thinking where to place it. We have had quite a few reviews of SilverStone's Tundra series, from Ryan's look at the original model 9 years ago, to Morry's recent reviews of the TD02 and TD03. The TD02-E is an updated model with newer tubing and a slimmer radiator, which [H]ard|OCP compared to previous models and other competitors. Their testing showed equivalent performance to the initial model with reduced noise and if you shop around, a reduced price as well.
"SilverStone is no stranger to us when it comes to All-In-One liquid CPU coolers. The new Tundra Series TD02-E AIO is SilverStone's updated version of its TD02 120mm dual fan cooler. SilverStone is not very clear on what exactly is better about this cooler, but we will put it through its paces to see if we have a better AIO."
Here are some more Cases & Cooling reviews from around the web:
- Thermaltake Water 3.0 Ultimate CPU Cooler Review @ Hardware Secrets
- Fractal Design Kelvin S36 360mm Liquid Cooler Review @ Madshrimps
- REEVEN Ouranos RC-1401 CPU Cooler Review @ NikKTech
- Be Quiet! Pure Rock CPU Cooler Review @ NikKTech
- SilverStone Tundra TD02-E @ Benchmark Reviews
- Cryorig R1 Ultimate CPU Cooler @ Benchmark Reviews
- Lian-Li PC-O6s @ HardwareHeaven
- In Win 707 Tower Case Review @ Hardware Asylum
- Fractal Design Define S Case with Kelvin S36 Cooler @ Kitguru
- Sleek, Stylish, If Slightly Strange – A Review Of The In Win 707 Full-Tower Chassis @ Techgage
- NZXT Noctis 450 with Kraken X61 @ Kitguru
- RAIDMAX Viper GX II @ Benchmark Reviews
- Antec ISK 110 90w Mini-ITX VESA Compatible Chassis @ eTeknix
- Origin Genesis PC Chassis Review – Part 2 @ Modders-Inc
- Corsair Carbide 100R Case Review @ Hardware Canucks
Subject: General Tech | May 19, 2015 - 01:09 PM | Jeremy Hellstrom
Tagged: dd-wrt, openwrt, linux, linksys, WRT1900AC
Regular listeners to the PCPer Podcast should be aware of the DD-WRT project to root and take control over your router as we have mentioned it multiples of times, along with a related project called OpenWrt. If you have not looked into the process of how to flash up a router with one or the other of these new OSes/firmware packages then this article at Linux.com is something you should take a look at. They walk you through the steps of taking over a Linksys WRT1900AC router, from straight out of the box to final configuration. They also give you a look at the advantages running a router on OpenWrt gives you and ideas for taking it further. Check it out right here.
"The Linksys WRT1900AC is a top-end modern router that gets even sweeter when you unleash Linux on it and install OpenWrt. OpenWrt includes the opkg package management system giving you easy access to a great deal of additional open source software to use on your router."
Here is some more Tech News from around the web:
- Doom is BOOM! BOOM! BACK! @ The Register
- Trojanized, Info-Stealing PuTTY Version Lurking Online @ Slashdot
- Google Google GOOGLE! Cloud cloud CLOUD! These prices are INNNSAAAANE! @ The Register
- Microsoft: Free Windows 10 for THIEVES and PIRATES? They can GET STUFFED @ The Register
- Open source power-up on the way for arcade game emulator MAME @ The Register
- TP-Link Archer C9 @ HardwareHeaven
- Tech ARP 2015 Mega Giveaway : Mi In-Ear Headphones
- Win great prizes with be quiet! and KitGuru!
Subject: Mobile | May 18, 2015 - 02:00 PM | Sebastian Peak
Tagged: ZenFone 2, smartphones, intel atom, atom z3580, asus
ZenFone 2 is the new flagship smartphone from ASUS ZenFone, and features a new design powered by an Intel Atom Z3580 (Moorefield) processor with a massive 4GB of RAM.
The phone has a 5.5-inch 1080p IPS display with 403 PPI for crisp scaling, and the “Ergonomic Arc” design includes a volume-control key on the rear of the phone “within easy reach of the user's index finger”, with a curved profile that tapers to a 0.15 inch at the edges.
The camera also features a 13 MP PixelMaster camera with a f/2.0 aperture and claimed “zero shutter-lag”. The battery weighs in at 3000mAh and features “BoostMaster” fast-charge technology that sounds similar to Qualcomm’s Quick Charge 2.0 standard.
But one of the most attractive features will be price, as ASUS will be selling these online through their retail channels as affordable unlocked smartphones:
- 2GB / 16GB storage / Atom 3560 - $199
- 4GB / 64GB storage / Atom 3580 / QuickCharger - $299
Here's look at the specifications:
- CPU: Intel Quad-Core 64-bit Atom Z3580 @ 2.3GHz (Min Clock 333MHz, Max Clock 2333MHz)
- GPU: PowerVR Series 6 G6430 with OpenGL 3.0 Support (Min Clock 457MHz, Max Clock 533MHz)
- Display: 5.5in IPS, 1920x1080 resolution (403 PPI), Corning Gorilla Glass 3 with Anti-Fingerprint Coating
- Memory: 4GB 800 MHz LPDDR3
- Storage: 64GB eMMC
- SIM: Support Dual active micro-SIM
- Micro-SD slot: SDXC support up to 128GB
- Modem: Intel XMM7260 LTE-Advanced
- FDD LTE 1/2/3/4/5/7/8/9/177/18/19/20/28/29
- TDD LTE 38/39/40/41
- WCDMA 850/900/1900
- TD-SCDMA 1900/2100
- EDGE/GPRS/GSM 850/900/1800/1900
- Wireless: WiFi 802.11a/b/g/n/ac
- Rear Camera: 13MP, aperture f/2.0, sensor size 1/3.2 inch
- Front Camera: 5MP
- Maximum Video Resolution: 1080p/30
- Battery: 3000 mAh Lithium-Polymer (11.4 Wh), Boostmaster Fast-Charging
- Colors: Glacier Gray, Osmium Black, Glamour Red, Sheer Gold
- Dimensions: 152.5 mm x 77.2 mm x 10.9-3.9 mm (6 x 3.04 x 0.43-.15 inches)
- Weight: 170g
PR after the break.
Subject: Mobile | May 18, 2015 - 02:00 PM | Sebastian Peak
Tagged: zenbook pro, zenbook, UX501, UX305, QHD+, notebooks, ips, asus, 4k, 2560x1440
ASUS has annouced a new QHD+ version of the affordable ZenBook UX305 notebook as well as the new ZenBook Pro UX501.
The ZenBook UX305 was released as a disruptive notebook with specs far above its $699 price tag, and this new version goes far beyond the 1920x1080 screen resolution of the original. This new QHD+ (3200x1800) panel is IPS just like the original, but with this ultra-high resolution it boasts 276 PPI for either incredibly sharp, or incredibly tiny text depending on how well your application scales.
The new ZenBook Pro UX501 takes resolution a step further with a 4K/UHD 3820x2160 IPS panel and a powerful quad-core Intel Core i7-4720HQ processor with 16GB of RAM at its disposal. NVIDIA GeForce GTX 960M graphics power this 15.6-inch, 282 PPI UHD panel, and naturally 4x PCIe storage is available as well.
More information and specs are available in the full PR for both notebooks after the break.
Subject: Graphics Cards | May 17, 2015 - 12:04 PM | Ryan Shrout
Tagged: The Witcher 3, nvidia, hairworks, gameworks, amd
I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.
Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort - for the good of said community.
Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX, Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.
An example of The Witcher 3: Wild Hunt with HairWorks
One of the game's developers has been quoted as such:
Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.
There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.
I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:
We are not asking game developers do anything unethical.
GameWorks improves the visual quality of games running on GeForce for our customers. It does not impair performance on competing hardware.
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
GameWorks licenses follow standard industry practice. GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license.
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.
Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation - often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.
Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to." And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.
It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings.
NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well - they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.
In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.
Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.
All arguing aside, this game looks amazing. Can we all agree on that?
The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right - and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.
Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.
So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!
Subject: Mobile | May 16, 2015 - 01:00 PM | Ryan Shrout
Tagged: Tegra X1, tegra, shield pro, shield console, shield, nvidia
UPDATE: Whoops! It appears that Amazon took the listing down... No surprise there. I'm sure we'll be seeing them again VERY SOON. :)
Looks like the release of the new NVIDIA SHIELD console device, first revealed back at GDC in March, is nearly here. A listing for "NVIDIA SHIELD" as well as the new "NVIDIA SHIELD Pro" showed up on Amazon.com today.
Though we don't know what the difference between the SHIELD and SHIELD Pro are officially, according to Amazon at least, the difference appears to be the internal storage. The Pro model will ship with 500GB of internal storage, the non-Pro model will only have 16GB. You'll have to get an SD Card for more storage on the base model if you plan on doing anything other than streaming games through NVIDIA GRID it seems.
No pricing is listed yet and there is no release date on the Amazon pages either, but we have always been told this was to be a May or June on-sale date. Both models of the NVIDIA SHIELD will include an HDMI cable, a micro-USB cable and a SHIELD Controller. If you want the remote or stand, you're going to have to pay out a bit more.
For those of you that missed out on the original SHIELD announcement from March, here is a quick table detailing the specs, as we knew them at that time. NVIDIA's own Tegra X1 SoC featuring 256 Maxwell GPU cores powers this device using the Android TV operating system, promising 4K video playback, the best performing Android gaming experience and NVIDIA GRID streaming games.
|NVIDIA SHIELD Specifications|
|Processor||NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM|
|Video Features||4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)|
|Audio||7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
|Wireless||802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Two USB 3.0 (Type A)
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
|Gaming Features||NVIDIA GRID™ streaming service
|SW Updates||SHIELD software upgrades directly from NVIDIA|
|Power||40W power adapter|
|Weight and Size||Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
|OS||Android TV™, Google Cast™ Ready|
|In the box||NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
|Requirements||TV with HDMI input, Internet access|
|Options||SHIELD controller, SHIELD remove, SHIELD stand|
Subject: General Tech | May 15, 2015 - 11:56 PM | Scott Michaud
Tagged: raptr, pc gaming
The PC gaming utility, Raptr, is normally used to optimize in-game settings, chat and socialize, and record game footage. It also keeps track of game-hours and aggregates them into a list once per month, which makes it one of the few sources for this type of data on the PC. We were late on it last month, which means that another was posted just a week later.
April marks the release of Grand Theft Auto V for the PC. It went live on the 14th and, despite only counting for half of the month, ended up at 4th place. Next month's survey will tell us whether the post-release drop-off was countered by counting Grand Theft Auto for a full month, which is double what they have now. It was just 0.17% of global play time behind CS:GO. Despite an error on the graph, it knocked DOTA 2 down to fifth, and Diablo III down to sixth. In fact, just about everything below Grand Theft Auto V dropped at least one rank.
Only three games actually gained places this month: ArcheAge, Warframe, and Spider Solitaire. Yes, that game is now the 19th most played, as tracked by Raptr. You could sort-of say that Hearthstone gained a rank by not losing one, but you would be wrong. Why would you say that?
League of Legends dropped less than a percent of total play time, settling in at about 21%. This is just about on target for the game, which proves that not even Rockstar can keep people from having a Riot.
Subject: Graphics Cards | May 15, 2015 - 11:36 PM | Scott Michaud
Tagged: windows 10, geforce, graphics drivers, nvidia, whql
The last time that NVIDIA has released a graphics driver for Windows 10, they added a download category to their website for the pre-release operating system. Since about January, graphics driver updates were pushed by Windows Update and, before that, you would need to use Windows 8.1 drivers. Receiving drivers from Windows Update also meant that add-ons, such as PhysX runtimes and the GeForce Experience, would not be bundled with it. I know that some have installed them separately, but I didn't.
The 352.84 release, which is their second Windows 10 driver to be released outside of Windows Update, is also certified by WHQL. NVIDIA has recently been touting Microsoft certification for many of their drivers. Historically, they released a large number of Beta drivers that were stable, but did not wait for Microsoft to vouch for them. For one reason or another, they have put a higher priority on that label, even for “Game Ready” drivers that launch alongside a popular title.
For some reason, the driver is only available via GeForce Experience and NVIDIA.com, but not GeForce.com. I assume NVIDIA will publish it there soon, too.
Subject: Graphics Cards, Processors, Displays, Systems | May 15, 2015 - 03:02 PM | Scott Michaud
Tagged: Oculus, oculus vr, nvidia, amd, geforce, radeon, Intel, core i5
Today, Oculus has published a list of what they believe should drive their VR headset. The Oculus Rift will obviously run on lower hardware. Their minimum specifications, published last month and focused on the Development Kit 2, did not even list a specific CPU or GPU -- just a DVI-D or HDMI output. They then went on to say that you really should use a graphics card that can handle your game at 1080p with at least 75 fps.
The current list is a little different:
- NVIDIA GeForce GTX 970 / AMD Radeon R9 290 (or higher)
- Intel Core i5-4590 (or higher)
- 8GB RAM (or higher)
- A compatible HDMI 1.3 output
- 2x USB 3.0 ports
- Windows 7 SP1 (or newer).
I am guessing that, unlike the previous list, Oculus has a more clear vision for a development target. They were a little unclear about whether this refers to the consumer version or the current needs of developers. In either case, it would likely serve as a guide for what they believe developers should target when the consumer version launches.
This post also coincides with the release of the Oculus PC SDK 0.6.0. This version pushes distortion rendering to the Oculus Server process, rather than the application. It also allows multiple canvases to be sent to the SDK, which means developers can render text and other noticeable content at full resolution, but scale back in places that the user is less likely to notice. They can also be updated at different frequencies, such as sleeping the HUD redraw unless a value changes.
The Oculus PC SDK (0.6.0) is now available at the Oculus Developer Center.
Subject: General Tech | May 15, 2015 - 02:01 PM | Jeremy Hellstrom
Tagged: audio, ozone, Rage ST, gaming headset
With a pricetag of $40 many may be a bit leery of purchasing the Ozone Rage ST Headset as it is significantly lower in price than most gaming headsets which implies lower quality too. It does use the 40mm drivers common in most headsets with a response range of 20-20kHz but the microphone is omnidirectional as opposed to unidirectional which means you will send background noise. Modders-Inc tried it out and were pleasantly surprised; while it has none of the extra features that $100+ headsets do, the overall quality was worth the price of admission. If you are in need of a headset but are strapped for cash, these are a good choice for you.
"Despite the stereotype, gamers are social creatures too. Competitive games after all requires another person to play with, but as expressive as some gestures may be such as virtual teabagging, it is not nearly as effective in conveying what you really feel when you shout out expletives through a headset. It feels very natural in fact that one almost feels …"
Here is some more Tech News from around the web:
- AKG K553 Pro Studio Headphones Review @ Hardware Canucks
- Razer Seiren Review at HardwareHeaven
- Tesoro Kuven Pro @ HardwareHeaven
- Creative Sound Blaster ZxR @ HardwareHeaven
- Sound Blaster X7 @ HardwareHeaven
Subject: Mobile | May 15, 2015 - 01:56 PM | Ryan Shrout
Tagged: video, mali, jem davies, interview, arm
Have you ever wondered how a mobile GPU is born? Or how the architecture of a mobile GPU like ARM Mali differs from the technology in your discrete PC graphics card? Perhaps you just want to know if ideas like HBM (high bandwidth memory) are going to find their way into the mobile ecosystem any time soon?
Josh and I sat down (virtually) with ARM's VP of Technology and Fellow, Jem Davies, to answer these questions and quite a bit more. The resulting interview will shed light on the design process of a mobile GPU, how you get the most out of an SoC that measures power by the milliwatt, what the world of mobile benchmarking needs to do to clean up its act and quite a bit more.
You'd be hard pressed to find a better way to spend the next hour of your day as you will without a doubt walk away more informed about the world of smartphones, tablets and GPUs.
Subject: General Tech | May 15, 2015 - 01:02 PM | Jeremy Hellstrom
Tagged: rumours, msi, Lenovo
When you think of Lenovo laptops you tend to think of suits and office suites, not Cheetos and Red Bull but DigiTimes has heard tell that this could possibly change. With Acer, Asustek's ROG and Dell's Alienware lineups all seeing decent profits from the niche market of high end gaming laptops the rumour is that Lenovo would like in on some of that filthy lucre. DigiTimes' source posits that MSI's gaming laptop subdivision would be the obvious target for Lenovo. It is possible that this is all hot air but Lenovo is a huge company and could easily afford to buy a division of a competitor, if they were willing to sell.
"Micro-Star International (MSI) has been successful in selling gaming notebooks and Lenovo is interested in acquiring MSI's gaming notebook business unit, according to sources from supply chain makers. However, MSI has denied the reports."
Here is some more Tech News from around the web:
- Elementary OS Freya: Is This The Next Big Linux Distro? @ Linux.com
- The Internet of Things: a jumbled mess or a jumbled mess? @ The Register
- Candy Crush Saga preloaded on Windows 10 is the key to enterprise sales @ The Inquirer
- Hackaday Prize Entry: A $100 CT Scanner @ Hack a Day
- You cannot be cirrus: 51 percent of Americans think storms bork cloud computing @ The Inquirer
Subject: Memory | May 14, 2015 - 07:16 PM | Jeremy Hellstrom
Tagged: Vengeance LPX, Dominator Platinum, ddr4, corsair, 128Gb
Corsair has just released the three largest unbuffered DDR4 kits available for enthusiasts who can afford the asking price. Two 128GB Dominator Platinum kits, one clocked at 2400MHz and one at 2666MHz along with a 2400MHz Vengeance LPX have just gone on sale. All three kits consist of eight 16GB modules which means that the number of motherboards that support these kits is extremely limited, the EVGA X99 Classified, ASRock's X99 Extreme4 and the Asus X99-E WS are among the few. As you can see below the investment is rather high but if you want bragging rights, or an amazingly large RAM drive then Corsair has a solution for you.
Corsair, a worldwide leader in high-performance PC components, today announced the availability of the world’s first available 128GB DDR4 unbuffered memory kits. Available in Corsair’s Vengeance LPX and Dominator Platinum Series lines, the new 128GB capacities give content creators an unprecedented amount of high-speed DDR4 SDRAM for memory-hungry applications.
The 128GB (8 x 16GB) DDR4 memory kits are designed for the latest Intel X99 series motherboards and support XMP 2.0 for the ultimate compatibility, reliability, and performance. The first available kits are rated at speeds of 2666MHz and 2400MHz and higher speeds will be announced soon. Like all Corsair memory, the new kits are backed by a lifetime warranty.
Dominator Platinum Series 128GB DDR4 Memory
The most advanced memory kits available, the Dominator Platinum series DDR4 modules feature a striking industrial design for good looks, patented DHX technology for cooler operation, and user-swappable colored “light pipes” for customizable LED lighting. Dominator Platinum memory is built with hand-screened ICs, undergoes rigorous performance testing, and incorporates patented DHX cooling technology for reliable performance in demanding environments.
Vengeance LPX Series 128GB DDR4 Memory
Vengeance LPX memory is designed for high-performance overclocking with aluminum heatspreaders for faster heat dissipation and eight-layer PCB for superior overclocking headroom. Each IC is individually screened for performance potential.
Pricing and Lifetime Warranty
Corsair Dominator Platinum and Vengeance LPX DDR4 memory kits are available from Corsair.com and Corsair’s worldwide network of authorized distributors and resellers. All Corsair memory is backed with a limited lifetime warranty and Corsair customer service and technical support.