Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Leaked AMD Fiji Card Images Show Small Form Factor, Water Cooler Integration

Subject: Graphics Cards | May 22, 2015 - 09:39 AM |
Tagged: wce, radeon, Fiji, amd, 390x

UPDATE (5/22/15): Johan Andersson tweeted out this photo this morning, with the line: "This new island is one seriously impressive and sweet GPU. wow & thanks @AMDRadeon ! They will be put to good use :)"  Looks like we can confirm that at least one of the parts AMD is releasing does have the design of the images we showed you before, though the water cooling implementation is missing or altered.

repi-amd.jpg

END UPDATE

File this under "rumor" for sure, but a cool one none the less...

After yesterday's official tidbit of information surrounding AMD's upcoming flagship graphics card for enthusiasts and its use of HBM (high bandwidth memory), it appears we have another leak on our hands. The guys over at Chiphell have apparently acquired some stock footage of the new Fiji flagship card (whether or not it will be called the 390X has yet to be seen) and it looks...awesome.

390x1.jpg

In that post from yesterday I noted that with an HBM design AMD could in theory build an add-in card that is of a different form factor than anything we have previously seen for a high end part. Based on the image above, if this turns out to be the high end Fiji offering, it appears the PCB will indeed be quite small as it no longer requires memory surrounding the GPU itself. You can also see that it will in fact be water cooled though it looks like it has barb inlets rather than a pre-attached cooler in this image.

390x2.jpg

The second leaked image shows display outputs consisting of three full-size DisplayPort connections and a single HDMI port.

All of this could be faked of course, but if it is, the joker did a damn good job of compiling all the information into one design. If it's real, I think AMD might finally have a match for the look and styling of the high-end GeForce offerings.

What do you think: real or fake? Cool or meh? Let us know!

Source: Chiphell

Just Arrived: MSI 990FXA-Gaming Motherboard

Subject: Motherboards | May 21, 2015 - 11:34 PM |
Tagged: msi, amd, 990fx, FX-8370, FX-9590, sli, crossfire, SoundBlaster, killer nic, usb 3.1

Several weeks ago MSI officially announced the 990FXA-Gaming motherboard for the AM3+ market.  The board is based on the tried and true 990FX and SB950 combo, but it adds a new wrinkle to the game: USB 3.1 support.  MSI has released the other AMD based USB 3.1 board on the market, the 970 Krait.

msi_990fxa_g_01.jpg

Quite a few people were excited about this part, as the AM3+ market has been pretty stagnant as of late.  This is not necessarily surprising considering that AMD has not launched a new AM3+ chip since Fall of 2014 with a couple of "efficiency" chips as well as the slightly faster FX-8370.

msi_990fxa_g_02.jpg

There was some speculation based on early photographs that the board could have a more robust power delivery system than previous AM3+ boards, but alas, that is not the case.  Upon closer inspection it appears as though MSI has gone the 6+2 phase route.  If there are good quality components in there, you can potentially run the 220 watt TDP FX-9000 series parts, but these puppies are not officially supported.  In fact, I received an email saying that I might want to be really careful in my choice of CPUs as well as being extremely careful when overclocking.

The board still has some real potential at being a really nice home for the 125 watt TDP and below parts.  The audio portion looks very well designed and features the SoundBlaster Cinema 2.  It supports both SLI and CrossFire in native 2 x 16x (highly doubtful with 3 cards with the way the slots are configured).  It has the Killer NIC ethernet suite which may or may not be a selling point, depending on who you ask.

msi_990fxa_g_03.jpg

Overall the board is an interesting addition to the club, but I really wouldn't trust it with the FX-9000 series chips.  I have a 970 Gaming that came with the FX-9590 that had a similar power delivery system, and it ran like a champ; there is a possibility that the board will run this combination.  This is going to be installed this weekend and I will start the benchmarking!  Keep tuned!

Source: MSI

Could it be a 980 Ti, or does the bill of lading lie?

Subject: Graphics Cards | May 21, 2015 - 07:33 PM |
Tagged: rumour, nvidia, 980 Ti

980ti.PNG

The source of leaks and rumours is often unexpected, such as this import data of a shipment headed from China into India.  Could this 6GB card be the GTX 980 Ti that so many have theorized would be coming sometime around AMD's release of their new cards?  Does the fact that 60,709 Indian Rupees equal 954.447 US Dollars put a damper on your excitement or could it be that these 6 lonely cards are being sold at a higher rate overseas than they might be in the US? 

We don't know but we do know there is a mysterious card out there somewhere.

Source: Zauba

Podcast #350 - AMD's plan for HBM, IPS G-SYNC, GameWorks and The Witcher 3, and more!

Subject: Editorial | May 21, 2015 - 03:34 PM |
Tagged: podcast, video, amd, hbm, Fiji, g-sync, ips, XB270HU, corsair, Oculus, supermicro, asus, gladius, jem davies, arm, mali

PC Perspective Podcast #350 - 05/21/2015

Join us this week as we discuss AMD's plan for HBM, IPS G-SYNC, GameWorks and The Witcher 3, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Stop by for the BenQ XL2730Z FreeSync display, stay for the conversations

Subject: Displays | May 21, 2015 - 01:44 PM |
Tagged: XL2730Z, freesync, benq, amd

Ryan wasn't the only one to test BenQ's XL2730Z 27-in 2560x1440 144 Hz FreeSync Monitor, The Tech Report also had a chance to test one, as well as talk to NVIDIA's Tom Petersen about their competing technology.  They also had a chance to discuss FreeSync in general with AMD's David Glen who is one of the engineers behind FreeSync.  Their benchmarks and overall impression of the displays capabilities and FreeSync in general are a major portion of the review but the discussion with the two company representatives makes for even more interesting reading.

img.x.jpg

"AMD's FreeSync is here, personified in BenQ's XL2730Z monitor. We've gone deep into the display's performance and smoothness, with direct comparisons to G-Sync using 240-fps video. Here's what we found."

Here are some more Display articles from around the web:

Displays

Running an EXT4 RAID on the Linux 4.0 kernel? Better spray for bugs!

Subject: General Tech | May 21, 2015 - 12:40 PM |
Tagged: linux, EXT4, raid, bug

On Tuesday a bug was discovered to have been introduced to Linux 4.0 kernel when a fix was added to deal with RAIDs where the chunksize not a power of 2, a problem present since Linux 3.14-rc1.  This fix has been causing corruption on RAIDs and the file system on that RAID, making many an unhappy Arch Linux user.  Only users of rolling release flavours will be effected, distros with scheduled updates like RHEL or Ubuntu are not effected at this time.  The good news is that as of today there is a fix available if you wish to apply it, as well as defining the fix which caused the issue.  Check out both at Phoronix.

linux-penguin-100055693-gallery.png

"A few days ago we reported on an EXT4 file-system corruption issue being discovered within the stable Linux 4.0 kernel series. The good news is the issue has been uncovered and a patch is available, but it could still be a few days before it starts getting sent out in stable updates."

Here is some more Tech News from around the web:

Tech Talk

Source: Phoronix

Introduction and First Impressions

Supermicro recently entered the consumer space with a new line of enthusiast motherboards and today we’re looking at a gaming enclosure from the well-known enterprise manufacturer.

supermicro_main.jpg

While many component manufacturers have diversified their product offerings to include everything from cooling fans to thumb drives, Supermicro is not a name that anyone familiar with the company would have likely suspected of this trend. With recent Z97 and X99 motherboard offerings Supermicro has made an effort to enter the enthusiast market with boards that don’t exactly look like gaming products, but this is to be expected from a company that specializes in the enterprise market.

It was something of a surprise to hear that Supermicro had created a new enclosure for the consumer segment, and even more so to hear that it was to be a gaming enclosure. And while the term “gaming” gets thrown around quite a bit the new enclosure does have the look we tend to associate with the moniker, with flashy red accents and a brushed aluminum front panel to go along with all-black steel enclosure.

supermicro_cover.jpg

Continue reading our review of the Supermicro SuperChassis S5 enclosure!!

Love the NVMe, shame almost nobody can use it

Subject: Storage | May 20, 2015 - 02:48 PM |
Tagged: XP941, SSD 750, ssd, SM951, pcie, NVMe, MZVPV512HDGL, AHCI

For owners of Z97 or X99 boards with updated UEFIs or a rare SFF-8643 connector for the 2.5" version, booting from NVMe is possible, for the rest the Intel SSD 750 will have to be a storage drive.   Al recently looked at this more than impressive PCIe SSD and now [H]ard|OCP has had a bash at it.  The review is certainly worth checking out as some of their tests, especially the real world ones, differ from the benchmarks that Al used.  This will give you more information about how the new SSD will handle your workloads, research worth it if you are thinking of spending $1055 for the 1.2TB model.

1428498730H4WON14xlV_1_1.jpg

"Intel is set to be the catalyst for a long-awaited leap forward in storage technology with the new SSD 750 bringing NVMe storage to client PCs for the first time, and turning the high end SSD space upside-down. We are expecting blinding IOPs and we dig in to find out what it can mean to the hardware enthusiast."

Here are some more Storage reviews from around the web:

Storage

Source: [H]ard|OCP

Wotcher new Witcher like?

Subject: General Tech | May 20, 2015 - 01:58 PM |
Tagged: The Witcher 3, gaming, CD Projekt RED

The new and not quite as pretty as advertised Witcher is here from CD Projekt RED, available from GoG among other places. Rock, Paper, SHOTGUN have started another one of their ongoing diaries to share their experiences, so far involving a bare bum and the amazing Tutorial Man.  They also went straight for the dream sequence right off the bat; a smart move to get that over and done with in the early stages.  There will be more, as this is a very large game.   If you are looking for more details on graphics settings than to turn off Vidal Sasson, there is a post here discussing the options they used as well as the links below.

roadend.jpg

"I shall instead run a (mostly) in-character diary series covering my adventures in, presumably, just the earlier stages of CDP’s saucy roleplayer. But for the record, it runs OK if I turn Fancy Hair off but it has crashed twice so far."

Here is some more Tech News from around the web:

Gaming

GoogleU just got noiser; tweets are returning to your search results

Subject: General Tech | May 20, 2015 - 01:02 PM |
Tagged: twitter, google

Several years back Google thought it would be fun to include Tweets in Google searches and while they were smart to discontinue that, the reasoning behind ending it, Google+, may not have been as sound.  According to Slashdot, once again your searches for information on Google will be accompanied by 140 character posts of scintillating wisdom which will obviously impart far more knowledge than the citation you were looking for.  This should also do wonders for those looking to limit the perspectives and opinions they are exposed to as dissenting views can easily be drowned out by tweets that reinforce your beliefs just by minor alterations the text in your search. 

On the plus side, one comment on Slashdot shows how to add operators back into your searches, just paste &tbs=li:1 at the end of the URL once you have searched.

Add "&tbs=li:1" to your keyword search string. For example: https://www.google.com/search?q=%s&tbs=li:1

twitter-ios-logo.png

"Google will now begin showing tweets alongside search results. Mobile users searching via the Android/iOS apps or through the browser will start seeing the tweets immediately, while the desktop version is "coming shortly." The tweets will only be available for the searches in English to start, but Twitter says they'll be adding more languages soon."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

How about that High Bandwidth Memory

Subject: Graphics Cards | May 19, 2015 - 03:51 PM |
Tagged: memory, high bandwidth memory, hbm, Fiji, amd

Ryan and the rest of the crew here at PC Perspective are excited about AMD's new memory architecture and the fact that they will be first to market with it.  However as any intelligent reader is wont to look for; a second opinion on the topic is worth finding.  Look no further than The Tech Report who have also been briefed on AMD's new memory architecture.  Read on to see what they learned from Joe Macri and their thoughts on the successor to GDDR5 and HBM2 which is already in the works.

stack-diagram.jpg

"HBM is the next generation of memory for high-bandwidth applications like graphics, and AMD has helped usher it to market. Read on to find out more about HBM and what we've learned about the memory subsystem in AMD's next high-end GPU, code-named Fiji."

Here are some more Graphics Card articles from around the web:

Graphics Cards

A new and improved AiO watercooler from SilverStone, the Tundra TD02-E

Subject: Cases and Cooling | May 19, 2015 - 02:55 PM |
Tagged: Silverstone, Tundra TD02-E, CPU Water Block

SilverStone's Tundra TD02-E is a 240mm radiator, 278x124x27mm in total, with a hefty weight of 1.5kg which you should keep in mind when thinking where to place it.  We have had quite a few reviews of SilverStone's Tundra series, from Ryan's look at the original model 9 years ago, to Morry's recent reviews of the TD02 and TD03.  The TD02-E is an updated model with newer tubing and a slimmer radiator, which [H]ard|OCP compared to previous models and other competitors.  Their testing showed equivalent performance to the initial model with reduced noise and if you shop around, a reduced price as well.

1430051962VF0AJetlpn_1_1.jpg

"SilverStone is no stranger to us when it comes to All-In-One liquid CPU coolers. The new Tundra Series TD02-E AIO is SilverStone's updated version of its TD02 120mm dual fan cooler. SilverStone is not very clear on what exactly is better about this cooler, but we will put it through its paces to see if we have a better AIO."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: [H]ard|OCP

Time to give OpenWRT a shot?

Subject: General Tech | May 19, 2015 - 01:09 PM |
Tagged: dd-wrt, openwrt, linux, linksys, WRT1900AC

Regular listeners to the PCPer Podcast should be aware of the DD-WRT project to root and take control over your router as we have mentioned it multiples of times, along with a related project called OpenWrt.  If you have not looked into the process of how to flash up a router with one or the other of these new OSes/firmware packages then this article at Linux.com is something you should take a look at. They walk you through the steps of taking over a Linksys WRT1900AC router, from straight out of the box to final configuration.  They also give you a look at the advantages running a router on OpenWrt gives you and ideas for taking it further.  Check it out right here.

372.jpg.png

"The Linksys WRT1900AC is a top-end modern router that gets even sweeter when you unleash Linux on it and install OpenWrt. OpenWrt includes the opkg package management system giving you easy access to a great deal of additional open source software to use on your router."

Here is some more Tech News from around the web:

Tech Talk

Source: Linux.com
Author:
Manufacturer: AMD

High Bandwidth Memory

UPDATE: I have embedded an excerpt from our PC Perspective Podcast that discusses the HBM technology that you might want to check out in addition to the story below.

The chances are good that if you have been reading PC Perspective or almost any other website that focuses on GPU technologies for the past year, you have read the acronym HBM. You might have even seen its full name: high bandwidth memory. HBM is a new technology that aims to turn the ability for a processor (GPU, CPU, APU, etc.) to access memory upside down, almost literally. AMD has already publicly stated that its next generation flagship Radeon GPU will use HBM as part of its design, but it wasn’t until today that we could talk about what HBM actually offers to a high performance processor like Fiji. At its core HBM drastically changes how the memory interface works, how much power is required for it and what metrics we will use to compare competing memory architectures. AMD and its partners started working on HBM with the industry more than 7 years ago, and with the first retail product nearly ready to ship, it’s time to learn about HBM.

We got some time with AMD’s Joe Macri, Corporate Vice President and Product CTO, to talk about AMD’s move to HBM and how it will shift the direction of AMD products going forward.

The first step in understanding HBM is to understand why it’s needed in the first place. Current GPUs, including the AMD Radeon R9 290X and the NVIDIA GeForce GTX 980, utilize a memory technology known as GDDR5. This architecture has scaled well over the past several GPU generations but we are starting to enter the world of diminishing returns. Balancing memory performance and power consumption is always a tough battle; just ask ARM about it. On the desktop component side we have much larger power envelopes to work inside but the power curve that GDDR5 is on will soon hit a wall, if you plot it far enough into the future. The result will be either drastically higher power consuming graphics cards or stalling performance improvements of the graphics market – something we have not really seen in its history.

01-gddr5powergraph.jpg

While it’s clearly possible that current and maybe even next generation GPU designs could still have depended on GDDR5 as the memory interface, the move to a different solution is needed for the future; AMD is just making the jump earlier than the rest of the industry.

Continue reading our look at high bandwidth memory (HBM) architecture!!

ASUS Announces the ZenFone 2 Powered by Quad-Core Intel Atom

Subject: Mobile | May 18, 2015 - 02:00 PM |
Tagged: ZenFone 2, smartphones, intel atom, atom z3580, asus

ZenFone 2 is the new flagship smartphone from ASUS ZenFone, and features a new design powered by an Intel Atom Z3580 (Moorefield) processor with a massive 4GB of RAM.

zenfone2_main.png

The phone has a 5.5-inch 1080p IPS display with 403 PPI for crisp scaling, and the “Ergonomic Arc” design includes a volume-control key on the rear of the phone “within easy reach of the user's index finger”, with a curved profile that tapers to a 0.15 inch at the edges.

The camera also features a 13 MP PixelMaster camera with a f/2.0 aperture and claimed “zero shutter-lag”. The battery weighs in at 3000mAh and features “BoostMaster” fast-charge technology that sounds similar to Qualcomm’s Quick Charge 2.0 standard.

But one of the most attractive features will be price, as ASUS will be selling these online through their retail channels as affordable unlocked smartphones:

  • 2GB / 16GB storage / Atom 3560 - $199
  • 4GB / 64GB storage / Atom 3580 / QuickCharger - $299

Here's look at the specifications:

  • CPU: Intel Quad-Core 64-bit Atom Z3580 @ 2.3GHz (Min Clock 333MHz, Max Clock 2333MHz)
  • GPU: PowerVR Series 6 G6430 with OpenGL 3.0 Support (Min Clock 457MHz, Max Clock 533MHz)
  • Display: 5.5in IPS, 1920x1080 resolution (403 PPI), Corning Gorilla Glass 3 with Anti-Fingerprint Coating
  • Memory: 4GB 800 MHz LPDDR3
  • Storage: 64GB eMMC
  • SIM: Support Dual active micro-SIM
  • Micro-SD slot: SDXC support up to 128GB
  • Modem: Intel XMM7260 LTE-Advanced
    • FDD LTE 1/2/3/4/5/7/8/9/177/18/19/20/28/29
    • TDD LTE 38/39/40/41
    • WCDMA 850/900/1900
    • TD-SCDMA 1900/2100
    • EDGE/GPRS/GSM 850/900/1800/1900
  • Wireless: WiFi 802.11a/b/g/n/ac
  • Rear Camera: 13MP, aperture f/2.0, sensor size 1/3.2 inch
  • Front Camera: 5MP
  • Maximum Video Resolution: 1080p/30
  • Battery: 3000 mAh Lithium-Polymer (11.4 Wh), Boostmaster Fast-Charging
  • Colors: Glacier Gray, Osmium Black, Glamour Red, Sheer Gold
  • Dimensions: 152.5 mm x 77.2 mm x 10.9-3.9 mm (6 x 3.04 x 0.43-.15 inches)
  • Weight: 170g

PR after the break.

Source: ASUS

ASUS Announces QHD ZenBook UX305, 4K UX501 Notebook

Subject: Mobile | May 18, 2015 - 02:00 PM |
Tagged: zenbook pro, zenbook, UX501, UX305, QHD+, notebooks, ips, asus, 4k, 2560x1440

ASUS has annouced a new QHD+ version of the affordable ZenBook UX305 notebook as well as the new ZenBook Pro UX501.

ux305_1.jpg

The ZenBook UX305 was released as a disruptive notebook with specs far above its $699 price tag, and this new version goes far beyond the 1920x1080 screen resolution of the original. This new QHD+ (3200x1800) panel is IPS just like the original, but with this ultra-high resolution it boasts 276 PPI for either incredibly sharp, or incredibly tiny text depending on how well your application scales.

The new ZenBook Pro UX501 takes resolution a step further with a 4K/UHD 3820x2160 IPS panel and a powerful quad-core Intel Core i7-4720HQ processor with 16GB of RAM at its disposal. NVIDIA GeForce GTX 960M graphics power this 15.6-inch, 282 PPI UHD panel, and naturally 4x PCIe storage is available as well.

ux501.png

More information and specs are available in the full PR for both notebooks after the break.

Source: ASUS
Subject: General Tech
Manufacturer: ASUS

Introduction and First Impressions

The ASUS ROG Gladius mouse features sleek styling and customizable lighting effects, but the biggest aspect is the underlying technology. With socketed Omron switches designed to be easily swapped and an adjustable 6400dpi optical sensor this gaming mouse offers a lot on paper. So how does it feel? Let's find out.

gladius_box.jpg

There are a few aspects to the way a mouse feels, including the shape, surface material, and overall weight. Beyond the physical properties there is the speed and accuracy of the sensor (which also affects hand movement) and of course the mouse buttons and scroll wheel. Really, there's a lot going on with a modern gaming mouse - a far cry from the "X-Y position indicator" that the inventors had nicknamed "mouse" in the 1960s.

One of the hallmarks of the ASUS ROG (Republic of Gamers) lineup is the sheer amount of additional features the products tend to have. I use an ROG motherboard in my personal system, and even my micro-ATX board is stuffed with additional functionality (and the box is loaded with accessories). So it came as no surprise to me when I opened the Gladius mouse and began to look it over. Sure, the box contents aren't as numerous as one of the Maximus motherboards, but there's still quite a bit more than I've encountered with a mouse before.

gladius_top.jpg

Continue reading our review of the ASUS ROG Gladius Gaming Mouse!!

NVIDIA Under Attack Again for GameWorks in The Witcher 3: Wild Hunt

Subject: Graphics Cards | May 17, 2015 - 12:04 PM |
Tagged: The Witcher 3, nvidia, hairworks, gameworks, amd

I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.

Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort - for the good of said community.

gwlogo.jpg

Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX,  Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.

An example of The Witcher 3: Wild Hunt with HairWorks

One of the game's developers has been quoted as such:

Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.

There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.

I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:

We are not asking game developers do anything unethical.
 
GameWorks improves the visual quality of games running on GeForce for our customers.  It does not impair performance on competing hardware.
 
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
 
GameWorks licenses follow standard industry practice.  GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license. 
 
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
 
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
 
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.  

Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation - often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.

Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to."  And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.

It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings. 

NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well - they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.

In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.

Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.

wicher3-1.jpg

All arguing aside, this game looks amazing. Can we all agree on that?

The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right - and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.

Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.

So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!

NVIDIA SHIELD and SHIELD Pro Show up on Amazon

Subject: Mobile | May 16, 2015 - 01:00 PM |
Tagged: Tegra X1, tegra, shield pro, shield console, shield, nvidia

UPDATE: Whoops! It appears that Amazon took the listing down... No surprise there. I'm sure we'll be seeing them again VERY SOON. :)

Looks like the release of the new NVIDIA SHIELD console device, first revealed back at GDC in March, is nearly here. A listing for "NVIDIA SHIELD" as well as the new "NVIDIA SHIELD Pro" showed up on Amazon.com today.

shield1.jpg

Though we don't know what the difference between the SHIELD and SHIELD Pro are officially, according to Amazon at least, the difference appears to be the internal storage. The Pro model will ship with 500GB of internal storage, the non-Pro model will only have 16GB. You'll have to get an SD Card for more storage on the base model if you plan on doing anything other than streaming games through NVIDIA GRID it seems.

shield2.jpg

No pricing is listed yet and there is no release date on the Amazon pages either, but we have always been told this was to be a May or June on-sale date. Both models of the NVIDIA SHIELD will include an HDMI cable, a micro-USB cable and a SHIELD Controller. If you want the remote or stand, you're going to have to pay out a bit more.

shield3.jpg

For those of you that missed out on the original SHIELD announcement from March, here is a quick table detailing the specs, as we knew them at that time. NVIDIA's own Tegra X1 SoC featuring 256 Maxwell GPU cores powers this device using the Android TV operating system, promising 4K video playback, the best performing Android gaming experience and NVIDIA GRID streaming games.

  NVIDIA SHIELD Specifications
Processor NVIDIA® Tegra® X1 processor with 256-core Maxwell™ GPU with 3GB RAM
Video Features 4K Ultra-HD Ready with 4K playback and capture up to 60 fps (VP9, H265, H264)
Audio 7.1 and 5.1 surround sound pass through over HDMI
High-resolution audio playback up to 24-bit/192kHz over HDMI and USB
High-resolution audio upsample to 24-bit/192hHz over USB
Storage 16 GB
Wireless 802.11ac 2x2 MIMO 2.4 GHz and 5 GHz Wi-Fi
Bluetooth 4.1/BLE
Interfaces Gigabit Ethernet
HDMI 2.0
Two USB 3.0 (Type A)
Micro-USB 2.0
MicroSD slot (supports 128GB cards)
IR Receiver (compatible with Logitech Harmony)
Gaming Features NVIDIA GRID™ streaming service
NVIDIA GameStream™
SW Updates SHIELD software upgrades directly from NVIDIA
Power 40W power adapter
Weight and Size Weight: 23oz / 654g
Height: 5.1in / 130mm
Width: 8.3in / 210mm
Depth: 1.0in / 25mm
OS Android TV™, Google Cast™ Ready
Bundled Apps PLEX
In the box NVIDIA SHIELD
NVIDIA SHIELD controller
HDMI cable (High Speed), USB cable (Micro-USB to USB)
Power adapter (Includes plugs for North America, Europe, UK)
Requirements TV with HDMI input, Internet access
Options SHIELD controller, SHIELD remove, SHIELD stand

 

Source: Amazon.com

Raptr's Top PC Games of April 2015

Subject: General Tech | May 15, 2015 - 11:56 PM |
Tagged: raptr, pc gaming

The PC gaming utility, Raptr, is normally used to optimize in-game settings, chat and socialize, and record game footage. It also keeps track of game-hours and aggregates them into a list once per month, which makes it one of the few sources for this type of data on the PC. We were late on it last month, which means that another was posted just a week later.

caas-most_played_april-2015.jpg

April marks the release of Grand Theft Auto V for the PC. It went live on the 14th and, despite only counting for half of the month, ended up at 4th place. Next month's survey will tell us whether the post-release drop-off was countered by counting Grand Theft Auto for a full month, which is double what they have now. It was just 0.17% of global play time behind CS:GO. Despite an error on the graph, it knocked DOTA 2 down to fifth, and Diablo III down to sixth. In fact, just about everything below Grand Theft Auto V dropped at least one rank.

Only three games actually gained places this month: ArcheAge, Warframe, and Spider Solitaire. Yes, that game is now the 19th most played, as tracked by Raptr. You could sort-of say that Hearthstone gained a rank by not losing one, but you would be wrong. Why would you say that?

League of Legends dropped less than a percent of total play time, settling in at about 21%. This is just about on target for the game, which proves that not even Rockstar can keep people from having a Riot.

Source: Raptr