GDC 14: OpenGL ES 3.1 Spec Released by Khronos Group

Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 17, 2014 - 09:01 AM |
Tagged: OpenGL ES, opengl, Khronos, gdc 14, GDC

Today, day one of Game Developers Conference 2014, the Khronos Group has officially released the 3.1 specification for OpenGL ES. The main new feature, brought over from OpenGL 4, is the addition of compute shaders. This opens GPGPU functionality to mobile and embedded devices for applications developed in OpenGL ES, especially if the developer does not want to add OpenCL.

The update is backward-compatible with OpenGL ES 2.0 and 3.0 applications, allowing developers to add features, as available, for their existing apps. On the device side, most functionality is expected to be a driver update (in the majority of cases).

opengl-es-logo.png

OpenGL ES, standing for OpenGL for Embedded Systems but is rarely branded as such, delivers what they consider the most important features from the graphics library to the majority of devices. The Khronos Group has been working toward merging ES with the "full" graphics library over time. The last release, OpenGL ES 3.0, was focused on becoming a direct subset of OpenGL 4.3. This release expands upon the feature-space it occupies.

OpenGL ES also forms the basis for WebGL. The current draft of WebGL 2.0 uses OpenGL ES 3.0 although that was not discussed today. I have heard murmurs (not from Khronos) about some parties pushing for compute shaders in that specification, which this announcement puts us closer to.

The new specification also adds other features, such as the ability to issue a draw without CPU intervention. You could imagine a particle simulation, for instance, that wants to draw the result after its compute shader terminates. Shading is also less rigid, where vertex and fragment shaders do not need to be explicitly linked into a program before they are used. I inquired about the possibility that compute devices could be targetted (for devices with two GPUs) and possibly load balanced, in a similar method to WebCL but no confirmation or denial was provided (although he did mention that it would be interesting for apps that fall somewhere in the middle of OpenGL ES and OpenCL).

The OpenGL ES 3.1 spec is available at the Khronos website.

Source: Khronos

AMD Radeon R9 Graphics Stock Friday Night Update

Subject: Graphics Cards | March 14, 2014 - 10:17 PM |
Tagged: radeon, R9 290X, r9 290, r9 280x, r9 280, amd

While sitting on the couch watching some college basketball I decided to start browsing Amazon.com and Newegg.com for some Radeon R9 graphics cards.  With all of the stock and availability issues AMD has had recently, this is a more frequent occurrence for me than I would like to admit.  Somewhat surprisingly, things appear to be improving for AMD at the high end of the product stack.  Take a look at what I found.

  Amazon.com Newegg.com
ASUS Radeon R9 290X DirectCU II $599 -
Visiontek R9 290X $599 -
XFX R9 290X Double D $619 -
ASUS R9 290 DirectCU II $499 -
XFX R9 290 Double D $499 -
MSI R9 290 Gaming $465 $469
PowerColor TurboDuo AXR9 280X - $329
Visiontek R9 280X $370 $349
XFX R9 280 Double D - $289
Sapphire Dual-X R9 280 - $299
Sapphire R7 265 $184 $149

msir9290.jpg

It's not perfect, but it's better.  I was able to find two R9 290X cards at $599, which is just $50 over the expected selling price of $549.  The XFX Double D R9 290X at $619 is pretty close as well.  The least expensive R9 290 I found was $469 but others remain about $100 over the suggested price.  In reality, having the R9 290 and R9 290X only $100 apart, as opposed to the $150 that AMD would like you to believe, is more realistic based on the proximity of performance between the two SKUs.  

Stepping a bit lower, the R9 280X (which is essentially the same as the HD 7970 GHz Edition) can be found for $329 and $349 on Newegg.  Those prices are just $30-50 more than the suggested pricing!  The brand new R9 280, similar in specs to the HD 7950, is starting to show up for $289 and $299; $10 over what AMD told us to expect.

Finally, though not really a high end card, I did see that the R7 265 was showing up at both Amazon.com and Newegg.com for the second time since its announcement in February. For budget 1080p gamers, if you can find it, this could be the best card you can pick up.

What deals are you finding online?  If you guys have one worth adding here, let me know! Is the lack of availability and high prices on AMD GPUs finally behind us??

Author:
Manufacturer: NZXT

Installation

When the Radeon R9 290 and R9 290X first launched last year, they were plagued by issues of overheating and variable clock speeds.  We looked at the situation several times over the course of a couple months and AMD tried to address the problem with newer drivers.  These drivers did help stabilize clock speeds (and thus performance) of the reference built R9 290 and R9 290X cards but caused noise levels to increase as well.  

The real solution was the release of custom cooled versions of the R9 290 and R9 290X from AMD partners like ASUS, MSI and others.  The ASUS R9 290X DirectCU II model for example, ran cooler, quieter and more consistently than any of the numerous reference models we had our hands on.  

But what about all those buyers that are still purchasing, or have already purchased, reference style R9 290 and 290X cards?  Replacing the cooler on the card is the best choice and thanks to our friends at NZXT we have a unique solution that combines standard self contained water coolers meant for CPUs with a custom built GPU bracket.  

IMG_9179_0.JPG

Our quick test will utilize one of the reference R9 290 cards AMD sent along at launch and two specific NZXT products.  The Kraken X40 is a standard CPU self contained water cooler that sells for $100 on Amazon.com.  For our purposes though we are going to team it up with the Kraken G10, a $30 GPU-specific bracket that allows you to use the X40 (and other water coolers) on the Radeon R9 290.

IMG_9181_0.JPG

Inside the box of the G10 you'll find an 80mm fan, a back plate, the bracket to attach the cooler to the GPU and all necessary installation hardware.  The G10 will support a wide range of GPUs, though they are targeted towards the reference designs of each:

NVIDIA : GTX 780 Ti, 780, 770, 760, Titan, 680, 670, 660Ti, 660, 580, 570, 560Ti, 560, 560SE 
AMD : R9 290X, 290, 280X*, 280*, 270X, 270 HD7970*, 7950*, 7870, 7850, 6970, 6950, 6870, 6850, 6790, 6770, 5870, 5850, 5830
 

That is pretty impressive but NZXT will caution you that custom designed boards may interfere.

IMG_9184_0.JPG

The installation process begins by removing the original cooler which in this case just means a lot of small screws.  Be careful when removing the screws on the actual heatsink retention bracket and alternate between screws to take it off evenly.

Continue reading about how the NZXT Kraken G10 can improve the cooling of the Radeon R9 290 and R9 290X!!

AMD Teasing Dual GPU Graphics Card, Punks Me at Same Time

Subject: Graphics Cards | March 13, 2014 - 10:52 AM |
Tagged: radeon, amd

This morning I had an interesting delivery on my door step.

The only thing inside it was an envelope stamped TOP SECRET and this photo.  Coming from AMD's PR department, the hashtag #2betterthan1 adorned the back of the picture.  

twobetter.jpg

This original photo is from like....2004.  Nice, very nice AMD.

With all the rumors circling around the release of a new dual-GPU graphics card based on Hawaii, it seems that AMD is stepping up the viral marketing campaign a bit early.  Code named 'Vesuvius', the idea of a dual R9 290X single card seems crazy due to high power consumption but maybe AMD has been holding back the best, most power efficient GPUs for such a release.

What do you think?  Can AMD make a dual-GPU Hawaii card happen?  How will this affect or be affected by the GPU shortages and price hikes still plaguing the R9 290 and R9 290X?  How much would you be willing to PAY for something like this?

Author:
Manufacturer: NVIDIA

Maxwell and Kepler and...Fermi?

Covering the landscape of mobile GPUs can be a harrowing experience.  Brands, specifications, performance, features and architectures can all vary from product to product, even inside the same family.  Rebranding is rampant from both AMD and NVIDIA and, in general, we are met with one of the most confusing segments of the PC hardware market.  

Today, with the release of the GeForce GTX 800M series from NVIDIA, we are getting all of the above in one form or another. We will also see performance improvements and the introduction of the new Maxwell architecture (in a few parts at least).  Along with the GeForce GTX 800M parts, you will also find the GeForce 840M, 830M and 820M offerings at lower performance, wattage and price levels.

slides01.jpg
 

With some new hardware comes a collection of new software for mobile users, including the innovative Battery Boost that can increase unplugged gaming time by using frame rate limiting and other "magic" bits that NVIDIA isn't talking about yet.  ShadowPlay and GameStream also find their way to mobile GeForce users as well.

Let's take a quick look at the new hardware specifications.

  GTX 880M GTX 780M GTX 870M GTX 770M
GPU Code name Kepler Kepler Kepler Kepler
GPU Cores 1536 1536 1344 960
Rated Clock 954 MHz 823 MHz 941 MHz 811 MHz
Memory Up to 4GB Up to 4GB Up to 3GB Up to 3GB
Memory Clock 5000 MHz 5000 MHz 5000 MHz 4000 MHz
Memory Interface 256-bit 256-bit 192-bit 192-bit
Features Battery Boost
GameStream
ShadowPlay
GFE
GameStream
ShadowPlay
GFE
Battery Boost
GameStream
ShadowPlay
GFE
GameStream
ShadowPlay
GFE

Both the GTX 880M and the GTX 870M are based on Kepler, keeping the same basic feature set and hardware specifications of their brethren in the GTX 700M line.  However, while the GTX 880M has the same CUDA core count as the 780M, the same cannot be said of the GTX 870M.  Moving from the GTX 770M to the 870M sees a significant 40% increase in core count as well as a jump in clock speed from 811 MHz (plus Boost) to 941 MHz.  

Continue reading about the NVIDIA GeForce GTX 800M Launch and Battery Boost!!

Win a GeForce GTX 750 Ti by Showing Off Your Upgrade-Worthy Rig!

Subject: General Tech, Graphics Cards | March 11, 2014 - 09:06 PM |
Tagged: nvidia, gtx 750 ti, giveaway, geforce, contest

UPDATE:  We have our winners! Congrats to the following users that submitted upgrade worthy PCs that will be shipped a free GeForce GTX 750 Ti courtesy of NVIDIA! 

  • D. Todorov
  • C. Fogg
  • K. Rowe
  • K. Froehlich
  • D. Aarssen

When NVIDIA launched the GeForce GTX 750 Ti this month it convinced us to give this highly efficient graphics card a chance to upgrade some off-the-shelf, under powered PCs.  In a story that we published just a week ago, we were able to convert three pretty basic and pretty boring computers into impressive gaming PCs by adding in the $150 Maxwell-based graphics card.

gateway-bioshock.png

If you missed the video we did on the upgrade process and results, check it out here.

Now we are going to give our readers the chance to do the same thing to their PCs.  Do you have a computer in your home that is just not up to the task of playing the latest PC games?  Then this contest is right up your alley.

IMG_9552.JPG

Prizes: 1 of 5 GeForce GTX 750 Ti Graphics Cards

Your Task: You are going to have to do a couple of things to win one of these cards in our "Upgrade Story Giveaway."  We want to make sure these cards are going to those of you that can really use it so here is what we are asking for (you can find the form to fill out right here):

  1. Show us your PC that is in need of an upgrade!  Take a picture of your machine with this contest page on the screen or something similar and share it with us.  You can use Imgur.com to upload your photo if you need some place to put it.  An inside shot would be good as well.  Place the URL for your image in the appropriate field in the form below.
  2. Show us your processor and integrated graphics that need some help!  That means you can use a program like CPU-Z to view the processor in your system and then GPU-Z to show us the graphics setup.  Take a screenshot of both of these programs so we can see what hardware you have that needs more power for PC gaming!  Place the URL for that image in the correct field below.
  3. Give us your name and email address so we can contact you for more information if you win!
  4. Leave us a comment below to let me know why you think you should win!!
  5. Subscribing to our PCPer Live! mailing list or even our PCPer YouTube channel wouldn't hurt either...

That's pretty much it!  We'll run this promotion for 2 weeks with a conclusion date of March 13th. That should give you plenty of time to get your entry in.

Good luck!!

Author:
Manufacturer: Various

1920x1080, 2560x1440, 3840x2160

Join us on March 11th at 9pm ET / 6pm PT for a LIVE Titanfall Game Stream!  You can find us at http://www.pcper.com/live.  You can subscribe to our mailing list to be alerted whenever we have a live event!!

We canceled the event due to the instability of Titanfall servers.  We'll reschedule soon!!

With the release of Respawn's Titanfall upon us, many potential PC gamers are going to be looking for suggestions on compiling a list of parts targeted at a perfect Titanfall experience.  The good news is, even with a fairly low investment in PC hardware, gamers will find that the PC version of this title is definitely the premiere way to play as the compute power of the Xbox One just can't compete.

titanfallsystem.jpg
 

In this story we'll present three different build suggestions, each addressing a different target resolution but also better image quality settings than the Xbox One can offer.  We have options for 1080p, the best option that the Xbox could offer, 2560x1440 and even 3840x2160, better known as 4K.  In truth, the graphics horsepower required by Titanfall isn't overly extreme, and thus an entire PC build coming in under $800, including a full copy of Windows 8.1, is easy to accomplish.

Target 1: 1920x1080

First up is old reliable, the 1920x1080 resolution that most gamers still have on their primary gaming display.  That could be a home theater style PC hooked up to a TV or monitors in sizes up to 27-in.  Here is our build suggestion, followed by our explanations.

  Titanfall 1080p Build
Processor Intel Core i3-4330 - $137
Motherboard MSI H87-G43 - $96
Memory Corsair Vengeance LP 8GB 1600 MHz (2 x 4GB) - $89
Graphics Card EVGA GeForce GTX 750 Ti - $179
Storage Western Digital Blue 1TB - $59
Case Corsair 200R ATX Mid Tower Case - $72
Power Supply Corsair CX 500 watt - $49
OS Windows 8.1 OEM - $96
Total Price $781 - Amazon Full Cart

Our first build comes in at $781 and includes some incredibly competent gaming hardware for that price.  The Intel Core i3-4330 is a dual-core, HyperThreaded processor that provides more than enough capability to push Titanfall any all other major PC games on the market.  The MSI H87 motherboard lacks some of the advanced features of the Z87 platform but does the job at a lower cost.  8GB of Corsair memory, though not running at a high clock speed, provides more than enough capacity for all the programs and applications you could want to run.

Continue reading our article on building a gaming PC for Titanfall!!

Author:
Manufacturer: EVGA

Its been a while...

EVGA has been around for quite some time now.  They have turned into NVIDIA’s closest North American partner after the collapse of the original VisionTek.  At nearly every trade show or gaming event, EVGA is closely associated with whatever NVIDIA presence is there.  In the past EVGA focused primarily on using NVIDIA reference designs for PCB and cooling, and would branch out now and then with custom or semi-custom watercooling solutions.

evga_780_01.jpg

A very svelte and minimalist design for the shroud.  I like it.

The last time I actually reviewed an EVGA products was way back in May of 2006.  I took a look at the 7600 GS product, which was a passively cooled card.  Oddly enough, that card is sitting right in front of me as I write this.  Unfortunately, that particular card has a set of blown caps on it and no longer works.  Considering that the card has been in constant use since 2006, I would say that it held up very well for those eight years!

EVGA has been expanding their product lineup to be able to handle the highs and lows of the PC market.  They have started manufacturing motherboards, cases, and power supplies to help differentiate their product lineup and hopefully broaden their product portfolio.  We know from past experiences that companies that rely on one type of product from a single manufacturer (GPUs in this particular case) can experience some real issues if demand drops dramatically due to competitive disadvantages.  EVGA also has taken a much more aggressive approach to differentiating their products while keeping them within a certain budget.

The latest generation of GTX 700 based cards have seen the introduction of the EVGA ACX cooling solutions.  These dual fan coolers are a big step up from the reference design and puts EVGA on par with competitive products from Asus and MSI.  EVGA does make some tradeoffs as compared, but these are fairly minimal when considering the entire package.

Click here to read the entire review!

Microsoft, Along with AMD, Intel, NVIDIA, and Qualcomm, Will Announce DirectX 12 at GDC 2014

Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 5, 2014 - 08:28 PM |
Tagged: qualcomm, nvidia, microsoft, Intel, gdc 14, GDC, DirectX 12, amd

The announcement of DirectX 12 has been given a date and time via a blog post on the Microsoft Developer Network (MSDN) blogs. On March 20th at 10:00am (I assume PDT), a few days into the 2014 Game Developers Conference in San Francisco, California, the upcoming specification should be detailed for attendees. Apparently, four GPU manufacturers will also be involved with the announcement: AMD, Intel, NVIDIA, and Qualcomm.

microsoft-dx12-gdc-announce.jpg

As we reported last week, DirectX 12 is expected to target increased hardware control and decreased CPU overhead for added performance in "cutting-edge 3D graphics" applications. Really, this is the best time for it. Graphics processors are mostly settled into highly-efficient co-processors of parallel data, with some specialized logic for geometry and video tasks. A new specification can relax the needs of video drivers and thus keep the GPU (or GPUs, in Mantle's case) loaded and utilized.

But, to me, the most interesting part of this announcement is the nod to Qualcomm. Microsoft values DirectX as leverage over other x86 and ARM-based operating systems. With Qualcomm, clearly Microsoft believes that either Windows RT or Windows Phone will benefit from the API's next version. While it will probably make PC gamers nervous, mobile platforms will benefit most from reducing CPU overhead, especially if it can be spread out over multiple cores.

Honestly, that is fine by me. As long as Microsoft returns to treating the PC as a first-class citizen, I do not mind them helping mobile, too. We will definitely keep you up to date as we know more.

Source: MSDN Blogs

AMD Radeon R9 290X shows up for $549 on Newegg. Is the worst behind us?

Subject: Graphics Cards | March 4, 2014 - 03:38 PM |
Tagged: radeon, R9 290X, hawaii, amd, 290x

Yes, I know it is only one card.  And yes I know that this could sell out in the next 10 minutes and be nothing, but I was so interested, excited and curious about this that I wanted to put together a news post.  I just found a Radeon R9 290X card selling for $549 on Newegg.com.  That is the normal, regular, non-inflated, expected retail price.

WAT.

290xwat.jpg

You can get a Powercolor AXR9 290X with 4GB of memory for $549 right now, likely only if you hurry.  That same GPU on Amazon.com will cost you $676.  This same card at Newegg.com has been as high as $699:

290xwat2.jpg

Again - this is only one card on one site, but the implications are positive.  This is also a reference design card, rather than one of the superior offerings with a custom cooler.  After that single card, the next lowest price is $629, followed by a couple at $649 and then more at $699.  We are still waiting to hear from AMD on the issue, what its response is and if it can actually even do anything to fix it.  It seems plausible, but maybe not likely, that the draw of coin mining is reached a peak (and who can blame them) and the pricing of AMD GPUs could stabilize.  Maybe.  It's classified.

But for now, if you want an R9 290X, Newegg.com has at least one option that makes sense.