Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

New AMD Polaris 10 and Polaris 11 GPU Details Emerge

Subject: Editorial, Graphics Cards | May 18, 2016 - 05:18 PM |
Tagged: rumor, Polaris, opinion, HDMI 2.0, gpu, gddr5x, GDDR5, GCN, amd, 4k

While Nvidia's Pascal has held the spotlight in the news recently, it is not the only new GPU architecture debuting this year. AMD will soon be bringing its Polaris-based graphics cards to market for notebooks and mainstream desktop users. While several different code names have been thrown around for these new chips, they are consistently in general terms referred to as Polaris 10 and Polaris 11. AMD's Raja Kudori stated in an interview with PC Perspective that the numbers used in the naming scheme hold no special significance, but eventually Polaris will be used across the entire performance lineup (low end to high end graphics).

Naturally, there are going to be many rumors and leaks as the launch gets closer. In fact, Tech Power Up recently came into a number of interesting details about AMD's plans for Polaris-based graphics in 2016 including specifications and which areas of the market each chip is going to be aimed at. 

AMD GPU Roadmap.jpg

Citing the usual "industry sources" familiar with the matter (take that for what it's worth, but the specifications do not seem out of the realm of possibility), Tech Power Up revealed that there are two lines of Polaris-based GPUs that will be made available this year. Polaris 10 will allegedly occupy the mid-range (mainstream) graphics option in desktops as well as being the basis for high end gaming notebook graphics chips. On the other hand, Polaris 11 will reportedly be a smaller chip aimed at thin-and-light notebooks and mainstream laptops.

Now, for the juicy bits of the leak: the rumored specifications!

AMD's "Polaris 10" GPU will feature 32 compute units (CUs) which TPU estimates – based on the assumption that each CU still contains 64 shaders on Polaris – works out to 2,048 shaders. The GPU further features a 256-bit memory interface along with a memory controller supporting GDDR5 and GDDR5X (though not at the same time heh). This would leave room for cheaper Polaris 10 derived products with less than 32 CUs and/or cheaper GDDR5 memory. Graphics cards would have as much as 8GB of memory initially clocked at 7 Gbps. Reportedly, the full 32 CU GPU is rated at 5.5 TFLOPS of single precision compute power and runs at a TDP of no more than 150 watts.

Compared to the existing Hawaii-based R9 390X, the upcoming R9 400 Polaris 10 series GPU has fewer shaders and less memory bandwidth. The memory is clocked 1 GHz higher, but the GDDR5X memory bus is half that of the 390X's 512-bit GDDR5 bus which results in 224 GB/s memory bandwidth for Polaris 10 versus 384 GB/s on Hawaii. The R9 390X has a slight edge in compute performance at 5.9 TFLOPS versus Polaris 10's 5.5 TFLOPS however the Polaris 10 GPU is using much less power and easily wins at performance per watt! It almost reaches the same level of single precision compute performance at nearly half the power which is impressive if it holds true!

  R9 390X R9 390 R9 380 R9 400-Series "Polaris 10"
GPU Code name Grenada (Hawaii) Grenada (Hawaii) Antigua (Tonga) Polaris 10
GPU Cores 2816 2560 1792 2048
Rated Clock 1050 MHz 1000 MHz 970 MHz ~1343 MHz
Texture Units 176 160 112 ?
ROP Units 64 64 32 ?
Memory 8GB 8GB 4GB 8GB
Memory Clock 6000 MHz 6000 MHz 5700 MHz 7000 MHz
Memory Interface 512-bit 512-bit 256-bit 256-bit
Memory Bandwidth 384 GB/s 384 GB/s 182.4 GB/s 224 GB/s
TDP 275 watts 275 watts 190 watts 150 watts (or less)
Peak Compute 5.9 TFLOPS 5.1 TFLOPS 3.48 TFLOPS 5.5 TFLOPS
MSRP (current) ~$400 ~$310 ~$199 $ unknown

Note: Polaris GPU clocks esitmated using assumption of 5.5 TFLOPS being peak compute and accurate number of shaders. (Thanks Scott.)

Another comparison that can be made is to the Radeon R9 380 which is a Tonga-based GPU with similar TDP. In this matchup, the Polaris 10 based chip will – at a slightly lower TDP – pack in more shaders, twice the amount of faster clocked memory with 23% more bandwidth, and provide a 58% increase in single precision compute horsepower. Not too shabby!

Likely, a good portion of these increases are made possible by the move to a smaller process node and utilizing FinFET "tri-gate" like transistors on the Samsung/Globalfoundries 14LPP FinFET manufacturing process, though AMD has also made some architecture tweaks and hardware additions to the GCN 4.0 based processors. A brief high level introduction is said to be made today in a webinar for their partners (though AMD has come out and said preemptively that no technical nitty-gritty details will be divulged yet). (Update: Tech Altar summarized the partner webinar. Unfortunately there was no major reveals other than that AMD will not be limiting AIB partners from pushing for the highest factory overclocks they can get).

Moving on from Polaris 10 for a bit, Polaris 11 is rumored to be a smaller GCN 4.0 chip that will top out at 14 CUs (estimated 896 shaders/stream processors) and 2.5 TFLOPS of single precision compute power. These chips aimed at mainstream and thin-and-light laptops will have 50W TDPs and will be paired with up to 4GB of GDDR5 memory. There is apparently no GDDR5X option for these, which makes sense at this price point and performance level. The 128-bit bus is a bit limiting, but this is a low end mobile chip we are talking about here...

  R7 370 R7 400 Series "Polaris 11"
GPU Code name Trinidad (Pitcairn) Polaris 11
GPU Cores 1024 896
Rated Clock

925 MHz base (975 MHz boost)

~1395 MHz
Texture Units 64 ?
ROP Units 32 ?
Memory 2 or 4GB 4GB
Memory Clock 5600 MHz ? MHz
Memory Interface 256-bit 128-bit
Memory Bandwidth 179.2 GB/s ? GB/s
TDP 110 watts 50 watts
Peak Compute 1.89 TFLOPS 2.5 TFLOPS
MSRP (current) ~$140 (less after rebates and sales) $?

Note: Polaris GPU clocks esitmated using assumption of 2.5 TFLOPS being peak compute and accurate number of shaders. (Thanks Scott.)

Fewer details were unveiled concerning Polaris 11, as you can see from the chart above. From what we know so far, it should be a promising successor to the R7 370 series even with the memory bus limitation and lower shader count as the GPU should be clocked higher, (it also might have more shaders in M series mobile variants versus of the 370 and lower mobile series) and a much lower TDP for at least equivalent if not a decent increase in performance. The lower power usage in particular will be hugely welcomed in mobile devices as it will result in longer battery life under the same workloads, ideally. I picked the R7 370 as the comparison as it has 4 gigabytes of memory and not that many more shaders and being a desktop chip readers may be more widely familiar with it. It also appears to sit between the R7 360 and R7 370 in terms of shader count and other features but is allegedly going to be faster than both of them while using at least (on paper) less than half the power.

Of course these are still rumors until AMD makes Polaris officially, well, official with a product launch. The claimed specifications appear reasonable though, and based on that there are a few important takeaways and thoughts I have.

amd-2016-polaris-blocks.jpg

The first thing on my mind is that AMD is taking an interesting direction here. While NVIDIA has chosen to start out its new generation at the top by announcing "big Pascal" GP100 and actually launching the GP104 GTX 1080 (one of its highest end consumer chips/cards) yesterday and then over the course of the year introducing lower end products AMD has opted for the opposite approach. AMD will be starting closer to the lower end with a mainstream notebook chip and high end notebook/mainstream desktop GPU (Polaris 11 and 10 respectively) and then over a year fleshing out its product stack (remember Raja Kudori stated Polaris and GCN 4 would be used across the entire product stack) and building up with bigger and higher end GPUs over time finally topping off with its highest end consumer (and professional) GPUs based on "Vega" in 2017.

This means, and I'm not sure if this was planned by either Nvidia or AMD or just how it happened to work out based on them following their own GPU philosophies (but I'm thinking the latter), that for some time after both architectures are launched AMD and NVIDIA's newest architectures and GPUs will not be directly competing with each other. Eventually they should meet in the middle (maybe late this year?) with a mid-range desktop graphics card and it will be interesting to see how they stack up at similar price points and hardware levels. Then, of course once "Vega" based GPUs hit (sadly probably in time for NV's big Pascal to launch heh. I'm not sure if Vega is Fury X replacement only or even beyond that to 1080Ti or even GP100 competitor) we should see GCN 4 on the new smaller process node square up against NVIDIA and it's 16nm Pascal products across the board (entire lineup). Which will have the better performance, which will win out in power usage and performance/watt and performance/$? All questions I wish I knew the answers to, but sadly do not!!

Speaking of price and performance/$... Polaris is actually looking pretty good so far at hitting much lower TDPs and power usage targets while delivering at least similar performance if not a good bit more. Both AMD and NVIDIA appear to be bringing out GPUs better than I expected to see as far as technological improvements in performance and power usage (these die shrinks have really helped even though from here on out that trend isn't really going to continue...). I hope that AMD can at least match NV in these areas at the mid range even if they do not have a high end GPU coming out soon (not until sometime after these cards launch and not really until Vega, the high end GCN GPU successor). At least on paper based on the leaked information the GPUs so far look good. My only worry is going to be pricing which I think is going to make or break these cards. AMD will need to price them competitively and aggressively to ensure their adoption and success.  

I hope that doing the rollout this way (starting with lower end chips) helps AMD to iron out the new smaller process node and that they are able to get good yields so that they can be aggressive with pricing here and eventually at the hgh end!

I am looking forward to more information on AMD's Polaris architecture and the graphics cards based on it!

Also read:

I will admit that I am not 100% up on all the rumors and I apologize for that. With that said, I would love to hear what your thoughts are on AMD's upcoming GPUs and what you think about these latest rumors!

NVIDIA Releases Full Specifications for GTX 1070

Subject: Graphics Cards | May 18, 2016 - 04:49 PM |
Tagged: nvidia, pascal, gtx 1070, 1070, gtx, GTX 1080, 16nm FF+, TSMC, Founder's Edition

Several weeks ago when NVIDIA announced the new GTX 1000 series of products, we were given a quick glimpse of the GTX 1070.  This upper-midrange card is to carry a $379 price tag in retail form while the "Founder's Edition" will hit the $449 mark.  Today NVIDIA released the full specifications of this card on their website.

The interest of the GTX 1070 is incredibly great because of the potential performance of this card vs. the previous generation.  Price is also a big consideration here as it is far easier to raise $370 than it is to make the jump to GTX 1080 and shell out $599 once non-Founder's Edition cards are released.  The GTX 1070 has all of the same features as the GTX 1080, but it takes a hit when it comes to clockspeed and shader units.

gtx_1070_launch.png

The GTX 1070 is a Pascal based part that is fabricated on TSMC's 16nm FF+ node.  It shares the same overall transistor count of the GTX 1080, but it is partially disabled.  The GTX 1070 contains 1920 CUDA cores as compared to the 2560 cores of the 1080.  Essentially one full GPC is disabled to reach that number.  The clockspeeds take a hit as well compared to the full GTX 1080.  The base clock for the 1070 is still an impressive 1506 MHz and boost reaches 1683 MHz.  This combination of shader counts and clockspeed makes this probably a little bit faster than the older GTX 980 ti.  The rated TDP for the card is 150 watts with a single 8 pin PCI-E power connector.  This means that there should be some decent headroom when it comes to overclocking this card.  Due to binning and yields, we may not see 2+ GHz overclocks with these cards, especially if NVIDIA cut down the power delivery system as compared to the GTX 1080.  Time will tell on that one.

The memory technology that NVIDIA is using for this card is not the cutting edge GDDR5x or HBM, but rather the tried and true GDDR5.  8 GB of this memory sits on a 256 bit bus, but it is running at a very, very fast 8 gbps.  This gives overall bandwidth in the 256 GB/sec region.  When we combine this figure with the memory compression techniques implemented with the Pascal architecture we can see that the GTX 1070 will not be bandwidth starved.  We have no information if this generation of products will mirror what we saw with the previous generation GTX 970 in terms of disabled memory controllers and the 3.5 GB/500 MB memory split due to that unique memory subsystem.

gtx_1070_launch2.png

Beyond those things, the GTX 1070 is identical to the GTX 1080 in terms of DirectX features, display specifications, decoding support, double bandwidth SLI, etc.  There is an obvious amount of excitement for this card considering its potential performance and price point.  These supposedly will be available in the Founder's Edition release on June 10 for the $449 MSRP.  I know many people are considering using these cards in SLI to deliver performance for half the price of last year's GTX 980ti.  From all indications, these cards will be a signficant upgrade for anyone using GTX 970s in SLI.  With the greater access to monitors that hit 4K as well as Surround Gaming, this could be a solid purchase for anyone looking to step up their game in these scenarios.

Source: NVIDIA

Simply FUD or a message from the Forced Upgrade Department?

Subject: General Tech | May 18, 2016 - 04:44 PM |
Tagged: Intel, microsoft, fud

DigiTimes has a doozy of a post title, stating that Intel plans to limit OS support on future processors starting with Kaby Lake and Apollo Lake CPUs.  Now this sounds horrible but you may be taking the word support out of context as it refers to the support that major customers require which leads to the so called errata (pdf example), not that the processors will be incapable of running any OS but Windows 10.  This may not matter so much to the average consumer but for industries and the scientific community this could result in huge costs as they would no longer be able to get fixes from Intel, unless they have upgraded to Windows 10.   That upgrade comes with its own costs, the monstrous amount of time it will take for compatibility testing, application updating and implementation; not to mention licensing fees.

AMD should take note of this, focus on continued legacy support and most importantly advertising that fact.  The price difference between choosing AMD over Intel could become even more compelling for these large customers and help refill AMD's coffers.

Opportunity.jpg

"With Intel planning to have its next-generation processors support only Windows 10, industrial PC (IPC) players are concerned that the move will dramatically increase their costs and affect market demand, according to sources from IPC players."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes
Author:
Subject: Processors
Manufacturer: ARM

10nm Sooner Than Expected?

It seems only yesterday that we had the first major GPU released on 16nm FF+ and now we are talking about ARM about to receive their first 10nm FF test chips!  Well, in fact it was yesterday that NVIDIA formally released performance figures on the latest GeForce GTX 1080 which is based on TSMC’s 16nm FF+ process technology.  Currently TSMC is going full bore on their latest process node and producing the fastest current graphics chip around.  It has taken the foundry industry as a whole a lot longer to develop FinFET technology than expected, but now that they have that piece of the puzzle seemingly mastered they are moving to a new process node at an accelerated rate.

arm_td01.png

TSMC’s 10nm FF is not well understood by press and analysts yet, but we gather that it is more of a marketing term than a true drop to 10 nm features.  Intel has yet to get past 14nm and does not expect 10 nm production until well into next year.  TSMC is promising their version in the second half of 2016.  We cannot assume that TSMC’s version will match what Intel will be doing in terms of geometries and electrical characteristics, but we do know that it is a step past TSMC’s 16nm FF products.  Lithography will likely get a boost with triple patterning exposure.  My guess is that the back end will also move away from the “20nm metal” stages that we see with 16nm.  All in all, it should be an improved product from what we see with 16nm, but time will tell if it can match the performance and density of competing lines that bear the 10nm name from Intel, Samsung, and GLOBALFOUNDRIES.

ARM has a history of porting their architectures to new process nodes, but they are being a bit more aggressive here than we have seen in the past.  It used to be that ARM would announce a new core or technology, and it would take up to two years to be introduced into the market.  Now we are seeing technology announcements and actual products hitting the scenes about nine months later.  With the mobile market continuing to grow we expect to see products quicker to market still.

arm_td02.png

The company designed a simplified test chip to tape out and send to TSMC for test production on the aforementioned 10nm FF process.  The chip was taped out in December, 2015.  The design was shipped to TSMC for mask production and wafer starts.  ARM is expecting the finished wafers to arrive this month.

Click here to continue reading about ARM's test foray into 10nm!

The 1080 roundup, Pascal in all its glory

Subject: Graphics Cards | May 17, 2016 - 10:22 PM |
Tagged: nvidia, pascal, video, GTX 1080, gtx, GP104, geforce, founders edition

Yes that's right, if you felt Ryan and Al somehow missed something in our review of the new GTX 1080 or you felt the obvious pro-Matrox bios was showing here are the other reviews you can pick and choose from.  Start off with [H]ard|OCP who also tested Ashes of the Singularity and Doom as well as the old favourite Battlefield 4.  Doom really showed itself off as a next generation game, its Nightmare mode scoffing at any GPU with less than 5GB of VRAM available and pushing the single 1080 hard.  Read on to see how the competition stacked up ... or wait for the 1440 to come out some time in the future.

1463427458xepmrLV68z_1_8_l.jpg

"NVIDIA's next generation video card is here, the GeForce GTX 1080 Founders Edition video card based on the new Pascal architecture will be explored. We will compare it against the GeForce GTX 980 Ti and Radeon R9 Fury X in many games to find out what it is capable of."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Crucial Announces Ballistix Sport LT DDR4 SODIMMs

Subject: Memory | May 17, 2016 - 04:09 PM |
Tagged: sodimm, ddr4, crucial ballistix sport

Crucial is releasing some new high end memory for gaming laptops and for those mobile devices which work for a living.  The new Ballistix Sport LT DDR4 SODIMMs will start at speeds of 2400 MT/s and will be fully Intel XMP compatible assuming you system beleives in those DDR4 speeds; if not look for an update from the manufacturer.  The SODIMMs will be available in sizes of up to 16GB per DIMM so you should be able to install quite a large pool of memory.  They didn't offer up any pictures as this was being written but instead a Youtube video of how Ballistix memory is made, which you can see below.

Boise, ID, and Glasgow, UK, -- May 17, 2016 – Crucial, a leading global brand of memory and storage upgrades, today announced the availability of Ballistix® Sport LT DDR4 SODIMMs. Ideal for gamers and performance enthusiasts, the new modules accelerate gaming laptops and small form factor systems by packing faster speeds into every memory slot, enabling users to run demanding games and applications with ease.

With speeds starting at 2400 MT/s, Ballistix Sport LT SODIMMs offer better latencies, reduced load times, and improved frame rates with integrated graphics. The new modules also feature a sleek black PCB and digital camo design and support Intel® XMP 2.0 profiles for easy installation.

“We’re constantly seeking ways to empower gamers with affordable, easy-to-use products that help them gain that competitive, performance edge,” explained Jeremy Mortenson, product marketing manager, Crucial. “With new platforms supporting faster DDR4 SODIMMS coming to the market, the newest Ballistix SODIMM module does just that.”

The Ballistix Sport LT DDR4 SODIMM modules will be available for purchase at www.crucial.com and through select global partners. All Crucial memory is backed by a limited lifetime warranty Limited lifetime warranty valid everywhere except Germany, where warranty is valid for 10 years from date of purchase.

For more information about Ballistix memory, visit crucial.com/ballistix.

Source: Crucial
Author:
Manufacturer: NVIDIA

A new architecture with GP104

Table of Contents

The summer of change for GPUs has begun with today’s review of the GeForce GTX 1080. NVIDIA has endured leaks, speculation and criticism for months now, with enthusiasts calling out NVIDIA for not including HBM technology or for not having asynchronous compute capability. Last week NVIDIA’s CEO Jen-Hsun Huang went on stage and officially announced the GTX 1080 and GTX 1070 graphics cards with a healthy amount of information about their supposed performance and price points. Issues around cost and what exactly a Founders Edition is aside, the event was well received and clearly showed a performance and efficiency improvement that we were not expecting.

DSC00209.jpg

The question is, does the actual product live up to the hype? Can NVIDIA overcome some users’ negative view of the Founders Edition to create a product message that will get the wide range of PC gamers looking for an upgrade path an option they’ll take?

I’ll let you know through the course of this review, but what I can tell you definitively is that the GeForce GTX 1080 clearly sits alone at the top of the GPU world.

Continue reading our review of the GeForce GTX 1080 Founders Edition!!

Manufacturer: NVIDIA

An Overview

 
TL;DR:
NVIDIA's Ansel Technology
 
Ansel is a utility that expands the concept of screenshots along the direction of photography. When fully enabled, it allows the user to capture still images with HDR exposures, gigapixel levels of resolution, 360-degree views for VR, 3D stereo projection, and post-processing filters, all from either the game's view, or from a free-roaming camera (if available). While it must be implemented by the game developer, mostly to prevent the user from either cheating or seeing hidden parts of the world, such as an inventory or minimap rendering room, NVIDIA claims that it is a tiny burden.
  • - NVIDIA blog claims "GTX 600-series and up"
  • - UI/UX is NVIDIA controlled
    • Allows NVIDIA to provide a consistent UI across all supported games
    • Game developers don't need to spend UX and QA effort on their own
  • - Can signal the game to use its highest-quality assets during the shot
  • - NVIDIA will provide an API for users to create their own post-process shader
    • Will allow access to Color, Normal, Depth, Geometry, (etc.) buffers
  • - When asked about implementing Ansel with ShadowPlay: "Stay tuned."
     

 

“In-game photography” is an interesting concept. Not too long ago, it was difficult to just capture the user's direct experience with a title. Print screen could only hold a single screenshot at a time, which allowed Steam and FRAPS to provide a better user experience. FRAPS also made video more accessible to the end-user, but it output huge files and, while it wasn't too expensive, it needed to be purchased online, which was a big issue ten-or-so years ago.

shadowplay-vs.jpg

Seeing that their audience would enjoy video captures, NVIDIA introduced ShadowPlay a couple of years ago. The feature allowed users to, not only record video, but also capture the last few minutes. It did this with hardware acceleration, and it did this for free (for compatible GPUs). While I don't use ShadowPlay, preferring the control of OBS, it's a good example of how NVIDIA wants to support their users. They see these features as a value-add, which draw people to their hardware.

Read on to learn more about NVIDIA Ansel

G.Skill Adds Splash of Color To Trident Z Memory Modules

Subject: Memory | May 17, 2016 - 07:07 AM |
Tagged: trident z, gskill, G.Skill Trident Z, ddr4

G.Skill recently updated its high end line of Trident Z DDR4 memory modules to add several new color options. While no new speed tiers are being introduced, the existing DIMMs with brushed aluminum silver colored modules with red and black accents will shortly be joined by new modules with 5 new color schemes including silver modules with white or black top bar accents or black modules with white, yellow, or silver accents.

Trident Z 5 colors.png

There is nothing groundbreaking here, but it will certainly make putting together a build based around a particular color or theme a bit easier, and that is their goal as these new DIMMs are aimed at modders and enthusiasts who are the most likely group to be running windowed or open air type systems that can show off the internal hardware.

black.yellow.png

For those interested, the new colors will be available at the end of May. The memory kits in DDR4 3200 Mhz speeds (16GB to 128GB kits) of all timings will be available in the existing red and all the new color schemes. Users wanting the faster speed memory kits (e.g. DDR4 3400) will be limited to the red, white, and black accents (no orange or yellow top pieces on the heat spreader).

Source: G.Skill

Pairing up the R9 380X

Subject: Graphics Cards | May 16, 2016 - 07:52 PM |
Tagged: amd, r9 380x, crossfire

A pair of R9 380X's will cost you around $500, a bit more $100 less than a single GTX 980Ti and on par or a little less expensive than a straight GTX 980.  You have likely seen these cards compared but how often have you seen these cards pitted against a pair of GTX 960's which costs a little bit less than two 380X cards?  [H]ard|OCP decided it was worth investigating, perhaps for those who currently have a single one of these cards that are considering a second if the price is right.  The results are very tight, overall the two setups performed very similarly with some games favouring AMD and others NVIDIA, check out the full review here.

1462161482JkHsFf1A5H_1_1.jpg

"We are evaluating two Radeon R9 380X video cards in CrossFire against two GeForce GTX 960 video cards in a SLI arrangement. We will overclock each setup to its highest, to experience the full gaming benefit each configuration has to offer. Additionally we will compare a Radeon R9 380 CrossFire setup to help determine the best value."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

PCPer Live! GeForce GTX 1080 Live Stream with Tom Petersen (Now with free cards!)

Subject: General Tech, Graphics Cards | May 16, 2016 - 07:19 PM |
Tagged: video, tom petersen, pascal, nvidia, live, GTX 1080, gtx, GP104, geforce

Our review of the GeForce GTX 1080 is LIVE NOW, so be sure you check that out before today's live stream!!

Get yourself ready, it’s time for another GeForce GTX live stream hosted by PC Perspective’s Ryan Shrout and NVIDIA’s Tom Petersen. The general details about consumer Pascal and the GeForce GTX 1080 graphics card are already official and based on the traffic to our stories and the response on Twitter and YouTube, there is more than a little pent-up excitement. .

gtx10801.jpg

On hand to talk about the new graphics card, answer questions about technologies in the GeForce family including Pascal, SLI, VR, Simultaneous Multi-Projection and more will be Tom Petersen, well known in our community. We have done quite a few awesome live steams with Tom in the past, check them out if you haven't already.

pcperlive.png

NVIDIA GeForce GTX 1080 Live Stream

10am PT / 1pm ET - May 17th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, May 17th at 1pm ET / 10am PT at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience, asking questions for me and Tom to answer live. 

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else. Previous streams have produced news as well – including statements on support for Adaptive Sync, release dates for displays and first-ever demos of triple display G-Sync functionality. You never know what’s going to happen or what will be said!

UPDATE! UPDATE! UPDATE! This just in fellow gamers: Tom is going to be providing two GeForce GTX 1080 graphics cards to give away during the live stream! We won't be able to ship them until availability hits at the end of May, but two lucky viewers of the live stream will be able to get their paws on the fastest graphics card we have ever tested!! Make sure you are scheduled to be here on May 17th at 10am PT / 1pm ET!!

DSC00218.jpg

Don't you want to win me??!?

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Tuesday at 1pm ET / 10am PT and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

Curious how Google handles its Cloud? Meet Chromium OS

Subject: General Tech | May 16, 2016 - 07:12 PM |
Tagged: google, Chromium OS

Google has released a beta version of their Container-VM Image to those interested in how they manage their Cloud.  It is built to handle Docker and Kubernetes instances on the Google cloud, not for home usage on a small scale.  If you are curious about the competition for Amazon, Microsoft and other providers of Clouded services you should follow the links from the post on The Register for a look.  Be aware this is a beta, not all features are available and some of the ones which are may not be compatible with future updates but it is a great way to familiarize yourself with the inner lining of the Google Cloud Platform.

google-cloud-platform.png

"Google's decided the Chromium OS is its preferred operating system for running containers in its own cloud. And why wouldn't it – the company says it uses it for its own services."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Mushkin's Reactor 1TB SSD, great performance and price

Subject: Storage | May 13, 2016 - 07:30 PM |
Tagged: Mushkin, Reactor, 1TB, jmicron, JMF612, mlc

While not quite within Ryan's Law, the Mushkin Reactor 1TB model usually sells for around $240.  As the price implies this drive uses MLC flash but the three year warranty should be enough to see you to your next upgrade.  The Tech Report decided to test out the drive to ensure users were getting performance as well as a great value. The results speak for themselves, with better performance than expected.

internals.jpg

"Mushkin's Reactor 1TB SSD is a frequent star of our weekly deals posts. We put it to the test to see whether it offers high performance along with its low price tag."

Here are some more Storage reviews from around the web:

Storage

MSI's DS502; one order of gaming headset, hold the extras

Subject: General Tech | May 12, 2016 - 09:56 PM |
Tagged: msi, ds502, gaming headset, virtual 7.1, audio

MSI's DS502 seem great for the gamer on a budget who needs to play quietly but who would still like some style.  At $64 they do have some compromises such as 40mm drivers and a self adjusting headband but they also have the features you want such as plush earcups and virtual 7.1 surround sound.  Kitguru found them effective during gameplay and the mic, while not stellar was good enough for in game communications and did not need the input to be raised significantly to be audible.  If you are in need of a simple headset with controls that give you enough options to game and want to leave the fancy stuff to the audiophiles drop by for a look at the review.

26e.jpg

"Today we are taking a look at MSI’s brand new gaming headset, the DS502, boasting 7.1 surround sound, premium build and Hi-Fi level sound quality in a lightweight, ergonomically designed package. How does it hold up? "

Here is some more Tech News from around the web:

Audio Corner

Source: Kitguru

Podcast #399 - GTX 1080 Launch, UWP Updates, DOOM Vulkan Patch, Kaby Lake Leaks, ASUS ROG STRIX X99, and more!

Subject: General Tech | May 12, 2016 - 08:28 PM |
Tagged: podcast, video, GTX 1080, galax, founders, uwp, doom, vulkan, kaby lake, EKWB, rog strix x99

PC Perspective Podcast #399 - 05/11/2016

Join us this week as we discuss the GTX 1080 Launch, UWP Updates, DOOM Vulkan Patch, Kaby Lake Leaks, ASUS ROG STRIX X99, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

This episode of the PC Perspective Podcast is sponsored by Casper!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Program length: 1:03:21

  1. Week in Review:
  2. AD BREAK
  3. News items of interest:
  4. Hardware/Software Picks of the Week
    1. Allyn: Old retro gaming history from PlayValue
  5. Closing/outro

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

G.Skill has a different take on Cherry MX RGB, check out the Ripjaws KM780R

Subject: General Tech | May 12, 2016 - 07:40 PM |
Tagged: input, G.Skill, Ripjaws KM780R, gaming keyboard, Cherry MX, cherry mx rgb

G.Skill have joined the ranks of those who have released a Cherry MX RGB keyboard, you can choose between Red, Brown and Blue switches to accompany the light show. They chose an interesting set of caps, which float above the keyboard allowing more backlighting to show through but The Tech Report noticed that the caps feel like they are rubbing against something.  As the caps are replaceable this can be resolved if you do find it to be an issue, but you will lose some light and the keyboard will not be as easy to clean.  In addition to having audio jacks and a USB pass-through the optional software allows an immense amount of control over your lighting.  Drop by and see if this keyboard meets your needs.

fullviewfeathered.png

"Established RAM manufacturer G.Skill is branching into gaming peripherals of late. We've already examined the company's Ripjaws MX780 gaming mouse, and now we're looking at the KM780R gaming keyboard. Join us as we see whether this keyboard has what it takes to be a contender in the crowded gaming peripherals market."

Here is some more Tech News from around the web:

Tech Talk

Say it ain't so AMD; Polaris may not shine at Computex?

Subject: General Tech | May 12, 2016 - 05:56 PM |
Tagged: rumour, Polaris, computex, amd

There is a rumour floating around this morning and it is not good.  Guru of 3D translated a story over at Nordic Hardware which suggests AMD has stated they will not have any new working cards to show at Computex.  The only Polaris hardware they will have ready will be equivalent to the current R9 390 and 390X, albeit at a lower price point.  The rumoured problem is that the new flagship cards simply won't hit 850 MHz reliably, which in turn means high end GPUs are right out. 

This had better not be true or AMD may find themselves shoeless and GPU enthusiasts will be as disappointed as White Sox fans back in 1919, albeit for different reasons.  Those in the NVIDIA camp would do well to remember this has an effect on them as well; why would NVIDIA lower the price on those shiny new 1080's or 1070's when there is nothing in the market to compete with them? 

This is a rumour from an anonymous source at an AMD partner, so be sure to take it with a grain of salt and hope that it is completely unsubstantiated; or that a silicon-based miracle happens in the coming months if there is some substance to this.

man_crying.jpg

"Okay before I start on this news-item, I really need to state that this is based on a vague rumor, nothing has been confirmed or denied otherwise. Here's the story, some reports say Polaris 10 can't hit 850 MHz reliably and that availability will be pushed back to October. I sincerely hope the rumor is not true."

Here is some more Tech News from around the web:

Tech Talk

Source: Guru of 3D

Final Fantasy X and X-2 Confirmed for PC... Launches Today

Subject: General Tech | May 12, 2016 - 11:01 AM |
Tagged: square enix, final fantasy

Okay, so we're starting to get just about every main-line Final Fantasy title on the PC platform. They haven't always arrived in good condition, with Final Fantasy XIII having major user interface issues, which were mostly (but not entirely) patched, and Final Fantasy IX requiring the installation of a few Android plug-ins... on Windows. Still, many of them are classics and it's therefore good to have them on platforms that (barring current UWP discussions) encourage perpetual compatibility.

square-2016-ffx.jpg

As for its system requirements? They don't specify integrated GPUs, ie: Intel, but what they do list are what we would expect for a game that targets the Xbox 360 and PS3. If it comes from AMD or NVIDIA, and they are still releasing drivers for it, you should be fine. SquareEnix declares a minimum of either a AMD Radeon HD 2600XT or a NVIDIA GeForce 9600GT.

Check out the Steam page if you're concerned, though.

Excluding the spin-offs, the PC is now missing just the original Final Fantasy, Final Fantasy II, Final Fantasy XII, and the upcoming Final Fantasy XV. The rest have arrived on the PC in some form, be it a port of the original or one of their remasters. Still no word on any of those remaining four, though, but it seems like Square is just dropping games onto Steam with a few days of notice now.

Oddly enough, no price is listed (at least in Canada). No idea why.

AMD Releases Radeon Software Crimson Edition 16.5.2 Beta

Subject: Graphics Cards | May 12, 2016 - 03:53 AM |
Tagged: amd, crimson, graphics drivers

For the second time this month, hence the version number, AMD has released a driver to coincide with a major game release. This one is for DOOM, which will be available on Friday. Like the previous driver, which was aligned with Forza, it has not been WHQL-certified. That's okay, though. NVIDIA's Game Ready drivers didn't strive for WHQL certification until just recently, and, even then, WHQL certification doesn't mean what it used to.

amd-2015-crimson-logo.png

But yeah, apart from game-specific optimizations for DOOM, 16.5.2 has a few extra reasons to be used. If you play Battleborn, which launched on May 3rd, then AMD has added a new CrossFire profile for that game. They have also fixed at least eleven issues (plus however many undocumented ones). It comes with ten known issues, but none of them seem particularly troubling. It seems to be mostly CrossFire-related issues.

You can pick up the driver at AMD's website.

Source: AMD

NVIDIA Limits GTX 1080 SLI to Two Cards

Subject: Graphics Cards | May 12, 2016 - 02:57 AM |
Tagged: sli, nvidia, GTX 1080, GeForce GTX 1080

Update (May 12th, 1:45am): Okay so the post has been deleted, which was originally from Chris Bencivenga, Support Manager at EVGA. A screenshot of it is attached below. Note that Jacob Freeman later posted that "More info about SLI support will be coming soon, please stay tuned." I guess this means take the news with a grain of salt until an official word can be released.

evga-2016-gtx1080sli.png

Original Post Below

According to EVGA, NVIDIA will not support three- and four-way SLI on the GeForce GTX 1080. They state that, even if you use the old, multi-way connectors, it will still be limited to two-way. The new SLI connector (called SLI HB) will provide better performance “than 2-way SLI did in the past on previous series”. This suggests that the old SLI connectors can be used with the GTX 1080, although with less performance and only for two cards.

nvidia-2016-dreamhack-newsli.png

This is the only hard information that we have on this change, but I will elaborate a bit based on what I know about graphics APIs. Basically, SLI (and CrossFire) are simplifications of the multi-GPU load-balancing problems such that it is easy to do from within the driver, without the game's involvement. In DirectX 11 and earlier, the game cannot interface with the driver in that way at all. That does not apply to DirectX 12 and Vulkan, however. In those APIs, you will be able to explicitly load-balance by querying all graphics devices (including APUs) and split the commands yourself.

Even though a few DirectX 12 games exist, it's still unclear how SLI and CrossFire will be utilized in the context of DirectX 12 and Vulkan. DirectX 12 has the tier of multi-GPU called “implicit multi-adapter,” which allows the driver to load balance. How will this decision affect those APIs? Could inter-card bandwidth even be offloaded via SLI HB in DirectX 12 and Vulkan at all? Not sure yet (but you would think that they would at least add a Vulkan extension). You should be able to use three GTX 1080s in titles that manually load-balance to three or more mismatched GPUs, but only for those games.

If it relies upon SLI, which is everything DirectX 11, then you cannot. You definitely cannot.

Source: EVGA