EVGA Live Stream! Learn about Maxwell, X99 and Win Prizes!!

Subject: General Tech, Graphics Cards, Motherboards, Cases and Cooling | September 22, 2014 - 12:16 PM |
Tagged: X99, video, maxwell, live, GTX 980, GTX 970, evga

EVGA has been a busy company recently. It has continued to innovate with new coolers for the recent GTX 980 and GTX 970 card releases, newer power supplies offer unique features and improved quality and power output, a new line of X99 chipset motherboards including a Micro ATX variant and hey, the company even released a line of high-performance mice this year! PC Perspective has covered basically all of these releases (and will continue to do so with pending GPU and MB reviews) but there is a lot that needs explaining.

To help out, an industry and community favorite will be stopping by from EVGA to the PC Perspective offices: Jacob Freeman. You might know him as @EVGA_JacobF on Twitter or have seen him on countless forums, but he will making an in-person appearance on Friday, September 26th on PC Perspective Live! We plan on discussing the brand new ACX 2.0 cooler on the Maxwell GPUs released last week, go over some of highlights of the new X99 motherboards and even touch on power supplies and the Torq mice line as well.

pcperlive.png

EVGA GTX 980/970, X99, PSU and Torq Live Stream featuring Jacob Freeman

3pm ET / 12pm PT - September 26th

PC Perspective Live! Page

EVGA has been a supporter of PC Perspective for a long time and we asked them to give back to our community during this live stream - and they have stepped up! Look at this prize list:

How can you participate and win these awesome pieces of hardware? Just be here at 3pm ET / 12pm PT on http://www.pcper.com/live and we'll be announcing winners as we go for those that tune in. It really couldn't be more simple!

x99class.jpg

If you have questions you want to ask Jacob about EVGA, or any of its line of products, please leave them in the comments section below and we'll start compiling a list to address on the live stream Friday. Who knows, we may even save some prizes for some of our favorite questions!

To make sure you don't miss our live stream events, be sure you sign up for our spam-free PC Perspective Live! Mailing List. We email that group a couple hours before each event gets started.

NVIDIA GeForce 344.11 Driver and GeForce Experience 2.1.2 Released Alongside Maxwell-based GTX 980 and GTX 970

Subject: General Tech, Graphics Cards | September 20, 2014 - 09:56 AM |
Tagged: nvidia, maxwell, graphics drivers, geforce experience

Update: There is also the 344.16 for the GTX 970 and GTX 980, resolving an issue specific to them.

When they release a new graphics card, especially in a new architecture, NVIDIA will have software ready to support it. First and most obvious, Maxwell comes with the GeForce 344.11 drivers - which is the first to support only Fermi and later GPUs. Mostly, the driver's purpose is supporting the new graphics cards and optimizing to Borderlands: The Pre-Sequel, The Evil Within, F1 2014, and Alien: Isolation. It also supports multi-monitor G-Sync, which was previously impossible, even with three single-DisplayPort Kepler cards.

nvidia-geforce.png

At the same time, NVIDIA launched a new GeForce Experience with more exciting features. First, and I feel least expected, it allows the SHIELD Wireless Controller to be connected to a PC, but only wired with its provided USB cable. This also means that you cannot use the controller without a GeForce graphics card.

If you have a GeForce GTX 900-series add-in board, you will be able to use Dynamic Super Resolution (DSR) and record in 4K video with ShadowPlay. Performance when recording on a PC in SLI mode has been improved also, apparently even for Kepler-based cards.

Both the drivers and GeForce Experience are available now.

Source: NVIDIA

Developer's View on DirectX 12 Alongside Maxwell Launch

Subject: General Tech, Graphics Cards | September 20, 2014 - 09:06 AM |
Tagged: unreal engine 4, nvidia, microsoft, maxwell, DirectX 12, DirectX

Microsoft and NVIDIA has decided to release some information about DirectX 12 (and DirectX 11.3) alongside the launch of the Maxwell-based GeForce GTX 980 and GeForce GTX 970 graphics cards. Mostly, they announced that Microsoft teamed up with Epic Games to bring DirectX 12 to Unreal Engine 4. They currently have two demos, Elemental and Infiltrator, that are up and running with DirectX 12.

epic-ue4-infiltrator.jpg

Moreover, they have provided a form for developers who are interested in "early access" to apply for it. They continually discuss it in terms of Unreal Engine 4, but they do not explicitly say that other developers cannot apply. UE4 subscribers will get access to the Elemental demo in DX12, but it does not look like Infiltrator will be available.

DirectX 12 is expected to target games for Holiday 2015.

Source: Microsoft

ASUS Announces STRIX GTX 980 and STRIX GTX 970

Subject: Graphics Cards | September 19, 2014 - 01:57 PM |
Tagged: asus, strix, STRIX GTX 970, STRIX GTX 980, maxwell

The ASUS STRIX series comes with a custom DirectCU II cooler that is capable of running at 0dB when not under full load, in fact you can choose the temperature at which the fans activate using the included GPU Tweak application.  The factory overclock is modest but thanks to that cooler and the 10-phase power you will be able to push the card even further. The best news is the price, you get all of these extras for almost the same price as the reference cards are selling at!

unnamed.jpg

Fremont, CA (19th September, 2014) - ASUS today announced the STRIX GTX 980 and STRIX GTX 970, all-new gaming graphics cards packed with exclusive ASUS technologies, including DirectCU II and GPU Tweak for cooler, quieter and faster performance. The STRIX GTX 980 and STRIX GTX 970 are factory-overclocked at 1279MHz and 1253MHz respectively and are fitted with 4GB of high-speed GDDR5 video memory operating at speeds up to 7010MHz for the best gameplay experience.

Play League of Legends and StarCraft in silence!
The STRIX GTX 980 and STRIX GTX 970 both come with the ASUS-exclusive DirectCU II cooling technology. With a 10mm a heatpipe to transport heat away from the GPU core, operating temperatures are 30% cooler and 3X quieter than reference designs. Efficient cooling and lower operating temperatures allow STRIX graphics cards to incorporate an intelligent fan-stop mode that can handle games such as League of Legends1 and StarCraft1 passively, making both cards ideal for gamers that prefer high-performance, low-noise PCs.

burger.jpg

Improved stability and reliability with Digi+ VRM technology
STRIX GTX 980 and STRIX GTX 970 graphics cards include Digi+ VRM technology. This 10-phase power design in the STRIX GTX 980 and 6-phase design in the STRIX GTX 970 uses a digital voltage regulator to reduce power noise by 30% and enhance energy efficiency by 15% – increasing long term stability and reliability. The STRIX GTX 970 is designed to use a single 8-pin power connecter for clean and easy cable management.

Real-time monitoring and control with GPU Tweak software
The STRIX GTX 980 and STRIX GTX 970 come with GPU Tweak, an exclusive ASUS tool that enables users to squeeze the very best performance from their graphics card. GPU Tweak provides the ability to finely control GPU speeds, voltages and video memory clock speeds in real time, so overclocking is easy and can be carried out with high confidence.

GPU Tweak also includes a streaming tool that lets users share on-screen action over the internet in real time, meaning others can watch live as games are played. It is even possible to add a title to the streaming window along with scrolling text, pictures and webcam images.

AVAILABILITY & PRICING
ASUS STRIX GTX 980 and GTX 970 graphics cards will be available at ASUS authorized resellers and distributors starting on September 19, 2014. Suggested US MSRP pricing is $559 for the STRIX GTX 980 and $339 for the STRIX GTX 970.

Source: ASUS

NVIDIA's Maxwell offers smart performance

Subject: Graphics Cards | September 19, 2014 - 11:17 AM |
Tagged: vr direct, video, nvidia, mfaa, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr

The answer to the two most important questions are as follows, the GTX 980 will cost you around $560 compared to the $500 for an R9 290X and the GTX 970 an attractive $330 compared to $380 for an R9 290.  Availability is hard to predict but the cards will be shipping soon and you can pre-order your choice of card by following the links on the last page of Ryan's review.  Among all the new features that have been added to this new GPU one of the most impressive is the power draw, as you can see in [H]ard|OCP's review this card pulls 100W less than the 290X at full load although it did run warmer than the 290X Double Dissipation card which [H] compared it to, something that may change with a 980 bearing a custom cooler.  Follow those links to see the benchmarking results of this card, both synthetic and in game.

14110637240cPED1snfp_5_15_l.jpg

"Today NVIDIA launches its newest Maxwell GPU. There will be two new GPUs, the GeForce GTX 980 and GeForce GTX 970. These next generation GPUs usher in new features and performance that move the gaming industry forward. We discuss new features, architecture, and evaluate the gameplay performance against the competition."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Three Display Component Vendors Support AMD FreeSync

Subject: Graphics Cards, Displays | September 18, 2014 - 03:52 PM |
Tagged: amd, freesync, DisplayPort, adaptive sync

MStar, Novatek, and Realtek, three vendors of scaler units for use in displays, have announced support for AMD's FreeSync. Specifically, for the Q1'15 line of monitors, these partners will provide scaler chips that use DisplayPort Adaptive-Sync and, when paired with a compatible AMD GPU, will support FreeSync.

amd-freesync1.jpg

The press release claims that these scalar chips will either support 1080p and 1440p monitors that are up to 144Hz, or drive 4K displays that are up to 60Hz. While this is promising, at least compared to the selection at G-Sync's launch a year earlier, it does not mean that this variety of monitors will be available -- just that internal components will be available for interested display vendors. Also, it means that there are probably interested display vendors.

AMD and partners "intend to reveal" displays via a "media review program" in Q1. This is a little later than what we expected from Richard Huddy's "next month" statements, but it is possible that "Sampling" and "Media Review Program" are two different events. Even if it is "late", this is the sort of thing that is forgivable to me (missing a few months while relying on a standards body and several, independent companies).

Source: AMD

Can a heavily overclocked ASUS STRIX GTX 750 Ti OC beat an R9 270?

Subject: Graphics Cards | September 18, 2014 - 12:56 PM |
Tagged: DirectCU II, asus, STRIX GTX 750 Ti OC, factory overclocked

The ASUS STRIX GTX 750 Ti OC sports the custom DirectCU II cooling system which not only improves the temperatures on the card but also reduces the noise produced by the fans.  It comes out of the box with an overclocked  GPU base clock 1124MHz and a boost clock of 1202MHz, with the 2GB of VRAM set to the stock speed of 5.4GHz; [H]ard|OCP managed to increase that to an impressive 1219/1297MHz and 6.0GHz even for the VRAM without increasing voltages.  Unfortunately even with that overclock it lagged behind the Sapphire Radeon R9 270 Dual-X which happens to be about the same price at $170.

1410203265ODA6sXGQNR_1_12_l.jpg

"Rounding out our look at ASUS' new STRIX technology we have another STRIX capable video card on our test bench today, this time based on the GTX 750 Ti GPU. We will take the ASUS STRIX GTX 750 Ti OC Edition and test it against an AMD Radeon R9 270 and AMD Radeon R9 265 to see what reigns supreme under $200."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

More NVIDIA GeForce GTX 980 Leaked Details

Subject: General Tech, Graphics Cards | September 15, 2014 - 05:02 PM |
Tagged: nvidia, geforce, GTX 980

Details and photographs of the GeForce GTX 980 are leaking on various forums and websites. Based on the Maxwell architecture, it is supposed to be faster and more efficient than Kepler while being manufactured on an identical, 28nm fab process. While we were uncertain before, it now looks like the GTX 980 will be its formal name, as seen in leaked photographs, below.

NVIDIA-GeForce-GTX-980-Front-Picture.jpg

Image Credit: Videocardz

As expected, the cooler is a continuation of NVIDIA's reference cooler, as seen on recent high-end graphics cards (such as the GeForce Titan). Again, this is not a surprise. The interesting part is that it is rated for about 250W whereas Maxwell is rumored to draw 180W. While the reference card has two six-pin PCIe power connectors, I am curious to see if the excess cooling will lead to interesting overclocks. That is not even mentioning what the AIB partners can do.

NVIDIA-GeForce-GTX-980-Back-Picture.jpg

Image Credit: Videocardz

Beyond its over-engineering for Maxwell's TDP, it also includes a back plate.

nvidia-gtx-980-display-connectors.jpg

Image Credit: Chip Hell via Videocards

Its display connectors have been hotly anticipated. As you can see above, the GTX 980 has five outputs: three DisplayPort, one HDMI, and one DVI. Which version of HDMI? Which version of DisplayPort? No clue at the moment. There has been some speculation regarding HDMI 2.0, and the DisplayPort 1.3 standard was just released to the public today, but I would be pleasantly surprised if even one of these made it in.

Check out Videocardz for a little bit more.

Source: Videocardz

AMD R9 390X Is Rumored Liquid Cooled

Subject: General Tech, Graphics Cards, Cases and Cooling | September 15, 2014 - 02:50 PM |
Tagged: amd, R9, r9 390x, liquid cooler, liquid cooling, liquid cooling system, asetek

Less than a year after the launch of AMD's R9 290X, we are beginning to hear rumors of a follow-up. What is being called the R9 390X, because if it is called anything else, then that was a very short-lived branding scheme, might be liquid cooled. This would be the first single-processor, reference graphics card to have an integrated water cooler. That said, the public evidence is not as firm as I would normally like.

amd-r9-390x-leak.jpg

Image Credit: Baidu Forums

According to Tom's Hardware, Asetek is working on a liquid-cooled design for "an undisclosed OEM". The product is expected to ship during the first half of 2015 and the press release claims that it will "continue Asetek's success in the growing liquid cooling market". Technically, this could be a collaboration with an AIB partner, not necessarily a GPU developer. That said, the leaked photograph looks like a reference card.

We don't really know anything more than this. I would expect that it will be a refresh based on Hawaii, but that is pure speculation. I have no evidence to support that.

Personally, I would hope that a standalone air-cooled model would be available. While I have no experience with liquid cooling, it seems like a bit extra of a burden that not all purchasers of a top-of-the-line single GPU add-in board would want to bare. Specifically, placing the radiator if their case even supports it. That said, having a high-performing reference card will probably make the initial benchmarks look extra impressive, which could be a win in itself.

NCIX and LinusTech Does Four Single GPUs... Twice

Subject: General Tech, Graphics Cards | September 11, 2014 - 04:46 PM |
Tagged: quad sli, quad crossfire, nvidia, amd

Psst. AMD fans. Don't tell "Team Green" but Linus decided to take four R9 290X graphics cards and configure them in Quad Crossfire formation. They did not seem to have too much difficulty setting it up, although they did have trouble with throttling and setting up Crossfire profiles. When they finally were able to test it, they got a 3D Mark Fire Strike Extreme score of 14979.

Psst. NVIDIA fans. Don't tell "Team Red" but Linus decided to take four GeForce Titan Black graphics cards and configure them in Quad SLI formation. He had a bit of a difficult time setting up the machine at first, requiring a reshuffle of the cards (how would reordering PCIe slots for identical cards do anything?) and a few driver crashes, but it worked. Eventually, they got a 3D Mark Fire Strike Extreme score of around 13,300 (give or take a couple hundred).

NVIDIA GM204 info is leaking

Subject: General Tech, Graphics Cards | September 8, 2014 - 02:03 PM |
Tagged: leak, nvidia, GM204, GTX 980, GTX 980M, GTX 970, GTX 970M

Please keep in mind that this information has been assembled via research done by WCCF Tech and Videocardz off of 3DMark entries of unreleased GPUs; we won't get the official numbers until the middle of this month.  That said, rumours and guesswork about new hardware are a favourite past time of our readers so here is the information we've seen so far about the upcoming GM204 chip from NVIDIA.  On the desktop side is the GeForce GTX 980 and GeForce GTX 970 which should both have 4GB of GDDR5 on a 256-bit bus with GPU clock speeds ranging from 1127 to 1190 MHz.  The performance that was shown on 3DMark has the GTX 980 beating the 780 Ti and R9 290X and the GTX 970 performing similarly to the plain GTX 780 and falling behind the 290X.  SLI scaling looks rather attractive with a pair of GTX 980 coming within a hair of the performance of the R9 295X2.

NVIDIA-GeForce-GTX-980-and-GeForce-GTX-970-3DMark-Firestrike-Performance-635x599.png

On the mobile side things look bleak for AMD, the GTX 980M and GTX 970M surpass the current GTX 880M which in turn benchmarks far better than AMD's M290X chip.  Again the scaling in SLI systems will be impressive assuming that the leaks that you can see indepth here are accurate.  It won't be too much longer before we know one way or the other so you might want to keep your finger off of the Buy Button for a short while.

Source: WCCF Tech

Intel Graphics Drivers Claim Significant Improvements

Subject: General Tech, Graphics Cards, Processors | September 6, 2014 - 02:25 PM |
Tagged: iris pro, iris, intel hd graphics, Intel

I was originally intending to test this with benchmarks but, after a little while, I realized that Ivy Bridge was not supported. This graphics driver starts and ends with Haswell. While I cannot verify their claims, Intel advertises up to 30% more performance in some OpenCL tasks and a 10% increase in games like Batman: Arkham City and Sleeping Dogs. They even claim double performance out of League of Legends at 1366x768.

inteltf2.jpg

Intel is giving gamers a "free lunch".

The driver also tunes Conservative Morphological Anti-Aliasing (CMAA). They claim it looks better than MLAA and FXAA, "without performance impact" (their whitepaper from March showed a ~1-to-1.5 millisecond cost on Intel HD 5000). Intel recommends disabling it after exiting games to prevent it from blurring other applications, and they automatically disable it in Windows, Internet Explorer, Chrome, Firefox, and Windows 8.1 Photo.

Adaptive Rendering Control was also added in this driver. This limits redrawing identical frames by comparing the ones it does draw with previously drawn ones, and adjusts the frame rate accordingly. This is most useful for games like Angry Birds, Minesweeper, and Bejeweled LIVE. It is disabled when not on battery power, or when the driver is set to "Maximum Performance".

The Intel Iris and HD graphics driver is available from Intel, for both 32-bit and 64-bit Windows 7, 8, and 8.1, on many Haswell-based GPUs.

Source: Intel

AMD's Dropping the R9 295X2 Price to $999 USD

Subject: General Tech, Graphics Cards | September 3, 2014 - 06:28 PM |
Tagged: amd, R9, r9 295x2, price cut

amd-r9-295x2-gigabyte.jpg

While not fully in effect yet, AMD is cutting $500 off of the R9 295X2 price tag to $999 USD. Currently, there are two models available on Newegg USA at the reduced price, and one at Amazon for $1200. We expect to see other SKUs reduce soon, as well. This puts the water-cooled R9 295X2 just below the cost of two air-cooled R9 290X graphics cards.

If you were interested in this card, now might be the time (if one of the reduced units are available).

Source: Newegg

Matrox Creates Professional Graphics Cards with AMD GPUs

Subject: General Tech, Graphics Cards | September 3, 2014 - 03:15 PM |
Tagged: Matrox, firepro, cape verde xt gl, cape verde xt, cape verde, amd

Matrox, along with S3, develop GPU ASICs for use with desktop add-in boards, alongside AMD and NVIDIA. Last year, they sold less than 7000 units in their quarter according to my math (rounding to 0.0% market share implies < 0.05% of total market, which was 7000 units that quarter). Today, Matrox Graphics Inc. announce that they will use an AMD GPU on their upcoming product line.

Matrox-AMD-Logos-Image.jpg

While they do not mention a specific processor, they note that "the selected AMD GPU" will be manufactured at a 28nm process with 1.5 billion transistors. It will support DirectX 11.2, OpenGL 4.4, and OpenCL 1.2. It will have a 128-bit memory bus.

Basically, it kind-of has to be Cape Verde XT (or XT GL) unless it is a new, unannounced GPU.

If it is Cape Verde XT, it would have about 1.0 to 1.2 TFLOPs of single precision performance (depending on the chosen clock rate). Whatever clock rate is chosen, the chip contains 640 shader processors. It was first released in February 2012 with the Radeon HD 7770 GHz Edition. Again, this is assuming that AMD will not release a GPU refresh for that category.

Matrox will provide their PowerDesk software to configure multiple monitors. It will work alongside AMD's professional graphics drivers. It is a sad that to see a GPU ASIC manufacturer throw in the towel, at least temporarily, but hopefully they can use AMD's technology to remain in the business with competitive products. Who knows: maybe they will make a return when future graphics APIs reduce the burden of driver and product development?

Source: Matrox

PNY's Customized OC series, decent value for great performance

Subject: Graphics Cards | August 25, 2014 - 12:57 PM |
Tagged: pny, gtx 780, gtx 780 ti, Customized OC, factory overclocked

PNY is not as heavily marketed as some GPU resellers in North America but that doesn't mean they are not hard at work designing custom cards.  Hardware Canucks tried out the Customized OC GTX 780 and 780 Ti recently with the factory overclock as well as pushing the cards to the limit by manual overclocking.  Using EVGA's Precision overclocking tools they pushed the GTX 780 to 1120MHz Core, 6684MHz RAM and the Ti to an impressive 1162MHz Core, 7800MHz RAM.  Read on to see how effective the custom cooler proved to be as it is also a major part of the Customized series.

PNY-CUSTOM-1.jpg

"PNY's latest Customized series will be rolling through their GTX 780 and GTX 780 Ti lineups, bringing high end cooling and increased performance."

Here are some more Graphics Card articles from around the web:

Graphics Cards

AMD Announces Radeon R9 285X and R9 285 Graphics Card

Subject: Graphics Cards | August 23, 2014 - 07:46 AM |
Tagged: radeon, r9 285, R9, amd, 285

Today during AMD's live stream event celebrating 30 years of graphics and gaming, the company spent a bit of time announcing and teasing a new graphics card, the Radeon R9 285X and R9 285. Likely based on the Tonga GPU die, the specifications haven't been confirmed but most believe that the chip will feature 2048 stream processors, 128 texture units, 32 ROPs and a 256-bit memory bus.

r9285.jpg

In a move to help donate to the Child's Play charity, AMD currently has an AMD Radeon R9 285 on Ebay. It lists an ASUS built Strix-style cooled retail card, with 2GB of memory being the only specification that is visible on the box.

r92852.jpg

The R9 285X and R9 285 will replace the R9 280X and R9 280 more than likely and we should see these shipping and available in very early September.

UPDATE: AMD showed specifications of the Radeon R9 285 during the live stream.

r92853.jpg

For those of you with eyes as bad as mine, here are the finer points:

  • 1,792 Stream Processors
  • 918 MHz GPU Clock
  • 3.29 TFLOPS peak performance
  • 112 Texture units
  • 32 ROPs
  • 2GB GDDR5
  • 256-bit memory bus
  • 5.5 GHz memory clock
  • 2x 6-pin power connectors
  • 190 watt TDP
  • $249 MSRP
  • Release date: September 2nd

These Tonga GPU specifications are VERY similar to that of the R9 280: 1792 stream units, 112 texture units, etc. However, the R9 280 had a wider memory bus (384-bit) but runs at 500 MHz lower effective frequency. Clock speeds on Tonga look like they are just slightly lower as well. Maybe most interesting is the frame buffer size drop from 3GB to 2GB.

That's all we have for now, but I expect we'll have our samples in very soon and expect a full review shortly!

UPDATE 2: Apparently AMD hasn't said anything about the Radeon R9 285X, so for the time being, that still falls under the "rumor" category. I'm sure we'll know more soon though.

Source: AMD

PCPer Live! Recap - NVIDIA G-Sync Surround Demo and Q&A

Subject: Graphics Cards, Displays | August 22, 2014 - 05:05 PM |
Tagged: video, gsync, g-sync, tom petersen, nvidia, geforce

Earlier today we had NVIDIA's Tom Petersen in studio to discuss the retail availability of G-Sync monitors as well as to get hands on with a set of three ASUS ROG Swift PG278Q monitors running in G-Sync Surround! It was truly an impressive sight and if you missed any of it, you can catch the entire replay right here.

Even if seeing the ASUS PG278Q monitor again doesn't interest you (we have our full review of the monitor right here), you won't want to miss the very detailed Q&A that occurs, answering quite a few reader questions about the technology. Covered items include:

  • Potential added latency of G-Sync
  • Future needs for multiple DP connections on GeForce GPUs
  • Upcoming 4K and 1080p G-Sync panels
  • Can G-Sync Surround work through an MST Hub?
  • What happens to G-Sync when the frame rate exceeds the panel refresh rate? Or drops below minimum refresh rate?
  • What does that memory on the G-Sync module actually do??
  • A demo of the new NVIDIA SHIELD Tablet capabilities
  • A whole lot more!

Another big thank you to NVIDIA and Tom Petersen for stopping out our way and for spending the time to discuss these topics with our readers. Stay tuned here at PC Perspective as we will have more thoughts and reactions to G-Sync Surround very soon!!

NVIDIA Live Stream: We Want Your Questions!

Subject: Graphics Cards, Displays, Mobile | August 21, 2014 - 02:23 PM |
Tagged: nvidia, video, live, shield, shield tablet, g-sync, gsync, tom petersen

Tomorrow at 12pm EDT / 9am PDT, NVIDIA's Tom Petersen will be stopping by the PC Perspective office to discuss some topics of interest. There has been no lack of topics floating around the world of graphics card, displays, refresh rates and tablets recently and I expect the show tomorrow to be incredibly interesting and educational.

On hand we'll be doing demonstrations of G-Sync Surround (3 panels!) with the ASUS ROG Swift PG278Q display (our review here) and also show off the SHIELD Tablet (we have a review of that too) with some multiplayer action. If you thought the experience with a single G-Sync monitor was impressive, you will want to hear what a set of three of them can be like.

pcperlive.png

NVIDIA Live Stream with Tom Petersen

9am PT / 12pm ET - August 22nd

PC Perspective Live! Page

The topic list is going to include (but not limited to):

  • ASUS PG278Q G-Sync monitor
  • G-Sync availability and pricing
  • G-Sync Surround setup, use and requirements
  • Technical issues surrounding G-Sync: latency, buffers, etc.
  • Comparisons of G-Sync to Adaptive Sync
  • SHIELD Tablet game play
  • Altoids?

gsyncsurround.jpg

But we want your questions! Do you have burning issues that you think need to be addressed by Tom and the NVIDIA team about G-Sync, FreeSync, GameWorks, Tegra, tablets, GPUs and more? Nothing is off limits here, though obviously Tom may be cagey on future announcements. Please use the comments section on this news post below (registration not required) to ask your questions and we can organize them before the event tomorrow. We MIGHT even be able to come up with a couple of prizes to giveaway for live viewers as well...

See you tomorrow!!

An odd Q2 for tablets and PCs

Subject: General Tech, Graphics Cards | August 19, 2014 - 09:30 AM |
Tagged: jon peddie, gpu market share, q2 2014

Jon Peddie Research's latest Market Watch adds even more ironic humour to the media's continuing proclamations of the impending doom of the PC industry.  This quarter saw tablet sales decline while overall PCs were up and that was without any major releases to drive purchasers to adopt new technology.  While JPR does touch on the overall industry this report is focused on the sale of GPUs and APUs and happens to contain some great news for AMD.  They saw their overall share of the market increase by 11% from last quarter and by just over a percent of the entire market.  Intel saw a small rise in share though it does still hold the majority of the market as PCs with no discrete GPU are more likely to contain Intel's chips than AMDs.  That leaves NVIDIA who are still banking solely on discrete GPUs and saw over an 8% decline from last quarter and a decline of almost two percent in the total market.  Check out the other graphs in JPR's overview right here.

unnamed.jpg

"The big drop in graphics shipments in Q1 has been partially offset by a small rise this quarter. Shipments were up 3.2% quarter-to-quarter, and down 4.5% compared to the same quarter last year."

Here is some more Tech News from around the web:

Tech Talk

Khronos Announces "Next" OpenGL & Releases OpenGL 4.5

Subject: General Tech, Graphics Cards, Shows and Expos | August 15, 2014 - 05:33 PM |
Tagged: siggraph 2014, Siggraph, OpenGL Next, opengl 4.5, opengl, nvidia, Mantle, Khronos, Intel, DirectX 12, amd

Let's be clear: there are two stories here. The first is the release of OpenGL 4.5 and the second is the announcement of the "Next Generation OpenGL Initiative". They both occur on the same press release, but they are two, different statements.

OpenGL 4.5 Released

OpenGL 4.5 expands the core specification with a few extensions. Compatible hardware, with OpenGL 4.5 drivers, will be guaranteed to support these. This includes features like direct_state_access, which allows accessing objects in a context without binding to it, and support of OpenGL ES3.1 features that are traditionally missing from OpenGL 4, which allows easier porting of OpenGL ES3.1 applications to OpenGL.

opengl_logo.jpg

It also adds a few new extensions as an option:

ARB_pipeline_statistics_query lets a developer ask the GPU what it has been doing. This could be useful for "profiling" an application (list completed work to identify optimization points).

ARB_sparse_buffer allows developers to perform calculations on pieces of generic buffers, without loading it all into memory. This is similar to ARB_sparse_textures... except that those are for textures. Buffers are useful for things like vertex data (and so forth).

ARB_transform_feedback_overflow_query is apparently designed to let developers choose whether or not to draw objects based on whether the buffer is overflowed. I might be wrong, but it seems like this would be useful for deciding whether or not to draw objects generated by geometry shaders.

KHR_blend_equation_advanced allows new blending equations between objects. If you use Photoshop, this would be "multiply", "screen", "darken", "lighten", "difference", and so forth. On NVIDIA's side, this will be directly supported on Maxwell and Tegra K1 (and later). Fermi and Kepler will support the functionality, but the driver will perform the calculations with shaders. AMD has yet to comment, as far as I can tell.

nvidia-opengl-debugger.jpg

Image from NVIDIA GTC Presentation

If you are a developer, NVIDIA has launched 340.65 (340.23.01 for Linux) beta drivers for developers. If you are not looking to create OpenGL 4.5 applications, do not get this driver. You really should not have any use for it, at all.

Next Generation OpenGL Initiative Announced

The Khronos Group has also announced "a call for participation" to outline a new specification for graphics and compute. They want it to allow developers explicit control over CPU and GPU tasks, be multithreaded, have minimal overhead, have a common shader language, and "rigorous conformance testing". This sounds a lot like the design goals of Mantle (and what we know of DirectX 12).

amd-mantle-queues.jpg

And really, from what I hear and understand, that is what OpenGL needs at this point. Graphics cards look nothing like they did a decade ago (or over two decades ago). They each have very similar interfaces and data structures, even if their fundamental architectures vary greatly. If we can draw a line in the sand, legacy APIs can be supported but not optimized heavily by the drivers. After a short time, available performance for legacy applications would be so high that it wouldn't matter, as long as they continue to run.

Add to it, next-generation drivers should be significantly easier to develop, considering the reduced error checking (and other responsibilities). As I said on Intel's DirectX 12 story, it is still unclear whether it will lead to enough performance increase to make most optimizations, such as those which increase workload or developer effort in exchange for queuing fewer GPU commands, unnecessary. We will need to wait for game developers to use it for a bit before we know.