PCPer Live! Recap - NVIDIA G-Sync Surround Demo and Q&A

Subject: Graphics Cards, Displays | August 22, 2014 - 08:05 PM |
Tagged: video, gsync, g-sync, tom petersen, nvidia, geforce

Earlier today we had NVIDIA's Tom Petersen in studio to discuss the retail availability of G-Sync monitors as well as to get hands on with a set of three ASUS ROG Swift PG278Q monitors running in G-Sync Surround! It was truly an impressive sight and if you missed any of it, you can catch the entire replay right here.

Even if seeing the ASUS PG278Q monitor again doesn't interest you (we have our full review of the monitor right here), you won't want to miss the very detailed Q&A that occurs, answering quite a few reader questions about the technology. Covered items include:

  • Potential added latency of G-Sync
  • Future needs for multiple DP connections on GeForce GPUs
  • Upcoming 4K and 1080p G-Sync panels
  • Can G-Sync Surround work through an MST Hub?
  • What happens to G-Sync when the frame rate exceeds the panel refresh rate? Or drops below minimum refresh rate?
  • What does that memory on the G-Sync module actually do??
  • A demo of the new NVIDIA SHIELD Tablet capabilities
  • A whole lot more!

Another big thank you to NVIDIA and Tom Petersen for stopping out our way and for spending the time to discuss these topics with our readers. Stay tuned here at PC Perspective as we will have more thoughts and reactions to G-Sync Surround very soon!!

NVIDIA Live Stream: We Want Your Questions!

Subject: Graphics Cards, Displays, Mobile | August 21, 2014 - 05:23 PM |
Tagged: nvidia, video, live, shield, shield tablet, g-sync, gsync, tom petersen

Tomorrow at 12pm EDT / 9am PDT, NVIDIA's Tom Petersen will be stopping by the PC Perspective office to discuss some topics of interest. There has been no lack of topics floating around the world of graphics card, displays, refresh rates and tablets recently and I expect the show tomorrow to be incredibly interesting and educational.

On hand we'll be doing demonstrations of G-Sync Surround (3 panels!) with the ASUS ROG Swift PG278Q display (our review here) and also show off the SHIELD Tablet (we have a review of that too) with some multiplayer action. If you thought the experience with a single G-Sync monitor was impressive, you will want to hear what a set of three of them can be like.

pcperlive.png

NVIDIA Live Stream with Tom Petersen

9am PT / 12pm ET - August 22nd

PC Perspective Live! Page

The topic list is going to include (but not limited to):

  • ASUS PG278Q G-Sync monitor
  • G-Sync availability and pricing
  • G-Sync Surround setup, use and requirements
  • Technical issues surrounding G-Sync: latency, buffers, etc.
  • Comparisons of G-Sync to Adaptive Sync
  • SHIELD Tablet game play
  • Altoids?

gsyncsurround.jpg

But we want your questions! Do you have burning issues that you think need to be addressed by Tom and the NVIDIA team about G-Sync, FreeSync, GameWorks, Tegra, tablets, GPUs and more? Nothing is off limits here, though obviously Tom may be cagey on future announcements. Please use the comments section on this news post below (registration not required) to ask your questions and we can organize them before the event tomorrow. We MIGHT even be able to come up with a couple of prizes to giveaway for live viewers as well...

See you tomorrow!!

Podcast #314 - Corsair Air 240 Case, Angelbird SSD wrk, DDR4 Pricing, and more!

Subject: General Tech | August 21, 2014 - 12:50 PM |
Tagged: podcast, corsair, angelbird, wrk, ddr4, freesync, gsync, nvidia, amd, Intel, titan-z, VIA, video

PC Perspective Podcast #314 - 08/21/2014

Join us this week as we discuss the Corsair Air 240 Case, Angelbird SSD wrk, DDR4 Pricing, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Program length: 1:24:13
 

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

 

Khronos Announces "Next" OpenGL & Releases OpenGL 4.5

Subject: General Tech, Graphics Cards, Shows and Expos | August 15, 2014 - 08:33 PM |
Tagged: siggraph 2014, Siggraph, OpenGL Next, opengl 4.5, opengl, nvidia, Mantle, Khronos, Intel, DirectX 12, amd

Let's be clear: there are two stories here. The first is the release of OpenGL 4.5 and the second is the announcement of the "Next Generation OpenGL Initiative". They both occur on the same press release, but they are two, different statements.

OpenGL 4.5 Released

OpenGL 4.5 expands the core specification with a few extensions. Compatible hardware, with OpenGL 4.5 drivers, will be guaranteed to support these. This includes features like direct_state_access, which allows accessing objects in a context without binding to it, and support of OpenGL ES3.1 features that are traditionally missing from OpenGL 4, which allows easier porting of OpenGL ES3.1 applications to OpenGL.

opengl_logo.jpg

It also adds a few new extensions as an option:

ARB_pipeline_statistics_query lets a developer ask the GPU what it has been doing. This could be useful for "profiling" an application (list completed work to identify optimization points).

ARB_sparse_buffer allows developers to perform calculations on pieces of generic buffers, without loading it all into memory. This is similar to ARB_sparse_textures... except that those are for textures. Buffers are useful for things like vertex data (and so forth).

ARB_transform_feedback_overflow_query is apparently designed to let developers choose whether or not to draw objects based on whether the buffer is overflowed. I might be wrong, but it seems like this would be useful for deciding whether or not to draw objects generated by geometry shaders.

KHR_blend_equation_advanced allows new blending equations between objects. If you use Photoshop, this would be "multiply", "screen", "darken", "lighten", "difference", and so forth. On NVIDIA's side, this will be directly supported on Maxwell and Tegra K1 (and later). Fermi and Kepler will support the functionality, but the driver will perform the calculations with shaders. AMD has yet to comment, as far as I can tell.

nvidia-opengl-debugger.jpg

Image from NVIDIA GTC Presentation

If you are a developer, NVIDIA has launched 340.65 (340.23.01 for Linux) beta drivers for developers. If you are not looking to create OpenGL 4.5 applications, do not get this driver. You really should not have any use for it, at all.

Next Generation OpenGL Initiative Announced

The Khronos Group has also announced "a call for participation" to outline a new specification for graphics and compute. They want it to allow developers explicit control over CPU and GPU tasks, be multithreaded, have minimal overhead, have a common shader language, and "rigorous conformance testing". This sounds a lot like the design goals of Mantle (and what we know of DirectX 12).

amd-mantle-queues.jpg

And really, from what I hear and understand, that is what OpenGL needs at this point. Graphics cards look nothing like they did a decade ago (or over two decades ago). They each have very similar interfaces and data structures, even if their fundamental architectures vary greatly. If we can draw a line in the sand, legacy APIs can be supported but not optimized heavily by the drivers. After a short time, available performance for legacy applications would be so high that it wouldn't matter, as long as they continue to run.

Add to it, next-generation drivers should be significantly easier to develop, considering the reduced error checking (and other responsibilities). As I said on Intel's DirectX 12 story, it is still unclear whether it will lead to enough performance increase to make most optimizations, such as those which increase workload or developer effort in exchange for queuing fewer GPU commands, unnecessary. We will need to wait for game developers to use it for a bit before we know.

Podcast #313 - New Kaveri APUs, ASUS ROG Swift G-Sync Monitor, Intel Core M Processors and more!

Subject: General Tech | August 14, 2014 - 03:30 PM |
Tagged: video, ssd, ROG Swift, ROG, podcast, ocz, nvidia, Kaveri, Intel, g-sync, FMS 2014, crossblade ranger, core m, Broadwell, asus, ARC 100, amd, A6-7400K, A10-7800, 14nm

PC Perspective Podcast #313 - 08/14/2014

Join us this week as we discuss new Kaveri APUs, ASUS ROG Swift G-Sync Monitor, Intel Core M Processors and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Program length: 1:41:24
 

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Select GeForce GTX GPUs Now Include Borderlands: The Pre-Sequel

Subject: General Tech | August 13, 2014 - 12:26 PM |
Tagged: borderlands, nvidia, geforce

 

Santa Clara, CA — August 12, 2014 — Get ready to shoot ‘n’ loot your way through Pandora’s moon. Starting today, gamers who purchase select NVIDIA GeForce GTX TITAN, 780 Ti, 780, and 770 desktop GPUs will receive a free copy of Borderlands: The Pre-Sequel, the hotly anticipated new chapter to the multi-award winning Borderlands franchise from 2K and Gearbox Software.

Discover the story behind Borderlands 2’s villain, Handsome Jack, and his rise to power. Taking place between the original Borderlands and Borderlands 2, Borderlands: The Pre-Sequel offers players a whole lotta new gameplay in low gravity.

“If you have a high-end NVIDIA GPU, Borderlands: The Pre-Sequel will offer higher fidelity and higher performance hardware-driven special effects including awesome weapon impacts, moon-shatteringly cool cryo explosions and ice particles, and cloth and fluid simulation that blows me away every time I see it," said Randy Pitchford, CEO and president of Gearbox Software.

With NVIDIA PhysX technology, you will feel deep space like never before. Get high in low gravity and use new ice and laser weapons to experience destructible levels of mayhem. Check out the latest trailer here: http://youtu.be/c9a4wr4I1hk that just went live this morning!

Borderlands: The Pre-Sequel will also stream to your NVIDIA SHIELD tablet or portable. For the first time ever, you can play Claptrap anywhere by using NVIDIA Gamestream technologies. You can even livestream and record every fist punch with GeForce Shadowplay

Borderlands: The Pre-Sequel will be available on October 14, 2014 in North America and on October 17, 2014 internationally. Borderlands: The Pre-Sequel is not yet rated by the ESRB.

The GeForce GTX and Borderlands: The Pre-Sequel bundle is available starting today from leading e-tailers including Amazon, NCIX, Newegg, and Tiger Direct and system builders including Canada Computers, Digital Storm, Falcon Northwest, Maingear, Memory Express, Origin PC, V3 Gaming, and Velocity Micro. For a full list of participating partners, please visit: www.GeForce.com/GetBorderlands.

Source: NVIDIA

G-SYNC is sweet but far from free

Subject: Displays | August 12, 2014 - 03:36 PM |
Tagged: asus, g-sync, geforce, gsync, nvidia, pg278q, Republic of Gamers, ROG, swift, video

Ryan was not the only one to test the ASUS ROG Swift PG278Q G-Sync monitor, Overclockers Club also received a model to test out.  Their impressions of the 27" 2560 x 1440 TN panel were very similar, once they saw this monitor in action going back to their 30-inch 60Hz IPS monitor was not as enjoyable as once it was.  The only bad thing they could say about the display was the MSRP, $800 is steep for any monitor and makes it rather difficult to even consider getting two or more of them for a multiple display system.

2.jpg

”When you get down to it, the facts are that even with a TN panel being used for the high refresh rate, the ASUS ROG Swift PG278Q G-Sync monitor delivers great picture quality and truly impressive gaming. I could go on all day long about how smooth each of the games played while testing this monitor, but ultimately not be able to show you without having you sit at the desk with me. No stuttering, no tearing, no lag; it's like getting that new car and having all the sales hype end up being right on the money. When I flip back and forth between my 60Hz monitor and the PC278Q, its like a night and day experience.”

Here are some more Display articles from around the web:

Displays

NVIDIA Reveals 64-bit Denver CPU Core Details, Headed to New Tegra K1 Powered Devices Later This Year

Subject: Processors | August 12, 2014 - 01:06 AM |
Tagged: tegra k1, project denver, nvidia, Denver, ARMv8, arm, Android, 64-bit

During GTC 2014 NVIDIA launched the Tegra K1, a new mobile SoC that contains a powerful Kepler-based GPU. Initial processors (and the resultant design wins such as the Acer Chromebook 13 and Xiaomi Mi Pad) utilized four ARM Cortex-A15 cores for the CPU side of things, but later this year NVIDIA is deploying a variant of the Tegra K1 SoC that switches out the four A15 cores for two custom (NVIDIA developed) Denver CPU cores.

Today at the Hot Chips conference, NVIDIA revealed most of the juicy details on those new custom cores announced in January which will be used in devices later this year.

The custom 64-bit Denver CPU cores use a 7-way superscalar design and run a custom instruction set. Denver is a wide but in-order architecture that allows up to seven operations per clock cycle. NVIDIA is using a custom ISA and on-the-fly binary translation to convert ARMv8 instructions to microcode before execution. A software layer and 128MB cache enhance the Dynamic Code Optimization technology by allowing the processor to examine and optimize the ARM code, convert it to the custom instruction set, and further cache the converted microcode of frequently used applications in a cache (which can be bypassed for infrequently processed code). Using the wider execution engine and Dynamic Code Optimization (which is transparent to ARM developers and does not require updated applications), NVIDIA touts the dual Denver core Tegra K1 as being at least as powerful as the quad and octo-core packing competition.

Further, NVIDIA has claimed at at peak throughput (and in specific situations where application code and DCO can take full advantage of the 7-way execution engine) the Denver-based mobile SoC handily outpaces Intel’s Bay Trail, Apple’s A7 Cyclone, and Qualcomm’s Krait 400 CPU cores. In the results of a synthetic benchmark test provided to The Tech Report, the Denver cores were even challenging Intel’s Haswell-based Celeron 2955U processor. Keeping in mind that these are NVIDIA-provided numbers and likely the best results one can expect, Denver is still quite a bit more capable than existing cores. (Note that the Haswell chips would likely pull much farther ahead when presented with applications that cannot be easily executed in-order with limited instruction parallelism).

NVIDIA Denver CPU Core 64bit ARMv8 Tegra K1.png

NVIDIA is ratcheting up mobile CPU performance with its Denver cores, but it is also aiming for an efficient chip and has implemented several power saving tweaks. Beyond the decision to go with an in-order execution engine (with DCO hopefully mostly making up for that), the beefy Denver cores reportedly feature low latency power state transitions (e.g. between active and idle states), power gating, dynamic voltage, and dynamic clock scaling. The company claims that “Denver's performance will rival some mainstream PC-class CPUs at significantly reduced power consumption.” In real terms this should mean that the two Denver cores in place of the quad core A15 design in the Tegra K1 should not result in significantly lower battery life. The two K1 variants are said to be pin compatible such that OEMs and developers can easily bring upgraded models to market with the faster Denver cores.

NVIDIA Denver CPU cores in Tegra K1.png

For those curious, In the Tegra K1, the two Denver cores (clocked at up to 2.5GHz) share a 16-way L2 cache and each have 128KB instruction and 64KB data L1 caches to themselves. The 128MB Dynamic Code Optimization cache is held in system memory.

Denver is the first (custom) 64-bit ARM processor for Android (with Apple’s A7 being the first 64-bit smartphone chip), and NVIDIA is working on supporting the next generation Android OS known as Android L.

The dual Denver core Tegra K1 is coming later this year and I am excited to see how it performs. The current K1 chip already has a powerful fully CUDA compliant Kepler-based GPU which has enabled awesome projects such as computer vision and even prototype self-driving cars. With the new Kepler GPU and Denver CPU pairing, I’m looking forward to seeing how NVIDIA’s latest chip is put to work and the kinds of devices it enables.

Are you excited for the new Tegra K1 SoC with NVIDIA’s first fully custom cores?

Source: NVIDIA
Author:
Manufacturer: ASUS

The Waiting Game

NVIDIA G-Sync was announced at a media event held in Montreal way back in October, and promised to revolutionize the way the display and graphics card worked together to present images on the screen. It was designed to remove hitching, stutter, and tearing -- almost completely. Since that fateful day in October of 2013, we have been waiting. Patiently waiting. We were waiting for NVIDIA and its partners to actually release a monitor that utilizes the technology and that can, you know, be purchased.

In December of 2013 we took a look at the ASUS VG248QE monitor, the display for which NVIDIA released a mod kit to allow users that already had this monitor to upgrade to G-Sync compatibility. It worked, and I even came away impressed. I noted in my conclusion that, “there isn't a single doubt that I want a G-Sync monitor on my desk” and, “my short time with the NVIDIA G-Sync prototype display has been truly impressive…”. That was nearly 7 months ago and I don’t think anyone at that time really believed it would be THIS LONG before the real monitors began to show in the hands of gamers around the world.

IMG_9328.JPG

Since NVIDIA’s October announcement, AMD has been on a marketing path with a technology they call “FreeSync” that claims to be a cheaper, standards-based alternative to NVIDIA G-Sync. They first previewed the idea of FreeSync on a notebook device during CES in January and then showed off a prototype monitor in June during Computex. Even more recently, AMD has posted a public FAQ that gives more details on the FreeSync technology and how it differs from NVIDIA’s creation; it has raised something of a stir with its claims on performance and cost advantages.

That doesn’t change the product that we are reviewing today of course. The ASUS ROG Swift PG278Q 27-in WQHD display with a 144 Hz refresh rate is truly an awesome monitor. What did change is the landscape, from NVIDIA's original announcement until now.

Continue reading our review of the ASUS ROG Swift PG278Q 2560x1440 G-Sync Monitor!!

Acer Unveils Chromebook 13 Powered By NVIDIA Tegra K1 SoC

Subject: General Tech, Mobile | August 11, 2014 - 08:00 AM |
Tagged: webgl, tegra k1, nvidia, geforce, Chromebook, Bay Trail, acer

Today Acer unveiled a new Chromebook powered by an NVIDIA Tegra K1 processor. The aptly-named Chromebook 13 is 13-inch thin and light notebook running Google’s Chrome OS with up to 13 hours of battery life and three times the graphical performance of existing Chromebooks using Intel Bay Trail and Samsung Exynos processors.

Acer Chromebook 13 CB5-311_AcerWP_app-02.jpg

The Chromebook 13 is 18mm thick and comes in a white plastic fanless chassis that hosts a 13.3” display, full size keyboard, trackpad, and HD webcam. The Chromebook 13 will be available with a 1366x768 or 1920x1080 resolution panel depending on the particular model (more on that below).

Beyond the usual laptop fixtures, external I/O includes two USB 3.0 ports, HDMI video output, a SD card reader, and a combo headphone/mic jack. Acer has placed one USB port on the left side along with the card reader and one USB port next to the HDMI port on the rear of the laptop. Personally, I welcome the HDMI port placement as it means connecting a second display will not result in a cable invading the mousing area should i wish to use a mouse (and it’s even south paw friendly Scott!).

The Chromebook 13 looks decent from the outside, but it is the internals where the device gets really interesting. Instead of going with an Intel Bay Trail (or even Celeron/Core i3), Acer has opted to team up with NVIDIA to deliver the world’s first NVIDIA-powered Chromebook.

Specifically, the Chromebook 13 uses a NVIDIA Tegra K1 SoC, up to 4GB RAM, and up to 32GB of flash storage. The K1 offers up four A15 CPU cores clocked at 2.1GHz, and a graphics unit with 192 Kepler-based CUDA cores. Acer rates the Chromebook 13 at 11 hours with the 1080p panel or 13 hours when equipped with the 1366x768 resolution display. Even being conservative, the Chromebook 13 looks to be the new leader in Chromebook battery life (with the previous leader claiming 11 hours).

acer chromebook 13 tegra k1 quad core multitasking benchmark.jpg

A graph comparing WebGL performance between the NVIDIA Tegra K1, Intel (Bay Trail) Celeron N2830, Samsung Exynos 5800, and Samsung Exynos 5250. Results courtesy NVIDIA.

The Tegra K1 is a powerful little chip, and it is nice to see NVIDIA get a design win here. NVIDIA claims that the Tegra K1, which is rated at 326 GFLOPS of compute performance, offers up to three times the graphics performance of the Bay Trail N2830 and Exynos 5800 SoCs. Additionally, the K1 reportedly uses slightly less power and delivers higher multi-tasking performance. I’m looking forward to seeing independent reviews in this laptop formfactor and hoping that the chip lives up to its promises.

The Chromebook 13 is currently up for pre-order and will be available in September starting at $279. The Tegra K1-powered laptop will hit the United States and Europe first, with other countries to follow. Initially, the Europe roll-out will include “UK, Netherlands, Belgium, Denmark, Sweden, Finland, Norway, France, Germany, Russia, Italy, Spain, South Africa and Switzerland.”

Acer Chromebook 13 CB5-311_closed 2.jpg

Acer is offering three consumer SKUs and one education SKU that will be exclusively offering through a re-seller. Please see the chart below for the specifications and pricing.

Acer Chromebook 13 Models System Memory (RAM) Storage (flash) Display Price MSRP
CB5-311-T9B0 2GB 16GB 1920 x 1080 $299.99
CB5-311-T1UU 4GB 32GB 1920 x 1080 $379.99
CB5-311-T7NN - Base Model 2GB 16GB 1366 x 768 $279.99
Educational SKU (Reseller Only) 4GB 16GB 1366 x 768 $329.99

Intel made some waves in the Chromebook market earlier this year with the announcement of several new Intel-powered Chrome devices and the addition of conflict-free Haswell Core i3 options. It seems that it is now time for the ARM(ed) response. I’m interested to see how NVIDIA’s newest model chip stacks up to the current and upcoming Intel x86 competition in terms of graphics power and battery usage.

As far as Chromebooks go, if the performance is at the point Acer and NVIDIA claim, this one definitely looks like a decent option considering the price. I think a head-to-head between the ASUS C200 (Bay Trail N2830, 2GB RAM, 16GB eMMC, and 1366x768 display at $249.99 MSRP) and Acer Chromebook 13 would be interesting as the real differentiator (beyond aesthetics) is the underlying SoC. I do wish there was a 4GB/16GB/1080p option in the Chromebook 13 lineup though considering the big price jump to get 4GB RAM (mostly as a result of the doubling of flash) in the $379.99 model at, say, $320 MSRP.

Read more about Chromebooks at PC Perspective!

Source: Acer