Subject: Mobile
Manufacturer: MSI

Introduction and Design

P4264361.jpg

It was only last year that we were singing the praises of the GT60, which was one of the fastest notebooks we’d seen to date. Its larger cousin, the GT70, features a 17.3” screen (versus the GT60’s 15.6”), faster CPUs and GPUs, and even better options for storage. Now, the latest iteration of this force to be reckoned with has arrived on our desks, and while its appearance hasn’t changed much, its performance is even better than ever.

While we’ll naturally be spending a good deal of time discussing performance and stability in our article here, we won’t be dedicating much to casing and general design, as—for the most part—it is very similar to that of the GT60. On the other hand, one area on which we’ll be focusing particularly heavily is that of battery life, thanks solely to the presence of NVIDIA’s new Battery Boost technology. As the name suggests, this new feature employs power conservation techniques to extend the notebook’s life while gaming unplugged. This is accomplished primarily via frame rate limiting, which is a feature that has actually been available since the introduction of Kepler, but which until now has been buried within the advanced options available for such products. Battery Boost basically brings this to the forefront and makes it both accessible and default.

Let’s take a look at what this bad boy is packing:

specs.png

Not much commentary needed here; this table reads like a who’s who of computer specifications. Of particular note are the 32 GB of RAM, the 880M (of course), and the 384 GB SSD RAID array (!!). Elsewhere, it’s mostly business as usual for the ultra-high-end MSI GT notebooks, with a slightly faster CPU than the previous model we reviewed (the i7-4700MQ). One thing is guaranteed: it’s a fast machine.

P4264350.jpg

Continue reading our review of the MSI GT70 2PE Gaming Notebook!!

GeForce GTX Titan Z Overclocking Testing

Subject: Graphics Cards | June 12, 2014 - 06:17 PM |
Tagged: overclocking, nvidia, gtx titan z, geforce

Earlier this week I posted a review of the NVIDIA GeForce GTX Titan Z graphics card, a dual-GPU Kepler GK110 part that currently sells for $3000. If you missed that article you should read it first and catch up but the basic summary was that, for PC gamers, it's slower and twice the price of AMD's Radeon R9 295X2.

During that article though I mentioned that the Titan Z had more variable clock speeds than any other GeForce card I had tested. At the time I didn't go any further than that since the performance of the card already pointed out the deficit it had going up against the R9 295X2. However, several readers asked me to dive into overclocking with the Titan Z and with that came the need to show clock speed changes. 

My overclocking was done through EVGA's PrecisionX software and we measured clock speeds with GPU-Z. The first step in overclocking an NVIDIA GPU is to simply move up the Power Target sliders and see what happens. This tells the card that it is allowed to consume more power than it would normally be allowed to, and then thanks to GPU Boost technology, the clock speed should scale up naturally. 

titanzoc.jpg

Click to Enlarge

And that is exactly what happened. I ran through 30 minutes of looped testing with Metro: Last Light at stock settings, with the Power Target at 112%, with the Power Target at 120% (the maximum setting) and then again with the Power Target at 120% and the GPU clock offset set to +75 MHz. 

That 75 MHz offset was the highest setting we could get to run stable on the Titan Z, which brings the Base clock up to 781 MHz and the Boost clock to 951 MHz. Though, as you'll see in our frequency graphs below the card was still reaching well above that.

clockspeedtitanz.png

Click to Enlarge

This graph shows clock rates of the GK110 GPUs on the Titan Z over the course of 25 minutes of looped Metro: Last Light gaming. The green line is the stock performance of the card without any changes to the power settings or clock speeds. While it starts out well enough, hitting clock rates of around 1000 MHz, it quickly dives and by 300 seconds of gaming we are often going at or under the 800 MHz mark. That pattern is consistent throughout the entire tested time and we have an average clock speed of 894 MHz.

Next up is the blue line, generated by simply moving the power target from 100% to 112%, giving the GPUs a little more thermal headroom to play with. The results are impressive, with a much more consistent clock speed. The yellow line, for the power target at 120%, is even better with a tighter band of clock rates and with a higher average clock. 

Finally, the red line represents the 120% power target with a +75 MHz offset in PrecisionX. There we see a clock speed consistency matching the yellow line but offset up a bit, as we have been taught to expect with NVIDIA's recent GPUs. 

clockspeedtitan-avg.png

Click to Enlarge

The result of all this data comes together in the bar graph here that lists the average clock rates over the entire 25 minute test runs. At stock settings, the Titan Z was able to hit 894 MHz, just over the "typical" boost clock advertised by NVIDIA of 876 MHz. That's good news for NVIDIA! Even though there is a lot more clock speed variance than I would like to see with the Titan Z, the clock speeds are within the expectations set by NVIDIA out the gate.

Bumping up that power target though will help out gamers that do invest in the Titan Z quite a bit. Just going to 112% results in an average clock speed of 993 MHz, a 100 MHz jump worth about 11% overall. When we push that power target up even further, and overclock the frequency offset a bit, we actually get an average clock rate of 1074 MHz, 20% faster than the stock settings. This does mean that our Titan Z is pulling more power and generating more noise (quite a bit more actually) with fan speeds going from around 2000 to 2700 RPM.

MetroLL_2560x1440_OFPS.png

MetroLL_2560x1440_PER.png

MetroLL_3840x2160_OFPS.png

MetroLL_3840x2160_PER.png

At both 2560x1440 and 3840x2160, in the Metro: Last Light benchmark we ran, the added performance of the Titan Z does put it at the same level of the Radeon R9 295X2. Of course, it goes without saying that we could also overclock the 295X2 a bit further to improve ITS performance, but this is an exercise in education.

IMG_0270.JPG

Does it change my stance or recommendation for the Titan Z? Not really; I still think it is overpriced compared to the performance you get from AMD's offerings and from NVIDIA's own lower priced GTX cards. However, it does lead me to believe that the Titan Z could have been fixed and could have offered at least performance on par with the R9 295X2 had NVIDIA been willing to break PCIe power specs and increase noise.

UPDATE (6/13/14): Some of our readers seem to be pretty confused about things so I felt the need to post an update to the main story here. One commenter below mentioned that I was one of "many reviewers that pounded the R290X for the 'throttling issue' on reference coolers" and thinks I am going easy on NVIDIA with this story. However, there is one major difference that he seems to overlook: the NVIDIA results here are well within the rated specs. 

When I published one of our stories looking at clock speed variance of the Hawaii GPU in the form of the R9 290X and R9 290, our results showed that clock speed of these cards were dropping well below the rated clock speed of 1000 MHz. Instead I saw clock speeds that reached as low as 747 MHz and stayed near the 800 MHz mark. The problem with that was in how AMD advertised and sold the cards, using only the phrase "up to 1.0 GHz" in its marketing. I recommended that AMD begin selling the cards with a rated base clock and a typical boost clock instead only labeling with the, at the time, totally incomplete "up to" rating. In fact, here is the exact quote from this story: "AMD needs to define a "base" clock and a "typical" clock that users can expect." Ta da.

The GeForce GTX Titan Z though, as we look at the results above, is rated and advertised with a base clock of 705 MHz and a boost clock of 876 MHz. The clock speed comparison graph at the top of the story shows the green line (the card at stock) never hitting that 705 MHz base clock while averaging 894 MHz. That average is ABOVE the rated boost clock of the card. So even though the GPU is changing between frequencies more often than I would like, the clock speeds are within the bounds set by NVIDIA. That was clearly NOT THE CASE when AMD launched the R9 290X and R9 290. If NVIDIA had sold the Titan Z with only the specification of "up to 1006 MHz" or something like then the same complaint would be made. But it is not.

The card isn't "throttling" at all, in fact, as someone specifies below. That term insinuates that it is going below a rated performance rating. It is acting in accordance with the GPU Boost technology that NVIDIA designed.

Some users seem concerned about temperature: the Titan Z will hit 80-83C in my testing, both stock and overclocked, and simply scales the fan speed to compensate accordingly. Yes, overclocked, the Titan Z gets quite a bit louder but I don't have sound level tests to show that. It's louder than the R9 295X2 for sure but definitely not as loud as the R9 290 in its original, reference state.

Finally, some of you seem concerned that I was restrticted by NVIDIA on what we could test and talk about on the Titan Z. Surprise, surprise, NVIDIA didn't send us this card to test at all! In fact, they were kind of miffed when I did the whole review and didn't get into showing CUDA benchmarks. So, there's that.

Podcast #304 - GeForce GTX TITAN Z, Core i7-4790K, Gigabyte Z97X-SOC Force and more!

Subject: Editorial | June 12, 2014 - 02:28 PM |
Tagged: Z97X-SOC Force, video, titan z, radeon, project tango, podcast, plextor, nvidia, Lightning, gtx titan z, gigabyte, geforce, E3 14, amd, 4790k, 290x

PC Perspective Podcast #304 - 06/12/2014

We have lots of reviews to talk about this week including the GeForce GTX TITAN Z, Core i7-4790K, Gigabyte Z97X-SOC Force, E3 News and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom and Allyn Maleventano

Program length: 1:11:36

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Author:
Manufacturer: NVIDIA

A powerful architecture

In March of this year, NVIDIA announced the GeForce GTX Titan Z at its GPU Technology Conference. It was touted as the world's fastest graphics card with its pair of full GK110 GPUs but it came with an equally stunning price of $2999. NVIDIA claimed it would be available by the end of April for gamers and CUDA developers to purchase but it was pushed back slightly and released at the very end of May, going on sale for the promised price of $2999.

The specifications of GTX Titan Z are damned impressive - 5,760 CUDA cores, 12GB of total graphics memory, 8.1 TFLOPs of peak compute performance. But something happened between the announcement and product release that perhaps NVIDIA hadn't accounted for. AMD's Radeon R9 295X2, a dual-GPU card with full-speed Hawaii chips on-board, was released at $1499. I think it's fair to say that AMD took some chances that NVIDIA was surprised to see them take, including going the route of a self-contained water cooler and blowing past the PCI Express recommended power limits to offer a ~500 watt graphics card. The R9 295X2 was damned fast and I think it caught NVIDIA a bit off-guard.

As a result, the GeForce GTX Titan Z release was a bit quieter than most of us expected. Yes, the Titan Black card was released without sampling the gaming media but that was nearly a mirror of the GeForce GTX 780 Ti, just with a larger frame buffer and the performance of that GPU was well known. For NVIDIA to release a flagship dual-GPU graphics cards, admittedly the most expensive one I have ever seen with the GeForce brand on it, and NOT send out samples, was telling.

NVIDIA is adamant though that the primary target of the Titan Z is not just gamers but the CUDA developer that needs the most performance possible in as small of a space as possible. For that specific user, one that doesn't quite have the income to invest in a lot of Tesla hardware but wants to be able to develop and use CUDA applications with a significant amount of horsepower, the Titan Z fits the bill perfectly.

Still, the company was touting the Titan Z as "offering supercomputer class performance to enthusiast gamers" and telling gamers in launch videos that the Titan Z is the "fastest graphics card ever built" and that it was "built for gamers." So, interest peaked, we decided to review the GeForce GTX Titan Z.

The GeForce GTX TITAN Z Graphics Card

Cost and performance not withstanding, the GeForce GTX Titan Z is an absolutely stunning looking graphics card. The industrial design started with the GeForce GTX 690 (the last dual-GPU card NVIDIA released) and continued with the GTX 780 and Titan family, lives on with the Titan Z. 

IMG_0270.JPG

The all metal finish looks good and stands up to abuse, keeping that PCB straight even with the heft of the heatsink. There is only a single fan on the Titan Z, center mounted, with a large heatsink covering both GPUs on opposite sides. The GeForce logo up top illuminates, as we have seen on all similar designs, which adds a nice touch.

Continue reading our review of the NVIDIA GeForce GTX Titan Z 12GB Graphics Card!!

E3 2014: NVIDIA SHIELD Tablet Will Exist?

Subject: General Tech, Mobile, Shows and Expos | June 9, 2014 - 02:10 PM |
Tagged: shield tablet, shield, nvidia, E3 14, E3

The Tech Report had their screenshot-fu tested today with the brief lifespan of NVIDIA's SHIELD Tablet product page. As you can see, it is fairly empty. We know that it will have at least one bullet point of "Features" and that its name will be "SHIELD Tablet".

nvidia-shield-tablet-il.jpg

Image Credit: The Tech Report

Of course, being the first day of E3, it is easy to expect that such a device will be announced in the next couple of days. This is expected to be based on the Tegra K1 with 2GB of RAM and have a 2048x1536 touch display.

It does question what exactly is a "SHIELD", however. Apart from being a first-party device, how would they be any different from other TegraZone devices? We know that Half Life 2 and Portal have been ported to the SHIELD product line, exclusively, and will not be available on other Tegra-powered devices. Now that the SHIELD line is extending to tablets, I wonder how NVIDIA will handle this seemingly two-tier class of products (SHIELD vs Tegra OEM devices). It might even depend on how many design wins they achieve, along with their overall mobile market share.

Source: Tech Report

Google's Project Tango Announced, Uses NVIDIA Tegra K1

Subject: General Tech, Mobile | June 5, 2014 - 02:51 PM |
Tagged: tegra k1, tegra, project tango, nvidia, google, Android

Today, Google announced their "Project Tango" developer kit for tablets with spatial awareness. With a price tag of $1,024 USD, it is definitely aimed at developers. In fact, the form to be notified about the development kit has a required check box that is labeled, "I am a developer". Slightly above the form is another statement, "These development kits are not a consumer device and will be available in limited quantities".

So yes, you can only buy these if you are a developer.

The technology is the unique part. Project Tango is aimed at developers to make apps which understand the 3D world around the tablet. Two examples categories they have already experimented with are robotics and computer vision. Of course, this could also translate to alternate reality games and mapping.

While Google has not been too friendly with OpenCL in its Android platform, it makes sense that they would choose a flexible GPU with a wide (and deep) range of API support. While other SoCs are probably capable enough, the Kepler architecture in the Tegra K1 is about as feature-complete as you can get in a mobile chip, because it is basically a desktop chip.

google-project-tango.jpg

Google's Project Tango is available to developers, exclusively, for $1,024 and ships later this month.

Also, that price is clearly a pun.

Source: Google

NVIDIA Launches GeForce Experience 2.1

Subject: General Tech, Graphics Cards | June 2, 2014 - 05:52 PM |
Tagged: nvidia, geforce, geforce experience, ShadowPlay

NVIDIA has just launched another version of their GeForce Experience, incrementing the version to 2.1. This release allows video of up to "2500x1600", which I assume means 2560x1600, as well as better audio-video synchronization in Adobe Premiere. Also, because why stop going after FRAPS once you start, it also adds an in-game framerate indicator. It also adds push-to-talk for recording the microphone.

nvidia-geforce-experience.png

Another note: when GeForce Experience 2.0 launched, it introduced streaming of the user's desktop. This allowed recording of OpenGL and windowed-mode games by simply capturing an entire monitor. This mode was not capable of "Shadow Mode", which I believed was because they thought users didn't want a constant rolling video to be taken of their desktop in the event that they wanted to save a few minutes of it at some point. Turns out that I was wrong; the feature was coming and it arrived with GeForce Experience 2.1.

GeForce Experience 2.1 is now available at NVIDIA's website, unless it already popped up a notification for you.

Source: NVIDIA

AMD and NVIDIA get into a hairy argument

Subject: General Tech | May 29, 2014 - 07:43 PM |
Tagged: nvidia, gameworks, dirty pool, business as usual, amd

The topic on NVIDIA Gameworks was discussed at great length on last night's PCPer Podcast and from the live comments as well as the comments on Ryan's original story this is obviously a topic which draws strong opinions.  As it is always best to limit yourself to debating topics of which you are familiar with the facts The Tech Report's article on the aftereffects of the Forbes story is well worth a read.  Cyril had a chance to speak with a rep from NVIDIA's driver development team about Hallock's comments pertaining to NVIDIA's Gameworks and the legitimacy of AMD's complaints.  As you might expect there is a lot of denial and finger pointing from both sides; what long time enthusiasts might describe as 'business as usual'.  Both sides of this argument have vehemently denied ever attempting to undermine each others business but yet both sides can point to specific instances in which the competition has used questionable methods to get a leg (or hair) up on the competition.  

6a00d8341c795b53ef010535c10aa8970b.jpg

"Earlier today, I spoke with Cem Cebenoyan, Director of Engineering for Developer Technology at Nvidia, who offered a rebuttal to a Forbes story we covered yesterday. In that story, AMD's Robert Hallock alleged that Nvidia's GameWorks program prevents AMD from working with game developers on GPU optimizations."

Here is some more Tech News from around the web:

Tech Talk

Podcast #302 - ASUS PB287Q 4K Monitor, NVIDIA and AMD's fight over GameWorks, Haswell-E Leaks and more!

Subject: General Tech | May 29, 2014 - 02:51 PM |
Tagged: video, podcast, asus, 4k, pb287q, nvidia, amd, gameworks, ubisoft, watch dogs, crucial, mx100, tegra k1, gsync

PC Perspective Podcast #302 - 05/29/2014

Join us this week as we discuss the ASUS PB287Q 4K Monitor, NVIDIA and AMD's fight over GameWorks, Haswell-E Leaks and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, and Allyn Maleventano

Program length: 1:29:01
  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week:
    1. Allyn: For Josh - the Wenger Giant Knife
  4. Closing/outro

 

Author:
Manufacturer: Various

The AMD Argument

Earlier this week, a story was posted in a Forbes.com blog that dove into the idea of NVIDIA GameWorks and how it was doing a disservice not just on the latest Ubisoft title Watch_Dogs but on PC gamers in general. Using quotes from AMD directly, the author claims that NVIDIA is actively engaging in methods to prevent game developers from optimizing games for AMD graphics hardware. This is an incredibly bold statement and one that I hope AMD is not making lightly. Here is a quote from the story:

Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products. . . . Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.

The example cited on the Forbes story is the recently released Watch_Dogs title, which appears to show favoritism towards NVIDIA GPUs with performance of the GTX 770 ($369) coming close the performance of a Radeon R9 290X ($549).

It's evident that Watch Dogs is optimized for Nvidia hardware but it's staggering just how un-optimized it is on AMD hardware.

watch_dogs_ss9_99866.jpg

Watch_Dogs is the latest GameWorks title released this week.

I decided to get in touch with AMD directly to see exactly what stance the company was attempting to take with these kinds of claims. No surprise, AMD was just as forward with me as they appeared to be in the Forbes story originally.

The AMD Stance

Central to AMD’s latest annoyance with the competition is the NVIDIA GameWorks program. First unveiled last October during a press event in Montreal, GameWorks combines several NVIDIA built engine functions into libraries that can be utilized and accessed by game developers to build advanced features into games. NVIDIA’s website claims that GameWorks is “easy to integrate into games” while also including tutorials and tools to help quickly generate content with the software set. Included in the GameWorks suite are tools like VisualFX which offers rendering solutions like HBAO+, TXAA, Depth of Field, FaceWorks, HairWorks and more. Physics tools include the obvious like PhysX while also adding clothing, destruction, particles and more.

Continue reading our editorial on the verbal battle between AMD and NVIDIA about the GameWorks program!!