Author:
Manufacturer: MSI

A New TriFrozr Cooler

Graphics cards are by far the most interesting topic we cover at PC Perspective.  Between the battles of NVIDIA and AMD as well as the competition between board partners like EVGA, ASUS, MSI and Galaxy, there is very rarely a moment in time when we don't have a different GPU product of some kind on an active test bed.  Both NVIDIA and AMD release reference cards (for the most part) with each and every new product launch and it then takes some time for board partners to really put their own stamp on the designs.  Other than the figurative stamp that is the sticker on the fan.

IMG_9886.JPG

One of the companies that has recently become well known for very custom, non-reference graphics card designs is MSI and the pinnacle of the company's engineering falls into the Lightning brand.  As far back as the MSI GTX 260 Lightning and as recently as the MSI HD 7970 Lightning, these cards have combined unique cooling, custom power design and good amount of over engineering to really produce a card that has few rivals.

Today we are looking at the brand new MSI GeForce GTX 780 Lightning, a complete revamp of the GTX 780 that was released in May.  Based on the same GK110 GPU as the GTX Titan card, with two fewer SMX units, the GTX 780 easily the second fastest single GPU card on the market.  MSI is hoping to make the enthusiasts even more excited about the card with the Lightning design that brings a brand new TriFrozr cooler, impressive power design and overclocking capabilities that basic users and LN2 junkies can take advantage of.  Just what DO you get for $750 these days?

Continue reading our review of the MSI GeForce GTX 780 Lightning graphics card!!

New NVIDIA 326.41 Beta Graphics Drivers Add Shield PC Game Streaming Support

Subject: Graphics Cards | August 1, 2013 - 11:50 PM |
Tagged: graphics drivers, nvidia, shield, pc game streaming, gaming, geforce

NVIDIA recently released a new set of beta GeForce graphics card drivers targetted at the 400, 500, 600, and 700 series GPUs. The new version 326.41 beta drivers feature the same performance tweaks as the previous 326.19 drivers while baking in beta support for PC game streaming to NVIDIA’s Shield gaming portable from a compatible GeForce graphics card (GTX 650 or better). The new beta release is also the suggested version to use for those running the Windows 8.1 Preview.

NVIDIA has included the same performance tweaks as version 326.19. The tweaks offer up to 19% performance increases, depending on the particular GPU setup. For example, users running a GTX 770 will see as much as 15% better performance in Dirt: Showdown and 6% in Tomb Raider. Performance improvements are even higher for GTX 770 SLI setups, with boosts in Dirt: Showdown and F1 2012 of 19% and 11% respectively. NVIDIA has also added SLI profiles for Splinter Cell: Blacklist and Batman: Arkham Origins.

The NVIDIA Shield launched recently and reviews are making the rounds around the Internet. One of the exciting features of the Shield gaming handheld is the ability to stream PC games from a PC with NVIDIA graphics card to the Shield over Wi-Fi.

The 326.41 drivers improve performance across several games on the GTX 770.

The other major changes are improvements to tiled 4K displays, which are displays with 4K resolutions that are essentially made of two separate displays, and the monitor even shows up to the OS as two separate displays despite being in a single physical monitor. Using DisplayPort MST and tiled displays allows monitor manufacturers to deliver 4K displays with higher refresh rates.

Interested GeForce users can grab the latest beta drivers from the NVIDIA website or via the links below:

Source: Tech Spot
Author:
Manufacturer: Galaxy

Overclocked GTX 770 from Galaxy

When NVIDIA launched the GeForce GTX 770 at the very end of May, we started to get in some retail samples from companies like Galaxy.  While our initial review looked at the reference models, other add-in card vendors are putting their own unique touch on the latest GK104 offering and Galaxy was kind enough to send us their GeForce GTX 770 2GB GC model that uses a unique, more efficient cooler design and also runs at overclocked frequencies. 

If you haven't yet read up on the GTX 770 GPU, you should probably stop by my first review of the GTX 770 to see what information you are missing out on.  Essentially, the GTX 770 is a full-spec GK104 Kepler GPU running at higher clocks (both core and memory speeds) compared to the original GTX 680.  The new reference clocks for the GTX 770 were 1046 MHz base clock, 1085 MHz Boost clock and a nice increase to 7.0 GHz memory speeds.

gpuz.png

Galaxy GeForce GTX 770 2GB GC Specs

The Galaxy GC model is overclocked with a new base clock setting of 1111 MHz and a higher Boost clock of 1163 MHz; both are about 6.5-7.0% higher than the original clocks.  Galaxy has left the memory speeds alone though keeping them running at 7.0 GHz effectively.

IMG_9941.JPG

Continue reading our review of the Galaxy GeForce GTX 770 2GB GC graphics card!!

Author:
Manufacturer: NVIDIA

Another Wrench – GeForce GTX 760M Results

Just recently, I evaluated some of the current processor-integrated graphics options from our new Frame Rating performance metric. The results were very interesting, proving Intel has done some great work with its new HD 5000 graphics option for Ultrabooks. You might have noticed that the MSI GE40 didn’t just come with the integrated HD 4600 graphics but also included a discrete NVIDIA GeForce GTX 760M, on-board.  While that previous article was to focus on the integrated graphics of Haswell, Trinity, and Richland, I did find some noteworthy results with the GTX 760M that I wanted to investigate and present.

IMG_0141.JPG

The MSI GE40 is a new Haswell-based notebook that includes the Core i7-4702MQ quad-core processor and Intel HD 4600 graphics.  Along with it MSI has included the Kepler architecture GeForce GTX 760M discrete GPU.

760mspecs.png

This GPU offers 768 CUDA cores running at a 657 MHz base clock but can stretch higher with GPU Boost technology.  It is configured with 2GB of GDDR5 memory running at 2.0 GHz

If you didn’t read the previous integrated graphics article, linked above, you’re going to have some of the data presented there spoiled and so you might want to get a baseline of information by getting through that first.  Also, remember that we are using our Frame Rating performance evaluation system for this testing – a key differentiator from most other mobile GPU testing.  And in fact it is that difference that allowed us to spot an interesting issue with the configuration we are showing you today. 

If you are not familiar with the Frame Rating methodology, and how we had to change some things for mobile GPU testing, I would really encourage you to read this page of the previous mobility Frame Rating article for the scoop.  The data presented below depends on that background knowledge!

Okay, you’ve been warned – on to the results.

Continue reading our story about GeForce GTX 760M Frame Rating results and Haswell Optimus issues!!

NVIDIA launches the GTX 760 @ $250; let the price wars begin again!

Subject: Graphics Cards | June 25, 2013 - 10:28 AM |
Tagged: geforce, GK104, gtx 760, nvidia, msi, MSI N760 TF 2GD5/OC

To start off with the good news, the GTX 760 is now available between $250 to $260 for the MSI model that [H]ard|OCP reviewed.  No paper launch this, nor another $400+ card for you to dream about but instead a solid performing card at a decent price.  Power is provide by an 8-pin and a 6-pin PCIe power connector, perhaps a little more than the card needs but perfect for overclockers who need the extra juice.  Performance wise the card trumps the GTX 660Ti and matches the GTX 670 and HD7950 boost in almost every test, for a good $50-75 less to pick up.  Even better news is that some certain sites testing Frame Rating and SLI performance saw great scaling in real performance. 

Read on to get the whole picture from [H]ard|OCP.

H_gtx760.jpg

"Today NVIDIA is launching the GeForce GTX 760. The GeForce GTX 760 will be replacing a video card and offering what use to be high-end memory performance, at a mainstream price. We will evaluate a retail MSI N760 TF 2GD5/OC video card with comparisons to find out whether or not this is a true value."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP
Author:
Manufacturer: NVIDIA

Getting even more life from GK104

Have you guys heard about this new GPU from NVIDIA?  It’s called GK104 and it turns out that the damn thing is found yet another graphics card this year – the new GeForce GTX 760.  Yup, you read that right, what NVIDIA is saying is the last update to the GeForce lineup through Fall 2013 is going to be based on the same GK104 design that we have previously discussed in reviews of the GTX 680, GTX 670, GTX 660 Ti, GTX 690 and more recently, the GTX 770. This isn’t a bad thing though!  GK104 has done a fantastic job in every field and market segment that NVIDIA has tossed it into with solid performance and even better performance per watt than the competition.  It does mean however that talking up the architecture is kind of mind numbing at this point…

block.jpg

If you are curious about the Kepler graphics architecture and the GK104 in particular, I’m not going to stop you from going back and reading over my initial review of the GTX 680 from January of 2012.  The new GTX 760 takes the same GPU, adds a new and improved version of GPU Boost (the same we saw in the GTX 770) and lowers down the specifications a bit to enable NVIDIA to hit a new price point.  The GTX 760 will be replacing the GTX 660 Ti – that card will be falling into the ether but the GTX 660 will remain, as will everything below it including the GTX 650 Ti Boost, 650 Ti and plain old 650.  The GTX 670 went the way of the dodo with the release of the GTX 770.

01.jpg

Even though the GTX 690 isn't on this list, NVIDIA says it isn't EOL

As for the GeForce GTX 760 it will ship with 1152 CUDA cores running at a base clock of 980 MHz and a typical boost clock of 1033 MHz.  The memory speed remains at 6.0 GHz on a 256-bit memory bus and you can expect to find both 2GB and 4GB frame buffer options from retail partners upon launch.  The 1152 CUDA cores are broken up over 6 SMX units and that means you’ll see some parts with 3 GPCs and others with 4 – NVIDIA claims any performance delta between them will be negligible. 

Continue reading our review of the NVIDIA GeForce GTX 760 2GB Graphics Card!!

Frame Rating: AMD plans driver release to address frame pacing for July 31st

Subject: Graphics Cards | June 20, 2013 - 01:05 PM |
Tagged: radeon, nvidia, geforce, frame rating, fcat, crossfire, amd

Well, the date has been set.  AMD publicly stated on its @AMDRadeon Twitter account that a new version the prototype driver we originally previewed with the release of the Radeon HD 7990 in April will be released to the public on July 31st.  For a problem that many in the industry didn't think existed.  

 

 

Since that April release AMD has been very quiet about its driver changes and actually has refused to send me updated prototypes over the spring.  Either they have it figured out or they are worried they haven't - but it looks like we'll find out at the end of next month and I feel pretty confident that the team will be able to address the issues we brought to light.

For those of you that might have missed the discussion, our series of Frame Rating stories will tell you all about the issues with frame pacing and stutter in regards to AMD's CrossFire multi-GPU technology. 

AMD gave the media a prototype driver in April to test with the Radeon HD 7990, a card that depends on CrossFire to work correctly, and the improvements were pretty drastic.

BF3_2560x1440_PLOT_0.png

So what can we expect on July 31st?  A driver that will give users the option to disable or enable the frame pacing technology they are developing - though I am still of the mindset that disabling is never advantageous.  More to come in the next 30 days!

Source: Twitter

Rumor: AMD Gets Exclusive Optimization for all Frostbite 3 Games

Subject: Graphics Cards | June 18, 2013 - 12:39 PM |
Tagged: radeon, nvidia, geforce, frostbite 3, ea, dice, amd

UPDATE #3

The original source article at IGN.com has been updated with some new information.  Now they are saying the agreement between AMD and EA is "non-exclusive and gamers using other components will be supported." 

The quote from an EA rep says as follows:

DICE has a partnership with AMD specifically for Battlefield 4 on PC to showcase and optimize the game for AMD hardware," an EA spokesperson said. "This does not exclude DICE from working with other partners to ensure players have a great experience across a wide set of PCs for all their titles.

END UPDATE #3

This could be a huge deal for NVIDIA and AMD in the coming months - according to a story at IGN.com, AMD has entered into an agreement with EA that will allow them exclusive rights to optimization for all games based around the Frostbite 3 engine.  That includes Battlefield 4, Mirror's Edge 2, Need for Speed Rivals and many more games due out this year and in 2014.  Here is the quote that is getting my attention:

Starting with the release of Battlefield 4, all current and future titles using the Frostbite 3 engine — Need for Speed Rivals, Mirror's Edge 2, etc. — will ship optimized exclusively for AMD GPUs and CPUs. While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

bf4.jpg

Battlefield 4 will be exclusive optimized for AMD hardware.

This is huge news for AMD as the Frostbite 3 engine will be used for all EA published games going forward with the exception of sports titles.  The three mentioned above are huge but this also includes Star Wars Battlefront, Dragon Age and even the next Mass Effect so I can't really emphasize enough how big of a win this could be for AMD's marketing and developer relations teams. 

I am particularly interested in this line as well:

While Nvidia-based systems will be supported, the company won't be able to develop and distribute updated drivers until after each game is released.

The world of PC optimizations and partnerships has been around for a long time so this isn't a huge surprise for anyone that follows PC gaming.  What is bothersome to me is that both EA and AMD are saying are rumored to have agreed that NVIDIA won't get access to the game as it is being developed - something that is CRUCIAL for day-of driver releases and performance tweaks for GeForce card owners.  In most cases, both AMD and NVIDIA developer relations teams get early access to game builds for PC titles in order to validate compatibility and to improve performance of these games for the public release.  Without these builds, NVIDIA would be at a big disadvantage.  This is exactly what happend with the recent Tomb Raider release.

UPDATE

AMD called me to reiterate their stance that competition does not automatically mean cutting out the other guy.  In the Tomb Raider story linked above, Neil Robison, AMD's Senior Director of Consumer and Graphics Alliances, states quite plainly: "The thing that angers me the most is when I see a request to debilitate a game. I understand winning, I get that, and I understand aggressive companies, I get that. Why would you ever want to introduce a feature on purpose that would make a game not good for half the gaming audience?"

So what do we take away from that statement, made in a story published in March, and today's rumor?  We have to take AMD at its word until we see solid evidence otherwise, or enough cases of this occurring to feel like I am being duped but AMD wants us all to know that they are playing the game the "right way."  That stance just happens to be counter to this rumor. 

END UPDATE

tombraider.jpg

NVIDIA had performance and compatibility issues with Tomb Raider upon release.

The irony in all of this is that AMD has been accusing NVIDIA of doing this exact thing for years - though without any public statements from developers, publishers or NVIDIA.  When Batman: Arkham Asylum was launched AMD basically said that NVIDIA had locked them out of supporting antialiasing.  In 2008, Assassin's Creed dropped DX 10.1 support supposedly because NVIDIA asked them too, who didn't have support for it at the time in GeForce cards.  Or even that NVIDIA was disabling cores for PhysX CPU support to help prop up GeForce sales.  At the time, AMD PR spun this as the worst possible thing for a company to do in the name of gamers, that is was bad for the industry, etc.  But times change as opportunity changes.

The cold truth is that this is why AMD decided to take the chance that NVIDIA was allegedly unwilling to and take the console design wins that are often noted as being "bad business."  If settling for razor thin margins on the consoles is a risk, the reward that AMD is hoping to get is exactly this: benefits in other markets thanks to better relationships with game developers.

ps4controller.jpg

Will the advantage be with AMD thanks to PS4 and Xbox One hardware?

At E3 I spoke in-depth with both NVIDIA and AMD executives about this debate and as you might expect both have very different opinions about what is going to transpire in the next 12-24 months.  AMD views this advantage (being in the consoles) as the big bet that is going to pay off for the more profitable PC space.  NVIDIA thinks that AMD still doesn't have what it takes to truly support developers in the long run and they don't have the engineers to innovate on the technology side.  In my view, having Radeon-based processors in the Xbox One and Playstation 4 (as well as the Wii U I guess) gives AMD a head start but won't win them the race for the hearts and minds of PC gamers. There is still a lot of work to be done for that.

Before this story broke I was planning on outlining another editorial on this subject and it looks like it just got promoted to a top priority.  There appear to be a lot of proverbial shoes left to drop in this battle, but it definitely needs more research and discussion. 

batmanaa.jpg

Remember the issues with Batman: Arkham Asylum?  I do.

I asked both NVIDIA and AMD for feedback on this story but only AMD has replied thus far.  Robert Hallock, PR manager for gaming and graphics, Graphics Business Unit at AMD sent me this:

It makes sense that game developers would focus on AMD hardware with AMD hardware being the backbone of the next console generation. At this time, though, our relationship with EA is exclusively focused on Battlefield 4 and its hardware optimizations for AMD CPUs, GPUs and APUs.

Not much there, but he is also not denying of the original report coming from IGN.  It might just be too early for a more official statement.  I will update this story with information from NVIDIA if I hear anything else.

What do YOU think about this announcement though?  Is this good news for AMD and bad news for NVIDIA?  Is it good or bad for the gamer and in particular, the PC gamer?  Your input will help guide or upcoming continued talks with NVIDIA and AMD on the subject. 

UPDATE #2

Just so we all have some clarification on this and on the potential for validity of the rumor, this is where I sourced the story from this afternoon:

taylorquote.png

END UPDATE #2

Source: IGN

Razer Blade Haswell Gaming Notebook is Damn Sexy, Powerful

Subject: Mobile, Shows and Expos | June 12, 2013 - 05:47 PM |
Tagged: E3, razer, blade, haswell, gtx 765m, geforce

With the launch of Intel's Haswell processor, accessory maker-turned notebook vendor Razer announced a pretty slick machine, the Blade.  Based on a quad-core, 37 watt Core i7 Haswell CPU and a GeForce GTX 765M GPU, the Razer Blade packs a lot of punch.

razer1.jpg

It also includes 8GB of DDR3-1600 memory, an mSATA SSD and integrates a 14-in 1600x900 display.  The design of the unit looks very similar to that of the MacBook Pro but the black metal finish is really an attractive style change. 

razer2.jpg

The embedded battery is fairly large at 70 Whr and Razer claims this will equate to a 6 hour battery life when operating non-gaming workloads.  With a weight just barely creeping past 4 lbs, the Razer Blade is both portable and powerful it seems.

razer3.jpg

The price tag starts at $1799 so you won't be able to pick one of these up on the cheap, but for users like me that are willing to pay a bit more for performance and style in a slim chassis, the Blade seems like a very compelling option.  There are a lot of questions left to answer on this notebook including the thermal concerns of packing that much high frequency silicon into a thin and light form factor.  Does the unit get hot in bad places?  Can the screen quality match the performance of Haswell + Kepler? 

We are working with Razer to get a model in very soon to put it to the test and I am looking forward to answering if we have found the best gaming portable on the market.

Author:
Manufacturer: NVIDIA

A necessary gesture

NVIDIA views the gaming landscape as a constantly shifting medium that starts with the PC.  But the company also sees mobile gaming, cloud gaming and even console gaming as part of the overall ecosystem.  But that is all tied together by an investment in content – the game developers and game publishers that make the games that we play on PCs, tablets, phones and consoles.

nv14.jpg

The slide above shows NVIDIA targeting for each segment – expect for consoles obviously.  NVIDIA GRID will address the cloud gaming infrastructure, GeForce and the GeForce Experience will continue with the PC systems and NVIDIA SHIELD and the Tegra SoC will get the focus for the mobile and tablet spaces.  I find it interesting that NVIDIA has specifically called out Steam under the PC – maybe a hint of the future for the upcoming Steam Box?

The primary point of focus for today’s press meeting was to talk about the commitment that NVIDIA has to the gaming world and to developers.  AMD has been talking up their 4-point attack on gaming that starts really with the dominance in the console markets.  But NVIDIA has been the leader in the PC world for many years and doesn’t see that changing.

nv02.jpg

With several global testing facilities, the most impressive of which exists in Russia, NVIDIA tests more games, more hardware and more settings combinations than you can possibly imagine.  They tune drivers and find optimal playing settings for more than 100 games that are now wrapped up into the GeForce Experience software.  They write tools for developers to find software bottlenecks and test for game streaming latency (with the upcoming SHIELD). They invest more in those areas than any other hardware vendor.

nv03.jpg

This is a list of technologies that NVIDIA claims they invented or developed – an impressive list that includes things like programmable shaders, GPU compute, Boost technology and more. 

nv04.jpg

Many of these turned out to be very important in the development and advancement of gaming – not for PCs but for ALL gaming. 

Continue reading our editorial on NVIDIA's stance on it's future in PC gaming!!