New GeForce Bundle: Witcher 3 and Batman: Arkham Knight

Subject: Graphics Cards | May 5, 2015 - 12:46 PM |
Tagged: The Witcher 3, nvidia, GTX 980, GTX 970, gtx, geforce, batman arkham knight

NVIDIA has announced a new game bundle for GeForce graphics cards starting now, and it’s a doozy: Purchase a qualifying GTX card and receive download codes for both Witcher 3: Wild Hunt and Batman: Arkham Knight!

batman-witcher-glp-header.png

Needless to say, both of these titles have been highly anticipated, and neither have been released just yet with Witcher 3: Wild Hunt due to be released on May 19, and Batman: Arkham Knight arriving on June 23. So which cards qualify? Amazon has a page specifically created for this new offer here, and depending on which card you select you’ll be eligible for either both upcoming games, or just Witcher 3. NVIDIA has this chart on their promotion page for reference:

eligible_cards.png

So GeForce GTX 980 and GTX 970 cards will qualify for both games, and GTX 960 cards and the mobile GPUs qualify for Witcher 3 alone. Regardless of which game or card you may choose these PC versions of the new games will feature graphics that can’t be matched by consoles, and NVIDIA points out the advantages in their post:

"Both Batman and The Witcher raise the bar for graphical fidelity, rendering two very different open worlds with a level of detail we could only dream of last decade. And on PC, each is bolstered by NVIDIA GameWorks effects that increase fidelity, realism, and immersion. But to use those effects in conjunction with the many other available options, whilst retaining a high frame rate, it’s entirely possible that you’ll need an upgrade."

NVIDIA also posted this Witcher 3: Wild Hunt behind-the-scenes video:

In addition to the GameWorks effects present in Witcher 3, NVIDIA worked with developer Rocksteady on Batman: Arkham Knight “to create new visual effects exclusively for the game’s PC release.” The post elaborates:

"This is in addition to integrating the latest and greatest versions of our GPU-accelerated PhysX Destruction, PhysX Clothing and PhysX Turbulence technologies, which work in tandem to create immersive effects that react realistically to external forces. In previous Batman: Arkham games, these technologies created immersive fog and ice effects, realistic destructible scenery, and game-enhancing cloth effects that added to the atmosphere and spectacle. Expect to see similar effects in Arkham Knight, along with new features and effects that we'll be talking about in depth as we approach Arkham Knight’s June 23rd release."

batman-arkham-knight-screenshot-1.jpg

Of note, mobile GPUs included in the promotion will only receive download codes if the seller of the notebook is participating in the "Two Times The Adventure" offer, so be sure to check that out when looking for a gaming laptop that qualifies!

The promotion is going on now and is available “for a limited time or while supplies last”.

Source: NVIDIA

PCPer Live! GeForce GTX TITAN X Live Stream and Giveaway!

Subject: Graphics Cards | March 16, 2015 - 07:13 PM |
Tagged: video, tom petersen, titan x, nvidia, maxwell, live, gtx titan x, gtx, gm200, geforce

UPDATE 2: If you missed the live stream, we now have the replay available below!

UPDATE: The winner has been announced: congrats to Ethan M. for being selected as the random winner of the GeForce GTX TITAN X graphics card!!

Get yourself ready, it’s time for another GeForce GTX live stream hosted by PC Perspective’s Ryan Shrout! This time the focus is going to be NVIDIA's brand-new GeForce GTX TITAN X graphics card, first teased a couple of weeks back at GDC. NVIDIA's Tom Petersen will be joining us live from the GPU Technology Conference show floor to discuss the GM200 GPU, it's performance and to show off some demos of the hardware in action.

GeForce_GTX_TITANX_3Qtr.jpg

And what's a live stream without a prize? One lucky live viewer will win a GeForce GTX TITAN X 12GB graphics card of their very own! That's right - all you have to do is tune in for the live stream tomorrow afternoon and you could win a Titan X!!

pcperlive.png

NVIDIA GeForce GTX TITAN X Live Stream and Giveaway

1pm PT / 4pm ET - March 17th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

The event will take place Tuesday, March 17th at 1pm PT / 4pm ET at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience. To win the prize you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else.

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Tuesday at 1pm PT / 4pm ET and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

Huge thanks to ASUS for supplying a new G751JY notebook, featuring an Intel Core i7-4710HQ and a GeForce GTX 980M 4GB GPU to power our live stream from GTC!!

GDC 15: NVIDIA Shows TITAN X at Epic Games Keynote

Subject: Graphics Cards | March 4, 2015 - 01:10 PM |
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC

For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.

The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.

titanx3.jpg

JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.

titanx2.jpg

Any guesses on performance or price?

titanx1.jpg

titanx4.jpg

Jen-Hsun signs the world's first TITAN X for Tim Sweeney.

Kite Demo running on TITAN X

UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.

titanx5.jpg

Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.

titanx6.jpg

PCPer Live! GeForce GTX 960 Live Stream and Giveaway with Tom Petersen

Subject: General Tech, Graphics Cards | January 22, 2015 - 06:44 PM |
Tagged: video, tom petersen, nvidia, maxwell, live, gtx 960, gtx, GM206, geforce

UPDATE 2: If you missed the live stream you missed the prizes! But you can still watch the replay to get all the information and Q&A that went along with it as we discuss the GTX 960 and many more topics from the NVIDIA universe.

UPDATE (1/22): Well, the secret is out. Today's discussion will be about the new GeForce GTX 960, a $199 graphics card that takes power efficiency to a previously un-seen level! If you haven't read my review of the card yet, you should do so first, but then be sure you are ready for today's live stream and giveaway - details below! And don't forget: if you have questions, please leave them in the comments!

Get yourself ready, it’s time for another GeForce GTX live stream hosted by PC Perspective’s Ryan Shrout and NVIDIA’s Tom Petersen. Though we can’t dive into the exact details of what topics are going to be covered, intelligent readers that keep an eye on the rumors on our site will likely be able to guess what is happening on January 22nd.

geforce-logo.jpg

On hand to talk about the products, answer questions about technologies in the GeForce family including GPUs, G-Sync, GameWorks, GeForce Experience and more will be Tom Petersen, well known on the LAN party and events circuit. To spice things up as well Tom has worked with graphics card partners to bring along a sizeable swag pack to give away LIVE during the event, including new GTX graphics cards. LOTS of graphics cards.

pcperlive.png

NVIDIA GeForce GTX 960 Live Stream and Giveaway

10am PT / 1pm ET - January 22nd

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Here are some of the prizes we have lined up for those of you that join us for the live stream:

  • 3 x MSI GeForce GTX 960 Graphics Cards
  • 4 x EVGA GeForce GTX 960 Graphics Cards
  • 3 x ASUS GeForce GTX 960 Graphics Cards

prizes.jpg

Thanks to ASUS, EVGA and MSI for supporting the stream!

The event will take place Thursday, January 22nd at 1pm ET / 10am PT at http://www.pcper.com/live. There you’ll be able to catch the live video stream as well as use our chat room to interact with the audience, asking questions for me and Tom to answer live. To win the prizes you will have to be watching the live stream, with exact details of the methodology for handing out the goods coming at the time of the event.

Tom has a history of being both informative and entertaining and these live streaming events are always full of fun and technical information that you can get literally nowhere else. Previous streams have produced news as well – including statements on support for Adaptive Sync, release dates for displays and first-ever demos of triple display G-Sync functionality. You never know what’s going to happen or what will be said!

If you have questions, please leave them in the comments below and we'll look through them just before the start of the live stream. Of course you'll be able to tweet us questions @pcper and we'll be keeping an eye on the IRC chat as well for more inquiries. What do you want to know and hear from Tom or I?

So join us! Set your calendar for this coming Thursday at 1pm ET / 10am PT and be here at PC Perspective to catch it. If you are a forgetful type of person, sign up for the PC Perspective Live mailing list that we use exclusively to notify users of upcoming live streaming events including these types of specials and our regular live podcast. I promise, no spam will be had!

GPU Rumors: AMD Plans 20nm but NVIDIA Waits for 16nm

Subject: General Tech, Graphics Cards | December 28, 2014 - 09:47 PM |
Tagged: radeon, nvidia, gtx, geforce, amd

According to an anonymous source of WCCFTech, AMD is preparing a 20nm-based graphics architecture that is expected to release in April or May. Originally, they predicted that the graphics devices, which they call R9 300 series, would be available in February or March. The reason for this “delay” is a massive demand for 20nm production.

nvidia-gtx-vs-amd-gaming-evolved.jpg

The source also claims that NVIDIA will skip 20nm entirely and instead opt for 16nm when that becomes available (which is said to be mid or late 2016). The expectation is that NVIDIA will answer AMD's new graphics devices with a higher-end Maxwell device that is still at 28nm. Earlier rumors, based on a leaked SiSoftware entry, claim 3072 CUDA cores that are clocked between 1.1 GHz and 1.39 GHz. If true, this would give it between 6.75 and 8.54 TeraFLOPs of performance, the higher of which is right around the advertised performance of a GeForce Titan Z (only in a single compute device that does not require distribution of work like what SLI was created to automate).

Will this strategy work in NVIDIA's favor? I don't know. 28nm is a fairly stable process at this point, which will probably allow them to get chips that can be bigger and more aggressively clocked. On the other hand, they pretty much need to rely upon chips that are bigger and more aggressively clocked to be competitive with AMD's slightly more design architecture. Previous rumors also hint that AMD is looking at water-cooling for their reference card, which might place yet another handicap against NVIDIA, although cooling is not an area that NVIDIA struggles in.

Source: WCCFTech
Author:
Manufacturer: NVIDIA

A powerful architecture

In March of this year, NVIDIA announced the GeForce GTX Titan Z at its GPU Technology Conference. It was touted as the world's fastest graphics card with its pair of full GK110 GPUs but it came with an equally stunning price of $2999. NVIDIA claimed it would be available by the end of April for gamers and CUDA developers to purchase but it was pushed back slightly and released at the very end of May, going on sale for the promised price of $2999.

The specifications of GTX Titan Z are damned impressive - 5,760 CUDA cores, 12GB of total graphics memory, 8.1 TFLOPs of peak compute performance. But something happened between the announcement and product release that perhaps NVIDIA hadn't accounted for. AMD's Radeon R9 295X2, a dual-GPU card with full-speed Hawaii chips on-board, was released at $1499. I think it's fair to say that AMD took some chances that NVIDIA was surprised to see them take, including going the route of a self-contained water cooler and blowing past the PCI Express recommended power limits to offer a ~500 watt graphics card. The R9 295X2 was damned fast and I think it caught NVIDIA a bit off-guard.

As a result, the GeForce GTX Titan Z release was a bit quieter than most of us expected. Yes, the Titan Black card was released without sampling the gaming media but that was nearly a mirror of the GeForce GTX 780 Ti, just with a larger frame buffer and the performance of that GPU was well known. For NVIDIA to release a flagship dual-GPU graphics cards, admittedly the most expensive one I have ever seen with the GeForce brand on it, and NOT send out samples, was telling.

NVIDIA is adamant though that the primary target of the Titan Z is not just gamers but the CUDA developer that needs the most performance possible in as small of a space as possible. For that specific user, one that doesn't quite have the income to invest in a lot of Tesla hardware but wants to be able to develop and use CUDA applications with a significant amount of horsepower, the Titan Z fits the bill perfectly.

Still, the company was touting the Titan Z as "offering supercomputer class performance to enthusiast gamers" and telling gamers in launch videos that the Titan Z is the "fastest graphics card ever built" and that it was "built for gamers." So, interest peaked, we decided to review the GeForce GTX Titan Z.

The GeForce GTX TITAN Z Graphics Card

Cost and performance not withstanding, the GeForce GTX Titan Z is an absolutely stunning looking graphics card. The industrial design started with the GeForce GTX 690 (the last dual-GPU card NVIDIA released) and continued with the GTX 780 and Titan family, lives on with the Titan Z. 

IMG_0270.JPG

The all metal finish looks good and stands up to abuse, keeping that PCB straight even with the heft of the heatsink. There is only a single fan on the Titan Z, center mounted, with a large heatsink covering both GPUs on opposite sides. The GeForce logo up top illuminates, as we have seen on all similar designs, which adds a nice touch.

Continue reading our review of the NVIDIA GeForce GTX Titan Z 12GB Graphics Card!!

NVIDIA Finally Launches GeForce GTX Titan Z Graphics Card

Subject: Graphics Cards | May 28, 2014 - 11:19 AM |
Tagged: titan z, nvidia, gtx, geforce

Though delayed by a month, today marks the official release of NVIDIA's Titan Z graphics card, the dual GK110 beast with the $3000 price tag. The massive card was shown for the first time in March at NVIDIA's GPU Technology Conference and our own Tim Verry was on the grounds to get the information

The details remain the same:

Specifically, the GTX TITAN Z is a triple slot graphics card that marries two full GK110 (big Kepler) GPUs for a total of 5,760 CUDA cores, 448 TMUs, and 96 ROPs with 12GB of GDDR5 memory on a 384-bit bus (6GB on a 384-bit bus per GPU). For the truly adventurous, it appears possible to SLI two GTX Titan Z cards using the single SLI connector. Display outputs include two DVI, one HDMI, and one DisplayPort connector.

The difference now of course is that all the clock speeds and pricing are official. 

titanzspecs.png

A base clock speed of 705 MHz with a Boost rate of 876 MHz places it well behind the individual GPU performance of a GeForce GTX 780 Ti or GTX Titan Black (rated at 889/980 MHz). The memory clock speed remains the same at 7.0 Gbps and you are still getting a massive 6GB of memory per GPU.

Maybe most interesting with the release of the GeForce GTX Titan Z is that NVIDIA seems to have completely fixated on non-DIY consumers with the card. We did not receive a sample of the Titan Z (nor did we get one of the Titan Black) and when I inquired as to why, NVIDIA PR stated that they were "only going to CUDA developers and system builders."

geforce-gtx-titan-z-3qtr.png

I think it is more than likely that after the release of AMD's Radeon R9 295X2 dual GPU graphics card on April 8th, with a price tag of $1500 (half of the Titan Z), the target audience was redirected. NVIDIA already had its eye on the professional markets that weren't willing to dive into the Quadro/Tesla lines (CUDA developers will likely drop $3k at the drop of a hat to get this kind of performance density). But a side benefit of creating the best flagship gaming graphics card on the planet was probably part of the story - and promptly taken away by AMD.

geforce-gtx-titan-z-bracket.png

I still believe the Titan Z will be an impressive graphics card to behold both in terms of look and style and in terms of performance. But it would take the BIGGEST NVIDIA fans to be able to pass up buying a pair of Radeon R9 295X2 cards for a single GeForce GTX Titan Z. At least that is our assumption until we can test one for ourselves.

I'm still working to get my hands on one of these for some testing as I think the ultra high end graphics card coverage we offer is incomplete without it. 

Several of NVIDIA's partners are going to be offering the Titan Z including EVGA, ASUS, MSI and Zotac. Maybe the most intersting though is EVGA's water cooled option!

evgatitanz.jpg

So, what do you think? Anyone lining up for a Titan Z when they show up for sale?

Secret Page to Win an NVIDIA GeForce GTX TITAN X!

Subject: Graphics Cards | March 17, 2014 - 12:00 PM |
Tagged: video, tom petersen, titan x, nvidia, maxwell, live, gtx titan x, gtx, gm200, geforce

Hey, you found us!

IMG_1953.JPG

This is the secret page on PC Perspective for live viewers of our Titan X live stream so they can enter for a chance to enter one! Just fill out the form below and you're entered. Good luck!

Note: we are making this an international giveaway, so good luck to all of our domestic and international friends!

NVIDIA Joins the Bundle Game: Up to $150 in credit on Free-to-Play games for all GTX buyers

Subject: Graphics Cards | February 11, 2013 - 12:33 PM |
Tagged: world of tanks, planetside 2, nvidia, Hawken, gtx, geforce, bundle

AMD has definitely been winning the "game" of game bundles and bonus content with graphics cards purchases, as is evident from the recent Never Settle Reloaded campaign that includes titles like Crysis 3, Bioshock Infinite and Tomb Raider.  I made comments that NVIDIA was falling behind and may even start to look like they have moved away from a focus on PC gamers since they hadn't made any reply over the last year...

After losing a bidding war with AMD over Crysis 3, today NVIDIA is unveiling a bundle campaign that attack at a different angle; rather than including in bundled games NVIDIA is working free-to-play titles.  How do you give gamers bonuses by including free to play games?  Credits!  Cold hard cash!

bundle1.png

Starting today if you pick up any GeForce GTX graphics card you'll be eligible to get free in-game credit to use in one of the three free-to-play titles partnering with NVIDIA.  A GTX 650 or GTX 650 Ti will net you $25 in each for a total bonus of $75 while buying a GTX 660 or higher, all the way up to the GTX 690 results in $50 per game for a total of $150.

Also, after asking NVIDIA about it, this is a PER CARD bundle so if you get an SLI pair of anything, you'll get double the credit.  A pair of GeForce GTX 660s for an SLI rig results in $100 per game, $300 total!

bundle3.png

This is a very interesting approach that NVIDIA has decided to take and I am eager to get feedback from our readers on the differences between AMD's and NVIDIA's bundles.  I have played quite a bit of Planetside 2 and definitely enjoyed it; it is a graphics showcase as well with huge and expansive levels and hundreds of people per server.  World of Tanks and Hawken I am less familiar with but they also are extremely popular.

bundle2.png

Leave us your comments below!  Do you think NVIDIA's new GeForce GTX gaming bundle for free-to-play game credits can be successful! 

If you are looking for a new GeForce GTX card today and this bundle convinced you to buy, feel free to use the links below. 

Author:
Manufacturer: MSI

Spicing up the GTX 670

The Power Edition graphics card series from MSI is a relatively new addition to its lineup. The Power Edition often mimics that of the higher-end Lightning series, but at a far lower price (and perhaps a smaller feature set). This allows MSI to split the difference between the reference class boards and the high end Lightning GPUs.

msi_gtx670_pe_01.jpg

Doing this allows users a greater variety of products to choose from, and to better tailor users' purchases by their needs and financial means. Not everyone wants to pay $600 for a GTX 680 Lightning, but what if someone was able to get similar cooling, quality, and overclocking potential for a much lower price?  This is what MSI has done with one of its latest Power Edition cards.

The GTX 670 Power Edition

The NVIDIA GTX 670 cards have received accolades throughout the review press. It is a great combination of performance, power consumption, heat production, and price. It certainly caused AMD a great amount of alarm, and it hurriedly cut prices on the HD 7900 series of cards in response. The GTX 670 is a slightly cut-down version of the full GTX 680, and it runs very close to the clock speed of its bigger brother. In fact, other than texture and stream unit count, the cards are nearly identical.

Continue reading the entire review!