Subject: General Tech, Graphics Cards | October 23, 2013 - 12:21 AM | Scott Michaud
Tagged: nvidia, graphics drivers, geforce
Mid-June kicked up a storm of poop across the internet when IGN broke the AMD optimizations for Frostbite 3. It was reported that NVIDIA would not receive sample code for those games until after they launched. The article was later updated with a statement from AMD: "... the AMD Gaming Evolved program undertakes no efforts to prevent our competition from optimizing for games before their release."
Now, I assume, the confusion was caused by then-not-announced Mantle.
And, as it turns out, NVIDIA did receive the code for Battlefield 4 prior to launch. Monday, the company launched their 331.58 WHQL-certified drivers which are optimized for Batman: Arkham Origins and Battlefield 4. According to the release notes, you should even be able to use SLi out of the gate. If, on the other hand, you are a Civilization V player: HBAO+ should enhance your shadowing.
They also added a DX11 SLi profile for Watch Dogs... awkwarrrrrd.
To check out the blog at GeForce.com for a bit more information, check out the release notes, or just head over to the drivers page. If you have GeForce Experience installed, it probably already asked you to update.
The Really Good Times are Over
We really do not realize how good we had it. Sure, we could apply that to budget surpluses and the time before the rise of global terrorism, but in this case I am talking about the predictable advancement of graphics due to both design expertise and improvements in process technology. Moore’s law has been exceptionally kind to graphics. We can look back and when we plot the course of these graphics companies, they have actually outstripped Moore in terms of transistor density from generation to generation. Most of this is due to better tools and the expertise gained in what is still a fairly new endeavor as compared to CPUs (the first true 3D accelerators were released in the 1993/94 timeframe).
The complexity of a modern 3D chip is truly mind-boggling. To get a good idea of where we came from, we must look back at the first generations of products that we could actually purchase. The original 3Dfx Voodoo Graphics was comprised of a raster chip and a texture chip, each contained approximately 1 million transistors (give or take) and were made on a then available .5 micron process (we shall call it 500 nm from here on out to give a sense of perspective with modern process technology). The chips were clocked between 47 and 50 MHz (though often could be clocked up to 57 MHz by going into the init file and putting in “SET SST_GRXCLK=57”… btw, SST stood for Sellers/Smith/Tarolli, the founders of 3Dfx). This revolutionary graphics card at the time could push out 47 to 50 megapixels and had 4 MB of VRAM and was released in the beginning of 1996.
My first 3D graphics card was the Orchid Righteous 3D. Voodoo Graphics was really the first successful consumer 3D graphics card. Yes, there were others before it, but Voodoo Graphics had the largest impact of them all.
In 1998 3Dfx released the Voodoo 2, and it was a significant jump in complexity from the original. These chips were fabricated on a 350 nm process. There were three chips to each card, one of which was the raster chip and the other two were texture chips. At the top end of the product stack was the 12 MB cards. The raster chip had 4 MB of VRAM available to it while each texture chip had 4 MB of VRAM for texture storage. Not only did this product double performance from the Voodoo Graphics, it was able to run in single card configurations at 800x600 (as compared to the max 640x480 of the Voodoo Graphics). This is the same time as when NVIDIA started to become a very aggressive competitor with the Riva TnT and ATI was about to ship the Rage 128.
Subject: Graphics Cards, Displays | October 20, 2013 - 02:50 PM | Ryan Shrout
Tagged: video, tom petersen, nvidia, livestream, live, g-sync
UPDATE: If you missed our live stream today that covered NVIDIA G-Sync technology, you can watch the replay embedded below. NVIDIA's Tom Petersen stops by to talk about G-Sync in both high level and granular detail while showing off some demonstrations of why G-Sync is so important. Enjoy!!
Last week NVIDIA hosted press and developers in Montreal to discuss a couple of new technologies, the most impressive of which was NVIDIA G-Sync, a new monitor solution that looks to solve the eternal debate of smoothness against latency. If you haven't read about G-Sync and how impressive it was when first tested on Friday, you should check out my initial write up, NVIDIA G-Sync: Death of the Refresh Rate, that not only does that, but dives into the reason the technology shift was necessary in the first place.
G-Sync essentially functions by altering and controlling the vBlank signal sent to the monitor. In a normal configuration, vBlank is a combination of the combination of the vertical front and back porch and the necessary sync time. That timing is set a fixed stepping that determines the effective refresh rate of the monitor; 60 Hz, 120 Hz, etc. What NVIDIA will now do in the driver and firmware is lengthen or shorten the vBlank signal as desired and will send it when one of two criteria is met.
- A new frame has completed rendering and has been copied to the front buffer. Sending vBlank at this time will tell the screen grab data from the card and display it immediately.
- A substantial amount of time has passed and the currently displayed image needs to be refreshed to avoid brightness variation.
In current display timing setups, the submission of the vBlank signal has been completely independent from the rendering pipeline. The result was varying frame latency and either horizontal tearing or fixed refresh frame rates. With NVIDIA G-Sync creating an intelligent connection between rendering and frame updating, the display of PC games is fundamentally changed.
Every person that saw the technology, including other media members and even developers like John Carmack, Johan Andersson and Tim Sweeney, came away knowing that this was the future of PC gaming. (If you didn't see the panel that featured those three developers on stage, you are missing out.)
But it is definitely a complicated technology and I have already seen a lot of confusion about it in our comment threads on PC Perspective. To help the community get a better grasp and to offer them an opportunity to ask some questions, NVIDIA's Tom Petersen is stopping by our offices on Monday afternoon where he will run through some demonstrations and take questions from the live streaming audience.
Be sure to stop back at PC Perspective on Monday, October 21st at 2pm ET / 11am PT as to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming. You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA G-Sync Live Stream
11am PT / 2pm ET - October 21st
We also want your questions!! The easiest way to get them answered is to leave them for us here in the comments of this post. That will give us time to filter through the questions and get the answers you need from Tom. We'll take questions via the live chat and via Twitter (follow me @ryanshrout) during the event but often time there is a lot of noise to deal with.
So be sure to join us on Monday afternoon!
Subject: Graphics Cards | October 18, 2013 - 07:55 PM | Ryan Shrout
Tagged: video, tim sweeney, nvidia, Mantle, john carmack, johan andersson, g-sync, amd
If you weren't on our live stream from the NVIDIA "The Way It's Meant to be Played" tech day this afternoon, you missed a hell of an event. After the announcement of NVIDIA G-Sync variable refresh rate monitor technology, NVIDIA's Tony Tomasi brough one of the most intriguing panels of developers on stage to talk.
John Carmack, Tim Sweeney and Johan Andersson talk for over an hour, taking questions from the audience and even getting into debates amongst themselves in some instances. Topics included NVIDIA G-Sync of course, AMD's Mantle low-level API, the hurdles facing PC gaming and what direction each luminary is currently on for future development.
If you are a PC enthusiast or gamer you are definitely going to want to listen and watch the video below!
Subject: General Tech, Graphics Cards | October 18, 2013 - 01:21 PM | Scott Michaud
Tagged: nvidia, GeForce GTX 780 Ti
So the really interesting news today was G-Sync but that did not stop NVIDIA from sneaking in a new high-end graphics card. The GeForce GTX 780 Ti follows the company's old method of releasing successful products:
- Attach a seemingly arbitrary suffix to a number
In all seriousness, we know basically nothing about this card. It is entirely possible that its architecture might not even be based on GK110. We do know it will be faster than a GeForce 780 but we have no frame of reference in regards to the GeForce Titan. The two cards were already so close in performance that Ryan struggled to validate the 780's existence. Imagine how difficult it would be for NVIDIA to wedge yet another product in that gap.
And if it does outperform the Titan, what is its purpose? Sure, Titan is a GPGPU powerhouse if you want double-precision performance without purchasing a Tesla or a Quadro, but that is not really relevant for gamers yet.
We shall see, soon, when we get review samples in. You, on the other hand, will likely see more when the card launches mid-November. No word on pricing.
Our Legacys Influence
We are often creatures of habit. Change is hard. And often times legacy systems that have been in place for a very long time can shift and determine the angle at which we attack new problems. This happens in the world of computer technology but also outside the walls of silicon and the results can be dangerous inefficiencies that threaten to limit our advancement in those areas. Often our need to adapt new technologies to existing infrastructure can be blamed for stagnant development.
Take the development of the phone as an example. The pulse based phone system and the rotary dial slowed the implementation of touch dial phones and forced manufacturers to include switches to select between pulse and tone based dialing options on phones for decades.
Perhaps a more substantial example is that of the railroad system that has based the track gauge (width between the rails) on the transportation methods that existed before the birth of Christ. Horse drawn carriages pulled by two horses had an axle gap of 4 feet 8 inches in the 1800s and thus the first railroads in the US were built with a track gauge of 4 feet 8 inches. Today, the standard rail track gauge remains 4 feet 8 inches despite the fact that a wider gauge would allow for more stability of larger cargo loads and allow for higher speed vehicles. But the cost of updating the existing infrastructure around the world would be so cost prohibitive that it is likely we will remain with that outdated standard.
What does this have to do with PC hardware and why am I giving you an abbreviated history lesson? There are clearly some examples of legacy infrastructure limiting our advancement in hardware development. Solid state drives are held back by the current SATA based storage interface though we are seeing movements to faster interconnects like PCI Express to alleviate this. Some compute tasks are limited by the “infrastructure” of standard x86 processor cores and the move to GPU compute has changed the direction of these workloads dramatically.
There is another area of technology that could be improved if we could just move past an existing way of doing things. Displays.
Subject: Graphics Cards | October 18, 2013 - 10:52 AM | Ryan Shrout
Tagged: variable refresh rate, refresh rate, nvidia, gsync, geforce, g-sync
UPDATE: I have posted a more in-depth analysis of the new NVIDIA G-Sync technology: NVIDIA G-Sync: Death of the Refres Rate. Thanks for reading!!
UPDATE 2: ASUS has announced the G-Sync enabled version of the VG248QE will be priced at $399.
During a gaming event being held in Montreal, NVIDIA unveield a new technology for GeForce gamers that the company is hoping will revolutionize the PC and displays. Called NVIDIA G-Sync, this new feature will combine changes to the graphics driver as well as change to the monitor to alter the way refresh rates and Vsync have worked for decades.
With standard LCD monitors gamers are forced to choose between a tear-free experience by enabling Vsync or playing a game with the substantial visual anomolies in order to get the best and most efficient frame rates. G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync. Essentially, G-Sync allows a properly equiped monitor to run at a variable refresh rate which will improve the experience of gaming in interesting ways.
This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work. The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes. In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design - so you can obviously expect this to only function with NVIDIA GPUs.
DisplayPort is the only input option currently supported.
It turns out NVIDIA will actually be offering retrofitting kits for current users of the VG248QE at some yet to be disclosed cost. The first retail sales of G-Sync will ship as a monitor + retrofit kit as production was just a bit behind.
Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing. It can also display 133 FPS at 133 Hz without tearing. Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync.
The technology that NVIDIA is showing here is impressive when seen in person; and that is really the only way to understand the difference. High speed cameras and captures will help but much like 3D Vision was, this is a feature that needs to be seen to be appreciated. How users will react to that road block will have to be seen.
Features like G-Sync show the gaming world that without the restrictions of console there is quite a bit of revolutionary steps that can be made to maintain the PC gaming advantage well into the future. 4K displays were a recent example and now NVIDIA G-Sync adds to the list.
Be sure to stop back at PC Perspective on Monday, November 21st at 2pm ET / 11am PT as we will be joined in-studio by NVIDIA's Tom Petersen to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming. You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA G-Sync Live Stream
11am PT / 2pm ET - October 21st
Subject: General Tech, Graphics Cards | October 17, 2013 - 06:56 PM | Scott Michaud
Tagged: nvidia, GTX 760 OEM
A pair of new graphics cards have been announced during the first day of "The Way It's Meant To Be Played Montreal 2013" both of which intended for system builders to integrate into their products. Both cards fall under the GeForce GTX 760 branding with the names: "GeForce GTX 760 Ti (OEM)" and "GeForce GTX 760 192-bit (OEM)".
I will place the main specifications of both cards side-by-side-side with the default GeForce 760 for a little bit of reference. Be sure to check out its benchmark.
|GTX 760||GTX 760 192-bit (OEM)||GTX 760 Ti (OEM)|
|Base Clock||980 MHz||823 MHz||915 MHz|
|Boost Clock||1033 MHz||888 MHz||980 MHz|
|Memory Bandwidth||6.0 GT/s||5.8 GT/s||6.0 GT/s|
|vRAM (capacity)||2 GB||1.5 or 3 GB||2 GB|
The GeForce 760 is no slouch and, especially the GTX 760 Ti, seems to be pretty close in performance to the retail product. I could see this being a respectible addition to a Steam Machine. I still cannot understand why, like the gaming bundle, these cards were not announced during the keynote speech.
Or, for that matter, why no-one seems to be reporting on them.
Subject: General Tech, Graphics Cards | October 17, 2013 - 05:36 PM | Scott Michaud
Tagged: shield, nvidia, bundle
The live stream from NVIDIA, this morning, was full of technologies focused around the PC gaming ecosystem including mobile (but still PC-like) platforms. Today they also announced a holiday gaming bundle for their GeForce cards although that missed the stream for some reason.
If you purchase a GeForce GTX 770, 780, or Titan from a participating retailer (including online), you will receive Splinter Cell: Black List, Batman: Arkham Origins, and Assassin's Creed IV: Black Flag along with a $100-off coupon for an NVIDIA SHIELD.
If, on the other hand, you purchase a GTX 760, 680, 670, 660 Ti, or 660 from a participating retailer (again, including online), you will receive Splinter Cell: Black List and Assassin's Creed IV: Black Flag along with a $50-off coupon for the NVIDIA SHIELD.
The current price at Newegg for an NVIDIA SHIELD is $299 USD. With a $100 discount, this pushes the price point to $199. The $200 price point is a barrier, for videogame systems, under which customers tend to jump at. Reaching the sub-$200 price point could be a big deal even for customers not on the fence especially when you consider PC streaming. Could be.
Assume you were already planning on upgrading your GPU. Would you be interested in adding in an NVIDIA SHIELD for an extra $199?
Subject: General Tech, Graphics Cards | October 17, 2013 - 01:45 AM | Ryan Shrout
Tagged: video, nvidia, live blog, live
Last month it was AMD hosting the media out in sunny Hawaii for a #GPU14 press event. This week NVIDIA is hosting a group of media in Montreal for a two-day event built around "The Way It's Meant to be Played".
NVIDIA promises some very impressive software and technology demonstrations on hand and you can take it all in with our live blog and (hopefully) live stream on our PC Perspective Live! page!
It starts at 10am ET / 7am PT so join us bright and early!! And don't forget to stop by tomorrow for an even more exciting Day 2!!
Get notified when we go live!