Our Legacys Influence
We are often creatures of habit. Change is hard. And often times legacy systems that have been in place for a very long time can shift and determine the angle at which we attack new problems. This happens in the world of computer technology but also outside the walls of silicon and the results can be dangerous inefficiencies that threaten to limit our advancement in those areas. Often our need to adapt new technologies to existing infrastructure can be blamed for stagnant development.
Take the development of the phone as an example. The pulse based phone system and the rotary dial slowed the implementation of touch dial phones and forced manufacturers to include switches to select between pulse and tone based dialing options on phones for decades.
Perhaps a more substantial example is that of the railroad system that has based the track gauge (width between the rails) on the transportation methods that existed before the birth of Christ. Horse drawn carriages pulled by two horses had an axle gap of 4 feet 8 inches in the 1800s and thus the first railroads in the US were built with a track gauge of 4 feet 8 inches. Today, the standard rail track gauge remains 4 feet 8 inches despite the fact that a wider gauge would allow for more stability of larger cargo loads and allow for higher speed vehicles. But the cost of updating the existing infrastructure around the world would be so cost prohibitive that it is likely we will remain with that outdated standard.
What does this have to do with PC hardware and why am I giving you an abbreviated history lesson? There are clearly some examples of legacy infrastructure limiting our advancement in hardware development. Solid state drives are held back by the current SATA based storage interface though we are seeing movements to faster interconnects like PCI Express to alleviate this. Some compute tasks are limited by the “infrastructure” of standard x86 processor cores and the move to GPU compute has changed the direction of these workloads dramatically.
There is another area of technology that could be improved if we could just move past an existing way of doing things. Displays.
Subject: Graphics Cards | October 18, 2013 - 10:52 AM | Ryan Shrout
Tagged: variable refresh rate, refresh rate, nvidia, gsync, geforce, g-sync
UPDATE: I have posted a more in-depth analysis of the new NVIDIA G-Sync technology: NVIDIA G-Sync: Death of the Refres Rate. Thanks for reading!!
UPDATE 2: ASUS has announced the G-Sync enabled version of the VG248QE will be priced at $399.
During a gaming event being held in Montreal, NVIDIA unveield a new technology for GeForce gamers that the company is hoping will revolutionize the PC and displays. Called NVIDIA G-Sync, this new feature will combine changes to the graphics driver as well as change to the monitor to alter the way refresh rates and Vsync have worked for decades.
With standard LCD monitors gamers are forced to choose between a tear-free experience by enabling Vsync or playing a game with the substantial visual anomolies in order to get the best and most efficient frame rates. G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync. Essentially, G-Sync allows a properly equiped monitor to run at a variable refresh rate which will improve the experience of gaming in interesting ways.
This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work. The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes. In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design - so you can obviously expect this to only function with NVIDIA GPUs.
DisplayPort is the only input option currently supported.
It turns out NVIDIA will actually be offering retrofitting kits for current users of the VG248QE at some yet to be disclosed cost. The first retail sales of G-Sync will ship as a monitor + retrofit kit as production was just a bit behind.
Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing. It can also display 133 FPS at 133 Hz without tearing. Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync.
The technology that NVIDIA is showing here is impressive when seen in person; and that is really the only way to understand the difference. High speed cameras and captures will help but much like 3D Vision was, this is a feature that needs to be seen to be appreciated. How users will react to that road block will have to be seen.
Features like G-Sync show the gaming world that without the restrictions of console there is quite a bit of revolutionary steps that can be made to maintain the PC gaming advantage well into the future. 4K displays were a recent example and now NVIDIA G-Sync adds to the list.
Be sure to stop back at PC Perspective on Monday, November 21st at 2pm ET / 11am PT as we will be joined in-studio by NVIDIA's Tom Petersen to discuss G-Sync, how it was developed and the various ramifications the technology will have in PC gaming. You'll find it all on our PC Perspective Live! page on Monday but you can sign up for our "live stream mailing list" as well to get notified in advance!
NVIDIA G-Sync Live Stream
11am PT / 2pm ET - October 21st
Subject: General Tech, Graphics Cards | October 17, 2013 - 06:56 PM | Scott Michaud
Tagged: nvidia, GTX 760 OEM
A pair of new graphics cards have been announced during the first day of "The Way It's Meant To Be Played Montreal 2013" both of which intended for system builders to integrate into their products. Both cards fall under the GeForce GTX 760 branding with the names: "GeForce GTX 760 Ti (OEM)" and "GeForce GTX 760 192-bit (OEM)".
I will place the main specifications of both cards side-by-side-side with the default GeForce 760 for a little bit of reference. Be sure to check out its benchmark.
|GTX 760||GTX 760 192-bit (OEM)||GTX 760 Ti (OEM)|
|Base Clock||980 MHz||823 MHz||915 MHz|
|Boost Clock||1033 MHz||888 MHz||980 MHz|
|Memory Bandwidth||6.0 GT/s||5.8 GT/s||6.0 GT/s|
|vRAM (capacity)||2 GB||1.5 or 3 GB||2 GB|
The GeForce 760 is no slouch and, especially the GTX 760 Ti, seems to be pretty close in performance to the retail product. I could see this being a respectible addition to a Steam Machine. I still cannot understand why, like the gaming bundle, these cards were not announced during the keynote speech.
Or, for that matter, why no-one seems to be reporting on them.
Subject: General Tech, Graphics Cards | October 17, 2013 - 05:36 PM | Scott Michaud
Tagged: shield, nvidia, bundle
The live stream from NVIDIA, this morning, was full of technologies focused around the PC gaming ecosystem including mobile (but still PC-like) platforms. Today they also announced a holiday gaming bundle for their GeForce cards although that missed the stream for some reason.
If you purchase a GeForce GTX 770, 780, or Titan from a participating retailer (including online), you will receive Splinter Cell: Black List, Batman: Arkham Origins, and Assassin's Creed IV: Black Flag along with a $100-off coupon for an NVIDIA SHIELD.
If, on the other hand, you purchase a GTX 760, 680, 670, 660 Ti, or 660 from a participating retailer (again, including online), you will receive Splinter Cell: Black List and Assassin's Creed IV: Black Flag along with a $50-off coupon for the NVIDIA SHIELD.
The current price at Newegg for an NVIDIA SHIELD is $299 USD. With a $100 discount, this pushes the price point to $199. The $200 price point is a barrier, for videogame systems, under which customers tend to jump at. Reaching the sub-$200 price point could be a big deal even for customers not on the fence especially when you consider PC streaming. Could be.
Assume you were already planning on upgrading your GPU. Would you be interested in adding in an NVIDIA SHIELD for an extra $199?
Subject: General Tech, Graphics Cards | October 17, 2013 - 01:45 AM | Ryan Shrout
Tagged: video, nvidia, live blog, live
Last month it was AMD hosting the media out in sunny Hawaii for a #GPU14 press event. This week NVIDIA is hosting a group of media in Montreal for a two-day event built around "The Way It's Meant to be Played".
NVIDIA promises some very impressive software and technology demonstrations on hand and you can take it all in with our live blog and (hopefully) live stream on our PC Perspective Live! page!
It starts at 10am ET / 7am PT so join us bright and early!! And don't forget to stop by tomorrow for an even more exciting Day 2!!
Subject: General Tech, Graphics Cards, Systems | October 10, 2013 - 06:59 PM | Scott Michaud
Tagged: amd, nvidia, Intel, Steam Machine
This should be little-to-no surprise for the viewers of our podcast, as this story was discussed there, but Valve has confirmed AMD and Intel graphics are compatible with Steam Machines. Doug Lombardi of Valve commented by email to, apparently, multiple sources including Forbes and Maximum PC.
Last week, we posted some technical specs of our first wave of Steam Machine prototypes. Although the graphics hardware that we've selected for the first wave of prototypes is a variety of NVIDIA cards, that is not an indication that Steam Machines are NVIDIA-only. In 2014, there will be Steam Machines commercially available with graphics hardware made by AMD, NVIDIA, and Intel. Valve has worked closely together with all three of these companies on optimizing their hardware for SteamOS, and will continue to do so into the foreseeable future.
Ryan and the rest of the podcast crew found the whole situation, "Odd". They could not understand why AMD referred the press to Doug Lombardi rather than circulate a canned statement from him. It was also weird why NVIDIA had an exclusive on the beta program with AMD being commercially available in 2014.
As I have said in the initial post: for what seems to be deliberate non-committal to a specific hardware spec, why limit to a single graphics provider?
Subject: Editorial, General Tech, Graphics Cards | October 10, 2013 - 03:28 PM | Ryan Shrout
Tagged: podcast, nvidia, contest, batman arkham origins
UPDATE: We picked our winner for week 1 but now you can enter for week 2!!! See the new podcast episode listed below!!
Back in August NVIDIA announced that they would be teaming up with Warner Bros. Interactive to include copies of the upcoming Batman: Arkham Origins game with select NVIDIA GeForce graphics cards. While that's great and all, wouldn't you rather get one for free next week from PC Perspective?
Great, you're in luck! We have a handful of keys to give out to listeners and viewers of the PC Perspective Podcast. Here's how you enter:
- Listen to or watch episode #272 of the PC Perspective Podcast and listen for the "secret phrase" as mentioned in the show!
- Subscribe to our RSS feed for the podcast or subscribe to our YouTube channel.
- Fill out the form at the bottom of this podcast page with the "secret phrase" and you're entered!
I'll draw a winner before the next podcast and announce it on the show! We'll giveaway one copy each of the next two weeks! Our thanks goes to NVIDIA for supplying the Batman: Arkham Origins keys for this contest!!
No restrictions on winning, so good luck!!
Subject: General Tech | October 9, 2013 - 01:05 PM | Jeremy Hellstrom
Tagged: amd, nvidia, price cuts, gtx 760
DigiTimes has broken the news that NVIDIA will be cutting prices on many of their cards in reaction to AMD's new GPU family. Currently the lowest priced GTX660 is $150 after MIR and a GTX650Ti Boost can be had for $110. We don't have any information as to how they will be updating the GTX 760, likely faster clocks but we can hope for something a little more adventurous. The GTX 760 can be had for $250 right now but you should hold off to see what the new model has and what it does to the price of the current model.
"Nvidia has offered price cuts for several of its graphics cards including the GTX 660 and GTX 650Ti Boost and will soon release an upgraded GTX 760, targeting AMD's Radeon R9 280X, according to sources from graphics card makers."
Here is some more Tech News from around the web:
- Selling phones? Nonsense, BlackBerry is all about THE CLOUD @ The Register
- Pokewithastick, an Arduino programmable web-logger/server @ Hack a Day
- Android adware that MUST NOT BE NAMED threatens MILLIONS @ The Register
- Dangerous VBulletin Exploit In the Wild @ Slashdot
- Mad Catz is taking pre-orders for the Mojo games console @ The Inquirer
- Sony Alpha NEX-5T Review @ TechReviewSource
Subject: General Tech | October 7, 2013 - 01:46 PM | Jeremy Hellstrom
Tagged: nvidia, linux, microsoft, open source
If you haven't heard the accusations flying over the possible scenarios that lead up to Origin PC dropping AMD cards from all their machines you can catch up at The Tech Report. They keep any speculation to a minimum unlike other sites but the key point is the claims of overheating and stability issues, something that apparently only Origin has encountered. If they had stuck with mentioning the frame pacing in Crossfire and 4K/mulitmonitor issue it would be understandable that they not sell AMD cards in systems designed for that usage but dropping them altogether is enough to start rumours and conspiracy theories across the interwebs. Winning a place in the Steam Machine was great for NVIDIA but at no time did they imply that AMD was unworthy, they merely didn't win the contract.
Today some oil was tossed on the fire with the revelation that NVIDIA is specifically limiting the functionality of its hardware on Linux. Just after we praised their release of documentation for Nouveau, their open sourced driver, we find out from a post at The Inquirer that NVIDIA limits the number of monitors used in Linux to three so as not to outdo their functionality in Windows. For a brief moment it seemed that NVIDIA was willing to cooperate with the open source and Linux communities but apparently that moment is all we will have and once again NVIDIA proves that it is willing bow to pressure from Microsoft.
"According to a forum poster at the Nvidia Developer Zone, the v310 version of the drivers for Basemosaic has reduced the number of monitors a user can connect simultaneously to three."
Here is some more Tech News from around the web:
- Cisco, Google and SAP may buy BlackBerry's bits: report @ The Register
- Toshiba unveils 'lightest and thinnest' workstation and a raft of business ultrabooks @ The Inquirer
- Microsoft claims its Surface 2 tablets are 'selling out' without spilling figures @ The Inquirer
- Down with Unicode! Why 16 bits per character is a right pain in the ASCII @ The Register
- NSA using Firefox flaw to snoop on Tor users @ The Register
- LED Costumes and Clothing @ Hack a Day
- Witnessing The League of Legends Season 3 World Championship Finals @ Legit Reviews
- OZONE Gaming Worldwide Joint Giveaway @ NikKTech
Introduction and Design
As we’re swimming through the veritable flood of Haswell refresh notebooks, we’ve stumbled across the latest in a line of very popular gaming models: the ASUS G750JX-DB71. This notebook is the successor to the well-known G75 series, which topped out at an Intel Core i7-3630QM with NVIDIA GeForce GTX 670MX dedicated graphics. Now, ASUS has jacked up the specs a little more, including the latest 4th-gen CPUs from Intel as well as 700-series NVIDIA GPUs.
Our ASUS G750JX-DB71 test unit features the following specs:
Of course, the closest comparison to this unit is already the most recently-reviewed MSI GT60-2OD-026US, which featured nearly identical specifications, apart from a 15.6” screen, a better GPU (a GTX 780M with 4 GB GDDR5), and a slightly different CPU (the Intel Core i7-4700MQ). In case you’re wondering what the difference is between the ASUS G750JX’s Core i7-4700MQ and the GT60’s i7-4700HQ, it’s very minor: the HQ features a slightly faster integrated graphics Turbo frequency (1.2 GHz vs. 1.15 GHz) and supports Intel Virtualization Technology for Directed I/O (VT-d). Since the G750JX doesn’t support Optimus, we won’t ever be using the integrated graphics, and unless you’re doing a lot with virtual machines, VT-d isn’t likely to offer any benefits, either. So for all intents and purposes, the CPUs are equivalent—meaning the biggest overall performance difference (on the spec sheet, anyway) lies with the GPU and the storage devices (where the G750JX offers more solid-state storage than the GT60). It’s no secret that the MSI GT60 burned up our benchmarks—so the real question is, how close is the ASUS G750JX to its pedestal, and if the differences are considerable, are they justified?
At an MSRP of around $2,000 (though it can be found for around $100 less), the ASUS G750JX-DB71 competes directly with the likes of the MSI GT60, too (which is priced equivalently). The question, of course, is whether it truly competes. Let’s find out!
Get notified when we go live!