Subject: Displays | December 16, 2013 - 09:11 AM | Ryan Shrout
Tagged: video, vg248qe, nvidia, gsync, g-sync, asus
It looks like some G-Sync ready monitors are going to be on sale starting today, though perhaps not from the outlets you would have expected. NVIDIA let me know last night that they are working with partners, including ASUS obviously, to make a small amount of pre-modified ASUS VG248QE G-Sync monitors available for purchase. These are the same monitors we used in our recent G-Sync preview story so you should check that article out if you want our opinions on the display and the technology.
Those people selling the displays? Digital Storm, Falcon Northwest, Maingear, and Overlord Computer. This creates some unfortunate requirements on potential buyers. For example, Falcon Northwest is only selling the panels to users that either are buying a new Falcon PC or already own a Falcon custom system. Digital Storm on the other hand WILL sell the monitor on its own or allow you to send in your VG248QE monitor to have the upgrade service done for you. The monitor alone will sell for $499 while the upgrade price (with module included) is $299.
This distribution model for G-Sync technology likely isn't what users wanted or expected. After all, we were promised upgrade kits for users of that specific ASUS VG248QE display and we still do not have data on how NVIDIA plans to sell them or distribute them. Being able to purchase the display from these resellers above is at least SOMETHING before the holiday, but it really isn't the way we would like to see G-Sync showcased. NVIDIA needs to get these products in the hands of gamers sooner rather than later.
NVIDIA also prepared a new video to showcase G-Sync. Unlike other marketing videos this one wasn't placed on YouTube as the ability for it to run at a fixed 60 FPS is a strict requirement, something that YouTube can't do or can't do reliably. For this video's demonstration to work correctly you need set your display to a 60 Hz refresh rate and you should use a video player capable of maintaining the static 60 FPS content decoding.
To grab a copy of this video, you can use the link right here that will download the file directly from Mega.co.nz. It should help demonstrate the effects us using a G-Sync enabled display for users that don't have access to see one in person.
Oh, and I know that LOTS of you have been clamoring for information on how you can get your hands on one of those DIY G-Sync upgrade kits for yourself and I have some good news. Though I can't tell you where to buy one or how much it will cost, I can offer you one of 5 FREE G-Sync ASUS VG248QE upgrade kits through a giveaway we are hosting at PC Perspective! Check out this page for the details!!
Quality time with G-Sync
Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.
NVIDIA's Prototype G-Sync Monitor
We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics. All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.
Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed. This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.
In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.
The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.
Subject: General Tech | December 12, 2013 - 01:35 AM | Ken Addison
Tagged: z87, xfire, video, shield, R9 290X, podcast, pcper, nvidia, litecoin, grid, frame rating, eyefinity, crossfire, amd
PC Perspective Podcast #280 - 12/12/2013
Join us this week as we discuss the NVIDIA GRID Beta, R9 290X Custom Coolers, 2TB SSDs and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano and Scott Michaud
Streaming games straight from NVIDIA
Over the weekend NVIDIA released a December update for the SHIELD Android mobile gaming device that included a very interesting, and somewhat understated, new feature: Beta support for NVIDIA GRID.
You have likely heard of GRID before, NVIDIA has been pushing it as part of the companies vision going forward to GPU computing in every facet and market. GRID was aimed at creating GPU-based server farms to enable mobile, streaming gaming to users across the country and across the world. While initially NVIDIA only talked about working with partners to launch streaming services based on GRID, they have obviously changed their tune slightly with this limited release.
If you own a SHIELD, and install the most recent platform update, you'll find a new icon in your NVIDIA SHIELD menu called GRID Beta. The first time you start this new application, it will attempt to measure your bandwidth and latency to offer up an opinion on how good your experience should be. NVIDIA is asking for at least 10 Mbps of sustained bandwidth, and wants round trip latency under 60 ms from your location to their servers.
Currently, servers are ONLY located in Northern California so the further out you are, the more likely you will be to run into problems. However, oing some testing in Kentucky and Ohio resulted in a very playable gaming scenarios, though we did run into some connection problems that might be load-based or latency-based.
After the network setup portion users are shown 8 different games that they can try. Darksiders, Darksiders II, Street Fighter X Tekken, Street Fighter IV, Alan Wake, The Witcher 2, Red Faction: Armageddon and Trine 2. You are free to play them free of charge during this beta though I think you can be sure they will be removed and erased at some point; just a reminder. Saves work well and we were able to save and resume games of Darksiders 2 on GRID easily and quickly.
Starting up the game was fast, about on par with starting up a game on a local PC, though obviously the server is loading it in the background. Once the game is up and running, you are met with some button mapping information provided by NVIDIA for that particular game (great addition) and then you jump into the menus as if you were running it locally.
Subject: General Tech, Graphics Cards | December 2, 2013 - 03:16 PM | Scott Michaud
Tagged: nvidia, ShadowPlay
They grow up so fast these days...
GeForce Experience is NVIDIA's software package, often bundled with their driver updates, to optimize the experience of their customers. This could be adding interesting features, such as GPU-accelerated game video capture, or just recommending graphics settings for popular games.
Version 1.8 adds many desired features lacking from the previous version. I always found it weird that GeForce Experience would recommend one good baseline settings for games, and set them for you, but force you to then go into the game and tweak from there. It would be nice to see multiple presets but that is not what we get; instead, we are able to tweak the settings from within GeForce Experience. The baseline tries to provide a solid 40 FPS at the most difficult moments, computationally. You can then tune the familiar performance and quality slider from there.
You are also able to set resolutions up to 3840x2160 and select whether you would like to play in windowed (including "borderless") mode.
Also, with ShadowPlay, Windows 7 users will also be able to "shadow" the last 20 minutes like their Windows 8 neighbors. You will also be able to combine your microphone audio with the in-game audio should you select it. I can see the latter feature being very useful for shoutcasters. Apparently it allows capturing VoIP communication and not just your microphone itself.
Still no streaming to Twitch.tv, yet. It is still coming.
For now, you can download GeForce Experience from NVIDIA's GeForce website. If you want to read a little more detail about it, first, you can check out their (much longer) blog post.
Subject: General Tech, Graphics Cards | November 22, 2013 - 06:26 PM | Scott Michaud
Tagged: nvidia, jpr, amd
Jen Peddie Research (JPR) reports an 8% rise in quarter-to-quarter shipments of graphics add-in boards (AIBs) for NVIDIA and a decrease of 3% for AMD. This reverses the story from last quarter where NVIDIA lost 8% and AMD gained. In all, NVIDIA holds over half the market (64.5%).
JPR attributed AMD's gains seen last quarter to consumers who added a discrete graphics solution to systems which already contain an integrated product. SLi and Crossfire were noted but pale in comparison. I expect that Never Settle to have contributed heavily. This quarter, the free games initiative was reduced with the new GPU lineup. For a decent amount of time, nothing was offered.
At the same time, NVIDIA launched the GTX 780 Ti and their own game bundle. While I do not believe this promotion was as popular as AMD's Never Settle, it probably helped. That said, it is still probably too early to tell whether the Battlefield 4 promotion (or Thief's addition to Silver Tier) will help them regain some ground.
The other vendors, Matrox and S3, were "flat to declining". Their story is the same as last quarter: they less than (maybe much less than) 7000 units. On the whole, add-in board shipments are rising from last quarter; that quarter, however, was a 5.4% drop from the one before.
Subject: General Tech, Graphics Cards, Systems | November 21, 2013 - 09:47 PM | Scott Michaud
Tagged: nvidia, tesla, supercomputing
GPUs are very efficient in terms of operations per watt. Their architecture is best suited for a gigantic bundle of similar calculations (such as a set of operations for each entry of a large blob of data). These are the tasks which also take up the most computation time especially for, not surprisingly, 3D graphics (where you need to do something to every pixel, fragment, vertex, etc.). It is also very relevant for scientific calculations, financial and other "big data" services, weather prediction, and so forth.
Tokyo Tech KFC achieves over 4 GigaFLOPs per watt of power draw from 160 Tesla K20X GPUs in its cluster. That is about 25% more calculations per watt than current leader of the Green500 (CINECA Eurora System in Italy, with 3.208 GFLOPs/W).
One interesting trait: this supercomputer will be cooled by oil immersion. NVIDIA offers passively cooled Tesla cards which, according to my understanding of how this works, suit very well to this fluid system. I am fairly certain that they remove all of the fans before dunking the servers (I figured they would be left on).
By the way, was it intentional to name computers dunked in giant vats of heat-conducting oil, "KFC"?
Intel has done a similar test, which we reported on last September, submerging numerous servers for over a year. Another benefit of being green is that you are not nearly as concerned about air conditioning.
NVIDIA is actually taking it to the practical market with another nice supercomputer win.
Other NVIDIA Supercomputing News:
Subject: General Tech, Graphics Cards | November 18, 2013 - 03:33 PM | Scott Michaud
Tagged: tesla, nvidia, K40, GK110b
The Tesla K20X ruled NVIDIA's headless GPU portfolio for quite some time now. The part is based on the GK110 chip with 192 shader cores disabled, like the GeForce Titan, and achieved 3.9 TeraFLOPs of compute performance (1.31 TeraFLOPs in double precision). Also, like the Titan, the K20X offers 6GB of memory.
The Tesla K40X
So the layout was basically the following: GK104 ruled the gamer market except for the, in hindsight, oddly-positioned GeForce Titan which was basically a Tesla K20X without a few features like error correction (ECC). The Quadro K6000 was the only card to utilize all 2880 CUDA cores.
Then, at the recent G-Sync event, NVIDIA CEO Jen-Hsun Huang announced the GeForce GTX 780Ti. This card uses the GK110b processor and incorporates all 2880 CUDA cores albeit with reduced double-precision performance (for the 780 Ti, not for GK110b in general). So now we have Quadro and GeForce with the full power Kepler, your move Tesla.
And they did, the Tesla K40 launched this morning and it brought more than just cores.
A brief overview
The GeForce launch was famous for its inclusion of GPU Boost, a feature absent in the Tesla line. It turns out that NVIDIA was paying attention to the feature but wanted to include it in a way that suited data centers. GeForce cards boost based on the status of the card, its temperature or its power draw. This is apparently unsuitable for data centers because they would like every unit operating at a very similar performance. The Tesla K40 has a base clock of 745 MHz but gives the data center two boost clocks that they can manually set: 810 MHz and 875 MHz.
Relative performance benchmarks
The Tesla K40 also doubles the amount of RAM to 12GB. Of course this allows for the GPU to work on larger data sets without streaming in the computation from system memory or worse.
There is currently no public information on pricing for the Tesla K40 but it is available starting today. What we do know are the launch OEM partners: ASUS, Bull, Cray, Dell, Eurotech, HP, IBM, Inspur, SGI, Sugon, Supermicro, and Tyan.
If you are interested in testing out a K40, NVIDIA has remotely hosted clusters that your company can sign up for at the GPU Test Drive website.
Subject: General Tech | November 15, 2013 - 05:54 PM | Jeremy Hellstrom
Tagged: shield, nvidia, gamestream, Android
Neoseeker traded in their Star Wars Limited Edition PSP for an NVIDIA Shield to see the evolution of portable gaming in action. It was love at first sight, from the design of the box it came in to the shape of the actual device. The actual performance of the device involved changing some habit, years of touchscreen usage were working against them when navigating with the D-Pad but that was quickly overcome as they became accustomed to the device. Once they got comfortable with Shield and tried out both GameStream and Console Mode it was no longer possible to separate them from NVIDIA's new toy and it became a permanent fixture, much like their cellphones. At launch this device was impressive and as people continue to use it and develop new applications it will only get better.
"Through SHIELD, NVIDIA offers a new approach to Android gaming by providing an all-in-one platform combining beautiful display and console-grade controller."
Here are some more Mobile articles from around the web:
- Dell Alienware 14 @ The Inquirer
- HP Chromebook 11 @ The Inquirer
- Thermaltake Massive 14² Notebook Cooler @ Funky Kit
- Evercool AIOLUS Notebook Cooler Review @ OCC
- Thermaltake Massive SP Notebook Cooler @ Funky Kit
- Cooler Master CM Storm SF-17 Gaming Notebook Cooler Review @ Madshrimps
- iPad Air review @ Bjorn3D
- Sony Xperia Z1 @ Techspot
- iconBIT NetTAB MERCURY Q7 (NT-3602M) Smartphone Review @ Madshrimps
- exus 5 @ The Inquirer
- ASUS Nexus 7 Gen 2 Android Tablet Review @ TechwareLabs
- Hands-on with the Lenovo Yoga Tablet @ Hardware.info
- TEXT GOES HERE
- iPhone 5S vs iPhone 5 head to head @ The Inquirer
NVIDIA Tegra Note Program
Clearly, NVIDIA’s Tegra line has not been as successful as the company had hoped and expected. The move for the discrete GPU giant into the highly competitive world of the tablet and phone SoCs has been slower than expected, and littered with roadblocks that were either unexpected or that NVIDIA thought would be much easier to overcome.
The truth is that this was always a long play for the company; success was never going to be overnight and anyone that thought that was likely or possible was deluded. Part of it has to do with the development cycle of the ARM ecosystem. NVIDIA is used to a rather quick development, production, marketing and sales pattern thanks to its time in high performance GPUs, but the SoC world is quite different. By the time a device based on a Tegra chip is found in the retail channel it had to go through an OEM development cycle, NVIDIA SoC development cycle and even an ARM Cortex CPU development cycle. The result is an extended time frame from initial product announcement to retail availability.
Partly due to this, and partly due to limited design wins in the mobile markets, NVIDIA has started to develop internal-designed end-user devices that utilize its Tegra SoC processors. This has the benefit of being much faster to market – while most SoC vendors develop reference platforms during the normal course of business, NVIDIA is essentially going to perfect and productize them.
Get notified when we go live!