Subject: Graphics Cards | May 2, 2014 - 01:29 AM | Tim Verry
Tagged: titan z, nvidia, gpgpu, gk110, dual gpu, asus
NVIDIA unveiled the GeForce GTX TITAN Z at the GPU Technology Conference last month, and the cards will be for sale soon from various partners. ASUS will be one of the first AIB partners to offer a reference TITAN-Z.
The ASUS GTX TITAN Z pairs two full GK110-based GPUs with 12GB of GDDR5 memory. The graphics card houses a total of 5,760 CUDA cores, 480 texture manipulation units (TMUs), and 96 ROPs. Each GK110 GPU interfaces with 6GB of GDDR5 memory via a 384-bit bus. ASUS is using reference clockspeeds with this card, which means 705 MHz base and up to 876 MHz GPU Boost for the GPUs and 7.0 GHz for the memory.
For comparison, the dual-GPU TITAN Z is effectively two GTX TITAN Black cards on a single PCB. However, the TITAN Black runs at 889 MHz base and up to 980 MHz GPU Boost. A hybrid water cooling solution may have allowed NVIDIA to maintain the clockspeed advantage, but doing so would compromise the only advantage the TITAN Z has over using two (much cheaper) TITAN Blacks in a workstation or server: card density. A small hit in clockspeed will be a manageable sacrifice for the target market, I believe.
The ASUS GTX TITAN Z has a 375W TDP and is powered by two 8-pin PCI-E power connectors. The new flagship dual GPU NVIDIA card has an MSRP of $3,000 and should be available in early May.
Subject: General Tech | May 1, 2014 - 02:47 PM | Ken Addison
Tagged: nvidia, shield, Portal, GTC, Cake, lie
Sometimes I feel like this job just keeps getting stranger and stranger. Today is no expection.
After reciving just a tracking number, and no additional information from NVIDIA earlier this week, the mystery package finally arrived today. Upon initial inspection we had no idea what to expect.
When we opened the box, we were greeted by a polystyrene cooler with the logo of Bake Me a Wish, which only served to confuse us more.
As we opened the cooler, and the subsequent box inside of it, things started to make more sense.
Inside the box, we were greeted by a chocolate cake, accompanied by a card from NVIDIA.
As you may remember at this year's GTC Conference, NVIDIA announced that they had ported Valve's Portal to Android and would be releasing it for SHIELD. Today we were greeted with a reminder of that, and the message that we should be able to try it for ourselves.
A teaser from this year's GTC Keynote
While we can't talk about our experiences with Portal just yet, stay tuned to PC Perspective for more coverage of the NVIDIA SHIELD and Portal very soon!
Subject: General Tech | May 1, 2014 - 02:27 PM | Jeremy Hellstrom
Tagged: nvidia, charity, falcon northwest, phil scholtz
In January of this year, an NVIDIA employee named Philip Scholz died while successfully saving the life of a man who was on the tracks at a Caltrain station in Santa Clara. His family started the Philip Scholz Memorial Foundation to provide scholarship opportunities to needs based youth and promote physical activity, especially baseball. The Foundation has been taking donations directly but now his friends at NVIDIA have raised the bar and are auctioning off an impressive system from Falcon Northwest with the proceeds going to the charity.
Check out the video below and consider putting your money towards a great cause!
We’re now auctioning this amazing system to raise funds for the Philip Scholz Memorial Foundation, honoring an NVIDIA brother who was tragically killed earlier this year when he heroically removed an individual from train tracks and was hit.
The Foundation bearing Phil’s name provides scholarships to needs-based youth and promotes outdoor physical activities involving his favorite pastime, baseball.
This amazing PC uses all the latest technology to deliver absolutely raging performance in a cool, quiet platform.
It starts with the award-winning GeForce GTX Titan Black – the world’s fastest single GPU.
And it includes Intel’s latest Core i7 CPU, the 4770K, running on an ASUS z87 Maximus motherboard loaded with 16GB of GSKILL DDR3 memory. The storage system is based on two 1TB M50 SSDs from Crucial. And it’s all wrapped up in the most impressive chassis available today – the Tiki -- custom designed in NVIDIA green, with a laser cutout window so everyone can see the beast that lives within.
This entire system is fully warranted by Falcon Northwest.
To learn more about Phil and the way he changed those around him, see this story from San Jose Mercury News.
The direct auction link is here: http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=151292393844
Podcast #298 - Next Generation Intel Motherboards, Crossfire R9 295x2s, Corsair AX1500i Power Supply, and more!
Subject: General Tech | May 1, 2014 - 12:35 PM | Ken Addison
Tagged: video, r9 295x2, podcast, nvidia, Next Generation, Intel, corsair, AX1500i, amd, 295x2
PC Perspective Podcast #298 - 05/01/2014
Join us this week as we discuss Next Generation Intel Motherboards, Crossfire R9 295x2s, Corsair AX1500i Power Supply, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Josh Walrath, Jeremy Hellstrom, Allyn Malventano, and Morry Tietelman
there is a video, and it will be streamed
Week in Review:
0:40:00 Noctua NH-U14S CPU Cooler Review
News items of interest:
Hardware/Software Picks of the Week:
Subject: Graphics Cards | April 29, 2014 - 10:22 AM | Ryan Shrout
Tagged: nvidia, watch_dogs, watch dogs, bundle, geforce
A bit of a surprise email found its way to my inbox today that announced NVIDIA's partnership with Ubisoft to include copies of the upcoming Watch_Dogs game with GeForce GTX graphics cards.
Gamers that purchase a GeForce GTX 780 Ti, GTX 780, GTX 770 or GTX 760 from select retailers will qualify for a free copy of the game. You can details on this bundle and available GPUs to take advantage of it at Amazon.com!
The press release also confirms inclusion of NVIDIA exclusive features like TXAA and HBAO+ in the game itself, which is interesting. From what I am hearing, Watch_Dogs is going to be a beast of a game on GPU hardware and we are looking forward to using it as a test platform going forward.
Full press release is included below.
OWN THE TECH AND CONTROL THE CITY WITH NVIDIA® AND UBISOFT®
Select GeForce GTX GPUs Now Include the Hottest Game of the Year: Watch_Dogs™
Santa Clara, CA — April 29, 2014 — Destructoid calls it one of the “most wanted games of 2014.” CNET said it was “one of the most anticipated games in recent memory.” MTV said it’s one of the “Can’t-Miss Video Games of 2014.” This, all before anyone out there has even played it.
So, starting today(1), gamers who purchase select NVIDIA® GeForce® GTX® 780 Ti, 780, 770 and 760 desktop GPUs can get their chance to play Watch_Dogs™, the new PC game taking the world by storm and latest masterpiece from Ubisoft®.
Described as a “polished, refined and truly next generation experience,” in Watch_Dogs you play as Aiden Pearce, a brilliant hacker whose criminal past led to a violent family tragedy. While seeking justice, you will monitor and hack those around you, access omnipresent security cameras, download personal information to locate a target, control traffic lights and public transportation to stop the enemy and more.
Featuring NVIDIA TXAA and HBAO+ technology for an interactive, immersive experience, it’s clear that gamers can’t wait to play Watch_Dogs, especially considering the effusive praise that the official trailer received. Launched mere weeks ago, the trailer has already been viewed more than a combined 650,000 times. For gamers, Watch_Dogs seamlessly blends a mixture of single-player and multiplayer action in a way never before seen, and Ubisoft has gone one step further in creating a unique ctOS mobile companion app for users of smartphone and tablet devices allowing for even greater access to the fun. If you haven’t checked out the trailer, please check it out here: https://www.youtube.com/watch?
The GeForce GTX and Watch_Dogs bundle is available starting today from leading e-tailers including Newegg, Amazon.com, TigerDirect, NCIX; add-in card vendors such as EVGA; and nationwide system builders including AVADirect, CyberPower, Digital Storm, Falcon Northwest, iBUYPOWER, Maingear, Origin PC, Puget Systems, V3 Gaming PC and Velocity Micro. For a full list of participating partners, please visit: www.GeForce.com/GetWatchDogs.
Subject: General Tech, Graphics Cards | April 27, 2014 - 04:22 AM | Scott Michaud
Tagged: nvidia, linux, amd
GPU drivers have been a hot and sensitive topic at the site, especially recently, probably spurred on by the announcements of Mantle and DirectX 12. These two announcements admit and illuminate (like a Christmas tree) the limitations of APIs on gaming performance. Both AMD and NVIDIA have their recent successes and failures on their respective fronts. This will not deal with that, though. This is a straight round-up of new GPUs running the latest drivers... in Linux.
In all, NVIDIA tends to have better performance with its 700-series parts than equivalently-priced R7 or R9 products from AMD, especially in low-performance Source Engine titles such as Team Fortress 2. Sure, even the R7 260X was almost at 120 FPS, but the R9 290 was neck-and-neck with the GeForce GTX 760. The GeForce GTX 770, about $50 cheaper than the R9 290, had a healthy 10% lead over it.
In Unigine Heaven, however, the AMD R9 290 passed the NVIDIA GTX 770 by a small margin, coming right in line with it's aforementioned $50-bigger price tag. In that situation, where performance became non-trivial, AMD caught up (but did not beat). Also, third-party driver support is more embraced by AMD than NVIDIA. On the other hand, NVIDIA's proprietary drivers are demonstrably better, even if you would argue that the specific cases are trivial because of overkill.
And then there's Unvanquished, where AMD's R9 290 did not achieve triple-digit FPS scores despite the $250 GTX 760 getting 110 FPS.
Update: As pointed out in the comments, some games perform significantly better on the $130 R7 260X than the $175 GTX 750 Ti (HL2: Lost Coast, TF2, OpenArena, Unigine Sanctuary). Some other games are the opposite, with the 750 Ti holding a sizable lead over the R7 260X (Unigine Heaven and Unvanquished). Again, Linux performance is a grab bag between vendors.
There's a lot of things to consider, especially if you are getting into Linux gaming. I expect that it will be a hot topic, soon, as it picks up... ... Steam.
Subject: General Tech, Systems, Mobile | April 27, 2014 - 03:30 AM | Scott Michaud
Tagged: nvidia, sheild, shield 2, AnTuTu
VR-Zone is claiming that this is the successor to NVIDIA's SHIELD portable gaming system. An AnTuTu benchmark was found for a device called, "NVIDIA test model(SHIELD)" with an "NVIDIA Gefroce(Kepler Graphics)" GPU, typos left as-is. My gut expects that it is valid, but I hesitate to vouch the rumor. If it even came from NVIDIA, which the improper spelling and capitalization of "GeForce" calls into question, it could easily be an internal prototype and maybe even incorrectly given the "SHIELD" (which is properly spelled and capitalized) label.
Image Credit: AnTuTu.com
As far as its camera listing, it would make sense for the SHIELD to get one at standard definition (0.3MP -- probably 640x480). The fact that the original SHIELD shipped without any, at all, still confuses me. The low resolution sensor still does not make sense, seeming like an almost pointless upgrade, but it could be used by NVIDIA for a specific application or built-in purpose.
Or, it could be an irrelevant benchmark listing.
Either way, there are rumors floating around about a SHIELD 2 being announced at E3 in June. It is unlikely that NVIDIA will give up on the handheld any time soon. Whether that means new hardware, versus more software updates, is anyone's guess. The Tegra K1 would have been a good launching SoC for the SHIELD, however, with its full OpenGL 4.4 and compute support (the hardware supports up to OpenCL 1.2 although driver support will apparently be "based on customer needs". PDF - page 8).
Waiting. Seeing. You know the drill.
Subject: General Tech | April 25, 2014 - 01:43 PM | Jeremy Hellstrom
Tagged: nvidia, contest, jetson tk1, kepler
Attention enthusiasts, developers and creators. Are you working on a new embedded computing application?
Meet the Jetson TK1 Developer Kit. It’s the world’s first mobile supercomputer for embedded systems, putting unprecedented computing performance in a low-power, portable and fully programmable package.
Power, ports, and portability: the Jetson TK1 development kit.The Jetson TK1 development kit
It’s the ultimate platform for developing next-generation computer vision solutions for robotics, medical devices, and automotive applications.
And we’re giving away 50 of them as part of our Tegra K1 CUDA Vision Challenge.
In addition to the Tegra K1 processor, the Jetson TK1 DevKit is equipped with 2 GB of RAM, 16 GB of storage and a host of ports and connectivity options.
And, because it offers full support for CUDA, the most pervasive, easy-to-use parallel computing platform and programming model, it’s much easier to program than the FPGA, custom ASIC and DSP processors that are typically used in today’s embedded systems.
Jetson TK1 is based on the Kepler computing architecture, the same technology powering today’s supercomputers, professional workstations and high-end gaming rigs. It has 192 CUDA cores, delivering over 300 GFLOPs of performance, and also provides full support for OpenGL 4.4, and CUDA 6.0, as well as the GPU-accelerated OpenCV.
Our Tegra K1 system-on-a-chip offers unprecedented power and portability.Our Tegra K1 system-on-a-chip offers unprecedented power and portability.
Entering the Tegra K1 CUDA Vision Challenge is easy. Just tell us about your embedded application idea. All proposals must be submitted April 30, 2014. Entries will be judged for innovation, impact on research or industry, public availability, and quality of work.
By the end of May, the top 50 submissions will be awarded one of the first Jetson TK1 DevKits to roll off the production line, as well as access to technical support documents and assets.
The five most noteworthy Jetson TK1 breakthroughs may get a chance to share their work at the NVIDIA GPU Technology Conference in 2015.
AM1 Walks New Ground
After Josh's initial review of the AMD AM1 Platform and the Athlon 5350, we received a few requests to look at gaming performance with a discrete GPU installed. Even though this platform isn't being aimed at gamers looking to play demanding titles, we started to investigate this setup anyway.
While Josh liked the ASUS AM1I-A Mini ITX motherboard he used in his review, with only a x1 PCI-E slot it would be less than ideal for this situation.
Luckily we had the Gigabyte AM1M-S2H Micro ATX motherboard, which features a full length PCI-E x16 slot, as well as 2 x1 slots.
Don't be mistaken by the shape of the slot though, the AM1 chipset still only offers 4 lanes of PCI-Express 2.0. This, of course, means that the graphics card will not be running at full bandwidth. However, having the physical x16 slot makes it a lot easier to physically connect a discrete GPU, without having to worry about those ribbon cables that miners use.
Competition is a Great Thing
While doing some testing with the AMD Athlon 5350 Kabini APU to determine it's flexibility as a low cost gaming platform, we decided to run a handful of tests to measure something else that is getting a lot of attention right now: AMD Mantle and NVIDIA's 337.50 driver.
Earlier this week I posted a story that looked at performance scaling of NVIDIA's new 337.50 beta driver compared to the previous 335.23 WHQL. The goal was to assess the DX11 efficiency improvements that the company stated it had been working on and implemented into this latest beta driver offering. In the end, we found some instances where games scaled by as much as 35% and 26% but other cases where there was little to no gain with the new driver. We looked at both single GPU and multi-GPU scenarios on mostly high end CPU hardware though.
Earlier in April I posted an article looking at Mantle, AMD's answer to a lower level API that is unique to its ecosystem, and how it scaled on various pieces of hardware on Battlefield 4. This was the first major game to implement Mantle and it remains the biggest name in the field. While we definitely saw some improvements in gaming experiences with Mantle there was work to be done when it comes to multi-GPU scaling and frame pacing.
Both parties in this debate were showing promise but obviously both were far from perfect.
While we were benchmarking the new AMD Athlon 5350 Kabini based APU, an incredibly low cost processor that Josh reviewed in April, it made sense to test out both Mantle and NVIDIA's 337.50 driver in an interesting side by side.