Subject: Displays | May 22, 2014 - 11:30 PM | Sebastian Peak
Tagged: nvidia, monitor, g-sync, acer, 4k
We've been talking about the benefits 4K for a while, most recently with the Samsung U28D590D, which added single-stream 60Hz support to the mix, but there have certainly been some drawbacks with 4K monitors to date. Between usually low refresh rates and the general problem of getting smooth images on the screen (not to mention the high price of entry into 4K) there have been some legitimate questions about when to upgrade. Well, an interesting new product announcement from a surprising source might change things.
With a logo like that, who needs product photos?
Today, Acer is announcing an interesting alternative: the world’s first 4K monitor with integrated NVIDIA G-SYNC technology.
The XB280HK will be a 28" display, and (provided you have an NVIDIA graphics card and were looking to make the move to 4K) the benefits of G-SYNC - which include minimizing stutter and eliminating tearing - seem ideal for extremely high-res gaming.
We’ll be eagerly awaiting a look at the performance of this new monitor. (Or even a look at it, since Acer did not release a product photo!)
The details are scarce, but Acer says this will be a part of their “XB0” series of gaming monitors. Here are some specs for this 28” 3840x2160 display, which features three proprietary technologies from Acer:
- “Flicker-less” which Acer says is implemented at the power supply level to reduce screen flicker
- “Low-dimming” which sounds like an ambient light sensor to dim the monitor in low light
- “ComfyView” non-glare screen
Of interest, the Acer XB280HK is likely using a TN panel given the claimed "170/170 degree" viewing angle.
The hardware needed for good 4K frame rates are definitely up there, and with G-SYNC onboard the XB280HK will probably not be in the low-end of the 4K price range, but we shall see!
Subject: General Tech, Graphics Cards, Mobile | May 22, 2014 - 04:58 PM | Scott Michaud
Tagged: tegra k1, nvidia, iris pro, iris, Intel, hd 4000
The Chinese tech site, Evolife, acquired a few benchmarks for the Tegra K1. We do not know exactly where they got the system from, but we know that it has 4GB of RAM and 12 GB of storage. Of course, this is the version with four ARM Cortex-A15 cores (not the upcoming, 64-bit version based on Project Denver). On 3DMark Ice Storm Unlimited, it was capable of 25737 points, full system.
Image Credit: Evolife.cn
You might remember that our tests with an Intel Core i5-3317U (Ivy Bridge), back in September, achieved a score of 25630 on 3DMark Ice Storm. Of course, that was using the built-in Intel HD 4000 graphics, not a discrete solution, but it still kept up for gaming. This makes sense, though. Intel HD 4000 (GT2) graphics has a theoretical performance of 332.8 GFLOPs, while the Tegra K1 is rated at 364.8 GFLOPs. Earlier, we said that its theoretical performance is roughly on par with the GeForce 9600 GT, although the Tegra K1 supports newer APIs.
Of course, Intel has released better solutions with Haswell. Benchmarks show that Iris Pro is able to play Battlefield 4 on High settings, at 720p, with about 30FPS. The HD 4000 only gets about 12 FPS with the same configuration (and ~30 FPS on Low). This is not to compare Intel to NVIDIA's mobile part, but rather compare Tegra K1 to modern, mainstream laptops and desktops. It is getting fairly close, especially with the first wave of K1 tablets entering at the mid-$200 USD MSRP in China.
As a final note...
There was a time where Tim Sweeney, CEO of Epic Games, said that the difference between high-end and low-end PCs "is something like 100x". Scaling a single game between the two performance tiers would be next-to impossible. He noted that ten years earlier, that factor was more "10x".
Now, an original GeForce Titan is about 12x faster than the Tegra K1 and they support the same feature set. In other words, it is easier to develop a game for the PC and high-end tablet than it was to develop an PC game for high-end and low-end machines, back in 2008. PC Gaming is, once again, getting healthier.
Subject: General Tech, Graphics Cards, Processors, Mobile | May 15, 2014 - 05:02 PM | Scott Michaud
Tagged: nvidia, xaiomi, mipad, tegra k1
Tegra K1 is NVIDIA's new mobile processor and this first to implement the Kepler graphics architecture. In other words, it has all of the same graphics functionality as a desktop GPU with 364 GigaFLOPs of performance (a little faster than a GeForce 9600 GT). This is quite fast for a mobile product. For instance, that amount of graphics performance could max out Unreal Tournament 3 to 2560x1600 and run Crysis at 720p. Being Kepler, it supports OpenGL 4.4, OpenGL ES 3.1, DirectX 11 and 12, and GPU compute languages.
Xiaomi is launching their MiPad in Beijing, today, with an 8-inch 2048x1536 screen and the Tegra K1. They will be available in June (for China) starting at $240 USD for the 16GB version and going up to $270 for the 64GB version. Each version has 2GB of RAM, an 8MP rear-facing camera, and a 5MP front camera.
Now, we wait and see if any Tegra K1 devices come to North America and Europe - especially at that price point.
Subject: General Tech | May 12, 2014 - 09:42 PM | Scott Michaud
Tagged: nvidia, shield, half-life 2, Portal
What would Gordon Freeman do? He would tell everyone to... ... oh right.
Well, apparently he is available on the NVIDIA SHIELD, now, along with Portal. I am not talking about GameStream. These two games have been ported to Android, but only through the SHIELD. From their screenshots, the mobile games look pretty good, especially Portal with its look-through mechanics.
As usual, whenever NVIDIA really wants something, they will often parachute engineers through your skylights to do it for you. The company revolves around delivering experiences to their customers, which is a good mindset for a company to have. This is one of the main reasons for Microsoft and the success of PC gaming, especially in the late 90's with their DirectX efforts.
If you have an NVIDIA SHIELD, Half Life 2 and Portal are available now for $9.99, through TegraZone.
Subject: General Tech, Graphics Cards | May 12, 2014 - 08:00 PM | Scott Michaud
Tagged: titan z, nvidia, gtx titan z, geforce
To a crowd of press and developers at their GTC summit, NVIDIA announced the GeForce GTX Titan Z add-in board (AIB). Each of the two, fully unlocked, GK110 GPUs would each have access to 6GB of GDDR5 memory (12GB total). The card was expected to be available on May 8th but has yet to surface. As NVIDIA has yet to comment on the situation, many question whether it ever will.
And then we get what we think are leaked benchmarks (note: two pictures).
One concern about the Titan Z was its rated 8 TeraFLOPs of compute performance. This is a fairly sizable reduction from the theoretical maximum of 10.24 TeraFLOPs of two Titan Black processors and even less than two first-generation Titans (9 TeraFLOPs combined). We expected that this is due to reduced clock rates. What we did not expect is for benchmarks to show the GPUs boost way above those advertised levels, and even beyond the advertised boost clocks of the Titan Black and the 780 Ti. The card was seen pushing 1058 MHz in some sections, which leads to a theoretical compute performance of 12.2 TeraFLOPs (6.1 TeraFLOPs per GPU) in single precision. That is a lot.
These benchmarks also show that NVIDIA has a slight lead over AMD's R9 295X2 in many games, except Battlefield 4 and Sleeping Dogs (plus 3DMark and Unigine). Of course, these benchmarks measure the software reported frame rate and frame times and those may or may not be indicative of actual performance. While I would say that the Titan Z appears to have a slight performance lead over the R9 295X2, although a solid argument for an AMD performance win exists, it does so double the cost (at its expected $3000 USD price point). That is not up for debate.
So, until NVIDIA says anything, the Titan Z is in limbo. I am sure there exists CUDA developers who await its arrival. Personally, I would just get three Titan Blacks since you are going to need to manually schedule your workloads across multiple processors anyway (or 780 Tis if 32-bit arithmetic is enough precision). That is, of course, unless you cannot physically fit enough GeForce Titan Blacks in your motherboard and, as such, you require two GK110 chips per AIB (but not enough to bother writing a cluster scheduling application).
Subject: General Tech | May 8, 2014 - 11:57 AM | Ken Addison
Tagged: podcast, video, asus, z97, Z97-Deluxe, ncase, m1, amd, seattle, arm, nvidia, Portal, shield
PC Perspective Podcast #299 - 05/08/2014
Join us this week as we discuss ASUS Z97-Deluxe, NCASE M1 Case, AMD's custom ARM Designs and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, Allyn Malventano, and Morry Tietelman
Week in Review:
News items of interest:
Hardware/Software Picks of the Week:
Subject: Mobile | May 8, 2014 - 11:02 AM | Ryan Shrout
Tagged: nvidia, tegra, shield, half-life 2
Remember that cake we got last week? It was sent by NVIDIA to celebrate the release of Portal (May 12th) on SHIELD. They are at it again...
When you get a FedEx box meant for a poster or tube of some kind, but you didn't order said poster, you are likely to be confused. Imagine my surprise when I opened it up and found...a bright green crowbar. This might become a habit for them; we received a pry bar from NVIDIA in April of 2012 to tease the release of the GeForce GTX 690.
Based on the message on the crow bar it seems that a Half-Life 2 release on SHIELD is going to be following soon. Sorry to disappoint anyone that was expecting Half-Life 3...
Subject: General Tech, Graphics Cards, Mobile | May 7, 2014 - 02:26 AM | Scott Michaud
Tagged: Thunderbolt 2, thunderbolt, nvidia, GeForce GTX 780 Ti
Externally-attached GPUs have been a topic for many years now. Numerous companies have tried, including AMD and Lucid, but no solution has ever been a widely known and available product. Even as interfaces increase in bandwidth and compatibility with internal buses, it has never been something that a laptop salesperson could suggest to users who want to dock into a high-performance station at home. At best, we are seeing it in weird "coin mining" racks to hang way more GPUs above a system than could physically mount on the motherboard.
Apparently that has not stopped the DIY community, according to chatter on Tech Inferno forums. While the above video does not really show the monitor, MacBook Pro, and GPU enclosure at the same time, let alone all wired together and on, it seems reasonable enough. The video claims to give the MacBook Pro (running Windows 8.1) access to a GeForce GTX 780 Ti with fairly high performance, despite the reduced bandwidth. Quite cool.
Subject: General Tech, Graphics Cards | May 5, 2014 - 05:03 PM | Scott Michaud
Tagged: nvidia, geforce experience, shield
NVIDIA has released version 2.0.1 of GeForce Experience. This update does not bring many new features, hence why it is a third-level increment to the version number, but is probably worthwhile to download regardless. Its headlining feature is security enhancements with OpenSSL under remote GameStream on SHIELD. The update also claims to improve streaming quality and reduce audio latency.
While they do not seem to elaborate, I assume this is meant to fix Heartbleed, which is an exploit that allows an attacker to receive a small snapshot of active memory. If that is that case, it is unclear whether the SHIELD, the host PC during a game session, or both endpoints are affected.
The new GeForce Experience is available at the NVIDIA website. If it is running, it will also ask you to update it, of course.
Subject: Graphics Cards | May 2, 2014 - 01:29 AM | Tim Verry
Tagged: titan z, nvidia, gpgpu, gk110, dual gpu, asus
NVIDIA unveiled the GeForce GTX TITAN Z at the GPU Technology Conference last month, and the cards will be for sale soon from various partners. ASUS will be one of the first AIB partners to offer a reference TITAN-Z.
The ASUS GTX TITAN Z pairs two full GK110-based GPUs with 12GB of GDDR5 memory. The graphics card houses a total of 5,760 CUDA cores, 480 texture manipulation units (TMUs), and 96 ROPs. Each GK110 GPU interfaces with 6GB of GDDR5 memory via a 384-bit bus. ASUS is using reference clockspeeds with this card, which means 705 MHz base and up to 876 MHz GPU Boost for the GPUs and 7.0 GHz for the memory.
For comparison, the dual-GPU TITAN Z is effectively two GTX TITAN Black cards on a single PCB. However, the TITAN Black runs at 889 MHz base and up to 980 MHz GPU Boost. A hybrid water cooling solution may have allowed NVIDIA to maintain the clockspeed advantage, but doing so would compromise the only advantage the TITAN Z has over using two (much cheaper) TITAN Blacks in a workstation or server: card density. A small hit in clockspeed will be a manageable sacrifice for the target market, I believe.
The ASUS GTX TITAN Z has a 375W TDP and is powered by two 8-pin PCI-E power connectors. The new flagship dual GPU NVIDIA card has an MSRP of $3,000 and should be available in early May.