All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Falcon Northwest Tiki-Z Special Edition Crams Titan Z And Liquid Cooled i7-4790K CPU Into A Stylish Micro Tower
Subject: General Tech, Systems | August 15, 2014 - 10:40 PM | Tim Verry
Tagged: titan z, tiki-z, gtx titan z, gk110, falcon northwest, dual gpu
The Tiki-Z Special Edition is the latest custom PC from boutique vendor Falcon Northwest. This high-end enthusiast system, which starts at $5,614 manages to pack a dual GPU graphics card, liquid cooled CPU, 600W power supply, and up to 6TB of storage into a stylish micro tower that measures a mere 4” wide and 13” tall.
Falcon Northwest has taken the original Tiki chassis and made several notable tweaks to accommodate NVIDIA’s latest dual GPU card: the GeForce GTX TITAN Z which we reviewed here. The case has a custom (partial) side window that shows off the graphics card. This window can be green glass or smoke tinted acrylic with customizable laser cut venting. A ducted intake feeds cool air to the graphics card and vents at the rear and front of the case exhaust hot air. The exterior of the case can be painted in any single color of automobile paint for free or with a fully customized paint scheme with artwork at an additional cost.
In addition to the Titan Z with its 5,760 CUDA cores, 12GB of memory, and 8.1 TFLOPS of peak compute power, Falcon Northwest has packed a modular small form factor 600W PSU from SilverStone, an ASUS Z97I Plus motherboard, Intel Core i7-4790K “Devil’s Canyon” CPU with liquid cooler, up to 16GB of DDR3 1866MHz memory from G.Skill, and up to 6TB of storage (two 1TB SSDs and one 4TB Western Digital Green hard drive). The i7-4970K comes stock clocked at 4GHz (4.4GHz max turbo), but can be overclocked by Falcon Northwest upon request.
Needless to say, that is a lot of hardware to cram into a PC that can easily sit next to your monitor at your desk or in your living room!
The engineering, artwork, and support of this high end system all come at a price, however. The new Titan Z powered boutique PC starts at $5,614 USD and is available now from Falcon Northwest. To sweeten the deal, for a limited time Falcon Northwest is including a free ASUS PB287Q 4K monitor (3820x2160, 60Hz, 1ms response time, see more specification in our review) with each Tiki-Z purchase.
This system is an impressive feat of engineering and it certainly looks sharp with the artwork, custom side panel, and compact form factor. My only concern from a usability standpoint would be noise from the cooling systems for the GPU, CPU radiator, and PSU. One also has to consider that the Titan Z graphics card by itself is priced at $3,000 which puts the Tiki Z pricing back into the somewhat sane world of boutique PC pricing (heh at about $2,600 for the system minus the GPU). No question, this is not going to be a system for everyone and will even be a niche product within the niche market of those enthusiasts interested in pre-built gaming systems. Even so, if noise levels can be held in check it will make for one powerful little gaming box!
Subject: General Tech, Graphics Cards, Shows and Expos | August 15, 2014 - 05:33 PM | Scott Michaud
Tagged: siggraph 2014, Siggraph, OpenGL Next, opengl 4.5, opengl, nvidia, Mantle, Khronos, Intel, DirectX 12, amd
Let's be clear: there are two stories here. The first is the release of OpenGL 4.5 and the second is the announcement of the "Next Generation OpenGL Initiative". They both occur on the same press release, but they are two, different statements.
OpenGL 4.5 Released
OpenGL 4.5 expands the core specification with a few extensions. Compatible hardware, with OpenGL 4.5 drivers, will be guaranteed to support these. This includes features like direct_state_access, which allows accessing objects in a context without binding to it, and support of OpenGL ES3.1 features that are traditionally missing from OpenGL 4, which allows easier porting of OpenGL ES3.1 applications to OpenGL.
It also adds a few new extensions as an option:
ARB_pipeline_statistics_query lets a developer ask the GPU what it has been doing. This could be useful for "profiling" an application (list completed work to identify optimization points).
ARB_sparse_buffer allows developers to perform calculations on pieces of generic buffers, without loading it all into memory. This is similar to ARB_sparse_textures... except that those are for textures. Buffers are useful for things like vertex data (and so forth).
ARB_transform_feedback_overflow_query is apparently designed to let developers choose whether or not to draw objects based on whether the buffer is overflowed. I might be wrong, but it seems like this would be useful for deciding whether or not to draw objects generated by geometry shaders.
KHR_blend_equation_advanced allows new blending equations between objects. If you use Photoshop, this would be "multiply", "screen", "darken", "lighten", "difference", and so forth. On NVIDIA's side, this will be directly supported on Maxwell and Tegra K1 (and later). Fermi and Kepler will support the functionality, but the driver will perform the calculations with shaders. AMD has yet to comment, as far as I can tell.
Image from NVIDIA GTC Presentation
If you are a developer, NVIDIA has launched 340.65 (340.23.01 for Linux) beta drivers for developers. If you are not looking to create OpenGL 4.5 applications, do not get this driver. You really should not have any use for it, at all.
Next Generation OpenGL Initiative Announced
The Khronos Group has also announced "a call for participation" to outline a new specification for graphics and compute. They want it to allow developers explicit control over CPU and GPU tasks, be multithreaded, have minimal overhead, have a common shader language, and "rigorous conformance testing". This sounds a lot like the design goals of Mantle (and what we know of DirectX 12).
And really, from what I hear and understand, that is what OpenGL needs at this point. Graphics cards look nothing like they did a decade ago (or over two decades ago). They each have very similar interfaces and data structures, even if their fundamental architectures vary greatly. If we can draw a line in the sand, legacy APIs can be supported but not optimized heavily by the drivers. After a short time, available performance for legacy applications would be so high that it wouldn't matter, as long as they continue to run.
Add to it, next-generation drivers should be significantly easier to develop, considering the reduced error checking (and other responsibilities). As I said on Intel's DirectX 12 story, it is still unclear whether it will lead to enough performance increase to make most optimizations, such as those which increase workload or developer effort in exchange for queuing fewer GPU commands, unnecessary. We will need to wait for game developers to use it for a bit before we know.
Subject: Mobile | August 15, 2014 - 11:22 AM | Jeremy Hellstrom
Tagged: Surface Pro 3, microsoft
With a 12" 2160x1440 resolution screen, a 4th generation Core i3, i5 or i7 and a full version of Win 8.1 the new Surface Pro 3 is the best tablet offered by Microsoft so far. Overall it is thinner but 1.5" larger than the Pro 3 with better resolution with a battery that should last about 8 hours while you are working, slightly longer when just browsing. The Surface Pen is a nice addition to the dock and stand we have become familiar with. Overall The Inquirer was fairly impressed with Microsoft's new offering, apart from the pricing which is rather prohibitive even before accessorizing.
"THE SURFACE PRO 3 tablet brings some of the biggest and most welcome changes seen in the Surface tablet line yet, with a bigger and better 12in HD screen, a much thinner case and an improved keyboard and kickstand, meaning its never lived more up to its motto of "the tablet that can replace your laptop."
Here are some more Mobile articles from around the web:
- Asus C200 Chromebook Review @ TechwareLabs
- ASUS X200MA: 11.6-inch Bay Trail Notebook @ SPCR
- Acer Goes Tegra K1 for Chromebook 13 @ Hardware Canucks
- Arctic Home Charger 4500 USB Adapter @ Funky Kit
- Corsair Voyager Air 2 @ Kitguru
- HUAWEI Ascend Mate2 Smart Phone Review @ Legit Reviews
Subject: General Tech | August 15, 2014 - 10:09 AM | Jeremy Hellstrom
Tagged: amd, Mantle, opengl, OpenGL Next
Along with his announcements about FreeSync, Richard Huddy also discussed OpenGL Next and its relationship with Mantle and the role it played in DirectX 12's development. AMD has given Chronos Group, the developers of OpenGL, complete access to Mantle to help them integrate it into future versions of the API starting with OpenGL Next. He also discussed the advantages of Mantle over DirectX, citing AMD's ability to update it much more frequently than Intel has done with DX. With over 75 developers working on titles that take advantage of Mantle the interest is definitely there but it is uncertain if devs will actually benefit from an API which updates at a pace faster than a game can be developed. Read on at The Tech Report.
"At Siggraph yesterday, AMD's Richard Huddy gave us an update on Mantle, and he also revealed some interesting details about AMD's role in the development of the next-gen OpenGL API."
Here is some more Tech News from around the web:
- The Man Responsible For Pop-Up Ads On Building a Better Web @ Slashdot
- Skype stops working on older Android phones leaving Linux users in the dark @ The Inquirer
- ntel teams with 50 Cent's audio firm to launch heart-rate monitoring headphones @ The Inquirer
- TSMC 4Q14 production capacity almost fully booked @ DigiTimes
- Lenovo posts an INCREASE in desktop PC and notebook sales @ The Register
- Boffins brew TCP tuned to perform on lossy links like Wi-Fi networks @ The Register
Subject: Memory | August 15, 2014 - 09:24 AM | Jeremy Hellstrom
Tagged: ddr4, corsair, Vengeance LPX, Dominator Platinum
FREMONT, California — August 14, 2014 — Corsair, a leader in high-performance PC hardware, today announced the availability of Corsair Vengeance LPX and Dominator Platinum lines of high-speed DDR4 computer memory. This new generation of memory ushers in a new age of ultrafast computing with optimizations such as increased DRAM bandwidth, higher bus frequencies, lower power usage, and higher reliability.
Corsair Vengeance LPX and Dominator Platinum DDR4 memory kits are validated with motherboard partners (ASUS, ASRock, EVGA, Gigabyte, and MSI) and use the new XMP 2.0 profile to deliver easy, reliable overclocking performance with the upcoming next-generation Intel® X99 platforms and Intel® Core™ i7 processors (codenamed Haswell-E). The Vengeance LPX and Dominator Platinum memory kits are supplied with a limited lifetime warranty.
Vengeance LPX memory is a new Corsair memory line designed for high-performance overclocking with a low-profile heatspreader is made of pure aluminum for faster heat dissipation and the eight-layer PCB helps manage heat and provides superior overclocking headroom. The memory kits are available in black, red, white, or blue so that enthusiasts, gamers, and modders can add a touch style to match the color scheme of their PC.
Like the DDR3 memory versions, the new Dominator Platinum DDR4 memory kits have a striking industrial design for good looks, patented DHX technology for cooler operation, user-swappable colored “light pipes” for customizable downwash lighting, and Corsair Link compatibility for real-time temperature monitoring. Dominator Platinum memory is built with hand-screened ICs, undergoes rigorous performance testing, and incorporates state-of-the-art cooling for reliable performance in demanding environments.
Vengeance LPX and Dominator Platinum DDR4 Specifications
- Unbuffered DDR4 SDRAM in 288-pin DIMM
- Capacities at launch: 8GB (2x4GB), 16GB (4x4GB), 32GB (4x8GB) and 64GB (8x8GB)
- Speeds at launch: 2666MHz, 2800MHz, and 3000MHz
- Intel XMP 2.0 (Extreme Memory Profile) support
DDR4 is faster. Even at the baseline speed of DDR4 delivers twice the bandwidth with 2133 MT/s (million transfers per second) compared with the base DDR3 1600 MT/s. With optimizations games and applications have the potential to load faster and run more smoothly.
DDR4 uses a lot less power and runs cooler.
With each new generation of CPU and GPU architecture, system power consumption and heat generation become more and more important. DDR4 modules operate at an ultra-low standard 1.2 volts compared to the 1.5 and 1.65 volts of DDR3 memory, allowing DDR4 memory to consume significantly less power and generate less heat.
DDR4 memory modules can get bigger.
DDR3 is limited to 8GB modules for a maximum of 32GB on standard four-socket motherboards. DDR4 will have the ability to enable 16GB per module by 2015. A motherboard with eight memory slots will be upgradeable to an amazing 128GB or DDR4 memory.
Pricing, Availability, and Warranty
Corsair Vengeance LPX Series and Dominator Platinum DDR4 memory kits will be available at the end of August from Corsair's worldwide network of authorized distributors and resellers. The Vengeance LPX and Dominator Platinum memory kits are supplied with a limited lifetime warranty and are backed up by Corsair's customer service and technical support.
Subject: Networking | August 14, 2014 - 08:47 PM | Tim Verry
Tagged: wireless router, wave 2, rt-ac87u, rt-ac87r, qsr1000, mu-mimo, ASUS ROG, asus, 802.11ac
ASUS recently launched the RT-AC87U which is the first "wave 2" 802.11ac wireless router to support multi user MIMO (MU-MIMO) technology. Although the initial launch happened at the end of last month, the RT-AC87U and RT-AC87R (a variant exclusive to Best Buy) will finally be avaiable for purchase starting August 26th for around $279.99.
The RT-AC87U is a monster matte black router with four large external antennas and sleek fighter jet angles. I/O is mostly clustered on the rear of the router and includes four Gigabit Ethernet LAN ports, one GbE WAN port, and one USB 2.0 port. In addition to the rear I/O, ASUS has positioned a USB 3.0 port on the front of the router (specifically the right corner of the front panel hidden behind a removeable rubber port cover).
On the wireless front, the RT-AC87U and RT-AC87R supports the latest 802.11ac and newer 256QAM (600Mbps) 802.11n specification as well as legacy 802.11g/b/a Wi-Fi networks. The router supports simultaneous dual band operation, which results in maximum throughput of 1.73 Gbps on the 5GHz 802.11ac band (4 x 433 Mbps streams) and 600 Mbps on the 2.4GHz 802.11n band.
The new and interesting bit about the RT-AC87 is the MU-MIMO support. MU-MIMO, which stands for Multi-User Multple Input Multiple Output, is the evolution of MIMO technology which debuted with wireless N routers. The ASUS router is able to use multiple anntennas to communicate with a client device to increase bandwidth. Beamforming is used to focus the signal in the direction of the client to get better range and a stronger signal for that specific client. MU-MIMO builds on this technology by allowing the router to track, beamform, and employ multiple transmit and recieve antennas to talk to multiple clients simultaneously. Previously, routers were limited to communicating with a single client at a time (see the diagram below for an example).
Multi-User MIMO will benefit those users that choose to connect the majority of their networked devices via Wi-Fi. However, the technology will be especially noticeable in areas flooded with various Wi-Fi networks such as apartments. According to Matthew Gast of Aerohive Networks, MU-MIMO will allow all wireless clients to get an acceptable data rate in crowded wireless areas at the expense of being able to deliver the highest data rate to a single client device. Especially when competing Wi-Fi networks are involved and fighting for channels, MU-MIMO will shine at keeping devices connected and talking to the access point.
ASUS has chosen the Quantenna QSR100 chipset to handle the 802.11ac duties while a Broadcom BCM4709 chipset handles the 256QAM wireless N bands. Additionally, the RT-AC87 routers have 128MB of flash memory and 256MB of DDR3 RAM. According to ASUS, the router draws slightly over 45W.
On the software side of things, ASUS has chosen its own ASUSWRT firmware which includes parental controls, Time Machine backup support, VPN support, security software from TrendMicro (AiProtection), and AiCloud 2.0. USB support includes storage sharing as well as 3G/4G cellular modem internet connectivity.
In all, the ASUS RT-AC87U looks to be new home router champion packing quite a bit of hardware and leading the charge of Wave 2 802.11ac wireless routers. This all comes at a cost, however. The RT-AC87U and RT-AC87R will be available on August 26 with a MSRP of $269.99 and e-tail prices currently around $279.99.
For all the nitty-gritty details, check out this ASUS PCDIY blog post!
Subject: Graphics Cards | August 14, 2014 - 04:20 PM | Jeremy Hellstrom
Tagged: catalyst 14.7 RC3, beta, amd
A new Catalyst Release Candidate has arrived and as with the previous driver it no longer supports Windows 8.0 or the WDDM 1.2 driver, so upgrade to Win 7 or Win 8.1 before installing please. AMD will eventually release a driver which supports WDDM 1.1 under Win 8.0 for those who do not upgrade.
Feature Highlights of the AMD Catalyst 14.7 RC3 Driver for Windows Includes all improvements found in the AMD Catalyst 14.7 RC driver
- Display interface enhancements to improve 4k monitor performance and reduce flickering.
- Improvements apply to the following products:
- AMD Radeon R9 290 Series
- AMD Radeon R9 270 Series
- AMD Radeon HD 7800 Series
- Even with these improvements, cable quality and other system variables can affect 4k performance. AMD recommends using DisplayPort 1.2 HBR2 certified cables with a length of 2m (~6 ft) or less when driving 4K monitors.
- Wildstar: AMD Crossfire profile support
- Lichdom: Single GPU and Multi-GPU performance enhancements
- Watch Dogs: Smoother gameplay on single GPU and Multi-GPU configurations
Feature Highlights of the AMD Catalyst 14.7 RC Driver for Windows
- Includes all improvements found in the AMD Catalyst 14.6 RC driver
- AMD CrossFire and AMD Radeon Dual Graphics profile update for Plants vs. Zombies
- Assassin's Creed IV - improved CrossFire scaling (3840x2160 High Settings) up to 93%
- Collaboration with AOC has identified non-standard display timings as the root cause of 60Hz SST flickering exhibited by the AOC U2868PQU panel on certain AMD Radeon graphics cards.
- A software workaround has been implemented in AMD Catalyst 14.7 RC driver to resolve the display timing issues with this display. Users are further encouraged to obtain newer display firmware from AOC that will resolve flickering at its origin.
- Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection.
Feature Highlights of the AMD Catalyst 14.6 RC Driver for Windows
- Plants vs. Zombies (Direct3D performance improvements):
- AMD Radeon R9 290X - 1920x1080 Ultra – improves up to 11%
- AMD Radeon R9 290X - 2560x1600 Ultra – improves up to 15%
- AMD Radeon R9 290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
- 3DMark Sky Diver improvements:
- AMD A4-6300 – improves up to 4%
- Enables AMD Dual Graphics/AMD CrossFire support
- Grid Auto Sport: AMD CrossFire profile
- Wildstar: Power Xpress profile
- Performance improvements to improve smoothness of application
- Performance improves up to 24% at 2560x1600 on the AMD Radeon R9 and R7 Series of products for both single GPU and multi-GPU configurations.
- Watch Dogs: AMD CrossFire – Frame pacing improvements
- Battlefield Hardline Beta: AMD CrossFire profile
- Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
- Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in flickering
- AMD CrossFire configurations with AMD Eyefinity enabled will see instability with BattleField 4 or Thief when running Mantle
- Catalyst Install Manager text is covered by Express/Custom radio button text
- Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder
Subject: General Tech, Displays | August 14, 2014 - 01:59 PM | Scott Michaud
Tagged: amd, freesync, g-sync, Siggraph, siggraph 2014
At SIGGRAPH, Richard Huddy of AMD announced the release windows of FreeSync, their adaptive refresh rate technology, to The Tech Report. Compatible monitors will begin sampling "as early as" September. Actual products are expected to ship to consumers in early 2015. Apparently, more than one display vendor is working on support, although names and vendor-specific release windows are unannounced.
As for cost of implementation, Richard Huddy believes that the added cost should be no more than $10-20 USD (to the manufacturer). Of course, the final price to end-users cannot be derived from this - that depends on how quickly the display vendor expects to sell product, profit margins, their willingness to push new technology, competition, and so forth.
If you want to take full advantage of FreeSync, you will need a compatible GPU (look for "gaming" support in AMD's official FreeSync compatibility list). All future AMD GPUs are expected to support the technology.
Subject: General Tech | August 14, 2014 - 12:30 PM | Ken Addison
Tagged: video, ssd, ROG Swift, ROG, podcast, ocz, nvidia, Kaveri, Intel, g-sync, FMS 2014, crossblade ranger, core m, Broadwell, asus, ARC 100, amd, A6-7400K, A10-7800, 14nm
PC Perspective Podcast #313 - 08/14/2014
Join us this week as we discuss new Kaveri APUs, ASUS ROG Swift G-Sync Monitor, Intel Core M Processors and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Subject: General Tech | August 14, 2014 - 12:00 PM | Jeremy Hellstrom
Tagged: audio, diamond multimedia, Xtreme Sound XS71HDU, usb sound card, DAC
The Diamond Xtreme Sound XS71HDU could be a versitile $60 solution for those with high end audio equipment that would benefit from a proper DAC. With both optical in and out it is capable of more than an onboard solution, not to mention the six 3.5-mm jacks for stereo headphones, 7.1 surround support with rear, sub, side, mic, and line in. The design and features are impressive however the performance failed to please The Tech Report who felt that there were similar solutions with much higher quality sound reproduction.
"We love sound cards here at TR, but they don't fit in every kind of PC. Diamond's Xtreme Sound XS71HDU serves up the same kinds of features in a tiny USB package suitable for mini-PCs and ultrabooks. We took it for a spin to see if it's as good as it looks."
Here is some more Tech News from around the web:
- TDK A12 TREK Micro Wireless Speaker Review @ NikKTech
- Wavemaster Moody 2.1 Rev 2 Speaker @ eTeknix
- IK Multimedia iLoud Studio-Quality Portable Speaker Review @ NikKTech
- LUXA2 GroovyW bluetooth speaker @ Kitguru
- BitFenix Flo Premium PC Headset Review @ NikKTech
- Tt eSPORTS Sybaris Wired & Wireless Bluetooth NFC Enabled Headset @ eTeknix
- Tt eSports Level 10 M Gaming Headset @ TechwareLabs
- GAMDIAS Hephaestus GHS2000 Headset @ Benchmark Reviews
- Tt eSPORTS Level 10M Gaming Headset Review @ Techgage
- CM Storm Resonar Gaming Earphones @ eTeknix
Subject: General Tech | August 14, 2014 - 10:31 AM | Jeremy Hellstrom
For many Linux is a mysterious thing that is either dead or about to die because no one uses it. Linux.com has put together an overview of what Linux is and where to find it being used. Much of what they describe in the beginning applies to all operating systems as they share similar features, it is only in the details that they differ. If you have only thought about Linux as that OS that you can't game on then it is worth taking a look through the descriptions of the distributions and why people choose to use Linux. You may never build a box which runs Linux but if you are considering buying a Steambox when they arrive on the market you will find yourself using a type of Linux and having a basic understanding of the parts of the OS for troubleshooting and optimization. If you already use Linux then fire up Steam and take a break.
"For those in the know, you understand that Linux is actually everywhere. It's in your phones, in your cars, in your refrigerators, your Roku devices. It runs most of the Internet, the supercomputers making scientific breakthroughs, and the world's stock exchanges."
Here is some more Tech News from around the web:
- The internet just BROKE under its own weight – we explain how @ The Register
- Intel snaps up Axxia to bolster its wireless networking credentials @ The Inquirer
- The Biggest iPhone Security Risk Could Be Connecting One To a Computer @ Slashdot
- CHIL PowerShare Reactor 5.1 Amp Multi-Device Charger Review @ NikKTech
Subject: General Tech, Graphics Cards, Processors, Mobile, Shows and Expos | August 13, 2014 - 06:55 PM | Scott Michaud
Tagged: siggraph 2014, Siggraph, microsoft, Intel, DirectX 12, directx 11, DirectX
Along with GDC Europe and Gamescom, Siggraph 2014 is going on in Vancouver, BC. At it, Intel had a DirectX 12 demo at their booth. This scene, containing 50,000 asteroids, each in its own draw call, was developed on both Direct3D 11 and Direct3D 12 code paths and could apparently be switched while the demo is running. Intel claims to have measured both power as well as frame rate.
Variable power to hit a desired frame rate, DX11 and DX12.
The test system is a Surface Pro 3 with an Intel HD 4400 GPU. Doing a bit of digging, this would make it the i5-based Surface Pro 3. Removing another shovel-load of mystery, this would be the Intel Core i5-4300U with two cores, four threads, 1.9 GHz base clock, up-to 2.9 GHz turbo clock, 3MB of cache, and (of course) based on the Haswell architecture.
While not top-of-the-line, it is also not bottom-of-the-barrel. It is a respectable CPU.
Intel's demo on this processor shows a significant power reduction in the CPU, and even a slight decrease in GPU power, for the same target frame rate. If power was not throttled, Intel's demo goes from 19 FPS all the way up to a playable 33 FPS.
Intel will discuss more during a video interview, tomorrow (Thursday) at 5pm EDT.
Maximum power in DirectX 11 mode.
For my contribution to the story, I would like to address the first comment on the MSDN article. It claims that this is just an "ideal scenario" of a scene that is bottlenecked by draw calls. The thing is: that is the point. Sure, a game developer could optimize the scene to (maybe) instance objects together, and so forth, but that is unnecessary work. Why should programmers, or worse, artists, need to spend so much of their time developing art so that it could be batch together into fewer, bigger commands? Would it not be much easier, and all-around better, if the content could be developed as it most naturally comes together?
That, of course, depends on how much performance improvement we will see from DirectX 12, compared to theoretical max efficiency. If pushing two workloads through a DX12 GPU takes about the same time as pushing one, double-sized workload, then it allows developers to, literally, perform whatever solution is most direct.
Maximum power when switching to DirectX 12 mode.
If, on the other hand, pushing two workloads is 1000x slower than pushing a single, double-sized one, but DirectX 11 was 10,000x slower, then it could be less relevant because developers will still need to do their tricks in those situations. The closer it gets, the fewer occasions that strict optimization is necessary.
If there are any DirectX 11 game developers, artists, and producers out there, we would like to hear from you. How much would a (let's say) 90% reduction in draw call latency (which is around what Mantle claims) give you, in terms of fewer required optimizations? Can you afford to solve problems "the naive way" now? Some of the time? Most of the time? Would it still be worth it to do things like object instancing and fewer, larger materials and shaders? How often?
Subject: Graphics Cards | August 13, 2014 - 03:11 PM | Jeremy Hellstrom
Tagged: factory overclocked, sapphire, R9 290X, Vapor-X R9 290X TRI-X OC
As far as factory overclocks go, the 1080MHz core and 5.64GHz RAM on the new Sapphire Vapor-X 290X is impressive and takes the prize for the highest factory overclock on this card [H]ard|OCP has seen yet. That didn't stop them from pushing it to 1180MHz and 5.9GHz after a little work which is even more impressive. At both the factory and manual overclocks the card handily beat the reference model and the manually overclocked benchmarks could meet or beat the overclocked MSI GTX 780 Ti GAMING 3G OC card. The speed is not the only good feature, Intelligent Fan Control keeps two of the three fans from spinning when the GPU is under 60C which vastly reduces the noise produced by this card. It is currently selling for $646, lower than the $710 that the GeForce is currently selling for as well.
"We take a look at the SAPPHIRE Vapor-X R9 290X TRI-X OC video card which has the highest factory overclock we've ever encountered on any AMD R9 290X video card. This video card is feature rich and very fast. We'll overclock it to the highest GPU clocks we've seen yet on R9 290X and compare it to the competition."
Here are some more Graphics Card articles from around the web:
- Sapphire Radeon R7 260X CrossFire Review @HiTech Legion
- ASUS Radeon R9 270X DirectCU II TOP @ [H]ard|OCP
- ASUS R9 270 Direct CU II OC 2 GB Video Card Review @ Madshrimps
- EKWB ASUS GTX 780 Ti DCII OC Full Cover Water Block Review @ Madshrimps
- Zotac GTX 750 Zone Edition @ Hardware Heaven
- Palit GTX 750 Ti KalmX 2 GB @ techPowerUp
- Palit GTX750 Ti KalmX @ Kitguru
- PNY GTX 780 Ti XLR8 OC Single & SLI Review @ Hardware Canucks
- Gigabyte GeForce GTX Titan Black GHz Edition @ X-bit Labs
- ASUS Republic of Gamers Striker Platinum GTX 760 4GB SLI @ eTeknix
- PNY GTX 750 Ti XLR8 OC @ [H]ard|OCP
Subject: General Tech | August 13, 2014 - 02:15 PM | Ryan Shrout
Tagged: supernova, podcast, giveaway, evga, contest
A big THANK YOU goes to our friends at EVGA for hooking us up with another item to give away for our podcast listeners and viewers this week. If you watch tonight's LIVE recording of Podcast #313 (10pm ET / 7pm PT at http://pcper.com/live) or download our podcast after the fact (at http://pcper.com/podcast) then you'll have the tools needed to win an EVGA SuperNOVA 1000 G2 Power Supply!! (Valued at $165 based on Amazon current selling price.) See review of our 750/850G2 SuperNOVA units.
How do you enter? Well, on the live stream (or in the downloaded version) we'll give out a special keyword during our discussion of the contest for you to input in the form below. That's it!
We'll draw a random winner next week, anyone can enter from anywhere in the world - we'll cover the shipping. We'll draw a winner on August 20th and announce it on the next episode of the podcast! Good luck, and once again, thanks goes out to EVGA for supplying the prize!
Subject: Storage | August 13, 2014 - 11:38 AM | Jeremy Hellstrom
Tagged: toshiba, ssd, sata, ocz, barefoot 3, ARC
Before even looking at the performance the real selling point of the new OCZ ARC 100 is the MSRP, the 240GB and 480GB models are slated to be released at $0.50/GB and will likely follow the usual trend of SSD prices and drop from there. The drives use the Barefoot 3 controller, this one clocked slightly lower than the Vertex 460 but still capable of accelerating encryption. Once The Tech Report set the drive up in their test bed the performance was almost on par with the Vertex 460 and other mid to high end SSDs, especially in comparison to the Crucial MX100.
"OCZ's latest value SSD is priced at just $0.50 per gig, but it hangs with mid-range and even high-end drives in real-world and demanding workloads. It's also backed by an upgraded warranty and some impressive internal reliability data provided by OCZ. We take a closer look:"
Here are some more Storage reviews from around the web:
- OCZ ARC 100 240GB SSD @ Kitguru
- OCZ ARC 100 240GB SSD Review @ Legit Reviews
- OCZ ARC 100 240GB @ Legion Hardware
- OCZ ARC 100 SSD @ SSD Review
- OCZ ARC 100 240GB SSD Review @ Hardware Canucks
- Samsung 845DC EVO 3-bit Toggle MLC and 845DC PRO 3D V-NAND SSDs @ The Register
- Synology DS412+ - Network Attached Storage @ Funky Kit
Subject: General Tech | August 13, 2014 - 11:02 AM | Jeremy Hellstrom
Tagged: Unreal Tournament, gaming, Alpha
Feel like (Pre-Pre-)Alpha testing Unreal Tournament without forking money over for early access? No problems thanks to Epic and Unreal Forums member ‘raxxy’ who is compiling and updating the (pre)Alpha version of the next Unreal Tournament. Sure there may not be many textures but there is a Flak Cannon so what could you possible have to complain about? There are frequent updates and a major part of participating is to give feedback to the devs so please be sure to check into the #beyondunreal IRC channel to get tips and offer feedback. Rock, Paper, SHOTGUN reports that the severs are massively packed now so you may not be able to immediately join in but it is worth trying.
raxxy would like you to understand "These are PRE-ALPHA Prototype Builds. Seriously. Super early testing. So early it's technically not even pre alpha, it's debug code!"
You can be guaranteed that the Fragging Frogs will be taking advantage of this, as well as revisiting the much beloved UT2K4 so if you haven't joined up yet ... what are you waiting for?
Check out Fatal1ty playing if you can't get on
"Want to play the new Unreal Tournament for free, right this very second? Cor blimey and OMG you totes can! Hero of the people ‘raxxy’ on the Unreal Forums is compiling Epic’s builds and releasing them as small, playable packages that anyone can run, with multiple updates per week. The maps are untextured, the weapons unbalanced, and things change rapidly as everything’s still “pre-alpha” but it’s playable and – more importantly – fun."
Here is some more Tech News from around the web:
- Nvidia's Shield Tablet @ The Tech Report
- Hard West Kickstarter Offers Turn-Based Cowboy Tactics @ Rock, Paper, SHOTGUN
- Cyclonic! Space Hulk: Ascension Edition Announced @ Rock, Paper, SHOTGUN
- Killing Floor 2 Confidential Specimen Footage @ [H]ard|OCP
- Downloadable Cunning: AI War – Destroyer Of Worlds @ Rock, Paper, SHOTGUN
- It Rises: Sierra Returns With Geometry Wars & King’s Quest @ Rock, Paper, SHOTGUN
- Sacred 3 Review: It’s not Sacred Anymore @ Techgage
- Six New Witcher 3 Screenshots And A Trailer For You @ Rock, Paper, SHOTGUN
- HOMMage: Might & Magic Heroes VII Announced @ Rock, Paper, SHOTGUN
- Hands On: Alien Isolation @ Rock, Paper, SHOTGUN
- Splatummer Holidays: Dead Island 2 Trailer @ Rock, Paper, SHOTGUN
Subject: General Tech | August 13, 2014 - 09:58 AM | Jeremy Hellstrom
Tagged: tonga, radeon, FirePro W7100, amd
A little secret popped out with the release of AMD's FirePro W7100, a new family of GPU that goes by the name of Tonga, which is very likely to replace the aging Tahiti chip that has been used since the HD 7900 series. The stats that The Tech Report saw show interesting changes from Tahiti including a reduction of the memory interface to 256-bit which is in line with NVIDIA's current offerings. The number of stream processors might be reduced to 1792 from 2048 but that is based on the W7100 and it the GPUs may be released with the full 32 GCN compute units. Many other features have seen increases, the number of Asynchronous Compute Engines goes from 2 to 8, the number of rasterized triangles per clock doubles to 4 and it adds support for the new TrueAudio DSP and CrossFire XDMA.
"The bottom line is that Tonga joins the Hawaii (Radeon R9 290X) and Bonaire (R7 260X) chips as the only members of AMD' s GCN 1.1 series of graphics processors. Tonga looks to be a mid-sized GPU and is expected to supplant the venerable Tahiti chip used in everything from the original Radeon HD 7970 to the current Radeon R9 280."
Here is some more Tech News from around the web:
- No more turning over a USB thing, then turning it over again to plug it in: Reversible socket ready for lift off @ The Register
- 12 Linux-Based Home Automation Systems for Under $300 @ Linux.com
- The IPv4 Internet Hiccups @ Slashdot
- Password manager LastPass goes titsup: Users LOCKED OUT @ The Register
- Seagate to splash MILLIONS on LAND, FACTORIES @ The Register
- Hardware Asylum Podcast - Computex 2014 Wrap Up and MSI MOA Americas Qualifier
- Netis Beacon N300 Wireless Gaming Router @ TechwareLabs
- Canadian ISP Shaw stumbles around internet with mystery 'routing' sickness @ The Register
Subject: General Tech | August 13, 2014 - 09:26 AM | Jeremy Hellstrom
Tagged: borderlands, nvidia, geforce
Santa Clara, CA — August 12, 2014 — Get ready to shoot ‘n’ loot your way through Pandora’s moon. Starting today, gamers who purchase select NVIDIA GeForce GTX TITAN, 780 Ti, 780, and 770 desktop GPUs will receive a free copy of Borderlands: The Pre-Sequel, the hotly anticipated new chapter to the multi-award winning Borderlands franchise from 2K and Gearbox Software.
Discover the story behind Borderlands 2’s villain, Handsome Jack, and his rise to power. Taking place between the original Borderlands and Borderlands 2, Borderlands: The Pre-Sequel offers players a whole lotta new gameplay in low gravity.
“If you have a high-end NVIDIA GPU, Borderlands: The Pre-Sequel will offer higher fidelity and higher performance hardware-driven special effects including awesome weapon impacts, moon-shatteringly cool cryo explosions and ice particles, and cloth and fluid simulation that blows me away every time I see it," said Randy Pitchford, CEO and president of Gearbox Software.
With NVIDIA PhysX technology, you will feel deep space like never before. Get high in low gravity and use new ice and laser weapons to experience destructible levels of mayhem. Check out the latest trailer here: http://youtu.be/c9a4wr4I1hk that just went live this morning!
Borderlands: The Pre-Sequel will also stream to your NVIDIA SHIELD tablet or portable. For the first time ever, you can play Claptrap anywhere by using NVIDIA Gamestream technologies. You can even livestream and record every fist punch with GeForce Shadowplay
Borderlands: The Pre-Sequel will be available on October 14, 2014 in North America and on October 17, 2014 internationally. Borderlands: The Pre-Sequel is not yet rated by the ESRB.
The GeForce GTX and Borderlands: The Pre-Sequel bundle is available starting today from leading e-tailers including Amazon, NCIX, Newegg, and Tiger Direct and system builders including Canada Computers, Digital Storm, Falcon Northwest, Maingear, Memory Express, Origin PC, V3 Gaming, and Velocity Micro. For a full list of participating partners, please visit: www.GeForce.com/GetBorderlands.
Subject: General Tech | August 12, 2014 - 06:00 PM | Scott Michaud
Tagged: valve, source engine, Source 2, DOTA 2
While it may not seem like it in North America, we are in a busy week for videogame development. GDC Europe, which stands for Game Developers Conference Europe, is just wrapping up to make room for Gamescom, which will take up the rest of the week. Valve will be there and people are reading tea leaves to find out why. SteamOS seems likely, but what about their next generation gaming engine, Source 2? Maybe it already happened?
Valve is the most secretive company with values of openness that I know. They are pretty good at preventing leaks from escaping their walls. Recently, Dota 2 was updated to receive new features and development tools for user-generated maps and gametypes. The tools currently require 64-bit Windows and a DirectX 11-compatible GPU.
Those don't sound like Source requirements...
And the editor doesn't look like Valve's old tools.
Video Credit: "Valve News Network".
Leaks also point to things like "tf_imported", "left4dead2_source2", and "left4dead2_imported". This is interesting. Valve is pushing Dota 2, their most popular, free-to-play game into Source 2. Also, because it is listed as "tf" rather than "tf2", like "dota" is not registered as "dota2" but "left4dead2" keeps its number, this might mean that the free-to-play Team Fortress 2 could be in a perpetual-development mode, like Dota 2. Eventually, it could be pushed to the new engine and given more content.
As for Left4Dead2? I am wondering if it is intended to be a product, rather than an internal (or external) Source 2 tech demo.
Was this what brought Valve to Gamescom, or will be be surprised by other announcements (or nothing at all)?
Subject: Displays | August 12, 2014 - 12:36 PM | Jeremy Hellstrom
Tagged: asus, g-sync, geforce, gsync, nvidia, pg278q, Republic of Gamers, ROG, swift, video
Ryan was not the only one to test the ASUS ROG Swift PG278Q G-Sync monitor, Overclockers Club also received a model to test out. Their impressions of the 27" 2560 x 1440 TN panel were very similar, once they saw this monitor in action going back to their 30-inch 60Hz IPS monitor was not as enjoyable as once it was. The only bad thing they could say about the display was the MSRP, $800 is steep for any monitor and makes it rather difficult to even consider getting two or more of them for a multiple display system.
”When you get down to it, the facts are that even with a TN panel being used for the high refresh rate, the ASUS ROG Swift PG278Q G-Sync monitor delivers great picture quality and truly impressive gaming. I could go on all day long about how smooth each of the games played while testing this monitor, but ultimately not be able to show you without having you sit at the desk with me. No stuttering, no tearing, no lag; it's like getting that new car and having all the sales hype end up being right on the money. When I flip back and forth between my 60Hz monitor and the PC278Q, its like a night and day experience.”
Here are some more Display articles from around the web:
- AOC G2460PG G-Sync 144Hz 1ms Gaming Monitor @ Kitguru
- Asus ROG Swift PG278Q 144hz G-Sync Monitor @ Kitguru
- 6400×1080: Testing Mixed-Resolution AMD Eyefinity @ eTeknix
- Demystifying NTSC Color And Progressive Scan @ Hack a Day
Get notified when we go live!