All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Introduction and Technical Specifications
Courtesy of GIGABYTE
The Z97X-SOC Force motherboard is the premier offering in GIGABYTE's Overclocking Series of boards. The overclocking series boards are designed with enhancements and features meant to appeal to enthusiasts and professional overclockers alike. The Z97X-SOC Force board deign is based on the previous generation Z87X-OC Force, featuring the same black and orange coloration typical to the series. The board does contain several evolutionary changes making the board easier to use and more appealing to its target users. At an MSRP of $209.99, the Z97X-S0C Force is competitively priced to appeal to all levels of enthusiasts.
Courtesy of GIGABYTE
Courtesy of GIGABYTE
Courtesy of GIGABYTE
GIGABYTE enhanced the board power regulation system designed into the current generation boards, allowing for use of a simplified cooling and fewer power phases. As a result, the Z97X-SOC Force is packed with 8-phase digital power circuit for the CPU, using International Rectifier (IR) based PowIRstage digital controllers and 10k-rated black solid capacitors to ensure system stability under any conditions. The Z97X-SOC Force board comes standard with the following integrated features: six SATA 3 ports; one SATA Express 10 Gb/s ports; a Qualcomm® Atheros Killer E2201 NIC; four PCI-Express x16 slots; a PCI-Express x1 slots; two PCI slots; 2-digit diagnostic LED display; on-board power, reset, CMOS clear, CMOS battery clear, OC Ignition, OC Tag, OC Turbo, OC Touch, Settings Lock, Direct to BIOS, and Memory Safe buttons; Dual-BIOS, active BIOS, and IC Trigger switches; OC PCIe and OC DIMM switch jumper blocks; integrated voltage measurement points; and USB 2.0 and 3.0 port support.
Courtesy of GIGABYTE
A powerful architecture
In March of this year, NVIDIA announced the GeForce GTX Titan Z at its GPU Technology Conference. It was touted as the world's fastest graphics card with its pair of full GK110 GPUs but it came with an equally stunning price of $2999. NVIDIA claimed it would be available by the end of April for gamers and CUDA developers to purchase but it was pushed back slightly and released at the very end of May, going on sale for the promised price of $2999.
The specifications of GTX Titan Z are damned impressive - 5,760 CUDA cores, 12GB of total graphics memory, 8.1 TFLOPs of peak compute performance. But something happened between the announcement and product release that perhaps NVIDIA hadn't accounted for. AMD's Radeon R9 295X2, a dual-GPU card with full-speed Hawaii chips on-board, was released at $1499. I think it's fair to say that AMD took some chances that NVIDIA was surprised to see them take, including going the route of a self-contained water cooler and blowing past the PCI Express recommended power limits to offer a ~500 watt graphics card. The R9 295X2 was damned fast and I think it caught NVIDIA a bit off-guard.
As a result, the GeForce GTX Titan Z release was a bit quieter than most of us expected. Yes, the Titan Black card was released without sampling the gaming media but that was nearly a mirror of the GeForce GTX 780 Ti, just with a larger frame buffer and the performance of that GPU was well known. For NVIDIA to release a flagship dual-GPU graphics cards, admittedly the most expensive one I have ever seen with the GeForce brand on it, and NOT send out samples, was telling.
NVIDIA is adamant though that the primary target of the Titan Z is not just gamers but the CUDA developer that needs the most performance possible in as small of a space as possible. For that specific user, one that doesn't quite have the income to invest in a lot of Tesla hardware but wants to be able to develop and use CUDA applications with a significant amount of horsepower, the Titan Z fits the bill perfectly.
Still, the company was touting the Titan Z as "offering supercomputer class performance to enthusiast gamers" and telling gamers in launch videos that the Titan Z is the "fastest graphics card ever built" and that it was "built for gamers." So, interest peaked, we decided to review the GeForce GTX Titan Z.
The GeForce GTX TITAN Z Graphics Card
Cost and performance not withstanding, the GeForce GTX Titan Z is an absolutely stunning looking graphics card. The industrial design started with the GeForce GTX 690 (the last dual-GPU card NVIDIA released) and continued with the GTX 780 and Titan family, lives on with the Titan Z.
The all metal finish looks good and stands up to abuse, keeping that PCB straight even with the heft of the heatsink. There is only a single fan on the Titan Z, center mounted, with a large heatsink covering both GPUs on opposite sides. The GeForce logo up top illuminates, as we have seen on all similar designs, which adds a nice touch.
A refresh for Haswell
Intel is not very good at keeping secrets recently. Rumors of a refreshed Haswell line of processors have been circulating for most of 2014. In March, it not only confirmed that release but promised an even more exciting part called Devil's Canyon. The DC parts are still quad-core Haswell processors built on Intel's 22nm process technology, but change a few specific things.
Intel spent some time on the Devil's Canyon Haswell processors to improve the packaging and thermals for overclockers and enthusiasts. The thermal interface material (TIM) that lies in between the die and the heat spreader has been updated to a next-generation polymer TIM (NGPTIM). The change should improve cooling performance of all currently shipping cooling solutions (air or liquid), but it is still a question just HOW MUCH this change will actually matter.
You can also tell from the photo comparison above that Intel has added capacitors to the back of the processor to "smooth" power delivery. This, in combination with the NGPTIM, should enable a bit more headroom for clock speeds with the Core i7-4790K.
In fact, there are two Devil's Canyon processors being launched this month. The Core i7-4790K will sell for $339, the same price as the Core i7-4770K, while the Core i5-4690K will sell for $242. The lower end option is a 3.5 GHz base clock, 3.9 GHz Turbo clock quad-core CPU without HyperThreading. While a nice step over the Core i5-4670K, it's only 100 MHz faster. Clearly the Core i7-4790K is the part everyone is going to be scrambling to buy.
Another interesting change is that both the Core i7-4790K and the Core i5-4690K enable support for both Intel's VT-d virtualization IO technology and Intel's TSX-NI transactional memory instructions. This makes them the first enthusiast-grade unlocked processors from Intel to support them!
As Intel states it, the Core i7-4790K and the Core i5-4690K have been "designed to be used in conjunction with the Z97 chipset." That being said, at least one motherboard manufacturer, ASUS, has released limited firmware updates to support the Devil's Canyon parts on Z87 products. Not all motherboards are going to be capable, and not all vendors are going to the spend the time to integrate support, so keep an eye on the support page for your specific motherboard.
The CPU itself looks no different on the top, save for the updated model numbering.
Core i7-4790K on the left, Core i7-4770K on the right
On the back you can see the added capacitors that help with stable overclocking.
The clock speed advantage that the Core i7-4790K provides over the Core i7-4770K should not be overlooked, even before overclocking is taken into consideration. A 500 MHz base clock boost is 14% higher in this case and in those specific CPU-limited tasks, you should see very high scaling.
With the GPU landscape mostly settled for 2014, we have the ability to really dig in and evaluate the retail models that continue to pop up from NVIDIA and AMD board partners. One of our favorite series of graphics cards over the years comes from MSI in the form of the Lightning brand. These cards tend to take the engineering levels to a point other designers simply won't do - and we love it! Obviously the target of this capability is additional overclocking headroom and stability, but what if the GPU target has issues scaling already?
That is more or less the premise of the Radeon R9 290X Lightning from MSI. AMD's Radeon R9 290X Hawaii GPU is definitely a hot and power hungry part and that caused quite a few issues at the initial release. Since then though, both AMD and its add-in card partners have worked to improve the coolers installed on these cards to improve performance reliability and decrease the LOUD NOISES produced by the stock, reference cooler.
Let's dive into the latest to hit our test bench, the MSI Radeon R9 290X Lightning.
The MSI Radeon R9 290X Lightning
MSI continues to utilize the yellow and black color scheme that many of the company's high end parts integrate and I love the combination. I know that both NVIDIA and AMD disapprove of the distinct lack of "green" and "red" in the cooler and box designs, but good on MSI for sticking to its own thing.
The box for the Lightning card is equal to the prominence of the card itself and you even get a nifty drawer for all of the included accessories.
We originally spotted the MSI R9 290X Lightning at CES in January and the design remains the same. The cooler is quite large (and damn heavy) and is cooled by a set of three fans. The yellow fan in the center is smaller and spins a bit faster, creating more noise than I would prefer. All fan speeds can be adjusted with MSI's included fan control software.
Kaveri Goes Mobile
The processor market is in an interesting place today. At the high end of the market Intel continues to stand pretty much unchallenged, ranging from the Ivy Bridge-E at $1000 to the $300 Haswell parts available for DIY users. The same could really be said for the mobile market - if you want a high performance part the default choice continues to rest with Intel. But AMD has some interesting options that Intel can't match when you start to enter the world of the mainstream notebook. The APU was slow to develop but it has placed AMD in a unique position, separated from the Intel processors with a more or less reversed compute focus. While Intel dominates in the performance on the x86 side of things, the GPU in AMD's latest APUs continue to lead in gaming and compute performance.
The biggest problem for AMD is that the computing software ecosystem still has not caught up with the performance that a GPU can provide. With the exception of games, the GPU in a notebook or desktop remains under utilized. Certain software vendors are making strides - see the changes in video transcoding and image manipulation - but there is still some ground AMD needs to accelerate down.
Today we are looking at the mobile version of Kaveri, AMD's latest entry into the world of APUs. This processor combines the latest AMD processor architecture with a GCN-based graphics design for a pretty advanced part. When the desktop version of this processor was released, we wrote quite a bit about the architecture and the technological advancements made into, including becoming the first processor that is fully HSA compliant. I won't be diving into the architecture details here since we covered them so completely back in January just after CES.
The mobile version of Kaveri is basically identical in architecture with some changes for better power efficiency. The flagship part will ship with 12 Compute Cores (4 Steamroller x86 cores and 8 GCN cores) and will support all the same features of GCN graphics designs including the new Mantle API.
Early in the spring we heard rumors that the AMD FX brand was going to make a comeback! Immediately enthusiasts were thinking up ways AMD could compete against the desktop Core i7 parts from Intel; could it be with 12 cores? DDR4 integration?? As it turns out...not so much.
Introduction, Specifications and Packaging
Intel has a nasty habit of releasing disruptive technology, especially in the area of computer storage. Among the first of those releases was the X25-M, which was groundbreaking to say the least. At a time where most other SATA SSDs were just stopgap attempts to graft flash memory to a different interface, Intel's SATA SSD was really the first true performer.
With performance in the bag, Intel shifted their attention to reducing the cost of their products. The next few generations of the Intel line was coupled with leadership in die shrinks. This all came together in the form of SSD releases of increasingly reduced cost. Sure the enterprise parts retained a premium, but the consumer parts generally remained competitive.
Now Intel appears to have once again shifted their attention to performance, and we know it has been in the works for a while now. With the SATA bottleneck becoming increasingly apparent, big changes needed to me made. First, SATA, while fine for relatively high latency HDD's, was just never meant for SSD speeds. As SSD performance increased, the latencies involved with the interface overhead (translating memory-based addresses into ATA style commands) becomes more and more of a burden.
The solution is to not only transition to PCIe, but to do so using a completely new software and driver interface, called NVM Express. NVMe has been in the works for a while, and offers some incredible benefits in that it essentially brings the flash memory closer to the CPU. The protocol was engineered for the purpose of accessing flash memory as storage, and doing so as fast and with the least latency as possible. We hadn't seen any true NVMe products hit the market, until today, that is:
Behold the Intel SSD DC P3700!
Introduction, Specifications and Packaging
Back in July of last year, Micron announced production of 16nm flash memory. These were the same 128gbit dies as the previous gen parts, but 16nm means the dies are smaller, meaning more dies from a single wafer, ultimately translating to lower end user cost.
It takes a bit of time for those new flash die shrinks to trickle into mainstream products. Early yields from a given shrink tend to not have competitive endurance on initial production. As production continues, the process gets tweaked, resulting in greater and longer enduring yields.
In many ways, the Google Nexus 7 has long been the standard of near perfection for an Android tablet. With a modest 7-inch screen, solid performance and low cost, the ASUS-built hardware has stood through one major revision as our top selection. Today though, a new contender in the field makes its way to the front of the pack in the form of the ASUS MeMO Pad 7 (ME176C). At $150, this new 7-inch tablet has almost all the hallmarks to really make an impact in the Android ecosystem. Finally.
The MeMO Pad 7 is not a new product family, though. It has existed with Mediatek processors for quite some time with essentially the same form factor. This new ME176C model makes some decisions that help it break into a new level of performance while maintaining the budget pricing required to really take on the likes of Google. By coupling the MeMO Pad brand with the Intel Bay Trail Atom processor, the two companies firmly believe they have a winner; but do they?
I have to admit that my time with the ASUS MeMO Pad 7 (ME176C) has been short; shorter than I would have liked to offer a truly definitive take on this mobile platform. I prefer to take the time to work the tablet into my daily work and home routines. Reading, browsing, email, etc. This allows me to filter though any software intricacies that might make or break a purchasing decision. Still, I think the ASUS design is going to live up to my expectations and is worth every penny of the $150 price tag.
The ASUS MeMO Pad 7 has a 1280x800 resolution IPS screen. This 7-inch device is powered by the new Intel Atom Z3745 quad-core SoC with 1GB of memory and 16GB of on-board storage. The front facing camera is of the 2MP variety while the rear facing camera is 5MP - but you will likely be as disappointed in the image quality of the photos as I was. Connectivity options include the microUSB port for charging and data transfer along with 802.11b/g/n 2.4 GHz WiFi (sorry, no 5.0 GHz option here). Bluetooth 4.0 allows for low power data sync with other devices you might have and our model shipped with Android 4.4.2 already pre-installed.
The rear of the ASUS MeMO Pad is a pseudo rubber/plastic type material that is easy to grip while not leaving fingerprints behind - a solid combination. The center mounted camera lens takes decent pictures - but I can't put any more praise on it than that. It was easy to find image quality issues with photos even in full daylight. It's hard to know how disappointed to be considering the price, but the Nexus 7 has better optical hardware.
Introduction and Technical Specifications
Courtesy of ASUS
The ASUS Sabertooth Z97 Mark 1 motherboard is the newest member of thier TUF (The Ultimate Force) series. The board features the Intel Z97 chipset, but keeps the aesthetics and functionality you have come to expect from TUF series boards, and more specifically the Sabertooth line. That does not mean to say that ASUS did not improve the design of the Sabertooth Z97 Mark 1 board. The Sabertooth Z97 Mark 1 comes with a premium MSRP of $249.99 with features to justify its premium price point.
Courtesy of ASUS
Courtesy of ASUS
Courtesy of ASUS
ASUS used the previous Sabertooth board as a template to design and improve upon the Sabertooth Z97 Mark 1. The board includes the line's signature TUF Thermal Armor board overlay and TUF Fortifier back plate. With the new revision of the Thermal Armor, ASUS integrated two optional fans into its design as well as flow ports above the CPU VRM heat sinks to adjust the airflow through the Thermal Armor. The board is designed with an 8 phase digital power system with alloy chokes redesigned to run cooler than previous generation chokes, 10k-rated Titanium capacictors, and military-grade TUF MOSFETs. Additionally, ASUS integrated on board ESD (elctro-static discharge) protection into the rear panel keyboard/mouse, USB 2.0, USB 3.0, LAN, and audio ports to protect the motherboard and integrated components from voltage fluctuations and power surges.
The N2310 is a budget dual-bay NAS from Thecus and an interesting product beyond the low cost for this category, boasting a number of features that help set it apart.
Apart from the primary role of a network attached storage (NAS) device - you know, storage - there are some interesting things a piece of hardware like the N2310 can do. This inexpensive NAS is actually a server, too, so beyond storing up to 8TB of data it’s powerful enough to replace a dedicated PC for certain tasks - the kind of tasks that some of us leave a PC running 24/7 to accomplish.
In this review we’ll take a look at some of the functionality that helps set the N2310 apart, as well as the kind of real-world performance you might expect to see.
It’s All About the Gigabytes
There are more reasons now than ever before for large storage options. Even though SSD’s are at their lowest prices ever most of us still need to supplement a fast boot drive with some traditional spinning disks. Just think about what accumulates in an average year on your PC… photos, music, videos, program backups and images, you name it. All those GB’s have to go somewhere, and there are obviously internal and external hard drives to share the load. However, regardless of the local storage option you might chose, it’s not always so convenient to actually access this stuff again. Clearly, the easier it is to access your files, the better - and not just from one device. So, having centralized storage is a great idea, right?
Between computers, tablets, and of course our phones, there are generally quite a few connected devices in the average technology-inclined home. And while every device mentioned can connect to the internet - and cloud storage has become very popular - there's still something to be said for local content management. Beyond the convenience of sharing sometimes massive amounts of data easily at home, another benefit of always-on storage is backup. Ideally, every computer in the home would be backed up locally as well as the cloud, and a great way to take care of the local side of backup is with a NAS. Setting one up is very easy these days, with a growing number of affordable options from various vendors.
Thecus makes an interesting case for a budget NAS with the N2310. For a comparison, Allyn recently looked at Western Digital’s My Cloud EX2 network drive, and this is a highly polished all-in-one solution is now selling for about $199 (without drives). The Thecus N2310 is less expensive at $149, and both offer two 3.5” drive bays. (The My Cloud is also offered pre-populated with drives providing up to 8TB of storage.) These “diskless” enclosures present a good opportunity to save some money up front, and whether you choose to run on two drives you happened to have around the house or office, or if you want to go out and grab a couple of Western Digital 4TB Red drives, they can accommodate your situation.
Let’s take a look at the Thecus N2310.
The AMD Argument
Earlier this week, a story was posted in a Forbes.com blog that dove into the idea of NVIDIA GameWorks and how it was doing a disservice not just on the latest Ubisoft title Watch_Dogs but on PC gamers in general. Using quotes from AMD directly, the author claims that NVIDIA is actively engaging in methods to prevent game developers from optimizing games for AMD graphics hardware. This is an incredibly bold statement and one that I hope AMD is not making lightly. Here is a quote from the story:
Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favor of NVIDIA products. . . . Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code—the most desirable form of optimization.
The example cited on the Forbes story is the recently released Watch_Dogs title, which appears to show favoritism towards NVIDIA GPUs with performance of the GTX 770 ($369) coming close the performance of a Radeon R9 290X ($549).
It's evident that Watch Dogs is optimized for Nvidia hardware but it's staggering just how un-optimized it is on AMD hardware.
Watch_Dogs is the latest GameWorks title released this week.
I decided to get in touch with AMD directly to see exactly what stance the company was attempting to take with these kinds of claims. No surprise, AMD was just as forward with me as they appeared to be in the Forbes story originally.
The AMD Stance
Central to AMD’s latest annoyance with the competition is the NVIDIA GameWorks program. First unveiled last October during a press event in Montreal, GameWorks combines several NVIDIA built engine functions into libraries that can be utilized and accessed by game developers to build advanced features into games. NVIDIA’s website claims that GameWorks is “easy to integrate into games” while also including tutorials and tools to help quickly generate content with the software set. Included in the GameWorks suite are tools like VisualFX which offers rendering solutions like HBAO+, TXAA, Depth of Field, FaceWorks, HairWorks and more. Physics tools include the obvious like PhysX while also adding clothing, destruction, particles and more.
4K for $649
The growth and adoption of 4K resolution panels (most commonly 3840x2160) has really been the biggest story of the past year or so in the world of PC gaming. After a couple of TVs that ran at 3840x2160 over HDMI at 30 Hz found there way into our offices, the first real 60 Hz 4K monitor that I got some hands on time with was the ASUS PQ321Q. This monitor was definitely targeted at the profressional market with its IGZO display (near IPS quality) and somewhat high price tag of $3500. It has since dropped to $2400 or so but it remains somewhat complicated by the use of MST technology (multi-stream transport) that was required to hit 60 Hz.
Earlier this month I took a look at the Samsung U28D590D 28-in 4K panel that was capable of 60 Hz refresh rates for just $699. This display used a single-stream transport DisplayPort connection to keep setup simple but used a TN panel rather than IPS/IGZO. This meant viewing angles were not as strong (though better than most TN screens you have seen before) but...that price!
Today we have our second low cost, SST 4K monitor to evaluate, the ASUS PB287Q. We saw it at CES back in January and with a launch date of June 10th and an MSRP $649, ASUS is setting itself up for an impressive release.
So what can you expect if you purchase the ASUS PB287Q 4K monitor? In short you get an adequate screen that won't live up to IPS standards but is just good enough for the PC gamer and productivity user in all of us. You'll also get a form factor that well exceeds that of the Samsung U28D590D with fully moveable stand and VESA mounting. And a price of $649 for a 3840x2160 screen doesn't hurt either.
Read on the next pages for more details on the user experience in Windows 8.1 as well as while gaming to see if this is the right monitor for you to buy this summer!
Introduction and Technical Specifications
Courtesy of MSI
MSI's Z97 Gaming 7 motherboard offers a bevy of updated features and aesthetics to usher in the Intel Z97 chipset-based product line. The intial update you'll notice are the design of the CPU VRM and chipset heat sinks. The VRM heat sinks have a unique banded design, reminiscent of a dragon's claws. This was done purposefully to cement the Gaming series moniker to the board in the user's mind. The chipset heat sink contains a glowing MSI corporate logo with the Dragon Gaming series shield logo centered above it. The black and red board color scheme is done tastefully with the red coloration used to accent and highlight the heat sinks. With an MRSP of $189, the Z97 Gaming 7 board is competitively priced to take the market by storm.
Courtesy of MSI
Courtesy of MSI
MSI designed the Z97 Gaming 7 board with a 12-phase digital power delivery system combining Hi-C and Dark capacitors with super ferrite chokes for optimal power delivery with enhanced power efficiency characteristics. The board includes the following integrated components: eight SATA 3 ports; one M.2 10 Gb/s ports; a Qualcomm® Atheros Killer E2205 NIC; three PCI-Express x16 slots; four PCI-Express x1 slots; a 2-digit diagnostic LED display; on-board power, reset, CMOS clear, and OC-Genie buttons; Slow Mode boot, Multi-BIOS, and Audio Power switches; left and right channel audio gain control switches; Realtek audio solution with isolated audio PCB and Nichicon audio capacitors; dedicated per-channel headphone OP-AMPs; integrated V-Check voltage measurement points; and USB 2.0 and 3.0 port support.
Courtesy of MSI
Introduction: The Elements of (Life)Style
If this review began by describing this mini-ITX enclosure's all metal and glass construction, its rounded corners, and the premium price tag, it might easily start to sound like it came from that company in Cupertino. Come to think of it, this case would look right at home in a lifestyle magazine photo shoot...
Living the IN WIN 901 lifestyle?
The 901 is definitely stylish, and this is in keeping with the design philosophy of a company that promotes the aesthetics of products first and foremost. So where does this design merge with functionality? This question is a fundamental part of industrial design (ID as it's known in the industry), and in our look at this striking enclosure we'll see how much substance there is to go along with all of that IN WIN style.
One Size Does Not Fit All
Computer cases are a personal thing, which is why we hesitate to make recommendations in this area. Within a certain price point there might be dozens of options for just about any need. But whether or not you're a fan of the sleek styling of a product like the 901, it's different beyond that first impression. The case starts with an aluminum quasi-unibody construction with tempered glass panels on both sides. There is a rather complex structure within this simple exterior, but it is well organized with some thoughtful (and some really smart) design choices.
IN WIN says the 901 mini-ITX case is an example of “precision craftsmanship with no compromises”, and an initial inspection would leave one hard pressed to disagree. It's apparent that some serious engineering has gone into this enclosure, and there is a high level of quality befitting something with this price tag. At $179.99 this is geared toward the high-end enthusiast community, and even a smaller subset considering it is only compatible with mini-ITX motherboards. And while mini-ITX is the supported form-factor, this is definitely not a SFF case. In fact, it’s almost big enough to be a micro-ATX enclosure, but this isn't a complaint. The size of the 901 allows it a unique internal layout.
Another Boring Presentation...?
In my old age I am turning into a bit of a skeptic. It is hard to really blame a guy; we are surrounded by marketing and hype, both from inside companies and from their fans. When I first started to listen in on AMD’s Core Innovation Update presentation, I was not expecting much. I figured it would be a rehash of the past year, more talk about Mullins/Beema, and some nice words about some of the upcoming Kaveri mobile products.
I was wrong.
AMD decided to give us a pretty interesting look at what they are hoping to accomplish in the next three years. It was not all that long ago that AMD was essentially considered road kill, and there was a lot of pessimism that Rory Read and Co. could turn AMD around. Now after a couple solid years of growth, a laser-like focus on product development based on the IP strengths of the company, and a pretty significant cut of the workforce, we are seeing an AMD that is vastly different from the one that Dirk Meyers was in charge of (or Hector Ruiz for that matter). Their view for the future takes a pretty significant turn from where AMD was even 8 years ago. x86 certainly has a future for AMD, but the full-scale adoption of the ARM architecture looks to be what finally differentiates this company from Intel.
Look, I’m Amphibious!
AMD is not amphibious. They are working on being ambidextrous. Their goal is not only to develop and sell x86 based processors, but also be a prime moving force in the ARM market. AMD has survived against a very large, well funded, and aggressive organization for the past 35 years. They believe their experience here can help them break into, and thrive within, the ARM marketplace. Their goals are not necessarily to be in every smartphone out there, but they are leveraging the ARM architecture to address high growth markets that have a lot of potential.
There are really two dominant architectures in the world with ARM and x86. They power the vast majority of computing devices around the world. Sure, we still have some Power and MIPS implementations, but they are dwarfed by the combined presence of x86 and ARM in modern devices. The flexibility of x86 allows it to scale from the extreme mobile up to the highest performing clusters. ARM also has the ability to scale in performance from handhelds up to the server world, but so far their introduction into servers and HPC solutions has been minimal to non-existent. This is an area that AMD hopes to change, but it will not happen overnight. A lot of infrastructure is needed to get ARM into that particular area. Ask Intel how long it took for x86 to gain a handhold in the lucrative server and workstation markets.
Typical Flow Diagram for Single Block Loop
All-in-one liquid coolers seem to be all the rage with several companies introducing expandable systems for integration of a system chipset or graphics cooling block to the loop. We will be exploring the performance of two of our previously reviewed coolers to see just how well those liquid coolers can handle the addition of an additional in-line graphics card block. Both the Koolance EXT-440CU Liquid Cooling System and the Cooler Master Glacer 240L Liquid CPU Cooler were used with the ASUS Poseidon GTX 780 graphics card placed in-line for testing.
Typical Flow Diagram for Multi-Block Loop
Several key factors come into play in a liquid cooling loop that the addition of a second block effects including:
- heat dissipation capacity of the radiator
- flow rate of the system
- resistance of the system components
Basically, additional liquid cooling blocks add more heat and longer tube runs to the system. This increases the amount of heat that the system must dissipate and introduces increased flow resistance to the system because of the increase of the loop size as well as the internal makeup of the added cooling blocks. The increase resistance and loop size directly effects the system flow rate and how hard the pump must work to keep the coolant flowing through the system.
For the purpose of this testing, we did not measure the liquid flow of the system directly. Rather, we measured the temperature of both components (the CPU and GPU) which directly correlates to the flow and heat dissipation capacity of the system. The ASUS Poseidon block adds little resistance to the system, besides the added length of the liquid channel, because of its simple U-loop channel internal to the block.
For additional information about the components used for this article, please see our review of the Koolance EXT-440CU Cooling System here, the Cooler Master Glacer 240L Liquid CPU Cooler here, and the ASUS Poseidon GTX 780 graphics card here.
3840x2160 for Cheap!!
It has been just over a year ago when we first got our hands on a 4K display. At the time, we were using a 50-in Seiki 3840x2160 HDTV that ran at a 30 Hz refresh rate and was disappointing in terms of its gaming experience, but impressive in image quality and price ($1500 at the time). Of course, we had to benchmark graphics cards at 4K resolutions and the results proved what we expected - you are going to need some impressive hardware to run at 4K with acceptable frame rates.
Since that story was published, we saw progress in the world of 4K displays with the ASUS PQ321Q, a 4K monitor (not a TV) that was built to handle 60 Hz refresh rates. The problem, of course, was the requirement for a multi-stream connection that essentially pushes two distinct streams over a single DisplayPort cable to the monitor, each at 1920x2160. While in theory that wasn't a problem, we saw a lot configuration and installation headaches as we worked through the growing pains of drivers and firmware. Also, it was priced at $3200 when we first reviewed it, though that number has fallen to $2400 recently.
Today we are looking at the Samsung U28D590D, the first 4K panel we have seen that supports a 60 Hz refresh rate with a single stream (single tile) implementation. That means that not only do you get the better experiences associated with a 60 Hz refresh rate over a 30 Hz, you also gain a much more simple and compatible installation and setup. No tricky driver issues to be found here! If you have a DisplayPort 1.2-capable graphics card, it's just plug and play.
The Samsung U28D590D uses a 28-in TN panel, which is obviously of a lower quality in terms of colors and viewing angles than the IGZO screen used on the ASUS PQ321Q, but it's not as bad as you might expect based on previous TN panel implementations. We'll talk a bit more about that below. The best part of course is the price - you can find the Samsung 4K panel for as low as $690!
Introduction and Technical Specifications
Courtesy of GIGABYTE
Like many other manufacturers, GIGABYTE is using the launch of the Intel Z97 Express chipset to update its product line aesthetics and makeup. The Z97X-Gaming G1-WIFI-BK motherboard is part of GIGABYTE's Black Edition product line, differentiated from other GIGABYTE offers because of the enhanced validation criteria each individual Black Edition product is put through prior to leaving the assembly plant. Each board is put through a 168 hour stress test, consisting of both CPU and GPU-based Lite Coin mining, to prove its worth as part of the elite Black Edition product line. The Z97X-Gaming G1-WIFI-BK motherboard itself shares components and aesthetics with their Gaming line, featuring the line's black and red coloration and the GIGABYTE Gaming logo (the silver eye) on the Z97 chipset heat sink. With an MSRP of $349, the board is priced at a premium, more than justified by its enhanced feature set and Black Edition enhanced validation criteria.
Courtesy of GIGABYTE
GIGABYTE paired the Z97X-Gaming G1-WIFI-BK motherboard with an 8 phase digital power delivery system to ensure clean power to the CPU under all operating conditions. Additionally, the board's VRM heat sinks can be integrated into an existing water loop using the provided G 1/4" threaded ports. Note that the PLX and chipset heat pipe is physically separate from the VRM heat pipe and does not benefit by water cooling the board's VRM cooler. The Z97X-Gaming G1-WIFI-BK motherboard boasts the following integrated feature list: eight SATA 3 ports; one SATA Express 10 Gb/s ports; dual Gigabit NICs - an Intel i217V NIC and a Qualcomm® Atheros Killer E2201 NIC; an Intel 802.11ac Wi-Fi and Bluetooth PCIe adapter; four PCI-Express x16 slot; three PCI-Express x1 slots; 2-digit diagnostic LED display; on-board power, reset, and CMOS clear buttons; Dual-Bios and active BIOS switches; left and right channel audio gain control switches; Sound Blaster Core 3D audio solution; removable audio OP-AMP port; integrated voltage measurement points; hybrid VRM cooling solution; and USB 2.0 and 3.0 port support.
Courtesy of GIGABYTE
Upgrades from Anker
Last year we started to have a large amount of mobile devices around the office including smartphones, tablets and even convertibles like the ASUS T100, all of which were charged with USB connections. While not a hassle when you are charging one or two units at time, having 6+ on our desks on any day started to become a problem for our less numerous wall outlets. Our solution last year was Anker's E150 25 watt wall charger that we did a short video overview on.
It was great but had limitations including different charging rates depending on the port you connected it to, limited output of 5 Amps total for all five ports and fixed outputs per port. Today we are taking a look at a pair of new Anker devices that implement smart ports called PowerIQ that enable the battery and wall charger to send as much power to the charging device as it requests, regardless of what physical port it is attached to.
We'll start with the updated Anker 40 watt 5-port wall charger and then move on to discuss the 3-port mobile battery charger, both of which share the PowerIQ feature.
Anker 40 watt 5-Port Wall Charger
The new Anker 5-port wall charger is actually smaller than the previous generation but offers superior specifications at all feature points. This unit can push out more than 40 watts total combined through all five USB ports, 5 volts at as much as 8 amps. All 8 amps can in fact go through a single USB charging port we are told if there was a device that would request that much - we don't have anything going above 2.3A it seems in our offices.
Any USB port can be used for any device on this new model, it doesn't matter where it plugs in. This great simplifies things from a user experience point of view as you don't have to hold the unit up to your face to read the tiny text that existed on the E150. With 8 amps spread across all five ports you should have more than enough power to charge all your devices at full speed. If you happen to have five iPads charging at the same time, that would exceed 8A and all the devices charge rates would be a bit lower.
Introduction, Specifications, and Packaging
Image credit: NCASE
The NCASE M1 Mini-ITX case has been lusted after for about a year now by those of us interested in small form-factor (SFF) computing, ever since it made the news last spring by making its initial goal on the crowd-funding site Indiegogo. The last campaign to raise funds ended in August of last year, and not leaving anything up to chance the creators of the M1 contracted none other than Lian Li to make their dream a reality. Today, we have the privilege of seeing the finished product!
Making things happen
We’ve all talked about changing some existing product to fix problems or just add features that we’d like to have. But most of us probably wouldn’t take our idea to a public funding site to actually make it happen, and that’s exactly why the story of NCASE and the M1 is unique. The creators were members on hardforums, and the original thread for the M1 is now well over 500 pages long.
The story began with conversation about improving an existing mini-ITX design, with the SilverStone SG05 the original topic. (It's fascinating to watch the design evolve on the thread!) Two forum members joined forces and started creating designs, and ended up with the blueprint for an incredibly small case that still supported large GPU's and 240mm radiators. Then, it was on to Indiegogo to see if the interest was high enough to get this case built.
Judging by the results starting with that initial round of prototype funding, there has definitely been interest in this design! Lian Li's prototype case was a success, and the initial production run funding campaign quickly raised more than double the goal again… Fast forward to spring 2014, a black M1 case was delivered safely, and I for one can’t wait to get started building up a system with it!
The M1 next to a BitFenix Prodigy: It's tiny!! (Image credit NCASE)