All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.
Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.
The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?
Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?
Introduction and Features
Corsair continues to offer a huge selection of memory products, PC cases, SSDs, power supplies, coolers, gaming peripherals, and PC accessories! The 780T Full-Tower case is one of the new additions to Corsair’s Graphite Series of PC enclosures for 2015 and is available in either black or white. The 780T is a premium case loaded with features that will enable quick, easy, and good-looking builds along with plenty of room and numerous case cooling options. The 780T comes with three 140mm Corsair fans pre-installed with numerous mounting locations for additional fans. The 780T also provides excellent support for liquid cooling with mounting locations for two 360mm radiators. The full-tower enclosure can mount E-ATX and XL-ATX motherboards with room for multiple, high-end graphic adapters up to 14” (355mm) in length. There are currently 16 different models in the Graphite Series ranging in price from $69.99 up to $189.99 USD.
Graphite Series 780T Black ($179.99) Graphite Series 780T White ($189.99)
In this review we will be taking a detailed look at the Graphite Series 780T White Full-Tower case. Here is what Corsair has to say about their new 780T enclosure: “The stunning Graphite Series 780T Full-Tower PC case can satisfy the most hardcore gamer or overclocker with ample room for nine drives and nearly a dozen large cooling fans. Into water cooling? You’ll appreciate the generous space for dual 360mm radiators. And, you’ll get everything done faster: the 780T offers easy maintenance shortcuts like tool-free removal of side panels and hard drives. A three-speed fan control button and generous options for peripheral connections make the front-panel a true time saver.”
Graphite Series 780T Full-Tower Case Key Features:
• Large, Full-Tower PC case (available in black or white)
• Premium design with rounded corners and sleek, cohesive styling
• Latched side panels for easy tool-free access
• Large acrylic side window to show off internal components
• Dual 140mm LED intake fans and a 140mm exhaust fan included
• Locations for up to nine total case fans
• Supports 120mm, 240mm, and 360mm radiators for water-cooling
• Supports XL-ATX, E-ATX, ATX, MicroATX and Mini-ITX motherboards
• Six 3.5” / 2.5” tool-less HDD/SSD bays (can be removed if not needed)
• Three 2.5” tool-less SSD bays
• Three-speed fan control switch on top panel with LED gauge
• Two USB 3.0 and two USB 2.0 ports on top panel
• Two 5.25” front exposed drive bays
• Removable mesh dust filters (front, top, and bottom)
• Up to 355mm (14”) of space for long graphics cards
• Up to 200mm (7.8”) of space for CPU coolers
• Cable routing cutouts to keep cables out of the airflow path
The 780T White Full-Tower case features a beautiful white matte finish with black accents. All internal surfaces finished in black. The two 140mm intake fans behind the front grill incorporate white LEDs (the black version comes with red LEDs).
Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.
NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.
Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.
And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.
But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.
Big Things, Small Packages
Sapphire isn’t a brand we have covered in a while, so it is nice to see a new and interesting product drop on our door. Sapphire was a relative unknown until around the release of the Radeon 9700 Pro days. This was around the time when ATI decided that they did not want to be so vertically integrated, so allowed other companies to start buying their chips and making their own cards. This was done to provide a bit of stability for ATI pricing, as they didn’t have to worry about a volatile component market that could cause their margins to plummet. By selling just the chips to partners, ATI could more adequately control margins on their own product while allowing their partners to make their own deals and component choices for the finished card.
ATI had very limited graphics card production of their own, so they often would farm out production to second sources. One of these sources ended up turning into Sapphire. When ATI finally allowed other partners to produce and brand their own ATI based products, Sapphire already had a leg up on the competition by being a large producer already of ATI products. They soon controlled a good portion of the marketplace by their contacts, pricing, and close relationship with ATI.
Since this time ATI has been bought up by AMD and they no longer produce any ATI branded cards. Going vertical when it come to producing their own chips and video cards was obviously a bad idea, we can look back at 3dfx and their attempt at vertical integration and how that ended for the company. AMD obviously produces an initial reference version of their cards and coolers, but allows their partners to sell the “sticker” version and then develop their own designs. This has worked very well for both NVIDIA and AMD, and it has allowed their partners to further differentiate their product from the competition.
Sapphire usually does a bang up job on packaging the graphics card. Oh look, a mousepad!
Sapphire is not as big of a player as they used to be, but they are still one of the primary partners of AMD. It would not surprise me in the least if they still produced the reference designs for AMD and then distributed those products to other partners. Sapphire is known for building a very good quality card and their cooling solutions have been well received as well. The company does have some stiff competition from the likes of Asus, MSI, and others for this particular market. Unlike those two particular companies, Sapphire obviously does not make any NVIDIA based boards. This has been a blessing and a curse, depending on what the cycle is looking like between AMD and NVIDIA and who has dominance in any particular marketplace.
Introduction: Improving Portable Sound
The Calyx PaT is very small USB DAC and headphone amp that can be used with PCs and mobile devices, offering the possibility of better sound from just about any digital source. So how does it sound? Let’s find out!
The PaT is a very interesting little device, to be sure. It rather resembles a large domino and weighs less than 1 ounce thanks to an ultra-light aluminum construction. It requires no battery or power source other than its micro USB connection, yet it provides sufficient power (0.8 V output) for in-ear monitors and efficient headphones through its 3.5mm headphone jack. Inside is a proprietary mix of DAC and amplifier circuitry, and like other products produced by Calyx, a Korean company with little presence in the United States, there is the promise of a dedication to great sound. Did Calyx pull it off with the diminutive PaT?
Improving Portable Sound
Outboard DACs and headphone amplifiers for computers and mobile devices are nothing new, with recent products like AudioQuest’s Dragonfly a prime example in the portable USB DAC market (though it offers no mobile support). When I first heard about the PaT during CES it was still in the prototype stage, but I was interested because of the Calyx name if nothing else, as I already owned the Calyx M DAP and had been quite honestly blown away by the sound.
So what need might I have for the interestingly-named PaT (pronounced "paat", meaning "bean" in reference to the small size), which is itself a DAC that requires another device to play music files? It wasn’t until I had the opportunity to speak with Calyx president Seungmok Yi during CES (via video chat as I couldn’t attend the show) that I started realize that this could be a compelling product, not just for the $99 price tag - a bargain for an audiophile product - but because of how versatile the PaT can be. You don't have to identify as an "audiophile" to appreciate the clearer and more detailed sound of a good DAC, especially when so many of us simply haven't heard one (especially on mobile devices).
Introduction and Technical Specifications
Courtesy of GIGABYTE
The X99-Gaming 5P is the latest board to be branding as part of GIGABYTE's Champion Series of motherboards with enhanced overclocking and memory support common to board's in that series. The board also features the G1 Gaming-branding, clearly targeting the gaming crowd with is red and black aesthetics. The board supports all Intel LGA2011-3 based processors paired with DDR4 memory in up to a quad channel configuration. GIGABYTE priced the X99-Gaming 5P competitively with an MSRP of $309.99.
Courtesy of GIGABYTE
Courtesy of GIGABYTE
Courtesy of GIGABYTE
The X99-Gaming 5P was designed to take the abuse that enthusiast gamers put their boards through. The board features a 8+4-phase digital power system, using International Rectify Gen 4 digital PWM controllers and Gen 3 PowIRstage controllers, Server Level chokes, and long life Durable Black Solid capacitors. It also offers a superior sound solution for gaming, pairing together the Creative Sound Core3D&trade quad-core audio processor, high-end audio capacitors, and a removable OP-AMP.
Introduction and First Impressions
Supermicro recently entered the consumer space with a new line of enthusiast motherboards and today we’re looking at a gaming enclosure from the well-known enterprise manufacturer.
While many component manufacturers have diversified their product offerings to include everything from cooling fans to thumb drives, Supermicro is not a name that anyone familiar with the company would have likely suspected of this trend. With recent Z97 and X99 motherboard offerings Supermicro has made an effort to enter the enthusiast market with boards that don’t exactly look like gaming products, but this is to be expected from a company that specializes in the enterprise market.
It was something of a surprise to hear that Supermicro had created a new enclosure for the consumer segment, and even more so to hear that it was to be a gaming enclosure. And while the term “gaming” gets thrown around quite a bit the new enclosure does have the look we tend to associate with the moniker, with flashy red accents and a brushed aluminum front panel to go along with all-black steel enclosure.
High Bandwidth Memory
UPDATE: I have embedded an excerpt from our PC Perspective Podcast that discusses the HBM technology that you might want to check out in addition to the story below.
The chances are good that if you have been reading PC Perspective or almost any other website that focuses on GPU technologies for the past year, you have read the acronym HBM. You might have even seen its full name: high bandwidth memory. HBM is a new technology that aims to turn the ability for a processor (GPU, CPU, APU, etc.) to access memory upside down, almost literally. AMD has already publicly stated that its next generation flagship Radeon GPU will use HBM as part of its design, but it wasn’t until today that we could talk about what HBM actually offers to a high performance processor like Fiji. At its core HBM drastically changes how the memory interface works, how much power is required for it and what metrics we will use to compare competing memory architectures. AMD and its partners started working on HBM with the industry more than 7 years ago, and with the first retail product nearly ready to ship, it’s time to learn about HBM.
We got some time with AMD’s Joe Macri, Corporate Vice President and Product CTO, to talk about AMD’s move to HBM and how it will shift the direction of AMD products going forward.
The first step in understanding HBM is to understand why it’s needed in the first place. Current GPUs, including the AMD Radeon R9 290X and the NVIDIA GeForce GTX 980, utilize a memory technology known as GDDR5. This architecture has scaled well over the past several GPU generations but we are starting to enter the world of diminishing returns. Balancing memory performance and power consumption is always a tough battle; just ask ARM about it. On the desktop component side we have much larger power envelopes to work inside but the power curve that GDDR5 is on will soon hit a wall, if you plot it far enough into the future. The result will be either drastically higher power consuming graphics cards or stalling performance improvements of the graphics market – something we have not really seen in its history.
While it’s clearly possible that current and maybe even next generation GPU designs could still have depended on GDDR5 as the memory interface, the move to a different solution is needed for the future; AMD is just making the jump earlier than the rest of the industry.
Introduction and First Impressions
The ASUS ROG Gladius mouse features sleek styling and customizable lighting effects, but the biggest aspect is the underlying technology. With socketed Omron switches designed to be easily swapped and an adjustable 6400dpi optical sensor this gaming mouse offers a lot on paper. So how does it feel? Let's find out.
There are a few aspects to the way a mouse feels, including the shape, surface material, and overall weight. Beyond the physical properties there is the speed and accuracy of the sensor (which also affects hand movement) and of course the mouse buttons and scroll wheel. Really, there's a lot going on with a modern gaming mouse - a far cry from the "X-Y position indicator" that the inventors had nicknamed "mouse" in the 1960s.
One of the hallmarks of the ASUS ROG (Republic of Gamers) lineup is the sheer amount of additional features the products tend to have. I use an ROG motherboard in my personal system, and even my micro-ATX board is stuffed with additional functionality (and the box is loaded with accessories). So it came as no surprise to me when I opened the Gladius mouse and began to look it over. Sure, the box contents aren't as numerous as one of the Maximus motherboards, but there's still quite a bit more than I've encountered with a mouse before.
Introduction and Specifications
Displays have been a hot item as of late here at PC Perspective. Today we are looking at the new Acer XB270HU. In short, this is an IPS version of the ASUS ROG Swift. For the long version, it is a 1440P, 144Hz, G-Sync enabled 27 inch display. This is the first G-Sync display released with an IPS panel, which is what makes this release such a big deal. Acer has been pushing hard on the display front, with recent releases of the following variable refresh capable displays:
- XB270H 27in 1080P 144Hz G-Sync
- XB280HK 28in 4K 60Hz G-SYnc
- XG270HU 27in 1440P 40-144Hz FreeSync
- XB270HU 27in 1440P 144Hz G-Sync < you are here
The last entry in that list is the subject of todays review, and it should look familiar to those who have been tracking Acer's previous G-Sync display releases:
Here's our video overview of this new display. I encourage you to flip through the review as there are more comparison pictures and information to go along.
Or: How to fall asleep at work.
I will be the first to admit that just a couple of weeks ago I had zero need for a gaming chair. But when 4GamerGear.com offered to send us one of the AKracing AK-6014 ergonomic executive units, I agreed. I went out of town for a week and returned to find the chair assembled and already in use by Ken, our go to video editor and engineer. Of course I had to steal it from him to take part in the "review" process and the results were great! As I sit here now in my Ikea office chair, writing up this post, while Ken sits just feet away tapping away on some edits, I can't help but want to use my authority to take it back.
The price is steep, but the added comfort you get from a chair like the AKracing model we tested is substantial. Every part of the design, based on a racing seat for a car, is built to keep you in place. But instead of preventing lateral movements caused from taking corners, this is more to keep your butt in place and your back straight to encourage good posture. The arm rests are height adjustable (as is the seat itself of course) and the back reclines for different desk and resting positions. You can lay it PAST flat for naps if you're into that kind of thing.
You can find these chairs for sale on Amazon in different color combinations with a current price of $349. It's expensive, I can't deny that. But it looks and feels way cooler than what you are sitting in right now. And aren't you worth it?
Introduction and First Impressions
The Define S from Fractal Design is a mid-tower enclosure based on the company’s excellent Define R5, and this version has a new interior for enhanced cooling support with an innovative approach to storage.
I've mentioned before that the PC enclosure market is crowded with options at every price point, but this can actually be a good thing because of the high level of individual preference this permits. Selecting a case is a multi-faceted thing, and while they all (well, mostly) keep components safely housed, once that need has been met there's a lot more to consider. Let's face it, aesthetics are important since the enclosure is the outward-facing representation of your build (and personal style). Support for your preferred type of cooling, storage, and future expandability are high on the list when selecting a finalist as well, and then there's the thermal/noise performance element to consider. It was Fractal Design's own Define R5 (review here) which offered a balanced approach to these needs, and while not looking especially flashy with understated style and a standard ATX layout, the R5 was an exceptionally well-done effort overall. Now, months later, enter the Define S.
With the Define R5 offering a solid combination of silence, expandability, and build quality, why would Fractal Design create another very similar case right on its heels? It’s all about giving people choice, and that’s something I can certainly stand behind - even when it means further segmenting a market that seems almost impossibly crowded now. And when we dive deeper into the Define S we see what is essentially a companion to the Define R5, and not a replacement. At first glance this might appear to be an identical case, but the interior layout clearly separates the two. In summary, the Define S loses 5.25” storage support found in the R5, and while that previous model had no less than 8 hard drive trays the S employs a novel approach to HDD support, but cuts the drive support from 8 standard 3.5" drives to just 3 in the process.
Some familiar scenery
If you thought that Intel was going to slow down on its iteration in the SFF (small form factor) system design, you were sadly mistaken. It was February when Intel first sent us a NUC based on Broadwell, an iterative upgrade over a couple of generations for this very small platform, 4" x 4", that showed proved to be interesting from a technology stand point but didn't shift expectations of the puck-sized PC business.
Today we are looking at yet another NUC, also using a Broadwell processor, though this time the CPU is running quite a bit faster, with Intel Iris 6100 graphics and a noticeably higher TDP. The Core i7-5557U is still a dual-core / HyperThreaded processor but it increases base and Turbo clocks by wide margins, offering as much as 35% better CPU performance and mainstream gaming performance boosts in the same range. This doesn't mean the NUC 5i7RYH will overtake your custom built desktop but it does make it a lot more palatable for everyday PC users.
Oh, and we have an NVMe PCI Express SSD inside this beast as well. (Waaaaaa??)
Introduction and First Impressions
Corsair's venerable single and double-width liquid CPU coolers based on 120mm fans have been refreshed with added style and a new underlying design. How do they perform? We're about to find out!
There isn't much left to say about all-in-one (AIO) liquid CPU coolers these days, other than they are often better performing and more expensive than traditional air coolers. A good design gives an AIO liquid cooler a distinct advantage with overclocking headroom, and often in noise output as well (at higher levels of performance). This is not to dismiss air cooling as massive dual-tower coolers from Noctua and others certainly have the potential to out-perform all but the very best AIO liquid coolers, though not every enclosure will have the room for such a cooler. Thus, a good AIO liquid CPU cooler can offer not only space savings but potentially outstanding cooling performance as well. But this is a road we have traveled many times, with cost often the deciding factor even in the face of compelling evidence in favor of what are often very expensive solutions. Has that changed in 2015?
Corsair has done as much as any manufacturer to make AIO liquid CPU cooling mainstream, and their original H100 cemented the AIO's place as an enthusiast-level cooler and not simply a shortcut to traditional water cooling. The 240mm H100 and corresponding 120mm H80 have been refined a couple of times since their introduction, and though other variants such as the H105 and H75 (at the same 240/120mm sizes) have been released and crowded the market further, the H100/H80 series still occupy an important position.
No Longer the Media Center of Attention
Gabe Aul, of Microsoft's Windows Insiders program, has confirmed on Twitter that Windows 10 will drop support for Windows Media Center due to a decline in usage. This is not surprising news as Microsoft has been deprecating the Media Center application for a while now. In Windows 8.x, the application required both the “Pro” SKU of the operating system, and then users needed to install an optional add-on above and beyond that. The Media Center Pack cost $10 over the price of Windows 8.x Pro unless you claimed a free license in the promotional period surrounding Windows 8's launch.
While Media Center has been officially abandoned, its influence on the industry (and vice versa) is an interesting story. For a time, it looked like Microsoft had bigger plans that were killed by outside factors and other companies seem to be eying the money that Microsoft left on the table.
There will be some speculation here.
We could go back to the days of WebTV, but we won't. All you need to know is that Microsoft lusted over the living room for years. Windows owned the office and PC gaming was taking off with strong titles (and technologies) from Blizzard, Epic, iD, Valve, and others. DirectX was beloved by developers, which led to the original Xbox. Their console did not get a lot of traction, but they respected it as a first-generation product that was trying to acquire a foothold late in a console generation. Financially, the first Xbox would cost Microsoft almost four billion dollars more than it made.
At the same time, Microsoft was preparing Windows to enter the living room. This was the company's power house and it acquired significant marketshare wherever it went, due to its ease of development and its never-ending supply of OEMs, even if the interface itself was subpar. Their first attempt at bringing Windows to the living room was Windows XP Media Center Edition. This spin-off of Windows XP could only be acquired by OEMs to integrate into home theater PCs (HTPCs). The vision was interesting, using OEM competition to rapidly prototype what users actually want in a PC attached to a TV.
This leads us to Windows Vista, which is where Media Center came together while the OS fell apart.
Introduction and Technical Specifications
Courtesy of SUPERMICRO
SUPERMICRO is a vendor you really don't hear much about anymore, unless you are dealing with datacenter and server builds. However, they are looking to make a comeback into the enthusiast crowd with their Intel X99 chipset-based offering, the C7X99-OCE motherboard. The board features a blue and black aesthetic with stylish heat sinks covering all the important areas with support for all Intel LGA2011-3 based processors paired with DDR4 memory in up to a quad channel configuration. Offered at a $329.99 MSRP, the SUPERMICRO C7X99-OCE remains reasonably priced in comparison to Intel X99 offerings from other manufacturers.
Courtesy of SUPERMICRO
SUPERMICRO built the C7X99-OCE to appeal to the enthusiast crowd, powered by a more than adequate 8+4 phase digital power delivery system. The following features were integrated into the board: 10 SATA 3 ports; dual RJ-45 Intel i210-AT Gigabit NICs; four PCI-Express x16 slots; two PCI-Express x4 slots; 2-digit diagnostic LED display; on-board power, reset, CMOS clear, and overclocking buttons; and USB 2.0 and 3.0 port support.
Courtesy of SUPERMICRO
Introduction, Specifications and Packaging
Back in November of last year, we tested the Corsair Neutron XT, which was the first product to feature the Phison PS3110-S10 controller. First spotted at Flash Memory Summit, the S10 sports the following features:
- Quad-core controller - Quad-core CPU dedicates three cores just to managing flash and maintaining performance
- Maximum throughput and I/O - Offers speeds of up to 560 MB/s read and 540 MB/s write and 100K IOPs on read and 90 IOPs on write, saturating the SATA 6Gbps bus
- End-to-end Data Path Protection - Enterprise level CRC/ECC corrects internal soft errors as well as detecting and correcting any errors that may arise between the DRAM, controller, and flash
- SmartECC™ - Reconstructs defective/faulty pages when regular ECC fails
- SmartRefresh™ - Monitors block ECC health status and refreshes blocks periodically to improve data retention
- SmartFlush™ - Minimizes time data spends in cache to ensure data retention in the event of power loss
- Advanced wear-leveling and garbage collection
Corsair was Phison's launch partner, but as that was a while ago, we now have two additional SSD models launching with the S10 at their core:
To the left is the Kingston HyperX Savage. To the right is the Patriot Ignite. They differ in flash memory types used, available capacities, and the stated performance specs vary slightly among them. Today we'll compare them against the Neutron XT as well as a selecton of other SATA SSDs.
DirectX 12 Has No More Secrets
The DirectX 12 API is finalized and the last of its features are known. Before the BUILD conference, the list consisted of Conservative Rasterization, Rasterizer Ordered Viewed, Typed UAV Load, Volume Tiled Resources, and a new Tiled Resources revision for non-volumetric content. When the GeForce GTX 980 launched, NVIDIA claimed it would be compatible with DirectX 12 features. Enthusiasts were skeptical, because Microsoft did not officially finalize the spec at the time.
Last week, Microsoft announced the last feature of the graphics API: Multiadapter.
We already knew that Multiadapter existed, at least to some extent. It is the part of the specification that allows developers to address multiple graphics adapters to split tasks between them. In DirectX 11 and earlier, secondary GPUs would remain idle unless the graphics driver sprinkled some magic fair dust on it with SLI, CrossFire, or Hybrid CrossFire. The only other way to access this dormant hardware was by spinning up an OpenCL (or similar compute API) context on the side.
Don't be afraid of PCIe or NVMe
In very early April, Intel put a shot across the bow of the storage world with the release of the SSD 750 Series of storage devices. Using the PCI Express bus but taking advantage of the new NVMe (Non-Volatile Memory Express) protocol, it drastically upgrades the capabilities of storage within modern PC platforms. In Allyn's review, for example, we saw read data transfer rates cross into the 2.6 GB/s range in sequential workloads and write rates over 1.2 GB/s sequentially. Even more impressive is the random I/O performance where the SSD 750 is literally 2x the speed of previous PCIe SSD options.
A couple of weeks later we posted a story looking into the compatibility of the SSD 750 with different motherboards and chipsets. We found that booting from the SSD 750 Series products is indeed going to require specific motherboards and platforms simply due to the "new-ness" of the NVMe protocol. Officially, Intel is only going to support Z97 and X99 chipsets today but obviously you can expect all future chipsets to have proper NVMe integration. We did find a couple of outliers that allowed for bootability with the SSD 750, but I wouldn't count on it.
Assuming you have a Z97/X99 motherboard that properly supports NVMe drives, of which ASUS, MSI and Gigabyte seem to be on top of, what are the steps and processes necessary to get your system up and running on the Intel SSD 750? As it turns out, it's incredibly simple.
Make sure you have enabled NVMe in the latest BIOS/UEFI. The screenshot below shows our ASUS X99-Deluxe motherboard used during testing and that it is properly recognizing the device. There was no specific option to ENABLED NVMe here though we have seen instances where that is required.
Some Fresh Hope for 2016
EDIT 2015-05-07: A day after the AMD analyst meeting we now know that the roadmaps delivered here are not legitimate. While some of the information is likely correct on the roadmaps, they were not leaked by AMD. There is no FM3 socket, rather AMD is going with AM4. AMD will be providing more information throughout this quarter about their roadmaps, but for now take all of this information as "not legit".
SH SOTN has some eagle eyes and spotted the latest leaked roadmap for AMD. These roadmaps cover both mobile and desktop, from 2015 through 2016. There are obviously quite a few interesting tidbits of information here.
On the mobility roadmap we see the upcoming release of Carrizo, which we have been talking about since before CES. This will be the very first HSA 1.0 compliant part to hit the market, and AMD has done some really interesting things with the design in terms of performance, power efficiency, and die size optimizations. Carrizo will span the market from 15 watts to 35 watts TDP. This is a mobile only part, but indications point to it being pretty competent overall. This is a true SOC that will support all traditional I/O functions of older standalone southbridges. Most believe that this part will be manufactured by GLOBALFOUNDIRES on their 28 nm HKMG process that is more tuned to AMD's APU needs.
Carrizo-L will be based on the Puma+ architecture and will go from 10 watts to 15 watts TDP. This will use the same FP4 BGA connection as the big Carrizo APU. This should make these parts more palatable for OEMs as they do not have to differentiate the motherboard infrastructure. Making things easier for OEMs will give more reasons for these folks to offer products based on Carrizo and Carrizo-L APUs. The other big reason will be the GCN graphics compute units. Puma+ is a very solid processor architecture for low power products, but these parts are still limited to the older 28 nm HKMG process from TSMC.
One interesting addition here is that AMD will be introducing their "Amur" APU for the low power and ultra-low power markets. These will be comprised of four Cortex-A57 CPUs combined with AMD's GCN graphics units. This will be the first time we see this combination, and the first time AMD has integrated with ARM since ATI spun off their mobile graphics to Qualcomm under the "Adreno" branding (anagram for "Radeon"). What is most interesting here is that this APU will be a 20 nm part most likely fabricated by TSMC. This is not to say that Samsung or GLOBALFOUNDRIES might be producing it, but those companies are expending their energy on the 14 nm FinFET process that will be their bread and butter for years to come. This will be a welcome addition to the mobile market (tablets and handhelds) and could be a nice profit center for AMD if they are able to release this in a timely manner.
2016 is when things get very interesting. The Zen x86 design will dominate the upper 2/3 of the roadmap. I had talked about Zen when we had some new diagram leaks yesterday, but now we get to see the first potential products based off of this architecture. In mobile it will span from 5 watts to 35 watts TDP. The performance and mainstream offerings will be the "Bristol Ridge" APU which will feature 4 Zen cores (or one Zen module) combined with the next gen GCN architecture. This will be a 14nm part, and the assumption is that it will be GLOBALFOUNDRIES using 14nm FinFET LPP (Low Power Plus) that will be more tuned for larger APUs. This will also be a full SOC.
The next APU will be codenamed "Basilisk" that will span the 5 watt to 15 watt range. It will be comprised of 2 Zen cores (1/2 of a Zen module) and likely feature 2 to 4 MB of L3 cache, depending on power requirements. This looks to be the first Skybridge set of APUs that will share the same infrastructure as the ARM based Amur SOC. FT4 BGA is the basis for both the 2015 Amur and 2016 Basilisk SOCs.
Finally we have the first iteration of AMD's first ground up implementation of ARM's ARMv8-A ISA. The "Styx" APU features the new K12 CPU cores that AMD has designed from scratch. It too will feature the next generation GCN units as well as share the same FT4 BGA connection. Many are anxiously watching this space to see if AMD can build a better mousetrap when it comes to licensing the ARM ISA (as have Qualcomm, NVIDIA, and others).
2015 shows no difference in the performance desktop space, as it is still serviced by the now venerable Piledriver based FX parts on AM3+. The only change we expect to see here is that there will be a handful of new motherboard offerings from the usual suspects that will include the new USB 3.1 functionality derived from a 3rd party controller.
Mainstream and Performance will utilize the upcoming Godavari APUs. These are power and speed optimized APUs that are still based on the current Kaveri design. These look to be a simple refresh/rebadge with a slight performance tweak. Not exciting, but needs to happen for OEMs.
Low power will continue to be addressed by Beema based APUs. These are regular Puma based cores (not Puma+). AMD likely does not have the numbers to justify a new product in this rather small market.
2016 is when things get interesting again. We see the release of the FM3 socket (final proof that AM3+ is dead) that will house the latest Zen based APUs. At the top end we see "Summit Ridge" which will be composed of 8 Zen cores (or 2 Zen modules). This will have 4 MB of L2 cache and 16 MB of L3 cache if our other leaks are correct. These will be manufactured on 14nm FinFET LPE (the more appropriate process product for larger, more performance oriented parts). These will not be SOCs. We can expect these to be the basis of new Opterons as well, but there is obviously no confirmation of that on these particular slides. This will be the first new product in some years from AMD that has the chance to compete with higher end desktop SKUs from Intel.
From there we have the lower power Bristol Ridge and Basilisk APUs that we already covered in the mobile discussion. These look to be significant upgrades from the current Kaveri (and upcoming Godavari) APUs. New graphics cores, new CPU cores, and new SOC implementations where necessary.
AMD will really be shaking up the game in 2016. At the very least they will have proven that they can still change up their game and release higher end (and hopefully competitive) products. AMD has enough revenue and cash on hand to survive through 2016 and 2017 at the rate they are going now. We can only hope that this widescale change will allow AMD to make some significant inroads with OEMs on all levels. Otherwise Intel is free to do what they want and what price they want across multiple markets.