Subject: Editorial | May 29, 2015 - 12:37 PM | Ryan Shrout
Tagged: SSD 750, PCI Express, NVMe, Intel, giveaway, contest, 750 series
PC Perspective and Intel are partnering together to offer up a giveaway with some pretty impressive swag. Surely by now you have read all about the new Intel SSD 750 Series of products, a new class of solid state drive that combines four lanes of PCI Express 3.0 and a new protocol called NVM Express (NVMe) for impressive bandwidth throughput. In Allyn's review of the SSD in April he called it "the obvious choice for consumers who demand the most from their storage" and gave it a PC Perspective Editor's Choice Award!
Thanks to our friends at Intel we are going to be handing out a pair of the 400GB add-in card models to loyal PC Perspective readers and viewers. How can you enter? The rules are dead simple:
- Fill out the contest entry form below to find multiple entry methods including reading our review, answering a question about Intel SSD 750 Series specs or following us on Twitter. You can fill out one or all of the methods - the more you do the better your chances!
- Leave a comment on the news post below thanking Intel for sponsoring PC Perspective and for supplying this hardware for us to give to you!
- This is a global contest - so feel free to enter from anywhere in the world!
- Contest will close on June 2nd, 2015.
Our most sincere thanks to Intel for bringing this contest to PC Perspective's readers and fans. Good luck to everyone (except Josh)!
Sponsored by Intel
|Capacity||Seqential 128KB Read (up to MB/s)||Sequential 128KB Write (up to MB/s)||Random 4KB Read (up to IOPS)||Random 4KB Write (up to IOPS)||Form Factor||Interface|
|400 GB||2,200||900||430,000||230,000||2.5-inch x 15mm||PCI Express Gen3 x4|
|1.2 TB||2,400||1,200||440,000||290,000||2.5-inch x 15mm||PCI Express Gen3 x4|
|400 GB||2,200||900||430,000||230,000||Half-height half-length (HHHL) Add-in Card||PCI Express Gen3 x4|
|1.2 TB||2,400||1,200||440,000||290,000||Half-heigh half-length (HHHL) Add-in Card||PCI Express Gen3 x4|
Experience the future of storage performance for desktop client and workstation users with the Intel® SSD 750 Series. The Intel SSD 750 Series delivers uncompromised performance by utilizing NVM Express* over four lanes of PCIe* 3.0.
With both Add-in Card and 2.5-inch form factors, the Intel SSD 750 Series eases migration from SATA to PCIe 3.0 without power or thermal limitations on performance. The SSD can now deliver the ultimate in performance in a variety of system form factors and configurations.
When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.
Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.
The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?
Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?
Subject: Displays | June 1, 2015 - 12:21 PM | Ryan Shrout
Tagged: nvidia, gsync, g-sync, asus, 3800R
At Computex this week ASUS is showing off a prototype of the new ROG 3800R monitor, a 34-in curved display with a 3440x1440 resolution and G-Sync variable refresh rate capability. ASUS claims on its PCDIY blog that the 21:9 aspect ratio was the one of the "most requested" specifications for a new ROG monitor, followed by a curved design. The result is a gorgeous display:
Here's a list of specifications:
- 34” optimal dimension for QHD resolutions with 3440×1440 resolution
- 21:9 ultra-wide aspect ratio for increased immersion and improved horizontal workflow
- IPS based panel for superior color reproduction, black levels and reduction of color shifting
- NVIDIA G-SYNC equipped offering smooth, fluid and tear free gaming with improved motion clarity. Additionally equipped with ULMB operating mode for outstanding motion clarity.
- Frameless design for seamless surround gaming
- ASUS exclusive GamePlus feature and Turbo Key
- Ergonomic adjustment including tilt, swivel and height adjustment
Hot damn, we want of these and we want it yesterday! There is no mention of the refresh rate of the display here though we did see information from NVIDIA that ASUS was planning a 34x14 60 Hz screen - but we are not sure this is the same model being shown. And the inclusion of ULMB would normally indicate a refresh rate above 60-75 Hz...
Another interesting note: this monitor appears to include both DisplayPort and HDMI connectivity.
This 34-inch 3800R curved display features wide-viewing angles, a 3440 x 1440 native resolution, and 21:9 aspect ratio. It features NVIDIA® G-SYNC™ display technology to deliver smooth, lag-free visuals. G-SYNC synchronizes the display’s refresh rate to the GPU in any GeForce® GTX™-powered PC to eliminate screen tearing and minimizing display stutter and input lag. This results in sharper, more vibrant images; and more fluid and responsive gameplay. It has extensive connectivity options that include DisplayPort and HDMI.
The above information came from ASUS just a few short hours ago, so you can assume that it is accurate. Could this be the start of panels that integrate dual scalars (G-Sync module plus something else) to offer more connectivity or has the G-Sync module been updated to support more inputs? We'll find out!
Subject: Processors | May 28, 2015 - 03:44 PM | Scott Michaud
Tagged: Intel, Skylake, skylake-s, haswell, devil's canyon
For a while, it was unclear whether we would see Broadwell on the desktop. With the recently leaked benchmarks of the Intel Core i7-6700K, it seems all-but-certain that Intel will skip it and go straight to Skylake. Compared to Devil's Canyon, the Haswell-based Core i7-4790K, the Skylake-S Core i7-6700K has the same base clock (4.0 GHz) and same full-processor Turbo clock (4.2 GHz). Pretty much every improvement that you see is pure performance per clock (IPC).
Image Credit: CPU Monkey
In multi-threaded applications, the Core i7-6700K tends to get about a 9% increase while, when a single core is being loaded, it tends to get about a 4% increase. Part of this might be the slightly lower single-core Turbo clock, which is said to be 4.2 GHz instead of 4.4 GHz. There might also be some increased efficiency with HyperThreading or cache access -- I don't know -- but it would be interesting to see.
I should note that we know nothing about the GPU. In fact, CPU Monkey fails to list a GPU at all. Intel has expressed interest in bringing Iris Pro-class graphics to the high-end mainstream desktop processors. For someone who is interested in GPU compute, especially with Explicit Unlinked MultiAdapter in DirectX 12 upcoming, it would be nice to see GPUs be ubiquitous and always enabled. It is expected to have the new GT4e graphics with 72 compute units and either 64 or 128MB of eDRAM. If clocks are equivalent, this could translate well over a teraflop (~1.2 TFLOPs) of compute performance in addition to discrete graphics. In discrete graphics, that would be nearly equivalent to an NVIDIA GTX 560 Ti.
We are expecting to see the Core i7-6700K launch in Q3 of this year. We'll see.
Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.
NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.
Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.
And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.
But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.
Subject: Displays, Mobile | May 31, 2015 - 06:00 PM | Ryan Shrout
Tagged: nvidia, notebooks, msi, mobile, gsync, g-sync, asus
If you remember back to January of this year, Allyn and posted an article that confirmed the existence of a mobile variant of G-Sync thanks to a leaked driver and an ASUS G751 notebook. Rumors and speculation floated around the Internet ether for a few days but we eventually got official word from NVIDIA that G-Sync for notebooks was a real thing and that it would launch "soon." Well we have that day here finally with the beginning of Computex.
G-Sync for notebooks has no clever branding, no "G-Sync Mobile" or anything like that, so discussing it will be a bit more difficult since the technologies are different. Going forward NVIDIA claims that any gaming notebook using NVIDIA GeForce GPUs will be a G-Sync notebook and will support all of the goodness that variable refresh rate gaming provides. This is fantastic news as notebook gaming is often at lower frame rates than you would find on a desktop PC because of lower powered hardware yet comparable (1080p, 1440p) resolution displays.
Of course, as we discovered in our first look at G-Sync for notebooks back in January, the much debated G-Sync module is not required and will not be present on notebooks featuring the variable refresh technology. So what gives? We went over some of this before, but it deserves to be detailed again.
NVIDIA uses the diagram above to demonstrate the complication of the previous headaches presented by the monitor and GPU communication path before G-Sync was released. You had three different components: the GPU, the monitor scalar and the monitor panel that all needed to work together if VRR was going to become a high quality addition to the game ecosystem.
NVIDIA's answer was to take over all aspects of the pathway for pixels from the GPU to the eyeball, creating the G-Sync module and helping OEMs to hand pick the best panels that would work with VRR technology. This helped NVIDIA make sure it could do things to improve the user experience such as implementing an algorithmic low-frame-rate, frame-doubling capability to maintain smooth and tear-free gaming at frame rates under the panels physical limitations. It also allows them to tune the G-Sync module to the specific panel to help with ghosting and implemention variable overdrive logic.
All of this is required because of the incredible amount of variability in the monitor and panel markets today.
But with notebooks, NVIDIA argues, there is no variability at all to deal with. The notebook OEM gets to handpick the panel and the GPU directly interfaces with the screen instead of passing through a scalar chip. (Note that some desktop monitors like the ever popular Dell 3007WFP did this as well.) There is no other piece of logic in the way attempting to enforce a fixed refresh rate. Because of that direct connection, the GPU is able to control the data passing between it and the display without any other logic working in the middle. This makes implementing VRR technology much more simple and helps with quality control because NVIDIA can validate the panels with the OEMs.
As I mentioned above, going forward, all new notebooks using GTX graphics will be G-Sync notebooks and that should solidify NVIDIA's dominance in the mobile gaming market. NVIDIA will be picking the panels, and tuning the driver for them specifically, to implement anti-ghosting technology (like what exists on the G-Sync module today) and low frame rate doubling. NVIDIA also claims that the world's first 75 Hz notebook panels will ship with GeForce GTX and will be G-Sync enabled this summer - something I am definitely looking forward to trying out myself.
Though it wasn't mentioned, I am hopeful that NVIDIA will continue to allow users the ability to disable V-Sync at frame rates above the maximum refresh of these notebook panels. With most of them limited to 60 Hz (but this applies to 75 Hz as well) the most demanding gamers are going to want that same promise of minimal latency.
At Computex we'll see a handful of models announced with G-Sync up and running. It should be no surprise of course to see the ASUS G751 with the GeForce GTX 980M GPU on this list as it was the model we used in our leaked driver testing back in January. MSI will also launch the GT72 G with a 1080p G-Sync ready display and GTX 980M/970M GPU option. Gigabyte will have a pair of notebooks: the Aorus X7 Pro-SYNC with GTX 970M SLI and a 1080p screen as well as the Aorus X5 with a pair of GTX 965M in SLI and a 3K resolution (2560x1440) screen.
This move is great for gamers and I am eager to see what the resulting experience is for users that pick up these machines. I have long been known as a proponent of variable refresh displays and getting access to that technology on your notebook is a victory for NVIDIA's team.
LG Australia published a product page for their LG 27MU67 monitor, which the rest of the company doesn't seem to acknowledge the existence of. It is still online, even after three days worth of time that someone could have used to pull the plug. This one is interesting for a variety of reasons: it's 4K, it's IPS, and it supports AMD FreeSync. It is also relatively cheap for that combination, being listed at $799 AUD RRP.
Some websites have converted that to ~$610 to $620 USD, but it might even be less than that. Australian prices are often listed with their federal tax rolled in, which would yield a price that is inflated about 10%. It is possible, though maybe wishful thinking, that this monitor could retail in the ~$500 to $550 price range for the United States (if it even comes to North America). Again, this is a 4K, IPS, FreeSync panel.
Very little is posted on LG's website and thus it is hard to tell how good of an IPS panel this is. It is listed as 99% SRGB coverage, which is good for typical video but not the best if you are working on printed content, such as magazine illustrations. On the other hand, this is a gaming panel, not a professional one. Update (May 29, 2015): It also has 10-bit (per channel) color. It sounds like it is true 10-bit, not just a look-up table, but I should note that it doesn't explicitly say that.
Again, pricing and availability is up in the air, because this is not an official announcement. It is listed to launch in Australia for $799 AUD, though.
Subject: Displays | May 31, 2015 - 06:00 PM | Ryan Shrout
Tagged: nvidia, gsync, g-sync, computex 2015, computex
In conjunction with the release of the new GeForce GTX 980 Ti graphics card today, NVIDIA is making a handful of other announcements around the GeForce brand. The most dramatic of the announcements center around the company's variable refresh monitor technology called G-Sync. I assume that any of you reading this are already intimately familiar with what G-Sync is, but if not, check out this story that dives into how it compares with AMD's rival tech called FreeSync.
First, NVIDIA is announcing a set of seven new G-Sync ready monitors that will be available this summer and fall from ASUS and Acer.
Many of these displays offer configurations of panels we haven't yet seen in a G-Sync display. Take the Acer X34 for example: this 34-in monitor falls into the 21:9 aspect ratio form factor, with a curved screen and a 3440x1440 resolution. The refresh rate will peak at 75 Hz while also offering the color consistency and viewing angles of an IPS screen. This is the first 21:9, the first 34x14 and the first curved monitor to support G-Sync, and with a 75 Hz maximum refresh it should provide a solid gaming experience. ASUS has a similar model, the PG34Q, though it peaks at a refresh rate of 60 Hz.
ASUS will be updating the wildly popular ROG Swift PG278Q display with the PG279Q, another 27-in monitor with a 2560x1440 resolution. Only this time it will run at 144 Hz with an IPS screen rather than TN, again resulting in improved color clarity, viewing angles and lower eye strain.
Those of you on the look out for 4K panels with G-Sync support will be happy to find IPS iterations of that configuration but still will peak at 60 Hz refresh - as much a limitation of DisplayPort as anything else though.
Another technology addition for G-Sync with the 352-series (353-series, sorry!) driver released today is support for windowed mode variable refresh.
By working some magic with the DWM (Desktop Window Manager), NVIDIA was able to allow for VRR to operate without requiring a game to be in full screen mode. For gamers that like to play windowed or borderless windowed while using secondary or large displays for other side activities, this is a going to a great addition to the G-Sync portfolio.
Finally, after much harassment and public shaming, NVIDIA is finally going to allow users the choice to enable or disable V-Sync when your game render rate exceeds the maximum refresh rate of the G-Sync monitor it is attached to.
One of the complaints about G-Sync has been that it is restrictive on the high side of the VRR window for its monitors. While FreeSync allowed you to selectively enable or disable V-Sync when your frame rate goes above the maximum refresh rate, G-Sync was forcing users into a V-Sync enabled state. The reasoning from NVIDIA was that allowing for horizontal tearing of any kind with G-Sync enabled would ruin the experience and/or damage the technology's reputation. But now, while the default will still be to keep V-Sync on, gamers will be able to manually set the V-Sync mode to off with a G-Sync monitor.
Why is this useful? Many gamers believe that a drawback to V-Sync enabled gaming is the added latency of waiting for a monitor to refresh before drawing a frame that might be ready to be shown to the user immediately. G-Sync fixes this from frame rates of 1 FPS to the maximum refresh of the G-Sync monitor (144 FPS, 75 FPS, 60 FPS) but now rather than be stuck with tear-free, but latency-added V-Sync when gaming over the max refresh, you'll be able to play with tearing on the screen, but lower input latency. This could be especially useful for gamers using 60 Hz G-Sync monitors with 4K resolutions.
Oh, actually one more thing: you'll now be able to enable ULMB (ultra low motion blur) mode in the driver as well without requiring entry into your display's OSD.
NVIDIA is also officially announcing G-Sync for notebooks at Computex. More on that in this story!
Big Things, Small Packages
Sapphire isn’t a brand we have covered in a while, so it is nice to see a new and interesting product drop on our door. Sapphire was a relative unknown until around the release of the Radeon 9700 Pro days. This was around the time when ATI decided that they did not want to be so vertically integrated, so allowed other companies to start buying their chips and making their own cards. This was done to provide a bit of stability for ATI pricing, as they didn’t have to worry about a volatile component market that could cause their margins to plummet. By selling just the chips to partners, ATI could more adequately control margins on their own product while allowing their partners to make their own deals and component choices for the finished card.
ATI had very limited graphics card production of their own, so they often would farm out production to second sources. One of these sources ended up turning into Sapphire. When ATI finally allowed other partners to produce and brand their own ATI based products, Sapphire already had a leg up on the competition by being a large producer already of ATI products. They soon controlled a good portion of the marketplace by their contacts, pricing, and close relationship with ATI.
Since this time ATI has been bought up by AMD and they no longer produce any ATI branded cards. Going vertical when it come to producing their own chips and video cards was obviously a bad idea, we can look back at 3dfx and their attempt at vertical integration and how that ended for the company. AMD obviously produces an initial reference version of their cards and coolers, but allows their partners to sell the “sticker” version and then develop their own designs. This has worked very well for both NVIDIA and AMD, and it has allowed their partners to further differentiate their product from the competition.
Sapphire usually does a bang up job on packaging the graphics card. Oh look, a mousepad!
Sapphire is not as big of a player as they used to be, but they are still one of the primary partners of AMD. It would not surprise me in the least if they still produced the reference designs for AMD and then distributed those products to other partners. Sapphire is known for building a very good quality card and their cooling solutions have been well received as well. The company does have some stiff competition from the likes of Asus, MSI, and others for this particular market. Unlike those two particular companies, Sapphire obviously does not make any NVIDIA based boards. This has been a blessing and a curse, depending on what the cycle is looking like between AMD and NVIDIA and who has dominance in any particular marketplace.
Subject: General Tech | June 2, 2015 - 03:35 PM | Jeremy Hellstrom
Tagged: microsoft, windows 10, acer, Lenovo
If you are running Win7 or a flavour of Win8 you have probably seen the pop-up nagging you to reserve your copy of Windows 10 before the official launch on July 29th. That deadline is a little misleading, if it has you concerned, you still have until July 29 2016 to use your free upgrade. What the reminder does do is give Microsoft a chance at a large initial adoption rate of the brand new OS which is rather necessary to restore confidence in them as an investment opportunity after the lukewarm Windows 8 adoption numbers. One question does still remain about the licensing which we are still awaiting an answer to. What does the future after that date hold for those who like to reinstall OSes on a regular basis; if you only possess a Windows 7 serial number and take advantage of the free upgrade before the deadline, will you get a way to install a fresh copy of Windows 10?
If you do have to buy a new license for Windows 10, the prices will remain as they were for Windows 8, Windows 10 Home will retail for $119, Windows 10 Pro for $199 with an upgrade from Home to Pro costing you $99. If you want control over when your updates are installed you might want to get some friends together to invest in a volume licensing agreement as patches are now pushed out and installed immediately. As we have mentioned Windows Media Centre will disappear as will any Windows 7 desktop gadgets you might have installed along the way but one mildly surprising omission that The Inquirer spotted was a change to DVD playback, which will also be an extra feature or else be handled by superior open source players. If for some reason you still use floppy drives, the new Windows will not natively support them but as with the previous version you should be able to locate drivers.
As for hardware, DigiTimes has heard word of very low priced Broadwell based laptops being released by Lenovo and Acer. Acer will be releasing a pair of models, an 11.6" at $169 and a 14" for $199. Lenovo's will be more expensive at $250 but will be a convertible Yoga machine which explains at least part of the premium pricing. It will be interesting to see how these will compete with existing products on the market, including Microsoft's own Surface.
"The Windows 10 notebooks are an 11.6-inch notebook (US$169) and a 14-inch clamshell-type notebook (US$199) from Acer and a 14-inch convertible Yoga notebook (US$249) from Lenovo. These devices will be manufactured by Inventec, and target mainly against Chromebooks."
Here is some more Tech News from around the web:
- Intel Compute Stick Performance Surprises Under Ubuntu Linux @ Phoronix
- Intel's Broadwell goes broad with new desktop, mobile, server variants @ The Tech Report
- Fedora 22: Don't be glum about the demise of Yum – this is a welcome update @ The Register
- Intel gobbles up chipmaker Altera in $16.7 BILLION splurge @ The Register
- Tossed all your snaps into the new Google Photos? You read the terms, right? ... RIGHT? @ The Register
- Blackberry Defeats Typo In Court, Typo To Discontinue Sales of Keyboard @ Slashdot
- NVIDIA SHIELD Android TV Review @ Hardware Canucks
Subject: Graphics Cards | May 29, 2015 - 11:05 AM | Sebastian Peak
Tagged: rumors, radeon, hbm, graphics, gpu, Fury, Fiji, amd
Another rumor has emerged about an upcoming GPU from AMD, and this time it's a possible name for the HBM-powered Fiji card a lot of us have been speculating about.
The rumor from VideoCardz via Expreview (have to love the multiple layers of reporting here) states the the new card will be named Radeon Fury:
"Radeon Fury would be AMD’s response to growing popularity of TITAN series. It is yet unclear how AMD is planning to adopt Fury naming schema. Are we going to see Fury XT or Fury PRO? Well, let’s just wait and see. This rumor also means that Radeon R9 390X will be a direct rebrand of R9 290X with 8GB memory."
Of course this is completely unsubstantiated, and Fury is a branding scheme from the ATI days, but who knows? I can only hope that if true, AMD will adopt all caps: TITAN! FURY! Feel the excitement. What do you think of this possible name for the upcoming AMD flagship GPU?
A substantial upgrade for Thunderbolt
Today at Computex, Intel took the wraps off of the latest iteration of Thunderbolt, a technology that I am guessing many of you thought was dead in the water. It turns out that's not the case, and this new set of features that Thunderbolt 3 offers may in fact push it over the crest and give it the momentum needed to become a useable and widespread standard.
First, Thunderbolt 3 starts with a new piece of silicon, code named Alpine Ridge. Not only does Alpine Ridge increase the available Thunderbolt bandwidth to 40 Gbps but it also adds a native USB 3.1 host controller on the chip itself. And, as mobile users will be glad to see, Intel is going to start utilizing the new USB Type-C (USB-C) connector as the standard port rather than mini DisplayPort.
This new connector type, that was already a favorite among PC Perspective staff because of its size and its reversibility, will now be the way connectivity and speed increases this generation with Thunderbolt. This slide does a good job of summarizing the key take away from the TB3 announcement: 40 Gbps, support for two 4K 60 Hz displays, 100 watt (bi-directional) charging capability, 15 watt device power and support for four protocols including Thunderbolt, DisplayPort, USB and PCI Express.
Protocol support is important and Thunderbolt 3 over USB-C will be able to connect directly to a DisplayPort monitor, to an external USB 3.1 storage drive, an old thumb drive or a new Thunderbolt 3 docking station. This is truly unrivaled flexibility from a single connector. The USB 3.1 controller is backward compatible as well: feel free to connect any USB device to it that you can adapt to the Type-C connection.
From a raw performance perspective Thunderbolt 3 offers a total of 40 Gbps of bi-directional bandwidth, twice that of Thunderbolt 2 and 4x what we get with USB 3.1. That offers users the ability to combine many different devices, multiple displays and network connections and have plenty of headroom.
With Thunderbolt 3 you get twice as much raw video bandwidth, two DP 1.2 streams, allowing you to run not just a single 4K display at 60 Hz but two of them, all over a single TB3 cable. If you want to connect a 5K display though, you will be limited to just one of them.
For mobile users, which I think is the area where Thunderbolt 3 will be the most effective, the addition of USB 3.1 allows for charging capability up to 100 watts. This is in addition to the 15 watts of power that Thunderbolt provides to devices directly - think external storage, small hubs/docks, etc.
Subject: Processors | May 27, 2015 - 09:45 PM | Scott Michaud
Tagged: xeon, Skylake, Intel, Cannonlake, avx-512
AVX-512 is an instruction set that expands the CPU registers from 256-bit to 512-bit. It comes with a core specification, AVX-512 Foundation, and several extensions that can be added where it makes sense. For instance, AVX-512 Exponential and Reciprocal Instructions (ERI) help solve transcendental problems, which occur in geometry and are useful for GPU-style architectures. As such, it appears in Knights Landing but not anywhere else.
Image Credit: Bits and Chips
Today's rumor is that Skylake, the successor to Broadwell, will not include any AVX-512 support in its consumer parts. According to the lineup, Xeons based on Skylake will support AVX-512 Foundation, Conflict Detection Instructions, Vector Length Extensions, Byte and Word Instructions, and Double and Quadword Instructions. Fused Multiply and Add for 52-bit Integers and Vector Byte Manipulation Instructions will not arrive until Cannonlake shrinks everything down to 10nm.
The main advantage of larger registers is speed. When you can fit 512 bits of data in a memory bank and operate upon it at once, you are able to do several, linked calculations together. AVX-512 has the capability to operate on sixteen 32-bit values at the same time, which is obviously sixteen times the compute performance compared with doing just one at a time... if all sixteen undergo the same operation. This is especially useful for games, media, and other, vector-based workloads (like science).
This also makes me question whether the entire Cannonlake product stack will support AVX-512. While vectorization is a cheap way to get performance for suitable workloads, it does take up a large amount of transistors (wider memory, extra instructions, etc.). Hopefully Intel will be able to afford the cost with the next die shrink.
Subject: General Tech, Systems | June 1, 2015 - 07:30 AM | Tim Verry
Tagged: steam os, living room gaming, liquid cooling, gaming, DIY, corsair, computex 2015, computex, barebones, 4k
Today at Computex, Corsair unveiled a new barebones gaming PC aimed at the living room. The compact Bulldog PC is an upgradeable barebones DIY kit that offers gamers an interesting base from which to build a living room PC capable of 4K gaming. The chassis resembles an overbuilt console in that it is a short but wide design with many angular edges and aesthetic touches including stylized black case feet and red accents surrounding the vents. A hidden panel in the lower right corner reveals two USB 3.0 ports and two audio jacks. It looks ready to fight in the next season of Robot Wars should you add a flamethrower or hydraulic flipper (heh).
The Bulldog kit consists of the chassis, motherboard, small form factor power supply, and a customized Hydro H55F series closed loop liquid CPU cooler. From there, users need to bring their own processor, RAM, and storage devices. There is no operating system included with the kit, but it, being a full PC, supports Windows, Linux, and SteamOS et al.
As far as graphics cards, Corsair is offering several liquid cooled NVIDIA graphics cards (initially only from MSI with other AIB partner cards to follow) that are ready to be installed in the Bulldog PC. Currently, users can choose from the GTX TITAN X, GTX 980, and GTX 970.
Alternatively, Corsair is offering a $99 (MSRP) upgrade kit for existing graphics cards with its Hydro H55 cooler and HG110 bracket.
The Bulldog case supports Mini ITX form factor motherboards and it appears that Corsair is including the Asus Z97I-Plus which is a socket 1150 board supporting Haswell-based Core processors, DDR3 memory, M.2 (though you have to take the board out of the case to install the drive since the slot is on the underside of the board), a single PCI-E 3.0 x16 slot, four SATA 6.0 Gbps ports, and the usual fare of I/O options including USB 3.0, 802.11ac Wi-Fi, and optical and analog audio outputs (among others).
A mini ITX motherboard paired with the small from factor Corsair H55F CPU cooler (left) and the internal layout of the Bulldog case with all components installed (right).
User purchased processors are cooled by the included liquid cooler which is a customized Hydro series cooler that mounts over the processor and exhausts air blower style out of the back of the case. The system is powered by the pre-installed 600W Corsair FS600 power supply. The PSU is mounted in the front of the system and the graphics card radiator and fan are mounted horizontally beside it. Along the left side of the case are mounts for a single 2.5" drive and a single 3.5" drive.
GPU manufacturers will be selling card with liquid coolers pre-installed. Users can also upgrade existing air cooled graphics cards with an optional upgrade kit.
The liquid cooling aspect of the Bulldog is neat and, according to Corsair, is what is enabling them to cram so much hardware together into a relatively small case while enabling thermal headroom for overclocking and quieter operation versus air coolers.
I am curious how well the CPU cooler performs especially as far as noise levels go with the compacted and shrouded design. Also, while there is certainly plenty of ventilation along the sides of the case to draw in cool air, I'm interested in how well the GPU HSF will be able to exhaust the heat since there are no top grilles.
Corsair is marketing the Bulldog as the next step up from your typical Steam Machine and game console and the first 4K capable gaming PC designed for the living room. Further, it would be a nice stepping stone for console gamers to jump into PC gaming.
From the press release:
“Bulldog is designed to take the 4K gaming experience delivered by desktop gaming PCs, and bring it to the big 4K screens in the home,” said Andy Paul, CEO of Corsair Components. “We knew we needed to deliver a solution that was elegant, powerful, and compact. By leveraging our leading expertise in PC case design and liquid cooling, we met that goal with Bulldog. We can’t wait to unleash it on gamers this fall.”
The Bulldog DIY PC kit is slated for an early Q4 2015 launch with a MSRP of $399. After adding in a processor, memory, storage, and graphics, Corsair estimates a completed build to start around $940 with liquid cooled graphics ($600 without a dedicated GPU) and tops out at $2,250.
Keep in mind that the lowest tier liquid cooled GPU at launch will be the MSI GTX 970 (~$340). Users could get these prices down a bit with some smart shopping and component selection along with the optional $99 upgrade kit for other GPU options. It is also worth considering that the Bulldog is being positioned as a 4K gaming machine. If you were willing to start off with a 1080p setup, you could get buy with a cheaper graphics card and upgrade later along with your TV when 4K televisions are cheaper and more widespread.
At its core, $400 for the Bulldog kit (case, quality power supply, high end motherboard, and closed loop CPU cooler) is a decent value that just might entice some console gamers to explore the world of PC gaming (and to never leave following their first Steam sale heh)! It is a big commitment for sure at that price, but it looks like Corsair is using quality components and while there is surely the usual the small form factor part price premium (especially cases), it is far from obnoxious.
What do you think about the bulldog? Is it more bark than bite or is it a console killer?
Subject: General Tech | June 1, 2015 - 08:07 AM | Sebastian Peak
Tagged: STRAFE, mechanical keyboard, gaming keyboard, corsair, computex 2015, computex, Cherry MX
Corsair has announced the STRAFE mechanical gaming keyboard featuring Cherry MX switches, and the company is calling it the “most advanced mono-backlit mechanical gaming keyboard available”.
“The STRAFE mechanical gaming keyboard’s brilliant red backlighting can be customized to a virtually unlimited number of lighting configurations and effects. Each key can be programmed with automated macros using CUE (Corsair Utility Engine) software. Users can choose from six unique lighting effects or craft their own custom profiles and share them on www.corsairgaming.com.”
The STRAFE features:
- German-made Cherry MX red switches with gold contacts for fast, precise key presses
- Fully programmable brilliant red LED backlighting for unrivaled personalization
- USB pass-through port for easy connections
- Textured and contoured FPS/MOBA keycaps
- 100% anti-ghosting technology with 104-key rollover
- Enhanced, easy-access multimedia controls
The Corsair STRAFE has an MSRP of $109.99 and will be available in June.
Subject: Graphics Cards | June 1, 2015 - 10:58 AM | Ryan Shrout
Tagged: evga, precisionx, dx12, DirectX 12
Another interesting bit of news surrounding Computex and the new GTX 980 Ti comes from EVGA and its PrecisionX software. This is easily our favorite tool for overclocking and GPU monitoring, so it's great to see the company continuing to push forward with features and capability. EVGA is the first to add full support for DX12 with an overlay.
What does that mean? It means as DX12 applications that find their way out to consumers and media, we will now have a tool that can help measure performance and monitor GPU speeds and feeds via the PrecisionX overlay. Before this release, we were running the dark with DX12 demos, so this is great news!
You can download the latest version over on EVGA's website!
Introduction and First Impressions
Phanteks has expanded their Enthoo enclosure lineup with a new ATX version of the popular EVOLV case, and it offers a striking design and some unique features to help it stand out in the mid-tower market.
Phanteks first came to my attention with their large double tower cooler PH-TC14, which competes directly with the Noctua NH-D14 in the CPU air-cooling market. But like a lot of other cooling companies (Cooler Master, Corsair, etc.) Phanteks also offers a full lineup of enclosures as well. Of these the Enthoo EVOLV, which until today has only been available in a micro-ATX and mini-ITX version, has been well-received and has a angular, minimalist look that I like quite a bit. Enter the EVOLV ATX.
With the larger size to this new EVOLV ATX there is not only room for a full-size motherboard, but much more room for components and cooling as well. The internal layout is very similar to the recently reviewed Fractal Design Define S enclosure, with no storage (5.25” or 3.5”) inside the front of the case, which gives the EVOLV ATX a totally open layout. The front is solid metal (though well vented) so we’ll see how this affects cooling, and it will be interesting to see how Phanteks has approached internal storage with the design as well. Let’s get started!
Subject: Graphics Cards | June 1, 2015 - 06:00 AM | Ryan Shrout
Tagged: maxwell, hydro copper, GTX 980 Ti, gm200, evga, computex 2015, computex, classified, acx
With the release of the brand new GeForce GTX 980 Ti from NVIDIA stirring up the week just before Computex in Taipei, you can be sure that all of NVIDIA's partners are going to be out in force showing off their custom graphics card solutions.
EVGA has several lined up and they were able to share some information with us. First up is the standard but custom cooled GTX 980 Ti that uses the ACX 2.0+ cooler. This new version of the ACX 2.0 cooler includes a "memory MOSFET Cooling Plate (MMCP) reduces MOSFET temperatures up to 13%, and optimized Straight Heat Pipes (SHP) additionally reduce GPU temperature by 5C. ACX 2.0+ coolers also feature optimized swept fan blades, double ball bearings and an extreme low power motor, delivering more air flow with less power, unlocking additional power for the GPU." We're looking forward to some hands-on testing with this card when it shows up on Monday morning.
Also due for an update is the EVGA Classified line, often considered one of the best cards you can buy for overclockers and extreme enthusiasts. Though the card is also using the ACX 2.0+ cooler it will include additional power delivery improvements on the PCB that help stretch available performance headroom.
Following in the footsteps of the recently released Titan X Hybrid comes the GTX 980 Ti version. This card will use a standard blower cooler for the memory and power delivery while attaching a self-contained water cooler for the GPU itself. This should keep the GPU temperature down quite a bit though the benefit to real-world overclocking is debatable with the voltage lock that NVIDIA has kept in place. If only they were to change that...
Finally, for the water cooling fans among us we have the GTX 980 Ti Hydro Copper, using a water block from EK.
Interested in clock speeds?
- EVGA 980 Ti ACX 2.0
- Base: 1000 MHz
- Boost: 1076 MHz
- Memory: 7010 MHz
- EVGA 980 Ti Classified
- Base: 1152 MHz
- Boost: 1241 MHz
- Memory: 7010 MHz
- EVGA 980 Ti Hybrid
- Base: 1140 MHz
- Boost: 1228 MHz
- Memory: 7010 MHz
I am still waiting for pricing and availability information which we will pass on as soon as we get it!
Subject: Processors, Shows and Expos | June 2, 2015 - 11:10 AM | Ryan Shrout
Tagged: Intel, computex 2015, computex, Broadwell
Earlier this morning you saw us post a story about MSI updating its line of 20 notebooks with new Broadwell processors. Though dual-core Broadwell has been available for Ultrabooks and 2-in-1s for some time already, today marks the release of the quad-core variations we have been waiting on for some time. Available for mobile designs, as well as marking the very first Iris Pro graphics implementation for desktop users, Broadwell quad-core parts look to be pretty impressive.
Today Intel gives to the world a total 10 new processors for content creators and enthusiasts. Two of these parts are 65 watt SKUs in LGA packaging for use by enthusiasts and DIY builders. The rest are BGA designs for all-in-one PCs and high performance notebooks and include both 65 watt and 47 watt variants. And most are using the new Iris Pro Graphics 6200 implementation.
For desktop users, we get the Core i7-5775C and the Core i5-5675C. The Core i7 model is a quad-core, HyperThreaded CPU with a base clock of 3.3 GHz and a max Turbo clock of 3.7 GHz. It's unlocked so that overclockers and can mess around with them in the same way do with Haswell. The Iris Pro Graphics 6200 can scale up to 1150 MHz and rated DDR3L memory speeds are up to 1600 MHz. 6MB of L3 cache, a 65 watt TDP and a tray price of $366 round out the information we have.
Click to Enlarge
The Core i5-5675C does not include HyperThreading, has clock speed ranges of 3.1 GHz to 3.6 GHz and only sees the Iris Pro scale to 1100 MHz. Also, it drops from 6MB of L3 cache to 4MB. Pricing on this model will start a $276.
These two processors mark the first time we have seen Iris Pro graphics in a socketed form factor, something we have been asking Intel to offer for at least a couple of generations. They focused on 65 watt TDPs rather than anything higher mostly because of the target audience for these chips: if you are interested in the performance of integrated graphics then you likely are pushing a small form factor design or HTPC of some kind. If you have a Haswell-capable motherboard then you SHOULD be able to utilize one of these new processors though you'll want a Z97 board if you are going to try to overclock it.
From a performance standpoint, the Core i7-5775C will offer 2x the gaming performance, 35% faster video transcoding and 20% higher compute performance when compared to the previous top-end 65 watt Haswell part, the Core i7-4790S. That 4th generation part uses Intel HD Graphics 4600 that does not include the massive eDRAM that makes Iris Pro implementations so unique.
For mobile and AIO buyers, Intel has a whole host of new processors to offer. You'll likely find most of the 65 watt parts in all-in-one designs but you may see some mobile designs that go crazy and opt for them too. For the rest of the gaming notebook designs there are CPUs like the Core i7-5950HQ, a quad-core HyperThreaded part with a base clock of 2.9 GHz and max Turbo clock of 3.8 GHz inside a TDP of 47 watts. The Iris Pro Graphics 6200 will scale from 300 to 1150 MHz so GPU performance should basically be on par with the desktop 65-watt equivalent. Pricing is pretty steep though: starting at $623.
Click to Enlarge
These new processors, especially the new 5950HQ, offer impressive compute and gaming performance.
Compared to the Core i7-5600U, already available and used in some SFF and mobile platforms, the Core i7-5950HQ is 2.5x faster in SPECint and nearly 2x faster in a video conversion benchmark. Clearly these machines are going to be potent desktop replacement options.
For mainstream gamers, the Iris Pro Graphics 6200 on 1920x1080 displays will see some impressive numbers. Players of League of Legends, Heroes of the Storm and WoW will see over 60 FPS at the settings listed in the slide above.
We are still waiting for our hardware to show up but we have both the LGA CPUs and notebooks using the BGA option en route. Expect testing from PC Perspective very soon!
Subject: Mobile, Shows and Expos | May 27, 2015 - 12:27 AM | Ken Addison
Tagged: z51, z41, tech world, r9 m375, r9 m360, Lenovo, ideapad 100, amd
Today at their Tech World event in Beijing, Lenovo is taking the opportunity to announce some new mainstream notebook options.
First off, we have simply the Lenovo Z41 and Z51. The 14-inch Z41 and 15.6-inch Z51 aim to refresh the previous Z40 and Z50 with Broadwell CPUs as well as new AMD discrete GPU options.
Lenovo is using the Broadwell-U class of CPUs here as you would find in ultra books, so don't expect a CPU powerhouse, but for productivity style tasks these machines should hit the sweet spot of Price vs Performance with a starting price of $549 for the base Z51.
Paired with the new AMD R9-M360 (Z41) or M375 (Z51) these notebooks should also be able to play mainstream titles on the integrated 1080p display while coming in just over $800.
Lenovo also announced a low-cost entry into the ideapad line utilizing Intel's BayTrail-M processors. The ideapad 100 is available in both 14-inch and 15-inch variants and seems to be aimed at the low-cost Chromebook market.
Starting at $249, the ideapad 100 seems like it will be a good option for users looking for a secondary option for basic web browsing and office tasks.
Stay tuned for more from Lenovo's Tech World Event this week!
- 1 of 4