Subject: Editorial | May 29, 2015 - 12:37 PM | Ryan Shrout
Tagged: SSD 750, PCI Express, NVMe, Intel, giveaway, contest, 750 series
PC Perspective and Intel are partnering together to offer up a giveaway with some pretty impressive swag. Surely by now you have read all about the new Intel SSD 750 Series of products, a new class of solid state drive that combines four lanes of PCI Express 3.0 and a new protocol called NVM Express (NVMe) for impressive bandwidth throughput. In Allyn's review of the SSD in April he called it "the obvious choice for consumers who demand the most from their storage" and gave it a PC Perspective Editor's Choice Award!
Thanks to our friends at Intel we are going to be handing out a pair of the 400GB add-in card models to loyal PC Perspective readers and viewers. How can you enter? The rules are dead simple:
- Fill out the contest entry form below to find multiple entry methods including reading our review, answering a question about Intel SSD 750 Series specs or following us on Twitter. You can fill out one or all of the methods - the more you do the better your chances!
- Leave a comment on the news post below thanking Intel for sponsoring PC Perspective and for supplying this hardware for us to give to you!
- This is a global contest - so feel free to enter from anywhere in the world!
- Contest will close on June 2nd, 2015.
Our most sincere thanks to Intel for bringing this contest to PC Perspective's readers and fans. Good luck to everyone (except Josh)!
Sponsored by Intel
|Capacity||Seqential 128KB Read (up to MB/s)||Sequential 128KB Write (up to MB/s)||Random 4KB Read (up to IOPS)||Random 4KB Write (up to IOPS)||Form Factor||Interface|
|400 GB||2,200||900||430,000||230,000||2.5-inch x 15mm||PCI Express Gen3 x4|
|1.2 TB||2,400||1,200||440,000||290,000||2.5-inch x 15mm||PCI Express Gen3 x4|
|400 GB||2,200||900||430,000||230,000||Half-height half-length (HHHL) Add-in Card||PCI Express Gen3 x4|
|1.2 TB||2,400||1,200||440,000||290,000||Half-heigh half-length (HHHL) Add-in Card||PCI Express Gen3 x4|
Experience the future of storage performance for desktop client and workstation users with the Intel® SSD 750 Series. The Intel SSD 750 Series delivers uncompromised performance by utilizing NVM Express* over four lanes of PCIe* 3.0.
With both Add-in Card and 2.5-inch form factors, the Intel SSD 750 Series eases migration from SATA to PCIe 3.0 without power or thermal limitations on performance. The SSD can now deliver the ultimate in performance in a variety of system form factors and configurations.
When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.
Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.
The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?
Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?
Subject: Displays | June 1, 2015 - 12:21 PM | Ryan Shrout
Tagged: nvidia, gsync, g-sync, asus, 3800R
At Computex this week ASUS is showing off a prototype of the new ROG 3800R monitor, a 34-in curved display with a 3440x1440 resolution and G-Sync variable refresh rate capability. ASUS claims on its PCDIY blog that the 21:9 aspect ratio was the one of the "most requested" specifications for a new ROG monitor, followed by a curved design. The result is a gorgeous display:
Here's a list of specifications:
- 34” optimal dimension for QHD resolutions with 3440×1440 resolution
- 21:9 ultra-wide aspect ratio for increased immersion and improved horizontal workflow
- IPS based panel for superior color reproduction, black levels and reduction of color shifting
- NVIDIA G-SYNC equipped offering smooth, fluid and tear free gaming with improved motion clarity. Additionally equipped with ULMB operating mode for outstanding motion clarity.
- Frameless design for seamless surround gaming
- ASUS exclusive GamePlus feature and Turbo Key
- Ergonomic adjustment including tilt, swivel and height adjustment
Hot damn, we want of these and we want it yesterday! There is no mention of the refresh rate of the display here though we did see information from NVIDIA that ASUS was planning a 34x14 60 Hz screen - but we are not sure this is the same model being shown. And the inclusion of ULMB would normally indicate a refresh rate above 60-75 Hz...
Another interesting note: this monitor appears to include both DisplayPort and HDMI connectivity.
This 34-inch 3800R curved display features wide-viewing angles, a 3440 x 1440 native resolution, and 21:9 aspect ratio. It features NVIDIA® G-SYNC™ display technology to deliver smooth, lag-free visuals. G-SYNC synchronizes the display’s refresh rate to the GPU in any GeForce® GTX™-powered PC to eliminate screen tearing and minimizing display stutter and input lag. This results in sharper, more vibrant images; and more fluid and responsive gameplay. It has extensive connectivity options that include DisplayPort and HDMI.
The above information came from ASUS just a few short hours ago, so you can assume that it is accurate. Could this be the start of panels that integrate dual scalars (G-Sync module plus something else) to offer more connectivity or has the G-Sync module been updated to support more inputs? We'll find out!
Subject: Processors | May 28, 2015 - 03:44 PM | Scott Michaud
Tagged: Intel, Skylake, skylake-s, haswell, devil's canyon
For a while, it was unclear whether we would see Broadwell on the desktop. With the recently leaked benchmarks of the Intel Core i7-6700K, it seems all-but-certain that Intel will skip it and go straight to Skylake. Compared to Devil's Canyon, the Haswell-based Core i7-4790K, the Skylake-S Core i7-6700K has the same base clock (4.0 GHz) and same full-processor Turbo clock (4.2 GHz). Pretty much every improvement that you see is pure performance per clock (IPC).
Image Credit: CPU Monkey
In multi-threaded applications, the Core i7-6700K tends to get about a 9% increase while, when a single core is being loaded, it tends to get about a 4% increase. Part of this might be the slightly lower single-core Turbo clock, which is said to be 4.2 GHz instead of 4.4 GHz. There might also be some increased efficiency with HyperThreading or cache access -- I don't know -- but it would be interesting to see.
I should note that we know nothing about the GPU. In fact, CPU Monkey fails to list a GPU at all. Intel has expressed interest in bringing Iris Pro-class graphics to the high-end mainstream desktop processors. For someone who is interested in GPU compute, especially with Explicit Unlinked MultiAdapter in DirectX 12 upcoming, it would be nice to see GPUs be ubiquitous and always enabled. It is expected to have the new GT4e graphics with 72 compute units and either 64 or 128MB of eDRAM. If clocks are equivalent, this could translate well over a teraflop (~1.2 TFLOPs) of compute performance in addition to discrete graphics. In discrete graphics, that would be nearly equivalent to an NVIDIA GTX 560 Ti.
We are expecting to see the Core i7-6700K launch in Q3 of this year. We'll see.
Subject: General Tech | May 26, 2015 - 02:40 PM | Ryan Shrout
Tagged: video, pcper live, logitechg, logitech, live, giveaway, contest
UPDATE: Did you miss our live stream yesterday? Sorry to hear that! But the good news is that you can still watch the showcase and hear the questions (and answers) on the LogitechG product line. Enjoy!
If you haven't looked up what LogitechG has done lately, you are missing out on some of the most interesting and innovative gaming accessories in the market. My reconnection with the Logitech brand started in July of last year when they sent over the G402 Hyperion Fury gaming mouse, a device with a pair of sensors capable of tracking speeds as high as 28.4 mph - seriously!
Since then the team has been releasing new products and updated software that shows a dedication and drive to the enthusiast and gaming markets. The company's Intelligent Illumination system, that allows game developers to change the colors and brightness of the G910 Orion Spark based on what is happening in the game. Is your health critical? Then maybe you want the keyboard lights to pulse bright red? Or maybe when you jump into a flying vehicle you would like to illuminate the appropriate control keys? All of this is possible.
Besides the cool technology they have to show off, the truth is that Chris, the LogitechG peripheral guru that will be joining me, knows almost everything about keyboards and mice and headsets (oh my!). If you have a question about a specific LogitechG product, or about mice and keyboard technologies in general, this is the guy to ask and the outlet to do it.
LogitechG Showcase, Q&A and Giveaway!!
12pm PT / 3pm ET - May 27th
Need a reminder? Join our live mailing list!
And of course, what would a live stream like this be without some giveaways? Logitech is going to supply hardware to give out during the live stream so you are going to want to be in attendance for sure! But to help build up some excitement, we are going to be giving away combinations of the G402 Hyperion Fury mouse and G440 mousing pad every weekday between now and the live stream! To sign up for the daily giveaways, use the form below. And for a chance to win a whole lot more gaming hardware, make sure you tune in on May 27th at 3pm ET / 12pm PT!!
If you have questions for Chris from Logitech, about mice, keyboards, switches, RGB illumination, tracking technologies - anything - let us know in the comments!
Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.
NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.
Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.
And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.
But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.
Subject: Graphics Cards | May 26, 2015 - 05:29 PM | Sebastian Peak
Tagged: rumors, leaks, Hawaii XT, Fiji, amd radeon
This sounds like disappointing news, but it might be a little misleading as well. First of all the report, another from the folks at VideoCardz, details information about these upcoming GPUs from a leaked driver.
Here’s the list VideoCardz came up with (200-series name on the right):
- AMD6658.1 AMD Radeon(TM) R7 360 Graphics Bonaire XTX [Radeon R7 260X]
- AMD67B0.1 AMD Radeon R9 300 Series Hawaii XT [Radeon R9 290X]
- AMD67B1.1 AMD Radeon R9 300 Series Hawaii PRO [Radeon R9 290]
- AMD6810.1 AMD Radeon(TM) R7 370 Graphics Curacao XT [Radeon R9 270X]
- AMD6810.2 AMD Radeon (TM) R7 300 Series Curacao XT [Radeon R9 270X]
- AMD6811.1 AMD Radeon (TM) R7 300 Series Curacao PRO [Radeon R9 270/370]
- AMD6939.1 AMD Radeon R9 300 Series Tonga PRO [Radeon R9 285]
VideoCardz further comments on this list:
“FIJI was not included in this driver release. Radeon R9 Hawaii is neither shown as R9 380 nor as R9 390. Right now it’s just R9 300, just like R9 285-rebrand. Both Hawaii cards will get a small bump in clock speeds and most importantly 8GB memory.”
And this is where the news might be misleading, as the “300-series” naming may very well not include a new GPU due to AMD shifting to a new scheme for their upcoming flagship, just as NVIDIA has used the “TITAN” and “TITAN X” names for flagship cards. If the rumored AMD Fiji card (which was recently pictured) is the all-new GPU featuring HBM memory that we’ve all been waiting for, expect iterations of this new card to follow as the technology trickles down to the more affordable segment.
Subject: Displays, Mobile | May 31, 2015 - 06:00 PM | Ryan Shrout
Tagged: nvidia, notebooks, msi, mobile, gsync, g-sync, asus
If you remember back to January of this year, Allyn and posted an article that confirmed the existence of a mobile variant of G-Sync thanks to a leaked driver and an ASUS G751 notebook. Rumors and speculation floated around the Internet ether for a few days but we eventually got official word from NVIDIA that G-Sync for notebooks was a real thing and that it would launch "soon." Well we have that day here finally with the beginning of Computex.
G-Sync for notebooks has no clever branding, no "G-Sync Mobile" or anything like that, so discussing it will be a bit more difficult since the technologies are different. Going forward NVIDIA claims that any gaming notebook using NVIDIA GeForce GPUs will be a G-Sync notebook and will support all of the goodness that variable refresh rate gaming provides. This is fantastic news as notebook gaming is often at lower frame rates than you would find on a desktop PC because of lower powered hardware yet comparable (1080p, 1440p) resolution displays.
Of course, as we discovered in our first look at G-Sync for notebooks back in January, the much debated G-Sync module is not required and will not be present on notebooks featuring the variable refresh technology. So what gives? We went over some of this before, but it deserves to be detailed again.
NVIDIA uses the diagram above to demonstrate the complication of the previous headaches presented by the monitor and GPU communication path before G-Sync was released. You had three different components: the GPU, the monitor scalar and the monitor panel that all needed to work together if VRR was going to become a high quality addition to the game ecosystem.
NVIDIA's answer was to take over all aspects of the pathway for pixels from the GPU to the eyeball, creating the G-Sync module and helping OEMs to hand pick the best panels that would work with VRR technology. This helped NVIDIA make sure it could do things to improve the user experience such as implementing an algorithmic low-frame-rate, frame-doubling capability to maintain smooth and tear-free gaming at frame rates under the panels physical limitations. It also allows them to tune the G-Sync module to the specific panel to help with ghosting and implemention variable overdrive logic.
All of this is required because of the incredible amount of variability in the monitor and panel markets today.
But with notebooks, NVIDIA argues, there is no variability at all to deal with. The notebook OEM gets to handpick the panel and the GPU directly interfaces with the screen instead of passing through a scalar chip. (Note that some desktop monitors like the ever popular Dell 3007WFP did this as well.) There is no other piece of logic in the way attempting to enforce a fixed refresh rate. Because of that direct connection, the GPU is able to control the data passing between it and the display without any other logic working in the middle. This makes implementing VRR technology much more simple and helps with quality control because NVIDIA can validate the panels with the OEMs.
As I mentioned above, going forward, all new notebooks using GTX graphics will be G-Sync notebooks and that should solidify NVIDIA's dominance in the mobile gaming market. NVIDIA will be picking the panels, and tuning the driver for them specifically, to implement anti-ghosting technology (like what exists on the G-Sync module today) and low frame rate doubling. NVIDIA also claims that the world's first 75 Hz notebook panels will ship with GeForce GTX and will be G-Sync enabled this summer - something I am definitely looking forward to trying out myself.
Though it wasn't mentioned, I am hopeful that NVIDIA will continue to allow users the ability to disable V-Sync at frame rates above the maximum refresh of these notebook panels. With most of them limited to 60 Hz (but this applies to 75 Hz as well) the most demanding gamers are going to want that same promise of minimal latency.
At Computex we'll see a handful of models announced with G-Sync up and running. It should be no surprise of course to see the ASUS G751 with the GeForce GTX 980M GPU on this list as it was the model we used in our leaked driver testing back in January. MSI will also launch the GT72 G with a 1080p G-Sync ready display and GTX 980M/970M GPU option. Gigabyte will have a pair of notebooks: the Aorus X7 Pro-SYNC with GTX 970M SLI and a 1080p screen as well as the Aorus X5 with a pair of GTX 965M in SLI and a 3K resolution (2560x1440) screen.
This move is great for gamers and I am eager to see what the resulting experience is for users that pick up these machines. I have long been known as a proponent of variable refresh displays and getting access to that technology on your notebook is a victory for NVIDIA's team.
LG Australia published a product page for their LG 27MU67 monitor, which the rest of the company doesn't seem to acknowledge the existence of. It is still online, even after three days worth of time that someone could have used to pull the plug. This one is interesting for a variety of reasons: it's 4K, it's IPS, and it supports AMD FreeSync. It is also relatively cheap for that combination, being listed at $799 AUD RRP.
Some websites have converted that to ~$610 to $620 USD, but it might even be less than that. Australian prices are often listed with their federal tax rolled in, which would yield a price that is inflated about 10%. It is possible, though maybe wishful thinking, that this monitor could retail in the ~$500 to $550 price range for the United States (if it even comes to North America). Again, this is a 4K, IPS, FreeSync panel.
Very little is posted on LG's website and thus it is hard to tell how good of an IPS panel this is. It is listed as 99% SRGB coverage, which is good for typical video but not the best if you are working on printed content, such as magazine illustrations. On the other hand, this is a gaming panel, not a professional one. Update (May 29, 2015): It also has 10-bit (per channel) color. It sounds like it is true 10-bit, not just a look-up table, but I should note that it doesn't explicitly say that.
Again, pricing and availability is up in the air, because this is not an official announcement. It is listed to launch in Australia for $799 AUD, though.
Subject: Displays | May 31, 2015 - 06:00 PM | Ryan Shrout
Tagged: nvidia, gsync, g-sync, computex 2015, computex
In conjunction with the release of the new GeForce GTX 980 Ti graphics card today, NVIDIA is making a handful of other announcements around the GeForce brand. The most dramatic of the announcements center around the company's variable refresh monitor technology called G-Sync. I assume that any of you reading this are already intimately familiar with what G-Sync is, but if not, check out this story that dives into how it compares with AMD's rival tech called FreeSync.
First, NVIDIA is announcing a set of seven new G-Sync ready monitors that will be available this summer and fall from ASUS and Acer.
Many of these displays offer configurations of panels we haven't yet seen in a G-Sync display. Take the Acer X34 for example: this 34-in monitor falls into the 21:9 aspect ratio form factor, with a curved screen and a 3440x1440 resolution. The refresh rate will peak at 75 Hz while also offering the color consistency and viewing angles of an IPS screen. This is the first 21:9, the first 34x14 and the first curved monitor to support G-Sync, and with a 75 Hz maximum refresh it should provide a solid gaming experience. ASUS has a similar model, the PG34Q, though it peaks at a refresh rate of 60 Hz.
ASUS will be updating the wildly popular ROG Swift PG278Q display with the PG279Q, another 27-in monitor with a 2560x1440 resolution. Only this time it will run at 144 Hz with an IPS screen rather than TN, again resulting in improved color clarity, viewing angles and lower eye strain.
Those of you on the look out for 4K panels with G-Sync support will be happy to find IPS iterations of that configuration but still will peak at 60 Hz refresh - as much a limitation of DisplayPort as anything else though.
Another technology addition for G-Sync with the 352-series (353-series, sorry!) driver released today is support for windowed mode variable refresh.
By working some magic with the DWM (Desktop Window Manager), NVIDIA was able to allow for VRR to operate without requiring a game to be in full screen mode. For gamers that like to play windowed or borderless windowed while using secondary or large displays for other side activities, this is a going to a great addition to the G-Sync portfolio.
Finally, after much harassment and public shaming, NVIDIA is finally going to allow users the choice to enable or disable V-Sync when your game render rate exceeds the maximum refresh rate of the G-Sync monitor it is attached to.
One of the complaints about G-Sync has been that it is restrictive on the high side of the VRR window for its monitors. While FreeSync allowed you to selectively enable or disable V-Sync when your frame rate goes above the maximum refresh rate, G-Sync was forcing users into a V-Sync enabled state. The reasoning from NVIDIA was that allowing for horizontal tearing of any kind with G-Sync enabled would ruin the experience and/or damage the technology's reputation. But now, while the default will still be to keep V-Sync on, gamers will be able to manually set the V-Sync mode to off with a G-Sync monitor.
Why is this useful? Many gamers believe that a drawback to V-Sync enabled gaming is the added latency of waiting for a monitor to refresh before drawing a frame that might be ready to be shown to the user immediately. G-Sync fixes this from frame rates of 1 FPS to the maximum refresh of the G-Sync monitor (144 FPS, 75 FPS, 60 FPS) but now rather than be stuck with tear-free, but latency-added V-Sync when gaming over the max refresh, you'll be able to play with tearing on the screen, but lower input latency. This could be especially useful for gamers using 60 Hz G-Sync monitors with 4K resolutions.
Oh, actually one more thing: you'll now be able to enable ULMB (ultra low motion blur) mode in the driver as well without requiring entry into your display's OSD.
NVIDIA is also officially announcing G-Sync for notebooks at Computex. More on that in this story!
Subject: Graphics Cards | May 26, 2015 - 05:03 PM | Sebastian Peak
Tagged: rumors, nvidia, leaks, GTX 980 Ti, gpu, gm200
Who doesn’t love rumor and speculation about unreleased products? (Other than the manufacturers of such products, of course.) Today VideoCardz is reporting via HardwareBattle a GPUZ screenshot reportedly showing specs for an NIVIDIA GeForce GTX 980 Ti.
Image credit: HardwareBattle via VideoCardz.com
First off, the HardwareBattle logo conveniently obscures the hardware ID (as well as ROP/TMU counts). What is visible is the 2816 shader count, which places it between the GTX 980 (2048) and TITAN X (3072). The 6 GB of GDDR5 memory has a 384-bit interface and 7 Gbps speed, so bandwidth should be the same 336 GB/s as the TITAN X. As far as core clocks on this GPU (which seems likely to be a cut-down GM200), they are identical to those of the TITAN X as well with 1000 MHz Base and 1076 MHz Boost clocks shown in the screenshot.
Image credit: VideoCardz.com
We await any official announcement, but from the frequency of the leaks it seems we won’t have to wait too long.
Big Things, Small Packages
Sapphire isn’t a brand we have covered in a while, so it is nice to see a new and interesting product drop on our door. Sapphire was a relative unknown until around the release of the Radeon 9700 Pro days. This was around the time when ATI decided that they did not want to be so vertically integrated, so allowed other companies to start buying their chips and making their own cards. This was done to provide a bit of stability for ATI pricing, as they didn’t have to worry about a volatile component market that could cause their margins to plummet. By selling just the chips to partners, ATI could more adequately control margins on their own product while allowing their partners to make their own deals and component choices for the finished card.
ATI had very limited graphics card production of their own, so they often would farm out production to second sources. One of these sources ended up turning into Sapphire. When ATI finally allowed other partners to produce and brand their own ATI based products, Sapphire already had a leg up on the competition by being a large producer already of ATI products. They soon controlled a good portion of the marketplace by their contacts, pricing, and close relationship with ATI.
Since this time ATI has been bought up by AMD and they no longer produce any ATI branded cards. Going vertical when it come to producing their own chips and video cards was obviously a bad idea, we can look back at 3dfx and their attempt at vertical integration and how that ended for the company. AMD obviously produces an initial reference version of their cards and coolers, but allows their partners to sell the “sticker” version and then develop their own designs. This has worked very well for both NVIDIA and AMD, and it has allowed their partners to further differentiate their product from the competition.
Sapphire usually does a bang up job on packaging the graphics card. Oh look, a mousepad!
Sapphire is not as big of a player as they used to be, but they are still one of the primary partners of AMD. It would not surprise me in the least if they still produced the reference designs for AMD and then distributed those products to other partners. Sapphire is known for building a very good quality card and their cooling solutions have been well received as well. The company does have some stiff competition from the likes of Asus, MSI, and others for this particular market. Unlike those two particular companies, Sapphire obviously does not make any NVIDIA based boards. This has been a blessing and a curse, depending on what the cycle is looking like between AMD and NVIDIA and who has dominance in any particular marketplace.
Subject: Graphics Cards | May 29, 2015 - 11:05 AM | Sebastian Peak
Tagged: rumors, radeon, hbm, graphics, gpu, Fury, Fiji, amd
Another rumor has emerged about an upcoming GPU from AMD, and this time it's a possible name for the HBM-powered Fiji card a lot of us have been speculating about.
The rumor from VideoCardz via Expreview (have to love the multiple layers of reporting here) states the the new card will be named Radeon Fury:
"Radeon Fury would be AMD’s response to growing popularity of TITAN series. It is yet unclear how AMD is planning to adopt Fury naming schema. Are we going to see Fury XT or Fury PRO? Well, let’s just wait and see. This rumor also means that Radeon R9 390X will be a direct rebrand of R9 290X with 8GB memory."
Of course this is completely unsubstantiated, and Fury is a branding scheme from the ATI days, but who knows? I can only hope that if true, AMD will adopt all caps: TITAN! FURY! Feel the excitement. What do you think of this possible name for the upcoming AMD flagship GPU?
Subject: Processors | May 27, 2015 - 09:45 PM | Scott Michaud
Tagged: xeon, Skylake, Intel, Cannonlake, avx-512
AVX-512 is an instruction set that expands the CPU registers from 256-bit to 512-bit. It comes with a core specification, AVX-512 Foundation, and several extensions that can be added where it makes sense. For instance, AVX-512 Exponential and Reciprocal Instructions (ERI) help solve transcendental problems, which occur in geometry and are useful for GPU-style architectures. As such, it appears in Knights Landing but not anywhere else.
Image Credit: Bits and Chips
Today's rumor is that Skylake, the successor to Broadwell, will not include any AVX-512 support in its consumer parts. According to the lineup, Xeons based on Skylake will support AVX-512 Foundation, Conflict Detection Instructions, Vector Length Extensions, Byte and Word Instructions, and Double and Quadword Instructions. Fused Multiply and Add for 52-bit Integers and Vector Byte Manipulation Instructions will not arrive until Cannonlake shrinks everything down to 10nm.
The main advantage of larger registers is speed. When you can fit 512 bits of data in a memory bank and operate upon it at once, you are able to do several, linked calculations together. AVX-512 has the capability to operate on sixteen 32-bit values at the same time, which is obviously sixteen times the compute performance compared with doing just one at a time... if all sixteen undergo the same operation. This is especially useful for games, media, and other, vector-based workloads (like science).
This also makes me question whether the entire Cannonlake product stack will support AVX-512. While vectorization is a cheap way to get performance for suitable workloads, it does take up a large amount of transistors (wider memory, extra instructions, etc.). Hopefully Intel will be able to afford the cost with the next die shrink.
Subject: Graphics Cards | June 1, 2015 - 06:00 AM | Ryan Shrout
Tagged: maxwell, hydro copper, GTX 980 Ti, gm200, evga, computex 2015, computex, classified, acx
With the release of the brand new GeForce GTX 980 Ti from NVIDIA stirring up the week just before Computex in Taipei, you can be sure that all of NVIDIA's partners are going to be out in force showing off their custom graphics card solutions.
EVGA has several lined up and they were able to share some information with us. First up is the standard but custom cooled GTX 980 Ti that uses the ACX 2.0+ cooler. This new version of the ACX 2.0 cooler includes a "memory MOSFET Cooling Plate (MMCP) reduces MOSFET temperatures up to 13%, and optimized Straight Heat Pipes (SHP) additionally reduce GPU temperature by 5C. ACX 2.0+ coolers also feature optimized swept fan blades, double ball bearings and an extreme low power motor, delivering more air flow with less power, unlocking additional power for the GPU." We're looking forward to some hands-on testing with this card when it shows up on Monday morning.
Also due for an update is the EVGA Classified line, often considered one of the best cards you can buy for overclockers and extreme enthusiasts. Though the card is also using the ACX 2.0+ cooler it will include additional power delivery improvements on the PCB that help stretch available performance headroom.
Following in the footsteps of the recently released Titan X Hybrid comes the GTX 980 Ti version. This card will use a standard blower cooler for the memory and power delivery while attaching a self-contained water cooler for the GPU itself. This should keep the GPU temperature down quite a bit though the benefit to real-world overclocking is debatable with the voltage lock that NVIDIA has kept in place. If only they were to change that...
Finally, for the water cooling fans among us we have the GTX 980 Ti Hydro Copper, using a water block from EK.
Interested in clock speeds?
- EVGA 980 Ti ACX 2.0
- Base: 1000 MHz
- Boost: 1076 MHz
- Memory: 7010 MHz
- EVGA 980 Ti Classified
- Base: 1152 MHz
- Boost: 1241 MHz
- Memory: 7010 MHz
- EVGA 980 Ti Hybrid
- Base: 1140 MHz
- Boost: 1228 MHz
- Memory: 7010 MHz
I am still waiting for pricing and availability information which we will pass on as soon as we get it!
Subject: General Tech, Systems | June 1, 2015 - 07:30 AM | Tim Verry
Tagged: steam os, living room gaming, liquid cooling, gaming, DIY, corsair, computex 2015, computex, barebones, 4k
Today at Computex, Corsair unveiled a new barebones gaming PC aimed at the living room. The compact Bulldog PC is an upgradeable barebones DIY kit that offers gamers an interesting base from which to build a living room PC capable of 4K gaming. The chassis resembles an overbuilt console in that it is a short but wide design with many angular edges and aesthetic touches including stylized black case feet and red accents surrounding the vents. A hidden panel in the lower right corner reveals two USB 3.0 ports and two audio jacks. It looks ready to fight in the next season of Robot Wars should you add a flamethrower or hydraulic flipper (heh).
The Bulldog kit consists of the chassis, motherboard, small form factor power supply, and a customized Hydro H55F series closed loop liquid CPU cooler. From there, users need to bring their own processor, RAM, and storage devices. There is no operating system included with the kit, but it, being a full PC, supports Windows, Linux, and SteamOS et al.
As far as graphics cards, Corsair is offering several liquid cooled NVIDIA graphics cards (initially only from MSI with other AIB partner cards to follow) that are ready to be installed in the Bulldog PC. Currently, users can choose from the GTX TITAN X, GTX 980, and GTX 970.
Alternatively, Corsair is offering a $99 (MSRP) upgrade kit for existing graphics cards with its Hydro H55 cooler and HG110 bracket.
The Bulldog case supports Mini ITX form factor motherboards and it appears that Corsair is including the Asus Z97I-Plus which is a socket 1150 board supporting Haswell-based Core processors, DDR3 memory, M.2 (though you have to take the board out of the case to install the drive since the slot is on the underside of the board), a single PCI-E 3.0 x16 slot, four SATA 6.0 Gbps ports, and the usual fare of I/O options including USB 3.0, 802.11ac Wi-Fi, and optical and analog audio outputs (among others).
A mini ITX motherboard paired with the small from factor Corsair H55F CPU cooler (left) and the internal layout of the Bulldog case with all components installed (right).
User purchased processors are cooled by the included liquid cooler which is a customized Hydro series cooler that mounts over the processor and exhausts air blower style out of the back of the case. The system is powered by the pre-installed 600W Corsair FS600 power supply. The PSU is mounted in the front of the system and the graphics card radiator and fan are mounted horizontally beside it. Along the left side of the case are mounts for a single 2.5" drive and a single 3.5" drive.
GPU manufacturers will be selling card with liquid coolers pre-installed. Users can also upgrade existing air cooled graphics cards with an optional upgrade kit.
The liquid cooling aspect of the Bulldog is neat and, according to Corsair, is what is enabling them to cram so much hardware together into a relatively small case while enabling thermal headroom for overclocking and quieter operation versus air coolers.
I am curious how well the CPU cooler performs especially as far as noise levels go with the compacted and shrouded design. Also, while there is certainly plenty of ventilation along the sides of the case to draw in cool air, I'm interested in how well the GPU HSF will be able to exhaust the heat since there are no top grilles.
Corsair is marketing the Bulldog as the next step up from your typical Steam Machine and game console and the first 4K capable gaming PC designed for the living room. Further, it would be a nice stepping stone for console gamers to jump into PC gaming.
From the press release:
“Bulldog is designed to take the 4K gaming experience delivered by desktop gaming PCs, and bring it to the big 4K screens in the home,” said Andy Paul, CEO of Corsair Components. “We knew we needed to deliver a solution that was elegant, powerful, and compact. By leveraging our leading expertise in PC case design and liquid cooling, we met that goal with Bulldog. We can’t wait to unleash it on gamers this fall.”
The Bulldog DIY PC kit is slated for an early Q4 2015 launch with a MSRP of $399. After adding in a processor, memory, storage, and graphics, Corsair estimates a completed build to start around $940 with liquid cooled graphics ($600 without a dedicated GPU) and tops out at $2,250.
Keep in mind that the lowest tier liquid cooled GPU at launch will be the MSI GTX 970 (~$340). Users could get these prices down a bit with some smart shopping and component selection along with the optional $99 upgrade kit for other GPU options. It is also worth considering that the Bulldog is being positioned as a 4K gaming machine. If you were willing to start off with a 1080p setup, you could get buy with a cheaper graphics card and upgrade later along with your TV when 4K televisions are cheaper and more widespread.
At its core, $400 for the Bulldog kit (case, quality power supply, high end motherboard, and closed loop CPU cooler) is a decent value that just might entice some console gamers to explore the world of PC gaming (and to never leave following their first Steam sale heh)! It is a big commitment for sure at that price, but it looks like Corsair is using quality components and while there is surely the usual the small form factor part price premium (especially cases), it is far from obnoxious.
What do you think about the bulldog? Is it more bark than bite or is it a console killer?
Subject: General Tech | June 1, 2015 - 08:07 AM | Sebastian Peak
Tagged: STRAFE, mechanical keyboard, gaming keyboard, corsair, computex 2015, computex, Cherry MX
Corsair has announced the STRAFE mechanical gaming keyboard featuring Cherry MX switches, and the company is calling it the “most advanced mono-backlit mechanical gaming keyboard available”.
“The STRAFE mechanical gaming keyboard’s brilliant red backlighting can be customized to a virtually unlimited number of lighting configurations and effects. Each key can be programmed with automated macros using CUE (Corsair Utility Engine) software. Users can choose from six unique lighting effects or craft their own custom profiles and share them on www.corsairgaming.com.”
The STRAFE features:
- German-made Cherry MX red switches with gold contacts for fast, precise key presses
- Fully programmable brilliant red LED backlighting for unrivaled personalization
- USB pass-through port for easy connections
- Textured and contoured FPS/MOBA keycaps
- 100% anti-ghosting technology with 104-key rollover
- Enhanced, easy-access multimedia controls
The Corsair STRAFE has an MSRP of $109.99 and will be available in June.
Subject: Memory | May 26, 2015 - 06:22 PM | Jeremy Hellstrom
Tagged: ddr4-2666, G.Skill, Ripjaws 4
The price may still sting a bit but honestly, it is only about a small premium over many 16GB DDR3 kits so the pricing on DDR4 is getting much better. G.Skill's 16GB DDR4-2666 quad channel kit has timings of 15-15-15-35 and are fully XMP compliant so getting them out of the clamshell packaging may be the hardest step in installing them. Of course many readers here, just like at Bjorn3D, are not going to be satisfied with the default settings which brings us to the overclocking results. 3048MHz @ 16-16-16-37 was perfectly stable in their testing at 1.35V and for those who don't mind the long term effects of upping the voltage to 1.4V there is more headroom left.
"G.Skill has been churning out enthusiast memory that overclocks like nothing else we’ve ever seen. Pop a set of Ripjaws 4 into your dream machine and settle into the BIOS for an overclocking experience like you’ve never had!"
Here are some more Memory articles from around the web:
- G.SKILL Ripjaws 4 3000MHz @ Bjorn3d
- Patriot Viper 4 16GB DDR4 3000MHz Memory Kit Review @ Neoseeker
- Crucial Ballistix Sport 2400MHz Quad Channel DDR4 16GB Memory Kit @ eTeknix
Subject: Mobile, Shows and Expos | May 29, 2015 - 03:42 PM | Scott Michaud
Tagged: Android, google, google io, google io 2015
I'll be honest with you: I did not see a whole lot that interested me out of the Google I/O keynote. The company released a developer preview of their upcoming Android OS “M”, which refers to the thirteenth alphabetical release (although only eleven were formally lettered because they started with “C”upcake). Version nomenclature aside, this release is supposed to tune the experience. While the platform could benefit from a tune-up, it is also synonymous with not introducing major features.
But some things are being added, including “Google Now on Tap”. The idea is that Google will understand what is happening on screen and allow the user to access more information about it. In a demo on Engadget, the user was looking at scores for the Golden State Warriors. She asked “When are they playing next”, actually using the pronoun “they”, and the phone brought up their next game (it was against the Cavaliers).
Fingerprint reading and Android Pay are also being added to this release.
Other than that, it is mostly performance and usability. One example is “Doze State”, which allows the OS to update less frequently when the device is inactive. It is supposed to play nice with alarms and notifications though, which is good. Normally, I would wait to see if it actually works before commenting on it, but this seems like something that would only be a problem if no-one thought of it. Someone clearly did, because they apparently mentioned it at the event.
Android M, whatever it will actually be called, is expected to ship to consumers in the Fall.
Subject: Graphics Cards | June 1, 2015 - 10:58 AM | Ryan Shrout
Tagged: evga, precisionx, dx12, DirectX 12
Another interesting bit of news surrounding Computex and the new GTX 980 Ti comes from EVGA and its PrecisionX software. This is easily our favorite tool for overclocking and GPU monitoring, so it's great to see the company continuing to push forward with features and capability. EVGA is the first to add full support for DX12 with an overlay.
What does that mean? It means as DX12 applications that find their way out to consumers and media, we will now have a tool that can help measure performance and monitor GPU speeds and feeds via the PrecisionX overlay. Before this release, we were running the dark with DX12 demos, so this is great news!
You can download the latest version over on EVGA's website!
- 1 of 4