Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Contest: Win a 400GB Intel 750 Series SSD from Intel and PC Perspective!

Subject: Editorial | May 29, 2015 - 12:37 PM |
Tagged: SSD 750, PCI Express, NVMe, Intel, giveaway, contest, 750 series

PC Perspective and Intel are partnering together to offer up a giveaway with some pretty impressive swag. Surely by now you have read all about the new Intel SSD 750 Series of products, a new class of solid state drive that combines four lanes of PCI Express 3.0 and a new protocol called NVM Express (NVMe) for impressive bandwidth throughput. In Allyn's review of the SSD in April he called it "the obvious choice for consumers who demand the most from their storage" and gave it a PC Perspective Editor's Choice Award!

contest1.jpg

Thanks to our friends at Intel we are going to be handing out a pair of the 400GB add-in card models to loyal PC Perspective readers and viewers. How can you enter? The rules are dead simple:

  1. Fill out the contest entry form below to find multiple entry methods including reading our review, answering a question about Intel SSD 750 Series specs or following us on Twitter. You can fill out one or all of the methods - the more you do the better your chances!
     
  2. Leave a comment on the news post below thanking Intel for sponsoring PC Perspective and for supplying this hardware for us to give to you!
     
  3. This is a global contest - so feel free to enter from anywhere in the world!
     
  4. Contest will close on June 2nd, 2015.

Win an Intel SSD 750 Series From PC Perspective and Intel!

Our most sincere thanks to Intel for bringing this contest to PC Perspective's readers and fans. Good luck to everyone (except Josh)!

Sponsored by Intel

contest2.jpg

Product Specifications

Capacity Seqential 128KB Read (up to MB/s) Sequential 128KB Write (up to MB/s) Random 4KB Read (up to IOPS) Random 4KB Write (up to IOPS) Form Factor Interface
400 GB 2,200 900 430,000 230,000 2.5-inch x 15mm PCI Express Gen3 x4
1.2 TB 2,400 1,200 440,000 290,000 2.5-inch x 15mm PCI Express Gen3 x4
400 GB 2,200 900 430,000 230,000 Half-height half-length (HHHL) Add-in Card PCI Express Gen3 x4
1.2 TB 2,400 1,200 440,000 290,000 Half-heigh half-length (HHHL) Add-in Card PCI Express Gen3 x4

Experience the future of storage performance for desktop client and workstation users with the Intel® SSD 750 Series. The Intel SSD 750 Series delivers uncompromised performance by utilizing NVM Express* over four lanes of PCIe* 3.0.

With both Add-in Card and 2.5-inch form factors, the Intel SSD 750 Series eases migration from SATA to PCIe 3.0 without power or thermal limitations on performance. The SSD can now deliver the ultimate in performance in a variety of system form factors and configurations.

Source: Intel.com

Author:
Manufacturer: NVIDIA

Specifications

When NVIDIA launched the GeForce GTX Titan X card only back in March of this year, I knew immediately that the GTX 980 Ti would be close behind. The Titan X was so different from the GTX 980 when it came to pricing and memory capacity (12GB, really??) that NVIDIA had set up the perfect gap with which to place the newly minted GTX 980 Ti. Today we get to take the wraps off of that new graphics card and I think you'll be impressed with what you find, especially when you compare its value to the Titan X.

Based on the same Maxwell architecture and GM200 GPU, with some minor changes to GPU core count, memory size and boost speeds, the GTX 980 Ti finds itself in a unique spot in the GeForce lineup. Performance-wise it's basically identical in real-world game testing to the GTX Titan X, yet is priced $350 less that that 12GB behemoth. Couple that with a modest $50 price drop in the GTX 980 cards and you have all markers of an enthusiast graphics card that will sell as well as any we have seen in recent generations.

P1020283.jpg

The devil is in all the other details, of course. AMD has its own plans for this summer but the Radeon R9 290X is still sitting there at a measly $320, undercutting the GTX 980 Ti by more than half. NVIDIA seems to be pricing its own GPUs as if it isn't even concerned with what AMD and the Radeon brand are doing. That could be dangerous if it goes on too long, but for today, can the R9 290X put up enough fight with the aging Hawaii XT GPU to make its value case to gamers on the fence?

Will the GeForce GTX 980 Ti be the next high-end GPU to make a splash in the market, or will it make a thud at the bottom of the GPU gene pool? Let's dive into it, shall we?

Continue reading our review of the new NVIDIA GeForce GTX 980 Ti 6GB Graphics Card!!

Computex 2015: ASUS 3800R 34″ Ultrawide 3440x1440 IPS 21:9 Curved G-SYNC Monitor

Subject: Displays | June 1, 2015 - 12:21 PM |
Tagged: nvidia, gsync, g-sync, asus, 3800R

At Computex this week ASUS is showing off a prototype of the new ROG 3800R monitor,  a 34-in curved display with a 3440x1440 resolution and G-Sync variable refresh rate capability. ASUS claims on its PCDIY blog that the 21:9 aspect ratio was the one of the "most requested" specifications for a new ROG monitor, followed by a curved design. The result is a gorgeous display:

ROG-34-inch-curved-gaming-monitor.jpg

Here's a list of specifications:

  • 34” optimal dimension for QHD resolutions with 3440×1440 resolution
  • 21:9 ultra-wide aspect ratio for increased immersion and improved horizontal workflow
  • IPS based panel for superior color reproduction, black levels and reduction of color shifting
  • NVIDIA G-SYNC equipped offering smooth, fluid and tear free gaming with improved motion clarity. Additionally equipped with ULMB operating mode for outstanding motion clarity.
  • Frameless design for seamless surround gaming
  • ASUS exclusive GamePlus feature and Turbo Key
  • Ergonomic adjustment including tilt, swivel and height adjustment

Hot damn, we want of these and we want it yesterday! There is no mention of the refresh rate of the display here though we did see information from NVIDIA that ASUS was planning a 34x14 60 Hz screen - but we are not sure this is the same model being shown. And the inclusion of ULMB would normally indicate a refresh rate above 60-75 Hz...

Another interesting note: this monitor appears to include both DisplayPort and HDMI connectivity.

This 34-inch 3800R curved display features wide-viewing angles, a 3440 x 1440 native resolution, and 21:9 aspect ratio. It features NVIDIA® G-SYNC™ display technology to deliver smooth, lag-free visuals. G-SYNC synchronizes the display’s refresh rate to the GPU in any GeForce® GTX™-powered PC to eliminate screen tearing and minimizing display stutter and input lag. This results in sharper, more vibrant images; and more fluid and responsive gameplay. It has extensive connectivity options that include DisplayPort and HDMI.

The above information came from ASUS just a few short hours ago, so you can assume that it is accurate. Could this be the start of panels that integrate dual scalars (G-Sync module plus something else) to offer more connectivity or has the G-Sync module been updated to support more inputs? We'll find out!

Rumor: Intel Core i7-6700K (Skylake-S) Benchmarks Leaked

Subject: Processors | May 28, 2015 - 03:44 PM |
Tagged: Intel, Skylake, skylake-s, haswell, devil's canyon

For a while, it was unclear whether we would see Broadwell on the desktop. With the recently leaked benchmarks of the Intel Core i7-6700K, it seems all-but-certain that Intel will skip it and go straight to Skylake. Compared to Devil's Canyon, the Haswell-based Core i7-4790K, the Skylake-S Core i7-6700K has the same base clock (4.0 GHz) and same full-processor Turbo clock (4.2 GHz). Pretty much every improvement that you see is pure performance per clock (IPC).

intel-2015-cpumonkey-skylakes-benchmark.png

Image Credit: CPU Monkey

In multi-threaded applications, the Core i7-6700K tends to get about a 9% increase while, when a single core is being loaded, it tends to get about a 4% increase. Part of this might be the slightly lower single-core Turbo clock, which is said to be 4.2 GHz instead of 4.4 GHz. There might also be some increased efficiency with HyperThreading or cache access -- I don't know -- but it would be interesting to see.

I should note that we know nothing about the GPU. In fact, CPU Monkey fails to list a GPU at all. Intel has expressed interest in bringing Iris Pro-class graphics to the high-end mainstream desktop processors. For someone who is interested in GPU compute, especially with Explicit Unlinked MultiAdapter in DirectX 12 upcoming, it would be nice to see GPUs be ubiquitous and always enabled. It is expected to have the new GT4e graphics with 72 compute units and either 64 or 128MB of eDRAM. If clocks are equivalent, this could translate well over a teraflop (~1.2 TFLOPs) of compute performance in addition to discrete graphics. In discrete graphics, that would be nearly equivalent to an NVIDIA GTX 560 Ti.

We are expecting to see the Core i7-6700K launch in Q3 of this year. We'll see.

Source: CPU Monkey
Author:
Manufacturer: NVIDIA

SHIELD Specifications

Announced just this past June at last year’s Google I/O event, Android TV is a platform developed by Google, running Android 5.0 and higher, that aims to create an interactive experience for the TV. This platform can be built into a TV directly as well as into set-top style boxes, like the NVIDIA SHIELD we are looking at today. The idea is to bring the breadth of apps and content to the TV through the Android operating system in a way that is both convenient and intuitive.

NVIDIA announced SHIELD back in March at GDC as the first product to use the company’s latest Tegra processor, the X1. This SoC combines an 8-core big.LITTLE ARM processor design with a 256-core implementation of the NVIDIA Maxwell GPU architecture, providing GPU performance previously unseen in an Android device. I have already spent some time with the NVIDIA SHIELD at various events and the promise was clearly there to make it a leading option for Android TV adoption, but obviously there were questions to be answered.

DSC01740.jpg

Today’s article will focus on my early impressions with the NVIDIA SHIELD, having used it both in the office and at home for a handful of days. As you’ll see during the discussion there are still some things to be ironed out, some functionality that needs to be added before SHIELD and Android TV can really be called a must-buy product. But I do think it will get there.

And though this review will focus on the NVIDIA SHIELD, it’s impossible not to marry the success of SHIELD with the success of Google’s Android TV. The dominant use case for SHIELD is as a media playback device, with the gaming functionality as a really cool side project for enthusiasts and gamers looking for another outlet. For SHIELD to succeed, Google needs to prove that Android TV can improve over other integrated smart TV platforms as well as other set-top box platforms like Boxee, Roku and even the upcoming Apple TV refresh.

But first, let’s get an overview of the NVIDIA SHIELD device, pricing and specifications, before diving into my experiences with the platform as a whole.

Continue reading our review of the new NVIDIA SHIELD with Android TV!!

NVIDIA G-Sync for Notebooks Announced, No Module Required

Subject: Displays, Mobile | May 31, 2015 - 06:00 PM |
Tagged: nvidia, notebooks, msi, mobile, gsync, g-sync, asus

If you remember back to January of this year, Allyn and posted an article that confirmed the existence of a mobile variant of G-Sync thanks to a leaked driver and an ASUS G751 notebook. Rumors and speculation floated around the Internet ether for a few days but we eventually got official word from NVIDIA that G-Sync for notebooks was a real thing and that it would launch "soon." Well we have that day here finally with the beginning of Computex.

mobile1.jpg

G-Sync for notebooks has no clever branding, no "G-Sync Mobile" or anything like that, so discussing it will be a bit more difficult since the technologies are different. Going forward NVIDIA claims that any gaming notebook using NVIDIA GeForce GPUs will be a G-Sync notebook and will support all of the goodness that variable refresh rate gaming provides. This is fantastic news as notebook gaming is often at lower frame rates than you would find on a desktop PC because of lower powered hardware yet comparable (1080p, 1440p) resolution displays.

mobile2.jpg

Of course, as we discovered in our first look at G-Sync for notebooks back in January, the much debated G-Sync module is not required and will not be present on notebooks featuring the variable refresh technology. So what gives? We went over some of this before, but it deserves to be detailed again.

NVIDIA uses the diagram above to demonstrate the complication of the previous headaches presented by the monitor and GPU communication path before G-Sync was released. You had three different components: the GPU, the monitor scalar and the monitor panel that all needed to work together if VRR was going to become a high quality addition to the game ecosystem. 

mobile3.jpg

NVIDIA's answer was to take over all aspects of the pathway for pixels from the GPU to the eyeball, creating the G-Sync module and helping OEMs to hand pick the best panels that would work with VRR technology. This helped NVIDIA make sure it could do things to improve the user experience such as implementing an algorithmic low-frame-rate, frame-doubling capability to maintain smooth and tear-free gaming at frame rates under the panels physical limitations. It also allows them to tune the G-Sync module to the specific panel to help with ghosting and implemention variable overdrive logic. 

980ti-32.jpg

All of this is required because of the incredible amount of variability in the monitor and panel markets today.

mobile4.jpg

But with notebooks, NVIDIA argues, there is no variability at all to deal with. The notebook OEM gets to handpick the panel and the GPU directly interfaces with the screen instead of passing through a scalar chip. (Note that some desktop monitors like the ever popular Dell 3007WFP did this as well.)  There is no other piece of logic in the way attempting to enforce a fixed refresh rate. Because of that direct connection, the GPU is able to control the data passing between it and the display without any other logic working in the middle. This makes implementing VRR technology much more simple and helps with quality control because NVIDIA can validate the panels with the OEMs.

mobile5.jpg

As I mentioned above, going forward, all new notebooks using GTX graphics will be G-Sync notebooks and that should solidify NVIDIA's dominance in the mobile gaming market. NVIDIA will be picking the panels, and tuning the driver for them specifically, to implement anti-ghosting technology (like what exists on the G-Sync module today) and low frame rate doubling. NVIDIA also claims that the world's first 75 Hz notebook panels will ship with GeForce GTX and will be G-Sync enabled this summer - something I am definitely looking forward to trying out myself.

Though it wasn't mentioned, I am hopeful that NVIDIA will continue to allow users the ability to disable V-Sync at frame rates above the maximum refresh of these notebook panels. With most of them limited to 60 Hz (but this applies to 75 Hz as well) the most demanding gamers are going to want that same promise of minimal latency.

mobile6.jpg

At Computex we'll see a handful of models announced with G-Sync up and running. It should be no surprise of course to see the ASUS G751 with the GeForce GTX 980M GPU on this list as it was the model we used in our leaked driver testing back in January. MSI will also launch the GT72 G with a 1080p G-Sync ready display and GTX 980M/970M GPU option. Gigabyte will have a pair of notebooks: the Aorus X7 Pro-SYNC with GTX 970M SLI and a 1080p screen as well as the Aorus X5 with a pair of GTX 965M in SLI and a 3K resolution (2560x1440) screen. 

This move is great for gamers and I am eager to see what the resulting experience is for users that pick up these machines. I have long been known as a proponent of variable refresh displays and getting access to that technology on your notebook is a victory for NVIDIA's team.

LG 27MU67: 27-inch, 4K, IPS, FreeSync Monitor

Subject: Displays | May 29, 2015 - 04:59 PM |
Tagged: LG, ips, freesync, 4k

LG Australia published a product page for their LG 27MU67 monitor, which the rest of the company doesn't seem to acknowledge the existence of. It is still online, even after three days worth of time that someone could have used to pull the plug. This one is interesting for a variety of reasons: it's 4K, it's IPS, and it supports AMD FreeSync. It is also relatively cheap for that combination, being listed at $799 AUD RRP.

lg-2015-freesync4kips-27mu67.jpg

Some websites have converted that to ~$610 to $620 USD, but it might even be less than that. Australian prices are often listed with their federal tax rolled in, which would yield a price that is inflated about 10%. It is possible, though maybe wishful thinking, that this monitor could retail in the ~$500 to $550 price range for the United States (if it even comes to North America). Again, this is a 4K, IPS, FreeSync panel.

Very little is posted on LG's website and thus it is hard to tell how good of an IPS panel this is. It is listed as 99% SRGB coverage, which is good for typical video but not the best if you are working on printed content, such as magazine illustrations. On the other hand, this is a gaming panel, not a professional one. Update (May 29, 2015): It also has 10-bit (per channel) color. It sounds like it is true 10-bit, not just a look-up table, but I should note that it doesn't explicitly say that.

Again, pricing and availability is up in the air, because this is not an official announcement. It is listed to launch in Australia for $799 AUD, though.

Source: LG Australia

NVIDIA G-Sync Update: New Monitors, Windowed Mode, V-Sync Options

Subject: Displays | May 31, 2015 - 06:00 PM |
Tagged: nvidia, gsync, g-sync, computex 2015, computex

In conjunction with the release of the new GeForce GTX 980 Ti graphics card today, NVIDIA is making a handful of other announcements around the GeForce brand. The most dramatic of the announcements center around the company's variable refresh monitor technology called G-Sync. I assume that any of you reading this are already intimately familiar with what G-Sync is, but if not, check out this story that dives into how it compares with AMD's rival tech called FreeSync.

First, NVIDIA is announcing a set of seven new G-Sync ready monitors that will be available this summer and fall from ASUS and Acer.

980ti-42.jpg

Many of these displays offer configurations of panels we haven't yet seen in a G-Sync display. Take the Acer X34 for example: this 34-in monitor falls into the 21:9 aspect ratio form factor, with a curved screen and a 3440x1440 resolution. The refresh rate will peak at 75 Hz while also offering the color consistency and viewing angles of an IPS screen. This is the first 21:9, the first 34x14 and the first curved monitor to support G-Sync, and with a 75 Hz maximum refresh it should provide a solid gaming experience. ASUS has a similar model, the PG34Q, though it peaks at a refresh rate of 60 Hz.

ASUS will be updating the wildly popular ROG Swift PG278Q display with the PG279Q, another 27-in monitor with a 2560x1440 resolution. Only this time it will run at 144 Hz with an IPS screen rather than TN, again resulting in improved color clarity, viewing angles and lower eye strain. 

Those of you on the look out for 4K panels with G-Sync support will be happy to find IPS iterations of that configuration but still will peak at 60 Hz refresh - as much a limitation of DisplayPort as anything else though. 

Another technology addition for G-Sync with the 352-series (353-series, sorry!) driver released today is support for windowed mode variable refresh.

gsync-windows.jpg

By working some magic with the DWM (Desktop Window Manager), NVIDIA was able to allow for VRR to operate without requiring a game to be in full screen mode. For gamers that like to play windowed or borderless windowed while using secondary or large displays for other side activities, this is a going to a great addition to the G-Sync portfolio. 

Finally, after much harassment and public shaming, NVIDIA is finally going to allow users the choice to enable or disable V-Sync when your game render rate exceeds the maximum refresh rate of the G-Sync monitor it is attached to.

gsyncsettings2.png

One of the complaints about G-Sync has been that it is restrictive on the high side of the VRR window for its monitors. While FreeSync allowed you to selectively enable or disable V-Sync when your frame rate goes above the maximum refresh rate, G-Sync was forcing users into a V-Sync enabled state. The reasoning from NVIDIA was that allowing for horizontal tearing of any kind with G-Sync enabled would ruin the experience and/or damage the technology's reputation. But now, while the default will still be to keep V-Sync on, gamers will be able to manually set the V-Sync mode to off with a G-Sync monitor.

Why is this useful? Many gamers believe that a drawback to V-Sync enabled gaming is the added latency of waiting for a monitor to refresh before drawing a frame that might be ready to be shown to the user immediately. G-Sync fixes this from frame rates of 1 FPS to the maximum refresh of the G-Sync monitor (144 FPS, 75 FPS, 60 FPS) but now rather than be stuck with tear-free, but latency-added V-Sync when gaming over the max refresh, you'll be able to play with tearing on the screen, but lower input latency. This could be especially useful for gamers using 60 Hz G-Sync monitors with 4K resolutions.

Oh, actually one more thing: you'll now be able to enable ULMB (ultra low motion blur) mode in the driver as well without requiring entry into your display's OSD.

gsyncsettings1.png

NVIDIA is also officially announcing G-Sync for notebooks at Computex. More on that in this story!

Author:
Manufacturer: Sapphire

Big Things, Small Packages

Sapphire isn’t a brand we have covered in a while, so it is nice to see a new and interesting product drop on our door.  Sapphire was a relative unknown until around the release of the Radeon 9700 Pro days.  This was around the time when ATI decided that they did not want to be so vertically integrated, so allowed other companies to start buying their chips and making their own cards.  This was done to provide a bit of stability for ATI pricing, as they didn’t have to worry about a volatile component market that could cause their margins to plummet.  By selling just the chips to partners, ATI could more adequately control margins on their own product while allowing their partners to make their own deals and component choices for the finished card.

r9_285_itx_01.jpg

ATI had very limited graphics card production of their own, so they often would farm out production to second sources.  One of these sources ended up turning into Sapphire.  When ATI finally allowed other partners to produce and brand their own ATI based products, Sapphire already had a leg up on the competition by being a large producer already of ATI products.  They soon controlled a good portion of the marketplace by their contacts, pricing, and close relationship with ATI.

Since this time ATI has been bought up by AMD and they no longer produce any ATI branded cards.  Going vertical when it come to producing their own chips and video cards was obviously a bad idea, we can look back at 3dfx and their attempt at vertical integration and how that ended for the company.  AMD obviously produces an initial reference version of their cards and coolers, but allows their partners to sell the “sticker” version and then develop their own designs.  This has worked very well for both NVIDIA and AMD, and it has allowed their partners to further differentiate their product from the competition.

r9_285_itx_02.jpg

Sapphire usually does a bang up job on packaging the graphics card. Oh look, a mousepad!

Sapphire is not as big of a player as they used to be, but they are still one of the primary partners of AMD.  It would not surprise me in the least if they still produced the reference designs for AMD and then distributed those products to other partners.  Sapphire is known for building a very good quality card and their cooling solutions have been well received as well.  The company does have some stiff competition from the likes of Asus, MSI, and others for this particular market.  Unlike those two particular companies, Sapphire obviously does not make any NVIDIA based boards.  This has been a blessing and a curse, depending on what the cycle is looking like between AMD and NVIDIA and who has dominance in any particular marketplace.

Click here to read the entire Sapphire R9 285 ITX OC Review!

Rumor: AMD To Name Upcoming Flagship Fiji GPU "Radeon Fury"

Subject: Graphics Cards | May 29, 2015 - 11:05 AM |
Tagged: rumors, radeon, hbm, graphics, gpu, Fury, Fiji, amd

Another rumor has emerged about an upcoming GPU from AMD, and this time it's a possible name for the HBM-powered Fiji card a lot of us have been speculating about.

RAGE-FURY-MAXX.jpg

New box art revealed! (Or a product from early 2000.) Image: VideoCardz

The rumor from VideoCardz via Expreview (have to love the multiple layers of reporting here) states the the new card will be named Radeon Fury:

"Radeon Fury would be AMD’s response to growing popularity of TITAN series. It is yet unclear how AMD is planning to adopt Fury naming schema. Are we going to see Fury XT or Fury PRO? Well, let’s just wait and see. This rumor also means that Radeon R9 390X will be a direct rebrand of R9 290X with 8GB memory."

Of course this is completely unsubstantiated, and Fury is a branding scheme from the ATI days, but who knows? I can only hope that if true, AMD will adopt all caps: TITAN! FURY! Feel the excitement. What do you think of this possible name for the upcoming AMD flagship GPU?

Source: VideoCardz

Rumor: Only Xeon-based Skylake CPUs Getting AVX-512

Subject: Processors | May 27, 2015 - 09:45 PM |
Tagged: xeon, Skylake, Intel, Cannonlake, avx-512

AVX-512 is an instruction set that expands the CPU registers from 256-bit to 512-bit. It comes with a core specification, AVX-512 Foundation, and several extensions that can be added where it makes sense. For instance, AVX-512 Exponential and Reciprocal Instructions (ERI) help solve transcendental problems, which occur in geometry and are useful for GPU-style architectures. As such, it appears in Knights Landing but not anywhere else.

intel-2015-instruction-set-support.png

Image Credit: Bits and Chips

Today's rumor is that Skylake, the successor to Broadwell, will not include any AVX-512 support in its consumer parts. According to the lineup, Xeons based on Skylake will support AVX-512 Foundation, Conflict Detection Instructions, Vector Length Extensions, Byte and Word Instructions, and Double and Quadword Instructions. Fused Multiply and Add for 52-bit Integers and Vector Byte Manipulation Instructions will not arrive until Cannonlake shrinks everything down to 10nm.

The main advantage of larger registers is speed. When you can fit 512 bits of data in a memory bank and operate upon it at once, you are able to do several, linked calculations together. AVX-512 has the capability to operate on sixteen 32-bit values at the same time, which is obviously sixteen times the compute performance compared with doing just one at a time... if all sixteen undergo the same operation. This is especially useful for games, media, and other, vector-based workloads (like science).

This also makes me question whether the entire Cannonlake product stack will support AVX-512. While vectorization is a cheap way to get performance for suitable workloads, it does take up a large amount of transistors (wider memory, extra instructions, etc.). Hopefully Intel will be able to afford the cost with the next die shrink.

Corsair Unleashes the Bulldog, an Upgradeable, Liquid Cooled 4K Gaming PC For The Living Room

Subject: General Tech, Systems | June 1, 2015 - 07:30 AM |
Tagged: steam os, living room gaming, liquid cooling, gaming, DIY, corsair, computex 2015, computex, barebones, 4k

Today at Computex, Corsair unveiled a new barebones gaming PC aimed at the living room. The compact Bulldog PC is an upgradeable barebones DIY kit that offers gamers an interesting base from which to build a living room PC capable of 4K gaming. The chassis resembles an overbuilt console in that it is a short but wide design with many angular edges and aesthetic touches including stylized black case feet and red accents surrounding the vents. A hidden panel in the lower right corner reveals two USB 3.0 ports and two audio jacks. It looks ready to fight in the next season of Robot Wars should you add a flamethrower or hydraulic flipper (heh).

Corsair Bulldog DIY PC 4K Gaming In The Living Room_Front.jpg

The Bulldog kit consists of the chassis, motherboard, small form factor power supply, and a customized Hydro H55F series closed loop liquid CPU cooler. From there, users need to bring their own processor, RAM, and storage devices. There is no operating system included with the kit, but it, being a full PC, supports Windows, Linux, and SteamOS et al.

As far as graphics cards, Corsair is offering several liquid cooled NVIDIA graphics cards (initially only from MSI with other AIB partner cards to follow) that are ready to be installed in the Bulldog PC. Currently, users can choose from the GTX TITAN X, GTX 980, and GTX 970.

Alternatively, Corsair is offering a $99 (MSRP) upgrade kit for existing graphics cards with its Hydro H55 cooler and HG110 bracket.

Corsair Bulldog DIY PC Kit.jpg

The Bulldog case supports Mini ITX form factor motherboards and it appears that Corsair is including the Asus Z97I-Plus which is a socket 1150 board supporting Haswell-based Core processors, DDR3 memory, M.2 (though you have to take the board out of the case to install the drive since the slot is on the underside of the board), a single PCI-E 3.0 x16 slot, four SATA 6.0 Gbps ports, and the usual fare of I/O options including USB 3.0, 802.11ac Wi-Fi, and optical and analog audio outputs (among others).

Corsair Bulldog DIY PC 4K Gaming In The Living Room H55F CPU Cooler.jpg

A mini ITX motherboard paired with the small from factor Corsair H55F CPU cooler (left) and the internal layout of the Bulldog case with all components installed (right).

User purchased processors are cooled by the included liquid cooler which is a customized Hydro series cooler that mounts over the processor and exhausts air blower style out of the back of the case. The system is powered by the pre-installed 600W Corsair FS600 power supply. The PSU is mounted in the front of the system and the graphics card radiator and fan are mounted horizontally beside it. Along the left side of the case are mounts for a single 2.5" drive and a single 3.5" drive.

Corsair Bulldog DIY PC 4K Gaming In The Living Room Liquid Cooled Graphics Cards.jpg

GPU manufacturers will be selling card with liquid coolers pre-installed. Users can also upgrade existing air cooled graphics cards with an optional upgrade kit.

The liquid cooling aspect of the Bulldog is neat and, according to Corsair, is what is enabling them to cram so much hardware together into a relatively small case while enabling thermal headroom for overclocking and quieter operation versus air coolers.

I am curious how well the CPU cooler performs especially as far as noise levels go with the compacted and shrouded design. Also, while there is certainly plenty of ventilation along the sides of the case to draw in cool air, I'm interested in how well the GPU HSF will be able to exhaust the heat since there are no top grilles.

Corsair is marketing the Bulldog as the next step up from your typical Steam Machine and game console and the first 4K capable gaming PC designed for the living room. Further, it would be a nice stepping stone for console gamers to jump into PC gaming.

From the press release:

“Bulldog is designed to take the 4K gaming experience delivered by desktop gaming PCs, and bring it to the big 4K screens in the home,” said Andy Paul, CEO of Corsair Components. “We knew we needed to deliver a solution that was elegant, powerful, and compact. By leveraging our leading expertise in PC case design and liquid cooling, we met that goal with Bulldog. We can’t wait to unleash it on gamers this fall.”

The Bulldog DIY PC kit is slated for an early Q4 2015 launch with a MSRP of $399. After adding in a processor, memory, storage, and graphics, Corsair estimates a completed build to start around $940 with liquid cooled graphics ($600 without a dedicated GPU) and tops out at $2,250.

Corsair Bulldog DIY PC 4K Gaming In The Living Room.jpg

Keep in mind that the lowest tier liquid cooled GPU at launch will be the MSI GTX 970 (~$340). Users could get these prices down a bit with some smart shopping and component selection along with the optional $99 upgrade kit for other GPU options. It is also worth considering that the Bulldog is being positioned as a 4K gaming machine. If you were willing to start off with a 1080p setup, you could get buy with a cheaper graphics card and upgrade later along with your TV when 4K televisions are cheaper and more widespread.

At its core, $400 for the Bulldog kit (case, quality power supply, high end motherboard, and closed loop CPU cooler) is a decent value that just might entice some console gamers to explore the world of PC gaming (and to never leave following their first Steam sale heh)! It is a big commitment for sure at that price, but it looks like Corsair is using quality components and while there is surely the usual the small form factor part price premium (especially cases), it is far from obnoxious.

What do you think about the bulldog? Is it more bark than bite or is it a console killer?

Source: Corsair

Computex 2015: Corsair STRAFE Mechanical Keyboard

Subject: General Tech | June 1, 2015 - 08:07 AM |
Tagged: STRAFE, mechanical keyboard, gaming keyboard, corsair, computex 2015, computex, Cherry MX

Corsair has announced the STRAFE mechanical gaming keyboard featuring Cherry MX switches, and the company is calling it the “most advanced mono-backlit mechanical gaming keyboard available”.

STRAFE_NA_01.jpg

From Corsair:

“The STRAFE mechanical gaming keyboard’s brilliant red backlighting can be customized to a virtually unlimited number of lighting configurations and effects. Each key can be programmed with automated macros using CUE (Corsair Utility Engine) software. Users can choose from six unique lighting effects or craft their own custom profiles and share them on www.corsairgaming.com.”

The STRAFE features:

  • German-made Cherry MX red switches with gold contacts for fast, precise key presses
  • Fully programmable brilliant red LED backlighting for unrivaled personalization
  • USB pass-through port for easy connections
  • Textured and contoured FPS/MOBA keycaps
  • 100% anti-ghosting technology with 104-key rollover
  • Enhanced, easy-access multimedia controls

Strafe_02.jpg

The Corsair STRAFE has an MSRP of $109.99 and will be available in June.

Source: Corsair

$170 for 16GB of very overclockable DDR4-2666

Subject: Memory | May 26, 2015 - 06:22 PM |
Tagged: ddr4-2666, G.Skill, Ripjaws 4

The price may still sting a bit but honestly, it is only about a small premium over many 16GB DDR3 kits so the pricing on DDR4 is getting much better.  G.Skill's 16GB DDR4-2666 quad channel kit has timings of 15-15-15-35 and are fully XMP compliant so getting them out of the clamshell packaging may be the hardest step in installing them.  Of course many readers here, just like at Bjorn3D, are not going to be satisfied with the default settings which brings us to the overclocking results.  3048MHz @ 16-16-16-37 was perfectly stable in their testing at 1.35V and for those who don't mind the long term effects of upping the voltage to 1.4V there is more headroom left. 

G.Skill_Ripjaws_4_2666_41-700x650.jpg

"G.Skill has been churning out enthusiast memory that overclocks like nothing else we’ve ever seen. Pop a set of Ripjaws 4 into your dream machine and settle into the BIOS for an overclocking experience like you’ve never had!"

Here are some more Memory articles from around the web:

Memory

Source: Bjorn3D

Computex 2015: EVGA Shows GTX 980 Ti Variants - Classified, Hybrid and Hydro

Subject: Graphics Cards | June 1, 2015 - 06:00 AM |
Tagged: maxwell, hydro copper, GTX 980 Ti, gm200, evga, computex 2015, computex, classified, acx

With the release of the brand new GeForce GTX 980 Ti from NVIDIA stirring up the week just before Computex in Taipei, you can be sure that all of NVIDIA's partners are going to be out in force showing off their custom graphics card solutions.

980ti.jpg

EVGA has several lined up and they were able to share some information with us. First up is the standard but custom cooled GTX 980 Ti that uses the ACX 2.0+ cooler. This new version of the ACX 2.0 cooler includes a "memory MOSFET Cooling Plate (MMCP) reduces MOSFET temperatures up to 13%, and optimized Straight Heat Pipes (SHP) additionally reduce GPU temperature by 5C. ACX 2.0+ coolers also feature optimized swept fan blades, double ball bearings and an extreme low power motor, delivering more air flow with less power, unlocking additional power for the GPU." We're looking forward to some hands-on testing with this card when it shows up on Monday morning.

Classified.jpg

Also due for an update is the EVGA Classified line, often considered one of the best cards you can buy for overclockers and extreme enthusiasts. Though the card is also using the ACX 2.0+ cooler it will include additional power delivery improvements on the PCB that help stretch available performance headroom.

Hybrid.jpg

Following in the footsteps of the recently released Titan X Hybrid comes the GTX 980 Ti version. This card will use a standard blower cooler for the memory and power delivery while attaching a self-contained water cooler for the GPU itself. This should keep the GPU temperature down quite a bit though the benefit to real-world overclocking is debatable with the voltage lock that NVIDIA has kept in place. If only they were to change that...

Hydro.jpg

Finally, for the water cooling fans among us we have the GTX 980 Ti Hydro Copper, using a water block from EK.

Interested in clock speeds?

  • EVGA 980 Ti ACX 2.0
    • Base: 1000 MHz
    • Boost: 1076 MHz
    • Memory: 7010 MHz
  • EVGA 980 Ti Classified
    • Base: 1152 MHz
    • Boost: 1241 MHz
    • Memory: 7010 MHz
  • EVGA 980 Ti Hybrid
    • Base: 1140 MHz
    • Boost: 1228 MHz
    • Memory: 7010 MHz

I am still waiting for pricing and availability information which we will pass on as soon as we get it!

A substantial upgrade for Thunderbolt

Today at Computex, Intel took the wraps off of the latest iteration of Thunderbolt, a technology that I am guessing many of you thought was dead in the water. It turns out that's not the case, and this new set of features that Thunderbolt 3 offers may in fact push it over the crest and give it the momentum needed to become a useable and widespread standard.

First, Thunderbolt 3 starts with a new piece of silicon, code named Alpine Ridge. Not only does Alpine Ridge increase the available Thunderbolt bandwidth to 40 Gbps but it also adds a native USB 3.1 host controller on the chip itself. And, as mobile users will be glad to see, Intel is going to start utilizing the new USB Type-C (USB-C) connector as the standard port rather than mini DisplayPort.

tb3-1.jpg

This new connector type, that was already a favorite among PC Perspective staff because of its size and its reversibility, will now be the way connectivity and speed increases this generation with Thunderbolt. This slide does a good job of summarizing the key take away from the TB3 announcement: 40 Gbps, support for two 4K 60 Hz displays, 100 watt (bi-directional) charging capability, 15 watt device power and support for four protocols including Thunderbolt, DisplayPort, USB and PCI Express.

tb3-2.jpg

Protocol support is important and Thunderbolt 3 over USB-C will be able to connect directly to a DisplayPort monitor, to an external USB 3.1 storage drive, an old thumb drive or a new Thunderbolt 3 docking station. This is truly unrivaled flexibility from a single connector. The USB 3.1 controller is backward compatible as well: feel free to connect any USB device to it that you can adapt to the Type-C connection.

tb3-3.jpg

From a raw performance perspective Thunderbolt 3 offers a total of 40 Gbps of bi-directional bandwidth, twice that of Thunderbolt 2 and 4x what we get with USB 3.1. That offers users the ability to combine many different devices, multiple displays and network connections and have plenty of headroom.

tb3-4.jpg

With Thunderbolt 3 you get twice as much raw video bandwidth, two DP 1.2 streams, allowing you to run not just a single 4K display at 60 Hz but two of them, all over a single TB3 cable. If you want to connect a 5K display though, you will be limited to just one of them.

tb3-5.jpg

For mobile users, which I think is the area where Thunderbolt 3 will be the most effective, the addition of USB 3.1 allows for charging capability up to 100 watts. This is in addition to the 15 watts of power that Thunderbolt provides to devices directly - think external storage, small hubs/docks, etc.

Continue reading our preview of the new Thunderbolt 3 technology!!

Computex 2015: EVGA Builds PrecisionX 16 with DirectX 12 Support

Subject: Graphics Cards | June 1, 2015 - 10:58 AM |
Tagged: evga, precisionx, dx12, DirectX 12

Another interesting bit of news surrounding Computex and the new GTX 980 Ti comes from EVGA and its PrecisionX software. This is easily our favorite tool for overclocking and GPU monitoring, so it's great to see the company continuing to push forward with features and capability. EVGA is the first to add full support for DX12 with an overlay.

precisionx16.jpg

What does that mean? It means as DX12 applications that find their way out to consumers and media, we will now have a tool that can help measure performance and monitor GPU speeds and feeds via the PrecisionX overlay. Before this release, we were running the dark with DX12 demos, so this is great news!

You can download the latest version over on EVGA's website!

Manufacturer: Phanteks

Introduction and First Impressions

Phanteks has expanded their Enthoo enclosure lineup with a new ATX version of the popular EVOLV case, and it offers a striking design and some unique features to help it stand out in the mid-tower market.

DSC_0825.jpg

Introduction

Phanteks first came to my attention with their large double tower cooler PH-TC14, which competes directly with the Noctua NH-D14 in the CPU air-cooling market. But like a lot of other cooling companies (Cooler Master, Corsair, etc.) Phanteks also offers a full lineup of enclosures as well. Of these the Enthoo EVOLV, which until today has only been available in a micro-ATX and mini-ITX version, has been well-received and has a angular, minimalist look that I like quite a bit. Enter the EVOLV ATX.

With the larger size to this new EVOLV ATX there is not only room for a full-size motherboard, but much more room for components and cooling as well. The internal layout is very similar to the recently reviewed Fractal Design Define S enclosure, with no storage (5.25” or 3.5”) inside the front of the case, which gives the EVOLV ATX a totally open layout. The front is solid metal (though well vented) so we’ll see how this affects cooling, and it will be interesting to see how Phanteks has approached internal storage with the design as well. Let’s get started!

DSC_0910.jpg

Continue reading our review of the Phanteks Enthoo EVOLV ATX enclosure!!

Lenovo Tech World: Z41/Z51 and ideapad 100 announced

Subject: Mobile, Shows and Expos | May 27, 2015 - 12:27 AM |
Tagged: z51, z41, tech world, r9 m375, r9 m360, Lenovo, ideapad 100, amd

Today at their Tech World event in Beijing, Lenovo is taking the opportunity to announce some new mainstream notebook options.

z41_images_product_photography_z41_white_standard_02_photomaster_high_res.jpg

First off, we have simply the Lenovo Z41 and Z51. The 14-inch Z41 and 15.6-inch Z51 aim to refresh the previous Z40 and Z50 with Broadwell CPUs as well as new AMD discrete GPU options.

z51_images_product_photography_z51_white_hero_03_win_metro_high_res. METRO.jpg

Lenovo is using the Broadwell-U class of CPUs here as you would find in ultra books, so don't expect a CPU powerhouse, but for productivity style tasks these machines should hit the sweet spot of Price vs Performance with a starting price of $549 for the base Z51.

z51_images_product_photography_z51_white_product_tour_03_high_res.jpg

Paired with the new AMD R9-M360 (Z41) or M375 (Z51) these notebooks should also be able to play mainstream titles on the integrated 1080p display while coming in just over $800. 

The Lenovo Z51 and Z41 are available on Lenovo's site now.

Ideapad_100_14_Black-7-Outlook.jpg

Lenovo also announced a low-cost entry into the ideapad line utilizing Intel's BayTrail-M processors. The ideapad 100 is available in both 14-inch and 15-inch variants and seems to be aimed at the low-cost Chromebook market. 

Ideapad_100_14_Black-10-Metro.jpg

Starting at $249, the ideapad 100 seems like it will be a good option for users looking for a secondary option for basic web browsing and office tasks. 

 

Stay tuned for more from Lenovo's Tech World Event this week!

 

Source: Lenovo

Computex 2015: MSI Announces Notebooks with Quad-Core Broadwell CPUs

Subject: Mobile | June 2, 2015 - 09:00 AM |
Tagged: notebook, msi, Intel Core i7, gaming notebook, computex 2015, computex, Broadwell

MSI has unveiled a refreshed notebook lineup featuring the new quad-core Intel Broadwell mobile processors.

image1.jpeg

Broadwell launched as a dual-core only option, which resulted in some high-performance notebooks opting to stay with Haswell CPUs. With the introduction of quad-core versions of the new Broadwell chips for mobile, MSI has jumped on the bandwagon to offer a few different options. Of the 20 new notebooks offered by MSI, 18 of them are powered by Intel Core i7 chips.

Intel’s 5th Generation Core i7 processor powers 18 MSI laptop models, including the GT80 Titan SLI, GT72 Dominator, GS70 Stealth, GS60 Ghost, GE72 Apache, GE62 Apache, GP72 Leopard, GP62 Leopard, and the newly announced PX60 Prestige.  Available immediately, all gaming notebook models come with an array of superior technologies, including Killer DoubleShot Pro for lag-less gaming, SteelSeries Gaming Keyboard for exceptional customization and feel, and more.

The flagship GT80 Titan SLI has these impressive specs, including an Intel Core i7-5950HQ processor:

GT80 Titan SLI

  • Screen: 18.4” 1920x1080 WideView Non-Reflection
  • CPU: Intel Core i7-5950HQ, 2.9 - 3.7 GHz
  • Chipset: HM87
  • Graphics: Dual GTX 980M SLI, 8GB GDDR5 VRAM each
  • Memory: 24GB (8GB x3) DDR3L 1600MHz (4 SoDIMM slots, max 32GB)
  • Storage: 256GB Super RAID (128GB M.2 SATA x2, RAID 0) + 1TB 7200 RPM HDD
  • Optical: BD Burner
  • LAN: Killer Gaming Network
  • Wireless: Killer N1525 Combo (2x2 ac), BT 4.1
  • Card Reader: SDXC
  • Video Output: HDMI 1.4, mDP v1.2 x2
  • MSRP: $3799.99

image3.png

The GT80 Titan SLI gaming notebook

1920x1080 with this model seems low, especially considering the obscene amount of VRAM (8GB per card on a laptop? Really?). Still, this notebook has excellent external monitor support with dual mini-DisplayPort outputs, though HDMI is limited to version 1.4.

MSI has also introduced a refreshed GT72 Dominator with NVIDIA G-Sync (covered here), and this new version also features USB 3.1. And for the more business-minded there is the premium PX60 Prestige, now refreshed with Broadwell Core i7 as well.

These refreshed notebook models will be “available immediately” from MSI’s retail partners.

Source: MSI