All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
The First Custom R9 290X
It has been a crazy launch for the AMD Radeon R9 series of graphics cards. When we first reviewed both the R9 290X and the R9 290, we came away very impressed with the GPU and the performance it provided. Our reviews of both products resulted in awards of the Gold class. The 290X was a new class of single GPU performance while the R9 290 nearly matched performance at a crazy $399 price tag.
But there were issues. Big, glaring issues. Clock speeds had a huge amount of variance depending on the game and we saw a GPU that was rated as "up to 1000 MHz" running at 899 MHz in Skyrim and 821 MHz in Bioshock Infinite. Those are not insignificant deltas in clock rate that nearly perfectly match deltas in performance. These speeds also changed based on the "hot" or "cold" status of the graphics card - had it warmed up and been active for 10 minutes prior to testing? If so, the performance was measurably lower than with a "cold" GPU that was just started.
That issue was not necessarily a deal killer; rather, it just made us rethink how we test GPUs. The fact that many people were seeing lower performance on retail purchased cards than with the reference cards sent to press for reviews was a much bigger deal. In our testing in November the retail card we purchased, that was using the exact same cooler as the reference model, was running 6.5% slower than we expected.
The obvious hope was the retail cards with custom PCBs and coolers would be released from AMD partners and somehow fix this whole dilemma. Today we see if that was correct.
A slightly smaller MARS
The NVIDIA GeForce GTX 760 was released in June of 2013. Based on the same GK104 GPU as the GTX 680, GTX 670 and GTX 770, the GTX 760 disabled a couple more of the clusters of processor cores to offer up impressive performance levels for a lower cost than we had seen previously. My review of the GTX 760 was very positive as NVIDIA had priced it aggressively against the competing products from AMD.
As for ASUS, they have a storied history with the MARS brand. Typically an over-built custom PCB with two of the highest end NVIDIA GPUs stapled together, the ASUS MARS cards have been limited edition products with a lot of cache around them. The first MARS card was a dual GTX 285 product that was the first card to offer 4GB of memory (though 2GB per GPU of course). The MARS II took a pair of GTX 580 GPUs and pasted them on a HUGE card and sold just 1000 of them worldwide. It was heavy, expensive and fast; blazing fast. But at a price of $1200+ it wasn't on the radar of most PC gamers.
Interestingly, the MARS iteration for the GTX 680 never occurred and why that is the case is still a matter of debate. Some point the finger at poor sales and ASUS while others think that NVIDIA restricted ASUS' engineers from being as creative as they needed to be.
Today's release of the ASUS ROG MARS 760 is a bit different - this is still a high end graphics card but it doesn't utilize the fastest single-GPU option on the market. Instead ASUS has gone with a more reasonable design that combines a pair of GTX 760 GK104 GPUs on a single PCB with a PCI Express bridge chip between them. The MARS 760 is significantly smaller and less power hungry than previous MARS cards but it is still able to pack a punch in the performance department as you'll soon see.
Introduction and Features
Be Quiet! has been a market leader for PC power supplies in Germany for seven years straight and in 2013 they are continuing to expand their PC power supply lineup into North American markets. Earlier this year, we reviewed Be Quiet!’s top-of-the-line Dark Power Pro 10 850W PSU and the value-minded Pure Power L8 Series with very good results. Now we are going to take a look at the new Power Zone Series, sprecifically the Power Zone 1000W PSU. The Power Zone Series features a 135mm Be Quiet! SilentWings fan, are certified for 80Plus Bronze efficiency, come with all-modular cables, and are backed by a 5-year warranty.
Be Quiet! is targeting the Power Zone Series towards discerning gamers and PC enthusiasts seeking high power, top performance and great features.
Here is what Be Quiet! has to say about their Power Zone Series: “The Power Zone Series provides the winning combination of superior performance, rock-solid stability, and advanced cooling. Whether you are assembling a high power PC or multi-GPU gaming system, your build will benefit from the Power Zone features. The Power Zone 1000W hits the sweet spot with granite stability, advanced cooling features, low noise and great value.”
Be Quiet! Power Zone 1000W PSU Key Features:
• 1000W of continuous power output @ 50°C
• Massive +12V rail design is ideal for overclocking
• Full cable management supports maximum build flexibility
• Quiet operation: 135mm SilentWings fan with 6-pole motor
• COOL*OFF feature runs fans for 3 minutes after system shutdown
• Connect up to three case fans for optimized system cooling
• 80Plus Bronze certification (up to 90% power conversion efficiency)
• Meets Energy Star 5.2 Guidelines
• Fulfills ErP 2013 Guidelines
• Supports Intel’s Deep Power Down C6 mode
• Sleeved cables for improved cooling and more attractive looks
• NVIDIA SLI Ready and AMD CrossFireX certified
• Up to six PCI-E connectors for multi-GPU support
• 5-Year warranty
• German product conception, design and quality control
A not-so-simple set of instructions
Valve released to the world the first beta of SteamOS, a Linux-based operating system built specifically for PC gaming, on Friday evening. We have spent quite a lot of time discussing and debating the merits of SteamOS, but this weekend we wanted to do an installation of the new OS on a system and see how it all worked.
Our full video tutorial of installing and configuring SteamOS
First up was selecting the hardware for the build. As is usually the case, we had a nearly-complete system sitting around that needed some tweaks. Here is a quick list of the hardware we used, with a discussion about WHY just below.
|Processor||Intel Core i5-4670K - $222|
|Motherboard||EVGA Z87 Stinger Mini ITX Motherboard - $257|
|Memory||Corsair Vengeance LP 8GB 1866 MHz (2 x 4GB) - $109|
|Graphics Card||NVIDIA GeForce GTX TITAN 6GB - $999
EVGA GeForce GTX 770 2GB SuperClocked - $349
|Storage||Samsung 840 EVO Series 250GB SSD - $168|
|Case||EVGA Hadron Mini ITX Case - $189|
|Power Supply||Included with Case|
|Optical Drive||Slot loading DVD Burnder - $36|
|Peak Compute||4,494 GFLOPS (TITAN), 3,213 GFLOPS (GTX 770)|
|Total Price||$1947 (GTX TITAN) $1297 (GTX 770)|
We definitely weren't targeting a low cost build with this system, but I think we did create a very powerful system to test SteamOS on. First up was the case, the new EVGA Hadron Mini ITX chassis. It's small, which is great for integration into your living room, yet can still hold a full power, full-size graphics card.
The motherboard we used was the EVGA Z87 Stinger Mini ITX - an offering that Morry just recently reviewed and recommended. Supporting the latest Intel Haswell processors, the Stinger includes great overclocking options and a great feature set that won't leave enthusiasts longing for a larger motherboard.
Introduction and Technical Specifications
Courtesy of EVGA
The EVGA Z87 Stinger is EVGA's Z87-based answer for the small form-factor crowd. Sporting the micro-ITX form factor, the board is featured packed and offers support for the latest generation of Intel LGA1150-based processors. While its MSRP of $229.99 may seem large for its small stature, the Z87 Stinger's feature list makes it well worth the outlay.
Courtesy of EVGA
The EVGA Z87 Stinger board features a 6-phase power delivery system and an impressive 10 layer PCB. Additionally, EVGA designed the CPU socket with a higher amount of gold, as well as use of solid state capacitors throughout the board to ensure problem-free operation under all operational circumstances. The following features are integrated into the Z87 Stinger: 4 SATA 6Gb/s ports; 1 mPCIe/mSATA 6Gb/s port; 1 eSATA 6Gb/s port; an Intel GigE NIC; 1 PCI-Express x16 slot; on board power, reset, and BIOS reset buttons; BIOS Select switch; 2-digit diagnostic LED display; and USB 2.0 and 3.0 port support.
Courtesy of EVGA
Technical Specifications (taken from the EVGA website)
|Based on Intel Z87 chipset|
|2 x 240-pin DIMM sockets
Maximum of 16GB of DDR3 (2666MHz+ in dual channel configuration)
|4 x Serial ATA 600MB/sec (4 Internal) with support for RAID 0 and RAID1|
|Audio connector (Line-in, Line-out, MIC)|
|6 Channel Creative Sound Core3D
1 x 10/100/1000 (Intel i217)
|mITX Form Factor
Length: 6.7in - 170.18mm
Width: 6.7in - 170.18mm
Operating System Support
|Windows 8 32/64bit
Windows 7 32/64bit
Windows Vista 32/64bit
Windows XP 32/64bit
|This product comes with a 3 year warranty. Registration is recommended.|
Quality time with G-Sync
Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.
NVIDIA's Prototype G-Sync Monitor
We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics. All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.
Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed. This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.
In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.
The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.
Introduction and Design
Contortionist PCs are a big deal these days as convertible models take the stage to help bridge the gap between notebook and tablet. But not everyone wants to drop a grand on a convertible, and not everyone wants a 12-inch notebook, either. Meanwhile, these same people may not wish to blow their cash on an underpowered (and far less capable) Chromebook or tablet. It’s for these folks that Lenovo has introduced the IdeaPad Flex 14 Ultrabook, which occupies a valuable middle ground between the extremes.
The Flex 14 looks an awful lot like a Yoga at first glance, with the same sort of acrobatic design and a thoroughly IdeaPad styling (Lenovo calls it a “dual-mode notebook”). The specs are also similar to that of the x86 Yoga, though with the larger size (and later launch), the Flex also manages to assemble a slightly more powerful configuration:
The biggest internal differences here are the i5-4200U CPU, which is a 1.6 GHz Haswell model with a TDP of 15 W and the ability to Turbo Boost (versus the Yoga 11S’ i5-3339Y, which is Ivy Bridge with a marginally lower TDP of 13 W and no Turbo Boost), the integrated graphics improvements that follow with the newer CPU, and a few more ports made possible by the larger chassis. Well, and the regression to a TN panel from the Yoga 11S’ much-appreciated IPS display, which is a bummer. Externally, your wallet will also appreciate a $250 drop in price: our model, as configured here, retails for just $749 (versus the $999 Yoga 11S we reviewed a few months back).
You can actually score a Flex 14 for as low as $429 (as of this writing), by the way, but if you’re after any sort of respectable configuration, that price quickly climbs above the $500 mark. Ours is the least expensive option currently available with both a solid-state drive and an i5 CPU.
Streaming games straight from NVIDIA
Over the weekend NVIDIA released a December update for the SHIELD Android mobile gaming device that included a very interesting, and somewhat understated, new feature: Beta support for NVIDIA GRID.
You have likely heard of GRID before, NVIDIA has been pushing it as part of the companies vision going forward to GPU computing in every facet and market. GRID was aimed at creating GPU-based server farms to enable mobile, streaming gaming to users across the country and across the world. While initially NVIDIA only talked about working with partners to launch streaming services based on GRID, they have obviously changed their tune slightly with this limited release.
If you own a SHIELD, and install the most recent platform update, you'll find a new icon in your NVIDIA SHIELD menu called GRID Beta. The first time you start this new application, it will attempt to measure your bandwidth and latency to offer up an opinion on how good your experience should be. NVIDIA is asking for at least 10 Mbps of sustained bandwidth, and wants round trip latency under 60 ms from your location to their servers.
Currently, servers are ONLY located in Northern California so the further out you are, the more likely you will be to run into problems. However, oing some testing in Kentucky and Ohio resulted in a very playable gaming scenarios, though we did run into some connection problems that might be load-based or latency-based.
After the network setup portion users are shown 8 different games that they can try. Darksiders, Darksiders II, Street Fighter X Tekken, Street Fighter IV, Alan Wake, The Witcher 2, Red Faction: Armageddon and Trine 2. You are free to play them free of charge during this beta though I think you can be sure they will be removed and erased at some point; just a reminder. Saves work well and we were able to save and resume games of Darksiders 2 on GRID easily and quickly.
Starting up the game was fast, about on par with starting up a game on a local PC, though obviously the server is loading it in the background. Once the game is up and running, you are met with some button mapping information provided by NVIDIA for that particular game (great addition) and then you jump into the menus as if you were running it locally.
Introduction and Technical Specifications
Courtesy of Noctua
Noctua is a well known name in the enthusiast world for its high-end CPU cooler products. Their flagship cooler, the NH-D14, features a nickel-plated copper base with dual radiator towers actively cooled by low noise 120mm and 140mm fans. The NH-D14 can be used with all current Intel and AMD CPU offerings. The cooler was put to the test against other similarly classed air and water-based cooling systems to see just how well Noctua's design would hold up. The Noctua NH-D14 does not come cheap with a retail price at $99.99, but its performance and utility should make up for that initial outlay.
Courtesy of Noctua
Courtesy of Noctua
The Noctua NH-D14 cooler is everything you would expect in a premium CPU cooler - nickel-plating for corrosion resistance, twin-tower radiators for massive heat dissipation potential, and copper / aluminum hybrid design for optimal heat transfer from the CPU. Noctua designed the NH-D14 with a total of six heat pipes, laid out in a U-shaped design which passes through the copper base plate and terminates in the radiator towers. The bottom of the copper base plate leaves the factory ground flat and polished to a mirror-like finish, ensuring optimal interfacing with the CPU surface.
Courtesy of Noctua
Noctua included the following components in with the base cooler: SecureFirm2™ multi-socket mounting kit, NF-P14 140mm fan, NF-P12 120mm fan, four fan mounting brackets, a dual-ended fan power cable, two single-fan low power cables, a case badge, and NT-H1 thermal compound.
PC Component Selections
It's that time of year again! When those of us lucky enough with the ability, get and share the best in technology with our friends and family. You are already the family IT manager so why not help spread the holiday cheer by picking up some items for them, and hey...maybe for you. :)
This year we are going to break up the guide into categories. We'll have a page dedicated to PC components, one for mobile devices like notebooks and tablets and one for PC accessories. Then, after those specific categories, we'll have an open ended collection of pages where each PC Perspective team member can throw in some wildcards.
Our Amazon code is: pcper04-20
Intel Core i7-4770K Haswell Processor
The Intel Core i7-4770K is likely the best deal in computing performance today, after to power just about any configuration of PC you can think of without breaking much a sweat. You want to game? This part has you covered? You want to encode some video? The four cores and included HyperThreading support provide just about as much power as you could need. Yes there are faster processors in the form of the the Ivy Bridge-E and even 10+ core Xeon processors, but those are significantly more expensive. For a modest price of $299 you can get what is generally considered the "best" processor on the market.
Corsair Carbide Series Air 540 Case
Cases are generally considered a PC component that is more about the preference of the buyer but there are still fundamentals that make cases good, solid cases. The new Corsair Carbide Air 540 is unique in a lot of ways. The square-ish shape allows for a division of your power supply, hard drives and SSDs from the other motherboard-attached components. Even though the case is a bit shorter than others on the market, there is plenty of working room inside thanks to the Corsair dual-chamber setup and it even includes a pair of high-performance Corsair AF140L fans for intake and exhaust. The side panel window is HUGE allowing you to show off your goods and nice touches like the rubber grommeted cable routing cut outs and dust filters make this one of the best mid-range cases available.
Another retail card reveals the results
Since the release of the new AMD Radeon R9 290X and R9 290 graphics cards, we have been very curious about the latest implementation of AMD's PowerTune technology and its scaling of clock frequency as a result of the thermal levels of each graphics card. In the first article covering this topic, I addressed the questions from AMD's point of view - is this really a "configurable" GPU as AMD claims or are there issues that need to be addressed by the company?
The biggest problems I found were in the highly variable clock speeds from game to game and from a "cold" GPU to a "hot" GPU. This affects the way many people in the industry test and benchmark graphics cards as running a game for just a couple of minutes could result in average and reported frame rates that are much higher than what you see 10-20 minutes into gameplay. This was rarely something that had to be dealt with before (especially on AMD graphics cards) so to many it caught them off-guard.
Because of the new PowerTune technology, as I have discussed several times before, clock speeds are starting off quite high on the R9 290X (at or near the 1000 MHz quoted speed) and then slowly drifting down over time.
Another wrinkle occurred when Tom's Hardware reported that retail graphics cards they had seen were showing markedly lower performance than the reference samples sent to reviewers. As a result, AMD quickly released a new driver that attempted to address the problem by normalizing to fan speeds (RPM) rather than fan voltage (percentage). The result was consistent fan speeds on different cards and thus much closer performance.
However, with all that being said, I was still testing retail AMD Radeon R9 290X and R9 290 cards that were PURCHASED rather than sampled, to keep tabs on the situation.
Introduction, Specifications and Packaging
If you're into the laptop storage upgrade scene, you hear the same sort of arguments all the time. "Do I go with a HDD for a large capacity and low cost/GB, but suffer performance"? "I want an SSD, but can't afford the capacity I need"! The ideal for this scenario is to combine both - go with a small capacity SSD for your operating system and apps, while going with a larger HDD for bulk storage at a lower cost/GB. The catch here is that most mobile platforms only come with a single 2.5" 9.5mm storage bay, and you just can't physically fit a full SSD and a full HDD into that space, can you? Well today Western Digital has answered that challenge with the Black2 Dual Drive:
Yup, we're not kidding. This is a 120GB SSD *and* a 1TB HDD in a single package. Not a hybrid. Two drives, and it's nothing short of a work of art.
The 7 Year Console Refresh
The consoles are coming! The consoles are coming! Ok, that is not necessarily true. One is already here and the second essentially is too. This of course brings up the great debate between PCs and consoles. The past has been interesting when it comes to console gaming, as often the consoles would be around a year ahead of PCs in terms of gaming power and prowess. This is no longer the case with this generation of consoles. Cutting edge is now considered mainstream when it comes to processing and graphics. The real incentive to buy this generation of consoles is a lot harder to pin down as compared to years past.
The PS4 retails for $399 US and the upcoming Xbox One is $499. The PS4’s price includes a single controller, while the Xbox’s package includes not just a controller, but also the next generation Kinect device. These prices would be comparable to some low end PCs which include keyboard, mouse, and a monitor that could be purchased from large brick and mortar stores like Walmart and Best Buy. Happily for most of us, we can build our machines to our own specifications and budgets.
As a directive from on high (the boss), we were given the task of building our own low-end gaming and productivity machines at a price as close to that of the consoles and explaining which solution would be superior at the price points given. The goal was to get as close to $500 as possible and still have a machine that would be able to play most recent games at reasonable resolutions and quality levels.
Does downloading make a difference?
I posted a story earlier this week that looked at the performance of the new PS4 when used with three different 2.5-in storage options: the stock 500GB hard drive, a 1TB hybrid SSHD and a 240GB SSD. The results were fairly interesting (and got a good bit of attention) but some readers wanted more data. In particular, many asked how things might change if you went the full digital route and purchased games straight from the Sony's PlayStation Network. I also will compare boot times for each of the tested storage devices.
You should definitely check out the previous article if you missed it. It not only goes through the performance comparison but also details how to change the hard drive on the PS4 from the physical procedure to the software steps necessary. The article also details the options we selected for our benchmarking.
- HGST 500GB 5400 RPM HDD - $50 - $0.10/GB
- Seagate 1TB Hybrid SSHD - $122 - $0.12/GB
- Corsair 240GB Force GS SSD - $189 - $0.78/GB
Today I purchased a copy of Assassin's Creed IV from the PSN store (you're welcome Ubisoft) and got to testing. The process was the same: start the game then load the first save spot. Again, each test was run three times and the averages were reported. The PS4 was restarted between each run.
The top section of results is the same that was presented earlier - average load times for AC IV when the game is installed from the Blu-ray. The second set is new and includes average load times fro AC IV after the installation from the PlayStation Network; no disc was in the drive during testing.
Load time improvements
On Friday Sony released the PlayStation 4 onto the world. The first new console launch in 7 years, the PS4 has a lot to live up to, but our story today isn't going to attempt to weigh the value of the hardware or software ecosystem. Instead, after our PS4 teardown video from last week, we got quite a few requests for information on storage performance with the PS4 and what replacement hardware might offer gamers.
Hard Drive Replacement Process
Changing the hard drive in your PlayStation 4 is quite simple, a continuation of a policy Sony's policy with the PS3.
Installation starts with the one semi-transparent panel on the top of the unit, to the left of the light bar. Obviously make sure your PS4 is completely turned off and unplugged.
Simply slide it to the outside of the chassis and wiggle it up to release. There are no screws or anything to deal with yet.
Once inside you'll find a screw with the PS4 shapes logos on them; that is screw you need to remove to pull out the hard drive cage.
Introduction and Features
Today Western Digital launched an important addition to their Personal Cloud Storage NAS family - the My Cloud EX4:
The My Cloud EX4 is Western Digital's answer to the increased demand for larger personal storage devices. When folks look for places to consolidate all of their bulk files, media, system backups, etc, they tend to extend past what is possible with a single hard drive. Here is Western Digital's projection on where personal storage is headed:
Where the My Cloud was a single drive solution, the My Cloud EX4 extends that capability to span up to four 3.5" drives. When it comes to devices that span across several drives, the number 4 is a bit of a sweet spot, as it enables several RAID configurations:
Everything but online capacity expansion (where the user can swap drives one at a time to a larger capacitiy) is suppoted. While WD has stated that feature will be available in a future update, I find it a bit risky to intentionally and repeatedly fail an array by pulling drives and forcing rebuilds. It just makes more sense to back up the data and re-create a fresh array with the new larger drives installed.
Ok, so we've got the groundwork down with a 4-bay NAS device. What remains to be seen is how Western Digital has implemented the feature set. There is a lot to get through here, so let's get to it.
NVIDIA Tegra Note Program
Clearly, NVIDIA’s Tegra line has not been as successful as the company had hoped and expected. The move for the discrete GPU giant into the highly competitive world of the tablet and phone SoCs has been slower than expected, and littered with roadblocks that were either unexpected or that NVIDIA thought would be much easier to overcome.
The truth is that this was always a long play for the company; success was never going to be overnight and anyone that thought that was likely or possible was deluded. Part of it has to do with the development cycle of the ARM ecosystem. NVIDIA is used to a rather quick development, production, marketing and sales pattern thanks to its time in high performance GPUs, but the SoC world is quite different. By the time a device based on a Tegra chip is found in the retail channel it had to go through an OEM development cycle, NVIDIA SoC development cycle and even an ARM Cortex CPU development cycle. The result is an extended time frame from initial product announcement to retail availability.
Partly due to this, and partly due to limited design wins in the mobile markets, NVIDIA has started to develop internal-designed end-user devices that utilize its Tegra SoC processors. This has the benefit of being much faster to market – while most SoC vendors develop reference platforms during the normal course of business, NVIDIA is essentially going to perfect and productize them.
More Details from Lisa Su
The executives at AMD like to break their own NDAs. Then again, they are the ones typically setting these NDA dates, so it isn’t a big deal. It is no secret that Kaveri has been in the pipeline for some time. We knew a lot of the basic details of the product, but there were certainly things that were missing. Lisu Su went up onstage and shared a few new details with us.
Kaveri will be made up of 4 “Steamroller” cores, which are enhanced versions of the previous Bulldozer/Trinity/Vishera families of products. Nearly everything in the processor is doubled. It now has dual decode, more cache, larger TLBs, and a host of other smaller features that all add up to greater single thread performance and better multi-threaded handling and performance. Integer performance will be improved, and the FPU/MMX/SSE unit now features 2 x 128 bit FMAC units which can “fuse” and support AVX 256.
However, there was no mention of the fabled 6 core Kaveri. At this time, it is unlikely that particular product will be launched anytime soon.
Introduction and Design
With few exceptions, it’s generally been taken for granted that gaming notebooks are going to be hefty devices. Portability is rarely the focus, with weight and battery life alike usually sacrificed in the interest of sheer power. But the MSI GE40 2OC—the lightest 14-inch gaming notebook currently available—seeks to compromise while retaining the gaming prowess. Trending instead toward the form factor of a large Ultrabook, the GE40 is both stylish and manageable (and perhaps affordable at around $1,300)—but can its muscle withstand the reduction in casing real estate?
While it can’t hang with the best of the 15-inch and 17-inch crowd, in context with its 14-inch peers, the GE40’s spec sheet hardly reads like it’s been the subject of any sort of game-changing handicap:
One of the most popular CPUs for Haswell gaming notebooks has been the 2.4 GHz (3.4 GHz Turbo) i7-4700MQ. But the i7-4702MQ in the GE40-20C is nearly as powerful (managing 2.2 GHz and 3.2 GHz in those same areas respectively), and it features a TDP that’s 10 W lower at just 37 W. That’s ideal for notebooks such as the GE40, which seek to provide a thinner case in conjunction with uncompromising performance. Meanwhile, the NVIDIA GTX 760M is no slouch, even if it isn’t on the same level as the 770s and 780s that we’ve been seeing in some 15.6-inch and 17.3-inch gaming beasts.
Elsewhere, it’s business as usual, with 8 GB of RAM and a 120 GB SSD rounding out the major bullet points. Nearly everything here is on par with the best of rival 14-inch gaming models with the exception of the 900p screen resolution (which is bested by some notebooks, such as Dell’s Alienware 14 and its 1080p panel).
EVGA Brings Custom GTX 780 Ti Early
Reference cards for new graphics card releases are very important for a number of reasons. Most importantly, these are the cards presented to the media and reviewers that judge the value and performance of these cards out of the gate. These various articles are generally used by readers and enthusiasts to make purchasing decisions, and if first impressions are not good, it can spell trouble. Also, reference cards tend to be the first cards sold in the market (see the recent Radeon R9 290/290X launch) and early adopters get the same technology in their hands; again the impressions reference cards leave will live in forums for eternity.
All that being said, retail cards are where partners can differentiate and keep the various GPUs relevant for some time to come. EVGA is probably the most well known NVIDIA partner and is clearly their biggest outlet for sales. The ACX cooler is one we saw popularized with the first GTX 700-series cards and the company has quickly adopted it to the GTX 780 Ti, released by NVIDIA just last week.
I would normally have a full review for you as soon as we could but thanks to a couple of upcoming trips that will keep me away from the GPU test bed, that will take a little while longer. However, I thought a quick preview was in order to show off the specifications and performance of the EVGA GTX 780 Ti ACX.
As expected, the EVGA ACX design of the GTX 780 Ti is overclocked. While the reference card runs at a base clock of 875 MHz and a typical boost clock of 928 MHz, this retail model has a base clock of 1006 MHz and a boost clock of 1072 MHz. This means that all 2,880 CUDA cores are going to run somewhere around 15% faster on the EVGA ACX model than the reference GTX 780 Ti SKUs.
We should note that though the cooler is custom built by EVGA, the PCB design of this GTX 780 Ti card remains the same as the reference models.
Get notified when we go live!