All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Very Minor Changes
November 14th, 2011 - that is the date that Intel introduced the LGA 2011 socket and the Sandy Bridge-E processor. Intel continued their pattern of modifying their mainstream architecture, Sandy Bridge at the time, into a higher performance (and higher priced) enthusiast class. The new socket differentiated these components into their own category for workstation users and others who demand top performance. Today Intel officially unveils the Ivy Bridge-E platform with essentially the same mindset.
The top end offering under the IVB-E name is the Core i7-4960X, a six-core, HyperThreaded processor with Turbo Boost technology and up to 15MB of L3 cache. Sound familiar? It should. There is really very little different about the new 4960X when compared to the Sandy Bridge-E Core i7-3960X released in 2011. In fact, the new processors use the exact same socket and will work on the same X79 motherboards already on the market. (Pending, of course, on whether your manufacturer has updated the UEFI/Firmware accordingly.)
The Ivy Bridge-E Platform
Even though the platform and features are nearly identical between Sandy Bridge-E and Ivy Bridge-E there are some readers that might need a refresher or maybe had never really investigated Socket 2011 products before today. I'll step through the major building blocks of the new Core i7-4960X just in case.
A New TriFrozr Cooler
Graphics cards are by far the most interesting topic we cover at PC Perspective. Between the battles of NVIDIA and AMD as well as the competition between board partners like EVGA, ASUS, MSI and Galaxy, there is very rarely a moment in time when we don't have a different GPU product of some kind on an active test bed. Both NVIDIA and AMD release reference cards (for the most part) with each and every new product launch and it then takes some time for board partners to really put their own stamp on the designs. Other than the figurative stamp that is the sticker on the fan.
One of the companies that has recently become well known for very custom, non-reference graphics card designs is MSI and the pinnacle of the company's engineering falls into the Lightning brand. As far back as the MSI GTX 260 Lightning and as recently as the MSI HD 7970 Lightning, these cards have combined unique cooling, custom power design and good amount of over engineering to really produce a card that has few rivals.
Today we are looking at the brand new MSI GeForce GTX 780 Lightning, a complete revamp of the GTX 780 that was released in May. Based on the same GK110 GPU as the GTX Titan card, with two fewer SMX units, the GTX 780 easily the second fastest single GPU card on the market. MSI is hoping to make the enthusiasts even more excited about the card with the Lightning design that brings a brand new TriFrozr cooler, impressive power design and overclocking capabilities that basic users and LN2 junkies can take advantage of. Just what DO you get for $750 these days?
Introduction and externals
Razer maintains a distinct sense of style across their product line. Over the past decade and a half, Razer has carved a spot in the peripherals market catering to competitive gamers as well as developing wholly novel products for the gaming market. Razer has a catalog including standard peripherals and more arcane things such as mice with telephone-style keypads geared toward MMORPG players as well as motion sensing controllers employing magnetic fields to detect controller position.
The Razer BlackWidow Ultimate Stealth 2013 Edition comes out of the box ready for use without additional software provided or assembly required. The keyboard uses a standard layout with five macro keys attached in a column on the left of the board. Rather than dedicated media buttons, media and keyboard specific functions are accessed by pressing a combination of a function key located to the right of right alt and the function keys on the top row.
The headphone and microphone jack are present on the side of the keyboard.
A unique enthusiast chassis
The Corsair Carbide Air 540 is a very unique case. It fits a full size ATX motherboard and up to four dual-slot graphics cards but it's shorter than you might expect thanks to a design choice that splits the active components from the mostly passive ones. The result is a case that is more square than rectangular yet still combines the charm of Corsair designs with the performance enthusiasts want.
For the best view of the case check out the video review below and then continue on for some additional photos and commentary.
Divided into two side by side compartments, the Air 540 has a unique front style merging a mesh look on the left with matte black paint on the right.
The right hand side includes two 5.25-in optical drive bays turned 90 degrees to fit in the smaller right hand compartment. Honestly, I am looking forward to the day that a case vendor is gutsy enough to leave off optical bays completely in an enthusiast design as I just think they take away from the overall appeal and looks.
Introduction and Design
It seems like only yesterday (okay, last month) that we were testing the IdeaPad Yoga 11, which was certainly an interesting device. That’s primarily because of what it represents: namely, the slow merging of the tablet and notebook markets. You’ve probably heard people proclaiming the death of the PC as we know it. Not so fast—while it’s true that tablets have eaten into the sales of what were previously low-powered notebooks and now-extinct netbooks, there is still no way to replace the utility of a physical keyboard and the sensibility of a mouse cursor. Touch-centric devices are hard to beat when entertainment and education are the focus of a purchase, but as long as productivity matters, we aren’t likely to see traditional means of input and a range of connectivity options disappear anytime soon.
The IdeaPad Yoga 11 leaned so heavily in the direction of tablet design that it arguably was more tablet than notebook. That is, it featured a tablet-grade SOC (the nVidia Tegra 3) as opposed to a standard Intel or AMD CPU, an 11” display, and a phenomenal battery life that can only be compared to the likes of other ARM-based tablets. But, of course, with those allegiances come necessary concessions, not least of which is the inability to run x86 applications and the consequential half-baked experiment that is Windows RT.
Fortunately, there’s always room for compromise, and for those of us searching for something closer to a notebook than the original Yoga 11, we’re now afforded the option of the 11S. Apart from being nearly identical in terms of form factor, the $999 (as configured) Yoga 11S adopts a standard x86 chipset with Intel ULV CPUs, which allows it to run full-blown Windows 8. That positions it squarely in-between the larger x86 Yoga 13 and the ARM-based Yoga 11, which makes it an ideal candidate for someone hoping for the best of both worlds. But can it survive the transition, or do its compromises outstrip its gains?
Our Yoga 11S came equipped with a fairly standard configuration:
Unless you’re comparing to the Yoga 11’s specs, not much about this stands out. The Core i5-3339Y is the first thing that jumps out at you; in exchange for the nVidia Tegra 3 ARM-based SOC of the original Yoga 11, it’s a much more powerful chip with a 13W TDP and (thanks to its x86 architecture) the ability to run Windows 8 and standard Windows applications. Next on the list is the included 8 GB of DDR3 RAM—versus just 2 GB on the Yoga 11. Finally, there’s USB 3.0 and a much larger SSD (256 GB vs. 64 GB)—all valuable additions. One thing that hasn’t changed, meanwhile, is the battery size. Surely you’re wondering how this will affect the longevity of the notebook under typical usage. Patience; we’ll get to that in a bit! First, let’s talk about the general design of the notebook.
Introduction and Features
Antec has one of the largest selections of PC power supplies on the market today and their new High Current Pro Platinum series includes three models: the HCP-850 Platinum, HCP-1000 Platinum and the HCP-1300 Platinum. The High Current Pro Platinum series is part of a second wave of new power supplies from Antec that replaces three older lines, the TruePower Quattro, High Current Pro (80 Plus Gold), and Antec’s Signature series. The High Current Pro Platinum series will be the new top class of maximum efficiency and performance within Antec’s range of power supplies with modular cabling.
The High Current Pro Platinum series is based on a design, co-developed with Antec’s partner Delta Electronics and combines several technological advancements and features to provide top performance and be the very best power supply possible. All three High Current Pro Platinum power supplies are 80Plus Platinum certified, come with all modular cables, and have been tested and certified by both NVIDIA for SLI and AMD for Crossfire systems. The High Current Pro Platinum PSUs also feature Antec’s new OC Link technology that allows two HCP Platinum PSUs to work in tandem to power even the most demanding systems.
Here is what Antec has to say about their new High Current Pro Platinum PSUs:
“Antec's High Current Pro Platinum series is the pinnacle of power supplies. High Current Pro Platinum is fully modular with a revolutionary 20+8-pin MBU socket for the needs of tomorrow. By using a PSU that is 80 PLUS® PLATINUM & ErP Lot 6: 2013 certified, operating up to 94% efficient, you can reduce your electricity bill by up to 25% when compared to many other power supplies. HCP Platinum's innovative 16-pin sockets create a new level of flexibility by doubling the modular connectivity, supporting two different 8-pins connectors and even future connectors of 10, 12, 14 or 16-pins. Backed by a 7 year warranty and lifetime global 24/7 support, the High Current Pro Platinum series embodies everything a power supply can accomplish today.”
Antec High Current Pro Platinum Series PSU Key Features:
• 850W/1000W/1300W continuous power output at 50°C
• 80Plus Platinum Certified (up to 94% efficient)
• Four High Current +12V rails with high maximum load
• 100% +12V output for maximum CPU and GPU support
• OC Link allows two HCP Platinum PSUs to work in tandem
• 16-Pin Socket for increased modular connectivity and flexibility
• 28(20+8)-pin Motherboard socket for future MBU support
• Quiet 135mm double ball bearing fan
• Thermal Manager – advanced low voltage fan controller
• All Japanese brand, heavy duty capacitors
• PhaseWave Design server-class, full-bridge LLC topology
• NVIDIA SLI-Ready and AMD Crossfire ready
• Intel Haswell & C7 ready
• Active PFC with Universal AC line input
• ErP Lot 6:2013 Compliant
• Fully modular cables
• Protection: OCP, OVP, UVP, SCP, OPP, OTP, SIP, NLO and BOP
• Antec AQ7 7-year warranty and lifetime global 24/7 support
Plus one GTX 670...
Brand new GPU architectures are typically packaged in reference designs when it comes to power, PCB layout, and cooling. Once manufacturers get a chance to put out their own designs, then interesting things happen. The top end products are usually the ones that get the specialized treatment first, because they typically have the larger margins to work with. Design choices here will eventually trickle down to lower end cards, typically with a price point $20 to $30 more than a reference design. Companies such as MSI have made this their bread and butter with the Lightning series on top, the Hawk line handling the midrange, and then the hopped up reference designs with better cooling under the Twin Frozr moniker.
ASUS has been working with their own custom designs for years and years, but it honestly was not until the DirectCU series debuted did we have a well defined product lineup which pushes high end functionality across the entire lineup of products from top to bottom. Certainly they had custom and unique designs, but things really seemed to crystallize with DirectCU. I guess that is also the power of a good marketing tool as well. DirectCU is a well known brand owned by Asus, and users typically know what to expect when looking at a DirectCU product.
Courtesy of XSPC
The Razor GTX680 water block was among the first in the XSPC full cover line of blocks. The previous generation of XSPC water blocks offered cooling for the GPU as well as the memory and on-board VRMs, but did not offer the protection that a full card-sized block offers to the sensitive components integrated into the card's PCB. At an MSRP of $99.99, the Razor GTX680 water block is a sound investment.
Courtesy of XSPC
The Razor GTX680 block comes with a total of seven G1/4" ports - four on the inlet side (left) and three on the outlet side (right). XSPC included the following component with the block: XSPC thermal compound, dual blue LEDs, five steel port caps, paper washers and mounting screws, and TIM (thermal interface material) for use with the on board memory and VRM chips.
Overview and Technical Specifications
What is the Second Look Review
The Second Look Review is used to take a more in-depth look at boards here at PC Perspective. In the initial review, we try to give you an overview of the board, forming an impression of how the board will perform in your system. The initial review details out the board features and layout, BIOS features, and stock performance. The Second Look review attempts to pull back the covers, exposing a more complete picture of the board's performance limits.
In this review, we cover subsystem testing including drive, networking, and audio functionality, as well as overclocking. Additionally, the board components and heat sinks are stripped down to uncover lower level board functionality and design. By the end of the Second Look review, a complete picture of board performance coalesces into an adequate picture for award determination. This award determination takes into account the board performance tested over the course of both the initial and second look reviews.
Be sure to take a look at our first review of this product for a better overall view of the layout and features as well.
Overview and Feature Recap
Courtesy of GIGABYTE
- Supports 4th Generation Intel® Core™ processors
- GIGABYTE Ultra Durable™ 5 Plus Technology
- All IR Digital Power design
- GIGABYTE UEFI DualBIOS™
- Exclusive GIGABYTE OC Features
- Unique OC Touch Feature
- Unique OC Ignition Feature
- Unique OC Brace Feature
- Gold plated DDR/PCIe Slots and power connectors
- 4-way Graphics Support
- Durable black solid capacitors
- GIGABYTE On/Off Charge™ 2 for USB devices
- Dual Intel® LAN with high ESD Protection
- Extreme Heat sink design with 9 system fan connectors
- Realtek ALC898 with High Quality 110dB SNR HD audio
- GIGABYTE Bluetooth 4.0 and Wi-Fi Card
Courtesy of GIGABYTE
Haswell and Kepler
With the release of Intel's Haswell core processors and the updated graphics card lineup from NVIDIA, Digital Storm has updated many of their custom PC lines to include both. A little while ago the company sent along a pre-built Ode system that includes some impressive hardware like an overclocked Core i7-4770K and a GTX 780 along with a Corsair SSD and more. Even though the design is using fully off-the-shelf parts, the build quality is impressive and will interest many users that want the jump start of a ready made rig.
Our article today (and embedded video) will give you a quick overview of the hardware, the build and the performance that you can expect for this $2500 PC.
- Digital Storm Ode Custom
- Intel Core i7-4770K (OC to 4.4 GHz)
- ASUS Z87-C Motherboard
- Corsair H100 Water Cooler
- 16GB (2 x 8GB) Kingston HyperX DDR3-1866
- NVIDIA GeForce GTX 780 3GB Graphics Card
- 120GB Corsair Neutron SSD
- 1TB Western Digital 7200 RPM HDD
- Corsair HX1050 Power Supply
- Corsair Graphite 600T White Case
Current pricing on this build is $2577 from Digital Storm's website and while that is definitely higher than buying the same components out right, the difference shouldn't be enough to scare you off. More on that later.
The Ode from Digital Storm is built around the Corsair 600T chassis, an older design that still stands up well in terms of looks and performance. The only draw back to it is that it does not have an internal USB 3.0 header and thus still uses the external cable to plug into the back of the motherboard. If you want to see video from 2010 we did of this case, check the way back machine to do so!
A white color scheme really makes this system stand out and the window on the side panel will let everyone gawk at the components included inside. With plenty of room for fans, radiators and good intake filter support throughout, the 600T remains one of our favorite chassis at PC Perspective.
500GB on the go
Corsair seems to have its fingers in just about everything these days so why not mobile storage, right? The Voyager Air a multi-function device that Corsair calls as "portable wireless drive, home network drive, USB drive, and wireless hub." This battery powered device is meant to act as a mobile hard drive for users that need more storage on the go including PCs and Macs as well as iOS and Android users.
The Voyager Air can also act as a basic home NAS device with a Gigabit Ethernet connection on board for all the computers on your local network. And if you happen to have DLNA ready Blu-ray players or TVs nearby, they can access the video and audio stored on the Voyager Air as well.
Available in either red or black, with 500GB and 1TB capacities, the Voyager Air is slim and sleek, meant to be seen not hidden in a closet.
The front holds the power switch and WiFi on/off switch as well as back-lit icons to check for power, battery life and connection status.
The Densest 2.5 Hours Imaginable
Introduction and Technical Specifications
Courtesy of Cooler Master
The HAF XB mid tower case is the newest member of the Cooler Master HAF line of cases. Touted as a LAN box, this cube-shaped case has both looks and features that appeal to any enthusiasts. We decided to put the HAF XB on our test bench to validate these claims. At a base price of $99.99, the HAF XB is a bargain for the features you are getting.
Courtesy of Cooler Master
Courtesy of Cooler Master
Cooler Master designed the HAF XB with a scratch-resistance, flat-black colored coating applied to all surfaces. Both side panels have integrated hand-holds for easy lifting and transport to your event and the front and top panels contain non-impeding mesh grills allowing for optimal airflow across your vital system components. Integrated into the case's front panel are power and reset buttons, power indicator LEDs, audio input and output port, USB 3.0 device ports, two 5.25" device bays, and two hot-swappable hard drive bays.
Frame Pacing for CrossFire
When the Radeon HD 7990 launched in April of this year, we had some not-so-great things to say about it. The HD 7990 depends on CrossFire technology to function and we had found quite a few problems with AMD's CrossFire technology over the last months of testing with our Frame Rating technology, the HD 7990 "had a hard time justifying its $1000 price tag." Right at launch, AMD gave us a taste of a new driver that they were hoping would fix the frame pacing and frame time variance issues seen in CrossFire, and it looked positive. The problem was that the driver wouldn't be available until summer.
As I said then: "But until that driver is perfected, is bug free and is presented to buyers as a made-for-primetime solution, I just cannot recommend an investment this large on the Radeon HD 7990."
Today could be a very big day for AMD - the release of the promised driver update that enables frame pacing on AMD 7000-series CrossFire configurations including the Radeon HD 7990 graphics cards with a pair of Tahiti GPUs.
It's not perfect yet and there are some things to keep an eye on. For example, this fix will not address Eyefinity configurations which includes multi-panel solutions and the new 4K 60 Hz displays that require a tiled display configuration. Also, we found some issues with more than two GPU CrossFire that we'll address in a later page too.
New Driver Details
Starting with 13.8 and moving forward, AMD plans to have the frame pacing fix integrated into all future drivers. The software team has implemented a software based frame pacing algorithm that simply monitors the time it takes for each GPU to render a frame, how long a frame is displayed on the screen and inserts delays into the present calls when necessary to prevent very tightly timed frame renders. This balances or "paces" the frame output to the screen without lowering the overall frame rate. The driver monitors this constantly in real-time and minor changes are made on a regular basis to keep the GPUs in check.
As you would expect, this algorithm is completely game engine independent and the games should be completely oblivious to all that is going on (other than the feedback from present calls, etc).
This fix is generic meaning it is not tied to any specific game and doesn't require profiles like CrossFire can from time to time. The current implementation will work with DX10 and DX11 based titles only with DX9 support being added later with another release. AMD claims this was simply a development time issue and since most modern GPU-bound titles are DX10/11 based they focused on that area first. In phase 2 of the frame pacing implementation AMD will add in DX9 and OpenGL support. AMD wouldn't give me a timeline for implementation though so we'll have to see how much pressure AMD continues with internally to get the job done.
It has come to my attention that you are planning on producing and selling a device to be called “NVIDIA SHIELD.” It should be noted that even though it shares the same name, this device has no matching attributes of the super-hero comic-based security agency. Please adjust.
When SHIELD was previewed to the world at CES in January of this year, there were a hundred questions about the device. What would it cost? Would the build quality stand up to expectations? Would the Android operating system hold up as a dedicated gaming platform? After months of waiting a SHIELD unit finally arrived in our offices in early July, giving us plenty of time (I thought) to really get a feel for the device and its strengths and weakness. As it turned out though, it still seemed like an inadequate amount of time to really gauge this product. But I am going to take a stab at it, feature by feature.
NVIDIA SHIELD aims to be a mobile gaming platform based on Android with a flip out touch-screen interface, high quality console design integrated controller, and added features like PC game streaming and Miracast support.
Initial Unboxing and Overview of Product Video
At the heart of NVIDIA SHIELD is the brand new Tegra 4 SoC, NVIDIA’s latest entry into the world of mobile processors. Tegra 4 is a quad-core, ARM Cortex-A15 based SoC that includes a 5th A15 core built on lower power optimized process technology to run background and idle tasks using less power. This is very similar to what NVIDIA did with Tegra 3’s 4+1 technology, and how ARM is tackling the problem with big.LITTLE philosophy.
Introduction and Specifications
Last week, Samsung flew a select group of press out to Seoul, Korea. The event was the 2013 Samsung Global SSD Summit. Here we saw the launch of a new consumer SSD, the 840 EVO:
This new SSD aims to replace the older 840 (non-Pro) model with one that is considerably more competitive. Let's just right into the specs:
NVIDIA Finally Gets Serious with Tegra
Tegra has had an interesting run of things. The original Tegra 1 was utilized only by Microsoft with Zune. Tegra 2 had a better adoption, but did not produce the design wins to propel NVIDIA to a leadership position in cell phones and tablets. Tegra 3 found a spot in Microsoft’s Surface, but that has turned out to be a far more bitter experience than expected. Tegra 4 so far has been integrated into a handful of products and is being featured in NVIDIA’s upcoming Shield product. It also hit some production snags that made it later to market than expected.
I think the primary issue with the first three generations of products is pretty simple. There was a distinct lack of differentiation from the other ARM based products around. Yes, NVIDIA brought their graphics prowess to the market, but never in a form that distanced itself adequately from the competition. Tegra 2 boasted GeForce based graphics, but we did not find out until later that it was comprised of basically four pixel shaders and four vertex shaders that had more in common with the GeForce 7800/7900 series than it did with any of the modern unified architectures of the time. Tegra 3 boasted a big graphical boost, but it was in the form of doubling the pixel shader units and leaving the vertex units alone.
While NVIDIA had very strong developer relations and a leg up on the competition in terms of software support, it was never enough to propel Tegra beyond a handful of devices. NVIDIA is trying to rectify that with Tegra 4 and the 72 shader units that it contains (still divided between pixel and vertex units). Tegra 4 is not perfect in that it is late to market and the GPU is not OpenGL ES 3.0 compliant. ARM, Imagination Technologies, and Qualcomm are offering new graphics processing units that are not only OpenGL ES 3.0 compliant, but also offer OpenCL 1.1 support. Tegra 4 does not support OpenCL. In fact, it does not support NVIDIA’s in-house CUDA. Ouch.
Jumping into a new market is not an easy thing, and invariably mistakes will be made. NVIDIA worked hard to make a solid foundation with their products, and certainly they had to learn to walk before they could run. Unfortunately, running effectively entails having design wins due to outstanding features, performance, and power consumption. NVIDIA was really only average in all of those areas. NVIDIA is hoping to change that. Their first salvo into offering a product that offers features and support that is a step above the competition is what we are talking about today.
Specifications and Overview
Talk to most PC enthusiasts today, be they gamers or developers, and ask them what technology they are most interested in for the next year or so and you will most likely hear about 4K somewhere in the discussion. While the world of consumer electronics and HDTV has been stuck in the rut of 1080p for quite some time now, computers, smartphones and tablets are racing in the direction of higher resolutions and higher pixel densities. 4K is a developing standard that pushes screen resolutions to 4K x 2K pixels and if you remove the competing options discussion (3840x2160 versus 4096x2160 are the most prominent) this move is all good news for the industry.
I first dove into the area of 4K displays when I purchased the SEIKI SE50UY04 50-in 4K TV in April for $1300 when it popped up online. The TV showed up days later and we did an unboxing and preview of the experience and I was blown away by the quality difference by moving to a 3840x2160 screen, even with other caveats to be had. It was a 30 Hz panel, half a typical LCD computer display today, it had limited functionality and it honestly wasn't the best quality TV I had ever used. But it was 4K, it was inexpensive and it was available.
It was hard to beat at the time but the biggest drawback was the lack of 60 Hz support, the ability for the screen to truly push 60 frames per second to the panel. This caused some less than desirable results with Windows usage and even in gaming where visual tearing was more prominent when Vsync was disabled. But a strength of this design was that it only required a single HDMI connection and would work with basically any current graphics systems. I did some Frame Rating game performance testing at 4K and found that GPU horsepower was definitely a limiting factor.
Today I follow up our initial unboxing and preview of the ASUS PQ321Q 4K monitor with a more thorough review and summary of our usage results. There is quite a bit that differs between our experience with the SEIKI and the ASUS panels and it is more than just the screen sizes.
Introduction and Design
With the release of Haswell upon us, we’re being treated to an impacting refresh of some already-impressive notebooks. Chief among the benefits is the much-championed battery life improvements—and while better power efficiency is obviously valuable where portability is a primary focus, beefier models can also benefit by way of increased versatility. Sure, gaming notebooks are normally tethered to an AC adapter, but when it’s time to unplug for some more menial tasks, it’s good to know that you won’t be out of juice in a couple of hours.
Of course, an abundance of gaming muscle never hurts, either. As the test platform for one of our recent mobile GPU analyses, MSI’s 15.6” GT60 gaming notebook is, for lack of a better description, one hell of a beast. Following up on Ryan’s extensive GPU testing, we’ll now take a more balanced and comprehensive look at the GT60 itself. Is it worth the daunting $1,999 MSRP? Does the jump to Haswell provide ample and economical benefits? And really, how much of a difference does it make in terms of battery life?
Our GT60 test machine featured the following configuration:
In case it wasn’t already apparent, this device makes no compromises. Sporting a desktop-grade GPU and a quad-core Haswell CPU, it looks poised to be the most powerful notebook we’ve tested to date. Other configurations exist as well, spanning various CPU, GPU, and storage options. However, all available GT60 configurations feature a 1080p anti-glare screen, discrete graphics (starting at the GTX 670M and up), Killer Gigabit LAN, and a case built from metal and heavy-duty plastic. They also come preconfigured with Windows 8, so the only way to get Windows 7 with your GT60 is to purchase it through a reseller that performs customizations.