Subject: Storage
Manufacturer: OCZ Technology

Introduction, Specifications and Packaging

Introduction:

It has been a while since OCZ introduced their Vector SSD, and it was in need of a refresh to bring its pricing more in-line with the competition, which had been equipping their products with physically smaller flash dies (therefore reducing cost). Today, OCZ launched a refresh to their Vector - now dubbed the Vector 150:

131107-072702-7.11.jpg

The OCZ strategy changed up a while back. They removed a lot of redundancy and confusing product lines, consolidating everything into a few simple solutions. Here's a snapsot of that strategy, showing the prior and newer iterations of three simple solutions:

vector 150 - lineup.png

The Vector 150 we look at today falls right into the middle here. I just love the 'ENTHUSIST' icon they went with:

ocz enthusiast.png

Read on for our full review of the new OCZ Vector 150!

Author:
Subject: Editorial
Manufacturer: Wyoming Whiskey

Bourbon? Really?

Why is there a bourbon review on a PC-centric website? 

We can’t live, eat, and breathe PC technology all the time.  All of us have outside interests that may not intersect with the PC and mobile market.  I think we would be pretty boring people if that were the case.  Yes, our professional careers are centered in this area, but our personal lives do diverge from the PC world.  You certainly can’t drink a GPU, though I’m sure somebody out there has tried.

bottle31.jpg

The bottle is unique to Wyoming Whiskey.  The bourbon has a warm, amber glow about it as well.  Picture courtesy of Wyoming Whiskey

Many years ago I became a beer enthusiast.  I loved to sample different concoctions, I would brew my own, and I settled on some personal favorites throughout the years.  Living in Wyoming is not necessarily conducive to sampling many different styles and types of beers, and so I was in a bit of a rut.  A few years back a friend of mine bought me a bottle of Tomatin 12 year single malt scotch, and I figured this would be an interesting avenue to move down since I had tapped out my selection of new and interesting beers (Wyoming has terrible beer distribution).

Click to read the entire review here!

Author:
Manufacturer: AMD

More of the same for a lot less cash

The week before Halloween, AMD unleashed a trick on the GPU world under the guise of the Radeon R9 290X and it was the fastest single GPU graphics card we had tested to date.  With a surprising price point of $549, it was able to outperform the GeForce GTX 780 (and GTX TITAN in most cases) while under cutting the competitions price by $100.  Not too bad! 

amd1.jpg

Today's release might be more surprising (and somewhat confusing).  The AMD Radeon R9 290 4GB card is based on the same Hawaii GPU with a few less compute units enabled (CUs) and an even more aggressive price and performance placement.  Seriously, has AMD lost its mind?

Can a card with a $399 price tag cut into the same performance levels as the JUST DROPPED price of $499 for the GeForce GTX 780??  And, if so, what sacrifices are being made by users that adopt it?  Why do so many of our introduction sentences end in question marks?

The R9 290 GPU - Hawaii loses a small island

If you are new to the Hawaii GPU and you missed our first review of the Radeon R9 290X from last month, you should probably start back there.  The architecture is very similar to that of the HD 7000-series Tahiti GPUs with some modest changes to improve efficiency with the biggest jump in raw primitives per second to 4/clock over 2/clock.

diagram1.jpg

The R9 290 is based on Hawaii though it has four fewer compute units (CUs) than the R9 290X.  When I asked AMD if that meant there was one fewer CU per Shader Engine or if they were all removed from a single Engine, they refused to really answer.  Instead, several "I'm not allowed to comment on the specific configuration" lines were given.  This seems pretty odd as NVIDIA has been upfront about the dual options for its derivative GPU models.  Oh well.

Continue reading our review of the AMD Radeon R9 290 4GB Graphics Card Review!!!

Author:
Manufacturer: AMD

Clock Variations

When AMD released the Radeon R9 290X last month, I came away from the review very impressed with the performance and price point the new flagship graphics card was presented with.  My review showed that the 290X was clearly faster than the NVIDIA GeForce GTX 780 and (and that time) was considerably less expensive as well - a win-win for AMD without a doubt. 

But there were concerns over a couple of aspects of the cards design.  First was the temperature and, specifically, how AMD was okay with this rather large silicon hitting 95C sustained.  Another concern, AMD has also included a switch at the top of the R9 290X to switch fan profiles.  This switch essentially creates two reference defaults and makes it impossible for us to set a baseline of performance.  These different modes only changed the maximum fan speed that the card was allowed to reach.  Still, performance changed because of this setting thanks to the newly revised (and updated) AMD PowerTune technology.

We also saw, in our initial review, a large variation in clock speeds both from one game to another as well as over time (after giving the card a chance to heat up).  This led me to create the following graph showing average clock speeds 5-7 minutes into a gaming session with the card set to the default, "quiet" state.  Each test is over a 60 second span.

clock-avg.png

Clearly there is variance here which led us to more questions about AMD's stance.  Remember when the Kepler GPUs launched.  AMD was very clear that variance from card to card, silicon to silicon, was bad for the consumer as it created random performance deltas between cards with otherwise identical specifications. 

When it comes to the R9 290X, though, AMD claims both the GPU (and card itself) are a customizable graphics solution.  The customization is based around the maximum fan speed which is a setting the user can adjust inside the Catalyst Control Center.  This setting will allow you to lower the fan speed if you are a gamer desiring a quieter gaming configuration while still having great gaming performance.  If you are comfortable with a louder fan, because headphones are magic, then you have the option to simply turn up the maximum fan speed and gain additional performance (a higher average clock rate) without any actual overclocking.

Continue reading our article on the AMD Radeon R9 290X - The Configurable GPU!!!

Subject: Motherboards
Manufacturer: GIGABYTE

Introduction and Technical Specifications

Introduction

02-8094_big.jpg

Courtesy of GIGABYTE

The GIGABYTE Z87X-UD5H is the upper tier of their Z87 desktop board line. The board supports the latest generation of Intel LGA1150-based processors and is crammed full of all the bells and whistles you would expect on more costly boards. With an MSRP of $229.99, the board is aggressively priced to compete with the other upper tier Z87 desktop boards like the MSI Z87 MPower.

03-8154.jpg

Courtesy of GIGABYTE

Besides an impressive 16-phase digital power delivery system dedicated to the processor, the Z87X-UD5H features GIGABYTE's Ultra Durable 5 Plus technology. Ultra Durable 5 Plus brings several high-end power components into the board's design: International Rectifier (IR) manufactured PowIRstage™ ICs and PWM controllers, Nippon Chemi-con manufactured Black Solid capacitors with a 10k hour operational rating at 105C, 15 micron gold plating on the CPU socket pins, and two 0.070mm copper layers imbedded into the PCB for optimal heat dissipation. In addition to the Ultra Durable 5 Plus power features, GIGABYTE designed the board with the following features: 10 SATA 6Gb/s ports; two Intel GigE NICs; three PCI-Express x16 slots for up to dual-card NVIDIA SLI or AMD CrossFire support; two PCI-Express x1 slots; a PCI slot; on board power, reset, and BIOS reset buttons; switch BIOS and Dual-BIOS switches; 2-digit diagnostic LED display; integrated voltage measurement points; and USB 2.0 and 3.0 port support.

04-8096_big.jpg

Courtesy of GIGABYTE

Continue reading our review of the GIGABYTE Z87X-UD5H motherboard!

Author:
Manufacturer: Various

ASUS R9 280X DirectCU II TOP

Earlier this month AMD took the wraps off of a revamped and restyled family of GPUs under the Radeon R9 and R7 brands.  When I reviewed the R9 280X, essentially a lower cost version of the Radoen HD 7970 GHz Edition, I came away impressed with the package AMD was able to put together.  Though there was no new hardware to really discuss with the R9 280X, the price drop placed the cards in a very aggressive position adjacent the NVIDIA GeForce line-up (including the GeForce GTX 770 and the GTX 760). 

As a result, I fully expect the R9 280X to be a great selling GPU for those gamers with a mid-range budget of $300. 

But another of the benefits of using an existing GPU architecture is the ability for board partners to very quickly release custom built versions of the R9 280X. Companies like ASUS, MSI, and Sapphire are able to have overclocked and custom-cooled alternatives to the 3GB $300 card, almost immediately, by simply adapting the HD 7970 PCB.

all01.jpg

Today we are going to be reviewing a set of three different R9 280X cards: the ASUS DirectCU II, MSI Twin Frozr Gaming, and the Sapphire TOXIC. 

Continue reading our roundup of the R9 280X cards from ASUS, MSI and Sapphire!!

Author:
Manufacturer: ARM

ARM is Serious About Graphics

Ask most computer users from 10 years ago who ARM is, and very few would give the correct answer.  Some well informed people might mention “Intel” and “StrongARM” or “XScale”, but ARM remained a shadowy presence until we saw the rise of the Smartphone.  Since then, ARM has built up their brand, much to the chagrin of companies like Intel and AMD.  Partners such as Samsung, Apple, Qualcomm, MediaTek, Rockchip, and NVIDIA have all worked with ARM to produce chips based on the ARMv7 architecture, with Apple being the first to release the first ARMv8 (64 bit) SOCs.  The multitude of ARM architectures are likely the most shipped chips in the world, going from very basic processors to the very latest Apple A7 SOC.

t700_01.jpg

The ARMv7 and ARMv8 architectures are very power efficient, yet provide enough performance to handle the vast majority of tasks utilized on smartphones and tablets (as well as a handful of laptops).  With the growth of visual computing, ARM also dedicated itself towards designing competent graphics portions of their chips.  The Mali architecture is aimed at being an affordable option for those without access to their own graphics design groups (NVIDIA, Qualcomm), but competitive with others that are willing to license their IP out (Imagination Technologies).

ARM was in fact one of the first to license out the very latest graphics technology to partners in the form of the Mali-T600 series of products.  These modules were among the first to support OpenGL ES 3.0 (compatible with 2.0 and 1.1) and DirectX 11.  The T600 architecture is very comparable to Imagination Technologies’ Series 6 and the Qualcomm Adreno 300 series of products.  Currently NVIDIA does not have a unified mobile architecture in production that supports OpenGL ES 3.0/DX11, but they are adapting the Kepler architecture to mobile and will be licensing it to interested parties.  Qualcomm does not license out Adreno after buying that group from AMD (Adreno is an anagram of Radeon).

Click to read the entire article here!

Manufacturer: NVIDIA

It impresses.

ShadowPlay is NVIDIA's latest addition to their GeForce Experience platform. This feature allows their GPUs, starting with Kepler, to record game footage either locally or stream it online through Twitch.tv (in a later update). It requires Kepler GPUs because it is accelerated by that hardware. The goal is to constantly record game footage without any noticeable impact to performance; that way, the player can keep it running forever and have the opportunity to save moments after they happen.

Also, it is free.

shadowplay-vs.jpg

I know that I have several gaming memories which come unannounced and leave undocumented. A solution like this is very exciting to me. Of course a feature on paper not the same as functional software in the real world. Thankfully, at least in my limited usage, ShadowPlay mostly lives up to its claims. I do not feel its impact on gaming performance. I am comfortable leaving it on at all times. There are issues, however, that I will get to soon.

This first impression is based on my main system running the 331.65 (Beta) GeForce drivers recommended for ShadowPlay.

  • Intel Core i7-3770, 3.4 GHz
  • NVIDIA GeForce GTX 670
  • 16 GB DDR3 RAM
  • Windows 7 Professional
  • 1920 x 1080 @ 120Hz.
  • 3 TB USB3.0 HDD (~50MB/s file clone).

The two games tested are Starcraft II: Heart of the Swarm and Battlefield 3.

Read on to see my thoughts on ShadowPlay, the new Experience on the block.

Manufacturer: Thermalright

Introduction and Technical Specifications

Introduction

02-Silver-Arrow-EX_0.jpg

Courtesy of Thermalright

Thermalright is an established brand in the CPU cooling arena with its track record for innovative creations designed to best remove the heat from your prize CPU. The latest incarnation of their cooler line for Intel and AMD-based CPUs takes the form of the Silver Arrow SB-E Extreme, a massive nickel-plated copper cooler sporting two 140mm fans to aid in heat dispersal. We tested this cooler in conjunction with other all-in-one and air coolers to see how well the Thermalright cooler stacks up. With a retail price at $99.99, the cooler has a premium price for the premium performance it offers.

03-Silver-Arrow-EX-Top1.jpg

Courtesy of Thermalright

04-Silver-Arrow-side.jpg

Courtesy of Thermalright

05-Silver-Arrow-EX-Bottom_0.jpg

Courtesy of Thermalright

Thermalright took their cooler design to a whole new level with the Silver Arrow SB-E Extreme. The cooler features a nickel-plated copper base and heat pipes with two massive aluminum thin-finned tower radiators to help with heat dissipation. The Silver Arrow SB-E contains eight total 6mm diameter heat pipes that run through the copper base plate, terminating in the two aluminum tower radiators. The base plate itself is polished to a mirror-like finish, ensuring optimal mating between the base plate and CPU surfaces.

Continue reading our review of the Thermalright Silver Arrow SB-E CPU air cooler!

Author:
Manufacturer: AMD

A bit of a surprise

Okay, let's cut to the chase here: it's late, we are rushing to get our articles out, and I think you all would rather see our testing results NOW rather than LATER.  The first thing you should do is read my review of the AMD Radeon R9 290X 4GB Hawaii graphics card which goes over the new architecture, new feature set, and performance in single card configurations. 

Then, you should continue reading below to find out how the new XDMA, bridge-less CrossFire implementation actually works in both single panel and 4K (tiled) configurations.

IMG_1802.JPG

 

A New CrossFire For a New Generation

CrossFire has caused a lot of problems for AMD in recent months (and a lot of problems for me as well).  But, AMD continues to make strides in correcting the frame pacing issues associated with CrossFire configurations and the new R9 290X moves the bar forward.

Without the CrossFire bridge connector on the 290X, all of the CrossFire communication and data transfer occurs over the PCI Express bus that connects the cards to the entire system.  AMD claims that this new XDMA interface was designed for Eyefinity and UltraHD resolutions (which were the subject of our most recent article on the subject).  By accessing the memory of the GPU through PCIe AMD claims that it can alleviate the bandwidth and sync issues that were causing problems with Eyefinity and tiled 4K displays.

Even better, this updated version of CrossFire is said to compatible with the frame pacing updates to the Catalyst driver to improve multi-GPU performance experiences for end users.

IMG_1800.JPG

When an extra R9 290X accidentally fell into my lap, I decided to take it for a spin.  And if you have followed my graphics testing methodology in the past year then you'll understand the important of these tests.

Continue reading our article Frame Rating: AMD Radeon R9 290X CrossFire and 4K Preview Testing!!

Author:
Manufacturer: AMD

A slightly new architecture

Note: We also tested the new AMD Radeon R9 290X in CrossFire and at 4K resolutions; check out that full Frame Rating story right here!!

Last month AMD brought media, analysts, and customers out to Hawaii to talk about a new graphics chip coming out this year.  As you might have guessed based on the location: the code name for this GPU was in fact, Hawaii. It was targeted at the high end of the discrete graphics market to take on the likes of the GTX 780 and GTX TITAN from NVIDIA. 

Earlier this month we reviewed the AMD Radeon R9 280X, R9 270X, and the R7 260X. None of these were based on that new GPU.  Instead, these cards were all rebrands and repositionings of existing hardware in the market (albeit at reduced prices).  Those lower prices made the R9 280X one of our favorite GPUs of the moment as it offers performance per price points currently unmatched by NVIDIA.

But today is a little different, today we are talking about a much more expensive product that has to live up to some pretty lofty goals and ambitions set forward by the AMD PR and marketing machine.  At $549 MSRP, the new AMD Radeon R9 290X will become the flagship of the Radeon brand.  The question is: to where does that ship sail?

 

The AMD Hawaii Architecture

To be quite upfront about it, the Hawaii design is very similar to that of the Tahiti GPU from the Radeon HD 7970 and R9 280X cards.  Based on the same GCN (Graphics Core Next) architecture AMD assured us would be its long term vision, Hawaii ups the ante in a few key areas while maintaining the same core.

01.jpg

Hawaii is built around Shader Engines, of which the R9 290X has four.  Each of these includes 11 CU (compute units) which hold 4 SIMD arrays each.  Doing the quick math brings us to a total stream processor count of 2,816 on the R9 290X. 

Continue reading our review of the AMD Radeon R9 290X 4GB Graphics Card!!

Author:
Subject: Editorial
Manufacturer:

The Really Good Times are Over

We really do not realize how good we had it.  Sure, we could apply that to budget surpluses and the time before the rise of global terrorism, but in this case I am talking about the predictable advancement of graphics due to both design expertise and improvements in process technology.  Moore’s law has been exceptionally kind to graphics.  We can look back and when we plot the course of these graphics companies, they have actually outstripped Moore in terms of transistor density from generation to generation.  Most of this is due to better tools and the expertise gained in what is still a fairly new endeavor as compared to CPUs (the first true 3D accelerators were released in the 1993/94 timeframe).

The complexity of a modern 3D chip is truly mind-boggling.  To get a good idea of where we came from, we must look back at the first generations of products that we could actually purchase.  The original 3Dfx Voodoo Graphics was comprised of a raster chip and a texture chip, each contained approximately 1 million transistors (give or take) and were made on a then available .5 micron process (we shall call it 500 nm from here on out to give a sense of perspective with modern process technology).  The chips were clocked between 47 and 50 MHz (though often could be clocked up to 57 MHz by going into the init file and putting in “SET SST_GRXCLK=57”… btw, SST stood for Sellers/Smith/Tarolli, the founders of 3Dfx).  This revolutionary graphics card at the time could push out 47 to 50 megapixels and had 4 MB of VRAM and was released in the beginning of 1996.

righteous3d_01.JPG

My first 3D graphics card was the Orchid Righteous 3D.  Voodoo Graphics was really the first successful consumer 3D graphics card.  Yes, there were others before it, but Voodoo Graphics had the largest impact of them all.

In 1998 3Dfx released the Voodoo 2, and it was a significant jump in complexity from the original.  These chips were fabricated on a 350 nm process.  There were three chips to each card, one of which was the raster chip and the other two were texture chips.  At the top end of the product stack was the 12 MB cards.  The raster chip had 4 MB of VRAM available to it while each texture chip had 4 MB of VRAM for texture storage.  Not only did this product double performance from the Voodoo Graphics, it was able to run in single card configurations at 800x600 (as compared to the max 640x480 of the Voodoo Graphics).  This is the same time as when NVIDIA started to become a very aggressive competitor with the Riva TnT and ATI was about to ship the Rage 128.

Read the entire editorial here!

Author:
Manufacturer: NVIDIA

Our Legacys Influence

We are often creatures of habit.  Change is hard. And often times legacy systems that have been in place for a very long time can shift and determine the angle at which we attack new problems.  This happens in the world of computer technology but also outside the walls of silicon and the results can be dangerous inefficiencies that threaten to limit our advancement in those areas.  Often our need to adapt new technologies to existing infrastructure can be blamed for stagnant development. 

Take the development of the phone as an example.  The pulse based phone system and the rotary dial slowed the implementation of touch dial phones and forced manufacturers to include switches to select between pulse and tone based dialing options on phones for decades. 

Perhaps a more substantial example is that of the railroad system that has based the track gauge (width between the rails) on the transportation methods that existed before the birth of Christ.  Horse drawn carriages pulled by two horses had an axle gap of 4 feet 8 inches in the 1800s and thus the first railroads in the US were built with a track gauge of 4 feet 8 inches.  Today, the standard rail track gauge remains 4 feet 8 inches despite the fact that a wider gauge would allow for more stability of larger cargo loads and allow for higher speed vehicles.  But the cost of updating the existing infrastructure around the world would be so cost prohibitive that it is likely we will remain with that outdated standard.

railroad.jpg

What does this have to do with PC hardware and why am I giving you an abbreviated history lesson?  There are clearly some examples of legacy infrastructure limiting our advancement in hardware development.  Solid state drives are held back by the current SATA based storage interface though we are seeing movements to faster interconnects like PCI Express to alleviate this.  Some compute tasks are limited by the “infrastructure” of standard x86 processor cores and the move to GPU compute has changed the direction of these workloads dramatically.

There is another area of technology that could be improved if we could just move past an existing way of doing things.  Displays.

Continue reading our story on NVIDIA G-Sync Variable Refresh Rate Technology!!

Manufacturer: XSPC

Introduction and Technical Specifications

Introduction

02-ex240-750-kit_0.jpg

XSPC Raystorm 750 EX240 Watercooling Kit
Courtesy of XSPC

XSPC has a well known presence in the enthusiast water cooling community with a track record for high performance and affordable water cooling products. Recently, XSPC released new version of their DIY cooling kits, integrating their EX series radiators into the kit. They were kind enough to provide us with a sample of their Raystorm 750 EX240 Watercooling Kit with an included EX240 radiator. We tested this cooler in conjunction with other all-in-one and air coolers to see how well the XSPC kit stacks up. With a retail price at $149.99, this kit offers an affordable alternative to the all-in-one coolers.

03-1-5-2-1-1-1_0.jpg

X2O 750 Dual Bayres/Pump V4 reservoir
Courtesy of XSPC

04-ex240-1-1_0.jpg

EX240 Dual Radiator
Courtesy of XSPC

05-kit1-4-1-1-1_0.jpg

Raystorm CPU Waterblock
Courtesy of XSPC

Continue reading our review of the XSPC Raystorm 750 EX240 Watercooling Kit!!

Manufacturer: Corsair

Introduction and Features

The 750D is Corsair’s latest addition to their top of the line Obsidian Series and is the third new Obsidian case for 2013. The new 750D is a full-tower enclosure that offers a little more room, enhanced cooling, with expanded drive mounting options, than Corsair’s ever popular 650D mid-tower enclosure. The 750D is being introduced with an MSRP of $159.99 USD, which also makes it a little less expensive than the 650D. In addition to PC enclosures, Corsair continues to offer one of the largest selections of memory products, SSDs, power supplies, coolers, gaming peripherals, and PC accessories currently on the market.

2-Obsidian-Banner.jpg

The 750D full-tower case is positioned mid-way between Corsair’s huge 900D Super-Tower and 350D Micro-ATX enclosures and shares many of the same styling and design features as the 900D and 350D. Corsair is saying the 750D is a successor to the 650D but we hope the 650D mid-tower enclosure doesn’t go away any time soon as the two enclosures are still different enough to appeal to different users.

3-750D-diag.jpg

(Courtesy of Corsair)

Continue reading our review of the Corsair Obsidian 750D case!!

Author:
Subject: Systems
Manufacturer: ORIGIN PC

Get your wallet ready

While I was preparing for the release of Intel's Core i7 Ivy Bridge-E processors last month ORIGIN PC approached me about a system review based on the new platform.  Of course I rarely pass up the opportunity to spend some time with unreasonably fast PC hardware so I told them to send something over that would impress me. 

This system did.

The ORIGIN PC Millennium custom configuration is one of the flagship offerings from the boutique builder and it will hit your wallet nearly as hard as it will your games and applications.  What kind of hardware do you get for $4200 these days?

  • ORIGIN PC Millennium
  • Intel Core i7-4930K (OC to 4.5 GHz)
  • ASUS Rampage IV Gene mATX motherboard
  • Custom Corsair H100i 240mm water cooler
  • 16GB (4 x 4GB) Corsair Vengeance DDR3-1866 memory
  • 2 x NVIDIA GeForce GTX 780 3GB SLI
  • 2 x Samsung 840 Pro 128GB SSD (RAID 0)
  • 1TB Western Digital Black HDD
  • Corsair AX1200i power supply
  • Corsair Obsidian 350D case
  • Windows 8

site3.jpg

Our custom build was designed to pack as much processing power into as small a case as possible and I think you'll find that ORIGIN did a bang up job here.  By starting with the Corsair 350D micro ATX chassis yet still including dual graphics cards and an overclocked IVB-E processor, the results are going to impress.

IMG_1458.JPG

Continue reading our overview of the ORIGIN PC Millennium custom gaming PC!!

Author:
Manufacturer: AMD

The AMD Radeon R9 280X

Today marks the first step in an introduction of an entire AMD Radeon discrete graphics product stack revamp. Between now and the end of 2013, AMD will completely cycle out Radeon HD 7000 cards and replace them with a new branding scheme. The "HD" branding is on its way out and it makes sense. Consumers have moved on to UHD and WQXGA display standards; HD is no longer extraordinary.

But I want to be very clear and upfront with you: today is not the day that you’ll learn about the new Hawaii GPU that AMD promised would dominate the performance per dollar metrics for enthusiasts.  The Radeon R9 290X will be a little bit down the road.  Instead, today’s review will look at three other Radeon products: the R9 280X, the R9 270X and the R7 260X.  None of these products are really “new”, though, and instead must be considered rebrands or repositionings. 

There are some changes to discuss with each of these products, including clock speeds and more importantly, pricing.  Some are specific to a certain model, others are more universal (such as updated Eyefinity display support). 

Let’s start with the R9 280X.

 

AMD Radeon R9 280X – Tahiti aging gracefully

The AMD Radeon R9 280X is built from the exact same ASIC (chip) that powers the previous Radeon HD 7970 GHz Edition with a few modest changes.  The core clock speed of the R9 280X is actually a little bit lower at reference rates than the Radeon HD 7970 GHz Edition by about 50 MHz.  The R9 280X GPU will hit a 1.0 GHz rate while the previous model was reaching 1.05 GHz; not much a change but an interesting decision to be made for sure.

Because of that speed difference the R9 280X has a lower peak compute capability of 4.1 TFLOPS compared to the 4.3 TFLOPS of the 7970 GHz.  The memory clock speed is the same (6.0 Gbps) and the board power is the same, with a typical peak of 250 watts.

280x-1.jpg

Everything else remains the same as you know it on the HD 7970 cards.  There are 2048 stream processors in the Tahiti version of AMD’s GCN (Graphics Core Next), 128 texture units and 32 ROPs all being pushed by a 384-bit GDDR5 memory bus running at 6.0 GHz.  Yep, still with a 3GB frame buffer.

Continue reading our review of the AMD Radeon R9 280X, R9 270X and R7 260X!!!

Subject: Mobile
Manufacturer: ASUS

Introduction and Design

P9033132.jpg

As we’re swimming through the veritable flood of Haswell refresh notebooks, we’ve stumbled across the latest in a line of very popular gaming models: the ASUS G750JX-DB71.  This notebook is the successor to the well-known G75 series, which topped out at an Intel Core i7-3630QM with NVIDIA GeForce GTX 670MX dedicated graphics.  Now, ASUS has jacked up the specs a little more, including the latest 4th-gen CPUs from Intel as well as 700-series NVIDIA GPUs.

Our ASUS G750JX-DB71 test unit features the following specs:

specs.png

Of course, the closest comparison to this unit is already the most recently-reviewed MSI GT60-2OD-026US, which featured nearly identical specifications, apart from a 15.6” screen, a better GPU (a GTX 780M with 4 GB GDDR5), and a slightly different CPU (the Intel Core i7-4700MQ).  In case you’re wondering what the difference is between the ASUS G750JX’s Core i7-4700MQ and the GT60’s i7-4700HQ, it’s very minor: the HQ features a slightly faster integrated graphics Turbo frequency (1.2 GHz vs. 1.15 GHz) and supports Intel Virtualization Technology for Directed I/O (VT-d).  Since the G750JX doesn’t support Optimus, we won’t ever be using the integrated graphics, and unless you’re doing a lot with virtual machines, VT-d isn’t likely to offer any benefits, either.  So for all intents and purposes, the CPUs are equivalent—meaning the biggest overall performance difference (on the spec sheet, anyway) lies with the GPU and the storage devices (where the G750JX offers more solid-state storage than the GT60).  It’s no secret that the MSI GT60 burned up our benchmarks—so the real question is, how close is the ASUS G750JX to its pedestal, and if the differences are considerable, are they justified?

At an MSRP of around $2,000 (though it can be found for around $100 less), the ASUS G750JX-DB71 competes directly with the likes of the MSI GT60, too (which is priced equivalently).  The question, of course, is whether it truly competes.  Let’s find out!

P9033137.jpg

Continue reading our review of the ASUS G750JX-DB71 Gaming Notebook!!!

Manufacturer: Scott Michaud

A new generation of Software Rendering Engines.

We have been busy with side projects, here at PC Perspective, over the last year. Ryan has nearly broken his back rating the frames. Ken, along with running the video equipment and "getting an education", developed a hardware switching device for Wirecase and XSplit.

My project, "Perpetual Motion Engine", has been researching and developing a GPU-accelerated software rendering engine. Now, to be clear, this is just in very early development for the moment. The point is not to draw beautiful scenes. Not yet. The point is to show what OpenGL and DirectX does and what limits are removed when you do the math directly.

Errata: BioShock uses a modified Unreal Engine 2.5, not 3.

In the above video:

  • I show the problems with graphics APIs such as DirectX and OpenGL.
  • I talk about what those APIs attempt to solve, finding color values for your monitor.
  • I discuss the advantages of boiling graphics problems down to general mathematics.
  • Finally, I prove the advantages of boiling graphics problems down to general mathematics.

I would recommend watching the video, first, before moving forward with the rest of the editorial. A few parts need to be seen for better understanding.

Click here, after you watch the video, to read more about GPU-accelerated Software Rendering.

Subject: Motherboards
Manufacturer: MSI

Introduction and Technical Specifications

Introduction

02-five_pictures2_2852_20130529171649_g_0.jpg

Courtesy of MSI

The Z87 XPower board is the flagship motherboard in MSI's MPower product line. The board supports the latest generation of Intel LGA1150-based processors with all the over-engineered goodness you've come to expect from an MSI flagship board. The Z87 XPower sports the black and yellow theme of the product line, along with integrated LEDs to really make the board stand out. While its $439.99 retail may seem a bit high, the stability, features, and raw power of the board make it a wise investment for those that only want the best.

03-mb_0.jpg

Courtesy of MSI

04-pcb-expl_0.jpg

Courtesy of MSI

Designed with a 32-phase digital power system and an eight-layer PCB, the MSI Z87 XPower is armed to take any amount of punishment you can throw at it. MSI incorporated a plethora of features into its massive XL-ATX form factor board: 10 SATA 6Gb/s ports; a mSATA 6Gb/s port; a Killer E2205 GigE NIC; Intel 802.11n WiFi and Bluetooth adapter; five PCI-Express x16 slots for up to quad-card NVIDIA SLI or AMD CrossFire support; two PCI-Express x1 slots; Lucidlogix Virtu® MVP 2.0 support; onboard power, reset, BIOS reset, CPU ratio control, base clock control, OC Genie, power discharge, and Go2BIOS buttons; multi-BIOS and PCIe control switches; 2-digit diagnostic LED display; 14 voltage check points; independent audio subsystem PCB design; and USB 2.0 and 3.0 port support.

05-rear-panel-expl_0.jpg

Courtesy of MSI

Continue reading our review of the MSI Z87 XPower motherboard!