New AMD Polaris 10 and Polaris 11 GPU Details Emerge

Subject: Editorial, Graphics Cards | May 18, 2016 - 01:18 PM |
Tagged: rumor, Polaris, opinion, HDMI 2.0, gpu, gddr5x, GDDR5, GCN, amd, 4k

While Nvidia's Pascal has held the spotlight in the news recently, it is not the only new GPU architecture debuting this year. AMD will soon be bringing its Polaris-based graphics cards to market for notebooks and mainstream desktop users. While several different code names have been thrown around for these new chips, they are consistently in general terms referred to as Polaris 10 and Polaris 11. AMD's Raja Kudori stated in an interview with PC Perspective that the numbers used in the naming scheme hold no special significance, but eventually Polaris will be used across the entire performance lineup (low end to high end graphics).

Naturally, there are going to be many rumors and leaks as the launch gets closer. In fact, Tech Power Up recently came into a number of interesting details about AMD's plans for Polaris-based graphics in 2016 including specifications and which areas of the market each chip is going to be aimed at. 

AMD GPU Roadmap.jpg

Citing the usual "industry sources" familiar with the matter (take that for what it's worth, but the specifications do not seem out of the realm of possibility), Tech Power Up revealed that there are two lines of Polaris-based GPUs that will be made available this year. Polaris 10 will allegedly occupy the mid-range (mainstream) graphics option in desktops as well as being the basis for high end gaming notebook graphics chips. On the other hand, Polaris 11 will reportedly be a smaller chip aimed at thin-and-light notebooks and mainstream laptops.

Now, for the juicy bits of the leak: the rumored specifications!

AMD's "Polaris 10" GPU will feature 32 compute units (CUs) which TPU estimates – based on the assumption that each CU still contains 64 shaders on Polaris – works out to 2,048 shaders. The GPU further features a 256-bit memory interface along with a memory controller supporting GDDR5 and GDDR5X (though not at the same time heh). This would leave room for cheaper Polaris 10 derived products with less than 32 CUs and/or cheaper GDDR5 memory. Graphics cards would have as much as 8GB of memory initially clocked at 7 Gbps. Reportedly, the full 32 CU GPU is rated at 5.5 TFLOPS of single precision compute power and runs at a TDP of no more than 150 watts.

Compared to the existing Hawaii-based R9 390X, the upcoming R9 400 Polaris 10 series GPU has fewer shaders and less memory bandwidth. The memory is clocked 1 GHz higher, but the GDDR5X memory bus is half that of the 390X's 512-bit GDDR5 bus which results in 224 GB/s memory bandwidth for Polaris 10 versus 384 GB/s on Hawaii. The R9 390X has a slight edge in compute performance at 5.9 TFLOPS versus Polaris 10's 5.5 TFLOPS however the Polaris 10 GPU is using much less power and easily wins at performance per watt! It almost reaches the same level of single precision compute performance at nearly half the power which is impressive if it holds true!

  R9 390X R9 390 R9 380 R9 400-Series "Polaris 10"
GPU Code name Grenada (Hawaii) Grenada (Hawaii) Antigua (Tonga) Polaris 10
GPU Cores 2816 2560 1792 2048
Rated Clock 1050 MHz 1000 MHz 970 MHz ~1343 MHz
Texture Units 176 160 112 ?
ROP Units 64 64 32 ?
Memory 8GB 8GB 4GB 8GB
Memory Clock 6000 MHz 6000 MHz 5700 MHz 7000 MHz
Memory Interface 512-bit 512-bit 256-bit 256-bit
Memory Bandwidth 384 GB/s 384 GB/s 182.4 GB/s 224 GB/s
TDP 275 watts 275 watts 190 watts 150 watts (or less)
Peak Compute 5.9 TFLOPS 5.1 TFLOPS 3.48 TFLOPS 5.5 TFLOPS
MSRP (current) ~$400 ~$310 ~$199 $ unknown

Note: Polaris GPU clocks esitmated using assumption of 5.5 TFLOPS being peak compute and accurate number of shaders. (Thanks Scott.)

Another comparison that can be made is to the Radeon R9 380 which is a Tonga-based GPU with similar TDP. In this matchup, the Polaris 10 based chip will – at a slightly lower TDP – pack in more shaders, twice the amount of faster clocked memory with 23% more bandwidth, and provide a 58% increase in single precision compute horsepower. Not too shabby!

Likely, a good portion of these increases are made possible by the move to a smaller process node and utilizing FinFET "tri-gate" like transistors on the Samsung/Globalfoundries 14LPP FinFET manufacturing process, though AMD has also made some architecture tweaks and hardware additions to the GCN 4.0 based processors. A brief high level introduction is said to be made today in a webinar for their partners (though AMD has come out and said preemptively that no technical nitty-gritty details will be divulged yet). (Update: Tech Altar summarized the partner webinar. Unfortunately there was no major reveals other than that AMD will not be limiting AIB partners from pushing for the highest factory overclocks they can get).

Moving on from Polaris 10 for a bit, Polaris 11 is rumored to be a smaller GCN 4.0 chip that will top out at 14 CUs (estimated 896 shaders/stream processors) and 2.5 TFLOPS of single precision compute power. These chips aimed at mainstream and thin-and-light laptops will have 50W TDPs and will be paired with up to 4GB of GDDR5 memory. There is apparently no GDDR5X option for these, which makes sense at this price point and performance level. The 128-bit bus is a bit limiting, but this is a low end mobile chip we are talking about here...

  R7 370 R7 400 Series "Polaris 11"
GPU Code name Trinidad (Pitcairn) Polaris 11
GPU Cores 1024 896
Rated Clock

925 MHz base (975 MHz boost)

~1395 MHz
Texture Units 64 ?
ROP Units 32 ?
Memory 2 or 4GB 4GB
Memory Clock 5600 MHz ? MHz
Memory Interface 256-bit 128-bit
Memory Bandwidth 179.2 GB/s ? GB/s
TDP 110 watts 50 watts
Peak Compute 1.89 TFLOPS 2.5 TFLOPS
MSRP (current) ~$140 (less after rebates and sales) $?

Note: Polaris GPU clocks esitmated using assumption of 2.5 TFLOPS being peak compute and accurate number of shaders. (Thanks Scott.)

Fewer details were unveiled concerning Polaris 11, as you can see from the chart above. From what we know so far, it should be a promising successor to the R7 370 series even with the memory bus limitation and lower shader count as the GPU should be clocked higher, (it also might have more shaders in M series mobile variants versus of the 370 and lower mobile series) and a much lower TDP for at least equivalent if not a decent increase in performance. The lower power usage in particular will be hugely welcomed in mobile devices as it will result in longer battery life under the same workloads, ideally. I picked the R7 370 as the comparison as it has 4 gigabytes of memory and not that many more shaders and being a desktop chip readers may be more widely familiar with it. It also appears to sit between the R7 360 and R7 370 in terms of shader count and other features but is allegedly going to be faster than both of them while using at least (on paper) less than half the power.

Of course these are still rumors until AMD makes Polaris officially, well, official with a product launch. The claimed specifications appear reasonable though, and based on that there are a few important takeaways and thoughts I have.


The first thing on my mind is that AMD is taking an interesting direction here. While NVIDIA has chosen to start out its new generation at the top by announcing "big Pascal" GP100 and actually launching the GP104 GTX 1080 (one of its highest end consumer chips/cards) yesterday and then over the course of the year introducing lower end products AMD has opted for the opposite approach. AMD will be starting closer to the lower end with a mainstream notebook chip and high end notebook/mainstream desktop GPU (Polaris 11 and 10 respectively) and then over a year fleshing out its product stack (remember Raja Kudori stated Polaris and GCN 4 would be used across the entire product stack) and building up with bigger and higher end GPUs over time finally topping off with its highest end consumer (and professional) GPUs based on "Vega" in 2017.

This means, and I'm not sure if this was planned by either Nvidia or AMD or just how it happened to work out based on them following their own GPU philosophies (but I'm thinking the latter), that for some time after both architectures are launched AMD and NVIDIA's newest architectures and GPUs will not be directly competing with each other. Eventually they should meet in the middle (maybe late this year?) with a mid-range desktop graphics card and it will be interesting to see how they stack up at similar price points and hardware levels. Then, of course once "Vega" based GPUs hit (sadly probably in time for NV's big Pascal to launch heh. I'm not sure if Vega is Fury X replacement only or even beyond that to 1080Ti or even GP100 competitor) we should see GCN 4 on the new smaller process node square up against NVIDIA and it's 16nm Pascal products across the board (entire lineup). Which will have the better performance, which will win out in power usage and performance/watt and performance/$? All questions I wish I knew the answers to, but sadly do not!!

Speaking of price and performance/$... Polaris is actually looking pretty good so far at hitting much lower TDPs and power usage targets while delivering at least similar performance if not a good bit more. Both AMD and NVIDIA appear to be bringing out GPUs better than I expected to see as far as technological improvements in performance and power usage (these die shrinks have really helped even though from here on out that trend isn't really going to continue...). I hope that AMD can at least match NV in these areas at the mid range even if they do not have a high end GPU coming out soon (not until sometime after these cards launch and not really until Vega, the high end GCN GPU successor). At least on paper based on the leaked information the GPUs so far look good. My only worry is going to be pricing which I think is going to make or break these cards. AMD will need to price them competitively and aggressively to ensure their adoption and success.  

I hope that doing the rollout this way (starting with lower end chips) helps AMD to iron out the new smaller process node and that they are able to get good yields so that they can be aggressive with pricing here and eventually at the hgh end!

I am looking forward to more information on AMD's Polaris architecture and the graphics cards based on it!

Also read:

I will admit that I am not 100% up on all the rumors and I apologize for that. With that said, I would love to hear what your thoughts are on AMD's upcoming GPUs and what you think about these latest rumors!

AT&T Will Start Enforcing U-Verse Data Caps, Charging Extra For Unlimited Data

Subject: Editorial, General Tech | March 30, 2016 - 08:00 AM |
Tagged: U-Verse, opinion, isp, Internet, FTTN, FTTH, editorial, data cap, AT%26T

AT&T U-Verse internet users will soon feel the pain of the company's old school DSL users in the form of enforced data caps and overage charges for exceeding new caps. In a blog post yesterday, AT&T announced plans to roll out new data usage caps for U-Verse users as well as a ('Comcastic') $30 per month option for unlimited data use.

Starting on May 23, 2016 AT&T U-Verse (VDSL2 and Gigapower/Fiber) customers will see an increase to their usage allowance based on their speed tier. Currently, U-Verse FTTN customer have a 250 GB cap regardless of speed tier while FTTH customers in its Gigapower markets have a higher 500 GB cap. These caps were soft caps and not enforced meaning that customers were not charged anything for going over them. That will soon change, and all U-Verse customers will be charged for going over their cap at a rate of $10 for every 50 GB over the cap. (e.g. Even if you use only 1 GB over the cap, you will still be charged the full $10 fee.).


The new U-Verse caps (also listed in the chart below) range from 300 GB for speeds up to 6 Mbps and 600 GB for everything up to its bonded pair 75 Mbps tier. At the top end, customers lucky enough to get fiber to the home and speed plans up to 1 Gbps will have a 1 TB cap.

Internet Tier New Data Caps Overage Charges
AT&T DSL (all speeds) 150 GB $10 per 50GB
AT&T U-Verse (768 Kbps – 6 Mbps) 300 GB $10 per 50GB
AT&T U-Verse (12 Mbps – 75Mbps) 600 GB $10 per 50GB
AT&T U-Verse FTTH (100 Mbps – 1 Gbps)  1 TB $10 per 50GB

Uverse customers that expect to use more than 500 GB over their data cap ($100 is the maximum overage charge) or that simply prefer not to worry about tracking their data usage can opt to pay an additional $30 monthly fee to be exempt from their data cap.

It's not all bad news though. General wisdom has always been that U-Verse customers subscribed to both internet and TV would be exempt from the caps even if AT&T started to enforce them. This is not changing. U-Verse customers subscribed to U-Verse TV (IPTV) or Direct TV on a double play package with U-Verse internet will officially be exempt from the cap and will get the $30/month unlimited data option for free.

AT&T DSL users continue to be left behind here as they will not receive an increase in their 150 GB data allowance, and from the wording of the blog post it appears that they will further be left out of the $30 per month unlimited data option (which would have actually been a very welcome change for them).

Karl Bode over at DSLReports adds a bit of interesting history in mentioning that originally AT&T stated that U-Verse users would not be subject to a hard data cap because of the improved network architecture and its "greater capacity" versus the old school CO-fed DSL lines. With the acquisition of Direct TV and the way that AT&T has been heavily pushing Direct TV and pushing customers away from its IPTV U-Verse TV service, it actually seems like a perfect time to not enforce data caps since customers going with its Direct TV satellite TV would free up a great deal of bandwidth on the VDSL2 wireline network for internet!

This recent move is very reminiscent of Comcast's as it "trials" data caps and overages in certain markets as well as having it's own extra monthly charge for unlimited data use. Considering the relatively miniscule cost to deliver this data versus the monthly service charges, these new unlimited options really seem more about seeking profit than any increased costs especially since customers have effectively had unlimited data this whole time and will soon be charged for the same service they've possibly been using for years. I will give AT&T some credit for implementing more realistic data caps and bumping everyone up based on speed tiers (something Comcast should adopt if they are set on having caps). Also, letting Internet+TV customers keep unlimited data is a good thing, even if it is only there to encourage people not to cut the cord.

The final bit of good news is that existing U-Verse customers will have approximately four months before they will be charged for going over their data caps. AT&T claims that they will only begin charging for overages on the third billing cycle, giving customers at least two 'free' months of overages. Users can opt to switch between unlimited and capped options at will, even in the middle of a billing cycle, and the company will send as many as seven email reminders at various data usage points as they approach the cap in the first two months as a warning to the overages.

This is a lot to take in, but there is still plenty of time to figure out how the changes will affect you. 

Are you a U-Verse or AT&T DSL user? What do you think about the new data caps for U-Verse users and the $30/month unlimited data option?

Source: AT&T

Computex 2015: Synaptics Touches Up The Space Bar With SmartBar Technology

Subject: General Tech | June 3, 2015 - 08:00 AM |
Tagged: touch, synaptics, smartbar, opinion, gaming, computex 2015, computex

Synaptics revealed more details on its SmartBar technology today at Computex. The human interface company most known for its trackpads is looking to expand its reach into keyboards. Specifically, SmartBar is technology that will add touch input functionality to the keyboard spacebar. Using the technology, OEMs can integrate capacitive touch sensors into the spacebar allowing for several unique and productivity boosting gestures. 

Synaptics SmartBar Capacitive Spacebar.jpg

The SmartBar spacebar can be broken up into five logical (touch sensitive areas) buttons each of which can be associated with user created macros using a bundled macro editing utility. Alternatively, users can enable touch gestures. Synaptics is touting the ability to use quick left and right swipe motion to edit text by moving the cursor back and forth word by word though a document as well as the ability to use a two thumb pinch gesture to zoom in and out on an image or document. The touch input would also be useful to gamers who want to future increase their actions per minute in RTS games or even something as simple as shifting gears or switching weapons in racing and first person games respectively.

Along the lines of gaming, it turns out that Thermaltake under it's Tt eSports line will be the first adopter of this SmartBar technology, and while Synaptics did not reveal any exact products I am looking forward to see what Thermaltake does with the technology in its future gaming keyboards. This could be a gimmick, or it could really take off and be a must have feature depending on how well it is implemented in both hardware and software. It does make sense though; the spacebar is the natural resting place for your thumbs, so it should not take too much effort to incorporate touch gestures (literally at your fingertips...) to improve your game or work efficiency. A simple but promising idea for sure.

From the press release:

“Desktop PCs still represent a sizeable portion of the PC market, especially in the commercial segment, but most desktop users have been left behind in terms of next-generation interfaces such as touch,” said Tom Mainelli, VP of Devices & Displays at International Data Corporation (IDC). “Companies are always looking for ways to help drive employee efficiency, and feature-rich, touch-enabled keyboards represent a straightforward, affordable way to help increase worker productivity.”

The SmartBar technology is available now to OEMs, but we might have to wait until CES to see actual products offering touch sensitive spacebars.

What do you think of the technology, and would you use it for gaming?

Source: Synaptics

New Intel Xeon D Broadwell Processors Aimed at Low Power, High Density Servers

Subject: Editorial, Processors | March 12, 2015 - 08:29 PM |
Tagged: Xeon D, xeon, servers, opinion, microserver, Intel

Intel dealt a blow to AMD and ARM this week with the introduction of the Xeon Processor D Product Family of low power server SoCs. The new Xeon D chips use Intel’s latest 14nm process and top out at 45W. The chips are aimed at low power high density servers for general web hosting, storage clusters, web caches, and networking hardware.

Intel Xeon D Processor.png

Currently, Intel has announced two Xeon D chips, the Xeon D-1540 and Xeon D-1520. Both chips are comprised of two dies inside a single package. The main die uses a 14nm process and holds the CPU cores, L3 cache, DDR3 and DDR4 memory controllers, networking controller, PCI-E 3.0, and USB 3.0 while a secondary die using a larger (but easier to implement) manufacturing process hosts the higher latency I/O that would traditionally sit on the southbridge including SATA, PCI-E 2.0, and USB 2.0.

In all, a fairly typical SoC setup from Intel. The specifics are where things get interesting, however. At the top end, Xeon D offers eight Broadwell-based CPU cores (with Hyper-Threading for 16 total threads) clocked at 2.0 GHz base and 2.5 GHz max all-core Turbo (2.6 GHz on a single core). The cores are slightly more efficient than Haswell, especially in this low power setup. The eight cores can tap into 12MB of L3 cache as well as up to 128GB of registered ECC memory (or 64GB unbuffered and/or SODIMMs) in DDR3 1600 MHz or DDR4 2133 MHz flavors. Xeon D also features 24 PCI-E 3.0 lanes (which can be broken up to as small as six PCI-E 3.0 x4 lanes or in a x16+x8 configuration among others), eight PCI-E 2.0 lanes, two 10GbE connections, six SATA III 6.0 Gbps channels, four USB 3.0 ports, and four USB 2.0 ports.

Intel Xeon D Server Processor Performance.png

All of this hardware is rolled into a part with a 45W TDP. Needless to say, this is a new level of efficiency for Xeons! Intel chose to compare the new chips to its Atom C2000 “Avoton” (Silvermont-based) SoCs which were also aimed at low power servers and related devices. According to the company, Xeon D offers up to 3.4-times the performance and 1.7-times the performance-per-watt of the top end Atom C2750 processor. Keeping in mind that Xeon D uses approximately twice the power as Atom C2000, it is still looking good for Intel since you are getting more than twice the performance and a more power efficient part. Further, while the TDPs are much higher,

Intel has packed Xeon D with a slew of power management technology including Integrated Voltage Regulation (IVR), an energy efficient turbo mode that will analyze whether increased frequencies actually help get work done faster (and if not will reduce turbo to allow extra power to be used elsewhere on the chip or to simply reduce wasted energy), and optional “hardware power management” that allows the processor itself to determine the appropriate power and sleep states independently from the OS.

Being server parts, Xeon D supports ECC, PCI-E Non-Transparent Bridging, memory and PCI-E Checksums, and corrected (errata-free) TSX instructions.

Ars Technica notes that Xeon D is strictly single socket and that Intel has reserved multi-socket servers for its higher end and more expensive Xeons (Haswell-EP). Where does the “high density” I mentioned come from then? Well, by cramming as many Xeon D SoCs on small motherboards with their own RAM and IO into rack mounted cases as possible, of course! It is hard to say just how many Xeon Ds will fit in a 1U, 2U, or even 4U rack mounted system without seeing associated motherboards and networking hardware needed but Xeon D should fare better than Avoton in this case since we are looking at higher bandwidth networking links and more PCI-E lanes, but AMD with SeaMicro’s Freedom Fabric and head start on low power x86 and ARM-based Opteron chip research as well as other ARM-based companies like AppliedMicro (X-Gene) will have a slight density advantage (though the Intel chips will be faster per chip).

Intel Xeon Processor D Product Family Server SoC.png

Which brings me to my final point. Xeon D truly appears like a shot across both ARM and AMD’s bow. It seems like Intel is not content with it’s dominant position in the overall server market and is putting its weight into a move to take over the low power server market as well, a niche that ARM and AMD in particular have been actively pursuing. Intel is not quite to the low power levels that AMD and other ARM-based companies are, but bringing Xeon down to 45W (with Atom-based solutions going upwards performance wise), the Intel juggernaut is closing in and I’m interested to see how it all plays out.

Right now, ARM still has the TDP and customization advantage (where customers can create custom chips and cores to suit their exact needs) and AMD will be able to leverage its GPU expertise by including processor graphics for a leg up on highly multi-threaded GPGPU workloads. On the other hand, Intel has the better manufacturing process and engineering budget. Xeon D seems to be the first step towards going after a market that they have in the past not really focused on.

With Intel pushing its weight around, where will that leave the little guys that I have been rooting for in this low power high density server space?

Source: Intel

CES 2014: Valve And 13 Launch Partners Unveil Slew of Steam Machines

Subject: Editorial, General Tech | January 7, 2014 - 02:25 AM |
Tagged: valve, SteamOS, steambox, opinion, Gabe Newell, CES 2014, CES

Valve Co-Founder Gabe Newell took the stage at a press conference in Las Vegas last night to introduce SteamOS powered Steam Machines and the company's hardware partners for the initial 2014 launch. And it has been quite the launch thus far, with as many as 13 companies launching at least one Steambox PC.

The majority of Steam Machines are living room friendly Mini-ITX (or smaller) form factors, but that has not stopped other vendors from going all out with full-tower builds. The 13 hardware partners have all put their own spin on a SteamOS-powered PC, and by the second half of 2014, users will be able to choose from $500 SFF cubes to ~$1000 Mini-ITX builds with dedicated graphics, to powerhouse desktop PCs that have MSRPs up to $6,000 and multiple GPUs. In fact, aside from SteamOS and support for the Steam Controller, the systems do not share much else, offering up unique options–which is a great thing. 

For the curious, the 13 Steam Machine hardware vendors are listed below.

  1. Alienware
  2. Alternate
  3. CyberPowerPC
  4. Digital Storm
  5. Falcon Northwest
  6. Gigabyte
  7. iBuyPower
  9. Next
  10. Origin PC
  11. Scan Computers
  12. Webhallen
  13. Zotac

As luck would have it for those eager to compare all of the available options, the crew over at Ars Technica have put together a handy table of the currently-known specifications and pricing of each company's Steam Machines! Some interesting takeaways from the chart include the almost even division between AMD and NVIDIA dedicated graphics while Intel has a single hardware win with it's Iris Pro 5200 (Gigabyte BRIX Pro). On the other hand, on the CPU side of things, Intel has the most design wins with AMD having as many as 3 design wins versus Intel's 10 (in the best case scenario). The pricing is also interesting. While there are outliers that offer up very expensive and affordable models, the majority of Steam Machines tend to be closer to the $1000 mark than either the $500 or $2000+ price points. In other words, about the same amount of money for a mid-range DIY PC. This is not necessarily a bad thing, as users are getting decent hardware for their money, a free OS, and OEM warranty/support (and there is nothing stopping the DIYers from making their own Steamboxes).

A SFF Steambox (left) from Zotac and a full-tower SteamOS gaming desktop from Falcon Nothwest (right).

So far, I have to say that I'm more impressed than not with the Steam Machine launch which has gone off better than I had expected. Here's hoping the hardware vendors are able to come through at the announced price points and Valve is able to continue wrangling developer support (and to improve the planned game streaming functionality from a Windows box). If so, I think Valve and it's partners will have a hit on their hands that will help bring PC gaming into the living room and (hopefully) on par (at least) in the mainstream perspective with the ever-popular game consoles (which are now using x86 PC architectures).

What do you think about the upcoming crop of Steam Machines? Does SteamOS have a future? Let us know your thoughts and predictions in the comments below!

Coverage of CES 2014 is brought to you by AMD!

PC Perspective's CES 2014 coverage is sponsored by AMD.

Follow all of our coverage of the show at!

Source: Ars Technica

PC Per(sonal) Game of the Year 2011

Subject: Editorial, General Tech | January 8, 2012 - 02:36 AM |
Tagged: opinion, gaming, game of the year, fps, deus ex: human revolution, deus ex

I have to say that 2011 was a pretty good year for PC gaming. Sure, it wasn't without some drab moments; however the diamonds in the rough more than made up for it. Those gems are the PC games that especially stood out for being of quality stock and most of all were awesomely fun.

Skyrim and Battlefield 3 have received a good deal of attention and praise among the PC Per staff, but I'm going to toss the guys a curve ball and name Deus Ex: Human Revolution as my personal favorite PC Game of the Year 2011. Here's why!


Well, first a bit of history. Growing up, I was always a big PC gamer (earliest game I remember is Digger on an actually floppy floppy disk), but somehow missed the first Deus Ex. I read about it year after year on various technology and gaming sites' "Greatest PC Games of All Time" lists, but somehow never picked it up. A couple years ago, I saw it pop up on a Steam sale, so (naturally) I bought it and gave it a shot. There was a lot of hype behind it, and as such it had a lot to live up to.

DeusEx 2012-01-06 22-28-03-76.png

Try taking away my equiptment now!

Unfortunately, it never really came close to living up to all of the hype and praise that people gave it. That is not to say that it's a bad game, just that compared to the newer games that I was used to it wasn't my personal favorite. From the perspective of comparing it with games of its time, it certainly is an interesting release and relatively really good. Compared to the much improved graphics, controls, budgets, and hardware of years later (when I finally played it); however, it simply did not measure up. One of my main gripes with the game was that the AI was not all that great and at some points would have super vision that could spot me from a football field away in the shadows. Had I played it when it came out and had games of that era to compare it to, (and the nostalgic love) I'm sure I would be among those singing it's praises but as a late comer to the game I just wasn't interested.

While PC gamers grew up with Deus Ex, I grew up with the Metal Gear Solid series and I absolutely loved sneaking around and being Solid Snake. Sure, the story was a crazy one, but the same could be said for Deus Ex. That's not to say that I didn't play PC games like Doom 3 (I went from Wolfenstein 3D to Doom 3 due to not growing up with constant internet access living out in the boonies and not having the money for a powerful GPU) but that the games that most stick around in my mind as favorites and fond memories were things like MGS (whereas others may have held Deus Ex for example). I did find Deus Ex's story interesting though even if I wasn't too impressed with the graphics or game play, so when I read that a new Deus Ex was coming out and that it was getting the proper PC attention (thanks, Nixxes!) as it should, I jumped on the chance to play it. Now that I have built a few PCs and have a decent video card, playing the new Deus Ex: Human Revolution was a "no-brainer".


Enough vents to make even Solid Snake jealous!

To make what could be a long and spoiler-ific story about Deus Ex: HR short, the game is my personal game of the year because it made me feel like Solid Snake again, vents and all. The conversations, hacking, story, augments, and graphics were all really fun and memorable. From walking around the city and listening to the people to punching through a wall to take out an enemy to sneaking through the vents and reading all the emails the game was immersive and I found myself staying up all night not wanting to go to sleep in favor of breaking into my co-workers office and reading their email... umm all in the name of keeping the office secure (some Head of Security I was!). Some of the comments I blogged about just after completing the game include that despite the long loading times, there are more vents than you can shake a stick at and being able to stealth around was very fun. More specifically, I noted:

"The emails and conversations that you overhear walking around the city are nice touches that help immersion. Some people have complained about the graphics quality; however, for what it is I find them to be very good. For a multi-platform game, it certainly runs well on the PC, which is an exceptional feat in this day and age!. . . . I have to say that it was an awesome experience!"

I'm sure that Ryan will disagree that Deus Ex: Human Revolution is the Game Of The Year, but for me it was very memorable, fun, and far exceeded my expectations. Now that I have found out about some cool sounding mods for the original Deus Ex, I may have to give it a second chance ;)

I invite the rest of the PC Per crew to share their personal Game of the Year as well as you, the readers! What was your favorite game this year, and why?

Runners up included Saints Row: The Third which is also crazy fun and Portal 2 who's story was awesome but came out so early in the year that it slipped my mind as GOTY.