Gigabyte Wants All Your Money for a 3-Way SLI Watercooled GTX 980 Setup

Subject: Graphics Cards | November 14, 2014 - 11:46 AM |
Tagged: sli, nvidia, N980X3WA-4GD, maxwell, GTX 980, gigabyte, geforce, 3-way

Earlier this week, a new product showed up on Gigabyte's website that has garnered quite a bit of attention. The GA-N980X3WA-4GD WaterForce Tri-SLI is a 3-Way SLI system with integrated water cooling powered by a set of three GeForce GTX 980 GPUs.

waterforce1.jpg

That. Looks. Amazing.

What you are looking at is a 3-Way closed loop water cooling system with an external enclosure to hold the radiators while providing a display full of information including temperatures, fans speeds and more. Specifications on the Gigabyte site are limited for now, but we can infer a lot from them:

  • WATERFORCE :3-WAY SLI Water Cooling System
  • Real-Time Display and Control
  • Flex Display Technology
  • Powered by NVIDIA GeForce GTX 980 GPU
  • Integrated with 4GB GDDR5 memory 256-bit memory interface(Single Card)
  • Features Dual-link DVI-I / DVI-D / HDMI / DisplayPort*3(Single Card)
  • BASE: 1228 MHz / BOOST: 1329 MHz
  • System power supply requirement: 1200W(with six 8-pin external power connectors)

waterforce2.jpg

The GPUs on each card are your standard GeForce GTX 980 with 4GB of memory (we reviewed it here) though they are running at overclocked base and boost clock speeds, as you would hope with all that water cooling power behind it. You will need a 1200+ watt power supply for this setup, which makes sense considering the GPU horsepower you'll have access to.

Another interesting feature Gigabyte is listing is called GPU Gauntlet Sorting.

With GPU Gauntlet™ Sorting, the Gigabyte SOC graphics card guarantees the higher overclocking capability in terms of excellent power switching.

Essentially, Gigabyte is going to make sure that the GPUs on the WaterForce Tri-SLI are the best they can get their hands on, with the best chance for overclocking higher than stock.

waterforce3.jpg

Setup looks interesting - the radiators and fans will be in the external enclosure with tubing passing into the system through a 5.25-in bay. It will need to have quick connect/disconnect points at either the GPU or radiator to make that installation method possible.

waterforce4.jpg

Pricing and availability are still unknown, but don't expect to get it cheap. With the GTX 980 still selling for at least $550, you should expect something in the $2000 range or above with all the custom hardware and fittings involved.

Can I get two please?

Source: Gigabyte

NVIDIA GeForce GTX 960 Specifications Potentially Leaked

Subject: Graphics Cards | November 13, 2014 - 12:46 PM |
Tagged: nvidia, geforce, gtx 960, maxwell

It is possible that a shipping invoice fragment was leaked for the NVIDIA GeForce GTX 960. Of course, an image of text on a plain, white background is one of the easiest things to fake and/or manipulate, so take it with a grain of salt.

nvidia-gtx-960-shipping.jpg

The GTX 960 is said to have 4GB of RAM on the same, 256-bit bus. Its video outputs are listed as two DVI, one HDMI, and one DisplayPort, making this graphics card useful for just one G-Sync monitor per card. If I'm reading it correctly, it also seems to have a 993 MHz base clock (boost clock unlisted) and an effective 6008 MHz (1500 MHz actual) RAM clock. This is slightly below the 7 GHz (1750 MHz actual) of the GTX 970 and GTX 980 parts, but it should also be significantly cheaper.

The GeForce GTX 960 is expected to retail in the low-$200 price point... some day.

Source: Reader Tip

Ubisoft Responds to Low Frame Rates in Assassin's Creed Unity

Subject: Graphics Cards | November 12, 2014 - 09:03 PM |
Tagged: Unity, ubisoft, assassin's creed

Over the last couple of days there have been a lot of discussions about the performance of the new Assassin's Creed Unity from Ubisoft on current generation PC hardware. Some readers have expressed annoyance that the game is running poorly, at lower than expected frame rates, at a wide range of image quality settings. Though I haven't published my results yet, we are working on a story comparing NVIDIA and AMD GPUs in Unity, but the truth is that this is occurring on GPUs from both sides.

For example, using a Core i7-3960X and a single GeForce GTX 980 4GB reference card, I see anywhere from 37 FPS to 48 FPS while navigating the crowded city of Paris at 1920x1080 and on the Ultra High preset. Using the Low preset, that frame rate increases to 65-85 FPS or so.

unity3.jpg

Clearly, those are lower frame rates at 1920x1080 than you'll find in basically any other PC game on the market. The accusation from some in the community is that Ubisoft is either doing this on purpose or doing it out of neglect with efficient code. I put some questions to the development team at Ubisoft and though I only had a short time with them, the answers tell their side of the story.

Ryan Shrout: What in the Unity game engine is putting the most demand on the GPU and its compute resources? Are there specific effects or were there specific design goals for the artists that require as much GPU horsepower as the game does today with high image quality settings?

Ubisoft: Assassin’s Creed Unity is one of the most detailed games on the market and [contains] a giant, open world city built to the scale that we’ve recreated. Paris requires significant details. Some points to note about Paris in Assassin’s Creed Unity:

  • There are tens of thousands of objects are visible on-screen, casting and receiving shadows.
  • Paris is incredibly detailed. For example, Notre-Dame itself is millions of triangles.
  • The entire game world has global illumination and local reflections.
  • There is realistic, high-dynamic range lighting.
  • We temporally stabilized anti-aliasing.

RS: Was there any debate internally about downscaling on effects/image quality to allow for lower end system requirements?

Ubisoft: We talked about this a lot, but our position always came back to us ensuring that Assassin’s Creed Unity is a next-gen only game with breakthrough graphics. With this vision, we did not degrade the visual quality of the game. On PC, we have several option for low-scaling, like disabling AA, decreasing resolution, and we have low option for Texture Quality, Environment Quality and Shadows.

RS: Were you looking forward or planning for future GPUs (or multi-GPU) that will run the game at peak IQ settings at higher frame rates than we have today?

Ubisoft: We targeted existing PC hardware.

RS: Do you envision updates to the game or to future GPU drivers that would noticeably improve performance on current generations of hardware?

Ubisoft: The development team is continuing to work on optimization post-launch through software updates. You’ll hear more details shortly.

Some of the features listed by the developer in the first answer - global illumination methods, high triangle counts, HDR lighting - can be pretty taxing on GPU hardware. I know there are people out there pointing out games that have similar feature sets and that run at higher frame rates, but the truth is that no two game engines are truly equal. If you have seen Assassin's Creed Unity in action you'll be able to tell immediately the game is beautiful, stunningly so. Is it worth that level of detail for the performance levels achieved from current high-end hardware? Clearly that's the debate.

unity2.jpg

When I asked if Ubisoft had considered scaling back the game to improve performance, they clearly decided against it. The developer had a vision for the look and style of the game and they were dedicated to it; maybe to a fault from some gamers' viewpoint.

Also worth nothing is that Ubisoft is continuing to work on optimization post-release; how much of an increase we'll actually see with game patches or driver updates will have to be seen as we move forward. Some developers have a habit of releasing a game and simply abandoning it as it shipped - hopefully we will see more dedication from the Unity team.

So, if the game runs at low frame rates on modern hardware...what is the complaint exactly? I do believe that Ubisoft would have benefited from better performance on lower image quality settings. You can tell by swapping the settings for yourself in game but the quality difference between Low and Ultra High is noticeable, but not dramatically so. Again, this likely harkens back to the desire of Ubisoft to maintain an artistic vision.

Remember that when Crysis 3 launched early last year, running at 1920x1200 at 50 FPS required a GTX 680, the top GPU at the time; and that was at the High settings. The Very High preset only hit 37 FPS on the same card.

PC gamers seems to be creating a double standard. On one hand, none of us want PC-ports or games that are developed with consoles in mind that don't take advantage of the power of the PC platform. Games in the Call of Duty series are immensely popular but, until the release of Advanced Warfare, would routinely run at 150-200 FPS at 1080p on a modern PC. Crysis 3 and Assassin's Creed Unity are the opposite of that - games that really tax current CPU and GPU hardware, paving a way forward for future GPUs to be developed and NEEDED.

If you're NVIDIA or AMD, you should applaud this kind of work. Now I am more interested than ever in a GTX 980 Ti, or a R9 390X, to see what Unity will play like, or what Far Cry 4 will run at, or if Dragon Age Inquisition looks even better.

Of course, if we can get more performance from a better optimized or tweaked game, we want that too. Developers need to be able cater to as wide of a PC gaming audience as possible, but sometimes creating a game that can scale between running on a GTX 650 Ti and a GTX 980 is a huge pain. And with limited time frames and budgets, don't we want at least some developers to focus on visual quality rather than "dumbing down" the product?

Let me know what you all think - I know this is a hot-button issue!

UPDATE: Many readers in the comments are bringing up the bugs and artifacts within Unity, pointing to YouTube videos and whatnot. Those are totally valid complaints about the game, but don't necessarily reflect on the game's performance - which is what we were trying to target with this story. Having crashes and bugs in the game is disappointing, but again, Ubisoft and Assassin's Creed Unity aren't alone here. Have you seen the bugs in Skyrim or Tomb Raider? Hopefully Ubisoft will be more aggressive in addressing them in the near future. 

UPDATE 2: I also wanted to comment that even though I seem to be defending Ubisoft around the performance of Unity, my direct feedback to them was that they should enable modes in the game that allow it to play at higher frame rates and even lower image quality settings, even if they were unable to find ways to "optimize" the game's efficiency. So far the developer seems aware of all the complaints around performance, bugs, physics, etc. and is going to try to address them.

UPDATE 3: In the last day or so, a couple of other media outlets have posted anonymous information that indicates that the draw call count for Assassin's Creed Unity is at fault for the poor performance of the game on PCs. According to this "anonymous" source, while the consoles have low-level API access to hardware to accept and process several times the draw calls, DirectX 11 can only handle "7,000 - 10,000 peak draw calls." Unity apparently is "pushing in excess of 50,000 draw calls per frame" and thus is putting more pressure on the PC that it can handle, even with high end CPU and GPU hardware. The fact that these comments are "anonymous" is pretty frustrating as it means that even if they are accurate, they can't be taken as the truth without confirmation from Ubisoft. If this turns out to be true, then it would be a confirmation that Ubisoft didn't take the time to implement a DX11 port correctly. If it's not true, or only partially to blame, we are left with more meaningless finger-pointing.

PCPer Live! Assassin's Creed Unity Game Stream Powered by NVIDIA!

Subject: General Tech, Graphics Cards | November 10, 2014 - 10:07 PM |
Tagged: video, Unity, pcper, nvidia, live, GTX 980, geforce, game stream, assassins creed

UPDATE: If you missed the live stream event: good news! We have it archived up on YouTube now and embeded below for your viewing pleasure!

Assassin's Creed Unity is shaping up to be one of the defining games of the holiday season, with visuals and game play additions that are incredible to see in person. Scott already wrote up a post that details some the new technologies found in the game along with a video of the impressive detail the engine provides. Check it out!

To celebrate the release, PC Perspective has partnered with NVIDIA to host a couple of live game streams that will feature some multi-player gaming fun as well some prizes to giveaway to the community. I will be joined by some new NVIDIA faces to take on the campaign in a cooperative style while taking a couple of stops to give away some hardware.

livelogo-unity.jpg

Assassin's Creed Unity Game Stream Powered by NVIDIA

5pm PT / 8pm ET - November 11th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Here are some of the prizes we have lined up for those of you that join us for the live stream:

Another awesome prize haul!! How do you win? It's really simple: just tune in and watch the Assassin's Creed Unity Game Stream Powered by NVIDIA! We'll explain the methods to enter live on the air and anyone can enter from anywhere in the world - no issues at all!

So stop by Tuesday night for some fun, some gaming and the chance to win some goods!

unity1.jpg

unity2.jpg

unity3.jpg

Meet the new Maxwell STRIX

Subject: Graphics Cards | November 10, 2014 - 03:45 PM |
Tagged: asus, strix, GTX 970 STRIX DirectCU II OC, GTX 970, nvidia, maxwell

When ASUS originally kicked off their new STRIX line they gained popularity not only due to the decent overclock and efficient custom cooler but also because there was only a small price premium over the base model.  At a price of $400 on Amazon the card has a price inline with other overclocked models, some base models can be up to $50 less.  [H]ard|OCP investigated this card to see what benefits you could expect from the model in this review, comparing it to the R290 and 290X.  Out of the box the card runs at a core of 1253 -1266MHz and memory of 7GHz, with a bit of overvolting they saw a stable core of 1473 - 1492MHz and memory of 7.832GHz. 

With the new price of the 290X dipping as low as $330 it makes for an interesting choice for GPU shoppers.  The NVIDIA card is far more power efficient and the fans operate at 0dB until the GPU hits 65C, which [H] did not see until after running at full load for a time and even then the highest their manually overclocked card hit was 70C.  On the other hand the AMD card costs $70 less and offers very similar performance.  It is always nice to see competition in the market.

1414970377FWgAjDtieB_1_9_l.jpg

"Today we examine ASUS' take on the GeForce GTX 970 video card. We have the ASUS GTX 970 STRIX DirectCU II OC video card today, and will break down its next-gen performance against an AMD Radeon R9 290 and R9 290X. This video card features 0dB fans, and many factors that improve its chance of extreme overclocking."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

AMD Gaming Evolved Goes Beyond Earth with Hawaii

Subject: General Tech, Graphics Cards | November 6, 2014 - 12:00 AM |
Tagged: radeon, r9 295x2, R9 290X, r9 290, R9, hawaii, civilization, beyond earth, amd

Why settle for space, when you can go Beyond Earth too (but only if you go to Hawaii)!

firaxis-civ-beyond-earth.jpg

The Never Settle promotion launched itself into space a couple of months ago, but AMD isn't settling for that. If you purchase a Hawaii-based graphics card (R9 290, R9 290X, or R9 295X2) then you will get a free copy of Civilization: Beyond Earth on top of the choice of three games (or game packs) from the Never Settle Space Gold Reward tier. Beyond Earth makes a lot of sense of course, because it is a new game that is also one of the most comprehensive implementations of Mantle yet.

AMD_GPU_web_iframe_Gold.png

To be eligible, the purchase would need to be made starting November 6th (which is today). Make sure that you check to make sure that what your buying is a "qualifying purchase" from "participating retailers", because that is a lot of value to miss in a moment of carelessness.

AMD has not specified an end date for this promotion.

Source: AMD

8 GB Variants of the R9 290X Coming This Month

Subject: General Tech, Graphics Cards | November 5, 2014 - 12:56 PM |
Tagged: radeon, R9 290X, R9, amd, 8gb

With the current range of AMD’s R9 290X cards sitting at 4 GB of memory, listings for an 8 GB version have appeared on an online retailer. As far back as March, Sapphire was rumored to be building an 8 GB variety. Those rumours were supposedly quashed last month by AMD and Sapphire. However, AMD has since confirmed the existence of the new additions to the series. Pre-orders have appeared online and are said to be shipping out this month.

amd-r9-290x-8gb-GX-353-SP_88860_600.jpg

Image Credit: Overclockers UK

With 8 GB of GDDR5 memory and price tags between $480 and $520, these new additions, expectedly, do not come cheap. Compared to the 4 GB versions of the R9 290X line, which run about $160 less according to the online retailer, is it worth upgrading at this stage? For the people using a single 1080p monitor, the answer is likely no. For those with multi-screen setups, or those with deep enough pockets to own a 4K display, however, the benefits may begin to justify the premium. At 4K though, just a single 8 GB R9 290X may not provide the best experience; a Crossfire setup would benefit more from the 8 GB bump, being less reliant on GPU speed.

AMD’s 8 GB R9 290X’s are currently available for preorder: a reference version for £299.99 + VAT (~$480) and a Vapor-X version for £324.99 + VAT (~$520). They are slated to ship later this month.

What, me jealous? Four weeks with SLI'd GTX 980s

Subject: Graphics Cards | October 31, 2014 - 03:45 PM |
Tagged: sli, nvidia, GTX 980

Just in case you need a reason to be insanely jealous of someone, [H]ard|OCP has just published an article covering what it is like to be living with two GTX 980's in SLI.  The cards are driving three Dell U2410 24" 1920x1200 displays for a relatively odd resolution of 3600x1920 but apart from an issue with the GeForce Experience software suite the cards have no trouble displaying to all three monitors.  In their testing of Borderlands games they definitely noticed when PhysX was turned on, though like others [H] wishes that PhysX would abandon its proprietary roots.  When compared to the Radeon R9 290X CrossFire system the performance is very similar but when you look at heat, power and noise produced the 980's are the clear winner.  Keep in mind a good 290X is just over $300 while the least expensive GTX 980 will run you over $550.

1414677298HAmmSaoZGr_1_1.jpg

"What do you get when you take two NVIDIA GeForce GTX 980 video cards, configure those for SLI, and set those at your feet for four weeks? We give our thoughts and opinions about actually using these GPUs in our own system for four weeks with focus on performance, sound profile, and heat generated by these cards."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Assassin's Creed Unity Has NVIDIA-exclusive Effects via GameWorks

Subject: General Tech, Graphics Cards | October 29, 2014 - 06:12 PM |
Tagged: ubisoft, assassin's creed

Ubisoft has integrated GameWorks into Assassin's Creed Unity, or at least parts of it. The main feature to be included is NVIDIA's Horizon Based Ambient Occlusion Plus (HBAO+), which is their implementation of Ambient Occlusion. This effect darkens areas that would otherwise be incorrectly lit with our current limitations of Global Illumination. Basically, it analyzes the scene's geometry to subtract some of the influence of "ambient light" in places where it is an unrealistic approximation (particularly in small crevices). This is especially useful for overcast scenes, where direct sunlight does not overwhelm the contribution of scatters and bounces.

The other features to be included are Temporal Anti-alising (TXAA), Percentage-Closer Soft Shadows (PCSS), and GeometryWorks Advanced Tessellation. TXAA and PCSS were both included in Assassin's Creed IV: Black Flag, alongside the previously mentioned HBAO+, so it makes sense that Ubisoft continues to use what worked for them. GeometryWorks is a different story. NVIDIA seems to claim that it is like DirectX 11 tessellation, but is better suited for use alongside HBAO+ and PCSS.

unity2.jpg

Assassin's Creed Unity will be available on November 11th.

Source: NVIDIA

GeForce GTX 970 Coil Whine Concerns

Subject: Graphics Cards | October 28, 2014 - 12:09 PM |
Tagged: maxwell, GTX 970, geforce, coil whine

Coil whine is the undesirable effect of electrical components creating audible noise when operating. Let's look to our friends at Wikipedia for a concise and accurate description of the phenomenon:

Coil noise is, as its name suggests, caused by electromagnetic coils. These coils, which may act as inductors or transformers, have a certain resonant frequency when coupled with the rest of the electric circuit, as well as a resonance at which it will tend to physically vibrate.

As the wire that makes up the coil passes a variable current, a small amount of electrical oscillation occurs, creating a small magnetic field. Normally this magnetic field simply works to establish the inductance of the coil. However, this magnetic field can also cause the coil itself to physically vibrate. As the coil vibrates physically, it moves through a variable magnetic field, and feeds its resonance back into the system. This can produce signal interference in the circuit and an audible hum as the coil vibrates.

Coil noise can happen, for example, when the coil is poorly secured to the circuit board, is poorly damped, or if the resonant frequency of the coil is close to the resonant frequency of the electric circuit. The effect becomes more pronounced as the signal passing through the coil increases in strength, and as it nears the resonant frequency of the coil, or as it nears the resonant frequency of the circuit. Coil noise is also noticed most often when it is in the humanly audible frequency.

Coil noise is also affected by the irregularities of the magnetic material within the coil. The flux density of the inductor is effected by these irregularities, causing small currents in the coil, contaminating the original signal. This particular subset of is sometimes referred to as magnetic fluctuation noise or the Barkhausen effect. Coil noise can also occur in conjunction with the noise produced by magnetostriction.

Gamers that frequently upgrade their graphics cards may have been witness to this problem with a particular install, or you might have been one of the lucky ones to never deal with the issue. If your computer sits under your desk, in a loud room or you only game with headphones, it's also possible that you just never noticed.

inductor.jpg

Possibly offending inductors?

The reason this comes up to today is that reports are surfacing of GeForce GTX 970 cards from various graphics card vendors exhibiting excessive coil whine or coil noise. These reports are coming in from multiple forum threads around the internet, a collection of YouTube videos of users attempting to capture the issue and even official statements from some of NVIDIA's partners. Now, just because the internet is talking about it doesn't necessarily mean it's a "big deal" relative to the number of products being sold. However, after several Twitter comments and emails requesting we look into the issue, I thought it was pertinent to start asking questions.

As far as I can tell today, GTX 970 cards from multiple vendors including EVGA, MSI and Gigabyte all have users reporting issues and claims of excessive coil noise. For my part here, I have two EVGA GTX 970 cards and an MSI GTX 970, none of which are producing sound at what I would call "excessive" levels. Everyone's opinion of excessive noise is going to vary, but as someone who sits next to a desk-high test bed and hears hundreds of cards a year, I am confident I have a good idea of what to listen for.

We are still gathering data on this potential issue, but a few of the companies mentioned above have issued official or semi-official statements on the problem.

From MSI:  

The coil whine issue is not specific to 900 series, but can happen with any high end GPU and that MSI is looking in to ways to minimize the issue. If you still have concern regarding this issue, then please contact our RMA department.

From EVGA:

We have been watching the early feedback on GTX 970 and inductor noise very closely, and have actively taken steps to improve this. We urge anyone who has this type of concern to contact our support so we can address it directly.

From NVIDIA: 

We’re aware of a small percentage of users reporting excessive “coil whine” noises and are actively looking into the issue.

We are waiting for feedback from other partners to see how they plan to respond.

Since all of the GTX 970 cards currently shipping are non-reference, custom built PCB designs, NVIDIA's input to the problem is one mostly of recommendations. NVIDIA knows that it is their name and brand being associated with any noisy GeForce cards so I would expect a lot of discussions and calls being had behind closed doors to make sure partners are addressing user concerns.

IMG_9794.JPG

Interestingly, the GeForce GTX 970 was the one card of this Maxwell release where all of NVIDIA's partners chose to go the route of custom designs rather than adopting the NVIDIA reference design. On the GTX 980, however, you'll find a mix of both and I would wager that NVIDIA's reference boards do not exhibit any above average noise levels from coils. (I have actually tested four reference GTX 980s without coil whine coming into play.) Sometimes offering all of these companies the option to be creative and to differentiate can back-fire if the utmost care isn't taken in component selection.

Ironically the fix is simple: a little glue on those vibrating inductor coils and the problem goes away. But most of the components are sealed making the simple fix a non-starter for the end user (and I wouldn't recommend doing that anyway). It does point to a lack of leadership from board manufacturers that are willing to skimp on hardware in such a way to make this a big enough issue that I am sitting here writing about this today.

As an aside, if you hear coil whine when running a game at 500-5000 FPS, I don't think that counts as being a major problem for your gaming. I have seen a video or two running a DX9 render test at over 4500 FPS - pretty much any card built today will make noises you don't expect when hitting that kind of performance level.

As for my non-official discussions on the topics with various parties, everyone continues to reiterate that the problem is not as widespread as the some of the forum threads would have you believe. It's definitely higher than normal, and getting public acknowledgements from EVGA and MSI basically confirms that, but one person told me the complaint and RMA levels are where they were expected to be consider the "massively fast sell out rates" the GTX 970 is experiencing. 

Of course, AMD isn't immune to coil whine issues either. If you remember back to the initial launch of the Radeon R9 290X and R9 290, we had similar coil whine issues and experienced those first hand on reference card designs. (You can see a video I recorded of an XFX unit back in November of 2013 here.) You can still find threads on popular forums from that time period discussing the issue and YouTube never seems to forget anything, so there's that. Of course, the fact that previous card launches might have seen issues along the same line doesn't forgive the issue in current or later card releases, but it does put things into context.

So, let's get some user feedback; I want to hear from GTX 970 owners about their experiences to help guide our direction of research going forward.

Click here to take our short poll for GTX 970 owners!

Source: Various

Sony PS4 and Microsoft Xbox One Already Hitting a Performance Wall

Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM |
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd

A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?

For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:

The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.

What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.

We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..

unity1.jpg

So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.

  1. The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
     
  2. The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
     
  3. The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
     
  4. Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.

unity2.jpg

If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:

  PlayStation 4 Xbox One
Processor 8-core Jaguar APU 8-core Jaguar APU
Motherboard Custom Custom
Memory 8GB GDDR5 8GB DDR3
Graphics Card 1152 Stream Unit APU 768 Stream Unit APU
Peak Compute 1,840 GFLOPS 1,310 GFLOPS

The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

xboxonegpu.jpg

If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).

Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.

Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.

unity3.jpg

But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.

Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?

UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.

AMD Catalyst 14.9.2 Beta for Civilization: Beyond Earth

Subject: Graphics Cards | October 26, 2014 - 02:44 AM |
Tagged: amd, driver, catalyst

So Ryan has been playing many games lately, as a comparison between the latest GPUs from AMD and NVIDIA. While Civilization: Beyond Earth is not the most demanding game in existence on GPUs, it is not trivial either. While not the most complex, from a video card's perspective, it is a contender for most demanding game on your main processor (CPU). It also has some of the most thought-out Mantle support of any title using the API, when using the AMD Catalyst 14.9.2 Beta driver.

firaxis-civilization-beyond-earth.jpg

And now you can!

The Catalyst 14.9.2 Beta drivers support just about anything using the GCN architecture, from APUs (starting with Kaveri) to discrete GPUs (starting with the HD 7000 and HD 7000M series). Beyond enabling Mantle support in Civilization, it also fixes some issues with Metro, Shadow of Mordor, Total War: Rome 2, Watch_Dogs, and other games.

Also, both AMD and Firaxis are aware of a bug in Civilization: Beyond Earth where the mouse cursor does not click exactly where it is supposed to, if the user enables font scaling in Windows. They are working on it, but suggest setting it to the default (100%) if users experience this issue. This could be problematic for customers with high-DPI screens, but could keep you playing until an official patch is released.

You can get 14.9.2 Beta for Windows 7 and Windows 8.1 at AMD's website.

Source: AMD

AMD Radeon R9 290X Now Selling at $299

Subject: Graphics Cards | October 24, 2014 - 03:44 PM |
Tagged: radeon, R9 290X, leaderboard, hwlb, hawaii, amd, 290x

When NVIDIA launched the GTX 980 and GTX 970 last month, it shocked the discrete graphics world. The GTX 970 in particular was an amazing performer and undercut the price of the Radeon R9 290 at the time. That is something that NVIDIA rarely does and we were excited to see some competition in the market.

AMD responded with some price cuts on both the R9 290X and the R9 290 shortly thereafter (though they refuse to call them that) and it seems that AMD and its partners are at it again.

r9290x1.jpg

Looking on Amazon.com today we found several R9 290X and R9 290 cards at extremely low prices. For example:

The R9 290X's primary competition in terms of raw performance is the GeForce GTX 980, currently selling for $549 and up. If you can find them in stock, that means NVIDIA has a hill of $250 to climb when going against the lowest priced R9 290X.

r92901.jpg

The R9 290 looks interesting as well:

Several other R9 290 cards are selling for upwards of $300-320 making them bone-headed decisions if you can get the R9 290X for the same or lower price, but considering the GeForce GTX 970 is selling for at least $329 today (if you can find it) and you can see why consumers are paying close attention. 

Will NVIDIA make any adjustments of its own? It's hard to say right now since stock is so hard to come by of both the GTX 980 and GTX 970 but it's hard to imagine NVIDIA lowering prices as long as parts continue to sell out. NVIDIA believes that its branding and technologies like G-Sync make GeForce cards more valuable and until they being to see a shift in the market, I imagine that will stay the course.

For those of you that utilize our Hardware Leaderboard you'll find that Jeremy has taken these prices into account and update a couple of the system build configurations.

Source: Amazon.com

Who rules the ~$250 market? XFX R9 285 Black Edition versus the GTX 760

Subject: Graphics Cards | October 23, 2014 - 04:06 PM |
Tagged: xfx, R9 285 Black Edition, factory overclocked, amd

Currently sitting at $260 the XFX R9 285 Black Edition is a little less expensive than the ASUS ROG STRIKER GTX 760 and significantly more expensive than the ASUS GTX760 DirectCU2 card.  Those prices lead [H]ard|OCP to set up a showdown to see which card provided the best bang for the buck, especially once they overclocked the AMD card to 1125MHz core and 6GHz RAM.  In the end it was a very close race between the cards, the performance crown did go to the R9 285 BE but that performance comes at a premium as you can get performance almost as good for $50 less.  Of course the both the XFX card and the  STRIKER sell at a premium compared to cards with less features and a stock setup; you should expect the lower priced R9 285s to be closer in performance to the DirectCU2 card.

1413885880S78ZQ7Hqqp_1_13_l.jpg

"Today we are reviewing the new XFX Radeon R9 285 Black Edition video card. We will compare it to a pair of GeForce GTX 760 based GPUs to determine the best at the sub-$250 price point. XFX states that it is faster than the GTX 760, but that is based on a single synthetic benchmark, let's see how it holds up in real world gaming."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

GeForce Game Ready Driver 344.48 WHQL

Subject: Graphics Cards | October 22, 2014 - 12:52 PM |
Tagged: whql, nvidia, GeForce 344.48

"Game Ready" for Lords of the Fallen, Civilization: Beyond Earth, and Elite: Dangerous. 

Grab it straight from NVIDIA or GeForce.com.

dsr-auto-enabled-in-geforce-experience-640px.jpg

What’s New in Version 344.48

Game Ready

Best gaming experience for Lords of the Fallen, Civilization: Beyond Earth, and Elite:Dangerous.

Gaming Technology

  • Supports Dynamic Super Resolution (DSR) on Kepler and Fermi-based desktop GPUs. Software Modules
  • NVIDIA PhysX System Software - version 9.14.0702
  • NVIDIA GPU PhysX acceleration is available only on systems with GeForce 8-series and later GPUs with a minimum of 256 MB dedicated graphics memory.
  • NVIDIA GPU PhysX acceleration is not available if there is a non-NVIDIA graphics processor in the system, even if it is not used for rendering.
  • HD Audio Driver - version 1.3.32.1 CUDA - version 6.5
  • GeForce Experience - 16.13.56.0 Application Profiles

Added or updated the following profiles:

  • Assassin's Creed Unity – control panel FXAA disabled
  • Dead Rising 3 – SLI-Single profile added
  • Elite Dangerous – SLI profile added, control panel FXAA disabled
  • Escape Dead Island – SLI profile added
  • FIFA 15 – SLI-Single profile added
  • Lichdom: Battlemage– SLI profile added
  • Lords of the Fallen – SLI profile added
  • MechWarrior Online – DX11 SLI profile added
  • Monster Hunter Online Benchmark – SLI profile added
  • Ryse: Son of Rome – SLI profile added, stereo blocked
  • Sid Meier's Civilization: Beyond Earth – ambient occlusion (AO) profile added
  • Sleeping Dogs Definitive Edition – SLI profile added
  • The Crew – control panel FXAA disabled
  • The Vanishing of Ethan Carter – SLI profile added 3D Vision Profiles

Added or updated the following profiles:

  • Dead Rising 3 – Not Recommended
  • Strife – rated as Fair 3D Compatibility Mode Support

Support for 3D Compatibility Mode has been added for the following games:

  • Dead Rising 3 – rated as Excellent
  • Strife – rated as Excellent

Windows Vista/Windows 7/Windows 8/Windows 8.1 Fixed Issues

  • Make control panel option for MFAA visible in NVIDIA Control Panel only for non-SLI configurations.
  • Implement MFAA along with porting TSF filter to driver side shim.
  • Add SLI profile for Sleeping Dogs: Definitive Edition.
  • GeForce GTX 980, Windows 8.1: Occasionally, the first line in a displayed frame mistakenly has content from a prior rendered frame.
  • Need SLI profile for FIFA 15.
  • Having G-SYNC enabled with Oculus Rift drivers installed causes applications to crash while launching and sometimes causes the system to reboot.
  • Green screen when certain videos played back in Media Player Classic Home Cinema. Backport to r304_00 all missing changes to the FreeBSD installer.
  • Device does not start (error code 49) in certain OEM motherboards.
  • Assassin's Creed Unity, Windows 8: TDR crash after loading a level and playing a little on NVIDIA 7-series GPUs.
  • Windows 8.1: Significant drop off in performance with 3D Vision enabled in SLI in Tomb Raider, no repro with Windows 7.
Source: NVIDIA

PCPer Live! Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA Part 2!

Subject: Editorial, Graphics Cards | October 21, 2014 - 07:45 PM |
Tagged: video, pcper, nvidia, live, GTX 980, geforce, game stream, borderlands: the pre-sequel, borderlands

UPDATE: It's time for ROUND 2!

UPDATE 2: You missed the fun for the second time? That's unfortunate, but you can relive the fun with the replay right here!

I'm sure like the staff at PC Perspective, many of our readers have been obsessively playing the Borderlands games since the first release in 2009. Borderlands 2 arrived in 2012 and once again took hold of the PC gaming mindset. This week marks the release of Borderlands: The Pre-Sequel, which as the name suggests, takes place before the events of Borderlands 2. The Pre-Sequel has playable characters that were previously only known to the gamer as NPCs and that, coupled with the new low-gravity game play style, should entice nearly everyone that loves the first-person, loot-driven series to come back.

To celebrate the release, PC Perspective has partnered with NVIDIA to host a couple of live game streams that will feature some multi-player gaming fun as well some prizes to giveaway to the community. I will be joined once again by NVIDIA's Andrew Coonrad and Kris Rey to tackle the campaign in a cooperative style while taking a couple of stops to give away some hardware.

livelogo.jpg

Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA Part 2

5pm PT / 8pm ET - October 21st

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Here are some of the prizes we have lined up for those of you that join us for the live stream:

Holy crap, that's a hell of a list!! How do you win? It's really simple: just tune in and watch the Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA! We'll explain the methods to enter live on the air and anyone can enter from anywhere in the world - no issues at all!

So stop by Tuesday night for some fun, some gaming and the chance to win some hardware!

global-landing-page-borderlands-presequal.jpg

2K_Borderlands_Pre-Sequel_AthenaToss_1stPerson.jpg

2K_Borderlands_Pre-Sequel_moonBandits.jpg

Gigabyte Packs Factory Overclocked GTX 970 GPU Into Mini ITX Card

Subject: Graphics Cards | October 21, 2014 - 06:42 PM |
Tagged: maxwell, nvidia, gaming, mini ITX, small form factor, GTX 970, GM204, gigabyte

Gigabyte has announced a new miniature graphics card based around NVIDIA's GeForce GTX 970 GPU. The upcoming card is a dual slot, single fan design that is even shorter than the existing GTX 970 graphics cards (which are fairly short themselves). Officially known as the GV-N970IXOC-4GD, the miniaturized GTX 970 will be available for your small form factor (Mini ITX) systems in November for around $330.

The new Mini ITX compatible graphics card packs in a factory overclocked GeForce GTX 970 processor, 4GB of video memory, a custom PCB, and a custom WindForce-inspired cooler into a graphics card that is smaller than any of the existing GTX 970 cards. Gigabyte is using a custom design with a single 8-pin PCI-E power connector instead of two 6-pin connectors from the reference design or the 6-pin plus 8-pin from manufacturers like EVGA. The single power connector means less cabling to route (and successfully attempt to hide heh) and better small form factor PSU compatibility. The cooler is an aluminum fin array with three copper heatpipes paired with a single shrouded fan.

Gigabyte GTX 970 Factory Overclocked Mini ITX Graphics Card.png

The tiny card comes factory overclocked at 1076 MHz base and 1216 MHz boost, which is a respectable boost over the reference specifications. For reference, the GeForce GTX 970 processor is a 28nm chip using NVIDIA's GM204 "Maxwell" architecture with 1664 CUDA cores clocked at 1051 MHz base and 1178 MHz boost. It appears that Gigabyte has left the 4GB of GDDR5 untouched at 7.0 GT/s.

  Gigabyte GTX 970 Mini ITX
Reference GTX 970
CUDA Cores 1664 1664
Core (MHz) 1076 1051
Core (MHz) Boost 1216 1178
Memory 4GB 4GB
Memory Rate 7.0 (GT/s) 7.0 (GT/s)
Memory Width 256-bit 256-bit
Architecture Maxwell Maxwell
Process Node 28nm 28nm
PCI-E Power 1x 8-pin 2x 6-pin
DirectX Version 12.0 12.0

The display output on the miniature Gigabyte card differs slightly from the reference design with the addition of a DVI-D connection.

  • 3 x DisplayPort
  • 1 x HDMI
  • 1 x DVI-I
  • 1 x DVI-D

According to Gigabyte, its custom cooler resulted in lower temperatures versus the reference design. The company claims that when running Metro: Last Light, the Mini ITX Gigabyte GTX 970 GPU ran at 62°C versus a reference design hitting 76°C running the same game. If true, the Gigabyte cooler is capable of keeping the card significantly cooler while taking up less space (though fan speeds and sound levels were not mentioned, nor compared to other custom coolers).

The small form factor friendly GTX 970 is coming next month with a MSRP of $329.99. Are you excited?

Source: Videocardz

The GTX 980 can reach very impressive frequencies

Subject: Graphics Cards | October 14, 2014 - 06:49 PM |
Tagged: GTX 980, nvidia, overclocking

[H]ard|OCP has had more time to spend with their reference GTX 980 and have reached the best stable overclock they could on this board without moving to third party coolers or serious voltage mods.  At 1516MHz core and 8GHz VRAM on this reference card, retail models will of course offer different results; regardless it is not too shabby a result.  This overclock was not easy to reach and how they managed it and the lessons they learned along the way make for interesting reading.  The performance increases were noticeable, in most cases the overclocked card was beating the stock card by 25% and as this was a reference card the retail cards with enhanced coolers and the possibility of custom BIOS which disable NVIDIA's TDP/Power Limit settings you could see cards go even faster.  You can bet [H] and PCPer will both be revisting the overclocking potential of GTX 980s.

1412746309oHQVINIuLi_1_1.gif

"The new NVIDIA GeForce GTX 980 makes overclocking GPUs a ton of fun again. Its extremely high clock rates achieved when you turn the right dials and sliders result in real world gaming advantages. We will compare it to a GeForce GTX 780 Ti and Radeon R9 290X; all overclocked head-to-head."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

PCPer Live! Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA

Subject: Editorial, Graphics Cards | October 13, 2014 - 10:28 PM |
Tagged: video, pcper, nvidia, live, GTX 980, geforce, game stream, borderlands: the pre-sequel, borderlands

UPDATE: You missed this weeks live stream but you can watch the game play via this YouTube embed!!

I'm sure like the staff at PC Perspective, many of our readers have been obsessively playing the Borderlands games since the first release in 2009. Borderlands 2 arrived in 2012 and once again took hold of the PC gaming mindset. This week marks the release of Borderlands: The Pre-Sequel, which as the name suggests, takes place before the events of Borderlands 2. The Pre-Sequel has playable characters that were previously only known to the gamer as NPCs and that, coupled with the new low-gravity game play style, should entice nearly everyone that loves the first-person, loot-driven series to come back.

To celebrate the release, PC Perspective has partnered with NVIDIA to host a couple of live game streams that will feature some multi-player gaming fun as well some prizes to giveaway to the community. I will be joined by NVIDIA's Andrew Coonrad and Kris Rey to tackle the campaign in a cooperative style while taking a couple of stops to give away some hardware.

livelogo.jpg

Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA

5pm PT / 8pm ET - October 14th

PC Perspective Live! Page

Need a reminder? Join our live mailing list!

Here are some of the prizes we have lined up for those of you that join us for the live stream:

Holy crap, that's a hell of a list!! How do you win? It's really simple: just tune in and watch the Borderlands: The Pre-Sequel Game Stream Powered by NVIDIA! We'll explain the methods to enter live on the air and anyone can enter from anywhere in the world - no issues at all!

So stop by Tuesday night for some fun, some gaming and the chance to win some hardware!

global-landing-page-borderlands-presequal.jpg

2K_Borderlands_Pre-Sequel_AthenaToss_1stPerson.jpg

2K_Borderlands_Pre-Sequel_moonBandits.jpg

Dr. Lisa Su Is AMD's New President and CEO

Subject: Graphics Cards, Processors | October 8, 2014 - 05:54 PM |
Tagged: amd

In an abrupt announcement, Rory Read has stepped down from his positions at AMD, leaving them to Dr. Lisa Su. Until today, Mr. Read served as president and Chief Executive Officer (CEO) of the x86 chip designer and Dr. Su as Chief Operating Officer (COO). Today however, Dr. Su has become president and CEO, and Mr. Read will stay on for a couple of months as an adviser during the transition.

amd-lisa-su.jpg

Josh Walrath, editor here at PC Perspective, tweeted that he was "Curious as to why Rory didn't stay on longer? He did some good things there [at AMD], but [it's] very much an unfinished job." I would have to agree. It feels like an odd time, hence the earlier use of the word "abrupt", to have a change in management. AMD restructured just four months ago, which was the occasion for Dr. Su to be promoted to COO. In fact, at least as far as I know, no-one is planned to fill her former position as COO.

These points suggest that she was planned to take over the company for at least several months.

 

Josh's Thoughts

I have been told that timing is everything.  I guess this rings true, but only if you truly know the circumstances around any action.  Today’s announcement by AMD was odd in its timing, but it was not exactly unexpected.  As Scott mentioned above, I was confused by this happening now.  I had expected Rory to be in charge for at least another year, if not two.  Rory had hinted that he was not planning on being at AMD forever, but was aiming at creating a solid foundation for the company and to help shore up its finances and instill a new culture.  While the culture is turning due to pressure from up top as well as a pretty significant personnel cuts, AMD is not quite as nimble yet as they want to be.

Rory’s term has seen the return of seasoned veterans like Jim Keller and Raja Koduri.  These guys are helping to turn the ship around after some fairly mediocre architecturse on the CPU and GPU sides.  While Raja had little to do with GCN, we are seeing some aggressive moves there in terms of features that are making their products much more competitive with NVIDIA.  Keller has made some very significant changes to the overall roadmap on the CPU side and I think we will see some very solid improvements in design and execution over the next two years.

Lisa Su was brought in by Rory shortly after he was named CEO.  Lisa has a pretty significant background in semiconductors and has made a name for herself in her work with IBM and Freescale.  Lisa attained all three of her degrees from MIT.  This is not unheard of, but it is uncommon to stay in one academic setting when gaining advanced degrees.  Having said that, MIT certainly is the top engineering and science school in the nation (if not the world).  I’m sure people from RPI, GT, and CalTech might argue that, but it certainly is an impressive school to have on your resume.

Dr. Su has seemingly been groomed for this transition for quite some time now.  She went from a VP to COO rather quickly, and is now shouldering the burden of being CEO.  Lisa has been on quite a few of the quarterly conference calls and taking questions.  She also serves on the Board of Directors at Analog Devices.

I think that Lisa will continue along the same path that Rory set out, but she will likely bring a few new wrinkles due to her experience with semiconductor design and R&D at IBM.  We can only hope that this won’t become a Dirk Meyer 2.0 type situation where a successful engineer and CPU architect could not change the course of the company after the disastrous reign of Hector Ruiz.  I do not think that this will be the case, as Rory did not leave the mess that Hector did.  I also believe that Lisa has more business sense and acumen than Dirk did.

This change, at this time, has provided some instability in the markets when regarding AMD.  Some weeks ago AMD was at a near high for the year at around $4.66 per share.  Right now it is hovering at $3.28.  I was questioning why the stock price was going down, and it seems that my question was answered.  One way or the other, rumors of Rory taking off reached investors’ ears and we saw a rapid decline in share price.  We have yet to see what Q3 earnings look like now that Rory has rather abruptly left his position, but people are pessimistic as to what will be announced with such a sudden departure.