Subject: General Tech, Graphics Cards | April 19, 2013 - 02:51 PM | Ryan Shrout
Tagged: raja koduri, apple, amd
Interesting information has surfaced today about the addition of a new executive at AMD. Raja Koduri, who previously worked for ATI and AMD as Chief Technology Officer, departed the company in 2009 for a four year stint at Apple, helping to turn that company into an SoC power house. Developing its own processors has enabled Apple to stand apart from the competition in many mobile spaces and Koduri is partly responsible for the technological shift at Apple.
Starting on Monday though, Raja Koduri is officially back at AMD, taking over as the CVP (Corporate Vice President) of Visual Computing. This position will result in more complete control over the entirety of the hardware and software platforms AMD is developing including desktop discrete, mobile and APU/SoC designs. This marks the second major returning visionary executive in recent memory to AMD, the first of which was Jim Keller in August of 2012 (also returning from a period with Apple).
It will take some time for Koduri to have effect on AMD's current roadmap
Having known Raja Koduri for quite a long time I have always seen the man as an incredibly intelligent engineer that was able to find strengths in designs that others could not. Much of the success of the ATI/AMD GPU divisions during the 2000s was due to Koduri's leadership (among others of course) and I think having him back at AMD at an even more senior role is great news for both discrete graphics fans and APU users.
In a discussion with Koduri recently, Anandtech got some positive feedback for PC gamers:
Raja believes there’s likely another 15 years ahead of us for good work in high-end discrete graphics, so we’ll continue to see AMD focus on that part of the market.
Koduri sees 15 years more GPU evolution
So even though this hiring isn't going to change AMD's position on the APU and SoC strategy, it is good to have someone at the CVP level that sees the importance and value of discrete, high power GPU technology.
In many talks with AMD over the last 6 months we kept hearing about the healthy influx of quality personnel though much of it was still under wraps. Keller was definitely one of them and Koduri is another and both of the hires give a lot of hope for AMD as a company going forward. Some in the industry have already written AMD off but I find it hard to believe that this caliber of executive would return to a sinking ship.
Subject: General Tech, Graphics Cards, Displays | April 18, 2013 - 08:52 PM | Ryan Shrout
Tagged: video, seiki, se50UY04, hdtv, hdmi 1.4, displays, 4k, 3840x2160
This just in! We have a 4K TV in the PC Perspective Offices!
While we are still working on the ability to test graphics card performance at this resolution with our Frame Rating capture system, we decided to do a live stream earlier today as we unboxed, almost dropped and then eventually configured our new 4K TV.
The TV in question? A brand new SEIKI SE50UY04 50-in 3840x2160 ready display. Haven't heard of it? Neither have we. I picked it up over the weekend from TigerDirect for $1299, though it actually a bit higher now at $1499.
The TV itself is pretty unassuming and other than looking for the 4K label on the box you'd be hard pressed to discern it from other displays. It DID come with a blue, braided UHD-ready HDMI cable, so there's that.
One point worth noting is that the stand on the TV is pretty flimsy; there was definitely wobble after installation and setup.
Connecting the TV to our test system was pretty easy - only a single HDMI cable was required and the GeForce GTX 680s in SLI we happened to have on our test bed recognized it as a 3840x2160 capable display. Keep in mind that you are limited to a refresh rate of 30 Hz though due to current limitations of HDMI 1.4. The desktop was clear and sharper and if you like screen real estate...this has it.
The first thing we wanted to try was some 4K video playback and we tried YouTube videos, some downloaded clips we found scattered across the Internet and a couple of specific examples I had been saving. Isn't that puppy cute? It was by far the best picture I had seen on a TV that close up - no other way to say it.
We did have issues with video playback in some cases due to high bit rates. In one case we had a YUV uncompressed file that was hitting our SSD so hard on read speeds that we saw choppiness. H.265 save us!
And of course we demoed some games as well - Battlefield 3, Crysis 3, Skyrim and Tomb Raider. Each was able to run at 3840x2160 without any complaints or INI hacks. They all looked BEAUTIFUL when in a still position but we did notice some flickering on the TV that might be the result of the 120 Hz interpolation and possibly the "dynamic luminance control" feature that SEIKI has.
We'll definitely test some more on this in the coming days to see if we can find a solution as I know many PC gamers are going to be excited about the possibility of using this as a gaming display! We are working on a collection of benchmarks on some of the higher end graphics solutions like the GeForce TITAN, GTX 680s, HD 7990 and HD 7970s!
If you want to check out the full experience of our unboxing and first testing, check out the full live stream archived below!!
Subject: Graphics Cards | April 18, 2013 - 04:15 PM | Jeremy Hellstrom
Tagged: asus, HD 7850 DirectCU II
With a custom cooler, 1 DP port, 2 DVI-I connectors, and 1 HDMI connector and only requiring a single PCIe 6 pin power connector the ASUS 7850 DirectCU II is a great blend of efficiency and flexibility for those looking for a card which costs around $200. On the other hand if you have no plans to overclock the card, the GTX660 which [H]ard|OCP compared this card to is slightly more powerful, costs the same and is a better choice for those who are planning on running dual GPUs. Check out the overclocked performance of this HD7850 in the full review.
"ASUS has refreshed its AMD Radeon HD 7850 DirectCU II video card with DirectCU and DIGI+ VRM with Super Alloy Power, poised to give you a robust video card with an improved overclocking experience. We will see whether this new revision brings new value to the Radeon HD 7850 GPU and we will compare it to the NVIDIA GeForce GTX 660."
Here are some more Graphics Card articles from around the web:
- PowerColor Radeon HD 7850 PCS+ Review @ OCC
- Asus Radeon HD 7790 DirectCU II OC 1GB @ eTeknix
- XFX Radeon HD 7790 Black Edition OC 1GB @ eTeknix
- XFX R7790 Black Edition OC @ LanOC Reviews
- Club3D Radeon HD 7790 13Series 1GB @ eTeknix
- GIGABYTE Radeon HD 7790 1GB & NVIDIA GeForce GTX 650 Ti BOOST 2GB Review @ Techgage
- ASUS Radeon HD 7790 DirectCU II OC Overclocked @ Tweaktown
- 23 AMD Radeon HD 7870 / 7950 and Nvidia GeForce GTX 660 / 660 Ti graphics card round-up @ Hardware.info
- AMD Radeon Gallium3D More Competitive With Catalyst On Linux @ Phoronix
- Diamond BizView 750 @ LanOC Reviews
- History of the GPU, Part 3: The Nvidia vs. ATI era begins @ Techspot
- History of the Modern Graphics Processor, Part 4: The GPGPU era arrives @ TechSpot
- iXBT Labs Review: i3DSpeed, March 2013
- ASUS GTX670 DirectCU Mini OC @ Hardware.info
- NVIDIA GeForce GTX 650 Ti Boost Roundup @ Hardware Canucks
- GTX 650 Ti Boost SLI @ LanOC Reviews
- Palit GeForce GTX 650 Ti Boost 2GB OC @ Tweaktown
- ZOTAC GeForce GTX 650 TI Boost Edition @ Modders-Inc
- EVGA GTX 650 Ti Boost SC 2 GB @ techPowerUp
- MSI GeForce GTX 650 Ti BOOST 2GB Twin Frozr Edition Review @ Hi Tech Legion
- MSI GTX 650Ti Boost Video Card Review @ Ninjalane
- Gainward GeForce GTX 650 Ti Boost GS Dual & GTX 660 GS Dual @ Legion Hardware
Subject: Graphics Cards | April 16, 2013 - 10:24 PM | Ryan Shrout
Tagged: nvidia, metro last light, Metro
Late this evening we got word from NVIDIA about an update to its game bundle program for GeForce GTX 600 series cards. Replacing the previously running Free to Play bundle that included $50 in credit for each World of Tanks, Hawken and Planetside 2 title, NVIDIA is moving back to the AAA game with Metro: Last Light.
Metro: Last Light is the sequel to surprise hit from 2010, Metro 2033 and I am personally really looking forward to the game and seeing how it can stress PC hardware like the first did.
This bundle is only good for GTX 660 cards and above with the GTX 650 Ti sticking with the Free to Play $75 credit offer.
NVIDIA today announced that gamers who purchase a NVIDIA GeForce GTX 660 or above would also receive a copy of the highly anticipated Metro: Last Light, published by Deep Silver and is the sequel to the multi award winning Metro 2033. Metro: Last Light will be available May 14, 2013 within the US and May 17, 2013 across Europe.
The deal is already up and running on Newegg.com but with the release date of Metro: Last Light set at May 14th, you'll have just about a month to wait before you can get your hands on it.
How do you think this compares to AMD's currently running bundle with Bioshock Infinite and more? Did NVIDIA step up its game this time around?
Subject: Graphics Cards | April 16, 2013 - 03:01 PM | Ryan Shrout
Tagged: vsync, stutter, smoothness, microstutter, frame rating, animation
We are running a poll in conjunction with our Frame Rating: Visual Effects of Vsync on Gaming Animation story that compares animation smoothness between fixed 30 FPS and 60 FPS captures and Vsync enabled versions.
If you haven't read the story linked above, these questions won't make any sense to you so please go read it and then stop back here to answer the polls!
Not a simple answer
After publishing the Frame Rating Part 3 story, I started to see quite a bit of feedback from readers and other enthusiasts with many requests for information about Vsync and how it might affect the results we are seeing here. Vertical Sync is the fix for screen tearing, a common artifact seen in gaming (and other mediums) when the frame rendering rate doesn’t match the display’s refresh rate. Enabling Vsync will force the rendering engine to only display and switch frames in the buffer to match the vertical refresh rate of the monitor or a divisor of it. So a 60 Hz monitor could only display frames at 16ms (60 FPS), 33ms (30 FPS), 50ms (20 FPS), and so on.
Many early readers hypothesized that simply enabling Vsync would fix the stutter and runt issues that Frame Rating was bringing to light. In fact, AMD was a proponent of this fix, as many conversations we have had with the GPU giant trailed into the direction of Vsync as answer to their multi-GPU issues.
In our continuing research on graphics performance, part of our Frame Rating story line, I recently spent many hours playing games on different hardware configurations and different levels of Vertical Sync. After this time testing, I am comfortable in saying that I do not think that simply enabling Vsync on platforms that exhibit a large number of runt frames fixes the issue. It may prevent runts, but it does not actually produce a completely smooth animation.
To be 100% clear - the issues with Vsync and animation smoothness are not limited to AMD graphics cards or even multi-GPU configurations. The situations we are demonstrating here present themselves equally on AMD and NVIDIA platforms and with single or dual card configurations, as long as all other parameters are met. Our goal today is only to compare a typical Vsync situation from either vendor to a reference result at 60 FPS and at 30 FPS; not to compare AMD against NVIDIA!!
In our initial research with Frame Rating, I presented this graph on the page discussing Vsync. At the time, I left this note with the image:
The single card and SLI configurations without Vsync disabled look just like they did on previous pages but the graph for GTX 680 SLI with Vsync on is very different. Frame times are only switching back and forth between 16 ms and 33 ms, 60 and 30 instantaneous FPS due to the restrictions of Vsync. What might not be obvious at first is that the constant shifting back and forth between these two rates (two refresh cycles with one frame, one refresh cycle with one frame) can actually cause more stuttering and animation inconsistencies than would otherwise appear.
Even though I had tested this out and could literally SEE that animation inconsistency I didn't yet have a way to try and demonstrate it to our readers, but today I think we do.
The plan for today's article is going to be simple. I am going to present a set of three videos to you that show side by side runs from different configuration options and tell you what I think we are seeing in each result. Then on another page, I'm going to show you three more videos and see if you can pinpoint the problems on your own.
Subject: Graphics Cards | April 15, 2013 - 03:34 PM | Tim Verry
Tagged: rumor, nvidia, kepler, gtx 700, geforce 700, computex
Recent rumors seem to suggest that NVIDIA will release its desktop-class GeForce 700 series of graphics cards later this year. The new card will reportedly be faster than the currently-available GTX 600 series, but will likely remain based on the company's Kepler architecture.
According to the information presented during NVIDIA's GTC keynote, its Kepler architecture will dominate 2012 and 2013. It will then follow up with Maxwell-based cards in 2014. Notably absent from the slides are product names, meaning the publicly-available information at least leaves the possibility of a refreshed Kepler GTX 700 lineup in 2013 open.
Fudzilla further reports that NVIDIA will release the cards as soon as May 2013, with an official launch as soon as Computex. Having actual cards available for sale by Computex is a bit unlikely, but a summer launch could be possible if the new 700 series is merely a tweaked Kepler-based design with higher clocks and/or lower power usage. The company is rumored to be accelerating the launch of the GTX 700 series in the desktop space in response to AMD's heavy game-bundle marketing, which seems to be working well at persuading gamers to choose the red team.
What do you make of this rumor? Do you think a refreshed Kepler is coming this year?
Subject: Graphics Cards | April 14, 2013 - 07:59 PM | Tim Verry
Tagged: windforce, nvidia, gtx titan, gtx 680, gpu cooler, gigabyte
Earlier this week, PC component manufacturer Gigabyte showed off its new graphics card cooler at its New Idea Tech Tour even in Berlin, Germany. The new triple slot cooler is built for this generation's highest-end graphics cards. It is capable of cooling cards with up to 450W TDPs while keeping the cards cooler and quiter than reference heatsinks.
The Gigabyte WindForce 450W cooler is a triple slot design that combines a large heatsink with three 80mm fans. The heatsink features two aluminum fin arrays connected to the GPU block by three 10mm copper heatpipes. Gigabyte stated during the card's reveal that its cooler keeps a NVIDIA GTX 680 graphics card 2°C cooler and 23.3 dB quiter during a Furmark benchmark run. Further, the cooler will allow these high end cards, like the GTX Titan to achieve higher (stable) boost clocks.
ComputerBase.de was on hand at Gigabyte's event in Berlin to snap shots of the upcoming GPU cooler.
The company has not announced which graphics cards will use the new cooler or when it will be available, but A Gigabyte GTX 680 and a custom cooled-Titan seem to be likely candidates considering these cards were mentioned in the examples given in the presentation. Note that NVIDIA has prohibited AIB partners from putting custom coolers on the Titan thus far, but other rumored Titan graphics cards with custom coolers seem to suggest that the company will allow custom-cooled Titans to be sold at retail at some point. In addition to using it for the top-end NVIDIA cards, I think a GTX 670 or GTX 660 Ti GPU using this cooler would also be great, as it would likely be one of the quieter running options available (because you could spin the three 80mm fans much slower than the single reference fan and still get the same temps).
What do you think about Gigabyte's new 450W GPU cooler? You can find more photos over at Computer Base (computerbase.de).
Subject: Editorial, General Tech, Graphics Cards | April 14, 2013 - 02:22 AM | Scott Michaud
Tagged: never settle, never settle reloaded, amd, far cry 3
So when AMD reloaded their Never Settle bundles, they left an extra round in the barrel.
Some of my favorite games were given to me in a bundle with some piece of computer hardware. You might remember from the PC Perspective game night that I am a major fan of the Unreal Tournament franchise. My first Unreal Tournament game was an unexpected surprise when I purchased my first standalone GPU. My 166MHz Pentium computer also came bundled with Mechwarrior 2 and Wipeout.
As we discussed, AMD considers bundle-offers as a way to keep the software industry rolling forward. The quantity and quality of games which participate in the recent Never Settle bundles certainly deserve credit as it is due. Bioshock: Infinite is a game that just about every PC gamer needs to experience, and there are about a half-dozen other great titles as a part of the promotion depending upon which card or cards you purchase.
As it turns out, AMD negotiated with Ubisoft and added Far Cry 3: Blood Dragon to their Never Settle bundle. The coolest part is that AMD will retroactively email codes for this new title to anyone who has redeemed a Never Settle: Reloaded code.
So if you have ever Reloaded your Never Settle in the past, check your email as apparently you can Never Settle your reloads again.
Subject: Graphics Cards | April 13, 2013 - 10:07 PM | Tim Verry
Tagged: radeon hd7790, powercolor, GCN, amd, 7790
PowerColor launched a new factory overclocked graphics card recently that is a revision of a previous model. The PowerColor HD7790 OC V2 is based on AMD’s Graphics Core Next (GCN) architecture and measures a mere 180 x 150 x 38mm.
The AMD Radeon HD 7790 GPU features 896 stream processors, 56 texture units, and 80 ROP units. The GPU is clocked at 1000 MHz base and 1030 MHz boost while the 1GB of GDDR5 memory is clocked at the 6Gbps reference speed. PowerColor has fitted the overclocked card with an aluminum heatsink cooled by a single 8mm copper heatpipe and 70mm fan.
The new card features two DL-DVI, one HDMI, and one DisplayPort video outputs. Its model number is AX7790-1GBD5-DHV2/OC. According to Guru3D, the new/revised card is priced at 120 pounds sterling. However, considering the currently available OC (non-V2) card is $150, the revised card is likely to come in around that price when it hits US retailers.
Also: If you have not already, read our latest Frame Rating article to see how the Radeon HD 7790 graphics card stacks up against the competition!