All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards, Shows and Expos | January 9, 2014 - 01:40 PM | Ryan Shrout
Tagged: CES, CES 2014, msi, 290x, radeon, amd, Lightning, R9 290X
The MSI Lightning series of graphics cards continues to be one of the best high end enthusiast lines available as we have seen with our reviews of the MSI GeForce GTX 780 Lightning and the R7970 Lightning. At CES this week in Las Vegas the company was showcasing the upcoming card in the series based on the latest AMD Hawaii GPU.
The MSI R9 290X Lightning features an updated triple cooler design and heat pipe cooler that appears to be truly impressive. If the weight of the card is any indication, this GPU should be running considerably cooler than most of the competition.
MSI has included a dual BIOS option, updated Military Class 4 components and hardware but be prepared to sacrifice three slots of your motherboard to this monster. Power requirements are interesting with a pair of 8-pin power connectors and a single 6-pin connector, though the 6-pin is going to optional.
The power of the card still comes from AMD's latest R9 290X Hawaii GPU, so you can be sure you'll have enough gaming power for just about any situation. We implored MSI to make sure that the overclocks of this card, probably in the 1050-1100 MHz range, are maintained consistently through extended game play to avoid any awkward variance discussions.
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Graphics Cards | January 8, 2014 - 08:25 PM | Josh Walrath
Tagged: triple fans, R9 290X, r9 290, powercolor, liquid cooling, cooling, CES 2014, amd
The nice folks at PowerColor were foolish enough to invite us into their suite full of video cards. Unhappily, we were unable to abscond with a few items that we will list here. PowerColor has a smaller US presence than other manufacturers, but they are not afraid to experiment with unique cooling solutions for their cards.
A sharp looking card that is remarkably heavy.
Cooling is provided by EKWB.
In their suite they were showing off two new products based on the AMD R9 290X chips. The first was actually released back in December, 2013. This is the liquid cooling version of the AMD R9 290X. This little number comes in at a hefty $799. When we think about this price, it really is not that out of line. It features a very high end liquid cooling block that is extremely heavy and well built. The PCB looks like it mimics the reference design, but the cooling is certainly the unique aspect of this card. Again, this card is extremely heavy and well built.
Three fans are too much!
The display outputs are the same as the reference design, which is not a bad thing.
The second card is probably much more interesting to most users. This is a new cooling solution from PowerColor that attaches to the AMD R9 290X. The PCS+ cooler features three fans and is over two slots wide (we can joke about it being 2.5 slots wide, but I doubt anyone can use that extra half slot that is left over). PCS+ stands for Professional Cooling Systems. The board again looks like it is based on the reference PCB, but the cooler is really where the magic lies. This particular product should be able to compete with the other 3rd party coolers that we have seen applied to this particular chip from AMD. As such, it should be able to not only keep the clockspeed at a steady state throughout testing/gaming, but it should also allow a measure of overclocking to be applied.
The back is protected/supported by a large and stiff plate. Cooling holes help maximize performance.
This card will be offered at $679 US and will be available on January 15. The amount of units shipped will likely be fairly small, so keep a good eye out. AMD is ultimately in charge of providing partners with chips to integrate into their respective products, and so far I think those numbers have been a little bit more limited than hoped. It also doesn’t help that the market price has been inflated by all the coin miners that have been purchasing up the latest GCN based AMD cards for the past several months.
There is no denying that this is a large cooler. Hopefully cooling performance will match or exced that of products Ryan has already reviewed.
We also expect to see the R9 290 version of this card around the same timeframe. This is supposed to be released around the same time as the bigger, more expensive R9 290X. There should be more PowerColor content at PCPer over the next few months, so please stay tuned!
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Graphics Cards, Displays | January 8, 2014 - 04:01 AM | Ryan Shrout
Tagged: pq321q, PQ321, nvidia, gsync, g-sync, CES 2014, CES, asus, 4k
Just before CES Allyn showed you the process of modifying the ASUS VG248QE to support NVIDIA G-Sync variable refresh rate technology. It wasn't the easiest mod we have ever done but even users without a lot of skill will be able to accomplish it.
But at the NVIDIA booth at CES this year the company was truly showing off G-Sync technology to its fullest capability. By taking the 3840x2160 ASUS PQ321Q monitor and modifying it with the same G-Sync module technology we were able to see variable refresh rate support in 4K glory.
Obviously you can't see much from the photo above about the smoothness of the animation, but I can assure you that in person this looks incredible. In fact, 4K might be the perfect resolution for G-Sync to shine as running games at that high of a resolution will definitely bring your system to its knees, dipping below that magical 60 Hz / FPS rate. But when it does with this modified panel, you'll still get smooth game play and a a tear-free visual experience.
The mod is actually using the same DIY kit that Allyn used in his story though it likely has a firmware update for compatibility. Even with the interesting debate from AMD about the support for VRR in the upcoming DisplayPort 1.3 standard, it's impossible to not see the ASUS PQ321Q in 4K with G-Sync and instantly fall in love with PCs again.
Sorry - there are no plans to offer this upgrade kit for ASUS PQ321Q owners!
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Graphics Cards, Cases and Cooling | January 6, 2014 - 04:00 PM | Morry Teitelman
Tagged: water cooling, ROG, nvidia, gtx 780, gtx 770, CES 2014, CES, asus
In keeping with their tradition of pushing the innovation and performance boundaries through their ROG product line, ASUS today released NVIDIA-based 7-seried video cards featuring as part of their Poseidon series of liquid cooled products. ASUS released both a GTX 780 and GTX 770-based product with the hybrid Poseidon cooling solution.
Courtesy of ASUS
All Poseidon series graphics cards come with a hybrid cooling solution, using a combination of fan-based and water-based cooling to propel these cards to new performance heights. The card's GPU is cooled with the DirectCU H20 cooler with water pathed through the integrated barbs to the copper-based cooler. The water inlets are threaded, accepting G1/4" sized male fittings. The memory and VRM components are cooled by a massive dual-fan heat pipe, exhausting air through the rear panel port. Both cards feature the red and black ROG coloration with the Poseidon series name displayed prominently along the right front edge of the card.
Courtesy of ASUS
Both Poseidon series graphics cards, the ROG Poseidon GTX 770 and ROG Poseidon GTX 780, include ASUS' DIGI+ VRM and Super Allow Power power circuitry to ensure stability and component life under the most grueling conditions. When paired with the Poseidon cooling, the GPU ran 20% cooler and 3 times quieter than a comparable reference card with card operating temperatures 16 C lower than the same reference solution.
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: General Tech, Graphics Cards | December 31, 2013 - 05:46 PM | Scott Michaud
Tagged: linux, iris pro, iris, Intel, haswell
'Tis the season to be sharing and giving (unless you are Disney).
According to Phoronix, Intel has shipped (heh heh heh, "boatload") over 5,000 pages of documentation and technical specs for their Haswell iGPUs including the HD, Iris, and Iris Pro product lines. The intricacies of the 3D engine, GPGPU computation, and video acceleration are laid bare for the open source community. Video acceleration is something that is oft omit from the manuals of other companies.
Phoronix believes that Intel is, and has been, the best GPU vendor for open-source drivers. AMD obviously supports their open-source community but Intel edges them out on speed. The Radeon HD 7000 series is just beginning to mature, according to their metric, and Hawaii is far behind that. For NVIDIA, the Nouveau driver is still developed primarily by reverse-engineering. That said, documentation was released a few months ago.
Of course all of these comparisons are only considering the open-source drivers.
NVIDIA prides itself on their proprietary driver offering and AMD pretty much offers both up for the user to chose between. Phoronix claims that Intel employs over two-dozen open-source Linux graphics developers but, of course, that is their only graphics driver for Linux. That is not a bad thing, of course, because a launch open-source GPU driver is really identical to what they would launch for a proprietary driver just without slapping the wrists of anyone who tries to tweak it. It does make sense for Intel, however, because community support will certainly do nothing but help their adoption.
If you would like to check out the documentation, it is available at Intel's 01.org.
Subject: Graphics Cards | December 31, 2013 - 03:13 PM | Jeremy Hellstrom
Tagged: amd, asus, ASUS ROG, MATRIX PLATINUM R9 280X
There is a lot to love about the ASUS ROG Matrix Platinum R9 280X, from its looks and faster and quieter performance when compared to the reference model. Still, [H]ard|OCP doesn't recommend running the fan at 100% except in brief moments when you need to dump heat quickly as it is quite loud but 50-60% will keep you below 90C and is not very noticeable. It overclocks very well, they hit 1270MHz core and 6.6GHz VRAM easily and noticed performance improvements when they did so, pushing past the GTX 770 occasionally. There is one major disappointment with this card, the price is currently over 50% higher than the base 280X MSRP so you might want to hold off for a while before thinking of purchasing this card.
"Today we take ASUS' elite R9 280X product, the ASUS ROG MATRIX PLATINUM R9 280X video card, and put its advanced power phasing and board components to the test as we harshly overclock it. We compare it to an ASUS GeForce GTX 770 DirectCU II. Then we will put it head to head against the overclocked SAPPHIRE TOXIC R9 280X."
Here are some more Graphics Card articles from around the web:
- Powercolor Radeon R9-290X Review @ Bjorn3D
- Gigabyte R9 270X Windforce OC @ Kitguru
- Sapphire R9 290X Tri-X OC @ Kitguru
- The $109 Console-killer GPU: AMD's Radeon R7 260 Graphics Card Reviewed @ Techgage
- AMD Catalyst 2013 Linux Graphics Driver Year-In-Review @ Phoronix
- Asus GTX 780 Ti DirectCU II OC @ Kitguru
- Zotac GeForce GTX 780 Ti AMP! Review @ Bjorn3D
Subject: Graphics Cards | December 30, 2013 - 02:32 PM | Ryan Shrout
Tagged: amd, Mantle, hawaii, BF4, battlefield 4
If you have been following the mess than has been Battlefield 4 since its release, what with the crashing on both PCs and consoles, you know that EA and DICE have decided that fixing the broken game is the number 1 priority. Gee, thanks.
While they work on that though, there is another casualty of development other than the pending DLC packs: AMD's Mantle version of the game. If you remember way back in September of 2013, along with the announcement of AMD's Hawaii GPUs, AMD and DICE promised a version of the BF4 game running on Mantle as a free update in December. If you are counting, that is just 1 more day away from being late.
Today we got this official statement from AMD:
After much consideration, the decision was made to delay the Mantle patch for Battlefield 4. AMD continues to support DICE on the public introduction of Mantle, and we are tremendously excited about the coming release for Battlefield 4! We are now targeting a January release and will have more information to share in the New Year.
Well, it's not a surprise but it sure is a bummer. One of the killer new features for AMD's GPUs was supposed to be the ability to use this new low-level API to enhance performance for PC games. As Josh stated in our initial article on the subject, "It bypasses DirectX (and possibly the hardware abstraction layer) and developers can program very close to the metal with very little overhead from software. This lowers memory and CPU usage, it decreases latency, and because there are fewer “moving parts” AMD claims that they can do 9x the draw calls with Mantle as compared to DirectX. This is a significant boost in overall efficiency."
It seems that buyers of the AMD R9 series of graphics need to wait at least another month to really see what the promise of Mantle is really all about. Will the wait be worth it?
Subject: General Tech, Graphics Cards | December 24, 2013 - 04:15 AM | Scott Michaud
Tagged: R9 290X, r9 290, msi
MSI just announced their two customized Hawaii GPUs. One of the two new boards will be based on the R9 290 and the other based on the R9 290X. The design is based around the Twin Frozr IV Advanced two-fan model found on previous cards.
The specifications of the 290X version include three different modes: OC, Gaming and Silent. The Silent mode will run at 1000 MHz which is the same clock speed as the reference models were set at. In Gaming mode the card will run at 1030 MHz and in OC mode it will clock at 1040 MHz. Obviously there will be some slight noise level variances between them but I am pretty sure that the difference between Gaming and OC mode is going to be negligible.
The R9 290 version has the same three settings, but the clock speeds are 947 MHz, 977 MHz and 1007 MHz respectively.
As a final note: MSI's press release claims, "Available Now". It does not appear to be available on either Amazon or Newegg but NCIX claims that it is estimated to arrive February 26th, 2014 for $700. I seriously hope that there are a few typoes... maybe they meant December 26? Maybe they meant not a more-than-$100 premium?
It is, unfortunately, still a wait and see game with these custom AIBs.
Subject: General Tech, Graphics Cards | December 19, 2013 - 07:23 PM | Scott Michaud
Tagged: amd, firepro, SPECviewperf
SPECviewperf 12 is a benchmark for workstation components that attempts to measure performance expected for professional applications. It is basically synthetic but is designed to quantify how your system can handle Maya, for instance. AMD provided us with a press deck of some benchmarks they ran leading to many strong FirePro results in the entry to mid-range levels.
They did not include high-end results which they justify with the quote, "[The] Vast majority of CAD and CAE users purchase entry and mid-range Professional graphics boards". That slide, itself, was titled, "Focusing Where It Matters Most". I will accept that but I assume they did the benchmarks and wonder if it would have just been better to include them.
The cards AMD compared are:
- Quadro 410 ($105) vs FirePro V3900 ($105)
- Quadro K600 ($160) vs FirePro V4900 ($150)
- Quadro K2000 ($425) vs FirePro W5000 ($425)
- Quadro K4000 ($763) vs FirePro W7000 ($750)
In each of the pairings, about as equally-priced as possible, AMD held decent lead throughout eight tests included in SPECviewperf 12. You could see the performance gap leveling off as prices begun to rise, however.
Obviously a single benchmark suite should be just one data-point when comparing two products. Still, these are pretty healthy performance numbers.
Subject: General Tech, Graphics Cards, Processors | December 19, 2013 - 04:05 AM | Scott Michaud
Tagged: Intel, haswell
In another review from around the net, Carl Nelson over at Hardcoreware tested the dual-core (4 threads) Intel Core i3-4340 based on the Haswell architecture. This processor slides into the $157 retail price point with a maximum frequency of 3.6GHz and an Intel HD 4600 iGPU clocked at 1150MHz. Obviously this is not intended as top-end performance but, of course, not everyone wants that.
Image Credit: Hardcoreware
One page which I found particularly interesting was the one which benchmarked Battlefield 4 rendering on the iGPU. The AMD A10 6790K (~$130) had slightly lower 99th percentile frame time (characteristic of higher performance) but slightly lower average frames per second (characteristic of lower performance). The graph of frame times shows that AMD is much more consistent than Intel. Perhaps the big blue needs a little Fame Rating? I would be curious to see what is causing the pretty noticeable (in the graph, at least) stutter. AMD's frame pacing seems to be very consistent albeit this is obviously not a Crossfire scenario.
If you are in the low-to-mid $100 price point be sure to check out his review. Also, of course, Kaveri should be coming next month so that is something to look out for.
Subject: General Tech, Graphics Cards | December 18, 2013 - 04:25 AM | Scott Michaud
Tagged: GeForce GTX 780 Ti, DirectCU II, asus
There has not been too many custom coolers for top-end NVIDIA graphics cards as of late. Starting with the GeForce GTX 690, NVIDIA allegedly demands AIB partners stick to the reference designs for certain models. Obviously, this is a problem as it limits the innovation realized by partners when they are forced to compete on fewer metrics (although the reference designs were pretty good regardless). This is especially true because the affected models are the upper high-end where pricing is more flexible if the product is worth it.
This is apparently not the case for the top end GTX 780 Ti. ASUS has just announced the GeForce GTX 780 Ti DirectCU II graphics card. ASUS claims this will lead to 30% cooler operating with 3x less noise. A 6% bump to performance (as measured in Battlefield 4) will accompany that cooler and quieter operation as the full GK110 GPU will boost to 1020MHz.
ASUS makes custom GPUs for both AMD and NVIDIA. Be sure to check out our review of another high-end DirectCU II card, with 100% less NVIDIA, very soon. It will definitely be a great read and maybe even an excellent podcast topic.
Subject: General Tech, Graphics Cards | December 17, 2013 - 05:02 PM | Scott Michaud
Tagged: nvidia, ShadowPlay, geforce experience
Another update to GeForce Experience brings another anticipated ShadowPlay feature. The ability to stream live gameplay to Twitch, hardware accelerated by Kepler, was demoed at the NVIDIA event in Montreal from late October. They showed Batman Origins streaming at 1080p 60FPS without capping or affecting the in-game output settings.
GeForce Experience 1.8.1 finally brings that feature, in beta of course, to the general public. When set up, Alt + F8 will launch the Twitch stream and Alt + F6 will activate your webcam. Oh, by the way, one feature they kept from us (or at least me) is the ability to overlay your webcam atop your gameplay.
Nice touch NVIDIA.
Of course the upstream bandwidth requirements of video are quite high: 3.5Mbps on the top end, a more common 2Mbps happy medium, and a 0.75Mbps minimum. NVIDIA has been trying to ensure that your machine will not lag but there's nothing a GPU can do about your internet connection.
GeForce Experience 1.8.1 is available now at the GeForce website.
Subject: General Tech, Graphics Cards | December 13, 2013 - 05:43 PM | Scott Michaud
Tagged: webgl, ue4, UE3, asm.js
Its shortcoming is the difficulty and annoyance when hand coding (without compiling it from another language). The browser is used more by encouraging the adoption of web standards through discouraging the usage of web standards. You can see where the politics can enter.
Still, it makes for great demos such as the cloth physics applet from James Long of Mozilla or, more amazingly, Unreal Engine 3. The upcoming UE4 is expected to be officially supported by Epic Games on asm.js (and obviously WebGL will be necessary too) but, of course, Epic will not prevent UE3 licensees from doing their own leg-work.
NomNom Games, a group within Trendy Entertainment (Trendy is known for Dungeon Defenders), became the first company to release a commercial 3D title on these standards. Monster Madness, powered by Unreal Engine 3, runs in web browsers like Mozilla Firefox and Google Chrome without plugins (although it will fail-down to Flash 11.6 if your browser is unsupported for the web-native version). Monster Madness is a top-down cell shaded shoot'em-up.
You can play, for free, with an anonymous token here. You can also visit their website to learn more about the closed beta for registered accounts. It is natively supported on Firefox, Chrome, and Opera. I am not entirely sure why IE11 is not supported, now that Microsoft supports WebGL, but there is probably a customer support or performance reason for it.
Subject: Graphics Cards | December 12, 2013 - 05:20 PM | Ryan Shrout
Tagged: video, amd, radeon, hawaii, r9 290, R9 290X, bitcoin, litecoin, mining
If you already listened to this weeks PC Perspective Podcast, then feel free to disregard this post. For the rest of you - subscribe to our damned weekly podcast would you already?!?
In any event, I thought it might be interesting to extract this 6 minute discussion we had during last nights live streamed podcast about how the emergence of Litecoin mining operations is driving up prices of GPUs, particularly the compute-capable R9 290 and R9 290X Hawaii-based cards from AMD.
Check out these prices currently on Amazon!
- Radeon R9 290X - $725+
- Radeon R9 290 - $499+
- Radeon R9 280X - $429+
- GeForce GTX 770 - $409+
- GeForce GTX 780 - $509+
- GeForce GTX 780 Ti - $699+
The price of the GTX 770 is a bit higher than it should be while the GTX 780 and GTX 780 Ti are priced in the same range they have been for the last month or so. The same cannot be said for the AMD cards listed here - the R9 280X is selling for $130 more than its expected MSRP at a minimum but you'll see quite a few going for much higher on Amazon, Ebay (thanks TR) and others. The Radeon R9 290 has an MSRP of $399 from AMD but the lowest price we found on Amazon was $499 and anything on Newegg.com is showing at the same price, but sold out. The R9 290X is even more obnoxiously priced when you can find them.
Do you have any thoughts on this? Do you think Litecoin mining is really causing these price inflations and what does that mean for AMD, NVIDIA and the gamer?
Subject: General Tech, Graphics Cards | December 11, 2013 - 05:58 PM | Scott Michaud
Tagged: frame pacing, frame rating, amd, southern islands, 4k, eyefinity, crossfire, microstutter
The frame pacing issue has been covered at our website for almost a year now. It stems from the original "microstutter" problem which dates back over a year before we could quantify it. We like to use the term "Frame Rating" to denote the testing methodology we now use for our GPU tests.
AMD fared worse at these tests than NVIDIA (although even they had some problems in certain configurations). They have dedicated a lot of man-hours to the problem resulting in a driver updates for certain scenarios. Crossfire while utilizing Eyefinity or 4K MST was one area they did not focus on. The issue has been addressed in Hawaii and AMD asserted that previous cards will get a software fix soon.
The good news is that we have just received word from AMD that they plan on releasing a beta driver for Southern Islands and earlier GPUs (AMD believes it should work for anything that's not "legacy"). As usual, until it ships anything could change, but it looks good for now.
The beta "frame pacing" driver addressing Crossfire with 4K and Eyefinity, for supported HD-series and Southern Islands-based Rx cards, is expected to be public sometime in January.
Subject: General Tech, Graphics Cards | December 9, 2013 - 03:03 AM | Scott Michaud
Tagged: R9 290X, DirectCU II, asus
The AMD Radeon R9 290X is a very good graphics processor whose reference design is marred with a few famous design choices. AMD specs the GPU to run at a maximum of 95C, perpetually, and will push its frequency up to 1 GHz if it can stay at or under that temperature. Its cooler in the typical, "Quiet", default setting is generally unable to keep this frequency for more than a handful of minutes. This lead to countless discussions about what it means to be a default and what are the components actual specifications.
All along we note that custom designs from add-in board (AIB) partners could change everything.
ASUS seems to be first to tease their custom solution. This card, based on their DirectCU II design, uses two fans and multiple 10mm nickel plated heatpipes directly atop the processor. The two fans should be able to move more air at a slower rate of rotation and thus be more efficient per decibel. The heatsink itself might also be able to pull heat, quicker, altogether. I am hoping that ASUS provisioned the part to remain at a stable 1GHz under default settings or perhaps even more!
The real test for Hawaii will be when the wave of custom editions washes on shore. We know the processor is capable of some pretty amazing performance figures when it can really open up. This, and other partner boards, would make for possibly the most interesting AIB round-up we have ever had.
No word, yet, on pricing or availability.
Subject: General Tech, Graphics Cards | December 5, 2013 - 03:17 AM | Scott Michaud
Tagged: linux, nvidia surround, eyefinity
Could four 1080p monitors be 4K on the cheap? Probably not... but keep reading.
Image Credit: Phoronix
Phoronix published an article for users interested in quad monitor gaming on Linux. Sure, you might think this is a bit excessive especially considering the bezel at the center of your screen. On the other hand, imagine you are playing a four player split-screen game. That would definitely get some attention. Each player would be able to tilt their screen out of the view of their opponents while only using a single computer.
In his 8-page editorial, Michael Larabel tests the official and popular open source drivers for both AMD and NVIDIA. The winner was NVIDIA's proprietary driver although the open source solution, Nouveau, seemed to fair the worst of the batch. This is the typical trade-off with NVIDIA. It was only just recent that The Green Giant opened up documentation for the other chefs in the kitchen... so these results may change soon.
If you are interested in gaming on Linux, give the article a read.
Subject: General Tech, Graphics Cards, Motherboards | December 4, 2013 - 12:02 AM | Scott Michaud
Tagged: uppercase, msi, mini-itx
MSI is calling these products, "Mini, but Mighty". These components are designed for the mini-ITX form factor which is smaller than 7 inches in length and width. Its size makes it very useful for home theater PCs (HTPCs) and other places where discretion is valuable. You also want these machines to be quiet, which MSI claims this product series is.
The name is also written in full uppercase so you imagine yourself yelling every time you read it.
The MSI Z87I GAMING AC Motherboard comes with an Intel 802.11ac (hence, "GAMING AC", I assume) wireless adapter. If you are using a wired connection, it comes with a Killer E2205 Ethernet adapter from Qualcomm's BigFoot Networks (even small PCs can be BigFoot). Also included is an HDMI 1.4 output capable of 4K video (HDMI 1.4 is limited to 30Hz output at 2160p).
Good features to have, especially for an HTPC build.
The other launch is the GTX 760 GAMING ITX video card. This card is a miniature GeForce 760 designed to fit in mini-ITX cases. If your box is a Home Theater PC, expect it to run just about any game at 1080p.
No information on pricing and availability yet. Check out the press release after the break.
Subject: General Tech, Graphics Cards, Processors | December 3, 2013 - 04:12 AM | Scott Michaud
Tagged: Kaveri, APU, amd
The launch and subsequent availability of Kaveri is scheduled for the CES time frame. The APU unites Steamroller x86 cores with several Graphics Core Next (GCN) cores. The high-end offering, the A10-7850K, is capable of 856 GFLOPs of compute power (most of which is of course from the GPU).
Image/Leak Credit: Prohardver.hu
We now know about two SKUs: the A10-7850K and the A10-7700K. Both parts are quite similar except that the higher model is given a 200 MHz CPU bump, 3.8 GHz to 4.0 Ghz, and 33% more GPU units, 6 to 8.
But how does this compare? The original source (prohardver.hu) claims that Kaveri will achieve an average 28 FPS in Crysis 3 on low at 1680x1050; this is a 12% increase over Richland. It also achieved an average 53 FPS with Sleeping Dogs on Medium which is 26% more than Richland.
These are healthy increases over the previous generation but do not even account for HSA advantages. I am really curious what will happen if integrated graphics become accessible enough that game developers decide to target it for general compute applications. The reduction in latency (semi-wasted time bouncing memory between compute devices) might open this architecture to where it can really shine.
We will do our best to keep you up to date on this part especially when it launches at CES.
Subject: General Tech, Graphics Cards | December 2, 2013 - 03:16 PM | Scott Michaud
Tagged: nvidia, ShadowPlay
They grow up so fast these days...
GeForce Experience is NVIDIA's software package, often bundled with their driver updates, to optimize the experience of their customers. This could be adding interesting features, such as GPU-accelerated game video capture, or just recommending graphics settings for popular games.
Version 1.8 adds many desired features lacking from the previous version. I always found it weird that GeForce Experience would recommend one good baseline settings for games, and set them for you, but force you to then go into the game and tweak from there. It would be nice to see multiple presets but that is not what we get; instead, we are able to tweak the settings from within GeForce Experience. The baseline tries to provide a solid 40 FPS at the most difficult moments, computationally. You can then tune the familiar performance and quality slider from there.
You are also able to set resolutions up to 3840x2160 and select whether you would like to play in windowed (including "borderless") mode.
Also, with ShadowPlay, Windows 7 users will also be able to "shadow" the last 20 minutes like their Windows 8 neighbors. You will also be able to combine your microphone audio with the in-game audio should you select it. I can see the latter feature being very useful for shoutcasters. Apparently it allows capturing VoIP communication and not just your microphone itself.
Still no streaming to Twitch.tv, yet. It is still coming.
For now, you can download GeForce Experience from NVIDIA's GeForce website. If you want to read a little more detail about it, first, you can check out their (much longer) blog post.