Subject: Graphics Cards, Processors | January 29, 2014 - 08:44 PM | Ryan Shrout
Tagged: video, nvidia, Intel, gt 630, APU, amd, A10-7850K, 7850k
The most interesting aspect of the new Kaveri-based APUs from AMD, in particularly the A10-7850K part, is how it improves mainstream gaming performance. AMD has always stated that these APUs shake up the need for low-cost discrete graphics and when we got the new APU in the office we did a couple of quick tests to see how much validity there to that claim.
In this short video we compare the A10-7850K APU against a combination of the Intel Core i3-4330 and GeForce GT 630 discrete graphics card in five of 2013's top PC releases. I think you'll find the results pretty interesting.
UPDATE: I've had some questions about WHICH of the GT 630 SKUs were used in this testing. Our GT 630 was this EVGA model that is based on 96 CUDA cores and a 128-bit DDR3 memory interface. You can see a comparison of the three current GT 630 options on NVIDIA's website here.
If you are looking for more information on AMD's Kaveri APUs you should check out my review of the A8-7600 part as well our testing of Dual Graphics with the A8-7600 and a Radeon R7 250 card.
Subject: General Tech | January 23, 2014 - 07:58 PM | Jeremy Hellstrom
Tagged: opengl, linux, amd, nvidia
If you are a Linux user who prefers to use OpenGL graphics there is still a huge benefit to choosing NVIDIA over AMD. The tests Phoronix just completed show that the GTX680, 770 and 780 all perform significantly faster than the R9 290 with even the older GTX 550 Ti and 650 GPUs outperforming AMD's best in some benchmarks. That said AMD is making important improvements to their open source drivers as that is where they are lagging behind NVIDIA. The new RadeonSI Gallium3D for the HD7000 series shows significant performance improvements when paired with the new 3.13 kernel though still falling a bit behind the Catalyst driver they are now much closer to the performance of the proprietary driver. For older cards the performance increase is nowhere near as impressive but some certain benchmarks do show this Gallium3D driver to provide at least some improvements. Pity the Source engine isn't behaving properly during benchmarks which is why no tests were run on Valve's games but that should be solved in the near future.
"In new tests conducted last week with the latest AMD and NVIDIA binary graphics drivers, the high-end AMD GPUs still really aren't proving much competition to NVIDIA's Kepler graphics cards. Here's a new 12 graphics card comparison on Ubuntu."
Here is some more Tech News from around the web:
- Testing Out The Configurable TDP On AMD's Kaveri @ Phoronix
- Lenovo shares in trading halt ahead of 'disclosable transaction' @ The Register
- BT's breakneck broadband test hits unimaginable speeds over plain ol' fiber @ Engadget
- NETGEAR CES 2014 New Products Showcase @ Benchmark Reviews
- Symantec uncovers malware that uses Windows to infect Android devices @ The Inquirer
- Windows 8.1 update 'screenshots' leak: Metro apps popped into classic desktop taskbar @ The Register
- AMD starts year, checks watch, hurries out Warsaw Opterons @ The Register
- Luxa2 H5 Premium Car Phone Mount @ eTeknix
- Nvidia Grid – Is It The Future Of High Performance Computing? @ eTeknix
Subject: General Tech, Graphics Cards | January 23, 2014 - 08:29 AM | Scott Michaud
Tagged: ShadowPlay, nvidia, geforce experience
NVIDIA has been upgrading their GeForce Experience just about once per month, on average. Most of their attention has been focused on ShadowPlay which is their video capture and streaming service for games based on DirectX. GeForce Experience 1.8.1 brought streaming to Twitch and the ability to overlay the user's webcam.
Until this version, users could choose between "Low", "Medium", and "High" quality stages. GeForce Experence 1.8.2 adds "Custom" which allows manual control over resolution, frame rate, and bit rate. NVIDIA wants to makes it clear: frame rate controls the number of images per second and bit rate controls the file size per second. Reducing the frame rate without adjusting the bit rate will result in a file of the same size (just with better quality per frame).
Also with this update, NVIDIA allows users to set a push-to-talk key. I expect this will be mostly useful for Twitch streaming in a crowded dorm or household. Only transmitting your voice when you have something to say prevents someone else from accidentally transmitting theirs globally and instantaneously.
GeForce Experience 1.8.2 is available for download at the GeForce website. Users with a Fermi-based GPU will no longer be pushed GeForce Experience (because it really does not do anything for those graphics cards). The latest version can always be manually downloaded, however.
Subject: Graphics Cards | January 21, 2014 - 08:49 PM | Ryan Shrout
Tagged: rumor, nvidia, kepler, gtx titan black, gtx titan, gtx 790
How about some fresh graphics card rumors for your Tuesday afternoon? The folks at VideoCardz.com have collected some information about two potential NVIDIA GeForce cards that are going to hit your pocketbook hard. If the mid-range GPU market was crowded wait until you see the changes NVIDIA might have for you soon on the high-end.
First up is the NVIDIA GeForce GTX TITAN Black Edition, a card that will actually have the same specifications as the GTX 780 Ti but with full performance double precision floating point and a move from 3GB to 6GB of memory. The all-black version of the GeForce GTX 700-series cooler is particularly awesome looking.
Image from VideoCardz.com
The new TITAN would sport the same GPU as GTX 780 Ti, only TITAN BLACK would have higher double precision computing performance, thus more FP64 CUDA cores. The GTX TITAN Black Edition is also said to feature 6GB memory buffer.This is twice as much as GTX 780 Ti, and it pretty much confirms we won’t be seeing any 6GB Ti’s.
The rest is pretty much well known, TITAN BLACK has 2880 CUDA cores, 240 TMUs and 48 ROPs.
VideoCardz.com says this will come in at $999. If true, this is a pure HPC play as the GTX 780 Ti would still offer the same gaming performance for enthusiasts.
Secondly, there looks to be an upcoming dual-GPU graphics card using a pair of GK110 GPUs that will be called the GeForce GTX 790. The specifications that VideoCardz.com says they have indicate that each GPU will have 2496 enabled CUDA cores and a smaller 320-bit memory interface with 5GB designated for each GPU. Cutting back on the memory interface, shader counts and even clocks speeds would allow NVIDIA to manage power consumption at the targeted 300 watt level.
Image from VideoCardz.com
Head over to VideoCardz.com for more information about these rumors but if all goes as they expect, you'll hear about these products quite a bit more in February and March.
What do you think? Are these new $1000 graphics cards something you are looking forward to?
Subject: General Tech, Graphics Cards | January 20, 2014 - 09:19 AM | Scott Michaud
Tagged: maxwell, nvidia
Well this is somewhat unexpected (and possibly wrong). Maxwell, NVIDIA's new architecture to replace Kepler, is said to appear in Feburary with the form of a GeForce GTX 750 Ti. The rumors, which sound iffy to me, claims that this core will be produced at TSMC on a 28nm fabrication technology and later transition to their 20nm lines.
As if the 700-series family tree was not diverse enough.
2013 may have been much closer than expected.
Swedish site, Sweclockers, have been contacted by "sources" which claim that NVIDIA has already alerted partners to prepare a graphics card launch. Very little information is given beyond that. They do not even have access to a suggested GM1## architecture code. They just claim that partners should expect a new videocard on the 18th of February (what type of launch that is is also unclear).
This also raises questions about why the mid-range card will come before the high-end. If the 28nm rumor is true, it could just be that NVIDIA did not want to wait around until TSMC could fabricate their high-end part if they already had an architecture version that could be produced now. It could be as simple as that.
The GeForce GTX 750 Ti is rumored to arrive in February to replace the GTX 650 Ti Boost.
Subject: Displays | January 17, 2014 - 11:35 PM | Tim Verry
Tagged: vg248qe, nvidia, gaming, g-sync, DIY, asus
NVIDIA's new G-Sync variable refresh rate technology is slowly being rolled out to consumers in the form of new monitors and DIY upgrade kits that can be used to add G-Sync functionality to existing displays. The first G-Sync capable monitor to support the DIY upgrade kit path is the ASUS VG248QE which is a 24" 1080p 144Hz TN panel. The monitor itself costs around $270 and the you can now purchase a G-Sync DIY upgrade kit from NVIDIA for $199.
The upgrade kit comes with a replacement controller board, power supply, HDMI cable, plastic spudger, IO shields, and installation instructions. Users will need to take apart the VG248QE monitor, remove the old PCBs and install the G-Sync board in its place. According to NVIDIA the entire process takes about 30 minutes though if this is your first time digging into monitor internals it will likely take closer to an hour to install.
The NVIDIA G-Sync DIY kit below the ASUS VG248QE monitor.
For help with installation, NVIDIA has posted a video of the installation process on YouTube. If you find text and photos easier, you can follow the installation guides written up for PC Perspective by Allyn Malventano and reader Levi Kendall. Both DIY kit reviews stated that the process, while a bit involved, was possible for most gamers to perform with a bit of guidance.
You can order the DIY upgrade kit yourself from this NVIDIA page.
Alternatively, ASUS is also releasing an updated version of the VG248QE monitor with the G-Sync board pre-installed in the first half of this year. This updated G-Sync monitor will have an MSRP of $399.
With the G-Sync kit at $199, will you be going the DIY path or waiting for a new monitor with the technology pre-installed?
Read more about NVIDIA's G-Sync display technology at PC Perspective including first impressions, installation, and more!
Subject: General Tech | January 8, 2014 - 06:20 PM | Jeremy Hellstrom
Tagged: tom petersen, nvidia, g-sync, free sync, CES 2014, amd
AMD's free sync has been getting a lot of well deserved attention at this years CES, Ryan had a chance to see it in action if you haven't checked out his look at AMD's under reported and under utilized feature. AMD missed an opportunity with this technology which NVIDIA picked up on with their G-Sync. NVIDIA has responded to The Tech Report's comments from yesterday, Tom Petersen stated that while free sync may be an alternative on laptops, desktop displays are a different beast. They utilize different connections and there is generally a scaler chip between the GPU and the display. Read his full comments here.
"AMD demoed its "free sync" alternative to G-Sync on laptops. Desktop displays are different, Nvidia says, and they may not support the variable refresh rate tech behind AMD's solution—at least not yet."
Here is some more Tech News from around the web:
- Globalfoundries names ex-Motorola CEO as new chief @ DigiTimes
- Micron renews DRAM pricing at Inotera @ DigiTimes
- Intel ditches McAfee brand: 'THANK GOD' shouts McAfee the man @ The Register
- CES 2014 EVGA Visit @ Hardware Asylum
- Shows HyperX PCIe SSD, Fury Memory, microDuo Flash Drive and More @ Legit Reviews
Subject: Graphics Cards, Displays | January 8, 2014 - 09:01 AM | Ryan Shrout
Tagged: pq321q, PQ321, nvidia, gsync, g-sync, CES 2014, CES, asus, 4k
Just before CES Allyn showed you the process of modifying the ASUS VG248QE to support NVIDIA G-Sync variable refresh rate technology. It wasn't the easiest mod we have ever done but even users without a lot of skill will be able to accomplish it.
But at the NVIDIA booth at CES this year the company was truly showing off G-Sync technology to its fullest capability. By taking the 3840x2160 ASUS PQ321Q monitor and modifying it with the same G-Sync module technology we were able to see variable refresh rate support in 4K glory.
Obviously you can't see much from the photo above about the smoothness of the animation, but I can assure you that in person this looks incredible. In fact, 4K might be the perfect resolution for G-Sync to shine as running games at that high of a resolution will definitely bring your system to its knees, dipping below that magical 60 Hz / FPS rate. But when it does with this modified panel, you'll still get smooth game play and a a tear-free visual experience.
The mod is actually using the same DIY kit that Allyn used in his story though it likely has a firmware update for compatibility. Even with the interesting debate from AMD about the support for VRR in the upcoming DisplayPort 1.3 standard, it's impossible to not see the ASUS PQ321Q in 4K with G-Sync and instantly fall in love with PCs again.
Sorry - there are no plans to offer this upgrade kit for ASUS PQ321Q owners!
Follow all of our coverage of the show at http://pcper.com/ces!
DisplayPort to Save the Day?
During an impromptu meeting with AMD this week, the company's Corporate Vice President for Visual Computing, Raja Koduri, presented me with an interesting demonstration of a technology that allowed the refresh rate of a display on a Toshiba notebook to perfectly match with the render rate of the game demo being shown. The result was an image that was smooth and with no tearing effects. If that sounds familiar, it should. NVIDIA's G-Sync was announced in November of last year and does just that for desktop systems and PC gamers.
Since that November unveiling, I knew that AMD would need to respond in some way. The company had basically been silent since learning of NVIDIA's release but that changed for me today and the information discussed is quite extraordinary. AMD is jokingly calling the technology demonstration "FreeSync".
Variable refresh rates as discussed by NVIDIA.
During the demonstration AMD's Koduri had two identical systems side by side based on a Kabini APU . Both were running a basic graphics demo of a rotating windmill. One was a standard software configuration while the other model had a modified driver that communicated with the panel to enable variable refresh rates. As you likely know from our various discussions about variable refresh rates an G-Sync technology from NVIDIA, this setup results in a much better gaming experience as it produces smoother animation on the screen without the horizontal tearing associated with v-sync disabled.
Obviously AMD wasn't using the same controller module that NVIDIA is using on its current G-Sync displays, several of which were announced this week at CES. Instead, the internal connection on the Toshiba notebook was the key factor: Embedded Display Port (eDP) apparently has a feature to support variable refresh rates on LCD panels. This feature was included for power savings on mobile and integrated devices as refreshing the screen without new content can be a waste of valuable battery resources. But, for performance and gaming considerations, this feature can be used to initiate a variable refresh rate meant to smooth out game play, as AMD's Koduri said.
Subject: General Tech | January 8, 2014 - 05:30 AM | Ken Addison
Tagged: video, podcast, corsair, coolermaster, nvidia, Samsung, exynos, Allwinner, AX1500i
CES 2014 Podcast Day 3 - 01/07/14
It's time for podcast fun at CES! Join us as we talk about the third day of the show including exciting announcements from Corsair, Coolermaster, NVIDIA, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Allyn Malventano and Ken Addison
Program length: 49:24