Subject: General Tech | October 7, 2013 - 01:46 PM | Jeremy Hellstrom
Tagged: nvidia, linux, microsoft, open source
If you haven't heard the accusations flying over the possible scenarios that lead up to Origin PC dropping AMD cards from all their machines you can catch up at The Tech Report. They keep any speculation to a minimum unlike other sites but the key point is the claims of overheating and stability issues, something that apparently only Origin has encountered. If they had stuck with mentioning the frame pacing in Crossfire and 4K/mulitmonitor issue it would be understandable that they not sell AMD cards in systems designed for that usage but dropping them altogether is enough to start rumours and conspiracy theories across the interwebs. Winning a place in the Steam Machine was great for NVIDIA but at no time did they imply that AMD was unworthy, they merely didn't win the contract.
Today some oil was tossed on the fire with the revelation that NVIDIA is specifically limiting the functionality of its hardware on Linux. Just after we praised their release of documentation for Nouveau, their open sourced driver, we find out from a post at The Inquirer that NVIDIA limits the number of monitors used in Linux to three so as not to outdo their functionality in Windows. For a brief moment it seemed that NVIDIA was willing to cooperate with the open source and Linux communities but apparently that moment is all we will have and once again NVIDIA proves that it is willing bow to pressure from Microsoft.
"According to a forum poster at the Nvidia Developer Zone, the v310 version of the drivers for Basemosaic has reduced the number of monitors a user can connect simultaneously to three."
Here is some more Tech News from around the web:
- Cisco, Google and SAP may buy BlackBerry's bits: report @ The Register
- Toshiba unveils 'lightest and thinnest' workstation and a raft of business ultrabooks @ The Inquirer
- Microsoft claims its Surface 2 tablets are 'selling out' without spilling figures @ The Inquirer
- Down with Unicode! Why 16 bits per character is a right pain in the ASCII @ The Register
- NSA using Firefox flaw to snoop on Tor users @ The Register
- LED Costumes and Clothing @ Hack a Day
- Witnessing The League of Legends Season 3 World Championship Finals @ Legit Reviews
- OZONE Gaming Worldwide Joint Giveaway @ NikKTech
Introduction and Design
As we’re swimming through the veritable flood of Haswell refresh notebooks, we’ve stumbled across the latest in a line of very popular gaming models: the ASUS G750JX-DB71. This notebook is the successor to the well-known G75 series, which topped out at an Intel Core i7-3630QM with NVIDIA GeForce GTX 670MX dedicated graphics. Now, ASUS has jacked up the specs a little more, including the latest 4th-gen CPUs from Intel as well as 700-series NVIDIA GPUs.
Our ASUS G750JX-DB71 test unit features the following specs:
Of course, the closest comparison to this unit is already the most recently-reviewed MSI GT60-2OD-026US, which featured nearly identical specifications, apart from a 15.6” screen, a better GPU (a GTX 780M with 4 GB GDDR5), and a slightly different CPU (the Intel Core i7-4700MQ). In case you’re wondering what the difference is between the ASUS G750JX’s Core i7-4700MQ and the GT60’s i7-4700HQ, it’s very minor: the HQ features a slightly faster integrated graphics Turbo frequency (1.2 GHz vs. 1.15 GHz) and supports Intel Virtualization Technology for Directed I/O (VT-d). Since the G750JX doesn’t support Optimus, we won’t ever be using the integrated graphics, and unless you’re doing a lot with virtual machines, VT-d isn’t likely to offer any benefits, either. So for all intents and purposes, the CPUs are equivalent—meaning the biggest overall performance difference (on the spec sheet, anyway) lies with the GPU and the storage devices (where the G750JX offers more solid-state storage than the GT60). It’s no secret that the MSI GT60 burned up our benchmarks—so the real question is, how close is the ASUS G750JX to its pedestal, and if the differences are considerable, are they justified?
At an MSRP of around $2,000 (though it can be found for around $100 less), the ASUS G750JX-DB71 competes directly with the likes of the MSI GT60, too (which is priced equivalently). The question, of course, is whether it truly competes. Let’s find out!
Subject: General Tech, Graphics Cards | September 25, 2013 - 02:59 AM | Scott Michaud
Tagged: nvidia, Nouveau, linux
AMD commit numerous updates to the open source driver community, three months ago, and has otherwise assisted the Linux community in the past. The same has not been true for NVIDIA. Despite a respectable (albeit lacking compared to Windows) proprietary driver for Linux, this GPU vendor was not adored by the community. They have not been accused of malice, it would just seem to be control over both the end-user experience and, of course, their secret sauce.
I, obviously, do not have a crystal ball of fortune telling (the journalist house of auction ran out and the gift shop is just too expensive) so it is anyone's guess the future extent of NVIDIA's involvement. For now, their assistance included 42 pages of Device Control Block documentation and proprietary developers answering questions on the Nouveau mailing list.
Many, from Ars Technica to our staff discussions at PC Perspective, note how the change of heart aligns with the SteamOS announcement. I do not really believe these events are related if only because I doubt NVIDIA would wait to contact developers until Valve spoke up. I would have to expect that SteamOS would not be a surprise to NVIDIA especially after Gabe Newell discussed Maxwell virtualization all the way back at CES.
You would think they would have come about while working with NVIDIA on the game streaming technology. You know, allow a single desktop to utilize multiple games across multiple devices. Even still, you would think NVIDIA would just put even more effort into their proprietary driver rather than help Nouveau.
Either way, we will keep an ear out for NVIDIA involvement with the open source community.
Subject: Graphics Cards | September 19, 2013 - 05:55 PM | Jeremy Hellstrom
Tagged: nvidia, msi, 650ti boost, Twin Frozr
To give you the full name, the MSI N650 Titanium TwinFrozr 2GD5/OC Boost Edition is $170 after MIR, whereas you can pick up the HD 7850 that [H]ard|OCP chose to contrast against for a mere $130 after rebate. That price difference means that NVIDIA really has to perform quite a bit better than the AMD card to beat it in a performance per price perspective. From the numbers in the review you can clearly see that the 650Ti is the better performing card, especially with the respectable overclock that [H] managed which does make it the best card under $200; on the other hand if your budget is tight the performance gap is not as big as the price gap which might make that HD 7850 a better choice.
By the way, that NVIDIA card has a Boost clock which means that it might steal some of your megahertz away when it gets too hot, which is apparently a horrible experience and if you somehow disable that feature and cook your GPU ... obviously that is not your fault.
"Today we evaluate MSI's high-end GeForce GTX 650 Ti BOOST line with the flagship overclocked Gaming Edition MSI N650Ti TF 2GD5/OC BE. With falling prices on AMD Radeon video cards we will compare it to the AMD Radeon HD 7850 to see which will emerge as the victor in the sub-$200 price price range."
Here are some more Graphics Card articles from around the web:
- MSI GTX 660 Gaming Video Card Review @ Ninjalane
- MSI GTX 660 N660 Gaming 2GD5/OC Video Card Review @HiTech Legion
- MSI GTX780 Lightning 3GB @ Kitguru
- Budget video cards: AMD Radeon HD 7730 vs. Nvidia GeForce GT 640 GK208 @ Hardware.info
- ASUS GTX 760 DirectCU Mini 2 GB @ techPowerUp
- MSI GTX 780 Lightning Review @ Hardware Canucks
- ASUS GTX 670 DirectCU II Mini @ Bjorn3D
- Palit GTX760, GTX770 and GTX780 Super JetStream @ Kitguru
- MSI GeForce GTX 760 Twin Frozr Gaming OC Edition 2GB @ eTeknix
- ASUS GTX 780 DirectCU II OC @ Bjorn3D
- Palit GTX 780 Super JetStream 3 GB @ techPowerUp
- Gainward GTX 760 Phantom 2GB @ eTeknix
- EVGA GTX 770 4GB Dual Classified w/ ACX Cooler Review @Hi Tech Legion
- XFX FX7850 Double Dissipation HD 7850 2GB @ eTeknix
- PowerColor Radeon HD 7730 1GB @ eTeknix
- Gigabyte Radeon HD 7870 2GB GHz Edition Video Card Review @ Legit Reviews
Subject: General Tech | September 19, 2013 - 02:26 PM | Ken Addison
Tagged: video, surround, podcast, nvidia, Intel, idf, haswell, frame rating, eyefinity, baytrail, amd, 4250U
PC Perspective Podcast #269 - 09/19/2013
Join us this week as we discuss Frame Rating on Eyefinity, News from IDF, and rumors about new AMD GPUs
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Jeremy Hellstrom, Josh Walrath, Allyn Malventano, and Morry Teitelman
Week in Review:
0:03:30 Corsair Carbide 330R Case
News items of interest:
0:54:45 Mushkin Scorpion Delux PCIe SSD
Hardware/Software Picks of the Week:
1-888-38-PCPER or email@example.com
Subject: Mobile | September 18, 2013 - 12:04 PM | Ryan Shrout
Tagged: tegra note, tegra 4, tegra, tablet, pny, nvidia, evga
Over the past couple of months there have been several leaks about a potential NVIDIA-branded tablet based on the Tegra 4 SoC. Most speculated that NVIDIA had decided to enter into the hardware market directly with a "Tegra Tab" in a similar vein to the release of NVIDIA SHIELD. As it turns out though NVIDIA has created a platform for which other companies can rebrand and resell an Android tablet.
According to NVIDIA, the Tegra Note platform will enable partners to bring 7-in tablets to market packed with the feature set NVIDIA has been promising since the launch of the Tegra 4 SoC. Those include stylus support, high quality audio, HDR camera capabilities and 100% native Android operating systems.
Maybe more interesting are the partners that NVIDIA is teaming with for this launch. While companies like ASUS have already done the development work to prepare various size tablets based on Tegra chips in the past, NVIDIA is going to introduce a couple of its graphics cards partners to the mobility ecosystem: EVGA and PNY in North America.
While we have questions about the capability for either of these companies to truly support a tabletin today's market but the truth is likely that NVIDIA is handling most if not all of the logistics on this project. What is not in question is the potential for high value: these tablets will start with a suggested retail price of $199.
We already know most of the technical details about the Tegra 4 SoC including the 4+1 Cortex A15 CPU cores and the 72-core GPU. NVIDIA claims they will get 10 hours of video playback with this platform but I would like to get data on the weight and battery size before calling that a win. The display resolution is a bit lower than other competing high-end options in the market today but the sub-$200 price point does mean there had to be some corners cut.
UPDATE: I asked NVIDIA for more information on the size, weight and battery capacity and got a quick answer. The battery capacity is 4100 mAh and the entire device weighs 320g. Compared to the Google Nexus 7, the current strongest 7-in tablet in my opinion, that is a 4% larger battery (vs 3950 mAh) and 10% heavier device (vs 290g). The Tegra Note reference is also a bit thicker at 9.6mm compared to the 8.65mm of the Nexus 7.
There are more details on the official NVIDIA blog post making the announcement this morning including direct OTA Android updates so check that out if you think you might be interested in one of these tablets in the coming months!
Summary of Events
In January of 2013 I revealed a new testing methodology for graphics cards that I dubbed Frame Rating. At the time I was only able to talk about the process, using capture hardware to record the output directly from the DVI connections on graphics cards, but over the course of a few months started to release data and information using this technology. I followed up the story in January with a collection of videos that displayed some of the capture video and what kind of performance issues and anomalies we were able to easily find.
My first full test results were published in February to quite a bit of stir and then finally in late March released Frame Rating Dissected: Full Details on Capture-based Graphics Performance Testing which dramatically changed the way graphics cards and gaming performance was discussed and evaluated forever.
Our testing proved that AMD CrossFire was not improving gaming experiences in the same way that NVIDIA SLI was. Also, we showed that other testing tools like FRAPS were inadequate in showcasing this problem. If you are at all unfamiliar with this testing process or the results it showed, please check out the Frame Rating Dissected story above.
At the time, we tested 5760x1080 resolution using AMD Eyefinity and NVIDIA Surround but found there were too many issues and problems with our scripts and the results they were presenting to give reasonably assured performance metrics. Running AMD + Eyefinity was obviously causing some problems but I wasn’t quite able to pinpoint what they were and how severe it might have been. Instead I posted graphs like this:
We were able to show NVIDIA GTX 680 performance and scaling in SLI at 5760x1080 but we only were giving results for the Radeon HD 7970 GHz Edition in a single GPU configuration.
Since those stories were released, AMD has been very active. At first they were hesitant to believe our results and called into question our processes and the ability for gamers to really see the frame rate issues we were describing. However, after months of work and pressure from quite a few press outlets, AMD released a 13.8 beta driver that offered a Frame Pacing option in the 3D controls that enables the ability to evenly space out frames in multi-GPU configurations producing a smoother gaming experience.
The results were great! The new AMD driver produced very consistent frame times and put CrossFire on a similar playing field to NVIDIA’s SLI technology. There were limitation though: the driver only fixed DX10/11 games and only addressed resolutions of 2560x1440 and below.
But the story won’t end there. CrossFire and Eyefinity are still very important in a lot of gamers minds and with the constant price drops in 1920x1080 panels, more and more gamers are taking (or thinking of taking) the plunge to the world of Eyefinity and Surround. As it turns out though, there are some more problems and complications with Eyefinity and high-resolution gaming (multi-head 4K) that are cropping up and deserve discussion.
Subject: General Tech | August 30, 2013 - 04:07 PM | Jeremy Hellstrom
Tagged: nvidia, batman arkham origins, free
If you weren't a fan of NVIDIA's last offer of in game currency for
PTW Free To Play online games then how about Batman: Arkham Origins for free? If you pick up a 600 or 700 series GPU before the end of the year then you will picking up a copy for free. The TITAN and GTX 690 are not named specifically, nor is the rumoured GTX 790 but it is unlikely you would be singled out. NVIDIA will also being showing off a sneak peek of the game at PAX Prime in September.
SANTA CLARA, Calif.—Aug. 30, 2013—NVIDIA today announced it is working with Warner Bros. Interactive Entertainment and WB Games Montréal to make Batman™: Arkham Origins, the next installment in the blockbuster Batman: Arkham videogame franchise, a technically advanced and intensely realistic chapter in the award-winning saga for PC players.
Gamers who purchase a qualifying GPU from a participating partner will receive a free PC edition of Batman: Arkham Origins, which will be released worldwide on Oct. 25, 2013.
Developed by WB Games Montréal, Batman: Arkham Origins features an expanded Gotham City and introduces an original prequel storyline set several years before the events of Batman: Arkham Asylum and Batman: Arkham City. Taking place before the rise of Gotham City’s most dangerous criminals, the game showcases a young Batman as he faces a defining moment of his early career and sets his path to becoming the Dark Knight.
Batman has immense power, strength and speed—the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham’s dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA™ antialiasing, soft shadows and various NVIDIA PhysX® engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.
“The Batman: Arkham games are visually stunning and it’s great that we are able to continue building upon the amazing graphics with Batman: Arkham Origins,” said Samantha Ryan, Senior Vice President, Production and Development, Warner Bros. Interactive Entertainment. “With NVIDIA’s continued support, we are able to deliver an incredibly immersive gameplay experience."
NVIDIA will be unveiling a sneak peek of Batman: Arkham Origins at PAX Prime in Seattle, during the NVIDIA stage presentation at the Paramount Theater on Monday, Sept. 2 at 10 a.m. PT. Entry is free.
Additionally, any PAX attendees that purchase a qualified bundle from the special kiosk at the NVIDIA booth on the show floor will receive for free a limited edition Batman lithograph — one of only 1,000 being produced.
For a full list of participating bundle partners, visit: www.geforce.com/freebatman. This offer is good only until Jan. 31, 2014.
Subject: General Tech | August 30, 2013 - 03:54 PM | Jeremy Hellstrom
Tagged: shield, nvidia, nifty, microsoft, grid vca, byod
Remember NVIDIA's Shield, that game streaming device Ryan was playing with at QuakeCon but which doesn't seem to fit the role of just a gaming device since it can harness the power of other nearby NVIDIA GPUs? The Register is proposing a rather interesting usage scenario for the Shield by using the GRID VCA technology which is the basis of communications with NVIDA's servers and virtualized GPUs, which is also happens to function well with many of the virtualization programs currently in use.
When they saw Windows games being played on a Shield at VM World they realized that there would be nothing impossible about providing Office 365 as a service if you were running Server 2012 with RemoteFX installed. With HDMI out you can have the monitor of your choice and the Bluetooth capability means you can support a keyboard and mouse and suddenly you have the coolest thinclient on the block. In fact you might even be able to sit near a server with several Tesla cards installed and run CAD programs if someone could figure out how to stream a CAD program to the Shield.
Or you could just game at work.
"Some grumble that the Bring Your Own Device (BYOD) concept deserves to be called Spend Your Own Money in recognition of the cost of providing a computer hitting workers' hip pockets instead of employers'.
Such grumbles may be less sustainable now that NVIDIA's $US299 SHIELD portable gaming console can run Windows applications."
Here is some more Tech News from around the web:
- Intel to announce new Haswell processors in September @ DigiTimes
- Samsung has begun bashing out DDR4 20nm memory modules @ The Inquirer
- Intel Plans 'Overclocking' Capability On SSDs @ Slashdot
- Apple set to make 63 million iWatches in 2014 priced at $199 @ The Inquirer
- Do not adjust your eyes: This Kobo ten-incher has a 2560 x 1600 resolution @ The Register
- Enter to win an MSI GeForce GTX 760 graphics card and Corsair HX1050 PSU @ The Tech Report
A New TriFrozr Cooler
Graphics cards are by far the most interesting topic we cover at PC Perspective. Between the battles of NVIDIA and AMD as well as the competition between board partners like EVGA, ASUS, MSI and Galaxy, there is very rarely a moment in time when we don't have a different GPU product of some kind on an active test bed. Both NVIDIA and AMD release reference cards (for the most part) with each and every new product launch and it then takes some time for board partners to really put their own stamp on the designs. Other than the figurative stamp that is the sticker on the fan.
One of the companies that has recently become well known for very custom, non-reference graphics card designs is MSI and the pinnacle of the company's engineering falls into the Lightning brand. As far back as the MSI GTX 260 Lightning and as recently as the MSI HD 7970 Lightning, these cards have combined unique cooling, custom power design and good amount of over engineering to really produce a card that has few rivals.
Today we are looking at the brand new MSI GeForce GTX 780 Lightning, a complete revamp of the GTX 780 that was released in May. Based on the same GK110 GPU as the GTX Titan card, with two fewer SMX units, the GTX 780 easily the second fastest single GPU card on the market. MSI is hoping to make the enthusiasts even more excited about the card with the Lightning design that brings a brand new TriFrozr cooler, impressive power design and overclocking capabilities that basic users and LN2 junkies can take advantage of. Just what DO you get for $750 these days?