Subject: General Tech | February 16, 2015 - 06:09 PM | Jeremy Hellstrom
Tagged: nvidia, gtx 900m, overclocking, responsibility
It seems that the recent ability to overclock the GTX 900M on laptops was a bug and not a feature, according to the response of an NVIDIA representative on this thread, to the many reasonable and well thought out posts on the thread on their forums. This started in the 347.29 release and continues into the current 347.52 release which supports the newly released Evolve as well as overclocking on desktop components.
It would be very nice to see the restoration of the ability to overclock mobile NVIDIA chips so that users can decide if they wish to or not but perhaps it is worth reminding those who want to overclock that they are doing so at their own risk. This does not mean the voiding of the warranty which will happen but refers more to the actual risk of damage to the GPU and the laptop it is in, by exceeding the thermal design of the laptop you risk destroying the expensive machine you just bought. Laptops have nowhere near the thermal flexibility or compartmentalization of a desktop, not only can you not pop the side off or slap in a new fan, the heat from the GPU is bleeding directly into other components in the laptop as their is no significant air gap between components.
Restoring the ability to overclock either natively or through third party applications is something that would be very appreciated, however there should be a strong warning presented to users if they do chose to. If you are running GPU enabled BOINC or Folding@Home on an overclocked laptop which you then leave unattended, it is your fault if the damn thing catches fire not NVIDIA's so do not go suing.
"Nvidia has removed the ability of users to overclock their GeForce GTX 900M series GPU equipped laptops in a recent driver update. The driver in question is the GeForce R347 driver (version 347.29). Before the update users of the laptops in question had no problems overclocking or even underclocking their GPUs."
Here is some more Tech News from around the web:
- Intel reportedly to delay launch of 14nm Skylake desktop CPUs @ DigiTimes
- Google, Mattel team up to offer View-Master VR in kid-friendly package @ ExtremeTech
- Get Your Data Back with Linux-Based Data Recovery Tools @ Linux.com
- Think you’re hard? Check out the frozen Panasonic CF-54 Toughbook @ The Register
- iOS 8 causes more developer headaches than Android 5.0 Lollipop @ The Inquirer
- Microsoft's patchwork falls apart … AGAIN! @ The Register
- The TR Podcast 170 video: What the kids put in their PCIe slots these days
Subject: Graphics Cards | February 16, 2015 - 04:04 PM | Sebastian Peak
Tagged: SFF, nvidia, mini-ITX GPU, mini-itx, gtx 960, graphics, gpu, geforce, asus
ASUS returns to the mini-ITX friendly form-factor with the GTX 960 Mini (officially named GTX960-MOC-2GD5 for maximum convenience), their newest NVIDIA GeForce GTX 960 graphics card.
Other than the smaller size to allow compatibility with a wider array of small enclosures, the GTX 960 Mini also features an overclocked core and promises "20% cooler and vastly quieter" performance from its custom heatsink and CoolTech fan. Here's a quick rundown of key specs:
- 1190 MHz Base Clock / 1253 MHz Boost Clock
- 1024 CUDA cores
- 2GB 128-bit GDDR5 @ 7010 MHz
- 3x DisplayPort, 1x HDMI 2.0, 1x DVI output
No word on the pricing or availability of the card just yet. The other mini-ITX version of the GTX 960 on the market from Gigabyte has been selling for $199.99, so expect this to run somewhere between $200-$220 at launch.
ASUS has reused this image from the GTX 970 Mini launch, and so have I
The product page is up on the ASUS website so availability seems imminent.
Subject: General Tech | February 12, 2015 - 06:46 PM | Ken Addison
Tagged: podcast, video, gtx 960, plextor, m6e black edition, M6e, r9 390, amd, radeon, nvidia, Silverstone, tegra, tx1, Tegra X1, corsair, H100i GTX, H80i GT
PC Perspective Podcast #336 - 02/12/2015
Join us this week as we discuss GTX 960 Overlocking, Plextor M6e Black Edition, AMD R9 3xx Rumors and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:11:53
Week in Review:
News item of interest:
0:46:10 NVIDIA Event on March 3rd. Why?
Hardware/Software Picks of the Week:
Subject: Graphics Cards | February 12, 2015 - 06:41 PM | Jeremy Hellstrom
Tagged: overclocking, nvidia, msi, gtx 960, GM206, maxwell
While Ryan was slaving over a baker's dozen of NVIDIA's GTX 960s, [H]ard|OCP focused on overclocking the MSI GeForce GTX 960 GAMING 2G that they recently reviewed. Out of the box this GPU will hit 1366MHz in game, with memory frequency unchanged at 7GHz effective. As users have discovered, overclocking cards with thermal protection that automatically downclocks the GPU when a certain TDP threshold has been reached is a little more tricky as simply upping the power provided to the card can raise the temperature enough that you end up with a lesser frequency that before you overvolted. After quite a bit of experimentation, [H] managed to boost the memory to a full 8GHz and the in game GPU was hitting 1557MHz which is at the higher end of what Ryan saw. The trick was to increase the Power Limit and turn the clock speed up but leave the voltage alone.
"We push the new MSI GeForce GTX 960 GAMING video card to its limits of performance by overclocking to its limits. This NVIDIA GeForce GTX 960 GPU based video card has a lot of potential for hardware enthusiasts and gamers wanting more performance. We compare it with other overclocked cards to see if the GTX 960 can keep up."
Here are some more Graphics Card articles from around the web:
- Five GeForce GTX 960 cards overclocked @ The Tech Report
- NVIDIA GTX 960 Reference Review @ Hardware Canucks
- MSI GTX 960 Gaming GPU, The Sweet Spot @ Bjorn3d
- MSI GTX 960 Gaming 2G @ Modders-Inc
- 6-way GeForce GTX 960 Shootout @ Legion Hardware
- Testing Nvidia’s GeForce GTX 960 2GB Graphics Cards In SLI @ eTeknix
- NVIDIA GeForce GTX 960 Video Card Review - EVGA SSC Edition @ NitroWare
- Gigabyte GTX 980 G1 Gaming 4GB @ Modders-Inc
- Sapphire R9 290X Tri-X OC 8GB @ Kitguru
- Corsair H105 on 4770K and R9 290 @ HardwareOverclock
Subject: Graphics Cards, Mobile, Shows and Expos | February 11, 2015 - 08:25 PM | Scott Michaud
Tagged: Tegra X1, nvidia, mwc 15, MWC, gdc 2015, GDC, DirectX 12
On March 3rd, NVIDIA will host an event called “Made to Game”. Invitations have gone out to numerous outlets, including Android Police, who published a censored screenshot of it. This suggests that it will have something to do with the Tegra X1, especially since the date is the day after Mobile World Congress starts. Despite all of this, I think it is for something else entirely.
Image Credit: Android Police
Allow me to highlight two points. First, Tech Report claims that the event is taking place in San Francisco, which is about as far away from Barcelona, Spain as you can get. It is close to GDC however, which takes also starts on March 2nd. If this was going to align with Mobile World Congress, you ideally would not want attendees to take 14-hour flight for a day trip.
Second, the invitation specifically says: “More than 5 years in the making, what I want to share with you will redefine the future of gaming.” Compare that to the DirectX 12 announcement blog post on March 20th of last year (2014): “Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC ((2014)).”
So yeah, while it might involve the Tegra X1 processor for Windows 10 on mobile devices, which is the only reason I can think of that they would want Android Police there apart from "We're inviting everyone everywhere", I expect that this event is for DirectX 12. I assume that Microsoft would host their own event that involves many partners, but I could see NVIDIA having a desire to save a bit for something of their own. What would that be? No idea.
Subject: General Tech, Graphics Cards, Mobile | February 11, 2015 - 01:00 PM | Scott Michaud
Tagged: shieldtuesday, shield, nvidia, gridtuesday, grid, graphics drivers, geforce, drivers
Update: Whoops! The title originally said "374.52", when it should be "347.52". My mistake. Thanks "Suddenly" in the comments!
Two things from NVIDIA this week, a new driver and a new game for the NVIDIA GRID. The new driver aligns with the release of Evolve, which came out on Tuesday from the original creators of Left4Dead. The graphics vendor also claims that it will help Assassin's Creed: Unity, Battlefield 4, Dragon Age: Inquisition, The Crew, and War Thunder. Several SLi profiles were also added for Alone in the Dark: Illumination, Black Desert, Dying Light, H1Z1, Heroes of the Storm, Saint's Row: Gat out of Hell, Total War: Attila, and Triad Wars.
On the same day, NVIDIA released Brothers: A Tale of Two Sons on GRID, bringing the number of available games up to 37. This game came out in August 2013 and received a lot of critical praise. Its control style is unique, using dual-thumbstick gamepads to simultaneously control both characters. More importantly, despite being short, the game is said to have an excellent story, even achieving Game of the Year (2013) for TotalBiscuit based on its narrative, which is not something that he praises often.
I'd comment on the game, but I've yet to get the time to play it. Apparently it is only a couple hours long, so maybe I can fit it in somewhere.
Also, they apparently are now calling this “#SHIELDTuesday” rather than “#GRIDTuesday”. I assume this rebranding is because people may not know that GRID exists, but they would certainly know if they purchased an Android-based gaming device for a couple hundred dollars or more. We could read into this and make some assumptions about GRID adoption rates versus SHIELD purchases, or even purchases of the hardware itself versus their projections, but it would be pure speculation.
Both announcements were made available on Tuesday, for their respective products.
A baker's dozen of GTX 960
Back on the launch day of the GeForce GTX 960, we hosted NVIDIA's Tom Petersen for a live stream. During the event, NVIDIA and its partners provided ten GTX 960 cards for our live viewers to win which we handed out through about an hour and a half. An interesting idea was proposed during the event - what would happen if we tried to overclock all of the product NVIDIA had brought along to see what the distribution of results looked like? After notifying all the winners of their prizes and asking for permission from each, we started the arduous process of testing and overclocking a total of 13 (10 prizes plus our 3 retail units already in the office) different GTX 960 cards.
Hopefully we will be able to provide a solid base of knowledge for buyers of the GTX 960 that we don't normally have the opportunity to offer: what is the range of overclocking you can expect and what is the average or median result. I think you will find the data interesting.
The 13 Contenders
Our collection of thirteen GTX 960 cards includes a handful from ASUS, EVGA and MSI. The ASUS models are all STRIX models, the EVGA cards are of the SSC variety, and the MSI cards include a single Gaming model and three 100ME. (The only difference between the Gaming and 100ME MSI cards is the color of the cooler.)
To be fair to the prize winners, I actually assigned each of them a specific graphics card before opening them up and testing them. I didn't want to be accused of favoritism by giving the best overclockers to the best readers!
ARM Releases Top Cortex Design to Partners
ARM has an interesting history of releasing products. The company was once in the shadowy background of the CPU world, but with the explosion of mobile devices and its relevance in that market, ARM has had to adjust how it approaches the public with their technologies. For years ARM has announced products and technology, only to see it ship one to two years down the line. It seems that with the increased competition in the marketplace from Apple, Intel, NVIDIA, and Qualcomm ARM is now pushing to license out its new IP in a way that will enable their partners to achieve a faster time to market.
The big news this time is the introduction of the Cortex A72. This is a brand new design that will be based on the ARMv8-A instruction set. This is a 64 bit capable processor that is also backwards compatible with 32 bit applications programmed for ARMv7 based processors. ARM does not go into great detail about the product other than it is significantly faster than the previous Cortex-A15 and Cortex-A57.
The previous Cortex-A15 processors were announced several years back and made their first introduction in late 2013/early 2014. These were still 32 bit processors and while they had good performance for the time, they did not stack up well against the latest A8 SOCs from Apple. The A53 and A57 designs were also announced around two years ago. These are the first 64 bit designs from ARM and were meant to compete with the latest custom designs from Apple and Qualcomm’s upcoming 64 bit part. We are only now just seeing these parts make it into production, and even Qualcomm has licensed the A53 and A57 designs to insure a faster time to market for this latest batch of next-generation mobile devices.
We can look back over the past five years and see that ARM is moving forward in announcing their parts and then having their partners ship them within a much shorter timespan than we were used to seeing. ARM is hoping to accelerate the introduction of its new parts within the next year.
Subject: General Tech, Systems, Mobile | February 3, 2015 - 10:35 PM | Scott Michaud
Tagged: razer blade, razer, nvidia, Intel, GTX 970M
When the Razer Blade launched, it took a classy design and filled it with high-end gaming components. Its competitors in the gaming space were often desktop replacements, which were powerful but not comfortable, every-day laptops. The Blade also came with a $2800 (at the time) price-tag, and that stunted a lot of reviews. It has been refreshed a few times since then, including today.
The New Razer Blade QHD+ has a 14-inch 3200x1800 display, with multi-touch and an LED backlight. The panel is IGZO, which is a competitor to IPS for screens with a high number of pixels per inch (such as the 4K PQ321Q from ASUS). This is housed in a milled aluminum chassis that is about 7/10th of an inch thick.
Its power brick is rated at 150W, which is surprisingly high for a laptop. I am wondering how much of that electricity is headroom for fast-charging (versus higher performance when not on battery). Most power adapters for common laptops that I've seen are between 60W and 95W. In a small, yet meticulously designed chassis, I would have to assume that thermal headroom of either the heatsinks or the components themselves would be the limiting factor.
On the topic of specifications, they are expectedly high-end.
The GPU was upgraded to the GeForce GTX 970M with 3GB of VRAM (up from a 3GB 870M) and the CPU is now a Core i7-4720HQ (up from a Core i7-4702HQ). The system memory also got doubled, to 16GB (up from 8GB). It also has 3 USB 3.0 ports, HDMI 1.4a out, 802.11a/b/g/n/ac, Bluetooth 4.0, and (of course) a high-end, backlit keyboard. Razer offers a choice in M.2 SSD capacity: 128GB for $2199.99, 256GB for $2399.99, or 512GB for $2699.99. This is kind-of expensive for solid state memory, $1.56/GB for the jump to 256GB and $1.17/GB to go from there to 512GB.
The New Razer Blade Gaming Laptop is available now at Razerzone.com in the US, Canada, Singapore, and Hong Kong. It will arrive at Microsoft Stores in the USA on February 16th. China, Australia, New Zealand, Malaysia, UAE, Japan, Korea, Taiwan, and Russia can purchase it on Razerzone.com in March. Prices start (as stated above) at $2199.99.
Battlefield 4 Results
At the end of my first Frame Rating evaluation of the GTX 970 after the discovery of the memory architecture issue, I proposed the idea that SLI testing would need to be done to come to a more concrete conclusion on the entire debate. It seems that our readers and the community at large agreed with us in this instance, repeatedly asking for those results in the comments of the story. After spending the better part of a full day running and re-running SLI results on a pair of GeForce GTX 970 and GTX 980 cards, we have the answers you're looking for.
Today's story is going to be short on details and long on data, so if you want the full back story on what is going on why we are taking a specific look at the GTX 970 in this capacity, read here:
- Part 1: NVIDIA issues initial statement
- Part 2: Full GTX 970 memory architecture disclosed
- Part 3: Frame Rating: GTX 970 vs GTX 980
- Part 4: Frame Rating: GTX 970 SLI vs GTX 980 SLI (what you are reading now)
Okay, are we good now? Let's dive into the first set of results in Battlefield 4.
Battlefield 4 Results
Just as I did with the first GTX 970 performance testing article, I tested Battlefield 4 at 3840x2160 (4K) and utilized the game's ability to linearly scale resolution to help me increase GPU memory allocation. In the game settings you can change that scaling option by a percentage: I went from 110% to 150% in 10% increments, increasing the load on the GPU with each step.
Memory allocation between the two SLI configurations was similar, but not as perfectly aligned with each other as we saw with our single GPU testing.
In a couple of cases, at 120% and 130% scaling, the GTX 970 cards in SLI are actually each using more memory than the GTX 980 cards. That difference is only ~100MB but that delta was not present at all in the single GPU testing.