Subject: General Tech | February 16, 2015 - 01:09 PM | Jeremy Hellstrom
Tagged: nvidia, gtx 900m, overclocking, responsibility
It seems that the recent ability to overclock the GTX 900M on laptops was a bug and not a feature, according to the response of an NVIDIA representative on this thread, to the many reasonable and well thought out posts on the thread on their forums. This started in the 347.29 release and continues into the current 347.52 release which supports the newly released Evolve as well as overclocking on desktop components.
It would be very nice to see the restoration of the ability to overclock mobile NVIDIA chips so that users can decide if they wish to or not but perhaps it is worth reminding those who want to overclock that they are doing so at their own risk. This does not mean the voiding of the warranty which will happen but refers more to the actual risk of damage to the GPU and the laptop it is in, by exceeding the thermal design of the laptop you risk destroying the expensive machine you just bought. Laptops have nowhere near the thermal flexibility or compartmentalization of a desktop, not only can you not pop the side off or slap in a new fan, the heat from the GPU is bleeding directly into other components in the laptop as their is no significant air gap between components.
Restoring the ability to overclock either natively or through third party applications is something that would be very appreciated, however there should be a strong warning presented to users if they do chose to. If you are running GPU enabled BOINC or Folding@Home on an overclocked laptop which you then leave unattended, it is your fault if the damn thing catches fire not NVIDIA's so do not go suing.
"Nvidia has removed the ability of users to overclock their GeForce GTX 900M series GPU equipped laptops in a recent driver update. The driver in question is the GeForce R347 driver (version 347.29). Before the update users of the laptops in question had no problems overclocking or even underclocking their GPUs."
Here is some more Tech News from around the web:
- Intel reportedly to delay launch of 14nm Skylake desktop CPUs @ DigiTimes
- Google, Mattel team up to offer View-Master VR in kid-friendly package @ ExtremeTech
- Get Your Data Back with Linux-Based Data Recovery Tools @ Linux.com
- Think you’re hard? Check out the frozen Panasonic CF-54 Toughbook @ The Register
- iOS 8 causes more developer headaches than Android 5.0 Lollipop @ The Inquirer
- Microsoft's patchwork falls apart … AGAIN! @ The Register
- The TR Podcast 170 video: What the kids put in their PCIe slots these days
Subject: General Tech | February 16, 2015 - 11:00 AM | Scott Michaud
Tagged: raptr, pc gaming
If you are interested in the top five most played PC games, according to Raptr, then the rank order has not changed much. Each of them bled a lot of mind share though. In January, the top twenty games accounted for 61.93% (give or take rounding error) of total time, with 44.05% of total time dominated by the top five. In December (2014), the top twenty games had 78.41% of total play time, or 57% for just the top five. This means that PC gamers, at least those using Raptr, were spending a lot more time playing a diverse spread of less-popular games last month.
The biggest change (by rank) was Warframe, which lost six ranks and 43.2% of its play time, even though that was only 0.6% of Raptr's total. The second-largest change in the bottom fifteen games is Diablo III, which climbed up five ranks due to a major update that was released halfway through the month. The third-largest change is Dragon Age: Inquisition, which lost almost half (43.3%) of its play time, resulting in a drop of three ranks.
Even though the ranking had a few big movements internally, all twenty were also on last month's list.
Subject: General Tech, Graphics Cards | February 15, 2015 - 07:30 AM | Scott Michaud
Tagged: ubisoft, DirectX 12, directx 11, assassins creed, assassin's creed, assasins creed unity
During a conference call with investors, analysts, and press, Yves Guillemot, CEO of Ubisoft, highlighted the issues with Assassin's Creed: Unity with an emphasis on the positive outcomes going forward. Their quarter itself was good, beating expectations and allowing them to raise full-year projections. As expected, they announced that a new Assassin's Creed game would be released at the end of the year based on the technology they created for Unity, with “lessons learned”.
Before optimization, every material on every object is at least one draw call.
Of course, there are many ways to optimize... but that effort works against future titles.
After their speech, the question period revisited the topic of Assassin's Creed: Unity and how it affected current sales, how it would affect the franchise going forward, and how should they respond to that foresight (Audio Recording - The question starts at 25:20). Yves responded that they redid “100% of the engine”, which was a tremendous undertaking. “When you do that, it's painful for all the group, and everything has to be recalibrated.” He continues: “[...] but the engine has been created, and it is going to help that brand to shine in the future. It's steps that we need to take regularly so that we can constantly innovated. Those steps are sometimes painful, but they allow us to improve the overall quality of the brand, so we think this will help the brand in the long term.”
This makes a lot of sense to me. When the issues first arose, it was speculated that the engine was pushing way too many draw calls, especially for DirectX 11 PCs. At the time, I figured that Ubisoft chose Assassin's Creed: Unity to be the first title to use their new development pipeline, focused on many simple assets rather than batching things together to minimize host-to-GPU and GPU-to-host interactions. Tens of thousands of individual tasks being sent to the GPU will choke a PC, and getting it to run at all on DirectX 11 might have diverted resources from, or even caused, many of the glitches. Currently, a few thousand is ideal although “amazing developers” can raise the ceiling to about ten thousand.
This also means that I expect the next Assassin's Creed title to support DirectX 12, possibly even in the graphics API's launch window. If I am correct, Ubisoft has been preparing for it for a long time. Of course, it is possible that I am simply wrong, but it would align with Microsoft's Holiday 2015 expectation for the first, big-budget titles to use the new interface and it would be silly to have done their big overhaul without planning on switching to DX12 ASAP.
Then there is the last concern: If I am correct, what should Ubisoft have done? Is it right for them to charge full price for a title that they know will have necessary birth pains? Do they delay it and risk (or even accept) that it will be non-profitable, and upset fans that way? There does not seem to be a clear answer, with all outcomes being some flavor of damage control.
Subject: General Tech | February 13, 2015 - 05:09 PM | Jeremy Hellstrom
Tagged: mouse, gaming mat, input, XTracPads, Carbonic, Ripper, Ripport XXL
They are not the most glamorous of peripherals but they do save your desk and can help you with your accuracy, so pop over to Overclockers Club to take a look at XTracPads. They offer three different sized gaming mats from the paper sized Carbonic at 8.5" x 11" x 1/8" to the Ripper at a larger 11" x 17" x 1/8" to the immense Ripper XXL at 36" x 18" x 1/8" which is going to cover a goodly piece of your desk. They are priced at roughly $15, $22 and $35 so it is not a major investment to pick up and well worth it if you are looking to replace an old mat which has seen better days.
"From a casual gamer perspective, I am sure someone who can game competitively will likely notice a greater improvement than I. Personally, I have had trouble with mouse pads that were too hard, not stiff, but solid cutouts of plastic (I don't even know if they are made anymore really). I have also had issues with mouse pads that accumulate a bunch of gross after a bit of use. I can live with poor or cheap mouse pads, but now that I have had a taste of the other side I really don't want to anymore."
Here is some more Tech News from around the web:
- Thermaltake Tt eSPORTS Talon Gaming Mouse Review @ OCC
- Cougar 600M Gaming Mouse Review @HiTech Legion
- Cougar 600K @ HardwareHeaven
- ROCCAT Ryos TKL Pro Mechanical Keyboard Review @ Techgage
- Corsair Gaming K70 RGB Mechanical Keyboard @ techPowerUp
Subject: General Tech | February 12, 2015 - 03:26 PM | Scott Michaud
Tagged: warner bros, pc gaming, GOG
Another publisher signed a deal with GOG to sell and distribute games, DRM-free. To launch their partnership with Warner Bros. Interactive Entertainment, six games have been added and five of them are on sale. LEGO Batman (50%-off), the two LEGO Harry Potter games (each 60%-off), F.E.A.R. Platinum (50%-off), and Bastion (60%-off) will be at their reduced prices all week.
The sixth title comes from their acquisition of Midway Games and is actually a three-game combo: Mortal Kombat 1+2+3. One person in the comments said that they are DOS-based versions and controller support might be a problem (although JoyToKey should solve that problem nicely - especially for a fighting game without analog controls). The first two games only support single PC multiplayer, although Mortal Kombat 3 allows LAN. Of course, LAN support should be easily extended to online multiplayer with people that you know online via VPN software, but I have not tried it myself and lag could be a problem.
All six titles are DRM-free, because it's GOG and that's how they roll.
Subject: General Tech | February 12, 2015 - 01:46 PM | Ken Addison
Tagged: podcast, video, gtx 960, plextor, m6e black edition, M6e, r9 390, amd, radeon, nvidia, Silverstone, tegra, tx1, Tegra X1, corsair, H100i GTX, H80i GT
PC Perspective Podcast #336 - 02/12/2015
Join us this week as we discuss GTX 960 Overlocking, Plextor M6e Black Edition, AMD R9 3xx Rumors and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:11:53
Week in Review:
News item of interest:
0:46:10 NVIDIA Event on March 3rd. Why?
Hardware/Software Picks of the Week:
Subject: General Tech | February 12, 2015 - 12:51 PM | Jeremy Hellstrom
Tagged: security, fud
In networking, an air gap refers to a security measure that separates a network from the public infrastructure, either physically or through the use of extremely secure tunnelling. This prevents access to that network over the internet or less secure LANs and is used in high security locations as it is generally considered one of the best ways of securing a network. As with all things silicon, it is not perfect and this article at The Register should not be read by the faint of heart. They describe several methods which have been developed to overcome air gaps, thankfully most require that the attacker had been able to gain physical access to the air gapped systems to infect them from within and as you have heard many times, once an attacker can gain physical access to your systems all bets are off.
What is interesting is the ways in which the infected systems transmit the stolen data without the need for physical contact and are incredibly difficult to detect. Some are able to use the FM frequencies generated by GPUs to send data to cellphones up to 7m away while another uses the pixels to transmit hidden data in a way that is invisible to the user of the machine. Other attacks involve spreading infection via microphones and speakers or a thumbdrive which was attached to an air gapped machine which could transmit data over a radio frequency up to 13 kilometres away. It is a wild world out there and even though many of the attacks described have only been done in research labs; don't let strangers fondle your equipment without consent!
"The custom code had jumped an air gap at a defence client and infected what should have been a highly-secure computer. Sikorski's colleagues from an unnamed company plucked the malware and sent it off to FireEye's FLARE team for analysis."
Here is some more Tech News from around the web:
- Storage firms drop 'A bombs' on the backup biz @ The Register
- BitTorrent Announces Exclusive TV Shows @ Slashdot
- Chip giant TSMC, flush with record sales, plans $16bn fab build-out @ The Register
- Ofcom paves way for IoT network with white space approval @ The Inquirer
Subject: General Tech | February 12, 2015 - 08:30 AM | Scott Michaud
Tagged: reverse-consolitis, PC, Nintendo, emulator, dolphin
Update: Fixed the title of "Pikmin". Accidentally wrote "Pikman".
Considering the recent Nintendo license requirements, I expect that their demonstrative YouTube videos will have a difficult time staying online. Regardless, if you are in a jurisdiction where this is legal, it is now possible to play some Gamecube-era games at 60 FPS (as well as 1080p) with an emulator PC.
The blog post at the Dolphin Emulator's website goes into the “hack” in detail. The main problem is that these games are tied to specific framerates, which will cause problems with sound processing and other, time-sensitive bits of code. I have actually been told that one of the most difficult aspects of bringing a console game to the PC (or restoring an old PC game) is touching the timing code. It will break things all over. For Super Mario Sunshine, this also involves patching the game such that certain parts are still ticked at 30 FPS, despite the render occurring at twice that rate.
Also interesting is that some games, like Super Smash Bros. Melee, did not require a game-side patch. Why? Because they apparently include a slow-motion setting by default, which was enabled and then promptly sped up to real time, resulting in a higher frame rate at normal speed. The PC is nothing if not interesting.
Subject: General Tech, Systems | February 12, 2015 - 08:00 AM | Scott Michaud
Tagged: raspberry pi 2, Raspberry Pi
It did not take long to find a problem with the Raspberry Pi 2. As it turns out, the Pi 2 contains a power regulator chip that is susceptible to bright sources of light. The light will force electrons to move when a metal is struck by enough photons with the correct, per-photon energy, which is its frequency/color, and that will be perceived as a voltage (because it actually does cause a voltage).
In the Raspberry Pi 2, this manifests as a voltage drop and the device immediately powers down. This was first discovered by Peter Onion on the Raspberry Pi forums while he was taking photographs of his Raspberry Pi 2. He noticed that each time he snapped a photo, the Pi would shut down. Liz Upton of the Raspberry Pi Foundation promptly confirmed the issue and wrote a long blog post explaining what actually happens. She borrows Peter's joke from the forum thread, that the Pi 2 is camera shy, and explains that “everyday light sources” will not cause this to happen. She then explains the photoelectric effect, the role of the above pictured U16 chip, and the issue itself.
I definitely appreciate Liz Upton and the Raspberry Pi Foundation, founded on the premise of education, taking the time to explain their bugs from an educational standpoint. That said, it is easy to lose sight of your goal when you have a product to defend, and I am glad that it did not get in the way.
A final note: this will not damage the Pi 2, just cause it to crash and power down. The only real problem is that shutting down your device mid-task will crash your task. If that is a write to the SD card, that will likely corrupt that write.
Subject: General Tech, Motherboards, Storage | February 11, 2015 - 09:59 PM | Scott Michaud
Tagged: usb 3.1, usb, msi, asmedia
USB 3.0, for storage, is fast. If you are using an external, spindle-based hard drive, it will perform basically as fast as an internal sibling would. Apart from my two SSDs, I do not even have an internal drive anymore. You can safely install games to external hard drives now.
But with USB 3.1, the spec doubled to 10 Gbps, which matches the first generation Thunderbolt connector. A couple of weeks ago, Tom's Hardware put it to the test with an ASMedia USB3.1 to SATA 6 Gbps developer board. Sure enough, when you are raiding a pair of Intel 730 SSDs, you can achieve over 700 MB/s read/write in CrystalDiskMark.
About the most interesting part of Tom's Hardware testing is their CPU usage benchmark. While USB 3.0 on Intel's controller choked a CPU thread, USB 3.1 on ASMedia's controller did not even reach half of a thread's maximum (the CPU in question is a Core i7-5930K Haswell-E at 3.5 GHz).
So until we get flash drives that are constrained by USB 3.0's fairly high ceiling, we might be able to have reduced CPU usage.