Subject: General Tech, Graphics Cards | March 22, 2014 - 12:43 AM | Scott Michaud
NVIDIA's Release 340.xx GPU drivers for Windows will be the last to contain "enhancements and optimizations" for users with video cards based on architectures before Fermi. While NVIDIA will provided some extended support for 340.xx (and earlier) drivers until April 1st, 2016, they will not be able to install Release 343.xx (or later) drivers. Release 343 will only support Fermi, Kepler, and Maxwell-based GPUs.
The company has a large table on their CustHelp website filled with product models that are pining for the fjords. In short, if the model is 400-series or higher (except the GeForce 405) then it is still fully supported. If you do have the GeForce 405, or anything 300-series and prior, then GeForce Release 340.xx drivers will be the end of the line for you.
As for speculation, Fermi was the first modern GPU architecture for NVIDIA. It transitioned to standards-based (IEEE 754, etc.) data structures, introduced L1 and L2 cache, and so forth. From our DirectX 12 live blog, we also noticed that the new graphics API will, likewise, begin support at Fermi. It feels to me that NVIDIA, like Microsoft, wants to shed the transition period and work on developing a platform built around that baseline.
Subject: General Tech | March 21, 2014 - 01:32 PM | Jeremy Hellstrom
Tagged: internet of things
Call it ubiquitous computing, the internet of things or whichever moniker you prefer but what you are describing remains the same, the network of internet enabled sensors, identifiers and broadcasters which now surround us. For many this is not something they think about but for the technologically inclined it is a topic of much interest with many concerns to address of which the prime suspect is privacy. The Inquirer has been hosting discussions on this topic and the possibilities that are created by the massive growth of the internet of things. Check out the debate on how this will impact privacy here.
"THE INTERNET OF THINGS is a term that has been around for about 15 years, with its origins in barcodes and radio frequency identity (RFID) tags, and evolving via near-field communications (NFC) and QR codes. But it's the rise of smart devices and wearable technology, which has only started to take off in the past few years, that will see the Internet of Things come into its own."
Here is some more Tech News from around the web:
- A sysadmin always comes prepared: Grasp those essential tools @ The Register
- Don’t Just Go Sticking That Anywhere: Protect the Precious With a USB Wrapper @ Hack a Day
- Linux May Succeed Windows XP As OS of Choice For ATMs @ Slashdot
- Netgear Nighthawk R7000 AC1900 Wireless Router Review @ Legit Reviews
- Intel ready to launch 4 new desktop CPUs in Mid 2014 @ Madshrimps
- EA games web server was hosting PHISHING SITE – securobod @ The Register
- 2014 Game Developers Conference @ Benchmark Reviews
Subject: General Tech | March 20, 2014 - 04:05 PM | Ken Addison
Tagged: podcast, video, gdc14, haswell, Haswell-E, Broadwell, devil's canyon, Intel, amd, Mantle, dx12, nvidia, gtx 750ti, evga, pny, galaxy
PC Perspective Podcast #292 - 03/20/2014
Join us this week as we discuss Haswell-E, Iris Pro in Broadwell, our 750 Ti Roundup and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano
We have our winners! Win a GeForce GTX 750 Ti by Showing Off Your Upgrade-Worthy Rig!
Week in Review:
0:34:44 This podcast is brought to you by Coolermaster, and the CM Storm Pulse-R Gaming Headset
News items of interest:
0:40:25 An SSD Supercomputer?
0:57:00 Busy week to be a GPU-accelerated software developer
Hardware/Software Picks of the Week:
Jeremy: Windows Mobile SSLChainSaver
Subject: General Tech | March 20, 2014 - 03:01 PM | Jeremy Hellstrom
Tagged: bitcoin, dogecoin, internet of things, cryptocurrency
Not content to ruin the hopes of gamers wanting to upgrade to a Hawaii based AMD GPU now your smart devices are being press ganged into mining crypto-currency. Everything from TVs and fridges through printers, routers and security cameras can be infected with the linux.darlloz worm and will then begin mining for the author of the worm. The worm will even block other infections and has even been monitored patching certain holes in routers to prevent anything else from infecting the device and slowing down the mining computations. The Inquirer does have some humourous news about this worm, there are 31,716 separate IP addresses infected but this has manged to raise a mere $196.00 so for the author.
"A WORM that leverages the Internet of Things to mine cryptocurrencies has been found to have infected around 31,000 devices."
Here is some more Tech News from around the web:
- MSI Shows Off New Gaming Notebooks @ Kitguru
- Exclusive Interview with Richard Huddy from Intel at GDC @ Kitguru
- Top 10 Google Chromecast apps you should install @ The Inquirer
- Azure promises to guard virtual machines against migration dangers @ The Register
- 'Software amplifier' boosts quantum signals @ The Register
- Intel Desktop Roadmap Update - Devil's Canyon, Broadwell @ Legit Reviews
- Intel Talks Haswell-E, Broadwell, Devil's Canyon & More @ Hardware Canucks
- Intel to renew commitment to desktop PCs with a slew of new CPUs
- We're being royalty screwed! Pandora blames price rise on musos wanting money @ The Register
Subject: General Tech, Graphics Cards | March 19, 2014 - 08:26 PM | Ryan Shrout
Tagged: live blog, gdc 14, dx12, DirectX 12, DirectX
UPDATE: If you are looking for the live blog information including commentary and photos, we have placed it in archive format right over here. Thanks!!
It is nearly time for Microsoft to reveal the secrets behind DirectX 12 and what it will offer PC gaming going forward. I will be in San Francisco for the session and will be live blogging from it as networking allows. We'll have a couple of other PC Perspective staffers chiming in as well, so it should be an interesting event for sure! We don't know how much detail Microsoft is going to get into, but we will all know soon.
Microsoft DirectX 12 Session Live Blog
Thursday, March 20th, 10am PDT
You can sign up for a reminder using the CoverItLive interface below or you can sign up for the PC Perspective Live mailing list. See you Thursday!
Subject: General Tech, Shows and Expos | March 19, 2014 - 08:15 PM | Scott Michaud
Tagged: unreal engine 4, gdc 14, GDC, epic games
Game developers, from indie to the gigantic, can now access Unreal Engine 4 with a $19/month subscription (plus 5% of revenue from resulting sales). This is a much different model from UDK, which was free to develop games with their precompiled builds until commercial release, where an upfront fee and 25% royalty is then applied. For Unreal Engine 4, however, this $19 monthly fee also gives you full C++ source code access (which I have wondered about since the announcement that Unrealscript no longer exists).
Of course, the Unreal Engine 3-based UDK is still available (and just recently updated).
This is definitely interesting and, I believe, a response to publishers doubling-down on developing their own engines. EA has basically sworn off engines outside of their own Frostbite and Ingite technologies. Ubisoft has only announced or released three games based on Unreal Engine since 2011; Activision has announced or released seven in that time, three of which were in that first year. Epic Games has always been very friendly to smaller developers and, with the rise of the internet, it is becoming much easier for indie developers to release content through Steam or even their own website. These developers now have a "AAA" engine, which I think almost anyone would agree that Unreal Engine 4 is, with an affordable license (and full source access).
Speaking of full source access, licensees can access the engine at Epic's GitHub. While a top-five publisher might hesitate to share fixes and patches, the army of smaller developers might share and share-alike. This could lead to Unreal Engine 4 acquiring its own features rapidly. Epic highlights their Oculus VR, Linux and Steam OS, and native HTML5 initiatives but, given community support, there could be pushes into unofficial support for Mantle, TrueAudio, or other technologies. Who knows?
A sister announcement, albeit a much smaller one, is that Unreal Engine 4 is now part of NVIDIA's GameWorks initiative. This integrates various NVIDIA SDKs, such as PhysX, into the engine. The press release quote from Tim Sweeney is as follows:
Epic developed Unreal Engine 4 on NVIDIA hardware, and it looks and runs best on GeForce.
Another brief mention is that Unreal Engine 4 will have expanded support for Android.
Subject: General Tech, Shows and Expos | March 19, 2014 - 05:00 PM | Scott Michaud
Tagged: Mantle, gdc 14, GDC, crytek, CRYENGINE
While I do not have too many details otherwise, Crytek and AMD have announced that mainline CRYENGINE will support the Mantle graphics API. CRYENGINE, by Crytek, now sits alongside Frostbite, by Dice, and Nitrous, by Oxide Games, as engines which support that alternative to DirectX and OpenGL. This comes little more than a week after their announcement of native Linux support with their popular engine.
The tape has separate draw calls!
Crytek has been "evaluating" the API for quite some time now, showing interest back at the AMD Developer Summit. Since then, they have apparently made a clear decision on it. It is also not the first time that CRYENGINE has been publicly introduced to Mantle, with Chris Robert's Star Citizen, also powered by the 4th Generation CRYENGINE, having announced support for the graphics API. Of course, there is a large gap between having a licensee do legwork to include an API and having the engine developer provide you supported builds (that would be like saying UnrealEngine 3 supports the original Wii).
Hopefully we will learn more as GDC continues.
Editor's (Ryan) Take:
As the week at GDC has gone on, AMD continues to push forward with Mantle and calls Crytek's implementation of the low level API "a huge endorsement" of the company's direction and vision for the future. Many, including myself, have considered that the pending announcement of DX12 would be a major set back for Mantle but AMD claims that is "short sited" and as more developers come into the Mantle ecosystem it is proof AMD is doing the "right thing."
Here at GDC, AMD told us they have expanded the number of beta Mantle members dramatically with plenty more applications (dozens) in waiting. Obviously this could put a lot of strain on AMD for Mantle support and maintenance but representatives assure us that the major work of building out documentation and development tools is nearly 100% behind them.
If stories like this one over at Semiaccurate are true, and that Microsoft's DirectX 12 will be nearly identical to AMD Mantle, then it makes sense that developers serious about new gaming engines can get a leg up on projects by learning Mantle today. Applying that knowledge to the DX12 API upon its release could speed up development and improve implementation efficiency. From what I am hearing from the few developers willing to even mention DX12, Mantle is much further along in its release (late beta) than DX12 is (early alpha).
AMD indeed was talking with and sharing the development of Mantle with Microsoft "every step of the way" and AMD has stated on several occasions that there were two outcomes with Mantle; it either becomes or inspires a new industry standard in game development. Even if DX12 is more or less a carbon copy of Mantle, forcing NVIDIA to implement that API style with DX12's release, AMD could potentially have the advantage of gaming performance and support between now and Microsoft's DirectX release. That could be as much as a full calendar year from reports we are getting at GDC.
Subject: General Tech | March 19, 2014 - 04:15 PM | Jeremy Hellstrom
Tagged: Homeworld, Homeworld 2, kick ass, Homeworld Remastered, gearbox
If you never had the chance to play Homeworld and Homeworld 2 then there is some good news you should hear; if you did play them then the news is even better! Gearbox is not just re-releasing the two games they have redone the cinematics, models, textures and effects, plus support for up to 4K resolution. The news at Rock, Paper SHOTGUN makes it sound like some content from mods might be included as well as revamped multiplayer. Maybe they will up the quality to the point where some later levels chug just as hard as they did when you first sat down in front of your mothership.
"We learned last year that Gearbox were planning to re-release the enormously loved Homeworld games. Having plucked the license from the THQ jumble sale, apparently because their Chief Creative Officer Brian Martel has maximum love for the series, they made clear their intentions to release an HD version of the first two games. They’re upping their terminology now, from “HD” to “Remastered“."
Here is some more Tech News from around the web:
- Nostalgiablivion: Morrowind's First Quests In Skywind @ Rock, Paper, SHOTGUN
- Frog Fractions 2′s Kickstarter Exists, Is Utterly Insane @ Rock, Paper, SHOTGUN
- Boooooo! - The Witcher 3 Delayed To 2015 @ Rock, Paper, SHOTGUN
- Titanfall @ The Inquirer
- AMD Mantle & TrueAudio Patch for THIEF Tested @ Legit Reviews
- Sony announces Project Morpheus virtual reality headset @ Hexus
Subject: General Tech, Shows and Expos | March 19, 2014 - 01:20 PM | Jeremy Hellstrom
Tagged: Imagination Technologies, gdc 14, wizard, ray tracing
The Tech Report visited Imagination Technologies' booth at GDC where they were showing off a new processor, the Wizard GPU. It is based on the PowerVR Series6XT Rogue graphics processor which is specifically designed to accelerate ray tracing performance, a topic we haven't heard much about lately. They describe the performance as capable of processing 300 million rays and 100 million dynamic triangles per second which translates to 7 to 10 rays per pixel at 720p and 30Hz or 3 to 5 rays a pixel at 1080p and 30Hz. That is not bad, though Imagination Technologies estimates movies display at a rate of 16 to 32 rays per pixel so it may be a while before we see a Ray Tracing slider under Advanced Graphics Options.
"When we visited Imagination Technologies at CES, they were showing off some intriguing hardware that augments their GPUs in order to accelerate ray-traced rendering. Ray tracing is a well-known and high-quality form of rendering that relies on the physical simulation of light rays bouncing around in a scene. Although it's been used in movies and in static scene creation, ray tracing has generally been too computationally intensive to be practical for real-time graphics and gaming. However, Imagination Tech is looking to bring ray-tracing to real-time graphics—in the mobile GPU space, no less—with its new family of Wizard GPUs."
Here is some more Tech News from around the web:
- MoOx contacts make p-type transistor @ Nanotechweb
- Surrender your crypto keys or you're off to chokey, says Australia @ The Register
- Win XP holdouts storm eBay and licence brokers, hiss: Give us all your Windows 7 @ The Register
- Ubuntu Now Runs Well On The MacBook Air, Beats OS X In Graphics @ Phoronix
- Hidden 'Windigo' UNIX ZOMBIES are EVERYWHERE @ The Register
- Xbox boss Marc Whitten leaves Microsoft for Sonos as PS4 leads console sales @ The Inquirer
- Big Brother China Censors WeChat... Again @ TechARP
- Ergotech Freedom Quad 1-over-3 Desk Stand Review @ Techgage
- 10 Old Sprint Phones Can Now Get Totally Free Voice, Texts, and Data @ Gizmodo
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 19, 2014 - 09:03 AM | Scott Michaud
Tagged: WebCL, gdc 14, GDC
The Khronos Group has just ratified the standard for WebCL 1.0. The API is expected to provide a massive performance boost to web applications which are dominated by expensive functions which can be offloaded to parallel processors, such as GPUs and multi-core CPUs. Its definition also allows WebCL to communicate and share buffers between it and WebGL with an extension.
Frequent readers of the site might remember that I have a particular interest in WebCL. Based on OpenCL, it allows web apps to obtain a list of every available compute device and target it for workloads. I have personally executed tasks on an NVIDIA GeForce 670 discrete GPU and other jobs on my Intel HD 4000 iGPU, at the same time, using the WebCL prototype from Tomi Aarnio of Nokia Research. The same is true for users with multiple discrete GPUs installed in their system (even if they are not compatible with Crossfire, SLi, or are from different vendors altogether). This could be very useful for physics, AI, lighting, and other game middleware packages.
Still, browser adoption might be rocky for quite some time. Google, Mozilla, and Opera Software were each involved in the working draft. This leaves both Apple and Microsoft notably absent. Even then, I am not sure how much interest exists within Google, Mozilla, and Opera to take it from a specification to a working feature in their browsers. Some individuals have expressed more faith in WebGL compute shaders than WebCL.
Of course, that can change with just a single "killer app", library, or middleware.
I do expect some resistance from the platform holders, however. Even Google has been pushing back on OpenCL support in Android, in favor of their "Renderscript" abstraction. The performance of a graphics processor is also significant leverage for a native app. There is little, otherwise, that cannot be accomplished with Web standards except a web browser itself (and there is even some non-serious projects for that). If Microsoft can support WebGL, however, there is always hope.
The specification is available at the Khronos website.
Get notified when we go live!