The Rift between Oculus, Kickstarter and you

Subject: General Tech | March 26, 2014 - 02:48 PM |
Tagged: oculus rift, Kickstarter, john carmack, facebook

You've heard by now that Facebook has purchased Oculus and you likely have an opinion on the matter.  There are quite a few issues this sale raises for the technologically inclined.  For the Kickstarter backers, the question of the propriety of Vulture Capitalists benefiting monetarily from a project which began in part because of their donation made on Kickstarter; which still did net them a device.  For those hoping that Oculus was going to be a project designed and lead by Palmer Luckey and involving John Carmack with little oversight or pressure from a company that wants an immediate return on their investment.  For some the simple involvment of Facebook is enough to sour the entire deal regardless of any other factors.

KitGuru offers some possible benefits that could come of this deal; Facebook cannot afford to slow development as competitors such as castAR will soon arrive, nor can they really push Carmack around without risking his involvement.  Before you start screaming take a moment to think about everything this deal involves and then express your opinion ... after all you don't get reality that is much more virtual than Facebook.

oculus.jpg

"I know guys. I know. I’m mad too. I’m sad, disappointed, even betrayed, but these are all things I’m feeling and I bet you are too. We’re having an emotional reaction to two companies worth multiple billions of dollars doing a business deal and though I can’t help but wish it hadn’t happened, I know that if I look at it logically, it makes sense for everyone."

Here is some more Tech News from around the web:

Tech Talk

Source: KitGuru

NVIDIA Launches Jetson TK1 Mobile CUDA Development Platform

Subject: General Tech, Mobile | March 25, 2014 - 09:34 PM |
Tagged: GTC 2014, tegra k1, nvidia, CUDA, kepler, jetson tk1, development

NVIDIA recently unified its desktop and mobile GPU lineups by moving to a Kepler-based GPU in its latest Tegra K1 mobile SoC. The move to the Kepler architecture has simplified development and enabled the CUDA programming model to run on mobile devices. One of the main points of the opening keynote earlier today was ‘CUDA everywhere,’ and NVIDIA has officially accomplished that goal by having CUDA compatible hardware from servers to desktops to tablets and embedded devices.

Speaking of embedded devices, NVIDIA showed off a new development board called the Jetson TK1. This tiny new board features a NVIDIA Tegra K1 SoC at its heart along with 2GB RAM and 16GB eMMC storage. The Jetson TK1 supports a plethora of IO options including an internal expansion port (GPIO compatible), SATA, one half-mini PCI-e slot, serial, USB 3.0, micro USB, Gigabit Ethernet, analog audio, and HDMI video outputs.

NVIDIA Jetson TK1 Mobile CUDA Development Board.jpg

Of course the Tegra K1 part is a quad core (4+1) ARM CPU and a Kepler-based GPU with 192 CUDA cores. The SoC is rated at 326 GFLOPS which enables some interesting compute workloads including machine vision.

Computer Vision On NVIDIA CUDA.jpg

In fact, Audi has been utilizing the Jetson TK1 development board to power its self-driving prototype car (more on that soon). Other intended uses for the new development board include robotics, medical devices, security systems, and perhaps low power compute clusters (such as an improved Pedraforca system).It can also be used as a simple desktop platform for testing and developing mobile applications for other Tegra K1 powered devices, of course.

NVIDIA VisionWorks GTC 2014.jpg

Beyond the hardware, the Jetson TK1 comes with the CUDA toolkit, OpenGL 4.4 driver, and NVIDIA VisionWorks SDK which includes programming libraries and sample code for getting machine vision applications running on the Tegra K1 SoC.

The Jetson TK1 is available for pre-order now at $192 and is slated to begin shipping in April. Interested developers can find more information on the NVIDIA developer website.

 

GTC 2014: NVIDIA Shows Off New Dual GK110 GPU GTX TITAN Z Graphics Card

Subject: General Tech | March 25, 2014 - 05:46 PM |
Tagged: gtx titan z, gtx titan, GTC 2014, CUDA

During the opening keynote, NVIDIA showed off several pieces of hardware that will be available soon. On the desktop and workstation side of things, researchers (and consumers chasing the ultra high end) have the new GTX Titan Z to look forward to. This new graphics card is a dual GK110 GPU monster that offers up 8 TeraFLOPS of number crunching performance for an equally impressive $2,999 price tag.

DSC01411.JPG

Specifically, the GTX TITAN Z is a triple slot graphics card that marries two full GK110 (big Kepler) GPUs for a total of 5,760 CUDA cores, 448 TMUs, and 96 ROPs with 12GB of GDDR5 memory on a 384-bit bus (6GB on a 384-bit bus per GPU). NVIDIA has yet to release clockspeeds, but the two GPUs will run at the same clocks with a dynamic power balancing feature. Four the truly adventurous, it appears possible to SLI two GTX Titan Z cards using the single SLI connector. Display outputs include two DVI, one HDMI, and one DisplayPort connector.

NVIDIA is cooling the card using a single fan and two vapor chambers. Air is drawn inwards and exhausted out of the front exhaust vents.

DSC01415.JPG

In short, the GTX Titan Z is NVIDIA's new number crunching king and should find its way into servers and workstations running big data analytics and simulations. Personally, I'm looking forward to seeing someone slap two of them into a gaming PC and watching the screen catch on fire (not really).

What do you think about the newest dual GPU flagship?

Stay tuned to PC Perspective for further GTC 2014 coverage!

NVIDIA SHIELD: New Features and Promotional Price Cut

Subject: General Tech, Graphics Cards, Mobile | March 25, 2014 - 03:01 PM |
Tagged: shield, nvidia

The SHIELD from NVIDIA is getting a software update which advances GameStream, TegraZone, and the Android OS, itself, to KitKat. Personally, the GameStream enhancements seem most notable as it now allows users to access their home PC's gaming content outside of the home, as if it were a cloud server (but some other parts were interesting, too). Also, from now until the end of April, NVIDIA has temporarily cut the price down to $199.

nvidia-shield-gamestream-01.jpg

Going into more detail: GameStream, now out of Beta, will stream games which are rendered on your gaming PC to your SHIELD. Typically, we have seen this through "cloud" services, such as OnLive and GaiKai, which allow access to a set of games that run on their servers (with varying license models). The fear with these services is the lack of ownership, but the advantage is that the slave device just needs enough power to decode an HD video stream.

nvidia-shield-gamestream-02.jpg

In NVIDIA's case, the user owns both server (their standard NVIDIA-powered gaming PC, which can now be a laptop) and target device (the SHIELD). This technology was once limited to your own network (which definitely has its uses, especially for the SHIELD as a home theater device) but now can also be exposed over the internet. For this technology, NVIDIA recommends 5 megabit upload and download speeds - which is still a lot of upload bandwidth, even for 2014. In terms of performance, NVIDIA believes that it should live up to expectations set by their GRID. I do not have any experience with this, but others on the conference call took it as good news.

As for content, NVIDIA has expanded the number of supported titles to over a hundred, including new entries: Assassin's Creed IV, Batman: Arkham Origins, Battlefield 4, Call of Duty: Ghosts, Daylight, Titanfall, and Dark Souls II. They also claim that users can add other apps which are not officially supported, Halo 2: Vista was mentioned as an example, for streaming. FPS and Bitrate can now be set by the user. A bluetooth mouse and keyboard can also be paired to SHIELD for that input type through GameStream.

nvidia-shield-checkbox.jpg

Yeah, I don't like checkbox comparisons either. It's just a summary.

A new TegraZone was also briefly mentioned. Its main upgrade was apparently its library interface. There has also been a number of PC titles ported to Android recently, such as Mount and Blade: Warband.

The update is available now and the $199 promotion will last until the end of April.

Source: NVIDIA

Valve Ports Portal To NVIDIA Shield Gaming Handheld

Subject: General Tech | March 25, 2014 - 02:33 PM |
Tagged: undefined

During the opening keynote of NVIDIA's GTC 2014 conference, company CEO Jen-Hsun Huang announced that Valve had ported the ever-popular "Portal" game to the NVIDIA SHIELD handheld gaming platform.

The game appeared to run smoothly on the portable device, and is a worthy addition to the catalog of local games that can be run on the SHIELD.

DSC01456.JPG

Additionally, while the cake may still be a lie, portable gaming systems apparently are not as Jen-Hsun Huang revealed that all GTC attendees will be getting a free SHIELD.

Stay tuned to PC Perspective for more information on all the opening keynote announcements and their implications for the future of computing!

GPU Technology Conference 2014 resources:

Keep up with GTC 2014 throughout the week by following the NVIDIA blog (blogs.nvidia.com) and the GTC tag on PC Perspective!

Ruinous Text Format; watch those attachments!

Subject: General Tech | March 25, 2014 - 12:59 PM |
Tagged: rtf, microsoft, outlook, word, fud

Users of Microsoft Word 2003 to the current version on PC or the 2011 version for Mac, which means any version of Outlook or other Microsoft application in which Word is the default text editor may want to avoid RTF attachments for the next while.  There is an exploit in the wild which could allow a nefariously modified RTF file to give an attacker access to the machine which it was opened on at the same level as the user.  This does mean that those who follow the advice of most Windows admins and do not log in to an administrator level account for day to day work need not worry overly but those who ignore the advice may find themselves compromised.  As The Register points out, just previewing the attachment in Outlook is enough to trigger a possible infection.

computer-virus_thumb.jpg

"Microsoft has warned its Word software is vulnerable to a newly discovered dangerous bug – which is being exploited right now in "limited, targeted attacks" in the wild. There is no patch available at this time."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

For when you need a keyboard that stands out

Subject: General Tech | March 24, 2014 - 01:22 PM |
Tagged: input, ducky, Cherry MX

If you are not satisfied with a plain keyboard that doesn't stand out in a crowd and also care about the quality of the board then the Ducky Shine 3 is a keyboard you should be aware of.  Your choice of Cherry MX switches to ensure a proper mechanical feel to your key presses and an array of LED lights will make this keyboard stand out from across the room.  As you can see from the picture, this isn't just backlit keys, a glowing snake on the space bar and lights on every key make this board rather unique.  If flashy keyboards are your thing, check out Benchmark Reviews article here.

unnamed.png

"The Ducky Shine Series, arguably one of the best mechanical keyboards on the market, has released the Ducky Shine 3 DK9008S3. Often referred to as the YOTS or “Year of the Snake”, the 2013 Shine 3 is the offshoot descendant of the 2012 Year of the Dragon Shine 2 DK9087 (a tenkeyless version in the shine series). This model, like it’s predecessor, comes with a wide array of switch options including Cherry MX Black, Blue, Brown, and Red, and a wide array of LED color options including: Blue, Red, Green, White, Magenta, and Orange."

Here is some more Tech News from around the web:

Tech Talk

GDC wasn't just about DirectX; OpenGL was also a hot topic

Subject: General Tech | March 24, 2014 - 12:26 PM |
Tagged: opengl, nvidia, gdc 14, GDC, amd, Intel

DX12 and its Mantle-like qualities garnered the most interest from gamers at GDC but an odd trio of companies were also pushing a different API.  OpenGL has been around for over 20 years and has waged a long war against Direct3D, a war which may be intensifying again.  Representatives from Intel, AMD and NVIDIA all took to the stage to praise the new OpenGL standard, suggesting that with a tweaked implementation of OpenGL developers could expect to see performance increases between 7 to 15 times.  The Inquirer has embedded an hour long video in their story, check it out to learn more.

slide-1-638.jpg

"CHIP DESIGNERS AMD, Intel and Nvidia teamed up to tout the advantages of the OpenGL multi-platform application programming interface (API) at this year's Game Developers Conference (GDC)."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

GDC 14: NVIDIA, AMD, and Intel Discuss OpenGL Speed-ups

Subject: General Tech, Shows and Expos | March 22, 2014 - 01:41 AM |
Tagged: opengl, nvidia, Intel, gdc 14, GDC, amd

So, for all the discussion about DirectX 12, the three main desktop GPU vendors, NVIDIA, AMD, and Intel, want to tell OpenGL developers how to tune their applications. Using OpenGL 4.2 and a few cross-vendor extensions, because OpenGL is all about its extensions, a handful of known tricks can reduce driver overhead up to ten-fold and increase performance up to fifteen-fold. The talk is very graphics developer-centric, but it basically describes a series of tricks known to accomplish feats similar to what Mantle and DirectX 12 suggest.

opengl_logo.jpg

The 130-slide presentation is broken into a few sections, each GPU vendor getting a decent chunk of time. On occasion, they would mention which implementation fairs better with one function call. The main point that they wanted to drive home (since they clearly repeated the slide three times with three different fonts) is that none of this requires a new API. Everything exists and can be implemented right now. The real trick is to know how to not poke the graphics library in the wrong way.

The page also hosts a keynote from the recent Steam Dev Days.

That said, an advantage that I expect from DirectX 12 and Mantle is reduced driver complexity. Since the processors have settled into standards, I expect that drivers will not need to do as much unless the library demands it for legacy reasons. I am not sure how extending OpenGL will affect that benefit, as opposed to just isolating the legacy and building on a solid foundation, but I wonder if these extensions could be just as easy to maintain and optimize. Maybe it is.

Either way, the performance figures do not lie.

Source: NVIDIA

Oculus Rift Development Kit 2 (DK2) Are $350, Expected July

Subject: General Tech, Displays, Shows and Expos | March 22, 2014 - 01:04 AM |
Tagged: oculus rift, Oculus, gdc 14, GDC

Last month, we published a news piece stating that Oculus Rift production has been suspended as "certain components" were unavailable. At the time, the company said they are looking for alternate suppliers but do not know how long that will take. The speculation was that the company was simply readying a new version and did not want to cannibalize their sales.

This week, they announced a new version which is available for pre-order and expected to ship in July.

DK2, as it is called, integrates a pair of 960x1080 OLED displays (correction, March 22nd @ 3:15pm: It is technically a single 1080p display that is divided per eye) for higher resolution and lower persistence. Citing Valve's VR research, they claim that the low persistence will reduce motion blur as your eye blends neighboring frames together. In this design, it flickers the image for a short period before going black, and does this at a high enough rate keep your eye fed with light. The higher resolution also prevents the "screen door effect" complained about by the first release. Like their "Crystal Cove" prototype, it also uses an external camera to reduce latency in detecting your movement. All of these should combine to less motion sickness.

I would expect that VR has a long road ahead of it before it becomes a commercial product for the general population, though. There are many legitimate concerns about leaving your users trapped in a sensory deprivation apparatus when Kinect could not even go a couple of days without someone pretending to play volleyball and wrecking their TV with ceiling fan fragments. Still, this company seems to be doing it intelligently: keep afloat on developers and lead users as you work through your prototypes. It is cool, even if it will get significantly better, and people will support its research while getting the best at the time.

DK2 is available for pre-order for $350 and is expected to ship in July.

Source: Oculus

NVIDIA GPUs pre-Fermi Are End of Life After 340.xx Drivers

Subject: General Tech, Graphics Cards | March 22, 2014 - 12:43 AM |
Tagged: nvidia

NVIDIA's Release 340.xx GPU drivers for Windows will be the last to contain "enhancements and optimizations" for users with video cards based on architectures before Fermi. While NVIDIA will provided some extended support for 340.xx (and earlier) drivers until April 1st, 2016, they will not be able to install Release 343.xx (or later) drivers. Release 343 will only support Fermi, Kepler, and Maxwell-based GPUs.

nvidia-geforce.png

The company has a large table on their CustHelp website filled with product models that are pining for the fjords. In short, if the model is 400-series or higher (except the GeForce 405) then it is still fully supported. If you do have the GeForce 405, or anything 300-series and prior, then GeForce Release 340.xx drivers will be the end of the line for you.

As for speculation, Fermi was the first modern GPU architecture for NVIDIA. It transitioned to standards-based (IEEE 754, etc.) data structures, introduced L1 and L2 cache, and so forth. From our DirectX 12 live blog, we also noticed that the new graphics API will, likewise, begin support at Fermi. It feels to me that NVIDIA, like Microsoft, wants to shed the transition period and work on developing a platform built around that baseline.

Source: NVIDIA

How scary is the internet of things?

Subject: General Tech | March 21, 2014 - 01:32 PM |
Tagged: internet of things

Call it ubiquitous computing, the internet of things or whichever moniker you prefer but what you are describing remains the same, the network of internet enabled sensors, identifiers and broadcasters which now surround us.  For many this is not something they think about but for the technologically inclined it is a topic of much interest with many concerns to address of which the prime suspect is privacy.  The Inquirer has been hosting discussions on this topic and the possibilities that are created by the massive growth of the internet of things.  Check out the debate on how this will impact privacy here.

Android_Wear_lead.jpg

"THE INTERNET OF THINGS is a term that has been around for about 15 years, with its origins in barcodes and radio frequency identity (RFID) tags, and evolving via near-field communications (NFC) and QR codes. But it's the rise of smart devices and wearable technology, which has only started to take off in the past few years, that will see the Internet of Things come into its own."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

Podcast #292 - Haswell-E, Iris Pro in Broadwell, our 750 Ti Roundup and more!

Subject: General Tech | March 20, 2014 - 04:05 PM |
Tagged: podcast, video, gdc14, haswell, Haswell-E, Broadwell, devil's canyon, Intel, amd, Mantle, dx12, nvidia, gtx 750ti, evga, pny, galaxy

PC Perspective Podcast #292 - 03/20/2014

Join us this week as we discuss Haswell-E, Iris Pro in Broadwell, our 750 Ti Roundup and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath and Allyn Malventano

 
This podcast is brought to you by Coolermaster, and the CM Storm Pulse-R Gaming Headset!
 
Program length: 1:32:09
 
  1. Week in Review:
  2. 0:34:44 This podcast is brought to you by Coolermaster, and the CM Storm Pulse-R Gaming Headset
  3. News items of interest:
    1. 0:57:00 Busy week to be a GPU-accelerated software developer
  4. Hardware/Software Picks of the Week:
  5. Closing/outro

Be sure to subscribe to the PC Perspective YouTube channel!!

 

Wonder why you can hear the sounds of pickaxes when you make breakfast?

Subject: General Tech | March 20, 2014 - 03:01 PM |
Tagged: bitcoin, dogecoin, internet of things, cryptocurrency

Not content to ruin the hopes of gamers wanting to upgrade to a Hawaii based AMD GPU now your smart devices are being press ganged into mining crypto-currency.  Everything from TVs and fridges through printers, routers and security cameras can be infected with the linux.darlloz worm and will then begin mining for the author of the worm.  The worm will even block other infections and has even been monitored patching certain holes in routers to prevent anything else from infecting the device and slowing down the mining computations.  The Inquirer does have some humourous news about this worm, there are 31,716 separate IP addresses infected but this has manged to raise a mere $196.00 so for the author.

dogecoin_dce1cafbbbc0db017f839f11970703b8.jpg

"A WORM that leverages the Internet of Things to mine cryptocurrencies has been found to have infected around 31,000 devices."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Set your calendar! PC Perspective GDC 14 DirectX 12 Live Blog is Coming!

Subject: General Tech, Graphics Cards | March 19, 2014 - 08:26 PM |
Tagged: live blog, gdc 14, dx12, DirectX 12, DirectX

UPDATE: If you are looking for the live blog information including commentary and photos, we have placed it in archive format right over here.  Thanks!!

It is nearly time for Microsoft to reveal the secrets behind DirectX 12 and what it will offer PC gaming going forward.  I will be in San Francisco for the session and will be live blogging from it as networking allows.  We'll have a couple of other PC Perspective staffers chiming in as well, so it should be an interesting event for sure!  We don't know how much detail Microsoft is going to get into, but we will all know soon.

dx12.jpg

Microsoft DirectX 12 Session Live Blog

Thursday, March 20th, 10am PDT

http://www.pcper.com/live

You can sign up for a reminder using the CoverItLive interface below or you can sign up for the PC Perspective Live mailing list. See you Thursday!

GDC 14: Unreal Engine 4 Launches with Radical Changes

Subject: General Tech, Shows and Expos | March 19, 2014 - 08:15 PM |
Tagged: unreal engine 4, gdc 14, GDC, epic games

Game developers, from indie to the gigantic, can now access Unreal Engine 4 with a $19/month subscription (plus 5% of revenue from resulting sales). This is a much different model from UDK, which was free to develop games with their precompiled builds until commercial release, where an upfront fee and 25% royalty is then applied. For Unreal Engine 4, however, this $19 monthly fee also gives you full C++ source code access (which I have wondered about since the announcement that Unrealscript no longer exists).

Of course, the Unreal Engine 3-based UDK is still available (and just recently updated).

This is definitely interesting and, I believe, a response to publishers doubling-down on developing their own engines. EA has basically sworn off engines outside of their own Frostbite and Ingite technologies. Ubisoft has only announced or released three games based on Unreal Engine since 2011; Activision has announced or released seven in that time, three of which were in that first year. Epic Games has always been very friendly to smaller developers and, with the rise of the internet, it is becoming much easier for indie developers to release content through Steam or even their own website. These developers now have a "AAA" engine, which I think almost anyone would agree that Unreal Engine 4 is, with an affordable license (and full source access).

Speaking of full source access, licensees can access the engine at Epic's GitHub. While a top-five publisher might hesitate to share fixes and patches, the army of smaller developers might share and share-alike. This could lead to Unreal Engine 4 acquiring its own features rapidly. Epic highlights their Oculus VR, Linux and Steam OS, and native HTML5 initiatives but, given community support, there could be pushes into unofficial support for Mantle, TrueAudio, or other technologies. Who knows?

A sister announcement, albeit a much smaller one, is that Unreal Engine 4 is now part of NVIDIA's GameWorks initiative. This integrates various NVIDIA SDKs, such as PhysX, into the engine. The press release quote from Tim Sweeney is as follows:

Epic developed Unreal Engine 4 on NVIDIA hardware, and it looks and runs best on GeForce.

Another brief mention is that Unreal Engine 4 will have expanded support for Android.

So, if you are a game developer, check out the official Epic Games blog post at their website. You can also check their Youtube page for various videos, many of which were released today.

Source: Epic Games

GDC 14: CRYENGINE To Support Mantle, AMD Gets Another Endorsement

Subject: General Tech, Shows and Expos | March 19, 2014 - 05:00 PM |
Tagged: Mantle, gdc 14, GDC, crytek, CRYENGINE

While I do not have too many details otherwise, Crytek and AMD have announced that mainline CRYENGINE will support the Mantle graphics API. CRYENGINE, by Crytek, now sits alongside Frostbite, by Dice, and Nitrous, by Oxide Games, as engines which support that alternative to DirectX and OpenGL. This comes little more than a week after their announcement of native Linux support with their popular engine.

Crysis2.jpg

The tape has separate draw calls!

Crytek has been "evaluating" the API for quite some time now, showing interest back at the AMD Developer Summit. Since then, they have apparently made a clear decision on it. It is also not the first time that CRYENGINE has been publicly introduced to Mantle, with Chris Robert's Star Citizen, also powered by the 4th Generation CRYENGINE, having announced support for the graphics API. Of course, there is a large gap between having a licensee do legwork to include an API and having the engine developer provide you supported builds (that would be like saying UnrealEngine 3 supports the original Wii).

Hopefully we will learn more as GDC continues.

Editor's (Ryan) Take:

As the week at GDC has gone on, AMD continues to push forward with Mantle and calls Crytek's implementation of the low level API "a huge endorsement" of the company's direction and vision for the future. Many, including myself, have considered that the pending announcement of DX12 would be a major set back for Mantle but AMD claims that is "short sited" and as more developers come into the Mantle ecosystem it is proof AMD is doing the "right thing."  

Here at GDC, AMD told us they have expanded the number of beta Mantle members dramatically with plenty more applications (dozens) in waiting.  Obviously this could put a lot of strain on AMD for Mantle support and maintenance but representatives assure us that the major work of building out documentation and development tools is nearly 100% behind them.

mantle.jpg

If stories like this one over at Semiaccurate are true, and that Microsoft's DirectX 12 will be nearly identical to AMD Mantle, then it makes sense that developers serious about new gaming engines can get a leg up on projects by learning Mantle today. Applying that knowledge to the DX12 API upon its release could speed up development and improve implementation efficiency. From what I am hearing from the few developers willing to even mention DX12, Mantle is much further along in its release (late beta) than DX12 is (early alpha).

AMD indeed was talking with and sharing the development of Mantle with Microsoft "every step of the way" and AMD has stated on several occasions that there were two outcomes with Mantle; it either becomes or inspires a new industry standard in game development. Even if DX12 is more or less a carbon copy of Mantle, forcing NVIDIA to implement that API style with DX12's release, AMD could potentially have the advantage of gaming performance and support between now and Microsoft's DirectX release.  That could be as much as a full calendar year from reports we are getting at GDC.  

You can go Homeworld again; Gearbox has some great news

Subject: General Tech | March 19, 2014 - 04:15 PM |
Tagged: Homeworld, Homeworld 2, kick ass, Homeworld Remastered, gearbox

If you never had the chance to play Homeworld and Homeworld 2 then there is some good news you should hear; if you did play them then the news is even better!   Gearbox is not just re-releasing the two games they have redone the cinematics, models, textures and effects, plus support for up to 4K resolution.  The news at Rock, Paper SHOTGUN makes it sound like some content from mods might be included as well as revamped multiplayer.  Maybe they will up the quality to the point where some later levels chug just as hard as they did when you first sat down in front of your mothership.

hwrm.jpg

"We learned last year that Gearbox were planning to re-release the enormously loved Homeworld games. Having plucked the license from the THQ jumble sale, apparently because their Chief Creative Officer Brian Martel has maximum love for the series, they made clear their intentions to release an HD version of the first two games. They’re upping their terminology now, from “HD” to “Remastered“."

Here is some more Tech News from around the web:

Gaming

 

Ray Tracing is back? That's Wizard!

Subject: General Tech, Shows and Expos | March 19, 2014 - 01:20 PM |
Tagged: Imagination Technologies, gdc 14, wizard, ray tracing

The Tech Report visited Imagination Technologies' booth at GDC where they were showing off a new processor, the Wizard GPU.  It is based on the PowerVR Series6XT Rogue graphics processor which is specifically designed to accelerate ray tracing performance, a topic we haven't heard much about lately.  They describe the performance as capable of processing 300 million rays and 100 million dynamic triangles per second which translates to 7 to 10 rays per pixel at 720p and 30Hz or 3 to 5 rays a pixel at 1080p and 30Hz.  That is not bad, though Imagination Technologies estimates movies display at a rate of 16 to 32 rays per pixel so it may be a while before we see a Ray Tracing slider under Advanced Graphics Options.

4_PowerVR Ray Tracing - hybrid rendering (4).jpg

"When we visited Imagination Technologies at CES, they were showing off some intriguing hardware that augments their GPUs in order to accelerate ray-traced rendering. Ray tracing is a well-known and high-quality form of rendering that relies on the physical simulation of light rays bouncing around in a scene. Although it's been used in movies and in static scene creation, ray tracing has generally been too computationally intensive to be practical for real-time graphics and gaming. However, Imagination Tech is looking to bring ray-tracing to real-time graphics—in the mobile GPU space, no less—with its new family of Wizard GPUs."

Here is some more Tech News from around the web:

Tech Talk

GDC 14: WebCL 1.0 Specification is Released by Khronos

Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 19, 2014 - 09:03 AM |
Tagged: WebCL, gdc 14, GDC

The Khronos Group has just ratified the standard for WebCL 1.0. The API is expected to provide a massive performance boost to web applications which are dominated by expensive functions which can be offloaded to parallel processors, such as GPUs and multi-core CPUs. Its definition also allows WebCL to communicate and share buffers between it and WebGL with an extension.

WebCL_300_167_75.png

Frequent readers of the site might remember that I have a particular interest in WebCL. Based on OpenCL, it allows web apps to obtain a list of every available compute device and target it for workloads. I have personally executed tasks on an NVIDIA GeForce 670 discrete GPU and other jobs on my Intel HD 4000 iGPU, at the same time, using the WebCL prototype from Tomi Aarnio of Nokia Research. The same is true for users with multiple discrete GPUs installed in their system (even if they are not compatible with Crossfire, SLi, or are from different vendors altogether). This could be very useful for physics, AI, lighting, and other game middleware packages.

Still, browser adoption might be rocky for quite some time. Google, Mozilla, and Opera Software were each involved in the working draft. This leaves both Apple and Microsoft notably absent. Even then, I am not sure how much interest exists within Google, Mozilla, and Opera to take it from a specification to a working feature in their browsers. Some individuals have expressed more faith in WebGL compute shaders than WebCL.

Of course, that can change with just a single "killer app", library, or middleware.

I do expect some resistance from the platform holders, however. Even Google has been pushing back on OpenCL support in Android, in favor of their "Renderscript" abstraction. The performance of a graphics processor is also significant leverage for a native app. There is little, otherwise, that cannot be accomplished with Web standards except a web browser itself (and there is even some non-serious projects for that). If Microsoft can support WebGL, however, there is always hope.

The specification is available at the Khronos website.

Source: Khronos