GDC 15: Intel shows 3DMark API Overhead Test at Work

Subject: Graphics Cards, Processors | March 4, 2015 - 08:46 PM |
Tagged: GDC, gdc 15, API, dx12, DirectX 12, dx11, Mantle, 3dmark, Futuremark

It's probably not a surprise to most that Futuremark is working on a new version of 3DMark around the release of DirectX 12. What might be new for you is that this version will include an API overhead test, used to evaluate a hardware configuration's ability to affect performance in Mantle, DX11 and DX12 APIs.

3dmark1.jpg

While we don't have any results quite yet (those are pending and should be very soon), Intel was showing the feature test running at an event at GDC tonight. In what looks like a simple cityscape being rendered over and over, the goal is to see how many draw calls, or how fast the CPU can react to a game engine, the API and hardware can be.

The test was being showcased on an Intel-powered notebook using a 5th Generation Core processor, code named Broadwell. Obviously this points to the upcoming support for DX12 (though obviously not Mantle) that Intel's integrated GPUs will provide.

It should be very interesting to see how much of an advantage DX12 offers over DX11, even on Intel's wide ranges of consumer and enthusiast processors.

GDC 15: PhysX Is Now Shared Source to UE4 Developers

Subject: General Tech, Graphics Cards, Shows and Expos | March 4, 2015 - 05:52 PM |
Tagged: GDC, gdc 15, nvidia, epic games, ue4, unreal engine 4, PhysX, apex

NVIDIA and Epic Games have just announced that Unreal Engine 4 developers can view and modify the source of PhysX. This also includes the source for APEX, which is NVIDIA's cloth and destruction library. It does not include any of the other libraries that are under the GameWorks banner, but Unreal Engine 4 does not use them anyway.

epic-ue4-nvidia-physx.jpg

This might even mean that good developers can write their own support for third-party platforms, like OpenCL. That would probably be a painful process, but it should be possible now. Of course, that support would only extend to their personal title, and anyone who they share their branch with.

If you are having trouble finding it, you will need to switch to a branch that has been updated to PhysX 3.3.3 with source, which is currently just “Master”. “Promoted” and earlier seem to be back at PhysX 3.3.2, which is still binary-only. It will probably take a few months to trickle down to an official release. If you are still unable to find it, even though you are on the “Master” branch, the path to NVIDIA's source code is: “Unreal Engine/Engine/Source/ThirdParty/PhysX/”. From there you can check out the various subdirectories for PhysX and APEX.

NVIDIA will be monitoring pull requests sent to that area of Unreal Engine. Enhancements might make it back upstream to PhysX proper, which would then be included in future versions of Unreal Engine and anywhere else that PhysX is used.

In other news, Unreal Engine 4 is now free of its subscription. The only time Epic will ask for money is when you ship a game and royalties are due. This is currently 5% of gross revenue, with the first $3000 (per product, per calendar quarter) exempt. This means that you can make legitimately free (no price, no ads, no subscription, no microtransactions, no Skylander figurines, etc.) game in UE4 for free now!

Source: Epic Games

GDC 15: Valve Shows Off $50 Steam Link Game Streaming Box

Subject: General Tech | March 4, 2015 - 04:31 PM |
Tagged: GDC, valve, streaming box, Steam Box, steam, pc game streaming, gaming, gdc 2015

Valve has slowly but surely been working on its living room gaming initiative. Despite the slow progress (read: Valve time), Steam Machines are still a thing and a new bit of hardware called the “Steam Link” will allow you to stream all of your Steam content from your computers and Steam Machines to your TV over a local network. Slated for a November launch, the Steam Link is a $49.99 box that can be paired with a Steam Controller for another $49.99.

Steam Link Angled.jpg

Valve has revealed little about the internals or specific features of the Steam Link. We do know that it can tap into Valve’s Steam In-Home Streaming technology to stream your PC games to your TV and output it at 1080p 60Hz (no word on specific latency numbers but the wired connection is promising). The box is tiny, looking to be less than half of a NUC (and much shorter) with sharp angles and one rounded corner hosting the Steam logo. Two USB ports, a Gigabit Ethernet port, a HDMI output, and an AC power jack sit on the rear of the device with a third USB port located on the left side of the Steam Link.

Steam Link Budget Streaming Box.jpg

In all, the Steam Link looks like a promising device so long as Valve can get it out the door in time, especially with so many competing streaming technologies hitting the market. I’m looking forward to more details and getting my hands one later this year.

Pushing the GTX 980 to the limits; the ASUS ROG Poseidon

Subject: Graphics Cards | March 4, 2015 - 03:02 PM |
Tagged: asus, ROG Poseidon GTX 980, GTX 980, factory overclocked

On the box the ASUS ROG Poseidon GTX 980 Platinum states a base of 1178MHz and a boost clock of 1279MHz but in testing [H]ard|OCP saw the card sitting at 1328MHz in game while on air cooling.  They then proceeded to hook up a Koolance Exos-2 V2 and pushed the card to 1580MHz in game, though the RAM would only increase by 1.1GHz to 8.1GHz.  As you would expect this had a noticeable impact on the performance and while it might not compete with the just announced Titan X at $640 it is also far less expensive though still $200 more than the Sapphire Vapor-X 290X it was tested against and $90 more than the 8GB version of that card.  If you have the budget this GTX 980 is the fastest single GPU card on the planet right now.

1425298901Z7r8UqWalm_2_15_l.jpg

"The highest overclocked GeForce GTX 980 based video card just landed. If the ASUS ROG Poseidon GTX 980 Platinum video card with a hybrid air and liquid cooling system doesn't impress you, we are not sure what will when it comes to GPU. We push the Poseidon clocks and pit it against the AMD Radeon R9 290X for an ultimate showdown."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: [H]ard|OCP

The old is new is old again, Wolfenstein: The Old Blood

Subject: General Tech | March 4, 2015 - 01:46 PM |
Tagged: gaming, wolfenstein, the old blood

B.J. Blazkowicz is back ... in time?  There was apparently a recent reboot of the venerable Wolfenstein series far more popular than the mediocre 2009 reboot which was set in a modern times, albeit a parallel history in which the Nazi's won.  The Old Blood was just announced, set presumably in the same timeline as The New Order, set in 1946 and is an assault by our hero on Castle Wolfenstein to try to prevent the Nazis from winning the war.  If it is indeed a prequel then we already know the ending, with B.J. suffering a massive head wound and going into a coma; perhaps while bending over to pick up the Spear of Destiny?  Check out the trailer at Rock, Paper, SHOTGUN.

wolf1.jpg

"Bethesda have just announced a Wolfenstein: The New Order stand-alone prequel, which is wonderful news. Going by the subtitle The Old Blood, it’s set in 1946 as the Nazis are on the brink of winning World War II."

Here is some more Tech News from around the web:

Gaming

GDC 15: NVIDIA Shows TITAN X at Epic Games Keynote

Subject: Graphics Cards | March 4, 2015 - 01:10 PM |
Tagged: titan x, nvidia, maxwell, gtx, geforce, gdc 15, GDC

For those of you worried that GDC would sneak by without any new information for NVIDIA's GeForce fans, Jen-Hsun Huang surprised everyone by showing up at the Epic Games' keynote with Tim Sweeny to hijack it.

The result: the first showing of the upcoming GeForce TITAN X based on the upcoming Maxwell GM200 GPU.

titanx3.jpg

JHH stated that it would have a 12GB frame buffer and was built using 8 billion transistors! There wasn't much more information than that, but I was promised that the details would be revealed sooner rather than later.

titanx2.jpg

Any guesses on performance or price?

titanx1.jpg

titanx4.jpg

Jen-Hsun signs the world's first TITAN X for Tim Sweeney.

Kite Demo running on TITAN X

UPDATE: I ran into the TITAN X again at the NVIDIA booth and was able to confirm a couple more things. First, the GPU will only require a 6+8-pin power connections, indicating that NVIDIA is still pushing power efficiency with GM200.

titanx5.jpg

Also, as you would expect, the TITAN X will support 3-way and 4-way SLI, or at very least has the SLI bridges to support it.

titanx6.jpg

Nano, Nano, Nano, Nano, Nano, Nano, Nano, Nano ... Server!

Subject: General Tech | March 4, 2015 - 12:46 PM |
Tagged: nano server, microsoft, server 2016, rumour

In a recent leak from Microsoft that The Inquirer is reporting on describes Windows Server 2016 as offering "a new headless deployment option for Windows Server".  Your next generation of servers may live in containers inside CloudOS infrastructure and you will use Windows Server Core to access Powershell to remotely interface with your server.  There are some downsides to this model, data which is required to be stored in a specific geographical location will not be able to take advantage of this and you will lose the ability to run a fax server.  Governments and other organizations may be forking over money to Microsoft to support older versions of Windows server now or in the future if the idea of a server that you can actually sit in front of is being discouraged.  As with all leaks you should take this with a grain of salt but this is certainly in line with what Microsoft's new business model seems to be.

index.png

"MICROSOFT IS PLANNING a 'Nano Server', according to the latest leaks from notorious Microsoft mole WZor."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

BitTorrent Sync 2.0 Available Now For PC, NAS, and Mobile With Pro Version For $39.95/year

Subject: General Tech | March 4, 2015 - 02:29 AM |
Tagged: sync 2.0, folder sync, file sharing, bittorrent sync, bittorrent, backup

BitTorrent Sync has officially taken the beta tag off and launched Sync 2.0. Sync 2.0 is the latest iteration of the company’s file and folder synchronization application. It uses certificate-based security and the torrent protocol to securely share files an folders with no file size or transfer limits. Sync 2.0 is available for PCs as well as NAS and mobile devices and it can be used to roll your own cloud storage.

Sync 2.0 Main.png

Sync 2.0 contains numerous bug fixes and three major new features over Sync 1.4 (which I detailed here and includes selective sync, ownership and permission controls, and private identities). Additionally, the question of how BitTorrent will monetize Sync has been answered with the introduction of a paid Sync Pro subscription service that grants access to all the new Sync features.

BitTorrent continues to offer a free version that Sync 1.4.3 users can upgrade to in order to take advantage of the bug fixes with one big caveat. The free version of Sync 2.0 is limited to synchronizing 10 folders (no file/folder size or transfer limits). This is a irksome step backwards from the previous version that in my opinion is unwarranted (Sync Pro unlocks a slew of useful features), but apparently BitTorrent believes it needs to do this to encourage enough people to ante up for the paid version to support the project.

Users can download Sync 2.0 for Windows, Mac, Linux, and Free BSD from GetSync.com while mobile users can pick the Sync app up from their app store of choice (it should be live today). BitTorrent now supports Sync on Network Attached Storage devices from Asustor, Drobo, Netgear, Overland SnapServer, QNAP, Seagate, and Synology. You can grab the appropriate NAS build from this page.

Downloads of Sync 2.0 include a 30-day trial of Sync Pro. Sync Pro will cost $39.95 per user per year (on unlimited devices) with volume licensing available for large organizations and teams.

I have been using Sync since the original alpha and have found it to be invaluable in keeping all my files in sync and my smartphone pictures backed up (especially with the number of times my S5 has needed replacing heh). I am still deciding whether or not I will purchase the yearly Pro subscription (The 10 folder limit does not affect me (yet anyway)), but the new features are compelling as the linked devices and selective sync would be welcome. The ownership and permissions stuff is great for collaboration and sharing with others, but that’s not something I’m using it for right now.

What are your thoughts on Sync 2.0 and the new subscription model? Now that I am allowed to talk about it, do you have any questions?

Source: BitTorrent

GDC 15: ZOTAC Announces the SN970 Steam Machine - Powered by a GTX 970M and Intel Skylake CPU

Subject: Systems | March 4, 2015 - 12:11 AM |
Tagged: Skylake, zotac, valve, SteamOS, Steam Machine, steam, gdc 2015, gdc 15, GDC, GTX 970M

Favor a steamier TV gaming experience? ZOTAC has announced a new Steam Machine on the eve of Valve’s presentation at GDC on Wednesday.

SN970-03.jpg

The SN970 presumably gets its name from the GTX 970M mobile GPU within, and this does the heavy lifting along with an unspecified 6th-generation Intel (Skylake) CPU. The massive amount of HDMI outputs (there are 4 HDMI 2.0 ports!) is pretty impressive for a small device like this, and dual Gigabit Ethernet ports are a premium feature as well.

SN970-05.jpg

There's a lot going on back here - the rear I/O of the ZOTAC SN970

Here's the rundown of features and specs from ZOTAC:

Key Features

  • SteamOS preloaded
  • NVIDIA GeForce® GTX 970M MXM graphics
  • 4 x HDMI 2.0, supports 4K UHD @ 60Hz

Specifications

  • 6th Gen Intel Processor
  • NVIDIA GeForce® GTX 970M 3GB GDDR5
  • 8GB DDR3 SODIMM
  • 64GB M.2 SSD
  • 1 x HDMI in
  • 2D/3D NVIDIA Surround
  • Dual Gigabit Ethernet
  • 4 x USB 3.0, 2 x USB 2.0
  • 1 x 2.5” 1TB HDD
  • 802.11ac WiFi, Bluetooth 4.0
  • Mic-In, Stereo Out
  • SD/SDHC/SDXC Card Reader

The release for this new Steam Box isn't specified, but we will be doubtless be hearing more from Valve and their partners tomorrow so stay tuned!

Source: ZOTAC

GDC 15: Native versions of Doom 3, Crysis 3 running on Android, Tegra X1

Subject: Graphics Cards, Mobile | March 3, 2015 - 10:43 PM |
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3

Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.

I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.

Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!

While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.

GDC 15: NVIDIA Announces SHIELD, Tegra X1 Powered Set-top with Android TV

Subject: General Tech, Mobile | March 3, 2015 - 10:21 PM |
Tagged: Tegra X1, tegra, shield, gdc 15, GDC, android tv

NVIDIA just announced a new member of its family of hardware devices: SHIELD. Just SHIELD. Powered by NVIDIA's latest 8-core, Maxwell GPU Tegra X1 SoC, SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming set-top box.

04.jpg

Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk, bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movie and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.

01.jpg

Speaking of the Tegra X1, the SHIELD will include the power of 256 Maxwell architecture CUDA cores and will easily provide the best Android gaming performance of any tablet or set-top box on the market. This means gaming, and lots of it, will be possible on SHIELD. Remember our many discussions about Tegra-specific gaming ports from the past? That trend will continue and more developers are realizing the power that NVIDIA is putting into this tiny chip.

02.jpg

In the box you'll get the SHIELD set-top unit and a SHIELD Controller, the same released with the SHIELD Tablet last year. A smaller remote controller that looks similar to the one used with the Kindle Fire TV will cost a little extra as will the stand that sets the SHIELD upright.

Pricing on the new SHIELD set-top will be $199, shipping in May.

Meet Silicon Motion's new flash agnostic controller

Subject: Storage | March 3, 2015 - 06:16 PM |
Tagged: tlc, ssd, SM2256, slc, silicon motion

You may remember the Silicon Motion SM2256 SSD controller that Al reported on during CES this year, even if you do not you should be interested in a controller which can work with 1x/1y/1z nm TLC NAND from any manufacturer on the market.  The SSD Review managed to get a prototype which uses the new SM2256 controller, Samsung’s 19nm TLC planar NAND flash and a Hynix 440Mhz 256MB DDR3 DRAM chip.  In benchmarking they saw 548MB/s sequential reads and 484MB/s writes, with 4K slowing down to 38MB/s for read and 110MB/s for write.  Check out the rest of the review here as well as keeping your eyes peeled for our first review of the new controller.

618x412xSilicon-Motion-SM2256-Main-1024x683.jpg.pagespeed.ic_.MsJr8DUd2Dl19vb5ls7F.jpg

"Controllers are the heart and soul of every SSD. Without one, an SSD would be a useless PCB with some components slapped on it. It is responsible for everything from garbage collection and wear leveling to error correction and hardware encryption. In simple terms, all these operations can be quite complicated to implement as well as expensive to develop."

Here are some more Storage reviews from around the web:

Storage

 

GDC 15: Khronos Acknowledges Mantle's Start of Vulkan

Subject: General Tech, Graphics Cards, Shows and Expos | March 3, 2015 - 03:37 PM |
Tagged: vulkan, Mantle, Khronos, glnext, gdc 15, GDC, amd

khronos-group-logo.png

Neil Trevett, the current president of Khronos Group and a vice president at NVIDIA, made an on-the-record statement to acknowledge the start of the Vulkan API. The quote came to me via Ryan, but I think it is a copy-paste of an email, so it should be verbatim.

Many companies have made great contributions to Vulkan, including AMD who contributed Mantle. Being able to start with the Mantle design definitely helped us get rolling quickly – but there has been a lot of design iteration, not the least making sure that Vulkan can run across many different GPU architectures. Vulkan is definitely a working group design now.

So in short, the Vulkan API was definitely started with Mantle and grew from there as more stakeholders added their opinion. Vulkan is obviously different than Mantle in significant ways now, such as its use of SPIR-V for its shading language (rather than HLSL). To see a bit more information, check out our article on the announcement.

Update: AMD has released a statement independently, but related to Mantle's role in Vulkan

A taste of the new Pi

Subject: General Tech | March 3, 2015 - 03:01 PM |
Tagged: Raspberry Pi 2 Model B

Linux.com have just released benchmarks of the new Raspberry Pi 2 Model B with its improved processor and RAM.  Benchmarking a Pi is always interesting as you must find applications which are reasonable for this device to use, with webserver software being a decent choice to compare to ODroid-U2, Radxa and the Beaglebone Black.  openSSL 1.0.1e,DES and AES cbc mode ciphering and Blowfish were all tested with the Pi performing slowly but improved from the previous generation and certainly decent for a $35 piece of hardware.  In addition both a full KDE desktop and KDE/Openbox were successfully installed with Openbox the recommended choice.  Get all the results right here.

rasp2.jpg

"Released in February, the Raspberry Pi Model 2 B is an update to the original board that brings quad cores for six times the performance of the original, 1 gigabyte of RAM for twice the memory, and still maintains backwards compatibility. The many CPU cores are brought about by moving from the BCM2835 SoC to the BCM2836 SoC in the Raspberry Pi 2."

Here is some more Tech News from around the web:

Tech Talk

Source: Linux.com

EVGA and Inno3D Announce the First 4GB NVIDIA GeForce GTX 960 Cards

Subject: Graphics Cards | March 3, 2015 - 02:44 PM |
Tagged: video cards, nvidia, gtx 960, geforce, 4GB

They said it couldn't be done, but where there are higher density chips there's always a way. Today EVGA and Inno3D have both announced new versions of GTX 960 graphics cards with 4GB of GDDR5 memory, placing the cards in a more favorable mid-range position depending on the launch pricing.

960_evga.PNG

EVGA's new 4GB NVIDIA GTX 960 SuperSC

Along with the expanded memory capacity EVGA's card features their ACX 2.0+ cooler, which promises low noise and better cooling. The SuperSC is joined by a standard ACX and the higher-clocked FTW variant, which pushes Base/Boost clocks to 1304/1367MHz out of the box.

960_evga_2.PNG

Inno3D's press release provides fewer details, and the company appears to be launching a single new model featuring 4GB of memory which looks like a variant of their existing GTX 960 OC card.

inno3d_960.jpg

The existing Inno3D GTX 960 OC card

The current 2GB version of the GTX 960 can be found starting at $199, so expect these expanded versions to include a price bump. The GTX 960, with only 1024 CUDA cores (half the count of a GTX 980) and a 128-bit memory interface, has been a very good performer nonetheless with much better numbers than last year's GTX 760, and is very competitive with AMD's R9 280/285. (It's a great overclocker, too.) The AMD/NVIDIA debate rages on, and NVIDIA's partners adding another 4GB offering to the mix will certainly add to the conversation, particularly as an upcoming 4GB version of the GTX 960 was originally said to be unlikely.

Source: EVGA

ARM and Geomerics Show Enlighten 3 Lighting, Integrate with Unity 5

Subject: Graphics Cards, Mobile | March 3, 2015 - 12:00 PM |
Tagged: Unity, lighting, global illumination, geomerics, GDC, arm

Back in 2013 ARM picked up a company called Geomerics, responsible for one the industry’s most advanced dynamic lighting engines used in games ranging from mobile to console to PC. Called Enlighten, it is the lighting engine in many major games in a variety of markets. Battlefield 3 uses it, Need for Speed: The Run does as well, The Bureau: XCOM Declassified and Quantum Conundrum mark another pair of major games that depend on Geomerics technology.

geo-3.jpg

Great, but what does that have to do with ARM and why would the company be interested in investing in software that works with such a wide array of markets, most of which are not dominated by ARM processors? There are two answers, the first of which is directional: ARM is using the minds and creative talent behind Geomerics to help point the Cortex and Mali teams in the correct direction for CPU and GPU architecture development. By designing hardware to better address the advanced software and lighting systems Geomerics builds then Cortex and Mali will have some semblance of an advantage in specific gaming titles as well as a potential “general purpose” advantage. NVIDIA employs hundreds of gaming and software developers for this exact reason: what better way to make sure you are always at the forefront of the gaming ecosystem than getting high-level gaming programmers to point you to that edge? Qualcomm also recently (back in 2012) started employing game and engine developers in-house with the same goals.

ARM also believes it will be beneficial to bring publishers, developers and middleware partners to the ARM ecosystem through deployment of the Enlighten engine. It would be feasible to think console vendors like Microsoft and Sony would be more willing to integrate ARM SoCs (rather than the x86 used in the PS4 and Xbox One) when shown the technical capabilities brought forward by technologies like Geomerics Enlighten.

geomerics-1.jpg

It’s best to think of the Geomerics acquisition of a kind of insurance program for ARM, making sure both its hardware and software roadmaps are in line with industry goals and directives.

At GDC 2015 Geomerics is announcing the release of the Enlighten 3 engine, a new version that brings cinematic-quality real-time global illumination to market. Some of the biggest new features include additional accuracy on indirect lighting, color separated directional output (enables individual RGB calculations), better light map baking for higher quality output, and richer material properties to support transparency and occlusion.

All of this technology will be showcased in a new Subway demo that includes real-time global illumination simulation, dynamic transparency and destructible environments.

Geomerics Enlighten 3 Subway Demo

Enlighten 3 will also ship with Forge, a new lighting editor and pipeline tool for content creators looking to streamline the building process. Forge will allow import functionality from Autodesk 3ds Max and Maya applications making inter-operability easier. Forge uses a technology called YEBIS 3 to show estimated final quality without the time consuming final-build processing time.

geo-1.jpg

Finally, maybe the biggest news for ARM and Geomerics is that the Unity 5 game engine will be using Enlighten as its default lighting engine, giving ARM/Mali a potential advantage for gaming experiences in the near term. Of course Enlighten is available as an option for Unreal Engine 3 and 4 for developers using that engine in mobile, console and desktop projects as well as in an SDK form for custom integrations.

Gigabyte's Force H3X is great for gaming but perhaps not podcasting

Subject: General Tech | March 2, 2015 - 03:56 PM |
Tagged: gigabyte, audio, Force H3X, gaming headset, analog

Gigabyte's Force H3X gaming headset sports the 50mm neodymium drivers we have become used to, with a decent frequency response range of 20Hz to 20KHz.  The microphone is a bit different, using two 2mm pickup drivers on each side for a total of four but from the testing Modders Inc performed it did not help with the quality of your recorded audio.  This does not matter so much on a gaming headset but this is perhaps not the best choice for a budding YouTube star.  For audio in gaming Modders Inc does give the headset good marks and they also found it to be very comfortable over long periods of time, definitely worth checking out if you are in the market for a new headset to game with.

IMG_7297.jpg

"Don't you hate that when you are camping with a sniper rifle and all of the sudden some one sneaks up behind you and puts a knife through your head? Of course! We have all been there. Don't you wish you heard that guy who was sneaking up on you? Maybe then you could have switched to a Desert Eagle …"

Here is some more Tech News from around the web:

Audio Corner

Source: Modders Inc

GDC 15: AMD Mantle Might Be Dead as We Know It: No Public SDK Planned

Subject: Graphics Cards | March 2, 2015 - 02:31 PM |
Tagged: sdk, Mantle, dx12, API, amd

The Game Developers Conference is San Francisco starts today and you can expect to see more information about DirectX 12 than you could ever possibly want, so be prepared. But what about the original low-level API, AMD Mantle. Utilized in Battlefield 4, Thief and integrated into the Crytek engine (announced last year), announced with the release of the Radeon R9 290X/290, Mantle was truly the instigator that pushed Microsoft into moving DX12's development along at a faster pace.

Since DX12's announcement, AMD has claimed that Mantle would live on, bringing performance advantages to AMD GPUs and would act as the sounding board for new API features for AMD and game development partners. And, as was always trumpeted since the very beginning of Mantle, it would become an open API, available for all once it outgrew the beta phase that it (still) resides in.

mantle1.jpg

Something might have changed there.

A post over on the AMD Gaming blog from Robert Hallock has some news about Mantle to share as GDC begins. First, the good news:

AMD is a company that fundamentally believes in technologies unfettered by restrictive contracts, licensing fees, vendor lock-ins or other arbitrary hurdles to solving the big challenges in graphics and computing. Mantle was destined to follow suit, and it does so today as we proudly announce that the 450-page programming guide and API reference for Mantle will be available this month (March, 2015) at www.amd.com/mantle.
 
This documentation will provide developers with a detailed look at the capabilities we’ve implemented and the design decisions we made, and we hope it will stimulate more discussion that leads to even better graphics API standards in the months and years ahead.

That's great! We will finally be able to read about the API and how it functions, getting access to the detailed information we have wanted from the beginning. But then there is this portion:

AMD’s game development partners have similarly started to shift their focus, so it follows that 2015 will be a transitional year for Mantle. Our loyal customers are naturally curious what this transition might entail, and we wanted to share some thoughts with you on where we will be taking Mantle next:

AMD will continue to support our trusted partners that have committed to Mantle in future projects, like Battlefield™ Hardline, with all the resources at our disposal.

  1. Mantle’s definition of “open” must widen. It already has, in fact. This vital effort has replaced our intention to release a public Mantle SDK, and you will learn the facts on Thursday, March 5 at GDC 2015.
     
  2. Mantle must take on new capabilities and evolve beyond mastery of the draw call. It will continue to serve AMD as a graphics innovation platform available to select partners with custom needs.
     
  3. The Mantle SDK also remains available to partners who register in this co-development and evaluation program. However, if you are a developer interested in Mantle "1.0" functionality, we suggest that you focus your attention on DirectX® 12 or GLnext.

Essentially, AMD's Mantle API in it's "1.0" form is at the end of its life, only supported for current partners and the publicly available SDK will never be posted. Honestly, at this point, this isn't so much of a let down as it is a necessity. DX12 and GLnext have already superseded Mantle in terms of market share and mind share with developers and any more work AMD put into getting devs on-board with Mantle is wasted effort.

mantle-2.jpg

Battlefield 4 is likely to be the only major title to use AMD Mantle

AMD claims to have future plans for Mantle though it will continue to be available only to select partners with "custom needs." I would imagine this would expand outside the world games but could also mean game consoles could be the target, where developers are only concerned with AMD GPU hardware.

So - from our perspective, Mantle as we know is pretty much gone. It served its purpose, making NVIDIA and Microsoft pay attention to the CPU bottlenecks in DX11, but it appears the dream was a bit bigger than the product could become. AMD shouldn't be chastised because of this shift nor for its lofty goals that we kind-of-always knew were too steep a hill to climb. Just revel in the news that pours from GDC this week about DX12.

Source: AMD

Read more about Intel's new Cherry Trail

Subject: General Tech | March 2, 2015 - 01:49 PM |
Tagged: SoFIA, silvermont, modem, LTE, Intel, Cherry Trail, atom x7, atom x5, atom x3, 7260

With MWC in full swing Intel showed off their mobile silicon to Ryan and to The Tech Report who compiled complete specifications of the Cherry Trail based Atom x5-8300 and 8500 as well as the x7-8700.  All three of these chips will have an Intel designed XMM 7260 LTE modem as well as WiFi and NFC connectivity with the X7 also featuring Intel WiGig. You can also expect RealSense, True Key facial recognition and Pro Wireless Display to send secure wireless video to compatible displays for meetings.  Check out the full list of stats here.

gsmarena_001.jpg

"Intel says the dual-core Atom x3-C3130 is shipping now, while the quad-core Atom x3-C3230RK is coming later in the first half of the year. The LTE-infused Atom x3-C3440 will follow in the second half. In all, the chipmaker names 19 partners on board with the Atom x3 rollout, including Asus, Compal, Foxconn, Pegatron, Weibu, and Wistron."

Here is some more Tech News from around the web:

Tech Talk

MWC 15: HP Spectre x360 Has Broadwell Core i5 and i7

Subject: General Tech, Systems, Shows and Expos | March 1, 2015 - 11:07 PM |
Tagged: spectre x360, spectre, mwc 15, MWC, hp, Broadwell

HP announced their updated Spectre x360 at Mobile World Congress. Like the Lenovo Yoga, it has a hinge that flips the entire way around, allowing the laptop to function as a 13.3-inch tablet with a 1080p, IPS display. There are two stages between “tablet” and “laptop”, which are “stand” and “tent”. They are basically ways to prop up the touch screen while hiding the keyboard behind (or under) the unit. The stand mode is better for hands-free operation because it has a flat contact surface to rest upon, while the tent mode is probably more sturdy for touch (albeit rests on two rims). The chassis is entirely milled aluminum, except the screen and things like that of course.

The real story is the introduction of Core i-level Broadwell. The 12.5-hour battery listing in a relatively thin form-factor can be attributed to the low power requirements of the CPU and GPU, as well as its SSD (128GB, 256GB, or 512GB). RAM comes in two sizes, 4GB or 8GB, which will depend slightly on the chosen processor SKU.

hp-spectre-x360-broadwell.png

Which pun would be more annoying?
"Case closed" or "I rest my case"...?

Prices start at $899 and most variants are available now at HP's website.

Source: HP