Thoughts on the unintended consequences of Mantle and SteamOS

Subject: General Tech | October 11, 2013 - 12:19 PM |
Tagged: amd, Mantle, gaming, valve

The Tech Report has been thinking on the upcoming release of SteamOS and AMD's Mantle and they see some problems that could come about because of them.  Fragmentation has always been a problem for PCs, be it that the hardware between systems never matches or the wide variety of APIs and game engines on the software side.  It can de daunting to begin developing a game and determining if optimizing for AMD, NVIDIA or Intel is worth considering as well as the choice between Direct3D or OpenGL or trying to make them both work.  Mantle is now a choice, BF4 will actually be releasing a version that is natively Mantle shortly after they launch the first version of the game.  Valve has also hinted that several AAA titles will be released on SteamOS, not necessarily Windows or Linux.  What effect could this have on PC gaming as these new choices arrive at the same time the next generation consoles are released?  Read on and see.

too-many-choices.png

"Valve's SteamOS and AMD's Mantle API have the potential to do great things for PC gaming. However, they also threaten to fragment the platform at a critical time, when next-gen consoles are about to reduce the PC's performance and image quality lead by a long shot."

Here is some more Tech News from around the web:

Tech Talk

Subject: Mobile
Manufacturer: ASUS

Introduction and Design

P9033132.jpg

As we’re swimming through the veritable flood of Haswell refresh notebooks, we’ve stumbled across the latest in a line of very popular gaming models: the ASUS G750JX-DB71.  This notebook is the successor to the well-known G75 series, which topped out at an Intel Core i7-3630QM with NVIDIA GeForce GTX 670MX dedicated graphics.  Now, ASUS has jacked up the specs a little more, including the latest 4th-gen CPUs from Intel as well as 700-series NVIDIA GPUs.

Our ASUS G750JX-DB71 test unit features the following specs:

specs.png

Of course, the closest comparison to this unit is already the most recently-reviewed MSI GT60-2OD-026US, which featured nearly identical specifications, apart from a 15.6” screen, a better GPU (a GTX 780M with 4 GB GDDR5), and a slightly different CPU (the Intel Core i7-4700MQ).  In case you’re wondering what the difference is between the ASUS G750JX’s Core i7-4700MQ and the GT60’s i7-4700HQ, it’s very minor: the HQ features a slightly faster integrated graphics Turbo frequency (1.2 GHz vs. 1.15 GHz) and supports Intel Virtualization Technology for Directed I/O (VT-d).  Since the G750JX doesn’t support Optimus, we won’t ever be using the integrated graphics, and unless you’re doing a lot with virtual machines, VT-d isn’t likely to offer any benefits, either.  So for all intents and purposes, the CPUs are equivalent—meaning the biggest overall performance difference (on the spec sheet, anyway) lies with the GPU and the storage devices (where the G750JX offers more solid-state storage than the GT60).  It’s no secret that the MSI GT60 burned up our benchmarks—so the real question is, how close is the ASUS G750JX to its pedestal, and if the differences are considerable, are they justified?

At an MSRP of around $2,000 (though it can be found for around $100 less), the ASUS G750JX-DB71 competes directly with the likes of the MSI GT60, too (which is priced equivalently).  The question, of course, is whether it truly competes.  Let’s find out!

P9033137.jpg

Continue reading our review of the ASUS G750JX-DB71 Gaming Notebook!!!

ARMA III is designed solely for the PC and it shows

Subject: General Tech, Graphics Cards | October 2, 2013 - 11:20 AM |
Tagged: gaming, ARMA III

Forget Crysis, if you want to hammer your PC pick up ARMA III and try turning up the settings!  Even an i7-3770K @ 4.8GHz and GTX 780's in SLI struggle to render this game with all the graphical bells and whistles turned on.  The close up landscapes and objects are gorgeous with high quality textures but to truly get into the feel of the game you need to be able to turn up the veiw distance and number of displayed objects as you can see from [H]ard|OCP's screenshots below.  [H] spent ia bit of time breaking down the best playable settings for numerous GPUs from NVIDIA and AMD as well as showing you the impact that MSAA and PPAA has on the visual quality as well as your PCs performance.  If you want to show off the superiority of a high end gaming machine then this is the game for you.

13805544468YzNbjlGpz_7_2_l.png

"ARMA III is our focus point for today. It features a large open world environment designed on a massive continent measuring 270 square kilometers. To go along side this massive continent is a max visibility range of 20km. Combine this with ARMA III's impressive looking graphics and we have a game that demands performance."

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP

Does Saints Row IV have what it takes to stress modern hardware?

Subject: General Tech | September 18, 2013 - 10:49 AM |
Tagged: gaming, Saints Row IV

[H]ard|OCP lined up five GPUs from the two competitors to see if Saints Row can benefit from serious GPU power.  They set the performance bar at an average of 40fps and raised the graphics options as high as they could while staying above that target frame rate.  For the high end GPUs 2560x1600 was playable at the highest settings though the mid-range cards needed to be reduced to 1920x1080 to remain playable except for the 7870 Gigahertz Edition which retained the higher resolution.  As you can guess from the fact that even a 650Ti or 7790 can max out the graphics options there is not much new in this game from a quality perspective and it really does not stress modern GPUs.  You can have fun playing it but don't expect jaw dropping scenery.

H_quality.png

"Deep Silver's next game is out in the Saints Row saga. Today we examine Saints Row IV focusing on the games performance with the latest hardware on the market. We dissect image quality in great detail and find out if this is a game we expect to see in 2013, or if it falls flat on its face in the innovation department."

Here is some more Tech News from around the web:

Gaming

 

Source: [H]ard|OCP

Sony PS Vita TV Supports Vita Games, Online Services, and PS4 Remote Play for $100

Subject: General Tech | September 9, 2013 - 02:07 AM |
Tagged: sony, remote play, ps vita, playstation 4, gaming

Today, Sony announced a new Vita-branded product called the PlayStation Vita TV. The small 60mm x 100mm box connects to televisions over HDMI and is able to play Vita games using a while PS3-style controller or a touchpad-equipped PS4 game controller.

PS Vita TV.jpg

The PS Vita TV also connects to your home network over Ethernet and is able to pull down content from various Sony online services including Music Unlimited, Video Unlimited, and Karaoke according to Engadget.

Those features alone make it an interesting product, but the PS Vita TV will also be able to connect to the PlayStation 4 over your home network and remote play PS4 games. Users will be able to play PS4 games on a second TV using a PS4 controller and network-connected PS Vita TV.

PS Vita TV_Logo.jpg

The PS Vita TV will be available in Japan in November for 9,954 Yen ($100 USD). Alternatively, a bundle that includes the PS Vita TV, controller, and memory card can be purchased for 14,995 Yen ($150 USD).

If it works as advertised, the PS Vita TV looks to be an excellent companion product to the PS4 which will allow users to play their PS4 and PS Vita library and access streaming content in multiple rooms without needing to pony up for multiple PlayStation 4 consoles.

I hope that the PS Vita TV comes to the US as it should shake up the decision of Xbox One or PS4 in favor of the latter, as the $100 Vita TV will bring the two consoles to the same price, but with the PS4 having remote play and more powerful hardware. In short, I believe the PS Vita TV to be a much more desirable add-on over Microsoft's bundled Kinect.

Does the announcement of the PS Vita TV affect your pre-order decisions at all?

Source: Engadget

A Roman of a different colour

Subject: General Tech | September 4, 2013 - 02:21 PM |
Tagged: total war, rome, gaming, creative assembly

The Total War series has come a long way, from campaign maps that played like a Risk game and cloned troopers in the battlefield to gorgeous landscapes with much more realistic movements and incredibly detailed units in battles.  On the other hand the long awaited next installation of Rome: Total War might have gone a bit too far.  It is not necessarily the obscene amount of time it takes to process the AI's turns nor the inevitable bugs that crept through the QA process; the ability to easily distribute 100MB patches has degraded every publishers QA process to a joke when you compare it to the days of dial-up.  Instead it is the realization that the niggling feeling as you push the End Turn button that you have left something undone is caused by the fact that you did nothing that turn at all.  The campaign map in Total War has never been fast paced nor is it meant to be, instead there had always been a million micromanagement tasks to be completed every turn whereas in this new Rome you often have nothing to do but bash on the end turn button for a few seasons.

It is as Rock, Paper, SHOTGUN comment "I feel that Total War should be a coiled armadillo rather than Rome IIs jellyfish."

Medieval_Total_War_Campaign.jpg

"So then. I am usually in the Total War apologist camp, but not this time. I am not sure if it’s because I had a better experience with Shogun 2, or whether there’s some kind of allergy due to over-exposure going on, but Rome II rubbed me up the wrong druid."

Here is some more Tech News from around the web:

Gaming

Microsoft Shows Off Xbox One SoC At Hot Chips 25

Subject: General Tech | August 29, 2013 - 08:22 PM |
Tagged: xbox one, SoC, microsoft, gaming, console, amd

At the Hot Chips conference earlier this week, Microsoft showed off several slides detailing the SoC used in its upcoming Xbox One gaming console.

The Xbox One uses a System on a Chip (SoC) designed by AMD’s Semi-Custom Business Unit. The processor features eight “Jaguar” AMD CPU cores, an AMD GCN (Graphics Core Next) based GPU with 768 shader cores, an audio co-processor, and 32MB of on-chip eSRAM.

The SoC, measuring 363mm^2 is manufactured on TSMC’s 28nm HPM process. The chip can interface with 8GB of DDR3 main memory with bandwidth of 68.3 GB/s or utilize the on-chip SRAM which has bandwidth of 102GB/s. The embedded SRAM is in addition to the smaller L1 and L2 caches. The slides indicate that the GPU and CPU can at least access the SRAM, though it still remains frustratingly unknown if the SoC supports anything like AMD’s hUMA technology which would allow the CPU and GPU to both read and write to the same memory address spaces without having to copy data between CPU and GPU-accessible memory space. It may be that the CPU and GPU can use the SRAM, but the same memory spaces can not be shared, though that may be the pessimist in me talking. On the other hand, there could be something more, but it’s impossible to say from the block diagram spotted by Semi-Accurate at the Microsoft presentation.

Microsoft Xbox One Gaming Console.png

With that said, the slides do reveal a few interesting figures about the SoC that were not known previously. The Xbox One SoC has 47MB of on-chip memory including 32MB eSRAM used by the CPU and GPU and 64KB of SRAM used by the audio co-processor. The chip’s GPU is rated for Microsoft’s DirectX 11.1 and above graphics API. Further, Microsoft rates the GPU at 1.31 TFLOPS, 41 Gigatexels-per-second, and 13.6 Gigapixels-per-second. Additionally, the GCN-based GPU is able to hardware-encode multi-stream H.264 AVC MVC video and hardware decode multiple formats, including H.264 MVC. The hardware encoder is likely being used for the console’s game capture functionality.

The audio processors in the Xbox One SoC use two 128-bit SIMD floating point vector cores rated at 15.4 GFLOPS and “specialized hardware engines” and “signal processing optimized vector and scalar cores.”

The final interesting specification I got from the slides was that the SoC is able to go into a low power state that is as low as 2.5% of the chip’s full power using power islands and clock gating techniques.

You can find all of the geeky details in these slides over at SemiAccurate.

Source: SemiAccurate

Roccat Launches New Kave XTD 5.1 Digital Gaming Headset

Subject: General Tech | August 26, 2013 - 09:50 PM |
Tagged: headset, gaming, kave xtd, roccat, audio, surround sound, 5.1 headset

German manufacturer Roccat recently took the wraps off of the Kave XTD 5.1 Digital gaming headset at Gamescom last week. The new headset is a 25% lighter update to the original Kave 5.1 and has a tweaked headband and mic. The new Kave XTD 5.1 Digital includes 5.1 over-the-ear style headphones, a removable microphone, and a desktop control pod with various audio controls.

Roccat Kave XTD 5.1 Digital Gaming Headset.jpg

The Kave XTD 5.1 Digital has three drivers per ear that are placed at a 12-degree angle and reportedly provide realistic surround sound. The removable microphone has been reworked to provide better sound quality, according to the Roccat press release.

Roccat Kave XTD 5.1 Desktop Remote.jpg

In addition to the headset itself, the Kave 5.1 Digital comes with a control pod with built-in sound card. The desktop remote has a volume dial, mic mute, phone call answer, speaker, and movie mode buttons. Four 3.5mm audio ports on the control pod allow users to connect and control surround sound speakers. Users can then switch between audio going to speakers or the headset by hitting a button on the desktop remote. Further, it has a technology called Smart Link that allows users to pair the pod with a smartphone over bluetooth in order to answer phone calls without removing the headset.

In all it looks like an interesting product, though I would wait for reviews before putting down cash for a 5.1 headset. The Roccat Kave XTD 5.1 Digital will be available in november for $169.99.

Source: Roccat

GCN-Based AMD 7000 Series GPUs Will Fully Support DirectX 11.2 After Driver Update

Subject: Graphics Cards | August 25, 2013 - 10:24 PM |
Tagged: amd, Windows 8.1, microsoft, directx 11.2, graphics cards, gaming, GCN

Earlier this month, several websites reported that AMD’s latest Graphics Core Next (GCN) based graphics cards (7000 series and 8000 series OEM lines) would not be compatible with the Windows 8.1-only DirectX 11.2 API. This was inferred from a statement made by AMD engineer Laylah Mah in an interview with c1 Magazin.

AMD Radeon 7970 GHz Edition.jpg

An AMD Radeon HD 7970 GHz Edition.

Fortunately, the GCN-based cards will fully support DirectX 11.2 once an updated driver has been released. As it turns out, Microsoft’s final DirectX 11.2 specification ended up being slightly different than what AMD expected. As a result, the graphics cards do not currently fully support the API. The issue is not one of hardware, however, and an updated driver can allow the GCN-based 7000 series hardware to fully support the latest DirectX 11.2 API and major new features such as tiled resources.

The updated driver will reportedly be released sometime in October to coincide with Microsoft’s release of Windows 8.1. Specifically, Maximum PC quoted AMD in stating the following:

"The Radeon HD 7000 series hardware architecture is fully DirectX 11.2-capable when used with a driver that enables this feature. AMD is planning to enable DirectX 11.2 with a driver update in the Windows 8.1 launch timeframe in October, when DirectX 11.2 ships. Today, AMD is the only GPU manufacturer to offer fully-compatible DirectX 11.1 support, and the only manufacturer to support Tiled Resources Tier-2 within a shipping product stack.”

So fret not, Radeon 7000-series owners, you will be able to fully utilize DX 11.2 and all its features once games start implementing them, and assuming you upgrade to Windows 8.1.

Source: Maximum PC

Just in case you forgot, console gamers are getting new toys

Subject: General Tech | August 21, 2013 - 11:47 AM |
Tagged: xbox one, xbone, ps4, gaming

Today we found out that the PlayStation 4 will be available in the US on November 15th and in the UK on the 29th.  In the US you can expect to pay $400 and across the pond it will run you £349.  Microsoft immediately followed, not by announcing their special day but by revealing a number of the games you will be able to play with hints of very similar release dates.  The Xbone will be more expensive, $500 US or £429 in the UK with pricing on additional controllers also available at The Inquirer.  In case you've forgotten the tech specs you can get a quick refresher here; I will likely still be addicted to Rome 2.

xbox-one-vs-ps4.jpg

"THE DUST IS SETTLING on the E3 games trade show keynotes and we are left picking through the facts given out about the Sony PS4 and Microsoft Xbox One consoles.

The good news is that both consoles cost a lot less than the £600 that Amazon had estimated."

Here is some more Tech News from around the web:

Gaming

Source: The Inquirer