Watch_Dogs_revisited

Subject: General Tech | August 20, 2014 - 10:52 AM |
Tagged: gaming, watch_dogs, 4k

After three months, two patches, driver updates and many a flamewar, [H]ard|OCP has posted their complete performance review of Watch_Dogs.  From the mighty Titan to the much more reasonably priced R9 270 almost a dozen cards performance is tested on this much hyped game.  The high end cards were paired and tested in 4k resolution with the R9 290X CrossFire setup coming out on top and holding that lead when tested in single GPU configurations at 2560x1600.  Indeed even at 1080p AMD was able to provide higher quality settings with an acceptable price in performance.  Read the full review to see the visual effects of the various graphics settings as well as the preferred cards at the various resolutions.

After the podcast tonight, or indeed just about any night, you can find some of the Fragging Frogs online playing a variety of games.  If you haven't checked them out yet you can learn all you need to know about joining up with one of the most fun group of gamers online right here.

14083370007cx2pU3ZI8_3_2_l.jpg

"We published a preview of Watch Dogs performance when it was released back in May this year. We have given this game time to mature. Now that a couple of game patches have been released, along with newer drivers from NVIDIA and AMD, it is time for our full performance and image quality comparison review."

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP

Run Windows on Intel's Galileo

Subject: General Tech | August 20, 2014 - 09:35 AM |
Tagged: galileo, Intel, windows, SoC

Intel's first generation low powered SoC which goes by the name of Galileo and is powered by a 400MHz Quark X1000 is now capable of running Windows with the help of the latest firmware update.  Therefore if you are familiar enough with their tweaked Arduino IDE you should be able to build a testbed for low powered machines that will be running Windows.  You will want to have some time on hand, loading Windows to the microSD card can take up to two hours and those used to SSDs will be less than impressed with the boot times.  For developers this is not an issue and well worth the wait as it gives them a brand new tool to work with.  Pop by The Register for the full details of the firmware upgrade and installation process.

galileo_1.jpg

"Windows fans can run their OS of choice on Intel’s counter to Raspberry Pi, courtesy of an Intel firmware update."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

An odd Q2 for tablets and PCs

Subject: General Tech, Graphics Cards | August 19, 2014 - 09:30 AM |
Tagged: jon peddie, gpu market share, q2 2014

Jon Peddie Research's latest Market Watch adds even more ironic humour to the media's continuing proclamations of the impending doom of the PC industry.  This quarter saw tablet sales decline while overall PCs were up and that was without any major releases to drive purchasers to adopt new technology.  While JPR does touch on the overall industry this report is focused on the sale of GPUs and APUs and happens to contain some great news for AMD.  They saw their overall share of the market increase by 11% from last quarter and by just over a percent of the entire market.  Intel saw a small rise in share though it does still hold the majority of the market as PCs with no discrete GPU are more likely to contain Intel's chips than AMDs.  That leaves NVIDIA who are still banking solely on discrete GPUs and saw over an 8% decline from last quarter and a decline of almost two percent in the total market.  Check out the other graphs in JPR's overview right here.

unnamed.jpg

"The big drop in graphics shipments in Q1 has been partially offset by a small rise this quarter. Shipments were up 3.2% quarter-to-quarter, and down 4.5% compared to the same quarter last year."

Here is some more Tech News from around the web:

Tech Talk

Also, Corsair's Cherry MX RGB Launch Date Changed

Subject: General Tech, Cases and Cooling | August 18, 2014 - 07:02 PM |
Tagged: corsair, mechanical keyboard, cherry mx rgb

So I actually did not see this until after I published the Razer story. Just a few hours ago, Corsair posted an announcement to their Facebook page that claimed a "cbange" in launch date for their Cherry MX RGB-based keyboards. I actually forgot that the K70 RGB Red was supposed to be out already, with availability listed as "late July" (the rest were scheduled to arrive in "late August"). Corsair does not yet have a new date, but will comment "in a few weeks".

corsair-cherry-mx-rgb-front.jpg

Got to say, that does look nice.

While, again, no further details are given, it sounds like a technical hurdle is holding back the launch. Corsair claims that they want the product to live up to expectations. This, of course, chips further at the company's exclusivity window and could put them in direct competition with Razer's custom design, and may even be available second, almost in spite of the exclusivity arrangement.

Razer BlackWidow Ultimate Chroma Announced

Subject: General Tech, Cases and Cooling | August 18, 2014 - 06:17 PM |
Tagged: razer, mechanical keyboard

Earlier in the year, we reported on Corsair's exclusivity over Cherry MX RGB-based mechanical keyboards. The thing is, Razer develops their own switches and is not reliant on ZF Electronics (Cherry Corporation). The Razer BlackWidow Ultimate Chroma mechanical keyboard uses their own switches, not Cherry's, and is not subject to Corsair's exclusivity. The keyboard can be ordered now for $179.99 USD and will be available in September.

razer-blackwidow-ultimate-chroma-front.png

I contacted Razer and asked them about their technology. They could not provide any direct comparison between their design and the Cherry MX RGB, but they were able to add a few details to their offering. The BlackWidow Ultimate Chroma was designed with its LEDs positioned away from moving parts and lined up with the keycap imprint. The LEDs are pointed upward for brightness.

Razer will be providing developers with Chroma SDK, allowing games and applications to control the Chroma-enabled device lighting to assist or immerse their users. I say "Chroma-enabled device" rather than "Chroma keyboards", because they already have plans for mice and headsets with the same technology. At the very least, they expect that users will appreciate coordinated colors across their gaming peripherals.

The Razer BlackWidow Ultimate Chroma is available to order, for $179.99 USD ($199.99 CDN), and ships in September. A Chroma-enabled mouse, based on the DeathAdder design, and a Chroma-enabled headset, based on the Kraken model, are announced but do not yet have pricing or availability information.

Source: Razer

Can you really have a wireless gaming mouse?

Subject: General Tech | August 18, 2014 - 02:05 PM |
Tagged: input, mouse, wireless gaming mouse, SteelSeries Sensei

Gaming mice have wires as it reduces input lag that would otherwise be the death of you while gaming.  Unfortunately for some this means they cannot sit on the couch streaming YouTube to their TVs since the wire on their mouse just isn't long enough.  SteelSeries claims to have overcome the technical problems of gaming wirelessly with their SteelSeries Sensei.  The software is definitely aimed at gamers, with an impressive array of settings to tweak and an impressive macro editor but that is not enough to solve the performance issues.  Believe it or not when TechGage compared it to a wired mouse they could not detect any difference whatsoever.  I would still recommend wearing pants while frying bacon regardless of your final mouse choice.

steelseries_sensei_wireless_mouse_01_thumb.jpg

"Want a high-performance wireless gaming mouse that doesn’t have its battery-life measured in seconds? Well, SteelSeries has released its renowned Sensei into the wild, free to run and frolic in grassy meadows, without the need of being tethered to unsightly cables. Does the result live up to our high expectations? There’s only one way to find out."

Here is some more Tech News from around the web:

Tech Talk

Source: Techgage

A good old Gigabyte Overclocking Competition

Subject: General Tech | August 18, 2014 - 10:33 AM |
Tagged: gigabyte, Extreme Overclocking Competition, overclocking

The weapons this year at Gigabyte's EOC were a Core i5-4690K and Core i7-4790K, Gigabyte Z97X-SOC FORCE LN2, Gigabyte HD7790, G.Skill TridentX F3-2933C12D-8GTXDG
and a Seasonic SSX-1200 Platinum PSU.  Team Awardfabrik hit 6578MHz on the i7-4790K with a mix of luck and skill while Team Switzerland took top spot for memory at 2106.3MHz.  Raw speed of one component is not enough to win this competition and when the nitrogen fog lifted it was Team HardwareLuxx with the overall win.  Check out what benchmarks were run and pictures and video from the event on MadShrimps.

EOC.JPG

"Each year Gigabyte Germany organizes the Extreme Overclocking Competition. At the EOC the best overclocking teams of Germany have a chance to prove who is still king. The main organizer behind each event is Germany’s finest Roman Hartung also known as der8auer at HWBot.org. This year besides Gigabyte also G.Skill, Intel, Seasonic and Gelid solutions provided the required hardware and funds to allow this clash of the titans to take place at the Know Cube at the Heilbronn Tech University."

Here is some more Tech News from around the web:

Tech Talk

Source: MadShrimps

Falcon Northwest Tiki-Z Special Edition Crams Titan Z And Liquid Cooled i7-4790K CPU Into A Stylish Micro Tower

Subject: General Tech, Systems | August 15, 2014 - 10:40 PM |
Tagged: titan z, tiki-z, gtx titan z, gk110, falcon northwest, dual gpu

The Tiki-Z Special Edition is the latest custom PC from boutique vendor Falcon Northwest. This high-end enthusiast system, which starts at $5,614 manages to pack a dual GPU graphics card, liquid cooled CPU, 600W power supply, and up to 6TB of storage into a stylish micro tower that measures a mere 4” wide and 13” tall.

Falcon Northwest has taken the original Tiki chassis and made several notable tweaks to accommodate NVIDIA’s latest dual GPU card: the GeForce GTX TITAN Z which we reviewed here. The case has a custom (partial) side window that shows off the graphics card. This window can be green glass or smoke tinted acrylic with customizable laser cut venting. A ducted intake feeds cool air to the graphics card and vents at the rear and front of the case exhaust hot air. The exterior of the case can be painted in any single color of automobile paint for free or with a fully customized paint scheme with artwork at an additional cost.

Falcon Northwest Tiki-Z Micro Tower.jpg

In addition to the Titan Z with its 5,760 CUDA cores, 12GB of memory, and 8.1 TFLOPS of peak compute power, Falcon Northwest has packed a modular small form factor 600W PSU from SilverStone, an ASUS Z97I Plus motherboard, Intel Core i7-4790K “Devil’s Canyon” CPU with liquid cooler, up to 16GB of DDR3 1866MHz memory from G.Skill, and up to 6TB of storage (two 1TB SSDs and one 4TB Western Digital Green hard drive). The i7-4970K comes stock clocked at 4GHz (4.4GHz max turbo), but can be overclocked by Falcon Northwest upon request.

Needless to say, that is a lot of hardware to cram into a PC that can easily sit next to your monitor at your desk or in your living room!

The engineering, artwork, and support of this high end system all come at a price, however. The new Titan Z powered boutique PC starts at $5,614 USD and is available now from Falcon Northwest. To sweeten the deal, for a limited time Falcon Northwest is including a free ASUS PB287Q 4K monitor (3820x2160, 60Hz, 1ms response time, see more specification in our review) with each Tiki-Z purchase.

This system is an impressive feat of engineering and it certainly looks sharp with the artwork, custom side panel, and compact form factor. My only concern from a usability standpoint would be noise from the cooling systems for the GPU, CPU radiator, and PSU. One also has to consider that the Titan Z graphics card by itself is priced at $3,000 which puts the Tiki Z pricing back into the somewhat sane world of boutique PC pricing (heh at about $2,600 for the system minus the GPU). No question, this is not going to be a system for everyone and will even be a niche product within the niche market of those enthusiasts interested in pre-built gaming systems. Even so, if noise levels can be held in check it will make for one powerful little gaming box!

Khronos Announces "Next" OpenGL & Releases OpenGL 4.5

Subject: General Tech, Graphics Cards, Shows and Expos | August 15, 2014 - 05:33 PM |
Tagged: siggraph 2014, Siggraph, OpenGL Next, opengl 4.5, opengl, nvidia, Mantle, Khronos, Intel, DirectX 12, amd

Let's be clear: there are two stories here. The first is the release of OpenGL 4.5 and the second is the announcement of the "Next Generation OpenGL Initiative". They both occur on the same press release, but they are two, different statements.

OpenGL 4.5 Released

OpenGL 4.5 expands the core specification with a few extensions. Compatible hardware, with OpenGL 4.5 drivers, will be guaranteed to support these. This includes features like direct_state_access, which allows accessing objects in a context without binding to it, and support of OpenGL ES3.1 features that are traditionally missing from OpenGL 4, which allows easier porting of OpenGL ES3.1 applications to OpenGL.

opengl_logo.jpg

It also adds a few new extensions as an option:

ARB_pipeline_statistics_query lets a developer ask the GPU what it has been doing. This could be useful for "profiling" an application (list completed work to identify optimization points).

ARB_sparse_buffer allows developers to perform calculations on pieces of generic buffers, without loading it all into memory. This is similar to ARB_sparse_textures... except that those are for textures. Buffers are useful for things like vertex data (and so forth).

ARB_transform_feedback_overflow_query is apparently designed to let developers choose whether or not to draw objects based on whether the buffer is overflowed. I might be wrong, but it seems like this would be useful for deciding whether or not to draw objects generated by geometry shaders.

KHR_blend_equation_advanced allows new blending equations between objects. If you use Photoshop, this would be "multiply", "screen", "darken", "lighten", "difference", and so forth. On NVIDIA's side, this will be directly supported on Maxwell and Tegra K1 (and later). Fermi and Kepler will support the functionality, but the driver will perform the calculations with shaders. AMD has yet to comment, as far as I can tell.

nvidia-opengl-debugger.jpg

Image from NVIDIA GTC Presentation

If you are a developer, NVIDIA has launched 340.65 (340.23.01 for Linux) beta drivers for developers. If you are not looking to create OpenGL 4.5 applications, do not get this driver. You really should not have any use for it, at all.

Next Generation OpenGL Initiative Announced

The Khronos Group has also announced "a call for participation" to outline a new specification for graphics and compute. They want it to allow developers explicit control over CPU and GPU tasks, be multithreaded, have minimal overhead, have a common shader language, and "rigorous conformance testing". This sounds a lot like the design goals of Mantle (and what we know of DirectX 12).

amd-mantle-queues.jpg

And really, from what I hear and understand, that is what OpenGL needs at this point. Graphics cards look nothing like they did a decade ago (or over two decades ago). They each have very similar interfaces and data structures, even if their fundamental architectures vary greatly. If we can draw a line in the sand, legacy APIs can be supported but not optimized heavily by the drivers. After a short time, available performance for legacy applications would be so high that it wouldn't matter, as long as they continue to run.

Add to it, next-generation drivers should be significantly easier to develop, considering the reduced error checking (and other responsibilities). As I said on Intel's DirectX 12 story, it is still unclear whether it will lead to enough performance increase to make most optimizations, such as those which increase workload or developer effort in exchange for queuing fewer GPU commands, unnecessary. We will need to wait for game developers to use it for a bit before we know.

Prying OpenGL to slip a little Mantle inside

Subject: General Tech | August 15, 2014 - 10:09 AM |
Tagged: amd, Mantle, opengl, OpenGL Next

Along with his announcements about FreeSync, Richard Huddy also discussed OpenGL Next and its relationship with Mantle and the role it played in DirectX 12's development.  AMD has given Chronos Group, the developers of OpenGL, complete access to Mantle to help them integrate it into future versions of the API starting with OpenGL Next.  He also discussed the advantages of Mantle over DirectX, citing AMD's ability to update it much more frequently than Intel has done with DX.  With over 75 developers working on titles that take advantage of Mantle the interest is definitely there but it is uncertain if devs will actually benefit from an API which updates at a pace faster than a game can be developed.  Read on at The Tech Report.

Richard Huddy-578-80.jpg

"At Siggraph yesterday, AMD's Richard Huddy gave us an update on Mantle, and he also revealed some interesting details about AMD's role in the development of the next-gen OpenGL API."

Here is some more Tech News from around the web:

Tech Talk

Richard Huddy Discusses FreeSync Availability Timeframes

Subject: General Tech, Displays | August 14, 2014 - 01:59 PM |
Tagged: amd, freesync, g-sync, Siggraph, siggraph 2014

At SIGGRAPH, Richard Huddy of AMD announced the release windows of FreeSync, their adaptive refresh rate technology, to The Tech Report. Compatible monitors will begin sampling "as early as" September. Actual products are expected to ship to consumers in early 2015. Apparently, more than one display vendor is working on support, although names and vendor-specific release windows are unannounced.

amd-freesync1.jpg

As for cost of implementation, Richard Huddy believes that the added cost should be no more than $10-20 USD (to the manufacturer). Of course, the final price to end-users cannot be derived from this - that depends on how quickly the display vendor expects to sell product, profit margins, their willingness to push new technology, competition, and so forth.

If you want to take full advantage of FreeSync, you will need a compatible GPU (look for "gaming" support in AMD's official FreeSync compatibility list). All future AMD GPUs are expected to support the technology.

Source: Tech Report

Podcast #313 - New Kaveri APUs, ASUS ROG Swift G-Sync Monitor, Intel Core M Processors and more!

Subject: General Tech | August 14, 2014 - 12:30 PM |
Tagged: video, ssd, ROG Swift, ROG, podcast, ocz, nvidia, Kaveri, Intel, g-sync, FMS 2014, crossblade ranger, core m, Broadwell, asus, ARC 100, amd, A6-7400K, A10-7800, 14nm

PC Perspective Podcast #313 - 08/14/2014

Join us this week as we discuss new Kaveri APUs, ASUS ROG Swift G-Sync Monitor, Intel Core M Processors and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Program length: 1:41:24
 

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Diamond's Xtreme Sound XS71HDU looks good but how does it sound?

Subject: General Tech | August 14, 2014 - 12:00 PM |
Tagged: audio, diamond multimedia, Xtreme Sound XS71HDU, usb sound card, DAC

The Diamond Xtreme Sound XS71HDU could be a versitile $60 solution for those with high end audio equipment that would benefit from a proper DAC.  With both optical in and out it is capable of more than an onboard solution, not to mention the six 3.5-mm jacks for stereo headphones, 7.1 surround support with rear, sub, side, mic, and line in.  The design and features are impressive however the performance failed to please The Tech Report who felt that there were similar solutions with much higher quality sound reproduction.

back.jpg

"We love sound cards here at TR, but they don't fit in every kind of PC. Diamond's Xtreme Sound XS71HDU serves up the same kinds of features in a tiny USB package suitable for mini-PCs and ultrabooks. We took it for a spin to see if it's as good as it looks."

Here is some more Tech News from around the web:

Audio Corner

That Linux thing that nobody uses

Subject: General Tech | August 14, 2014 - 10:31 AM |
Tagged: linux

For many Linux is a mysterious thing that is either dead or about to die because no one uses it.  Linux.com has put together an overview of what Linux is and where to find it being used.  Much of what they describe in the beginning applies to all operating systems as they share similar features, it is only in the details that they differ.  If you have only thought about Linux as that OS that you can't game on then it is worth taking a look through the descriptions of the distributions and why people choose to use Linux.  You may never build a box which runs Linux but if you are considering buying a Steambox when they arrive on the market you will find yourself using a type of Linux and having a basic understanding of the parts of the OS for troubleshooting and optimization.   If you already use Linux then fire up Steam and take a break.

software-center.png

"For those in the know, you understand that Linux is actually everywhere. It's in your phones, in your cars, in your refrigerators, your Roku devices. It runs most of the Internet, the supercomputers making scientific breakthroughs, and the world's stock exchanges."

Here is some more Tech News from around the web:

Tech Talk

Source: Linux.com

Intel and Microsoft Show DirectX 12 Demo and Benchmark

Subject: General Tech, Graphics Cards, Processors, Mobile, Shows and Expos | August 13, 2014 - 06:55 PM |
Tagged: siggraph 2014, Siggraph, microsoft, Intel, DirectX 12, directx 11, DirectX

Along with GDC Europe and Gamescom, Siggraph 2014 is going on in Vancouver, BC. At it, Intel had a DirectX 12 demo at their booth. This scene, containing 50,000 asteroids, each in its own draw call, was developed on both Direct3D 11 and Direct3D 12 code paths and could apparently be switched while the demo is running. Intel claims to have measured both power as well as frame rate.

intel-dx12-LockedFPS.png

Variable power to hit a desired frame rate, DX11 and DX12.

The test system is a Surface Pro 3 with an Intel HD 4400 GPU. Doing a bit of digging, this would make it the i5-based Surface Pro 3. Removing another shovel-load of mystery, this would be the Intel Core i5-4300U with two cores, four threads, 1.9 GHz base clock, up-to 2.9 GHz turbo clock, 3MB of cache, and (of course) based on the Haswell architecture.

While not top-of-the-line, it is also not bottom-of-the-barrel. It is a respectable CPU.

Intel's demo on this processor shows a significant power reduction in the CPU, and even a slight decrease in GPU power, for the same target frame rate. If power was not throttled, Intel's demo goes from 19 FPS all the way up to a playable 33 FPS.

Intel will discuss more during a video interview, tomorrow (Thursday) at 5pm EDT.

intel-dx12-unlockedFPS-1.jpg

Maximum power in DirectX 11 mode.

For my contribution to the story, I would like to address the first comment on the MSDN article. It claims that this is just an "ideal scenario" of a scene that is bottlenecked by draw calls. The thing is: that is the point. Sure, a game developer could optimize the scene to (maybe) instance objects together, and so forth, but that is unnecessary work. Why should programmers, or worse, artists, need to spend so much of their time developing art so that it could be batch together into fewer, bigger commands? Would it not be much easier, and all-around better, if the content could be developed as it most naturally comes together?

That, of course, depends on how much performance improvement we will see from DirectX 12, compared to theoretical max efficiency. If pushing two workloads through a DX12 GPU takes about the same time as pushing one, double-sized workload, then it allows developers to, literally, perform whatever solution is most direct.

intel-dx12-unlockedFPS-2.jpg

Maximum power when switching to DirectX 12 mode.

If, on the other hand, pushing two workloads is 1000x slower than pushing a single, double-sized one, but DirectX 11 was 10,000x slower, then it could be less relevant because developers will still need to do their tricks in those situations. The closer it gets, the fewer occasions that strict optimization is necessary.

If there are any DirectX 11 game developers, artists, and producers out there, we would like to hear from you. How much would a (let's say) 90% reduction in draw call latency (which is around what Mantle claims) give you, in terms of fewer required optimizations? Can you afford to solve problems "the naive way" now? Some of the time? Most of the time? Would it still be worth it to do things like object instancing and fewer, larger materials and shaders? How often?

Podcast Listeners and Viewers: Win an EVGA SuperNOVA 1000 G2 Power Supply

Subject: General Tech | August 13, 2014 - 02:15 PM |
Tagged: supernova, podcast, giveaway, evga, contest

A big THANK YOU goes to our friends at EVGA for hooking us up with another item to give away for our podcast listeners and viewers this week. If you watch tonight's LIVE recording of Podcast #313 (10pm ET / 7pm PT at http://pcper.com/live) or download our podcast after the fact (at http://pcper.com/podcast) then you'll have the tools needed to win an EVGA SuperNOVA 1000 G2 Power Supply!! (Valued at $165 based on Amazon current selling price.) See review of our 750/850G2 SuperNOVA units.

120-G2-1000-XR_XL_1.jpg

How do you enter? Well, on the live stream (or in the downloaded version) we'll give out a special keyword during our discussion of the contest for you to input in the form below. That's it! 

We'll draw a random winner next week, anyone can enter from anywhere in the world - we'll cover the shipping. We'll draw a winner on August 20th and announce it on the next episode of the podcast! Good luck, and once again, thanks goes out to EVGA for supplying the prize!

Unreal Tournament's Training Day, get in on the Pre-Pre-Alpha right now!

Subject: General Tech | August 13, 2014 - 11:02 AM |
Tagged: Unreal Tournament, gaming, Alpha

Feel like (Pre-Pre-)Alpha testing Unreal Tournament without forking money over for early access?  No problems thanks to Epic and Unreal Forums member ‘raxxy’ who is compiling and updating the (pre)Alpha version of the next Unreal Tournament.  Sure there may not be many textures but there is a Flak Cannon so what could you possible have to complain about?  There are frequent updates and a major part of participating is to give feedback to the devs so please be sure to check into the #beyondunreal IRC channel to get tips and offer feedback.  Rock, Paper, SHOTGUN reports that the severs are massively packed now so you may not be able to immediately join in but it is worth trying.

raxxy would like you to understand "These are PRE-ALPHA Prototype Builds. Seriously. Super early testing. So early it's technically not even pre alpha, it's debug code!"

You can be guaranteed that the Fragging Frogs will be taking advantage of this, as well as revisiting the much beloved UT2K4 so if you haven't joined up yet ... what are you waiting for?

Check out Fatal1ty playing if you can't get on

"Want to play the new Unreal Tournament for free, right this very second? Cor blimey and OMG you totes can! Hero of the people ‘raxxy’ on the Unreal Forums is compiling Epic’s builds and releasing them as small, playable packages that anyone can run, with multiple updates per week. The maps are untextured, the weapons unbalanced, and things change rapidly as everything’s still “pre-alpha” but it’s playable and – more importantly – fun."

Here is some more Tech News from around the web:

Gaming

Meet Tonga, soon to be your new Radeon

Subject: General Tech | August 13, 2014 - 09:58 AM |
Tagged: tonga, radeon, FirePro W7100, amd

A little secret popped out with the release of AMD's FirePro W7100, a new family of GPU that goes by the name of Tonga, which is very likely to replace the aging Tahiti chip that has been used since the HD 7900 series.  The stats that The Tech Report saw show interesting changes from Tahiti including a reduction of the memory interface to 256-bit which is in line with NVIDIA's current offerings.  The number of stream processors might be reduced to 1792 from 2048 but that is based on the W7100 and it the GPUs may be released with the full 32 GCN compute units.  Many other features have seen increases, the number of Asynchronous Compute Engines goes from 2 to 8, the number of rasterized triangles per clock doubles to 4 and it adds support for the new TrueAudio DSP and CrossFire XDMA.

IMG0045305.jpg

"The bottom line is that Tonga joins the Hawaii (Radeon R9 290X) and Bonaire (R7 260X) chips as the only members of AMD' s GCN 1.1 series of graphics processors. Tonga looks to be a mid-sized GPU and is expected to supplant the venerable Tahiti chip used in everything from the original Radeon HD 7970 to the current Radeon R9 280."

Here is some more Tech News from around the web:

Tech Talk

Select GeForce GTX GPUs Now Include Borderlands: The Pre-Sequel

Subject: General Tech | August 13, 2014 - 09:26 AM |
Tagged: borderlands, nvidia, geforce

 

Santa Clara, CA — August 12, 2014 — Get ready to shoot ‘n’ loot your way through Pandora’s moon. Starting today, gamers who purchase select NVIDIA GeForce GTX TITAN, 780 Ti, 780, and 770 desktop GPUs will receive a free copy of Borderlands: The Pre-Sequel, the hotly anticipated new chapter to the multi-award winning Borderlands franchise from 2K and Gearbox Software.

Discover the story behind Borderlands 2’s villain, Handsome Jack, and his rise to power. Taking place between the original Borderlands and Borderlands 2, Borderlands: The Pre-Sequel offers players a whole lotta new gameplay in low gravity.

“If you have a high-end NVIDIA GPU, Borderlands: The Pre-Sequel will offer higher fidelity and higher performance hardware-driven special effects including awesome weapon impacts, moon-shatteringly cool cryo explosions and ice particles, and cloth and fluid simulation that blows me away every time I see it," said Randy Pitchford, CEO and president of Gearbox Software.

With NVIDIA PhysX technology, you will feel deep space like never before. Get high in low gravity and use new ice and laser weapons to experience destructible levels of mayhem. Check out the latest trailer here: http://youtu.be/c9a4wr4I1hk that just went live this morning!

Borderlands: The Pre-Sequel will also stream to your NVIDIA SHIELD tablet or portable. For the first time ever, you can play Claptrap anywhere by using NVIDIA Gamestream technologies. You can even livestream and record every fist punch with GeForce Shadowplay

Borderlands: The Pre-Sequel will be available on October 14, 2014 in North America and on October 17, 2014 internationally. Borderlands: The Pre-Sequel is not yet rated by the ESRB.

The GeForce GTX and Borderlands: The Pre-Sequel bundle is available starting today from leading e-tailers including Amazon, NCIX, Newegg, and Tiger Direct and system builders including Canada Computers, Digital Storm, Falcon Northwest, Maingear, Memory Express, Origin PC, V3 Gaming, and Velocity Micro. For a full list of participating partners, please visit: www.GeForce.com/GetBorderlands.

Source: NVIDIA

DOTA 2 May Be Running Source Engine 2. Now.

Subject: General Tech | August 12, 2014 - 06:00 PM |
Tagged: valve, source engine, Source 2, DOTA 2

While it may not seem like it in North America, we are in a busy week for videogame development. GDC Europe, which stands for Game Developers Conference Europe, is just wrapping up to make room for Gamescom, which will take up the rest of the week. Valve will be there and people are reading tea leaves to find out why. SteamOS seems likely, but what about their next generation gaming engine, Source 2? Maybe it already happened?

valve-dota2-workshop.jpg

Valve is the most secretive company with values of openness that I know. They are pretty good at preventing leaks from escaping their walls. Recently, Dota 2 was updated to receive new features and development tools for user-generated maps and gametypes. The tools currently require 64-bit Windows and a DirectX 11-compatible GPU.

Those don't sound like Source requirements...

And the editor doesn't look like Valve's old tools.

Video Credit: "Valve News Network".

Leaks also point to things like "tf_imported", "left4dead2_source2", and "left4dead2_imported". This is interesting. Valve is pushing Dota 2, their most popular, free-to-play game into Source 2. Also, because it is listed as "tf" rather than "tf2", like "dota" is not registered as "dota2" but "left4dead2" keeps its number, this might mean that the free-to-play Team Fortress 2 could be in a perpetual-development mode, like Dota 2. Eventually, it could be pushed to the new engine and given more content.

As for Left4Dead2? I am wondering if it is intended to be a product, rather than an internal (or external) Source 2 tech demo.

Was this what brought Valve to Gamescom, or will be be surprised by other announcements (or nothing at all)?

Source: Polygon