Good effort goes a long way
The wait has been long and anxious for Heart of the Swarm, the expansion to 2010's StarCraft 2: Wings of Liberty. Blizzard originally hinted at a very rapid release schedule which did not exactly come to fruition. The nearly three years of development time for Heart of the Swarm is longer than a single studio spends on a full Call of Duty title; although, one could make a very credible argument that a Blizzard expansion requires more effort to create than said complete Call of Duty title.
But as Duke Nukem Forever demonstrated, a long time in development does not guarantee a fully baked product coming out the other end.
Blizzard games have always been highly entertaining albeit without deep artistic substance; their games are not first on the list for a university literature syllabus. But, there is a lot of room in life for engaging entertainment. In terms of the PC, Blizzard has always been one of the leading developers for the platform; they know how to deliver an exceptional PC experience if they choose to.
Subject: General Tech | April 24, 2013 - 02:32 PM | Jeremy Hellstrom
Tagged: gaming, Dark Souls II, consolitis, masochism
Dark Souls made a name for its self as one of the toughest and most unforgiving games going and built a huge following because of that. The sequel will be coming to PC as well but the one major negative comment many gamers had about the original will no longer be applicable, according to the developers this version will not suffer from consolitis. Rock, Paper, SHOTGUN has about as much information as is available on this game but to truly understand what this game will be like you should check out the YouTube preview below.
"A word of warning: I have never played Dark Souls, and this information is coming from French website GameKult’s interview with a Yui Tanimura, the Japanese game director of Dark Souls II. I am merely an information conduit. A nexus from them to you, with news that the complaints of the horrible, nasty port job of the previous game was noticed and taken into account. Dark Souls II is being developed as a PC game. Hooray!"
Here is some more Tech News from around the web:
- Jagged Alliance: Flashback @ KickStarter
- Homeworld space RTS rights bought by Gearbox for $1.35m @ Hexus
- That Much-Delayed 2k Marin Shooter Is Not An EXCOM @ Rock, Paper, SHOTGUN
- Post-Modern – ‘Call Of Duty: Ghosts’ Busted @ Rock, Paper, SHOTGUN
- Totally Teutoburgic: Tons Of Rome II In-Game Footage @ Rock, Paper, SHOTGUN
- Injustice: Gods Among Us @ The Inquirer
- Gears of War: Judgement Xbox 360 @ Tweaktown
Subject: General Tech | April 19, 2013 - 04:15 AM | Tim Verry
Tagged: metro: last light, gaming, deep silver
Metro: Last Light is nearing completion, with an expected release date of May 17th for the PC, PS3, and Xbox 360. Developer Deep Silver – famous (or perhaps infamous) for the STALKER game series – has taken over the project from THQ.
According to Bit-Tech, publisher 4K Games has announced the game’s system requirements. It seems that Metro: Last Light will continue the system-punishing trend that its Metro: 2033 predecessor started. In order to play the game with all the eye candy, gamers will reportedly need at least a NVIDIA GTX 690 or GTX Titan video card. Notably absent from the requirements list is an AMD equivalent, but the AMD Radeon HD 7990 would be the closest match.
The Optimum system requirements represent PC that will be able to crank up all the details. At least a quad core CPU clocked at 3.4GHz, 8GB of RAM, a GTX 690 (or GTX Titan), and Windows 7 or higher is recommended.
The Recommended system requirements suggests hardware used to play the game with most details turned on and at at least 1920 x 1080 resolution. 4K Games recommends at least a 2.6GHz quad core processor, 4GB of RAM, and a DirectX 11 compatible GPU equivalent to at least a NVIDIA GTX 580, GTX 660 Ti, or AMD HD 7870.
Interestingly, even the minimum system requirements are pretty steep compared to other modern titles. A computer running the 32-bit version of Windows XP or higher is needed along with at least a 2.2GHz dual core CPU, 2GB of system RAM, and a DirectX 9 Shader Model 3 compatible video card such as the NVIDIA GTS 250 or AMD HD 4000-series.
The suggested system requirements (especially the optimum level) are impressive, and do suggest that Metro: Last Light is a game that will take full advantage of PC hardware. (I am curious to see whether the system requirements are mostly due to graphical prowess or code optimization issues though. In other words, I hope that the game is more-stable than the STALKER series.)
One thing is for sure: my unlocked AMD 6950 is looking rather dated in light of the new Metro: Last Light specifications!
Subject: General Tech | April 17, 2013 - 06:13 PM | Jeremy Hellstrom
Tagged: gaming, bioshock infinite
As is their wont, [H]ard|OCP focuses on performance when reviewing the game, leaving examination of the game its self to sites dedicated to that type of content. The half dozen contestants represent the top 3 single GPU cards from NVIDIA and AMD and the drivers used were released this March. The game is DX11 through and through but tesselation is conspicuosly absent as is MSAA, only FXAA is available; even when enabled at the driver level they saw no differences. Both companies cards could play the game at 1080p with all settings maxed out but for higher resolutions we saw NVIDIA's performance pull ahead somewhat. Check out the image quality of BioShock Infinite in the full review.
"BioShock Infinite is here, delivering a colorful and dynamic world with the help of a customized Unreal Engine 3. BioShock Infinite has an improved PC gaming experience, we will test this game's performance among 8 video card configurations, and look at image quality of this immersive and colorful new game."
Here is some more Tech News from around the web:
- Bioshock Infinite Review @ OCC
- Squad Chat: Jagged Alliance – Flashback Interview @ Rock, Paper, SHOTGUN
- You’re The Boat Boss: Leviathan’s Smoooooooth Moves @ Rock, Paper, SHOTGUN
- Please Watch This Dumb Blood Dragon Live-Action Short @ Rock, Paper, SHOTGUN
- Antichamber just blew my mind @ The Tech Report
- Space Hulk Studio To Make Turn-Based Jagged Alliance @ Rock, Paper, SHOTGUN
- Defiance @ LanOC Reviews
- Roundup of the 6 New Gaming Platforms Launching in 2013 @ eTeknix
- Smashing Dolphins: Planet Punch Redefines Self-Loathing @ Rock, Paper, SHOTGUN
- Resident Evil 6 - Too Much Action for Horror? @ Techgage
Subject: General Tech | April 10, 2013 - 01:12 PM | Jeremy Hellstrom
Tagged: gaming, project eternity, obsidian
Obsidian's new RPG Project Eternity has some gorgeous backgrounds as you can see from the YouTube trailer below. It was on Kickstarter and garnered three times the amount of money that they required to get the project going so we will be seeing this game sooner or later, you can still toss some coins at them by visiting their main page which is accessible from Rock, Paper, SHOTGUN's preview. The design team describes the game as a mix of the dungeon crawling of Icewind Dale, the depth of NPC personality of Baldur's Gate and a story as gripping as Planescape. Currently the Slacker Backer is worth $29 and will get you a copy of the game from Steam or GOG.
"My old-school RPG gland’s been engorged with excitement for many reasons lately, but the past few weeks have seen Torment race to the front of the pack – and not just because it’s chock full of twisted sights and sounds not of this world. In something of a revolution, it also moves. Like, its pictures just sort of do things, without the assistance of a flip book, finger puppets, or any of the other traditional methods. So imagine my elation when I discovered that Project Eternity will, in fact, employ similar motion gremlins to sow the glorious song of movement into its lush mountains, valleys, and plains."
Here is some more Tech News from around the web:
- Today's mid-range graphics cards in BioShock Infinite @ The Tech Report
- See Space Hulk See Space Hulk See Space Hulk See @ Rock, Paper, SHOTGUN
- That Looks Painful: A Thief Teaser "Trailer" @ Rock, Paper, SHOTGUN
- Bioshock Infinite Tested, Benchmarked @ Techspot
- Gaming's favourite platters get another stir of the pot @ The Register
- Tomb Raider @ LanOC Reviews
- Hands On: Divinity – Original Sin @ Rock, Paper, SHOTGUN
- BioShock Infinite Benchmarked with AMD EyeFinity at 5760x1080 @ Tweaktown
- Tomb Raider PC @ eTeknix
- First Impressions of Trion Worlds’ Defiance @ Techgage
- Modern shooters and the atrophy of fun @ The Tech Report
- Valve Publishes Packages For Their Linux Distribution @ Phoronix
- Sleeping Dogs Benchmarked with AMD EyeFinity at 5760x1080 @ Tweaktown
- Disney shuts down Lucasarts games company @ The Inquirer
- BioShock Infinite Review – Leaving the World Awestruck @ Techgage
- Activision, Raven Release 2 Star Wars Games Under GPL @ Slashdot
- BioShock Infinite @ Kitguru
- Strike Vector Has Transforming Cowboy Space Planes @ Rock, Paper, SHOTGUN
- Decor Never Changes: Metro – Last Light’s World @ Rock, Paper, SHOTGUN
- LEGO City Undercover Wii U @ Tweaktown
Subject: General Tech | April 2, 2013 - 07:25 AM | Tim Verry
Tagged: x86 emulator, rpix86, Raspberry Pi, gaming, dos
The Raspberry Pi is proving to be a popular destination for all manner of interesting software projects and open source operating systems. The most-recent Pi project I've come across is a DOS PC emulator by Patrick Aalto called rpix86. A port of DSx86, which ran on the Nintendo DS handheld console, rpix86 is now up to version 0.04 and emulates a 90's X86 computer with enough hardware oomph to run classic PC games!
Rpix86 is an emulator that runs from the console (not within the X GUI desktop environment) on the Raspberry Pi. It emulates the following X86 PC specs:
|Processor||80486 @ ~ 20 MHz (inc. protected mode. No virtual memory support)|
|Memory||640 Kb low memory, 4 MB EMS memory, 16 MB XMS memory|
|Graphics||Super VGA @ 640 x 480 w/ 256 colors|
|Audio||Sound Blaster 2.0 (+ AdLib-compatible FM sounds)|
|Input Devices||US keyboard, analog joystick, 2 button mouse|
|Misc||Roland MPU-401 MIDI Support via USB MIDI Dongle|
Patrick Aalto added support for analog USB joysticks and foot pedals (4 buttons, 4 analog channels) as well as 80 x 50 text mode (required by some MIDI software and Little Big Adventure's setup program) to the recent 0.04 update. He also stripped out debug code, which cut the program size approximately in half.
The developer has stated on his blog that he is working on allowing rpix86 to be used from the terminal within X and adding support for intelligent MPU MIDI mode. A port to the Android operating system called ax86 is also in the works. You can grab the current version of the Raspberry Pi X86 emulator on the developer's website.
With this emulator, you can run most of the DOS games you grew up with (Wolf3D and Digger anyone?), which is definitely a worthy use for the $25 or $35 Raspberry Pi hardware! At the very least, it is an interesting alternative to running DOSBox, and much smaller and more power efficient than running an old X86 PC dedicated to running classic games. Getting those floppies to work with the Pi might be a bit of an issue though, assuming they are still readable (heh).
Read more about the Raspberry Pi computer at PC Perspective.
Subject: General Tech | April 2, 2013 - 02:54 AM | Tim Verry
Tagged: next generation character rendering, GDC 13, gaming, Activision, 3D rendering
Activision recently showed off its Next-Generation Character Rendering technology, which is a new method for rendering realistic and high-quality 3D faces. The technology has been in the works for some time now, and is now at a point where faces are extremely detailed down to pores, freckles, wrinkles, and eye lashes.
In addition to Lauren, Activision also showed off its own take on the face used in NVIDIA's Ira FaceWorks tech demo. Except instead of the NVIDIA rendering, the face was done using Activision's own Next-Generation Character Rendering technology. A method that is allegedly more efficient and "completely different" than the one used for Ira. In a video showing off the technology (embedded below), the Activision method produces some impressive 3D renders in real time, but when talking appear to be a bit creepy-looking and unnatural. Perhaps Activision and NVIDIA should find a way to combine the emotional improvements of Ira with the graphical prowess of NGCR (and while we are making a wish list, I might as well add TressFX support... heh).
The high resolution faces are not quite ready for the next Call of Duty, but the research team has managed to get models to render at 180 FPS on a PC running a single GTX 680 graphics card. That is not enough to implement the technology in a game, where there are multiple models, the environment, physics, AI, and all manner of other calculations to deal with and present at acceptable frame rates, but it is nice to see this kind of future-looking work being done now. Perhaps in a few graphics card generations the hardware will catch up to the face rendering technology that Activision (and others) are working on, which will be rather satisfying to see. It is amazing how far the graphics world has come since I got into PC gaming with Wolfenstein 3D, to say the least!
The team behind Activision's Next-Generation Character Rendering technology includes:
|Javier Von Der Pahlen||Director of Research and Development|
|Etienne Donvoye||Technical Director|
|Bernardo Antoniazzi||Technical Art Director|
|Zbyněk Kysela||Modeler and Texture Artist|
|Mike Eheler||Programming and Support|
|Jorge Jimenez||Real-Time Graphics Research and Development|
Jorge Jimenez has posted several more screenshots of the GDC tech demo on his blog that are worth checking out if you are interested in the new rendering tech.
Subject: General Tech | April 1, 2013 - 05:37 AM | Tim Verry
Tagged: virtual reality, oculus vr, oculus rift, GTC 2013, gaming
Update: Yesterday, an Oculus representative reached out to us in order to clarify that the information presented was forward looking and subject to change (naturally). Later in the day, the company also posted to its forum the following statement:
"The information from that presentation (dates, concepts, projections, etc...) represent our vision, ideas, and on-going research/exploration. None of it should be considered fact, though we'd love to have that projected revenue!"
You can find the full statement in this thread.
The original article text is below:
Oculus VR, the company behind the Oculus Rift virtual reality headset, took the stage at NVIDIA's GPU Technology Conference to talk about the state of its technology and where it is headed.
Oculus VR is a relatively new company founded by Palmer Luckey and managed by CEO Brendan Iribe, who is the former CPO of cloud gaming company Gaikai. Currently, Oculus VR is developing a wearable, 3D, virtual reality headset called the Oculus Rift. Initially launched via Kickstarter, the Oculus Rift hardware is now shipping to developers as the rev 1 developer kit. Oculus VR will manufacture 10,000 developer kits and managed to raise $2.55 million in 2012.
The developer kit has a resolution of 1280x800 and weighs 320g. It takes a digital video input via a DVI or HDMI cable (HDMI with an adapter). The goggles hold the display and internals, and a control box connects via a wire to provide power. It uses several pieces of hardware found in smartphones, and CEO Brendan Iribe even hinted that an ongoing theme at Oculus VR was that "if it's not in a cell phone, it's not in Oculus." It delivers a 3D experience with head tracking, but Iribe indicated that full motion VR is coming in the future. For now, it is head tracking that allows you to look around the game world, but "in five to seven or eight years" virtual reality setups that combine an Oculus Rift-like headset with a omni-directional treadmill would allow you to walk and run around the world in addition to simply looking around.
Beyond the immersion factor, a full motion VR setup would reduce (and possibly eliminate) the phenomena of VR sickness, where users using VR headsets for extended periods of time experience discomfort due to the disconnect between your perceived in-game movement and your (lack of) physical movement and inner-ear balance.
After the first developer kit, Oculus is planning to release a revised version, and ultimately a consumer version. This consumer version is slated for a Q3 2014 launch. It will weigh significantly less (Oculus VR is aiming for around 200g), and will support 1080p 3D resolutions. The sales projections estimate 50,000 revision 2 developer kits in 2013 and at least 500,000 consumer versions of the Oculus Rift in 2014. Ambitious numbers, for sure, but if Oculus can nail down next-generation console support, reduce the weight of the headset, and increase the resolution it is not out of the question.
With the consumer version, Oculus is hoping to offer both a base wired version and a higher-priced wireless Rift VR headset. Further, the company is working with game and professional 3D creation software developers to get software support for the VR headset. Team Fortress 2 support has been announced, for example (and there will even be an Oculus Rift hat, for gamers that are into hats). Additionally, Oculus is working to get support into the following software titles (among others):
- AutoDesk 3D
- DOTA 2
During the presentation Iribe stated that graphics cards (specifically he mentioned the GTX 680) are finally in a place to deliver 3D with smooth frame rates at high-enough resolutions for immersive virtual reality.
Left: potential games with Oculus VR support. Right: Oculus VR CEO Brendan Iribe at ECS during GTC 2013.
Pricing on the consumer version of the VR headset is still unkown, but developers can currently pre-order an Oculus Rift developer kit on the Oculus VR site. In the past, the company has stated that consumers should hold off on buying a developer kit and wait for the consumer version of the Rift in 2014. If the company is able to deliver on its claims of a lighter headset with higher resolution screen and adjustable 3D effects (like the 3DS, the level of stereo 3D can be adjusted, and even turned off), I think it will be worth the wait. The deciding factor will then be software support. Hopefully developers will take to the VR technology and offer up support for it in upcoming titles into the future.
Are you excited for the Oculus Rift?
Subject: General Tech | March 31, 2013 - 02:21 AM | Tim Verry
Tagged: sony, ps4, playstation eye, playstation 4, gaming, dualshock 4, APU, amd
Sony teased a few more details about its upcoming PlayStation 4 console at the Games Developer's Conference earlier this week. While the basic specifications have not changed since the original announcement, we now know more about the X86 console hardware.
The PS4 itself is powered by an AMD Jaguar CPU with eight physical cores and eight threads. Each core gets 32 KB L1 I-cache and D-cache. Further, each group of four physical cores shares 2 MB of L2 cache, for 4MB total L2. The processor is capable of Out of Order Execution, as are AMDs other processor offerings. The console also reportedly features 8GB of GDDR5 memory that is shared by the CPU and GPU. It offers 176 GB/s of bandwidth, and is a step above the PS3 which did not use a unified memory design. The system will also sport a faster GPU rated at 1.843 TFLOPS, and clocked at 800MHz. The PS3 will have a high-capacity hard drive and a new Blu-ray drive that is up to 3-times faster. Interestingly, the console also has a co-processor that allows the system to process the video streaming features and allow the Remote Play game streaming to the PlayStation Vita at its native resolution of 960x554.
The PlayStation Eye has also been upgraded with the PS4 to include 2 cameras, four microphones, and a 3-axis accelerometer. The Eye cameras have an 85-degree field of view, and can record video at 1280x800 at 60 Hz and 12 bits per pixel or 640x480 and 120Hz. The new PS4 Eye is a noteworthy upgrade to the current generation model which is limited to either 640x480 pixels at 60Hz or 320x240 pixels at 120Hz. The extra resolution should allow developers to be more accurate. The DualShock 4 controllers sport a light-bar that can be tracked by the new Eye camera, for example. The light-bar on the controllers uses an RGB LED that changes to blue, red, pink, or green for players 1-4 respectively.
Speaking of the new DualShock 4, Sony has reportedly ditched the analog face buttons and D-pad for digital buttons. With the DS3 and the PS3, the analog face buttons and D-pad came in handy with racing games, but otherwise they are not likely to be missed. The controllers will now charge even when the console is in standby mode, and the L2 and R2 triggers are more resistant to accidental pressure. The analog sticks have been slightly modified and feature a reduced dead zone. The touchpad, which is a completely new feature for the DualShock lineup, is capable of tracking 2 points at a resolution of 1920x900–which is pretty good.
While Sony has still not revealed what the actual PS4 console will look like, most of the internals are now officially known. It will be interesting to see just where Sony prices the new console, and where game developers are able to take it. Using a DX11.1+ feature set, developers are able to use many of the same tools used to program PC titles but also have additional debugging tools and low level access to the hardware. A new low level API below DirectX, but above the driver level gives developers deeper access to the shader pipeline. I'm curious to see how PC ports will turn out, with the consoles now running X86 hardware, I'm hoping that the usual fare of bugs common to ported titles from consoles to PCs will decrease–a gamer can dream, right?
Subject: General Tech, Mobile | March 28, 2013 - 01:10 PM | Jeremy Hellstrom
Tagged: razer blade, gaming
You may remember Nokia's failed N-Gage, the phone that thought it was as console but turned out to be a failure; it seems that Razer is going to market with a similar product called the Blade. This time we have a product that is a tablet with aspirations to console-hood as you can tell from the gamepad-type controls surrounding the 1366x768 10.1" screen. Inside you will find an Intel Core i7 processor, a 256GB SSD, and 8GB of RAM all of which adds up to a heavy weight mobile device with not much in the way of battery life. Gizmodo tried it out at GDC and played BioShock Infinite on Ultra with no problems whatsoever so the performance is there. On the other hand can a $1500 gaming tablet compete with full Ultrabooks or streaming devices like Project SHIELD?
"A gaming laptop in a tablet. It's a thought experiment that raises a whole host of questions: Is that even possible? Can it possibly be good? Would anyone even want it if it were? And finally: How much does it cost? The Razer Edge's answers translate roughly to "Yes!", "Sort of.", "Maybe?", and "Erm, you better sit down.""
Here is some more Tech News from around the web:
- Oculus Rift developer kit hands-on video at GDC 2013, shown playing Hawken @ Tweaktown
- Intel to separate 3rd-generation ultrabooks into 3 price groups @ DigiTimes
- Microsoft's 'Gemini' project will be the Windows Blue of Office @ The Register
- ARM says GPGPUs could lower overall chip costs @ The Inquirer
- Blackberry has sold one million Blackberry Z10 smartphones @ The Inquirer
- IBM unfurls SDN network manager @ The Register
- BIGGEST DDoS ATTACK IN HISTORY hammers Spamhaus @ The Register
- How to Improve Your Wi-Fi Signal Using a Soda Can in 6 Steps @ MAKE:Blog
- Interview with Richard Huddy about Intel moving beyond DX @ Kitguru
Get notified when we go live!