Subject: Mobile | June 7, 2012 - 01:54 AM | Tim Verry
Tagged: wireless, gaming laptop, gaming, computex, ASUS ROG, asus, 802.11ac, 5GHz wifi
Earlier today we posted a couple of teaser photos showing off some of ASUS’ upcoming products. One of the devices was a gaming laptop called the ASUS G75. Engadget has managed to get their hands on some more information regarding a variant of the G75 – the G75VW. According to the site, the gaming laptop is rocking an Intel Ivy Bridge processor, GeForce GTX 670M, and DDR3 memory (known because of the CPU used). That hardware is then powering a 1080p display, which the GTX 670M should have no problem driving but is a bit depressing to see on a high end laptop of this size (approximately 17”). The real kicker though is in the wireless card that it is allegedly packing: an 802.11ac card.
The ASUS G75 gaming laptop
Engadget states that although the information sheet next to the laptop at ASUS’ Computex booth did not list any 802.11ac compatibility, wireless chip maker Broadcom (manufacturer of chips that are used in many wireless routers and NICs) has stated that it does in fact have an 802.11ac NIC in it. Senior Vice President Michael Hurlston told members of the press at Computex 2012 that the ASUS G75VW is the “World’s first 5G Wi-Fi laptop.” He further stated that the computer would be arriving in the hands of consumers “very shortly.”
Interesting stuff, and although the “5G Wi-Fi” – so called because it is the fifth generation of consumer grade Wi-Fi (though not the 5th gen if you count all iterations of the wireless 802.11 standards) – is not yet official and set in stone, it is very close and I would not be surprised to see the technology in a laptop like this particular ASUS at this point in the game.
And to think that I just got done upgrading my network to Gigabit Ethernet and 802.11n about two months ago! Even so, I’m excited for the upcoming standard because I want to test its usefulness in getting live TV from my CableCARD tuner to the living room and Katy’s wireless laptop without stuttering – something even wireless N with MIMO can’t do with devices in the same room. So far, the only thing stable enough has been wired Cat5e Ethernet (both 100Mbps and 1000Mbps hardware seem to work without issues). And because it’s proving difficult to get a wired connection from the router to the TV (Xbox 360 used as Windows Media Extender), I’m ready to try out some 802.11ac stuff to see if it can really deliver on the increased bandwidth!
Subject: General Tech, Mobile | June 6, 2012 - 05:33 PM | Tim Verry
Tagged: motherboard, laptop, headsets, gaming, ASUS ROG, asus
Today we received a number of photos from ASUS that show off some upcoming hardware from their upcoming Republic of Gamers line. Except for the Xonar Phoebus (which has launched), the hardware in these photos is not yet released and ASUS has not revealed when it will be available for sale – or how much it will cost. Still, I can’t think of a better way to start the day than getting a glimpse of some shiny unreleased hardware – especially when I get to share it with you!
First up is a new Replublic of Gamers motherboard called the Maximus V Extreme. This board is similar to the mATX Maximus V GENE board that was announced recently, but the Extreme motherboard is full ATX.
While full specifications are unknown, from the photo you can see that the board has an LGA 1155 socket, making it compatible with the latest Intel Ivy Bridge processors. Further, it is sporting four DDR3 DIMM slots, five PCI-E 3.0 x16 slots, and one PCI-E 3.0 x4 slots. Other features of the board include ASUS’ Extreme Engine Digi+ II digital power control technology, power and reset buttons on the board itself, voltage check points, Lucid Virtu MVP GPU virtualization technology, and AMD CrossFireX and NVIDIA SLI support. The VRM area and southbridge area of the board is covered by large black and red heatsinks.
Rear IO includes five USB 2.0 ports (one to be used with ROG Connect), two USB 3.0 ports, an Intel-powered Gigabit LAN, HDMI, DisplayPort, optical audio output, PS/2 port, five analog audio outputs and a TOSLink connector. Additionally, the board features CMOS clear and reset buttons, a mini-PCIe + mSATA combo card, and a Republic of Gamers OC Key accessory. The OC Key plugs into the DVI port of the graphics card and provides an on-screen-display for overclocking information and voltage tweaking.
In addition to the ASUS Maximus V Extreme, the company is producing the Maximus V Forumula motherboard, which is then further available with or without the ThunderFX audio accessory. The Formula board is another socket 1155 board with a red and black color scheme that is ready for Ivy Bridge processors and multi-GPU setups (SLI or CrossFireX). The heatsinks on the formula are a little less beefy than those on the Maximus V Extreme, but the VRM heatsinks are ready to be integrated into a water cooling loop. Further features include four DDR3 DIMM slots, eight SATA connectors, three PCI-E 3.0 x16 slots, three PCI-E 3.0 x1 slots, and a single PCI-E 3.0 x4 slot.
The board also features the SupremeFX integrated sound card (which has been isolated from the rest of the board by routing the wiring through its own PCB layer) and a mini-PCI-E + mSATA combo card. One version of the motherboard also comes with the SupremeFX accessory which you can see in the photo below.
Rear IO of the Maximus V Formula motherboard includes four USB 2.0 ports (one for ROG connect), four USB 3.0 ports, an eSATA port, DisplayPort, HDMI, Intel-powered Gigabit LAN, five analog audio jacks, two optical audio outputs, and CMOS clear and reset buttons.
The Maximus Extreme V Formula comes with a device called the ThunderFX that is a high end headphone amp and DAC offering 120dB SNR, and noise cancellation technology. The included GamEQ comes with three preset profiles but also offers you a wide range of options to tweak your sound to your own desires. There is also onboard audio in the form of the SupremeFX IV audio chipset which will keep those who prefer speakers more than happy with their audio quality.
You can also see that the large anodized aluminium heatsinks have barbs for you to include them in a watercooling loop so that all components on your motherboard can be cooled without resorting to fans to move air. GameFirst II is the name ASUS has given their networking software and it is designed to examine an prioritize packets to reduce lag and ping times. It comes with both an EZ Mode as well as offering advanced options for those who know what they are doing. As we have seen on other boards, the Maximus Extreme V Formula comes with a mPCIe Combo card with dual-band Wi-Fi and Bluetooth 4.0. This board is also shattering records in Super PI 32M, 3DMark05 and Heaven to name a few.
Subject: General Tech | June 6, 2012 - 03:20 PM | Jeremy Hellstrom
Tagged: max payne 3, crossfire, sli, gtx680, HD 7970, gaming
For the tests they ran, [H]ard|OCP used the latest Catalyst beta, 12.6 and ForceWare 301.42 WHQL as both drivers proved able to provide proper multi-GPU performance on Max Payne 3. In AMD's case it provided improvements to single card gaming as well. The games graphics options provide a nice tool which displays how much VRAM your configuration will require so that you can get an idea if your card will be able to handle the settings before you even play the game. SLI did scale better than Crossfire but even still, both multi GPU rigs could handle the max settings at 2560x1600 and when used singly could still sit around the 60fps mark. Check out the full review here.
"HardOCP is on top of Max Payne 3 to find out what graphics options it supports and how a GTX 680 and a Radeon HD 7970 perform. We also wanted to know if SLI and CrossFireX worked, and how performance scales. In this preview of performance and image quality we take a look at all of this in the first chapter of this game."
Here is some more Tech News from around the web:
- Steam For Linux Will Launch In 2012 @ Slashdot
- Max Payne 3 Hi-Resolution Screenshot Gallery @ NGOHQ
- Max Payne 3 Review @ Techgage
- Max Payne 3 Performance Tested, Benchmarked @ Techspot
Subject: General Tech | June 3, 2012 - 03:20 AM | Tim Verry
Tagged: streaming, Hawken, gaming, gaikai
Mech Shooter Hawken will launch on December 12th, 2012 but streaming gaming service Gaikai has made a deal with Meteor Entertainment to allow gamers to play the game before launch to demonstrate its playability through its streaming service using NVIDIA’s GRID cloud gaming technology.
According to gaming website Joystiq, Gaikai has signed a deal with publisher Meteor Entertainment to allow gamers to test out the mech shooter PC game running on Gaikai's streaming service ahead of the game’s official release on 12/12/12. First demonstrated at GTC 2012, the free-to-play game uses NVIDIA’s GRID technology to reduce latency on the server and client sides.
A video of the NVIDIA demonstration.
Mark Long, CEO of Meteor Entertainment stated that "HAWKEN wants to be free and it wants to be everywhere - and with Gaikai, it will be.” The game has proved quite popular and has hundreds of thousands of gamers signing up for the closed beta. The free-to-play game is returning to a PC gaming classic with mech fighting and if Gaikai is able to deliver it will be a game that will be accessible to all kinds of devices from tablets to high powered gaming PCs.
That last bit is the real question though, and one that many gamers have on their minds. Gaikai is offering up the game pre-release to prove itself as a viable platform, and that is going to be a make it or break it situation. Here’s hoping that the NVIDIA GRID technology delivers and results in a playable game with real world performance benefits. While they have not set an exact date for when it will go live, gamers will be able to access it via the playhawken.com website. Will you be checking out Hawken streaming for yourself?
Subject: General Tech | June 3, 2012 - 02:40 AM | Tim Verry
Tagged: gaming, frostbite, ea, bf3, 64-bit
Last month, Johan Andersson posted on twitter a tweet that stated future Frostbite engine based games in 2013 would require a 64-bit operating system. The full tweet is shown in the image below. He suggested that it would be a good idea to upgrade to Windows 8, though it is difficult to judge sarcasm in text (hehe). That bit led to a big explosion of tweets as the Internet revolted against what they thought would be required: an x64 version of Windows 8. Mr. Andersson later clarified that any recent x64 version of Windows would be fine.
You can see the tweet on Twitter here.
The Windows 8 suggestion aside, I was very excited about the news that 64-bit Windows would be required. Currently, games are developed with both x64 and x86 versions in mind, which means that games are shackled by the limitations of the x86 (32 bit) operating system. As an example, Sins of a Solar Empire is a game that generally runs great from beginning to mid-game on large maps, but as players build up fleets of ships and have a lot of data to keep track of, the game starts to run out of memory and starts to chug–even when running the game on a 64-bit operating system. The CPU and GPU are not fully utilized, it is a RAM limitation as reported by a number of users and a situation I have found myself in numerous times as well.
32-bit operating systems (and I’m being general here) have a hard limit of about 4GB of RAM, from which the GPU, other expansion devices, and overhead steal a chunk of address space that the OS cannot use even if there is physically 4GB of RAM DIMMS in the system. With 2GB GPUs being common, that leaves a system running 32-bit OSes with 2GB of addressable system memory. From that, the OS can allocate programs, caching, and other system tasks to that 2GB of total available RAM. Modern games can easily hit 2GB or more of RAM usage, but on 32-bit systems they are severely restricted in how much they can use.
By requiring a 64-bit operating system, developers can focus on producing games that can make full use of RAM on modern systems. RTS and other strategy games are going to benefit the most, but even shooters like Battlefield (4?) will run smoother by being able to store as much data in RAM as possible without those pesky restrictions of 32-bit systems. Unfortunately, the upcoming Sins of a Solar Empire: Rebellion game will still suffer from RAM issues (though it is said to be managed better than previous releases) as it is being developed around the possibility of running on 32 or 64-bit OSes. Here’s hoping that the next SoaSE game will require 64-bit OSes just like Frostbite engine games will.
The best part, aside from performance benefits of course, is that the majority of gamers will not have to do anything when these games come out as they are already running a 64-bit version of Windows. Even OEMs have started loading x64 versions on pre-built systems in the last couple years (since Windows 7 and RAM became so cheap). Most gamers will be able to jump right in and enjoy the benefits immediately because gamers are inherently required to have at least somewhat recent hardware to play the latest games.
In the end, requiring 64-bit operating systems is a good thing, and hopefully more developers will follow in DICE’s footsteps. By freeing themselves from the limitations of 32-bit systems, they can focus on using gamers’ hardware to the fullest–at least until games start using more than 8TB of RAM (which would require a new version of Windows anyway as Win 7 x64 (Ultimate/Pro) can only address 192GB).
Subject: General Tech | May 23, 2012 - 02:50 PM | Jeremy Hellstrom
Tagged: gaming, diablo iii, blizzard
TechSpot wanted to see what effect your graphics card has on your experience while slaughtering mobs of baddies in Diablo III. First they removed any chance of a CPU bottleneck by building a test bed using an i7-3960X and then they gathered over two dozen GPUs to test with, ranging from a Radeon HD 6450 to a GTX 680 and almost everything in between. At lower resolutions all but the slowest seven cards and Intel's HD4000 were able to give 60fps or more but at 2560x1600 only half of the cards they tested could make 60fps or better. It is interesting to see that the GTX680 and HD7970 offer the same performance at the upper end of the resolutions they tested but you should expect that to change as drivers mature.
"While we disagree with making single player components online-only, there isn't much mere mortals like us can do about it. What we can do, however, is beat the hell out of Diablo III with today's finest hardware. Blizzard has somewhat of a reputation for making highly scalable titles that run on virtually any gaming rigs, so that's largely what we expect from the developer's latest offering..."
Here is some more Tech News from around the web:
- Hands On: XCOM – Enemy Unknown Part 2 @ Rock, Paper, SHOTGUN
- The Witcher 2: Enhanced Edition PC Review @ eTeknix
- Sniper Elite V2 @ Tweaktown
- Warlock: Master of the Arcane Review @ Techgage
- Dragon’s Dogma review @ The Inquirer
- From Dust Now Playable From Browsers @ Rock, Paper, SHOTGUN
- Starhawk (PS3) Game Review @ HardwareHeaven.
- Mortal Kombat (PS Vita) Review @ HardwareHeaven
- Dragon's Dogma Review (XBOX 360) @ HardwareHeaven
- Tom Clancy's Ghost Recon Future Soldier (XBOX 360) Game Review @ HardwareHeaven
Subject: General Tech | May 16, 2012 - 10:45 PM | Tim Verry
Tagged: vengeance 2000, vengeance, headset, gaming, corsair
Popular computer case and power supply maker Corsair recently launched a sweepstakes to get the word out about their new Vengeance 2000 wireless gaming headsets. They will be giving away five of the new virtual surround sound headsets to winners.
The contest is open to new entrants until Monday (5/21/12), and is very simple to enter. To enter the contest, head over to their Facebook contest page and hit the “Like” button. Then click on the green “Enter Sweepstakes” button. After that, they invite you to tell your friends about the contest. They have a couple thousands entries so far, so get in while you can! The Official Rules are linked on the bottom of the contest page but it looks like anyone over the age of 18 not affiliated with the company is eligible to win.
The Vengeance 2000 is essentially the wireless version of the company's Vengeance 1500 USB gaming headset with a noticeable makeover. The headset uses 50mm drivers and 2.4GHz wireless technology to deliver virtual surround sound without a wired connection to the PC, and up to about 40 feet. It also features a rechargeable battery in the headset and an adjustable noise canceling boom microphone. The headsets have an MSRP of $149 USD.
Best of luck in the contest, and if you win be sure to let us know what you think of them!
Subject: General Tech | May 16, 2012 - 12:04 PM | Jeremy Hellstrom
Tagged: gaming, diablo iii
Rock, Paper, SHOTGUN was given the chance to sit down with the senior world designer and the lead technical artist of Diablo 3. One of the topics of discussion will be near and dear to those who played the previous games in the series, co-op multiplayer, which really defined the game for those who tried it. Somehow button mashing in tandem was much more enjoyable than the already great single player experience and the development team spent a good deal of effort bringing that experience to Diablo 3. They also talk about the difficulties of including enough lore to keep players who want some depth to the story of the game but ensuring that those players who don't care for a back story don't feel it is getting in the way of their game. At no time were rainbows or unicorns discussed.
"Diablo III is now a thing that you’re capable of owning and (hopefully) playing. Just before the launch, when those network problems were yet to freeze Hell over, I sat down with senior world designer Leonard Boyarsky and lead technical artist Julian Love to keep them company as queues formed in the streets outside. Along the way, I discovered that having an ex-Troika chap on your game means that ‘lore’ is a very important word indeed, that the distant roguelike heritage hasn’t been forgotten and that technological progression doesn’t necessarily alter design principles."
Here is some more Tech News from around the web:
- Eyefinity/Surround Analysis of Rayman Origins @ Widescreen Gaming Forum
- Botanicula PC Review @ eTeknix
- Diablo III Midnight Launch and Signing Gallery @ HardwareHeaven
- ins of a Solar Empire Rebellion Beta @ Benchmark Reviews
- Inside the Atari 2600 @ Hardware Secrets
- Lucid PC Review @ eTeknix
- A Chat With Rocket, Creator Of Day Z @ Rock, Paper, SHOTGUN!
Subject: General Tech | May 15, 2012 - 06:22 PM | Tim Verry
Tagged: LAN party, lan, gigabyte, gaming, case mod contest
Popular motherboard manufacturer Gigabyte Technology Co. recently announced the Gigabyte eSports LAN (GESL), which is its first eSports event in North America. The event includes a BYOC (bring your own computer) LANFest, tournament competitions in Starcraft II and League of Legends, a case mod competition, presentations, and an event raffle. The competitions each feature various prizes for winning including Gigabyte G1.Sniper 3 motherboards, graphics cards, RAM, and other computer hardware. Starcraft II and League of Legends further offer $11,000 and $10,000 prize pools respectively.
The event will be held at the California State Polytechnic University in Pomona, California from June 15th to June 17th, 2012. In additon to Gigabyte, the eSports event is co-sponsored by Kingston and Cooler Master, among others. The LAN competitions will be broadcast in HD for free during the event for those that can’t attend in person. Alternatively, users can purchase spectator badges for $15 USD. There will also be an event raffle during the GESL that will give away various pieces of computer hardware and company swag to attendees.
Further, the case mod contest will showcase systems from participants of the BYOC LANFest or spectators, of which five winners will be chosen. They will receive computer hardware and coverage in CPU Magazine should they win.
More information on the event can be found at the Gigabyte eSports LAN website (thegesl.com).
Subject: General Tech | May 9, 2012 - 03:09 PM | Jeremy Hellstrom
Tagged: gaming, witcher 2
[H]ard|OCP overheard that The Witcher 2 underwent a large update, to the tune of 10GB or so, which gave them an opportunity to pit the newest GPUs from NVIDIA and AMD against a game which has a reputation for being hard on graphics card. The Radeon HD 7970 and GeForce GTX 680 both struggled equally at 2560x1600 on this game, which is quite impressive for a year old game. The most demanding feature of the game is UberSampling, which incorporates Antialiasing and Anisotropic Filtering and can bring even these powerful graphics cards to their knees. When it was enabled [H] saw frame rates reduced to 20fps or lower, something that the old standby Crysis just cannot do. You can argue that a DX9 game lacks many of the optimizations of DX10 and 11 which help the performance of these cards but it would seem that as far as testing the raw graphics power of a card,
"The Witcher 2: Assassins of Kings just underwent a major update. The Enhanced Edition offers a new zone, more quests, more cinematics, and several bug fixes. This DX9 game has the reputation of being one of the most demanding and stressful games on modern video cards. We put this theory to the test."
Here is some more Tech News from around the web:
- Lone Survivor PC Review @ eTeknix
- Saturday Morning Satan: Diablo III Gets A Cartoon! @ Rock, Paper, SHOTGUN
- Why Elder Scrolls Online Needs To Be A Sandbox @ Rock, Paper, SHOTGUN
- No BioShock Infinite This Year, Not For You @ Rock, Paper, SHOTGUN
- Kid Icarus: Uprising Nintendo 3DS @ Tweaktown
- Dragon's Dogma Preview (PlayStation 3) @ HardwareHeaven
- Running Sega Dreamcast Games on Your PC with nullDC @ Techgage