All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | June 21, 2011 - 03:33 AM | Scott Michaud
Tagged: vsis, sudo, PlanetLab, linux
The Unix-based computer has always been ahead of the curve when it comes to enforcing permission levels upon their users. Back during the infancy of operating systems the idea of permissions did not compute with many of the platform developers. Today it is next to impossible to imagine a modern operating system without some sort of hierarchy of trust. Historically there were various methods of controlling access: first there was “su” which temporarily logged you in as another user; then there was “sudo” which let you just execute commands to another user rather than log in as them; now PlanetLab claims to have a better offering, called “Vsys”.
Vsys was created to allow finer control over what user is allowed what action. One feature is the ability to create extensions, scripts from executable files, to define what is permissible and what is not. It is apparently possible to permit certain combinations of commands but no other similar combinations. PlanetLab, the creator of Vsys, is a research network headquartered at Princeton University. From the wording of the article it appears as if Vsys was an internal tool developed for their researchers to have more specific access to what was necessary which they now released publicly. While it has been stated that Vsis will not replace sudo for the common user it should be useful for administrators of larger groups of users.
Subject: General Tech, Storage | June 20, 2011 - 09:06 PM | Ryan Shrout
Tagged: storage, raid, network attached storage, NAS, drobo
Just Delivered is a new section of PC Perspective where we share some of the goodies that pass through our labs that may or may not see a review, but are pretty cool none the less.
When the time is right for dedicated network storage and you don't want to go through the hassle or complication of building your own FreeNAS or other type of device, one of the best options on the market according to our own Allyn Malventano is a Drobo.
For an upcoming review we just received a new Drobo FS, the network attached version of the Drobo lineup. Available in both a standard and a "Pro" model, the former with 5 bays the latter with 8, they are about as idiot-proof and easy to setup as a NAS can be.
The Drobo FS only has a single connectivity option: the Gigabit Ethernet port for connection to your primed-and-ready router. Adding or swapping hard drives for larger models is super easy and the "BeyondRAID" technology makes it reliable as well as simple to use.
We are looking forward to putting the Drobo FS to the test in the coming days and reporting back to you on the performance, features and reliability of it.
Subject: General Tech, Processors | June 20, 2011 - 04:46 PM | Scott Michaud
Tagged: VIA, sysmark, nvidia, bapco, amd
People like benchmarks. Benchmarks tell you which component to purchase while your mouse flutters between browser tabs of various Newegg or Amazon pages. Benchmarks let you see how awesome your PC is because often videogames will not for a couple of years. One benchmark you probably have not seen here in a very long time is Sysmark from the Business Applications Performance Corporation, known as BAPCo to its friends and well-wishers. There has been dispute over the political design of BAPCo and it eventually boiled over with AMD, NVIDIA, and VIA rolling off the sides of the pot.
Fixed that for you
The disputes centered mostly over the release of SYSmark 2012. For years various members have been complaining about various aspects of the product which they allege Intel strikes down and ignores while designing each version. One major complaint is the lack of reporting on the computer’s GPU performance which is quickly becoming beyond relevant to an actual system’s overall performance. With NVIDIA, AMD, and VIA gone from the consortium, Intel is pretty much left alone in the company: now officially.
Subject: General Tech | June 20, 2011 - 12:11 PM | Jeremy Hellstrom
Tagged: Intel, mic, larrabee, knights corner, 50 GPGPU
Knights Corner is not exactly Larrabee but the idea behind both are very similar. A large number of GPGPUs are integrated with a CPU, Intel is using a Xeon core now as opposed to a Pentium; with the GPGPUs hooked up in a similar method to Larrabee's ring of Pentium cores. The design is proven as they have sold units of the previous generation Kights Ferry and offers a feature that a lot of programmers are going to appreciate; instead of needing to learn a new language like CUDA or OpenCL, standard x86 scalar code is used to program these chips. This architecture is also expected to scale very well, for as ARM recently pointed out only specific multithreaded applications continue to scale well as more cores are added. Drop by The Inquirer for more information.
They will likely be sold as PCIe card like the Knights Ferry card pictured above.
"CHIPMAKER Intel has announced its second generation hybrid core technology codenamed 'Knights Corner'.
Knights Corner is Intel's second chip in its Many Integrated Core (MIC) chip line and will feature Xeon X86 cores and more than 50 GPGPU cores loosely based on what was previously known as Larrabee. Knights Corner will be fabricated using Intel's 22nm tri-gate process node beginning in 2012, though the firm would not be drawn on the exact core count at this time."
Here is some more Tech News from around the web:
- Japan's 8-petaflop K Computer Is Fastest On Earth @ Slashdot
- Intel admits that Moore's Law is not enough @ The Inquirer
- When WiFi doesn't work: a guide to home networking alternatives @ Ars Technica
- Western Digital Livewire PowerLine AV Kit @ TechwareLabs
- US reveals Stuxnet-style vuln in Chinese SCADA 'ware @ The Register
- Adobe offloads unwanted Linux AIR onto OEMs @ The Register
- Designcord 5 Metre Autorewind Cable Reel Extension Lead Review @ eTeknix
- x264 HD Benchmark 4.0 @ TechARP
- ArtRage: quality digital painting on the cheap @ Ars Technica
- Ultra Simple 360-degree Photo Hack @ Make
- AMD Developer Summit lacks Bulldozer details @ The Inquirer
- Nokia Connections 2011 - Our Expectations @ t-break
- DreamHack Summer Festival kicks off! Day One! @ eTeknix
- Interview with Ziad Matar of Qualcomm @ t-break
- Patriot Xporter XT Rage 32GB Flash Drive Giveaway! @ ThinkComputers
- Weekly Giveaway #2: Foxconn Flaming Blade GTI, Innergie mCube Lite, SteelSeries Spectrum AudioMixer, StarTech ExpressCard eSATA Controller Adapter Card @ eTeknix
Subject: General Tech, Displays | June 20, 2011 - 04:03 AM | Scott Michaud
Tagged: widi, D-Link
There are a lot of benefits of having a home theatre PC but still one major drawback: having the PC by the TV. Intel has worked hard to find a solution and released the specification under the name “WiDi”, a wireless display specification that lets you share your monitor with an HDTV attached to a wireless receiver box. D-Link has just recently launched their WiDi receiver in the US with Canada coming next month; will WiDi start picking up market share with more capable devices?
Why do network appliances these days look like pillows?
(Image from D-Link)
The D-Link MainStage (known as DHD-131 to its friends) has only a power cable to its name apart from your choice of video and audio connection to your TV or sound system. For choice of connection you have two video options and three audio options: on the video side you have HDMI for your high-resolution viewing and standard RCA for your standard definition devices; on the audio side you have optical audio or HDMI for surround and white and red RCA for stereo. Apart from a power button and a reset button that is the whole of this unit.
Plastic case with on/off butty. Baby got back shots.
(Image from D-Link)
One thing that typically holds back other implementations of WiDi that I have seen, and I assume this is no exception, is latency. The slight lag when controlling a media program or browsing a website is acceptable however it would really hold back the use of a PC as a console replacement unless the video card is directly connected to the TV which is a definite shame but to be expected given the bandwidths over WiFi that we are talking about. If you happen to be interested in this solution, however, it retails for just under 130$.
Subject: Editorial, General Tech | June 20, 2011 - 03:24 AM | Tim Verry
Tagged: simulator, networking, Internet, cyber warfare
Our world is the host to numerous physical acts of aggression every day, and until a few years ago those acts have remained in the (relatively) easily comprehensible physical world. However, the millions of connected servers and clients that overlay the numerous nations around the world have rapidly become host to what is known as “cyber warfare,” which amounts to subversion and attacks against another people or nation through electronic means-- by attacking its people or its electronic and Internet-based infrastructure.
While physical acts of aggression are easier to examine (and gather evidence) and attribute to the responsible parties, attacks on the Internet are generally the exact opposite. Thanks to the anonymity of the Internet, it is much more difficult to determine the originator of the attack. Further, the ethical debate of whether physical actions in the form of military action is appropriate in response to online attacks comes into question.
It seems as though the Pentagon is seeking the answers to the issues of attack attribution and appropriate retaliation methods through the usage of an Internet simulator dubbed the National Cyber Range. According to Computer World, two designs for the simulator are being constructed by Lockheed Martin with a $30.8 million USD grant and Johns Hopkins University Applied Physics Laboratory with a $24.7 million USD grant provided by DARPA.
The National Cyber Range is to be designed to mimic human behavior in response to various DefCon and InfoCon (Informational Operations Condition) levels. It will allow the Pentagon and authorized parties to study the effectiveness of war plan execution as it simulates offensive and defensive actions on the scale of nation-backed levels of cyber warfare. Once the final National Cyber Range design has been chosen by DARPA from the two competing projects (by Johns Hopkins and Lockheed Martin), the government would be able to construct a toolkit that would allow them to easily transfer and conduct cyber warfare testing from any facility.
Image cortesy Kurtis Scaletta via Flickr Creative Commons.
Subject: General Tech, Mobile | June 19, 2011 - 12:22 AM | Scott Michaud
Tagged: tablet, sony, S2, S1
We are going to see quite a few Android-based tablets come out in the next few months as the flood gates open for tablet creators. We have been reporting on strong rumors have been pointing to Amazon stepping in the tablet space to extend their Kindle portfolio this fall. Amazon is generally very successful when they decide to step in the market, yet that did not deter Sony from preparing to dive in to the tablet space as well. Sony are preparing to launch a 9-inch tablet and a dual screen 5.5-inch tablet in the autumn and to build hype they have released a video ad campaign to build hype for that event.
This “Two Will” Pass
As you can tell from watching the video, it says little about the product except that they slide really quickly, absolutely love someone, cast ominous shadows, and can kill action figures with lightbulb mind bullets. Sony did mention that this is just the first episode of five so it is possible that their later videos may be more informative. However, if you just want to see what an Echochrome 2-esque city has to do with Android tablet then be sure to watch the next four commercials.
We apologize for the lack of a podcast, but you don't often get a chance to see your city humiliate its self
Subject: General Tech | June 17, 2011 - 06:36 PM | Jeremy Hellstrom
Tagged: friday, PC Perspective Forums
Before your weekly tour through the PC Perspective Forums, it would behoove you to check out the bottom of some of our front page stories. If you click on the comments link, or just scroll down after clicking on an article you will notice it is possible to have a discussion about that article right there on the front page with other readers and with the creator of the review or news post. It is easy and you don't even need to sign up, though we would prefer that you do as membership at PC Perspective does have its privileges. If you want to remain anonymous or unverified then certainly do, though we do require you to fill something in the email address box and read a captcha, as there are still spammers out there on the internets.
To get really in depth advice and opinions you really should head off to the PC Perspective Forums, we can't give you indepth instructions on using Microsoft's debugging tools in the comments section but it is a piece of cake (slightly old cake, but still) to provide step by step instructions with pictures in the Forum its self. Some things just shouldn't be on the front page, but in the forums you will find kind souls ready to help your wetware as well as your hardware. Sometimes you will even find independent reviews in the Forums.
In the Cases'n'Cooling Forum, a new case mod has appeared; the Blood Ice HAF 922 is worth a look whether you are into case modding or just want to see an impressive tool set and workbench. In the Storage Forum a member is having an unpleasant time with a recent SSD upgrade as is someone in the Linux Forum.
The live watchers are already upset with us and the rest will just be receiving the bad news, but for the first time in quite a while we failed to provide our dedicated viewers a fresh PC Perspective Podcast on Wednesday. With Ryan in Seattle schmoozing with AMD and a pending riot of Vancouver residents, we decided to call it off. Perhaps next week we shall torture you with a double length episode?
Subject: General Tech | June 17, 2011 - 02:37 PM | Scott Michaud
We have been long battling online menaces that are looking to generate money off of the grief of others. It used to be simple for the attack to be successful: release virus; ???; profit. Now that worms are much less common the focus has shifted from invading a person’s computer to tricking the person to allow you in their computer or attacking the service they are accessing. Now, what was once a far-fetched joke by a popular comic strip is true: people are being contacted at home and told to infect their computer.
Your call is VERY important to us.
The story for security has always been the same: be careful what you do, keep your attack surface as small as possible, and limit the damage in the event of a breech. You need to be aware, regardless of what platform you utilize, that you are only as safe as your level of complacency. If someone is attempting to get you to do something quickly, they likely are trying to play on your complacency by distracting you with an urgency. The disappointing part is that in the heat of the moment even someone aware of these attacks could still be susceptible to them because social engineering is simply very effective.
All of the above said, the silver lining to this whole problem is that the attackers are getting substantially more desperate which means that it is only a matter of time before the pool of attackers shrinks due to lack of profitability. The problem will never go away, but as the difficulty steadily increases for the attackers (which it is, otherwise they would not be so inventive) the draw of money will seem much less luscious.
Subject: General Tech | June 17, 2011 - 02:24 PM | Jeremy Hellstrom
Tagged: TSMC, southern islands, northern islands, llano, global foundries, arm, amd, 40nm, 32nm, 28nm
Back in April there was a kerfuffle in the news about a deal penned between AMD, Global Foundries and TSMC. It is not worth repeating completely as you can follow the story by using the previous link, suffice to say that it did not indicate problems with the relationship between AMD and Global Foundries.
The previous post was specifically about 40nm and 32nm process chips, however today we hear from DigiTimes that TSMC has scored a deal with AMD for the 28nm Southern Islands APUs of which we have seen much recently. The 40nm Northern Islands GPUs will also be produced by TSMC. That leaves a lot of production capabilities free at Global Foundries to work on ARM processors.
"AMD reportedly has completed the tape-out of its next-generation GPU, codenamed Southern Islands, on Taiwan Semiconductor Manufacturing Company's (TSMC) 28nm process with High-k Metal Gate (HKMG) technology, according to a Chinese-language Commercial Times report. The chip is set to expected to enter mass produciton at the end of 2011.
TSMC will also be AMD's major foundry partner for the 28nm Krishna and Wichita accelerated processing units (APUs), with volume production set to begin in the first half of 2012, the report said.
TSMC reportedly contract manufactures the Ontario, Zacate and Desna APUs for AMD as well as the Northern Island family of GPUs. All of these use the foundry's 40nm process technology.
TSMC was quoted as saying in previous reports that it had begun equipment move-in for the phase one facility of a new 12-inch fab (Fab 15) with volume production of 28nm technology products slated for the fourth quarter of 2011. The foundry previously said it would begin moving equipment into the facility in June, with volume production expected to kick off in the first quarter of 2012."
Here is some more Tech News from around the web:
- ARM acquires Obsidian Software @ The Inquirer
- Mozilla pushes out final Firefox 5 test build @ The Register
- Sega Hacked @ XSReviews
- Tablets of 2011: What to Look For - June Update @ TechSpot
- A few thoughts on Ultrabooks @ The Tech Report
- Dsabling Windows Pagefile & Hibernation to Reclaim SSD Space @ Techgage
- Overclockers Benchmarking Party II: Where the Bell Tolls!
- Post Computex 2011 - Part 2 @ Bjorn3D
Subject: General Tech, Graphics Cards, Mobile | June 17, 2011 - 04:35 AM | Scott Michaud
Tagged: webgl, microsoft
WebGL: Heaven or Hell?
(Image from MrDoob WebGL demo; contains Lucy model from Stanford 3D repository)
WebGL is an API very similar to OpenGL ES 2.0: the API used for OpenGL features in embedded systems, particularly smart phones. The goal of WebGL is to provide a light-weight, CSS obeying, 3D and shader system for websites that require advanced 3D graphics or even general purpose calculations performed on the shader units of the client’s GPU. Mozilla and Google currently have support in their public browsers with Opera and Apple shipping support in the near future. Microsoft has stated that allowing third-party websites that level of access to the hardware is dangerous as security vulnerabilities that formerly needed to be exploited locally can now be exploited from the web browser. This is an area of expertise that Microsoft knows all too well from their past attempts at active(x)ly adding scripting functionality to the web browser evolving into a decade-long game of whack-a-mole for security holes.
But skeptics to Microsoft’s position could easily point to their effort to single out the one standard based on OpenGL, competitor to their still-cherished DirectX standard. Regardless of Microsoft’s motives it seems to put to rest the question of whether Microsoft will be working towards implementing WebGL in any release of Internet Explorer currently in development.
Do you think Microsoft is warning its competitors about its past ActiveX woes, or is this more politically motivated? Comment below (registration not required.)
Subject: General Tech, Storage | June 16, 2011 - 03:02 PM | Scott Michaud
Tagged: ssd, Intel, enterprise
Intel is currently in the process of releasing their 2011 lineup of solid state hard drives. A lot of news and products came out regarding their consumer 300-series and enthusiast 500-series line however it has been pretty silent regarding their enterprise 700-series products. That has changed recently with the release of specifications as a result of Anandtech’s coverage of the German hardware website ComputerBase.de.
And how does it compare to OCZ?
Intel will be releasing two enterprise SSDs: the SATA 3 Gbps based 710 SSD codename Lyndonville and the PCI express 2.0 based 720 SSD codename Ramsdale. The SATA based 710 will feature 25nm MLC-HET flash at capacities of 100, 200, and 300 GB. The 710 will have read and write speeds of 270/210 MB/s with 35,000/3300 read and write IOPS at 4KB and a 64MB cache. The PCIe based 720 will feature 34nm SLC flash at capacities of 200 and 400 GB. The 720 will be substantially faster than the 710 with read and write speeds of 2200/1800 MB/s with 180,000/56,000 read and write IOPS at 4KB and a 512MB cache. On the security front the 710 will be encrypted with 128 bit AES encryption where the 720 will be encrypted with 256 bit AES.
While there has been no hint toward pricing of these drives Intel is still expected to make a second quarter release date for their SATA based 710 SSD. If you are looking for a PCI express SSD you will need to be a bit more patient as they are still expected to be released in the fourth quarter. It will be interesting to see how the Intel vs OCZ fight will play out in 2012 for dominance in the PCIe-based SSD space.
Subject: General Tech | June 16, 2011 - 12:57 PM | Jeremy Hellstrom
Tagged: amd, Intel, nvidia
In some sort of bizarre voyeuristic hardware love/hate triangle AMD, Intel and NVIDIA are all semi-intertwined and being observed by Microsoft. Speaking with The Inquirer the VP of product and platform marketing at AMD, Leslie Sobon, stated that there was no chance that Intel would attempt to purchase NVIDIA as AMD did with ATI. AMD's purchase was less about the rights to the Radeon series as it was taking possession of the intellectual property that ATI owned after a decade of creating GPUs and lead directly to the APUs that AMD has recently released which will likely become their main product. Intel already has a working architecture that combines GPU and CPU and doesn't need to purchase another company's IP in order to develop that type of product.
There is another reason for purchasing NVIDIA though, which has very little to do with their discreet graphics card IP and everything to do with Tegra and Fermi which are two specialized products which so far Intel doesn't have an answer for. A vastly improved and shrunken Atom might be able to push Tegra off of mobile platforms and perhaps specialized SandyBridge CPUs could accelerate computation like the Fermi products do but so far there are no solid leads, only speculation.
If you learn more from your failures than your successes then Intel knows a lot about graphics.
"CHIP DESIGNER AMD believes that it is on a divergent path from Intel thanks to its accelerated processor unit (APU) and that Intel buying Nvidia "would never happen"."
Here is some more Tech News from around the web:
- Find Out if Your Passwords Were Leaked by LulzSec Right Here @ Gizmodo
- Adobe patches critical bugs in Flash and Reader @ The Register
- Umi, we hardly knew ye: contemplating the fate of the videophone in 2011 @ Ars Technica
- 'A SHARK attacked my ROBOT', gasps ex-Sun exec @ The Register
- We’ve got a real bone to pick with this mouse @ Hack a Day
- Fun Quotes from the AFDS Media Roundtable @ SemiAccurate
Subject: General Tech, Shows and Expos | June 15, 2011 - 09:14 PM | Scott Michaud
Tagged: opencl, amd, AFDS
If you are a developer of applications which requires more performance than a CPU alone can provide then you are probably having a gleeful week. Today Microsoft announced their competitor to OpenCL and we have a large write-up about that aspect of their keynote address. If you are currently an OpenCL developer you are not left out, however, as AMD has announced new tools designed to make your life easier too.
General Purpose GPU utilities: Because BINK won't satisfy this crowd.
(Logo trademark Apple Inc.)
AMD’s spectrum of enhanced tools includes:
- gDEBuger: An OpenCL and OpenGL debugger, profiler, and memory analyzer released as a plugin for Visual Studio.
- Parallel Path Analyzer (PPA): A tool designed to profile data transfers and kernel execution across your system.
- Global Memory for Accelerators (GMAC) API: Lets developers use multiple devices without needing to manage multiple data buffers in both the CPU and the GPU.
- Task Manager API: A framework to manage scheduling kernels across devices.
These tools and utilities should make the development of software easier and allow more developers to take the risk on the new technology. The GPU has already proven itself worthy of more and more important tasks and it is only a matter of time before it is finally ubiquitous enough that it is a default component as important as the CPU itself. As an ironic aside, that should spur the adoption of PC Gaming given how many people would have sufficient hardware.
Subject: Editorial, General Tech, Shows and Expos | June 15, 2011 - 05:58 PM | Ryan Shrout
Tagged: programming, microsoft, fusion, c++, amp, AFDS
During this morning's keynote at the AMD Fusion Developer Summit, Microsoft's Herb Sutter went on stage to discuss the problems and solutions involved around programming and developing for multi-processing systems and heterogeneous computing systems in particular. While the problems are definitely something we have discussed before at PC Perspective, the new solution that was showcased was significant.
C++ AMP (accelerated massive parallelism) was announced as a new extension to Visual Studio and the C++ programming language to help developers take advantage of the highly parallel and heterogeneous computing environments of today and the future. The new programming model uses C++ syntax and will be available in the next version of Visual Studio with "bits of it coming later this year." Sorry, no hard release date was given when probed.
Perhaps just as significant is the fact that Microsoft announced the C++ AMP standard would be an open specification and they are going to allow other compilers to integrated support for it. Unlike C# then, C++ AMP has a chance to be a new dominant standard in the programming world as the need for parallel computing expands. While OpenCL was the only option for developers that promised to allow easy utilization of ALL computing power in a computing device, C++ AMP gives users another option with the full weight of Microsoft behind it.
To demonstrate the capability of C++ AMP Microsoft showed a rigid body simulation program that ran on multiple computers and devices from a single executable file and was able to scale in performance from 3 GLOPS on the x86 cores of Llano to 650 GFLOPS on the combined APU power and to 830 GFLOPS with a pair of discrete Radeon HD 5800 GPUs. The same executable file was run on an AMD E-series APU powered tablet and ran at 16 GFLOPS with 16,000 particles. This is the promise of heterogeneous programming languages and is the gateway necessary for consumers and business to truly take advantage of the processors that AMD (and other companies) are building today.
If you want programs other than video transcoding apps to really push the promise of heterogeneous computing, then the announcement of C++ AMP is very, very big news.
If you happened to open up the store page in the Steam client or glance at their website, you may have noticed that Steam has made a moderately big announcement. Valve's digital download service now supports Free-to-Play games, which are games that are free to download and play at the basic level; however aesthetic and other upgrades can be purchased via so-called "microtransactions". F2P games on still will be free to download and will not require a credit card to do so.
Steam seems excited about the new F2P games.
At launch, the service is featuring five new Free-to-Play games including Champions Online: Free For All, Spiral Knights, Global Agenda: Free Agent, Forsaken World, and Alliance of Valiant Arms. According to the F2P Steam FAQ, games in which you wish to purchase content will be done through the use of your Steam Wallet. Further, for any Steam account that does not have at least one purchased (non Free-to-Play) game or a funded Steam Wallet will be considered a "Limited User" and will be restricted in the community features that it is able to access. Specifically, limited users can create community groups, be added as friends, and chat with other users; however, they are not able to send out friend invitations or start chat sessions (a non-limited user must initiate chat).
In adding the new genre to its repertoire, Steam will greatly increase its digital games library and add more options for PC gamers. One game that I have not played in some time that I would love to see make its way onto the new Free-to-Play Steam selection is a FPS game called Crossfire. That game was a good example of Free-To-Play done right as even accounts that did not spend a dime where able to stay competitive. Is there a Free-to-Play game that you would like to see Steam feature, and do you think F2P will add value to the service? Let us know in the comments.
Subject: General Tech | June 15, 2011 - 12:46 PM | Jeremy Hellstrom
Tagged: gaming, duke nukem, 10 commandments
We need a new joke, the poster boy of vapourware has actually arrived and no one remembers the Phantom console. You can catch up on all of the reviews of Duke Nukem Forever below the fold, but make sure you don't say anything mean about the game or the PR firm will get you. There is also a lot of previews from E3 to drool over, many new games offered teases of their unreleased products.
Before you take a look at the games, The Tech Report has recently crafted 10 commandments that all PC games should follow. Read through them and see which of the new games look to be following the reasonable requirements that they have listed.
It's beside the Any key, right?
"Picture this for a second: you just unpacked the latest PlayBox 720-X blockbuster game, Gran Gears of Duty Fantasy XVIII. It's a game so juicy and dreamy that it'll send you flying into all the colors of the rainbow, twitching and jerking with pleasure-induced spasms just from looking at the loading screen. Let's assume for the sake of argument that said game is a first-person shooter, like, oh, about 135% of recent releases. You insert the Megaray disc, go about the installation process, and merrily start to play.
All of a sudden, you notice the left stick is used for switching weapons. The right stick moves the character, and shooting is only accomplished by pressing it. The camera is moved with the directional buttons, and the triangle, square, A, and B buttons are used for your character's smartass quips. You enter the menu to change the controls, but you can only navigate them using the motion sensors. After five minutes of furniture-dusting motions, you finally enter the options menu and find out there are barely any options, and none that matter. Frustrated, you throw the TenAxis controller at your 4D TV screen and take the shiny disc out of the console to find out whether it will blend."
Here is some more Tech News from around the web:
- Duke Nukem Forever Performance Test @ TechSpot
- Wot I Think: Duke Nukem Forever @ Rock, Paper, SHOTGUN
- Duke Nukem Forever Game Review (PC) @ HardwareHeaven
- Duke Nukem Forever: barely playable, not funny, rampantly offensive @ Ars Technica
- Duke Nukem Forever Review @ Techgage
- Skyrim's dragon battles top off impressive demo @ Ars Technica
- Asura's Wrath is effortlessly weird and weirdly awesome @ Ars Technica
- Gaming Friday – Bulletstorm @ ThinkComputers
- Serious Sam 3 adds no modern gameplay, thank heavens @ Ars Technica
- Star Wars: The Old Republic's lead writer on good Sith, evil Jedi @ Ars Technica
- Section 8: Prejudice Review (PC) @ HardwareHeaven
- DiRT 3 Gameplay Performance @ The Register
- Assassin's Creed Revelations: The Escape From Constantinople Q&A @ HEXUS
- CANVAS Teaser Is Sinister, Awesome @ Rock, Paper, SHOTGUN
- Wot I Think: Red Faction Armageddon @ Rock, Paper, SHOTGUN
- Steam Now Offering Free-To-Play Games @ Slashdot
- A character, not a voice: tuning the narration in Bastion @ Ars Technica
- Silent Hill: Downpour - Xbox 360, PS3 @ HEXUS
- Wii? Maybe U... But I'm Not Sold Yet @ Techgage
- Super Mario 3DS hands-on: the Tanooki suit is back! @ Ars Technica
- Dead or Alive Dimensions 3DS @ Tweaktown
- Duke Nukem Forever Xbox 360 @ Tweaktown
- Icebreaker Hockey for iOS: hockey stripped down to $1 perfection @ Ars Technica
Subject: General Tech | June 15, 2011 - 12:16 PM | Jeremy Hellstrom
Tagged: servers, calxeda, arm
ARM has assembled their own Super Best Friends in a team lead by Calxeda, and composed of Autonomic Resources, Canonical, Caringo, Couchbase, Datastax, Eucalyptus Systems, Gluster, Momentum SI, Opscode, and Pervasive. This places Ubuntu as the ARM OS of choice for the server room and as it includes companies developing applications for running Cloud services, not only Microsoft should be paying attention; applications like Amazon's EC2 could face new competition as well.
Calexda's current reference machines pack 120 server nodes with 480 cores in a 2U chassis, a density which even a 1W Atom is going to find hard to match and the 1W Atoms are still a ways away. They are planning on getting the machines out to clients for testing by the end of the year, Intel's time table is nowhere near that tight. Read more about the low powered battle for dominance at The Register.
"With Intel's top brass bad-mouthing ARM-based servers, upstart server chip maker Calxeda can't let Intel do all the talking. It has to put together an ecosystem of hardware and software partners who believe there's a place for a low-power, 32-bit ARM-based server platform in the data center."
Here is some more Tech News from around the web:
- ARM Fellow takes the mic at AMD event @ The Tech Report
- AMD demos Trinity laptop @ SemiAccurate
- Epson WorkForce 60 Review @ TechReviewSource
- Chameleon WiGig gains momentum @ The Register
- Having fun with Microsoft Windows @ t-break
- Microsoft squeaks on Google Nortel sale @ The Register
- A tour of Zotac's Dongguan factory @ The Tech Report
Subject: General Tech | June 14, 2011 - 05:29 PM | Jeremy Hellstrom
Tagged: audio, razer, ferox, 2.1
The Razer Ferox speakers are designed to be portable, a pair of satellites measuring 70x70x64mm (not even 3") which come in a handy carrying case. They sport batteries that should last about 11 hours that are recharged over a USB connection but still require a 3.5mm jack to carry the audio, something that did not impress t-break in the least. The sound quality was good for this type of speaker, which equates to unnoticeable bass and decent mid and high end when in use. If you usually use headphones and simply need a way to share your audio, as opposed to needing new speakers then check out the Ferox, otherwise Razer has better choices as do Corsair and other manufacturers.
"Razer is no stranger to high quality audio equipment, what with the number of high-end stereo and surround headsets over the past years. Their breakthrough hit, the Razer Mako 2.1 THX speakers were one of the best desktop audio speakers at the time, and are still hard pressed to beat till this day. And now with the new Ferox speakers, Razer has entered the world of mobile speakers with a big bang."
Here is some more Tech News from around the web:
- Radiopaq Duo Headphones Review @ Tech-Reviews
- Thermaltake Shock Headset @ Bjorn3D
- Apacer Audio Steno AU825 MP4 Player Review @ Real World Labs
- SteelSeries 7H Headset for iPod, iPhone and iPad Review @ HardwareHeaven
- Speedlink Xbox 360 Headset Adapter @ XSReviews
Subject: General Tech, Systems | June 14, 2011 - 03:24 PM | Scott Michaud
Tagged: llano, hp
Level up! Llano life increased by 11 HP.
So, AMD is currently having a little shindig right now as you might be aware from recent news posts and news is just a leaking from the rafters. HP recently contacted us to announce that they just expanded both their consumer and business product lines to include 11 new models using “AMD’s latest Vision Technology”. What this means is we can expect a large array of products coming from HP that utilizes the latest generation of AMD CPUs and GPUs from their new Llano-based AMD A-Series product line. Expect a helping of Llano on your HP in the near future.