Subject: General Tech | July 21, 2011 - 01:02 PM | Tim Verry
Tagged: PSU, modular, corsair
Corsiar has recently introduced a new line of modular power supplies based on the popular TX V2 Enthusiast Series. The new modular PSUs have an attached ATX 12V cable and a full compliment of flat, detachable cables. Being 80+ Bronze certified, the new PSUs are able to deliver at least 85% efficiency at 50% load. Available from authorized retailers in July, the models range in wattage from 550 watts to 850 watts.
The Corsair TX550M is based on the TX550 V2 and is able to support the following connectors in addition to the non-modular ATX 12V cable.
|Wattage||550W @ 50°C|
On the voltage front, the PSU is capable of delivering 45A on the +12V rail and 25A on the +5V rail.
Read more about the new power supplies.
Subject: General Tech, Shows and Expos | July 21, 2011 - 11:35 AM | Jeremy Hellstrom
Tagged: Texas GamExperience, amd
You still have a while to wait before the start of Quakecon in Dallas, and then even longer to wait until the PC Perspective Hardware Workshop starts of and you can act like the maddest fool around in order to win prizes. That doesn't mean that Houston is quiet, indeed [H]ard|OCP and AMD just hosted the Texas GamExperience there this past weekend. They posted pictures of the event, including an awe inspiring EyeFinity Experience. Click through to see the rest of the pictures.
"AMD and HardOCP got together this past weekend in Dallas to put together an event that was focused on giving back to the enthusiast computer hardware community that has given so much to us."
Here is some more Tech News from around the web:
- New tablet from Lenovo for both fun and business @ SemiAccurate
- Hands on pictures of ASRock's new Gen3 motherboards @ VR-Zone
- Apple OS X Lion Review @ TechReviewSource
- Intel reports (mostly) solid revenue growth @ The Register
- A pound of flesh: how Cisco's "unmitigated gall" derailed one man's life @ Ars Technica
Subject: Editorial, General Tech | July 20, 2011 - 02:00 PM | Scott Michaud
Tagged: bumpday, 3d
This week LG unveiled their glasses-free 3D LCD display with only a minimal amount of LG employees trying to pet a poorly Photoshopped Formula One race car. 3D is quite heavily promoted lately with the hype machine apparently being fueled by anthropomorphic blue cats and Box Office records. 3D on the PC has been around for much longer, however. NVIDIA and ELSA had support for 3D glasses over a decade ago for 3D effects in games of the time. There really has not really been much said about 3D between then and the rush of publicity now so I guess it is time to bump it up in our memory.
This week’s intermission… in the third dimension
In August 2002 the epitome of threads on ATI’s lack of 3D stereoscopic support was born with a simple message: give your greens to the green. Of course whenever you mention one brand over another there immediately becomes a three-way comparison between the market leaders: ATI, nVidia, and Matrox (wha-what!?! Actually another article will be posted soon; an old Matrox technology has a spiritual successor… because the body’s long since dead.) Even back then, however, we had people who bashed 3D technology long before it was cool to dislike 3D technology. Some people like it a lot though, enough to drop down 1600$ on a pair of 3D VR glasses, but no money on an ATI card.
Subject: General Tech | July 20, 2011 - 01:52 PM | Steve Grever
Tagged: osx, macbook, mac, lion, imac, apple
Mission Control (Courtesy of Apple)
Apple released their latest operating system dubbed OS X Lion today that includes more than 250 new features the company states will make dramatic improvements to how users interact with Apple's entire line of computer systems. The $29.99 upgrade includes several new features like multi-touch gestures, full-screen apps, a new Mission Control section, and a new location for Mac apps called LaunchPad.
LaunchPad (Courtesy of Apple)
Apple expanded OS X's ability to view installed applications through a new program called Launchpad. Launchpad allows users to see all of their apps on one screen gives you instant access to all the apps on your Mac. Previously, loaded apps were viewed in a smaller window and now Launchpad will use all the screen real estate more efficiently to show users all their apps at one time.
Apple Mail (Courtesy of Apple)
OS X Lion also showcases a redesigned Mail program that uses a widescreen view to show message lists in modular sections that are more intuitive to read and use. Another section called Conversations gives users a basic timeline to show threads of messages from specific people. The revamped program also includes search suggestions and search tokens to make finding archived or buried e-mails alot simpler than clicking around for them.
Apple Server (Courtesy of Apple)
Another interesting feature Apple added is the OS X Lion Server that provides more control over user and administerator permissions versus the previous Server app. This program can basically turn almost any Mac into a basic server with secure options for remotely managing computers running Lion and other iOS devices like iPhones and iPad2s. Server admins can also send updates to all their users wirelessly through push notifications. Apple also made many improvements to the OS's file sharing options and to other programs like Wiki Server, iCal Server and Mail Server.
The OS X Lion upgrade can be purchased from the Mac App Store or online at Apple.com for $29.99. The entire download weighs in at around 3.49GB, which is a pretty significant update that should give many users more flexibility in how their use and interact with their Apple systems.
Subject: General Tech | July 20, 2011 - 12:54 PM | Jeremy Hellstrom
Tagged: john carmack, gaming
In a recent interview John Carmack seemed quite annoyed by a question implying that the current generation of first person shooters are all carbon copies of each other. He picked one of the favourite targets of game critics, Call of Duty, citing the fact that if people were sick of the game and its general game play that it would not sell the way that it does. He also touched on the cinematic opportunities offered by 3rd person game play; when you watch a movie it is very rare for the director to choose a first person view as that limits their creative choices for effects and environments. Follow the link from Slashdot for more information.
"id Software co-founder John Carmack defended the creativity of first-person shooter games in a recent interview. The legendary programmer, who was a pioneer in the shooter genre with Doom and Quake, said he doesn't like hearing from developers that shooters aren't good because they're not reinventing the wheel. 'I am pretty down on people who take the sort of creative auteurs' perspective. It's like "Oh, we're not being creative." But we're creating value for people — that's our job! It's not to do something that nobody's ever seen before. It's to do something that people love so much they're willing to give us money for... you see some of the indie developers that really take a snooty attitude about this,' he lamented."
Here is some more Tech News from around the web:
- Contemporary Graphics Accelerators in F.E.A.R. 3 @ X-bit Labs
- Duke Nukem Forever - 3D and Gameplay PC Review @ eTeknix
- DiRT 3 PC Review @ eTeknix
- Free-to-play MMO Realm of the Mad God a maddening, retro take on Diablo @ Ars Technica
- F.E.A.R. 3 (PC) Review @ HardwareHeaven
- L.A. Noire - PC, Xbox 360, PS3 @ HEXUS
- Wot I Think: New Vegas: Old World Blues @ Rock, Paper, SHOTGUN
- Take Dictation: Tropico 4 Screenshot Gallery @ Rock, Paper, SHOTGUN
- DA2 Expand-O-Pack: Better, Tougher, Morer @ Rock, Paper, SHOTGUN
- StarFox 64 3D Nintendo 3DS Hands On Preview @ Tweaktown
Subject: General Tech | July 20, 2011 - 12:25 PM | Jeremy Hellstrom
Tagged: southern islands, nvidia, gpu, amd, 28nm
Thanks to some information garnered by SemiAccurate we have a very good idea of AMD's release plans for their new GPU family, what we have been referring to as Southern Islands. The confusion that we felt from AMD's announcement that Southern Island parts would be ready sooner than expected arose from the reported difficulties that TSMC was having with their 28nm HKMG process. Thankfully someone had a chance to take apart some 28nm TSMC field programmable arrays and inside found a HKMG design modified for lower power states than the original specs. That doesn't mean cellphone level graphics performance but certainly means that the first GPUs we see from Southern Islands will not be the high end cards. AMD did the same thing with previous generations of GPUs, so the release schedule is becoming a habit, even if not what would be preferred.
There are other side effects to this choice by AMD and TSMC which are probably going to hurt NVIDIA, who are hoping to get full power Kepler based GPUs out at the beginning of next year. Since NVIDIA tends towards more aggressive clocks, the experience that TSMC has with what is called the HPL 28nm process will not necessarily help NVIDIA's HKMG 28nm process. SemiAccurate has more.
"The final piece of the TSMC 28nm HKMG process puzzle was put in place at SemiCon last week, it now makes sense. Chipworks got ahold of a Xilinx Kintex-7 FPGA, and it revealed a few secrets on the operating table.
If you recall, AMD is on track to put out Southern Islands chips much earlier than most people, SemiAccurate included, expected, possibly even this quarter. The real question is what process they are going to make it on, the TSMC 40nm SiON or 28nm HKMG? 40nm would be big, hot, and limited, think volcanic island more than Southern, while the 28nm SHP HKMG process wasn’t supposed to be ready until late Q1, best case. The short story is that Southern Islands is very likely not on either one."
Here is some more Tech News from around the web:
- Intel buys a networking chip design firm @ The Inquirer
- Back to the Mac: OS X 10.7 Lion @ AnandTech
- Mac OS X 10.7 Lion: the Ars Technica review
- Intel to launch new Celeron processors in September @ DigiTimes
- Google+ makes stalking easier @ t-break
- VidaBox iPad Wall Mount Review @ Madshrimps
Subject: Editorial, General Tech | July 19, 2011 - 11:59 PM | Scott Michaud
Tagged: mozilla, firefox
One side-effect of splitting a program up into multiple processes is that instructions do not inherently have a specific order. One of the most evident places for that to occur is during a videogame. I am sure most gamers have played a game where the controls just felt sluggish and muddy for some inexplicable reason. While there could be a few problems, one likely cause is that your input is not evaluated for a perceivably large amount of time. Chris Blizzard of Mozilla took on this and other issues with multithreaded applications and wrapped it around the concept of Firefox past, present, and future.
Firefox is getting Beta all the time.
One common misconception is that your input is recognized between each frame, which is untrue: many frames could go by before input affects the events on screen. John Carmack in a recent E3 interview discussed about iD measuring up to 100ms worth of frames occurring before a frame occurred which recognized the user’s command. This is often more permissible for games with slower-paced game design where agility is less relevant; if your character would lose to a Yak in a foot race, turns about as quick as one, and takes a hundred bullets to die: you will not notice that you started to dodge a few milliseconds earlier as you would expect to die in either case. In a web browser it is much less dramatic though the same principle is true: the browser is busy doing its many tasks and cannot waste too much time checking if the user has requested something yet. This aspect of performance, along with random hanging, is considered “responsiveness”. Mozilla targets 50 milliseconds (one-twentieth of a second) as the maximum time before Firefox rechecks its state for changes.
Chris Blizzard goes on to discuss how hardware is mostly advancing on the front of increases in parallelism rather than clock speed and other per-thread advancements. GPGPU was not a topic in the blog post leaving the question for the distant future centered on what a multithreaded DOM would look like – valuing the classical multicore over the still budding many-core architectures. Memory usage and crashing were also addressed though this likely was more to dispel the Firefox stereotype of being a memory hog starting later in the Firefox 2 era.
The GPGPU trail is not Mozilla's roadmap.
The last topic discussed was Sandboxing for security. One advantage of branching off your multiple threads into multiple discrete processes is that you could request that the operating system assign limited rights to individual processes. The concept of limited rights is to prevent one application from exploiting too much permissions for the purpose of forcing your computer to do something undesirable. If you are accepting external data, such as a random website on the internet, you need to make sure that if it can exploit vulnerability in your web browser that it gains as little permission as possible. While it is not a guarantee that external data will be executed with dangerous permission levels: the harder you can make it, the better.
What does our readers think? (Registration not required to comment.)
Subject: General Tech | July 19, 2011 - 02:05 PM | Jeremy Hellstrom
Tagged: CoolerMaster Storm Sirus, audio, 5.1 headset
Naming a product Sirus right now might attract an odd crowd, then again maybe it is best that they are using headphones to watch or listen to their favourite series. CoolerMaster's newest member of the Storm lineup is not a case, mouse or fan, it is a 5.1 surround headset. One of the more interesting features is that there is only one wire coming from the headset, connecting to a small round controller. From there you connect to the PC using USB, or preferably, 4 of the analog jacks on the back of your PC. The controller allows you to adjust the levels of each channel separately, which is a very nice touch. Unfortunately however Neoseeker adjusted it they couldn't bring it up to audiophile standards, but they have no reservations recommending it for gamers.
"Not one to be left out, Cooler Master enters the PC audio market with a 5.1 surround sound headset of its own that can connect to your audio source via analog jacks or USB port. See how well the Sirius stacks against more specialized headsets in our latest audio review."
Here is some more Tech News from around the web:
- CM Storm Sirus True 5.1 Surround Sound Gaming Headset Review @Hi Tech Legion
- CM Storm Sirus 5.1 Gaming Headset @ Benchmark Reviews
- Cooler Master Storm Sirus 5.1 Headset Review @ Hardware Canucks
- Cooler Master Storm Sirus 5.1 Gaming Headset @ Modders-Inc
- Cooler Master Storm Sirus 5.1 Headset Review @ Legit Reviews
- Cooler Master Storm Sirus 5.1 Gaming Headset Review @ Ninjalane
- Cooler Master CM Storm Sirius 5.1 Gaming Headset Review @ Tweaknews
- Roccat Kulo – Virtual 7.1 USB Gaming Headset @ Rbmods
- Steelseries 7H Gaming Headset @ Funky Kit
- Everything You Need to Know About the SPDIF Connection @ Hardware Secrets
- How to Make a Superlens From Soda Cans @ Make:Blog
- Asus' Xonar U3 USB audio device @ The Tech Report
Subject: General Tech | July 19, 2011 - 01:02 PM | Jeremy Hellstrom
Tagged: Intel, amd, arm, mali, low power
Those who ignored Microsoft's announcement that Windows 8 will support ARM processors will perhaps take note of Isuppli's claim that ARM could grab 1 in 5 of the laptops sold by 2015. The extremely low powers System on a Chip design that they have been selling were at the opposite end of the market from AMD and Intel's X86 chips, but with the rise of the APU the market has undergone a fundamental change. While the X86 makers are trying to lower the power requirements of their APUs, ARM is busy trying to ramp up the power of their chips. There are already several vendors establishing a relationship with ARM, up to and including Apple.
ARM's Cortex A9 and Mali are impressive, but ARM is already talking about console level graphics quality from their next generation of chips which we will see in roughly 18 months. This improvement will also encompass their next generation of power efficency research, which should keep power consumption and heat well below what Intel and AMD will be trying to reach. As well, it might provide an interesting opportunity for NVIDIA as the lack of a license to integrate chips with the new X86 based architecture will not stop them from developing graphics enhancements for ARM based laptops. Drop by The Inquirer for more on this topic.
"CHIP DESIGNER ARM could power over 20 per cent of all laptops shipped in 2015, according to analyst outfit IHS Isuppli.
IHS Isuppli has forecast that the domination of X86 chips in the laptop market will start to diminish as Microsoft releases its Windows 8 operating system. Windows 8 will be the first desktop operating system from Microsoft that will support the ARM architecture that is found in just about every smartphone in existence."
Here is some more Tech News from around the web:
- Foxconn reportedly considering ECS acquisition @ DigiTimes
- ReRAM gets closer to reality @ SemiAccurate
- Samsung SH100 Review @ TechReviewSource
- Ninjalane Podcast - Duke Nukem Forever Favorite Asus Product Listener Mailbag
- Sandberg Hard Disk Cloner Review @ Real World Labs
- Test Driving GNU Hurd, With Benchmarks Against Linux @ Phoronix
- S2TC: A Possible Workaround For The S3TC Patent Situation @ Phoronix
- Cyborg Gaming Lights (amBX) Review @ HardwareHeaven
- Panasonic Lumix GH2 Review @ t-break
- Real World Labs And Thermalright Joint Contest
- Win a Blackberry Bold 9900 @ t-break
Subject: Editorial, General Tech, Graphics Cards | July 17, 2011 - 01:07 PM | Scott Michaud
Tagged: stanford, nvidia, CUDA
NVIDIA has been pushing their CUDA platform for years now as a method to access your GPU for purposes far beyond the scopes of flags and frags. We have seen what a good amount of heterogeneous hardware will do to a process with a hefty portion of parallelizable code from encryption to generating bitcoins; media processing to blurring the line between real-time and non-real-time 3d rendering. NVIDIA also recognizes the role that academia plays in training the future programmers and thus strongly supports when an institution teaches how to use GPU hardware effectively, especially when they teach how to use NVIDIA GPU hardware effectively. Recently, NVIDIA knighted Stanford as the latest of its CUDA Center of Excellence round table.
It will be 150$ if you want it framed.
The list of CUDA Centres of Excellence now currently includes: Georgia Institute of Technology, Harvard School of Engineering, Institute of Process Engineering at Chinese Academy of Sciences, National Taiwan University, Stanford Engineering, TokyoTech, Tsinghua University, University of Cambridge, University of Illinois at Urbana-Champaign, University of Maryland, University of Tennessee, and the University of Utah. If you are interested in learning about programming for GPUs then NVIDIA has just graced blessing on one further choice. Whether that will affect many prospective students and faculty is yet to be seen, but it makes for many amusing puns nonetheless.