All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | April 3, 2013 - 06:37 PM | Jeremy Hellstrom
Tagged: input, mechanical keyboard, gigabyte, Aivia Osmium, cherry mx red
Gigabyte has added another mechanical keyboard to their family, the Aivia Osmium which uses the quiet Cherry MX Red switches preferred by gamers who don't want a click to slow down their button mashing. It is definitely aimed at gamers with backlighting, audio in and about and a USB 3.0 port on the side along with sound and brightness wheels at the top. The Tech Report was very impressed with the macro capability of this keyboard, not bound by a certain set of dedicated keys but instead a full program which allows up to 25 programmed macros which can include both mouse and keyboard input. Head on over and check out the full review.
"Most high-end keyboards combine mechanical switches with LED backlighting and programmable macro keys. Gigabyte's Aivia Osmium adds a new twist: USB 3.0 connectivity. We take a closer look at this unique keyboard to see what's what."
Here is some more Tech News from around the web:
- Logitech G710+ Mechanical Gaming Keyboard @ Tweaktown
- Cooler Master CM Storm Quick Fire Rapid Mechanical Gaming Keyboard Review @ Madshrimps
- SteelSeries APEX Gaming Keyboard @ Tweaktown
- Cooler Master Storm Trigger w/Green Keyswitches @ LanOC Reviews
- AZiO Large Print Tri-Color Backlight Keyboard Review @ OCC
- ROCCAT Isku FX Illuminated Gaming Keyboard Review @ NikKTech
- Logitech G710+ Mechanical Gaming Keyboard @LanOC Reviews
- Ducky Zero DK2108 Mechanical Keyboard @ eTeknix
- Satechi 10-Port USB 3.0 Hub UH3-10P Review @ Legit Reviews
- Func Surface 1030 XL mousepad @ Rbmods
- Func MS-3 Mouse & 1030XL Mouse Mat @ techPowerUp
- G600 MMO Gaming Mouse @ LanOC Reviews
- Razer Ouroboros Elite Gaming Mouse @ Benchmark Reviews
- Tt eSPORTS Level 10 M Gaming Mouse @ techPowerUp
- Corsair Vengeance M65 FPS Laser Gaming Mouse Review @ Madshrimps
- A4TECH V3 Bloody Gun3 Gaming Mouse @ Benchmark Reviews
- Corsair Vengeance M65 FPS Laser Gaming Mouse @ eTeknix
- SteelSeries Guild Wars 2 Gaming Mouse Review @ Madshrimps
- Logitech G600 MMO Gaming Mouse Review @ NikKTech
- AZIO GM-2000 Gaming Mouse Review @ Hardware Canucks
- Tesoro SHRIKE HL2 Laser Gaming Mouse Review @ NikKTech
- Func MS-3 Mouse Review @ Hardware Secrets
- Genius Gila GX Series Gaming Mouse Review @ Legit Reviews
Subject: General Tech | April 3, 2013 - 04:24 PM | Jeremy Hellstrom
One of the best SSDs from a dollar per gigabyte perspective is the Samsung 840 series; you can see it in action here in Allyn's review of the 250GB model. It uses Triple Level Cell flash, which is what helps keep the cost down, but won't have an effect on the performance for most users.
Samsung 840 Series 120GB SATA 6Gb/s 7mm 2.5" SSD (MZ-7TD120BW) @ $100
Subject: General Tech | April 3, 2013 - 01:43 PM | Tim Verry
Tagged: wireless display, Raspberry Pi, paperwhite, mobile, kindle, e-ink
The Raspberry Pi makes for a cheap and low power media PC, file server, or desktop but the lack of a display means that it is not very portable. Recently a hack was posted online by Max Ogden that enables the Rasbperry Pi to be used on the go by pairing it with an Amazon Kindle and its e-ink display. His wireless display setup was actually based on a previous hack that allowed the Pi to be paired with the 3rd-generation Kindle. Ogden's hack takes things a step further by supporting the latest Paperwhite versions as well as no longer requirig a wired connnection between the display and the Raspberry Pi.
By loading the Raspberry Pi with Raspian Linux and adding a terminal emulator to the Kindle, the Kindle connects to the Pi over an SSH session where the Pi console and any keyboard input can be seen on the Kindle's e-ink display. The hardware needed to make the setup work includes a Wi-Fi hotspot, a Wi-Fi USB NIC, The Raspberry Pi, a supported Kindle, and a battery pack with enough juice to power everything. A wired or wireless keyboard and Wi-Fi dongle can be added to the Raspberry Pi Model B, bu Model A users will need to add a USB hub as the $25 model only supports a single USB port on the device itself.
Max Ogden shows off his new portable battery-powered Raspberry Pi with wireless e-ink display.
There are some limitations to this setup. One is a bit of latency between typing and seeing the characters appear on the screen due to the low refresh rate inherent in e-ink displays and the wireless connection. Ogden estimates that this delay is around 200ms, and is noticeably but bearable while typing. The other major limitation is that the display can currently only be used to display the Pi console, and not the GUI of Raspian. For writing code or articles, you could get by with a command-line text editor like nano or vi--at the very least it would be a distraction-free writing environment as you could not procrastinate and browse Reddit or watch videos even if you wanted to (heh).
If you are interested in setting up your own wireless Raspberry Pi display, you should check out Ogdens blog for a list of recommended hardware as well as Rod Vagg's tutorial on configuring the Kindle Paperwhite with the correct software.
This is one of the more-useful Raspberry Pi hacks that I've seen so far. Hopefully, a future hack will come along that will also allow one of these e-ink devices to display the GUI desktop environment and not just the terminal.
Subject: General Tech | April 3, 2013 - 01:21 PM | Jeremy Hellstrom
Tagged: gpu, DRAM, ddr3, price increase
It has taken a while but the climbing price of memory is about to have an effect on the price you pay for your next GPU. DigiTimes does specifically mention DDR3 but as both GDDR4 and GDDR5 are based off of DDR3 they will suffer the same price increases. You can expect to see the new prices last as part of the reason for the increase in the price of RAM is the decrease in sales volume. AMD may be hit harder overall than NVIDIA as they tend to put more memory on their cards and buyers of value cards might see the biggest percentage increase as those cards still sport 1GB or more of memory.
"Since DDR3 memory prices have recently risen by more than 10%, the sources believe the graphics cards are unlikely to see their prices return to previous levels within the next six months unless GPU makers decide to offer promotions for specific models or launch next-generation products."
Here is some more Tech News from around the web:
- Memory vendors pile on '3D' stacking standard @ The Register
- History of the GPU, Part 2: 3Dfx Voodoo, the game-changer @ Techspot
- Intel releases OpenCL SDK for upcoming Haswell chips @ The Inquirer
- Linux Foundation Training Prepares the International Space Station for Linux Migration @ Linux.com
- Microsoft releases Exchange 2013 update @ The Register
- Canon PowerShot A2600 Review @ TechReviewSource
- AMD Releases Open-Source UVD Video Support @ Phoronix
- Win An Amazing PC Specialist Gaming System @ eTeknix
Subject: General Tech | April 3, 2013 - 06:53 AM | Tim Verry
Tagged: ice storm extreme, ice storm, Futuremark, benchmarking, Android, 3dmark
Futuremark recently unveiled its latest 3DMark benchmarking suite for Android devices. Compatible with over 1,000 devices, the new 3DMark is a free benchmark that incorporates both the Ice Storm and Ice Storm Extreme tests. The benchmark was developed by Futuremark in cooperation with a number of industry companies including Broadcom, Imagination Technologies, Intel, NVIDIA, and Qualcomm. The Ice Storm Extreme test is also coming to the Windows version of 3DMark, and the tests can be used to compare benchmark scores across platforms.
Both the benchmarking tests are based on OpenGL ES 2.0. Ice Storm runs through two graphical tests to stress the GPU and one physics test to measure CPU performance. The ice Storm Extreme benchmark takes things further by bumping up the resolution to 1080 and swapping in higher quality textures and post processing effects.
The benchmark is compatible with a number of mobile smartphones and tablets running Android 3.1 or higher. It is a free download from the Google Play store.
The iOS and Windows RT versions of 3DMark are still in development. More information can be found in the press release.
Read more about Futuremark's 3DMark benchmarking suite at PC Perspective.
AMD has announced that is will be hosting an event for fans in San Francisco this weekend. The AMD Fan Day is free with registration (register here), and fans will give enthusiasts a chance to go hands-on with the company's 2013 hardware lineup, play several newly released (and some not-yet-released) games, talk with industry experts, check out modded PCs, and have a chance to win free hardware and swag from AMD, Corsair, and Gigabyte.
Gamers will get a chance to speak with the developers for Bioshock Infinite, Far Cry 3, Crysis 3, Devil May Cry (DMC), and Tomb Raider as well as AMD representatives. VIZIO, IGN, Ubisoft, Sapphire, and Logitech will also be attending the AMD fan day to show off their latest products.
The event will held at City View at Metreon (address below) at 5:30pm on Saturday, April 6th. Best of all, the first 1,000 registered attendees in the door will get a free AMD A8 5600K APU. The first 120 attendees will win both an A8 5600K APU and an A85X motherboard.
One of the modded PCs that will be on the event floor.
If you're going to be in the area this weekend and are interested in going, be sure to head over to the AMD site and register. It sounds like it should be a fun time, and the free hardware doesn't hurt!
The AMD Fan Day will be held at the following address:
City View at Metreon
135 4th Street
San Francisco, CA 94013
Will you be checking out the AMD fan day to enjoy some gaming and PC hardware?
Subject: General Tech | April 2, 2013 - 05:57 PM | Jeremy Hellstrom
Tagged: arm, FinFET, 16nm, TSMC, Cortex-A57
While what DigiTimes is reporting on is only the first tape out, it is still very interesting to see TSMC hitting 16nm process testing and doing it with the 3D transistor technology we have come to know as FinFET. It was a 64-bit ARM Cortex-A57 chip that was created using this process, unfortunately we did not get much information about what comprised the chip apart from the slide you can see below.
As it can be inferred by the mention that it can run alongside big.LITTLE chips it will not be of the same architecture, nor will it be confined to cellphones. This does help reinforce TSMC's position in the market for keeping up with the latest fabrication trends and another solid ARM contract will also keep the beancounters occupied. You can't expect to see these chips immediately but this is a solid step towards an new process being mastered by TSMC.
"The achievement is the first milestone in the collaboration between ARM and TSMC to jointly optimize the 64-bit ARMv8 processor series on TSMC FinFET process technologies, the companies said. The pair has teamed up to produce Cortex-A57 processors and libraries to support early customer implementations on 16nm FinFET for ARM-based SoCs."
Here is some more Tech News from around the web:
- Wiping a Smartphone Still Leaves Data Behind @ Slashdot
- ARM processor competition to fire up @ DigiTimes
- Physicists bang the drum for quantum memory @ The Register
- Intel Haswell Socket H Heatsink Requirements and Overclocking Thoughts @ Tweaktown
- Killing Your Internet with Killer Ethernet @ Techgage
- Backdoors Found In Bitlocker, FileVault and TrueCrypt? @ TechARP
- Win ASRock FM2A85X Extreme 6 & Seasonic M12II-850 @ Kitguru
- Win Enermax Goodies From Insomnia i48 @ eTeknix
- NikKTech & Synology Joint Giveaway - One DiskStation DS213+ Up For Grabs
- The TR Podcast 131: News from GDC and FCAT attacks
- Dispatches from the Nexus @ The Tech Report
- AMD touts unified gaming strategy @ The Tech Report
- Intel gets serious about graphics for gaming @ The Tech Report
Subject: General Tech | April 2, 2013 - 10:59 AM | Tim Verry
Tagged: lga 2011, Ivy Bridge-E, Intel, 22nm
Many enthusiasts have been eagerly awaiting the next generation of Intel processors to use LGA 2011, which is supposed to be Ivy Bridge-E. Especially after seeing rumors of a 10 core Xeon E5-2600 V2 Ivy Bridge-EP CPU, I think many users expected at least an eight core Ivy Bridge-E part.
Unfortunately, if a slide posted by VR-Zone China is any indication, LGA 2011 users will not be getting an eight core processor any time soon. The slide suggests that Intel will release three new Ivy Bridge-E CPUs in the third quarter of this year (Q3'13). However, the top-end part is merely a six core CPU with slight improvements over the existing Sandy Bridge-E 3960X chip.
Specifically, the slide alleges that the initial Intel release will include the Core i7 4820, Core i7 4930K, and the Core i7 4960X. An Ivy Bridge-E equivalent to the SB-E 3970X is noticeably absent from the lineup along with several of the other rumored (higher core count) chips.
Rumored Ivy Bridge-E chips:
|Clockspeed||Core Count||L3 Cache||Manufacturing Process||TDP|
|Core i7 4960X||3.6GHz (4GHz Turbo)||6||15MB||22nm||130W|
|Core i7 4930K||3.4GHz (3.9GHz Turbo)||6||12MB||22||130W|
|Core i7 4820K||3.7GHz (3.9GHz Turbo)||4||10MB||22||130W|
Existing Sandy Bridge-E equivalents:
|Clockspeed||Core Count||L3 Cache||Manufacturing Process||TDP|
|Core i7 3960X||3.3GHz (3.9GHz Turbo)||6||15MB||32nm||130W|
|Core i7 3930K||3.2GHz (3.8GHz Turbo)||6||12MB||32nm||130W|
|Core i7 3820||3.6GHz (3.8GHz Turbo)||4||10MB||32nm||130W|
All of the chips allegedly have 130W TDPs, 40 PCI-E 3.0 lanes, support for quad-channel DDR3-1866 memory, and are built on Intel's 22nm manufacturing process. The low end i7 4820 is a quad core chip clocked at 3.7 GHz base and 3.9 GHz turbo with 10MB L3 cache. The i7 4930K is an unlocked six core part with 12MB L3 cache and clockspeeds of 3.4 GHz base and 3.9 GHz turbo. Finally, the Core i7 4960X is rumored to be the highest-end chip Intel will release (at least, initially). It is also a six core part clocked at 3.6 GHz base and 4 GHz turbo. It has 15MB of L3 cache. These chips are the Ivy Bridge-E equivalents to the 3820, 3930K, and 3960X chips respectively. The new processors feature higher clockspeeds, and are based on 22nm 3D transistor technology instead of SB-E's 32nm manufacturing process. It seems that Intel has extended unlocking to the lower-tier LGA 2011 chip, as it is listed as the Core i7 4820K. Having an unlocked multiplier is nice to see at the low end (the low end of the enthusiast platform, anyway). Curiously, the TDP ratings are the same, however. That suggests that the move to 22nm did not net Intel much TDP headroom, and the higher clocks are bringing them up to similar TDP numbers. At least the TDP ratings are not higher than SB-E, such that you motherboard and HSF should have no problems accepting an IVB-E CPU upgrade (with a BIOS update, of course).
It will be interesting to see how the new Ivy Bridge-E chips stack up, especially considering Intel may also be unveiling the consumer-grade Haswell processor this year. On one hand, Ivy Bridge-E offers up a CPU upgrade path for existing systems, but on the other hand pricing and the performance of Haswell (and lack of higher core count Ivy Bridge-E chips like previous rumors suggested) may see enthusiasts instead opt for a motherboard+CPU overhaul instead of simply recycling the LGA 2011/X79 motherboard. At this point, if this new slide holds true it appears that Ivy Bridge E/LGA 2011 will become even more of a niche solely for workstations that need the extra PCI-E lanes and quad channel memory. I say this as someone running a Lynnfield system who is itching for an upgrade and torn on going for the enthusiast platform or waiting for Haswell.
What do you think about the rumored Ivy Bridge-E chips, are they what you expected? Do you think they will be worth a CPU upgrade for your LGA 2011-based system or are you leaning towards Haswell?
Read more about Ivy Bride-E at PC Perspective, including: Ivy Bridge-E after Haswell: I think I've gone cross-eyed.
Subject: General Tech | April 2, 2013 - 07:25 AM | Tim Verry
Tagged: x86 emulator, rpix86, Raspberry Pi, gaming, dos
The Raspberry Pi is proving to be a popular destination for all manner of interesting software projects and open source operating systems. The most-recent Pi project I've come across is a DOS PC emulator by Patrick Aalto called rpix86. A port of DSx86, which ran on the Nintendo DS handheld console, rpix86 is now up to version 0.04 and emulates a 90's X86 computer with enough hardware oomph to run classic PC games!
Rpix86 is an emulator that runs from the console (not within the X GUI desktop environment) on the Raspberry Pi. It emulates the following X86 PC specs:
|Processor||80486 @ ~ 20 MHz (inc. protected mode. No virtual memory support)|
|Memory||640 Kb low memory, 4 MB EMS memory, 16 MB XMS memory|
|Graphics||Super VGA @ 640 x 480 w/ 256 colors|
|Audio||Sound Blaster 2.0 (+ AdLib-compatible FM sounds)|
|Input Devices||US keyboard, analog joystick, 2 button mouse|
|Misc||Roland MPU-401 MIDI Support via USB MIDI Dongle|
Patrick Aalto added support for analog USB joysticks and foot pedals (4 buttons, 4 analog channels) as well as 80 x 50 text mode (required by some MIDI software and Little Big Adventure's setup program) to the recent 0.04 update. He also stripped out debug code, which cut the program size approximately in half.
The developer has stated on his blog that he is working on allowing rpix86 to be used from the terminal within X and adding support for intelligent MPU MIDI mode. A port to the Android operating system called ax86 is also in the works. You can grab the current version of the Raspberry Pi X86 emulator on the developer's website.
With this emulator, you can run most of the DOS games you grew up with (Wolf3D and Digger anyone?), which is definitely a worthy use for the $25 or $35 Raspberry Pi hardware! At the very least, it is an interesting alternative to running DOSBox, and much smaller and more power efficient than running an old X86 PC dedicated to running classic games. Getting those floppies to work with the Pi might be a bit of an issue though, assuming they are still readable (heh).
Read more about the Raspberry Pi computer at PC Perspective.
Subject: General Tech | April 2, 2013 - 06:41 AM | Tim Verry
Tagged: file sync, cloud storage, cloud drive, amazon
Amazon has announced two new Java-based applications for Windows and Mac PCs that will sync files between multiple computers and the company's Cloud Drive online storage service.
Amazon Cloud Drive is a companion service that was spun off of its Cloud Player music locker service. Users get 5GB for free, with additional tiers of storage available for purchase. (Any music from Amazon side-loaded to Cloud Drive and Cloud Player before July 31st does not count towards your storage quota). Until now, Cloud Drive has been merely a web storage locker, but with the new desktop apps Amazon is adding file syncing capabilities that will keep your files updated across multiple PCs. The desktop apps will create a folder which will then contain a locally-stored copy of your Amazon Cloud Drive files. If you choose to install the desktop app onto a second PC, it will also sync with Cloud Drive and store a copy of the files locally. The most recently modified version will sync to all the other computers' local store and the cloud drive. There is no word on versioning support, so note that this should not be a replacement for a true file backup. With that said, the multiple-PC file sync is a welcome addition that makes Cloud Drive much more useful than ever before.
The new desktop apps will run on Windows XP, Vista, 7, and 8, and on Mac OS X 10.6, 10.7, and 10.8.
When Amazon was asked about mobile apps and file sync, the company told Ars Technica that it had "nothing specific to share." That could mean that Cloud Drive will bring file synchronization to iOS, Android, and WP8, or it could be a literal statement. It is difficult to say, but I think if Amazon wants its Cloud Drive storage service to be taken seriously the company will need to enter the mobile space (as it has done with Cloud Player).
Subject: General Tech | April 2, 2013 - 02:54 AM | Tim Verry
Tagged: next generation character rendering, GDC 13, gaming, Activision, 3D rendering
Activision recently showed off its Next-Generation Character Rendering technology, which is a new method for rendering realistic and high-quality 3D faces. The technology has been in the works for some time now, and is now at a point where faces are extremely detailed down to pores, freckles, wrinkles, and eye lashes.
In addition to Lauren, Activision also showed off its own take on the face used in NVIDIA's Ira FaceWorks tech demo. Except instead of the NVIDIA rendering, the face was done using Activision's own Next-Generation Character Rendering technology. A method that is allegedly more efficient and "completely different" than the one used for Ira. In a video showing off the technology (embedded below), the Activision method produces some impressive 3D renders in real time, but when talking appear to be a bit creepy-looking and unnatural. Perhaps Activision and NVIDIA should find a way to combine the emotional improvements of Ira with the graphical prowess of NGCR (and while we are making a wish list, I might as well add TressFX support... heh).
The high resolution faces are not quite ready for the next Call of Duty, but the research team has managed to get models to render at 180 FPS on a PC running a single GTX 680 graphics card. That is not enough to implement the technology in a game, where there are multiple models, the environment, physics, AI, and all manner of other calculations to deal with and present at acceptable frame rates, but it is nice to see this kind of future-looking work being done now. Perhaps in a few graphics card generations the hardware will catch up to the face rendering technology that Activision (and others) are working on, which will be rather satisfying to see. It is amazing how far the graphics world has come since I got into PC gaming with Wolfenstein 3D, to say the least!
The team behind Activision's Next-Generation Character Rendering technology includes:
|Javier Von Der Pahlen||Director of Research and Development|
|Etienne Donvoye||Technical Director|
|Bernardo Antoniazzi||Technical Art Director|
|Zbyněk Kysela||Modeler and Texture Artist|
|Mike Eheler||Programming and Support|
|Jorge Jimenez||Real-Time Graphics Research and Development|
Jorge Jimenez has posted several more screenshots of the GDC tech demo on his blog that are worth checking out if you are interested in the new rendering tech.
Subject: General Tech | April 1, 2013 - 05:37 AM | Tim Verry
Tagged: virtual reality, oculus vr, oculus rift, GTC 2013, gaming
Update: Yesterday, an Oculus representative reached out to us in order to clarify that the information presented was forward looking and subject to change (naturally). Later in the day, the company also posted to its forum the following statement:
"The information from that presentation (dates, concepts, projections, etc...) represent our vision, ideas, and on-going research/exploration. None of it should be considered fact, though we'd love to have that projected revenue!"
You can find the full statement in this thread.
The original article text is below:
Oculus VR, the company behind the Oculus Rift virtual reality headset, took the stage at NVIDIA's GPU Technology Conference to talk about the state of its technology and where it is headed.
Oculus VR is a relatively new company founded by Palmer Luckey and managed by CEO Brendan Iribe, who is the former CPO of cloud gaming company Gaikai. Currently, Oculus VR is developing a wearable, 3D, virtual reality headset called the Oculus Rift. Initially launched via Kickstarter, the Oculus Rift hardware is now shipping to developers as the rev 1 developer kit. Oculus VR will manufacture 10,000 developer kits and managed to raise $2.55 million in 2012.
The developer kit has a resolution of 1280x800 and weighs 320g. It takes a digital video input via a DVI or HDMI cable (HDMI with an adapter). The goggles hold the display and internals, and a control box connects via a wire to provide power. It uses several pieces of hardware found in smartphones, and CEO Brendan Iribe even hinted that an ongoing theme at Oculus VR was that "if it's not in a cell phone, it's not in Oculus." It delivers a 3D experience with head tracking, but Iribe indicated that full motion VR is coming in the future. For now, it is head tracking that allows you to look around the game world, but "in five to seven or eight years" virtual reality setups that combine an Oculus Rift-like headset with a omni-directional treadmill would allow you to walk and run around the world in addition to simply looking around.
Beyond the immersion factor, a full motion VR setup would reduce (and possibly eliminate) the phenomena of VR sickness, where users using VR headsets for extended periods of time experience discomfort due to the disconnect between your perceived in-game movement and your (lack of) physical movement and inner-ear balance.
After the first developer kit, Oculus is planning to release a revised version, and ultimately a consumer version. This consumer version is slated for a Q3 2014 launch. It will weigh significantly less (Oculus VR is aiming for around 200g), and will support 1080p 3D resolutions. The sales projections estimate 50,000 revision 2 developer kits in 2013 and at least 500,000 consumer versions of the Oculus Rift in 2014. Ambitious numbers, for sure, but if Oculus can nail down next-generation console support, reduce the weight of the headset, and increase the resolution it is not out of the question.
With the consumer version, Oculus is hoping to offer both a base wired version and a higher-priced wireless Rift VR headset. Further, the company is working with game and professional 3D creation software developers to get software support for the VR headset. Team Fortress 2 support has been announced, for example (and there will even be an Oculus Rift hat, for gamers that are into hats). Additionally, Oculus is working to get support into the following software titles (among others):
- AutoDesk 3D
- DOTA 2
During the presentation Iribe stated that graphics cards (specifically he mentioned the GTX 680) are finally in a place to deliver 3D with smooth frame rates at high-enough resolutions for immersive virtual reality.
Left: potential games with Oculus VR support. Right: Oculus VR CEO Brendan Iribe at ECS during GTC 2013.
Pricing on the consumer version of the VR headset is still unkown, but developers can currently pre-order an Oculus Rift developer kit on the Oculus VR site. In the past, the company has stated that consumers should hold off on buying a developer kit and wait for the consumer version of the Rift in 2014. If the company is able to deliver on its claims of a lighter headset with higher resolution screen and adjustable 3D effects (like the 3DS, the level of stereo 3D can be adjusted, and even turned off), I think it will be worth the wait. The deciding factor will then be software support. Hopefully developers will take to the VR technology and offer up support for it in upcoming titles into the future.
Are you excited for the Oculus Rift?
Subject: General Tech | March 31, 2013 - 08:43 PM | Tim Verry
Tagged: nvidia, lenovo yoga, GTC 2013, GTC, gesture control, eyesight, ECS
During the Emerging Companies Summit at NVIDIA's GPU Technology Conference, Israeli company EyeSight Mobile Technologies' CEO Gideon Shmuel took the stage to discuss the future of its gesture recognition software. He also provided insight into how EyeSight plans to use graphics cards to improve and accelerate the process of identifying, and responding to, finger and hand movements along with face detection.
EyeSight is a five year old company that has developed gesture recognition software that can be installed on existing machines (though it appears to be aimed more at OEMs than directly to consumers). It can use standard cameras, such as webcams, to get its 2D input data and then gets a relative Z-axis from proprietary algorithms. This gives EyeSight essentially 2.5D of input data, and camera resolution and frame rate permitting, allows the software to identify and track finger and hand movements. EyeSight CEO Gideon Shmuel stated at the ECS presentation that the software is currently capable of "finger-level accuracy" at 5 meters from a TV.
Gestures include the ability to use your fingers as a mouse to point at on-screen objects, waving your hand to turn pages, scrolling, and even give hand signal cues.
The software is not open source, and there are no plans to move in that direction. The company has 15 patents pending on its technology, several of which it managed to file before the US Patent Office changed from First to Invent to First Inventor to File (heh, which is another article...). The software will support up to 20 million hardware devices in 2013, and EyeSight expects the number of compatible camera-packing devices to increase further to as many as 3.5 billion in 2015. Other features include the ability transparently map EyeSight input to Android apps without user's needing to muck with settings, and the ability to detect faces and "emotional signals" even in low light. According to the website, SDKs are available for Windows, Linux, and Android. The software maps the gestures it recognizes to Windows shortcuts, to increase compatibility with many existing applications (so long as they support keyboard shortcuts).
Currently, the EyeSight software is mostly run on the CPU, but the company is heavily investing into incorporating GPU support. Moving the processing to GPUs will allow the software to run faster and more power efficiently, especially on mobile devices (NVIDIA's Tegra platform was specifically mentioned). EyeSight's future road-map includes using GPU acceleration to bolster the number of supported gestures, move image processing to the GPUs, add velocity and vector control inputs, incorporate a better low-light filter (which will run on the GPU), and offload processing from the CPU to optimize power management and save CPU resources for the OS and other applications which is especially important for mobile devices. Gideon Shmuel also stated that he wants to see the technology being used on "anything with a display" from your smartphone to your air conditioner.
A basic version of the EyeSight input technology reportedly comes installed on the Lenovo Yoga convertible tablet. I think this software has potential, and would provide that Minority Report-like interaction that many enthusiasts wish for. Hopefully, EyeSight can deliver on its claimed accuracy figures and OEMs will embrace the technology by integrating it into future devices.
EyeSight has posted additional video demos and information about its touch-free technology on its website.
Do you think this "touch-free" gesture technology has merit, or will this type of input remain limited to awkward-integration in console games?
Subject: General Tech | March 31, 2013 - 02:21 AM | Tim Verry
Tagged: sony, ps4, playstation eye, playstation 4, gaming, dualshock 4, APU, amd
Sony teased a few more details about its upcoming PlayStation 4 console at the Games Developer's Conference earlier this week. While the basic specifications have not changed since the original announcement, we now know more about the X86 console hardware.
The PS4 itself is powered by an AMD Jaguar CPU with eight physical cores and eight threads. Each core gets 32 KB L1 I-cache and D-cache. Further, each group of four physical cores shares 2 MB of L2 cache, for 4MB total L2. The processor is capable of Out of Order Execution, as are AMDs other processor offerings. The console also reportedly features 8GB of GDDR5 memory that is shared by the CPU and GPU. It offers 176 GB/s of bandwidth, and is a step above the PS3 which did not use a unified memory design. The system will also sport a faster GPU rated at 1.843 TFLOPS, and clocked at 800MHz. The PS3 will have a high-capacity hard drive and a new Blu-ray drive that is up to 3-times faster. Interestingly, the console also has a co-processor that allows the system to process the video streaming features and allow the Remote Play game streaming to the PlayStation Vita at its native resolution of 960x554.
The PlayStation Eye has also been upgraded with the PS4 to include 2 cameras, four microphones, and a 3-axis accelerometer. The Eye cameras have an 85-degree field of view, and can record video at 1280x800 at 60 Hz and 12 bits per pixel or 640x480 and 120Hz. The new PS4 Eye is a noteworthy upgrade to the current generation model which is limited to either 640x480 pixels at 60Hz or 320x240 pixels at 120Hz. The extra resolution should allow developers to be more accurate. The DualShock 4 controllers sport a light-bar that can be tracked by the new Eye camera, for example. The light-bar on the controllers uses an RGB LED that changes to blue, red, pink, or green for players 1-4 respectively.
Speaking of the new DualShock 4, Sony has reportedly ditched the analog face buttons and D-pad for digital buttons. With the DS3 and the PS3, the analog face buttons and D-pad came in handy with racing games, but otherwise they are not likely to be missed. The controllers will now charge even when the console is in standby mode, and the L2 and R2 triggers are more resistant to accidental pressure. The analog sticks have been slightly modified and feature a reduced dead zone. The touchpad, which is a completely new feature for the DualShock lineup, is capable of tracking 2 points at a resolution of 1920x900–which is pretty good.
While Sony has still not revealed what the actual PS4 console will look like, most of the internals are now officially known. It will be interesting to see just where Sony prices the new console, and where game developers are able to take it. Using a DX11.1+ feature set, developers are able to use many of the same tools used to program PC titles but also have additional debugging tools and low level access to the hardware. A new low level API below DirectX, but above the driver level gives developers deeper access to the shader pipeline. I'm curious to see how PC ports will turn out, with the consoles now running X86 hardware, I'm hoping that the usual fare of bugs common to ported titles from consoles to PCs will decrease–a gamer can dream, right?
Subject: General Tech | March 28, 2013 - 03:47 PM | Ken Addison
Tagged: sli, podcast, pcper, nvidia, kepler, HD7790, GTX 560Ti BOOST, GCN, frame rating, crossfire, amd
PC Perspective Podcast #244 - 03/28/2013
Join us this week as we discuss the launch of Frame Rating, HD 7790 vs. GTX 650Ti BOOST, and news from GDC
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:19:22
Week in Review:
News items of interest:
1:00:00 Is there a Flash flood coming?
1:12:00 Hardware/Software Picks of the Week:
Allyn: Samsung 840 250GB for $147 on Buy.com (checkout price) (ends 3/31)
1-888-38-PCPER or email@example.com
Subject: General Tech | March 28, 2013 - 01:59 PM | Jeremy Hellstrom
Alienware Aurora r4 Core i7 Gaming Desktop (Liquid-cooled) w/ GeForce GTX 660 for 1,299.00 with free shipping (normally $1,400.00 - use coupon code: LDLK0B2PNL0$32).
Featured $1,299 configuration includes Core i7-3820 Quad-core CPU (up to 4.1GHz clock speed, 10MB cache), 8GB RAM, 1TB HDD, 1.5GB GeForce GTX 660, 24X DVD Burner, and Windows 7 Home Premium 64-bit OS.
(Optional 4GB GDDR5 NVIDIA GeForce GTX 690 Graphics is now available on Alienware Aurora models.)
Subject: General Tech, Mobile | March 28, 2013 - 01:10 PM | Jeremy Hellstrom
Tagged: razer blade, gaming
You may remember Nokia's failed N-Gage, the phone that thought it was as console but turned out to be a failure; it seems that Razer is going to market with a similar product called the Blade. This time we have a product that is a tablet with aspirations to console-hood as you can tell from the gamepad-type controls surrounding the 1366x768 10.1" screen. Inside you will find an Intel Core i7 processor, a 256GB SSD, and 8GB of RAM all of which adds up to a heavy weight mobile device with not much in the way of battery life. Gizmodo tried it out at GDC and played BioShock Infinite on Ultra with no problems whatsoever so the performance is there. On the other hand can a $1500 gaming tablet compete with full Ultrabooks or streaming devices like Project SHIELD?
"A gaming laptop in a tablet. It's a thought experiment that raises a whole host of questions: Is that even possible? Can it possibly be good? Would anyone even want it if it were? And finally: How much does it cost? The Razer Edge's answers translate roughly to "Yes!", "Sort of.", "Maybe?", and "Erm, you better sit down.""
Here is some more Tech News from around the web:
- Oculus Rift developer kit hands-on video at GDC 2013, shown playing Hawken @ Tweaktown
- Intel to separate 3rd-generation ultrabooks into 3 price groups @ DigiTimes
- Microsoft's 'Gemini' project will be the Windows Blue of Office @ The Register
- ARM says GPGPUs could lower overall chip costs @ The Inquirer
- Blackberry has sold one million Blackberry Z10 smartphones @ The Inquirer
- IBM unfurls SDN network manager @ The Register
- BIGGEST DDoS ATTACK IN HISTORY hammers Spamhaus @ The Register
- How to Improve Your Wi-Fi Signal Using a Soda Can in 6 Steps @ MAKE:Blog
- Interview with Richard Huddy about Intel moving beyond DX @ Kitguru
Subject: General Tech | March 28, 2013 - 12:54 AM | Tim Verry
Tagged: webgl 1.0.2, webgl, web browser, tegra, programming
The Khronos Group recently announced that the WebGL 1.0.1 specification is compliant across mobile and desktop systems on a number of platforms. Chrome 25 and Firefox 19 support WebGL 1.0.1 on Windows, Mac, and Linux operating systems. Further, mobile devices with Tegra SoCs can tap into WebGL using a WebGL-enhanced Android browser.
Additionally, the WebGL 1.0.2 specification and its extensions have been submitted for ratification, and is expected to be formally released in April. According to the press release, the following features are being rolled into the WebGL 1.0.2 specification:
- "adds many clarifications for specification behavioral precision
- mandates support for certain combinations of framebuffer formats, to ease developer adoption;
- clarifies interactions with the encompassing HTML5 platform, including the browser compositor and high-DPI displays;
- dramatically increases the number of conformance tests to roughly 21,000 to improve both the breadth and depth of test coverage - thanks principally to work by Gregg Tavares at Google and the OpenGL ES working group."
Khronos President and NVIDIA Vice President of Mobile Content Neil Trevett stated that "The close cooperation between browser and silicon vendors to ensure the GPU is being reliably and securely exposed to the Web is ongoing proof that Khronos is a highly productive forum to evolve this vital Web standard." Meanwhile, WebGL Working group chair Ken Russell indicated that WebGL 1.0.2 is "a major milestone in bringing the power of the GPU to the Web.”
Although there are security concerns to consider, WebGL does open up some interesting opportunities for new web services. The full press release can be found here.
Subject: General Tech | March 27, 2013 - 11:01 PM | Tim Verry
Tagged: Internet, hc-pbgf, fiber, data transmission
Transmitting data over optical fiber is one of the fastest methods available, and researchers at the University of Southampton have managed to dial up the speed even further.
Being optical in nature, light is used to transmit data over fiber. The speed of light through a vacuum is 299,792,458 meters per second, but traditional fiber is not nearly that fast due to light traveling approximately 31% slower (206,856,796.02 m/s) through silica glass than a vacuum.
The new fiber employs a hollow design that allows light to travel through air rather than glass while still allowing the cable to bend and twist around corners. The new fiber has been dubbed Hollow Core Photonic Bandgap Fiber, or HC-PBGF, and allows light to travel up to about 298,893,080.63 m/s (~99.7% the speed of light). Currently, the HC-PBGF fiber is still in the experimental phase, but it could have big implications for data centers and HPC server clusters that depend on high bandwidth, low latency connections between individual nodes.
Just how fast is the new HC-PBGF? According to ExtremeTech, a researcher told the site that the new fiber has a total cable throughput of 73.7 Tbps. It transmits 3 modes of 96 channels of 256 Gbps each using a combination of wave division multiplexing and mode division multiplexing. The fiber is 160nm and is noticeably faster than traditional fiber. Additionally, the HC-PBGF has a data loss of 3.5 dB/km which makes it a useful candidate for short runs between nodes or rows of racks, but not yet suitable for longer runs. HC-PBGF will not be blanketing your neighborhood anytime soon, but the research may lead to new optical networking technologies used in the next supercomputer or cloud service, for example.
The full paper can be found here, along with more details over at Ars Technica. Unfortunately, the full paper is behind a paywall but it may be worth seeing your school or work can give you access should you be interested in drilling into the details of the experimental hollow fiber,.
Subject: General Tech, Shows and Expos | March 27, 2013 - 08:51 PM | Scott Michaud
Tagged: Intel, GDC 13, GDC
GDC 2013 is where the industry comes together to talk about the technology itself. Intel was there, and of course the big blue just had to unveil developments to help them in the PC gaming space. Two new major rendering extensions and updated developer tools debut. And, if you are not a developer, encode your movies with handbrake quicker!
First up is PixelSync, a DirectX extension for Intel HD Graphics. PixelSync is designed to be used with smoke, hair, and other effects which require blending translucent geometry. With this extension, objects do not need to be sorted before compositing.
Next up is InstantAccess. This DirectX extension allows CPU and integrated GPUs to access the same physical memory. What interests me most about InstantAccess is the ability for developers to write GPU-based applications which can quickly access the same memory as its CPU counterpart. Should the integrated GPU be visible alongside discrete GPUs, this could allow the integrated graphics to help offload GPGPU tasks such as physics while the CPU and discrete GPU handle the more important tasks.
Also updated is their Graphics Performance Analyzers toolset. If you are interested in performance optimization on your software, be sure to check those out.
And for the more general public... Handbrake is now set up to take advantage of Quick Sync Video. Given the popularity of Handbrake, this is quite a big deal for anyone wishing to transcode video using popular and free encoders.