All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | April 2, 2013 - 02:54 AM | Tim Verry
Tagged: next generation character rendering, GDC 13, gaming, Activision, 3D rendering
Activision recently showed off its Next-Generation Character Rendering technology, which is a new method for rendering realistic and high-quality 3D faces. The technology has been in the works for some time now, and is now at a point where faces are extremely detailed down to pores, freckles, wrinkles, and eye lashes.
In addition to Lauren, Activision also showed off its own take on the face used in NVIDIA's Ira FaceWorks tech demo. Except instead of the NVIDIA rendering, the face was done using Activision's own Next-Generation Character Rendering technology. A method that is allegedly more efficient and "completely different" than the one used for Ira. In a video showing off the technology (embedded below), the Activision method produces some impressive 3D renders in real time, but when talking appear to be a bit creepy-looking and unnatural. Perhaps Activision and NVIDIA should find a way to combine the emotional improvements of Ira with the graphical prowess of NGCR (and while we are making a wish list, I might as well add TressFX support... heh).
The high resolution faces are not quite ready for the next Call of Duty, but the research team has managed to get models to render at 180 FPS on a PC running a single GTX 680 graphics card. That is not enough to implement the technology in a game, where there are multiple models, the environment, physics, AI, and all manner of other calculations to deal with and present at acceptable frame rates, but it is nice to see this kind of future-looking work being done now. Perhaps in a few graphics card generations the hardware will catch up to the face rendering technology that Activision (and others) are working on, which will be rather satisfying to see. It is amazing how far the graphics world has come since I got into PC gaming with Wolfenstein 3D, to say the least!
The team behind Activision's Next-Generation Character Rendering technology includes:
|Javier Von Der Pahlen||Director of Research and Development|
|Etienne Donvoye||Technical Director|
|Bernardo Antoniazzi||Technical Art Director|
|Zbyněk Kysela||Modeler and Texture Artist|
|Mike Eheler||Programming and Support|
|Jorge Jimenez||Real-Time Graphics Research and Development|
Jorge Jimenez has posted several more screenshots of the GDC tech demo on his blog that are worth checking out if you are interested in the new rendering tech.
Subject: General Tech | April 1, 2013 - 05:37 AM | Tim Verry
Tagged: virtual reality, oculus vr, oculus rift, GTC 2013, gaming
Update: Yesterday, an Oculus representative reached out to us in order to clarify that the information presented was forward looking and subject to change (naturally). Later in the day, the company also posted to its forum the following statement:
"The information from that presentation (dates, concepts, projections, etc...) represent our vision, ideas, and on-going research/exploration. None of it should be considered fact, though we'd love to have that projected revenue!"
You can find the full statement in this thread.
The original article text is below:
Oculus VR, the company behind the Oculus Rift virtual reality headset, took the stage at NVIDIA's GPU Technology Conference to talk about the state of its technology and where it is headed.
Oculus VR is a relatively new company founded by Palmer Luckey and managed by CEO Brendan Iribe, who is the former CPO of cloud gaming company Gaikai. Currently, Oculus VR is developing a wearable, 3D, virtual reality headset called the Oculus Rift. Initially launched via Kickstarter, the Oculus Rift hardware is now shipping to developers as the rev 1 developer kit. Oculus VR will manufacture 10,000 developer kits and managed to raise $2.55 million in 2012.
The developer kit has a resolution of 1280x800 and weighs 320g. It takes a digital video input via a DVI or HDMI cable (HDMI with an adapter). The goggles hold the display and internals, and a control box connects via a wire to provide power. It uses several pieces of hardware found in smartphones, and CEO Brendan Iribe even hinted that an ongoing theme at Oculus VR was that "if it's not in a cell phone, it's not in Oculus." It delivers a 3D experience with head tracking, but Iribe indicated that full motion VR is coming in the future. For now, it is head tracking that allows you to look around the game world, but "in five to seven or eight years" virtual reality setups that combine an Oculus Rift-like headset with a omni-directional treadmill would allow you to walk and run around the world in addition to simply looking around.
Beyond the immersion factor, a full motion VR setup would reduce (and possibly eliminate) the phenomena of VR sickness, where users using VR headsets for extended periods of time experience discomfort due to the disconnect between your perceived in-game movement and your (lack of) physical movement and inner-ear balance.
After the first developer kit, Oculus is planning to release a revised version, and ultimately a consumer version. This consumer version is slated for a Q3 2014 launch. It will weigh significantly less (Oculus VR is aiming for around 200g), and will support 1080p 3D resolutions. The sales projections estimate 50,000 revision 2 developer kits in 2013 and at least 500,000 consumer versions of the Oculus Rift in 2014. Ambitious numbers, for sure, but if Oculus can nail down next-generation console support, reduce the weight of the headset, and increase the resolution it is not out of the question.
With the consumer version, Oculus is hoping to offer both a base wired version and a higher-priced wireless Rift VR headset. Further, the company is working with game and professional 3D creation software developers to get software support for the VR headset. Team Fortress 2 support has been announced, for example (and there will even be an Oculus Rift hat, for gamers that are into hats). Additionally, Oculus is working to get support into the following software titles (among others):
- AutoDesk 3D
- DOTA 2
During the presentation Iribe stated that graphics cards (specifically he mentioned the GTX 680) are finally in a place to deliver 3D with smooth frame rates at high-enough resolutions for immersive virtual reality.
Left: potential games with Oculus VR support. Right: Oculus VR CEO Brendan Iribe at ECS during GTC 2013.
Pricing on the consumer version of the VR headset is still unkown, but developers can currently pre-order an Oculus Rift developer kit on the Oculus VR site. In the past, the company has stated that consumers should hold off on buying a developer kit and wait for the consumer version of the Rift in 2014. If the company is able to deliver on its claims of a lighter headset with higher resolution screen and adjustable 3D effects (like the 3DS, the level of stereo 3D can be adjusted, and even turned off), I think it will be worth the wait. The deciding factor will then be software support. Hopefully developers will take to the VR technology and offer up support for it in upcoming titles into the future.
Are you excited for the Oculus Rift?
Subject: General Tech | March 31, 2013 - 08:43 PM | Tim Verry
Tagged: nvidia, lenovo yoga, GTC 2013, GTC, gesture control, eyesight, ECS
During the Emerging Companies Summit at NVIDIA's GPU Technology Conference, Israeli company EyeSight Mobile Technologies' CEO Gideon Shmuel took the stage to discuss the future of its gesture recognition software. He also provided insight into how EyeSight plans to use graphics cards to improve and accelerate the process of identifying, and responding to, finger and hand movements along with face detection.
EyeSight is a five year old company that has developed gesture recognition software that can be installed on existing machines (though it appears to be aimed more at OEMs than directly to consumers). It can use standard cameras, such as webcams, to get its 2D input data and then gets a relative Z-axis from proprietary algorithms. This gives EyeSight essentially 2.5D of input data, and camera resolution and frame rate permitting, allows the software to identify and track finger and hand movements. EyeSight CEO Gideon Shmuel stated at the ECS presentation that the software is currently capable of "finger-level accuracy" at 5 meters from a TV.
Gestures include the ability to use your fingers as a mouse to point at on-screen objects, waving your hand to turn pages, scrolling, and even give hand signal cues.
The software is not open source, and there are no plans to move in that direction. The company has 15 patents pending on its technology, several of which it managed to file before the US Patent Office changed from First to Invent to First Inventor to File (heh, which is another article...). The software will support up to 20 million hardware devices in 2013, and EyeSight expects the number of compatible camera-packing devices to increase further to as many as 3.5 billion in 2015. Other features include the ability transparently map EyeSight input to Android apps without user's needing to muck with settings, and the ability to detect faces and "emotional signals" even in low light. According to the website, SDKs are available for Windows, Linux, and Android. The software maps the gestures it recognizes to Windows shortcuts, to increase compatibility with many existing applications (so long as they support keyboard shortcuts).
Currently, the EyeSight software is mostly run on the CPU, but the company is heavily investing into incorporating GPU support. Moving the processing to GPUs will allow the software to run faster and more power efficiently, especially on mobile devices (NVIDIA's Tegra platform was specifically mentioned). EyeSight's future road-map includes using GPU acceleration to bolster the number of supported gestures, move image processing to the GPUs, add velocity and vector control inputs, incorporate a better low-light filter (which will run on the GPU), and offload processing from the CPU to optimize power management and save CPU resources for the OS and other applications which is especially important for mobile devices. Gideon Shmuel also stated that he wants to see the technology being used on "anything with a display" from your smartphone to your air conditioner.
A basic version of the EyeSight input technology reportedly comes installed on the Lenovo Yoga convertible tablet. I think this software has potential, and would provide that Minority Report-like interaction that many enthusiasts wish for. Hopefully, EyeSight can deliver on its claimed accuracy figures and OEMs will embrace the technology by integrating it into future devices.
EyeSight has posted additional video demos and information about its touch-free technology on its website.
Do you think this "touch-free" gesture technology has merit, or will this type of input remain limited to awkward-integration in console games?
Subject: Cases and Cooling | March 31, 2013 - 04:17 AM | Tim Verry
Tagged: nuc, Intel, impactics, europe, d1nu
Impactics is the latest company to launch its own small form factor case for Intel's Next Unit of Computing (NUC) platform. More heatsink than chassis, the new D1NU chassis sandwiches an Intel NUC motherboard and other internals between two aluminum fin heatsinks. The D1NU measures 170 x 114 x 67mm and weighs 1380g.
The D1NU supports Intel's D33217GKE and DCP847SKE motherboards. The motherboard and other components are attached to a solid piece of precision milled 99.99% electrolytic copper (220g), and then to an aluminum heatsink.
The case seals the components between a top and bottom heatsink and then a 4mm aluminum front bezel and a rear chromium steel bezel with EM shield. The D1NU case/heatsink supports a 25W TDP, and has an MSRP of 99 euros. The front bezel hosts a power button with blue LED and space for a single USB port. The rear of the case can support the outputs of either Intel's Golden Lake or Ski Lake boards. A VESA mount is also in the works. The D1NU comes in silver or black.
According to Fanless Tech, the passive NUC case is now available in Europe for €100 Euros from Case King or £87 pounds from Systo.co.uk. No word yet on whether it will show up on this side of the pond, but (although it is a bit pricey) it is certainly a cool NUC heatsink/case (heh)!
Subject: Graphics Cards | March 31, 2013 - 03:06 AM | Tim Verry
Tagged: GDC 13, sky 900, sky 700, sky 500, RapidFire, radeon sky, GCN, cloud gaming, amd
Earlier this week, AMD announced a new series of Radeon-branded cards–called Radeon Sky–aimed at the cloud gaming market. At the time, details on the cards was scarce apart from the fact that the cards would use latency-reduction "secret sauce" tech called RapidFire, and the highest-end model would be the Radeon Sky 900. Thankfully, gamers will not have to wait until AFDS after all, as AMD has posted additional information and specifications to its website. At this point, pricing and the underlying details of RapidFire are the only aspects still unknown.
According to the AMD site, the company will release three Radeon Sky cards later this year, called Sky 500, Sky 700, and Sky 900. All three cards are passively cooled with aluminum fin heatsinks and are based on AMD's Graphics Core Next (GCN) architecture. At the high end is the Sky 900, which is a dual Tahiti graphics card clocked at 825 MHz. The Sky 900 features 1,792 stream processors per GPU for a total of 3,584. The card further features 3GB of GDDR5 RAM per GPU on a 384-bit interface for a total GPU bandwidth of 480GB/s. AMD claims this dual slot card draws up to 300W while under load. In many respects the Sky 900 is the Radeon-equivalent to the company's professional FirePro S10,000 graphics card. It has similar hardware specifications (including the 5.91TFLOPS of single precision performance potential), but a higher TDP. It is also $3,599, though whether AMD will price the gaming-oriented Sky 900 similarly is unknown.
The Sky 700 steps down to a single-GPU graphics card. This card features a single Tahiti GPU clocked at 900 MHz with 1792 stream processors and 6GB of GDDR5. The graphics card memory uses a 384-bit memory interface for a total memory bandwidth of 264GB/s. Although also a dual slot card like the Sky 900, the cooler is smaller and it draws only 225W under load.
Finally, the Sky 500 represents the low end of the company's cloud gaming hardware lineup. It is the Radeon Sky equivalent to the company's consumer-grade Radeon HD 7870. The Sky 500 features a single Pitcairn GPU clocked at 950 MHz with 1280 stream processors, 4GB of GDDR5 on a 256-bit memory bus, and a rated 150W power draw under load. It further features 154GB/s of memory bandwidth and is a single slot graphics card.
|Sky 900||Sky 700||Sky 500|
|GPU(s)||Dual Tahiti||Single Tahiti||Single Pitcairn|
|GPU Clockspeed||825 MHz||900 MHz||950 MHz|
|Stream Processors||3584 (1792 per GPU)||1792||1280|
|Memory||6GB GDDR5 (3GB per GPU)||6GB GDDR5||4GB GDDR5|
Additionally, the Radeon Sky cards all employ a technology called RapidFire that allegedly reduces latency immensely. As Ryan mentioned on the latest PC Perspective Podcast, the Radeon Sky cards are able to stream up to six games. RapidFire is still a mystery, but the company has indicated that one aspect of RapidFire is the use of AMD's Video Encoding Engine (VCE) to encode the video stream on the GPU itself to reduce game latency. The Sky cards will output at 720p resolutions, and the Sky 700 can support either three games at 60 FPS or six games at 30 FPS.
In addition to working with cloud gaming companies Ubitus, G-Cluster, CiiNow, and Otoy, AMD has announced a partnership with VMWare and Citrix. AMD is reportedly working to allow VMWare ESX/ESXi and Citrix XenServer virtual machines to access the GPU hardware directly, which opens up the possibility of using Sky cards to run workstation applications or remote desktops with 3D support much like NVIDIA's VCA and GRID technology (which the company showed off at GTC last week). Personally, I think the Sky cards may be late to the party but is a step in the right direction. Even if cloud gaming doesn't take off, the cards could still be used to great success by enterprise customers if they are able to allow direct access to the full graphics card hardware from within virtual machines!
More information on the Radeon Sky cards can be found on the AMD website.
Subject: General Tech | March 31, 2013 - 02:21 AM | Tim Verry
Tagged: sony, ps4, playstation eye, playstation 4, gaming, dualshock 4, APU, amd
Sony teased a few more details about its upcoming PlayStation 4 console at the Games Developer's Conference earlier this week. While the basic specifications have not changed since the original announcement, we now know more about the X86 console hardware.
The PS4 itself is powered by an AMD Jaguar CPU with eight physical cores and eight threads. Each core gets 32 KB L1 I-cache and D-cache. Further, each group of four physical cores shares 2 MB of L2 cache, for 4MB total L2. The processor is capable of Out of Order Execution, as are AMDs other processor offerings. The console also reportedly features 8GB of GDDR5 memory that is shared by the CPU and GPU. It offers 176 GB/s of bandwidth, and is a step above the PS3 which did not use a unified memory design. The system will also sport a faster GPU rated at 1.843 TFLOPS, and clocked at 800MHz. The PS3 will have a high-capacity hard drive and a new Blu-ray drive that is up to 3-times faster. Interestingly, the console also has a co-processor that allows the system to process the video streaming features and allow the Remote Play game streaming to the PlayStation Vita at its native resolution of 960x554.
The PlayStation Eye has also been upgraded with the PS4 to include 2 cameras, four microphones, and a 3-axis accelerometer. The Eye cameras have an 85-degree field of view, and can record video at 1280x800 at 60 Hz and 12 bits per pixel or 640x480 and 120Hz. The new PS4 Eye is a noteworthy upgrade to the current generation model which is limited to either 640x480 pixels at 60Hz or 320x240 pixels at 120Hz. The extra resolution should allow developers to be more accurate. The DualShock 4 controllers sport a light-bar that can be tracked by the new Eye camera, for example. The light-bar on the controllers uses an RGB LED that changes to blue, red, pink, or green for players 1-4 respectively.
Speaking of the new DualShock 4, Sony has reportedly ditched the analog face buttons and D-pad for digital buttons. With the DS3 and the PS3, the analog face buttons and D-pad came in handy with racing games, but otherwise they are not likely to be missed. The controllers will now charge even when the console is in standby mode, and the L2 and R2 triggers are more resistant to accidental pressure. The analog sticks have been slightly modified and feature a reduced dead zone. The touchpad, which is a completely new feature for the DualShock lineup, is capable of tracking 2 points at a resolution of 1920x900–which is pretty good.
While Sony has still not revealed what the actual PS4 console will look like, most of the internals are now officially known. It will be interesting to see just where Sony prices the new console, and where game developers are able to take it. Using a DX11.1+ feature set, developers are able to use many of the same tools used to program PC titles but also have additional debugging tools and low level access to the hardware. A new low level API below DirectX, but above the driver level gives developers deeper access to the shader pipeline. I'm curious to see how PC ports will turn out, with the consoles now running X86 hardware, I'm hoping that the usual fare of bugs common to ported titles from consoles to PCs will decrease–a gamer can dream, right?
Subject: Storage | March 28, 2013 - 04:15 PM | Jeremy Hellstrom
Tagged: SuperSSpeed, S301 Hyper Gold, ssd, slc, SandForce SF-2281
SuperSSpeed is mixing the performance and endurance of SLC flash storage with the lower cost of the SandForce SF-2281 in an attempt to bring the price of their SLC drive to an affordable level for the consumer. The mix seems a good idea as the reduced write latency of SLC flash may help to overcome SandForce's weakness when writing incompressible data. [H]ard|OCP's testing bears this out as the drive kept up with a larger Samsung 840 Pro, one of the current performance kings. You will pay for the privilege however as the 128GB drive currently retails for $250 as SLC flash is not cheap. Consider that in almost any casual usage scenario, you are never going to push this drive to its limits ... unless you are going to start your own Frame Rating machine.
"The SuperSSpeed S301 128GB SLC SSD brings SLC flash into the consumer market. The extreme endurance and excellent write performance makes for an interesting SSD powered by the SandForce SF-2281 controller. The Intel 25nm SLC NAND removes much of the Achilles heel of the SandForce processors, delivering consistent performance."
Here are some more Storage reviews from around the web:
- OCZ Vertex 3.20 – Vertex 3 updated to 20nm @ Bjorn3D
- OCZ Vertex 3.20 120GB Solid State Drive Review @ Pro-Clockers
- OCZ Vertex 3 .20 120GB SSD @ Tweaktown
- OCZ Vertex 3 .20 240GB SSD @ Tweaktown
- KingFast Ultra-Cache K13 & K25 SATA2 SSD Review @ ModSynergy
- Kingston V300 120GB SSD Review @ HCW
- Kingston V300 120GB SSD @ Bjorn3D
- Intel 335 Series 180GB SSD Review @ Hardware Canucks
- The SSD Review SSD Database Is Live
- ADATA DashDrive Air AE400 Wireless Storage Device @ Tweaktown
- Kingston DataTraveller Ultimate 3.0 G3 64GB USB3.0 Flash Drive @ Tweaktown
- Kingston HyperX Predator 512GB USB 3.0 Flash Drive @ Tweaktown
- QNAP TS-469L @ Legion Hardware
- G-Technology G-Drive Mobile USB 1TB USB 3.0 Portable Hard Drive Review @ NikKTech
- OWC Mercury On-The-Go Pro USB 3.0 Portable Enclosure Kit Review @ Madshrimps
- ADATA DashDrive HV610 External Hard Drive @ Tweaktown
- ADATA DashDrive Durable HD710 External Hard Drive @ Tweaktown
Subject: General Tech | March 28, 2013 - 03:47 PM | Ken Addison
Tagged: sli, podcast, pcper, nvidia, kepler, HD7790, GTX 560Ti BOOST, GCN, frame rating, crossfire, amd
PC Perspective Podcast #244 - 03/28/2013
Join us this week as we discuss the launch of Frame Rating, HD 7790 vs. GTX 650Ti BOOST, and news from GDC
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:19:22
Week in Review:
News items of interest:
1:00:00 Is there a Flash flood coming?
1:12:00 Hardware/Software Picks of the Week:
Allyn: Samsung 840 250GB for $147 on Buy.com (checkout price) (ends 3/31)
1-888-38-PCPER or firstname.lastname@example.org
Subject: General Tech | March 28, 2013 - 01:59 PM | Jeremy Hellstrom
Alienware Aurora r4 Core i7 Gaming Desktop (Liquid-cooled) w/ GeForce GTX 660 for 1,299.00 with free shipping (normally $1,400.00 - use coupon code: LDLK0B2PNL0$32).
Featured $1,299 configuration includes Core i7-3820 Quad-core CPU (up to 4.1GHz clock speed, 10MB cache), 8GB RAM, 1TB HDD, 1.5GB GeForce GTX 660, 24X DVD Burner, and Windows 7 Home Premium 64-bit OS.
(Optional 4GB GDDR5 NVIDIA GeForce GTX 690 Graphics is now available on Alienware Aurora models.)
Subject: General Tech, Mobile | March 28, 2013 - 01:10 PM | Jeremy Hellstrom
Tagged: razer blade, gaming
You may remember Nokia's failed N-Gage, the phone that thought it was as console but turned out to be a failure; it seems that Razer is going to market with a similar product called the Blade. This time we have a product that is a tablet with aspirations to console-hood as you can tell from the gamepad-type controls surrounding the 1366x768 10.1" screen. Inside you will find an Intel Core i7 processor, a 256GB SSD, and 8GB of RAM all of which adds up to a heavy weight mobile device with not much in the way of battery life. Gizmodo tried it out at GDC and played BioShock Infinite on Ultra with no problems whatsoever so the performance is there. On the other hand can a $1500 gaming tablet compete with full Ultrabooks or streaming devices like Project SHIELD?
"A gaming laptop in a tablet. It's a thought experiment that raises a whole host of questions: Is that even possible? Can it possibly be good? Would anyone even want it if it were? And finally: How much does it cost? The Razer Edge's answers translate roughly to "Yes!", "Sort of.", "Maybe?", and "Erm, you better sit down.""
Here is some more Tech News from around the web:
- Oculus Rift developer kit hands-on video at GDC 2013, shown playing Hawken @ Tweaktown
- Intel to separate 3rd-generation ultrabooks into 3 price groups @ DigiTimes
- Microsoft's 'Gemini' project will be the Windows Blue of Office @ The Register
- ARM says GPGPUs could lower overall chip costs @ The Inquirer
- Blackberry has sold one million Blackberry Z10 smartphones @ The Inquirer
- IBM unfurls SDN network manager @ The Register
- BIGGEST DDoS ATTACK IN HISTORY hammers Spamhaus @ The Register
- How to Improve Your Wi-Fi Signal Using a Soda Can in 6 Steps @ MAKE:Blog
- Interview with Richard Huddy about Intel moving beyond DX @ Kitguru
Subject: General Tech | March 28, 2013 - 12:54 AM | Tim Verry
Tagged: webgl 1.0.2, webgl, web browser, tegra, programming
The Khronos Group recently announced that the WebGL 1.0.1 specification is compliant across mobile and desktop systems on a number of platforms. Chrome 25 and Firefox 19 support WebGL 1.0.1 on Windows, Mac, and Linux operating systems. Further, mobile devices with Tegra SoCs can tap into WebGL using a WebGL-enhanced Android browser.
Additionally, the WebGL 1.0.2 specification and its extensions have been submitted for ratification, and is expected to be formally released in April. According to the press release, the following features are being rolled into the WebGL 1.0.2 specification:
- "adds many clarifications for specification behavioral precision
- mandates support for certain combinations of framebuffer formats, to ease developer adoption;
- clarifies interactions with the encompassing HTML5 platform, including the browser compositor and high-DPI displays;
- dramatically increases the number of conformance tests to roughly 21,000 to improve both the breadth and depth of test coverage - thanks principally to work by Gregg Tavares at Google and the OpenGL ES working group."
Khronos President and NVIDIA Vice President of Mobile Content Neil Trevett stated that "The close cooperation between browser and silicon vendors to ensure the GPU is being reliably and securely exposed to the Web is ongoing proof that Khronos is a highly productive forum to evolve this vital Web standard." Meanwhile, WebGL Working group chair Ken Russell indicated that WebGL 1.0.2 is "a major milestone in bringing the power of the GPU to the Web.”
Although there are security concerns to consider, WebGL does open up some interesting opportunities for new web services. The full press release can be found here.
Subject: General Tech | March 27, 2013 - 11:01 PM | Tim Verry
Tagged: Internet, hc-pbgf, fiber, data transmission
Transmitting data over optical fiber is one of the fastest methods available, and researchers at the University of Southampton have managed to dial up the speed even further.
Being optical in nature, light is used to transmit data over fiber. The speed of light through a vacuum is 299,792,458 meters per second, but traditional fiber is not nearly that fast due to light traveling approximately 31% slower (206,856,796.02 m/s) through silica glass than a vacuum.
The new fiber employs a hollow design that allows light to travel through air rather than glass while still allowing the cable to bend and twist around corners. The new fiber has been dubbed Hollow Core Photonic Bandgap Fiber, or HC-PBGF, and allows light to travel up to about 298,893,080.63 m/s (~99.7% the speed of light). Currently, the HC-PBGF fiber is still in the experimental phase, but it could have big implications for data centers and HPC server clusters that depend on high bandwidth, low latency connections between individual nodes.
Just how fast is the new HC-PBGF? According to ExtremeTech, a researcher told the site that the new fiber has a total cable throughput of 73.7 Tbps. It transmits 3 modes of 96 channels of 256 Gbps each using a combination of wave division multiplexing and mode division multiplexing. The fiber is 160nm and is noticeably faster than traditional fiber. Additionally, the HC-PBGF has a data loss of 3.5 dB/km which makes it a useful candidate for short runs between nodes or rows of racks, but not yet suitable for longer runs. HC-PBGF will not be blanketing your neighborhood anytime soon, but the research may lead to new optical networking technologies used in the next supercomputer or cloud service, for example.
The full paper can be found here, along with more details over at Ars Technica. Unfortunately, the full paper is behind a paywall but it may be worth seeing your school or work can give you access should you be interested in drilling into the details of the experimental hollow fiber,.
Subject: General Tech, Shows and Expos | March 27, 2013 - 08:51 PM | Scott Michaud
Tagged: Intel, GDC 13, GDC
GDC 2013 is where the industry comes together to talk about the technology itself. Intel was there, and of course the big blue just had to unveil developments to help them in the PC gaming space. Two new major rendering extensions and updated developer tools debut. And, if you are not a developer, encode your movies with handbrake quicker!
First up is PixelSync, a DirectX extension for Intel HD Graphics. PixelSync is designed to be used with smoke, hair, and other effects which require blending translucent geometry. With this extension, objects do not need to be sorted before compositing.
Next up is InstantAccess. This DirectX extension allows CPU and integrated GPUs to access the same physical memory. What interests me most about InstantAccess is the ability for developers to write GPU-based applications which can quickly access the same memory as its CPU counterpart. Should the integrated GPU be visible alongside discrete GPUs, this could allow the integrated graphics to help offload GPGPU tasks such as physics while the CPU and discrete GPU handle the more important tasks.
Also updated is their Graphics Performance Analyzers toolset. If you are interested in performance optimization on your software, be sure to check those out.
And for the more general public... Handbrake is now set up to take advantage of Quick Sync Video. Given the popularity of Handbrake, this is quite a big deal for anyone wishing to transcode video using popular and free encoders.
Subject: General Tech, Graphics Cards | March 27, 2013 - 08:16 PM | Tim Verry
Tagged: sky graphics, sky 900, RapidFire, radeon sky, pc gaming, GDC, cloud gaming, ciinow, amd
AMD is making a new push into cloud gaming with a new series of Radeon graphics cards called Sky. The new cards feature a (mysterious) technology called "RapidFire" that allegedly provides "highly efficient and responsive game streaming" from servers to your various computing devices (tablets, PCs, Smart TVs) over the Internet. At this year's Games Developers Conference (GDC), the company announced that it is working with a number of existing cloud gaming companies to provide hardware and drivers to reduce latency.
AMD is working with Otoy, G-Cluster, Ubitus, and CiiNow. CiiNow in particular was heavily discussed by AMD, and can reportedly provide lower latency than cloud gaming competitor Gaikai. AMD Sky is, in many ways, similar in scope to NVIDIA's GRID technology which was announced last year and shown off at GTC last week. Obviously, that has given NVIDIA a head start, but it is difficult to say how AMD's technology will stack up as the company is not yet providing any specifics. Joystiq was able to obtain information on the high-end Radeon Sky graphics card, however (that's something at least...). The Sky 900 reportedly features 3,584 stream processors, 6GB of GDDR5 RAM, and 480 GB/s of bandwidth. Further, AMD has indicated that the new Radeon Sky cards will be based on the company's Graphics Core Next architecture.
|Sky 900||Radeon 7970|
I think it is safe to assume that the Sky cards will be sold to other cloud gaming companies. They will not be consumer cards, and AMD is not going to get into the cloud gaming business itself. Beyond that, AMD's Sky cloud gaming initiative is still a mystery. Hopefully more details will filter out between now and the AMD Fusion Developer Summit this summer.
Subject: Cases and Cooling | March 27, 2013 - 03:17 PM | Jeremy Hellstrom
Tagged: xfx, PSU, ProSeries 650W
The XFX ProSeries 650W PSU is mostly modular, with only the ATX connector attached, has a 135mm cooling fan and can send 98% of its total wattage to the single 53A 12V rail. With four 6+2 PCIe power connectors you will be able to handle multiple GPUs and the 8 SATA connectors should allow you as many storage devices as you need. Unfortunately [H]ard|OCP discovered something about the 5 year warranty which greatly displeased them; unless you register your PSU within 30 days of purchase, you only receive a 2 year warranty. If you are strictly concerned about the quality of the power this PSU delivers and are ambivalent towards the warranty, this PSU passed [H]'s torture tests handily which is something not every PSU can claim.
"XFX has long and actually very solid history of producing high quality enthusiast power supplies. We have consistently found XFX PSUs worthy of [H] Editor's Choice Awards. Today the XFX ProSeries 650W promises "One Rail, One Setup" in a PSU that is different. Let's see if that is good or bad."
Here are some more Cases & Cooling reviews from around the web:
- Cooler Master Silent Pro M2 1000W PSU Review @ Hi Tech Legion
- NZXT Hale 90 V2 1000 Watt Power Supply @ Modders-Inc
- Zalman ZM500-GS / ZM500-GT PSU @ Hardware.info
- Cougar PowerX 550 W @ techPowerUp
- Cougar PowerX 550W Power Supply Unit Review @ NikKTech
- Fractal Design Newton R3 800 W Power Supply Review @ Hardware Secrets
- SilverStone Strider Plus 600 W (ST60F-PS) Power Supply Review @ Hardware Secrets
- Enermax Triathlor FC 550 W @ techPowerUp
- Thermaltake Toughpower Grand Platinum 700-Watt 80 PLUS Platinum @ Tweaktown
- AZZA Platinum 1000-Watt 80 PLUS Platinum @ Tweaktown
- Silverstone Zeus 1350-Watt 80 PLUS @ Tweaktown
- NZXT HALE90 V2 850 W Power Supply Review @ Hardware Secrets
- Antec High Current Pro Platinum 1000W Power Supply Unit Review @ NikKTech
- Thermaltake Dr.Power II ATX Power Supply Tester @ Modders-Inc
- Cooler Master 120W NA 120 Universal AC Adapter Review @ Hi Tech Legion
- eXtreme Power Supply Calculator Update
Subject: General Tech | March 27, 2013 - 02:42 PM | Jeremy Hellstrom
Tagged: gaming, bioshock infinite
After the long wait, BioShock: Infinite is here, not as a sequel but carrying forward some of the atmosphere of the first two out of the water and into the clouds. The reviews are positive, both in terms of game play and story line, as you can see by Rock, Paper, SHOTGUN's preview of the game here. That particular article is spoiler free, not revealing any secrets that those following the release already know, but you can also venture into spoiler territory if you wish as they have already published some feedback about subplots they would have liked to see fleshed out more. For those jsut about to start playing they offer some tweaks to improve your experience, including the all important FOV hack, as well as a Konami-ish code to unlock sadistic mode immediately.
"BioShock: Infinite is a new first-person shooter from Irrational, creators of BioShock, System Shock 2 and SWAT 4. It’s set on a flying city in 1912, where racism and religious fundamentalism dictate society. You’re up there, wielding guns and magic, to bring someone the girl and wipe away the debt. Here’s what I thought, spoiler-free."
Here is some more Tech News from around the web:
- BioShock Infinite review: In the sky, Lord, in the sky @ Ars Technica
- A Look at Duke Nukem 3D Megaton Edition @ Techgage
- WARFACE Puts Its War In Your Face @ Rock, Paper, SHOTGUN
- Tripwire’s Rising Storm Whips Up A Trailer @ Rock, Paper, SHOTGUN
- Age Of Wonders III First Footage @ Rock, Paper, SHOTGUN
- First Look: Space Hulk @ Rock, Paper, SHOTGUN
Subject: General Tech | March 27, 2013 - 01:21 PM | Jeremy Hellstrom
Tagged: gpu, history, get off my lawn
TechSpot has jsut published an article looking at the history of the GPU over the past decades, from the first NTSC capable cards, through the golden 3DFX years straight through to the modern GPGPU. There have been a lot of standards over the years such as MDA, CGA and EGA as well as different interfaces like ISA, the graphic card specific AGP to our current PCIe standard. The first article in this four part series takes us from 1976 through to 1995 and the birth of the Voodoo series of accelerators. Read on to bring back memories or perhaps to encounter some of this history for the first time.
"The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer. While 3D graphics turned a fairly dull PC industry into a light and magic show, they owe their existence to generations of innovative endeavour. Over the next few weeks we'll be taking an extensive look at the history of the GPU, going from the early days of 3D consumer graphics, to the 3Dfx Voodoo game-changer, the industry's consolidation at the turn of the century, and today's modern GPGPU."
Here is some more Tech News from around the web:
- The Do-It-Yourself All-In-One Computer Standard from Intel @ Hardware Secrets
- What is Windows Blue? @ TechReviewSource
- Mozilla Firefox OS on Dreamfone video demo @ The Inquirer
- Google Keep hands-on @ The Inquirer
- Whoops! Tiny bug in NetBSD 6.0 code ruins SSH crypto keys @ The Register
Subject: General Tech | March 27, 2013 - 12:47 PM | Jeremy Hellstrom
Tagged: vengeance 2000, corsair, gaming headset, dolby, audio
Corsair is offering a way to add even more to their Vengeance 2000 wireless gaming headset. They have a brushed aluminum headband, 50mm drivers and microfiber earcups, plus have the benefit of being completely wireless but now they can also support two new Dolby features thanks to a driver update. Get more from your headset for free from Corsair, or pick up a pair for $100 if you are in the market for new headphones.
Fremont, California — March 27, 2013 — Corsair, a worldwide designer and supplier of high-performance components to the PC gaming hardware market, today announced the release of a free software driver update which adds Dolby Headphone 2.0 and Dolby Pro Logic IIx to the Vengeance 2000 wireless gaming headset. The addition of Dolby Headphone to the Vengeance 2000 improves both surround sound quality and game compatibility. The software driver is immediately available for download www.corsair.com.
Vengeance 2000 Wireless 7.1 Gaming Headset: Untethered gaming with amazing audio fidelity
The Vengeance 2000 7.1 Wireless gaming headset uses rock-solid 2.4GHz wireless that frees and untangles gamers with battery life up to ten hours and a range up to 40 feet. Whether it’s a 5.1 or 7.1 audio source, the headset’s optimized HRTF positional audio technology gives gamers an edge with precise information about the location of opponents. The Vengeance 2000 also features custom-engineered 50mm drivers and careful acoustic tuning for audiophile-grade sound with superior clarity and tight bass response. For immersive audio and hours-long comfort the Vengeance 2000 employs circumaural, micro-fiber memory foam earpads and a padded headband. The high-sensitivity, noise-cancelling microphone increases effective team play. The headset comes with a Micro-USB charge cable that can also provide power if the battery runs low.
Dolby Headphone 2.0 and Dolby Pro Logic IIx
Dolby Headphone is a revolutionary signal processing technology that delivers up to 7.1-channel surround sound over headphones for richer, more spacious headphone audio. Dolby Pro Logic IIx processes native stereo- and 5.1-channel material to produce 6.1 or 7.1 output channels.
The new Vengeance 2000 software driver is available for download on the Vengeance 2000 product page on Corsair.com.
Subject: Editorial, General Tech, Shows and Expos | March 27, 2013 - 03:25 AM | Scott Michaud
Tagged: battlefield, battlefield 4, GDC, GDC 13
Battlefield 4 is coming, that has been known with Medal of Honor: Warfighter's release and its promise of beta access, but the gameplay trailer is already here. Clocking in at just over 17 minutes, "Fishing in Baku" looks amazing from a technical standpoint.
The video has been embed below. A little not safe for work due to language and amputation.
Now that you finished gawking, we have gameplay to discuss. I cannot help but be disappointed with the campaign direction. Surely, the story was in planning prior to the release of Battlefield 3. Still, it seems to face the same generic-character problem which struck the last campaign.
In Battlefield 3, I really could not recognize many characters apart from the lead which made their deaths more confusing than upsetting. Normally when we claim a character is identifiable, we mean that we can relate to them. In this case, when I say the characters were not identifiable, I seriously mean that I probably could not pick them out in a police lineup.
Then again, the leaked promotional image for Battlefield 4 seems to show Blackburn at the helm. I guess there is some hope. Slim hope, which the trailer does not contribute to. I mean even the end narration capped how pointless the character interactions were. All this in spite of EA's proclaiming YouTube description of this being human, dramatic, and believable.
Oh well, it went boom good.
Subject: General Tech | March 27, 2013 - 12:06 AM | Tim Verry
Tagged: Xi3, valve, Steam Box, piston, pc gaming, gaming
It may or may not be Valve's Steam Box, but Xi3 is the closest thing to a small form factor PC gaming console running Steam on the radar so far. The Xi3 PISTON is now up for pre-order with an intended holiday 2013 launch.
The PISTON starts at $899 and increases in price from there depending on the amount of internal storage included. Basic specifications of the Piston include an AMD APU (likely the A10-4600M) clocked at 2.3GHz (3.2GHz turbo), Radeon 7660G processor graphics (384 shaders), 8GB of DDR3 RAM, and a 128GB solid state drive. For an extra $340, Xi3 will swap in a 256GB SSD, and for $750 the company will include a 512GB option. Of course, that would bring the price of the living room TV up to $1649, which is far from cheap.
For that kind of money you could build a much more powerful mid tower that could actually run Steam games at 1080p with all the details cranked up. The Xi3 box will be lucky to average 30FPS at 1080p with the latest games. With that said, it is a start and I hope to see continued development of these "Steam Box-esque PCs. Hopefully once mass production, competing options, and economies of scale kick in, consumers will be able to get their hands on cheaper Steam Boxes!
If you can't wait for the official Steam Box, however, you can head over to the Xi3 website to reserve your own PISTON.