Subject: General Tech | April 2, 2013 - 05:57 PM | Jeremy Hellstrom
Tagged: arm, FinFET, 16nm, TSMC, Cortex-A57
While what DigiTimes is reporting on is only the first tape out, it is still very interesting to see TSMC hitting 16nm process testing and doing it with the 3D transistor technology we have come to know as FinFET. It was a 64-bit ARM Cortex-A57 chip that was created using this process, unfortunately we did not get much information about what comprised the chip apart from the slide you can see below.
As it can be inferred by the mention that it can run alongside big.LITTLE chips it will not be of the same architecture, nor will it be confined to cellphones. This does help reinforce TSMC's position in the market for keeping up with the latest fabrication trends and another solid ARM contract will also keep the beancounters occupied. You can't expect to see these chips immediately but this is a solid step towards an new process being mastered by TSMC.
"The achievement is the first milestone in the collaboration between ARM and TSMC to jointly optimize the 64-bit ARMv8 processor series on TSMC FinFET process technologies, the companies said. The pair has teamed up to produce Cortex-A57 processors and libraries to support early customer implementations on 16nm FinFET for ARM-based SoCs."
Here is some more Tech News from around the web:
- Wiping a Smartphone Still Leaves Data Behind @ Slashdot
- ARM processor competition to fire up @ DigiTimes
- Physicists bang the drum for quantum memory @ The Register
- Intel Haswell Socket H Heatsink Requirements and Overclocking Thoughts @ Tweaktown
- Killing Your Internet with Killer Ethernet @ Techgage
- Backdoors Found In Bitlocker, FileVault and TrueCrypt? @ TechARP
- Win ASRock FM2A85X Extreme 6 & Seasonic M12II-850 @ Kitguru
- Win Enermax Goodies From Insomnia i48 @ eTeknix
- NikKTech & Synology Joint Giveaway - One DiskStation DS213+ Up For Grabs
- The TR Podcast 131: News from GDC and FCAT attacks
- Dispatches from the Nexus @ The Tech Report
- AMD touts unified gaming strategy @ The Tech Report
- Intel gets serious about graphics for gaming @ The Tech Report
Subject: General Tech | April 2, 2013 - 10:59 AM | Tim Verry
Tagged: lga 2011, Ivy Bridge-E, Intel, 22nm
Many enthusiasts have been eagerly awaiting the next generation of Intel processors to use LGA 2011, which is supposed to be Ivy Bridge-E. Especially after seeing rumors of a 10 core Xeon E5-2600 V2 Ivy Bridge-EP CPU, I think many users expected at least an eight core Ivy Bridge-E part.
Unfortunately, if a slide posted by VR-Zone China is any indication, LGA 2011 users will not be getting an eight core processor any time soon. The slide suggests that Intel will release three new Ivy Bridge-E CPUs in the third quarter of this year (Q3'13). However, the top-end part is merely a six core CPU with slight improvements over the existing Sandy Bridge-E 3960X chip.
Specifically, the slide alleges that the initial Intel release will include the Core i7 4820, Core i7 4930K, and the Core i7 4960X. An Ivy Bridge-E equivalent to the SB-E 3970X is noticeably absent from the lineup along with several of the other rumored (higher core count) chips.
Rumored Ivy Bridge-E chips:
|Clockspeed||Core Count||L3 Cache||Manufacturing Process||TDP|
|Core i7 4960X||3.6GHz (4GHz Turbo)||6||15MB||22nm||130W|
|Core i7 4930K||3.4GHz (3.9GHz Turbo)||6||12MB||22||130W|
|Core i7 4820K||3.7GHz (3.9GHz Turbo)||4||10MB||22||130W|
Existing Sandy Bridge-E equivalents:
|Clockspeed||Core Count||L3 Cache||Manufacturing Process||TDP|
|Core i7 3960X||3.3GHz (3.9GHz Turbo)||6||15MB||32nm||130W|
|Core i7 3930K||3.2GHz (3.8GHz Turbo)||6||12MB||32nm||130W|
|Core i7 3820||3.6GHz (3.8GHz Turbo)||4||10MB||32nm||130W|
All of the chips allegedly have 130W TDPs, 40 PCI-E 3.0 lanes, support for quad-channel DDR3-1866 memory, and are built on Intel's 22nm manufacturing process. The low end i7 4820 is a quad core chip clocked at 3.7 GHz base and 3.9 GHz turbo with 10MB L3 cache. The i7 4930K is an unlocked six core part with 12MB L3 cache and clockspeeds of 3.4 GHz base and 3.9 GHz turbo. Finally, the Core i7 4960X is rumored to be the highest-end chip Intel will release (at least, initially). It is also a six core part clocked at 3.6 GHz base and 4 GHz turbo. It has 15MB of L3 cache. These chips are the Ivy Bridge-E equivalents to the 3820, 3930K, and 3960X chips respectively. The new processors feature higher clockspeeds, and are based on 22nm 3D transistor technology instead of SB-E's 32nm manufacturing process. It seems that Intel has extended unlocking to the lower-tier LGA 2011 chip, as it is listed as the Core i7 4820K. Having an unlocked multiplier is nice to see at the low end (the low end of the enthusiast platform, anyway). Curiously, the TDP ratings are the same, however. That suggests that the move to 22nm did not net Intel much TDP headroom, and the higher clocks are bringing them up to similar TDP numbers. At least the TDP ratings are not higher than SB-E, such that you motherboard and HSF should have no problems accepting an IVB-E CPU upgrade (with a BIOS update, of course).
It will be interesting to see how the new Ivy Bridge-E chips stack up, especially considering Intel may also be unveiling the consumer-grade Haswell processor this year. On one hand, Ivy Bridge-E offers up a CPU upgrade path for existing systems, but on the other hand pricing and the performance of Haswell (and lack of higher core count Ivy Bridge-E chips like previous rumors suggested) may see enthusiasts instead opt for a motherboard+CPU overhaul instead of simply recycling the LGA 2011/X79 motherboard. At this point, if this new slide holds true it appears that Ivy Bridge E/LGA 2011 will become even more of a niche solely for workstations that need the extra PCI-E lanes and quad channel memory. I say this as someone running a Lynnfield system who is itching for an upgrade and torn on going for the enthusiast platform or waiting for Haswell.
What do you think about the rumored Ivy Bridge-E chips, are they what you expected? Do you think they will be worth a CPU upgrade for your LGA 2011-based system or are you leaning towards Haswell?
Read more about Ivy Bride-E at PC Perspective, including: Ivy Bridge-E after Haswell: I think I've gone cross-eyed.
Subject: General Tech | April 2, 2013 - 07:25 AM | Tim Verry
Tagged: x86 emulator, rpix86, Raspberry Pi, gaming, dos
The Raspberry Pi is proving to be a popular destination for all manner of interesting software projects and open source operating systems. The most-recent Pi project I've come across is a DOS PC emulator by Patrick Aalto called rpix86. A port of DSx86, which ran on the Nintendo DS handheld console, rpix86 is now up to version 0.04 and emulates a 90's X86 computer with enough hardware oomph to run classic PC games!
Rpix86 is an emulator that runs from the console (not within the X GUI desktop environment) on the Raspberry Pi. It emulates the following X86 PC specs:
|Processor||80486 @ ~ 20 MHz (inc. protected mode. No virtual memory support)|
|Memory||640 Kb low memory, 4 MB EMS memory, 16 MB XMS memory|
|Graphics||Super VGA @ 640 x 480 w/ 256 colors|
|Audio||Sound Blaster 2.0 (+ AdLib-compatible FM sounds)|
|Input Devices||US keyboard, analog joystick, 2 button mouse|
|Misc||Roland MPU-401 MIDI Support via USB MIDI Dongle|
Patrick Aalto added support for analog USB joysticks and foot pedals (4 buttons, 4 analog channels) as well as 80 x 50 text mode (required by some MIDI software and Little Big Adventure's setup program) to the recent 0.04 update. He also stripped out debug code, which cut the program size approximately in half.
The developer has stated on his blog that he is working on allowing rpix86 to be used from the terminal within X and adding support for intelligent MPU MIDI mode. A port to the Android operating system called ax86 is also in the works. You can grab the current version of the Raspberry Pi X86 emulator on the developer's website.
With this emulator, you can run most of the DOS games you grew up with (Wolf3D and Digger anyone?), which is definitely a worthy use for the $25 or $35 Raspberry Pi hardware! At the very least, it is an interesting alternative to running DOSBox, and much smaller and more power efficient than running an old X86 PC dedicated to running classic games. Getting those floppies to work with the Pi might be a bit of an issue though, assuming they are still readable (heh).
Read more about the Raspberry Pi computer at PC Perspective.
Subject: General Tech | April 2, 2013 - 06:41 AM | Tim Verry
Tagged: file sync, cloud storage, cloud drive, amazon
Amazon has announced two new Java-based applications for Windows and Mac PCs that will sync files between multiple computers and the company's Cloud Drive online storage service.
Amazon Cloud Drive is a companion service that was spun off of its Cloud Player music locker service. Users get 5GB for free, with additional tiers of storage available for purchase. (Any music from Amazon side-loaded to Cloud Drive and Cloud Player before July 31st does not count towards your storage quota). Until now, Cloud Drive has been merely a web storage locker, but with the new desktop apps Amazon is adding file syncing capabilities that will keep your files updated across multiple PCs. The desktop apps will create a folder which will then contain a locally-stored copy of your Amazon Cloud Drive files. If you choose to install the desktop app onto a second PC, it will also sync with Cloud Drive and store a copy of the files locally. The most recently modified version will sync to all the other computers' local store and the cloud drive. There is no word on versioning support, so note that this should not be a replacement for a true file backup. With that said, the multiple-PC file sync is a welcome addition that makes Cloud Drive much more useful than ever before.
The new desktop apps will run on Windows XP, Vista, 7, and 8, and on Mac OS X 10.6, 10.7, and 10.8.
When Amazon was asked about mobile apps and file sync, the company told Ars Technica that it had "nothing specific to share." That could mean that Cloud Drive will bring file synchronization to iOS, Android, and WP8, or it could be a literal statement. It is difficult to say, but I think if Amazon wants its Cloud Drive storage service to be taken seriously the company will need to enter the mobile space (as it has done with Cloud Player).
Subject: General Tech | April 2, 2013 - 02:54 AM | Tim Verry
Tagged: next generation character rendering, GDC 13, gaming, Activision, 3D rendering
Activision recently showed off its Next-Generation Character Rendering technology, which is a new method for rendering realistic and high-quality 3D faces. The technology has been in the works for some time now, and is now at a point where faces are extremely detailed down to pores, freckles, wrinkles, and eye lashes.
In addition to Lauren, Activision also showed off its own take on the face used in NVIDIA's Ira FaceWorks tech demo. Except instead of the NVIDIA rendering, the face was done using Activision's own Next-Generation Character Rendering technology. A method that is allegedly more efficient and "completely different" than the one used for Ira. In a video showing off the technology (embedded below), the Activision method produces some impressive 3D renders in real time, but when talking appear to be a bit creepy-looking and unnatural. Perhaps Activision and NVIDIA should find a way to combine the emotional improvements of Ira with the graphical prowess of NGCR (and while we are making a wish list, I might as well add TressFX support... heh).
The high resolution faces are not quite ready for the next Call of Duty, but the research team has managed to get models to render at 180 FPS on a PC running a single GTX 680 graphics card. That is not enough to implement the technology in a game, where there are multiple models, the environment, physics, AI, and all manner of other calculations to deal with and present at acceptable frame rates, but it is nice to see this kind of future-looking work being done now. Perhaps in a few graphics card generations the hardware will catch up to the face rendering technology that Activision (and others) are working on, which will be rather satisfying to see. It is amazing how far the graphics world has come since I got into PC gaming with Wolfenstein 3D, to say the least!
The team behind Activision's Next-Generation Character Rendering technology includes:
|Javier Von Der Pahlen||Director of Research and Development|
|Etienne Donvoye||Technical Director|
|Bernardo Antoniazzi||Technical Art Director|
|Zbyněk Kysela||Modeler and Texture Artist|
|Mike Eheler||Programming and Support|
|Jorge Jimenez||Real-Time Graphics Research and Development|
Jorge Jimenez has posted several more screenshots of the GDC tech demo on his blog that are worth checking out if you are interested in the new rendering tech.
Subject: General Tech | April 1, 2013 - 05:37 AM | Tim Verry
Tagged: virtual reality, oculus vr, oculus rift, GTC 2013, gaming
Update: Yesterday, an Oculus representative reached out to us in order to clarify that the information presented was forward looking and subject to change (naturally). Later in the day, the company also posted to its forum the following statement:
"The information from that presentation (dates, concepts, projections, etc...) represent our vision, ideas, and on-going research/exploration. None of it should be considered fact, though we'd love to have that projected revenue!"
You can find the full statement in this thread.
The original article text is below:
Oculus VR, the company behind the Oculus Rift virtual reality headset, took the stage at NVIDIA's GPU Technology Conference to talk about the state of its technology and where it is headed.
Oculus VR is a relatively new company founded by Palmer Luckey and managed by CEO Brendan Iribe, who is the former CPO of cloud gaming company Gaikai. Currently, Oculus VR is developing a wearable, 3D, virtual reality headset called the Oculus Rift. Initially launched via Kickstarter, the Oculus Rift hardware is now shipping to developers as the rev 1 developer kit. Oculus VR will manufacture 10,000 developer kits and managed to raise $2.55 million in 2012.
The developer kit has a resolution of 1280x800 and weighs 320g. It takes a digital video input via a DVI or HDMI cable (HDMI with an adapter). The goggles hold the display and internals, and a control box connects via a wire to provide power. It uses several pieces of hardware found in smartphones, and CEO Brendan Iribe even hinted that an ongoing theme at Oculus VR was that "if it's not in a cell phone, it's not in Oculus." It delivers a 3D experience with head tracking, but Iribe indicated that full motion VR is coming in the future. For now, it is head tracking that allows you to look around the game world, but "in five to seven or eight years" virtual reality setups that combine an Oculus Rift-like headset with a omni-directional treadmill would allow you to walk and run around the world in addition to simply looking around.
Beyond the immersion factor, a full motion VR setup would reduce (and possibly eliminate) the phenomena of VR sickness, where users using VR headsets for extended periods of time experience discomfort due to the disconnect between your perceived in-game movement and your (lack of) physical movement and inner-ear balance.
After the first developer kit, Oculus is planning to release a revised version, and ultimately a consumer version. This consumer version is slated for a Q3 2014 launch. It will weigh significantly less (Oculus VR is aiming for around 200g), and will support 1080p 3D resolutions. The sales projections estimate 50,000 revision 2 developer kits in 2013 and at least 500,000 consumer versions of the Oculus Rift in 2014. Ambitious numbers, for sure, but if Oculus can nail down next-generation console support, reduce the weight of the headset, and increase the resolution it is not out of the question.
With the consumer version, Oculus is hoping to offer both a base wired version and a higher-priced wireless Rift VR headset. Further, the company is working with game and professional 3D creation software developers to get software support for the VR headset. Team Fortress 2 support has been announced, for example (and there will even be an Oculus Rift hat, for gamers that are into hats). Additionally, Oculus is working to get support into the following software titles (among others):
- AutoDesk 3D
- DOTA 2
During the presentation Iribe stated that graphics cards (specifically he mentioned the GTX 680) are finally in a place to deliver 3D with smooth frame rates at high-enough resolutions for immersive virtual reality.
Left: potential games with Oculus VR support. Right: Oculus VR CEO Brendan Iribe at ECS during GTC 2013.
Pricing on the consumer version of the VR headset is still unkown, but developers can currently pre-order an Oculus Rift developer kit on the Oculus VR site. In the past, the company has stated that consumers should hold off on buying a developer kit and wait for the consumer version of the Rift in 2014. If the company is able to deliver on its claims of a lighter headset with higher resolution screen and adjustable 3D effects (like the 3DS, the level of stereo 3D can be adjusted, and even turned off), I think it will be worth the wait. The deciding factor will then be software support. Hopefully developers will take to the VR technology and offer up support for it in upcoming titles into the future.
Are you excited for the Oculus Rift?
Subject: General Tech | March 31, 2013 - 08:43 PM | Tim Verry
Tagged: nvidia, lenovo yoga, GTC 2013, GTC, gesture control, eyesight, ECS
During the Emerging Companies Summit at NVIDIA's GPU Technology Conference, Israeli company EyeSight Mobile Technologies' CEO Gideon Shmuel took the stage to discuss the future of its gesture recognition software. He also provided insight into how EyeSight plans to use graphics cards to improve and accelerate the process of identifying, and responding to, finger and hand movements along with face detection.
EyeSight is a five year old company that has developed gesture recognition software that can be installed on existing machines (though it appears to be aimed more at OEMs than directly to consumers). It can use standard cameras, such as webcams, to get its 2D input data and then gets a relative Z-axis from proprietary algorithms. This gives EyeSight essentially 2.5D of input data, and camera resolution and frame rate permitting, allows the software to identify and track finger and hand movements. EyeSight CEO Gideon Shmuel stated at the ECS presentation that the software is currently capable of "finger-level accuracy" at 5 meters from a TV.
Gestures include the ability to use your fingers as a mouse to point at on-screen objects, waving your hand to turn pages, scrolling, and even give hand signal cues.
The software is not open source, and there are no plans to move in that direction. The company has 15 patents pending on its technology, several of which it managed to file before the US Patent Office changed from First to Invent to First Inventor to File (heh, which is another article...). The software will support up to 20 million hardware devices in 2013, and EyeSight expects the number of compatible camera-packing devices to increase further to as many as 3.5 billion in 2015. Other features include the ability transparently map EyeSight input to Android apps without user's needing to muck with settings, and the ability to detect faces and "emotional signals" even in low light. According to the website, SDKs are available for Windows, Linux, and Android. The software maps the gestures it recognizes to Windows shortcuts, to increase compatibility with many existing applications (so long as they support keyboard shortcuts).
Currently, the EyeSight software is mostly run on the CPU, but the company is heavily investing into incorporating GPU support. Moving the processing to GPUs will allow the software to run faster and more power efficiently, especially on mobile devices (NVIDIA's Tegra platform was specifically mentioned). EyeSight's future road-map includes using GPU acceleration to bolster the number of supported gestures, move image processing to the GPUs, add velocity and vector control inputs, incorporate a better low-light filter (which will run on the GPU), and offload processing from the CPU to optimize power management and save CPU resources for the OS and other applications which is especially important for mobile devices. Gideon Shmuel also stated that he wants to see the technology being used on "anything with a display" from your smartphone to your air conditioner.
A basic version of the EyeSight input technology reportedly comes installed on the Lenovo Yoga convertible tablet. I think this software has potential, and would provide that Minority Report-like interaction that many enthusiasts wish for. Hopefully, EyeSight can deliver on its claimed accuracy figures and OEMs will embrace the technology by integrating it into future devices.
EyeSight has posted additional video demos and information about its touch-free technology on its website.
Do you think this "touch-free" gesture technology has merit, or will this type of input remain limited to awkward-integration in console games?
Subject: General Tech | March 31, 2013 - 02:21 AM | Tim Verry
Tagged: sony, ps4, playstation eye, playstation 4, gaming, dualshock 4, APU, amd
Sony teased a few more details about its upcoming PlayStation 4 console at the Games Developer's Conference earlier this week. While the basic specifications have not changed since the original announcement, we now know more about the X86 console hardware.
The PS4 itself is powered by an AMD Jaguar CPU with eight physical cores and eight threads. Each core gets 32 KB L1 I-cache and D-cache. Further, each group of four physical cores shares 2 MB of L2 cache, for 4MB total L2. The processor is capable of Out of Order Execution, as are AMDs other processor offerings. The console also reportedly features 8GB of GDDR5 memory that is shared by the CPU and GPU. It offers 176 GB/s of bandwidth, and is a step above the PS3 which did not use a unified memory design. The system will also sport a faster GPU rated at 1.843 TFLOPS, and clocked at 800MHz. The PS3 will have a high-capacity hard drive and a new Blu-ray drive that is up to 3-times faster. Interestingly, the console also has a co-processor that allows the system to process the video streaming features and allow the Remote Play game streaming to the PlayStation Vita at its native resolution of 960x554.
The PlayStation Eye has also been upgraded with the PS4 to include 2 cameras, four microphones, and a 3-axis accelerometer. The Eye cameras have an 85-degree field of view, and can record video at 1280x800 at 60 Hz and 12 bits per pixel or 640x480 and 120Hz. The new PS4 Eye is a noteworthy upgrade to the current generation model which is limited to either 640x480 pixels at 60Hz or 320x240 pixels at 120Hz. The extra resolution should allow developers to be more accurate. The DualShock 4 controllers sport a light-bar that can be tracked by the new Eye camera, for example. The light-bar on the controllers uses an RGB LED that changes to blue, red, pink, or green for players 1-4 respectively.
Speaking of the new DualShock 4, Sony has reportedly ditched the analog face buttons and D-pad for digital buttons. With the DS3 and the PS3, the analog face buttons and D-pad came in handy with racing games, but otherwise they are not likely to be missed. The controllers will now charge even when the console is in standby mode, and the L2 and R2 triggers are more resistant to accidental pressure. The analog sticks have been slightly modified and feature a reduced dead zone. The touchpad, which is a completely new feature for the DualShock lineup, is capable of tracking 2 points at a resolution of 1920x900–which is pretty good.
While Sony has still not revealed what the actual PS4 console will look like, most of the internals are now officially known. It will be interesting to see just where Sony prices the new console, and where game developers are able to take it. Using a DX11.1+ feature set, developers are able to use many of the same tools used to program PC titles but also have additional debugging tools and low level access to the hardware. A new low level API below DirectX, but above the driver level gives developers deeper access to the shader pipeline. I'm curious to see how PC ports will turn out, with the consoles now running X86 hardware, I'm hoping that the usual fare of bugs common to ported titles from consoles to PCs will decrease–a gamer can dream, right?
Subject: General Tech | March 28, 2013 - 03:47 PM | Ken Addison
Tagged: sli, podcast, pcper, nvidia, kepler, HD7790, GTX 560Ti BOOST, GCN, frame rating, crossfire, amd
PC Perspective Podcast #244 - 03/28/2013
Join us this week as we discuss the launch of Frame Rating, HD 7790 vs. GTX 650Ti BOOST, and news from GDC
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:19:22
Week in Review:
News items of interest:
1:00:00 Is there a Flash flood coming?
1:12:00 Hardware/Software Picks of the Week:
Allyn: Samsung 840 250GB for $147 on Buy.com (checkout price) (ends 3/31)
1-888-38-PCPER or firstname.lastname@example.org
Subject: General Tech | March 28, 2013 - 01:59 PM | Jeremy Hellstrom
Alienware Aurora r4 Core i7 Gaming Desktop (Liquid-cooled) w/ GeForce GTX 660 for 1,299.00 with free shipping (normally $1,400.00 - use coupon code: LDLK0B2PNL0$32).
Featured $1,299 configuration includes Core i7-3820 Quad-core CPU (up to 4.1GHz clock speed, 10MB cache), 8GB RAM, 1TB HDD, 1.5GB GeForce GTX 660, 24X DVD Burner, and Windows 7 Home Premium 64-bit OS.
(Optional 4GB GDDR5 NVIDIA GeForce GTX 690 Graphics is now available on Alienware Aurora models.)