Subject: General Tech | August 22, 2014 - 01:30 PM | Jeremy Hellstrom
Tagged: byod, security, Android
In the new BYOD corporate crapshoot Android devices are frequently connecting to secure resources which raises security concerns for many IT workers. The OS is not as secure as many would like it to be; good enough for home use but not for those who truly want to keep their data secure. The majority of the exploits come from insecure apps as opposed to an inherent problem with the OS which has lead to a group proposing an Android Security Module Framework. Root the phone once to add these to Android and enable the ability to restrict the capability of apps to share unnecessarily while not preventing the apps from running. The example offered to The Register was the ability to stop Whatsapp from uploading contact information without preventing the app from functioning. This could also allow you to configure a phone in a way similar to Blackberry's Balance feature, segregating work data from personal.
"An international group of researchers believes Android needs more extensible security, and is offering up a framework they hope either Google or mobe-makers will take for a spin."
Here is some more Tech News from around the web:
- Acer unveils 8-core 4G LTE smartphone in Taiwan @ DigiTimes
- Cyber security experts find 92 percent successful Gmail hack @ The Inquirer
- TELEPORTABLE storage? Atlantis Computing's PR bods jump the shark @ The Register
- Microsoft ropes in Opera Mini as default Nokia dumbphone browser @ The Register
- NETGEAR EX6200 @ Hardwareheaven
NVIDIA Reveals 64-bit Denver CPU Core Details, Headed to New Tegra K1 Powered Devices Later This Year
Subject: Processors | August 12, 2014 - 01:06 AM | Tim Verry
Tagged: tegra k1, project denver, nvidia, Denver, ARMv8, arm, Android, 64-bit
During GTC 2014 NVIDIA launched the Tegra K1, a new mobile SoC that contains a powerful Kepler-based GPU. Initial processors (and the resultant design wins such as the Acer Chromebook 13 and Xiaomi Mi Pad) utilized four ARM Cortex-A15 cores for the CPU side of things, but later this year NVIDIA is deploying a variant of the Tegra K1 SoC that switches out the four A15 cores for two custom (NVIDIA developed) Denver CPU cores.
The custom 64-bit Denver CPU cores use a 7-way superscalar design and run a custom instruction set. Denver is a wide but in-order architecture that allows up to seven operations per clock cycle. NVIDIA is using a custom ISA and on-the-fly binary translation to convert ARMv8 instructions to microcode before execution. A software layer and 128MB cache enhance the Dynamic Code Optimization technology by allowing the processor to examine and optimize the ARM code, convert it to the custom instruction set, and further cache the converted microcode of frequently used applications in a cache (which can be bypassed for infrequently processed code). Using the wider execution engine and Dynamic Code Optimization (which is transparent to ARM developers and does not require updated applications), NVIDIA touts the dual Denver core Tegra K1 as being at least as powerful as the quad and octo-core packing competition.
Further, NVIDIA has claimed at at peak throughput (and in specific situations where application code and DCO can take full advantage of the 7-way execution engine) the Denver-based mobile SoC handily outpaces Intel’s Bay Trail, Apple’s A7 Cyclone, and Qualcomm’s Krait 400 CPU cores. In the results of a synthetic benchmark test provided to The Tech Report, the Denver cores were even challenging Intel’s Haswell-based Celeron 2955U processor. Keeping in mind that these are NVIDIA-provided numbers and likely the best results one can expect, Denver is still quite a bit more capable than existing cores. (Note that the Haswell chips would likely pull much farther ahead when presented with applications that cannot be easily executed in-order with limited instruction parallelism).
NVIDIA is ratcheting up mobile CPU performance with its Denver cores, but it is also aiming for an efficient chip and has implemented several power saving tweaks. Beyond the decision to go with an in-order execution engine (with DCO hopefully mostly making up for that), the beefy Denver cores reportedly feature low latency power state transitions (e.g. between active and idle states), power gating, dynamic voltage, and dynamic clock scaling. The company claims that “Denver's performance will rival some mainstream PC-class CPUs at significantly reduced power consumption.” In real terms this should mean that the two Denver cores in place of the quad core A15 design in the Tegra K1 should not result in significantly lower battery life. The two K1 variants are said to be pin compatible such that OEMs and developers can easily bring upgraded models to market with the faster Denver cores.
For those curious, In the Tegra K1, the two Denver cores (clocked at up to 2.5GHz) share a 16-way L2 cache and each have 128KB instruction and 64KB data L1 caches to themselves. The 128MB Dynamic Code Optimization cache is held in system memory.
Denver is the first (custom) 64-bit ARM processor for Android (with Apple’s A7 being the first 64-bit smartphone chip), and NVIDIA is working on supporting the next generation Android OS known as Android L.
The dual Denver core Tegra K1 is coming later this year and I am excited to see how it performs. The current K1 chip already has a powerful fully CUDA compliant Kepler-based GPU which has enabled awesome projects such as computer vision and even prototype self-driving cars. With the new Kepler GPU and Denver CPU pairing, I’m looking forward to seeing how NVIDIA’s latest chip is put to work and the kinds of devices it enables.
Are you excited for the new Tegra K1 SoC with NVIDIA’s first fully custom cores?
A Tablet and Controller Worth Using
An interesting thing happened a couple of weeks back, while I was standing on stage at our annual PC Perspective Hardware Workshop during Quakecon in Dallas, TX. When NVIDIA offered up a SHIELD (now called the SHIELD Portable) for raffle, the audience cheered. And not just a little bit, but more than they did for nearly any other hardware offered up during the show. That included motherboards, graphics card, monitors, even complete systems. It kind of took me aback - NVIDIA SHIELD was a popular brand, a name that was recognized, and apparently, a product that people wanted to own. You might not have guessed that based on the sales numbers that SHIELD has put forward though. Even though it appeared to have a significant mind share, market share was something that was lacking.
Today though, NVIDIA prepares the second product in the SHIELD lineup, the SHIELD Tablet, a device the company hopes improves on the idea of SHIELD to encourage other users to sign on. It's a tablet (not a tablet with a controller attached), it has a more powerful SoC that can utilize different APIs for unique games, it can be more easily used in a 10-ft console mode and the SHIELD specific features like Game Stream are included and enhanced.
The question of course though is easy to put forward: should you buy one? Let's explore.
The NVIDIA SHIELD Tablet
At first glance, the NVIDIA SHIELD Tablet looks like a tablet. That actually isn't a negative selling point though, as the SHIELD Tablet can and does act like a high end tablet in nearly every way: performance, function, looks. We originally went over the entirety of the tablet's specifications in our first preview last week but much of it bears repeating for this review.
The SHIELD Tablet is built around the NVIDIA Tegra K1 SoC, the first mobile silicon to implement the Kepler graphics architecture. That feature alone makes this tablet impressive because it offers graphics performance not seen in a form factor like this before. CPU performance is also improved over the Tegra 4 processor, but the graphics portion of the die sees the largest performance jump easily.
A 1920x1200 resolution 7.9-in IPS screen faces the user and brings the option of full 1080p content lacking with the first SHIELD portable. The screen is bright and crisp, easily viewable in bring lighting for gaming or use in lots of environments. Though the Xiaomi Mi Pad 7.9 had a 2048x1536 resolution screen, the form factor of the SHIELD Tablet is much more in line with what NVIDIA built with the Tegra Note 7.
The First with the Tegra K1 Processor
Back in May a Chinese company announced what was then the first and only product based on NVIDIA’s Tegra K1 SoC, the Xiaomi Mi Pad 7.9. Since then we have had a couple of other products hit our news wire including Google’s own Project Tango development tablet. But the Xiaomi is the first to actually be released, selling through 50,000 units in four minutes according to some reports. I happened to find one on Aliexpress.com, a Chinese sell-through website, and after a few short days the DHL deliveryman dropped the Tegra K1 powered machine off at my door.
If you are like me, the Xiaomi name was a new one. A privately owned company from Beijing and has become one of China’s largest electronics companies, jumping into the smartphone market in 2011. The Mi Pad marks the company’s first attempt at a tablet device, and the partnership with NVIDIA to be an early seller of the Tegra K1 seems to be making waves.
The Tegra K1 Processor
The Tegra K1 SoC was first revealed at CES in January of 2014, and with it came a heavy burden of expectation from NVIDIA directly, as well as from investors and the media. The first SoC from the Tegra family to have a GPU built from the ground up by NVIDIA engineers, the Tegra K1 gets its name from the Kepler family of GPUs. It also happens to get the base of its architecture there as well.
The processor of the Tegra K1 look very familiar and include four ARM Cortex-A15 “r3” cores and 2MB of L2 cache with a fifth A15 core used for lower power situations. This 4+1 design is the same that was introduced with the Tegra 4 processor last year and allows NVIDIA to implement a style of “big.LITTLE” design that is unique. Some slight modifications to the cores are included with Tegra K1 that improve performance and efficiency, but not by much – the main CPU is very similar to the Tegra 4.
The focus on the Tegra K1 will be on the GPU, now powered by NVIDIA’s Kepler architecture. The K1 features 192 CUDA cores with a very similar design to a single SMX on today’s GeForce GTX 700-series graphics cards. This includes OpenGL ES3.0 support but much more importantly, OpenGL 4.4 and DirectX 11 integration. The ambition of bringing modern, quality PC gaming to mobile devices is going to be closer than you ever thought possible with this product and the demos I have seen running on reference designs are enough to leave your jaw on the floor.
By far the most impressive part of Tegra K1 is the implementation of a full Kepler SMX onto a chip that will be running well under 2 watts. While it has been the plan from NVIDIA to merge the primary GPU architectures between mobile and discrete, this choice did not come without some risk. When the company was building the first Tegra part it basically had to make a hedge on where the world of mobile technology would be in 2015. NVIDIA might have continued to evolve and change the initial GPU IP that was used in Tegra 1, adding feature support and increasing the required die area to improve overall GPU performance, but instead they opted to position a “merge point” with Kepler in 2014. The team at NVIDIA saw that they were within reach of the discontinuity point we are seeing today with Tegra K1, but in truth they had to suffer through the first iterations of Tegra GPU designs that they knew were inferior to the design coming with Kepler.
You can read much more on the technical detail of the Tegra K1 SoC by heading over to our launch article that goes into the updated CPU design, as well as giving you all the gore behind the Kepler integration.
By far the most interesting aspect of the Xiaomi Mi Pad 7.9 tablet is the decsion to integrate the Tegra K1 processor. Performance and battery life comparisons with other 7 to 8-in tablets will likely not impact how it sells in China, but the results may mean the world to NVIDIA as they implore other vendors to integrate the SoC.
Subject: General Tech, Mobile | July 16, 2014 - 04:11 AM | Scott Michaud
Tagged: google, google play, Android, android l
If you have looked at Google's recent design ideologies, first announced at Google I/O 2014, you will see them revolve around skeuomorphism in its most basic sense. By that, I do not mean that they want to make it look like a folder, a metal slab, or a radio button. Their concept is that objects should look like physical objects which behave with physical accuracy, even though they are just simulations of light.
Image Credit: Android Police (and their source)
Basically, rather than having a panel with a toolbar, buttons, and columns, have a background with a page on it. Interface elements which are affected by that panel are on it, while more global actions are off of it. According to Android Police, who make clear that they do not have leaked builds and readers should not believe anything until/unless it ships, the Google Play Store will be redesigned with this consistent, albeit broad, design metric.
Basically, if you are navigation bar, pack your desk and get out.
If true, when will these land? Anyone's guess. One speculation is that it will be timed with the release of Android "L" in Autumn. Their expectation, however, is that it will be one of many updates Google will make across their products in a rolling pattern. Either way, I think it looks good... albeit similar to many modern websites.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | July 7, 2014 - 04:06 AM | Scott Michaud
Tagged: tegra k1, OpenGL ES, opengl, Khronos, google io, google, android extension pack, Android
Sure, this is a little late. Honestly, when I first heard the announcement, I did not see much news in it. The slide from the keynote (below) showed four points: Tesselation, Geometry Shaders, Computer [sic] Shaders, and ASTC Texture Compression. Honestly, I thought tesselation and geometry shaders were part of the OpenGL ES 3.1 spec, like compute shaders. This led to my immediate reaction: "Oh cool. They implemented OpenGL ES 3.1. Nice. Not worth a news post."
Image Credit: Blogogist
Apparently, they were not part of the ES 3.1 spec (although compute shaders are). My mistake. It turns out that Google is cooking their their own vendor-specific extensions. This is quite interesting, as it adds functionality to the API without the developer needing to target a specific GPU vendor (INTEL, NV, ATI, AMD), waiting for approval from the Architecture Review Board (ARB), or using multi-vendor extensions (EXT). In other words, it sounds like developers can target Google's vendor without knowing the actual hardware.
Hiding the GPU vendor from the developer is not the only reason for Google to host their own vendor extension. The added features are mostly from full OpenGL. This makes sense, because it was announced with NVIDIA and their Tegra K1, Kepler-based SoC. Full OpenGL compatibility was NVIDIA's selling point for the K1, due to its heritage as a desktop GPU. But, instead of requiring apps to be programmed with full OpenGL in mind, Google's extension pushes it to OpenGL ES 3.1. If the developer wants to dip their toe into OpenGL, then they could add a few Android Extension Pack features to their existing ES engine.
Epic Games' Unreal Engine 4 "Rivalry" Demo from Google I/O 2014.
The last feature, ASTC Texture Compression, was an interesting one. Apparently the Khronos Group, owners of OpenGL, were looking for a new generation of texture compression technologies. NVIDIA suggested their ZIL technology. ARM and AMD also proposed "Adaptive Scalable Texture Compression". ARM and AMD won, although the Khronos Group stated that the collaboration between ARM and NVIDIA made both proposals better than either in isolation.
Android Extension Pack is set to launch with "Android L". The next release of Android is not currently associated with a snack food. If I was their marketer, I would block out the next three versions as 5.x, and name them (L)emon, then (M)eringue, and finally (P)ie.
Would I do anything with the two skipped letters before pie? (N)(O).
Subject: General Tech | July 3, 2014 - 12:39 PM | Jeremy Hellstrom
Tagged: linux, linaro, juno, google, armv8-a, ARMv8, arm, Android
By now you should have read Ryan's post or listened to Josh talk about Juno on the PCPer Podcast but if you find yourself hungry for more information you can visit The Tech Report. They discuss how the 64-bit Linaro is already able to take advantage of one of big.LITTLE's power efficiency optimization called Global Task Scheduling. As Linaro releases monthly updates you can expect to see more features and better implementations as their take on the Android Open Source Project evolves. Expect to see more of Juno and ARMv8 on review sites as we work out just how to benchmark these devices.
"ARM has created its own custom SoC and platform for 64-bit development. The folks at Linaro have used this Juno dev platform to port an early version of Android L to the ARMv8 instruction set. Here's a first look at the Juno hardware and the 64-bit software it enables."
Here is some more Tech News from around the web:
- Running Cisco's VoIP manager? Four words you don't want to hear: 'Backdoor SSH root key @ The Register
- Latest Nexus 9 leak outs tablet's 5GB RAM, 2560x1600 screen @ The Inquirer
- Twitter takes on GOOGLE, swallows wannabe YouTube firm @ The Register
- Samsung will halt plasma TV production before the end of the year @ The Inquirer
- Previously male-only Hearthstone competition now open to all genders @ Polygon
Subject: Processors, Mobile | June 23, 2014 - 01:08 PM | Ryan Shrout
Tagged: snapdragon, qualcomm, gaming, Android, adreno
Today Qualcomm has published a 22-page white paper that keys in on the company's focus around Android gaming and the benefits that Qualcomm SoCs offer. As the dominant SoC vendor in the Android ecosystem of smartphones, tablets and handhelds (shipping more than 32% in Q2 of 2013) QC is able to offer a unique combination of solutions to both developers and gamers that push Android gaming into higher fidelity with more robust game play.
According to the white paper, Android gaming is the fastest growing segment of the gaming market with a 30% compound annual growth rate from 2013 to 2015, as projected by Gartner. Experiences for mobile games have drastically improved since Android was released in 2008 with developers like Epic Games and the Unreal Engine pushing visuals to near-console and near-PC qualities.
Qualcomm is taking a heterogeneous approach to address the requirements of gaming that include AI execution, physics simulation, animation, low latency input and high speed network connectivity in addition to high quality graphics and 3D rendering. Though not directly a part of the HSA standards still in development, the many specialized engines that Qualcomm has developed for its Snapdragon SoC processors including traditional CPUs, GPUs, DSPs, security and connectivity allow the company to create a solution that is built for Android gaming dominance.
In the white paper Qualcomm dives into the advantages that the Krait CPU architecture offers for CPU-based tasks as well as the power of the Adreno 4x series of GPUs that offer both raw performance and the flexibility to support current and future gaming APIs. All of this is done with single-digit wattage draw and a passive, fanless design and points to the huge undertaking that mobile gaming requires from an engineering and implementation perspective.
For developers, the ability to target Snapdragon architectures with a single code path that can address a scalable product stack allows for the least amount of development time and the most return on investment possible. Qualcomm continues to support the development community with tools and assistance to bring out the peak performance of Krait and Adreno to get games running on lower power parts as well as the latest and upcoming generations of SoCs in flagship devices.
It is great to see Qualcomm focus on this aspect of the mobile market and the challenges presented by it require strong dedication from these engineering teams. Being able to create compelling gaming experiences with high quality imagery while maintaining the required power envelope is a task that many other company's have struggled with.
Check out the new landing page over at Qualcomm if you are interested in more technical information as well as direct access to the white paper detailing the work Qualcomm is putting into its Snapdragon line of SoC for gamers.
Subject: General Tech, Mobile | June 5, 2014 - 02:51 PM | Scott Michaud
Tagged: tegra k1, tegra, project tango, nvidia, google, Android
Today, Google announced their "Project Tango" developer kit for tablets with spatial awareness. With a price tag of $1,024 USD, it is definitely aimed at developers. In fact, the form to be notified about the development kit has a required check box that is labeled, "I am a developer". Slightly above the form is another statement, "These development kits are not a consumer device and will be available in limited quantities".
So yes, you can only buy these if you are a developer.
The technology is the unique part. Project Tango is aimed at developers to make apps which understand the 3D world around the tablet. Two examples categories they have already experimented with are robotics and computer vision. Of course, this could also translate to alternate reality games and mapping.
While Google has not been too friendly with OpenCL in its Android platform, it makes sense that they would choose a flexible GPU with a wide (and deep) range of API support. While other SoCs are probably capable enough, the Kepler architecture in the Tegra K1 is about as feature-complete as you can get in a mobile chip, because it is basically a desktop chip.
Google's Project Tango is available to developers, exclusively, for $1,024 and ships later this month.
Also, that price is clearly a pun.
Subject: Storage | June 2, 2014 - 07:00 AM | Sebastian Peak
Tagged: wireless storage, ios, Hard Drive, computex 2014, Android, airplay
Today Corsair annouces the Voyager Air 2, a wireless hard drive with 1TB of storage which can connect to iOS and Android devices, as well as PCs and Macs.
The Voyager Air 2 is battery-powered and rechargeable (Corsair estimates 7-hour battery life from the high-capacity rechargeable lithium-ion battery), and the included software syncs with Dropbox and Google Drive and supports AirPlay streaming to an Apple TV. It supports 802.11b/g/n Wi-Fi connections for multiple users within a 90 foot range, and can stream 720p high-definition video to up to five devices at once.
And the Voyager Air 2 has quite a bit more functionality than just streaming content over Wi-Fi. It can serve as a wireless hub to share internet access via wireless passthrough, and it also functions as a USB 3.0 drive for fast data transfers when connected to a computer.
The Voyager Air 2 will be available this month with a suggested price of $179.99.