All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Mobile | May 2, 2014 - 10:34 PM | Tim Verry
Tagged: Windows 8.1, thinkpad tablet, thinkpad 10, Lenovo, Intel, Bay Trail
Details on a new 10-inch tablet from Lenovo emerged following a product page being posted on the Lenovo Australia site prior to an official announcement. The page was quickly taken down, but not before German technology site TabTech snagged all of the details and photos of the new ThinkPad branded mobile.
The leaked ThinkPad 10 joins the existing ThinkPad 8 tablet which was first shown off at CES 2014 earlier this year. The business-focused device runs x86 hardware and the full version of Microsoft's Windows 8.1 operating system. The ThinkPad 10 sports rounded edges, a hefty bezel, (and if it follows the ThinkPad 8) a machine cut aluminum back panel with ThinkPad branding. The front of the device hosts a 10-inch display with a resolution of 1920x1200, a 2MP webcam, and Windows button. The top corner of the tablet hosts an 8MP rear camera with LED flash. Exact dimensions and weight are still unknown.
Internally, Lenovo is using a quad core Bay Trail SoC clocked at 1.6 GHz, up to 4GB of RAM, and up to 128GB of internal storage. If the ThinkPad 8 is any indication, the base models should start with 2GB of RAM, 64GB of storage, and a Wi-Fi chip. From there, users will be able to choose versions of the ThinkPad 10 with more memory, more storage, LTE cellular data connections, and stylus options.
Additionally, the ThinkPad 10 will support basic covers, basic docks that allow it to be used in tent mode, keyboard docks, and keyboard cases. Unfortunately, the keyboard dock does not appear to latch onto the tablet, and once docked the screen cannot be rotated further like with devices like the Transformer T100 and upcoming Aspire Switch 10. With that said, from the information available so far, I am interested in the ThinkPad 10 from a mobile productivity standpoint (I have been on the fence on getting a T100 for months now, heh). If Lenovo can maintain ThinkPad quality in this tablet and the keyboard options, I will definitely be considering it.
With the ThinkPad 8 starting at $399 for the WiFi-only model with 2GB RAM and 64GB storage, users can expect the ThinkPad 10 to start at least $499. Unfortunately, as with most product launches and leaks, official pricing and availability are still unknown.
Stay tuned to PC Perspective for more details on the ThinkPad 10. In the meantime, check out our video of the ThinkPad 8 to get an idea of the aesthetics and performance of the upcoming Windows 8.1 tablet!
Subject: Mobile | May 2, 2014 - 12:06 AM | Tim Verry
Tagged: tablet, Intel, Clover Trail+, atom z2560, Android
Acer is introducing a new 7-inch tablet due for release in June. The upcoming Iconia One 7 is an Intel-powered tablet running Google's Android 4.2 operating system. It is a budget device that cuts corners on the operating system and hardware so that it can reach a starting price of $129.99.
The Iconia One 7 tablet will be available in black, blue, red, pink, and white, and features a 7-inch IPS display with a 16:10 resolution of 1280x800, a 5 megapixel rear camera, and a 0.3 megapixel webcam. The tablet has rounded corners and edges (especially on the back panel).
Internally, Acer has chosen to use a dual core SoC based on Intel's previous generation Clover Trail+ architecture (2-wide, in order cores that support Hyper Threading). The chip features two CPU cores clocked at 1.6 GHz, 1 MB of cache, and a PowerVR SGX544 GPU. THe chip is paired with 1GB of system RAM and either 8GB or 16GB of internal flash storage. The internal storage can be expanded with up to a 32GB microSD card. The tablet is powered by a 3,700 mAh battery.
The tablet hardware is reportedly compatible with Android 4.4, but Acer has yet to outline an upgrade path.
Acer has obviously cut corners here, both on the hardware and software. However, these sacrifices have allowed the company to offer up a tablet at a base price of $129.99. It will not be the fastest device, but it should be a good-enough web browsing and reading tablet for those that prefer the portable 7-inch form factor. (Personally, I would have liked to see a Bay Trail-powered variant at a slightly higher price point.) The Iconia One 7 will be available in Africa, Europe, and the Middle East by the middle of this month and will hit US shores in June.
Subject: Processors, Mobile | April 30, 2014 - 07:06 PM | Ryan Shrout
Tagged: Intel, clover trail, Bay Trail, arm, Android
While we are still waiting for those mysterious Intel Bay Trail based Android tablets to find their way into our hands, we met with ARM today to discuss quite few varying topics. One of them centered around the cost of binary translation - the requirement to convert application code compiled for one architecture and running it after conversion on a different architecture. In this case, running native ARMv7 Android applications on an x86 platform like Bay Trail from Intel.
Based on results presented by ARM, so take everything here in that light, more than 50% of the top 250 applications in the Android Play Store require binary translation to run. 23-30% have been compiled to x86 natively, 20-21% run through Dalvik and the rest have more severe compatibility concerns. That paints a picture of the current state of Android apps and the environment in which Intel is working while attempting to release Android tablets this spring.
Performance of these binary translated applications will be lower than they would be natively, as you would expect, but to what degree? These results, again gathered by ARM, show a 20-40% performance drop in games like Riptide GP2 and Minecraft while also increasing "jank" - a measure of smoothness and stutter found with variances in frame rates. These are applications that exist in a native mode but were tricked into running through binary conversion as well. The insinuation is that we can now forecast what the performance penalty is for applications that don't have a natively compiled version and are forced to run in translation mode.
The result of this is lower battery life as it requires the CPU to draw more power to keep the experience close to nominal. While gaming on battery, which most people do with items like the Galaxy Tab 3 used for testing, a 20-35% decrease in game time will hurt Intel's ability to stand up to the best ARM designs on the market.
Other downsides to this binary translation include longer load times for applications, lower frame rates and longer execution time. Of course, the Galaxy Tab 3 10.1 is based on Intel's Atom Z2560 SoC, a somewhat older Clover Trail+ design. That is the most modern currently available Android platform from Intel as we are still awaiting Bay Trail units. This also explains why ARM did not do any direct performance comparisons to any devices from its partners. All of these results were comparing Intel in its two execution modes: native and translated.
Without a platform based on Bay Trail to look at and test, we of course have to use the results that ARM presented as a placeholder at best. It is possible that Intel's performance is high enough with Silvermont that it makes up for these binary translation headaches for as long as necessary to see x86 more ubiquitous. And in fairness, we have seen many demonstrations from Intel directly that show the advantage of performance and power efficiency going in the other direction - in Intel's favor. This kind of debate requires some more in-person analysis with hardware in our hands soon and with a larger collection of popular applications.
More from our visit with ARM soon!
Subject: General Tech | April 29, 2014 - 04:14 PM | Jeremy Hellstrom
Tagged: TrustZone, security, Puma+, Mullins, mobile, Kabini, Jaguar, boost, beema, amd, AM1
Beema and Mullins have arrived and by now you must have read Josh's coverage but you might be aching for more. The Tech Report were present at the unveiling and came prepared, with a USB 3.0 solid-state drive containing their own preferred testing applications and games. Not only do you get a look at how the Mullins tablet handled the testing you can see how it compares to Kabini and Bay Trail. Check out the performance results as well as their take on the power consumption and new security features on the new pair of chips from AMD which come bearing more gifts than we had thought they would.
"A couple weeks ago, AMD flew us down to its Austin, Texas campus for a first look at Mullins and Beema, two low-power APUs aimed at the next wave of Windows tablets and low-cost laptops. Today, we're able to share what we learned from that expedition—as well as benchmarks from the first Mullins tablet."
Here is some more Tech News from around the web:
- AMD launches third generation Mullins and Beema APUs @ The Inquirer
- AMD Beema and Mullins APU Performance – 3rd Generation APUs @ Legit Reviews
- AMD Mullins & Beema Mobile APUs Preview @ Hardware Canucks
- Drink me: Adobe pours Flash Player bug squash @ The Register
- Über-secure Blackphone crypto-mobe spills its silicon guts @ The Register
- inksys PLEK500 500Mbps Powerline Homeplug AV2 Kit @ NikKTech
- Testing NVIDIA Optimus / DRI PRIME On Ubuntu 14.04 @ Phoronix
Subject: Mobile | April 28, 2014 - 04:52 PM | Jeremy Hellstrom
Tagged: msi, gs70, stealth
Back during CES we saw several teasers showing off the MSI GS70 STEALTH gaming laptop and now we finally have a review thanks to Hardware Secrets. The insane specs of this machine mean that the power cord is rated at 120W although at a hair under 6lbs it is surprisingly light. If you can afford the price tag this is a serious contender for the most powerful gaming laptop currently available, barring some that have opted for SLI graphics.
"The MSI GS70 STEALTH is a gaming laptop computer with a fourth-generation Core i7-4700HQ processor, 16 GiB of RAM, 17" Full HD screen, two SSD units in RAID 0, a GeForce GTX 765M video card, high-end Gigabit Ethernet and Wi-Fi controllers, and much more. Let's analyze this powerful machine."
Here are some more Mobile articles from around the web:
- Dell XPS 12 2-in-1 Ultrabook @ Kitguru
- Dell XPS 15 @ The Inquirer
- MSI GE60 2PE Apache Pro 15.6in gaming notebook @ Kitguru
- Acer Chromebook C720 SSD Upgrade To MyDigitalSSD M.2 (NGFF) 128GB SSD – Worlds Easiest Upgrade @ The SSD Review
- Enermax TwisterOdio 16 CP-008 Notebook Cooling Pad Review @HiTech Legion
- Sony Xperia Z2 Tablet @ The Inquirer
- Gigabyte Tegra Note 7 @ Legion Hardware
- S5 vs Nexus 5: Touchwiz vs vanilla Android 4.4 Kitkat @ The Inquirer
- HTC One M8 Smartphone Performance Review @ Legit Reviews
- HTC One M8 @ The Inquirer
Subject: General Tech, Systems, Mobile | April 27, 2014 - 03:30 AM | Scott Michaud
Tagged: nvidia, sheild, shield 2, AnTuTu
VR-Zone is claiming that this is the successor to NVIDIA's SHIELD portable gaming system. An AnTuTu benchmark was found for a device called, "NVIDIA test model(SHIELD)" with an "NVIDIA Gefroce(Kepler Graphics)" GPU, typos left as-is. My gut expects that it is valid, but I hesitate to vouch the rumor. If it even came from NVIDIA, which the improper spelling and capitalization of "GeForce" calls into question, it could easily be an internal prototype and maybe even incorrectly given the "SHIELD" (which is properly spelled and capitalized) label.
Image Credit: AnTuTu.com
As far as its camera listing, it would make sense for the SHIELD to get one at standard definition (0.3MP -- probably 640x480). The fact that the original SHIELD shipped without any, at all, still confuses me. The low resolution sensor still does not make sense, seeming like an almost pointless upgrade, but it could be used by NVIDIA for a specific application or built-in purpose.
Or, it could be an irrelevant benchmark listing.
Either way, there are rumors floating around about a SHIELD 2 being announced at E3 in June. It is unlikely that NVIDIA will give up on the handheld any time soon. Whether that means new hardware, versus more software updates, is anyone's guess. The Tegra K1 would have been a good launching SoC for the SHIELD, however, with its full OpenGL 4.4 and compute support (the hardware supports up to OpenCL 1.2 although driver support will apparently be "based on customer needs". PDF - page 8).
Waiting. Seeing. You know the drill.
Subject: General Tech, Mobile | April 27, 2014 - 01:50 AM | Scott Michaud
Tagged: unreal engine 4, ue4, epic games
Epic Games has just incremented the minor version number of their popular engine by releasing Unreal Engine 4.1 to all subscribers. While the dot-zero was available privately for quite some time, it was made public barely a month ago. Its headlining feature is a few extra platforms: Linux, SteamOS, Xbox One, and Playstation 4. Each of these are included in the 19$ per month and 5-percent royalty agreement -- excluding outside fees, such as those required to become a registered developer with Sony and/or Microsoft, obviously.
You will also need a capable Windows PC to deploy a game to Playstation 4, Xbox One, Linux, or SteamOS... "for now". This implies that development on other platforms is being considered. Development from OSX seems likely, as does Linux, but creating games on an Xbox One or Playstation 4 seems a bit far-fetched. Who knows though? If any company has good enough relationships with Sony and Microsoft to make it happen, it would be Epic.
— Ray Davis (@EpicRayD) April 25, 2014
I am guessing... Dreamcast support is a "no". It was not that ahead-of-the-curve.
The actual update notes are just shy of 7000 words and about 20 pages long, so platforms are not everything. Epic has been adding a lot of content and templates to the engine and their marketplace, including the Elemental demo first seen at E3 2012. The editor was also updated with numerous improvements, such as better FBX importing (FBX is a cross-application 3d file format).
Also, it is available now.
Subject: General Tech, Systems, Mobile | April 18, 2014 - 02:39 AM | Scott Michaud
Tagged: canonical, ubuntu, ubuntu 14.04
Ubuntu, the popular Linux distribution, has been on a steady six-month release schedule for eight years. Every four versions, that is, once every two years, one is marked as Long Term Support (LTS). While typical (non-LTS) releases are supported for around 9 months, LTS versions are provided with five years of updates. Of course, each version, LTS or not, is free. The choice to stay on a specific branch is something else entirely.
For most home users, it will probably make sense to pick up the latest version available on your update manager. Of course, each new release will change things and that can be a problem for some users. That said, given that releases come in six-month intervals, it does make sense to keep up with the changes as they happen, rather than fall behind and have a real shock in five years. Enterprise customers, on the other hand, would love to adopt an operating system which never changes, outside of security updates. Windows XP is a recent example of where enterprise customers will actually pay to not upgrade. These customers will benefit most from LTS.
First and foremost, Canonical, the company behind Ubuntu, wants to catch the wave of PC users who are looking to upgrade from Windows XP and Windows 7. It is free, it has a web browser and an office suite, it is stable and secure, and they suggest that it will be easy to deploy and manage for governments and other institutions.
The interface is Unity7, although users will have the option to try Unity8. The latter version is Canonical's attempt to cover all form factors: phones, tablets, TVs, and desktops.
They probably could have chosen a different number, if only for the jokes.
Ubuntu 14.04 LTS is available now at their website. It is free. If you want it, go get it unless you already have it.
Subject: General Tech, Processors, Mobile | April 16, 2014 - 08:40 PM | Scott Michaud
Tagged: Intel, silvermont, arm, quarterly earnings, quarterly results
Sean Hollister at The Verge reported on Intel's recent quarterly report. Their chosen headline focuses on the significant losses incurred from the Mobile and Communications Group, the division responsible for tablet SoCs and 3G/4G modems. Its revenue dropped 52%, since last quarter, and its losses increased about 6%. Intel is still making plenty of money, with $12.291 billion USD in profits for 2013, but that is in spite of Mobile and Communications losing $3.148 billion over the same time.
Intel did have some wins, however. The Internet of Things Group is quite profitable, with $123 million USD of income from $482 million of revenue. They also had a better March quarter than the prior year, up a few hundred million in both revenue and profits. Also, Mobile and Communications should have a positive impact on the rest of the company. The Silvermont architecture, for instance, will eventually form the basis for 2015's Xeon Phi processors and co-processors.
It is concerning that Internet of Things has over twice the sales of Mobile but I hesitate to make any judgments. From my position, it is very difficult to see whether or not this trend follows Intel's projections. We simply do not know whether the division, time and time again, fails to meet expectations or whether Intel is just intentionally being very aggressive to position itself better in the future. I would shrug off the latter but, obviously, the former would be a serious concern.
The best thing for us to do is to keep an eye on their upcoming roadmaps and compare them to early projections.
Subject: Mobile | April 9, 2014 - 05:23 PM | Jeremy Hellstrom
Tagged: linux, asus, zenbook, UX301LA-DH71T, ubuntu 14.04, ubuntu, haswell
There is a lot to like about this particular 13.3" ASUS Zenbook, perhaps the most noticeable is the IPS display with a 2560 x 1440 resolution and a capacitive display capable of tracking 10 contact points. There is another reason to fall in love with this notebook, it can run Ubuntu with all of the features enabled without any extra work required. The specifications under the hood are rather impressive as well, a Core-i7 4558U with Intel Iris Graphics 5100, 8GB of DDR3-1600 and two 128GB SSDs capable of supporting RAID. Those of you looking for a powerful notebook which does not require Windows to run properly would be wise to read this review at Phoronix.
"As I wrote about at the beginning of March, I bought the ASUS Zenbook UX301LA-DH71T Haswell-based ultrabook to replace an Apple Retina MacBook Pro as my main system. I've been using this latest Zenbook with Intel Iris Graphics and dual SSDs for several weeks now as my main system and have taken it on four business trips so far and it's been running great. Paired with Ubuntu 14.04 LTS, the ASUS Zenbook UX301LA makes a rather nice lightweight yet powerful Linux system."
Here are some more Mobile articles from around the web:
- Dell XPS 15 9530 @ Kitguru
- Enermax DreamBass AeroOdio CP006 Cooling Pad Review @HiTech Legion
- Silverstone NB04 Notebook Cooler @ eTeknix
- Acer Iconia B1 Tablet Review @ Hardware Secrets
- Pivos MANA 2200 mAh Battery Pack Review @ Bjorn3D
- Pivos Mana 5200mAh Battery Pack @ Bjorn3D
- Silverstone SST-PB03 AA Emergency Battery Pack @ eTeknix
- Gumstick Smartphone Stand Review @ Bjorn3d
- Samsung Galaxy S5 @ The Inquirer
- HTC One M8 vs iPhone 5S specs comparison @ The Inquirer
- Acer Liquid S2 @ The Inquirer
- iOCEAN X7S 8-core Smartphone Review @ Madshrimps
Subject: Mobile | April 8, 2014 - 07:47 PM | Tim Verry
Tagged: SoC, snapdragon, qualcomm, LTE, ARMv8, adreno, 64-bit
Qualcomm has announced two new flagship 64-bit SoCs with the Snapdragon 808 and Snapdragon 810. The new chips will begin sampling later this year and should start showing up in high end smartphones towards the second half of 2015. The new 800-series parts join the previously announced mid-range Snapdragon 610 and 615 which are also 64-bit ARMv8 parts.
The Snapdragon 810 is Qualcomm's new flagship processor. The chip features four ARM Cortex A57 cores and four Cortex A53 cores in a big.LITTLE configuration, an Adreno 430 GPU, and support for Category 6 LTE (up to 300 Mbps downloads) and LPDDR4 memory. This flagship part uses the 64-bit ARMv8 ISA. The new Adreno 430 GPU integrated in the SoC is reportedly 30% faster than the Adreno 420 GPU in the Snapdragon 805 processor.
In addition to the flagship part, Qualcomm is also releasing the Snapdragon 808 which pairs two Cortex A57 CPU cores and four Cortex A53 CPU cores in a big.LITTLE configuration with an Adreno 418 (approximately 20% faster than the popular Adreno 320) GPU. This chip supports LPDDR3 memory and Qualcomm's new Category 6 LTE modem.
Both the 808 and 810 have Adreno GPUs which support OpenGL ES 3.1. The new chips support a slew of wireless I/O including Categrory 6 LTE, 802.11ac Wi-Fi, Bluetooth 4.1, and NFC.
Qualcomm is reportedly planning to produce these SoCs on a 20nm process. For reference, the mid-range 64-bit Snapdragon 610 and 615 use a 28nm LP manufacturing process. The new 20nm process (presumably from TSMC) should enable improved battery life and clockspeed headroom on the flagship parts. Exactly how big the mentioned gains will be will depend on the specific manufacturing process, with smaller gains from a bulk/planar process shrink or greater improvements coming from more advanced methods such as FD-SOI if the new chip on a 20nm process is the same transistor count as one on a 28nm process (which is being used in existing chips).
The 808 and 810 parts are the new high-end 64-bit chips which will effectively supplant the 32-bit Snapdragon 805 which is a marginal update over the Snapdragon 800. The naming conventions and product lineups are getting a bit crazy here, but suffice it to say that the 808 and 810 are the effective successors to the 800 while the 805 is a stop-gap upgrade while Qualcomm moves to 64-bit ARMv8 and secures manufacturing for the new chips which should be slightly faster CPU-wise, notably faster GPU-wise and more capable with the faster cellular modem support and 64-bit ISA support.
For those wondering, the press release also states that the company is still working on development of its custom 64-bit Krait CPU architecture. However, it does not appear that 64-bit Krait will be ready by the first half of 2015, which is why Qualcomm has opted to use ARM's Cortex A57 and A53 cores in its upcoming flagship 808 and 810 SoCs.
Subject: Mobile | April 8, 2014 - 07:01 PM | Tim Verry
Tagged: tablet, tab a8, tab a7-50, tab a10, mtk 8121, mediatek, Lenovo, android 4.2
Today, Lenovo announced a refreshed lineup of its A-series tablets including the A7-50, A8, and A10. The new tablets take a common hardware platform and scale it from a 7-inch tablet to a 10-inch tablet with optional keyboard. All three tablets run the Android 4.2 operating system and will be available in May.
The Lenovo TAB A7-50 Android 4.2 tablet.
The new Lenovo TAB A-series is powered by a quad core MediaTek 8121 SoC clocked at 1.3 GHz paired with 1GB of LP-DDR2 memory and 16GB of internal flash storage. Users can add an additional 32GB of storage with a micro SD card. Networking is handled by an 802.11 b/g/n Wi-Fi and Bluetooth 4.0 radio along with an optional SIM card slot on certain models (cellular functionality not available in the North American market). The tablets come with IPS touchscreens with a resolution of 1280 x 800. Lenovo includes a 2MP webcam and a 5MP rear facing camera on all three A-series tablets. The A10 further adds stereo speakers and compatibility with a keyboard dock.
Lenovo rates all three tablets at eight hours of battery life.
The Lenovo TAB A8 tablet.
Beyond the Lenovo TAB A10 being available with a Bluetooth keyboard dock, the only major differences between the new three A-series tablets are physical dimensions, screen size, and weight. The Lenovo TAB A7-50 measures 198x121.2x9.9mm and weighs 0.70 lbs. The TAB A8 meanwhile measures 217x136x8.9mm and weighs slightly more at 0.79 lbs. Finally, the TAB A10 measures 264x176.5x8.9mm and weighs 1.2 lbs.
The Lenovo TAB A10 with its Bluetooth keyboard dock.
The 7-inch Lenovo Tab A7-50 has an MSRP of $129 while the 8-Inch Tab A8 has an MSRP of $179. The 10.1-inch Tab A10 has a base price of $249 and is also available as a tablet and keyboard bundle for $299.
What do you think about Lenovo's new A-series lineup? On one hand, you have three size options at competitive prices, but on the other you only have a single option as far as internal specifications and screen resolution no matter the screen size. If you can live with the MTK 8121 and 1GB of RAM, they could be a viable option.
Read more about Lenovo tablets such as the Yoga 8 and Yoga 10 at PC Perspective.
Subject: General Tech, Mobile | March 25, 2014 - 09:34 PM | Tim Verry
Tagged: GTC 2014, tegra k1, nvidia, CUDA, kepler, jetson tk1, development
NVIDIA recently unified its desktop and mobile GPU lineups by moving to a Kepler-based GPU in its latest Tegra K1 mobile SoC. The move to the Kepler architecture has simplified development and enabled the CUDA programming model to run on mobile devices. One of the main points of the opening keynote earlier today was ‘CUDA everywhere,’ and NVIDIA has officially accomplished that goal by having CUDA compatible hardware from servers to desktops to tablets and embedded devices.
Speaking of embedded devices, NVIDIA showed off a new development board called the Jetson TK1. This tiny new board features a NVIDIA Tegra K1 SoC at its heart along with 2GB RAM and 16GB eMMC storage. The Jetson TK1 supports a plethora of IO options including an internal expansion port (GPIO compatible), SATA, one half-mini PCI-e slot, serial, USB 3.0, micro USB, Gigabit Ethernet, analog audio, and HDMI video outputs.
Of course the Tegra K1 part is a quad core (4+1) ARM CPU and a Kepler-based GPU with 192 CUDA cores. The SoC is rated at 326 GFLOPS which enables some interesting compute workloads including machine vision.
In fact, Audi has been utilizing the Jetson TK1 development board to power its self-driving prototype car (more on that soon). Other intended uses for the new development board include robotics, medical devices, security systems, and perhaps low power compute clusters (such as an improved Pedraforca system).It can also be used as a simple desktop platform for testing and developing mobile applications for other Tegra K1 powered devices, of course.
Beyond the hardware, the Jetson TK1 comes with the CUDA toolkit, OpenGL 4.4 driver, and NVIDIA VisionWorks SDK which includes programming libraries and sample code for getting machine vision applications running on the Tegra K1 SoC.
The Jetson TK1 is available for pre-order now at $192 and is slated to begin shipping in April. Interested developers can find more information on the NVIDIA developer website.
Subject: General Tech, Graphics Cards, Mobile | March 25, 2014 - 03:01 PM | Scott Michaud
Tagged: shield, nvidia
The SHIELD from NVIDIA is getting a software update which advances GameStream, TegraZone, and the Android OS, itself, to KitKat. Personally, the GameStream enhancements seem most notable as it now allows users to access their home PC's gaming content outside of the home, as if it were a cloud server (but some other parts were interesting, too). Also, from now until the end of April, NVIDIA has temporarily cut the price down to $199.
Going into more detail: GameStream, now out of Beta, will stream games which are rendered on your gaming PC to your SHIELD. Typically, we have seen this through "cloud" services, such as OnLive and GaiKai, which allow access to a set of games that run on their servers (with varying license models). The fear with these services is the lack of ownership, but the advantage is that the slave device just needs enough power to decode an HD video stream.
In NVIDIA's case, the user owns both server (their standard NVIDIA-powered gaming PC, which can now be a laptop) and target device (the SHIELD). This technology was once limited to your own network (which definitely has its uses, especially for the SHIELD as a home theater device) but now can also be exposed over the internet. For this technology, NVIDIA recommends 5 megabit upload and download speeds - which is still a lot of upload bandwidth, even for 2014. In terms of performance, NVIDIA believes that it should live up to expectations set by their GRID. I do not have any experience with this, but others on the conference call took it as good news.
As for content, NVIDIA has expanded the number of supported titles to over a hundred, including new entries: Assassin's Creed IV, Batman: Arkham Origins, Battlefield 4, Call of Duty: Ghosts, Daylight, Titanfall, and Dark Souls II. They also claim that users can add other apps which are not officially supported, Halo 2: Vista was mentioned as an example, for streaming. FPS and Bitrate can now be set by the user. A bluetooth mouse and keyboard can also be paired to SHIELD for that input type through GameStream.
Yeah, I don't like checkbox comparisons either. It's just a summary.
A new TegraZone was also briefly mentioned. Its main upgrade was apparently its library interface. There has also been a number of PC titles ported to Android recently, such as Mount and Blade: Warband.
The update is available now and the $199 promotion will last until the end of April.
Subject: Mobile | March 19, 2014 - 06:27 PM | Jeremy Hellstrom
Tagged: nexus 4, Ubuntu Mobile
We have yet to see the launch of the purpose built Ubuntu smartphones but that didn't stop The Inquirer from getting a preview of the new Ubuntu Mobile OS. By installing the current version of the OS on a Nexus 4 they got a chance to see and use the new mobile OS. Similar in design to the version we have seen previously on tablets it will likely feel a bit odd to those used to a multi-window OS like Android though the interface wall allow customization. There will indeed be a Canonical App store but the open nature of Ubuntu will allow third-party stores to be set up, over and above supporting third-party apps. Check out the hands on review here.
"CANONICAL ANNOUNCED earlier this year that the first Ubuntu smartphones will be made by BQ and Meizu. That created a wave of interest in how the open source Linux operating system (OS) distribution will look and work on a smartphone or tablet."
Here are some more Mobile articles from around the web:
- Qualcomm Snapdragon 805 4K tablet hands-on @ The Inquirer
- MWC: Samsung Gear 2 hands-on @ The Inquirer
- ZTE Grand Memo II hands-on @ The Inquirer
- Asus Transformer Book T100T @ Kitguru
- HP Pavilion x360 hands-on @ The Inquirer
- DinoPC Pegasus 17.3inch GTX 765M @ Kitguru
- Asus G750JX with GTX 770M @ Hardwareoverclock
- A Look at NVIDIA's GeForce 800M Mobile GPU Series @ Techgage
- GeForces 800M series combines Maxwell, Kepler @ The Tech Report
- GTX 800M; NVIDIA's Maxwell Goes Mobile @ Hardware Canucks
- LifeProof Realtree Edition Waterproof Iphone Case @ TechwareLabs
- ADATA DashDrive Air AE800 500GB Wireless HDD and Power Bank @ eTeknix
- Sandberg Solar PowerBank 6000 mAh @ NikKTech
- Hacking Dell Laptop Charger Identification @ Hack a Day
- Anker 2nd Gen Astro 6000mAh Portable Battery Review @ Legit Reviews
- ADATA Elite CE700 Qi Wireless Charging Station @ eTeknix
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 19, 2014 - 09:03 AM | Scott Michaud
Tagged: WebCL, gdc 14, GDC
The Khronos Group has just ratified the standard for WebCL 1.0. The API is expected to provide a massive performance boost to web applications which are dominated by expensive functions which can be offloaded to parallel processors, such as GPUs and multi-core CPUs. Its definition also allows WebCL to communicate and share buffers between it and WebGL with an extension.
Frequent readers of the site might remember that I have a particular interest in WebCL. Based on OpenCL, it allows web apps to obtain a list of every available compute device and target it for workloads. I have personally executed tasks on an NVIDIA GeForce 670 discrete GPU and other jobs on my Intel HD 4000 iGPU, at the same time, using the WebCL prototype from Tomi Aarnio of Nokia Research. The same is true for users with multiple discrete GPUs installed in their system (even if they are not compatible with Crossfire, SLi, or are from different vendors altogether). This could be very useful for physics, AI, lighting, and other game middleware packages.
Still, browser adoption might be rocky for quite some time. Google, Mozilla, and Opera Software were each involved in the working draft. This leaves both Apple and Microsoft notably absent. Even then, I am not sure how much interest exists within Google, Mozilla, and Opera to take it from a specification to a working feature in their browsers. Some individuals have expressed more faith in WebGL compute shaders than WebCL.
Of course, that can change with just a single "killer app", library, or middleware.
I do expect some resistance from the platform holders, however. Even Google has been pushing back on OpenCL support in Android, in favor of their "Renderscript" abstraction. The performance of a graphics processor is also significant leverage for a native app. There is little, otherwise, that cannot be accomplished with Web standards except a web browser itself (and there is even some non-serious projects for that). If Microsoft can support WebGL, however, there is always hope.
The specification is available at the Khronos website.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 19, 2014 - 09:02 AM | Scott Michaud
Tagged: OpenGL ES, opengl, opencl, gdc 14, GDC, EGL
The Khronos Group has also released their ratified specification for EGL 1.5. This API is at the center of data and event management between other Khronos APIs. This version increases security, interoperability between APIs, and support for many operating systems, including Android and 64-bit Linux.
The headline on the list of changes is the move that EGLImage objects makes, from the realm of extension into EGL 1.5's core functionality, giving developers a reliable method of transferring textures and renderbuffers between graphics contexts and APIs. Second on the list is the increased security around creating a graphics context, primarily designed for WebGL applications which any arbitrary website can become. Further down the list is the EGLSync object which allows further partnership between OpenGL (and OpenGL ES) and OpenCL. The GPU may not need CPU involvement when scheduling between tasks on both APIs.
During the call, the representative also wanted to mention that developers have asked them to bring EGL back to Windows. While it has not happened yet, they have announced that it is a current target.
The EGL 1.5 spec is available at the Khronos website.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 19, 2014 - 09:01 AM | Scott Michaud
Tagged: SYCL, opencl, gdc 14, GDC
To gather community feedback, the provisional specification for SYCL 1.2 has been released by The Khronos Group. SYCL extends itself upon OpenCL with the C++11 standard. This technology is built on another Khronos platform, SPIR, which allows the OpenCL C programming language to be mapped onto LLVM, with its hundreds of compatible languages (and Khronos is careful to note that they intend for anyone to make their own compatible alternative langauge).
In short, SPIR allows many languages which can compile into LLVM to take advantage of OpenCL. SYCL is the specification for creating C++11 libraries and compilers through SPIR.
As stated earlier, Khronos wants anyone to make their own compatible language:
While SYCL is one possible solution for developers, the OpenCL group encourages innovation in programming models for heterogeneous systems, either by building on top of the SPIR™ low-level intermediate representation, leveraging C++ programming techniques through SYCL, using the open source CLU libraries for prototyping, or by developing their own techniques.
SYCL 1.2 supports OpenCL 1.2 and they intend to develop it alongside OpenCL. Future releases are expected to support the latest OpenCL 2.0 specification and keep up with future developments.
The SYCL 1.2 provisional spec is available at the Khronos website.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 17, 2014 - 09:01 AM | Scott Michaud
Tagged: OpenGL ES, opengl, Khronos, gdc 14, GDC
Today, day one of Game Developers Conference 2014, the Khronos Group has officially released the 3.1 specification for OpenGL ES. The main new feature, brought over from OpenGL 4, is the addition of compute shaders. This opens GPGPU functionality to mobile and embedded devices for applications developed in OpenGL ES, especially if the developer does not want to add OpenCL.
The update is backward-compatible with OpenGL ES 2.0 and 3.0 applications, allowing developers to add features, as available, for their existing apps. On the device side, most functionality is expected to be a driver update (in the majority of cases).
OpenGL ES, standing for OpenGL for Embedded Systems but is rarely branded as such, delivers what they consider the most important features from the graphics library to the majority of devices. The Khronos Group has been working toward merging ES with the "full" graphics library over time. The last release, OpenGL ES 3.0, was focused on becoming a direct subset of OpenGL 4.3. This release expands upon the feature-space it occupies.
OpenGL ES also forms the basis for WebGL. The current draft of WebGL 2.0 uses OpenGL ES 3.0 although that was not discussed today. I have heard murmurs (not from Khronos) about some parties pushing for compute shaders in that specification, which this announcement puts us closer to.
The new specification also adds other features, such as the ability to issue a draw without CPU intervention. You could imagine a particle simulation, for instance, that wants to draw the result after its compute shader terminates. Shading is also less rigid, where vertex and fragment shaders do not need to be explicitly linked into a program before they are used. I inquired about the possibility that compute devices could be targetted (for devices with two GPUs) and possibly load balanced, in a similar method to WebCL but no confirmation or denial was provided (although he did mention that it would be interesting for apps that fall somewhere in the middle of OpenGL ES and OpenCL).
The OpenGL ES 3.1 spec is available at the Khronos website.
Subject: General Tech, Graphics Cards, Mobile, Shows and Expos | March 5, 2014 - 08:28 PM | Scott Michaud
Tagged: qualcomm, nvidia, microsoft, Intel, gdc 14, GDC, DirectX 12, amd
The announcement of DirectX 12 has been given a date and time via a blog post on the Microsoft Developer Network (MSDN) blogs. On March 20th at 10:00am (I assume PDT), a few days into the 2014 Game Developers Conference in San Francisco, California, the upcoming specification should be detailed for attendees. Apparently, four GPU manufacturers will also be involved with the announcement: AMD, Intel, NVIDIA, and Qualcomm.
As we reported last week, DirectX 12 is expected to target increased hardware control and decreased CPU overhead for added performance in "cutting-edge 3D graphics" applications. Really, this is the best time for it. Graphics processors are mostly settled into highly-efficient co-processors of parallel data, with some specialized logic for geometry and video tasks. A new specification can relax the needs of video drivers and thus keep the GPU (or GPUs, in Mantle's case) loaded and utilized.
But, to me, the most interesting part of this announcement is the nod to Qualcomm. Microsoft values DirectX as leverage over other x86 and ARM-based operating systems. With Qualcomm, clearly Microsoft believes that either Windows RT or Windows Phone will benefit from the API's next version. While it will probably make PC gamers nervous, mobile platforms will benefit most from reducing CPU overhead, especially if it can be spread out over multiple cores.
Honestly, that is fine by me. As long as Microsoft returns to treating the PC as a first-class citizen, I do not mind them helping mobile, too. We will definitely keep you up to date as we know more.