Subject: Mobile | July 2, 2015 - 04:54 PM | Jeremy Hellstrom
Tagged: amazon, kindle paperwhite
The insides of the third generation Kindle Paperwhite match the Voyage, a Freescale i.MX6 SoloLite 1GHz chip, as do the outsides with a new 300ppi screen. Connectivity has been expanded to Wi-Fi as well as an available 3G model and there is also a brand new font called Bookerly. If you are in need of an eReader and are not in Canada so that you can get the Tegra 4 powered Kobo Arc 7, you should head over to Techgage and see if the new improve Paperwhite is the solution you should chose.
"Amazon has just revealed its third-gen Kindle Paperwhite e-reader, and while it doesn’t offer a substantial upgrade over the previous model, it does iterate on what was already a fantastic device. With a 300 ppi screen and brand-new Bookerly font at-the-ready, there’s not much to dislike with this e-reader."
Here are some more Mobile articles from around the web:
- Blackview Zeta Smartphone Review @ Madshrimps
- Acer Liquid Jade S Smartphone @ Kitguru
- Mlais M7 Smartphone Review @ Madshrimps
- Camera Shootout : ASUS ZenFone 2 Vs. Samsung Galaxy S6 & Apple iPhone 6 @ TechARP
- ASUS ROG G751JY-DB72 w/G-Sync Gaming Notebook Review @HiTech Legion
Subject: General Tech, Mobile | June 26, 2015 - 04:53 PM | Scott Michaud
Tagged: Samsung, battery
When I was in my Physics program, there was a running joke that the word “Nano” should be a red flag when reading research papers. This one has graphene and nanoparticles, but it lacks quantum dots and it looks privately funded by a company, so we might be good. Kidding aside, while I have little experience with battery technology, they claim to have surrounded silicon anodes for lithium batteries with a layer of graphene.
Image Credit: Samsung via Nature
This addition of graphene is said to counteract an issue where silicon expands as it is used and recharged. The paper, which again is the first source that I have seen discuss this issue, says that other attempts at using silicon adds vacant space around the anode for future growth. If you can keep the material at the same volume over its lifespan, you will be able to store more electricity in smaller devices. I wonder why Samsung would want something like that...
Business Model Based on Partnerships
|Alexandru Voica works for Imagination Technologies. His background includes research in computer graphics at the School of Advanced Studies Sant'Anna in Pisa and a brief stint as a CPU engineer, working on several high-profile 32-bit processors used in many mobile and embedded devices today. You can follow Alex on Twitter @alexvoica.|
Some months ago my colleague Rys Sommefeldt wrote an article offering his (deeply) technical perspective on how a chip gets made, from R&D to manufacturing. While his bildungsroman production covers a lot of the engineering details behind silicon production, it is light on the business side of things; and that is a good thing because it gives me opportunity to steal some of his spotlight!
This article will give you a breakdown of the IP licensing model, describing the major players and the relationships between them. It is not designed to be a complete guide by any means and some parts might already sound familiar, but I hope it is a comprehensive overview that can be used by anyone who is new to product manufacturing in general.
The diagram below offers an analysis of the main categories of companies involved in the semiconductor food chain. Although I’m going to attempt to paint a broad picture, I will mainly offer examples based on the ecosystem formed around Imagination (since that is what I know best).
A simplified view of the manufacturing chain
Let’s work our way from left to right.
Traditionally, these are the companies that design and sell silicon IP. ARM and Imagination Technologies are perhaps the most renowned for their sub-brands: Cortex CPU + Mali GPU and MIPS CPU + PowerVR GPU, respectively.
Given the rapid evolution of the semiconductor market, such companies continue to evolve their business models beyond point solutions to become one-stop shops that offer more than for a wide variety of IP cores and platforms, comprising CPUs, graphics, video, connectivity, cloud software and more.
Subject: Mobile | June 22, 2015 - 11:43 PM | Sebastian Peak
Tagged: snapdragon 410, smartphone, rumor, Moto G, LTE, lollipop, Android
9to5google is reporting specs of the upcoming Moto G refresh, and it looks like the phone will carry over the internals of the current Moto E with a Snapdragon 410 SoC, and add an improved 13MP camera.
The current Moto G has been a favorite for many as a low-cost unlocked option (and one that runs mostly stock Android), and the adoption of the faster SoC with integrated (Cat 4) LTE baseband is a necessary move to update a device that in its current iteration is limited to 3G data speeds. It is interesting that the SoC would only match that of the $149 2015 Moto E (reviewed here), but it makes sense from a financial standpoint if the rumored Moto G is to be sold at or below its current $179 price point.
There is certainly stiff competition in the midrange smartphone market, bolstered considerably by the recently released ASUS Zenfone 2 (reviewed here as well) which starts at $199 unlocked; and with devices like the new Zenfone offering full 1080p screens the rumored choice of the Moto G’s existing 5-inch 720p screen returning in 2015 might be another indication that this new phone will feature a very aggressive price.
The alleged 2015 Moto G photo (image credit: 9to5google)
The phone is also rumored to ship with Android 5.1.1, which would carry on the recent tradition of Motorola phones running the latest versions of Android. All of this is unconfirmed information based on leaks or course, but regardless of its final form more options are always welcome in the $200-and-under unlocked phone space - and this year is shaping up to be a good one for consumers.
Qualcomm’s GPU History
Despite its market dominance, Qualcomm may be one of the least known contenders in the battle for the mobile space. While players like Apple, Samsung, and even NVIDIA are often cited as the most exciting and most revolutionary, none come close to the sheer sales, breadth of technology, and market share that Qualcomm occupies. Brands like Krait and Snapdragon have helped push the company into the top 3 semiconductor companies in the world, following only Intel and Samsung.
Founded in July 1985, seven industry veterans came together in the den of Dr. Irwin Jacobs’ San Diego home to discuss an idea. They wanted to build “Quality Communications” (thus the name Qualcomm) and outlined a plan that evolved into one of the telecommunications industry’s great start-up success stories.
Though Qualcomm sold its own handset business to Kyocera in 1999, many of today’s most popular mobile devices are powered by Qualcomm’s Snapdragon mobile chipsets with integrated CPU, GPU, DSP, multimedia CODECs, power management, baseband logic and more. In fact the typical “chipset” from Qualcomm encompasses up to 20 different chips of different functions besides just the main application processor. If you are an owner of a Galaxy Note 4, Motorola Droid Turbo, Nexus 6, or Samsung Galaxy S5, then you are most likely a user of one of Qualcomm’s Snapdragon chipsets.
Qualcomm’s GPU History
Before 2006, the mobile GPU as we know it today was largely unnecessary. Feature phones and “dumb” phones were still the large majority of the market with smartphones and mobile tablets still in the early stages of development. At this point all the visual data being presented on the screen, whether on a small monochrome screen or with the color of a PDA, was being drawn through a software renderer running on traditional CPU cores.
But by 2007, the first fixed-function, OpenGL ES 1.0 class of GPUs started shipping in mobile devices. These dedicated graphics processors were originally focused on drawing and updating the user interface on smartphones and personal data devices. Eventually these graphics units were used for what would be considered the most basic gaming tasks.
Subject: Mobile | June 12, 2015 - 03:28 PM | Jeremy Hellstrom
Tagged: smartphone, sony, xperia z3+, snapdragon 810
The new Sony Xperia Z3+ is a tiny bit thinner than the non-plus model at 146x72x6.9mm and 144g compared to 146x72x7.3mm and 152g. The display is unchanged, a 5.2" IPS screen with a 1080x1920 resolution but the processor received a significant upgrade, it is now a 64-bit octa-core Qualcomm Snapdragon 810. The phone ships with Android 5.0 and The Inquirer got a chance to try it out. The new processor handles 4K video perfectly and the phone feels snappier overall compared to the previous model, check out their full experience here.
"SONY UNVEILED its latest top-end smartphone, the Sony Xperia Z3+ this week, with an updated, slimmer design, which has a lighter and sleeker frame compared with its predecessor, the Xperia Z3."
Here are some more Mobile articles from around the web:
- SISWOO C50 Longbow Smartphone Review @ Madshrimps
- Lenovo ThinkPad Helix 2nd Gen Convertible Review @ Madshrimps
- Vodafone Smart Tab 4G Tablet @ Kitguru
Introduction and Specifications
The ASUS Zenfone 2 is a 5.5-inch smartphone with a premium look and the specs to match. But the real story here is that it sells for just $199 or $299 unlocked, making it a tempting alternative to contract phones without the concessions often made with budget devices; at least on paper. Let's take a closer look to see how the new Zenfone 2 stacks up! (Note: a second sample unit was provided by Gearbest.com.)
When I first heard about the Zenfone 2 from ASUS I was eager to check it out given its mix of solid specs, nice appearance, and a startlingly low price. ASUS has created something that has the potential to transcend the disruptive nature of a phone like the Moto E, itself a $149 alternative to contract phones that we reviewed recently. With its premium specs to go along with very low unlocked pricing, the Zenfone 2 could be more than just a bargain device, and if it performs well it could make some serious waves in the smartphone industry.
The Zenfone 2 also features a 5.5-inch IPS LCD screen with 1920x1080 resolution (in line with an iPhone 6 Plus or the OnePlus One), and beyond the internal hardware ASUS has created a phone that looks every bit the part of a premium device that one would expect to cost hundreds more. In fact, without spoiling anything up front, I will say that the context of price won't be necessary to judge the merit of the Zenfone 2; it stands on its own as a smartphone, and not simply a budget phone.
The big question is going to be how the Zenfone 2 compares to existing phones, and with its quad-core Intel Atom SoC this is something of an unknown. Intel has been making a push to enter the U.S. smartphone market (with earlier products more widely available in Europe) and the Zenfone 2 marks an important milestone for both Intel and ASUS in this regard. The Z3580 SoC powering my review unit certainly sounds fast on paper with its 4 cores clocked up to 2.33 GHz, and no less than 4 GB of RAM on board (and a solid 64 GB onboard storage as well).
Introduction and Design
With the introduction of the ASUS G751JT-CH71, we’ve now got our first look at the newest ROG notebook design revision. The celebrated design language remains the same, and the machine’s lineage is immediately discernible. However, unlike the $2,000 G750JX-DB71 unit we reviewed a year and a half ago, this particular G751JT configuration is 25% less expensive at just $1,500. So first off, what’s changed on the inside?
(Editor's Note: This is NOT the recent G-Sync version of the ASUS G751 notebook that was announced at Computex. This is the previously released version, one that I am told will continue to sell for the foreseeable future and one that will come at a lower overall price than the G-Sync enabled model. Expect a review on the G-Sync derivative very soon!)
Quite a lot, as it turns out. For starters, we’ve moved all the way from the 700M series to the 900M series—a leap which clearly ought to pay off in spades in terms of GPU performance. The CPU and RAM remain virtually equivalent, while the battery has migrated from external to internal and enjoyed a 100 mAh bump in the process (from 5900 to 6000 mAh). So what’s with the lower price then? Well, apart from the age difference, it’s the storage: the G750JX featured both a 1 TB storage drive and a 256 GB SSD, while the G751JT-CH71 drops the SSD. That’s a small sacrifice in our book, especially when an SSD is so easily added thereafter. By the way, if you’d rather simply have ASUS handle that part of the equation for you, you can score a virtually equivalent configuration (chipset and design evolutions notwithstanding) in the G751JT-DH72 for $1750—still $250 less than the G750JX we reviewed.
Subject: Graphics Cards, Mobile | June 6, 2015 - 04:05 PM | Scott Michaud
Tagged: VR, nvidia, gameworks vr
So I'm not quite sure what this hypothetical patent device is. According to its application, it is a head-mounted display that contains six cameras (??) and two displays, one for each eye. The usage of these cameras is not define but two will point forward, two will point down, and the last two will point left and right. The only clue that we have is in the second patent application photo, where unlabeled hands are gesturing in front of a node labeled “input cameras”.
Image Credit: Declassified
The block diagram declares that the VR headset will have its own CPU, memory, network adapter, and “parallel processing subsystem” (GPU). VRFocus believes that this will be based on the Tegra X1, and that it was supposed to be revealed three months ago at GDC 2015. In its place, NVIDIA announced the Titan X at the Unreal Engine 4 keynote, hosted by Epic Games. GameWorks VR was also announced with the GeForce GTX 980 Ti launch, which was mostly described as a way to reduce rendering cost by dropping resolution in areas that will be warped into a lower final, displayed resolution anyway.
Image Credit: Declassified
VRFocus suggests that the reveal could happen at E3 this year. The problem with that theory is that NVIDIA has neither a keynote at E3 this year nor even a place at someone else's keynote as far as we know, just a booth and meeting rooms. Of course, they could still announce it through other channels, but that seems less likely. Maybe they will avoid the E3 hype and announce it later (unless something changes behind the scenes of course)?
Subject: Graphics Cards, Processors, Mobile | June 4, 2015 - 04:58 PM | Scott Michaud
Tagged: amd, carrizo
My discussion of the Carrizo architecture went up a couple of days ago. The post did not include specific SKUs because we did not have those at the time. Now we do, and there will be products: one A8-branded, one A10-branded, and one FX-branded.
All three will be quad-core parts that can range between 12W and 35W designs, although the A8 processor does not have a 35W mode listed in the AMD Dual Graphics table. The FX-8800P is an APU that has all eight GPU cores while the A-series APUs have six. The A10-8700P and the A8-8600P are separated by a couple hundred megahertz base and boost CPU clocks, and 80 MHz GPU clock.
Also, we have been given a table of AMD Radeon R5 and R7 M-series GPUs that can be paired with Carrizo in an AMD Dual Graphics setup. These GPUs are the R7 M365, R7 M360, R7 M350, R7 M340, R5 M335, and R5 M330. They cannot be paired with every Carrizo APU, and some pairings only work in certain power envelopes. Thankfully, this table should only be relevant to OEMs, because end-users are receiving pre-configured systems.
Pricing and availability will depend on OEMs, of course.