Qualcomm’s GPU History
Despite its market dominance, Qualcomm may be one of the least known contenders in the battle for the mobile space. While players like Apple, Samsung, and even NVIDIA are often cited as the most exciting and most revolutionary, none come close to the sheer sales, breadth of technology, and market share that Qualcomm occupies. Brands like Krait and Snapdragon have helped push the company into the top 3 semiconductor companies in the world, following only Intel and Samsung.
Founded in July 1985, seven industry veterans came together in the den of Dr. Irwin Jacobs’ San Diego home to discuss an idea. They wanted to build “Quality Communications” (thus the name Qualcomm) and outlined a plan that evolved into one of the telecommunications industry’s great start-up success stories.
Though Qualcomm sold its own handset business to Kyocera in 1999, many of today’s most popular mobile devices are powered by Qualcomm’s Snapdragon mobile chipsets with integrated CPU, GPU, DSP, multimedia CODECs, power management, baseband logic and more. In fact the typical “chipset” from Qualcomm encompasses up to 20 different chips of different functions besides just the main application processor. If you are an owner of a Galaxy Note 4, Motorola Droid Turbo, Nexus 6, or Samsung Galaxy S5, then you are most likely a user of one of Qualcomm’s Snapdragon chipsets.
Qualcomm’s GPU History
Before 2006, the mobile GPU as we know it today was largely unnecessary. Feature phones and “dumb” phones were still the large majority of the market with smartphones and mobile tablets still in the early stages of development. At this point all the visual data being presented on the screen, whether on a small monochrome screen or with the color of a PDA, was being drawn through a software renderer running on traditional CPU cores.
But by 2007, the first fixed-function, OpenGL ES 1.0 class of GPUs started shipping in mobile devices. These dedicated graphics processors were originally focused on drawing and updating the user interface on smartphones and personal data devices. Eventually these graphics units were used for what would be considered the most basic gaming tasks.
Subject: Mobile | June 12, 2015 - 03:28 PM | Jeremy Hellstrom
Tagged: smartphone, sony, xperia z3+, snapdragon 810
The new Sony Xperia Z3+ is a tiny bit thinner than the non-plus model at 146x72x6.9mm and 144g compared to 146x72x7.3mm and 152g. The display is unchanged, a 5.2" IPS screen with a 1080x1920 resolution but the processor received a significant upgrade, it is now a 64-bit octa-core Qualcomm Snapdragon 810. The phone ships with Android 5.0 and The Inquirer got a chance to try it out. The new processor handles 4K video perfectly and the phone feels snappier overall compared to the previous model, check out their full experience here.
"SONY UNVEILED its latest top-end smartphone, the Sony Xperia Z3+ this week, with an updated, slimmer design, which has a lighter and sleeker frame compared with its predecessor, the Xperia Z3."
Here are some more Mobile articles from around the web:
- SISWOO C50 Longbow Smartphone Review @ Madshrimps
- Lenovo ThinkPad Helix 2nd Gen Convertible Review @ Madshrimps
- Vodafone Smart Tab 4G Tablet @ Kitguru
Introduction and Specifications
The ASUS Zenfone 2 is a 5.5-inch smartphone with a premium look and the specs to match. But the real story here is that it sells for just $199 or $299 unlocked, making it a tempting alternative to contract phones without the concessions often made with budget devices; at least on paper. Let's take a closer look to see how the new Zenfone 2 stacks up! (Note: a second sample unit was provided by Gearbest.com.)
When I first heard about the Zenfone 2 from ASUS I was eager to check it out given its mix of solid specs, nice appearance, and a startlingly low price. ASUS has created something that has the potential to transcend the disruptive nature of a phone like the Moto E, itself a $149 alternative to contract phones that we reviewed recently. With its premium specs to go along with very low unlocked pricing, the Zenfone 2 could be more than just a bargain device, and if it performs well it could make some serious waves in the smartphone industry.
The Zenfone 2 also features a 5.5-inch IPS LCD screen with 1920x1080 resolution (in line with an iPhone 6 Plus or the OnePlus One), and beyond the internal hardware ASUS has created a phone that looks every bit the part of a premium device that one would expect to cost hundreds more. In fact, without spoiling anything up front, I will say that the context of price won't be necessary to judge the merit of the Zenfone 2; it stands on its own as a smartphone, and not simply a budget phone.
The big question is going to be how the Zenfone 2 compares to existing phones, and with its quad-core Intel Atom SoC this is something of an unknown. Intel has been making a push to enter the U.S. smartphone market (with earlier products more widely available in Europe) and the Zenfone 2 marks an important milestone for both Intel and ASUS in this regard. The Z3580 SoC powering my review unit certainly sounds fast on paper with its 4 cores clocked up to 2.33 GHz, and no less than 4 GB of RAM on board (and a solid 64 GB onboard storage as well).
Introduction and Design
With the introduction of the ASUS G751JT-CH71, we’ve now got our first look at the newest ROG notebook design revision. The celebrated design language remains the same, and the machine’s lineage is immediately discernible. However, unlike the $2,000 G750JX-DB71 unit we reviewed a year and a half ago, this particular G751JT configuration is 25% less expensive at just $1,500. So first off, what’s changed on the inside?
(Editor's Note: This is NOT the recent G-Sync version of the ASUS G751 notebook that was announced at Computex. This is the previously released version, one that I am told will continue to sell for the foreseeable future and one that will come at a lower overall price than the G-Sync enabled model. Expect a review on the G-Sync derivative very soon!)
Quite a lot, as it turns out. For starters, we’ve moved all the way from the 700M series to the 900M series—a leap which clearly ought to pay off in spades in terms of GPU performance. The CPU and RAM remain virtually equivalent, while the battery has migrated from external to internal and enjoyed a 100 mAh bump in the process (from 5900 to 6000 mAh). So what’s with the lower price then? Well, apart from the age difference, it’s the storage: the G750JX featured both a 1 TB storage drive and a 256 GB SSD, while the G751JT-CH71 drops the SSD. That’s a small sacrifice in our book, especially when an SSD is so easily added thereafter. By the way, if you’d rather simply have ASUS handle that part of the equation for you, you can score a virtually equivalent configuration (chipset and design evolutions notwithstanding) in the G751JT-DH72 for $1750—still $250 less than the G750JX we reviewed.
Subject: Graphics Cards, Mobile | June 6, 2015 - 04:05 PM | Scott Michaud
Tagged: VR, nvidia, gameworks vr
So I'm not quite sure what this hypothetical patent device is. According to its application, it is a head-mounted display that contains six cameras (??) and two displays, one for each eye. The usage of these cameras is not define but two will point forward, two will point down, and the last two will point left and right. The only clue that we have is in the second patent application photo, where unlabeled hands are gesturing in front of a node labeled “input cameras”.
Image Credit: Declassified
The block diagram declares that the VR headset will have its own CPU, memory, network adapter, and “parallel processing subsystem” (GPU). VRFocus believes that this will be based on the Tegra X1, and that it was supposed to be revealed three months ago at GDC 2015. In its place, NVIDIA announced the Titan X at the Unreal Engine 4 keynote, hosted by Epic Games. GameWorks VR was also announced with the GeForce GTX 980 Ti launch, which was mostly described as a way to reduce rendering cost by dropping resolution in areas that will be warped into a lower final, displayed resolution anyway.
Image Credit: Declassified
VRFocus suggests that the reveal could happen at E3 this year. The problem with that theory is that NVIDIA has neither a keynote at E3 this year nor even a place at someone else's keynote as far as we know, just a booth and meeting rooms. Of course, they could still announce it through other channels, but that seems less likely. Maybe they will avoid the E3 hype and announce it later (unless something changes behind the scenes of course)?
Subject: Graphics Cards, Processors, Mobile | June 4, 2015 - 04:58 PM | Scott Michaud
Tagged: amd, carrizo
My discussion of the Carrizo architecture went up a couple of days ago. The post did not include specific SKUs because we did not have those at the time. Now we do, and there will be products: one A8-branded, one A10-branded, and one FX-branded.
All three will be quad-core parts that can range between 12W and 35W designs, although the A8 processor does not have a 35W mode listed in the AMD Dual Graphics table. The FX-8800P is an APU that has all eight GPU cores while the A-series APUs have six. The A10-8700P and the A8-8600P are separated by a couple hundred megahertz base and boost CPU clocks, and 80 MHz GPU clock.
Also, we have been given a table of AMD Radeon R5 and R7 M-series GPUs that can be paired with Carrizo in an AMD Dual Graphics setup. These GPUs are the R7 M365, R7 M360, R7 M350, R7 M340, R5 M335, and R5 M330. They cannot be paired with every Carrizo APU, and some pairings only work in certain power envelopes. Thankfully, this table should only be relevant to OEMs, because end-users are receiving pre-configured systems.
Pricing and availability will depend on OEMs, of course.
Digging into a specific market
A little while ago, I decided to think about processor design as a game. You are given a budget of complexity, which is determined by your process node, power, heat, die size, and so forth, and the objective is to lay out features in the way that suits your goal and workload best. While not the topic of today's post, GPUs are a great example of what I mean. They make the assumption that in a batch of work, nearby tasks are very similar, such as the math behind two neighboring pixels on the screen. This assumption allows GPU manufacturers to save complexity by chaining dozens of cores together into not-quite-independent work groups. The circuit fits the work better, and thus it lets more get done in the same complexity budget.
Carrizo is aiming at a 63 million unit per year market segment.
This article is about Carrizo, though. This is AMD's sixth-generation APU, starting with Llano's release in June 2011. For this launch, Carrizo is targeting the 15W and 35W power envelopes for $400-$700 USD notebook devices. AMD needed to increase efficiency on the same, 28nm process that we have seen in their product stack since Kabini and Temash were released in May of 2013. They tasked their engineers to optimize their APU's design for these constraints, which led to dense architectures and clever features on the same budget of complexity, rather than smaller transistors or a bigger die.
15W was their primary target, and they claim to have exceeded their own expectations.
Backing up for a second. Beep. Beep. Beep. Beep.
When I met with AMD last month, I brought up the Bulldozer architecture with many individuals. I suspected that it was a quite clever design that didn't reach its potential because of external factors. As I started this editorial, processor design is a game and, if you can save complexity by knowing your workload, you can do more with less.
Bulldozer looked like it wanted to take a shortcut by cutting elements that its designers believed would be redundant going forward. First and foremost, two cores share a single floating point (decimal) unit. While you need some floating point capacity, upcoming workloads could use the GPU for a massive increase in performance, which is right there on the same die. As such, the complexity that is dedicated to every second FPU can be cut and used for something else. You can see this trend throughout various elements of the architecture.
Subject: Mobile | June 2, 2015 - 09:00 AM | Sebastian Peak
Tagged: notebook, msi, Intel Core i7, gaming notebook, computex 2015, computex, Broadwell
MSI has unveiled a refreshed notebook lineup featuring the new quad-core Intel Broadwell mobile processors.
Broadwell launched as a dual-core only option, which resulted in some high-performance notebooks opting to stay with Haswell CPUs. With the introduction of quad-core versions of the new Broadwell chips for mobile, MSI has jumped on the bandwagon to offer a few different options. Of the 20 new notebooks offered by MSI, 18 of them are powered by Intel Core i7 chips.
Intel’s 5th Generation Core i7 processor powers 18 MSI laptop models, including the GT80 Titan SLI, GT72 Dominator, GS70 Stealth, GS60 Ghost, GE72 Apache, GE62 Apache, GP72 Leopard, GP62 Leopard, and the newly announced PX60 Prestige. Available immediately, all gaming notebook models come with an array of superior technologies, including Killer DoubleShot Pro for lag-less gaming, SteelSeries Gaming Keyboard for exceptional customization and feel, and more.
The flagship GT80 Titan SLI has these impressive specs, including an Intel Core i7-5950HQ processor:
GT80 Titan SLI
- Screen: 18.4” 1920x1080 WideView Non-Reflection
- CPU: Intel Core i7-5950HQ, 2.9 - 3.7 GHz
- Chipset: HM87
- Graphics: Dual GTX 980M SLI, 8GB GDDR5 VRAM each
- Memory: 24GB (8GB x3) DDR3L 1600MHz (4 SoDIMM slots, max 32GB)
- Storage: 256GB Super RAID (128GB M.2 SATA x2, RAID 0) + 1TB 7200 RPM HDD
- Optical: BD Burner
- LAN: Killer Gaming Network
- Wireless: Killer N1525 Combo (2x2 ac), BT 4.1
- Card Reader: SDXC
- Video Output: HDMI 1.4, mDP v1.2 x2
- MSRP: $3799.99
The GT80 Titan SLI gaming notebook
1920x1080 with this model seems low, especially considering the obscene amount of VRAM (8GB per card on a laptop? Really?). Still, this notebook has excellent external monitor support with dual mini-DisplayPort outputs, though HDMI is limited to version 1.4.
MSI has also introduced a refreshed GT72 Dominator with NVIDIA G-Sync (covered here), and this new version also features USB 3.1. And for the more business-minded there is the premium PX60 Prestige, now refreshed with Broadwell Core i7 as well.
These refreshed notebook models will be “available immediately” from MSI’s retail partners.
Subject: Mobile | June 2, 2015 - 09:00 AM | Sebastian Peak
Tagged: Tobii Technology, msi, GTX 980M, gt72, gaming notebook, g-sync, eye-tracking, computex 2015, computex
MSI has announced a new version of the GT72 gaming notebook featuring NVIDIA G-SYNC technology.
Like the current GT72 Dominator Pro G, this features NVIDIA GeForce GTX 980M graphics, though this announced version has 8GB of GDDR5 (vs. the previous 4GB) powering its 17.3” display. The G-SYNC implementation with this notebook will allow for variable refresh between 30 - 75 Hz, and as the existing G72 is a 1920x1080 notebook also featuring a GTX 980M it might seem unnecessary to implement G-SYNC, though this would ensure a smoother experience with the newest games at very high detail settings.
Based on the current GT72 Dominator Pro G we can also expect an Intel Broadwell Core i7 mobile processor (the i7-5700HQ in the current model), and these notebooks support up to 32GB of DDR3L 1600MHz memory, as well as up to 4 M.2 SSDs in RAID 0.
MSI is also announcing development, in partnership with eye-tracking company Tobii Technology, of a “fully integrated eye-tracking notebook” for gamers, and MSI will have prototype notebooks at Computex to demonstrate the technology.
We’ll post additional details when available. Right now full specs, as well as pricing and availability, have not been revealed.
Subject: Mobile | June 1, 2015 - 07:47 AM | Sebastian Peak
Tagged: ZenPad, tablets, moorefield, computex 2015, computex, intel atom, atom x3, asus, ZenPad S
ASUS has announced their ZenPad 8.0 Series of tablets, and these feature some proprietary visual enhancements to give their screens more contrast and vivid color:
- ASUS VisualMaster HDR video technology
- ASUS TruVivid (direct bonded glass)
- Bluelight filter (reduces blue light by 30%)
- ASUS Tru2Life (contrast and sharpness enhancement) technology
- Inteligent contrast (up to 150% wider contrast levels)
The ASUS ZenPad S 8.0
What about specifications? Here’s a quick rundown of the tablets, both of which run Android 5.0 Lollipop with the ASUS ZenUI:
ZenPad S 8.0 Z580CA
- Intel Atom 3580 Moorefield
- Quad-core up to 2.3 GHz
- PowerVR G6430 Rogue graphics
- 4GB of RAM
- 16GB, 32GB or 64GB eMMC storage
- 8-inch IPS panel
- 2048x1536 display (4:3)
- TruVivid technology (direct bonded glass)
- 73.75% screen-to-body ratio
- Dual front speakers
- 5MP front camera, 8MP rear camera
- Micro SDXC (up to 128GB)
- 15.2Wh battery
- GPS & GLONASS
- 802.11b/g/n + Bluetooth 4.0 LE
- USB Type-C connector
ZenPad 8.0 Z380C series
- Intel Atom X3 (SoFIA) C3200 series
- Mali 450MP4 GPU
- 8-inch IPS display
- 76.5% screen-to-body ratio
- 1280 x 800
- 16:10 ratio
- 10-finger multi-touch
- 1GB/2GB ram
- 8GB or 16GB eMMC
- 802.11b/g/n + Bluetooth 4.0 LE
- 2MP front camera, 5MP rear camera
- 1 x micro SDXC
- Sensors: G-Sensor, E-compass, GPS, Hall sensor, Light sensor
- 4000 mAh battery (non-removable)
No specifics on pricing or availablitiy just yet.