All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Mobile | April 23, 2015 - 02:24 PM | Jeremy Hellstrom
Tagged: Samsung, Galaxy S6 Edge, lollipop
The physical difference between the Galaxy S6 and the Galaxy S6 Edge are quite visible, but does the different body justify the price difference? The curved screen adds a bit of screen real estate and provides improved view angles compared to the base model but similar to the previous Galaxy Note Edge, there are not many apps designed to take advantage of the curve. The phone is 7mm thick and weighs slightly less than the base S6 at 132g, with a similar battery and the same TouchWiz overlay on top of Android Lollipop. You can check out what The Inquirer thought of Samsung's new premium phone here if you are considering purchasing the S6 Edge.
"THE GALAXY S6 EDGE will be seen by many as an expensive gimmick given that it's over £100 more expensive than the regular Galaxy S6, while others will see it as Samsung pushing the boundaries of design, and trumping its rivals by bringing something new to the smartphone market."
Here are some more Mobile articles from around the web:
- Camera Shootout : The Samsung Galaxy S6 & Galaxy S6 Edge Vs. The Apple iPhone 6 @ TechARP
- Xiaomi Redmi 2 @ Kitguru
- Blackview Breeze Smartphone Review @ Madshrimps
- OPPO R5 @ Kitguru
- Arion Bluetooth Mini Keyboard with Speakerphone @ eTeknix
- ASUS Republic of Gamers G751JY 17-inch Gaming Laptop Review @ Techgage
Subject: Mobile | April 14, 2015 - 03:33 PM | Jeremy Hellstrom
Tagged: Samsung, galaxy s6, Android 5.0
Samsung's new Galaxy S6 is unique in that it has metal sides and Gorilla Glass on both the back and front of the phone. The body is 143x71x6.8mm and it weighs a total of 138g, compared the the iPhone 6 at 138x67x6.9mm and 129g. The screen is 2560x1440, a density of 577PPI which compares favourably to the iPhone's 1334x750 at 326 PPI. The Inquirer was impressed by the quality of the screen as well as the colour calibration that they felt was significantly better than on the S5. As far as performance, the phone was tested by playing three hours of XCOM and it did so without stuttering or becoming uncomfortably warm. They tested the non-removable battery by looping a video, which the phone could manage for just over eight hours, slightly better than the competition though they lose the benefit of battery swapping thanks to the new design. Check out the images taken with the new camera and answers to other specific questions in their full review.
"Aware of customers' and reviewers' complaints, Samsung made a sweep of reforms in its smartphone division and "went back to the drawing board" with the 2015 Galaxy S6."
Here are some more Mobile articles from around the web:
- Asus ZenFone 5 LTE @ Kitguru
- Blackview Omega Smartphone Review @ Madshrimps
- Adam Elements Bella Power 6000mAh Portable Power Bank Review @ NikKTech
- XMG A505 Gaming Laptop @ HardwareHeaven
- Razer Blade Pro @ Kitguru
Subject: Mobile | April 3, 2015 - 08:00 AM | Tim Verry
Tagged: snapdragon 801, smartphone, quad hd, LG, Android 5.0
Just Delivered is a section of PC Perspective where we share some of the goodies that pass through our labs that may or may not see a review, but are pretty cool none the less.
Find the LG G3 on Amazon!
- LG G3 32GB Unlocked US Version (T-Mobile) - $540
- LG G3 16GB Unlocked International Version - $380
- LG G3 32GB Unlocked International Version - $435
Last week I stopped by the T-Mobile store in the mall, handed over two old phones, and ported over two lines from Verizon. I walked out with a cheaper contract with unlimited data (versus 4GB on Verizon) and a shiny new (to me, it's been out for awhile) LG G3. Which brings me to this post.
First off, the LG G3 is huge. This is the
smallest tablet largest smartphone I have ever owned. Measuring 146.3 x 74.6 x 8.9 mm, the 149g smartphone is slightly smaller than the Apple iPhone 6 Plus and a bit chunkier at its thickest point. It is however easier to hold and operate (especially one handed) than the iDevice. The is dominated by a large 5.5-inch Quad HD IPS display (2560 x 1440 resolution) and features round edges and a curved back. I chose the white version, but it also comes in black, blue, gold, red, and purple (the international versions). Except for the top bezel that holds the webcam, light sensor, and speaker, and that bit of empty space below the display with the LG logo, the G3 has super thin bezels. In fact, the phone is not much larger than the display (certainly width wise).
The LG G3's display looks amazing with sharp text and extremely detailed videos (the included 4k content is great). It is highly reflective and I had to crank the brightness all the way up to be able to read it under direct sunlight (my S4 was similar in this respect). In other lighting situations, it worked really well.
An infrared transmitter, microphone, micro USB port, and 3.5mm audio jack are placed along the top and bottom edges of the phone. Like its predecessor (the G2), LG has placed the power and volume buttons on the back of the device rather than the sides (Update: I am generally liking this setup now). The recessed buttons sit beneath the camera lens and are easier to find and use than I expected them to be. Now that I am getting used to them, I think LG is onto something (good) with this button placement. There is also a 1-watt speaker in the lower left corner of the back cover for media playback and speakerphone calls. For a smartphone speaker it can get fairly loud and does what it is supposed to. It is not spectacular but it is also not bad. I mostly use headphones but it's nice to know that I have a decent speaker should I want to share my music.
The curved back cover makes it easy to hold in one hand (even if I can't hit all the on-screen buttons without a longer thumb heh) and I feel like it will be dropped less frequently than my previous phone (the Galaxy S4) as a result of the form factor. One big change with the G3, for me, is the lack of buttons below the display (capacitive or physical), but I am slowly getting used to the on-screen navigation on Android (especially once I figured out I could long press the recent apps button to regain the menu button I miss from my S4).
Aside from the display, the G3 features a 2.1MP front facing camera and a 13MP rear camera. The rear camera is where things get interesting because it is paired with a dual LED flash, laser focus, and optical image stabilization (OIS) technology. Outdoor shots were excellent and indoor shots with enough lighting were great. In low light situations, the camera left something to be desired, and I was kind of disappointed. Using the flash does help and it is quite bright. However, I tend to not like using the flash unless I have to as photos always look less natural. For as small as the camera is though (the lens and sensor are tiny), it does pretty well. In good lighting conditions it is trounces my S4 but the (upgrade) is much less noticeable with less light (the G3 does have a much brighter flash).
The laser focus is a really cool feature that works as advertised. The camera focuses extremely quickly (even in low light) allowing me a much better chance to capture the moment. It also refocuses (tap to focus) quickly.
The camera software is not as full featured as other smartphones I have used, however. I was put off by this at first as someone that likes to tinker with these things but at the end of the day it does what it is supposed to and it does it well (which is to take photos). You can swipe to switch between the front and back cameras, choose from a couple preset modes, and adjust basic settings like resolution, voice controls, HDR, and shutter timer. For "selfie" fans, LG has a feature where you can make a fist in the air and it will start a countdown timer. While I have not tried the voice commands, I did try the gesture and it does work well.
Anyways, before this turns into a full review (heh), it might help to know what's under the hood as well. The G3 is powered by a Qualcomm Snapdragon 801 SoC which pairs four Krait 400 CPU cores clocked at 2.5 GHz with an Adreno 330 GPU. The phone comes with either 16GB internal storage and 2GB of RAM or 32GB internal storage and 3GB RAM. I chose the higher end model to get the extra RAM just in case as I plan to have this phone for a long time. It supports 4G LTE, 802.11ac Wi-Fi, Bluetooth 4.0, GPS, and NFC (Near Field Communication). You can also use it with Qi-enabled wireless chargers if you purchase a supporting back cover. The G3 is running Android 4.4.2 on T-Mobile but it does support Android 5.0 and some carriers have already pushed out updates.
The G3 comes with a 3,000 mAh battery and a 1.8 amp USB charger. It does take awhile to charge this thing (my 2.1 amp Samsung charger is a bit faster), but once it is fully charged it will easily last all day including listening to streaming music and audiobooks, text messaging, and web browsing. (Update: I don't have specific battery life numbers yet, but I generally only need to charge it once a day so long as I keep the display brightness around half. If I crank the brightness all the way up I can almost feel the battery draining by the second heh.)
Like Samsung, LG has a battery saving feature that will kick in at 30% to conserve battery but turning down the screen brightness, turning off radios that are not active, and a few other configurable battery drainers (haptic feedback, notification lights, and account syncing). I do like their battery settings page as it will estimate the time needed to charge and the time remaining as it discharges along with a nice graph of battery percentages over time. Other Android phones have something similar but LG has fleshed it out a bit more.
Just for fun, I installed 3DMark and ran the Ice Storm benchmark. The LG G3 maxed out the Ice Storm test and scored 10,033 points in Ice Storm Extreme. Further, it scored 16,151 in Ice Storm Unlimited. In comparison, the (apparently extremely popular judging by the feedback) Samsung Galaxy Centura scored 536 in Ice Storm and 281 in Ice Storm Extreme respectively (hehe). My Galaxy S4 is no longer available for me to test, but TweakTown was able to get 6,723 in the Ice Storm Extreme test.
LG packs light with only the smartphone, USB cable, USB charger, and a quck start guide included in the box. No headphones or extra accessories here.
In all, so far so good with the LG G3. I am very happy with my purchase and would recommend checking it out if you are in the market for a large display-packing smartphone that's not an iPhone 6 Plus or Galaxy Note 4 (which Ryan recently reviewed). If you want the latest and greatest Android phone and can afford the premium (about $300 more in my case when I compared them), grab the Note 4. On the other hand, if you are looking for a Android smartphone with a large display, good battery life, and decent hardware specifications, the LG G3 is a respectible choice that delivers and doesn't break the bank.
Have you tried out the G3? What do you think about the trend for larger and thinner smartphones? This is hardly an exhaustive review and there are things I didn't get into here. After all, I'm still checking out my G3. With that said, from first impressions and about a week of usage it seems like a really solid device. I've since fitted it with a screen protector and a case so as to not break it – especially that hi-res display!
Subject: Mobile | April 2, 2015 - 05:26 PM | Jeremy Hellstrom
Tagged: razer, blade 14, gaming laptop
Razer has refreshed their Blade series gaming laptop for 2015, thankfully keeping the M.2 SSD and the 3200×1800 resolution but unfortunately they stuck with the glossy panel. The i7-4720HQ stays but the GPU has been replaced with a GTX970m 3GB and have doubled the RAM to 16GB, at least in the model which Kitguru tested. The 14" size helps keep the weight down to 4.5lbs but also ensures the price is high, Amazon is selling the 512GB model for $2700 currently. If you have the money and require a gaming laptop for some reason this is a great choice, otherwise spend less on a more powerful desktop machine.
"Gaming laptops have a huge audience, but not everyone wants to lug around a 17 inch behemoth weighing more than 5KG. Razer have enjoyed success in recent years with their Blade range of laptops … even if the price has been prohibitive for many."
Here are some more Mobile articles from around the web:
- Gigabyte P37X Laptop @ HardwareHeaven
- HIS Multi-View X2 USB Docking Station Review @ Madshrimps
- Samsung Galaxy A5 Smartphone Review @ Hardware Secrets
- Acer Liquid Jade Smartphone @ Kitguru
Subject: Mobile | March 30, 2015 - 03:43 PM | Ryan Shrout
Tagged: Tegra X1, tegra, shield portable, shield, portable, nvidia
UPDATE (3/31/15): Thanks to another tip we can confirm that the new SHIELD P2523 will have the Tegra X1 SoC in it. From this manifest document you'll see the Tegra T210 listed (the same part marketed as X1) as well as the code name "Loki." Remember that the first SHIELD Portable device was code named Thor. Oh, so clever, NVIDIA.
Based on a rumor posted by Brad over at Lilliputing, it appears we can expect an updated NVIDIA SHIELD Portable device sometime later in 2015. According to both the Bluetooth and Wi-Fi certification websites, a device going by the name "NVIDIA Shield Portable P2523" has been submitted. There isn't a lot of detail though:
- 802.11a/b/g/n/ac dual-band 2.4 GHz and 5 GHz WiFi
- Bluetooth 4.1
- Android 5.0
- Firmware version 3.10.61
We definitely have a new device here as the initial SHIELD Portable did not includ 802.11ac support at all. And though no data is there to support it, you have to assume that NVIDIA would be using the new Tegra X1 processor in any new SHIELD devices coming out this year. I already previewd the new SHIELD console from GDC that utilizes that same SoC, but it might require a lower clocked, lower power version of the processor to help with heat and battery life on a portable unit.
There’s no information about the processor, screen, or other hardware. But if the new Shield portable is anything like the original, it’ll probably consist of what looks like an Xbox-style game controller with an attached 5 inch display which you can fold up to play games on the go.
And if it’s anything like the new NVIDIA Shield console, it could have a shiny new NVIDIA Tegra X1 processor to replace the aging Tegra 4 chip found in the original Shield Portable.
I wouldn’t be surprised if it also had a higher-resolution display, more memory, or other improvements.
Keep an eye out - NVIDIA may be making a push for even more SHIELD hardware this summer.
Subject: Processors, Mobile | March 25, 2015 - 09:51 PM | Scott Michaud
Tagged: Intel, core m, atom, surface, Surface 2, Windows 8.1, windows 10
The stack of Microsoft tablet devices had high-end Intel Core processors hovering over ARM SoCs, the two separated by a simple “Pro” label (and Windows 8.x versus Windows RT). While the Pro line has been kept reasonably up to date, the lower tier has been stagnant for a while. That is apparently going to change. WinBeta believes that a new, non-Pro Surface will be announced soon, at or before BUILD 2015. Unlike previous Surface models, it will be powered by an x86 processor from Intel, either an Atom or a Core M.
This also means it will run Windows 8.1.
The article claims, somewhat tongue-in-cheek, that Windows RT is dead. No. But still, the device should be eligible for a Windows 10 upgrade when it launches, unlike the RT-based Surfaces. Whether that is a surprise depends on the direction you view it from. I would find it silly for Microsoft to release a new Surface device, months before an OS update, but design it to be incompatible with it. On the other hand, it would be the first non-Pro Surface to do so. Either way, it was reported.
The “Surface 3”, whatever it will be called, is expected to be a fanless design. VR-Zone expects that it will be similar to the 10.6-inch, 1080p form factor of the Surface 2, but that seems to be their speculation. That is about all that we know thus far.
Subject: Mobile | March 24, 2015 - 07:02 PM | Jeremy Hellstrom
Tagged: Android, linux, smartwatch
Linux.com offers you a shopping list of smartwatches which are all less expensive than the fruit flavoured models and run Android or Linux. From familiar models like the Pebble and the older and less impressive Neptune Pine and Omate TrueSmart to leaked models like the Tizen-based Samsung Orbis you have quite a few choices to look through. There is even Monohm's large Runcible that is more of a pocket watch than a wrist watch to consider. In many cases the details are a bit lacking but the model names are known so you can get a leg up on your research for when they are finally revealed with full specifications.
"Much to the delight of Apple fanbots everywhere, Apple has now fully unveiled the Apple Watch. The watch, which was previewed in September, will go on sale April 10 and ship on the 24th. Based on its brand name, styling, accessories, and battery life claims, it will likely be a big hit -- at least as far as smartwatches go."
Here are some more Mobile articles from around the web:
- ASUS ZenFone 6 Mobile @ Kitguru
- Kingston Technologies Mobile Lite G4 Media Reader @ Bjorn3d.com
- Kingston Technologies Data Traveler microDuo 3 @ Bjorn3d
- Seagate Wireless 500GB mobile storage drive @ Kitguru
- Startech Universal USB 3.0 Laptop Docking Station @ Bjorn3d
- MSI GE62 2QD Apache @ HardwareHeaven
Subject: Mobile | March 13, 2015 - 09:33 AM | Sebastian Peak
Tagged: Razer Blade Pro, razer, notebook, laptop, i7-4720HQ, GTX 960M, gaming notebook
Razer has updated their massive Blade Pro notebook with new dual storage options and NVIDIA’s newly announced GeForce GTX 960M graphics.
Razer targets the Blade Pro at both gamers and professionals, placing emphasis on the usefulness of the device beyond gaming. However, being limited to 1920x1080 on a 17.3-inch display will eliminate this from consideration by most creative professionals (though the display does feature an anti-glare matte finish). Aiding the performance/gaming side of the notebook is Razer’s localized heating system which the company claims “focuses on directing heat away from the main touch surfaces of the notebook, to areas that can dissipate heat quickly and are not commonly touched by the user. This allows the laptop to pack in the highest performance available with NVIDIA’s critically acclaimed GTX graphics”.
The Blade Pro is constructed from aluminum and while reasonably thin at 0.88 inches, the notebook weighs in at a hefty 6.76 pounds (though the probably battery life of such a high-powered system precludes this from a lot of portable use anyway).
One of the most interesting aspects of this design is Razer’s Switchblade User Interface (SBUI), which the company says “is designed for a more efficient and intuitive experience for professionals and gamers.” It combines 10 customizable tactile keys and a unique LCD trackpad (which I would assume features a glass surface). Meanwhile the keyboard is backlit and features anti-ghosting technology as well.
Intel Core i7-4720HQ Quad Core Processor (2.6GHz / 3.6GHz)
NVIDIA GeForce GTX 960M (4GB GDDR5 VRAM), Optimus Technology
16GB System Memory (DDR3L-1600 MHz)
Windows 8.1 64-Bit
128GB SSD + 500GB HDD / 256GB SSD + 500GB HDD / 512GB SSD + 1TB HDD
17.3" Full HD 16:9 Ratio, 1920 x 1080 LED backlit
Intel Wireless-AC 7260HMW (802.11a/b/g/n/ac + Bluetooth 4.0)
Gigabit Ethernet port
3x USB 3.0 ports
HDMI 1.4a audio and video output
Dolby Digital Plus Home Theater Edition
Built-in stereo speakers
3.5 mm microphone/headphone combo jack
7.1 Codec support (via HDMI)
Built-in full-HD webcam (2.0 MP)
Compact 150 W Power Adapter
Built-in 74 Wh Rechargeable lithium ion polymer battery
Razer Switchblade User Interface (SBUI)
Razer Anti-Ghosting Keyboard (with adjustable backlight)
Razer Synapse Enabled
Kensington Lock interface
16.8 in. (427 mm) Width x 0.88 in. (22.4 mm) Height x 10.9 in. (277 mm) Depth
6.76 lbs. / 3.07 kg
The Razer Blade Pro starts at $2299.99 and is available now from the Razer online store.
Subject: General Tech, Mobile | March 13, 2015 - 09:00 AM | Tim Verry
Tagged: haswell, GTX 960M, gaming laptop, g501, ASUS ROG, asus
Today Asus unveiled the Republic of Gamers (ROG) G501 gaming laptop. The G501 is a 4.54 pound 15.6” laptop that packs high end hardware into a thin aluminum shell.
The ROG G501 features a dark gray 0.81” thick aluminum chassis with a brushed metal finish and red bezel accents. A 15.6” matte IPS display dominates the top half of the PC with a resolution of 3840x2160 (UHD). The lower half includes a red backlit keyboard (1.6mm key travel) with colored WASD keys and a number pad as well as a large trackpad.
External I/O on this gaming machine is extensive and includes:
- 1 x Thunderbolt
- 3 x USB 3.0
- 1 x HDMI
- 1 x Audio combo jack
- 1 x SD
- 1 x 1.2MP webcam
- Wi-Fi 802.11ac + Bluetooth 4.0
Asus is using the latest mobile technology with the G501 including a 47W Intel Haswell Core i7-4720HQ (4c/8t) processor, NVIDIA GTX 960M (4GB) graphics card, up to 16GB of DDR3 memory, and an impressive 512GB PCI-E x4 solid state drive (rated at 1,400MB/s reads). The laptop also supports 802.11ac Wi-Fi and Bluetooth 4.0. Asus claims that its Hyper Cool technology will keep the system running cool by using copper heatpipes and giving the CPU and GPU their own heatsink and fan which can be independently controlled to maintain a balance of heat and noise. The laptop is powered by a 96Wh Lithium Polymer battery.
This beastly gaming laptop will be available next month with an MSRP of $1,999 (with the configuration listed above). More information can be found at gseries.asus.com
In addition to the ROG G501, Asus’ GL551 and G751 series are also being refreshed to include NVIDIA’s new GTX 900 series graphics. The GL551JW will get the GTX 960M while the G751JL will use the GTX 965M.
Subject: Mobile | March 12, 2015 - 02:56 PM | Sebastian Peak
Until yesterday virtually all Chromebooks had two things in common: low-end specs and equally low prices. Most sell for around $200 and are available from virtually every manufacturer, and the relative success of these Google Chrome OS laptops in the post-netbook portable space has relied on price. Now Google has announced a new concept for a Chromebook: give it high-end specs and charge $999.
Is it reasonable to assume in 2015 that a user could be perfectly content using cloud storage and web-based apps to accomplish daily tasks? In many cases, yes. But asking $1k on the strength of better hardware is going to be a difficult sell for a Chromebook. The specs are impressive, beginning with a very high resolution 2560x1700 touchscreen, and like the new MacBook this is also sporting USB Type-C connectivity (with the same 5Gbps speed as the Apple implementation).
The pricing for this device continues a disturbing trend, coming just days after Apple's announcement of a Core M MacBook for $1299. In appearance the Pixel seems to borrow rather heavily from the MacBook Air design with a silver finish, glass trackpad, and backlit island-style black keyboard. If the build quality and screen are top notch then Google may have some justification for the price, but with the limitation of just 32GB of local storage (an additional 1TB cloud storage is offered at no cost for 3 years) and an OS that can only run applications from Google's Chrome store, the price does seem high.
Specs from Google below:
- 12.85" multi touch display, 2560 x 1700 (239 ppi), 400 nit brightness, 178° viewing angle
- Intel® Core™ i5 processor @ 2.2GHz, 8GB memory or Intel® Core™ i7 processor @ 2.4GHz, 16GB memory
- Intel® HD Graphics 5500, supports 4K video output over DisplayPort or HDMI with optional Type-C video adapter cable
- 32GB or 64GB of flash storage
Backlit keyboard, fully clickable etched-glass trackpad
- 720P HD wide angle camera with blue glass
- 2x USB Type-C (up to 5Gbps data, 4K display out with optional HDMI or DisplayPort™ adapter, 60W charging)
- 2x USB 3.0
- SD card reader
- Intel Dual Band Wireless-AC 7260 2x2, Bluetooth 4.0
- High power stereo speakers, built-in microphone, headphone/mic combo jack
- Universal Type-C USB Charger, 60W
- Up to 12 hours of battery life
- Dimensions: 11.7” x 8.8” x 0.6”, 3.3lbs
If you're ready for the $999 Chromebook experience the Pixel is available now from Google's online store.
Subject: Mobile | March 12, 2015 - 02:46 PM | Jeremy Hellstrom
Tagged: msi, gs30, gamingdock
The MSI GS30 Shadow is a high powered laptop with the first external GPU that you can actually buy. The GamingDock is indeed rather unattractive and hefty on the outside but it is what is on the inside that counts, a full GTX 980 with its own dedicated PSU. The external connection is a rear mounted PCIe slot which allows the 980 to run at the speeds you would expect it it were inside a desktop PC. The laptop itself has a Haswell i7-4870HQ, 16GB of DDR3-1600 and pair of Kingston 256GB M.2 SSDs in RAID 0, with the only internal graphics being the Iris Pro 5200 on the CPU. Kitguru has posted a review here, though it would be interesting another review featuring a head to head competition with the GTX980M.
"When we previewed the MSI GS30 Shadow and GamingDock at the end of 2014 we were blown away by the combination of Core i7-4870HQ CPU in the laptop and the desktop GTX 980 graphics card in the GamingDock. The concept of using an external dock to add proper gaming graphics to a thin and light laptop worked superbly well and we could hardly wait for the official release of the final package of hardware."
Here are some more Mobile articles from around the web:
- HP Spectre x360 @ The Inquirer
- Club3D SenseVision Adapters @ Kitguru
- FSP PB Runner 10400mAh Power Bank Review @ NikKTech
- Apple Watch vs Pebble Time Steel @ The Inquirer
- S6 vs S6 Edge @ The Inquirer
- KingSing T8 Smartphone Review @ Madshrimps
Subject: Graphics Cards, Mobile, Shows and Expos | March 7, 2015 - 07:00 AM | Scott Michaud
Tagged: vulkan, PowerVR, Khronos, Imagination Technologies, gdc 15, GDC
Possibly the most important feature of upcoming graphics APIs, albeit the least interesting for enthusiasts, is how much easier driver development will become. So many decisions and tasks that once laid on the shoulders of AMD, Intel, NVIDIA, and the rest will now be given to game developers or made obsolete. Of course, you might think that game developers would oppose this burden, but (from what I understand) it is a weight they already bear, just when dealing with the symptoms instead of the root problem.
This also helps other hardware vendors become competitive. Imagination Technologies is definitely not new to the field. Their graphics powers the PlayStation Vita, many earlier Intel graphics processors, and the last couple of iPhones. Despite how abrupt the API came about, they have a proof of concept driver that was present at GDC. The unfinished driver was running an OpenGL ES 3.0 demo that was converted to the Vulkan API.
A screenshot of the CPU usage was also provided, which is admittedly heavily cropped and hard to read. The one on the left claims 1.2% CPU load, with a fairly flat curve, while the one on the right claims 5% and seems to waggle more. Granted, the wobble could be partially explained by differences in the time they chose to profile.
According to Tom's Hardware, source code will be released “in the near future”.
Subject: Graphics Cards, Mobile | March 3, 2015 - 10:43 PM | Ryan Shrout
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3
Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.
I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.
Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!
While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.
Subject: General Tech, Mobile | March 3, 2015 - 10:21 PM | Ryan Shrout
Tagged: Tegra X1, tegra, shield, gdc 15, GDC, android tv
NVIDIA just announced a new member of its family of hardware devices: SHIELD. Just SHIELD. Powered by NVIDIA's latest 8-core, Maxwell GPU Tegra X1 SoC, SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming set-top box.
Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk, bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movie and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.
Speaking of the Tegra X1, the SHIELD will include the power of 256 Maxwell architecture CUDA cores and will easily provide the best Android gaming performance of any tablet or set-top box on the market. This means gaming, and lots of it, will be possible on SHIELD. Remember our many discussions about Tegra-specific gaming ports from the past? That trend will continue and more developers are realizing the power that NVIDIA is putting into this tiny chip.
In the box you'll get the SHIELD set-top unit and a SHIELD Controller, the same released with the SHIELD Tablet last year. A smaller remote controller that looks similar to the one used with the Kindle Fire TV will cost a little extra as will the stand that sets the SHIELD upright.
Pricing on the new SHIELD set-top will be $199, shipping in May.
Subject: Graphics Cards, Mobile | March 3, 2015 - 12:00 PM | Ryan Shrout
Tagged: Unity, lighting, global illumination, geomerics, GDC, arm
Back in 2013 ARM picked up a company called Geomerics, responsible for one the industry’s most advanced dynamic lighting engines used in games ranging from mobile to console to PC. Called Enlighten, it is the lighting engine in many major games in a variety of markets. Battlefield 3 uses it, Need for Speed: The Run does as well, The Bureau: XCOM Declassified and Quantum Conundrum mark another pair of major games that depend on Geomerics technology.
Great, but what does that have to do with ARM and why would the company be interested in investing in software that works with such a wide array of markets, most of which are not dominated by ARM processors? There are two answers, the first of which is directional: ARM is using the minds and creative talent behind Geomerics to help point the Cortex and Mali teams in the correct direction for CPU and GPU architecture development. By designing hardware to better address the advanced software and lighting systems Geomerics builds then Cortex and Mali will have some semblance of an advantage in specific gaming titles as well as a potential “general purpose” advantage. NVIDIA employs hundreds of gaming and software developers for this exact reason: what better way to make sure you are always at the forefront of the gaming ecosystem than getting high-level gaming programmers to point you to that edge? Qualcomm also recently (back in 2012) started employing game and engine developers in-house with the same goals.
ARM also believes it will be beneficial to bring publishers, developers and middleware partners to the ARM ecosystem through deployment of the Enlighten engine. It would be feasible to think console vendors like Microsoft and Sony would be more willing to integrate ARM SoCs (rather than the x86 used in the PS4 and Xbox One) when shown the technical capabilities brought forward by technologies like Geomerics Enlighten.
It’s best to think of the Geomerics acquisition of a kind of insurance program for ARM, making sure both its hardware and software roadmaps are in line with industry goals and directives.
At GDC 2015 Geomerics is announcing the release of the Enlighten 3 engine, a new version that brings cinematic-quality real-time global illumination to market. Some of the biggest new features include additional accuracy on indirect lighting, color separated directional output (enables individual RGB calculations), better light map baking for higher quality output, and richer material properties to support transparency and occlusion.
All of this technology will be showcased in a new Subway demo that includes real-time global illumination simulation, dynamic transparency and destructible environments.
Geomerics Enlighten 3 Subway Demo
Enlighten 3 will also ship with Forge, a new lighting editor and pipeline tool for content creators looking to streamline the building process. Forge will allow import functionality from Autodesk 3ds Max and Maya applications making inter-operability easier. Forge uses a technology called YEBIS 3 to show estimated final quality without the time consuming final-build processing time.
Finally, maybe the biggest news for ARM and Geomerics is that the Unity 5 game engine will be using Enlighten as its default lighting engine, giving ARM/Mali a potential advantage for gaming experiences in the near term. Of course Enlighten is available as an option for Unreal Engine 3 and 4 for developers using that engine in mobile, console and desktop projects as well as in an SDK form for custom integrations.
Subject: General Tech, Mobile, Shows and Expos | March 1, 2015 - 09:46 PM | Scott Michaud
Tagged: webOS, smartwatch, mwc 15, MWC, LG
A while ago, LG licensed WebOS from HP for use in their smart TVs and, as we found out during CES, smart watches.
The LG Urbane LTE is one such device, and we can finally see it in action. It is based around (literally) a circular P-OLED display (320 x 320, 1.3-inches, 245 ppi). Swirling your finger around the face scrolls through the elements like a wheel, which should be significantly more comfortable to search through a large list of applications than a linear list of elements -- a lot like an iPod (excluding the Touch and the Shuffle). That said, I have only seen other people use it.
The SoC is a Qualcomm Snapdragon 400, clocked at 1.2 GHz. It supports LTE, Wireless-N, Bluetooth 4.0LE, and NFC. It has 1 GB of RAM, which is quite a bit, and 4GB of permanent storage, which is not. It also has a bunch of sensors, from accelerometers and gyros to heart rate monitors and a barometer. It has a speaker and a microphone, but no camera. LG flaunts a 700 mAh battery, which they claim is “the category's largest”, but they do not link that to an actual amount of usage time (only that it “go[es] for days in standby mode”).
Video credit: The Verge
Pricing has not yet been announced, but it should hit the US and Europe before May arrives.
Subject: General Tech, Mobile, Shows and Expos | March 1, 2015 - 05:16 PM | Scott Michaud
Tagged: MWC, mwc 15, GDC, gdc 15, htc, valve, vive, vive vr, Oculus
Mobile World Congress (MWC) and Game Developers Conference (GDC) severely overlap this year, and not just in dates apparently. HTC just announced the Vive VR headset at MWC, which was developed alongside Valve. The developer edition will contain two 1200x1080 displays with a 90Hz refresh rate, and it will launch this spring. The consumer edition will launch this holiday. They made sure to underline 2015, so you know they're serious. Want more information? Well that will be for Valve to discuss at GDC.
The confusing part: why is this not partnered with Oculus? When Michael Abrash left Valve to go there, I assumed that it was Valve shedding its research to Facebook's subsidiary and letting them take the hit. Now, honestly, it seems like Facebook just poached Abrash, Valve said “oh well”, and the two companies kept to their respective research. Who knows? Maybe that is not the case. We might find out more at GDC, but you would expect that Oculus would be mentioned if they had any involvement at all.
Valve will host an event on the second official day of GDC, March 3rd at 3pm. In other words, Valve will make an announcement on 3/3 @ 3. Could it involve Left 4 Dead 3? Portal 3? Will they pull a Crytek and name their engine Source 3? Are they just trolling absolutely everyone? Will it have something to do with NVIDIA's March 3rd announcement? Do you honestly think I have any non-speculative information about this? No. No I don't. There, I answered one of those questions.
Subject: Mobile | March 1, 2015 - 02:01 PM | Sebastian Peak
Tagged: SoC, smartphones, Samsung, MWC 2015, MWC, Galaxy S6 Edge, galaxy s6, Exynos 7420, 14nm
Samsung has announced the new Galaxy S phones at MWC, and the new S6 and S6 Edge should be in line with what you were expecting if you’ve followed recent rumors.
The new Samsung Galaxy S6 and S6 Edge (Image credit: Android Central)
As expected we no longer see a Qualcomm SoC powering the new phones, and as the rumors had indicated Samsung opted instead for their own Exynos 7 Octa mobile AP. The Exynos SoC’s have previously been in international versions of Samsung’s mobile devices, but they have apparently ramped up production to meet the demands of the US market as well. There is an interesting twist here, however.
The Exynos 7420 powering both the Galaxy S6 and S6 Edge is an 8-core SoC with ARM’s big.LITTLE design, combining four ARM Cortex-A57 cores and four Cortex-A53 cores. Having announced 14nm FinFET mobile AP production earlier in February the possibility of the S6 launching with this new part was interesting, as the current process tech is 20nm HKMG for the Exynos 7. However a switch to this new process so soon before the official announcement seemed unlikely as large-scale 14nm FinFET production was just unveiled on February 16. Regardless, AnandTech is reporting that the new part will indeed be produced using this new 14nm process technology, and this gives Samsung an industry-first for a mobile SoC with the launch of the S6/S6 Edge.
GSM Arena has specs of the Galaxy S6 posted, and here’s a brief overview:
- Display: 5.1” Super AMOLED, QHD resolution (1440 x 2560, ~577 ppi), Gorilla Glass 4
- OS: Android OS, v5.0 (Lollipop) - TouchWiz UI
- Chipset: Exynos 7420
- CPU: Quad-core 1.5 GHz Cortex-A53 & Quad-core 2.1 GHz Cortex-A57
- GPU: Mali-T760
- Storage/RAM: 32/64/128 GB, 3 GB RAM
- Camera: (Primary) 16 MP, 3456 x 4608, optical image stabilization, autofocus, LED flash
- Battery: 2550 mAh (non-removable)
The new phones both feature attractive styling with metal and glass construction and Gorilla Glass 4 sandwiching the frame, giving each phone a glass back.
The back of the new Galaxy S6 (Image credit: Android Central)
The guys at Android Central (source) had some pre-release time with the phones and have a full preview and hands-on video up on their site. The new phones will be released worldwide on April 10, and no specifics on pricing have been announced.
Subject: Mobile | February 28, 2015 - 04:42 PM | Sebastian Peak
Tagged: smartphones, MWC 2015, MWC, Moto E, LG Magna, ios, Android 5.0
Last year my favorite smartphone became the 2014 version of the Moto G. This was (and still is) a $179 unlocked Android phone that shipped with 4.4.4 KitKat, but recently received an OTA update to 5.0 Lollipop (and subsequently 5.0.2 via a second OTA update). Motorola’s aggressive pricing made the phone compelling on paper, but using the device was even more impressive. It looked good, with a 5-inch 720p IPS display and the same design language as the Moto X and later Nexus 6, and ran a virtually untouched stock Android OS. It was never going to win any awards for raw speed, but the quad-core Snapdragon 400 SoC was plenty fast for daily use. The main drawback was a glaring one, however: the Moto G was not LTE capable. Enter the new Moto E.
Here are some quick specs from Motorola:
Moto E 2nd Edition (LTE capable)
4.5” 540x960 display
Quad-core 1.2GHz Cortex-A53/Adreno 306
1GB RAM/8GB storage
2390 mAh battery
We are already off to a solid start in 2015 with a great option from Motorola in the new 2nd edition Moto E. This LTE capable smartphone might look a little chunky, but the specs make it more that just a compelling option at $149 (unlocked) as it could have the disruptive impact on price that Microsoft just couldn’t make last year with their inexpensive Lumia phones. With 2015’s Mobile World Congress (MWC) fast approaching the Moto E has already been making some noise in the affordable phone space that last year’s Moto G played a big part in, and this time the message is clear: in 2015 a smartphone needs to have LTE, regardless of price.
To be fair Microsoft has already addressed need for LTE with their low-cost Windows Phone devices like the Lumia 635 (which is actually selling for just $49 on Amazon now), but the app ecosystem for the platform is just too restrictive to make it a viable solution compared to Android and iOS. Honestly, I love the Windows Phone OS but there are too many missing apps to make it a daily driver. So, since Windows clearly isn’t the answer and Apple won’t be selling a sub-$200 unlocked smartphone anytime soon (the cheapest unlocked iPhone is the 8GB 5c at $450), that leaves Android (of course).
Another possibility comes from LG, as ahead of MWC there was a press release from the company showcasing their new “mid-range” smartphone lineup for 2015. Among the models listed is another phone that matches the specs associated with a $200-ish unlocked phone, but pricing has not been announced yet.
LG Magna (LTE capable) - Unreleased
5.0” 720x1280 display
1GB RAM, 8GB storage
2540 mAh battery
We await the announcements from MWC and there are sure to be many other examples of low-cost LTE devices, but already it’s looking like it won’t take more than $200 and a SIM card to avoid the endless device upgrade cycle in 2015.
Subject: Graphics Cards, Mobile | February 26, 2015 - 02:15 PM | Ryan Shrout
Tagged: super-gpu, PowerVR, Imagination Technologies, gt7900
As a preview to announcements and releases being made at both Mobile World Congress (MWC) and the Game Developers Summit (GDC) next week, Imagination Technologies took the wraps off of a new graphics product they are calling a "super-GPU". The PowerVR GT7900 is the new flagship GPU as a part of its Series7XT family that is targeting a growing category called "affordable game consoles." Think about the Android-powered set-top devices like the Ouya or maybe Amazon's Kindle TV.
PowerVR breaks up its GPU designs into unified shading clusters (USCs) and the GT7900 has 16 of them for a total of 512 ALU cores. Imagination has previously posted a great overview of its USC architecture design and how you can compare its designs to other GPUs on the market. Imagination wants to claim that the GT7900 will offer "PC-class gaming experiences" though that is as ambiguous as the idea of a work load of a "console-level game." But with rated peak performance levels hitting over 800 GFLOPS in FP32 and 1.6 TFLOPS in FP16 (half-precision) this GPU does have significant theoretical capability.
|PowerVR GT7900||Tegra X1|
|GPU Clock||800 MHz||1000 MHz|
|Process Tech||16nm FinFET+||20nm TSMC|
Imagination also believes that PowerVR offers a larger portion of its peak performance for a longer period of time than the competition thanks to the tile-based deferred rendering (TBDR) approach that has been "refined over the years to deliver unmatched efficiency."
The FP16 performance number listed above is useful as an extreme power savings option where the half-precision compute operates in a much more efficient manner. A fair concern is how many applications, GPGPU or gaming, actually utilize the FP16 data type but having support for it in the GT7900 allows developers to target it.
Other key features of the GT7900 include support for OpenGL ES 3.1 + AEP (Android Extension Pack), hardware tessellation and ASTC LDR and HDR texture compression standards. The GPU also can run in a multi-domain virtualization mode that would allow multiple operating systems to run in parallel on a single platform.
Imagination believes that this generation of PowerVR will "usher a new era of console-like gaming experiences" and will showcase a new demo at GDC called Dwarf Hall.
I'll be at GDC next week and have already setup a meeting with Imagination to talk about the GT7900 so I can have some hands on experiences to report back with soon. I am continually curious about the market for these types of high-end "mobile" GPUs with the limited market that the Android console market currently addresses. Imagination does claim that the GT7900 is beating products with performance levels as high as the GeForce GT 730M discrete GPU - no small feat.