All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Mobile | March 24, 2015 - 07:02 PM | Jeremy Hellstrom
Tagged: Android, linux, smartwatch
Linux.com offers you a shopping list of smartwatches which are all less expensive than the fruit flavoured models and run Android or Linux. From familiar models like the Pebble and the older and less impressive Neptune Pine and Omate TrueSmart to leaked models like the Tizen-based Samsung Orbis you have quite a few choices to look through. There is even Monohm's large Runcible that is more of a pocket watch than a wrist watch to consider. In many cases the details are a bit lacking but the model names are known so you can get a leg up on your research for when they are finally revealed with full specifications.
"Much to the delight of Apple fanbots everywhere, Apple has now fully unveiled the Apple Watch. The watch, which was previewed in September, will go on sale April 10 and ship on the 24th. Based on its brand name, styling, accessories, and battery life claims, it will likely be a big hit -- at least as far as smartwatches go."
Here are some more Mobile articles from around the web:
- ASUS ZenFone 6 Mobile @ Kitguru
- Kingston Technologies Mobile Lite G4 Media Reader @ Bjorn3d.com
- Kingston Technologies Data Traveler microDuo 3 @ Bjorn3d
- Seagate Wireless 500GB mobile storage drive @ Kitguru
- Startech Universal USB 3.0 Laptop Docking Station @ Bjorn3d
- MSI GE62 2QD Apache @ HardwareHeaven
Subject: Mobile | March 13, 2015 - 09:33 AM | Sebastian Peak
Tagged: Razer Blade Pro, razer, notebook, laptop, i7-4720HQ, GTX 960M, gaming notebook
Razer has updated their massive Blade Pro notebook with new dual storage options and NVIDIA’s newly announced GeForce GTX 960M graphics.
Razer targets the Blade Pro at both gamers and professionals, placing emphasis on the usefulness of the device beyond gaming. However, being limited to 1920x1080 on a 17.3-inch display will eliminate this from consideration by most creative professionals (though the display does feature an anti-glare matte finish). Aiding the performance/gaming side of the notebook is Razer’s localized heating system which the company claims “focuses on directing heat away from the main touch surfaces of the notebook, to areas that can dissipate heat quickly and are not commonly touched by the user. This allows the laptop to pack in the highest performance available with NVIDIA’s critically acclaimed GTX graphics”.
The Blade Pro is constructed from aluminum and while reasonably thin at 0.88 inches, the notebook weighs in at a hefty 6.76 pounds (though the probably battery life of such a high-powered system precludes this from a lot of portable use anyway).
One of the most interesting aspects of this design is Razer’s Switchblade User Interface (SBUI), which the company says “is designed for a more efficient and intuitive experience for professionals and gamers.” It combines 10 customizable tactile keys and a unique LCD trackpad (which I would assume features a glass surface). Meanwhile the keyboard is backlit and features anti-ghosting technology as well.
Intel Core i7-4720HQ Quad Core Processor (2.6GHz / 3.6GHz)
NVIDIA GeForce GTX 960M (4GB GDDR5 VRAM), Optimus Technology
16GB System Memory (DDR3L-1600 MHz)
Windows 8.1 64-Bit
128GB SSD + 500GB HDD / 256GB SSD + 500GB HDD / 512GB SSD + 1TB HDD
17.3" Full HD 16:9 Ratio, 1920 x 1080 LED backlit
Intel Wireless-AC 7260HMW (802.11a/b/g/n/ac + Bluetooth 4.0)
Gigabit Ethernet port
3x USB 3.0 ports
HDMI 1.4a audio and video output
Dolby Digital Plus Home Theater Edition
Built-in stereo speakers
3.5 mm microphone/headphone combo jack
7.1 Codec support (via HDMI)
Built-in full-HD webcam (2.0 MP)
Compact 150 W Power Adapter
Built-in 74 Wh Rechargeable lithium ion polymer battery
Razer Switchblade User Interface (SBUI)
Razer Anti-Ghosting Keyboard (with adjustable backlight)
Razer Synapse Enabled
Kensington Lock interface
16.8 in. (427 mm) Width x 0.88 in. (22.4 mm) Height x 10.9 in. (277 mm) Depth
6.76 lbs. / 3.07 kg
The Razer Blade Pro starts at $2299.99 and is available now from the Razer online store.
Subject: General Tech, Mobile | March 13, 2015 - 09:00 AM | Tim Verry
Tagged: haswell, GTX 960M, gaming laptop, g501, ASUS ROG, asus
Today Asus unveiled the Republic of Gamers (ROG) G501 gaming laptop. The G501 is a 4.54 pound 15.6” laptop that packs high end hardware into a thin aluminum shell.
The ROG G501 features a dark gray 0.81” thick aluminum chassis with a brushed metal finish and red bezel accents. A 15.6” matte IPS display dominates the top half of the PC with a resolution of 3840x2160 (UHD). The lower half includes a red backlit keyboard (1.6mm key travel) with colored WASD keys and a number pad as well as a large trackpad.
External I/O on this gaming machine is extensive and includes:
- 1 x Thunderbolt
- 3 x USB 3.0
- 1 x HDMI
- 1 x Audio combo jack
- 1 x SD
- 1 x 1.2MP webcam
- Wi-Fi 802.11ac + Bluetooth 4.0
Asus is using the latest mobile technology with the G501 including a 47W Intel Haswell Core i7-4720HQ (4c/8t) processor, NVIDIA GTX 960M (4GB) graphics card, up to 16GB of DDR3 memory, and an impressive 512GB PCI-E x4 solid state drive (rated at 1,400MB/s reads). The laptop also supports 802.11ac Wi-Fi and Bluetooth 4.0. Asus claims that its Hyper Cool technology will keep the system running cool by using copper heatpipes and giving the CPU and GPU their own heatsink and fan which can be independently controlled to maintain a balance of heat and noise. The laptop is powered by a 96Wh Lithium Polymer battery.
This beastly gaming laptop will be available next month with an MSRP of $1,999 (with the configuration listed above). More information can be found at gseries.asus.com
In addition to the ROG G501, Asus’ GL551 and G751 series are also being refreshed to include NVIDIA’s new GTX 900 series graphics. The GL551JW will get the GTX 960M while the G751JL will use the GTX 965M.
Subject: Mobile | March 12, 2015 - 02:56 PM | Sebastian Peak
Until yesterday virtually all Chromebooks had two things in common: low-end specs and equally low prices. Most sell for around $200 and are available from virtually every manufacturer, and the relative success of these Google Chrome OS laptops in the post-netbook portable space has relied on price. Now Google has announced a new concept for a Chromebook: give it high-end specs and charge $999.
Is it reasonable to assume in 2015 that a user could be perfectly content using cloud storage and web-based apps to accomplish daily tasks? In many cases, yes. But asking $1k on the strength of better hardware is going to be a difficult sell for a Chromebook. The specs are impressive, beginning with a very high resolution 2560x1700 touchscreen, and like the new MacBook this is also sporting USB Type-C connectivity (with the same 5Gbps speed as the Apple implementation).
The pricing for this device continues a disturbing trend, coming just days after Apple's announcement of a Core M MacBook for $1299. In appearance the Pixel seems to borrow rather heavily from the MacBook Air design with a silver finish, glass trackpad, and backlit island-style black keyboard. If the build quality and screen are top notch then Google may have some justification for the price, but with the limitation of just 32GB of local storage (an additional 1TB cloud storage is offered at no cost for 3 years) and an OS that can only run applications from Google's Chrome store, the price does seem high.
Specs from Google below:
- 12.85" multi touch display, 2560 x 1700 (239 ppi), 400 nit brightness, 178° viewing angle
- Intel® Core™ i5 processor @ 2.2GHz, 8GB memory or Intel® Core™ i7 processor @ 2.4GHz, 16GB memory
- Intel® HD Graphics 5500, supports 4K video output over DisplayPort or HDMI with optional Type-C video adapter cable
- 32GB or 64GB of flash storage
Backlit keyboard, fully clickable etched-glass trackpad
- 720P HD wide angle camera with blue glass
- 2x USB Type-C (up to 5Gbps data, 4K display out with optional HDMI or DisplayPort™ adapter, 60W charging)
- 2x USB 3.0
- SD card reader
- Intel Dual Band Wireless-AC 7260 2x2, Bluetooth 4.0
- High power stereo speakers, built-in microphone, headphone/mic combo jack
- Universal Type-C USB Charger, 60W
- Up to 12 hours of battery life
- Dimensions: 11.7” x 8.8” x 0.6”, 3.3lbs
If you're ready for the $999 Chromebook experience the Pixel is available now from Google's online store.
Subject: Mobile | March 12, 2015 - 02:46 PM | Jeremy Hellstrom
Tagged: msi, gs30, gamingdock
The MSI GS30 Shadow is a high powered laptop with the first external GPU that you can actually buy. The GamingDock is indeed rather unattractive and hefty on the outside but it is what is on the inside that counts, a full GTX 980 with its own dedicated PSU. The external connection is a rear mounted PCIe slot which allows the 980 to run at the speeds you would expect it it were inside a desktop PC. The laptop itself has a Haswell i7-4870HQ, 16GB of DDR3-1600 and pair of Kingston 256GB M.2 SSDs in RAID 0, with the only internal graphics being the Iris Pro 5200 on the CPU. Kitguru has posted a review here, though it would be interesting another review featuring a head to head competition with the GTX980M.
"When we previewed the MSI GS30 Shadow and GamingDock at the end of 2014 we were blown away by the combination of Core i7-4870HQ CPU in the laptop and the desktop GTX 980 graphics card in the GamingDock. The concept of using an external dock to add proper gaming graphics to a thin and light laptop worked superbly well and we could hardly wait for the official release of the final package of hardware."
Here are some more Mobile articles from around the web:
- HP Spectre x360 @ The Inquirer
- Club3D SenseVision Adapters @ Kitguru
- FSP PB Runner 10400mAh Power Bank Review @ NikKTech
- Apple Watch vs Pebble Time Steel @ The Inquirer
- S6 vs S6 Edge @ The Inquirer
- KingSing T8 Smartphone Review @ Madshrimps
Subject: Graphics Cards, Mobile, Shows and Expos | March 7, 2015 - 07:00 AM | Scott Michaud
Tagged: vulkan, PowerVR, Khronos, Imagination Technologies, gdc 15, GDC
Possibly the most important feature of upcoming graphics APIs, albeit the least interesting for enthusiasts, is how much easier driver development will become. So many decisions and tasks that once laid on the shoulders of AMD, Intel, NVIDIA, and the rest will now be given to game developers or made obsolete. Of course, you might think that game developers would oppose this burden, but (from what I understand) it is a weight they already bear, just when dealing with the symptoms instead of the root problem.
This also helps other hardware vendors become competitive. Imagination Technologies is definitely not new to the field. Their graphics powers the PlayStation Vita, many earlier Intel graphics processors, and the last couple of iPhones. Despite how abrupt the API came about, they have a proof of concept driver that was present at GDC. The unfinished driver was running an OpenGL ES 3.0 demo that was converted to the Vulkan API.
A screenshot of the CPU usage was also provided, which is admittedly heavily cropped and hard to read. The one on the left claims 1.2% CPU load, with a fairly flat curve, while the one on the right claims 5% and seems to waggle more. Granted, the wobble could be partially explained by differences in the time they chose to profile.
According to Tom's Hardware, source code will be released “in the near future”.
Subject: Graphics Cards, Mobile | March 3, 2015 - 10:43 PM | Ryan Shrout
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3
Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.
I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.
Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!
While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.
Subject: General Tech, Mobile | March 3, 2015 - 10:21 PM | Ryan Shrout
Tagged: Tegra X1, tegra, shield, gdc 15, GDC, android tv
NVIDIA just announced a new member of its family of hardware devices: SHIELD. Just SHIELD. Powered by NVIDIA's latest 8-core, Maxwell GPU Tegra X1 SoC, SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming set-top box.
Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk, bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movie and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.
Speaking of the Tegra X1, the SHIELD will include the power of 256 Maxwell architecture CUDA cores and will easily provide the best Android gaming performance of any tablet or set-top box on the market. This means gaming, and lots of it, will be possible on SHIELD. Remember our many discussions about Tegra-specific gaming ports from the past? That trend will continue and more developers are realizing the power that NVIDIA is putting into this tiny chip.
In the box you'll get the SHIELD set-top unit and a SHIELD Controller, the same released with the SHIELD Tablet last year. A smaller remote controller that looks similar to the one used with the Kindle Fire TV will cost a little extra as will the stand that sets the SHIELD upright.
Pricing on the new SHIELD set-top will be $199, shipping in May.
Subject: Graphics Cards, Mobile | March 3, 2015 - 12:00 PM | Ryan Shrout
Tagged: Unity, lighting, global illumination, geomerics, GDC, arm
Back in 2013 ARM picked up a company called Geomerics, responsible for one the industry’s most advanced dynamic lighting engines used in games ranging from mobile to console to PC. Called Enlighten, it is the lighting engine in many major games in a variety of markets. Battlefield 3 uses it, Need for Speed: The Run does as well, The Bureau: XCOM Declassified and Quantum Conundrum mark another pair of major games that depend on Geomerics technology.
Great, but what does that have to do with ARM and why would the company be interested in investing in software that works with such a wide array of markets, most of which are not dominated by ARM processors? There are two answers, the first of which is directional: ARM is using the minds and creative talent behind Geomerics to help point the Cortex and Mali teams in the correct direction for CPU and GPU architecture development. By designing hardware to better address the advanced software and lighting systems Geomerics builds then Cortex and Mali will have some semblance of an advantage in specific gaming titles as well as a potential “general purpose” advantage. NVIDIA employs hundreds of gaming and software developers for this exact reason: what better way to make sure you are always at the forefront of the gaming ecosystem than getting high-level gaming programmers to point you to that edge? Qualcomm also recently (back in 2012) started employing game and engine developers in-house with the same goals.
ARM also believes it will be beneficial to bring publishers, developers and middleware partners to the ARM ecosystem through deployment of the Enlighten engine. It would be feasible to think console vendors like Microsoft and Sony would be more willing to integrate ARM SoCs (rather than the x86 used in the PS4 and Xbox One) when shown the technical capabilities brought forward by technologies like Geomerics Enlighten.
It’s best to think of the Geomerics acquisition of a kind of insurance program for ARM, making sure both its hardware and software roadmaps are in line with industry goals and directives.
At GDC 2015 Geomerics is announcing the release of the Enlighten 3 engine, a new version that brings cinematic-quality real-time global illumination to market. Some of the biggest new features include additional accuracy on indirect lighting, color separated directional output (enables individual RGB calculations), better light map baking for higher quality output, and richer material properties to support transparency and occlusion.
All of this technology will be showcased in a new Subway demo that includes real-time global illumination simulation, dynamic transparency and destructible environments.
Geomerics Enlighten 3 Subway Demo
Enlighten 3 will also ship with Forge, a new lighting editor and pipeline tool for content creators looking to streamline the building process. Forge will allow import functionality from Autodesk 3ds Max and Maya applications making inter-operability easier. Forge uses a technology called YEBIS 3 to show estimated final quality without the time consuming final-build processing time.
Finally, maybe the biggest news for ARM and Geomerics is that the Unity 5 game engine will be using Enlighten as its default lighting engine, giving ARM/Mali a potential advantage for gaming experiences in the near term. Of course Enlighten is available as an option for Unreal Engine 3 and 4 for developers using that engine in mobile, console and desktop projects as well as in an SDK form for custom integrations.
Subject: General Tech, Mobile, Shows and Expos | March 1, 2015 - 09:46 PM | Scott Michaud
Tagged: webOS, smartwatch, mwc 15, MWC, LG
A while ago, LG licensed WebOS from HP for use in their smart TVs and, as we found out during CES, smart watches.
The LG Urbane LTE is one such device, and we can finally see it in action. It is based around (literally) a circular P-OLED display (320 x 320, 1.3-inches, 245 ppi). Swirling your finger around the face scrolls through the elements like a wheel, which should be significantly more comfortable to search through a large list of applications than a linear list of elements -- a lot like an iPod (excluding the Touch and the Shuffle). That said, I have only seen other people use it.
The SoC is a Qualcomm Snapdragon 400, clocked at 1.2 GHz. It supports LTE, Wireless-N, Bluetooth 4.0LE, and NFC. It has 1 GB of RAM, which is quite a bit, and 4GB of permanent storage, which is not. It also has a bunch of sensors, from accelerometers and gyros to heart rate monitors and a barometer. It has a speaker and a microphone, but no camera. LG flaunts a 700 mAh battery, which they claim is “the category's largest”, but they do not link that to an actual amount of usage time (only that it “go[es] for days in standby mode”).
Video credit: The Verge
Pricing has not yet been announced, but it should hit the US and Europe before May arrives.
Subject: General Tech, Mobile, Shows and Expos | March 1, 2015 - 05:16 PM | Scott Michaud
Tagged: MWC, mwc 15, GDC, gdc 15, htc, valve, vive, vive vr, Oculus
Mobile World Congress (MWC) and Game Developers Conference (GDC) severely overlap this year, and not just in dates apparently. HTC just announced the Vive VR headset at MWC, which was developed alongside Valve. The developer edition will contain two 1200x1080 displays with a 90Hz refresh rate, and it will launch this spring. The consumer edition will launch this holiday. They made sure to underline 2015, so you know they're serious. Want more information? Well that will be for Valve to discuss at GDC.
The confusing part: why is this not partnered with Oculus? When Michael Abrash left Valve to go there, I assumed that it was Valve shedding its research to Facebook's subsidiary and letting them take the hit. Now, honestly, it seems like Facebook just poached Abrash, Valve said “oh well”, and the two companies kept to their respective research. Who knows? Maybe that is not the case. We might find out more at GDC, but you would expect that Oculus would be mentioned if they had any involvement at all.
Valve will host an event on the second official day of GDC, March 3rd at 3pm. In other words, Valve will make an announcement on 3/3 @ 3. Could it involve Left 4 Dead 3? Portal 3? Will they pull a Crytek and name their engine Source 3? Are they just trolling absolutely everyone? Will it have something to do with NVIDIA's March 3rd announcement? Do you honestly think I have any non-speculative information about this? No. No I don't. There, I answered one of those questions.
Subject: Mobile | March 1, 2015 - 02:01 PM | Sebastian Peak
Tagged: SoC, smartphones, Samsung, MWC 2015, MWC, Galaxy S6 Edge, galaxy s6, Exynos 7420, 14nm
Samsung has announced the new Galaxy S phones at MWC, and the new S6 and S6 Edge should be in line with what you were expecting if you’ve followed recent rumors.
The new Samsung Galaxy S6 and S6 Edge (Image credit: Android Central)
As expected we no longer see a Qualcomm SoC powering the new phones, and as the rumors had indicated Samsung opted instead for their own Exynos 7 Octa mobile AP. The Exynos SoC’s have previously been in international versions of Samsung’s mobile devices, but they have apparently ramped up production to meet the demands of the US market as well. There is an interesting twist here, however.
The Exynos 7420 powering both the Galaxy S6 and S6 Edge is an 8-core SoC with ARM’s big.LITTLE design, combining four ARM Cortex-A57 cores and four Cortex-A53 cores. Having announced 14nm FinFET mobile AP production earlier in February the possibility of the S6 launching with this new part was interesting, as the current process tech is 20nm HKMG for the Exynos 7. However a switch to this new process so soon before the official announcement seemed unlikely as large-scale 14nm FinFET production was just unveiled on February 16. Regardless, AnandTech is reporting that the new part will indeed be produced using this new 14nm process technology, and this gives Samsung an industry-first for a mobile SoC with the launch of the S6/S6 Edge.
GSM Arena has specs of the Galaxy S6 posted, and here’s a brief overview:
- Display: 5.1” Super AMOLED, QHD resolution (1440 x 2560, ~577 ppi), Gorilla Glass 4
- OS: Android OS, v5.0 (Lollipop) - TouchWiz UI
- Chipset: Exynos 7420
- CPU: Quad-core 1.5 GHz Cortex-A53 & Quad-core 2.1 GHz Cortex-A57
- GPU: Mali-T760
- Storage/RAM: 32/64/128 GB, 3 GB RAM
- Camera: (Primary) 16 MP, 3456 x 4608, optical image stabilization, autofocus, LED flash
- Battery: 2550 mAh (non-removable)
The new phones both feature attractive styling with metal and glass construction and Gorilla Glass 4 sandwiching the frame, giving each phone a glass back.
The back of the new Galaxy S6 (Image credit: Android Central)
The guys at Android Central (source) had some pre-release time with the phones and have a full preview and hands-on video up on their site. The new phones will be released worldwide on April 10, and no specifics on pricing have been announced.
Subject: Mobile | February 28, 2015 - 04:42 PM | Sebastian Peak
Tagged: smartphones, MWC 2015, MWC, Moto E, LG Magna, ios, Android 5.0
Last year my favorite smartphone became the 2014 version of the Moto G. This was (and still is) a $179 unlocked Android phone that shipped with 4.4.4 KitKat, but recently received an OTA update to 5.0 Lollipop (and subsequently 5.0.2 via a second OTA update). Motorola’s aggressive pricing made the phone compelling on paper, but using the device was even more impressive. It looked good, with a 5-inch 720p IPS display and the same design language as the Moto X and later Nexus 6, and ran a virtually untouched stock Android OS. It was never going to win any awards for raw speed, but the quad-core Snapdragon 400 SoC was plenty fast for daily use. The main drawback was a glaring one, however: the Moto G was not LTE capable. Enter the new Moto E.
Here are some quick specs from Motorola:
Moto E 2nd Edition (LTE capable)
4.5” 540x960 display
Quad-core 1.2GHz Cortex-A53/Adreno 306
1GB RAM/8GB storage
2390 mAh battery
We are already off to a solid start in 2015 with a great option from Motorola in the new 2nd edition Moto E. This LTE capable smartphone might look a little chunky, but the specs make it more that just a compelling option at $149 (unlocked) as it could have the disruptive impact on price that Microsoft just couldn’t make last year with their inexpensive Lumia phones. With 2015’s Mobile World Congress (MWC) fast approaching the Moto E has already been making some noise in the affordable phone space that last year’s Moto G played a big part in, and this time the message is clear: in 2015 a smartphone needs to have LTE, regardless of price.
To be fair Microsoft has already addressed need for LTE with their low-cost Windows Phone devices like the Lumia 635 (which is actually selling for just $49 on Amazon now), but the app ecosystem for the platform is just too restrictive to make it a viable solution compared to Android and iOS. Honestly, I love the Windows Phone OS but there are too many missing apps to make it a daily driver. So, since Windows clearly isn’t the answer and Apple won’t be selling a sub-$200 unlocked smartphone anytime soon (the cheapest unlocked iPhone is the 8GB 5c at $450), that leaves Android (of course).
Another possibility comes from LG, as ahead of MWC there was a press release from the company showcasing their new “mid-range” smartphone lineup for 2015. Among the models listed is another phone that matches the specs associated with a $200-ish unlocked phone, but pricing has not been announced yet.
LG Magna (LTE capable) - Unreleased
5.0” 720x1280 display
1GB RAM, 8GB storage
2540 mAh battery
We await the announcements from MWC and there are sure to be many other examples of low-cost LTE devices, but already it’s looking like it won’t take more than $200 and a SIM card to avoid the endless device upgrade cycle in 2015.
Subject: Graphics Cards, Mobile | February 26, 2015 - 02:15 PM | Ryan Shrout
Tagged: super-gpu, PowerVR, Imagination Technologies, gt7900
As a preview to announcements and releases being made at both Mobile World Congress (MWC) and the Game Developers Summit (GDC) next week, Imagination Technologies took the wraps off of a new graphics product they are calling a "super-GPU". The PowerVR GT7900 is the new flagship GPU as a part of its Series7XT family that is targeting a growing category called "affordable game consoles." Think about the Android-powered set-top devices like the Ouya or maybe Amazon's Kindle TV.
PowerVR breaks up its GPU designs into unified shading clusters (USCs) and the GT7900 has 16 of them for a total of 512 ALU cores. Imagination has previously posted a great overview of its USC architecture design and how you can compare its designs to other GPUs on the market. Imagination wants to claim that the GT7900 will offer "PC-class gaming experiences" though that is as ambiguous as the idea of a work load of a "console-level game." But with rated peak performance levels hitting over 800 GFLOPS in FP32 and 1.6 TFLOPS in FP16 (half-precision) this GPU does have significant theoretical capability.
|PowerVR GT7900||Tegra X1|
|GPU Clock||800 MHz||1000 MHz|
|Process Tech||16nm FinFET+||20nm TSMC|
Imagination also believes that PowerVR offers a larger portion of its peak performance for a longer period of time than the competition thanks to the tile-based deferred rendering (TBDR) approach that has been "refined over the years to deliver unmatched efficiency."
The FP16 performance number listed above is useful as an extreme power savings option where the half-precision compute operates in a much more efficient manner. A fair concern is how many applications, GPGPU or gaming, actually utilize the FP16 data type but having support for it in the GT7900 allows developers to target it.
Other key features of the GT7900 include support for OpenGL ES 3.1 + AEP (Android Extension Pack), hardware tessellation and ASTC LDR and HDR texture compression standards. The GPU also can run in a multi-domain virtualization mode that would allow multiple operating systems to run in parallel on a single platform.
Imagination believes that this generation of PowerVR will "usher a new era of console-like gaming experiences" and will showcase a new demo at GDC called Dwarf Hall.
I'll be at GDC next week and have already setup a meeting with Imagination to talk about the GT7900 so I can have some hands on experiences to report back with soon. I am continually curious about the market for these types of high-end "mobile" GPUs with the limited market that the Android console market currently addresses. Imagination does claim that the GT7900 is beating products with performance levels as high as the GeForce GT 730M discrete GPU - no small feat.
Subject: Mobile | February 25, 2015 - 04:46 PM | Jeremy Hellstrom
Tagged: z3580, venue 8 7000, venue, tablet, silvermont, moorefield, Intel, dell, atom z3580, Android
Dell's Venue 8 7000 tablet sports an 8.4" 2560x1600 OLED display and is powered by the Moorefield based Atom Z3580 SOC, 2GB LPDDR3-1600 with 16GB internal of internal storage with up to a 512GB Micro SD card supported. Even more impressive is that The Tech Report had no issues installing apps or moving files to the SD card with ES File Explorer, unlike many Android devices that need certain programs to reside on the internal storage media. Like Ryan, they had a lot of fun with the RealSense Camera and are looking forward to the upgrade to Lollipop support. Check out The Tech Report's opinion of this impressive Android tablet right here.
"Dell's Venue 8 7000 is the thinnest tablet around, and that's not even the most exciting thing about it. This premium Android slate packs a Moorefield-based Atom processor with quad x86 cores, a RealSense camera that embeds 3D depth data into still images, and a staggeringly beautiful OLED display that steals the show. Read on for our take on a truly compelling tablet."
Here are some more Mobile articles from around the web:
- Lenovo ThinkPad X1 Carbon Works Great As A Linux Ultrabook @ Phoronix
- Cooler Master NotePal ERGOSTAND III Review @ Techgage
- Portable Smartphone Battery Pack Roundup @ eTeknix
- Sandberg Outdoor Powerbank 10400 mAh Review @ NikKTech
- Xiaomi Mi4 64GB Smartphone Review @ Madshrimps
Subject: General Tech, Mobile | February 20, 2015 - 07:00 AM | Scott Michaud
Tagged: shieldtuesday, shield, Saints Row IV, nvidia, metro last light, gridtuesday, grid, alan wake
Once again, NVIDIA brings some really good games to their GRID service, which is currently free for all SHIELD owners. The concept is that NVIDIA will compute the graphics at their server farms, accept your input, and return an audio/video stream of the result. This is a very convenient way to access content, but it cannot replace actual ownership for guaranteed access to specific art that find intrinsically valuable. It can help you discover new content, though.
This week, Saint's Row IV is available to be played on the GRID gaming service. Its predecessor, Saint's Row: The Third, was published on GRID earlier this month. It would be good to play them in order, and they are both worth your time. I did find that the campaign of Saint's Row IV was a bit less unique because the majority of missions were a handful of side-missions strung together, while Saint's Row: The Third had more scenario-based objectives, with the side-missions as an option to build up stats (or just be fun) between these. On the other hand, the movement mechanics were genius in IV. Play them both.
Looking ahead, next Tuesday will be Alan Wake. This is a survival-horror title from Remedy that makes you appreciate just how long your batteries last in real life. Basically, electricity is light and light is a vulnerability for the monsters that want to destroy you. The week after, the third of March, is Metro: Last Light Redux. This is one of the most visually demanding games available, and it is still used as a GPU benchmark at this site.
Saint's Row IV went live last Tuesday, while Alan Wake arrives on the 24th and Metro: Last Light comes in last, on March 3rd.
Subject: Graphics Cards, Mobile | February 19, 2015 - 03:58 PM | Ryan Shrout
Tagged: nvidia, notebooks, mobile, gpu
After a week or so of debate circling NVIDIA's decsision to disable overclocking on mobility GPUs, we have word that the company has reconsidered and will be re-enabling the feature in next month's driver release:
As you know, we are constantly tuning and optimizing the performance of your GeForce PC.
We obsess over every possible optimization so that you can enjoy a perfectly stable machine that balances game, thermal, power, and acoustic performance.
Still, many of you enjoy pushing the system even further with overclocking.
Our recent driver update disabled overclocking on some GTX notebooks. We heard from many of you that you would like this feature enabled again. So, we will again be enabling overclocking in our upcoming driver release next month for those affected notebooks.
If you are eager to regain this capability right away, you can also revert back to 344.75.
Now, I don't want to brag here, but we did just rail NVIDIA for this decision on last night's podcast...and then the decision was posted on NVIDIA's forums just four hours ago... I'm not saying, but I'm just saying!
All kidding aside, this is great news! And NVIDIA desperately needs to be paying attention to what consumers are asking for in order to make up for some poor decisions made in the last several months. Now (or at least soon), you will be able to return to your mobile GPU overclocking!
Subject: Mobile | February 16, 2015 - 03:54 AM | Sebastian Peak
Tagged: zenbook, UX305, ultraportable, ips display, core m, asus, 5Y10
ASUS has announced the availability and pricing for the ZenBook UX305, and the specifications are quite exceptional for the price. Not content to compete on hardware specs alone the design of the notebook is a miniscule 0.48” thick, making the UX305 the world’s thinnest ultraportable notebook according to ASUS.
As impressive as the slim profile of the aluminum design might be, it is more impressive to look over the main specifications of the $699 UX305:
- Intel Core M 5Y10 processor
- 8GB of LPDDR3 memory
- 256GB SSD
- 13.3-inch 1920x1080 IPS display (matte finish)
I'll let that sink in for a moment. Quite an impressive list given the MSRP for these specifications is, again, only $699. At this price it's going to be very difficult to beat the UX305 considering what’s under the hood, as this configuration contains double the memory and storage space compared to many ultraportables in this price class. And 1080p IPS on top of everything is just icing on the cake. Battery life should be very good considerin the processor the heart of this is Intel's newest low-power Broadwell-based Core M (the 5Y10), which features HD 5300 graphics and a TDP of just 4.5W. Moreover, the processor is passively cooled and the notebook features a completely fanless design for silent operation.
Since there are no fans to expell heat ASUS has made it a point to promise that the palm rest will always stay cool thanks to their “IceCool technology” (whatever that is - but I really hope it’s an ice cube cooling system). The UX305 is powered by a 45Wh Lithium Polymer battery that has a claimed 10-hour battery life, and the notebook features 802.11ac wireless, three USB 3.0 ports, and includes a USB Ethernet adapter (a nice touch). ASUS is also touting a premium sound system with this notebook, employing a B&O ICEpower amplifier and enhanced with their proprietary “SonicMaster audio”. Rounding out the feature list is an SD card reader and 720p webcam.
The notebook weighs in at 2.6 Lbs, and this configuration of the UX305 is available immediately (listed on their official store). With the surprisingly low MSRP it sounds like this ZenBook will be a solid choice for anyone looking for the latest notebook tech on a budget, and depending on performance and real-world battery life it could just be that mythical MacBook Air "killer" (if you're ok with Windows 8 over OS X, of course).
Subject: Graphics Cards, Mobile, Shows and Expos | February 11, 2015 - 03:25 PM | Scott Michaud
Tagged: Tegra X1, nvidia, mwc 15, MWC, gdc 2015, GDC, DirectX 12
On March 3rd, NVIDIA will host an event called “Made to Game”. Invitations have gone out to numerous outlets, including Android Police, who published a censored screenshot of it. This suggests that it will have something to do with the Tegra X1, especially since the date is the day after Mobile World Congress starts. Despite all of this, I think it is for something else entirely.
Image Credit: Android Police
Allow me to highlight two points. First, Tech Report claims that the event is taking place in San Francisco, which is about as far away from Barcelona, Spain as you can get. It is close to GDC however, which takes also starts on March 2nd. If this was going to align with Mobile World Congress, you ideally would not want attendees to take 14-hour flight for a day trip.
Second, the invitation specifically says: “More than 5 years in the making, what I want to share with you will redefine the future of gaming.” Compare that to the DirectX 12 announcement blog post on March 20th of last year (2014): “Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC ((2014)).”
So yeah, while it might involve the Tegra X1 processor for Windows 10 on mobile devices, which is the only reason I can think of that they would want Android Police there apart from "We're inviting everyone everywhere", I expect that this event is for DirectX 12. I assume that Microsoft would host their own event that involves many partners, but I could see NVIDIA having a desire to save a bit for something of their own. What would that be? No idea.
Subject: Mobile | February 11, 2015 - 02:36 PM | Jeremy Hellstrom
Tagged: Samsung, Note 4, Exynos 5433, snapdragon 805, phablet
At 5.7" and 176g the Samsung Note 4 is a large device and it has a resolution to match it at 2560x1440. That resolution does slow it down somewhat, in graphics tests it does fall behind the iPhone 6 Plus except in Basemark X and 3DMark's Ice Storm test but it does show up the competition when it comes to graphical quality with only NVIDIA's Shield beating it on the GFXBench Quality tests. In the CPU tests it scored moderately well on single threaded applications but wipes the floor with the competition when it comes to multi-threaded performance which you should keep in mind when choosing your purchases. To see more benchmarks and details The Tech Report's full review can be found right here.
"Most of the world gets a variant of Samsung's Galaxy Note 4 based on Qualcomm's familiar Snapdragon 805 system-on-a-chip (SoC). In Samsung's home country of Korea, though, the firm ships a different variant of the Note 4 based on Samsung LSI's Exynos 5433 SoC. With eight 64-bit CPU cores and a 64-bit Mali-T760 GPU, the Eyxnos 5433 could make this version the fastest and most capable Note 4--and it gives us some quality time with the Cortex-A53 and A57 CPU cores that will likely dominate the Android market in 2015."
Here are some more Mobile articles from around the web:
- 9-Way Linux Laptop Performance Comparison From Intel Nehalem To Broadwell @ Phoronix
- The 2015 Alienware 15 & Alienware 17 Launch Even @ Tech ARP
- LUXA2 P1-PRO 7000mAh Outdoor Power Bank Review @ NikKTech
- Luxa2 PL3 10,400 mAh Leather Power Bank Review @HiTech Legion
- Luxa2 EnerG Slim 10,000mAh Power Bank Review @ OCC
- Noreve iPad Air 2 Protective Cases Review @ Madshrimps
- Sony Smartwatch 3 @ The Inquirer