All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Mobile | March 12, 2015 - 06:56 PM | Sebastian Peak
Until yesterday virtually all Chromebooks had two things in common: low-end specs and equally low prices. Most sell for around $200 and are available from virtually every manufacturer, and the relative success of these Google Chrome OS laptops in the post-netbook portable space has relied on price. Now Google has announced a new concept for a Chromebook: give it high-end specs and charge $999.
Is it reasonable to assume in 2015 that a user could be perfectly content using cloud storage and web-based apps to accomplish daily tasks? In many cases, yes. But asking $1k on the strength of better hardware is going to be a difficult sell for a Chromebook. The specs are impressive, beginning with a very high resolution 2560x1700 touchscreen, and like the new MacBook this is also sporting USB Type-C connectivity (with the same 5Gbps speed as the Apple implementation).
The pricing for this device continues a disturbing trend, coming just days after Apple's announcement of a Core M MacBook for $1299. In appearance the Pixel seems to borrow rather heavily from the MacBook Air design with a silver finish, glass trackpad, and backlit island-style black keyboard. If the build quality and screen are top notch then Google may have some justification for the price, but with the limitation of just 32GB of local storage (an additional 1TB cloud storage is offered at no cost for 3 years) and an OS that can only run applications from Google's Chrome store, the price does seem high.
Specs from Google below:
- 12.85" multi touch display, 2560 x 1700 (239 ppi), 400 nit brightness, 178° viewing angle
- Intel® Core™ i5 processor @ 2.2GHz, 8GB memory or Intel® Core™ i7 processor @ 2.4GHz, 16GB memory
- Intel® HD Graphics 5500, supports 4K video output over DisplayPort or HDMI with optional Type-C video adapter cable
- 32GB or 64GB of flash storage
Backlit keyboard, fully clickable etched-glass trackpad
- 720P HD wide angle camera with blue glass
- 2x USB Type-C (up to 5Gbps data, 4K display out with optional HDMI or DisplayPort™ adapter, 60W charging)
- 2x USB 3.0
- SD card reader
- Intel Dual Band Wireless-AC 7260 2x2, Bluetooth 4.0
- High power stereo speakers, built-in microphone, headphone/mic combo jack
- Universal Type-C USB Charger, 60W
- Up to 12 hours of battery life
- Dimensions: 11.7” x 8.8” x 0.6”, 3.3lbs
If you're ready for the $999 Chromebook experience the Pixel is available now from Google's online store.
Subject: Mobile | March 12, 2015 - 06:46 PM | Jeremy Hellstrom
Tagged: msi, gs30, gamingdock
The MSI GS30 Shadow is a high powered laptop with the first external GPU that you can actually buy. The GamingDock is indeed rather unattractive and hefty on the outside but it is what is on the inside that counts, a full GTX 980 with its own dedicated PSU. The external connection is a rear mounted PCIe slot which allows the 980 to run at the speeds you would expect it it were inside a desktop PC. The laptop itself has a Haswell i7-4870HQ, 16GB of DDR3-1600 and pair of Kingston 256GB M.2 SSDs in RAID 0, with the only internal graphics being the Iris Pro 5200 on the CPU. Kitguru has posted a review here, though it would be interesting another review featuring a head to head competition with the GTX980M.
"When we previewed the MSI GS30 Shadow and GamingDock at the end of 2014 we were blown away by the combination of Core i7-4870HQ CPU in the laptop and the desktop GTX 980 graphics card in the GamingDock. The concept of using an external dock to add proper gaming graphics to a thin and light laptop worked superbly well and we could hardly wait for the official release of the final package of hardware."
Here are some more Mobile articles from around the web:
- HP Spectre x360 @ The Inquirer
- Club3D SenseVision Adapters @ Kitguru
- FSP PB Runner 10400mAh Power Bank Review @ NikKTech
- Apple Watch vs Pebble Time Steel @ The Inquirer
- S6 vs S6 Edge @ The Inquirer
- KingSing T8 Smartphone Review @ Madshrimps
Subject: Graphics Cards, Mobile, Shows and Expos | March 7, 2015 - 12:00 PM | Scott Michaud
Tagged: vulkan, PowerVR, Khronos, Imagination Technologies, gdc 15, GDC
Possibly the most important feature of upcoming graphics APIs, albeit the least interesting for enthusiasts, is how much easier driver development will become. So many decisions and tasks that once laid on the shoulders of AMD, Intel, NVIDIA, and the rest will now be given to game developers or made obsolete. Of course, you might think that game developers would oppose this burden, but (from what I understand) it is a weight they already bear, just when dealing with the symptoms instead of the root problem.
This also helps other hardware vendors become competitive. Imagination Technologies is definitely not new to the field. Their graphics powers the PlayStation Vita, many earlier Intel graphics processors, and the last couple of iPhones. Despite how abrupt the API came about, they have a proof of concept driver that was present at GDC. The unfinished driver was running an OpenGL ES 3.0 demo that was converted to the Vulkan API.
A screenshot of the CPU usage was also provided, which is admittedly heavily cropped and hard to read. The one on the left claims 1.2% CPU load, with a fairly flat curve, while the one on the right claims 5% and seems to waggle more. Granted, the wobble could be partially explained by differences in the time they chose to profile.
According to Tom's Hardware, source code will be released “in the near future”.
Subject: Graphics Cards, Mobile | March 4, 2015 - 03:43 AM | Ryan Shrout
Tagged: Tegra X1, tegra, nvidia, gdc 15, GDC, Doom 3, Crysis 3
Impressively, NVIDIA just showed the new SHIELD powered by Tegra X1 running a version of both Doom 3 and Crysis 3 running natively on Android! The games were running at impressive quality and performance levels.
I have included some videos of these games being played on the SHIELD, but don't judge the visual quality of the game with these videos. They were recorded with a Panasonic GH2 off a 4K TV in a dimly lit room.
Doom 3 is quoted to run at full 1920x1080 and 60 FPS while Crysis 3 is much earlier in its development. Both games looked amazing considering we are talking about a system that has a total power draw of only 15 watts!
While these are just examples of the power that Tegra X1 can offer, it's important to note that this type of application is the exception, not the rule, for Android gaming. Just as we see with Half-Life 2 and Portal NVIDIA did most of the leg work to get this version of Doom 3 up and running. Crysis 3 is more of an effort from Crytek explicitly - hopefully this port is as gorgeous as this first look played.
Subject: General Tech, Mobile | March 4, 2015 - 03:21 AM | Ryan Shrout
Tagged: Tegra X1, tegra, shield, gdc 15, GDC, android tv
NVIDIA just announced a new member of its family of hardware devices: SHIELD. Just SHIELD. Powered by NVIDIA's latest 8-core, Maxwell GPU Tegra X1 SoC, SHIELD will run Android TV and act as a game playing, multimedia watching, GRID streaming set-top box.
Odd naming scheme aside, the SHIELD looks to be an impressive little device, sitting on your home theater or desk, bringing a ton of connectivity and performance to your TV. Running Android TV means the SHIELD will have access to the entire library of Google Play media including music, movie and apps. SHIELD supports 4K video playback at 60 Hz thanks to an HDMI 2.0 connection and fully supports H.265/HEVC decode thanks to Tegra X1 processor.
Speaking of the Tegra X1, the SHIELD will include the power of 256 Maxwell architecture CUDA cores and will easily provide the best Android gaming performance of any tablet or set-top box on the market. This means gaming, and lots of it, will be possible on SHIELD. Remember our many discussions about Tegra-specific gaming ports from the past? That trend will continue and more developers are realizing the power that NVIDIA is putting into this tiny chip.
In the box you'll get the SHIELD set-top unit and a SHIELD Controller, the same released with the SHIELD Tablet last year. A smaller remote controller that looks similar to the one used with the Kindle Fire TV will cost a little extra as will the stand that sets the SHIELD upright.
Pricing on the new SHIELD set-top will be $199, shipping in May.
Subject: Graphics Cards, Mobile | March 3, 2015 - 05:00 PM | Ryan Shrout
Tagged: Unity, lighting, global illumination, geomerics, GDC, arm
Back in 2013 ARM picked up a company called Geomerics, responsible for one the industry’s most advanced dynamic lighting engines used in games ranging from mobile to console to PC. Called Enlighten, it is the lighting engine in many major games in a variety of markets. Battlefield 3 uses it, Need for Speed: The Run does as well, The Bureau: XCOM Declassified and Quantum Conundrum mark another pair of major games that depend on Geomerics technology.
Great, but what does that have to do with ARM and why would the company be interested in investing in software that works with such a wide array of markets, most of which are not dominated by ARM processors? There are two answers, the first of which is directional: ARM is using the minds and creative talent behind Geomerics to help point the Cortex and Mali teams in the correct direction for CPU and GPU architecture development. By designing hardware to better address the advanced software and lighting systems Geomerics builds then Cortex and Mali will have some semblance of an advantage in specific gaming titles as well as a potential “general purpose” advantage. NVIDIA employs hundreds of gaming and software developers for this exact reason: what better way to make sure you are always at the forefront of the gaming ecosystem than getting high-level gaming programmers to point you to that edge? Qualcomm also recently (back in 2012) started employing game and engine developers in-house with the same goals.
ARM also believes it will be beneficial to bring publishers, developers and middleware partners to the ARM ecosystem through deployment of the Enlighten engine. It would be feasible to think console vendors like Microsoft and Sony would be more willing to integrate ARM SoCs (rather than the x86 used in the PS4 and Xbox One) when shown the technical capabilities brought forward by technologies like Geomerics Enlighten.
It’s best to think of the Geomerics acquisition of a kind of insurance program for ARM, making sure both its hardware and software roadmaps are in line with industry goals and directives.
At GDC 2015 Geomerics is announcing the release of the Enlighten 3 engine, a new version that brings cinematic-quality real-time global illumination to market. Some of the biggest new features include additional accuracy on indirect lighting, color separated directional output (enables individual RGB calculations), better light map baking for higher quality output, and richer material properties to support transparency and occlusion.
All of this technology will be showcased in a new Subway demo that includes real-time global illumination simulation, dynamic transparency and destructible environments.
Geomerics Enlighten 3 Subway Demo
Enlighten 3 will also ship with Forge, a new lighting editor and pipeline tool for content creators looking to streamline the building process. Forge will allow import functionality from Autodesk 3ds Max and Maya applications making inter-operability easier. Forge uses a technology called YEBIS 3 to show estimated final quality without the time consuming final-build processing time.
Finally, maybe the biggest news for ARM and Geomerics is that the Unity 5 game engine will be using Enlighten as its default lighting engine, giving ARM/Mali a potential advantage for gaming experiences in the near term. Of course Enlighten is available as an option for Unreal Engine 3 and 4 for developers using that engine in mobile, console and desktop projects as well as in an SDK form for custom integrations.
Subject: General Tech, Mobile, Shows and Expos | March 2, 2015 - 02:46 AM | Scott Michaud
Tagged: webOS, smartwatch, mwc 15, MWC, LG
A while ago, LG licensed WebOS from HP for use in their smart TVs and, as we found out during CES, smart watches.
The LG Urbane LTE is one such device, and we can finally see it in action. It is based around (literally) a circular P-OLED display (320 x 320, 1.3-inches, 245 ppi). Swirling your finger around the face scrolls through the elements like a wheel, which should be significantly more comfortable to search through a large list of applications than a linear list of elements -- a lot like an iPod (excluding the Touch and the Shuffle). That said, I have only seen other people use it.
The SoC is a Qualcomm Snapdragon 400, clocked at 1.2 GHz. It supports LTE, Wireless-N, Bluetooth 4.0LE, and NFC. It has 1 GB of RAM, which is quite a bit, and 4GB of permanent storage, which is not. It also has a bunch of sensors, from accelerometers and gyros to heart rate monitors and a barometer. It has a speaker and a microphone, but no camera. LG flaunts a 700 mAh battery, which they claim is “the category's largest”, but they do not link that to an actual amount of usage time (only that it “go[es] for days in standby mode”).
Video credit: The Verge
Pricing has not yet been announced, but it should hit the US and Europe before May arrives.
Subject: General Tech, Mobile, Shows and Expos | March 1, 2015 - 10:16 PM | Scott Michaud
Tagged: MWC, mwc 15, GDC, gdc 15, htc, valve, vive, vive vr, Oculus
Mobile World Congress (MWC) and Game Developers Conference (GDC) severely overlap this year, and not just in dates apparently. HTC just announced the Vive VR headset at MWC, which was developed alongside Valve. The developer edition will contain two 1200x1080 displays with a 90Hz refresh rate, and it will launch this spring. The consumer edition will launch this holiday. They made sure to underline 2015, so you know they're serious. Want more information? Well that will be for Valve to discuss at GDC.
The confusing part: why is this not partnered with Oculus? When Michael Abrash left Valve to go there, I assumed that it was Valve shedding its research to Facebook's subsidiary and letting them take the hit. Now, honestly, it seems like Facebook just poached Abrash, Valve said “oh well”, and the two companies kept to their respective research. Who knows? Maybe that is not the case. We might find out more at GDC, but you would expect that Oculus would be mentioned if they had any involvement at all.
Valve will host an event on the second official day of GDC, March 3rd at 3pm. In other words, Valve will make an announcement on 3/3 @ 3. Could it involve Left 4 Dead 3? Portal 3? Will they pull a Crytek and name their engine Source 3? Are they just trolling absolutely everyone? Will it have something to do with NVIDIA's March 3rd announcement? Do you honestly think I have any non-speculative information about this? No. No I don't. There, I answered one of those questions.
Subject: Mobile | March 1, 2015 - 07:01 PM | Sebastian Peak
Tagged: SoC, smartphones, Samsung, MWC 2015, MWC, Galaxy S6 Edge, galaxy s6, Exynos 7420, 14nm
Samsung has announced the new Galaxy S phones at MWC, and the new S6 and S6 Edge should be in line with what you were expecting if you’ve followed recent rumors.
The new Samsung Galaxy S6 and S6 Edge (Image credit: Android Central)
As expected we no longer see a Qualcomm SoC powering the new phones, and as the rumors had indicated Samsung opted instead for their own Exynos 7 Octa mobile AP. The Exynos SoC’s have previously been in international versions of Samsung’s mobile devices, but they have apparently ramped up production to meet the demands of the US market as well. There is an interesting twist here, however.
The Exynos 7420 powering both the Galaxy S6 and S6 Edge is an 8-core SoC with ARM’s big.LITTLE design, combining four ARM Cortex-A57 cores and four Cortex-A53 cores. Having announced 14nm FinFET mobile AP production earlier in February the possibility of the S6 launching with this new part was interesting, as the current process tech is 20nm HKMG for the Exynos 7. However a switch to this new process so soon before the official announcement seemed unlikely as large-scale 14nm FinFET production was just unveiled on February 16. Regardless, AnandTech is reporting that the new part will indeed be produced using this new 14nm process technology, and this gives Samsung an industry-first for a mobile SoC with the launch of the S6/S6 Edge.
GSM Arena has specs of the Galaxy S6 posted, and here’s a brief overview:
- Display: 5.1” Super AMOLED, QHD resolution (1440 x 2560, ~577 ppi), Gorilla Glass 4
- OS: Android OS, v5.0 (Lollipop) - TouchWiz UI
- Chipset: Exynos 7420
- CPU: Quad-core 1.5 GHz Cortex-A53 & Quad-core 2.1 GHz Cortex-A57
- GPU: Mali-T760
- Storage/RAM: 32/64/128 GB, 3 GB RAM
- Camera: (Primary) 16 MP, 3456 x 4608, optical image stabilization, autofocus, LED flash
- Battery: 2550 mAh (non-removable)
The new phones both feature attractive styling with metal and glass construction and Gorilla Glass 4 sandwiching the frame, giving each phone a glass back.
The back of the new Galaxy S6 (Image credit: Android Central)
The guys at Android Central (source) had some pre-release time with the phones and have a full preview and hands-on video up on their site. The new phones will be released worldwide on April 10, and no specifics on pricing have been announced.
Subject: Mobile | February 28, 2015 - 09:42 PM | Sebastian Peak
Tagged: smartphones, MWC 2015, MWC, Moto E, LG Magna, ios, Android 5.0
Last year my favorite smartphone became the 2014 version of the Moto G. This was (and still is) a $179 unlocked Android phone that shipped with 4.4.4 KitKat, but recently received an OTA update to 5.0 Lollipop (and subsequently 5.0.2 via a second OTA update). Motorola’s aggressive pricing made the phone compelling on paper, but using the device was even more impressive. It looked good, with a 5-inch 720p IPS display and the same design language as the Moto X and later Nexus 6, and ran a virtually untouched stock Android OS. It was never going to win any awards for raw speed, but the quad-core Snapdragon 400 SoC was plenty fast for daily use. The main drawback was a glaring one, however: the Moto G was not LTE capable. Enter the new Moto E.
Here are some quick specs from Motorola:
Moto E 2nd Edition (LTE capable)
4.5” 540x960 display
Quad-core 1.2GHz Cortex-A53/Adreno 306
1GB RAM/8GB storage
2390 mAh battery
We are already off to a solid start in 2015 with a great option from Motorola in the new 2nd edition Moto E. This LTE capable smartphone might look a little chunky, but the specs make it more that just a compelling option at $149 (unlocked) as it could have the disruptive impact on price that Microsoft just couldn’t make last year with their inexpensive Lumia phones. With 2015’s Mobile World Congress (MWC) fast approaching the Moto E has already been making some noise in the affordable phone space that last year’s Moto G played a big part in, and this time the message is clear: in 2015 a smartphone needs to have LTE, regardless of price.
To be fair Microsoft has already addressed need for LTE with their low-cost Windows Phone devices like the Lumia 635 (which is actually selling for just $49 on Amazon now), but the app ecosystem for the platform is just too restrictive to make it a viable solution compared to Android and iOS. Honestly, I love the Windows Phone OS but there are too many missing apps to make it a daily driver. So, since Windows clearly isn’t the answer and Apple won’t be selling a sub-$200 unlocked smartphone anytime soon (the cheapest unlocked iPhone is the 8GB 5c at $450), that leaves Android (of course).
Another possibility comes from LG, as ahead of MWC there was a press release from the company showcasing their new “mid-range” smartphone lineup for 2015. Among the models listed is another phone that matches the specs associated with a $200-ish unlocked phone, but pricing has not been announced yet.
LG Magna (LTE capable) - Unreleased
5.0” 720x1280 display
1GB RAM, 8GB storage
2540 mAh battery
We await the announcements from MWC and there are sure to be many other examples of low-cost LTE devices, but already it’s looking like it won’t take more than $200 and a SIM card to avoid the endless device upgrade cycle in 2015.
Subject: Graphics Cards, Mobile | February 26, 2015 - 07:15 PM | Ryan Shrout
Tagged: super-gpu, PowerVR, Imagination Technologies, gt7900
As a preview to announcements and releases being made at both Mobile World Congress (MWC) and the Game Developers Summit (GDC) next week, Imagination Technologies took the wraps off of a new graphics product they are calling a "super-GPU". The PowerVR GT7900 is the new flagship GPU as a part of its Series7XT family that is targeting a growing category called "affordable game consoles." Think about the Android-powered set-top devices like the Ouya or maybe Amazon's Kindle TV.
PowerVR breaks up its GPU designs into unified shading clusters (USCs) and the GT7900 has 16 of them for a total of 512 ALU cores. Imagination has previously posted a great overview of its USC architecture design and how you can compare its designs to other GPUs on the market. Imagination wants to claim that the GT7900 will offer "PC-class gaming experiences" though that is as ambiguous as the idea of a work load of a "console-level game." But with rated peak performance levels hitting over 800 GFLOPS in FP32 and 1.6 TFLOPS in FP16 (half-precision) this GPU does have significant theoretical capability.
|PowerVR GT7900||Tegra X1|
|GPU Clock||800 MHz||1000 MHz|
|Process Tech||16nm FinFET+||20nm TSMC|
Imagination also believes that PowerVR offers a larger portion of its peak performance for a longer period of time than the competition thanks to the tile-based deferred rendering (TBDR) approach that has been "refined over the years to deliver unmatched efficiency."
The FP16 performance number listed above is useful as an extreme power savings option where the half-precision compute operates in a much more efficient manner. A fair concern is how many applications, GPGPU or gaming, actually utilize the FP16 data type but having support for it in the GT7900 allows developers to target it.
Other key features of the GT7900 include support for OpenGL ES 3.1 + AEP (Android Extension Pack), hardware tessellation and ASTC LDR and HDR texture compression standards. The GPU also can run in a multi-domain virtualization mode that would allow multiple operating systems to run in parallel on a single platform.
Imagination believes that this generation of PowerVR will "usher a new era of console-like gaming experiences" and will showcase a new demo at GDC called Dwarf Hall.
I'll be at GDC next week and have already setup a meeting with Imagination to talk about the GT7900 so I can have some hands on experiences to report back with soon. I am continually curious about the market for these types of high-end "mobile" GPUs with the limited market that the Android console market currently addresses. Imagination does claim that the GT7900 is beating products with performance levels as high as the GeForce GT 730M discrete GPU - no small feat.
Subject: Mobile | February 25, 2015 - 09:46 PM | Jeremy Hellstrom
Tagged: z3580, venue 8 7000, venue, tablet, silvermont, moorefield, Intel, dell, atom z3580, Android
Dell's Venue 8 7000 tablet sports an 8.4" 2560x1600 OLED display and is powered by the Moorefield based Atom Z3580 SOC, 2GB LPDDR3-1600 with 16GB internal of internal storage with up to a 512GB Micro SD card supported. Even more impressive is that The Tech Report had no issues installing apps or moving files to the SD card with ES File Explorer, unlike many Android devices that need certain programs to reside on the internal storage media. Like Ryan, they had a lot of fun with the RealSense Camera and are looking forward to the upgrade to Lollipop support. Check out The Tech Report's opinion of this impressive Android tablet right here.
"Dell's Venue 8 7000 is the thinnest tablet around, and that's not even the most exciting thing about it. This premium Android slate packs a Moorefield-based Atom processor with quad x86 cores, a RealSense camera that embeds 3D depth data into still images, and a staggeringly beautiful OLED display that steals the show. Read on for our take on a truly compelling tablet."
Here are some more Mobile articles from around the web:
- Lenovo ThinkPad X1 Carbon Works Great As A Linux Ultrabook @ Phoronix
- Cooler Master NotePal ERGOSTAND III Review @ Techgage
- Portable Smartphone Battery Pack Roundup @ eTeknix
- Sandberg Outdoor Powerbank 10400 mAh Review @ NikKTech
- Xiaomi Mi4 64GB Smartphone Review @ Madshrimps
Subject: General Tech, Mobile | February 20, 2015 - 12:00 PM | Scott Michaud
Tagged: shieldtuesday, shield, Saints Row IV, nvidia, metro last light, gridtuesday, grid, alan wake
Once again, NVIDIA brings some really good games to their GRID service, which is currently free for all SHIELD owners. The concept is that NVIDIA will compute the graphics at their server farms, accept your input, and return an audio/video stream of the result. This is a very convenient way to access content, but it cannot replace actual ownership for guaranteed access to specific art that find intrinsically valuable. It can help you discover new content, though.
This week, Saint's Row IV is available to be played on the GRID gaming service. Its predecessor, Saint's Row: The Third, was published on GRID earlier this month. It would be good to play them in order, and they are both worth your time. I did find that the campaign of Saint's Row IV was a bit less unique because the majority of missions were a handful of side-missions strung together, while Saint's Row: The Third had more scenario-based objectives, with the side-missions as an option to build up stats (or just be fun) between these. On the other hand, the movement mechanics were genius in IV. Play them both.
Looking ahead, next Tuesday will be Alan Wake. This is a survival-horror title from Remedy that makes you appreciate just how long your batteries last in real life. Basically, electricity is light and light is a vulnerability for the monsters that want to destroy you. The week after, the third of March, is Metro: Last Light Redux. This is one of the most visually demanding games available, and it is still used as a GPU benchmark at this site.
Saint's Row IV went live last Tuesday, while Alan Wake arrives on the 24th and Metro: Last Light comes in last, on March 3rd.
Subject: Graphics Cards, Mobile | February 19, 2015 - 08:58 PM | Ryan Shrout
Tagged: nvidia, notebooks, mobile, gpu
After a week or so of debate circling NVIDIA's decsision to disable overclocking on mobility GPUs, we have word that the company has reconsidered and will be re-enabling the feature in next month's driver release:
As you know, we are constantly tuning and optimizing the performance of your GeForce PC.
We obsess over every possible optimization so that you can enjoy a perfectly stable machine that balances game, thermal, power, and acoustic performance.
Still, many of you enjoy pushing the system even further with overclocking.
Our recent driver update disabled overclocking on some GTX notebooks. We heard from many of you that you would like this feature enabled again. So, we will again be enabling overclocking in our upcoming driver release next month for those affected notebooks.
If you are eager to regain this capability right away, you can also revert back to 344.75.
Now, I don't want to brag here, but we did just rail NVIDIA for this decision on last night's podcast...and then the decision was posted on NVIDIA's forums just four hours ago... I'm not saying, but I'm just saying!
All kidding aside, this is great news! And NVIDIA desperately needs to be paying attention to what consumers are asking for in order to make up for some poor decisions made in the last several months. Now (or at least soon), you will be able to return to your mobile GPU overclocking!
Subject: Mobile | February 16, 2015 - 08:54 AM | Sebastian Peak
Tagged: zenbook, UX305, ultraportable, ips display, core m, asus, 5Y10
ASUS has announced the availability and pricing for the ZenBook UX305, and the specifications are quite exceptional for the price. Not content to compete on hardware specs alone the design of the notebook is a miniscule 0.48” thick, making the UX305 the world’s thinnest ultraportable notebook according to ASUS.
As impressive as the slim profile of the aluminum design might be, it is more impressive to look over the main specifications of the $699 UX305:
- Intel Core M 5Y10 processor
- 8GB of LPDDR3 memory
- 256GB SSD
- 13.3-inch 1920x1080 IPS display (matte finish)
I'll let that sink in for a moment. Quite an impressive list given the MSRP for these specifications is, again, only $699. At this price it's going to be very difficult to beat the UX305 considering what’s under the hood, as this configuration contains double the memory and storage space compared to many ultraportables in this price class. And 1080p IPS on top of everything is just icing on the cake. Battery life should be very good considerin the processor the heart of this is Intel's newest low-power Broadwell-based Core M (the 5Y10), which features HD 5300 graphics and a TDP of just 4.5W. Moreover, the processor is passively cooled and the notebook features a completely fanless design for silent operation.
Since there are no fans to expell heat ASUS has made it a point to promise that the palm rest will always stay cool thanks to their “IceCool technology” (whatever that is - but I really hope it’s an ice cube cooling system). The UX305 is powered by a 45Wh Lithium Polymer battery that has a claimed 10-hour battery life, and the notebook features 802.11ac wireless, three USB 3.0 ports, and includes a USB Ethernet adapter (a nice touch). ASUS is also touting a premium sound system with this notebook, employing a B&O ICEpower amplifier and enhanced with their proprietary “SonicMaster audio”. Rounding out the feature list is an SD card reader and 720p webcam.
The notebook weighs in at 2.6 Lbs, and this configuration of the UX305 is available immediately (listed on their official store). With the surprisingly low MSRP it sounds like this ZenBook will be a solid choice for anyone looking for the latest notebook tech on a budget, and depending on performance and real-world battery life it could just be that mythical MacBook Air "killer" (if you're ok with Windows 8 over OS X, of course).
Subject: Graphics Cards, Mobile, Shows and Expos | February 11, 2015 - 08:25 PM | Scott Michaud
Tagged: Tegra X1, nvidia, mwc 15, MWC, gdc 2015, GDC, DirectX 12
On March 3rd, NVIDIA will host an event called “Made to Game”. Invitations have gone out to numerous outlets, including Android Police, who published a censored screenshot of it. This suggests that it will have something to do with the Tegra X1, especially since the date is the day after Mobile World Congress starts. Despite all of this, I think it is for something else entirely.
Image Credit: Android Police
Allow me to highlight two points. First, Tech Report claims that the event is taking place in San Francisco, which is about as far away from Barcelona, Spain as you can get. It is close to GDC however, which takes also starts on March 2nd. If this was going to align with Mobile World Congress, you ideally would not want attendees to take 14-hour flight for a day trip.
Second, the invitation specifically says: “More than 5 years in the making, what I want to share with you will redefine the future of gaming.” Compare that to the DirectX 12 announcement blog post on March 20th of last year (2014): “Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC ((2014)).”
So yeah, while it might involve the Tegra X1 processor for Windows 10 on mobile devices, which is the only reason I can think of that they would want Android Police there apart from "We're inviting everyone everywhere", I expect that this event is for DirectX 12. I assume that Microsoft would host their own event that involves many partners, but I could see NVIDIA having a desire to save a bit for something of their own. What would that be? No idea.
Subject: Mobile | February 11, 2015 - 07:36 PM | Jeremy Hellstrom
Tagged: Samsung, Note 4, Exynos 5433, snapdragon 805, phablet
At 5.7" and 176g the Samsung Note 4 is a large device and it has a resolution to match it at 2560x1440. That resolution does slow it down somewhat, in graphics tests it does fall behind the iPhone 6 Plus except in Basemark X and 3DMark's Ice Storm test but it does show up the competition when it comes to graphical quality with only NVIDIA's Shield beating it on the GFXBench Quality tests. In the CPU tests it scored moderately well on single threaded applications but wipes the floor with the competition when it comes to multi-threaded performance which you should keep in mind when choosing your purchases. To see more benchmarks and details The Tech Report's full review can be found right here.
"Most of the world gets a variant of Samsung's Galaxy Note 4 based on Qualcomm's familiar Snapdragon 805 system-on-a-chip (SoC). In Samsung's home country of Korea, though, the firm ships a different variant of the Note 4 based on Samsung LSI's Exynos 5433 SoC. With eight 64-bit CPU cores and a 64-bit Mali-T760 GPU, the Eyxnos 5433 could make this version the fastest and most capable Note 4--and it gives us some quality time with the Cortex-A53 and A57 CPU cores that will likely dominate the Android market in 2015."
Here are some more Mobile articles from around the web:
- 9-Way Linux Laptop Performance Comparison From Intel Nehalem To Broadwell @ Phoronix
- The 2015 Alienware 15 & Alienware 17 Launch Even @ Tech ARP
- LUXA2 P1-PRO 7000mAh Outdoor Power Bank Review @ NikKTech
- Luxa2 PL3 10,400 mAh Leather Power Bank Review @HiTech Legion
- Luxa2 EnerG Slim 10,000mAh Power Bank Review @ OCC
- Noreve iPad Air 2 Protective Cases Review @ Madshrimps
- Sony Smartwatch 3 @ The Inquirer
Subject: General Tech, Graphics Cards, Mobile | February 11, 2015 - 01:00 PM | Scott Michaud
Tagged: shieldtuesday, shield, nvidia, gridtuesday, grid, graphics drivers, geforce, drivers
Update: Whoops! The title originally said "374.52", when it should be "347.52". My mistake. Thanks "Suddenly" in the comments!
Two things from NVIDIA this week, a new driver and a new game for the NVIDIA GRID. The new driver aligns with the release of Evolve, which came out on Tuesday from the original creators of Left4Dead. The graphics vendor also claims that it will help Assassin's Creed: Unity, Battlefield 4, Dragon Age: Inquisition, The Crew, and War Thunder. Several SLi profiles were also added for Alone in the Dark: Illumination, Black Desert, Dying Light, H1Z1, Heroes of the Storm, Saint's Row: Gat out of Hell, Total War: Attila, and Triad Wars.
On the same day, NVIDIA released Brothers: A Tale of Two Sons on GRID, bringing the number of available games up to 37. This game came out in August 2013 and received a lot of critical praise. Its control style is unique, using dual-thumbstick gamepads to simultaneously control both characters. More importantly, despite being short, the game is said to have an excellent story, even achieving Game of the Year (2013) for TotalBiscuit based on its narrative, which is not something that he praises often.
I'd comment on the game, but I've yet to get the time to play it. Apparently it is only a couple hours long, so maybe I can fit it in somewhere.
Also, they apparently are now calling this “#SHIELDTuesday” rather than “#GRIDTuesday”. I assume this rebranding is because people may not know that GRID exists, but they would certainly know if they purchased an Android-based gaming device for a couple hundred dollars or more. We could read into this and make some assumptions about GRID adoption rates versus SHIELD purchases, or even purchases of the hardware itself versus their projections, but it would be pure speculation.
Both announcements were made available on Tuesday, for their respective products.
Subject: General Tech, Systems, Mobile | February 3, 2015 - 10:35 PM | Scott Michaud
Tagged: razer blade, razer, nvidia, Intel, GTX 970M
When the Razer Blade launched, it took a classy design and filled it with high-end gaming components. Its competitors in the gaming space were often desktop replacements, which were powerful but not comfortable, every-day laptops. The Blade also came with a $2800 (at the time) price-tag, and that stunted a lot of reviews. It has been refreshed a few times since then, including today.
The New Razer Blade QHD+ has a 14-inch 3200x1800 display, with multi-touch and an LED backlight. The panel is IGZO, which is a competitor to IPS for screens with a high number of pixels per inch (such as the 4K PQ321Q from ASUS). This is housed in a milled aluminum chassis that is about 7/10th of an inch thick.
Its power brick is rated at 150W, which is surprisingly high for a laptop. I am wondering how much of that electricity is headroom for fast-charging (versus higher performance when not on battery). Most power adapters for common laptops that I've seen are between 60W and 95W. In a small, yet meticulously designed chassis, I would have to assume that thermal headroom of either the heatsinks or the components themselves would be the limiting factor.
On the topic of specifications, they are expectedly high-end.
The GPU was upgraded to the GeForce GTX 970M with 3GB of VRAM (up from a 3GB 870M) and the CPU is now a Core i7-4720HQ (up from a Core i7-4702HQ). The system memory also got doubled, to 16GB (up from 8GB). It also has 3 USB 3.0 ports, HDMI 1.4a out, 802.11a/b/g/n/ac, Bluetooth 4.0, and (of course) a high-end, backlit keyboard. Razer offers a choice in M.2 SSD capacity: 128GB for $2199.99, 256GB for $2399.99, or 512GB for $2699.99. This is kind-of expensive for solid state memory, $1.56/GB for the jump to 256GB and $1.17/GB to go from there to 512GB.
The New Razer Blade Gaming Laptop is available now at Razerzone.com in the US, Canada, Singapore, and Hong Kong. It will arrive at Microsoft Stores in the USA on February 16th. China, Australia, New Zealand, Malaysia, UAE, Japan, Korea, Taiwan, and Russia can purchase it on Razerzone.com in March. Prices start (as stated above) at $2199.99.
Subject: General Tech, Processors, Mobile | February 1, 2015 - 08:17 PM | Scott Michaud
Tagged: mt6753, mediatek
We do not talk about MediaTek's higher-end products too often. Part of that is because they use stock architectures, ARM's Cortex CPU, ARM's Mali GPU, and Imagination Technologies' PowerVR GPU, rather than designing their own CPU and/or GPU portion. Likewise, their design wins are also not covered too much on this site, such as the new Amazon Fire HD tablets, for their own reasons. They still make some interesting chips, though.
Image Credit: A Weibo user via GSM-Arena
The MediaTek MT6753 is a true eight-core, 64-bit ARM SoC. Its press release makes the rest of its details... confusing. The release claims that it is clocked at 1.5 GHz and contains an ARM Mali-T720 GPU that is capable of OpenGL ES 3.0 and OpenCL 1.2. The ARM Mali-T720 is actually capable of OpenGL ES 3.1 and OpenCL 1.1. This leads some sites to report that the MT6753 actually contains a Mali-T760, which is newer and can utilize OpenGL ES 3.1 and OpenCL 1.2 (it is also used in the MT6752 that was released several months ago). Other sites report what MediaTek claims.
GSM-Arena, one site that claims the (more-sensible) Mali-T760, also claims that the Cortex CPU cores can be clocked up to 1.7 GHz. This might not be inaccurate either, because it could be intended to run at ~1.3 to 1.5 GHz with a 1.7 GHz peak for vendors that want to take it to eleven. Alternatively, they could be wrong and it could peak at 1.5 GHz. We don't know, and MediaTek should be more clear about these important details.
Everyone seems to agree on the chip's networking capability, though. It will directly support LTE protocols for both China and western markets. This is expected to make them more competitive against Qualcomm, which might lead to more interesting designs.
Devices containing the MT6753 are expected to ship next quarter.