All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Storage | March 3, 2015 - 06:16 PM | Jeremy Hellstrom
Tagged: tlc, ssd, SM2256, slc, silicon motion
You may remember the Silicon Motion SM2256 SSD controller that Al reported on during CES this year, even if you do not you should be interested in a controller which can work with 1x/1y/1z nm TLC NAND from any manufacturer on the market. The SSD Review managed to get a prototype which uses the new SM2256 controller, Samsung’s 19nm TLC planar NAND flash and a Hynix 440Mhz 256MB DDR3 DRAM chip. In benchmarking they saw 548MB/s sequential reads and 484MB/s writes, with 4K slowing down to 38MB/s for read and 110MB/s for write. Check out the rest of the review here as well as keeping your eyes peeled for our first review of the new controller.
"Controllers are the heart and soul of every SSD. Without one, an SSD would be a useless PCB with some components slapped on it. It is responsible for everything from garbage collection and wear leveling to error correction and hardware encryption. In simple terms, all these operations can be quite complicated to implement as well as expensive to develop."
Here are some more Storage reviews from around the web:
- Crucial's BX100 and MX200 @ The Tech Report
- Crucial MX200 250 GB @ techPowerUp
- Crucial BX100 SSD @ HardwareHeaven
- OCZ ARC 100 480GB SSD Review @ NikKTech
- Thecus W4000 WSS NAS @ Kitguru
- WD My Cloud DL4100 Business NAS Review @ Techgage
- ASUSTOR AS7010T NAS Server Review @ NikKTech
- SilverStone TS431S 4-Bay miniSAS DAS Storage Tower @ eTeknix
Subject: General Tech, Graphics Cards, Shows and Expos | March 3, 2015 - 03:37 PM | Scott Michaud
Tagged: vulkan, Mantle, Khronos, glnext, gdc 15, GDC, amd
Neil Trevett, the current president of Khronos Group and a vice president at NVIDIA, made an on-the-record statement to acknowledge the start of the Vulkan API. The quote came to me via Ryan, but I think it is a copy-paste of an email, so it should be verbatim.
Many companies have made great contributions to Vulkan, including AMD who contributed Mantle. Being able to start with the Mantle design definitely helped us get rolling quickly – but there has been a lot of design iteration, not the least making sure that Vulkan can run across many different GPU architectures. Vulkan is definitely a working group design now.
So in short, the Vulkan API was definitely started with Mantle and grew from there as more stakeholders added their opinion. Vulkan is obviously different than Mantle in significant ways now, such as its use of SPIR-V for its shading language (rather than HLSL). To see a bit more information, check out our article on the announcement.
Update: AMD has released a statement independently, but related to Mantle's role in Vulkan
Subject: General Tech | March 3, 2015 - 03:01 PM | Jeremy Hellstrom
Tagged: Raspberry Pi 2 Model B
Linux.com have just released benchmarks of the new Raspberry Pi 2 Model B with its improved processor and RAM. Benchmarking a Pi is always interesting as you must find applications which are reasonable for this device to use, with webserver software being a decent choice to compare to ODroid-U2, Radxa and the Beaglebone Black. openSSL 1.0.1e,DES and AES cbc mode ciphering and Blowfish were all tested with the Pi performing slowly but improved from the previous generation and certainly decent for a $35 piece of hardware. In addition both a full KDE desktop and KDE/Openbox were successfully installed with Openbox the recommended choice. Get all the results right here.
"Released in February, the Raspberry Pi Model 2 B is an update to the original board that brings quad cores for six times the performance of the original, 1 gigabyte of RAM for twice the memory, and still maintains backwards compatibility. The many CPU cores are brought about by moving from the BCM2835 SoC to the BCM2836 SoC in the Raspberry Pi 2."
Here is some more Tech News from around the web:
- Google Backs Off Default Encryption on New Android Lollilop Devices @ Slashdot
- Ericsson, Telstra and Qualcomm up the ante with 600Mbps demo @ The Register
- 50 shades of grey can turn Adobe Reader into a hot mess @ The Register
- New Seagate Shingled Hard Drive Teardown @ Slashdot
- Microsoft spills some beans on the Windows 10 Universal Apps platform @ The Inquirer
- NVIDIA Fixes Old Compiz Bug @ Slashdot
- IBM and Apple cosy up further with more joint cloud apps @ The Inquirer
- MWC: Jolla pitches Sailfish Secure OS as Europe's only safe mobile option @ The Inquirer
Subject: Graphics Cards | March 3, 2015 - 02:44 PM | Sebastian Peak
Tagged: video cards, nvidia, gtx 960, geforce, 4GB
They said it couldn't be done, but where there are higher density chips there's always a way. Today EVGA and Inno3D have both announced new versions of GTX 960 graphics cards with 4GB of GDDR5 memory, placing the cards in a more favorable mid-range position depending on the launch pricing.
EVGA's new 4GB NVIDIA GTX 960 SuperSC
Along with the expanded memory capacity EVGA's card features their ACX 2.0+ cooler, which promises low noise and better cooling. The SuperSC is joined by a standard ACX and the higher-clocked FTW variant, which pushes Base/Boost clocks to 1304/1367MHz out of the box.
Inno3D's press release provides fewer details, and the company appears to be launching a single new model featuring 4GB of memory which looks like a variant of their existing GTX 960 OC card.
The existing Inno3D GTX 960 OC card
The current 2GB version of the GTX 960 can be found starting at $199, so expect these expanded versions to include a price bump. The GTX 960, with only 1024 CUDA cores (half the count of a GTX 980) and a 128-bit memory interface, has been a very good performer nonetheless with much better numbers than last year's GTX 760, and is very competitive with AMD's R9 280/285. (It's a great overclocker, too.) The AMD/NVIDIA debate rages on, and NVIDIA's partners adding another 4GB offering to the mix will certainly add to the conversation, particularly as an upcoming 4GB version of the GTX 960 was originally said to be unlikely.
Subject: Graphics Cards, Mobile | March 3, 2015 - 12:00 PM | Ryan Shrout
Tagged: Unity, lighting, global illumination, geomerics, GDC, arm
Back in 2013 ARM picked up a company called Geomerics, responsible for one the industry’s most advanced dynamic lighting engines used in games ranging from mobile to console to PC. Called Enlighten, it is the lighting engine in many major games in a variety of markets. Battlefield 3 uses it, Need for Speed: The Run does as well, The Bureau: XCOM Declassified and Quantum Conundrum mark another pair of major games that depend on Geomerics technology.
Great, but what does that have to do with ARM and why would the company be interested in investing in software that works with such a wide array of markets, most of which are not dominated by ARM processors? There are two answers, the first of which is directional: ARM is using the minds and creative talent behind Geomerics to help point the Cortex and Mali teams in the correct direction for CPU and GPU architecture development. By designing hardware to better address the advanced software and lighting systems Geomerics builds then Cortex and Mali will have some semblance of an advantage in specific gaming titles as well as a potential “general purpose” advantage. NVIDIA employs hundreds of gaming and software developers for this exact reason: what better way to make sure you are always at the forefront of the gaming ecosystem than getting high-level gaming programmers to point you to that edge? Qualcomm also recently (back in 2012) started employing game and engine developers in-house with the same goals.
ARM also believes it will be beneficial to bring publishers, developers and middleware partners to the ARM ecosystem through deployment of the Enlighten engine. It would be feasible to think console vendors like Microsoft and Sony would be more willing to integrate ARM SoCs (rather than the x86 used in the PS4 and Xbox One) when shown the technical capabilities brought forward by technologies like Geomerics Enlighten.
It’s best to think of the Geomerics acquisition of a kind of insurance program for ARM, making sure both its hardware and software roadmaps are in line with industry goals and directives.
At GDC 2015 Geomerics is announcing the release of the Enlighten 3 engine, a new version that brings cinematic-quality real-time global illumination to market. Some of the biggest new features include additional accuracy on indirect lighting, color separated directional output (enables individual RGB calculations), better light map baking for higher quality output, and richer material properties to support transparency and occlusion.
All of this technology will be showcased in a new Subway demo that includes real-time global illumination simulation, dynamic transparency and destructible environments.
Geomerics Enlighten 3 Subway Demo
Enlighten 3 will also ship with Forge, a new lighting editor and pipeline tool for content creators looking to streamline the building process. Forge will allow import functionality from Autodesk 3ds Max and Maya applications making inter-operability easier. Forge uses a technology called YEBIS 3 to show estimated final quality without the time consuming final-build processing time.
Finally, maybe the biggest news for ARM and Geomerics is that the Unity 5 game engine will be using Enlighten as its default lighting engine, giving ARM/Mali a potential advantage for gaming experiences in the near term. Of course Enlighten is available as an option for Unreal Engine 3 and 4 for developers using that engine in mobile, console and desktop projects as well as in an SDK form for custom integrations.
Subject: General Tech | March 2, 2015 - 03:56 PM | Jeremy Hellstrom
Tagged: gigabyte, audio, Force H3X, gaming headset, analog
Gigabyte's Force H3X gaming headset sports the 50mm neodymium drivers we have become used to, with a decent frequency response range of 20Hz to 20KHz. The microphone is a bit different, using two 2mm pickup drivers on each side for a total of four but from the testing Modders Inc performed it did not help with the quality of your recorded audio. This does not matter so much on a gaming headset but this is perhaps not the best choice for a budding YouTube star. For audio in gaming Modders Inc does give the headset good marks and they also found it to be very comfortable over long periods of time, definitely worth checking out if you are in the market for a new headset to game with.
"Don't you hate that when you are camping with a sniper rifle and all of the sudden some one sneaks up behind you and puts a knife through your head? Of course! We have all been there. Don't you wish you heard that guy who was sneaking up on you? Maybe then you could have switched to a Desert Eagle …"
Here is some more Tech News from around the web:
- Turtle Beach Earforce Z60 DTS Headphone X @ eTeknix
- Turtle Beach Elite 800 PlayStation & Mobile Wireless Headset @ eTeknix
- Kingston HyperX Cloud II Gaming Headset Review @ Neoseeker
- Audio-Technica Sonic Sport ATH-Sport1 @ Kitguru
- Turtle Beach Recon 320 PC & Mobile Gaming Headset @ eTeknix
- Kingston HyperX Cloud II @ Kitguru
Subject: Graphics Cards | March 2, 2015 - 02:31 PM | Ryan Shrout
Tagged: sdk, Mantle, dx12, API, amd
The Game Developers Conference is San Francisco starts today and you can expect to see more information about DirectX 12 than you could ever possibly want, so be prepared. But what about the original low-level API, AMD Mantle. Utilized in Battlefield 4, Thief and integrated into the Crytek engine (announced last year), announced with the release of the Radeon R9 290X/290, Mantle was truly the instigator that pushed Microsoft into moving DX12's development along at a faster pace.
Since DX12's announcement, AMD has claimed that Mantle would live on, bringing performance advantages to AMD GPUs and would act as the sounding board for new API features for AMD and game development partners. And, as was always trumpeted since the very beginning of Mantle, it would become an open API, available for all once it outgrew the beta phase that it (still) resides in.
Something might have changed there.
A post over on the AMD Gaming blog from Robert Hallock has some news about Mantle to share as GDC begins. First, the good news:
AMD is a company that fundamentally believes in technologies unfettered by restrictive contracts, licensing fees, vendor lock-ins or other arbitrary hurdles to solving the big challenges in graphics and computing. Mantle was destined to follow suit, and it does so today as we proudly announce that the 450-page programming guide and API reference for Mantle will be available this month (March, 2015) at www.amd.com/mantle.
This documentation will provide developers with a detailed look at the capabilities we’ve implemented and the design decisions we made, and we hope it will stimulate more discussion that leads to even better graphics API standards in the months and years ahead.
That's great! We will finally be able to read about the API and how it functions, getting access to the detailed information we have wanted from the beginning. But then there is this portion:
AMD’s game development partners have similarly started to shift their focus, so it follows that 2015 will be a transitional year for Mantle. Our loyal customers are naturally curious what this transition might entail, and we wanted to share some thoughts with you on where we will be taking Mantle next:
AMD will continue to support our trusted partners that have committed to Mantle in future projects, like Battlefield™ Hardline, with all the resources at our disposal.
- Mantle’s definition of “open” must widen. It already has, in fact. This vital effort has replaced our intention to release a public Mantle SDK, and you will learn the facts on Thursday, March 5 at GDC 2015.
- Mantle must take on new capabilities and evolve beyond mastery of the draw call. It will continue to serve AMD as a graphics innovation platform available to select partners with custom needs.
- The Mantle SDK also remains available to partners who register in this co-development and evaluation program. However, if you are a developer interested in Mantle "1.0" functionality, we suggest that you focus your attention on DirectX® 12 or GLnext.
Essentially, AMD's Mantle API in it's "1.0" form is at the end of its life, only supported for current partners and the publicly available SDK will never be posted. Honestly, at this point, this isn't so much of a let down as it is a necessity. DX12 and GLnext have already superseded Mantle in terms of market share and mind share with developers and any more work AMD put into getting devs on-board with Mantle is wasted effort.
Battlefield 4 is likely to be the only major title to use AMD Mantle
AMD claims to have future plans for Mantle though it will continue to be available only to select partners with "custom needs." I would imagine this would expand outside the world games but could also mean game consoles could be the target, where developers are only concerned with AMD GPU hardware.
So - from our perspective, Mantle as we know is pretty much gone. It served its purpose, making NVIDIA and Microsoft pay attention to the CPU bottlenecks in DX11, but it appears the dream was a bit bigger than the product could become. AMD shouldn't be chastised because of this shift nor for its lofty goals that we kind-of-always knew were too steep a hill to climb. Just revel in the news that pours from GDC this week about DX12.
Subject: General Tech | March 2, 2015 - 01:49 PM | Jeremy Hellstrom
Tagged: SoFIA, silvermont, modem, LTE, Intel, Cherry Trail, atom x7, atom x5, atom x3, 7260
With MWC in full swing Intel showed off their mobile silicon to Ryan and to The Tech Report who compiled complete specifications of the Cherry Trail based Atom x5-8300 and 8500 as well as the x7-8700. All three of these chips will have an Intel designed XMM 7260 LTE modem as well as WiFi and NFC connectivity with the X7 also featuring Intel WiGig. You can also expect RealSense, True Key facial recognition and Pro Wireless Display to send secure wireless video to compatible displays for meetings. Check out the full list of stats here.
"Intel says the dual-core Atom x3-C3130 is shipping now, while the quad-core Atom x3-C3230RK is coming later in the first half of the year. The LTE-infused Atom x3-C3440 will follow in the second half. In all, the chipmaker names 19 partners on board with the Atom x3 rollout, including Asus, Compal, Foxconn, Pegatron, Weibu, and Wistron."
Here is some more Tech News from around the web:
- News: The TR Podcast 171: Nvidia takes heat, Carrizo runs cool, and Fractal stays quiet
- Seagate NAS owners: hide it behind a firewall. Fast. @ The Register
- The Samsung Galaxy S6 & Galaxy S6 Edge Unpacked Event @ Tech ARP
- MWC: Galaxy S6 and Galaxy S6 Edge arrive with metal redesign and QHD screens @ The Inquirer
- Acer enters Windows Phone fray with cheap Liquid M220 mobe @ The Register
- Microsoft Swarms all over Docker Machines @ The Register
Subject: General Tech, Systems, Shows and Expos | March 1, 2015 - 11:07 PM | Scott Michaud
Tagged: spectre x360, spectre, mwc 15, MWC, hp, Broadwell
HP announced their updated Spectre x360 at Mobile World Congress. Like the Lenovo Yoga, it has a hinge that flips the entire way around, allowing the laptop to function as a 13.3-inch tablet with a 1080p, IPS display. There are two stages between “tablet” and “laptop”, which are “stand” and “tent”. They are basically ways to prop up the touch screen while hiding the keyboard behind (or under) the unit. The stand mode is better for hands-free operation because it has a flat contact surface to rest upon, while the tent mode is probably more sturdy for touch (albeit rests on two rims). The chassis is entirely milled aluminum, except the screen and things like that of course.
The real story is the introduction of Core i-level Broadwell. The 12.5-hour battery listing in a relatively thin form-factor can be attributed to the low power requirements of the CPU and GPU, as well as its SSD (128GB, 256GB, or 512GB). RAM comes in two sizes, 4GB or 8GB, which will depend slightly on the chosen processor SKU.
Prices start at $899 and most variants are available now at HP's website.
Subject: General Tech, Mobile, Shows and Expos | March 1, 2015 - 09:46 PM | Scott Michaud
Tagged: webOS, smartwatch, mwc 15, MWC, LG
A while ago, LG licensed WebOS from HP for use in their smart TVs and, as we found out during CES, smart watches.
The LG Urbane LTE is one such device, and we can finally see it in action. It is based around (literally) a circular P-OLED display (320 x 320, 1.3-inches, 245 ppi). Swirling your finger around the face scrolls through the elements like a wheel, which should be significantly more comfortable to search through a large list of applications than a linear list of elements -- a lot like an iPod (excluding the Touch and the Shuffle). That said, I have only seen other people use it.
The SoC is a Qualcomm Snapdragon 400, clocked at 1.2 GHz. It supports LTE, Wireless-N, Bluetooth 4.0LE, and NFC. It has 1 GB of RAM, which is quite a bit, and 4GB of permanent storage, which is not. It also has a bunch of sensors, from accelerometers and gyros to heart rate monitors and a barometer. It has a speaker and a microphone, but no camera. LG flaunts a 700 mAh battery, which they claim is “the category's largest”, but they do not link that to an actual amount of usage time (only that it “go[es] for days in standby mode”).
Video credit: The Verge
Pricing has not yet been announced, but it should hit the US and Europe before May arrives.
Subject: General Tech | March 1, 2015 - 09:11 PM | Scott Michaud
Tagged: quixel, GDC, gdc 15, ue4, unreal engine 4, gdc 2015
You know that a week will be busy when companies start announcing a day or two early to beat the flood. While Game Developers Conference starts tomorrow, Quixel published their Jungle demo to YouTube today in promotion of their MEGASCANS material library. The video was rendered in Unreal Engine 4.
Their other material samples look quite convincing. The vines on a wall (column in this case) is particularly interesting because it even looks like two distinct layers, despite being a single mesh with displacement as far as I can tell. I don't know, maybe it is two or three layers. It would certainly make sense if it was, but the top and bottom suggests that it is single, and that is impressive. It even looks self-occluding.
Pricing and availability for the library is not yet disclosed, but it sounds like it will be a subscription service. The software ranges from $25 to $500, depending on what you get and what sort of license you need (Academic vs Commercial and so forth).
Subject: General Tech, Mobile, Shows and Expos | March 1, 2015 - 05:16 PM | Scott Michaud
Tagged: MWC, mwc 15, GDC, gdc 15, htc, valve, vive, vive vr, Oculus
Mobile World Congress (MWC) and Game Developers Conference (GDC) severely overlap this year, and not just in dates apparently. HTC just announced the Vive VR headset at MWC, which was developed alongside Valve. The developer edition will contain two 1200x1080 displays with a 90Hz refresh rate, and it will launch this spring. The consumer edition will launch this holiday. They made sure to underline 2015, so you know they're serious. Want more information? Well that will be for Valve to discuss at GDC.
The confusing part: why is this not partnered with Oculus? When Michael Abrash left Valve to go there, I assumed that it was Valve shedding its research to Facebook's subsidiary and letting them take the hit. Now, honestly, it seems like Facebook just poached Abrash, Valve said “oh well”, and the two companies kept to their respective research. Who knows? Maybe that is not the case. We might find out more at GDC, but you would expect that Oculus would be mentioned if they had any involvement at all.
Valve will host an event on the second official day of GDC, March 3rd at 3pm. In other words, Valve will make an announcement on 3/3 @ 3. Could it involve Left 4 Dead 3? Portal 3? Will they pull a Crytek and name their engine Source 3? Are they just trolling absolutely everyone? Will it have something to do with NVIDIA's March 3rd announcement? Do you honestly think I have any non-speculative information about this? No. No I don't. There, I answered one of those questions.
Subject: Mobile | March 1, 2015 - 02:01 PM | Sebastian Peak
Tagged: SoC, smartphones, Samsung, MWC 2015, MWC, Galaxy S6 Edge, galaxy s6, Exynos 7420, 14nm
Samsung has announced the new Galaxy S phones at MWC, and the new S6 and S6 Edge should be in line with what you were expecting if you’ve followed recent rumors.
The new Samsung Galaxy S6 and S6 Edge (Image credit: Android Central)
As expected we no longer see a Qualcomm SoC powering the new phones, and as the rumors had indicated Samsung opted instead for their own Exynos 7 Octa mobile AP. The Exynos SoC’s have previously been in international versions of Samsung’s mobile devices, but they have apparently ramped up production to meet the demands of the US market as well. There is an interesting twist here, however.
The Exynos 7420 powering both the Galaxy S6 and S6 Edge is an 8-core SoC with ARM’s big.LITTLE design, combining four ARM Cortex-A57 cores and four Cortex-A53 cores. Having announced 14nm FinFET mobile AP production earlier in February the possibility of the S6 launching with this new part was interesting, as the current process tech is 20nm HKMG for the Exynos 7. However a switch to this new process so soon before the official announcement seemed unlikely as large-scale 14nm FinFET production was just unveiled on February 16. Regardless, AnandTech is reporting that the new part will indeed be produced using this new 14nm process technology, and this gives Samsung an industry-first for a mobile SoC with the launch of the S6/S6 Edge.
GSM Arena has specs of the Galaxy S6 posted, and here’s a brief overview:
- Display: 5.1” Super AMOLED, QHD resolution (1440 x 2560, ~577 ppi), Gorilla Glass 4
- OS: Android OS, v5.0 (Lollipop) - TouchWiz UI
- Chipset: Exynos 7420
- CPU: Quad-core 1.5 GHz Cortex-A53 & Quad-core 2.1 GHz Cortex-A57
- GPU: Mali-T760
- Storage/RAM: 32/64/128 GB, 3 GB RAM
- Camera: (Primary) 16 MP, 3456 x 4608, optical image stabilization, autofocus, LED flash
- Battery: 2550 mAh (non-removable)
The new phones both feature attractive styling with metal and glass construction and Gorilla Glass 4 sandwiching the frame, giving each phone a glass back.
The back of the new Galaxy S6 (Image credit: Android Central)
The guys at Android Central (source) had some pre-release time with the phones and have a full preview and hands-on video up on their site. The new phones will be released worldwide on April 10, and no specifics on pricing have been announced.
Subject: Graphics Cards | March 1, 2015 - 07:30 AM | Scott Michaud
Tagged: superfish, Lenovo, bloatware, adware
Obviously, this does not forget the controversy that Lenovo got themselves into, but it is certainly the correct response (if they act how they imply). Adware and bloatware is common to find on consumer PCs, which makes the slowest of devices even more sluggish as demos and sometimes straight-up advertisements claim their share of your resources. This does not even begin to discuss the security issues that some of these hitchhikers drag in. Again, I refer you to the aforementioned controversy.
In response, albeit a delayed one, Lenovo has announced that, by the launch of Windows 10, they will only pre-install the OS and “related software”. Lenovo classifies this related software as drivers, security software, Lenovo applications, and applications for “unique hardware” (ex: software for an embedded 3D camera).
It looks to be a great step, but I need to call out “security software”. Windows 10 should ship with Microsoft's security applications in many regions, which really questions why a laptop provider would include an alternative. If the problem is that people expect McAfee or Symantec, then advertise pre-loaded Microsoft anti-malware and keep it clean. Otherwise, it feels like keeping a single finger in the adware take-a-penny dish.
At least it is not as bad as trying to install McAfee every time you update Flash Player. I consider Adobe's tactic the greater of two evils on that one. I mean, unless Adobe just thinks that Flash Player is so insecure that you would be crazy to install it without a metaphorical guard watching over your shoulder.
And then of course we reach the divide between “saying” and “doing”. We will need to see Lenovo's actual Windows 10 devices to find out if they kept their word, and followed its implications to a tee.
Subject: Mobile | February 28, 2015 - 04:42 PM | Sebastian Peak
Tagged: smartphones, MWC 2015, MWC, Moto E, LG Magna, ios, Android 5.0
Last year my favorite smartphone became the 2014 version of the Moto G. This was (and still is) a $179 unlocked Android phone that shipped with 4.4.4 KitKat, but recently received an OTA update to 5.0 Lollipop (and subsequently 5.0.2 via a second OTA update). Motorola’s aggressive pricing made the phone compelling on paper, but using the device was even more impressive. It looked good, with a 5-inch 720p IPS display and the same design language as the Moto X and later Nexus 6, and ran a virtually untouched stock Android OS. It was never going to win any awards for raw speed, but the quad-core Snapdragon 400 SoC was plenty fast for daily use. The main drawback was a glaring one, however: the Moto G was not LTE capable. Enter the new Moto E.
Here are some quick specs from Motorola:
Moto E 2nd Edition (LTE capable)
4.5” 540x960 display
Quad-core 1.2GHz Cortex-A53/Adreno 306
1GB RAM/8GB storage
2390 mAh battery
We are already off to a solid start in 2015 with a great option from Motorola in the new 2nd edition Moto E. This LTE capable smartphone might look a little chunky, but the specs make it more that just a compelling option at $149 (unlocked) as it could have the disruptive impact on price that Microsoft just couldn’t make last year with their inexpensive Lumia phones. With 2015’s Mobile World Congress (MWC) fast approaching the Moto E has already been making some noise in the affordable phone space that last year’s Moto G played a big part in, and this time the message is clear: in 2015 a smartphone needs to have LTE, regardless of price.
To be fair Microsoft has already addressed need for LTE with their low-cost Windows Phone devices like the Lumia 635 (which is actually selling for just $49 on Amazon now), but the app ecosystem for the platform is just too restrictive to make it a viable solution compared to Android and iOS. Honestly, I love the Windows Phone OS but there are too many missing apps to make it a daily driver. So, since Windows clearly isn’t the answer and Apple won’t be selling a sub-$200 unlocked smartphone anytime soon (the cheapest unlocked iPhone is the 8GB 5c at $450), that leaves Android (of course).
Another possibility comes from LG, as ahead of MWC there was a press release from the company showcasing their new “mid-range” smartphone lineup for 2015. Among the models listed is another phone that matches the specs associated with a $200-ish unlocked phone, but pricing has not been announced yet.
LG Magna (LTE capable) - Unreleased
5.0” 720x1280 display
1GB RAM, 8GB storage
2540 mAh battery
We await the announcements from MWC and there are sure to be many other examples of low-cost LTE devices, but already it’s looking like it won’t take more than $200 and a SIM card to avoid the endless device upgrade cycle in 2015.
Subject: General Tech | February 27, 2015 - 04:41 PM | Tim Verry
Tagged: SFF, nuc5i7ryh, nuc, Intel, broadwell-u, Broadwell
We recently reviewed a new small form factor NUC PC from Intel powered by Broadwell. That i5-powered NUC5i5RYK will soon be joined by an even higher end Broadwell NUC (NUC5i7RYH) equipped with an i7-5557U CPU and Iris 6100 graphics.
According to FanlessTech, this slightly thicker NUC will come as a barebones system with a processor, motherboard, and wireless card pre-installed in a case with customizable lids (to add NFC, wireless charging, or other features). Note that, unlike the Broadwell i5 version we reviewed, this model supports 2.5” SSDs.
External I/O includes:
- 2 x USB 3.0 ports (one charging capable)
- 1 x Audio jack
- 1 x IR sensor
- 2 x USB 3.0 ports
- 1 x Gigabit Ethernet RJ45
- 1 x Mini HDMI 1.4a
- 1 x Mini DisplayPort 1.2
Internally, the NUC5i7RYH is powered by a dual core (with Hyper-Threading) i7-5557U processor clocked at 3.1 GHz base and 3.4 GHz turbo with 4MB cache and 28W TDP. The processor also features Intel’s Iris 6100 GPU which our own Scott Michaud estimates it at 48 execution units and 845 GFLOPS of performance. He further speculates that it gets to a similar level of theoretical performance as the Intel Iris 5100 graphics (used in Haswell CPUs) using more (but lower clocked at up to 1050 MHz) shaders.
The Iris 6100 GPU is likely to be the highest processor graphics we will see with Broadwell-U. It supports 4K resolutions at 24Hz as well as video decode (though apparently not hardware accelerated) of VP8, VP9, and H.265 (HVEC) via wired displays or over Intel’s WiDi wireless display technology. Further, the GPU supports DirectX 12 in its current iteration as well as OpenGL 4.3 and OpenCL 2.0.
Internal connectivity includes support for two DDR3L SODIMMs (up to 16GB), a single 2.5” solid state drive, one M.2 SSD, an Intel Wireless AC 7265 card (802.11ac+BT), a NFC header, and a header for two USB 2.0 ports.
Intel has not released pricing, but expect it to hit at least $500 since the i5 version without Iris graphics has an MSRP of $399. It is slated to arrive soon with a launch window of Q2 2015.
Subject: General Tech | February 27, 2015 - 01:14 PM | Jeremy Hellstrom
Tagged: Huawei, EE, qualcomm, 4g lte
If 4G speeds of 400Mbps become common there are going to be some very happy media streamers, at least until the bill comes in. In a proof of concept test Huawei EE and Qualcomm demonstrated a 4G LTE carrier aggregated connection in Wembley stadium which hit peak speeds of 400Mbps and should provide most attendees of events at Wembley with speeds hitting up to 150Mbps. The carrier will use the existing 4G LTE network, only tweaking was needed to increase the speeds as opposed to a new standard and so any phone capable of connecting to LTE should be able to take advantage of the speed increase. Check out The Inquirer for more information.
"HUAWEI, EE AND QUALCOMM have demonstrated a blink-and-you-missed-it 4G network at Wembley Stadium that achieved speeds of 400Mbps."
Here is some more Tech News from around the web:
- Intel unveils upcoming Atom x3 x5 and x7 processors ahead of MWC @ The Inquirer
- How to Use KDE Plasma Desktop Like a Pro @ Linux.com
- Check out our HOT AIR INTERFACE for 5G – Huawei @ The Register
- Microsoft man: Internet Explorer had to go because it's garbage @ The Register
- NO ONE is making money from YouTube, even Google – report @ The Register
Subject: General Tech | February 26, 2015 - 11:29 PM | Scott Michaud
Tagged: nvidia, hearthstone, esports
Professional and amateur players of Hearthstone: Heroes of Warcraft can compete for a share of the $25,000 prize pool and other perks, hosted by NVIDIA. Once the pool of players are whittled down to the sixteen invited pros and the top sixteen non-professionals, they will compete in a playoff format. The 32 players at that stage will each receive an NVIDIA Shield Tablet, the top 16 will receive money, and the top eight will get Blizzard World Championship qualifier points may either start their career or get them even closer to being invited to the autumn finals.
Breaking down the above into a little more detail:
|Prize Money||Qualification Points||Shield Tablet|
|3rd & 4th Place||$1,500||Some||✔|
|5th - 8th Place||$750||Some||✔|
|9th - 16th Place||$500||-||✔|
|17th - 32nd Place||-||-||✔|
NVIDIA will be streaming the event as a four-hour event every week, which consist of group-stage highlights. Registration will close on March 19th at noon (EST). The actual playoffs will take place on May 30th and May 31st, also streamed on NVIDIA's Twitch channel.
Subject: Cases and Cooling | February 26, 2015 - 05:39 PM | Jeremy Hellstrom
Tagged: corsair, Carbide Series, Air 240 High Airflow, MicroATX, mini-itx, SFF
Corsair designed the Carbide Series Air 240 High Airflow for small motherboards but left enough room to fit fair sized add in cards and coolers. The case is 397 x 260 x 320mm (15.6 x 10 x 12.6") and will hold GPUs up to 290 mm in length and a cooler of up to 120mm as well as a full sized ATX PSU. [H]ard|OCP installed two GTX 280's with no issues and had no problems installing several popular AiO watercoolers either. Even with just air cooling it would seem that Corsair's Direct Airflow Path is much more than just a marketing gimmick and kept the components at reasonable temperatures even after heavy loads. It certainly earned the Gold Award it received and for less than $100 it deserves to be on your short list of tiny cases to consider purchasing.
"Are you in the market of a case for that new Mini-ITX or MicroATX PC build? Corsair today shows off its Carbide Series Air 240 High Airflow MicroATX and Mini-ITX PC Case. It's big, it's black, and it will remind you the the Borg. OK, maybe it is not that big, but big enough to allow mATX fans plenty of room for cooling and hot dual GPUs."
Here are some more Cases & Cooling reviews from around the web:
- Fractal Design's Define R5 case @ The Tech Report
- Raijintek Metis Classic Computer Case @ Benchmark Reviews
- Fractal Design Core 2300 @ techPowerUp
- In Win 703 @ Legion Hardware
- Thermaltake Core V51 Case Review @ Hardware Asylum
- Thermalright Silver Arrow ITX @ techPowerUp
- Phanteks PH-TC14S Dual-Tower Review: Conflict-free CPU Cooling? @ Modders-Inc
- Phanteks PH-TC12LS CPU Cooler @ Modders-Inc
- Reeven Justice (RC-1204) @ eTeknix
Subject: Graphics Cards, Mobile | February 26, 2015 - 02:15 PM | Ryan Shrout
Tagged: super-gpu, PowerVR, Imagination Technologies, gt7900
As a preview to announcements and releases being made at both Mobile World Congress (MWC) and the Game Developers Summit (GDC) next week, Imagination Technologies took the wraps off of a new graphics product they are calling a "super-GPU". The PowerVR GT7900 is the new flagship GPU as a part of its Series7XT family that is targeting a growing category called "affordable game consoles." Think about the Android-powered set-top devices like the Ouya or maybe Amazon's Kindle TV.
PowerVR breaks up its GPU designs into unified shading clusters (USCs) and the GT7900 has 16 of them for a total of 512 ALU cores. Imagination has previously posted a great overview of its USC architecture design and how you can compare its designs to other GPUs on the market. Imagination wants to claim that the GT7900 will offer "PC-class gaming experiences" though that is as ambiguous as the idea of a work load of a "console-level game." But with rated peak performance levels hitting over 800 GFLOPS in FP32 and 1.6 TFLOPS in FP16 (half-precision) this GPU does have significant theoretical capability.
|PowerVR GT7900||Tegra X1|
|GPU Clock||800 MHz||1000 MHz|
|Process Tech||16nm FinFET+||20nm TSMC|
Imagination also believes that PowerVR offers a larger portion of its peak performance for a longer period of time than the competition thanks to the tile-based deferred rendering (TBDR) approach that has been "refined over the years to deliver unmatched efficiency."
The FP16 performance number listed above is useful as an extreme power savings option where the half-precision compute operates in a much more efficient manner. A fair concern is how many applications, GPGPU or gaming, actually utilize the FP16 data type but having support for it in the GT7900 allows developers to target it.
Other key features of the GT7900 include support for OpenGL ES 3.1 + AEP (Android Extension Pack), hardware tessellation and ASTC LDR and HDR texture compression standards. The GPU also can run in a multi-domain virtualization mode that would allow multiple operating systems to run in parallel on a single platform.
Imagination believes that this generation of PowerVR will "usher a new era of console-like gaming experiences" and will showcase a new demo at GDC called Dwarf Hall.
I'll be at GDC next week and have already setup a meeting with Imagination to talk about the GT7900 so I can have some hands on experiences to report back with soon. I am continually curious about the market for these types of high-end "mobile" GPUs with the limited market that the Android console market currently addresses. Imagination does claim that the GT7900 is beating products with performance levels as high as the GeForce GT 730M discrete GPU - no small feat.