Subject: General Tech, Mobile | October 1, 2014 - 03:17 PM | Scott Michaud
Tagged: Kickstarter, Firefox OS, web, chromecast
When Google released the Chromecast, it was a surprisingly clean solution for streaming video (my apologies if solutions existed before it). Just plug it into HDMI and connect to it with a PC or a mobile device to use the TV as monitor for content, and it is cheap. I figured that the open source community would like one of their own, but I did not think it was going to be done. Now there is a Kickstarter up, with FirefoxOS.
I constantly struggle with whether to discuss crowdfunding because, on the one hand, you never know if something will tank. On the other hand, is it really any less sketchy than pre-release information for computer hardware or video games (especially pre-release news for video games)?
In this case, I found out that it was promoted by Mozilla on their Hacks blog. It is based on a Rockchip 3066 SoC with 1GB RAM, 4GB of storage and 2.4 GHz Wireless-N. As stated earlier, it runs FirefoxOS which means that apps are websites. The SoC has a Mali-400 GPU that is capable of OpenGL ES 2.0, so it might even be able to support WebGL if the software and drivers are certified. Don't expect jaw-dropping 3D graphics, though. The GPU is rated at about 9 GFLOPs. For comparison, the Tegra K1 has a peak compute throughput of about 365 GFLOPs; alternatively, it is fairly close to later-model Intel GMA graphics (not Intel HD Graphics... GMA). Still, it might allow for some interesting 2D (or simplistic 3D) games.
Just a day-or-so in, it is already at over 150% funding.
Subject: General Tech | October 1, 2014 - 02:26 PM | Jeremy Hellstrom
Tagged: gaming, ancient space, letdown
It seems that Ancient Space is not quite living up to the hype surrounding the cast of Sci-Fi stars and Homeworld like appearance. From what Rock, Paper, SHOTGUN found the story was lacklustre even with recognizable voices and while the gameplay was enjoyable it was lacking that brilliance which made Homeworld so memorable. It is a beautiful game and it does actually have some new features like Captains which can be swapped to give different buffs to your ships but overall they were a bit let down. You can grab it on Steam but you might want to consider some of the Homeworld and Homeworld 2 mods to tide you over until the remastered versions are released.
If you do find a mod you like you might be able to talk one or more of the Fragging Frogs into playing a game with you, otherwise keep an eye on their Forum for the games they will be playing this week.
"That’s not to say Ancient Space is a terrible game: it’s actually not ever bad in any dramatic sense, it just doesn’t do anything particularly exciting. It’s disappointing. Beautiful, but disappointing. There’s your three word summary."
Here is some more Tech News from around the web:
- Survival Strategy: Total War – Attila Announced @ Rock, Paper, SHOTGUN
- Metro Redux Review @ OCC
- Interview: Gearbox On Borderlands: The Pre-Sequel @ Rock, Paper, SHOTGUN
- The Plague Spreads: Pathologic Remake Hits Funding Goal @ Rock, Paper, SHOTGUN
- Wot I Think (So Far): Middle-Earth: Shadow Of Mordor @ Rock, Paper, SHOTGUN
- Civilization Beyond Earth: 200 Turns On The Final Frontier @ Rock, Paper, SHOTGUN
Subject: General Tech | October 1, 2014 - 01:09 PM | Jeremy Hellstrom
Tagged: nvidia, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr
Move over Super Best Friends, the Dynamic Super Resolution Duo is here to slay the evil Jaggies! Ryan covered NVIDIA's new DSR in his review of the new Maxwell cards and how it can upsample a monitor with a resolution of 2560x1440 or lower to much higher resolutions using a process similar to supersampling but is in fact a 13-tap gaussian filter. That is important because supersampling would have some interesting challenges rendering 2560x1440 on a 1080p monitor. DSR gives you a much wider choice of resolutions as you can see in the Guild Wars screenshot below, allowing you to choose a variety of multipliers to your displays native resolution to give your game a much smoother look. The Tech Report has assembled a variety of screenshots from games with different DSR and AA settings which you can examine with your own eyeballs to see what you think.
"One of the more intriguing capabilities Nvidia introduced with the GeForce GTX 970 and 980 is a feature called Dynamic Super Resolution, or DSR, for short. Nvidia bills it as a means of getting 4K quality on a 2K display. How good is it? We take a look."
Here is some more Tech News from around the web:
- There's more to Windows 10 than miscounting @ The Inquirer
- Microsoft WINDOWS 10: Seven ATE Nine. Or Eight did really @ The Register
- AMD demonstrates NFV tool using 64-bit ARM-based SoC codenamed 'Hierofalcon' @ The Inquirer
- Hong Kong Protesters Use Mesh Networks To Organize @ Slashdot
- Mozilla might add Tor encryption to its Firefox web browser @ The Inquirer
- Lenovo becomes the biggest x86 server provider in China as acquisition of IBM x86 server business completes, says IDC @ DigiTimes
- Supercomputers: The Next Generation – Cray puts burst buffer tech, Intel Haswell inside @ The Register
- Competition: Win One of Three BioStar Motherboards @ eTeknix
Subject: General Tech | September 30, 2014 - 11:46 PM | Scott Michaud
Tagged: windows 9, Windows 8.1, Windows 7, windows 10, windows, threshold, microsoft
The Windows event for the enterprise, which took place today in San Francisco, revealed the name of the upcoming OS. It is not Windows 9, or One Windows, or just Windows. It will be Windows 10. Other than the name, there is not really any new information from a feature or announcement standpoint (except the Command Prompt refresh that I actually will give a brief mention later). My interest comes from their mindset with this new OS -- what they are changing and what they seem to be sticking with.
If you would like Microsoft's commentary before reading mine, the keynote is embed above.
Okay, so one thing that was shown is "Continuum". If you have not seen its prototype at the end of the above video, it is currently a small notification that appears when a keyboard and mouse is attached (or detached). If a user accepts, this will flip the user interface between tablet and desktop experiences. Joe Belfiore was clear that the video clip was not yet in code, but represents their vision. In practice, it will have options for whether to ask the user or to automatically do some chosen behavior.
In a way, you could argue that it was necessary to go through Windows 8.x to get to this point. From the demonstrations, the interface looks sensible and a landing point for users on both Windows 7 and Windows 8 paths. That said, I was fine with the original Windows 8 interface, barring a few glitches, like disappearing icons and snapping sidebars on PCs with multiple monitors. I always considered the "modern" Windows interface to be... acceptable.
It was the Windows Store certification that kept me from upgrading, and Microsoft's current stance is confusing at the very least. Today's announcement included the quote, "Organizations will also be able to create a customized store, curating store experiences that can include their choice of Store apps alongside company-owned apps into a separate employee store experience." Similar discussion was brought up and immediately glossed over during the keynote.
Who does that even apply to? Would a hobbyist developer be able to set up a repository for friends and family? Or is this relegated to businesses, leaving consumers to accept nothing more than what Microsoft allows? The concern is that I do not want Microsoft (or anyone) telling me what I can and cannot create and install on my devices. Once you build censorship, the crazies will come. They usually do.
But onto more important things: Command Prompt had a major UX overhaul. Joe Belfiore admitted that it was mostly because most important changes were already leaked and reported on, and they wanted to surprise us with something. They sure did. You can now use typical keyboard shortcuts, shift to select, ctrl+c and ctrl+v to copy/paste, and so forth. The even allow a transparency option, which is common in other OSes to make its presence less jarring. Rather than covering over what you're doing, it makes it feel more like it overlays on top of it, especially for quick commands. At least, that is my opinion.
Tomorrow, October 1st, Microsoft will launch their "Windows Inside Program". This will give a very early glimpse at the OS for "most enthusiastic Windows fans" who are "comfortable running pre-release software that will be of variable quality". They "plan to share all the features (they) are experimenting with". They seem to actually want user feedback, a sharp contrast from their Windows 8 technical preview. My eye will on relaxing certification requirements, obviously.
Subject: General Tech | September 30, 2014 - 02:00 PM | Jeremy Hellstrom
Tagged: K70 RGB, input, corsair, Cherry MX RGB red
There is a new type of Cherry MX switches on the market and they are what allow the Corsair K70 RGB to stand out in a light filled room; Cherry MX RGB switches feel like the original switches but with the clear plastic domes they have clear housings. Thanks to the Corsair Utility Engine software which comes with the keyboard you can choose from 16.8 million colours to enhance the look of your keyboard, or create macros to have colours change as you are using it. The Tech Report had great success in programming the keyboard considering that the manual is 142 pages long so expect a bit of a steep learning curve when you first start out playing with this keyboard. You can find their review as well as a video showing off some of their colour schemes right here.
"Corsair Gaming's K70 RGB keyboard has been hotly anticipated since its debut at CES earlier this year. Does it live up to the hype? We put the keyboard and its accompanying software to the test to find out"
Here is some more Tech News from around the web:
- Corsair Gaming K70 RGB Mechanical Keyboard @ eTeknix
- Ozone Strike Pro Cherry MX Red USB keyboard @ Kitguru
- CM Storm NovaTouch TKL Keyboard Review @ Legit Reviews
- Cooler Master NovaTouch TKL Keyboard @ Benchmark Reviews
- XTracGear Mouse Surfaces Review @ Neoseeker
- Attitude One Rapira Elite Gaming Mouse @ techPowerUp
- Attitude One Rapira Elite Laser Gaming Mouse @ NikKTech
- Gamdias HADES Extension laser gaming mouse @ Kitguru
- Attitude One Rapira One Gaming Mouse @ eTeknix
Subject: General Tech | September 30, 2014 - 01:11 PM | Jeremy Hellstrom
Tagged: arm, internet of things, Si106x, 108x, Silicon Labs, Intel, quark
While the Internet of Things is growing at an incredible pace the chip manufacturers which are competing for this new market segment are running into problems when trying to design chips to add to appliances. There is a balance which needs to be found between processing power and energy savings, the goal is to design very inexpensive chips which can run on microWatts of power but still be incorporate networked communication and sensors. The new Cortex-M7 is a 32-bit processor which is directly competing with 8 and 16 bit microcontrollers which provide far less features but also consume far less power. Does a smart light bulb really need to have a 32bit chip in it or will a lower cost MCU provide everything that is needed for the light to function? Intel's Quark is in a similar position, the processing power it is capable of could be a huge overkill compared to what the IoT product actually needs. The Register has made a good observation in this article, perhaps the Cortex M0 paired with an M4 or M7 when the application requires the extra horsepower is a good way for ARM to go in. Meanwhile, Qualcomm's Snapdragon 600 has been adopted to run an OS to control robots so don't think this market is going to get any less confusing in the near future.
"The Internet of Things (IoT) is growing an estimated five times more quickly than the overall embedded processing market, so it's no wonder chip suppliers are flocking to fit out connected cars, home gateways, wearables and streetlights as quickly as they can."
Here is some more Tech News from around the web:
- Microsoft's Asimov System To Monitor Users' Machines In Real Time @ Slashdot
- ARM teams with 1248 to launch Hyperweave gateway to 'IoT-enable' enterprises @ The Inquirer
- ARMs head Moonshot bodies: HP pops Applied Micro, TI chips into carts @ The Register
- Third patch brings more admin Shellshock for the battered and Bashed @ The Register
- Mining Bitcoins with Pencil and Paper @ Hack a Day
- How to Organize Your Linux File System for Clutter-Free Folders @ Linux.com
- Alien Isolation Community Preview event @ Kitguru
Subject: General Tech, Systems | September 29, 2014 - 06:41 PM | Scott Michaud
Tagged: fanless, nuc, haswell
The Akasa Newton X is a fanless case for the NUC form factor that was announced in May and released a couple of months ago. Now, we are beginning to see system builders (albeit in Europe) integrate it in some higher-end devices. This one, from Atlast! Solutions, is built around the Intel Core i5-4250U, up to 1.5TB of SSD storage (512GB Crucial M550 mSATA + 1TB 840 EVO SATA), and up to 16GB of RAM. It can also be configured with up to two-antenna Wireless AC.
The Core i5-4250U is a dual-core (four threads) processor that is rated for 15W TDP. Its on-chip GPU is the Intel HD Graphics 5000 with a peak, theoretical compute throughput of 704 GFLOPS. This makes it a little under three-times the graphics performance of an Xbox 360. In terms of PC games, you are looking at Battlefield 4 or Titanfall on low at 1024x768 (or basically whatever your home server can do if used as a stream-to target).
Prices currently start at £449.00 for 4GB of RAM and 60GB of mSATA SSD, including VAT.
Thanks to FanlessTech for covering this story.
Subject: General Tech | September 29, 2014 - 02:17 PM | Jeremy Hellstrom
Tagged: gigabyte, Z97X G1 Gaming GT, z97
Calling the GIGABYTE G1 Gaming GT Z97 motherboard trimmed down is a bit of an exaggeration, all that was removed was Bluetooth, WiFi and and Creative's Sound Core3D codec. It still features AMP-UP audio with swappable OP-AMPs, a E2200 KillerNIC, high quality caps, four PCIe 3.0 16x slots thanks to a PLEX chip as well as an impressive array of SATA and USB ports. At $270 it will cost you a somewhat less than choosing a new Haswell-E system and the performance in most cases will be very comparable, especially if you desire high quality audio. However not all was good once [H]ard|OCP started testing the board, while there were no insurmountable issues their overall experiences with setting up the board make this particular model difficult to recommend; read the reasons why in their full review.
"GIGABYTE’s G1 Gaming GT looks to be a stripped version of the Z97X Gaming G1 WiFi-BK. Like other offerings in the G1 family the G1 Gaming GT is a premium part representing the pinnacle of what GIGABYTE design and innovation can and should offer. We have high expectations for the G1 Gaming GT."
Here are some more Motherboard articles from around the web:
- ASUS Z97-Deluxe @ X-bit Labs
- Gigabyte GA-Z97X-UD5H Motherboard Review @ Madshrimps
- ASUS Rampage V Extreme @ HardwareHeaven
- MSI Z97 XPOWER AC Motherboard Review @ Madshrimps
- MSI Z97S SLI PLUS @ eTeknix
- MSI X99S Gaming 7 @ Kitguru
- ASUS X99 Deluxe Motherboard Review @ Hardware Canucks
- BIOSTAR A68N-5000 Motherboard Review @ Madshrimps
- MSI 970 Gaming @ Kitguru
Subject: General Tech, Systems | September 29, 2014 - 01:21 PM | Jeremy Hellstrom
Tagged: survey, components
The Tech Report have compiled the data from their survey of readers machines and the data is now posted in this article. You can see how your build compares to the major trends that they observed, from the number and type of monitors that you use to the amount of RAM you have installed. The most interesting page covers the odd facts which were revealed such as the overwhelming predominance of ATX boards and cases that are being used despite the fact that 75% of respondents having only a single card installed in their systems. It is also interesting to note a mere 10% of those responding use more than one GPU. Check out the findings here.
"Typical PC enthusiasts may spend more on their PCs than you might think—and by the looks of it, their taste for high-end hardware isn't just limited to core components. Those are two of the main takeaways from the TR Hardware Survey 2014, in which we invited readers to answer 26 questions about their PCs. Around 4,000 of you participated over a period of about a week and a half, and the results paint an enlightening picture of current trends in the hobbyist PC realm. "
Here is some more Tech News from around the web:
- Cloudflare rolls out free universal SSL encryption to all users @ The Inquirer
- Ineffective Shellshock fix means hackers are still exploiting vulnerability @ The Inquirer
- Microsoft to release new inexpensive notebook solution; unlikely to receive full support from vendors @ DigiTimes
- Android smartphone vendors speeding up migration to 64-bit architecture @ DigiTimes
- BlackBerry slowly pulls out of power dive into toilet @ The Register
- HP offers 64-bit ARM processors for Moonshot servers @ The Inquirer
- Nvidia: We will focus on G-Sync, not Adaptive-Sync @ Kitguru
- Vendors stop developing touchscreen notebooks @ DigiTimes
- Unchanging Unicorn: Don't be disappointed with Ubuntu 14.10, be happy @ The Register
One Small Step
While most articles surrounding the iPhone 6 and iPhone 6 Plus this far have focused around user experience and larger screen sizes, performance, and in particular the effect of Apple's transition to the 20nm process node for the A8 SoC have been our main questions regarding these new phones. Naturally, I decided to put my personal iPhone 6 though our usual round of benchmarks.
First, let's start with 3DMark.
Comparing the 3DMark scores of the new Apple A8 to even the last generation A7 provides a smaller improvement than we are used to seeing generation-to-generation with Apple's custom ARM implementations. When you compare the A8 to something like the NVIDIA Tegra K1, which utilizes desktop-class GPU cores, the overall score blows Apple out of the water. Even taking a look at the CPU-bound physics score, the K1 is still a winner.
A 78% performance advantage in overall score when compared the A8 shows just how much of a powerhouse NVIDIA has with the K1. (Though clearly power envelopes are another matter entirely.)
If we look at more CPU benchmarks, like the browser-based Google Octane and SunSpider tests, the A8 starts to shine more.
While the A8 edges out the A7 to be the best performing device and 54% faster than the K1 in SunSpider, the A8 and K1 are neck and neck in the Google Octane benchmark.
Moving back to a graphics heavy benchmark, GFXBench's Manhattan test, the Tegra K1 has a 75% percent performance advantage over the A8 though it is 36% faster than the previous A7 silicon.
These early results are certainly a disappointment compared to the usual generation-to-generation performance increase we see with Apple SoCs.
However, the other aspect to look at is power efficiency. With normal use I have noticed a substantial increase in battery life of my iPhone 6 over the last generation iPhone 5S. While this may be due to a small (about 1 wH) increase in battery capacity, I think more can be credited to this being an overall more efficient device. Certain choices like sticking to a highly optimized Dual Core CPU design and Quad Core GPU, as well as a reduction in process node to 20nm all contribute to increased battery life, while surpassing the performance of the last generation Apple A7.
In that way, the A8 moves the bar forward for Apple and is a solid first attempt at using the 20nm silicon technology at TSMC. There is a strong potential that further refined parts (like the expected A8x for the iPad revisions) Apple will be able to further surpass 28nm silicon in performance and efficiency.