Ancient Space doesn't uncover much that is new

Subject: General Tech | October 1, 2014 - 02:26 PM |
Tagged: gaming, ancient space, letdown

It seems that Ancient Space is not quite living up to the hype surrounding the cast of Sci-Fi stars and Homeworld like appearance.  From what Rock, Paper, SHOTGUN found the story was lacklustre even with recognizable voices and while the gameplay was enjoyable it was lacking that brilliance which made Homeworld so memorable.  It is a beautiful game and it does actually have some new features like Captains which can be swapped to give different buffs to your ships but overall they were a bit let down.  You can grab it on Steam but you might want to consider some of the Homeworld and Homeworld 2 mods to tide you over until the remastered versions are released.

If you do find a mod you like you might be able to talk one or more of the Fragging Frogs into playing a game with you, otherwise keep an eye on their Forum for the games they will be playing this week.

aspace3.jpg

"That’s not to say Ancient Space is a terrible game: it’s actually not ever bad in any dramatic sense, it just doesn’t do anything particularly exciting. It’s disappointing. Beautiful, but disappointing. There’s your three word summary."

Here is some more Tech News from around the web:

Gaming

Introducing the all new Dynamic Super Resolution Duo!

Subject: General Tech | October 1, 2014 - 01:09 PM |
Tagged: nvidia, maxwell, GTX 980, GTX 970, GM204, geforce, dx12, dsr

Move over Super Best Friends, the Dynamic Super Resolution Duo is here to slay the evil Jaggies!  Ryan covered NVIDIA's new DSR in his review of the new Maxwell cards and how it can upsample a monitor with a resolution of 2560x1440 or lower to much higher resolutions using a process similar to supersampling but is in fact a 13-tap gaussian filter.  That is important because supersampling would have some interesting challenges rendering 2560x1440 on a 1080p monitor.  DSR gives you a much wider choice of resolutions as you can see in the Guild Wars screenshot below, allowing you to choose a variety of multipliers to your displays native resolution to give your game a much smoother look.   The Tech Report has assembled a variety of screenshots from games with different DSR and AA settings which you can examine with your own eyeballs to see what you think.

gw2-resolutions.jpg

"One of the more intriguing capabilities Nvidia introduced with the GeForce GTX 970 and 980 is a feature called Dynamic Super Resolution, or DSR, for short. Nvidia bills it as a means of getting 4K quality on a 2K display. How good is it? We take a look."

Here is some more Tech News from around the web:

Tech Talk

Microsoft Introduces Windows 10 to the Enterprise

Subject: General Tech | September 30, 2014 - 11:46 PM |
Tagged: windows 9, Windows 8.1, Windows 7, windows 10, windows, threshold, microsoft

The Windows event for the enterprise, which took place today in San Francisco, revealed the name of the upcoming OS. It is not Windows 9, or One Windows, or just Windows. It will be Windows 10. Other than the name, there is not really any new information from a feature or announcement standpoint (except the Command Prompt refresh that I actually will give a brief mention later). My interest comes from their mindset with this new OS -- what they are changing and what they seem to be sticking with.

If you would like Microsoft's commentary before reading mine, the keynote is embed above.

Okay, so one thing that was shown is "Continuum". If you have not seen its prototype at the end of the above video, it is currently a small notification that appears when a keyboard and mouse is attached (or detached). If a user accepts, this will flip the user interface between tablet and desktop experiences. Joe Belfiore was clear that the video clip was not yet in code, but represents their vision. In practice, it will have options for whether to ask the user or to automatically do some chosen behavior.

windows-10-continuum.jpg

In a way, you could argue that it was necessary to go through Windows 8.x to get to this point. From the demonstrations, the interface looks sensible and a landing point for users on both Windows 7 and Windows 8 paths. That said, I was fine with the original Windows 8 interface, barring a few glitches, like disappearing icons and snapping sidebars on PCs with multiple monitors. I always considered the "modern" Windows interface to be... acceptable.

It was the Windows Store certification that kept me from upgrading, and Microsoft's current stance is confusing at the very least. Today's announcement included the quote, "Organizations will also be able to create a customized store, curating store experiences that can include their choice of Store apps alongside company-owned apps into a separate employee store experience." Similar discussion was brought up and immediately glossed over during the keynote.

Who does that even apply to? Would a hobbyist developer be able to set up a repository for friends and family? Or is this relegated to businesses, leaving consumers to accept nothing more than what Microsoft allows? The concern is that I do not want Microsoft (or anyone) telling me what I can and cannot create and install on my devices. Once you build censorship, the crazies will come. They usually do.

windows-10.png

But onto more important things: Command Prompt had a major UX overhaul. Joe Belfiore admitted that it was mostly because most important changes were already leaked and reported on, and they wanted to surprise us with something. They sure did. You can now use typical keyboard shortcuts, shift to select, ctrl+c and ctrl+v to copy/paste, and so forth. The even allow a transparency option, which is common in other OSes to make its presence less jarring. Rather than covering over what you're doing, it makes it feel more like it overlays on top of it, especially for quick commands. At least, that is my opinion.

Tomorrow, October 1st, Microsoft will launch their "Windows Inside Program". This will give a very early glimpse at the OS for "most enthusiastic Windows fans" who are "comfortable running pre-release software that will be of variable quality". They "plan to share all the features (they) are experimenting with". They seem to actually want user feedback, a sharp contrast from their Windows 8 technical preview. My eye will on relaxing certification requirements, obviously.

Source: Microsoft

Look at all the pretty lights! The Corsair K70 RGB

Subject: General Tech | September 30, 2014 - 02:00 PM |
Tagged: K70 RGB, input, corsair, Cherry MX RGB red

There is a new type of Cherry MX switches on the market and they are what allow the Corsair K70 RGB to stand out in a light filled room; Cherry MX RGB switches feel like the original switches but with the clear plastic domes they have clear housings.  Thanks to the Corsair Utility Engine software which comes with the keyboard you can choose from 16.8 million colours to enhance the look of your keyboard, or create macros to have colours change as you are using it.  The Tech Report had great success in programming the keyboard considering that the manual is 142 pages long so expect a bit of a steep learning curve when you first start out playing with this keyboard.  You can find their review as well as a video showing off some of their colour schemes right here.

front342-lores.jpg

"Corsair Gaming's K70 RGB keyboard has been hotly anticipated since its debut at CES earlier this year. Does it live up to the hype? We put the keyboard and its accompanying software to the test to find out"

Here is some more Tech News from around the web:

Tech Talk

The Internet of Thing is a confusing place for manufacturers right now

Subject: General Tech | September 30, 2014 - 01:11 PM |
Tagged: arm, internet of things, Si106x, 108x, Silicon Labs, Intel, quark

While the Internet of Things is growing at an incredible pace the chip manufacturers which are competing for this new market segment are running into problems when trying to design chips to add to appliances.  There is a balance which needs to be found between processing power and energy savings, the goal is to design very inexpensive chips which can run on  microWatts of power but still be incorporate networked communication and sensors.  The new Cortex-M7 is a 32-bit processor which is directly competing with 8 and 16 bit microcontrollers which provide far less features but also consume far less power.  Does a smart light bulb really need to have a 32bit chip in it or will a lower cost MCU provide everything that is needed for the light to function?  Intel's Quark is in a similar position, the processing power it is capable of could be a huge overkill compared to what the IoT product actually needs.  The Register has made a good observation in this article, perhaps the Cortex M0 paired with an M4 or M7 when the application requires the extra horsepower is a good way for ARM to go in.  Meanwhile, Qualcomm's Snapdragon 600 has been adopted to run an OS to control robots so don't think this market is going to get any less confusing in the near future.

cortex-mo.png

"The Internet of Things (IoT) is growing an estimated five times more quickly than the overall embedded processing market, so it's no wonder chip suppliers are flocking to fit out connected cars, home gateways, wearables and streetlights as quickly as they can."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Europeans: Atlast! Has Haswell-based Fanless NUCs

Subject: General Tech, Systems | September 29, 2014 - 06:41 PM |
Tagged: fanless, nuc, haswell

The Akasa Newton X is a fanless case for the NUC form factor that was announced in May and released a couple of months ago. Now, we are beginning to see system builders (albeit in Europe) integrate it in some higher-end devices. This one, from Atlast! Solutions, is built around the Intel Core i5-4250U, up to 1.5TB of SSD storage (512GB Crucial M550 mSATA + 1TB 840 EVO SATA), and up to 16GB of RAM. It can also be configured with up to two-antenna Wireless AC.

akasa-NUC09-A1B_f00.jpg

The Core i5-4250U is a dual-core (four threads) processor that is rated for 15W TDP. Its on-chip GPU is the Intel HD Graphics 5000 with a peak, theoretical compute throughput of 704 GFLOPS. This makes it a little under three-times the graphics performance of an Xbox 360. In terms of PC games, you are looking at Battlefield 4 or Titanfall on low at 1024x768 (or basically whatever your home server can do if used as a stream-to target).

Prices currently start at £449.00 for 4GB of RAM and 60GB of mSATA SSD, including VAT.

Thanks to FanlessTech for covering this story.

Gigabyte's Z97X G1 Gaming GT is a bit of a step backwards

Subject: General Tech | September 29, 2014 - 02:17 PM |
Tagged: gigabyte, Z97X G1 Gaming GT, z97

Calling the GIGABYTE G1 Gaming GT Z97 motherboard trimmed down is a bit of an exaggeration, all that was removed was Bluetooth, WiFi and and Creative's Sound Core3D codec.  It still features AMP-UP audio with swappable OP-AMPs, a E2200 KillerNIC, high quality caps, four PCIe 3.0 16x slots thanks to a PLEX chip as well as an impressive array of SATA and USB ports.  At $270 it will cost you a somewhat less than choosing a new Haswell-E system and the performance in most cases will be very comparable, especially if you desire high quality audio.  However not all was good once [H]ard|OCP started testing the board, while there were no insurmountable issues their overall experiences with setting up the board make this particular model difficult to recommend; read the reasons why in their full review.

1409143032CuWCYe4Ehc_1_11_l.jpg

"GIGABYTE’s G1 Gaming GT looks to be a stripped version of the Z97X Gaming G1 WiFi-BK. Like other offerings in the G1 family the G1 Gaming GT is a premium part representing the pinnacle of what GIGABYTE design and innovation can and should offer. We have high expectations for the G1 Gaming GT."

Here are some more Motherboard articles from around the web:

Motherboards

Source: [H]ard|OCP

Check out results of The Tech Report's hardware survey

Subject: General Tech, Systems | September 29, 2014 - 01:21 PM |
Tagged: survey, components

The Tech Report have compiled the data from their survey of readers machines and the data is now posted in this article.  You can see how your build compares to the major trends that they observed, from the number and type of monitors that you use to the amount of RAM you have installed.  The most interesting page covers the odd facts which were revealed such as the overwhelming predominance of ATX boards and cases that are being used despite the fact that 75% of respondents having only a single card installed in their systems.  It is also interesting to note a mere 10% of those responding use more than one GPU.  Check out the findings here.

images.jpg

"Typical PC enthusiasts may spend more on their PCs than you might think—and by the looks of it, their taste for high-end hardware isn't just limited to core components. Those are two of the main takeaways from the TR Hardware Survey 2014, in which we invited readers to answer 26 questions about their PCs. Around 4,000 of you participated over a period of about a week and a half, and the results paint an enlightening picture of current trends in the hobbyist PC realm. "

Here is some more Tech News from around the web:

Tech Talk

Author:
Manufacturer: Apple

One Small Step

While most articles surrounding the iPhone 6 and iPhone 6 Plus this far have focused around user experience and larger screen sizes, performance, and in particular the effect of Apple's transition to the 20nm process node for the A8 SoC have been our main questions regarding these new phones. Naturally, I decided to put my personal iPhone 6 though our usual round of benchmarks.

applea83.jpg

First, let's start with 3DMark.

3dmark-iceunlimited.png

Comparing the 3DMark scores of the new Apple A8 to even the last generation A7 provides a smaller improvement than we are used to seeing generation-to-generation with Apple's custom ARM implementations. When you compare the A8 to something like the NVIDIA Tegra K1, which utilizes desktop-class GPU cores, the overall score blows Apple out of the water. Even taking a look at the CPU-bound physics score, the K1 is still a winner.

A 78% performance advantage in overall score when compared the A8 shows just how much of a powerhouse NVIDIA has with the K1. (Though clearly power envelopes are another matter entirely.)

octane.png

If we look at more CPU benchmarks, like the browser-based Google Octane and SunSpider tests, the A8 starts to shine more.

sunspider.png

While the A8 edges out the A7 to be the best performing device and 54% faster than the K1 in SunSpider, the A8 and K1 are neck and neck in the Google Octane benchmark.

gfxbench-manhattan.png

Moving back to a graphics heavy benchmark, GFXBench's Manhattan test, the Tegra K1 has a 75% percent performance advantage over the A8 though it is 36% faster than the previous A7 silicon.

These early results are certainly a disappointment compared to the usual generation-to-generation performance increase we see with Apple SoCs.

However, the other aspect to look at is power efficiency. With normal use I have noticed a substantial increase in battery life of my iPhone 6 over the last generation iPhone 5S. While this may be due to a small (about 1 wH) increase in battery capacity, I think more can be credited to this being an overall more efficient device. Certain choices like sticking to a highly optimized Dual Core CPU design and Quad Core GPU, as well as a reduction in process node to 20nm all contribute to increased battery life, while surpassing the performance of the last generation Apple A7.

apple-a8-dieshot-chipworks.png

In that way, the A8 moves the bar forward for Apple and is a solid first attempt at using the 20nm silicon technology at TSMC. There is a strong potential that further refined parts (like the expected A8x for the iPad revisions) Apple will be able to further surpass 28nm silicon in performance and efficiency.

Intel RealSense SDK Beta Available, Camera Pre-Order

Subject: General Tech | September 29, 2014 - 03:41 AM |
Tagged: Realsense 3D, realsense, kinect, Intel

RealSense is Intel's 3D camera initiative for bringing face recognition, gesture control, speech input, and augmented reality to the PC. Its closest analogy would be Microsoft's Kinect for Windows. The technology has been presented at Intel keynotes for a while now, embodied in the "Intel Perceptual Computing SDK 2013" under its "Perceptual Computing" initiative.

intel-creative-realsense.png

Since August 31st, that has been removed from their site and replaced with the Intel RealSense SDK. While the software is free, you will probably need compatible hardware to do anything useful. None is available yet, but the "Intel RealSense Developer Kit" hardware (not to be confused with the "Intel RealSense SDK", which is software) is available for reservation at Intel's website. The camera is manufactured by Creative Labs and will cost $99. They are also very clear that this is a developer tool, and forbid it from being used in "mission critical applications". Basically, don't trust your life on it, or the lives and health of any other(s) or anything.

The developer kit will be available for many regions: the US, Canada, much of Europe, Brazil, India, China, Taiwan, Japan, Malaysia, South Korea, New Zealand, Australia, Russia, Israel, and Singapore.

Source: Intel