Skyrim on Frostbite ... no not the engine

Subject: General Tech | October 29, 2014 - 01:45 PM |
Tagged: gaming, skyrim, frostfall

Last week RPS engaged in a bit of a theme, reviewing various survival games which is a genre which has really taken off this year.  Perhaps the most interesting was this article describing life with mods that make Skyrim into a much colder place to live, with frostbite becoming a serious concern as well as weather effects which are far more than just eye candy.  They also chose a mod which disables fast travel and removes dragons and the Dragonborn, instead playing a random outlaw out for an adventure.  All told this makes for a very different game than the vanilla and for those really looking for a new experience there is a comprehensive list of survival mods in this post, check out the comments below as well if you want to start counting your calories.

If you prefer survival of the fittest in a multiplayer game, then drop the single player mods and check out what the Fragging Frogs are up to this week.

6_youaredrunk.jpg

Drinking also has an effect.

"But more importantly, Meeko kept me warm in Skyrim’s deadly mountain passes. One of the mods I have installed is Frostfall, which gives the player a few extra things to worry about. Exposure can leave you freezing to death, while being wet means you succumb to the cold even faster. You have to keep yourself warm at fires and fill up on hot soups to keep your ‘exposure meter’ from dropping too low. Once, I tried to swim across a small, icy river and before I could get a fire going on the opposite shore I passed out from hypothermia. I woke up in a familiar inn, penniless, frostbitten and with this note in my pocket."

Here is some more Tech News from around the web:

Gaming

Building a new PC for the holidays?

Subject: Systems | October 29, 2014 - 01:19 PM |
Tagged: system build

The Tech Report have updated their system build recommendations for the latter part of 2014, with changes to their system components as well as a reluctant recommendation for Win 8.1 as Win7 is scheduled for EOL in the New Year.   The Core i7-5960X did not make it as the  i7-5930K reaches similar performance for just over half the price which also means that DDR4 has appeared for the first time, specifically the Crucial 16GB and 32GB DDR4-2133 kits.  There is a lot of choice right now when it comes to GPUs; four under $150, five under $250 and four ranging from ~$300 to $630 ensuring that you can find one in your price range.  Check out the full array of choices in their update.

Make sure to check out the recent updates on our Hardware Leaderboard as well.

haswell-oc.jpg

"Join us for another System Guide update, this time with just about all the tools you need to build a holiday PC early. We've got Nvidia's new GeForce GTX 900-series graphics cards, one of AMD's recently discounted A-series APUs, and much more."

Here are some more Systems articles from around the web:

Systems

Surface Server, WinARM, WARM? What shall we call it

Subject: General Tech | October 29, 2014 - 12:22 PM |
Tagged: arm, microsoft, windows server

The Register does not specify which version this was, likely a recent but highly modified version, but Microsoft has demonstrated their Server OS running on ARM hardware.  This will give them another inroad to low cost server builds which don't necessarily have Intel or AMD inside, as well as hedging their bets against Linux.  Linux is already happily running on just about any hardware you could want, or will be soon and Microsoft is likely worried about losing share to the open source OS.  It will be interesting to see what Microsoft can offer the price conscious shopper to convince them to spend the money on an OS license when Linux is free.  The days when the older generations of techs who have grown up with large UNIX servers and through Microsoft replacing it are numbered and they have always been one of the obstacles for the growth of upstart young Linux.  The Register also points to the possibility of it being an in house solution to keep the costs of maintaining Microsoft's Cloud applications.

7217.Windows-Azure-logo-v_6556EF52.png

"That's not a stunning feat: having developed Windows RT – a version of Windows 8 running on ARM chippery – Microsoft clearly has the know-how to get the job done. And it's not an indication that Microsoft intends to make Windows Server on ARM a product. It's just a test."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

ARM Announces Mali-T800 Series of Mobile GPUs

Subject: Processors, Mobile | October 29, 2014 - 04:30 AM |
Tagged: arm, mali-T800, mali

While some mobile SoC manufacturers have created their own graphics architectures, others license from ARM (and some even have a mixture of each within their product stack). There does not seem to be a specific push with this generation, rather just increases in the areas that make the most sense. Some comments tout increased energy efficiency, others higher performance, and even API support got a boost to OpenGL ES 3.1, which brings compute shaders to mobile graphics applications (without invoking OpenCL, etc.).

arm-mali-t860-chip-diagram-LG.png

Three models are in the Mali-T800 series: the T820, the T830, and the T860. As you climb in the list, the products go from entry level to high-performance mobile. GPUs are often designed in modularized segments, which ARM calls cores. You see this frequently in desktop, discrete graphics cards where an entire product stack contains a handful of actual designs, but products are made by disabling whole modules. The T820 and T830 can scale between one to four "core" modules, each core containing four actual "shader cores", while the T860 can scale between one to sixteen "core" modules, each core with 16 "shader cores". Again "core modules" are groups that contain actual shader processors (and L2 cache, etc.). Cores in cores.

This is probably why NVIDIA calls them "Streaming Multiprocessors" that contain "CUDA Cores".

arm-mali-t830-chip-diagram-LG.png

ARM does not (yet) provide an actual GFLOP rating for these processors, and it is up to manufacturers to some extent. It is normally a matter of multiplying the clock frequency by the number of ops per cycle and by the number of shader units available. I tried, but I assume my assumption of instructions per clock was off because the number I was getting did not match with known values from previous generations, so I assumed that I made a mistake. Also, again, ARM considers their performance figures to be conservative. Manufacturers should have no problem exceeding these, effortlessly.

As for a release timeline? Because these architectures are designed for manufacturers to implement, you should start seeing them within devices hitting retail in late 2015, early 2016.

Source: ARM

GoG Releases LucasArts Classics. More on the Way?

Subject: General Tech | October 28, 2014 - 06:10 PM |
Tagged: pc gaming, disney, lucasfilm

Lucasfilm games (think LucasArts) and Disney Interactive have recently been re-introducing their back catalog to the PC. Earlier this month, Disney unleashed its wrath upon Steam, including Epic Mickey 2, which was not available on the PC outside of a limited, Eastern Europe release. Today, they licensed (different) titles to GOG.com: three Star Wars titles and three point-and-click adventures.

lucasarts-tiefighter.jpg

As for the Star Wars titles? Two of them are X-Wing Special Edition and TIE Fighter Special Edition. Both titles include their 1994 and 1998 releases, as well as any applicable expansions. They, along with Sam & Max Hit the Road, have never been sold through digital distribution platforms, prior to today.

Honestly, I never had a chance to play X-Wing and TIE Fighter. I liked space combat games, but I pretty much just played Privateer 1 and 2 as well as some console games, like Star Fox and Rogue Squadron. I was a kid. I played a handful of games to death. I keep hearing that X-Wing and TIE Fighter were, supposedly, the best of the genre. I have no experience with them, though.

These titles are currently the top six best sellers on the service, pulling ahead of The Witcher 3 pre-order as I wrote this post. The press release claims that more titles are on the way "in the coming months".

Source: GOG

LiteOn announces EP1 Series Enterprise M.2 PCIe SSDs

Subject: Storage | October 28, 2014 - 04:49 PM |
Tagged: ssd, pcie, M.2, LiteOn

In conjunction with Dell World, LiteOn has announced their new EP1 M.2 PCIe SSD:

EP1 pic.png

Designed primarily for enterprise workloads and usage, the EP1 sports impressive specs for such a small device. Capacities are 480 and 960GB, random 4k IO is rated at 150k/44k (R/W), sequentials are as high as 1.5GB/sec, and max latencies are in the 30-40 us range (this spec is particularly important for enterprise OLTP / transactional database workloads). Given the enterprise specs, power loss protection is a given (and you can see the capacitors in the upper right of the above photo). Here are the full specs:

EP1 specs.png

It should be noted that larger PCIe-based SSDs are rated for greater than the 1 drive write per day of the EP1, but they are also considerably larger (physically) when compared to the M.2 EP1. As an additional aside, the 960GB capacity is a bit longer than you might have seen so far in the M.2 form factor. While the 480GB model is a familiar 2280 (80mm long), the 960GB model follows the 22110 form factor (110mm long). The idle power consumption seems a bit high, but enterprise devices are typically tuned for instantaneous response over idle wattage.

Full press blast after the break.

Source: LiteOn

The Alienware 13 comes with an optional Graphics Amplifier

Subject: General Tech | October 28, 2014 - 03:42 PM |
Tagged: alienware, Alienware 13, graphics amplifier, gaming laptop

The Alienware 13 is a gaming laptop which comes with a very interesting optional product, the so called Graphics Amplifier which is an external enclosure for a desktop GPU.  Finally the product which we have been waiting for has arrived, though only for a specific system.  The box will cost you $300 but will allow you to connect a GPU to your laptop with a single cord.  It does not ship with a GPU but there is a 460W PSU inside.  The GPU can be at most a double slot card, larger ones will not fit and it can have a maximum power draw of 375W which is not really an issue as that limit come from the PCIe interface.  The single cord you can see coming out of the back of the enclosure in this picture from Gizmodo provides a combined PCIe and USB connection to the laptop and when connected will disable the laptops internal GPU and allow the external desktop GPU to power the system. 

gntiejiarspy2nch7uxb.jpg

You cannot hotswap your GPU, you will need to reboot your system to switch between the external GPU and your internal GPU and SLI an option.  You do get to choose between your integral display or an external one connected via HDMI or Mini DisplayPort; the most expensive model of Alienware 13 does ship with a 2560x1440 touchscreen but it is still only 13" in size. 

a131.JPG

The internals are quite nice with a Haswell Core i5 4210U, a choice of either 8 or 16GB of DDR3-1600, a GTX 860M and either a large HDD or a 256GB M.2 SSD.  That is enough power to keep this laptop from lagging behind in performance for the next few years and with the external GPU you could feasibly upgrade your graphics for a few generations which will keep you in the game without needing a whole new system.

a132.JPG

From the tests that Gizmodo performed the external GPU functions perfectly when it is enabled which is great news for those of us who have been hoping that PCIe would eventually bring us a product such as this one.  The proprietary nature should not be too much of a concern, if Dell has managed to pull it off there is no reason why other companies would not be able to make a version which could work with other laptops which have the proper ports.  This certainly changes the biggest issue that gaming laptops have faced; now you can upgrade the laptop through several generations instead of needing to purchase a completely new system every other generation or so.

Source: Dell

Get your Win7 machines while you still can

Subject: General Tech | October 28, 2014 - 01:46 PM |
Tagged: microsoft, win7, inevitable

It is official, at the end of this month consumers will no longer be able to get their hands on a machine with Windows 7 installed, unless they luck into a machine which has been sitting on the shelves for a while.  If you buy through a corporate account you will still be able to order a machine with Win7 but that will be the only way to get your hands on the OS which is already almost impossible to find.  That puts shoppers in a bit of a bind as Win10 will not arrive for a while yet which leaves Win 8.1 as your only Microsoft based OS.  Of course there is always Linux, now that many games and distribution platforms such as Steam support the free OS it is a viable choice for both productivity and entertainment.  You can get more details at Slashdot or vent your spleen in the comments section.

images.jpg

"This Friday is Halloween, but if you try to buy a PC with Windows 7 pre-loaded after that, you're going to get a rock instead of a treat. Microsoft will stop selling Windows 7 licenses to OEMs after this Friday and you will only be able to buy a machine with Windows 8.1. The good news is that business/enterprise customers will still be able to order PCs 'downgraded' to Windows 7 Professional."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

Samsung 850 EVO SKUs leaked, leads to initial pricing, specs

Subject: Storage | October 28, 2014 - 01:30 PM |
Tagged: ssd, sata, Samsung, 850 EVO

Thanks to an updated SKU list and some searching, we've come across some initial photos, specs, and pricing for the upcoming Samsung 850 EVO.

8310217.01.prod_.jpg

You may have heard of an 850 EVO 1TB listing over at Frys, but there's actually more information out there. Here's a quick digest:

Specs:

  • Memory: 3D VNAND
  • Read: 550MB/sec
  • Write: 520MB/sec
  • Weight: 0.29 lbs

Pricing (via Antares Pro listings at time of writing):

  • 120GB (MZ-75E120B/AM): $100 ($0.83 / GB)
  • 250GB (MZ-75E250B/AM): $146 ($0.58 / GB)
  • 500GB (MZ-75E500B/AM): $258 ($0.52 / GB)
  • 1TB     (MZ-75E1T0B/AM): $477 ($0.48 / GB)

In addition to the above, we saw the 1TB model listed for $500 at Frys, and also found the 500GB for $264 at ProVantage. The shipping date on the Frys listing was initially November 3rd, but that has since shifted to November 24th, presumably due to an influx of orders.

We'll be publishing a full capacity roundup on the 850 Pro in anticipation of the 850 EVO launch, which based on these leaks is imminent.

GeForce GTX 970 Coil Whine Concerns

Subject: Graphics Cards | October 28, 2014 - 12:09 PM |
Tagged: maxwell, GTX 970, geforce, coil whine

Coil whine is the undesirable effect of electrical components creating audible noise when operating. Let's look to our friends at Wikipedia for a concise and accurate description of the phenomenon:

Coil noise is, as its name suggests, caused by electromagnetic coils. These coils, which may act as inductors or transformers, have a certain resonant frequency when coupled with the rest of the electric circuit, as well as a resonance at which it will tend to physically vibrate.

As the wire that makes up the coil passes a variable current, a small amount of electrical oscillation occurs, creating a small magnetic field. Normally this magnetic field simply works to establish the inductance of the coil. However, this magnetic field can also cause the coil itself to physically vibrate. As the coil vibrates physically, it moves through a variable magnetic field, and feeds its resonance back into the system. This can produce signal interference in the circuit and an audible hum as the coil vibrates.

Coil noise can happen, for example, when the coil is poorly secured to the circuit board, is poorly damped, or if the resonant frequency of the coil is close to the resonant frequency of the electric circuit. The effect becomes more pronounced as the signal passing through the coil increases in strength, and as it nears the resonant frequency of the coil, or as it nears the resonant frequency of the circuit. Coil noise is also noticed most often when it is in the humanly audible frequency.

Coil noise is also affected by the irregularities of the magnetic material within the coil. The flux density of the inductor is effected by these irregularities, causing small currents in the coil, contaminating the original signal. This particular subset of is sometimes referred to as magnetic fluctuation noise or the Barkhausen effect. Coil noise can also occur in conjunction with the noise produced by magnetostriction.

Gamers that frequently upgrade their graphics cards may have been witness to this problem with a particular install, or you might have been one of the lucky ones to never deal with the issue. If your computer sits under your desk, in a loud room or you only game with headphones, it's also possible that you just never noticed.

inductor.jpg

Possibly offending inductors?

The reason this comes up to today is that reports are surfacing of GeForce GTX 970 cards from various graphics card vendors exhibiting excessive coil whine or coil noise. These reports are coming in from multiple forum threads around the internet, a collection of YouTube videos of users attempting to capture the issue and even official statements from some of NVIDIA's partners. Now, just because the internet is talking about it doesn't necessarily mean it's a "big deal" relative to the number of products being sold. However, after several Twitter comments and emails requesting we look into the issue, I thought it was pertinent to start asking questions.

As far as I can tell today, GTX 970 cards from multiple vendors including EVGA, MSI and Gigabyte all have users reporting issues and claims of excessive coil noise. For my part here, I have two EVGA GTX 970 cards and an MSI GTX 970, none of which are producing sound at what I would call "excessive" levels. Everyone's opinion of excessive noise is going to vary, but as someone who sits next to a desk-high test bed and hears hundreds of cards a year, I am confident I have a good idea of what to listen for.

We are still gathering data on this potential issue, but a few of the companies mentioned above have issued official or semi-official statements on the problem.

From MSI:  

The coil whine issue is not specific to 900 series, but can happen with any high end GPU and that MSI is looking in to ways to minimize the issue. If you still have concern regarding this issue, then please contact our RMA department.

From EVGA:

We have been watching the early feedback on GTX 970 and inductor noise very closely, and have actively taken steps to improve this. We urge anyone who has this type of concern to contact our support so we can address it directly.

From NVIDIA: 

We’re aware of a small percentage of users reporting excessive “coil whine” noises and are actively looking into the issue.

We are waiting for feedback from other partners to see how they plan to respond.

Since all of the GTX 970 cards currently shipping are non-reference, custom built PCB designs, NVIDIA's input to the problem is one mostly of recommendations. NVIDIA knows that it is their name and brand being associated with any noisy GeForce cards so I would expect a lot of discussions and calls being had behind closed doors to make sure partners are addressing user concerns.

IMG_9794.JPG

Interestingly, the GeForce GTX 970 was the one card of this Maxwell release where all of NVIDIA's partners chose to go the route of custom designs rather than adopting the NVIDIA reference design. On the GTX 980, however, you'll find a mix of both and I would wager that NVIDIA's reference boards do not exhibit any above average noise levels from coils. (I have actually tested four reference GTX 980s without coil whine coming into play.) Sometimes offering all of these companies the option to be creative and to differentiate can back-fire if the utmost care isn't taken in component selection.

Ironically the fix is simple: a little glue on those vibrating inductor coils and the problem goes away. But most of the components are sealed making the simple fix a non-starter for the end user (and I wouldn't recommend doing that anyway). It does point to a lack of leadership from board manufacturers that are willing to skimp on hardware in such a way to make this a big enough issue that I am sitting here writing about this today.

As an aside, if you hear coil whine when running a game at 500-5000 FPS, I don't think that counts as being a major problem for your gaming. I have seen a video or two running a DX9 render test at over 4500 FPS - pretty much any card built today will make noises you don't expect when hitting that kind of performance level.

As for my non-official discussions on the topics with various parties, everyone continues to reiterate that the problem is not as widespread as the some of the forum threads would have you believe. It's definitely higher than normal, and getting public acknowledgements from EVGA and MSI basically confirms that, but one person told me the complaint and RMA levels are where they were expected to be consider the "massively fast sell out rates" the GTX 970 is experiencing. 

Of course, AMD isn't immune to coil whine issues either. If you remember back to the initial launch of the Radeon R9 290X and R9 290, we had similar coil whine issues and experienced those first hand on reference card designs. (You can see a video I recorded of an XFX unit back in November of 2013 here.) You can still find threads on popular forums from that time period discussing the issue and YouTube never seems to forget anything, so there's that. Of course, the fact that previous card launches might have seen issues along the same line doesn't forgive the issue in current or later card releases, but it does put things into context.

So, let's get some user feedback; I want to hear from GTX 970 owners about their experiences to help guide our direction of research going forward.

Click here to take our short poll for GTX 970 owners!

Source: Various

Sony PS4 and Microsoft Xbox One Already Hitting a Performance Wall

Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM |
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd

A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?

For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:

The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.

What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.

We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..

unity1.jpg

So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.

  1. The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
     
  2. The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
     
  3. The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
     
  4. Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.

unity2.jpg

If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:

  PlayStation 4 Xbox One
Processor 8-core Jaguar APU 8-core Jaguar APU
Motherboard Custom Custom
Memory 8GB GDDR5 8GB DDR3
Graphics Card 1152 Stream Unit APU 768 Stream Unit APU
Peak Compute 1,840 GFLOPS 1,310 GFLOPS

The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

xboxonegpu.jpg

If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).

Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.

Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.

unity3.jpg

But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.

Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?

UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.

Connected Data announces Transporter Genesis Private Cloud Appliance

Subject: Storage | October 27, 2014 - 04:17 PM |
Tagged: Transporter Genesis, transporter, connected data

Connected Data (whose members are merged with Drobo), have really been pushing their new Transporter line. When we saw them this past CES, there was only a small desktop appliance meant to connect and sync files between homes or small offices. Now they are stepping up their Transporter game by scaling all the way up to 24TB rack mount devices!

Transporter Genesis_AG_L.jpg

For those unaware, Transporter is a personal cloud solution, but with software and mobile app support akin to that of Dropbox. Their desktop software tool has seen rapid addition of features, and the company has even rolled out version history support. Features are nice, but what will now set Transporter apart from competing options is scalability:

Transporter.png

The base level Transporter (right) is a relatively simple device with a single 2.5" HDD installed. These devices scale through the '5' and '15' models, which appear to be built on Drobo hardware. The 'Genesis' models (left) are not simply Drobo 1200i's with blue stickers on them, they are full blown Xeon systems with redundant power supplies, an 80GB SSD, up to 32GB or RAM and 24TB of raw storage capacity. Here is what a typical business rollout of Transporter might look like with these new additions at play:

Transporter2.png

Features currently supported across the line:

  • 256 Bit AES communication
  • Transporter Desktop software solution (Windows and Mac)
  • Transporter mobile app (iOS and Android)
  • Redundancy within each node ('5' and above)
  • Redundancy across nodes (via sync)
  • Active Directory support
  • No recurring fees

The 12TB Genesis 75 comes in at $9,999, but the '15' and '5' should prove to be lower cost options. The base model single bay Transporter can be found for just over $100 (BYOHDD). Full press blast after the break.

Trancend's new M.2 SSD, the MTS800

Subject: Storage | October 27, 2014 - 04:00 PM |
Tagged: M.2, ssd, transcend, MTS800

M.2 is quickly gaining popularity thanks to its small size and power requirements as well as the possible speed increase and other features.  Transcend's 128GB MTS800 drive features fill AES encryption, wear levelling and garbage collection as well as something new, StaticDataRefresh Technology.  That is their name for a process which automatically restores the charge levels in the NAND cells which both prevents errors from accumulating as well as performance reduction over time.  M.2 drives do come with a price premium, the 128GB model is available for $76 on Amazon but the performance is impressive, the lowest transfer speed The SSD Review saw during their testing was 265.61MB/s.

629x419xTranscend-MTS800.jpg

"We have been seeing more M.2 SSDs lately, a lot of which are companies’ first steps into the market since the form factor is so new. They have been designed to meet strict size requirements and allow for greater flexibility in product development. They are the perfect fit for mobile devices with their compact size and light weight."

Here are some more Storage reviews from around the web:

Storage

Samsung updates 840 EVO Performance Restoration Tool

Subject: Storage | October 27, 2014 - 02:59 PM |
Tagged: Samsung, firmware, 840 evo

Over the weekend Samsung silently updated their 840 EVO Performance Restoration Tool. The incremental update improved support for some system configurations that were previously not recognizing an installed 840 EVO. Samsung also improved how the GUI progress bar responds during the update process, presumably to correct the near silent failure that occurred when the tool was unable to update the drive's firmware. Previously, the tool would halt at 15% without any clear indication that the firmware could not be updated (this would occur if the tool was unable to issue the necessary commands to the SSD, mainly due to the motherboard being in the wrong storage controller mode or using an incompatible storage driver).

DSC05837.JPG

Still no word on relief for those owners of the original 840 (non EVO or Pro). We've also heard from some users with Samsung OEM TLC-based SSDs that showed the same type of slow down (some variants of the PM851 apparently used TLC flash). More to follow there.

We evaluated the Samsung 840 EVO Performance Restoration Tool here. If you've already successfully run the 1.0 version of the tool, there is no need to re-run the 1.1 version, as it will not do anything additional to an EVO that has been updated and restored.

Source: Samsung

Earphones without the flashy colours and branding

Subject: General Tech | October 27, 2014 - 02:33 PM |
Tagged: audio, Takstar, HD5500

Some people still prefer headsets with a simplistic design and understated branding as opposed to models with colours bright enough to pass for emergency beacons and a logo large enough to be spotted from orbit.  Takstar understands this and even offers their product for less money than their ostentatious competitors, but that is only half the story as they still need to sound good.  It has a variety of connection options, a 1/8" adapter designed for mobile devices as well as a larger 1/4" connection for use on stereos.  On a mobile device the bass is lacking, which is more because of the lack of power as the headsets sounded much better on the 1/4" plug from a more powerful source.  Do not expect a miracle from $75 circumaural headphones but for the value conscious you should take a look at TechPowerUp's review.

hd5500.jpg

"Takstar is well-known for their bang-for-the-buck headphones, and today, we take a look at their HD5500s. Priced at $74.50, these headphones are for mobile users who want a solid and well-sounding pair of headphones. We take the HD5500s for a spin to see if they can live up to such expectation."

Here is some more Tech News from around the web:

Audio Corner

Source: techPowerUp

No new Intel for you this year

Subject: General Tech | October 27, 2014 - 12:35 PM |
Tagged: Haswell-EX, Haswell-EP4S, Intel, server, xeon, Broadwell-DE, Skylake

Intel's release schedules have been slowing down, unfortunately in a large part that is due to the fact that the only competition they face in certain market segments is themselves.  For high end servers it looks like we won't see Haswell-EX or EP4S until the second half of next year and Skylake chips for entry level servers until after the third quarter.  Intel does have to fight for their share of the SoC and low powered chips, DigiTimes reports the Broadwell-DE family and the C2750 and C2350 should be here in the second quarter which gives AMD and ARM a chance to gain market share against Intel's current offerings.  Along with the arrival of the new chips we will also see older models from Itanium, Xeon, Xeon Phi and Atom be discontinued; some may be gone before the end of the year.  You have already heard the bad news about Broadwell-E.

index.jpg

"Intel's next-generation server processors for 2015 including new Haswell-EX (Xeon E7 v3 series) and -EP4S (Xeon E5-4600 v3 series), are scheduled to be released in the second quarter of 2015, giving clients more time to transition to the new platform, according to industry sources."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes

(Oldish News) Kingdom Hearts 3 on Unreal Engine 4

Subject: General Tech | October 26, 2014 - 11:15 PM |
Tagged: square enix, kingdom hearts 3, unreal engine 4, ue4

I did not report on this the first time because it did not seem like a credible rumor. As it turns out, they were citing an interview with the game's director from Famitsu, the Japanese video game magazine. Basically, while Square likes to make their own engine to use with their RPG projects, their Luminous Engine did not satisfy their needs so they decided to shift production to Unreal Engine 4. While it is still not scheduled to come to the PC, we know that the engine feels at home on our platform.

square-KingdomHearts3-logo.jpg

Image Credit: Wikipedia

As an aside, Famitsu is a surprisingly hard website to machine translate for any content after the first page. I will make a mental note to not feed written content through JavaScript in any website that I make, for the sake of international readers. I eventually had to copy and paste the text directly into Microsoft Translate. It was a pretty terrible experience, but I digress. If you wish to see the interview, do not expect your browser's built-in tools to help. Ctrl-C and Ctrl-V.

It seems pretty clear that Kingdom Hearts was not moved to Unreal Engine 4 for PC support. That would just be silly. More likely, their internal engine might have needed a little extra development work and, especially with the vastly different art styles of Kingdom Hearts and Final Fantasy, they moved the two release dates further apart. Maybe they will even release Kingdom Hearts 3 earlier than intended?

But, if it does come to the PC, it seems somewhat more likely that it will function better than Final Fantasy XIII does. That title was locked to 720p with a few odd quirks, like Esc being the equivalent of "/qq" despite even Alt+F4 giving a warning prompt, and that it seems to require a keyboard to close (I could not find a way to close the game with the gamepad or mouse alone). That said, while a tangent-to-a-tangent, I did like the option to have the original, Japanese dub. Yet again, I digress.

This was not the first time that Square has developed an RPG on Unreal Engine. The Last Remnant, for the Xbox 360 and PC, was developed on Unreal Engine 3. Kingdom Hearts 3 does not have a release date, but it might be sooner than we expect (and probably much earlier than Final Fantasy XV).

Source: Famitsu

The Billion Dollar Businesses of Free to Play

Subject: General Tech | October 26, 2014 - 08:28 PM |
Tagged: pc gaming, free to play

Year to date, League of Legends, Crossfire, and Dungeon Fighter Online are each closing in on one billion dollars in revenue. Yes, three free-to-play MMO titles are closing in on $1 Billion USD in a single year. All three exceed World of Warcraft, which is still the most lucrative subscription MMO. That might change once expansion pack revenue from the upcoming Warlords of Draenor is accounted for, however. The total MMO industry, free-to-play or subscription, is estimated at almost $8 Billion USD, from January through September.

riot-lol-logo.jpg

This is all according to Gamesbeat and their dissection of a SuperData Research (how is that a real name?!) report on the MMO industry. Of course, there is always the possibility that these products will fall short of that milestone by the time January rolls around, but they are pretty close for nine months in and three to go.

The interesting part is why. The article discusses how easily these games can transition between markets due to how low the barrier to entry is. This is especially true in markets that embrace internet cafes, where the game is already installed. The barrier to entry is creating an account, the customer does not even need to think about payment until they have generated interest in the free content.

The second reason, which is not mentioned in the article, is the curve of revenue by customer type. A flat-fee is some value multiplied by the number of legitimate users you have. You will get at most "X" from a customer, maybe a little less for sales, and zero for pirated copies or customers that simply ignore your content. Subscription games split this off to a recurring income; it is the number of legitimate users for that month, summed over every month. While this will get more money from the most dedicated players, because they are playing longer, this still has a ceiling. Free-to-play and other microtransaction-based models have no ceiling except for all the content you have ever made. This is an unlimited ceiling for consumable content.

This can be good for the consumer or it can be bad, of course. Where a game falls on this spectrum really depends on how it is designed. Also, money is not everything. A game can even be released for free if the developer has a reason to not ignore all claims, whether it was a hobby, tech demo, are art piece. It is up to the player (or their gift giver) to decide what is worth their time or money, and that is okay.

Blizzard Is Installing World of Warcraft Servers in Australia

Subject: General Tech | October 26, 2014 - 03:33 AM |
Tagged: wow, blizzard

With the new expansion for World of Warcraft, Blizzard is expanding their infrastructure to better serve their customers in Oceania. The company will not require users who are currently on North American realms to switch, but will be reimbursing server swaps, for as many characters as desired, during the two weeks leading up to Warlords of Draenor's November 13th launch date. This will not affect the time of release, which will be 7:00 PM AEDT / midnight PST (PDT ends on November 2nd).

blizzard-wow-warlords-of-draenor.jpg

The expression, better late than never, definitely applies in this situation. The game has "Oceanic" realms for quite some time now, but they were still physically located in the west coast of America. Sure, the ideal latency of a packet from Australia to California is around 30ms (Update: It's actually around 60ms, 120ms round-trip ideal assuming 66% speed to light in a fiber cable. When Googling the distance between Australia and California, it thought I meant Sydney, Nova Scotia, Canada, 4000mi, not Sydney, Australia, 7500mi. Pixy Misa in the comments, who pointed out my error, said that they experience about 170ms of latency in practice), assuming the speed of light in fiber optics is about 2/3rds of light in a vacuum, but the actual latency is significantly higher in the practical world. Getting the servers about 4000 7500 miles closer should be welcome.

The transfer does not yet have a date, but refunds will be offered for character migrations between 6:01PM AEDT on October 29th, 2014, until 6:59PM AEDT on November 13, 2014. Just make sure to do realm swaps as a separate transaction from anything else you might buy. Apparently Blizzard acknowledges that their storefront will not be able to pick out the Character Transfer and Guild Master Realm Transfer among other services. While they should have spent a little more time making this promotion robust, I cannot really blame them. This is a one-shot. It is probably not worth the man-hours.

Source: Blizzard

AMD Catalyst 14.9.2 Beta for Civilization: Beyond Earth

Subject: Graphics Cards | October 26, 2014 - 02:44 AM |
Tagged: amd, driver, catalyst

So Ryan has been playing many games lately, as a comparison between the latest GPUs from AMD and NVIDIA. While Civilization: Beyond Earth is not the most demanding game in existence on GPUs, it is not trivial either. While not the most complex, from a video card's perspective, it is a contender for most demanding game on your main processor (CPU). It also has some of the most thought-out Mantle support of any title using the API, when using the AMD Catalyst 14.9.2 Beta driver.

firaxis-civilization-beyond-earth.jpg

And now you can!

The Catalyst 14.9.2 Beta drivers support just about anything using the GCN architecture, from APUs (starting with Kaveri) to discrete GPUs (starting with the HD 7000 and HD 7000M series). Beyond enabling Mantle support in Civilization, it also fixes some issues with Metro, Shadow of Mordor, Total War: Rome 2, Watch_Dogs, and other games.

Also, both AMD and Firaxis are aware of a bug in Civilization: Beyond Earth where the mouse cursor does not click exactly where it is supposed to, if the user enables font scaling in Windows. They are working on it, but suggest setting it to the default (100%) if users experience this issue. This could be problematic for customers with high-DPI screens, but could keep you playing until an official patch is released.

You can get 14.9.2 Beta for Windows 7 and Windows 8.1 at AMD's website.

Source: AMD