Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Sony PS4 and Microsoft Xbox One Already Hitting a Performance Wall

Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM |
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd

A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?

For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:

The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.

What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.

We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..

unity1.jpg

So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.

  1. The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
     
  2. The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
     
  3. The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
     
  4. Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.

It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.

unity2.jpg

If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:

  PlayStation 4 Xbox One
Processor 8-core Jaguar APU 8-core Jaguar APU
Motherboard Custom Custom
Memory 8GB GDDR5 8GB DDR3
Graphics Card 1152 Stream Unit APU 768 Stream Unit APU
Peak Compute 1,840 GFLOPS 1,310 GFLOPS

The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.

xboxonegpu.jpg

If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).

Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.

Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.

unity3.jpg

But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.

Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?

UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.

Author:
Manufacturer: Firaxis

A Civ for a New Generation

Turn-based strategy games have long been defined by the Civilization series. Civ 5 took up hours and hours of the PC Perspective team's non-working hours (and likely the working ones too) and it looks like the new Civilization: Beyond Earth has the chance to do the same. Early reviews of the game from Gamespot, IGN, and Polygon are quite positive, and that's great news for a PC-only release; they can sometimes get overlooked in the games' media.

For us, the game offers an interesting opportunity to discuss performance. Beyond Earth is definitely going to be more CPU-bound than the other games that we tend to use in our benchmark suite, but the fact that this game is new, shiny, and even has a Mantle implementation (AMD's custom API) makes interesting for at least a look at the current state of performance. Both NVIDIA and AMD sent have released drivers with specific optimization for Beyond Earth as well. This game is likely to be popular and it deserves the attention it gets.

Testing Process

Civilization: Beyond Earth, a turn-based strategy game that can take a very long time to complete, ships with an integrated benchmark mode to help users and the industry test performance under different settings and hardware configurations. To enable it, you simple add "-benchmark results.csv" to the Steam game launch options and then start up the game normally. Rather than taking you to the main menu, you'll be transported into a view of a map that represents a somewhat typical gaming state for a long term session. The game will use the last settings you ran the game at to measure your system's performance, without the modified launch options, so be sure to configure that before you prepare to benchmark.

The output of this is the "result.csv" file, saved to your Steam game install root folder. In there, you'll find a list of numbers, separated by commas, representing the frame times for each frame rendering during the run. You don't get averages, a minimum, or a maximum without doing a little work. Fire up Excel or Google Docs and remember the formula:

1000 / Average (All Frame Times) = Avg FPS

It's a crude measurement that doesn't take into account any errors, spikes, or other interesting statistical data, but at least you'll have something to compare with your friends.

settings.jpg

Our testing settings

Just as I have done in recent weeks with Shadow of Mordor and Sniper Elite 3, I ran some graphics cards through the testing process with Civilization: Beyond Earth. These include the GeForce GTX 980 and Radeon R9 290X only, along with SLI and CrossFire configurations. The R9 290X was run in both DX11 and Mantle.

  • Core i7-3960X
  • ASUS Rampage IV Extreme X79
  • 16GB DDR3-1600
  • GeForce GTX 980 Reference (344.48)
  • ASUS R9 290X DirectCU II (14.9.2 Beta)

Mantle Additions and Improvements

AMD is proud of this release as it introduces a few interesting things alongside the inclusion of the Mantle API.

  1. Enhanced-quality Anti-Aliasing (EQAA): Improves anti-aliasing quality by doubling the coverage samples (vs. MSAA) at each AA level. This is automatically enabled for AMD users when AA is enabled in the game.
     
  2. Multi-threaded command buffering: Utilizing Mantle allows a game developer to queue a much wider flow of information between the graphics card and the CPU. This communication channel is especially good for multi-core CPUs, which have historically gone underutilized in higher-level APIs. You’ll see in your testing that Mantle makes a notable difference in smoothness and performance high-draw-call late game testing.
     
  3. Split-frame rendering: Mantle empowers a game developer with total control of multi-GPU systems. That “total control” allows them to design an mGPU renderer that best matches the design of their game. In the case of Civilization: Beyond Earth, Firaxis has selected a split-frame rendering (SFR) subsystem. SFR eliminates the latency penalties typically encountered by AFR configurations.

EQAA is an interesting feature as it improves on the quality of MSAA (somewhat) by doubling the coverage sample count while maintaining the same color sample count as MSAA. So 4xEQAA will have 4 color samples and 8 coverage samples while 4xMSAA would have 4 of each. Interestingly, Firaxis has decided the EQAA will be enabled on Beyond Earth anytime a Radeon card is detected (running in Mantle or DX11) and AA is enabled at all. So even though in the menus you might see 4xMSAA enabled, you are actually running at 4xEQAA. For NVIDIA users, 4xMSAA means 4xMSAA. Performance differences should be negligible though, according to AMD (who would actually be "hurt" by this decision if it brought down FPS).

Continue reading our article on Civilization: Beyond Earth performance!!

AMD Radeon R9 290X Now Selling at $299

Subject: Graphics Cards | October 24, 2014 - 03:44 PM |
Tagged: radeon, R9 290X, leaderboard, hwlb, hawaii, amd, 290x

When NVIDIA launched the GTX 980 and GTX 970 last month, it shocked the discrete graphics world. The GTX 970 in particular was an amazing performer and undercut the price of the Radeon R9 290 at the time. That is something that NVIDIA rarely does and we were excited to see some competition in the market.

AMD responded with some price cuts on both the R9 290X and the R9 290 shortly thereafter (though they refuse to call them that) and it seems that AMD and its partners are at it again.

r9290x1.jpg

Looking on Amazon.com today we found several R9 290X and R9 290 cards at extremely low prices. For example:

The R9 290X's primary competition in terms of raw performance is the GeForce GTX 980, currently selling for $549 and up. If you can find them in stock, that means NVIDIA has a hill of $250 to climb when going against the lowest priced R9 290X.

r92901.jpg

The R9 290 looks interesting as well:

Several other R9 290 cards are selling for upwards of $300-320 making them bone-headed decisions if you can get the R9 290X for the same or lower price, but considering the GeForce GTX 970 is selling for at least $329 today (if you can find it) and you can see why consumers are paying close attention. 

Will NVIDIA make any adjustments of its own? It's hard to say right now since stock is so hard to come by of both the GTX 980 and GTX 970 but it's hard to imagine NVIDIA lowering prices as long as parts continue to sell out. NVIDIA believes that its branding and technologies like G-Sync make GeForce cards more valuable and until they being to see a shift in the market, I imagine that will stay the course.

For those of you that utilize our Hardware Leaderboard you'll find that Jeremy has taken these prices into account and update a couple of the system build configurations.

Source: Amazon.com

Assassin's Creed Unity Has NVIDIA-exclusive Effects via GameWorks

Subject: General Tech, Graphics Cards | October 29, 2014 - 06:12 PM |
Tagged: ubisoft, assassin's creed

Ubisoft has integrated GameWorks into Assassin's Creed Unity, or at least parts of it. The main feature to be included is NVIDIA's Horizon Based Ambient Occlusion Plus (HBAO+), which is their implementation of Ambient Occlusion. This effect darkens areas that would otherwise be incorrectly lit with our current limitations of Global Illumination. Basically, it analyzes the scene's geometry to subtract some of the influence of "ambient light" in places where it is an unrealistic approximation (particularly in small crevices). This is especially useful for overcast scenes, where direct sunlight does not overwhelm the contribution of scatters and bounces.

The other features to be included are Temporal Anti-alising (TXAA), Percentage-Closer Soft Shadows (PCSS), and GeometryWorks Advanced Tessellation. TXAA and PCSS were both included in Assassin's Creed IV: Black Flag, alongside the previously mentioned HBAO+, so it makes sense that Ubisoft continues to use what worked for them. GeometryWorks is a different story. NVIDIA seems to claim that it is like DirectX 11 tessellation, but is better suited for use alongside HBAO+ and PCSS.

unity2.jpg

Assassin's Creed Unity will be available on November 11th.

Source: NVIDIA

GeForce GTX 970 Coil Whine Concerns

Subject: Graphics Cards | October 28, 2014 - 12:09 PM |
Tagged: maxwell, GTX 970, geforce, coil whine

Coil whine is the undesirable effect of electrical components creating audible noise when operating. Let's look to our friends at Wikipedia for a concise and accurate description of the phenomenon:

Coil noise is, as its name suggests, caused by electromagnetic coils. These coils, which may act as inductors or transformers, have a certain resonant frequency when coupled with the rest of the electric circuit, as well as a resonance at which it will tend to physically vibrate.

As the wire that makes up the coil passes a variable current, a small amount of electrical oscillation occurs, creating a small magnetic field. Normally this magnetic field simply works to establish the inductance of the coil. However, this magnetic field can also cause the coil itself to physically vibrate. As the coil vibrates physically, it moves through a variable magnetic field, and feeds its resonance back into the system. This can produce signal interference in the circuit and an audible hum as the coil vibrates.

Coil noise can happen, for example, when the coil is poorly secured to the circuit board, is poorly damped, or if the resonant frequency of the coil is close to the resonant frequency of the electric circuit. The effect becomes more pronounced as the signal passing through the coil increases in strength, and as it nears the resonant frequency of the coil, or as it nears the resonant frequency of the circuit. Coil noise is also noticed most often when it is in the humanly audible frequency.

Coil noise is also affected by the irregularities of the magnetic material within the coil. The flux density of the inductor is effected by these irregularities, causing small currents in the coil, contaminating the original signal. This particular subset of is sometimes referred to as magnetic fluctuation noise or the Barkhausen effect. Coil noise can also occur in conjunction with the noise produced by magnetostriction.

Gamers that frequently upgrade their graphics cards may have been witness to this problem with a particular install, or you might have been one of the lucky ones to never deal with the issue. If your computer sits under your desk, in a loud room or you only game with headphones, it's also possible that you just never noticed.

inductor.jpg

Possibly offending inductors?

The reason this comes up to today is that reports are surfacing of GeForce GTX 970 cards from various graphics card vendors exhibiting excessive coil whine or coil noise. These reports are coming in from multiple forum threads around the internet, a collection of YouTube videos of users attempting to capture the issue and even official statements from some of NVIDIA's partners. Now, just because the internet is talking about it doesn't necessarily mean it's a "big deal" relative to the number of products being sold. However, after several Twitter comments and emails requesting we look into the issue, I thought it was pertinent to start asking questions.

As far as I can tell today, GTX 970 cards from multiple vendors including EVGA, MSI and Gigabyte all have users reporting issues and claims of excessive coil noise. For my part here, I have two EVGA GTX 970 cards and an MSI GTX 970, none of which are producing sound at what I would call "excessive" levels. Everyone's opinion of excessive noise is going to vary, but as someone who sits next to a desk-high test bed and hears hundreds of cards a year, I am confident I have a good idea of what to listen for.

We are still gathering data on this potential issue, but a few of the companies mentioned above have issued official or semi-official statements on the problem.

From MSI:  

The coil whine issue is not specific to 900 series, but can happen with any high end GPU and that MSI is looking in to ways to minimize the issue. If you still have concern regarding this issue, then please contact our RMA department.

From EVGA:

We have been watching the early feedback on GTX 970 and inductor noise very closely, and have actively taken steps to improve this. We urge anyone who has this type of concern to contact our support so we can address it directly.

From NVIDIA: 

We’re aware of a small percentage of users reporting excessive “coil whine” noises and are actively looking into the issue.

We are waiting for feedback from other partners to see how they plan to respond.

Since all of the GTX 970 cards currently shipping are non-reference, custom built PCB designs, NVIDIA's input to the problem is one mostly of recommendations. NVIDIA knows that it is their name and brand being associated with any noisy GeForce cards so I would expect a lot of discussions and calls being had behind closed doors to make sure partners are addressing user concerns.

IMG_9794.JPG

Interestingly, the GeForce GTX 970 was the one card of this Maxwell release where all of NVIDIA's partners chose to go the route of custom designs rather than adopting the NVIDIA reference design. On the GTX 980, however, you'll find a mix of both and I would wager that NVIDIA's reference boards do not exhibit any above average noise levels from coils. (I have actually tested four reference GTX 980s without coil whine coming into play.) Sometimes offering all of these companies the option to be creative and to differentiate can back-fire if the utmost care isn't taken in component selection.

Ironically the fix is simple: a little glue on those vibrating inductor coils and the problem goes away. But most of the components are sealed making the simple fix a non-starter for the end user (and I wouldn't recommend doing that anyway). It does point to a lack of leadership from board manufacturers that are willing to skimp on hardware in such a way to make this a big enough issue that I am sitting here writing about this today.

As an aside, if you hear coil whine when running a game at 500-5000 FPS, I don't think that counts as being a major problem for your gaming. I have seen a video or two running a DX9 render test at over 4500 FPS - pretty much any card built today will make noises you don't expect when hitting that kind of performance level.

As for my non-official discussions on the topics with various parties, everyone continues to reiterate that the problem is not as widespread as the some of the forum threads would have you believe. It's definitely higher than normal, and getting public acknowledgements from EVGA and MSI basically confirms that, but one person told me the complaint and RMA levels are where they were expected to be consider the "massively fast sell out rates" the GTX 970 is experiencing. 

Of course, AMD isn't immune to coil whine issues either. If you remember back to the initial launch of the Radeon R9 290X and R9 290, we had similar coil whine issues and experienced those first hand on reference card designs. (You can see a video I recorded of an XFX unit back in November of 2013 here.) You can still find threads on popular forums from that time period discussing the issue and YouTube never seems to forget anything, so there's that. Of course, the fact that previous card launches might have seen issues along the same line doesn't forgive the issue in current or later card releases, but it does put things into context.

So, let's get some user feedback; I want to hear from GTX 970 owners about their experiences to help guide our direction of research going forward.

Click here to take our short poll for GTX 970 owners!

Source: Various
Author:
Subject: Systems
Manufacturer: Steiger Dynamics

Overview

Often times, one of the suggestions of what to do with older PC components is to dedicate it to a Home Theater PC. While in concept this might seem like a great idea, you can do a lot of things with full control over the box hooked up to your TV, I think it's a flawed concept.

With a HTPC, some of the most desired traits include low power consumption, quiet operation, all while maintaining a high performance level so you can do things like transcode video quickly. Older components that you have outgrown don't tend to be nearly as efficient as newer components. To have a good HTPC experience, you really want to pick components from the ground up, which is why I was excited to take a look at the Steiger Dynamics Maven Core HTPC.

As it was shipped to us, our Maven Core is equipped with an Intel Core i5-4690K and an NVIDIA GTX 980. By utilizing two of the most power efficient architectures available, Intel's Haswell and NVIDIA's Maxwell, the Maven should be able to sip power while maintaining low temperature and noise. While a GTX 980 might be overkill for just HTPC applications, it opens up a lot of possibilities for couch-style PC gaming with things like Steam Big Picture mode.

IMG_9996.JPG

From the outside, the hand-brushed aluminum Steiger Dynamics system takes the form of traditional high-end home theater gear. At 6.85-in tall, or almost 4U if you are comfortable with that measurement system, the Maven Core is a large device, but does not stand out in a collection of AV equipment. Additionally, when you consider the standard Blu-Ray drive and available Ceton InfiniTV Quad PCIe CableCARD tuner giving this system the capability of replacing both a cable set top box and dedicated Blu-Ray player all together, the size becomes easier to deal with.

Digging deeper into the hardware specs of the Maven Core we find some familiar components. The Intel Core i5-4690K sits in an ASUS Z97-A motherboard along with 8GB of Corsair DDR3-1866 memory. For storage we have a 250GB Samsung 840 EVO SSD paired with a Western Digital 3TB Hard Drive for mass storage of your media.

IMG_0061.JPG

Cooling for the CPU is provided by a Corsair H90 with a single Phanteks fan to help keep the noise down. Steiger Dynamics shipped our system with a Seasonic Platinum-series 650W power supply, including their custom cabling option. For $100, they will ship your system with custom, individually sleeved Power Supply and SATA drive cables. The sleeving and cable management are impressive, but $100 would be a difficult upsell of a PC that you are likely never going to see the inside of.

As we mentioned earlier, this machine also shipped with a Ceton InfiniTV 4 PCIe CableCARD tuner. While CableCARD is a much maligned technology that never really took off, when you get it working it can be impressive. Our impressions of the InfiniTV can be found later in this review.

Continue reading our review of the Steiger Dynamics Maven Core HTPC!

The Alienware 13 comes with an optional Graphics Amplifier

Subject: General Tech | October 28, 2014 - 03:42 PM |
Tagged: alienware, Alienware 13, graphics amplifier, gaming laptop

The Alienware 13 is a gaming laptop which comes with a very interesting optional product, the so called Graphics Amplifier which is an external enclosure for a desktop GPU.  Finally the product which we have been waiting for has arrived, though only for a specific system.  The box will cost you $300 but will allow you to connect a GPU to your laptop with a single cord.  It does not ship with a GPU but there is a 460W PSU inside.  The GPU can be at most a double slot card, larger ones will not fit and it can have a maximum power draw of 375W which is not really an issue as that limit come from the PCIe interface.  The single cord you can see coming out of the back of the enclosure in this picture from Gizmodo provides a combined PCIe and USB connection to the laptop and when connected will disable the laptops internal GPU and allow the external desktop GPU to power the system. 

gntiejiarspy2nch7uxb.jpg

You cannot hotswap your GPU, you will need to reboot your system to switch between the external GPU and your internal GPU and SLI an option.  You do get to choose between your integral display or an external one connected via HDMI or Mini DisplayPort; the most expensive model of Alienware 13 does ship with a 2560x1440 touchscreen but it is still only 13" in size. 

a131.JPG

The internals are quite nice with a Haswell Core i5 4210U, a choice of either 8 or 16GB of DDR3-1600, a GTX 860M and either a large HDD or a 256GB M.2 SSD.  That is enough power to keep this laptop from lagging behind in performance for the next few years and with the external GPU you could feasibly upgrade your graphics for a few generations which will keep you in the game without needing a whole new system.

a132.JPG

From the tests that Gizmodo performed the external GPU functions perfectly when it is enabled which is great news for those of us who have been hoping that PCIe would eventually bring us a product such as this one.  The proprietary nature should not be too much of a concern, if Dell has managed to pull it off there is no reason why other companies would not be able to make a version which could work with other laptops which have the proper ports.  This certainly changes the biggest issue that gaming laptops have faced; now you can upgrade the laptop through several generations instead of needing to purchase a completely new system every other generation or so.

Source: Dell

Samsung updates 840 EVO Performance Restoration Tool

Subject: Storage | October 27, 2014 - 02:59 PM |
Tagged: Samsung, firmware, 840 evo

Over the weekend Samsung silently updated their 840 EVO Performance Restoration Tool. The incremental update improved support for some system configurations that were previously not recognizing an installed 840 EVO. Samsung also improved how the GUI progress bar responds during the update process, presumably to correct the near silent failure that occurred when the tool was unable to update the drive's firmware. Previously, the tool would halt at 15% without any clear indication that the firmware could not be updated (this would occur if the tool was unable to issue the necessary commands to the SSD, mainly due to the motherboard being in the wrong storage controller mode or using an incompatible storage driver).

DSC05837.JPG

Still no word on relief for those owners of the original 840 (non EVO or Pro). We've also heard from some users with Samsung OEM TLC-based SSDs that showed the same type of slow down (some variants of the PM851 apparently used TLC flash). More to follow there.

We evaluated the Samsung 840 EVO Performance Restoration Tool here. If you've already successfully run the 1.0 version of the tool, there is no need to re-run the 1.1 version, as it will not do anything additional to an EVO that has been updated and restored.

Source: Samsung
Subject: Systems
Manufacturer: ECS

Introduction

DSC_0484 (Large).JPG

When Intel revealed their miniature PC platform in 2012, the new “Next Unit of Computing” (NUC) was a tiny motherboard with a custom case, and admittedly very little compute power. Well, maybe not so much with the admittedly: “The Intel NUC is an ultra-compact form factor PC measuring 4-inch by 4-inch. Anything your tower PC can do, the Intel NUC can do and in 4 inches of real estate.” That was taken from Intel’s NUC introduction, and though their assertion was perhaps a bit premature, technology does continue its rapid advance in the small form-factor space. We aren’t there yet by any means, but the fact that a mini-ITX computer can be built with the power of an ATX rig (limited to single-GPU, of course) suggests that it could happen for a mini-PC in the not so distant future.

With NUC the focus was clearly on efficiency over performance, and with very low power and noise there were practical applications for such a device to offset the marginal "desktop" performance. The viability of a NUC would definitely depend on the user and their particular needs, of course. If you could find a place for such a device (such as a living room) it may have been worth the cost, as the first of the NUC kits were fairly expensive (around $300 and up) and did not include storage or memory. These days a mini PC can be found starting as low as $100 or so, but most still do not include any memory or storage. They are tiny barebones PC kits after all, so adding components is to be expected...right?

DSC_0809 (Large).JPG

It’s been a couple of years now, and the platform continues to evolve - and shrink to some startlingly small sizes. Of the Intel-powered micro PC kits on today’s market the LIVA from ECS manages to push the boundaries of this category in both directions. In addition to boasting a ridiculously small size - actually the smallest in the world according to ECS - the LIVA is also very affordable. It carries a list price of just $179 (though it can be found for less), and that includes onboard memory and storage. And this is truly a Windows PC platform, with full Windows 8.1 driver support from ECS (previous versions are not supported).

Continue reading our look at the ECS LIVA Mini PC!!

Get your Win7 machines while you still can

Subject: General Tech | October 28, 2014 - 01:46 PM |
Tagged: microsoft, win7, inevitable

It is official, at the end of this month consumers will no longer be able to get their hands on a machine with Windows 7 installed, unless they luck into a machine which has been sitting on the shelves for a while.  If you buy through a corporate account you will still be able to order a machine with Win7 but that will be the only way to get your hands on the OS which is already almost impossible to find.  That puts shoppers in a bit of a bind as Win10 will not arrive for a while yet which leaves Win 8.1 as your only Microsoft based OS.  Of course there is always Linux, now that many games and distribution platforms such as Steam support the free OS it is a viable choice for both productivity and entertainment.  You can get more details at Slashdot or vent your spleen in the comments section.

images.jpg

"This Friday is Halloween, but if you try to buy a PC with Windows 7 pre-loaded after that, you're going to get a rock instead of a treat. Microsoft will stop selling Windows 7 licenses to OEMs after this Friday and you will only be able to buy a machine with Windows 8.1. The good news is that business/enterprise customers will still be able to order PCs 'downgraded' to Windows 7 Professional."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

New 13-Inch Windows 8.1 Tablet Rounds Out Lenovo's Yoga Tablet 2 Lineup

Subject: Mobile | October 29, 2014 - 09:08 PM |
Tagged: yoga tablet 2, Windows 8.1, Lenovo, Bay Trail, atom z3745, atom

Lenovo made a new 13-inch Windows 8.1 tablet official today rounding out the company's Yoga Tablet 2 family. The aptly named Yoga Tablet 2 With Windows (13") combines the design and hardware features of the Yoga Tablet 2 Pro with the smaller 10-inch Yoga Tablet 2 (Android or Windows) siblings. This tablet lacks the Pico projector of the Pro model, but keeps the JBL audio hardware, QHD IPS display, and kickstand. It further adds a larger version of the Bluetooth AccuType keyboard seen on the 10-inch Yoga Tablet 2 Windows model. Aimed at productivity tasks, the Bay Trail-powered PC is equipped with additional memory and storage along with an ample 12,800 mAh battery rated at up to 15 hours of general usage (including video/audio playback and web browsing). It will be available for purchase next month for $699.

Convertible Tablet_Yoga Tablet 2 Pro_13_W_Bluetooth keyboard_01.jpg

The Yoga Tablet 2 with Windows 13-Inch is a 2.27 pound (tablet only) PC featuring a 2560x1440 IPS display, JBL audio with a Wolfson Master Hi-Fi codec (two front facing 1.5W stereo speakers with a rear firing 5W subwoofer), 1.6MP webcam for video conferencing, and a bundled AccuType keyboard cover. External IO includes one micro HDMI video output, one micro USB port, and micro SD card slot, and an analog audio jack. The tablet and keyboard are all ebony black which sets it apart from the other mostly silver-clad Yoga Tablet 2s.

Internally, Lenovo has chosen the quad core Intel Atom (Bay Trail) Z3745 clocked at 1.86GHz, 4GB of LPDDR3 memory, and 64GB of internal storage that can be expanded upon by adding a micro SD card up to 64GB. There is no cellular data support, but the tablet does include dual band 802.11n Wi-Fi and Bluetooth 4.0 radios. A large 12,800 mAh Lithium Polymer battery powers the tablet for up to 15 hours, according to Lenovo.

Convertible Tablet_Yoga Tablet 2 Pro_13_W_Bluetooth keyboard_02.jpg

The tablet runs the full version of Windows 8.1 and comes with a one month trail of Office 365 (which recently started offering 'unlimited' cloud storage).

It will be available for purchase in November on Lenovo.com for $699.

I like the black design and the inclusion of a keyboard along with the usage of Windows 8.1 makes this a better choice for business users than the Android-running Yoga Tablet 2 Pro model. The specifications look pretty good for what it is, though I question how many Lenovo will sell at that price point. You can find older generation convertible tablets, even from Lenovo, running the faster Intel Core (Ivy Bridge and similar) chips in that price range not to mention regular laptops should you not need the hybrid/tablet nature. It is kind of in an odd middle ground between the budget Bay Trail devices and starter ultrabooks though the high resolution IPS display and audio do not hurt.

Do you think it has a place in the market and will you be picking one up?

*For reference, the 13" Yoga Tablet 2 Pro has an MSRP of $499 while the 10-inch Yoga Tablet 2 (Windows, with keyboard) has an MSRP of $399. The $200 or $300 premium (depending on the comparison) gets you (at least) a device with more memory and storage and potentially an added keyboard or a larger device.

Source: Lenovo

Samsung 850 EVO SKUs leaked, leads to initial pricing, specs

Subject: Storage | October 28, 2014 - 01:30 PM |
Tagged: ssd, sata, Samsung, 850 EVO

Thanks to an updated SKU list and some searching, we've come across some initial photos, specs, and pricing for the upcoming Samsung 850 EVO.

8310217.01.prod_.jpg

You may have heard of an 850 EVO 1TB listing over at Frys, but there's actually more information out there. Here's a quick digest:

Specs:

  • Memory: 3D VNAND
  • Read: 550MB/sec
  • Write: 520MB/sec
  • Weight: 0.29 lbs

Pricing (via Antares Pro listings at time of writing):

  • 120GB (MZ-75E120B/AM): $100 ($0.83 / GB)
  • 250GB (MZ-75E250B/AM): $146 ($0.58 / GB)
  • 500GB (MZ-75E500B/AM): $258 ($0.52 / GB)
  • 1TB     (MZ-75E1T0B/AM): $477 ($0.48 / GB)

In addition to the above, we saw the 1TB model listed for $500 at Frys, and also found the 500GB for $264 at ProVantage. The shipping date on the Frys listing was initially November 3rd, but that has since shifted to November 24th, presumably due to an influx of orders.

We'll be publishing a full capacity roundup on the 850 Pro in anticipation of the 850 EVO launch, which based on these leaks is imminent.

LiteOn announces EP1 Series Enterprise M.2 PCIe SSDs

Subject: Storage | October 28, 2014 - 04:49 PM |
Tagged: ssd, pcie, M.2, LiteOn

In conjunction with Dell World, LiteOn has announced their new EP1 M.2 PCIe SSD:

EP1 pic.png

Designed primarily for enterprise workloads and usage, the EP1 sports impressive specs for such a small device. Capacities are 480 and 960GB, random 4k IO is rated at 150k/44k (R/W), sequentials are as high as 1.5GB/sec, and max latencies are in the 30-40 us range (this spec is particularly important for enterprise OLTP / transactional database workloads). Given the enterprise specs, power loss protection is a given (and you can see the capacitors in the upper right of the above photo). Here are the full specs:

EP1 specs.png

It should be noted that larger PCIe-based SSDs are rated for greater than the 1 drive write per day of the EP1, but they are also considerably larger (physically) when compared to the M.2 EP1. As an additional aside, the 960GB capacity is a bit longer than you might have seen so far in the M.2 form factor. While the 480GB model is a familiar 2280 (80mm long), the 960GB model follows the 22110 form factor (110mm long). The idle power consumption seems a bit high, but enterprise devices are typically tuned for instantaneous response over idle wattage.

Full press blast after the break.

Source: LiteOn

Samsung Germany acknowledges '840 Basic' performance slow down, promises fix

Subject: Storage | October 29, 2014 - 03:10 PM |
Tagged: tlc, Samsung, firmware, 840

If you own a Samsung 840 SSD, it appears that after much repeated and vocal pressure, Samsung has acknowledged the slow down also affects your drive. We're not talking about the EVO or the Pro, this is the original pure TLC model that launched (the EVO is a TLC+SLC cache hybrid while the Pro is all MLC). Here's the quote from Samsung, via Computer Base:

Uns ist durch das Feedback, das uns erreicht hat, bekannt, dass es auch beim Zugriff auf bestimmte Daten bei Modellen der SSD 840 zu niedrigeren Leseleistungen als angegeben kommen kann.

Im Moment untersuchen unsere Produktexperten systematisch die betreffenden SSD-Modelle innerhalb verschiedener Systemumgebungen und arbeiten an einer schnellstmöglichen Lösung.

Aufgrund der unterschiedlichen Technologien sind die Modelle der PRO-Serie (840 PRO und 850 PRO) nicht betroffen.

Samsung

What? You can't read German? Neither can we, but paraphrasing from the poor quality translation from several online tools, we deduce that Samsung has acknowledged the issue on the 840, and is working on a solution as quickly as possible. This is similar verbiage to the statement issued for the 840 EVO acknowledgement.

** Update **

Thanks to Zyhmet, who commented shortly after posting, here's a human translation:

Because of the feedback we got, we realized that, accessing specific data with units of SSD 840 could lead to lower reading performance.

For the moment our experts are systematically examining the SSD-units with different system environments and we are working on a solution as fast as possible.

Due to different technologies the PRO-series (840 PRO and 850 PRO) are not affected.

Samsung

** End update **

Side note - of those who have used the 840 EVO Performance Restoration Tool, a few have reported an issue cropping up. The error manifests as a SMART data misreporting error:

temp (cooling).png

What's odd about this error is that it was present on some of our pre-production test samples (firmware EXT0AB0Q), and was corrected once we updated those samples to the first retail build (EXT0BB0Q). The image above was an actual screen shot taken during our temperature-dependency testing of the slow down issue. While none of our samples had the issue return when updating all the way to the performance restored firmware, one of those updates did corrupt the Master File Table, rendering the majority of the SSD inaccessible. While we have seen no other reports of corrupted partitions, several users noticed the SMART reporting issue after updating. It's odd to see this sort of a regression with firmware updates, in that a bug fixed in the initial shipping firmware has returned (for some) in a subsequent update. If you've updated your 840 EVO with their Performance Restoration Tool, it may be a good time to check your SMART attributes. If you see the error above, please leave us a note in the comments.

Circling back to the slow down issue - given that it is present in two TLC-based SSDs from Samsung, one has to wonder if this issue exists in other Samsung TLC SSDs as well. Here's the list of potentials (thanks to an anonymous comment on a prior story):

  • 840 EVO - 19nm TLC
  • 840 - 21nm TLC
  • PM841 - 21nm TLC
  • PM851 - 21nm TLC (some SKUs)
  • 845DC EVO - 19nm TLC
  • PM843 - 21nm TLC
  • PM853T - 21nm TLC

We have several questions out to Samsung on these issues, but to date they have not been answered. More to follow as we wait for an official (English) response.

Eighteen-core Xeon E7 v3 Based on Haswell-EX in Q2'15

Subject: Processors | October 29, 2014 - 05:44 PM |
Tagged: Intel, Haswell-E, Haswell-EX, Ivy Bridge-EX

Last February, Intel launched the Xeon E7 v2 line of CPUs. Based on the Ivy Bridge architecture, they replaced the original Xeon E7s, developed from Sandy Bridge, that were released in April 2011. Intel is now planning to release Haswell-EX in the second quarter of 2015. No specific SKUs are listed, this information describes the product family as a whole.

intel-xeon-e7v2.jpg

This is Ivy Bridge-EX. Haswell-EX will have 3 extra cores (and look a bit different).

To set the tone, these are not small chips. Using the previous generation as an example, Ivy Bridge-EX was over twice the size (surface area) of Ivy Bridge-E, and it contained over twice the number of transistors. While Ivy Bridge-EX was available with up to 15 physical cores per processor, double that with HyperThreading, Haswell-EX is increasing that to 18, or 36 simultaneous threads with HyperThreading. If that is not enough cores, then you can pick up an eight-socket motherboard and load it up with multiple of these.

Other than their gigantic size, these chips are fairly similar to the Xeon E5 processors that are based on Haswell-E. If you need eighteen cores per package, and can spare several thousand dollars per processor, you should be able to give someone your money in just a handful of months.

Source: KitGuru

Don't tell your iObsessed iBuddies but the iPad Air 2 is a bit of an iBore

Subject: General Tech | October 30, 2014 - 01:01 PM |
Tagged: iPad Air 2, apple

There were long lineups of people desperate to get their hands on the new iPad Air 2, regardless of the fact that the internals cost a mere $1 more than the initial model.  To be fair that is not the best way to judge the quality of the upgrade, that should rely more on the screen quality ... which is exactly the same in all respects except for a new anti-reflective coating.  Apple is also reducing their markup, from 45-61% down to a paltry 45-57% for this generation so at least that $1.00 extra in materials will not raise your purchase price overly.  The internals such as the TSMC made A8X and camera match the iPhone 6 to a large extent making it a more powerful phablet than the original, so don't disparage it too much.  You can read more on The Register if you are into fruit.

ipadair2_exploded.jpg

"New iPad Air 2 components cost Apple just one dollar more than the previous model, according to the teardown bods at IHS."

Here is some more Tech News from around the web:

Tech Talk

No new Intel for you this year

Subject: General Tech | October 27, 2014 - 12:35 PM |
Tagged: Haswell-EX, Haswell-EP4S, Intel, server, xeon, Broadwell-DE, Skylake

Intel's release schedules have been slowing down, unfortunately in a large part that is due to the fact that the only competition they face in certain market segments is themselves.  For high end servers it looks like we won't see Haswell-EX or EP4S until the second half of next year and Skylake chips for entry level servers until after the third quarter.  Intel does have to fight for their share of the SoC and low powered chips, DigiTimes reports the Broadwell-DE family and the C2750 and C2350 should be here in the second quarter which gives AMD and ARM a chance to gain market share against Intel's current offerings.  Along with the arrival of the new chips we will also see older models from Itanium, Xeon, Xeon Phi and Atom be discontinued; some may be gone before the end of the year.  You have already heard the bad news about Broadwell-E.

index.jpg

"Intel's next-generation server processors for 2015 including new Haswell-EX (Xeon E7 v3 series) and -EP4S (Xeon E5-4600 v3 series), are scheduled to be released in the second quarter of 2015, giving clients more time to transition to the new platform, according to industry sources."

Here is some more Tech News from around the web:

Tech Talk

Source: DigiTimes

Cooler Master's new Nepton 240M in action

Subject: General Tech | October 24, 2014 - 01:22 PM |
Tagged: watercooler, Nepton 240M, Nepton, cooler master, all in one

As with the previous generation the new Nepton 240M is designed with "ultra-fine micro channel" technology which quadruples the surface area of the radiator but does provide more resistance to air travelling through the rad.  Installation was a breeze with only one small issue with the gasket which was easily solved.  The Tech Report were more than happy with the new Silencio fans which stayed under 40dB under load, in fact the noise barely changed when compared to when the computer was idle.  The pump was also reasonably quiet and powerful enough to keep the CPU nice and cool though at a cost, the new Nepton 120M has an MSRP of $130.

overview.jpg

"The Nepton 240M is a big liquid cooler with a price to match. We strapped it to TR's Casewarmer to see whether it could take the heat."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Building a new PC for the holidays?

Subject: Systems | October 29, 2014 - 01:19 PM |
Tagged: system build

The Tech Report have updated their system build recommendations for the latter part of 2014, with changes to their system components as well as a reluctant recommendation for Win 8.1 as Win7 is scheduled for EOL in the New Year.   The Core i7-5960X did not make it as the  i7-5930K reaches similar performance for just over half the price which also means that DDR4 has appeared for the first time, specifically the Crucial 16GB and 32GB DDR4-2133 kits.  There is a lot of choice right now when it comes to GPUs; four under $150, five under $250 and four ranging from ~$300 to $630 ensuring that you can find one in your price range.  Check out the full array of choices in their update.

Make sure to check out the recent updates on our Hardware Leaderboard as well.

haswell-oc.jpg

"Join us for another System Guide update, this time with just about all the tools you need to build a holiday PC early. We've got Nvidia's new GeForce GTX 900-series graphics cards, one of AMD's recently discounted A-series APUs, and much more."

Here are some more Systems articles from around the web:

Systems

Workstation class X99 from ASRock

Subject: General Tech | October 30, 2014 - 02:59 PM |
Tagged: x99 ws, Intel X99, Haswell-E, asrock

ASRock has a Work Station class board for Haswell-E with five PCIe 3.0 slots, support for up to 128GB of RAM which can be ECC if you install an appropriate processor and on the back are four of both USB 2.0 and 3.0 ports, one eSATA ports, audio and a pair of LAN ports.  They also included A-Tuning overclocking software which seems odd for a Work Station but proved to be very important as [H]ard|OCP could not get the system they built with this board to POST at default settings and had to change UEFI settings to get it to boot.  Once it did start up the performance was solid and it was one of the better ASRock boards that [H] has reviewed though with a street price over $300 it is hard to recommend.

1414356118PQyIkyQD6F_1_8_l.jpg

"ASRock comes to us with its "Work Station" version Haswell-E motherboard. This time our out-of-box experience with its X99 WS was as rock solid as it could be and did leave us with feelings of getting to work with a quality component. As you all know, we are much more interested in how it performs at high clocks while under stress."

Here are some more Motherboard articles from around the web:

Motherboards

Source: [H]ard|OCP