Subject: General Tech, Graphics Cards | October 27, 2014 - 04:50 PM | Ryan Shrout
Tagged: xbox one, sony, ps4, playstation 4, microsoft, amd
A couple of weeks back a developer on Ubisoft's Assassin's Creed Unity was quoted that the team had decided to run both the Xbox One and the Playstation 4 variants of the game at 1600x900 resolution "to avoid all the debates and stuff." Of course, the Internet exploded in a collection of theories about why that would be the case: were they paid off by Microsoft?
For those of us that focus more on the world of PC gaming, however, the following week an email into the Giantbomb.com weekly podcast from an anonymous (but seemingly reliable) developer on the Unity team raised even more interesting material. In this email, despite addressing other issues on the value of pixel count and the stunning visuals of the game, the developer asserted that we may have already peaked on the graphical compute capability of these two new gaming consoles. Here is a portion of the information:
The PS4 couldn’t do 1080p 30fps for our game, whatever people, or Sony and Microsoft say. ...With all the concessions from Microsoft, backing out of CPU reservations not once, but twice, you’re looking at about a 1-2 FPS difference between the two consoles.
What's hard is not getting the game to render but getting everything else in the game at the same level of performance we designed from the start for the graphics. By the amount of content and NPCs in the game, from someone who witnessed a lot of optimizations for games from Ubisoft in the past, this is crazily optimized for such a young generation of consoles. This is really about to define the next-generation unlike any other game beforehand.
We are bound from the CPU because of AI. Around 50% of the CPU is used for the pre-packaged rendering parts..
So, if we take this anonymous developers information as true, and this whole story is based on that assumption, then have learned some interesting things.
- The PS4, the more graphically powerful of the two very similarly designed consoles, was not able to maintain a 30 FPS target when rendering at 1920x1080 resolution with Assassin's Creed Unity.
- The Xbox One (after giving developers access to more compute cycles previously reserved to Kinect) is within a 1-2 FPS mark of the PS4.
- The Ubisoft team see Unity as being "crazily optimized" for the architecture and consoles even as we just now approach the 1 year anniversary of their release.
- Half of the CPU compute time is being used to help the rendering engine by unpacking pre-baked lighting models for the global illumination implementation and thus the game is being limited by the 50% remaining performance power the AI, etc.
It would appear that just as many in the media declared when the specifications for the new consoles were announced, the hardware inside the Playstation 4 and Xbox One undershoots the needs of game developers to truly build "next-generation" games. If, as this developer states, we are less than a year into the life cycle of hardware that was planned for an 8-10 year window and we have reached performance limits, that's a bad sign for game developers that really want to create exciting gaming worlds. Keep in mind that this time around the hardware isn't custom built cores or using a Cell architecture - we are talking about very basic x86 cores and traditional GPU hardware that ALL software developers are intimately familiar with. It does not surprise me one bit that we have seen more advanced development teams hit peak performance.
If the PS4, the slightly more powerful console of the pair, is unable to render reliably at 1080p with a 30 FPS target, then unless the Ubisoft team are completely off the rocker in terms of development capability, the advancement of gaming on consoles would appear to be somewhat limited. Remember the specifications for these two consoles:
|PlayStation 4||Xbox One|
|Processor||8-core Jaguar APU||8-core Jaguar APU|
|Memory||8GB GDDR5||8GB DDR3|
|Graphics Card||1152 Stream Unit APU||768 Stream Unit APU|
|Peak Compute||1,840 GFLOPS||1,310 GFLOPS|
The custom built parts from AMD both feature an 8-core Jaguar x86 architecture and either 768 or 1152 stream processors. The Jaguar CPU cores aren't high performance parts: single-threaded performance of Jaguar is less than the Intel Silvermont/Bay Trail designs by as much as 25%. Bay Trail is powering lots of super low cost tablets today and even the $179 ECS LIVA palm-sized mini-PC we reviewed this week. And the 1152/768 stream processors in the GPU portion of the AMD APU provide some punch, but a Radeon HD 7790 (now called the R7 260X), released in March of 2013, provides more performance than the PS4 and the Radeon R7 250X is faster than what resides in the Xbox One.
If you were to ask me today what kind of performance would be required from AMD's current GPU lineup for a steady 1080p gaming experience on the PC, I would probably tell you the R9 280, a card you can buy today for around $180. From NVIDIA, I would likely pick a GTX 760 (around $200).
Also note that if the developer is using 50% of the CPU resources for rendering computation and the remaining 50% isn't able to hold up its duties on AI, etc., we likely have hit performance walls on the x86 cores as well.
Even if this developer quote is 100% correct that doesn't mean that the current generation of consoles is completely doomed. Microsoft has already stated that DirectX 12, focused on performance efficiency of current generation hardware, will be coming to the Xbox One and that could mean additional performance gains for developers. The PS4 will likely have access to OpenGL Next that is due in the future. And of course, it's also possible that this developer is just wrong and there is plenty of headroom left in the hardware for games to take advantage of.
But honestly, based on my experience with these GPU and CPU cores, I don't think that's the case. If you look at screenshots of Assassin's Creed Unity and then look at the minimum and recommended specifications for the game on the PC, there is huge, enormous discrepancy. Are the developers just writing lazy code and not truly optimizing for the hardware? It seems unlikely that a company the size of Ubisoft would choose this route on purpose, creating a console game that runs in a less-than-ideal state while also struggling on the PC version. Remember, there is almost no "porting" going on here: the Xbox One and Playstation 4 share the same architecture as the PC now.
Of course, we might just be treading through known waters. I know we are a bit biased, and so is our reader base, but I am curious: do you think MS and Sony have put themselves in a hole with their shortsighted hardware selections?
UPDATE: It would appear that a lot of readers and commentors take our editorial on the state of the PS4 and XB1 as a direct attack on AMD and its APU design. That isn't really the case - regardless of what vendors' hardware is inside the consoles, had Microsoft and Sony still targeted the same performance levels, we would be in the exact same situation. An Intel + NVIDIA hardware combination could just have easily been built to the same peak theoretical compute levels and would have hit the same performance wall just as quickly. MS and Sony could have prevented this by using higher performance hardware, selling the consoles at a loss out the gate and preparing each platform for the next 7-10 years properly. And again, the console manufacturers could have done that with higher end AMD hardware, Intel hardware or NVIDIA hardware. The state of the console performance war is truly hardware agnostic.
Subject: General Tech, Graphics Cards | October 29, 2014 - 06:12 PM | Scott Michaud
Tagged: ubisoft, assassin's creed
Ubisoft has integrated GameWorks into Assassin's Creed Unity, or at least parts of it. The main feature to be included is NVIDIA's Horizon Based Ambient Occlusion Plus (HBAO+), which is their implementation of Ambient Occlusion. This effect darkens areas that would otherwise be incorrectly lit with our current limitations of Global Illumination. Basically, it analyzes the scene's geometry to subtract some of the influence of "ambient light" in places where it is an unrealistic approximation (particularly in small crevices). This is especially useful for overcast scenes, where direct sunlight does not overwhelm the contribution of scatters and bounces.
The other features to be included are Temporal Anti-alising (TXAA), Percentage-Closer Soft Shadows (PCSS), and GeometryWorks Advanced Tessellation. TXAA and PCSS were both included in Assassin's Creed IV: Black Flag, alongside the previously mentioned HBAO+, so it makes sense that Ubisoft continues to use what worked for them. GeometryWorks is a different story. NVIDIA seems to claim that it is like DirectX 11 tessellation, but is better suited for use alongside HBAO+ and PCSS.
Assassin's Creed Unity will be available on November 11th.
Subject: Graphics Cards | October 24, 2014 - 03:44 PM | Ryan Shrout
Tagged: radeon, R9 290X, leaderboard, hwlb, hawaii, amd, 290x
When NVIDIA launched the GTX 980 and GTX 970 last month, it shocked the discrete graphics world. The GTX 970 in particular was an amazing performer and undercut the price of the Radeon R9 290 at the time. That is something that NVIDIA rarely does and we were excited to see some competition in the market.
AMD responded with some price cuts on both the R9 290X and the R9 290 shortly thereafter (though they refuse to call them that) and it seems that AMD and its partners are at it again.
Looking on Amazon.com today we found several R9 290X and R9 290 cards at extremely low prices. For example:
- XFX Radeon R9 290X Double D - $299 (after MIR)
- Gigabyte R9 290X WindForce - $360
- MSI R9 290X Gaming - $366
The R9 290X's primary competition in terms of raw performance is the GeForce GTX 980, currently selling for $549 and up. If you can find them in stock, that means NVIDIA has a hill of $250 to climb when going against the lowest priced R9 290X.
The R9 290 looks interesting as well:
Several other R9 290 cards are selling for upwards of $300-320 making them bone-headed decisions if you can get the R9 290X for the same or lower price, but considering the GeForce GTX 970 is selling for at least $329 today (if you can find it) and you can see why consumers are paying close attention.
Will NVIDIA make any adjustments of its own? It's hard to say right now since stock is so hard to come by of both the GTX 980 and GTX 970 but it's hard to imagine NVIDIA lowering prices as long as parts continue to sell out. NVIDIA believes that its branding and technologies like G-Sync make GeForce cards more valuable and until they being to see a shift in the market, I imagine that will stay the course.
For those of you that utilize our Hardware Leaderboard you'll find that Jeremy has taken these prices into account and update a couple of the system build configurations.
Subject: Graphics Cards | October 28, 2014 - 12:09 PM | Ryan Shrout
Tagged: maxwell, GTX 970, geforce, coil whine
Coil whine is the undesirable effect of electrical components creating audible noise when operating. Let's look to our friends at Wikipedia for a concise and accurate description of the phenomenon:
Coil noise is, as its name suggests, caused by electromagnetic coils. These coils, which may act as inductors or transformers, have a certain resonant frequency when coupled with the rest of the electric circuit, as well as a resonance at which it will tend to physically vibrate.
As the wire that makes up the coil passes a variable current, a small amount of electrical oscillation occurs, creating a small magnetic field. Normally this magnetic field simply works to establish the inductance of the coil. However, this magnetic field can also cause the coil itself to physically vibrate. As the coil vibrates physically, it moves through a variable magnetic field, and feeds its resonance back into the system. This can produce signal interference in the circuit and an audible hum as the coil vibrates.
Coil noise can happen, for example, when the coil is poorly secured to the circuit board, is poorly damped, or if the resonant frequency of the coil is close to the resonant frequency of the electric circuit. The effect becomes more pronounced as the signal passing through the coil increases in strength, and as it nears the resonant frequency of the coil, or as it nears the resonant frequency of the circuit. Coil noise is also noticed most often when it is in the humanly audible frequency.
Coil noise is also affected by the irregularities of the magnetic material within the coil. The flux density of the inductor is effected by these irregularities, causing small currents in the coil, contaminating the original signal. This particular subset of is sometimes referred to as magnetic fluctuation noise or the Barkhausen effect. Coil noise can also occur in conjunction with the noise produced by magnetostriction.
Gamers that frequently upgrade their graphics cards may have been witness to this problem with a particular install, or you might have been one of the lucky ones to never deal with the issue. If your computer sits under your desk, in a loud room or you only game with headphones, it's also possible that you just never noticed.
Possibly offending inductors?
The reason this comes up to today is that reports are surfacing of GeForce GTX 970 cards from various graphics card vendors exhibiting excessive coil whine or coil noise. These reports are coming in from multiple forum threads around the internet, a collection of YouTube videos of users attempting to capture the issue and even official statements from some of NVIDIA's partners. Now, just because the internet is talking about it doesn't necessarily mean it's a "big deal" relative to the number of products being sold. However, after several Twitter comments and emails requesting we look into the issue, I thought it was pertinent to start asking questions.
As far as I can tell today, GTX 970 cards from multiple vendors including EVGA, MSI and Gigabyte all have users reporting issues and claims of excessive coil noise. For my part here, I have two EVGA GTX 970 cards and an MSI GTX 970, none of which are producing sound at what I would call "excessive" levels. Everyone's opinion of excessive noise is going to vary, but as someone who sits next to a desk-high test bed and hears hundreds of cards a year, I am confident I have a good idea of what to listen for.
We are still gathering data on this potential issue, but a few of the companies mentioned above have issued official or semi-official statements on the problem.
The coil whine issue is not specific to 900 series, but can happen with any high end GPU and that MSI is looking in to ways to minimize the issue. If you still have concern regarding this issue, then please contact our RMA department.
We have been watching the early feedback on GTX 970 and inductor noise very closely, and have actively taken steps to improve this. We urge anyone who has this type of concern to contact our support so we can address it directly.
We’re aware of a small percentage of users reporting excessive “coil whine” noises and are actively looking into the issue.
We are waiting for feedback from other partners to see how they plan to respond.
Since all of the GTX 970 cards currently shipping are non-reference, custom built PCB designs, NVIDIA's input to the problem is one mostly of recommendations. NVIDIA knows that it is their name and brand being associated with any noisy GeForce cards so I would expect a lot of discussions and calls being had behind closed doors to make sure partners are addressing user concerns.
Interestingly, the GeForce GTX 970 was the one card of this Maxwell release where all of NVIDIA's partners chose to go the route of custom designs rather than adopting the NVIDIA reference design. On the GTX 980, however, you'll find a mix of both and I would wager that NVIDIA's reference boards do not exhibit any above average noise levels from coils. (I have actually tested four reference GTX 980s without coil whine coming into play.) Sometimes offering all of these companies the option to be creative and to differentiate can back-fire if the utmost care isn't taken in component selection.
Ironically the fix is simple: a little glue on those vibrating inductor coils and the problem goes away. But most of the components are sealed making the simple fix a non-starter for the end user (and I wouldn't recommend doing that anyway). It does point to a lack of leadership from board manufacturers that are willing to skimp on hardware in such a way to make this a big enough issue that I am sitting here writing about this today.
As an aside, if you hear coil whine when running a game at 500-5000 FPS, I don't think that counts as being a major problem for your gaming. I have seen a video or two running a DX9 render test at over 4500 FPS - pretty much any card built today will make noises you don't expect when hitting that kind of performance level.
As for my non-official discussions on the topics with various parties, everyone continues to reiterate that the problem is not as widespread as the some of the forum threads would have you believe. It's definitely higher than normal, and getting public acknowledgements from EVGA and MSI basically confirms that, but one person told me the complaint and RMA levels are where they were expected to be consider the "massively fast sell out rates" the GTX 970 is experiencing.
Of course, AMD isn't immune to coil whine issues either. If you remember back to the initial launch of the Radeon R9 290X and R9 290, we had similar coil whine issues and experienced those first hand on reference card designs. (You can see a video I recorded of an XFX unit back in November of 2013 here.) You can still find threads on popular forums from that time period discussing the issue and YouTube never seems to forget anything, so there's that. Of course, the fact that previous card launches might have seen issues along the same line doesn't forgive the issue in current or later card releases, but it does put things into context.
So, let's get some user feedback; I want to hear from GTX 970 owners about their experiences to help guide our direction of research going forward.
Subject: General Tech | October 28, 2014 - 03:42 PM | Jeremy Hellstrom
Tagged: alienware, Alienware 13, graphics amplifier, gaming laptop
The Alienware 13 is a gaming laptop which comes with a very interesting optional product, the so called Graphics Amplifier which is an external enclosure for a desktop GPU. Finally the product which we have been waiting for has arrived, though only for a specific system. The box will cost you $300 but will allow you to connect a GPU to your laptop with a single cord. It does not ship with a GPU but there is a 460W PSU inside. The GPU can be at most a double slot card, larger ones will not fit and it can have a maximum power draw of 375W which is not really an issue as that limit come from the PCIe interface. The single cord you can see coming out of the back of the enclosure in this picture from Gizmodo provides a combined PCIe and USB connection to the laptop and when connected will disable the laptops internal GPU and allow the external desktop GPU to power the system.
You cannot hotswap your GPU, you will need to reboot your system to switch between the external GPU and your internal GPU and SLI an option. You do get to choose between your integral display or an external one connected via HDMI or Mini DisplayPort; the most expensive model of Alienware 13 does ship with a 2560x1440 touchscreen but it is still only 13" in size.
The internals are quite nice with a Haswell Core i5 4210U, a choice of either 8 or 16GB of DDR3-1600, a GTX 860M and either a large HDD or a 256GB M.2 SSD. That is enough power to keep this laptop from lagging behind in performance for the next few years and with the external GPU you could feasibly upgrade your graphics for a few generations which will keep you in the game without needing a whole new system.
From the tests that Gizmodo performed the external GPU functions perfectly when it is enabled which is great news for those of us who have been hoping that PCIe would eventually bring us a product such as this one. The proprietary nature should not be too much of a concern, if Dell has managed to pull it off there is no reason why other companies would not be able to make a version which could work with other laptops which have the proper ports. This certainly changes the biggest issue that gaming laptops have faced; now you can upgrade the laptop through several generations instead of needing to purchase a completely new system every other generation or so.
Often times, one of the suggestions of what to do with older PC components is to dedicate it to a Home Theater PC. While in concept this might seem like a great idea, you can do a lot of things with full control over the box hooked up to your TV, I think it's a flawed concept.
With a HTPC, some of the most desired traits include low power consumption, quiet operation, all while maintaining a high performance level so you can do things like transcode video quickly. Older components that you have outgrown don't tend to be nearly as efficient as newer components. To have a good HTPC experience, you really want to pick components from the ground up, which is why I was excited to take a look at the Steiger Dynamics Maven Core HTPC.
As it was shipped to us, our Maven Core is equipped with an Intel Core i5-4690K and an NVIDIA GTX 980. By utilizing two of the most power efficient architectures available, Intel's Haswell and NVIDIA's Maxwell, the Maven should be able to sip power while maintaining low temperature and noise. While a GTX 980 might be overkill for just HTPC applications, it opens up a lot of possibilities for couch-style PC gaming with things like Steam Big Picture mode.
From the outside, the hand-brushed aluminum Steiger Dynamics system takes the form of traditional high-end home theater gear. At 6.85-in tall, or almost 4U if you are comfortable with that measurement system, the Maven Core is a large device, but does not stand out in a collection of AV equipment. Additionally, when you consider the standard Blu-Ray drive and available Ceton InfiniTV Quad PCIe CableCARD tuner giving this system the capability of replacing both a cable set top box and dedicated Blu-Ray player all together, the size becomes easier to deal with.
Digging deeper into the hardware specs of the Maven Core we find some familiar components. The Intel Core i5-4690K sits in an ASUS Z97-A motherboard along with 8GB of Corsair DDR3-1866 memory. For storage we have a 250GB Samsung 840 EVO SSD paired with a Western Digital 3TB Hard Drive for mass storage of your media.
Cooling for the CPU is provided by a Corsair H90 with a single Phanteks fan to help keep the noise down. Steiger Dynamics shipped our system with a Seasonic Platinum-series 650W power supply, including their custom cabling option. For $100, they will ship your system with custom, individually sleeved Power Supply and SATA drive cables. The sleeving and cable management are impressive, but $100 would be a difficult upsell of a PC that you are likely never going to see the inside of.
As we mentioned earlier, this machine also shipped with a Ceton InfiniTV 4 PCIe CableCARD tuner. While CableCARD is a much maligned technology that never really took off, when you get it working it can be impressive. Our impressions of the InfiniTV can be found later in this review.
Subject: Storage | October 27, 2014 - 02:59 PM | Allyn Malventano
Tagged: Samsung, firmware, 840 evo
Over the weekend Samsung silently updated their 840 EVO Performance Restoration Tool. The incremental update improved support for some system configurations that were previously not recognizing an installed 840 EVO. Samsung also improved how the GUI progress bar responds during the update process, presumably to correct the near silent failure that occurred when the tool was unable to update the drive's firmware. Previously, the tool would halt at 15% without any clear indication that the firmware could not be updated (this would occur if the tool was unable to issue the necessary commands to the SSD, mainly due to the motherboard being in the wrong storage controller mode or using an incompatible storage driver).
Still no word on relief for those owners of the original 840 (non EVO or Pro). We've also heard from some users with Samsung OEM TLC-based SSDs that showed the same type of slow down (some variants of the PM851 apparently used TLC flash). More to follow there.
We evaluated the Samsung 840 EVO Performance Restoration Tool here. If you've already successfully run the 1.0 version of the tool, there is no need to re-run the 1.1 version, as it will not do anything additional to an EVO that has been updated and restored.
When Intel revealed their miniature PC platform in 2012, the new “Next Unit of Computing” (NUC) was a tiny motherboard with a custom case, and admittedly very little compute power. Well, maybe not so much with the admittedly: “The Intel NUC is an ultra-compact form factor PC measuring 4-inch by 4-inch. Anything your tower PC can do, the Intel NUC can do and in 4 inches of real estate.” That was taken from Intel’s NUC introduction, and though their assertion was perhaps a bit premature, technology does continue its rapid advance in the small form-factor space. We aren’t there yet by any means, but the fact that a mini-ITX computer can be built with the power of an ATX rig (limited to single-GPU, of course) suggests that it could happen for a mini-PC in the not so distant future.
With NUC the focus was clearly on efficiency over performance, and with very low power and noise there were practical applications for such a device to offset the marginal "desktop" performance. The viability of a NUC would definitely depend on the user and their particular needs, of course. If you could find a place for such a device (such as a living room) it may have been worth the cost, as the first of the NUC kits were fairly expensive (around $300 and up) and did not include storage or memory. These days a mini PC can be found starting as low as $100 or so, but most still do not include any memory or storage. They are tiny barebones PC kits after all, so adding components is to be expected...right?
It’s been a couple of years now, and the platform continues to evolve - and shrink to some startlingly small sizes. Of the Intel-powered micro PC kits on today’s market the LIVA from ECS manages to push the boundaries of this category in both directions. In addition to boasting a ridiculously small size - actually the smallest in the world according to ECS - the LIVA is also very affordable. It carries a list price of just $179 (though it can be found for less), and that includes onboard memory and storage. And this is truly a Windows PC platform, with full Windows 8.1 driver support from ECS (previous versions are not supported).
Mini-ITX Sized Package with a Full Sized GPU
PC components seem to be getting smaller. Micro-ATX used to not be very popular for the mainstream enthusiast, but that has changed as of late. Mini-ITX is now the hot form factor these days with plenty of integrated features on motherboards and interesting case designs to house them in. Enthusiast graphics cards tend to be big, and that is a problem for some of these small cases. Manufacturers are responding to this by squeezing every ounce of cooling performance into smaller cards that more adequately fit in these small chassis.
MSI is currently offering their midrange cards in these mini-ITX liveries. The card we have today is the GTX 760 Mini-ITX Gaming. The GTX 760 is a fairly popular card due to it being fairly quick, but not too expensive. It is still based on the GK104, though fairly heavily cut down from a fully functional die. The GTX 760 features 1152 CUDA Cores divided into 6 SMXs. A fully functional GK104 is 1536 CUDA Cores and 8 SMXs. The stock clock on the GTX 760 is 980 MHz with a boost up to 1033 MHz.
The pricing for the GTX 760 cards is actually fairly high as compared to similarly performing products from AMD. NVIDIA feels that they offer a very solid product at that price and do not need to compete directly with AMD on a performance per dollar basis. Considering that NVIDIA has stayed very steady in terms of marketshare, they probably have a valid point. Overall the GTX 760 performs in the same general area as a R9 270X and R9 280, but again the AMD parts have a significant advantage in terms of price.
The challenges for making a high performing, small form factor card are focused on power delivery and thermal dissipation. Can the smaller PCB still have enough space for all of the VRMs required with such a design? Can the manufacturer develop a cooling solution that will keep the GPU in the designed thermal envelope? MSI has taken a shot at these issues with their GTX 760 Mini-ITX OC edition card.
Subject: General Tech | October 28, 2014 - 01:46 PM | Jeremy Hellstrom
Tagged: microsoft, win7, inevitable
It is official, at the end of this month consumers will no longer be able to get their hands on a machine with Windows 7 installed, unless they luck into a machine which has been sitting on the shelves for a while. If you buy through a corporate account you will still be able to order a machine with Win7 but that will be the only way to get your hands on the OS which is already almost impossible to find. That puts shoppers in a bit of a bind as Win10 will not arrive for a while yet which leaves Win 8.1 as your only Microsoft based OS. Of course there is always Linux, now that many games and distribution platforms such as Steam support the free OS it is a viable choice for both productivity and entertainment. You can get more details at Slashdot or vent your spleen in the comments section.
"This Friday is Halloween, but if you try to buy a PC with Windows 7 pre-loaded after that, you're going to get a rock instead of a treat. Microsoft will stop selling Windows 7 licenses to OEMs after this Friday and you will only be able to buy a machine with Windows 8.1. The good news is that business/enterprise customers will still be able to order PCs 'downgraded' to Windows 7 Professional."
Here is some more Tech News from around the web:
- SUSE Linux Enterprise 12 Debuts With 'Rock-Solid' Cloud Support @ Linux.com
- Microsoft brings the CLOUD that GOES ON FOREVER @ The Register
- QuarkXpress 2015 to launch early next year with 64-bit speed boost @ The Inquirer
- Lumia 830: Microsoft hopes to seduce with slim 'affordable' model @ The Register
Subject: Mobile | October 30, 2014 - 03:38 PM | Jeremy Hellstrom
Tagged: msi, GT80 Titan, mechanical keyboard, cherry mx brown, gaming laptop
The full details are still a little sparse but we do know one thing for sure, the MSI GT80 Titan will be the first gaming laptop with an integral mechanical keyboard, it also happens to be backlit. The laptop is an 18" model and though it may look large in the pictures MSI reports it will be 17% thinner and 22% lighter than similar machines. They have also incorporated the SteelSeries Engine with CloudSync to allow you to save and synchronize settings via SteelSeries cloud storage. Check out the full PR below.
City of Industry, Calif. – October 30, 2014 – MSI Computer Corp, a leading manufacturer of computer hardware products and solutions, unveils the GT80 Titan, the world’s first gaming laptop with a mechanical keyboard.
First of its kind, MSI’s GT80 Titan ushers the future of gaming by integrating a SteelSeries gaming keyboard with Cherry Brown MX switches into the 18-inch gaming beast. Mechanical keyboards provide superior tactile feedback, increases durability, and enhances overall gaming experience by eliminating key jamming even during the most heated battle sessions.
“Performance is key for gamers and the GT80 Titan will forever change the mobile gaming experience,” says Andy Tung, president of MSI Pan America. “We are proud to be at the forefront of the gaming evolution and will continue to provide solutions that deliver the most outstanding gaming experience in the world.”
MSI’s newest gaming laptop uses standard Cherry switches and a standard keycap with 27mm of thickness, nearly 5 times of traditional laptop keyboards. It is also the world’s slimmest and lightest 18-inch gaming laptop, measuring 17% thinner and 22% lighter than its closest competitor. To fully optimize the keyboard, the GT80 Titan features an enhanced SteelSeries Engine with CloudSync, allowing users to save and synchronize settings via SteelSeries cloud storage.
Subject: Mobile | October 30, 2014 - 11:40 PM | Tim Verry
Tagged: motorola, Lenovo, finance, Android
Lenovo officially acquired Motorola Mobility from Google in a deal worth $2.91 billion (both cash and stock) today. Following the acquisition, Motorola will exist as a wholly owned subsidiary of Lenovo. Motorola will retain its headquarters in Chicago's Merchandise Mart along with satellite offices (including Silicon Valley) and approximately 3,500 employees. Note that Google will retain the majority of Motorola's patent portfolio along with the Advanced Technology and Projects research division.
Lenovo now owns the Motorola brand as well as the Moto and DROID trademarks. Lenovo expects to sell 100 million smartphones within the first year following the acquisition. These smartphones will allegedly continue to feature a stock Android experience with a focus of quick OS updates. Specifically, this Motorola blog post states:
"We will continue to focus on pure Android and fast upgrades, and remain committed to developing technology to solve real consumer problems. And we will continue to develop mobile devices that bring people unprecedented choice, value and quality." -
Lenovo has indicated that it plans to aggressively pursue selling Motorola devices in China, emerging markets, and even stateside. That last bit is perhaps the most interesting aspect of the buyout. Lenovo has been producing smartphones for a couple of years now, and while the mobile devices have held promise, they have yet to be made available in the US market. Now that Lenovo owns Motorola, the company has the branding power, experience, and carrier relationships to bring their devices stateside in a big way.
Google was not necessarily bad for Motorola but the potential conflicts of interest with other Android phone manufactures, I think, resulted in Google being much more reserved with Motorola when it came to producing new Android hardware. Now that Lenovo holds the future of Motorola, I think the company will be free to compete with new hardware running any manner of OS but especially Android. I'm interested to see where Motorola will go from here and the kinds of devices we'll see from the now Lenovo-owned company.
Subject: Mobile | October 29, 2014 - 09:08 PM | Tim Verry
Tagged: yoga tablet 2, Windows 8.1, Lenovo, Bay Trail, atom z3745, atom
Lenovo made a new 13-inch Windows 8.1 tablet official today rounding out the company's Yoga Tablet 2 family. The aptly named Yoga Tablet 2 With Windows (13") combines the design and hardware features of the Yoga Tablet 2 Pro with the smaller 10-inch Yoga Tablet 2 (Android or Windows) siblings. This tablet lacks the Pico projector of the Pro model, but keeps the JBL audio hardware, QHD IPS display, and kickstand. It further adds a larger version of the Bluetooth AccuType keyboard seen on the 10-inch Yoga Tablet 2 Windows model. Aimed at productivity tasks, the Bay Trail-powered PC is equipped with additional memory and storage along with an ample 12,800 mAh battery rated at up to 15 hours of general usage (including video/audio playback and web browsing). It will be available for purchase next month for $699.
The Yoga Tablet 2 with Windows 13-Inch is a 2.27 pound (tablet only) PC featuring a 2560x1440 IPS display, JBL audio with a Wolfson Master Hi-Fi codec (two front facing 1.5W stereo speakers with a rear firing 5W subwoofer), 1.6MP webcam for video conferencing, and a bundled AccuType keyboard cover. External IO includes one micro HDMI video output, one micro USB port, and micro SD card slot, and an analog audio jack. The tablet and keyboard are all ebony black which sets it apart from the other mostly silver-clad Yoga Tablet 2s.
Internally, Lenovo has chosen the quad core Intel Atom (Bay Trail) Z3745 clocked at 1.86GHz, 4GB of LPDDR3 memory, and 64GB of internal storage that can be expanded upon by adding a micro SD card up to 64GB. There is no cellular data support, but the tablet does include dual band 802.11n Wi-Fi and Bluetooth 4.0 radios. A large 12,800 mAh Lithium Polymer battery powers the tablet for up to 15 hours, according to Lenovo.
The tablet runs the full version of Windows 8.1 and comes with a one month trail of Office 365 (which recently started offering 'unlimited' cloud storage).
It will be available for purchase in November on Lenovo.com for $699.
I like the black design and the inclusion of a keyboard along with the usage of Windows 8.1 makes this a better choice for business users than the Android-running Yoga Tablet 2 Pro model. The specifications look pretty good for what it is, though I question how many Lenovo will sell at that price point. You can find older generation convertible tablets, even from Lenovo, running the faster Intel Core (Ivy Bridge and similar) chips in that price range not to mention regular laptops should you not need the hybrid/tablet nature. It is kind of in an odd middle ground between the budget Bay Trail devices and starter ultrabooks though the high resolution IPS display and audio do not hurt.
Do you think it has a place in the market and will you be picking one up?
*For reference, the 13" Yoga Tablet 2 Pro has an MSRP of $499 while the 10-inch Yoga Tablet 2 (Windows, with keyboard) has an MSRP of $399. The $200 or $300 premium (depending on the comparison) gets you (at least) a device with more memory and storage and potentially an added keyboard or a larger device.
Subject: Storage | October 28, 2014 - 01:30 PM | Allyn Malventano
Tagged: ssd, sata, Samsung, 850 EVO
Thanks to an updated SKU list and some searching, we've come across some initial photos, specs, and pricing for the upcoming Samsung 850 EVO.
You may have heard of an 850 EVO 1TB listing over at Frys, but there's actually more information out there. Here's a quick digest:
- Memory: 3D VNAND
- Read: 550MB/sec
- Write: 520MB/sec
- Weight: 0.29 lbs
Pricing (via Antares Pro listings at time of writing):
- 120GB (MZ-75E120B/AM): $100 ($0.83 / GB)
- 250GB (MZ-75E250B/AM): $146 ($0.58 / GB)
- 500GB (MZ-75E500B/AM): $258 ($0.52 / GB)
- 1TB (MZ-75E1T0B/AM): $477 ($0.48 / GB)
In addition to the above, we saw the 1TB model listed for $500 at Frys, and also found the 500GB for $264 at ProVantage. The shipping date on the Frys listing was initially November 3rd, but that has since shifted to November 24th, presumably due to an influx of orders.
We'll be publishing a full capacity roundup on the 850 Pro in anticipation of the 850 EVO launch, which based on these leaks is imminent.
In conjunction with Dell World, LiteOn has announced their new EP1 M.2 PCIe SSD:
Designed primarily for enterprise workloads and usage, the EP1 sports impressive specs for such a small device. Capacities are 480 and 960GB, random 4k IO is rated at 150k/44k (R/W), sequentials are as high as 1.5GB/sec, and max latencies are in the 30-40 us range (this spec is particularly important for enterprise OLTP / transactional database workloads). Given the enterprise specs, power loss protection is a given (and you can see the capacitors in the upper right of the above photo). Here are the full specs:
It should be noted that larger PCIe-based SSDs are rated for greater than the 1 drive write per day of the EP1, but they are also considerably larger (physically) when compared to the M.2 EP1. As an additional aside, the 960GB capacity is a bit longer than you might have seen so far in the M.2 form factor. While the 480GB model is a familiar 2280 (80mm long), the 960GB model follows the 22110 form factor (110mm long). The idle power consumption seems a bit high, but enterprise devices are typically tuned for instantaneous response over idle wattage.
Subject: Storage | October 29, 2014 - 03:10 PM | Allyn Malventano
Tagged: tlc, Samsung, firmware, 840
If you own a Samsung 840 SSD, it appears that after much repeated and vocal pressure, Samsung has acknowledged the slow down also affects your drive. We're not talking about the EVO or the Pro, this is the original pure TLC model that launched (the EVO is a TLC+SLC cache hybrid while the Pro is all MLC). Here's the quote from Samsung, via Computer Base:
Uns ist durch das Feedback, das uns erreicht hat, bekannt, dass es auch beim Zugriff auf bestimmte Daten bei Modellen der SSD 840 zu niedrigeren Leseleistungen als angegeben kommen kann.
Im Moment untersuchen unsere Produktexperten systematisch die betreffenden SSD-Modelle innerhalb verschiedener Systemumgebungen und arbeiten an einer schnellstmöglichen Lösung.
Aufgrund der unterschiedlichen Technologien sind die Modelle der PRO-Serie (840 PRO und 850 PRO) nicht betroffen.
What? You can't read German? Neither can we, but paraphrasing from the poor quality translation from several online tools, we deduce that Samsung has acknowledged the issue on the 840, and is working on a solution as quickly as possible. This is similar verbiage to the statement issued for the 840 EVO acknowledgement.
** Update **
Thanks to Zyhmet, who commented shortly after posting, here's a human translation:
Because of the feedback we got, we realized that, accessing specific data with units of SSD 840 could lead to lower reading performance.
For the moment our experts are systematically examining the SSD-units with different system environments and we are working on a solution as fast as possible.
Due to different technologies the PRO-series (840 PRO and 850 PRO) are not affected.
** End update **
Side note - of those who have used the 840 EVO Performance Restoration Tool, a few have reported an issue cropping up. The error manifests as a SMART data misreporting error:
What's odd about this error is that it was present on some of our pre-production test samples (firmware EXT0AB0Q), and was corrected once we updated those samples to the first retail build (EXT0BB0Q). The image above was an actual screen shot taken during our temperature-dependency testing of the slow down issue. While none of our samples had the issue return when updating all the way to the performance restored firmware, one of those updates did corrupt the Master File Table, rendering the majority of the SSD inaccessible. While we have seen no other reports of corrupted partitions, several users noticed the SMART reporting issue after updating. It's odd to see this sort of a regression with firmware updates, in that a bug fixed in the initial shipping firmware has returned (for some) in a subsequent update. If you've updated your 840 EVO with their Performance Restoration Tool, it may be a good time to check your SMART attributes. If you see the error above, please leave us a note in the comments.
Circling back to the slow down issue - given that it is present in two TLC-based SSDs from Samsung, one has to wonder if this issue exists in other Samsung TLC SSDs as well. Here's the list of potentials (thanks to an anonymous comment on a prior story):
- 840 EVO - 19nm TLC
- 840 - 21nm TLC
- PM841 - 21nm TLC
- PM851 - 21nm TLC (some SKUs)
- 845DC EVO - 19nm TLC
- PM843 - 21nm TLC
- PM853T - 21nm TLC
We have several questions out to Samsung on these issues, but to date they have not been answered. More to follow as we wait for an official (English) response.
Subject: Processors | October 29, 2014 - 05:44 PM | Scott Michaud
Tagged: Intel, Haswell-E, Haswell-EX, Ivy Bridge-EX
Last February, Intel launched the Xeon E7 v2 line of CPUs. Based on the Ivy Bridge architecture, they replaced the original Xeon E7s, developed from Sandy Bridge, that were released in April 2011. Intel is now planning to release Haswell-EX in the second quarter of 2015. No specific SKUs are listed, this information describes the product family as a whole.
This is Ivy Bridge-EX. Haswell-EX will have 3 extra cores (and look a bit different).
To set the tone, these are not small chips. Using the previous generation as an example, Ivy Bridge-EX was over twice the size (surface area) of Ivy Bridge-E, and it contained over twice the number of transistors. While Ivy Bridge-EX was available with up to 15 physical cores per processor, double that with HyperThreading, Haswell-EX is increasing that to 18, or 36 simultaneous threads with HyperThreading. If that is not enough cores, then you can pick up an eight-socket motherboard and load it up with multiple of these.
Other than their gigantic size, these chips are fairly similar to the Xeon E5 processors that are based on Haswell-E. If you need eighteen cores per package, and can spare several thousand dollars per processor, you should be able to give someone your money in just a handful of months.
Subject: General Tech | October 27, 2014 - 12:35 PM | Jeremy Hellstrom
Tagged: Haswell-EX, Haswell-EP4S, Intel, server, xeon, Broadwell-DE, Skylake
Intel's release schedules have been slowing down, unfortunately in a large part that is due to the fact that the only competition they face in certain market segments is themselves. For high end servers it looks like we won't see Haswell-EX or EP4S until the second half of next year and Skylake chips for entry level servers until after the third quarter. Intel does have to fight for their share of the SoC and low powered chips, DigiTimes reports the Broadwell-DE family and the C2750 and C2350 should be here in the second quarter which gives AMD and ARM a chance to gain market share against Intel's current offerings. Along with the arrival of the new chips we will also see older models from Itanium, Xeon, Xeon Phi and Atom be discontinued; some may be gone before the end of the year. You have already heard the bad news about Broadwell-E.
"Intel's next-generation server processors for 2015 including new Haswell-EX (Xeon E7 v3 series) and -EP4S (Xeon E5-4600 v3 series), are scheduled to be released in the second quarter of 2015, giving clients more time to transition to the new platform, according to industry sources."
Here is some more Tech News from around the web:
- iOS 8.1 @ The Inquirer
- How to Get Open Source Android @ Linux.com
- Mozilla to make Firefox OS a tasty filling for a Raspberry Pi @ The Inquirer
- Pesky POS poison won't Backoff @ The Register
- Cisco patches three-year-old remote code-execution hole @ The Register
- Netgear Nighthawk R7000 AC1900 @ Kitguru
- Tech ARP 2014 Mega Giveaway Contest
- WIN a 1TB monster Samsung EVO 840 SSD @ The Register
Subject: General Tech | October 30, 2014 - 01:01 PM | Jeremy Hellstrom
Tagged: iPad Air 2, apple
There were long lineups of people desperate to get their hands on the new iPad Air 2, regardless of the fact that the internals cost a mere $1 more than the initial model. To be fair that is not the best way to judge the quality of the upgrade, that should rely more on the screen quality ... which is exactly the same in all respects except for a new anti-reflective coating. Apple is also reducing their markup, from 45-61% down to a paltry 45-57% for this generation so at least that $1.00 extra in materials will not raise your purchase price overly. The internals such as the TSMC made A8X and camera match the iPhone 6 to a large extent making it a more powerful phablet than the original, so don't disparage it too much. You can read more on The Register if you are into fruit.
"New iPad Air 2 components cost Apple just one dollar more than the previous model, according to the teardown bods at IHS."
Here is some more Tech News from around the web:
- The TR Podcast 164: We get twitchy over Apples, Nexuses, and beefy games
- Hey - who wants 4.8 TERABYTES almost AS FAST AS MEMORY? @ The Register
- Drupal Warns Users of Mass, Automated Attacks On Critical Flaw @ Slashdot
- More Microsoft staffers shown the door in Round 3 of job cuts @ The Register
Subject: General Tech | October 30, 2014 - 02:59 PM | Jeremy Hellstrom
Tagged: x99 ws, Intel X99, Haswell-E, asrock
ASRock has a Work Station class board for Haswell-E with five PCIe 3.0 slots, support for up to 128GB of RAM which can be ECC if you install an appropriate processor and on the back are four of both USB 2.0 and 3.0 ports, one eSATA ports, audio and a pair of LAN ports. They also included A-Tuning overclocking software which seems odd for a Work Station but proved to be very important as [H]ard|OCP could not get the system they built with this board to POST at default settings and had to change UEFI settings to get it to boot. Once it did start up the performance was solid and it was one of the better ASRock boards that [H] has reviewed though with a street price over $300 it is hard to recommend.
"ASRock comes to us with its "Work Station" version Haswell-E motherboard. This time our out-of-box experience with its X99 WS was as rock solid as it could be and did leave us with feelings of getting to work with a quality component. As you all know, we are much more interested in how it performs at high clocks while under stress."
Here are some more Motherboard articles from around the web:
- ASRock X99 Extreme11 @ The SSD Review
- ASUS X99-A Motherboard @ Hardware Secrets
- MSI Z97 Gaming 9 AC Motherboard Review @ Modders-Inc
- Asus Maximus Vii Hero Motherboard Review @ TechwareLabs
- MSI X99S Gaming 9 AC @ HardwareHeaven
- MSI X99S SLI PLUS On Linux @ Phoronix
- Gigabyte Z97X-Gaming 5 @ eTeknix
- 1 of 2