All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards | November 12, 2014 - 09:03 PM | Ryan Shrout
Tagged: Unity, ubisoft, assassin's creed
Over the last couple of days there have been a lot of discussions about the performance of the new Assassin's Creed Unity from Ubisoft on current generation PC hardware. Some readers have expressed annoyance that the game is running poorly, at lower than expected frame rates, at a wide range of image quality settings. Though I haven't published my results yet, we are working on a story comparing NVIDIA and AMD GPUs in Unity, but the truth is that this is occurring on GPUs from both sides.
For example, using a Core i7-3960X and a single GeForce GTX 980 4GB reference card, I see anywhere from 37 FPS to 48 FPS while navigating the crowded city of Paris at 1920x1080 and on the Ultra High preset. Using the Low preset, that frame rate increases to 65-85 FPS or so.
Clearly, those are lower frame rates at 1920x1080 than you'll find in basically any other PC game on the market. The accusation from some in the community is that Ubisoft is either doing this on purpose or doing it out of neglect with efficient code. I put some questions to the development team at Ubisoft and though I only had a short time with them, the answers tell their side of the story.
Ryan Shrout: What in the Unity game engine is putting the most demand on the GPU and its compute resources? Are there specific effects or were there specific design goals for the artists that require as much GPU horsepower as the game does today with high image quality settings?
Ubisoft: Assassin’s Creed Unity is one of the most detailed games on the market and [contains] a giant, open world city built to the scale that we’ve recreated. Paris requires significant details. Some points to note about Paris in Assassin’s Creed Unity:
- There are tens of thousands of objects are visible on-screen, casting and receiving shadows.
- Paris is incredibly detailed. For example, Notre-Dame itself is millions of triangles.
- The entire game world has global illumination and local reflections.
- There is realistic, high-dynamic range lighting.
- We temporally stabilized anti-aliasing.
RS: Was there any debate internally about downscaling on effects/image quality to allow for lower end system requirements?
Ubisoft: We talked about this a lot, but our position always came back to us ensuring that Assassin’s Creed Unity is a next-gen only game with breakthrough graphics. With this vision, we did not degrade the visual quality of the game. On PC, we have several option for low-scaling, like disabling AA, decreasing resolution, and we have low option for Texture Quality, Environment Quality and Shadows.
RS: Were you looking forward or planning for future GPUs (or multi-GPU) that will run the game at peak IQ settings at higher frame rates than we have today?
Ubisoft: We targeted existing PC hardware.
RS: Do you envision updates to the game or to future GPU drivers that would noticeably improve performance on current generations of hardware?
Ubisoft: The development team is continuing to work on optimization post-launch through software updates. You’ll hear more details shortly.
Some of the features listed by the developer in the first answer - global illumination methods, high triangle counts, HDR lighting - can be pretty taxing on GPU hardware. I know there are people out there pointing out games that have similar feature sets and that run at higher frame rates, but the truth is that no two game engines are truly equal. If you have seen Assassin's Creed Unity in action you'll be able to tell immediately the game is beautiful, stunningly so. Is it worth that level of detail for the performance levels achieved from current high-end hardware? Clearly that's the debate.
When I asked if Ubisoft had considered scaling back the game to improve performance, they clearly decided against it. The developer had a vision for the look and style of the game and they were dedicated to it; maybe to a fault from some gamers' viewpoint.
Also worth nothing is that Ubisoft is continuing to work on optimization post-release; how much of an increase we'll actually see with game patches or driver updates will have to be seen as we move forward. Some developers have a habit of releasing a game and simply abandoning it as it shipped - hopefully we will see more dedication from the Unity team.
So, if the game runs at low frame rates on modern hardware...what is the complaint exactly? I do believe that Ubisoft would have benefited from better performance on lower image quality settings. You can tell by swapping the settings for yourself in game but the quality difference between Low and Ultra High is noticeable, but not dramatically so. Again, this likely harkens back to the desire of Ubisoft to maintain an artistic vision.
Remember that when Crysis 3 launched early last year, running at 1920x1200 at 50 FPS required a GTX 680, the top GPU at the time; and that was at the High settings. The Very High preset only hit 37 FPS on the same card.
PC gamers seems to be creating a double standard. On one hand, none of us want PC-ports or games that are developed with consoles in mind that don't take advantage of the power of the PC platform. Games in the Call of Duty series are immensely popular but, until the release of Advanced Warfare, would routinely run at 150-200 FPS at 1080p on a modern PC. Crysis 3 and Assassin's Creed Unity are the opposite of that - games that really tax current CPU and GPU hardware, paving a way forward for future GPUs to be developed and NEEDED.
If you're NVIDIA or AMD, you should applaud this kind of work. Now I am more interested than ever in a GTX 980 Ti, or a R9 390X, to see what Unity will play like, or what Far Cry 4 will run at, or if Dragon Age Inquisition looks even better.
Of course, if we can get more performance from a better optimized or tweaked game, we want that too. Developers need to be able cater to as wide of a PC gaming audience as possible, but sometimes creating a game that can scale between running on a GTX 650 Ti and a GTX 980 is a huge pain. And with limited time frames and budgets, don't we want at least some developers to focus on visual quality rather than "dumbing down" the product?
Let me know what you all think - I know this is a hot-button issue!
UPDATE: Many readers in the comments are bringing up the bugs and artifacts within Unity, pointing to YouTube videos and whatnot. Those are totally valid complaints about the game, but don't necessarily reflect on the game's performance - which is what we were trying to target with this story. Having crashes and bugs in the game is disappointing, but again, Ubisoft and Assassin's Creed Unity aren't alone here. Have you seen the bugs in Skyrim or Tomb Raider? Hopefully Ubisoft will be more aggressive in addressing them in the near future.
UPDATE 2: I also wanted to comment that even though I seem to be defending Ubisoft around the performance of Unity, my direct feedback to them was that they should enable modes in the game that allow it to play at higher frame rates and even lower image quality settings, even if they were unable to find ways to "optimize" the game's efficiency. So far the developer seems aware of all the complaints around performance, bugs, physics, etc. and is going to try to address them.
UPDATE 3: In the last day or so, a couple of other media outlets have posted anonymous information that indicates that the draw call count for Assassin's Creed Unity is at fault for the poor performance of the game on PCs. According to this "anonymous" source, while the consoles have low-level API access to hardware to accept and process several times the draw calls, DirectX 11 can only handle "7,000 - 10,000 peak draw calls." Unity apparently is "pushing in excess of 50,000 draw calls per frame" and thus is putting more pressure on the PC that it can handle, even with high end CPU and GPU hardware. The fact that these comments are "anonymous" is pretty frustrating as it means that even if they are accurate, they can't be taken as the truth without confirmation from Ubisoft. If this turns out to be true, then it would be a confirmation that Ubisoft didn't take the time to implement a DX11 port correctly. If it's not true, or only partially to blame, we are left with more meaningless finger-pointing.
Subject: General Tech | November 12, 2014 - 07:38 PM | Scott Michaud
Tagged: visual studio, microsoft
While this is significantly different from what we usually write about, I have a feeling that there is some overlap with our audience.
Update: If you use Visual Studio Express 2013, you may wish to uninstall it before installing Community. My experience seems to be that it thinks that both are installed to the same directory, and so uninstalling Express after installing Community will break both. I am currently repairing Community, which should fix it, but there's no sense for you to install twice if you know better.
Visual Studio Express has been the free, cut-down option for small and independent software developers. It can be used for commercial applications, but it was severely limited in many areas, such as its lack of plug-in support. Today, Microsoft announced Visual Studio Community 2013, which is a free version of Visual Studio that is equivalent to Visual Studio Professional 2013 for certain users (explained below). According to TechCrunch, while Visual Studio Express will still be available for download, Community is expected to be the version going forward.
Image Credit: Wikimedia (modified)
There are four use cases for Visual Studio Community 2013:
- To contribute to open-source projects (unlimited users)
- To use in a classroom environment for learning (unlimited users)
- To use as a tool for Academic research (unlimited users)
- To create free or commercial, closed-source applications (up to 5 users)
- You must be an individual or small studio with less than 250 PCs
- You must have no more than $1 million USD in yearly revenue
Honestly, this is a give-and-take scenario, but it seems generally positive. I can see this being problematic for small studios with 6+ developers, but they can (probably) still use Visual Studio Express 2013 Update 3 until it gets too old. For basically everyone else, this means that you do not need to worry about technical restrictions when developing software. This opens the avenue for companies like NVIDIA (Nsight Visual Studio Edition) and Epic Games (Unreal Engine 4) to deliver their plug-ins to the independent developer community. When I get a chance, and after it finishes installing, I will probably check to see if those examples already work.
Visual Studio Community 2013 Update 4 is available now at Microsoft's website.
Subject: Editorial | November 12, 2014 - 06:58 PM | Josh Walrath
Tagged: Wyoming Whiskey, Whiskey, Kirby, Bourbon
Last year around this time I reviewed my first bottle of Wyoming Whiskey. Overall, I was quite pleased with how this particular spirit has come along. You can read my entire review here. It also includes a little interview with one of the co-founders of Wyoming Whiskey, David Defazio. The landscape has changed a little throughout the past year, and the distillery has recently released a second product in limited quantities to the Wyoming market. The Single Barrel Bourbon selections come from carefully selected barrels and are not blended with others. I had the chance to chat with David again recently and received some interesting information from him about the latest product and where the company is headed.
Picture courtesy of Wyoming Whiskey
Noticed that you have a new single barrel product on the shelves. How would you characterize this as compared to the standard bottle you sell?
These very few barrels are selected from many and only make the cut if they meet very high standards. We have only bottled 4 so far. And, the State has sold out. All of our product has matured meaningfully since last year and these barrels have benefitted the most as evidenced by their balance and depth of character. The finish is wickedly smooth. I have not heard one negative remark about the Single Barrel Product.
Have you been able to slowly lengthen out the time that the bourbon matures til it is bottled, or is it around the same age as what I sampled last year?
Yes, these barrels are five years old, as is the majority of our small batch product.
How has been the transition from Steve to Elizabeth as the master distiller?
Elizabeth is no longer with us. She had intended to train under Steve for the year, but when his family drew him back to Kentucky in February, this plan disintegrated. So, our crew is making bourbon under the direction of Sam Mead, my partners' son, who is our production manager. He has already applied his engineering degree in ways that help increase quality and production. And he's just getting started.
What other new products may be showing up in the next year?
You may see a barrel-strength bourbon from us. There are a couple of honey barrels that we are setting aside for this purpose.
Wyoming Whiskey had originally hired on Steve Nally of Maker’s Mark fame, somehow pulling him out of retirement. He was the master distiller for quite a few years, and had moved on from the company this past year. He is now heading up a group that is opening a new distillery in Kentucky that is hoping to break into the bourbon market. They expect their first products to be aged around 7 years. As we all know, it is hard to keep afloat as a company if they are not selling product. In the meantime, it looks like this group will do what so many other “craft” distillers have been caught doing, and that is selling bourbon that is produced from mega-factories that is then labeled as their own.
Bourbon has had quite the renaissance in the past few years with the popularity of the spirit soaring. People go crazy trying to find limited edition products like Pappy Van Winkle and many estimate that overall bourbon production in the United States will not catch up to demand anytime soon. This of course leads to higher prices and tighter supply for the most popular of brands.
It is good to see that Wyoming Whiskey is lengthening out the age of the barrels that they are bottling, as it can only lead to smoother and more refined bourbon. From most of my tasting, it seems that 6 to 7 years is about optimal for most bourbon. There are other processes that can speed up these results, and I have tasted batches that are only 18 months old and rival that of much older products. I look forward to hearing more about what Wyo Whiskey is doing to improve their product.
Subject: General Tech | November 12, 2014 - 06:09 PM | Jeremy Hellstrom
Tagged: audio, roccat, Kave XTD 5.1, gaming headset
The name implies that the Roccat Kave XTD 5.1 Digital headset provides virtual surround sound but in fact it has three 40mm driver units in each earcup, giving you front, rear and centre channels though you can use the provided software to switch to stereo sound if you prefer. The earcups are leather over foam which makes them quite comfortable although they could get warm after extended periods of time and the microphone boom is removable for when it would be in your way. They also have noise cancellation and the ability to pair with a phone over Bluetooth and an integrated sound card, all part of the reason that the headset is $150. Modders-Inc were impressed by that soundcards four speaker plugs on the rear allowing you to switch between sending 5.1 signal to the Kave XTD or to external speakers. Audio reviews are always very subjective as it is difficult to rate perceived sound quality for anyone but yourself but you should still check out Modders-Inc's take on the software and hardware in their full review.
"Overall I thought the Roccat Kave XTD 5.1 Digital headset is a solid performer. The audio quality from the headset is excellent. At just slightly under full volume the headset is LOUD!"
Here is some more Tech News from around the web:
- MP4Nation Brainwavz S5 @ techPowerUp
- ROCCAT Kave XTD Stereo Gaming Headset @ Benchmark Reviews
- Corsair H1500 @ HardwareHeaven
- Luxa² E-One Headset Holder Review @ TechwareLabs
- TDK A34 TREK MAX Wireless Weather Resistant Speaker Review @ NikKTech
Subject: General Tech | November 12, 2014 - 05:10 PM | Jeremy Hellstrom
Tagged: linux, amd, radeon, CS:GO, tf2
With the new driver from AMD and a long list of cards to test, from an R9290 all the way back to an HD4650, Phoronix has put together a rather definitive list of the current performance you can expect from CS:GO and TF2. CS:GO was tested at 2560x1600 and showed many performance changes from the previous driver, including some great news for 290 owners. TF2 was tested at the same resolution and many of the GPUs were capable of providing 60FPS or higher, again with the 290 taking the lead. Phoronix also did testing on the efficiency of these cards, detailing the number of frames per second, per watt used, this may not be pertinent to many users but does offer an interesting look at the efficiency of the GPUs. If you are gaming on a Radeon on Linux now is a good time to upgrade your drivers and associated programs.
"The latest massive set of Linux test data we have to share with Linux gamers and enthusiasts is a look at Counter-Strike: Global Offensive and Team Fortress 2 when using the very newest open-source Radeon graphics driver code. The very latest open-source Radeon driver code tested with these popular Valve Linux games were the Linux 3.18 Git kernel, Mesa 10.4-devel, LLVM 3.6 SVN, and xf86-video-ati 7.5.99."
Here is some more Tech News from around the web:
- A Spaceship For Christmas – Elite: Dangerous Dated @ Rock, Paper, SHOTGUN
- Wot I Think – Call Of Duty: Advanced Warfare Singleplayer @ Rock, Paper, SHOTGUN
- Ryse: Son of Rome PC: It’s Boring but Here’s Why You Should Still Buy It @ eTeknix
- Free Beards And Horse Armour: The Witcher 3 DLC Plans @ Rock, Paper, SHOTGUN
- Borderlands: The Pre-Sequel Review @ OCC
- Assassin's Creed: Unity widely found to be slow and buggy @ HEXUS
- Avalanche confirms Just Cause 3 for PC and next-gen consoles @ HEXUS
- The Witcher 2 And Mount & Blade Free In GOG Sale @ Rock, Paper, SHOTGUN
- Skaven Time: Warhammer’s XCOMish Mordheim Out Soon @ Rock, Paper, SHOTGUN
- Microsoft to bring back beloved 1990s super-hit BATTLETOADS!? @ The Register
Subject: General Tech | November 12, 2014 - 04:54 PM | Jeremy Hellstrom
Tagged: mozilla, oculus rift, MozVR
You have been able to browse the web on your Oculus Rift since the first dev kit, but not with a UI designed specifically for the VR device. MozVR is in development along with a specific version of Firefox or Chromium to allow Oculus users to browse the web in a new way. It will work with both Mac and Windows, though as of yet there is no mention of Linux support which should change in the near future. You need to get your hands on an Oculus to try out the new browser, it simply is not going to translate to the desktop. The software is open sourced and available on Github so you can contribute to the overall design of the new way to surf the web as well as optimizing your own site for VR. Check out more on MozVR and Oculus over at The Inquirer.
"MOZILLA IS CONTINUING its 10th birthday celebrations with the launch of a virtual reality (VR) website."
Here is some more Tech News from around the web:
- Elon Musk and ex-Google man mull flinging 700 internet satellites into orbit @ The Register
- Samsung slams door on OLED TVs, makes QUANTUM dot LEAP @ The Register
- Intro to Systemd Runlevels and Service Management Commands @ Linux.com
- TSMC 16FinFET Plus process achieves risk production milestone @ DigiTimes
- Iranian contractor named as Stuxnet 'patient zero' @ The Register
- Hardware Asylum Podcast - MOA 2014 Final and Surprise Lightning
Subject: Storage | November 12, 2014 - 04:44 PM | Allyn Malventano
Tagged: ssd, pcie, NVMe, Intel, DC P3500
Since we reviewed the Intel SSD DC P3700, many of you have been drooling over the idea of an 18-channel NVMe PCIe SSD, even more so given that the P3500 variant was to launch at a $1.50/GB target price. It appears we are getting closer to that release, as the P3500 has been appearing on some web sites in pre-order or out of stock status.
ShopBLT lists the 400GB part at $629 ($1.57/GB), while Antares Pro has an out of stock listing at $611 ($1.53/GB). The other two capacities are available at a similar cost/GB. We were hoping to see an 800GB variant, but it appears Intel has stuck to their initial plan. Here are the part numbers we’ve gathered, for your Googling pleasure:
- 400GB: SSDPEDMX400G401
- 1.2TB: SSDPEDMX012T401
- 2TB: SSDPEDMX020T401
2.5” SFF-8639 (*not SATA*):
- 400GB: SSDPE2MX400G401
- 1.2TB: SSDPE2MX012T401
- 2TB: SSDPE2MX020T401
We did spot a date of December 12th in an Amazon listing, but I wouldn't count that as a solid date, as many of the listings there had errors (like 10 packs for the price of one).
November 12, 2014 - 08:29 AM | Sebastian Peak
Noctua has announced three new 92mm CPU coolers today, with two different replacements for the existing NH-U9B SE2 and a new cooler for Intel Xeon LGA2011 processors for workstations and servers. Each model will now use a PWM fan, the recently announced NF-A9.
Image credit: Noctua
In Noctua’s official press release their CEO Roland Mossig is quoted "The NH-U9B SE2 is still one of our most popular models. The NH-U9S and NH-D9L stay true to this proven formula but now offer even better performance, better compatibility and PWM support for automatic fan speed control."
The first of the two NH-U9B replacements is the NH-U9S, which features an asymmetrical design with 5 heatpipes. The other model with be the NH-D9L, a 4 heatpipe design that is “15mm lower than classic 9cm coolers such as the NH-U9 series (110mm vs. 125mm)”. Noctua states that this will “guarantee full 3U compliance” and also “makes the NH-D9L ideal for compact HTPC and Small Form Factor cases”. Noctua states that the 95x95mm footprint of these new coolers, will clear “RAM and PCIe slots on all Intel and most AMD based mainboards, including µATX and ITX.”
The last addition to the 92mm lineup announced today is the server-specific NH-D9DX i4 3U, the replacement for the 4U model NH-U9DX i4. Noctua states that this new cooler uses “the same heatsink as the NH-D9L but comes with LGA2011 mounting for both Square ILM and Narrow ILM Xeon platforms as well as support for LGA13x6.”
The fan powering these new coolers is the NF-A9 PWM, and each cooler will use Noctua’s SecuFirm2 mounting system, and will come with a 6 year warranty. Noctua states that all three models are currently shipping and will be available shortly. MSRP’s will be as follows: NH-U9S, $59.90 USD; NH-D9L, $56.90 USD; NH-D9DX i4 3U, $59.90 USD.
Subject: General Tech | November 12, 2014 - 04:07 AM | Tim Verry
Tagged: system requirements, pc gaming, kyrat, fps, far cry 4
In case you missed it earlier this week, Ubisoft revealed the PC system requirements needed to run Far Cry 4. Developed by Ubisoft Montreal and set to release on November 18th, Far Cry 4 is the latest action adventure FPS in the Far Cry series. The game uses Ubisoft's Dunia Engine II which is a heavily modified game engine originally based on Crytek's CryEngine 1 developed by Kirmaan Aboobaker. The player is a Nepalese native that returns to Kyrat, a fictional location in the Himalayas following the death of their mother only to become embroiled in a civil war taking place in an open world filled with enemies, weapons, animals, and did I mention weapons?
This bow is a far cry from the only weapon you'll have access to...
According to the developer, Far Cry 4 continues the tradition of an open world environment, but the game world has been tweaked from the Far Cry 3 experience to be a tighter and more story focused experience where the single player story will take precedence over exploration and romps across the mountainous landscape.
While I can not comment on how the game plays, it certainly looks quite nice, and will need a beefy modern PC to run at its maximum settings. Interestingly, the game seems to scale down decently as well, with the entry level computer needed to run Far Cry 4 being rather modest.
No matter the hardware level, only 64-bit operating systems need apply, Far Cry 4 requires the 64-bit version of Windows 7 or later to run. At a minimum, Ubisoft recommends a quad core processor (Intel i5 750 or AMD Phenom II X4 955), 4GB of memory, a Radeon 5850 or GTX 460, and 30GB of storage.
To get optimal settings, users will need twice the system memory (at least 8GB) and video memory (at least 2GB), a newer quad core CPU such as the Intel i5-2400S or AMD FX-8350, and a modern NVIDIA GTX 680 or AMD Radeon R9 290X graphics card.
Anything beyond that is gravy that will allow gamers to crank up the AA and AF as well as the resolution.
Far Cry 4 will be available in North America on November 18, 2014 for the PC, PS4, Xbox One, PS3, and Xbox 360. Following the North America release, the game is scheduled to launch in Europe and Australia on November 20th, and in Japan on January 22 of next year.
Subject: General Tech | November 12, 2014 - 03:23 AM | Scott Michaud
Tagged: pc gaming, final fantasy xiii-2, final fantasy xiii, final fantasy
It seems like Square Enix has paid attention to the criticism about Final Fantasy XIII.
While it would have been nice for them to go back and fix the problems for the original game (Update Nov 12 @ 5:35pm EST: They are, in early December - Thanks TimeKeeper in the comments), it looks like the sequel, XIII-2, will behave more like a PC title. First and foremost, it will not be locked to 720p and it is said to offer other graphics options. The sequel is scheduled to launch on December 11th for $20, or $18 USD on pre-order (a few dollars above the launch price for Final Fantasy 13).
Of course, it is somewhat disappointing that screen resolution, a 60FPS cap, and graphics options are considered features, but the platform is unfamiliar to certain parts of the company. Acknowledging their error and building a better, but probably still below expectations, product is a good direction. Hopefully they will continue to progress, and eventually make PC games with the best of them. Either that, or they have a talk with their Eidos arm about borrowing Nixxes, a company that specializes in enhancing games on the PC.
Final Fantasy XIII-2 is coming to Steam in a month for $20 USD. The third installment, Lightning Returns, will arrive sometime in 2015.
Subject: General Tech, Systems | November 11, 2014 - 11:11 PM | Scott Michaud
Tagged: haswell-t, haswell, fanless
This one is more for our European readers, because this company operates out of Germany, but the Cirrus7 Nimbus is an interestingly designed, fanless system. Its fin shape is said to be assembled out of laser-cut layers of aluminum that sandwiches in the I/O plate at the rear. FanlessTech has noted that the systems are now available with Haswell processors, up to a Core i7 based on Haswell-T. Their storage options now also include the Samsung 850 Pro, up to 1TB.
Image Credit: Cirrus7 via FanlessTech
The customization options are actually pretty decent. I find that a lack of meaningful upgrades to be a problem with modern PC builders, however this one does not apply. Eight CPUs are offered, ranging from a Celeron up to a 45W Haswell-T; RAM comes in 4GB, 8GB, or 16GB; up to three drives can be installed, up to one mSATA and up to two SATA; Intel Wireless N or AC is available; external DVD or BluRay burners are an option; and one of seven OSes can be installed, including two versions of Linux (Ubuntu 14.04 or Ubuntu 14.10). If you get all of the bells and whistles, you are probably up to about 3,000 USD, but you cannot expect two terabytes of Samsung 850 Pro SSDs to be cheap. It seems reasonable enough, especially for the EU. The big limiter is the lack of a discrete GPU unless you are using this device for something like audio recording, which an Intel HD 4600 can easily handle.
The Cirrus7 Nimbus is available now at their website.
Subject: Storage | November 11, 2014 - 05:32 PM | Allyn Malventano
Tagged: Intel, ssd, dc s3500, M.2
Today Intel refreshed their Datacenter Series of SSDs, specifically their DC S3500. We have reviewed this model in the past. It uses the same controller that is present in the S3700, as well as the SSD 730 Series (though it is overclocked in that series).
The full line of Intel Datacenter SSDs (minus the P3700). DC S3500 is just right of center.
Todays refresh includes higher capacities to the S3500, which now include 1.2TB and 1.6TB on the hign end of capacity. This suggests that Intel is stacking 20nm dies as many as 8 to a package. IOPS performance sees a slight penalty at these new higher capacities, while maximum sequentials are a bit higher due to the increased die count.
Also announced was an M.2 version of the S3500. This packaging is limited to only a few capacity points (80GB, 120GB, 340GB), and is p;rimarily meant for applications where data integrity is critical (i.e. ATM's, server boot partitions, etc).
A standard press blast was unavailable, but full specs are listed after the break.
Subject: General Tech | November 11, 2014 - 03:10 PM | Scott Michaud
Tagged: pc gaming, gaming, eff, DRM, consolitis
This is something that I have been saying for quite some time now: games are struggling as an art form. Now I don't mean that games are not art; games, like all content that expresses feelings, thoughts, and ideas, are art. No, I'm talking about their ability to be preserved for future society and scholarly review. The business models for entertainment are based in either services or consumables. In the entertainment industries, few (but some) producers are concerned about the long tail – the extreme back-catalog of titles. Success is often determined by two weeks of sales, and the focus is on maximizing those revenues before refreshing with newer, similar content that scratches the same itch.
DRM is often justified as maximizing the initial rush by degrading your launch competitors: free versions of yourself. Now I'm not going to go into the endless reasons about where this fails to help (or actively harms) sales and your customers; that is the topic of other rants. For this news post, I will only discuss the problems that DRM (and other proprietary technologies) have on the future.
When you tie content to a platform, be it an operating system, API, or DRM service, you are trusting it for sustainability. This is necessary and perfectly reasonable. The problems arise with the permissions given to society from that platform owner, and how easily society can circumvent restrictions, as necessary. For instance, content written for a specific processor can be fed through an emulator, and the instruction sets can be emulated (or entirely knocked off) when allowed by patent law, if patents even interfere.
Copyright is different, though. Thanks to the DMCA, it is illegal, a federal crime at that, to circumvent copyright protection even for the betterment of society. You know, society, the actual owner of all original works, but who grants limited exclusivity to the creators for “the progress of Science and useful Arts”. Beyond the obvious and direct DRM implementations, this can also include encryption that is imposed by console manufacturers, for instance.
The DMCA is designed to have holes poked into it, however, by the Librarian of Congress. Yes, that is a job title. I did not misspell “Library of Congress”. The position was held by James H. Billington for over 25 years. Every three years, he considers petitions to limit the DMCA and adds exceptions in places that he sees fit. In 2012, he decided the jailbreaking a phone should not be illegal under the DMCA, although tablets were not covered under that exemption. This is around the time that proposals will be submitted for his next batch in late 2015.
This time, the EFF is proposing that circumventing DRM in abandoned video games should be deemed legal, for society to preserve these works of art when the copyright holders will not bother. Simply put, if society intended to grant a limited exclusive license to a content creator who has no intention of making their work available to society, then society demands the legal ability to pry off the lock to preserve the content.
Of course, even if it is deemed legal, stronger DRM implementations could make it technologically unfeasible to preserve certain works. It is still a long way's away before we encounter a lock that society cannot crack, but it is theoretically possible. This proposal does not address that root problem, but at least it could prevent society's greatest advocates from being slapped with a pointless felony for trying to do the right thing.
Subject: General Tech, Systems, Mobile | November 11, 2014 - 03:27 AM | Scott Michaud
Tagged: usb computer, Raspberry Pi B+, Raspberry Pi, Education
The Raspberry Pi was intended as a learning device. David Braben, previously known for Rollercoaster Tycoon and other video games, noticed that computer science education was lacking and he wanted to contribute to its advancement with a cheap, portable, and highly-programmable PC. Yesterday, the organization announced a new model, the Raspberry Pi A+, which is (theoretically) cheaper, smaller, and has a few better components. This announcement follows the release of the Raspberry Pi B+ from last July.
I say “theoretically cheaper” because, although the organization is touting a price reduction from $25 to $20 USD, that always depends on the reseller. MCM Electronics, one of the foundation's US-based distributors, is selling the A+ for its list price of $20 (plus an extra ~$10 in shipping, before tax). In the UK, however, the currency conversion works out to about $25 before VAT. That said, the UK is known to be expensive for electronics.
Whatever the price, the device is slightly improved. While it keeps the same, Broadcom BCM2835 SoC and RAM, the memory has been upgraded to a locking MicroSD card slot, the audio's power delivery has been improved to reduce noise, and the number of GPIO pins has been increased from 26 to 40. The latter enhancement will allow the Pi to interface with more, and different, sensors and motors for robotics and other embedded applications.
The Raspberry Pi A+ and B+ are both currently on backorder for $20 and $35, respectively, before a $10 shipping fee and any applicable taxes.
Subject: General Tech, Graphics Cards | November 10, 2014 - 10:07 PM | Ryan Shrout
Tagged: video, Unity, pcper, nvidia, live, GTX 980, geforce, game stream, assassins creed
UPDATE: If you missed the live stream event: good news! We have it archived up on YouTube now and embeded below for your viewing pleasure!
Assassin's Creed Unity is shaping up to be one of the defining games of the holiday season, with visuals and game play additions that are incredible to see in person. Scott already wrote up a post that details some the new technologies found in the game along with a video of the impressive detail the engine provides. Check it out!
To celebrate the release, PC Perspective has partnered with NVIDIA to host a couple of live game streams that will feature some multi-player gaming fun as well some prizes to giveaway to the community. I will be joined by some new NVIDIA faces to take on the campaign in a cooperative style while taking a couple of stops to give away some hardware.
Assassin's Creed Unity Game Stream Powered by NVIDIA
5pm PT / 8pm ET - November 11th
Need a reminder? Join our live mailing list!
Here are some of the prizes we have lined up for those of you that join us for the live stream:
- 5 x Assassin's Creed Unity Steam Keys
- 10 x NVIDIA SLI Bridges - From NVIDIA Direct
- 1 x ASUS ROG Swift PG278Q G-Sync Monitor - PC Perspective Review
- 1 x Acer XB280HK 28-in 4K G-Sync Monitor - PC Perspective Review
Another awesome prize haul!! How do you win? It's really simple: just tune in and watch the Assassin's Creed Unity Game Stream Powered by NVIDIA! We'll explain the methods to enter live on the air and anyone can enter from anywhere in the world - no issues at all!
So stop by Tuesday night for some fun, some gaming and the chance to win some goods!
Subject: Graphics Cards | November 10, 2014 - 03:45 PM | Jeremy Hellstrom
Tagged: asus, strix, GTX 970 STRIX DirectCU II OC, GTX 970, nvidia, maxwell
When ASUS originally kicked off their new STRIX line they gained popularity not only due to the decent overclock and efficient custom cooler but also because there was only a small price premium over the base model. At a price of $400 on Amazon the card has a price inline with other overclocked models, some base models can be up to $50 less. [H]ard|OCP investigated this card to see what benefits you could expect from the model in this review, comparing it to the R290 and 290X. Out of the box the card runs at a core of 1253 -1266MHz and memory of 7GHz, with a bit of overvolting they saw a stable core of 1473 - 1492MHz and memory of 7.832GHz.
With the new price of the 290X dipping as low as $330 it makes for an interesting choice for GPU shoppers. The NVIDIA card is far more power efficient and the fans operate at 0dB until the GPU hits 65C, which [H] did not see until after running at full load for a time and even then the highest their manually overclocked card hit was 70C. On the other hand the AMD card costs $70 less and offers very similar performance. It is always nice to see competition in the market.
"Today we examine ASUS' take on the GeForce GTX 970 video card. We have the ASUS GTX 970 STRIX DirectCU II OC video card today, and will break down its next-gen performance against an AMD Radeon R9 290 and R9 290X. This video card features 0dB fans, and many factors that improve its chance of extreme overclocking."
Here are some more Graphics Card articles from around the web:
- Alien: Isolation - Video Card Performance @ [H]ard|OCP
- GTX 970 Roundup (EVGA, GALAX, Gigabyte) @ Hardware Canucks
- MSI GTX 980 Gaming 4G Review @ OCC
- Nvidia GeForce GTX 980 @ X-bit Labs
- MSI GeForce GTX 970 Gaming 4G @ X-bit Labs
- Gainward GTX 980 & GTX 970 Phantom @ Legion Hardware
- NZXT Kraken G10 Graphics Adapter: Cooler, and also Quieter @ Silent PC Reivew
- AMD's Windows Catalyst Driver Remains Largely Faster Than Linux Drivers @ Phoronix
- HIS R9 285 IceQ X2 OC 2GB GDDR5 Video Card Review @ Madshrimps
- Sapphire R9 290X Vapor X 8GB CF @ Kitguru
- Sapphire Radeon R9 290X Vapor-X OC 8GB @ eTeknix
Subject: General Tech | November 8, 2014 - 09:22 PM | Scott Michaud
Tagged: Starcraft II, starcraft, lotv, legacy of the void, blizzcon 2014, blizzcon, blizzard
Blizzard has been reconsidering what constitutes "a game sale" with StarCraft for quite some time now. They have been slowly carving out its mod platform, StarCraft Arcade, into a standalone, free product. They allow playing multiplayer with limitations, such as forcing free players to choose Terran (except for certain promotions). A few years in to StarCraft II's release, they even added "Spawning" to allow Starter and Wings of Liberty users to play locked content as long as a party member has purchased it, although Starter users are still locked to Terran.
Today's announcement is a little more conventional -- Legacy of the Void will be a standalone expansion. You can purchase it without owning any earlier content. If you do own Wings of Liberty and/or Heart of the Swarm, then it will behave like an expansion, however.
The game itself will change significantly, too. At the competitive level, you often have a bit of a boring early game, unless one player decides to be a bit cheesy with their tactics. A lot of this is due to how long it takes to get from your initial six workers to being supply blocked. In Legacy of the Void, you start with 12 workers, twice as many as before. Also, each mineral patch has 33% less minerals, requiring bases to be taken more frequently and discouraging a maxed-out army from sitting on a handful of expansions to build a bank.
Many units were added and changed as well. Terran and Protoss are being pushed toward dropping units. The Warp Prism has its pickup range increased, to allow it to grab and reposition units from anywhere within a relatively large army ball, without needing to put the transport unit in danger. On the other hand, Terrans are able to pick up Seige Tanks while they are in Siege Mode. This allows a Terran player, who is paying close attention, to drop a tank for a quick, high-damage, and splashing shot, and then pick it up before it can be attacked. Siege Tanks have large range, slow rate of fire, and a relatively low health. If they are never shot at, though, while they're reloading their main cannon, then that nullifies their weakness, as long as you can keep the Medivac alive, too.
One thing that Blizzard disliked, however, seems to be Swarm Hosts. In Heart of the Swarm, competitions went on for hours, literally hours, as one component turtled in a corner of the map (or surrounded an opponent into a corner of the map) with free units. This was particularly problematic for Protoss, that has a highly efficient, ball-based army, and Zerg, which could counter with their own Swarm Hosts. Battles was commonly wave-after-wave of free units doing zero (or minimal) damage, ad-infinitum.
In Legacy of the Void, they do not spawn Locusts (free units) fast enough to pin someone down, or keep someone out, and these Locusts need to be spawned manually. Instead, they are intended as more of a sieging unit, capable of dropping free units into a base and walking away. They also do not burrow, unless that upgrade is acquired, which will make them easier to attack. On the other hand, the Locusts can fly to their target, where they must land to attack, as normal. The Swarm Hosts do not need to be in a dangerous location, just a potentially dangerous range. Whether Swarm Hosts, if they are upgraded with Burrow, can release Locusts while hidden is unclear. It is not something that I have seen yet. That said, the borrowed, space-control unit is now the Lurker, a Brood War alumnus.
Many other changes have been announced, but it always comes down to user testing.
As usual for a Blizzard title, no official release date has been given. A private beta will be "coming soon" to selected participants. It was also available to play at Blizzcon.
Subject: General Tech | November 8, 2014 - 07:59 PM | Scott Michaud
Tagged: overwatch, blizzcon 2014, blizzcon, blizzard
Blizzard has announced Overwatch, a new franchise to expand their portfolio. It was unveiled at the Blizzcon keynote with a cinematic trailer followed immediately by gameplay footage. The first video looks significantly different from other Blizzard cinematics. It follows a Walt Disney Animation Studios art style, including exaggerated facial features and animations, versus the game company's normal dulled realism. It would look at home alongside "Bolt", for instance.
The gameplay itself is compared to Team Fortress 2. It is a class-based first-person shooter with an assortment of game types. The first two, announced modes will probably sound very similar to most of our fans: Point Capture and Payload (yes, that Payload). The classes are described more like MOBA heroes, however, but multiple players are (said to be) able to use the same class. Apart from the character design, they seem to be functionally TF2 classes. Maybe the difference is just that their names do not define what they do?
There are several similarities and differences between the two games. The classes seem to borrow from Team Fortress, with a comfortable embrace to magic and abilities. There are at least two engineer-style characters that can build turrets, and at least one of them can build a teleporter. One difference is, there seems to be a bit of a focus on parkour and movement abilities, such as grappling hooks, in particular.
There are also a couple of guesses about where this game came from. The funny, albeit likely incorrect reason is that, after Valve took the reigns of DOTA, Blizzard decided to take on Team Fortress 2 and push into their turf (although Gabe Newell has described the relationship between the two companies as "friends"). More likely, Paul Tassi published on Forbes some claims that Overwatch was a remnant from Titan, possibly one of its intended PvP modes. If this was a spin-off of Titan, it makes me wonder exactly what kind of engine they were trying to develop, that was developed for an MMO but that could also be comfortable as a first-person shooter. That said, it is not uncommon to see versatile engines in recent years, such as Source and Unreal Engine 4.
Overwatch will be going into a multiplayer beta in 2015, seemingly early in the year. It is interesting to see Blizzard go into a vastly different genre than their usual, especially from a technology standpoint.
Subject: Cases and Cooling | November 7, 2014 - 01:50 PM | Jeremy Hellstrom
Tagged: corsair, Graphite 780T
The Graphite 780T stands 689 x 332 x 670 mm (27 x 13 x 26") which gives you a lot of space to install your system. The cooling options are similarly impressive, you can install up to six 140mm fans or nine 120mm or for watercoolers you can install up to a 360mm rad on the top or front, 240mm on the bottom or a 140mm rad on the back. In addition to the drive cages with tool-less installation on the front of the case, you can also install three 2.5" drives on the back side of the case. If you want to build a system with an XL-ATX motherboard, the biggest CPU cooler you can get your hands on an several of the largest GPUs on the market this case will take them all and still leave you with plenty of space. Check out the full review at Overclockers Club.
"To follow up, the Graphite 780T has many positive things making it well worth the asking price. I don't have time to write out each in detail or this would go on forever, so I'm just going to cover the things that make it stand out. First up, having support for every aftermarket CPU cooler is a major advantage. When I say every single one, it's because nothing has topped 200mm yet and that would just be purely insane."
Here are some more Cases & Cooling reviews from around the web:
- NZXT S340 Mid-Tower Case @ Benchmark Reviews
- Fractal Design Core 1100 @ Benchmark Reviews
- Fractal Design Node 804 Micro-ATX @ eTeknix
- DimasTech Bench/Test Easy V3.0 Review @ Modders-Inc
- Aerocool Strike-X Cube White Edition @ Kitguru
- Phobya 360LT Pure Performance Watercooling Kit Review @ NikKTech
- Swiftech H240-X Open Loop 280mm CPU Cooler Review @HiTech Legion
- Reveen Okeanos @ techPowerUp
Subject: General Tech | November 7, 2014 - 12:54 PM | Jeremy Hellstrom
Tagged: google, barges, mysterious
Not even Google is able to defeat the enforcement powers of local fire marshals which is why the mysterious barges are no longer anchored off the coast of San Francisco. It seems that may not have met the fire safety rules required by law and so they have departed for places unknown. The variety of theories which attempted to explain the barges, from floating data centres to a project to cede from the USA, were far more entertaining than the truth but perhaps we can enjoy a resurgence of entertaining internet hypothesizing now that the barges have disappeared. The Inquirer did get a chance to speak with Google about the barges and it turns out that they were simply a very unique way to set up a display room to show off Google's newest projects.
"TWO MYSTERIOUS BARGES moored by Google off the coast of the US last year were apparently moved because coastguards feared they did not conform to fire regulations."
Here is some more Tech News from around the web:
- Website Peeps Into 73,000 Unsecured Security Cameras Via Default Passwords @ Slashdot
- Microsoft's November Patch Tuesday is a whopper @ The Inquirer
- Microsoft releases free anti-malware for Azure VMs @ The Register
- Microsoft improves Azure SQL Server cloud service, simultaneously makes it worse @ The Register
- Inside the OC Lab at MSI HQ in Taipei: KitGuru TV