All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Graphics Cards | November 14, 2014 - 11:46 AM | Ryan Shrout
Tagged: sli, nvidia, N980X3WA-4GD, maxwell, GTX 980, gigabyte, geforce, 3-way
Earlier this week, a new product showed up on Gigabyte's website that has garnered quite a bit of attention. The GA-N980X3WA-4GD WaterForce Tri-SLI is a 3-Way SLI system with integrated water cooling powered by a set of three GeForce GTX 980 GPUs.
That. Looks. Amazing.
What you are looking at is a 3-Way closed loop water cooling system with an external enclosure to hold the radiators while providing a display full of information including temperatures, fans speeds and more. Specifications on the Gigabyte site are limited for now, but we can infer a lot from them:
- WATERFORCE :3-WAY SLI Water Cooling System
- Real-Time Display and Control
- Flex Display Technology
- Powered by NVIDIA GeForce GTX 980 GPU
- Integrated with 4GB GDDR5 memory 256-bit memory interface(Single Card)
- Features Dual-link DVI-I / DVI-D / HDMI / DisplayPort*3(Single Card)
- BASE: 1228 MHz / BOOST: 1329 MHz
- System power supply requirement: 1200W(with six 8-pin external power connectors)
The GPUs on each card are your standard GeForce GTX 980 with 4GB of memory (we reviewed it here) though they are running at overclocked base and boost clock speeds, as you would hope with all that water cooling power behind it. You will need a 1200+ watt power supply for this setup, which makes sense considering the GPU horsepower you'll have access to.
Another interesting feature Gigabyte is listing is called GPU Gauntlet Sorting.
With GPU Gauntlet™ Sorting, the Gigabyte SOC graphics card guarantees the higher overclocking capability in terms of excellent power switching.
Essentially, Gigabyte is going to make sure that the GPUs on the WaterForce Tri-SLI are the best they can get their hands on, with the best chance for overclocking higher than stock.
Setup looks interesting - the radiators and fans will be in the external enclosure with tubing passing into the system through a 5.25-in bay. It will need to have quick connect/disconnect points at either the GPU or radiator to make that installation method possible.
Pricing and availability are still unknown, but don't expect to get it cheap. With the GTX 980 still selling for at least $550, you should expect something in the $2000 range or above with all the custom hardware and fittings involved.
Can I get two please?
Subject: General Tech | November 13, 2014 - 06:09 PM | Jeremy Hellstrom
Tagged: Cherry MX clear, WASD Keyboards, CODE, input, mechanical keyboard, tenkeyless
Scott posted about the WASD Keyboards CODE with Cherry MX Clear switches but until now we have not found a review of this keyboard. The Tech Report has changed that with this review which takes a look at the new type of switch which sits between the Brown and the Green, Clear switches need more force to bottom out that a Brown but not as much as the clicky style Green switches. That is not all this tenkeyless board offers, there are LEDs that can be activated by the dip switches in the recess found on the back of the keyboard. In fact those dip switches can do more than just enable a nice glow, you can disable the Windows key or even immediately switch to different layouts such as Mac, Dvorak, and Colemak though sadly they left Sinclair ZX off of the list. If this type of switch interests your fingers and you are willing to spend $150 on a keyboard check out the full review here.
"We've been meaning to try out Cherry MX's clear key switches for a while, and now, we've finally gotten our wish. Join us for a look at WASD Keyboards' Cherry MX clear-infused Code keyboard, a tenkeyless offering with more than a few tricks up its sleeve."
Here is some more Tech News from around the web:
- COUGAR 700K Mechanical Gaming Keyboard Review @ Techgage
- Cougar 700K Mechanical Keyboard @ Hardware Heaven
- AORUS Thunder K7 Gaming Keyboard @ Benchmark Reviews
- Thrustmaster Ferrari GT Cockpit 458 Steeling Wheel @ eTeknix
- Corsair Gaming RGB M65 @ Kitguru
- GAMDIAS ZEUS Laser Gaming Mouse @ Tech ARP
- Tesoro Gandiva H1L Laser Gaming Mouse Review @ NikKTec
Subject: Systems | November 13, 2014 - 04:03 PM | Jeremy Hellstrom
Tagged: zotac, zbox ci540 nano, fanless, haswell, i5-4210Y
The Zotac ZBOX CI540 Nano is a bit more powerful than your average Bay Trail based mini-PC, it sports a Haswell based dual core i5-4210Y which runs between 1.5-1.9GHz and has Intel's HD4200 onboard. This won't play AC:Unity but comes close to matching a NUC containing a Core i5-4250U, you give up a bit of horsepower for completely silent operation and for media it sports enough power to watch your favourite videos. As you look at Silent PC Reviews' article you can see the honeycomb patterned knockouts on the casing to allow heat to dissipate and to let in liquid if you don't put some thought into where you are going to place the ZBOX. It does have Bluetooth and there is an unofficial optional IR receiver that can be used to make it easy to place this tiny computer in a safe place.
"The Zotac ZBOX CI540 Nano gives up a little CPU/GPU horsepower to deliver a completely fanless, silent and full-featured mini-PC experience."
Here are some more Systems articles from around the web:
- Zotac ZBox PI320 Mini-PC Review and Teardown @ The SSD Review
- Zotac Zbox Pico @ HardwareHeaven
- Lenovo Horizon 2 AIO Desktop Computer @ Benchmark Reviews
- Logic Supply ML400G-50 Fanless m-ITX PC @ Silent PC Review
- Habey MITX-6771 Bay Trail Embedded Motherboard @ Silent PC Review
Subject: General Tech | November 13, 2014 - 03:19 PM | Ken Addison
Tagged: podcast, video, Intel, core m, core m 5y70, Broadwell, broadwell-y, Lenovo, yoga 2 pro, yoga 3 pro, assasins creed unity, ubisoft, farcry 4, p3500, gskill blade
PC Perspective Podcast #326 - 11/13/2014
Join us this week as we discuss Intel's Core M 5Y70, Assassin's Creed Unity, Intel P3500 and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:09:49
Subject: Systems, Mobile | November 13, 2014 - 02:48 PM | Scott Michaud
Tagged: shield tablet, shield, nvidia, grid, geforce grid
Today, NVIDIA has announced the November update for their SHIELD Tablet, which is really about three announcements that are rolled up together.
As expected, the SHIELD Tablet is getting a roll-up to Android 5.0 Lollipop and its new, “Material Design” style guide. NVIDIA's took the opportunity to refresh the SHIELD HUB (my shift key must think that this is an MSI announcement by now...) in the same design specification. While interesting, the two other announcements probably beat it out, especially the GRID streaming service (and how it relates to the Xbox One and the PlayStation 4).
But before we get to GRID, let's talk “The Green Box”. In May, NVIDIA sent us a green crowbar to mark the availability of Half-Life 2 and Portal on the NVIDIA SHIELD. These were full, native ports of the PC title to ARM and Android that is exclusive to the NVIDIA SHIELD. With the November update, Half-Life 2: Episode One has also been ported to the platform. The three games, Portal, Half-Life 2, and Episode One, are also packaged in “The Green Box” bundle, which will be included free-of-charge with the SHIELD Tablet 32GB. Note that, while the games are included with the tablet, they require a controller to play, which is not included.
Now we talk about GRID.
Netflix is a popular service where people can watch a variety of movies from their rolling catalog. It will not replace ownership of certain, intrinsically valuable titles, but there is probably options for anyone who wants to consume some form of entertainment. GRID is a similar service for video games, and it is not the first. We took a look at a preview of OnLive in 2010, connecting to a server about 2400 miles away, which is over twice the maximum intended range, and found the experience somewhat positive for games except Unreal Tournament 3 at that relatively extreme latency. Another company, GaiKai, was purchased by Sony and rebranded as PlayStation Now. It will serve up a selection of games from the PS3 catalog. Again, content on these services can be pulled at any time, but if you are just looking for the entertainment value, something else will probably be there to scratch your itch.
The interesting part that I have been teasing throughout this entire post is the performance of NVIDIA GRID. PlayStation Now is rated at 192 GFLOPs, which is the theoretical GPU compute throughput of the PS3's RSX chip. GRID, on the other hand, is rated for 2448 GFLOPs (~2.5 TFLOPs). This is higher than the PlayStation 4, and almost twice the GPU performance of the Xbox One. On the PC side, it is roughly equivalent to the GeForce GTX 760 Ti.
This compute rating has a hidden story, too. Back in 2011, Epic Games demoed “Samaritan” in Unreal Engine 3. This was the bar that Epic Games set for Microsoft, Sony, and Nintendo to mark a new console generation. When Unreal Engine 4 was unveiled at the end of E3 2012, it was embodied in the Elemental Demo, which also ran at (you guessed it) 2.5 TFLOPs. At the PlayStation 4 (1.9 TFLOPs) announcement, the demo was scaled back with reduced particles and lighting complexity. It was not shown at either Xbox One (1.3 TFLOPs) announcement at all.
What all of that means is simple: NVIDIA GRID is the only fixed hardware platform (that I am aware of) to meet Epic's vision of a next-gen gaming system. I say fixed, of course, because the PC can over-double it per card, with some games scaling to four discrete GPUs. This also says nothing about the CPU performance, system memory, or video memory, but it has the GPU in the right place for a next gen platform.
The NVIDIA GRID preview will launch in November for North America, with East Coast and West Coast servers. It will expand in December for Western Europe, and in “Q2” for Asia Pacific. The service will be free for SHIELD users until June 30th, 2015. The Android 5.0 Update for the SHIELD Tablet will be available on November 18th.
Subject: General Tech | November 13, 2014 - 01:39 PM | Jeremy Hellstrom
Tagged: cycle computing, supercomputer, gojira, bargain
While the new Gojira supercomputer is not more powerful than the University of Tokyo's Oakleaf-FX at 1 petaflop of performance if you look at it from a price to performance ratio the $5,500 Gojira is more than impressive. It has a peak theoretical performance of 729 teraflops by using over 71,000 Ivy Bridge cores across several Amazon Web Service regions and providing the equivalent of 70.75 years of compute time. The cluster was built in an incredibly short time, going from zero to 50,000 cores in 23 minutes and hitting the peak after 60 minutes. You won't be playing AC Unity on it any time soon but if you want to rapidly test virtual prototypes these guys can do it for an insanely low price. Catch more at The Register and ZDNet, the Cycle Computing page seems to be down for the moment.
"Cycle Computing has helped hard drive giant Western Digital shove a month's worth of simulations into eight hours on Amazon cores."
Here is some more Tech News from around the web:
- Mac OS X Yosemite disables third-party SSD driver support @ The Inquirer
- Digitimes Research: Lenovo, Asustek to launch US$149 Chromebook
- DAY ZERO, and COUNTING: EVIL 'UNICORN' all-Windows vuln - are YOU patched? @ The Register
- FLASH better than DISK for archiving, say academics. Are they stark, raving mad? @ The Register
Subject: Graphics Cards | November 13, 2014 - 12:46 PM | Scott Michaud
Tagged: nvidia, geforce, gtx 960, maxwell
It is possible that a shipping invoice fragment was leaked for the NVIDIA GeForce GTX 960. Of course, an image of text on a plain, white background is one of the easiest things to fake and/or manipulate, so take it with a grain of salt.
The GTX 960 is said to have 4GB of RAM on the same, 256-bit bus. Its video outputs are listed as two DVI, one HDMI, and one DisplayPort, making this graphics card useful for just one G-Sync monitor per card. If I'm reading it correctly, it also seems to have a 993 MHz base clock (boost clock unlisted) and an effective 6008 MHz (1500 MHz actual) RAM clock. This is slightly below the 7 GHz (1750 MHz actual) of the GTX 970 and GTX 980 parts, but it should also be significantly cheaper.
The GeForce GTX 960 is expected to retail in the low-$200 price point... some day.
Subject: Graphics Cards | November 12, 2014 - 09:03 PM | Ryan Shrout
Tagged: Unity, ubisoft, assassin's creed
Over the last couple of days there have been a lot of discussions about the performance of the new Assassin's Creed Unity from Ubisoft on current generation PC hardware. Some readers have expressed annoyance that the game is running poorly, at lower than expected frame rates, at a wide range of image quality settings. Though I haven't published my results yet, we are working on a story comparing NVIDIA and AMD GPUs in Unity, but the truth is that this is occurring on GPUs from both sides.
For example, using a Core i7-3960X and a single GeForce GTX 980 4GB reference card, I see anywhere from 37 FPS to 48 FPS while navigating the crowded city of Paris at 1920x1080 and on the Ultra High preset. Using the Low preset, that frame rate increases to 65-85 FPS or so.
Clearly, those are lower frame rates at 1920x1080 than you'll find in basically any other PC game on the market. The accusation from some in the community is that Ubisoft is either doing this on purpose or doing it out of neglect with efficient code. I put some questions to the development team at Ubisoft and though I only had a short time with them, the answers tell their side of the story.
Ryan Shrout: What in the Unity game engine is putting the most demand on the GPU and its compute resources? Are there specific effects or were there specific design goals for the artists that require as much GPU horsepower as the game does today with high image quality settings?
Ubisoft: Assassin’s Creed Unity is one of the most detailed games on the market and [contains] a giant, open world city built to the scale that we’ve recreated. Paris requires significant details. Some points to note about Paris in Assassin’s Creed Unity:
- There are tens of thousands of objects are visible on-screen, casting and receiving shadows.
- Paris is incredibly detailed. For example, Notre-Dame itself is millions of triangles.
- The entire game world has global illumination and local reflections.
- There is realistic, high-dynamic range lighting.
- We temporally stabilized anti-aliasing.
RS: Was there any debate internally about downscaling on effects/image quality to allow for lower end system requirements?
Ubisoft: We talked about this a lot, but our position always came back to us ensuring that Assassin’s Creed Unity is a next-gen only game with breakthrough graphics. With this vision, we did not degrade the visual quality of the game. On PC, we have several option for low-scaling, like disabling AA, decreasing resolution, and we have low option for Texture Quality, Environment Quality and Shadows.
RS: Were you looking forward or planning for future GPUs (or multi-GPU) that will run the game at peak IQ settings at higher frame rates than we have today?
Ubisoft: We targeted existing PC hardware.
RS: Do you envision updates to the game or to future GPU drivers that would noticeably improve performance on current generations of hardware?
Ubisoft: The development team is continuing to work on optimization post-launch through software updates. You’ll hear more details shortly.
Some of the features listed by the developer in the first answer - global illumination methods, high triangle counts, HDR lighting - can be pretty taxing on GPU hardware. I know there are people out there pointing out games that have similar feature sets and that run at higher frame rates, but the truth is that no two game engines are truly equal. If you have seen Assassin's Creed Unity in action you'll be able to tell immediately the game is beautiful, stunningly so. Is it worth that level of detail for the performance levels achieved from current high-end hardware? Clearly that's the debate.
When I asked if Ubisoft had considered scaling back the game to improve performance, they clearly decided against it. The developer had a vision for the look and style of the game and they were dedicated to it; maybe to a fault from some gamers' viewpoint.
Also worth nothing is that Ubisoft is continuing to work on optimization post-release; how much of an increase we'll actually see with game patches or driver updates will have to be seen as we move forward. Some developers have a habit of releasing a game and simply abandoning it as it shipped - hopefully we will see more dedication from the Unity team.
So, if the game runs at low frame rates on modern hardware...what is the complaint exactly? I do believe that Ubisoft would have benefited from better performance on lower image quality settings. You can tell by swapping the settings for yourself in game but the quality difference between Low and Ultra High is noticeable, but not dramatically so. Again, this likely harkens back to the desire of Ubisoft to maintain an artistic vision.
Remember that when Crysis 3 launched early last year, running at 1920x1200 at 50 FPS required a GTX 680, the top GPU at the time; and that was at the High settings. The Very High preset only hit 37 FPS on the same card.
PC gamers seems to be creating a double standard. On one hand, none of us want PC-ports or games that are developed with consoles in mind that don't take advantage of the power of the PC platform. Games in the Call of Duty series are immensely popular but, until the release of Advanced Warfare, would routinely run at 150-200 FPS at 1080p on a modern PC. Crysis 3 and Assassin's Creed Unity are the opposite of that - games that really tax current CPU and GPU hardware, paving a way forward for future GPUs to be developed and NEEDED.
If you're NVIDIA or AMD, you should applaud this kind of work. Now I am more interested than ever in a GTX 980 Ti, or a R9 390X, to see what Unity will play like, or what Far Cry 4 will run at, or if Dragon Age Inquisition looks even better.
Of course, if we can get more performance from a better optimized or tweaked game, we want that too. Developers need to be able cater to as wide of a PC gaming audience as possible, but sometimes creating a game that can scale between running on a GTX 650 Ti and a GTX 980 is a huge pain. And with limited time frames and budgets, don't we want at least some developers to focus on visual quality rather than "dumbing down" the product?
Let me know what you all think - I know this is a hot-button issue!
UPDATE: Many readers in the comments are bringing up the bugs and artifacts within Unity, pointing to YouTube videos and whatnot. Those are totally valid complaints about the game, but don't necessarily reflect on the game's performance - which is what we were trying to target with this story. Having crashes and bugs in the game is disappointing, but again, Ubisoft and Assassin's Creed Unity aren't alone here. Have you seen the bugs in Skyrim or Tomb Raider? Hopefully Ubisoft will be more aggressive in addressing them in the near future.
UPDATE 2: I also wanted to comment that even though I seem to be defending Ubisoft around the performance of Unity, my direct feedback to them was that they should enable modes in the game that allow it to play at higher frame rates and even lower image quality settings, even if they were unable to find ways to "optimize" the game's efficiency. So far the developer seems aware of all the complaints around performance, bugs, physics, etc. and is going to try to address them.
UPDATE 3: In the last day or so, a couple of other media outlets have posted anonymous information that indicates that the draw call count for Assassin's Creed Unity is at fault for the poor performance of the game on PCs. According to this "anonymous" source, while the consoles have low-level API access to hardware to accept and process several times the draw calls, DirectX 11 can only handle "7,000 - 10,000 peak draw calls." Unity apparently is "pushing in excess of 50,000 draw calls per frame" and thus is putting more pressure on the PC that it can handle, even with high end CPU and GPU hardware. The fact that these comments are "anonymous" is pretty frustrating as it means that even if they are accurate, they can't be taken as the truth without confirmation from Ubisoft. If this turns out to be true, then it would be a confirmation that Ubisoft didn't take the time to implement a DX11 port correctly. If it's not true, or only partially to blame, we are left with more meaningless finger-pointing.
Subject: General Tech | November 12, 2014 - 07:38 PM | Scott Michaud
Tagged: visual studio, microsoft
While this is significantly different from what we usually write about, I have a feeling that there is some overlap with our audience.
Update: If you use Visual Studio Express 2013, you may wish to uninstall it before installing Community. My experience seems to be that it thinks that both are installed to the same directory, and so uninstalling Express after installing Community will break both. I am currently repairing Community, which should fix it, but there's no sense for you to install twice if you know better.
Visual Studio Express has been the free, cut-down option for small and independent software developers. It can be used for commercial applications, but it was severely limited in many areas, such as its lack of plug-in support. Today, Microsoft announced Visual Studio Community 2013, which is a free version of Visual Studio that is equivalent to Visual Studio Professional 2013 for certain users (explained below). According to TechCrunch, while Visual Studio Express will still be available for download, Community is expected to be the version going forward.
Image Credit: Wikimedia (modified)
There are four use cases for Visual Studio Community 2013:
- To contribute to open-source projects (unlimited users)
- To use in a classroom environment for learning (unlimited users)
- To use as a tool for Academic research (unlimited users)
- To create free or commercial, closed-source applications (up to 5 users)
- You must be an individual or small studio with less than 250 PCs
- You must have no more than $1 million USD in yearly revenue
Honestly, this is a give-and-take scenario, but it seems generally positive. I can see this being problematic for small studios with 6+ developers, but they can (probably) still use Visual Studio Express 2013 Update 3 until it gets too old. For basically everyone else, this means that you do not need to worry about technical restrictions when developing software. This opens the avenue for companies like NVIDIA (Nsight Visual Studio Edition) and Epic Games (Unreal Engine 4) to deliver their plug-ins to the independent developer community. When I get a chance, and after it finishes installing, I will probably check to see if those examples already work.
Visual Studio Community 2013 Update 4 is available now at Microsoft's website.
Subject: Editorial | November 12, 2014 - 06:58 PM | Josh Walrath
Tagged: Wyoming Whiskey, Whiskey, Kirby, Bourbon
Last year around this time I reviewed my first bottle of Wyoming Whiskey. Overall, I was quite pleased with how this particular spirit has come along. You can read my entire review here. It also includes a little interview with one of the co-founders of Wyoming Whiskey, David Defazio. The landscape has changed a little throughout the past year, and the distillery has recently released a second product in limited quantities to the Wyoming market. The Single Barrel Bourbon selections come from carefully selected barrels and are not blended with others. I had the chance to chat with David again recently and received some interesting information from him about the latest product and where the company is headed.
Picture courtesy of Wyoming Whiskey
Noticed that you have a new single barrel product on the shelves. How would you characterize this as compared to the standard bottle you sell?
These very few barrels are selected from many and only make the cut if they meet very high standards. We have only bottled 4 so far. And, the State has sold out. All of our product has matured meaningfully since last year and these barrels have benefitted the most as evidenced by their balance and depth of character. The finish is wickedly smooth. I have not heard one negative remark about the Single Barrel Product.
Have you been able to slowly lengthen out the time that the bourbon matures til it is bottled, or is it around the same age as what I sampled last year?
Yes, these barrels are five years old, as is the majority of our small batch product.
How has been the transition from Steve to Elizabeth as the master distiller?
Elizabeth is no longer with us. She had intended to train under Steve for the year, but when his family drew him back to Kentucky in February, this plan disintegrated. So, our crew is making bourbon under the direction of Sam Mead, my partners' son, who is our production manager. He has already applied his engineering degree in ways that help increase quality and production. And he's just getting started.
What other new products may be showing up in the next year?
You may see a barrel-strength bourbon from us. There are a couple of honey barrels that we are setting aside for this purpose.
Wyoming Whiskey had originally hired on Steve Nally of Maker’s Mark fame, somehow pulling him out of retirement. He was the master distiller for quite a few years, and had moved on from the company this past year. He is now heading up a group that is opening a new distillery in Kentucky that is hoping to break into the bourbon market. They expect their first products to be aged around 7 years. As we all know, it is hard to keep afloat as a company if they are not selling product. In the meantime, it looks like this group will do what so many other “craft” distillers have been caught doing, and that is selling bourbon that is produced from mega-factories that is then labeled as their own.
Bourbon has had quite the renaissance in the past few years with the popularity of the spirit soaring. People go crazy trying to find limited edition products like Pappy Van Winkle and many estimate that overall bourbon production in the United States will not catch up to demand anytime soon. This of course leads to higher prices and tighter supply for the most popular of brands.
It is good to see that Wyoming Whiskey is lengthening out the age of the barrels that they are bottling, as it can only lead to smoother and more refined bourbon. From most of my tasting, it seems that 6 to 7 years is about optimal for most bourbon. There are other processes that can speed up these results, and I have tasted batches that are only 18 months old and rival that of much older products. I look forward to hearing more about what Wyo Whiskey is doing to improve their product.
Subject: General Tech | November 12, 2014 - 06:09 PM | Jeremy Hellstrom
Tagged: audio, roccat, Kave XTD 5.1, gaming headset
The name implies that the Roccat Kave XTD 5.1 Digital headset provides virtual surround sound but in fact it has three 40mm driver units in each earcup, giving you front, rear and centre channels though you can use the provided software to switch to stereo sound if you prefer. The earcups are leather over foam which makes them quite comfortable although they could get warm after extended periods of time and the microphone boom is removable for when it would be in your way. They also have noise cancellation and the ability to pair with a phone over Bluetooth and an integrated sound card, all part of the reason that the headset is $150. Modders-Inc were impressed by that soundcards four speaker plugs on the rear allowing you to switch between sending 5.1 signal to the Kave XTD or to external speakers. Audio reviews are always very subjective as it is difficult to rate perceived sound quality for anyone but yourself but you should still check out Modders-Inc's take on the software and hardware in their full review.
"Overall I thought the Roccat Kave XTD 5.1 Digital headset is a solid performer. The audio quality from the headset is excellent. At just slightly under full volume the headset is LOUD!"
Here is some more Tech News from around the web:
- MP4Nation Brainwavz S5 @ techPowerUp
- ROCCAT Kave XTD Stereo Gaming Headset @ Benchmark Reviews
- Corsair H1500 @ HardwareHeaven
- Luxa² E-One Headset Holder Review @ TechwareLabs
- TDK A34 TREK MAX Wireless Weather Resistant Speaker Review @ NikKTech
Subject: General Tech | November 12, 2014 - 05:10 PM | Jeremy Hellstrom
Tagged: linux, amd, radeon, CS:GO, tf2
With the new driver from AMD and a long list of cards to test, from an R9290 all the way back to an HD4650, Phoronix has put together a rather definitive list of the current performance you can expect from CS:GO and TF2. CS:GO was tested at 2560x1600 and showed many performance changes from the previous driver, including some great news for 290 owners. TF2 was tested at the same resolution and many of the GPUs were capable of providing 60FPS or higher, again with the 290 taking the lead. Phoronix also did testing on the efficiency of these cards, detailing the number of frames per second, per watt used, this may not be pertinent to many users but does offer an interesting look at the efficiency of the GPUs. If you are gaming on a Radeon on Linux now is a good time to upgrade your drivers and associated programs.
"The latest massive set of Linux test data we have to share with Linux gamers and enthusiasts is a look at Counter-Strike: Global Offensive and Team Fortress 2 when using the very newest open-source Radeon graphics driver code. The very latest open-source Radeon driver code tested with these popular Valve Linux games were the Linux 3.18 Git kernel, Mesa 10.4-devel, LLVM 3.6 SVN, and xf86-video-ati 7.5.99."
Here is some more Tech News from around the web:
- A Spaceship For Christmas – Elite: Dangerous Dated @ Rock, Paper, SHOTGUN
- Wot I Think – Call Of Duty: Advanced Warfare Singleplayer @ Rock, Paper, SHOTGUN
- Ryse: Son of Rome PC: It’s Boring but Here’s Why You Should Still Buy It @ eTeknix
- Free Beards And Horse Armour: The Witcher 3 DLC Plans @ Rock, Paper, SHOTGUN
- Borderlands: The Pre-Sequel Review @ OCC
- Assassin's Creed: Unity widely found to be slow and buggy @ HEXUS
- Avalanche confirms Just Cause 3 for PC and next-gen consoles @ HEXUS
- The Witcher 2 And Mount & Blade Free In GOG Sale @ Rock, Paper, SHOTGUN
- Skaven Time: Warhammer’s XCOMish Mordheim Out Soon @ Rock, Paper, SHOTGUN
- Microsoft to bring back beloved 1990s super-hit BATTLETOADS!? @ The Register
Subject: General Tech | November 12, 2014 - 04:54 PM | Jeremy Hellstrom
Tagged: mozilla, oculus rift, MozVR
You have been able to browse the web on your Oculus Rift since the first dev kit, but not with a UI designed specifically for the VR device. MozVR is in development along with a specific version of Firefox or Chromium to allow Oculus users to browse the web in a new way. It will work with both Mac and Windows, though as of yet there is no mention of Linux support which should change in the near future. You need to get your hands on an Oculus to try out the new browser, it simply is not going to translate to the desktop. The software is open sourced and available on Github so you can contribute to the overall design of the new way to surf the web as well as optimizing your own site for VR. Check out more on MozVR and Oculus over at The Inquirer.
"MOZILLA IS CONTINUING its 10th birthday celebrations with the launch of a virtual reality (VR) website."
Here is some more Tech News from around the web:
- Elon Musk and ex-Google man mull flinging 700 internet satellites into orbit @ The Register
- Samsung slams door on OLED TVs, makes QUANTUM dot LEAP @ The Register
- Intro to Systemd Runlevels and Service Management Commands @ Linux.com
- TSMC 16FinFET Plus process achieves risk production milestone @ DigiTimes
- Iranian contractor named as Stuxnet 'patient zero' @ The Register
- Hardware Asylum Podcast - MOA 2014 Final and Surprise Lightning
Subject: Storage | November 12, 2014 - 04:44 PM | Allyn Malventano
Tagged: ssd, pcie, NVMe, Intel, DC P3500
Since we reviewed the Intel SSD DC P3700, many of you have been drooling over the idea of an 18-channel NVMe PCIe SSD, even more so given that the P3500 variant was to launch at a $1.50/GB target price. It appears we are getting closer to that release, as the P3500 has been appearing on some web sites in pre-order or out of stock status.
ShopBLT lists the 400GB part at $629 ($1.57/GB), while Antares Pro has an out of stock listing at $611 ($1.53/GB). The other two capacities are available at a similar cost/GB. We were hoping to see an 800GB variant, but it appears Intel has stuck to their initial plan. Here are the part numbers we’ve gathered, for your Googling pleasure:
- 400GB: SSDPEDMX400G401
- 1.2TB: SSDPEDMX012T401
- 2TB: SSDPEDMX020T401
2.5” SFF-8639 (*not SATA*):
- 400GB: SSDPE2MX400G401
- 1.2TB: SSDPE2MX012T401
- 2TB: SSDPE2MX020T401
We did spot a date of December 12th in an Amazon listing, but I wouldn't count that as a solid date, as many of the listings there had errors (like 10 packs for the price of one).
November 12, 2014 - 08:29 AM | Sebastian Peak
Noctua has announced three new 92mm CPU coolers today, with two different replacements for the existing NH-U9B SE2 and a new cooler for Intel Xeon LGA2011 processors for workstations and servers. Each model will now use a PWM fan, the recently announced NF-A9.
Image credit: Noctua
In Noctua’s official press release their CEO Roland Mossig is quoted "The NH-U9B SE2 is still one of our most popular models. The NH-U9S and NH-D9L stay true to this proven formula but now offer even better performance, better compatibility and PWM support for automatic fan speed control."
The first of the two NH-U9B replacements is the NH-U9S, which features an asymmetrical design with 5 heatpipes. The other model with be the NH-D9L, a 4 heatpipe design that is “15mm lower than classic 9cm coolers such as the NH-U9 series (110mm vs. 125mm)”. Noctua states that this will “guarantee full 3U compliance” and also “makes the NH-D9L ideal for compact HTPC and Small Form Factor cases”. Noctua states that the 95x95mm footprint of these new coolers, will clear “RAM and PCIe slots on all Intel and most AMD based mainboards, including µATX and ITX.”
The last addition to the 92mm lineup announced today is the server-specific NH-D9DX i4 3U, the replacement for the 4U model NH-U9DX i4. Noctua states that this new cooler uses “the same heatsink as the NH-D9L but comes with LGA2011 mounting for both Square ILM and Narrow ILM Xeon platforms as well as support for LGA13x6.”
The fan powering these new coolers is the NF-A9 PWM, and each cooler will use Noctua’s SecuFirm2 mounting system, and will come with a 6 year warranty. Noctua states that all three models are currently shipping and will be available shortly. MSRP’s will be as follows: NH-U9S, $59.90 USD; NH-D9L, $56.90 USD; NH-D9DX i4 3U, $59.90 USD.
Subject: General Tech | November 12, 2014 - 04:07 AM | Tim Verry
Tagged: system requirements, pc gaming, kyrat, fps, far cry 4
In case you missed it earlier this week, Ubisoft revealed the PC system requirements needed to run Far Cry 4. Developed by Ubisoft Montreal and set to release on November 18th, Far Cry 4 is the latest action adventure FPS in the Far Cry series. The game uses Ubisoft's Dunia Engine II which is a heavily modified game engine originally based on Crytek's CryEngine 1 developed by Kirmaan Aboobaker. The player is a Nepalese native that returns to Kyrat, a fictional location in the Himalayas following the death of their mother only to become embroiled in a civil war taking place in an open world filled with enemies, weapons, animals, and did I mention weapons?
This bow is a far cry from the only weapon you'll have access to...
According to the developer, Far Cry 4 continues the tradition of an open world environment, but the game world has been tweaked from the Far Cry 3 experience to be a tighter and more story focused experience where the single player story will take precedence over exploration and romps across the mountainous landscape.
While I can not comment on how the game plays, it certainly looks quite nice, and will need a beefy modern PC to run at its maximum settings. Interestingly, the game seems to scale down decently as well, with the entry level computer needed to run Far Cry 4 being rather modest.
No matter the hardware level, only 64-bit operating systems need apply, Far Cry 4 requires the 64-bit version of Windows 7 or later to run. At a minimum, Ubisoft recommends a quad core processor (Intel i5 750 or AMD Phenom II X4 955), 4GB of memory, a Radeon 5850 or GTX 460, and 30GB of storage.
To get optimal settings, users will need twice the system memory (at least 8GB) and video memory (at least 2GB), a newer quad core CPU such as the Intel i5-2400S or AMD FX-8350, and a modern NVIDIA GTX 680 or AMD Radeon R9 290X graphics card.
Anything beyond that is gravy that will allow gamers to crank up the AA and AF as well as the resolution.
Far Cry 4 will be available in North America on November 18, 2014 for the PC, PS4, Xbox One, PS3, and Xbox 360. Following the North America release, the game is scheduled to launch in Europe and Australia on November 20th, and in Japan on January 22 of next year.
Subject: General Tech | November 12, 2014 - 03:23 AM | Scott Michaud
Tagged: pc gaming, final fantasy xiii-2, final fantasy xiii, final fantasy
It seems like Square Enix has paid attention to the criticism about Final Fantasy XIII.
While it would have been nice for them to go back and fix the problems for the original game (Update Nov 12 @ 5:35pm EST: They are, in early December - Thanks TimeKeeper in the comments), it looks like the sequel, XIII-2, will behave more like a PC title. First and foremost, it will not be locked to 720p and it is said to offer other graphics options. The sequel is scheduled to launch on December 11th for $20, or $18 USD on pre-order (a few dollars above the launch price for Final Fantasy 13).
Of course, it is somewhat disappointing that screen resolution, a 60FPS cap, and graphics options are considered features, but the platform is unfamiliar to certain parts of the company. Acknowledging their error and building a better, but probably still below expectations, product is a good direction. Hopefully they will continue to progress, and eventually make PC games with the best of them. Either that, or they have a talk with their Eidos arm about borrowing Nixxes, a company that specializes in enhancing games on the PC.
Final Fantasy XIII-2 is coming to Steam in a month for $20 USD. The third installment, Lightning Returns, will arrive sometime in 2015.
Subject: General Tech, Systems | November 11, 2014 - 11:11 PM | Scott Michaud
Tagged: haswell-t, haswell, fanless
This one is more for our European readers, because this company operates out of Germany, but the Cirrus7 Nimbus is an interestingly designed, fanless system. Its fin shape is said to be assembled out of laser-cut layers of aluminum that sandwiches in the I/O plate at the rear. FanlessTech has noted that the systems are now available with Haswell processors, up to a Core i7 based on Haswell-T. Their storage options now also include the Samsung 850 Pro, up to 1TB.
Image Credit: Cirrus7 via FanlessTech
The customization options are actually pretty decent. I find that a lack of meaningful upgrades to be a problem with modern PC builders, however this one does not apply. Eight CPUs are offered, ranging from a Celeron up to a 45W Haswell-T; RAM comes in 4GB, 8GB, or 16GB; up to three drives can be installed, up to one mSATA and up to two SATA; Intel Wireless N or AC is available; external DVD or BluRay burners are an option; and one of seven OSes can be installed, including two versions of Linux (Ubuntu 14.04 or Ubuntu 14.10). If you get all of the bells and whistles, you are probably up to about 3,000 USD, but you cannot expect two terabytes of Samsung 850 Pro SSDs to be cheap. It seems reasonable enough, especially for the EU. The big limiter is the lack of a discrete GPU unless you are using this device for something like audio recording, which an Intel HD 4600 can easily handle.
The Cirrus7 Nimbus is available now at their website.
Subject: Storage | November 11, 2014 - 05:32 PM | Allyn Malventano
Tagged: Intel, ssd, dc s3500, M.2
Today Intel refreshed their Datacenter Series of SSDs, specifically their DC S3500. We have reviewed this model in the past. It uses the same controller that is present in the S3700, as well as the SSD 730 Series (though it is overclocked in that series).
The full line of Intel Datacenter SSDs (minus the P3700). DC S3500 is just right of center.
Todays refresh includes higher capacities to the S3500, which now include 1.2TB and 1.6TB on the hign end of capacity. This suggests that Intel is stacking 20nm dies as many as 8 to a package. IOPS performance sees a slight penalty at these new higher capacities, while maximum sequentials are a bit higher due to the increased die count.
Also announced was an M.2 version of the S3500. This packaging is limited to only a few capacity points (80GB, 120GB, 340GB), and is p;rimarily meant for applications where data integrity is critical (i.e. ATM's, server boot partitions, etc).
A standard press blast was unavailable, but full specs are listed after the break.
Subject: General Tech | November 11, 2014 - 03:10 PM | Scott Michaud
Tagged: pc gaming, gaming, eff, DRM, consolitis
This is something that I have been saying for quite some time now: games are struggling as an art form. Now I don't mean that games are not art; games, like all content that expresses feelings, thoughts, and ideas, are art. No, I'm talking about their ability to be preserved for future society and scholarly review. The business models for entertainment are based in either services or consumables. In the entertainment industries, few (but some) producers are concerned about the long tail – the extreme back-catalog of titles. Success is often determined by two weeks of sales, and the focus is on maximizing those revenues before refreshing with newer, similar content that scratches the same itch.
DRM is often justified as maximizing the initial rush by degrading your launch competitors: free versions of yourself. Now I'm not going to go into the endless reasons about where this fails to help (or actively harms) sales and your customers; that is the topic of other rants. For this news post, I will only discuss the problems that DRM (and other proprietary technologies) have on the future.
When you tie content to a platform, be it an operating system, API, or DRM service, you are trusting it for sustainability. This is necessary and perfectly reasonable. The problems arise with the permissions given to society from that platform owner, and how easily society can circumvent restrictions, as necessary. For instance, content written for a specific processor can be fed through an emulator, and the instruction sets can be emulated (or entirely knocked off) when allowed by patent law, if patents even interfere.
Copyright is different, though. Thanks to the DMCA, it is illegal, a federal crime at that, to circumvent copyright protection even for the betterment of society. You know, society, the actual owner of all original works, but who grants limited exclusivity to the creators for “the progress of Science and useful Arts”. Beyond the obvious and direct DRM implementations, this can also include encryption that is imposed by console manufacturers, for instance.
The DMCA is designed to have holes poked into it, however, by the Librarian of Congress. Yes, that is a job title. I did not misspell “Library of Congress”. The position was held by James H. Billington for over 25 years. Every three years, he considers petitions to limit the DMCA and adds exceptions in places that he sees fit. In 2012, he decided the jailbreaking a phone should not be illegal under the DMCA, although tablets were not covered under that exemption. This is around the time that proposals will be submitted for his next batch in late 2015.
This time, the EFF is proposing that circumventing DRM in abandoned video games should be deemed legal, for society to preserve these works of art when the copyright holders will not bother. Simply put, if society intended to grant a limited exclusive license to a content creator who has no intention of making their work available to society, then society demands the legal ability to pry off the lock to preserve the content.
Of course, even if it is deemed legal, stronger DRM implementations could make it technologically unfeasible to preserve certain works. It is still a long way's away before we encounter a lock that society cannot crack, but it is theoretically possible. This proposal does not address that root problem, but at least it could prevent society's greatest advocates from being slapped with a pointless felony for trying to do the right thing.