All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | November 17, 2014 - 02:06 PM | Jeremy Hellstrom
Tagged: blackberry, BES12
Somewhere the highly successful company RIM released BES 11 which runs on Windows 9, but not in this universe. In this brane we have the struggling, but still alive Blackberry announcing the upcoming release of BES 12, a mobile device management server which will manage Android, iOS, Windows and Blackberry devices for enterprise. The most interesting feature that was mentioned to The Register are virtual phone numbers, a company owned phone number can be added to an employees personal device, allowing them to keep their personal number and still receive business calls without needing a second device. All the billing used by the business number is billed to the business while data and voice used under the users account is billed to their plan. There are some obvious challenges to this service, you would need the same provider and the device would need to support Blackberry Balance, Android Fort Knox or similar partitioning software but is something which could interest small businesses. You can catch up on the other features in the BES and news about some new devices right here.
"Turnaround artiste John Chen marked one year as BlackBerry boss with an avalanche of enterprise software news related to the firm's new BES12 server, which can manage enterprise mobe devices running Android, iOS, Windows Phone – and of course, BB's own mobile OS."
Here is some more Tech News from around the web:
- How to Easily Install Ubuntu on Chromebook with Crouton @ Linux.com
- Bridging Networks With The Flip Of A Switch @ Hack a Day
- Mastercard and Visa to ERADICATE password authentication @ The Register
- Tor users can be deduced with 81 percent accuracy, says report @ The Inquirer
- NikKTech & Be Quiet ! Worldwide Giveaway
- Tech ARP 2014 Mega Giveaway Contest
Subject: General Tech | November 16, 2014 - 11:30 PM | Scott Michaud
Tagged: raptr, pc gaming
The PC gaming utility, Raptr, tracks the time its users spend playing titles and aggregates it into a monthly press release. Because its purpose is recording game footage, adjusting quality settings, and so forth, it is not limited to any specific catalog of games. It allows a comparison developers, publishers, and distribution platforms, as long as the average Raptr user is representative of that market.
It should be noted that, because game-hours are the recorded metric, it is not necessarily a good experiment to judge sales figures from. It is weighted by the average session length per user and how frequently the average user plays it, and not just how many people use it. As such, it will probably over-represent MMOs, MOBAs, and other multiplayer games... unless you are looking for aggregate game time, which is exactly what this survey provides (and sales figures are bad at determining).
From last month, League of Legends lost a bit of share, down from 22.54% of total to 22.25%. The second place contender, World of Warcraft, jumped from 7.63% of total game time to 8.53%. This means that League of Legends dropped from being 195% more popular than WoW to being 160% more popular. World of Warcraft is expected to jump further due to its Warlords of Draenor expansion that released in early November. The October bump, reported today, was likely due to the pre-expansion patch and promotional events.
Diablo III, the other Blizzard title on this chart, lost three places (and almost half of its play time) this month. It currently rests above Minecraft as it dropped below Smite and Counter-Strike: GO and ArcheAge moved up past it. PAYDAY 2 and FIFA 15 represented the capital letters by jumping onto the list (back onto in PAYDAY 2's case) in 14th and 15th spot, respectively.
Borderlands: The Pre-Sequel and Middle-Earth: Shadow of Mordor are new games for October, and appeared on the list in 18th and 19th place. Both titles bumped Team Fortress 2 down to 20th place, almost spiting the Halloween promotion, although its play time increased from September.
Subject: Storage | November 14, 2014 - 02:00 PM | Jeremy Hellstrom
Tagged: btrfs, EXT4, XFS, F2FS, 530 series, Intel, raid, unix humour, linux
When you use Linux you have a choice as to which file system you wish to use, a choice that never occurs to most Windows users but can spark an argument every bit as vicious as the eternal debate over EMACS versus VIM versus whichever text editor you prefer. There has not been much SSD benchmarking done on alternate files systems until now, Phoronix has benchmarked the Intel 530 series SSD in numerous configurations on Btrfs, EXT4, XFS, and F2FS. With four of the 120GB model available they were able to test the speed of the drives in RAID 0, 1, 5, 6, and 1+0. There is obviously still some compatibility issues as some tests failed to run in certain configurations but overall these drives performed as expected. While the results did not vary widely it is worth reading through their article if you plan on building a high speed storage machine which will run Linux.
"Following the recent Btrfs RAID: Native vs. Mdadm comparison, the dual-HDD Btrfs RAID benchmarks, and four-SSD RAID 0/1/5/6/10 Btrfs benchmarks are RAID Linux benchmarks on these four Intel SATA 3.0 solid state drives using other file-systems -- including EXT4, XFS, and Btrfs with Linux 3.18."
Here are some more Storage reviews from around the web:
- Samsung XS1715 (1.6TB) @ The SSD Review
- Angelbird SSD2go Pocket USB 3.0 External Solid State Drive @ eTeknix
- Silicon Power Thunder T11 120 GB @ techPowerUp
- Silicon Power Armor A80 2TB USB 3.0 Portable Hard Drive Review @ NikKTech
- Patriot Memory Stellar Boost XT 64GB USB 3.0 OTG Drive Review @ Madshrimps
- Synology DiskStation DS1815+ @ Legion Hardware
- Western Digital Red Pro (WD4001FFSX) 4 TB @ Tech ARP
Subject: General Tech | November 14, 2014 - 01:27 PM | Jeremy Hellstrom
Tagged: asus, gigabyte, sales, motherboards
If you prefer to talk about the sheer number of sales then ASUS is on track to take top spot with roughly 22 million units sold over 2014, a jump of over just 1 million from last year and 2 more than Gigabyte's predicted sales of 20 million units. ASUS will also hold on to the most profit this year, Gigabyte is expected to match last year's profit of about 97 million USD which falls short of ASUS' expected 130 million USD but that is not the whole story. Last year ASUS closed out with over 160 million USD profit which shows a significant decline in their profitability during the same period that Gigabyte's profitability remained the same. DigiTimes reports this as being due to increased spending by ASUS on marketing and price cuts on their motherboards. Is it possible that ASUS' once insurmountable lead in the motherboard market could be a thing of the past?
"Asustek Computer's motherboard shipments returned to six million units in the third quarter thanks to its aggressive price-cutting strategy, which helped the vendor slightly widen the gap with its major competitors Gigabyte Technology, according to sources from the motherboard industry. However, despite the fact that Asustek is estimated to ship more motherboards than Gigabyte in 2014, its profit growth may perform weaker than Gigabyte's."
Here is some more Tech News from around the web:
- Heartbleed and Windows bugs don't make for an insecure cloud @ The Inquirer
- Graphics Card Coil Whine; An Investigation @ Hardware Canucks
- Pay-by-bonk chip lets hackers pop all your favourite phones @ The Register
- The TR Podcast 165: Game reqs get inflated, benchmarks get weird, and Asus nails the X99-A
Subject: Graphics Cards | November 14, 2014 - 11:46 AM | Ryan Shrout
Tagged: sli, nvidia, N980X3WA-4GD, maxwell, GTX 980, gigabyte, geforce, 3-way
Earlier this week, a new product showed up on Gigabyte's website that has garnered quite a bit of attention. The GA-N980X3WA-4GD WaterForce Tri-SLI is a 3-Way SLI system with integrated water cooling powered by a set of three GeForce GTX 980 GPUs.
That. Looks. Amazing.
What you are looking at is a 3-Way closed loop water cooling system with an external enclosure to hold the radiators while providing a display full of information including temperatures, fans speeds and more. Specifications on the Gigabyte site are limited for now, but we can infer a lot from them:
- WATERFORCE :3-WAY SLI Water Cooling System
- Real-Time Display and Control
- Flex Display Technology
- Powered by NVIDIA GeForce GTX 980 GPU
- Integrated with 4GB GDDR5 memory 256-bit memory interface(Single Card)
- Features Dual-link DVI-I / DVI-D / HDMI / DisplayPort*3(Single Card)
- BASE: 1228 MHz / BOOST: 1329 MHz
- System power supply requirement: 1200W(with six 8-pin external power connectors)
The GPUs on each card are your standard GeForce GTX 980 with 4GB of memory (we reviewed it here) though they are running at overclocked base and boost clock speeds, as you would hope with all that water cooling power behind it. You will need a 1200+ watt power supply for this setup, which makes sense considering the GPU horsepower you'll have access to.
Another interesting feature Gigabyte is listing is called GPU Gauntlet Sorting.
With GPU Gauntlet™ Sorting, the Gigabyte SOC graphics card guarantees the higher overclocking capability in terms of excellent power switching.
Essentially, Gigabyte is going to make sure that the GPUs on the WaterForce Tri-SLI are the best they can get their hands on, with the best chance for overclocking higher than stock.
Setup looks interesting - the radiators and fans will be in the external enclosure with tubing passing into the system through a 5.25-in bay. It will need to have quick connect/disconnect points at either the GPU or radiator to make that installation method possible.
Pricing and availability are still unknown, but don't expect to get it cheap. With the GTX 980 still selling for at least $550, you should expect something in the $2000 range or above with all the custom hardware and fittings involved.
Can I get two please?
Subject: General Tech | November 13, 2014 - 06:09 PM | Jeremy Hellstrom
Tagged: Cherry MX clear, WASD Keyboards, CODE, input, mechanical keyboard, tenkeyless
Scott posted about the WASD Keyboards CODE with Cherry MX Clear switches but until now we have not found a review of this keyboard. The Tech Report has changed that with this review which takes a look at the new type of switch which sits between the Brown and the Green, Clear switches need more force to bottom out that a Brown but not as much as the clicky style Green switches. That is not all this tenkeyless board offers, there are LEDs that can be activated by the dip switches in the recess found on the back of the keyboard. In fact those dip switches can do more than just enable a nice glow, you can disable the Windows key or even immediately switch to different layouts such as Mac, Dvorak, and Colemak though sadly they left Sinclair ZX off of the list. If this type of switch interests your fingers and you are willing to spend $150 on a keyboard check out the full review here.
"We've been meaning to try out Cherry MX's clear key switches for a while, and now, we've finally gotten our wish. Join us for a look at WASD Keyboards' Cherry MX clear-infused Code keyboard, a tenkeyless offering with more than a few tricks up its sleeve."
Here is some more Tech News from around the web:
- COUGAR 700K Mechanical Gaming Keyboard Review @ Techgage
- Cougar 700K Mechanical Keyboard @ Hardware Heaven
- AORUS Thunder K7 Gaming Keyboard @ Benchmark Reviews
- Thrustmaster Ferrari GT Cockpit 458 Steeling Wheel @ eTeknix
- Corsair Gaming RGB M65 @ Kitguru
- GAMDIAS ZEUS Laser Gaming Mouse @ Tech ARP
- Tesoro Gandiva H1L Laser Gaming Mouse Review @ NikKTec
Subject: Systems | November 13, 2014 - 04:03 PM | Jeremy Hellstrom
Tagged: zotac, zbox ci540 nano, fanless, haswell, i5-4210Y
The Zotac ZBOX CI540 Nano is a bit more powerful than your average Bay Trail based mini-PC, it sports a Haswell based dual core i5-4210Y which runs between 1.5-1.9GHz and has Intel's HD4200 onboard. This won't play AC:Unity but comes close to matching a NUC containing a Core i5-4250U, you give up a bit of horsepower for completely silent operation and for media it sports enough power to watch your favourite videos. As you look at Silent PC Reviews' article you can see the honeycomb patterned knockouts on the casing to allow heat to dissipate and to let in liquid if you don't put some thought into where you are going to place the ZBOX. It does have Bluetooth and there is an unofficial optional IR receiver that can be used to make it easy to place this tiny computer in a safe place.
"The Zotac ZBOX CI540 Nano gives up a little CPU/GPU horsepower to deliver a completely fanless, silent and full-featured mini-PC experience."
Here are some more Systems articles from around the web:
- Zotac ZBox PI320 Mini-PC Review and Teardown @ The SSD Review
- Zotac Zbox Pico @ HardwareHeaven
- Lenovo Horizon 2 AIO Desktop Computer @ Benchmark Reviews
- Logic Supply ML400G-50 Fanless m-ITX PC @ Silent PC Review
- Habey MITX-6771 Bay Trail Embedded Motherboard @ Silent PC Review
Subject: General Tech | November 13, 2014 - 03:19 PM | Ken Addison
Tagged: podcast, video, Intel, core m, core m 5y70, Broadwell, broadwell-y, Lenovo, yoga 2 pro, yoga 3 pro, assasins creed unity, ubisoft, farcry 4, p3500, gskill blade
PC Perspective Podcast #326 - 11/13/2014
Join us this week as we discuss Intel's Core M 5Y70, Assassin's Creed Unity, Intel P3500 and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the Store
- RSS - Subscribe through your regular RSS reader
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano
Program length: 1:09:49
Subject: Systems, Mobile | November 13, 2014 - 02:48 PM | Scott Michaud
Tagged: shield tablet, shield, nvidia, grid, geforce grid
Today, NVIDIA has announced the November update for their SHIELD Tablet, which is really about three announcements that are rolled up together.
As expected, the SHIELD Tablet is getting a roll-up to Android 5.0 Lollipop and its new, “Material Design” style guide. NVIDIA's took the opportunity to refresh the SHIELD HUB (my shift key must think that this is an MSI announcement by now...) in the same design specification. While interesting, the two other announcements probably beat it out, especially the GRID streaming service (and how it relates to the Xbox One and the PlayStation 4).
But before we get to GRID, let's talk “The Green Box”. In May, NVIDIA sent us a green crowbar to mark the availability of Half-Life 2 and Portal on the NVIDIA SHIELD. These were full, native ports of the PC title to ARM and Android that is exclusive to the NVIDIA SHIELD. With the November update, Half-Life 2: Episode One has also been ported to the platform. The three games, Portal, Half-Life 2, and Episode One, are also packaged in “The Green Box” bundle, which will be included free-of-charge with the SHIELD Tablet 32GB. Note that, while the games are included with the tablet, they require a controller to play, which is not included.
Now we talk about GRID.
Netflix is a popular service where people can watch a variety of movies from their rolling catalog. It will not replace ownership of certain, intrinsically valuable titles, but there is probably options for anyone who wants to consume some form of entertainment. GRID is a similar service for video games, and it is not the first. We took a look at a preview of OnLive in 2010, connecting to a server about 2400 miles away, which is over twice the maximum intended range, and found the experience somewhat positive for games except Unreal Tournament 3 at that relatively extreme latency. Another company, GaiKai, was purchased by Sony and rebranded as PlayStation Now. It will serve up a selection of games from the PS3 catalog. Again, content on these services can be pulled at any time, but if you are just looking for the entertainment value, something else will probably be there to scratch your itch.
The interesting part that I have been teasing throughout this entire post is the performance of NVIDIA GRID. PlayStation Now is rated at 192 GFLOPs, which is the theoretical GPU compute throughput of the PS3's RSX chip. GRID, on the other hand, is rated for 2448 GFLOPs (~2.5 TFLOPs). This is higher than the PlayStation 4, and almost twice the GPU performance of the Xbox One. On the PC side, it is roughly equivalent to the GeForce GTX 760 Ti.
This compute rating has a hidden story, too. Back in 2011, Epic Games demoed “Samaritan” in Unreal Engine 3. This was the bar that Epic Games set for Microsoft, Sony, and Nintendo to mark a new console generation. When Unreal Engine 4 was unveiled at the end of E3 2012, it was embodied in the Elemental Demo, which also ran at (you guessed it) 2.5 TFLOPs. At the PlayStation 4 (1.9 TFLOPs) announcement, the demo was scaled back with reduced particles and lighting complexity. It was not shown at either Xbox One (1.3 TFLOPs) announcement at all.
What all of that means is simple: NVIDIA GRID is the only fixed hardware platform (that I am aware of) to meet Epic's vision of a next-gen gaming system. I say fixed, of course, because the PC can over-double it per card, with some games scaling to four discrete GPUs. This also says nothing about the CPU performance, system memory, or video memory, but it has the GPU in the right place for a next gen platform.
The NVIDIA GRID preview will launch in November for North America, with East Coast and West Coast servers. It will expand in December for Western Europe, and in “Q2” for Asia Pacific. The service will be free for SHIELD users until June 30th, 2015. The Android 5.0 Update for the SHIELD Tablet will be available on November 18th.
Subject: General Tech | November 13, 2014 - 01:39 PM | Jeremy Hellstrom
Tagged: cycle computing, supercomputer, gojira, bargain
While the new Gojira supercomputer is not more powerful than the University of Tokyo's Oakleaf-FX at 1 petaflop of performance if you look at it from a price to performance ratio the $5,500 Gojira is more than impressive. It has a peak theoretical performance of 729 teraflops by using over 71,000 Ivy Bridge cores across several Amazon Web Service regions and providing the equivalent of 70.75 years of compute time. The cluster was built in an incredibly short time, going from zero to 50,000 cores in 23 minutes and hitting the peak after 60 minutes. You won't be playing AC Unity on it any time soon but if you want to rapidly test virtual prototypes these guys can do it for an insanely low price. Catch more at The Register and ZDNet, the Cycle Computing page seems to be down for the moment.
"Cycle Computing has helped hard drive giant Western Digital shove a month's worth of simulations into eight hours on Amazon cores."
Here is some more Tech News from around the web:
- Mac OS X Yosemite disables third-party SSD driver support @ The Inquirer
- Digitimes Research: Lenovo, Asustek to launch US$149 Chromebook
- DAY ZERO, and COUNTING: EVIL 'UNICORN' all-Windows vuln - are YOU patched? @ The Register
- FLASH better than DISK for archiving, say academics. Are they stark, raving mad? @ The Register
Subject: Graphics Cards | November 13, 2014 - 12:46 PM | Scott Michaud
Tagged: nvidia, geforce, gtx 960, maxwell
It is possible that a shipping invoice fragment was leaked for the NVIDIA GeForce GTX 960. Of course, an image of text on a plain, white background is one of the easiest things to fake and/or manipulate, so take it with a grain of salt.
The GTX 960 is said to have 4GB of RAM on the same, 256-bit bus. Its video outputs are listed as two DVI, one HDMI, and one DisplayPort, making this graphics card useful for just one G-Sync monitor per card. If I'm reading it correctly, it also seems to have a 993 MHz base clock (boost clock unlisted) and an effective 6008 MHz (1500 MHz actual) RAM clock. This is slightly below the 7 GHz (1750 MHz actual) of the GTX 970 and GTX 980 parts, but it should also be significantly cheaper.
The GeForce GTX 960 is expected to retail in the low-$200 price point... some day.
Subject: Graphics Cards | November 12, 2014 - 09:03 PM | Ryan Shrout
Tagged: Unity, ubisoft, assassin's creed
Over the last couple of days there have been a lot of discussions about the performance of the new Assassin's Creed Unity from Ubisoft on current generation PC hardware. Some readers have expressed annoyance that the game is running poorly, at lower than expected frame rates, at a wide range of image quality settings. Though I haven't published my results yet, we are working on a story comparing NVIDIA and AMD GPUs in Unity, but the truth is that this is occurring on GPUs from both sides.
For example, using a Core i7-3960X and a single GeForce GTX 980 4GB reference card, I see anywhere from 37 FPS to 48 FPS while navigating the crowded city of Paris at 1920x1080 and on the Ultra High preset. Using the Low preset, that frame rate increases to 65-85 FPS or so.
Clearly, those are lower frame rates at 1920x1080 than you'll find in basically any other PC game on the market. The accusation from some in the community is that Ubisoft is either doing this on purpose or doing it out of neglect with efficient code. I put some questions to the development team at Ubisoft and though I only had a short time with them, the answers tell their side of the story.
Ryan Shrout: What in the Unity game engine is putting the most demand on the GPU and its compute resources? Are there specific effects or were there specific design goals for the artists that require as much GPU horsepower as the game does today with high image quality settings?
Ubisoft: Assassin’s Creed Unity is one of the most detailed games on the market and [contains] a giant, open world city built to the scale that we’ve recreated. Paris requires significant details. Some points to note about Paris in Assassin’s Creed Unity:
- There are tens of thousands of objects are visible on-screen, casting and receiving shadows.
- Paris is incredibly detailed. For example, Notre-Dame itself is millions of triangles.
- The entire game world has global illumination and local reflections.
- There is realistic, high-dynamic range lighting.
- We temporally stabilized anti-aliasing.
RS: Was there any debate internally about downscaling on effects/image quality to allow for lower end system requirements?
Ubisoft: We talked about this a lot, but our position always came back to us ensuring that Assassin’s Creed Unity is a next-gen only game with breakthrough graphics. With this vision, we did not degrade the visual quality of the game. On PC, we have several option for low-scaling, like disabling AA, decreasing resolution, and we have low option for Texture Quality, Environment Quality and Shadows.
RS: Were you looking forward or planning for future GPUs (or multi-GPU) that will run the game at peak IQ settings at higher frame rates than we have today?
Ubisoft: We targeted existing PC hardware.
RS: Do you envision updates to the game or to future GPU drivers that would noticeably improve performance on current generations of hardware?
Ubisoft: The development team is continuing to work on optimization post-launch through software updates. You’ll hear more details shortly.
Some of the features listed by the developer in the first answer - global illumination methods, high triangle counts, HDR lighting - can be pretty taxing on GPU hardware. I know there are people out there pointing out games that have similar feature sets and that run at higher frame rates, but the truth is that no two game engines are truly equal. If you have seen Assassin's Creed Unity in action you'll be able to tell immediately the game is beautiful, stunningly so. Is it worth that level of detail for the performance levels achieved from current high-end hardware? Clearly that's the debate.
When I asked if Ubisoft had considered scaling back the game to improve performance, they clearly decided against it. The developer had a vision for the look and style of the game and they were dedicated to it; maybe to a fault from some gamers' viewpoint.
Also worth nothing is that Ubisoft is continuing to work on optimization post-release; how much of an increase we'll actually see with game patches or driver updates will have to be seen as we move forward. Some developers have a habit of releasing a game and simply abandoning it as it shipped - hopefully we will see more dedication from the Unity team.
So, if the game runs at low frame rates on modern hardware...what is the complaint exactly? I do believe that Ubisoft would have benefited from better performance on lower image quality settings. You can tell by swapping the settings for yourself in game but the quality difference between Low and Ultra High is noticeable, but not dramatically so. Again, this likely harkens back to the desire of Ubisoft to maintain an artistic vision.
Remember that when Crysis 3 launched early last year, running at 1920x1200 at 50 FPS required a GTX 680, the top GPU at the time; and that was at the High settings. The Very High preset only hit 37 FPS on the same card.
PC gamers seems to be creating a double standard. On one hand, none of us want PC-ports or games that are developed with consoles in mind that don't take advantage of the power of the PC platform. Games in the Call of Duty series are immensely popular but, until the release of Advanced Warfare, would routinely run at 150-200 FPS at 1080p on a modern PC. Crysis 3 and Assassin's Creed Unity are the opposite of that - games that really tax current CPU and GPU hardware, paving a way forward for future GPUs to be developed and NEEDED.
If you're NVIDIA or AMD, you should applaud this kind of work. Now I am more interested than ever in a GTX 980 Ti, or a R9 390X, to see what Unity will play like, or what Far Cry 4 will run at, or if Dragon Age Inquisition looks even better.
Of course, if we can get more performance from a better optimized or tweaked game, we want that too. Developers need to be able cater to as wide of a PC gaming audience as possible, but sometimes creating a game that can scale between running on a GTX 650 Ti and a GTX 980 is a huge pain. And with limited time frames and budgets, don't we want at least some developers to focus on visual quality rather than "dumbing down" the product?
Let me know what you all think - I know this is a hot-button issue!
UPDATE: Many readers in the comments are bringing up the bugs and artifacts within Unity, pointing to YouTube videos and whatnot. Those are totally valid complaints about the game, but don't necessarily reflect on the game's performance - which is what we were trying to target with this story. Having crashes and bugs in the game is disappointing, but again, Ubisoft and Assassin's Creed Unity aren't alone here. Have you seen the bugs in Skyrim or Tomb Raider? Hopefully Ubisoft will be more aggressive in addressing them in the near future.
UPDATE 2: I also wanted to comment that even though I seem to be defending Ubisoft around the performance of Unity, my direct feedback to them was that they should enable modes in the game that allow it to play at higher frame rates and even lower image quality settings, even if they were unable to find ways to "optimize" the game's efficiency. So far the developer seems aware of all the complaints around performance, bugs, physics, etc. and is going to try to address them.
UPDATE 3: In the last day or so, a couple of other media outlets have posted anonymous information that indicates that the draw call count for Assassin's Creed Unity is at fault for the poor performance of the game on PCs. According to this "anonymous" source, while the consoles have low-level API access to hardware to accept and process several times the draw calls, DirectX 11 can only handle "7,000 - 10,000 peak draw calls." Unity apparently is "pushing in excess of 50,000 draw calls per frame" and thus is putting more pressure on the PC that it can handle, even with high end CPU and GPU hardware. The fact that these comments are "anonymous" is pretty frustrating as it means that even if they are accurate, they can't be taken as the truth without confirmation from Ubisoft. If this turns out to be true, then it would be a confirmation that Ubisoft didn't take the time to implement a DX11 port correctly. If it's not true, or only partially to blame, we are left with more meaningless finger-pointing.
Subject: General Tech | November 12, 2014 - 07:38 PM | Scott Michaud
Tagged: visual studio, microsoft
While this is significantly different from what we usually write about, I have a feeling that there is some overlap with our audience.
Update: If you use Visual Studio Express 2013, you may wish to uninstall it before installing Community. My experience seems to be that it thinks that both are installed to the same directory, and so uninstalling Express after installing Community will break both. I am currently repairing Community, which should fix it, but there's no sense for you to install twice if you know better.
Visual Studio Express has been the free, cut-down option for small and independent software developers. It can be used for commercial applications, but it was severely limited in many areas, such as its lack of plug-in support. Today, Microsoft announced Visual Studio Community 2013, which is a free version of Visual Studio that is equivalent to Visual Studio Professional 2013 for certain users (explained below). According to TechCrunch, while Visual Studio Express will still be available for download, Community is expected to be the version going forward.
Image Credit: Wikimedia (modified)
There are four use cases for Visual Studio Community 2013:
- To contribute to open-source projects (unlimited users)
- To use in a classroom environment for learning (unlimited users)
- To use as a tool for Academic research (unlimited users)
- To create free or commercial, closed-source applications (up to 5 users)
- You must be an individual or small studio with less than 250 PCs
- You must have no more than $1 million USD in yearly revenue
Honestly, this is a give-and-take scenario, but it seems generally positive. I can see this being problematic for small studios with 6+ developers, but they can (probably) still use Visual Studio Express 2013 Update 3 until it gets too old. For basically everyone else, this means that you do not need to worry about technical restrictions when developing software. This opens the avenue for companies like NVIDIA (Nsight Visual Studio Edition) and Epic Games (Unreal Engine 4) to deliver their plug-ins to the independent developer community. When I get a chance, and after it finishes installing, I will probably check to see if those examples already work.
Visual Studio Community 2013 Update 4 is available now at Microsoft's website.
Subject: Editorial | November 12, 2014 - 06:58 PM | Josh Walrath
Tagged: Wyoming Whiskey, Whiskey, Kirby, Bourbon
Last year around this time I reviewed my first bottle of Wyoming Whiskey. Overall, I was quite pleased with how this particular spirit has come along. You can read my entire review here. It also includes a little interview with one of the co-founders of Wyoming Whiskey, David Defazio. The landscape has changed a little throughout the past year, and the distillery has recently released a second product in limited quantities to the Wyoming market. The Single Barrel Bourbon selections come from carefully selected barrels and are not blended with others. I had the chance to chat with David again recently and received some interesting information from him about the latest product and where the company is headed.
Picture courtesy of Wyoming Whiskey
Noticed that you have a new single barrel product on the shelves. How would you characterize this as compared to the standard bottle you sell?
These very few barrels are selected from many and only make the cut if they meet very high standards. We have only bottled 4 so far. And, the State has sold out. All of our product has matured meaningfully since last year and these barrels have benefitted the most as evidenced by their balance and depth of character. The finish is wickedly smooth. I have not heard one negative remark about the Single Barrel Product.
Have you been able to slowly lengthen out the time that the bourbon matures til it is bottled, or is it around the same age as what I sampled last year?
Yes, these barrels are five years old, as is the majority of our small batch product.
How has been the transition from Steve to Elizabeth as the master distiller?
Elizabeth is no longer with us. She had intended to train under Steve for the year, but when his family drew him back to Kentucky in February, this plan disintegrated. So, our crew is making bourbon under the direction of Sam Mead, my partners' son, who is our production manager. He has already applied his engineering degree in ways that help increase quality and production. And he's just getting started.
What other new products may be showing up in the next year?
You may see a barrel-strength bourbon from us. There are a couple of honey barrels that we are setting aside for this purpose.
Wyoming Whiskey had originally hired on Steve Nally of Maker’s Mark fame, somehow pulling him out of retirement. He was the master distiller for quite a few years, and had moved on from the company this past year. He is now heading up a group that is opening a new distillery in Kentucky that is hoping to break into the bourbon market. They expect their first products to be aged around 7 years. As we all know, it is hard to keep afloat as a company if they are not selling product. In the meantime, it looks like this group will do what so many other “craft” distillers have been caught doing, and that is selling bourbon that is produced from mega-factories that is then labeled as their own.
Bourbon has had quite the renaissance in the past few years with the popularity of the spirit soaring. People go crazy trying to find limited edition products like Pappy Van Winkle and many estimate that overall bourbon production in the United States will not catch up to demand anytime soon. This of course leads to higher prices and tighter supply for the most popular of brands.
It is good to see that Wyoming Whiskey is lengthening out the age of the barrels that they are bottling, as it can only lead to smoother and more refined bourbon. From most of my tasting, it seems that 6 to 7 years is about optimal for most bourbon. There are other processes that can speed up these results, and I have tasted batches that are only 18 months old and rival that of much older products. I look forward to hearing more about what Wyo Whiskey is doing to improve their product.
Subject: General Tech | November 12, 2014 - 06:09 PM | Jeremy Hellstrom
Tagged: audio, roccat, Kave XTD 5.1, gaming headset
The name implies that the Roccat Kave XTD 5.1 Digital headset provides virtual surround sound but in fact it has three 40mm driver units in each earcup, giving you front, rear and centre channels though you can use the provided software to switch to stereo sound if you prefer. The earcups are leather over foam which makes them quite comfortable although they could get warm after extended periods of time and the microphone boom is removable for when it would be in your way. They also have noise cancellation and the ability to pair with a phone over Bluetooth and an integrated sound card, all part of the reason that the headset is $150. Modders-Inc were impressed by that soundcards four speaker plugs on the rear allowing you to switch between sending 5.1 signal to the Kave XTD or to external speakers. Audio reviews are always very subjective as it is difficult to rate perceived sound quality for anyone but yourself but you should still check out Modders-Inc's take on the software and hardware in their full review.
"Overall I thought the Roccat Kave XTD 5.1 Digital headset is a solid performer. The audio quality from the headset is excellent. At just slightly under full volume the headset is LOUD!"
Here is some more Tech News from around the web:
- MP4Nation Brainwavz S5 @ techPowerUp
- ROCCAT Kave XTD Stereo Gaming Headset @ Benchmark Reviews
- Corsair H1500 @ HardwareHeaven
- Luxa² E-One Headset Holder Review @ TechwareLabs
- TDK A34 TREK MAX Wireless Weather Resistant Speaker Review @ NikKTech
Subject: General Tech | November 12, 2014 - 05:10 PM | Jeremy Hellstrom
Tagged: linux, amd, radeon, CS:GO, tf2
With the new driver from AMD and a long list of cards to test, from an R9290 all the way back to an HD4650, Phoronix has put together a rather definitive list of the current performance you can expect from CS:GO and TF2. CS:GO was tested at 2560x1600 and showed many performance changes from the previous driver, including some great news for 290 owners. TF2 was tested at the same resolution and many of the GPUs were capable of providing 60FPS or higher, again with the 290 taking the lead. Phoronix also did testing on the efficiency of these cards, detailing the number of frames per second, per watt used, this may not be pertinent to many users but does offer an interesting look at the efficiency of the GPUs. If you are gaming on a Radeon on Linux now is a good time to upgrade your drivers and associated programs.
"The latest massive set of Linux test data we have to share with Linux gamers and enthusiasts is a look at Counter-Strike: Global Offensive and Team Fortress 2 when using the very newest open-source Radeon graphics driver code. The very latest open-source Radeon driver code tested with these popular Valve Linux games were the Linux 3.18 Git kernel, Mesa 10.4-devel, LLVM 3.6 SVN, and xf86-video-ati 7.5.99."
Here is some more Tech News from around the web:
- A Spaceship For Christmas – Elite: Dangerous Dated @ Rock, Paper, SHOTGUN
- Wot I Think – Call Of Duty: Advanced Warfare Singleplayer @ Rock, Paper, SHOTGUN
- Ryse: Son of Rome PC: It’s Boring but Here’s Why You Should Still Buy It @ eTeknix
- Free Beards And Horse Armour: The Witcher 3 DLC Plans @ Rock, Paper, SHOTGUN
- Borderlands: The Pre-Sequel Review @ OCC
- Assassin's Creed: Unity widely found to be slow and buggy @ HEXUS
- Avalanche confirms Just Cause 3 for PC and next-gen consoles @ HEXUS
- The Witcher 2 And Mount & Blade Free In GOG Sale @ Rock, Paper, SHOTGUN
- Skaven Time: Warhammer’s XCOMish Mordheim Out Soon @ Rock, Paper, SHOTGUN
- Microsoft to bring back beloved 1990s super-hit BATTLETOADS!? @ The Register
Subject: General Tech | November 12, 2014 - 04:54 PM | Jeremy Hellstrom
Tagged: mozilla, oculus rift, MozVR
You have been able to browse the web on your Oculus Rift since the first dev kit, but not with a UI designed specifically for the VR device. MozVR is in development along with a specific version of Firefox or Chromium to allow Oculus users to browse the web in a new way. It will work with both Mac and Windows, though as of yet there is no mention of Linux support which should change in the near future. You need to get your hands on an Oculus to try out the new browser, it simply is not going to translate to the desktop. The software is open sourced and available on Github so you can contribute to the overall design of the new way to surf the web as well as optimizing your own site for VR. Check out more on MozVR and Oculus over at The Inquirer.
"MOZILLA IS CONTINUING its 10th birthday celebrations with the launch of a virtual reality (VR) website."
Here is some more Tech News from around the web:
- Elon Musk and ex-Google man mull flinging 700 internet satellites into orbit @ The Register
- Samsung slams door on OLED TVs, makes QUANTUM dot LEAP @ The Register
- Intro to Systemd Runlevels and Service Management Commands @ Linux.com
- TSMC 16FinFET Plus process achieves risk production milestone @ DigiTimes
- Iranian contractor named as Stuxnet 'patient zero' @ The Register
- Hardware Asylum Podcast - MOA 2014 Final and Surprise Lightning
Subject: Storage | November 12, 2014 - 04:44 PM | Allyn Malventano
Tagged: ssd, pcie, NVMe, Intel, DC P3500
Since we reviewed the Intel SSD DC P3700, many of you have been drooling over the idea of an 18-channel NVMe PCIe SSD, even more so given that the P3500 variant was to launch at a $1.50/GB target price. It appears we are getting closer to that release, as the P3500 has been appearing on some web sites in pre-order or out of stock status.
ShopBLT lists the 400GB part at $629 ($1.57/GB), while Antares Pro has an out of stock listing at $611 ($1.53/GB). The other two capacities are available at a similar cost/GB. We were hoping to see an 800GB variant, but it appears Intel has stuck to their initial plan. Here are the part numbers we’ve gathered, for your Googling pleasure:
- 400GB: SSDPEDMX400G401
- 1.2TB: SSDPEDMX012T401
- 2TB: SSDPEDMX020T401
2.5” SFF-8639 (*not SATA*):
- 400GB: SSDPE2MX400G401
- 1.2TB: SSDPE2MX012T401
- 2TB: SSDPE2MX020T401
We did spot a date of December 12th in an Amazon listing, but I wouldn't count that as a solid date, as many of the listings there had errors (like 10 packs for the price of one).
November 12, 2014 - 08:29 AM | Sebastian Peak
Noctua has announced three new 92mm CPU coolers today, with two different replacements for the existing NH-U9B SE2 and a new cooler for Intel Xeon LGA2011 processors for workstations and servers. Each model will now use a PWM fan, the recently announced NF-A9.
Image credit: Noctua
In Noctua’s official press release their CEO Roland Mossig is quoted "The NH-U9B SE2 is still one of our most popular models. The NH-U9S and NH-D9L stay true to this proven formula but now offer even better performance, better compatibility and PWM support for automatic fan speed control."
The first of the two NH-U9B replacements is the NH-U9S, which features an asymmetrical design with 5 heatpipes. The other model with be the NH-D9L, a 4 heatpipe design that is “15mm lower than classic 9cm coolers such as the NH-U9 series (110mm vs. 125mm)”. Noctua states that this will “guarantee full 3U compliance” and also “makes the NH-D9L ideal for compact HTPC and Small Form Factor cases”. Noctua states that the 95x95mm footprint of these new coolers, will clear “RAM and PCIe slots on all Intel and most AMD based mainboards, including µATX and ITX.”
The last addition to the 92mm lineup announced today is the server-specific NH-D9DX i4 3U, the replacement for the 4U model NH-U9DX i4. Noctua states that this new cooler uses “the same heatsink as the NH-D9L but comes with LGA2011 mounting for both Square ILM and Narrow ILM Xeon platforms as well as support for LGA13x6.”
The fan powering these new coolers is the NF-A9 PWM, and each cooler will use Noctua’s SecuFirm2 mounting system, and will come with a 6 year warranty. Noctua states that all three models are currently shipping and will be available shortly. MSRP’s will be as follows: NH-U9S, $59.90 USD; NH-D9L, $56.90 USD; NH-D9DX i4 3U, $59.90 USD.
Subject: General Tech | November 12, 2014 - 04:07 AM | Tim Verry
Tagged: system requirements, pc gaming, kyrat, fps, far cry 4
In case you missed it earlier this week, Ubisoft revealed the PC system requirements needed to run Far Cry 4. Developed by Ubisoft Montreal and set to release on November 18th, Far Cry 4 is the latest action adventure FPS in the Far Cry series. The game uses Ubisoft's Dunia Engine II which is a heavily modified game engine originally based on Crytek's CryEngine 1 developed by Kirmaan Aboobaker. The player is a Nepalese native that returns to Kyrat, a fictional location in the Himalayas following the death of their mother only to become embroiled in a civil war taking place in an open world filled with enemies, weapons, animals, and did I mention weapons?
This bow is a far cry from the only weapon you'll have access to...
According to the developer, Far Cry 4 continues the tradition of an open world environment, but the game world has been tweaked from the Far Cry 3 experience to be a tighter and more story focused experience where the single player story will take precedence over exploration and romps across the mountainous landscape.
While I can not comment on how the game plays, it certainly looks quite nice, and will need a beefy modern PC to run at its maximum settings. Interestingly, the game seems to scale down decently as well, with the entry level computer needed to run Far Cry 4 being rather modest.
No matter the hardware level, only 64-bit operating systems need apply, Far Cry 4 requires the 64-bit version of Windows 7 or later to run. At a minimum, Ubisoft recommends a quad core processor (Intel i5 750 or AMD Phenom II X4 955), 4GB of memory, a Radeon 5850 or GTX 460, and 30GB of storage.
To get optimal settings, users will need twice the system memory (at least 8GB) and video memory (at least 2GB), a newer quad core CPU such as the Intel i5-2400S or AMD FX-8350, and a modern NVIDIA GTX 680 or AMD Radeon R9 290X graphics card.
Anything beyond that is gravy that will allow gamers to crank up the AA and AF as well as the resolution.
Far Cry 4 will be available in North America on November 18, 2014 for the PC, PS4, Xbox One, PS3, and Xbox 360. Following the North America release, the game is scheduled to launch in Europe and Australia on November 20th, and in Japan on January 22 of next year.