Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

When your Brown bottoms out too easily, WASD Keyboards CODE with Cherry MX Clear

Subject: General Tech | November 13, 2014 - 06:09 PM |
Tagged: Cherry MX clear, WASD Keyboards, CODE, input, mechanical keyboard, tenkeyless

Scott posted about the WASD Keyboards CODE with Cherry MX Clear switches but until now we have not found a review of this keyboard.  The Tech Report has changed that with this review which takes a look at the new type of switch which sits between the Brown and the Green, Clear switches need more force to bottom out that a Brown but not as much as the clicky style Green switches.  That is not all this tenkeyless board offers, there are LEDs that can be activated by the dip switches in the recess found on the back of the keyboard.  In fact those dip switches can do more than just enable a nice glow, you can disable the Windows key or even immediately switch to different layouts such as Mac, Dvorak, and Colemak though sadly they left Sinclair ZX off of the list.  If this type of switch interests your fingers and you are willing to spend $150 on a keyboard check out the full review here.

bottom-accessories.jpg

"We've been meaning to try out Cherry MX's clear key switches for a while, and now, we've finally gotten our wish. Join us for a look at WASD Keyboards' Cherry MX clear-infused Code keyboard, a tenkeyless offering with more than a few tricks up its sleeve."

Here is some more Tech News from around the web:

Tech Talk

Zotac ZBOX CI540 Nano; are you a fan of fanless mini PCs?

Subject: Systems | November 13, 2014 - 04:03 PM |
Tagged: zotac, zbox ci540 nano, fanless, haswell, i5-4210Y

The Zotac ZBOX CI540 Nano is a bit more powerful than your average Bay Trail based mini-PC, it sports a Haswell based dual core i5-4210Y which runs between 1.5-1.9GHz and has Intel's HD4200 onboard.  This won't play AC:Unity but comes close to matching a NUC containing a Core i5-4250U, you give up a bit of horsepower for completely silent operation and for media it sports enough power to watch your favourite videos.  As you look at Silent PC Reviews' article you can see the honeycomb patterned knockouts on the casing to allow heat to dissipate and to let in liquid if you don't put some thought into where you are going to place the ZBOX.  It does have Bluetooth and there is an unofficial optional IR receiver that can be used to make it easy to place this tiny computer in a safe place.

01.jpg

"The Zotac ZBOX CI540 Nano gives up a little CPU/GPU horsepower to deliver a completely fanless, silent and full-featured mini-PC experience."

Here are some more Systems articles from around the web:

Systems

Podcast #326 - Intel's Core M 5Y70, Assassin's Creed Unity, Intel P3500 and more!

Subject: General Tech | November 13, 2014 - 03:19 PM |
Tagged: podcast, video, Intel, core m, core m 5y70, Broadwell, broadwell-y, Lenovo, yoga 2 pro, yoga 3 pro, assasins creed unity, ubisoft, farcry 4, p3500, gskill blade

PC Perspective Podcast #326 - 11/13/2014

Join us this week as we discuss Intel's Core M 5Y70, Assassin's Creed Unity, Intel P3500 and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Author:
Subject: Motherboards
Manufacturer: MSI

MSI Redefines AM3+ Value

It is no secret that AMD’s AM3+ motherboard ecosystem has languished for the past year or so, with very few examples of new products hitting the scene.  This is understandable since AMD has not updated the chipset options for AM3+, and only recently did they release updated processors in the form of the FX-8370 and FX-8370e.  It has been two years since the release of the original FX-8350 and another year since the high TDP FX-9000 series of parts.  For better or for worse, AMD is pushing their APUs far harder to consumers than the aging AM3+ platform.

970_G_01.jpg

MSI has refined their "Gaming" series of products with a distinctive look that catches the eye.

This does not mean that the AM3+ ecosystem is non-viable to both AMD and consumers.  While Intel has stayed ahead of AMD in terms of IPC, TDP, and process technology the overall competitiveness of the latest AM3+ parts are still quite good when considering price.  Yes, these CPUs will run hotter and pull more power than the Intel parts they are directly competing against, but when we look at the prices of comparable motherboards and the CPUs themselves, AMD still holds a price/performance advantage.  The AM3+ processors that feature six and eight cores (3 and 4 modules) are solid performers in a wide variety of applications.  The top end eight core products compete well against the latest Intel parts in many gaming scenarios, as well as productivity applications which leverage multiple threads.

When the Vishera based FX processors were initially introduced we saw an influx of new AM3+ designs that would support these new processors, as well as the planned 220 watt TDP variants that would emerge later.  From that point on we have only seen a smattering of new products based on AM3+.  From all the available roadmaps from AMD that we have seen, we do not expect there to be new products based on Steamroller or Excavator architectures on the AM3+ platform.  AMD is relying on their HSA enabled APUs to retain marketshare and hopefully drive new software technologies that will leverage these products.  The Future really is Fusion…

MSI is bucking this trend.  The company still sees value in the AM3+ market, and they are introducing a new product that looks to more adequately fit the financial realities of that marketplace.  We already have high end boards from MSI, ASRock, Asus, and Gigabyte that are feature packed and go for a relatively low price for enthusiast motherboards.  On the other end of the spectrum we have barebone motherboards based on even older chipsets (SB710/750 based).  In between we often see AMD 970 based boards that offer a tolerable mix of features attached to a low price.

970_G_02.jpg

The bundle is fair, but not exciting.  It offers the basics to get a user up and running quickly.

The MSI 970 Gaming motherboard is a different beast as compared to the rest of the market.  It is a Gaming branded board which offers a host of features that can be considered high end, but at the same time being offered for a price less than $100 US.  MSI looks to explore this sweet spot with a motherboard that far outpunches its weight class.  This board is a classic balance of price vs. features, but it addresses this balance in a rather unique way.  Part of it might be marketing, but a good chunk of it is smart and solid engineering.

Click to read the entire MSI 970 Gaming Review!

NVIDIA SHIELD Tablet Update for November: Android 5.0, "Green Box", GRID Gaming Service

Subject: Systems, Mobile | November 13, 2014 - 02:48 PM |
Tagged: shield tablet, shield, nvidia, grid, geforce grid

Today, NVIDIA has announced the November update for their SHIELD Tablet, which is really about three announcements that are rolled up together.

nvidia-shield-november-14-01_0.jpg

As expected, the SHIELD Tablet is getting a roll-up to Android 5.0 Lollipop and its new, “Material Design” style guide. NVIDIA's took the opportunity to refresh the SHIELD HUB (my shift key must think that this is an MSI announcement by now...) in the same design specification. While interesting, the two other announcements probably beat it out, especially the GRID streaming service (and how it relates to the Xbox One and the PlayStation 4).

nvidia-shield-november-14-02_0.jpg

But before we get to GRID, let's talk “The Green Box”. In May, NVIDIA sent us a green crowbar to mark the availability of Half-Life 2 and Portal on the NVIDIA SHIELD. These were full, native ports of the PC title to ARM and Android that is exclusive to the NVIDIA SHIELD. With the November update, Half-Life 2: Episode One has also been ported to the platform. The three games, Portal, Half-Life 2, and Episode One, are also packaged in “The Green Box” bundle, which will be included free-of-charge with the SHIELD Tablet 32GB. Note that, while the games are included with the tablet, they require a controller to play, which is not included.

Now we talk about GRID.

nvidia-shield-november-14-03_0.jpg

Netflix is a popular service where people can watch a variety of movies from their rolling catalog. It will not replace ownership of certain, intrinsically valuable titles, but there is probably options for anyone who wants to consume some form of entertainment. GRID is a similar service for video games, and it is not the first. We took a look at a preview of OnLive in 2010, connecting to a server about 2400 miles away, which is over twice the maximum intended range, and found the experience somewhat positive for games except Unreal Tournament 3 at that relatively extreme latency. Another company, GaiKai, was purchased by Sony and rebranded as PlayStation Now. It will serve up a selection of games from the PS3 catalog. Again, content on these services can be pulled at any time, but if you are just looking for the entertainment value, something else will probably be there to scratch your itch.

nvidia-shield-november-14-04_0.jpg

The interesting part that I have been teasing throughout this entire post is the performance of NVIDIA GRID. PlayStation Now is rated at 192 GFLOPs, which is the theoretical GPU compute throughput of the PS3's RSX chip. GRID, on the other hand, is rated for 2448 GFLOPs (~2.5 TFLOPs). This is higher than the PlayStation 4, and almost twice the GPU performance of the Xbox One. On the PC side, it is roughly equivalent to the GeForce GTX 760 Ti.

epic-samaritan-ue3.png

This compute rating has a hidden story, too. Back in 2011, Epic Games demoed “Samaritan” in Unreal Engine 3. This was the bar that Epic Games set for Microsoft, Sony, and Nintendo to mark a new console generation. When Unreal Engine 4 was unveiled at the end of E3 2012, it was embodied in the Elemental Demo, which also ran at (you guessed it) 2.5 TFLOPs. At the PlayStation 4 (1.9 TFLOPs) announcement, the demo was scaled back with reduced particles and lighting complexity. It was not shown at either Xbox One (1.3 TFLOPs) announcement at all.

epic-elemental-ue4-pcvps4.jpg

What all of that means is simple: NVIDIA GRID is the only fixed hardware platform (that I am aware of) to meet Epic's vision of a next-gen gaming system. I say fixed, of course, because the PC can over-double it per card, with some games scaling to four discrete GPUs. This also says nothing about the CPU performance, system memory, or video memory, but it has the GPU in the right place for a next gen platform.

nvidia-shield-november-14-01b.jpg

The NVIDIA GRID preview will launch in November for North America, with East Coast and West Coast servers. It will expand in December for Western Europe, and in “Q2” for Asia Pacific. The service will be free for SHIELD users until June 30th, 2015. The Android 5.0 Update for the SHIELD Tablet will be available on November 18th.

Source: NVIDIA

Oh no, there goes the University of Tokyo! Go, go Gojira!

Subject: General Tech | November 13, 2014 - 01:39 PM |
Tagged: cycle computing, supercomputer, gojira, bargain

While the new Gojira supercomputer is not more powerful than the University of Tokyo's Oakleaf-FX at 1 petaflop of performance if you look at it from a price to performance ratio the $5,500 Gojira is more than impressive.  It has a peak theoretical performance of 729 teraflops by using over 71,000 Ivy Bridge cores across several Amazon Web Service regions and providing the equivalent of 70.75 years of compute time.  The cluster was built in an incredibly short time, going from zero to 50,000 cores in 23 minutes and hitting the peak after 60 minutes.  You won't be playing AC Unity on it any time soon but if you want to rapidly test virtual prototypes these guys can do it for an insanely low price.  Catch more at The Register and ZDNet, the Cycle Computing page seems to be down for the moment.

cycle-gojira-cyclecloud-tflops-620x208.png

"Cycle Computing has helped hard drive giant Western Digital shove a month's worth of simulations into eight hours on Amazon cores."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

NVIDIA GeForce GTX 960 Specifications Potentially Leaked

Subject: Graphics Cards | November 13, 2014 - 12:46 PM |
Tagged: nvidia, geforce, gtx 960, maxwell

It is possible that a shipping invoice fragment was leaked for the NVIDIA GeForce GTX 960. Of course, an image of text on a plain, white background is one of the easiest things to fake and/or manipulate, so take it with a grain of salt.

nvidia-gtx-960-shipping.jpg

The GTX 960 is said to have 4GB of RAM on the same, 256-bit bus. Its video outputs are listed as two DVI, one HDMI, and one DisplayPort, making this graphics card useful for just one G-Sync monitor per card. If I'm reading it correctly, it also seems to have a 993 MHz base clock (boost clock unlisted) and an effective 6008 MHz (1500 MHz actual) RAM clock. This is slightly below the 7 GHz (1750 MHz actual) of the GTX 970 and GTX 980 parts, but it should also be significantly cheaper.

The GeForce GTX 960 is expected to retail in the low-$200 price point... some day.

Source: Reader Tip

Ubisoft Responds to Low Frame Rates in Assassin's Creed Unity

Subject: Graphics Cards | November 12, 2014 - 09:03 PM |
Tagged: Unity, ubisoft, assassin's creed

Over the last couple of days there have been a lot of discussions about the performance of the new Assassin's Creed Unity from Ubisoft on current generation PC hardware. Some readers have expressed annoyance that the game is running poorly, at lower than expected frame rates, at a wide range of image quality settings. Though I haven't published my results yet, we are working on a story comparing NVIDIA and AMD GPUs in Unity, but the truth is that this is occurring on GPUs from both sides.

For example, using a Core i7-3960X and a single GeForce GTX 980 4GB reference card, I see anywhere from 37 FPS to 48 FPS while navigating the crowded city of Paris at 1920x1080 and on the Ultra High preset. Using the Low preset, that frame rate increases to 65-85 FPS or so.

unity3.jpg

Clearly, those are lower frame rates at 1920x1080 than you'll find in basically any other PC game on the market. The accusation from some in the community is that Ubisoft is either doing this on purpose or doing it out of neglect with efficient code. I put some questions to the development team at Ubisoft and though I only had a short time with them, the answers tell their side of the story.

Ryan Shrout: What in the Unity game engine is putting the most demand on the GPU and its compute resources? Are there specific effects or were there specific design goals for the artists that require as much GPU horsepower as the game does today with high image quality settings?

Ubisoft: Assassin’s Creed Unity is one of the most detailed games on the market and [contains] a giant, open world city built to the scale that we’ve recreated. Paris requires significant details. Some points to note about Paris in Assassin’s Creed Unity:

  • There are tens of thousands of objects are visible on-screen, casting and receiving shadows.
  • Paris is incredibly detailed. For example, Notre-Dame itself is millions of triangles.
  • The entire game world has global illumination and local reflections.
  • There is realistic, high-dynamic range lighting.
  • We temporally stabilized anti-aliasing.

RS: Was there any debate internally about downscaling on effects/image quality to allow for lower end system requirements?

Ubisoft: We talked about this a lot, but our position always came back to us ensuring that Assassin’s Creed Unity is a next-gen only game with breakthrough graphics. With this vision, we did not degrade the visual quality of the game. On PC, we have several option for low-scaling, like disabling AA, decreasing resolution, and we have low option for Texture Quality, Environment Quality and Shadows.

RS: Were you looking forward or planning for future GPUs (or multi-GPU) that will run the game at peak IQ settings at higher frame rates than we have today?

Ubisoft: We targeted existing PC hardware.

RS: Do you envision updates to the game or to future GPU drivers that would noticeably improve performance on current generations of hardware?

Ubisoft: The development team is continuing to work on optimization post-launch through software updates. You’ll hear more details shortly.

Some of the features listed by the developer in the first answer - global illumination methods, high triangle counts, HDR lighting - can be pretty taxing on GPU hardware. I know there are people out there pointing out games that have similar feature sets and that run at higher frame rates, but the truth is that no two game engines are truly equal. If you have seen Assassin's Creed Unity in action you'll be able to tell immediately the game is beautiful, stunningly so. Is it worth that level of detail for the performance levels achieved from current high-end hardware? Clearly that's the debate.

unity2.jpg

When I asked if Ubisoft had considered scaling back the game to improve performance, they clearly decided against it. The developer had a vision for the look and style of the game and they were dedicated to it; maybe to a fault from some gamers' viewpoint.

Also worth nothing is that Ubisoft is continuing to work on optimization post-release; how much of an increase we'll actually see with game patches or driver updates will have to be seen as we move forward. Some developers have a habit of releasing a game and simply abandoning it as it shipped - hopefully we will see more dedication from the Unity team.

So, if the game runs at low frame rates on modern hardware...what is the complaint exactly? I do believe that Ubisoft would have benefited from better performance on lower image quality settings. You can tell by swapping the settings for yourself in game but the quality difference between Low and Ultra High is noticeable, but not dramatically so. Again, this likely harkens back to the desire of Ubisoft to maintain an artistic vision.

Remember that when Crysis 3 launched early last year, running at 1920x1200 at 50 FPS required a GTX 680, the top GPU at the time; and that was at the High settings. The Very High preset only hit 37 FPS on the same card.

PC gamers seems to be creating a double standard. On one hand, none of us want PC-ports or games that are developed with consoles in mind that don't take advantage of the power of the PC platform. Games in the Call of Duty series are immensely popular but, until the release of Advanced Warfare, would routinely run at 150-200 FPS at 1080p on a modern PC. Crysis 3 and Assassin's Creed Unity are the opposite of that - games that really tax current CPU and GPU hardware, paving a way forward for future GPUs to be developed and NEEDED.

If you're NVIDIA or AMD, you should applaud this kind of work. Now I am more interested than ever in a GTX 980 Ti, or a R9 390X, to see what Unity will play like, or what Far Cry 4 will run at, or if Dragon Age Inquisition looks even better.

Of course, if we can get more performance from a better optimized or tweaked game, we want that too. Developers need to be able cater to as wide of a PC gaming audience as possible, but sometimes creating a game that can scale between running on a GTX 650 Ti and a GTX 980 is a huge pain. And with limited time frames and budgets, don't we want at least some developers to focus on visual quality rather than "dumbing down" the product?

Let me know what you all think - I know this is a hot-button issue!

UPDATE: Many readers in the comments are bringing up the bugs and artifacts within Unity, pointing to YouTube videos and whatnot. Those are totally valid complaints about the game, but don't necessarily reflect on the game's performance - which is what we were trying to target with this story. Having crashes and bugs in the game is disappointing, but again, Ubisoft and Assassin's Creed Unity aren't alone here. Have you seen the bugs in Skyrim or Tomb Raider? Hopefully Ubisoft will be more aggressive in addressing them in the near future. 

UPDATE 2: I also wanted to comment that even though I seem to be defending Ubisoft around the performance of Unity, my direct feedback to them was that they should enable modes in the game that allow it to play at higher frame rates and even lower image quality settings, even if they were unable to find ways to "optimize" the game's efficiency. So far the developer seems aware of all the complaints around performance, bugs, physics, etc. and is going to try to address them.

UPDATE 3: In the last day or so, a couple of other media outlets have posted anonymous information that indicates that the draw call count for Assassin's Creed Unity is at fault for the poor performance of the game on PCs. According to this "anonymous" source, while the consoles have low-level API access to hardware to accept and process several times the draw calls, DirectX 11 can only handle "7,000 - 10,000 peak draw calls." Unity apparently is "pushing in excess of 50,000 draw calls per frame" and thus is putting more pressure on the PC that it can handle, even with high end CPU and GPU hardware. The fact that these comments are "anonymous" is pretty frustrating as it means that even if they are accurate, they can't be taken as the truth without confirmation from Ubisoft. If this turns out to be true, then it would be a confirmation that Ubisoft didn't take the time to implement a DX11 port correctly. If it's not true, or only partially to blame, we are left with more meaningless finger-pointing.

Visual Studio Community 2013 Announced

Subject: General Tech | November 12, 2014 - 07:38 PM |
Tagged: visual studio, microsoft

While this is significantly different from what we usually write about, I have a feeling that there is some overlap with our audience.

Update: If you use Visual Studio Express 2013, you may wish to uninstall it before installing Community. My experience seems to be that it thinks that both are installed to the same directory, and so uninstalling Express after installing Community will break both. I am currently repairing Community, which should fix it, but there's no sense for you to install twice if you know better.

Visual Studio Express has been the free, cut-down option for small and independent software developers. It can be used for commercial applications, but it was severely limited in many areas, such as its lack of plug-in support. Today, Microsoft announced Visual Studio Community 2013, which is a free version of Visual Studio that is equivalent to Visual Studio Professional 2013 for certain users (explained below). According to TechCrunch, while Visual Studio Express will still be available for download, Community is expected to be the version going forward.

microsoft-vs-logo.png

Image Credit: Wikimedia (modified)

There are four use cases for Visual Studio Community 2013:

  • To contribute to open-source projects (unlimited users)
  • To use in a classroom environment for learning (unlimited users)
  • To use as a tool for Academic research (unlimited users)
  • To create free or commercial, closed-source applications (up to 5 users)
    • You must be an individual or small studio with less than 250 PCs
    • You must have no more than $1 million USD in yearly revenue

Honestly, this is a give-and-take scenario, but it seems generally positive. I can see this being problematic for small studios with 6+ developers, but they can (probably) still use Visual Studio Express 2013 Update 3 until it gets too old. For basically everyone else, this means that you do not need to worry about technical restrictions when developing software. This opens the avenue for companies like NVIDIA (Nsight Visual Studio Edition) and Epic Games (Unreal Engine 4) to deliver their plug-ins to the independent developer community. When I get a chance, and after it finishes installing, I will probably check to see if those examples already work.

Visual Studio Community 2013 Update 4 is available now at Microsoft's website.

Source: Microsoft

Following Up with Wyoming Whiskey

Subject: Editorial | November 12, 2014 - 06:58 PM |
Tagged: Wyoming Whiskey, Whiskey, Kirby, Bourbon

Last year around this time I reviewed my first bottle of Wyoming Whiskey.  Overall, I was quite pleased with how this particular spirit has come along.  You can read my entire review here.  It also includes a little interview with one of the co-founders of Wyoming Whiskey, David Defazio.  The landscape has changed a little throughout the past year, and the distillery has recently released a second product in limited quantities to the Wyoming market.  The Single Barrel Bourbon selections come from carefully selected barrels and are not blended with others.  I had the chance to chat with David again recently and received some interesting information from him about the latest product and where the company is headed.

dingle_barrel.jpg

Picture courtesy of Wyoming Whiskey

Noticed that you have a new single barrel product on the shelves.  How would you characterize this as compared to the standard bottle you sell?

These very few barrels are selected from many and only make the cut if they meet very high standards.  We have only bottled 4 so far.  And, the State has sold out.  All of our product has matured meaningfully since last year and these barrels have benefitted the most as evidenced by their balance and depth of character.  The finish is wickedly smooth.  I have not heard one negative remark about the Single Barrel Product.

Have you been able to slowly lengthen out the time that the bourbon matures til it is bottled, or is it around the same age as what I sampled last year?

Yes, these barrels are five years old, as is the majority of our small batch product.

How has been the transition from Steve to Elizabeth as the master distiller?

Elizabeth is no longer with us.  She had intended to train under Steve for the year, but when his family drew him back to Kentucky in February, this plan disintegrated.  So, our crew is making bourbon under the direction of Sam Mead, my partners' son, who is our production manager.  He has already applied his engineering degree in ways that help increase quality and production.  And he's just getting started.

What other new products may be showing up in the next year?

You may see a barrel-strength bourbon from us.  There are a couple of honey barrels that we are setting aside for this purpose.

Wyoming Whiskey had originally hired on Steve Nally of Maker’s Mark fame, somehow pulling him out of retirement.  He was the master distiller for quite a few years, and had moved on from the company this past year.  He is now heading up a group that is opening a new distillery in Kentucky that is hoping to break into the bourbon market.  They expect their first products to be aged around 7 years.  As we all know, it is hard to keep afloat as a company if they are not selling product.  In the meantime, it looks like this group will do what so many other “craft” distillers have been caught doing, and that is selling bourbon that is produced from mega-factories that is then labeled as their own.

Bourbon has had quite the renaissance in the past few years with the popularity of the spirit soaring.  People go crazy trying to find limited edition products like Pappy Van Winkle and many estimate that overall bourbon production in the United States will not catch up to demand anytime soon.  This of course leads to higher prices and tighter supply for the most popular of brands.

It is good to see that Wyoming Whiskey is lengthening out the age of the barrels that they are bottling, as it can only lead to smoother and more refined bourbon.  From most of my tasting, it seems that 6 to 7 years is about optimal for most bourbon.  There are other processes that can speed up these results, and I have tasted batches that are only 18 months old and rival that of much older products.  I look forward to hearing more about what Wyo Whiskey is doing to improve their product.

Roccat Kave XTD 5.1 Digital, seriously loud surround sound

Subject: General Tech | November 12, 2014 - 06:09 PM |
Tagged: audio, roccat, Kave XTD 5.1, gaming headset

The name implies that the Roccat Kave XTD 5.1 Digital headset provides virtual surround sound but in fact it has three 40mm driver units in each earcup, giving you front, rear and centre channels though you can use the provided software to switch to stereo sound if you prefer.  The earcups are leather over foam which makes them quite comfortable although they could get warm after extended periods of time and the microphone boom is removable for when it would be in your way.  They also have noise cancellation and the ability to pair with a phone over Bluetooth and an integrated sound card, all part of the reason that the headset is $150.  Modders-Inc were impressed by that soundcards four speaker plugs on the rear allowing you to switch between sending 5.1 signal to the Kave XTD or to external speakers.  Audio reviews are always very subjective as it is difficult to rate perceived sound quality for anyone but yourself but you should still check out Modders-Inc's take on the software and hardware in their full review.

DSC_2855.jpg

"Overall I thought the Roccat Kave XTD 5.1 Digital headset is a solid performer. The audio quality from the headset is excellent. At just slightly under full volume the headset is LOUD!"

Here is some more Tech News from around the web:

Audio Corner

Source: Modders Inc

CS:GO and TF2 on Linux and Radeon

Subject: General Tech | November 12, 2014 - 05:10 PM |
Tagged: linux, amd, radeon, CS:GO, tf2

With the new driver from AMD and a long list of cards to test, from an R9290 all the way back to an HD4650, Phoronix has put together a rather definitive list of the current performance you can expect from CS:GO and TF2.  CS:GO was tested at 2560x1600 and showed many performance changes from the previous driver, including some great news for 290 owners.  TF2 was tested at the same resolution and many of the GPUs were capable of providing 60FPS or higher, again with the 290 taking the lead.  Phoronix also did testing on the efficiency of these cards, detailing the number of frames per second, per watt used, this may not be pertinent to many users but does offer an interesting look at the efficiency of the GPUs.  If you are gaming on a Radeon on Linux now is a good time to upgrade your drivers and associated programs.

image.php_.jpg

"The latest massive set of Linux test data we have to share with Linux gamers and enthusiasts is a look at Counter-Strike: Global Offensive and Team Fortress 2 when using the very newest open-source Radeon graphics driver code. The very latest open-source Radeon driver code tested with these popular Valve Linux games were the Linux 3.18 Git kernel, Mesa 10.4-devel, LLVM 3.6 SVN, and xf86-video-ati 7.5.99."

Here is some more Tech News from around the web:

Gaming

Source: Phoronix

Browse the web with your Oculus and MozVR

Subject: General Tech | November 12, 2014 - 04:54 PM |
Tagged: mozilla, oculus rift, MozVR

You have been able to browse the web on your Oculus Rift since the first dev kit, but not with a UI designed specifically for the VR device.  MozVR is in development along with a specific version of Firefox or Chromium to allow Oculus users to browse the web in a new way.  It will work with both Mac and Windows, though as of yet there is no mention of Linux support which should change in the near future.  You need to get your hands on an Oculus to try out the new browser, it simply is not going to translate to the desktop.  The software is open sourced and available on Github so you can contribute to the overall design of the new way to surf the web as well as optimizing your own site for VR.  Check out more on MozVR and Oculus over at The Inquirer.

moxvr.PNG

"MOZILLA IS CONTINUING its 10th birthday celebrations with the launch of a virtual reality (VR) website."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

The Intel SSD DC P3500 is coming sooner than we thought

Subject: Storage | November 12, 2014 - 04:44 PM |
Tagged: ssd, pcie, NVMe, Intel, DC P3500

Since we reviewed the Intel SSD DC P3700, many of you have been drooling over the idea of an 18-channel NVMe PCIe SSD, even more so given that the P3500 variant was to launch at a $1.50/GB target price. It appears we are getting closer to that release, as the P3500 has been appearing on some web sites in pre-order or out of stock status.

P3500.jpg

ShopBLT lists the 400GB part at $629 ($1.57/GB), while Antares Pro has an out of stock listing at $611 ($1.53/GB).  The other two capacities are available at a similar cost/GB. We were hoping to see an 800GB variant, but it appears Intel has stuck to their initial plan. Here are the part numbers we’ve gathered, for your Googling pleasure:

Half-height PCIe:

  • 400GB: SSDPEDMX400G401
  • 1.2TB: SSDPEDMX012T401
  • 2TB: SSDPEDMX020T401

2.5” SFF-8639 (*not SATA*):

  • 400GB: SSDPE2MX400G401
  • 1.2TB: SSDPE2MX012T401
  • 2TB: SSDPE2MX020T401

We did spot a date of December 12th in an Amazon listing, but I wouldn't count that as a solid date, as many of the listings there had errors (like 10 packs for the price of one).

Manufacturer: Fractal Design

Introduction: The Core Series Shrinks Down

core_1100_main.jpg

Image credit: Fractal Design

The Core 1100 from Fractal Design is a small micro-ATX case, essentially a miniature version of the previously reviewed Core 3300. With its small dimensions the Core 1100 targets micro-ATX and mini-ITX builders, and provides another option not only in Fractal Design's budget lineup, but in the crowded budget enclosure market.

core_1100_angle.jpg

The price level for the Core 1100 has fluctuated a bit on Amazon since I began this review, with prices ranging from a high of $50 down to a low of just $39. It is currently $39.99 at Newegg, so the price should soon stabilize at Amazon and other retailers. At the ~$40 level this could easily be a compelling option for a smaller build, though admittedly the design of these Core series cases is purely functional. Ultimately any enclosure recommendation will depend on ease of use and thermal performance/noise, which is exactly what we will look at in this review.

Continue reading our review of the Fractal Design Core 1100 case!!

Noctua Announces Three New 92mm CPU Coolers with PWM Fans

November 12, 2014 - 08:29 AM |
Tagged:

Noctua has announced three new 92mm CPU coolers today, with two different replacements for the existing NH-U9B SE2 and a new cooler for Intel Xeon LGA2011 processors for workstations and servers. Each model will now use a PWM fan, the recently announced NF-A9.

noctua_new_9cm_coolers.jpg

Image credit: Noctua

In Noctua’s official press release their CEO Roland Mossig is quoted "The NH-U9B SE2 is still one of our most popular models. The NH-U9S and NH-D9L stay true to this proven formula but now offer even better performance, better compatibility and PWM support for automatic fan speed control."

The first of the two NH-U9B replacements is the NH-U9S, which features an asymmetrical design with 5 heatpipes. The other model with be the NH-D9L, a 4 heatpipe design that is “15mm lower than classic 9cm coolers such as the NH-U9 series (110mm vs. 125mm)”.  Noctua states that this will “guarantee full 3U compliance” and also “makes the NH-D9L ideal for compact HTPC and Small Form Factor cases”. Noctua states that the 95x95mm footprint of these new coolers, will clear “RAM and PCIe slots on all Intel and most AMD based mainboards, including µATX and ITX.”

The last addition to the 92mm lineup announced today is the server-specific NH-D9DX i4 3U, the replacement for the 4U model NH-U9DX i4. Noctua states that this new cooler uses “the same heatsink as the NH-D9L but comes with LGA2011 mounting for both Square ILM and Narrow ILM Xeon platforms as well as support for LGA13x6.”

noctua_nf_a9_pwm_2.jpg

The fan powering these new coolers is the NF-A9 PWM, and each cooler will use Noctua’s SecuFirm2 mounting system, and will come with a 6 year warranty. Noctua states that all three models are currently shipping and will be available shortly. MSRP’s will be as follows: NH-U9S,  $59.90 USD; NH-D9L, $56.90 USD; NH-D9DX i4 3U, $59.90 USD.

Source: Noctua

Far Cry 4 Will Require Modest PC Hardware and Newer Operating Systems

Subject: General Tech | November 12, 2014 - 04:07 AM |
Tagged: system requirements, pc gaming, kyrat, fps, far cry 4

In case you missed it earlier this week, Ubisoft revealed the PC system requirements needed to run Far Cry 4. Developed by Ubisoft Montreal and set to release on November 18th, Far Cry 4 is the latest action adventure FPS in the Far Cry series. The game uses Ubisoft's Dunia Engine II which is a heavily modified game engine originally based on Crytek's CryEngine 1 developed by Kirmaan Aboobaker. The player is a Nepalese native that returns to Kyrat, a fictional location in the Himalayas following the death of their mother only to become embroiled in a civil war taking place in an open world filled with enemies, weapons, animals, and did I mention weapons?

Far Cry 4 Kyrat Bow and Arrow.jpg

This bow is a far cry from the only weapon you'll have access to...

According to the developer, Far Cry 4 continues the tradition of an open world environment, but the game world has been tweaked from the Far Cry 3 experience to be a tighter and more story focused experience where the single player story will take precedence over exploration and romps across the mountainous landscape.

While I can not comment on how the game plays, it certainly looks quite nice, and will need a beefy modern PC to run at its maximum settings. Interestingly, the game seems to scale down decently as well, with the entry level computer needed to run Far Cry 4 being rather modest.

No matter the hardware level, only 64-bit operating systems need apply, Far Cry 4 requires the 64-bit version of Windows 7 or later to run. At a minimum, Ubisoft recommends a quad core processor (Intel i5 750 or AMD Phenom II X4 955), 4GB of memory, a Radeon 5850 or GTX 460, and 30GB of storage.

To get optimal settings, users will need twice the system memory (at least 8GB) and video memory (at least 2GB), a newer quad core CPU such as the Intel i5-2400S or AMD FX-8350, and a modern NVIDIA GTX 680 or AMD Radeon R9 290X graphics card.

Far Cy 4 Mortar.jpg

Anything beyond that is gravy that will allow gamers to crank up the AA and AF as well as the resolution.

Far Cry 4 will be available in North America on November 18, 2014 for the PC, PS4, Xbox One, PS3, and Xbox 360. Following the North America release, the game is scheduled to launch in Europe and Australia on November 20th, and in Japan on January 22 of next year.

Source: Maximum PC

Final Fantasy XIII-2 Dated for December 11th

Subject: General Tech | November 12, 2014 - 03:23 AM |
Tagged: pc gaming, final fantasy xiii-2, final fantasy xiii, final fantasy

It seems like Square Enix has paid attention to the criticism about Final Fantasy XIII. While it would have been nice for them to go back and fix the problems for the original game (Update Nov 12 @ 5:35pm EST: They are, in early December - Thanks TimeKeeper in the comments), it looks like the sequel, XIII-2, will behave more like a PC title. First and foremost, it will not be locked to 720p and it is said to offer other graphics options. The sequel is scheduled to launch on December 11th for $20, or $18 USD on pre-order (a few dollars above the launch price for Final Fantasy 13).

square-ffxii2-logo.jpg

Of course, it is somewhat disappointing that screen resolution, a 60FPS cap, and graphics options are considered features, but the platform is unfamiliar to certain parts of the company. Acknowledging their error and building a better, but probably still below expectations, product is a good direction. Hopefully they will continue to progress, and eventually make PC games with the best of them. Either that, or they have a talk with their Eidos arm about borrowing Nixxes, a company that specializes in enhancing games on the PC.

Final Fantasy XIII-2 is coming to Steam in a month for $20 USD. The third installment, Lightning Returns, will arrive sometime in 2015.

Source: Steam

Newly Refreshed Cirrus7 Nimbus Looks Cool (Pun Intended)

Subject: General Tech, Systems | November 11, 2014 - 11:11 PM |
Tagged: haswell-t, haswell, fanless

This one is more for our European readers, because this company operates out of Germany, but the Cirrus7 Nimbus is an interestingly designed, fanless system. Its fin shape is said to be assembled out of laser-cut layers of aluminum that sandwiches in the I/O plate at the rear. FanlessTech has noted that the systems are now available with Haswell processors, up to a Core i7 based on Haswell-T. Their storage options now also include the Samsung 850 Pro, up to 1TB.

cirrus-nimbus_2.jpg

Image Credit: Cirrus7 via FanlessTech

The customization options are actually pretty decent. I find that a lack of meaningful upgrades to be a problem with modern PC builders, however this one does not apply. Eight CPUs are offered, ranging from a Celeron up to a 45W Haswell-T; RAM comes in 4GB, 8GB, or 16GB; up to three drives can be installed, up to one mSATA and up to two SATA; Intel Wireless N or AC is available; external DVD or BluRay burners are an option; and one of seven OSes can be installed, including two versions of Linux (Ubuntu 14.04 or Ubuntu 14.10). If you get all of the bells and whistles, you are probably up to about 3,000 USD, but you cannot expect two terabytes of Samsung 850 Pro SSDs to be cheap. It seems reasonable enough, especially for the EU. The big limiter is the lack of a discrete GPU unless you are using this device for something like audio recording, which an Intel HD 4600 can easily handle.

The Cirrus7 Nimbus is available now at their website.

Source: Cirrus7

Intel refreshes SSD DC S3500 Series to include larger capacities, M.2 form factor

Subject: Storage | November 11, 2014 - 05:32 PM |
Tagged: Intel, ssd, dc s3500, M.2

Today Intel refreshed their Datacenter Series of SSDs, specifically their DC S3500. We have reviewed this model in the past. It uses the same controller that is present in the S3700, as well as the SSD 730 Series (though it is overclocked in that series).

130611-221736-7.31.jpg

The full line of Intel Datacenter SSDs (minus the P3700). DC S3500 is just right of center.

Todays refresh includes higher capacities to the S3500, which now include 1.2TB and 1.6TB on the hign end of capacity. This suggests that Intel is stacking 20nm dies as many as 8 to a package. IOPS performance sees a slight penalty at these new higher capacities, while maximum sequentials are a bit higher due to the increased die count.

Intel SSD DC S3500 Series - M.2.png

Also announced was an M.2 version of the S3500. This packaging is limited to only a few capacity points (80GB, 120GB, 340GB), and is p;rimarily meant for applications where data integrity is critical (i.e. ATM's, server boot partitions, etc).

A standard press blast was unavailable, but full specs are listed after the break.

Source: Intel