Subject: Graphics Cards | October 6, 2015 - 02:40 PM | Jeremy Hellstrom
Tagged: 4k, gtx titan x, fury x, GTX 980 Ti, crossfire, sli
[H]ard|OCP shows off just what you can achieve when you spend over $1000 on graphics cards and have a 4K monitor in their latest review. In Project Cars you can expect never to see less than 40fps with everything cranked to maximum and if you invested in Titan X's you can even enable DS2X AntiAliasing for double the resolution, before down sampling. Witcher 3 is a bit more challenging and no card is up for HairWorks without a noticeable hit to performance. FarCry 4 still refuses to believe in Crossfire and as far as NVIDIA performance goes, if you want to see soft shadows you are going to have to invest in a pair of Titan X's. Check out the full review to see what the best of the current market is capable of.
"The ultimate 4K battle is about to begin, AMD Radeon R9 Fury X CrossFire, NVIDIA GeForce GTX 980 Ti SLI, and NVIDIA GeForce GTX TITAN X SLI will compete for the best gameplay experience at 4K resolution. Find out what $1300 to $2000 worth of GPU backbone will buy you. And find out if Fiji really can 4K."
Here are some more Graphics Card articles from around the web:
- Sapphire R7 370 Nitro Review @ OCC
- PNY GTX 950 2GB @ Kitguru
- Gigabyte GTX 950 Xtreme Gaming 2GB @ Kitguru
- Nvidia's GeForce GTX 950 @ The Tech Report
Subject: Mobile | October 6, 2015 - 02:38 PM | Ryan Shrout
Tagged: video, surface book, surface, Skylake, nvidia, microsoft, Intel, geforce
Along with the announcement of the new Surface Pro 4, Microsoft surprised many with the release of the new Surface Book 2-in-1 convertible laptop. Sharing much of the same DNA as the Surface tablet line, the Surface Book adopts a more traditional notebook design while still adding enough to the formula to produce a unique product.
The pivotal part of the design (no pun intended) is the new hinge, a "dynamic fulcrum" design that looks great and also (supposedly) will be incredibly strong. The screen / tablet attachment mechanism is called Muscle Wire and promises secure attachment as well as ease of release with a single button.
An interesting aspect of the fulcrum design is that, when closed, the Surface Book screen and keyboard do not actually touch near the hinge. Instead you have a small gap in this area. I'm curious how this will play out in real-world usage - it creates a natural angle for using the screen in its tablet form but also may find itself "catching" coin, pens and other things between the two sections.
The 13.5-in screen has a 3000 x 2000 resolution (3:2 aspect ratio obviously) with a 267 PPI pixel density. Just like the Surface Pro 4, it has a 10-point touch capability and uses the exclusive PixelSense display technology for improved image quality.
While most of the hardware is included in the tablet portion of the device, the keyboard dock includes some surprises of its own. You get a set of two USB 3.0 ports, a full size SD card slot and a proprietary SurfaceConnect port for an add-on dock. But most interestingly you'll find an optional discrete GPU from NVIDIA, an as-yet-undiscovered GeForce GPU with 1GB (??) of memory. I have sent inquiries to Microsoft and NVIDIA for details on the GPU, but haven't heard back yet. We think it is a 30 watt GeForce GPU of some kind (by looking at the power adapter differences) but I'm more interested in how the GPU changes both battery life and performance.
Keyboard and touchpad performance looks to be impressive as well, with a full glass trackpad integration, backlit keyboard design and "class leading" key switch throw distance.
The Surface Book is powered by Intel Skylake processors, available in both Core i5 and Core i7 options, but does not offer Core m-based or Iris graphics options. Instead the integrated GPU will only be offered with the Intel HD 520.
Microsoft promises "up to" 12 hours of battery life on the Surface Book, though that claim was made with the Core i5 / 256GB / 8GB configuration option; no discrete GPU included.
Pricing on the Surface Book starts at $1499 but can reach as high as $2699 with the maximum performance and storage capacity options.
Subject: Mobile | October 6, 2015 - 01:40 PM | Ryan Shrout
Tagged: video, surface pro 4, surface, Skylake, microsoft, iris, Intel, edram
Microsoft has finally revealed the next product in the Surface Pro tablet lineup, skipping the Broadwell processor generation and jumping straight to the latest Intel Skylake processors. The design is very similar to previous Surface Pro tablets but the upgrades and changes made for the Surface Pro 4 are impressive.
The kickstand design that has made the Surface recognizable remains the same but there is a solid collection of new features including a fingerprint reader and Microsoft Hello support for security and login. The new Pro 4 model is only 8.4mm thick (coming in just about 1mm thinner than the Pro 3) and is also lighter at 1.73 lbs.
The screen size is 12.3-inches with a 2736 x 1824 3:2 resolution for a pixel density of 267 PPI. It has a 10-point touch interface with drastically improved latency, palm detection and pressure sensitivity for the included Surface Pen. Even better, that improved Surface Pen will have a full year of battery life along with magnetic attachment to the tablet rather than relying on a elastic loop!
The Surface keyboard sees improvements as well including better spacing on the keys, quieter and more reliable typing and it also becomes the thinnest type cover MS has yet to build for the Surface line. A 5-point touch glass trackpad is now part of the deal, 40% larger than the one found on the Pro 3 - a welcome modification for anyone that has used the type cover in the past.
In terms of computing horsepower, the Surface Pro 4 will be available with a Core m3, Core i5 or even a Core i7 processor. It will ship with 4GB, 8GB or 16GB of system memory and internal storage capacities as high as 1TB. Microsoft hasn't posted any more details about the clock speeds of these CPUs but if you look at the awesome hype video MS made for the Pro 4 launch, you'll notice an interesting thing in the exploded view: an Intel processor with three dies on a single package.
What you are seeing is the Skylake CPU, chipset and an eDRAM package. This tells us that at least one of the available options for the Surface Pro 4 will ship with Iris graphics and 64MB or 128MB of L4 cache / eDRAM - a first for this form factor! This should help improve performance for graphics as well as other specific CPU compute workloads.
Other highlights for the Surface Pro 4 include front facing stereo speakers, 8MP rear-facing camera and a fancy-ass Windows 10 logo.
Pricing will START at $899 but will spike to as high as $2699 if you max out the processor and storage options.
We are working on getting a unit in for testing as the devices are going up for presale today and should arrive by October 26th.
Subject: General Tech | October 6, 2015 - 01:11 PM | Jeremy Hellstrom
Tagged: nuclear, security
Stuxnet hit the news five years ago when it was discovered infecting the industrial Supervisory Control And Data Acquisition systems of factories all across the world, up to and including nuclear plants. The breadth of the attack was a bit more than what Israeli intelligence and the NSA originally intended but they did succeed in severely damaging their actual target which was an Iranian uranium enrichment plant. Unfortunately it seems the development of Stuxnet might have been somewhat of a waste of resources as they could probably have achieved the same results with a simple man in the middle attack.
The Chatham House recently released a report on the state of security in nuclear power plants and facilities across the globe and the results are horrifying to say the least. From the overview that The Register provides the level of security present in many of these facilities is commensurate with your average high school. The idea that these plants are air-gapped is a fallacy and the code for the control systems can be easily altered remotely without the need to design a complex virus to infect them. Thankfully it is very difficult to cause a nuclear plant to go critical but these vulnerabilities can still cause damage to machinery and interfere with the plants ability to provide power to customers. You may not want to read the whole story if you want to sleep well tonight.
"The report adds that search engines can "readily identify critical infrastructure components with" VPNs, some of which are power plants. It also adds that facility operators are "sometimes unaware of" them."
Here is some more Tech News from around the web:
- AMD partners with Oculus and Dell to power Oculus ready PCs @ DigiTimes
- iOS malware YiSpecter: iPhones menaced by software nasty @ The Register
- Atom-thin transistor defies fundamental limits @ Nanotechweb
- Microsoft's big Tuesday reveal: New mobiles and slabs? Win 10 shock? @ The Register
- Chocolate Factory plops Marshmallow on Android slabs @ The Register
- Surface Book: MacBook Pro rival packs a Skylake chip and Nvidia GPU @ The Inquirer
- ASUS RT-AC87U & RT-AC3200 Routers Review @ Hardware Canucks
- NikKTech & Mionix Enjoy Gaming Worldwide Giveaway
- Win 1 of 3 be quiet! Silent Base 600 PC cases @ KitGuru
Subject: Mobile | October 6, 2015 - 07:01 AM | Scott Michaud
Tagged: google, android 6.0, Android
Android 6.0 was launched yesterday, and Ars Technica has, so far, been the only outlet to give it a formal review. That said, it is a twelve-page review with a table of contents -- so that totally counts for five or so.
The main complaint that the reviewer has is the operating system's inability to be directly updated. There is a large chain of rubber stamps between Google's engineers and the world at large. Carriers and phone manufacturers can delay (or not even attempt to certify) patches for their many handsets. It is not like Windows, where Microsoft controls the centralized update service. In the beginning, this wasn't too big of an issue as updates were typically for features. Sucker, buy a new phone if you want WebGL.
Now it's about security. Granted, it has always been about security, even on the iPhone, we just care more now. If you think about it, every time a phone gets jailbroken, a method exists to steal admin privileges away from Apple and give them to... the user. Some were fairly sophisticated processes involving USB tethering to PCs, while others involved browsing to a malicious website with a payload that the user (but not Apple) wanted to install. Hence why no-one cared: the security was being exploited by the user for the user. It was only a matter of time before either the companies sufficiently crush the bugs, or it started to be tasty for the wolves.
And Google is getting bit.
Otherwise, Ars Technica mostly praised the OS. Be sure to read their review to get a full sense of their opinion. As far as I can tell, they only tested it on the Nexus 5.
Subject: Systems | October 5, 2015 - 07:39 PM | Scott Michaud
Tagged: valve, steam os, steam machines, steam, pc gaming
According to SteamDB, Valve has struck deals with GameStop, GAME UK, and EB Canada to create “store within a store” areas in North American and UK locations. The article does not clarify how many of stores will receive this treatment. It does note that Steam Controller, Steam Link, and even Steam Machines will be sold from these outlets, which will give physical presence to Valve's console platform alongside the existing ones.
The thing about Valve is that, when they go silent, you can't tell whether they reconsidered their position, or they just are waiting for the right time to announce. They have been fairly vocal about Steam accessories, but the machines themselves have been pretty much radio silence for the better part of a year. There was basically nothing at CES 2015 after a big push in the prior year. The talk shifted to Steam Link, which was obviously part of their original intention but, due to the simultaneous lack of Steam Machine promotion, feels more like a replacement than an addition.
But, as said, that's tricky logic to use with Valve.
As a final note, I am curious about what the transaction entailed. From what I hear, purchasing retail space is pricey and difficult, but some retailers donate space for certain products and initiatives that they find intrinsic value in. Valve probably has a lot money, but they don't have Microsoft levels of cash. Whether Valve paid for the space, or the retailers donated it, is question that leads to two very different, but both very interesting in their own way, follow-ups. Hopefully we'll learn more, but we probably won't.
Subject: General Tech | October 5, 2015 - 06:24 PM | Scott Michaud
Tagged: pc gaming, ea, battlefront
So I'm reading PC Gamer and I see an article that says, “Star Wars Battlefront Will Not Use Microtransactions”. Given the previous few Battlefield games, this surprised me. Granted, these titles weren't particularly egregious in their use of payments. Everything (apart from expansion packs of course) could be achieved through a reasonable amount of play. That said, it takes a lot of restraint for a developer to not just ratchet the requirements further and further to widen their net, so I can see the problem.
Regardless, by the third paragraph I notice that the representative never actually said that they won't (according to the snippets that PC Gamer quoted). The phrase is simply, “not part of the core design of how it works”. Granted, I would expect that EA would poke PC Gamer to correct them if they did intend to release a game in about six weeks, so I feel like their interpretation is correct.
That doesn't change that, according to the quotes, the only thing they promised is for the currency system to be fully accessible without payments. I'm not fully convinced that it will only be accessible without payments, though.
Subject: Systems | October 5, 2015 - 05:16 PM | Scott Michaud
Tagged: microsoft, windows 10, surface, Surface Pro, surface pro 4, hp, Lenovo, dell, asus, acer, toshiba
Tomorrow at 10 am ET, Microsoft will host a live stream to announce “new Windows 10 devices from Microsoft”. It's pretty obvious that we'll get at least one new Surface device announced, which rumors suggest will be the Surface Pro 4 with a low-bezel, 13-inch display. W4pHub, via VR-Zone, goes a bit further to claim that the display can shrink to 12 inches when in tablet mode, giving a frame for the user to hold. If true, I wonder how applications will handle the shift in resolution. Perhaps the only problem is a little flicker, which will be hidden by the rest of Continuum's transition?
Image Credit: VR-Zone
The Microsoft Blog post also lists the announcement dates of their partners. Here's the rundown:
- October 7th -- HP
- October 8th -- Dell
- October 9th -- ASUS
- October 12th -- Acer
- October 13th -- Toshiba
- October 19th -- Lenovo
While the rush of Windows 10 devices have missed the Back to School season, despite Microsoft's attempts to rush development with a July release, it looks like we might get a good amount of them for the holiday season. I was a bit worried, seeing how slowly Threshold 2 seems to be advancing, but they seem to have convinced OEMs to make a big deal out of it.
Then again, it could be holiday fever.
Subject: Processors | October 5, 2015 - 04:48 PM | Jeremy Hellstrom
Tagged: amd, PRO A12-8800B, Excavator, carrizo pro, Godavari Pro
AMD recently announced a Pro lineup of Excavator based chips which match their Carrizo and Godavari current lineup as far as the specifications go. This was somewhat confusing as there were no real features at first glance that separated the Pro chips from the non-Pro cousins in the press material from AMD or HP. Tech ARP posted the slides from the reveal and they note one key feature that separates the two chip families and why businesses should be interested in them. These are hand-picked dies taken from hand picked wafers which AMD chose as they represent the best of the chips they have fabbed. You should expect performance free from any possible defects which made it past quality control and if you do have bad enough luck to find a way to get a less than perfect chip they come with a 36 month extended OEM warranty.
In addition to being hand picked, machines with an AMD Pro chip will also come with an ARM TrustZone Technology based AMD Secure Processor onboard. If you use a mobile device which has TPM and a crypto-processor onboard you will be familar with the technology; AMD is the first to bring this open sourced security platform to Windows based machines. Small business owners may also be interested the AMD PRO Control Center which is an inventory management client which will not cost as much as ones designed for Enterprise and in theory should be easier to use as well.
This news is of lesser interest to the gamer you never know, if you can secure one of these hand picked chips you may find it gives you a bit more headroom for tweaking than your average run of the mill Godavari or Carrizo would.
"We will now only show you the presentation slides, we also recorded the entire conference call and created a special video presentation based on the conference call for you. We hope you enjoy our work."
Here are some more Processor articles from around the web:
- The Skylake Core i3-6320 is the gamer's new best friend @ The Tech Report
- Core i7-5775C CPU Review @ Hardware Secrets
- Pentium N3700 CPU Review @ Hardware Secrets
New Components, New Approach
After 20 or so enclosure reviews over the past year and a half and some pretty inconsistent test hardware along the way, I decided to adopt a standardized test bench for all reviews going forward. Makes sense, right? Turns out choosing the best components for a cases and cooling test system was a lot more difficult than I expected going in, as special consideration had to be made for everything from form-factor to noise and heat levels.
Along with the new components I will also be changing the approach to future reviews by expanding the scope of CPU cooler testing. After some debate as to the type of CPU cooler to employ I decided that a better test of an enclosure would be to use both closed-loop liquid and air cooling for every review, and provide thermal and noise results for each. For CPU cooler reviews themselves I'll be adding a "real-world" load result to the charts to offer a more realistic scenario, running a standard desktop application (in this case a video encoder) in addition to the torture-test result using Prime95.
But what about this new build? It isn't completely done but here's a quick look at the components I ended up with so far along with the rationale for each selection.
CPU – Intel Core i5-6600K ($249, Amazon.com)
The introduction of Intel’s 6th generation Skylake processors provided the
excuse opportunity for an upgrade after using an AMD FX-6300 system for the last couple of enclosure reviews, and after toying with the idea of the new i7-6700K, and immediately realizing this was likely overkill and (more importantly) completely unavailable for purchase at the time, I went with the more "reasonable" option with the i5. There has long been a debate as to the need for hyper-threading for gaming (though this may be changing with the introduction of DX12) but in any case this is still a very powerful processor and when stressed should produce a challenging enough thermal load to adequately test both CPU coolers and enclosures going forward.
GPU – XFX Double Dissipation Radeon R9 290X ($347, Amazon.com)
This was by far the most difficult selection. I don’t think of my own use when choosing a card for a test system like this, as it must meet a set of criteria to be a good fit for enclosure benchmarks. If I choose a card that runs very cool and with minimal noise, GPU benchmarks will be far less significant as the card won’t adequately challenge the design and thermal characteristics of the enclosure. There are certainly options that run at greater temperatures and higher noise (a reference R9 290X for example), but I didn’t want a blower-style cooler with the GPU. Why? More and more GPUs are released with some sort of large multi-fan design rather than a blower, and for enclosure testing I want to know how the case handles the extra warm air.
Noise was an important consideration, as levels from an enclosure of course vary based on the installed components. With noise measurements a GPU cooler that has very low output at idle (or zero, as some recent cooler designs permit) will allow system idle levels to fall more on case fans and airflow than a GPU that might drown them out. (This would also allow a better benchmark of CPU cooler noise - particularly with self-contained liquid coolers and audible pump noise.) And while I wanted very quiet performance at idle, at load there must be sufficient noise to measure the performance of the enclosure in this regard, though of course nothing will truly tax a design quite like a loud blower. I hope I've found a good balance here.
Subject: Cases and Cooling | October 5, 2015 - 02:29 PM | Jeremy Hellstrom
Tagged: air cooling, enermax, ETS-T40F-W
Have you recently bought a white motherboard and a matching case to go with it? Are you now bemoaning the fact that your cooler just isn't matching rhe Storm Trooper vibe that you have going on in your case? Enermax has a solution, the ETS-T40F-W cooler, 610 grams of glowing white cooling standing 139x93x160mm (5.5x3.7x6.3"). The Thermal Conductive Coating does seem to work effectively, though the cooler is not among the best that [H]ard|OCP has reviewed. They also recommend running the fan at low speed as high speed does not increase the cooling as noticeably as it increases the fan noise. Then again, at $50 and being the only coloured cooler on the market does place it in an interesting niche market.
"Enermax comes to us with its new compact size model, the ETS-T40F-W CPU air cooler also referred to as the "Fit" series cooler. This model is decked out in its best Storm Trooper white garb which is actually what Enermax calls its "Thermal Conductive Coating." Do the Fit's dual 12cm fans have what it takes to make a good CPU air cooler?"
Here are some more Cases & Cooling reviews from around the web:
- Building your first Custom Designed Watercooled PC: KitGuru TV
- EKWB EK-XLC Predator 240 Pre-filled CPU Xpandable Liquid Cooler Review @ NikKTech
- Phanteks Enthoo Evolv ITX @ Modders-Inc
- Silverstone SG12 Micro-ATX @ eTeknix
- Antec GX505 Window SC Mid-Tower @ Benchmark Reviews
- Streacom ST-F12CS Aluminium ATX HTPC Chassis @ eTeknix
- StarTech 25U Open-Frame Server Rack Cabinet @ Phoronix
Subject: General Tech | October 5, 2015 - 01:14 PM | Jeremy Hellstrom
Tagged: LinuxCon Europe, linux, open source
LinuxCon Europe has just kicked off and there are some interesting projects being discussed at the event. ARM, Cisco, NexB, Qualcomm, SanDisk and Wind River have formed the Openchain workgroup to bring some standardization to Linux software development, such as exists in Debian, to ensure that multiple companies are not attempting design their own wheels simultaneously. The Real-Time Linux Collaborative Project is developing software for application in robotics, telecom, and aviation and includes members such as Google, Texas Instruments, Intel, ARM and Altera. They will be working towards developing Linux applications for those industries where shaving a few milliseconds off of transaction times can be worth millions of dollars. The last major project announced at the convention will be FOSSology 3.0 which will enable you quickly and easily run licence and copyright scans, something near and dear to the heart of the Free and Open Source Software community. Check out more at The Inquirer.
"Tim Zemlin, chief executive of the Foundation, said in his opening remarks that this year's opening day falls on the 24th anniversary of Linux itself and the 30th of the Free Software Foundation, giving credit to delegates for their part in the success of both."
Here is some more Tech News from around the web:
- Apple's A9 impresses and the Nexus strikes back: The TR Podcast 188
- Shutdowngate: iPhone 6S handsets are randomly turning off @ The Inquirer
- Google spews out Alphabet. Alphabet gobbles Google @ The Register
- Mega Giveaway #7 : LEAGOO Elite 4 Smartphone @ Tech ARP
Subject: Cases and Cooling | October 5, 2015 - 09:01 AM | Scott Michaud
Tagged: wall mount, thermaltake
Personally, I would like to see at least an option for plexiglass on the perimeter. I feel like some might want a bit of protection from things like sneezes, or rogue squirt-gun blasts. The “case” is basically a plate with a clear acrylic pane in front of it. It can stand upright, be rotated horizontally, or even screwed into a wall if you want to show off a custom liquid coolant loops or something.
Interestingly, Thermaltake is providing “3D Printing Accessory Files”. I somehow doubt that this will be the CAD files required to lasercut your own Core P5 case, but it's designed to allow makers to create their own accessories for it. As such, this sounds more like guides and schematics, but I cannot say for sure because I haven't tried it... and they're not available yet.
The Thermaltake Core P5 will be available soon for an MSRP of $169.99, although it's already at a sale price of $149.99. This could be just a pre-order discount, or a sign of its typical price point. We don't know.
Subject: General Tech | October 5, 2015 - 08:31 AM | Scott Michaud
Tagged: pc gaming, humble bundle
Humble Bundle is an organization that sells games for charity. It started with a service that let users pay pretty much whatever they want for DRM-free titles, and let them choose how much went to the developers, the organization, and the selected charities of the moment. They have branches out since then, sometimes with praise, sometimes with concerned murmors.
Humble Bundle mumble, if you will.
Now they have created a subscription service. Basically, on the first Friday of every month, subscribers will receive the game that is promoted. In other words, it is a service that acts similar to what we're used to, except that you don't know what you're getting ahead of time, you cannot select how much you pay for it, and you cannot choose the proceed distribution. Unless it leads to a unique palette of games that are decidedly better than the typical bundles, I cannot see how this is anything more than a restrictive subset for the sake of it.
Still, that doesn't mean said subset isn't worth your money (be careful of the double-negative). If it is, then you can subscribe now and pick up Legend of Grimrock 2. The title is apparently available on Steam for $24, so this would be a half-price deal if it was something that you were interesting in buying.
I guess that's a decent first impression.
Subject: General Tech, Mobile | October 5, 2015 - 08:01 AM | Scott Michaud
Tagged: windows 10, microsoft, iot
Microsoft has released the Windows 10 IoT Core for the Raspberry Pi 2. It retails for 75$ without the Raspberry Pi 2 Model B, or $115$ with it. Apart from the optional Pi, it is basically a pack of electronic components and an SD card that's pre-loaded with Windows 10 IoT. It is available at the Adafruit store, although both packs are currently out of stock... because of course they are.
Beyond jumper wires, a case, breadboards, resistors, LEDs, switches, and sensors, the pack also comes with a WiFi module. Interestingly, Adafruit claims that this will be the only WiFi adapter for the Raspberry Pi 2 that's supported by Windows 10 IoT. This is weird, of course, because Windows is kind-of the go-to when it comes to driver support. It makes me wonder whether Microsoft changed anything under the hood that affects hardware compatibility and, if it did, whether Windows 10 IoT loses its major advantage over Linux and other OSes in this form factor.
The kit is currently sold up, but retails for $75, or $115 with a Raspberry Pi 2 Model B.
Subject: General Tech | October 5, 2015 - 07:32 AM | Scott Michaud
Tagged: starcraft 2, starcraft, pc gaming, esports
I'm not really seeing anyone pick up this news in English outside of StarCraft II forums, so I'm not sure whether this news will be fresh, or completely irrelevant to anyone's interests. Either way, GOM eXP was one of the leading broadcasters of StarCraft tournaments in South Korea. They operated GSL, which was one of the three Blizzard-endorsed leagues for StarCraft II.
Image Credit: Wolf Shröder via Twitter
They have just shut down, but their GSL tournament will not.
afreecaTV, a video streaming service, has bought out the tournament. For viewers, this means that high quality, 1080p streams will be available for free. Previously, GOM was a bit strict about forcing Twitch subscriptions for anything other than Low quality. The quality was bad enough that you often couldn't even read the on-screen text, such as how many units or resources each player has.
Beyond hosting the 2016 GSL tournament, they will also have a couple of StarCraft II show matches and even a StarCraft: Brood War league. I wonder how the original StarCraft holds up for viewers after we have gotten used to the sequel's updated graphics. Hmm.
Subject: Graphics Cards | October 5, 2015 - 07:13 AM | Scott Michaud
Tagged: graphics drivers, amd
Apparently users of AMD's Catalyst 15.9 drivers have been experiencing issues. Specifically, “major memory leaks” could be caused by adjusting windows, such as resizing them or snapping them to edges of the desktop. According to PC Gamer, AMD immediately told users to roll back when they found out about the bug.
They have since fixed it with Catalyst 15.9.1 Beta. This subversion driver also fixes crashes and potential “signal loss” problems with a BenQ FreeSync monitor. As such, if you were interested in playing around with the Catalyst 15.9 beta driver, then it should be safe to do so now. I wish I could offer more input, but I just found out about it and it seems pretty cut-and-dry: if you had problems, they should be fixed. The update is available here.
Subject: General Tech | October 5, 2015 - 07:01 AM | Scott Michaud
Tagged: physics, microsoft, Intel, Havok
Microsoft has just purchased Havok from Intel for an undisclosed price. This group develops one of the leading physics engines for video games and other software. It was used in every Halo title since Halo 2, including Halo Wars, and a fork of it drives the physics for Valve's Source Engine. It has been around since 2000, but didn't really take off until Max Payne 2 in 2003.
And the natural follow-up question for just about everything is “why?”
Hopefully this isn't bad taste...
Photo Credit: Havok via Game Developer Magazine (June 2013)
There are good reasons, though. First, Microsoft has been in the video game middleware and API business for decades. DirectX is the obvious example, but they have also created software like Games for Windows Live and Microsoft Gaming Zone. Better software drives sales for platforms, and developers can always use help accomplishing that.
Another reason could be Azure. Microsoft wants to bring cloud services to online titles, offloading some of the tasks that are insensitive to latency allows developers to lower system requirements or do more with what they have (which is especially true when consoles flatten huge install bases to a handful of specifications). If they plan to go forward with services that run on Azure or Xbox Live, then it would make sense to have middleware that's as drop-in as possible. Creating a physics engine from scratch is a bit of a hassle, but so is encouraging existing engines to use it.
It would be better to just buy someone that everyone is using. Currently, that's Havok, an open-source solution that is rarely used outside of other open-source systems, and something that's owned by NVIDIA (and probably won't leave their grip until their fingers are frigid and lifeless).
That's about all we know, though. The deal doesn't have a close date, value, or official purpose. Intel hasn't commented on the deal, only Microsoft has.
Subject: Graphics Cards | October 5, 2015 - 02:33 AM | Sebastian Peak
Tagged: rumor, report, radeon, graphics cards, Gemini, fury x, fiji xt, dual-GPU, amd
The AMD R9 Fury X, Fury, and Nano have all been released, but a dual-GPU Fiji XT card could be on the way soon according to a new report.
Back in June at AMD's E3 event we were shown Project Quantum, AMD's concept for a powerful dual-GPU system in a very small form-factor. It was speculated that the system was actually housing an unreleased dual-GPU graphic card, which would have made sense given the very small size of the system (and mini-ITX motherboard therein). Now a report from WCCFtech is pointing to a manifest that just might be a shipment of this new dual-GPU card, and the code-name is Gemini.
"Gemini is the code-name AMD has previously used in the past for dual GPU variants and surprisingly, the manifest also contains another phrase: ‘Tobermory’. Now this could simply be a reference to the port that the card shipped from...or it could be the actual codename of the card, with Gemini just being the class itself."
The manifest also indicates a Cooler Master cooler for the card, the maker of the liquid cooling solution for the Fury X. As the Fury X has had its share of criticism for pump whine issues it would be interesting to see how a dual-GPU cooling solution would fare in that department, though we could be seeing an entirely new generation of the pump as well. Of course speculation on an unreleased product like this could be incorrect, and verifiable hard details aren't available yet. Still, of the dual-GPU card is based on a pair of full Fiji XT cores the specs could be very impressive to say the least:
- Core: Fiji XT x2
- Stream Processors: 8192
- GCN Compute Units: 128
- ROPs: 128
- TMUs: 512
- Memory: 8 GB (4GB per GPU)
- Memory Interface: 4096-bit x2
- Memory Bandwidth: 1024 GB/s
In addition to the specifics above the report also discussed the possibility of 17.2 TFLOPS of performance based on 2x the performance of Fury X, which would make the Gemini product one of the most powerful single-card GPU solutions in the world. The card seems close enough to the final stage that we should expect to hear something official soon, but for now it's fun to speculate - unless of course the speculation concerns a high initial retail price, and unfortunately something at or above $1000 is quite likely. We shall see.
GPU Enthusiasts Are Throwing a FET
NVIDIA is rumored to launch Pascal in early (~April-ish) 2016, although some are skeptical that it will even appear before the summer. The design was finalized months ago, and unconfirmed shipping information claims that chips are being stockpiled, which is typical when preparing to launch a product. It is expected to compete against AMD's rumored Arctic Islands architecture, which will, according to its also rumored numbers, be very similar to Pascal.
This architecture is a big one for several reasons.
Image Credit: WCCFTech
First, it will jump two full process nodes. Current desktop GPUs are manufactured at 28nm, which was first introduced with the GeForce GTX 680 all the way back in early 2012, but Pascal will be manufactured on TSMC's 16nm FinFET+ technology. Smaller features have several advantages, but a huge one for GPUs is the ability to fit more complex circuitry in the same die area. This means that you can include more copies of elements, such as shader cores, and do more in fixed-function hardware, like video encode and decode.
That said, we got a lot more life out of 28nm than we really should have. Chips like GM200 and Fiji are huge, relatively power-hungry, and complex, which is a terrible idea to produce when yields are low. I asked Josh Walrath, who is our go-to for analysis of fab processes, and he believes that FinFET+ is probably even more complicated today than 28nm was in the 2012 timeframe, which was when it launched for GPUs.
It's two full steps forward from where we started, but we've been tiptoeing since then.
Image Credit: WCCFTech
Second, Pascal will introduce HBM 2.0 to NVIDIA hardware. HBM 1.0 was introduced with AMD's Radeon Fury X, and it helped in numerous ways -- from smaller card size to a triple-digit percentage increase in memory bandwidth. The 980 Ti can talk to its memory at about 300GB/s, while Pascal is rumored to push that to 1TB/s. Capacity won't be sacrificed, either. The top-end card is expected to contain 16GB of global memory, which is twice what any console has. This means less streaming, higher resolution textures, and probably even left-over scratch space for the GPU to generate content in with compute shaders. Also, according to AMD, HBM is an easier architecture to communicate with than GDDR, which should mean a savings in die space that could be used for other things.
Third, the architecture includes native support for three levels of floating point precision. Maxwell, due to how limited 28nm was, saved on complexity by reducing 64-bit IEEE 754 decimal number performance to 1/32nd of 32-bit numbers, because FP64 values are rarely used in video games. This saved transistors, but was a huge, order-of-magnitude step back from the 1/3rd ratio found on the Kepler-based GK110. While it probably won't be back to the 1/2 ratio that was found in Fermi, Pascal should be much better suited for GPU compute.
Image Credit: WCCFTech
Mixed precision could help video games too, though. Remember how I said it supports three levels? The third one is 16-bit, which is half of the format that is commonly used in video games. Sometimes, that is sufficient. If so, Pascal is said to do these calculations at twice the rate of 32-bit. We'll need to see whether enough games (and other applications) are willing to drop down in precision to justify the die space that these dedicated circuits require, but it should double the performance of anything that does.
So basically, this generation should provide a massive jump in performance that enthusiasts have been waiting for. Increases in GPU memory bandwidth and the amount of features that can be printed into the die are two major bottlenecks for most modern games and GPU-accelerated software. We'll need to wait for benchmarks to see how the theoretical maps to practical, but it's a good sign.
- 1 of 948