Subject: Systems | October 5, 2015 - 07:39 PM | Scott Michaud
Tagged: valve, steam os, steam machines, steam, pc gaming
According to SteamDB, Valve has struck deals with GameStop, GAME UK, and EB Canada to create “store within a store” areas in North American and UK locations. The article does not clarify how many of stores will receive this treatment. It does note that Steam Controller, Steam Link, and even Steam Machines will be sold from these outlets, which will give physical presence to Valve's console platform alongside the existing ones.
The thing about Valve is that, when they go silent, you can't tell whether they reconsidered their position, or they just are waiting for the right time to announce. They have been fairly vocal about Steam accessories, but the machines themselves have been pretty much radio silence for the better part of a year. There was basically nothing at CES 2015 after a big push in the prior year. The talk shifted to Steam Link, which was obviously part of their original intention but, due to the simultaneous lack of Steam Machine promotion, feels more like a replacement than an addition.
But, as said, that's tricky logic to use with Valve.
As a final note, I am curious about what the transaction entailed. From what I hear, purchasing retail space is pricey and difficult, but some retailers donate space for certain products and initiatives that they find intrinsic value in. Valve probably has a lot money, but they don't have Microsoft levels of cash. Whether Valve paid for the space, or the retailers donated it, is question that leads to two very different, but both very interesting in their own way, follow-ups. Hopefully we'll learn more, but we probably won't.
Subject: General Tech | October 5, 2015 - 06:24 PM | Scott Michaud
Tagged: pc gaming, ea, battlefront
So I'm reading PC Gamer and I see an article that says, “Star Wars Battlefront Will Not Use Microtransactions”. Given the previous few Battlefield games, this surprised me. Granted, these titles weren't particularly egregious in their use of payments. Everything (apart from expansion packs of course) could be achieved through a reasonable amount of play. That said, it takes a lot of restraint for a developer to not just ratchet the requirements further and further to widen their net, so I can see the problem.
Regardless, by the third paragraph I notice that the representative never actually said that they won't (according to the snippets that PC Gamer quoted). The phrase is simply, “not part of the core design of how it works”. Granted, I would expect that EA would poke PC Gamer to correct them if they did intend to release a game in about six weeks, so I feel like their interpretation is correct.
That doesn't change that, according to the quotes, the only thing they promised is for the currency system to be fully accessible without payments. I'm not fully convinced that it will only be accessible without payments, though.
Subject: Systems | October 5, 2015 - 05:16 PM | Scott Michaud
Tagged: microsoft, windows 10, surface, Surface Pro, surface pro 4, hp, Lenovo, dell, asus, acer, toshiba
Tomorrow at 10 am ET, Microsoft will host a live stream to announce “new Windows 10 devices from Microsoft”. It's pretty obvious that we'll get at least one new Surface device announced, which rumors suggest will be the Surface Pro 4 with a low-bezel, 13-inch display. W4pHub, via VR-Zone, goes a bit further to claim that the display can shrink to 12 inches when in tablet mode, giving a frame for the user to hold. If true, I wonder how applications will handle the shift in resolution. Perhaps the only problem is a little flicker, which will be hidden by the rest of Continuum's transition?
Image Credit: VR-Zone
The Microsoft Blog post also lists the announcement dates of their partners. Here's the rundown:
- October 7th -- HP
- October 8th -- Dell
- October 9th -- ASUS
- October 12th -- Acer
- October 13th -- Toshiba
- October 19th -- Lenovo
While the rush of Windows 10 devices have missed the Back to School season, despite Microsoft's attempts to rush development with a July release, it looks like we might get a good amount of them for the holiday season. I was a bit worried, seeing how slowly Threshold 2 seems to be advancing, but they seem to have convinced OEMs to make a big deal out of it.
Then again, it could be holiday fever.
Subject: Processors | October 5, 2015 - 04:48 PM | Jeremy Hellstrom
Tagged: amd, PRO A12-8800B, Excavator, carrizo pro, Godavari Pro
AMD recently announced a Pro lineup of Excavator based chips which match their Carrizo and Godavari current lineup as far as the specifications go. This was somewhat confusing as there were no real features at first glance that separated the Pro chips from the non-Pro cousins in the press material from AMD or HP. Tech ARP posted the slides from the reveal and they note one key feature that separates the two chip families and why businesses should be interested in them. These are hand-picked dies taken from hand picked wafers which AMD chose as they represent the best of the chips they have fabbed. You should expect performance free from any possible defects which made it past quality control and if you do have bad enough luck to find a way to get a less than perfect chip they come with a 36 month extended OEM warranty.
In addition to being hand picked, machines with an AMD Pro chip will also come with an ARM TrustZone Technology based AMD Secure Processor onboard. If you use a mobile device which has TPM and a crypto-processor onboard you will be familar with the technology; AMD is the first to bring this open sourced security platform to Windows based machines. Small business owners may also be interested the AMD PRO Control Center which is an inventory management client which will not cost as much as ones designed for Enterprise and in theory should be easier to use as well.
This news is of lesser interest to the gamer you never know, if you can secure one of these hand picked chips you may find it gives you a bit more headroom for tweaking than your average run of the mill Godavari or Carrizo would.
"We will now only show you the presentation slides, we also recorded the entire conference call and created a special video presentation based on the conference call for you. We hope you enjoy our work."
Here are some more Processor articles from around the web:
- The Skylake Core i3-6320 is the gamer's new best friend @ The Tech Report
- Core i7-5775C CPU Review @ Hardware Secrets
- Pentium N3700 CPU Review @ Hardware Secrets
New Components, New Approach
After 20 or so enclosure reviews over the past year and a half and some pretty inconsistent test hardware along the way, I decided to adopt a standardized test bench for all reviews going forward. Makes sense, right? Turns out choosing the best components for a cases and cooling test system was a lot more difficult than I expected going in, as special consideration had to be made for everything from form-factor to noise and heat levels.
Along with the new components I will also be changing the approach to future reviews by expanding the scope of CPU cooler testing. After some debate as to the type of CPU cooler to employ I decided that a better test of an enclosure would be to use both closed-loop liquid and air cooling for every review, and provide thermal and noise results for each. For CPU cooler reviews themselves I'll be adding a "real-world" load result to the charts to offer a more realistic scenario, running a standard desktop application (in this case a video encoder) in addition to the torture-test result using Prime95.
But what about this new build? It isn't completely done but here's a quick look at the components I ended up with so far along with the rationale for each selection.
CPU – Intel Core i5-6600K ($249, Amazon.com)
The introduction of Intel’s 6th generation Skylake processors provided the
excuse opportunity for an upgrade after using an AMD FX-6300 system for the last couple of enclosure reviews, and after toying with the idea of the new i7-6700K, and immediately realizing this was likely overkill and (more importantly) completely unavailable for purchase at the time, I went with the more "reasonable" option with the i5. There has long been a debate as to the need for hyper-threading for gaming (though this may be changing with the introduction of DX12) but in any case this is still a very powerful processor and when stressed should produce a challenging enough thermal load to adequately test both CPU coolers and enclosures going forward.
GPU – XFX Double Dissipation Radeon R9 290X ($347, Amazon.com)
This was by far the most difficult selection. I don’t think of my own use when choosing a card for a test system like this, as it must meet a set of criteria to be a good fit for enclosure benchmarks. If I choose a card that runs very cool and with minimal noise, GPU benchmarks will be far less significant as the card won’t adequately challenge the design and thermal characteristics of the enclosure. There are certainly options that run at greater temperatures and higher noise (a reference R9 290X for example), but I didn’t want a blower-style cooler with the GPU. Why? More and more GPUs are released with some sort of large multi-fan design rather than a blower, and for enclosure testing I want to know how the case handles the extra warm air.
Noise was an important consideration, as levels from an enclosure of course vary based on the installed components. With noise measurements a GPU cooler that has very low output at idle (or zero, as some recent cooler designs permit) will allow system idle levels to fall more on case fans and airflow than a GPU that might drown them out. (This would also allow a better benchmark of CPU cooler noise - particularly with self-contained liquid coolers and audible pump noise.) And while I wanted very quiet performance at idle, at load there must be sufficient noise to measure the performance of the enclosure in this regard, though of course nothing will truly tax a design quite like a loud blower. I hope I've found a good balance here.
Subject: Cases and Cooling | October 5, 2015 - 02:29 PM | Jeremy Hellstrom
Tagged: air cooling, enermax, ETS-T40F-W
Have you recently bought a white motherboard and a matching case to go with it? Are you now bemoaning the fact that your cooler just isn't matching rhe Storm Trooper vibe that you have going on in your case? Enermax has a solution, the ETS-T40F-W cooler, 610 grams of glowing white cooling standing 139x93x160mm (5.5x3.7x6.3"). The Thermal Conductive Coating does seem to work effectively, though the cooler is not among the best that [H]ard|OCP has reviewed. They also recommend running the fan at low speed as high speed does not increase the cooling as noticeably as it increases the fan noise. Then again, at $50 and being the only coloured cooler on the market does place it in an interesting niche market.
"Enermax comes to us with its new compact size model, the ETS-T40F-W CPU air cooler also referred to as the "Fit" series cooler. This model is decked out in its best Storm Trooper white garb which is actually what Enermax calls its "Thermal Conductive Coating." Do the Fit's dual 12cm fans have what it takes to make a good CPU air cooler?"
Here are some more Cases & Cooling reviews from around the web:
- Building your first Custom Designed Watercooled PC: KitGuru TV
- EKWB EK-XLC Predator 240 Pre-filled CPU Xpandable Liquid Cooler Review @ NikKTech
- Phanteks Enthoo Evolv ITX @ Modders-Inc
- Silverstone SG12 Micro-ATX @ eTeknix
- Antec GX505 Window SC Mid-Tower @ Benchmark Reviews
- Streacom ST-F12CS Aluminium ATX HTPC Chassis @ eTeknix
- StarTech 25U Open-Frame Server Rack Cabinet @ Phoronix
Subject: General Tech | October 5, 2015 - 01:14 PM | Jeremy Hellstrom
Tagged: LinuxCon Europe, linux, open source
LinuxCon Europe has just kicked off and there are some interesting projects being discussed at the event. ARM, Cisco, NexB, Qualcomm, SanDisk and Wind River have formed the Openchain workgroup to bring some standardization to Linux software development, such as exists in Debian, to ensure that multiple companies are not attempting design their own wheels simultaneously. The Real-Time Linux Collaborative Project is developing software for application in robotics, telecom, and aviation and includes members such as Google, Texas Instruments, Intel, ARM and Altera. They will be working towards developing Linux applications for those industries where shaving a few milliseconds off of transaction times can be worth millions of dollars. The last major project announced at the convention will be FOSSology 3.0 which will enable you quickly and easily run licence and copyright scans, something near and dear to the heart of the Free and Open Source Software community. Check out more at The Inquirer.
"Tim Zemlin, chief executive of the Foundation, said in his opening remarks that this year's opening day falls on the 24th anniversary of Linux itself and the 30th of the Free Software Foundation, giving credit to delegates for their part in the success of both."
Here is some more Tech News from around the web:
- Apple's A9 impresses and the Nexus strikes back: The TR Podcast 188
- Shutdowngate: iPhone 6S handsets are randomly turning off @ The Inquirer
- Google spews out Alphabet. Alphabet gobbles Google @ The Register
- Mega Giveaway #7 : LEAGOO Elite 4 Smartphone @ Tech ARP
Subject: Cases and Cooling | October 5, 2015 - 09:01 AM | Scott Michaud
Tagged: wall mount, thermaltake
Personally, I would like to see at least an option for plexiglass on the perimeter. I feel like some might want a bit of protection from things like sneezes, or rogue squirt-gun blasts. The “case” is basically a plate with a clear acrylic pane in front of it. It can stand upright, be rotated horizontally, or even screwed into a wall if you want to show off a custom liquid coolant loops or something.
Interestingly, Thermaltake is providing “3D Printing Accessory Files”. I somehow doubt that this will be the CAD files required to lasercut your own Core P5 case, but it's designed to allow makers to create their own accessories for it. As such, this sounds more like guides and schematics, but I cannot say for sure because I haven't tried it... and they're not available yet.
The Thermaltake Core P5 will be available soon for an MSRP of $169.99, although it's already at a sale price of $149.99. This could be just a pre-order discount, or a sign of its typical price point. We don't know.
Subject: General Tech | October 5, 2015 - 08:31 AM | Scott Michaud
Tagged: pc gaming, humble bundle
Humble Bundle is an organization that sells games for charity. It started with a service that let users pay pretty much whatever they want for DRM-free titles, and let them choose how much went to the developers, the organization, and the selected charities of the moment. They have branches out since then, sometimes with praise, sometimes with concerned murmors.
Humble Bundle mumble, if you will.
Now they have created a subscription service. Basically, on the first Friday of every month, subscribers will receive the game that is promoted. In other words, it is a service that acts similar to what we're used to, except that you don't know what you're getting ahead of time, you cannot select how much you pay for it, and you cannot choose the proceed distribution. Unless it leads to a unique palette of games that are decidedly better than the typical bundles, I cannot see how this is anything more than a restrictive subset for the sake of it.
Still, that doesn't mean said subset isn't worth your money (be careful of the double-negative). If it is, then you can subscribe now and pick up Legend of Grimrock 2. The title is apparently available on Steam for $24, so this would be a half-price deal if it was something that you were interesting in buying.
I guess that's a decent first impression.
Subject: General Tech, Mobile | October 5, 2015 - 08:01 AM | Scott Michaud
Tagged: windows 10, microsoft, iot
Microsoft has released the Windows 10 IoT Core for the Raspberry Pi 2. It retails for 75$ without the Raspberry Pi 2 Model B, or $115$ with it. Apart from the optional Pi, it is basically a pack of electronic components and an SD card that's pre-loaded with Windows 10 IoT. It is available at the Adafruit store, although both packs are currently out of stock... because of course they are.
Beyond jumper wires, a case, breadboards, resistors, LEDs, switches, and sensors, the pack also comes with a WiFi module. Interestingly, Adafruit claims that this will be the only WiFi adapter for the Raspberry Pi 2 that's supported by Windows 10 IoT. This is weird, of course, because Windows is kind-of the go-to when it comes to driver support. It makes me wonder whether Microsoft changed anything under the hood that affects hardware compatibility and, if it did, whether Windows 10 IoT loses its major advantage over Linux and other OSes in this form factor.
The kit is currently sold up, but retails for $75, or $115 with a Raspberry Pi 2 Model B.
Subject: General Tech | October 5, 2015 - 07:32 AM | Scott Michaud
Tagged: starcraft 2, starcraft, pc gaming, esports
I'm not really seeing anyone pick up this news in English outside of StarCraft II forums, so I'm not sure whether this news will be fresh, or completely irrelevant to anyone's interests. Either way, GOM eXP was one of the leading broadcasters of StarCraft tournaments in South Korea. They operated GSL, which was one of the three Blizzard-endorsed leagues for StarCraft II.
Image Credit: Wolf Shröder via Twitter
They have just shut down, but their GSL tournament will not.
afreecaTV, a video streaming service, has bought out the tournament. For viewers, this means that high quality, 1080p streams will be available for free. Previously, GOM was a bit strict about forcing Twitch subscriptions for anything other than Low quality. The quality was bad enough that you often couldn't even read the on-screen text, such as how many units or resources each player has.
Beyond hosting the 2016 GSL tournament, they will also have a couple of StarCraft II show matches and even a StarCraft: Brood War league. I wonder how the original StarCraft holds up for viewers after we have gotten used to the sequel's updated graphics. Hmm.
Subject: Graphics Cards | October 5, 2015 - 07:13 AM | Scott Michaud
Tagged: graphics drivers, amd
Apparently users of AMD's Catalyst 15.9 drivers have been experiencing issues. Specifically, “major memory leaks” could be caused by adjusting windows, such as resizing them or snapping them to edges of the desktop. According to PC Gamer, AMD immediately told users to roll back when they found out about the bug.
They have since fixed it with Catalyst 15.9.1 Beta. This subversion driver also fixes crashes and potential “signal loss” problems with a BenQ FreeSync monitor. As such, if you were interested in playing around with the Catalyst 15.9 beta driver, then it should be safe to do so now. I wish I could offer more input, but I just found out about it and it seems pretty cut-and-dry: if you had problems, they should be fixed. The update is available here.
Subject: General Tech | October 5, 2015 - 07:01 AM | Scott Michaud
Tagged: physics, microsoft, Intel, Havok
Microsoft has just purchased Havok from Intel for an undisclosed price. This group develops one of the leading physics engines for video games and other software. It was used in every Halo title since Halo 2, including Halo Wars, and a fork of it drives the physics for Valve's Source Engine. It has been around since 2000, but didn't really take off until Max Payne 2 in 2003.
And the natural follow-up question for just about everything is “why?”
Hopefully this isn't bad taste...
Photo Credit: Havok via Game Developer Magazine (June 2013)
There are good reasons, though. First, Microsoft has been in the video game middleware and API business for decades. DirectX is the obvious example, but they have also created software like Games for Windows Live and Microsoft Gaming Zone. Better software drives sales for platforms, and developers can always use help accomplishing that.
Another reason could be Azure. Microsoft wants to bring cloud services to online titles, offloading some of the tasks that are insensitive to latency allows developers to lower system requirements or do more with what they have (which is especially true when consoles flatten huge install bases to a handful of specifications). If they plan to go forward with services that run on Azure or Xbox Live, then it would make sense to have middleware that's as drop-in as possible. Creating a physics engine from scratch is a bit of a hassle, but so is encouraging existing engines to use it.
It would be better to just buy someone that everyone is using. Currently, that's Havok, an open-source solution that is rarely used outside of other open-source systems, and something that's owned by NVIDIA (and probably won't leave their grip until their fingers are frigid and lifeless).
That's about all we know, though. The deal doesn't have a close date, value, or official purpose. Intel hasn't commented on the deal, only Microsoft has.
Subject: Graphics Cards | October 5, 2015 - 02:33 AM | Sebastian Peak
Tagged: rumor, report, radeon, graphics cards, Gemini, fury x, fiji xt, dual-GPU, amd
The AMD R9 Fury X, Fury, and Nano have all been released, but a dual-GPU Fiji XT card could be on the way soon according to a new report.
Back in June at AMD's E3 event we were shown Project Quantum, AMD's concept for a powerful dual-GPU system in a very small form-factor. It was speculated that the system was actually housing an unreleased dual-GPU graphic card, which would have made sense given the very small size of the system (and mini-ITX motherboard therein). Now a report from WCCFtech is pointing to a manifest that just might be a shipment of this new dual-GPU card, and the code-name is Gemini.
"Gemini is the code-name AMD has previously used in the past for dual GPU variants and surprisingly, the manifest also contains another phrase: ‘Tobermory’. Now this could simply be a reference to the port that the card shipped from...or it could be the actual codename of the card, with Gemini just being the class itself."
The manifest also indicates a Cooler Master cooler for the card, the maker of the liquid cooling solution for the Fury X. As the Fury X has had its share of criticism for pump whine issues it would be interesting to see how a dual-GPU cooling solution would fare in that department, though we could be seeing an entirely new generation of the pump as well. Of course speculation on an unreleased product like this could be incorrect, and verifiable hard details aren't available yet. Still, of the dual-GPU card is based on a pair of full Fiji XT cores the specs could be very impressive to say the least:
- Core: Fiji XT x2
- Stream Processors: 8192
- GCN Compute Units: 128
- ROPs: 128
- TMUs: 512
- Memory: 8 GB (4GB per GPU)
- Memory Interface: 4096-bit x2
- Memory Bandwidth: 1024 GB/s
In addition to the specifics above the report also discussed the possibility of 17.2 TFLOPS of performance based on 2x the performance of Fury X, which would make the Gemini product one of the most powerful single-card GPU solutions in the world. The card seems close enough to the final stage that we should expect to hear something official soon, but for now it's fun to speculate - unless of course the speculation concerns a high initial retail price, and unfortunately something at or above $1000 is quite likely. We shall see.
GPU Enthusiasts Are Throwing a FET
NVIDIA is rumored to launch Pascal in early (~April-ish) 2016, although some are skeptical that it will even appear before the summer. The design was finalized months ago, and unconfirmed shipping information claims that chips are being stockpiled, which is typical when preparing to launch a product. It is expected to compete against AMD's rumored Arctic Islands architecture, which will, according to its also rumored numbers, be very similar to Pascal.
This architecture is a big one for several reasons.
Image Credit: WCCFTech
First, it will jump two full process nodes. Current desktop GPUs are manufactured at 28nm, which was first introduced with the GeForce GTX 680 all the way back in early 2012, but Pascal will be manufactured on TSMC's 16nm FinFET+ technology. Smaller features have several advantages, but a huge one for GPUs is the ability to fit more complex circuitry in the same die area. This means that you can include more copies of elements, such as shader cores, and do more in fixed-function hardware, like video encode and decode.
That said, we got a lot more life out of 28nm than we really should have. Chips like GM200 and Fiji are huge, relatively power-hungry, and complex, which is a terrible idea to produce when yields are low. I asked Josh Walrath, who is our go-to for analysis of fab processes, and he believes that FinFET+ is probably even more complicated today than 28nm was in the 2012 timeframe, which was when it launched for GPUs.
It's two full steps forward from where we started, but we've been tiptoeing since then.
Image Credit: WCCFTech
Second, Pascal will introduce HBM 2.0 to NVIDIA hardware. HBM 1.0 was introduced with AMD's Radeon Fury X, and it helped in numerous ways -- from smaller card size to a triple-digit percentage increase in memory bandwidth. The 980 Ti can talk to its memory at about 300GB/s, while Pascal is rumored to push that to 1TB/s. Capacity won't be sacrificed, either. The top-end card is expected to contain 16GB of global memory, which is twice what any console has. This means less streaming, higher resolution textures, and probably even left-over scratch space for the GPU to generate content in with compute shaders. Also, according to AMD, HBM is an easier architecture to communicate with than GDDR, which should mean a savings in die space that could be used for other things.
Third, the architecture includes native support for three levels of floating point precision. Maxwell, due to how limited 28nm was, saved on complexity by reducing 64-bit IEEE 754 decimal number performance to 1/32nd of 32-bit numbers, because FP64 values are rarely used in video games. This saved transistors, but was a huge, order-of-magnitude step back from the 1/3rd ratio found on the Kepler-based GK110. While it probably won't be back to the 1/2 ratio that was found in Fermi, Pascal should be much better suited for GPU compute.
Image Credit: WCCFTech
Mixed precision could help video games too, though. Remember how I said it supports three levels? The third one is 16-bit, which is half of the format that is commonly used in video games. Sometimes, that is sufficient. If so, Pascal is said to do these calculations at twice the rate of 32-bit. We'll need to see whether enough games (and other applications) are willing to drop down in precision to justify the die space that these dedicated circuits require, but it should double the performance of anything that does.
So basically, this generation should provide a massive jump in performance that enthusiasts have been waiting for. Increases in GPU memory bandwidth and the amount of features that can be printed into the die are two major bottlenecks for most modern games and GPU-accelerated software. We'll need to wait for benchmarks to see how the theoretical maps to practical, but it's a good sign.
Subject: General Tech | October 3, 2015 - 11:04 PM | Scott Michaud
Tagged: Starcraft II, legacy of the void, blizzard
Third time's the charm, unless they plan another release at some point.
The StarCraft II interface isn't perfect. Even though it is interesting and visually appealing, some tasks are unnecessarily difficult and space is not used in the most efficient way. To see what I mean, try to revert the multiplayer mode to Wings of Liberty, or, worse, find your Character Code. Blizzard released a new UI with Heart of the Swarm back in 2013, and they're doing a new one for the release of Legacy of the Void on November 10th. Note that my two examples probably won't be fixed in this update, they are just examples of UX issues.
While the update aligns with the new expansion, Blizzard will patch the UI for all content levels, including the free Starter Edition. This honestly makes sense, because it's easier to patch a title when all variations share a common core. Then again, not every company patches five-year-old titles like Blizzard does, so the back-catalog support is appreciated.
The most heartwarming change for fans, if pointless otherwise, is in the campaign selection screen. As the StarCraft II trilogy will be completed with Legacy of the Void, the interface aligns them as three episodes in the same style as the original StarCraft did.
On the functional side, the interface has been made more compact (which I alluded to earlier). This was caused by the new chat design, which is bigger yet less disruptive than it was in Heart of the Swarm. The column of buttons on the side are now a top bar, which expands down for sub-menu items.
While there are several things that I don't mention, a final note for this post is that Arcade will now focus on open lobbies. Players can look for the specific game they want, but the initial screen will show lobbies that are waiting to fill. The hope seems to be that players waiting for a game will spend less time. This raises two questions. First, Arcade games tend to have a steep learning curve, so I wonder if this feature will slump off after people try a few rounds before realizing that they should stick with a handful of games. Second, I wonder what this means for player numbers in general -- this sounds like a feature that is added during player declines, which Blizzard seems to hint is not occuring.
I'm not sure when the update will land, but it will probably be around the launch of Legacy of the Void on November 10th.
Subject: Displays | October 3, 2015 - 09:12 PM | Sebastian Peak
Tagged: UP3216Q, ultrasharp, UHD, monitor, ips, HDMI 2.0, display, dell, calibration, Adobe RGB, 4k
While not officially launched in the U.S. just yet, on Thursday Tom's Hardware reported news of a trio of upcoming UltraSharp monitors from Dell, the largest of which - the UP3216Q - I was able to locate on Dell's Bermuda site.
For anyone looking for a 4K display for photo or video editing (or any other color critical work) the new Dell UltraSharp UP3216Q looks like a great - and likely very pricey - option. Just how much are we talking? The existing 31.5-inch 4K UP3214Q carries a $1999 MSRP (though it sells for $1879 on Dell's site). For this kind of money there are probably those who will never consider a 16:9 option (or ever give up their 16:10 30-inch displays), but the specifications of this new UP3216Q are impressive:
- Diagonal Viewing Size: 31.5 inch
- Aspect Ratio: Widescreen (16:9)
- Panel Type, Surface: In-Plane Switching
- Optimal resolution: 3840 x 2160 @ 60Hz
- Active Display Area (H x V): 273,996 sq-mm (424.7 sq-inches)
- Contrast Ratio: 1000 to 1 (typical), 2 Million to 1 (dynamic)
- Brightness: 300 cd/m2 (typical)
- Response Time: 6ms fast mode . GTG
- Viewing Angle: 178° vertical / 178° horizontal
- Adjustability: Tilt, Swivel, Height Adjust
- Color Support: 1.07 billion colors
- Pixel Pitch: 0.182 mm
- Backlight Technology: LED light bar system
- Display Screen Coating: Anti-Glare with 3H hardness
- Connectivity: DP, mDP, HDMI (MHL), 4 x USB3 with one charging port, 1 x USB3 upstream, Media Card Reader
With the 60 Hz 4K (UHD) IPS panel offering full sRGB and 99.5% Adobe RGB, and a factory calibration that promises to be factory color calibrated with a deltaE of less than 2, the UP3214Q sounds pretty much ready to go out of the box. However for those inclined to strive for a more perfect calibration Dell is offering an X-Rite i1Display Pro colorimeter as an optional accessory, providing their own Dell UltraSharp Color Calibration Solution software.
A couple of points of interest with this monitor, while it offers DisplayPort and mini-DP inputs it also supports 4K 60 Hz via HDMI 2.0. Color support is also listed as 1.07 billion colors, but it's not specified whether this indicates a 10-bit panel or if they are implementing 10-bit color processing with an 8-bit panel - though if it's in the $2k price range it would probably safe to assume this is a 10-bit panel. Lastly, in keeping with the UltraSharp branding the monitor will also carry Dell's Premium Panel Guarantee and 3-Year Advanced Exchange Service warranty.
Subject: Mobile | October 2, 2015 - 04:09 PM | Tim Verry
Tagged: Tegra X1, tablet, pixel, nvidia, google, android 6.0, Android
During its latest keynote event, Google unveiled the Pixel C, a powerful tablet with optional keyboard that uses NVIDIA’s Tegra X1 SoC and runs the Android 6.0 “Marshmallow” operating system.
The Pixel C was designed by the team behind the Chromebook Pixel. Pixel C features an anodized aluminum body that looks (and reportedly feels) smooth with clean lines and rounded corners. The tablet itself is 7mm thick and weighs approximately one pound. The front of the Pixel C is dominated by a 10.2” display with a resolution of 2560 x 1800 (308 PPI, 500 nits brightness), wide sRGB color gamut, and 1:√2 aspect ratio (which Google likened to the size and aspect ratio of an A4 sheet of paper). A 2MP front camera sits above the display while four microphones sit along the bottom edge and a single USB Type-C port and two stereo speakers sit on the sides of the tablet. Around back, there is an 8MP rear camera and a bar of LED lights that will light up to indicate the battery charge level after double tapping it.
The keyboard is an important part of the Pixel C, and Google has given it special attention to make it part of the package. The keyboard attaches to the tablet using self-aligning magnets that are powerful enough to keep the display attached while holding it upside down and shaking it (not that you'd want to do that, mind you). It can be attached to the bottom of the tablet for storage and used like a slate or you can attach the tablet to the back of the keyboard and lift the built-in hinge to use the Pixel C in laptop mode (the hinge can hold the display at anywhere from 100 to 135-degrees). The internal keyboard battery is good for two months of use, and can be simply recharged by closing the Pixel C like a laptop and allowing it to inductively charge from the tablet portion. The keyboard is around 2mm thick and is nearly full size at 18.85mm pitch and the chiclet keys have a 1.4mm travel that is similar to that of the Chromebook Pixel. There is no track pad, but it does offer a padded palm rest which is nice to see.
Internally, the Pixel C is powered by the NVIDIA Tegra X1 SoC, 3GB of RAM, and 32GB or 64GB of storage (depending on model). The 20nm Tegra X1 consists of four ARM Cortex A57 and four Cortex A53 CPU cores paired with a 256-core Maxwell GPU. The Pixel C is a major design win for NVIDIA, and the built in GPU will be great for gaming on the go.
The Pixel C will be available in December ("in time for the holidays") for $499 for the base 32 GB model, $599 for the 64 GB model, and $149 for the keyboard.
First impressions, such as this hands-on by Engadget, seem to be very positive stating that it is sturdy yet sleek hardware that feels comfortable typing on. While the hardware looks more than up to the task, the operating system of choice is a concern for me. Android is not the most productivity and multi-tasking friendly software. There are some versions of Android that enable multiple windows or side-by-side apps, but it has always felt rather clunky and limited in its usefulness. With that said, Computer World's JR Raphael seems hopeful. He points out that the Pixel C is, in Batman fashion, not the hardware Android wants, but the hardware that Android needs (to move forward) and is primed for a future of Android that is more friendly to such productive endeavors. Development versions of Android 6.0 included support for multiple apps running simultaneously side-by-side, and while that feature will not make the initial production code cut, it does show that it is something that Google is looking into pursuing and possibly enabling at some point. The Pixel C has an excellent aspect ratio to take advantage of the app splitting with the ability to display four windows each with the same aspect ratio.
I am not sure how well received the Pixel C will be by business users who have several convertible tablet options running Windows and Chrome OS. It certainly gives the iPad-and-keyboard combination a run for its money and is a premium alternative to devices like the Asus Transformers.
What do you think about the Pixel C, and in particular, it running Android?
Even if I end up being less-than-productive using it, I think I'd still want the sleek-looking hardware as a second machine, heh.
Subject: Cases and Cooling | October 2, 2015 - 03:29 PM | Jeremy Hellstrom
Tagged: PSU, modular psu, coolermaster, V750W, 80+ gold
Cooler Master's V750W PSU is fully modular and comes with a nice selection of cabling including four PCIe 6+2 connectors and eight SATA power connectors. At 150x140x86mm (5.9x5.5x3.4") it also takes up less space than many PSUs, though not enough to fit in a truly SFF case. A single 12V rail can provide 744W at 62A which is enough to power more than one mid to high range GPU and Bjorn3D's testing shows that it can maintain that 80+ GOLD rating while it is being used. The five year warranty is also a good reason to pick up this PSU, assuming you are not in the market for something in the kilowatt range.
"One available option soon to be available on the market for <179$, and our specimen of review today, is the CoolerMaster V750. CoolerMaster has partnered with Seasonic to produce the high quality compact “V” series PSUs which made a huge statement for CoolerMaster and told the world they were ready to push some serious power."
Here are some more Cases & Cooling reviews from around the web:
- Corsair RM1000x @ Kitguru
- Corsair RM750x @ Kitguru
- Antec HCP Platinum Continuous Power 1000W @ eTeknix
- Be Quiet! Dark Power Pro 11 1000W Power Supply Unit Review @ NikKTech
- Cooler Master V Series 550W @ Kitguru
- be quiet! Dark Power Pro 11 550W @ eTeknix
Introduction and Technical Specifications
Courtesy of GIGABYTE
As the flagship board in their Intel Z170-based G1 Gaming Series product line, GIGABYTE integrated all the premium features you could ever want into their Z170X-G1 Gaming motherboard. The board features a black PCB with red and white accents spread throughout its surface to make for a very appealing aesthetic. GIGIGABYTE chose to integrate plastic shields covering their rear panel assembly, the VRM heat sinks, the audio components, and the chipset and SATA ports. With the addition of the Intel Z170 chipset, the motherboard supports the latest Intel LGA1151 Skylake processor line as well as Dual Channel DDR4 memory. Offered at at a premium MSRP of $499, the Z170X-Gaming G1 is priced to appeal to the premium user enthusiasts.
Courtesy of GIGABYTE
Courtesy of GIGABYTE
GIGABYTE over-engineered the Z170X-Gaming G1 to take anything you could think of throwing at it with a massive 22-phase digital power delivery system, featuring 4th gen IR digital controllers and 3rd gen IR PowerIRStage ICs as well as Durable Black solid capacitors rated at 10k operational hours. GIGABYTE integrated the following features into the Z170X-G1 Gaming board: four SATA 3 ports; three SATA-Express ports; two M.2 PCIe x4 capable port; dual Qualcomm® Atheros Killer E2400 NICs; a Killer™ Wireless-AC 1535 802.11AC WiFI controller; four PCI-Express x16 slots; three PCI-Express x1 slots; 2-digit diagnostic LED display; on-board power, reset, CMOS clear, ECO, and CPU Overclock buttons; Dual-BIOS and active BIOS switches; audio gain control switch; Sound Blaster Core 3D audio solution; removable audio OP-AMP port; integrated voltage measurement points; Q-Flash Plus BIOS updater; integrated HDMI video port; and USB 2.0, 3.0, and 3.1 Type-A and Type-C port support.