Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

We really want the ASUS PG279Q - 2560x1440, IPS, G-Sync...165 Hz

Subject: Displays | October 9, 2015 - 06:32 PM |
Tagged: asus, ROG, swift, pg279q, gsync, g-sync, ips

Okay, we see a lot of monitors here at PC Perspective...but this is probably the current "most coveted" of them all. The ASUS ROG Swift PG279Q looks nearly identical to the first generation ROG Swift display but with a couple of key modifications. Yes, this is still a 27-in 2560x1440 monitor but this time...oh this time...it holds a 165 Hz IPS screen.


Moving away from the world of TN screens and into the image-quality-improvement of IPS, the PG279Q not only brings ASUS' first G-Sync capable IPS 2560x1440 panel to the world but also ups the ante more than any other screen we have seen when it comes to the maximum refresh rate: this beast will top out at 165 Hz! High performance gamers that have taken to the 144 Hz market will surely see the advantages of stepping up yet again though I am curious how ASUS is able to drive an IPS screen at this speed without artifacts or issues. 


Interestingly, this panel not only includes a DisplayPort connection for 165 Hz 2560x1440 throughput but also an HDMI 1.4a input that can run 2560x1440 at 60 Hz, should you need that kind of thing. If you prefer ULMB over G-Sync, you have that option as well. 


I'm not sure yet, but I can feel Allyn's trigger finger on the BUY NOW button...if it existed. We don't have pricing and we don't have any update on availability, but if our past experiences with the ROG Swift line are any indication, I have a feeling this display is going to impress.

Source: ASUS

Microsoft Surface Book 2-in-1 with Skylake with NVIDIA Discrete GPU Announced

Subject: Mobile | October 6, 2015 - 02:38 PM |
Tagged: video, surface book, surface, Skylake, nvidia, microsoft, Intel, geforce

Along with the announcement of the new Surface Pro 4, Microsoft surprised many with the release of the new Surface Book 2-in-1 convertible laptop. Sharing much of the same DNA as the Surface tablet line, the Surface Book adopts a more traditional notebook design while still adding enough to the formula to produce a unique product.


The pivotal part of the design (no pun intended) is the new hinge, a "dynamic fulcrum" design that looks great and also (supposedly) will be incredibly strong. The screen / tablet attachment mechanism is called Muscle Wire and promises secure attachment as well as ease of release with a single button.

An interesting aspect of the fulcrum design is that, when closed, the Surface Book screen and keyboard do not actually touch near the hinge. Instead you have a small gap in this area. I'm curious how this will play out in real-world usage - it creates a natural angle for using the screen in its tablet form but also may find itself "catching" coin, pens and other things between the two sections. 


The 13.5-in screen has a 3000 x 2000 resolution (3:2 aspect ratio obviously) with a 267 PPI pixel density. Just like the Surface Pro 4, it has a 10-point touch capability and uses the exclusive PixelSense display technology for improved image quality.

While most of the hardware is included in the tablet portion of the device, the keyboard dock includes some surprises of its own. You get a set of two USB 3.0 ports, a full size SD card slot and a proprietary SurfaceConnect port for an add-on dock. But most interestingly you'll find an optional discrete GPU from NVIDIA, an as-yet-undiscovered GeForce GPU with 1GB (??) of memory. I have sent inquiries to Microsoft and NVIDIA for details on the GPU, but haven't heard back yet. We think it is a 30 watt GeForce GPU of some kind (by looking at the power adapter differences) but I'm more interested in how the GPU changes both battery life and performance.

UPDATE: Just got official word from NVIDIA on the GPU, but unfortunately it doesn't tell us much.

The new GPU is a Maxwell based GPU with GDDR5 memory. It was designed to deliver the best performance in ultra-thin form factors such as the Surface Book keyboard dock. Given its unique implementation and design in the keyboard module, it cannot be compared to a traditional 900M series GPU. Contact Microsoft for performance information.


Keyboard and touchpad performance looks to be impressive as well, with a full glass trackpad integration, backlit keyboard design and "class leading" key switch throw distance.

The Surface Book is powered by Intel Skylake processors, available in both Core i5 and Core i7 options, but does not offer Core m-based or Iris graphics options. Instead the integrated GPU will only be offered with the Intel HD 520.


Microsoft promises "up to" 12 hours of battery life on the Surface Book, though that claim was made with the Core i5 / 256GB / 8GB configuration option; no discrete GPU included. 


Pricing on the Surface Book starts at $1499 but can reach as high as $2699 with the maximum performance and storage capacity options. 

Source: Microsoft
Manufacturer: NVIDIA

GPU Enthusiasts Are Throwing a FET

NVIDIA is rumored to launch Pascal in early (~April-ish) 2016, although some are skeptical that it will even appear before the summer. The design was finalized months ago, and unconfirmed shipping information claims that chips are being stockpiled, which is typical when preparing to launch a product. It is expected to compete against AMD's rumored Arctic Islands architecture, which will, according to its also rumored numbers, be very similar to Pascal.

This architecture is a big one for several reasons.


Image Credit: WCCFTech

First, it will jump two full process nodes. Current desktop GPUs are manufactured at 28nm, which was first introduced with the GeForce GTX 680 all the way back in early 2012, but Pascal will be manufactured on TSMC's 16nm FinFET+ technology. Smaller features have several advantages, but a huge one for GPUs is the ability to fit more complex circuitry in the same die area. This means that you can include more copies of elements, such as shader cores, and do more in fixed-function hardware, like video encode and decode.

That said, we got a lot more life out of 28nm than we really should have. Chips like GM200 and Fiji are huge, relatively power-hungry, and complex, which is a terrible idea to produce when yields are low. I asked Josh Walrath, who is our go-to for analysis of fab processes, and he believes that FinFET+ is probably even more complicated today than 28nm was in the 2012 timeframe, which was when it launched for GPUs.

It's two full steps forward from where we started, but we've been tiptoeing since then.


Image Credit: WCCFTech

Second, Pascal will introduce HBM 2.0 to NVIDIA hardware. HBM 1.0 was introduced with AMD's Radeon Fury X, and it helped in numerous ways -- from smaller card size to a triple-digit percentage increase in memory bandwidth. The 980 Ti can talk to its memory at about 300GB/s, while Pascal is rumored to push that to 1TB/s. Capacity won't be sacrificed, either. The top-end card is expected to contain 16GB of global memory, which is twice what any console has. This means less streaming, higher resolution textures, and probably even left-over scratch space for the GPU to generate content in with compute shaders. Also, according to AMD, HBM is an easier architecture to communicate with than GDDR, which should mean a savings in die space that could be used for other things.

Third, the architecture includes native support for three levels of floating point precision. Maxwell, due to how limited 28nm was, saved on complexity by reducing 64-bit IEEE 754 decimal number performance to 1/32nd of 32-bit numbers, because FP64 values are rarely used in video games. This saved transistors, but was a huge, order-of-magnitude step back from the 1/3rd ratio found on the Kepler-based GK110. While it probably won't be back to the 1/2 ratio that was found in Fermi, Pascal should be much better suited for GPU compute.


Image Credit: WCCFTech

Mixed precision could help video games too, though. Remember how I said it supports three levels? The third one is 16-bit, which is half of the format that is commonly used in video games. Sometimes, that is sufficient. If so, Pascal is said to do these calculations at twice the rate of 32-bit. We'll need to see whether enough games (and other applications) are willing to drop down in precision to justify the die space that these dedicated circuits require, but it should double the performance of anything that does.

So basically, this generation should provide a massive jump in performance that enthusiasts have been waiting for. Increases in GPU memory bandwidth and the amount of features that can be printed into the die are two major bottlenecks for most modern games and GPU-accelerated software. We'll need to wait for benchmarks to see how the theoretical maps to practical, but it's a good sign.

Quick! Win 1 of 20 Star Wars Battlefront Beta keys from Logitech G and LucasArts!

Subject: General Tech | October 6, 2015 - 10:53 PM |
Tagged: logitech g, logitech, gleam, giveaway, contest

Look, time is short, and we want to get you these keys SOON!

Sign up using the form below to enter to win 1 of 20 keys for the PC version of Star Wars Battlefront beta on-going RIGHT NOW. I played for a couple of hours today and I have to say the game is looking very impressive - both visually and in terms of fun gameplay.


Our thanks to Logitech G and LucasArts for the key for our readers!!

SW Battlefront Keys

Report: AMD's Dual-GPU Fiji XT Card Might Be Coming Soon

Subject: Graphics Cards | October 5, 2015 - 02:33 AM |
Tagged: rumor, report, radeon, graphics cards, Gemini, fury x, fiji xt, dual-GPU, amd

The AMD R9 Fury X, Fury, and Nano have all been released, but a dual-GPU Fiji XT card could be on the way soon according to a new report.


Back in June at AMD's E3 event we were shown Project Quantum, AMD's concept for a powerful dual-GPU system in a very small form-factor. It was speculated that the system was actually housing an unreleased dual-GPU graphic card, which would have made sense given the very small size of the system (and mini-ITX motherboard therein). Now a report from WCCFtech is pointing to a manifest that just might be a shipment of this new dual-GPU card, and the code-name is Gemini.


"Gemini is the code-name AMD has previously used in the past for dual GPU variants and surprisingly, the manifest also contains another phrase: ‘Tobermory’. Now this could simply be a reference to the port that the card shipped from...or it could be the actual codename of the card, with Gemini just being the class itself."

The manifest also indicates a Cooler Master cooler for the card, the maker of the liquid cooling solution for the Fury X. As the Fury X has had its share of criticism for pump whine issues it would be interesting to see how a dual-GPU cooling solution would fare in that department, though we could be seeing an entirely new generation of the pump as well. Of course speculation on an unreleased product like this could be incorrect, and verifiable hard details aren't available yet. Still, of the dual-GPU card is based on a pair of full Fiji XT cores the specs could be very impressive to say the least:

  • Core: Fiji XT x2
  • Stream Processors: 8192
  • GCN Compute Units: 128
  • ROPs: 128
  • TMUs: 512
  • Memory: 8 GB (4GB per GPU)
  • Memory Interface: 4096-bit x2
  • Memory Bandwidth: 1024 GB/s

In addition to the specifics above the report also discussed the possibility of 17.2 TFLOPS of performance based on 2x the performance of Fury X, which would make the Gemini product one of the most powerful single-card GPU solutions in the world. The card seems close enough to the final stage that we should expect to hear something official soon, but for now it's fun to speculate - unless of course the speculation concerns a high initial retail price, and unfortunately something at or above $1000 is quite likely. We shall see.

Source: WCCFtech

Another hard quarter for PC sellers

Subject: General Tech | October 9, 2015 - 12:35 PM |
Tagged: pc sales, Q3 2015

73.7 million units, including desktops, laptops and ultrabooks were sold in the third quarter of 2015, down 7.7% from this time last year.  In the EMEA, Japan and Latin America this could be in part because prices have risen by about 10% but is also likely due to a lack of any convincing reason to upgrade.  The recent security problems revealed on Lenovo machines do not seem to have hurt their sales in North America , they saw a 22% increase in sales with the launch of their various 2 in 1 portable devices.  Gartner feels this may change in the latter half of the year as many companies do not get out of the red until holiday sales start driving consumers, but also because machines shipping with Windows 10 will start to hit the markets.  Skylake product refreshes should also help out and we can all hope to see bargains on older kit that distributors want off their shelves as well as the numerous holiday sales start to ramp up.  You can follow the links from The Inquirer for more detailed information.


"FIGURES FROM GARTNER show that PC shipments declined a further 7.7 percent year on year during the third quarter of 2015, despite the release of Microsoft's Windows 10 operating system during the period."

Here is some more Tech News from around the web:

Tech Talk


Source: The Inquirer
Manufacturer: PC Perspective

New Components, New Approach


After 20 or so enclosure reviews over the past year and a half and some pretty inconsistent test hardware along the way, I decided to adopt a standardized test bench for all reviews going forward. Makes sense, right? Turns out choosing the best components for a cases and cooling test system was a lot more difficult than I expected going in, as special consideration had to be made for everything from form-factor to noise and heat levels.

Along with the new components I will also be changing the approach to future reviews by expanding the scope of CPU cooler testing. After some debate as to the type of CPU cooler to employ I decided that a better test of an enclosure would be to use both closed-loop liquid and air cooling for every review, and provide thermal and noise results for each. For CPU cooler reviews themselves I'll be adding a "real-world" load result to the charts to offer a more realistic scenario, running a standard desktop application (in this case a video encoder) in addition to the torture-test result using Prime95.

But what about this new build? It isn't completely done but here's a quick look at the components I ended up with so far along with the rationale for each selection.

CPU – Intel Core i5-6600K ($249, Amazon.com)


The introduction of Intel’s 6th generation Skylake processors provided the excuse opportunity for an upgrade after using an AMD FX-6300 system for the last couple of enclosure reviews, and after toying with the idea of the new i7-6700K, and immediately realizing this was likely overkill and (more importantly) completely unavailable for purchase at the time, I went with the more "reasonable" option with the i5. There has long been a debate as to the need for hyper-threading for gaming (though this may be changing with the introduction of DX12) but in any case this is still a very powerful processor and when stressed should produce a challenging enough thermal load to adequately test both CPU coolers and enclosures going forward.

GPU – XFX Double Dissipation Radeon R9 290X ($347, Amazon.com)


This was by far the most difficult selection. I don’t think of my own use when choosing a card for a test system like this, as it must meet a set of criteria to be a good fit for enclosure benchmarks. If I choose a card that runs very cool and with minimal noise, GPU benchmarks will be far less significant as the card won’t adequately challenge the design and thermal characteristics of the enclosure. There are certainly options that run at greater temperatures and higher noise (a reference R9 290X for example), but I didn’t want a blower-style cooler with the GPU. Why? More and more GPUs are released with some sort of large multi-fan design rather than a blower, and for enclosure testing I want to know how the case handles the extra warm air.

Noise was an important consideration, as levels from an enclosure of course vary based on the installed components. With noise measurements a GPU cooler that has very low output at idle (or zero, as some recent cooler designs permit) will allow system idle levels to fall more on case fans and airflow than a GPU that might drown them out. (This would also allow a better benchmark of CPU cooler noise - particularly with self-contained liquid coolers and audible pump noise.) And while I wanted very quiet performance at idle, at load there must be sufficient noise to measure the performance of the enclosure in this regard, though of course nothing will truly tax a design quite like a loud blower. I hope I've found a good balance here.

Continue reading our look at the cases and cooling test system build!

Thermaltake Releases Core P5 Wall-Mountable Case

Subject: Cases and Cooling | October 5, 2015 - 09:01 AM |
Tagged: wall mount, thermaltake

Personally, I would like to see at least an option for plexiglass on the perimeter. I feel like some might want a bit of protection from things like sneezes, or rogue squirt-gun blasts. The “case” is basically a plate with a clear acrylic pane in front of it. It can stand upright, be rotated horizontally, or even screwed into a wall if you want to show off a custom liquid coolant loops or something.


Interestingly, Thermaltake is providing “3D Printing Accessory Files”. I somehow doubt that this will be the CAD files required to lasercut your own Core P5 case, but it's designed to allow makers to create their own accessories for it. As such, this sounds more like guides and schematics, but I cannot say for sure because I haven't tried it... and they're not available yet.

The Thermaltake Core P5 will be available soon for an MSRP of $169.99, although it's already at a sale price of $149.99. This could be just a pre-order discount, or a sign of its typical price point. We don't know.

Source: Thermaltake

Who Decided to Call a Lightweight API "Metal"?

Subject: Graphics Cards | October 7, 2015 - 07:01 AM |
Tagged: opengl, metal, apple

Ars Technica took it upon themselves to benchmark Metal in the latest OSX El Capitan release. Even though OpenGL on Mac OSX is not considered to be on par with its Linux counterparts, which is probably due to the driver situation until recently, it pulls ahead of Metal in many situations.


Image Credit: Ars Technica

Unlike the other graphics APIs, Metal uses the traditional binding model. Basically, you have a GPU object that you attach your data to, then call one of a handful of “draw” functions to signal the driver. DirectX 12, Vulkan, and Mantle, on the other hand, treat work like commands on queues. The latter model works better in multi-core environments, and it aligns with GPU compute APIs, but the former is easier to port OpenGL and DirectX 11 applications to.

Ars Technica notes that faster GPUs, such as the NVIDIA GeForce GTX 680MX, show higher gains than slower ones. Their “best explanation” is that “faster GPUs can offload more work from the CPU”. That is pretty much true, yes. The new APIs are designed to keep GPUs loaded and working as much as possible, because they really do sit around doing nothing a lot. If you are able to keep a GPU loaded, because it can't accept much load in the first place, then there is little benefit to decreasing CPU load or spreading out across multiple cores.

Granted, there are many ways that benchmarks like these could be incorrectly used. I'll assume that Ars Technica and GFXBench are not making any simple mistakes, though, but it's good to be critical just in case.

Source: Ars Technica

Centon drops SandForce in favour of Phison

Subject: Storage | October 6, 2015 - 07:22 PM |
Tagged: Phison PS3110-S10, centon, C-380

The last time we heard from Centon they were using the SandForce 2281 SSD controller, which they have dropped in preference to a Phison controller in their new C-380 series of SSDs.  Benchmark Reviews recently reviewed their 480GB model, using MLC NAND and sporting a 4Gb cache of DDR3-1600.  The benchmark results were quite varied, sometimes the drive came in at the top of the pack yet other times it was well below average, especially writing to the drive.  There is a 1 year warranty on the drive and currently it is on sale at $219 for the 480GB model, down from the list price of $399.99 ... perhaps not a drive to recommend to your friends.


"Centon isn’t a name many enthusiasts will know. I’d never heard of the company myself until this review sample; apparently, they’ve been in business for over 35 years manufacturing DRAM and flash memory products, and have only recently entered the consumer marketplace. The Centon C-380 480GB SSD SATA-III Solid State Drive, part of the “Enthusiast Solutions” series, is the focus of what Benchmark Reviews will be putting through our test suite."

Here are some more Storage reviews from around the web:


ASUS Announces ROG Maximus VIII Impact Mini-ITX Z170 Motherboard

Subject: Motherboards | October 9, 2015 - 06:00 PM |
Tagged: Z170, Skylake, SFF, ROG, motherboard, mini-itx, Maximus VIII Impact, lga1151, asus

ASUS has announced their latest mini-ITX offering in the Republic of Gamers series, and the Maximus VIII Impact motherboard packs an outrageous number of features into one formidable little 6.7-inch square. In fact, short of the second PCIe slot afforded the larger mATX form-factor, the newest Impact board looks to be every bit as powerful as the recently released Maximus VIII Gene motherboard.

Maximus VIII Impact_3D-1.jpg

Let’s check out the specs on this new Impact board:

  • CPU: LGA1151 socket for 6th Generation Intel Core i7/i5/i3/Pentium/Celeron processors
  • Chipset: Intel Z170 Express
  • Memory: Dual-channel memory architecture
    • 2x DIMM, max. 32GB DDR4-4133(OC) non-ECC, un-buffered memory
  • PCIe Slot: 1x PCIe 3.0 x16 slot (supports x16 mode)
  • Graphics: Integrated Intel HD Graphics Processor
    • HDMI 1.4b
    • Intel InTru 3D/Quick Sync Video/Clear Video HD Technology/Insider
  • Wi-Fi: 802.11a/b/g/n/ac supports dual frequency band 2.4/5GHz; MU-MIMO
  • Bluetooth: V4.1, 4.0LE
  • USB: 2x USB 3.1 ports (1 Type-A and 1 Type-C) powered by Intel USB 3.1 controller; 6x USB 3.0 ports (2 at mid-board)
  • Storage: 1x U.2 port (PCIe x4, 32Gb/s), 4x SATA 6Gb/s ports. Supports Intel Smart Response Technology
  • LAN: Intel® I219-V Gigabit LAN with Anti-surge LANGuard, ROG GameFirst Technology
  • HD Audio: SupremeFX Impact III
    • ROG SupremeFX 2015 High Definition Audio Codec
    • ESS® ES9023P DAC with Hyperstream™ Architecture
    • 2Vrms Headphone Amp into 32-600 Ohms
    • SupremeFX Shielding Technology
    • Optical S/PDIF output at back panel
    • Sonic Studio II; Sonic Radar II; Sonic SenseAmp; DTS Connect
  • Fan headers: 2x 4-pin onboard; 3x 4-pin on daughter card
  • Form Factor: Mini-ITX, 6.7" x 6.7" (17 cm x 17 cm)

Maximus VIII Impact_2D Front.jpg

Pricing and availability were not immediately available.

Source: ASUS
Subject: Editorial, Storage
Manufacturer: PC Perspective

What you never knew you didn't know

While researching a few upcoming SD / microSD product reviews here at PC Perspective, I quickly found myself swimming in a sea of ratings and specifications. This write up was initially meant to explain and clarify these items, but it quickly grew into a reference too large to include in every SD card article, so I have spun it off here as a standalone reference. We hope it is as useful to you as it will be to our upcoming SD card reviews.

SD card speed ratings are a bit of a mess, so I'm going to do my best to clear things up here. I'll start with classes and grades. These are specs that define the *minimum* speed a given SD card should meet when reading or writing (both directions are used for the test). As with all flash devices, the write speed tends to be the more limiting factor. Without getting into gory detail, the tests used assume mostly sequential large writes and random reads occurring at no smaller than the minimum memory unit of the card (typically 512KB). The tests match the typical use case of an SD card, which is typically writing larger files (or sequential video streams), with minimal small writes (file table updates, etc).

Speed Class


In the above chart, we see speed 'Class' 2, 4, 6, and 10. The SD card spec calls out very specific requirements for these specs, but the gist of it is that an unfragmented SD card will be able to write at a minimum MB/s corresponding to its rated class (e.g. Class 6 = 6 MB/s minimum transfer speed). The workload specified is meant to represent a typical media device writing to an SD card, with buffering to account for slower FAT table updates (small writes). With higher bus speed modes (more on that later), we also get higher classes. Older cards that are not rated under this spec are referred to as 'Class 0'.

Speed Grade

As we move higher than Class 10, we get to U1 and U3, which are referred to as UHS Speed Grades (contrary to the above table which states 'Class') in the SD card specification. The changeover from Class to Grade has something to do with speed modes, which also relates with the standard capacity of the card being used:


U1 and U3 correspond to 10 and 30 MB/s minimums, but the test conditions are slightly different for these specs (so Class 10 is not *exactly* the same as a U1 rating, even though they both equate to 10 MB/sec). Cards not performing to U1 are classified as 'Speed Grade 0'. One final note here is that a U rating also implies a UHS speed mode (see the next section).

Read on as we decrypt all of the many specs and ratings present on SD and microSD cards!

4K performance when you can spend at least $1.3K

Subject: Graphics Cards | October 6, 2015 - 02:40 PM |
Tagged: 4k, gtx titan x, fury x, GTX 980 Ti, crossfire, sli

[H]ard|OCP shows off just what you can achieve when you spend over $1000 on graphics cards and have a 4K monitor in their latest review.  In Project Cars you can expect never to see less than 40fps with everything cranked to maximum and if you invested in Titan X's you can even enable DS2X AntiAliasing for double the resolution, before down sampling.  Witcher 3 is a bit more challenging and no card is up for HairWorks without a noticeable hit to performance.  FarCry 4 still refuses to believe in Crossfire and as far as NVIDIA performance goes, if you want to see soft shadows you are going to have to invest in a pair of Titan X's.  Check out the full review to see what the best of the current market is capable of.


"The ultimate 4K battle is about to begin, AMD Radeon R9 Fury X CrossFire, NVIDIA GeForce GTX 980 Ti SLI, and NVIDIA GeForce GTX TITAN X SLI will compete for the best gameplay experience at 4K resolution. Find out what $1300 to $2000 worth of GPU backbone will buy you. And find out if Fiji really can 4K."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

NVIDIA Releases 358.50 WHQL Game Ready Drivers

Subject: Graphics Cards | October 7, 2015 - 01:45 PM |
Tagged: opengl es 3.2, nvidia, graphics drivers, geforce

The GeForce Game Ready 358.50 WHQL driver has been released so users can perform their updates before the Star Wars Battlefront beta goes live tomorrow (unless you already received a key). As with every “Game Ready” driver, NVIDIA ensures that the essential performance and stability tweaks are rolled in to this version, and tests it against the title. It is WHQL certified too, which is a recent priority for NVIDIA. Years ago, “Game Ready” drivers were often classified as Beta, but the company now intends to pass their work through Microsoft for a final sniff test.


Another interesting addition to this driver is the inclusion of OpenGL 2015 ARB and OpenGL ES 3.2. To use OpenGL ES 3.2 on the PC, if you want to develop software in it for instance, you needed to use a separate release since it was released at SIGGRAPH. It has now been rolled into the main, public driver. The mobile devs who use their production machines to play Battlefront rejoice, I guess. It might also be useful if developers, for instance at Mozilla or Google, want to create pre-release implementations of future WebGL specs too.

Source: NVIDIA

Dell Releases Redesigned XPS 15 Laptop with InfinityEdge Display

Subject: Systems, Mobile | October 8, 2015 - 10:05 AM |
Tagged: dell, XPS 15, InfinityEdge, laptop, notebook, Skylake, i3-6100H, i5-6300HQ, i7-6700HQ, GTX 960M

The redesigned Dell XPS 15 is here, now a larger clone of the popular XPS 13 including the same minuscule “InfinityEdge” display and featuring optional 4K resolution.


Image credit: Engadget

The XPS 13 is among the highest-rated Windows laptops of the past year, and the preferred notebook of our own Ryan Shrout. Dell certainly had a big design win with a 13-inch screen on a laptop that would normally only house an 11.6-inch display, thanks to the razor-thin bezel surrounding the LCD panel. This InfinityEdge display makes a lot of sense for the larger XPS 15, and the newly redesigned notebook now occupies the space of a mere 14-inch notebook, while offering both FHD and UHD/4K screen resolutions.

What good would a beautiful screen be without the horsepower to drive it? For this Dell has implemented the latest 6th Generation Intel Skylake mobile processors, namely the Core i3-6100H, Core i5-6300HQ, and Core i7-6700HQ. Graphics duties are performed either by the integrated Intel HD 530 or an NVIDIA GTX 960M GPU, and 8GB of DDR4 memory comes standard with options up to 32GB available (and this is SoDIMM memory so users can upgrade later as well).


Image credit: Windows Central


  • Processor:
    • 6th Gen Intel Core i3-6100H (3M Cache, up to 2.7 GHz)
    • 6th Gen Intel Core i5-6300HQ Quad-Core (6M Cache, up to 3.2 GHz)
    • 6th Gen Intel Core i7-6700HQ Quad-Core (6M Cache, up to 3.5 GHz)
  • Display: 15.6" FHD (1920x1080) InfinityEdge display or 15.6" UltraSharp 4K Ultra HD (3840x2160) InfinityEdge touch display
  • RAM: 8GB, 16GB or 32GB DDR4 at 2133 MHz (32GB post-RTS) (2 x SoDIMMs)
  • Graphics: Intel HD Graphics 530; NVIDIA GeForce GTX 960M 2GB GDDR5
  • Storage: 500GB HDD + 32GB Flash or 1TB HDD + 32GB Flash
  • 256GB PCIe SSD, 512GB PCIe SSD, or 1TB PCIe SSD
  • Camera: Widescreen HD (720p) webcam
  • Ports and Connectors: HDMI, USB 3.0 (x2), Headset Jack, SD card reader, Kensington Lock slot, Thunderbolt 3
  • Dimensions: 11-17mm x 357mm x 235mm
  • Weight: Non-touch, starting at 3.9 lbs; Touch, starting at 4.4 lbs

The new Dell XPS 15 is available today and prices start at $999.

MSI Releases GK-701 Mechanical Gaming Keyboard

Subject: General Tech | October 7, 2015 - 09:45 AM |
Tagged: msi, GK-701, gaming keyboard, cherry mx brown

MSI has a new mechanical gaming keyboard available, and the GK-701 features MSI’s black and red "Dragon" styling with red LED backlighting for each key, and uses Cherry MX Brown switches.


MSI is emphasizing the quality of their build with this new keyboard, stating that each key “is created with precision laser etching for extra resistance to wear and tear”, and the red LED backlight for each key is rated for “over 50 million key presses”. Additionally, the GK-701 offers a braided USB cable with a 18K gold plated connector, and there is a set of multimedia hotkeys and a game mode that disables the Windows Key. As this is a mechanical keyboard one of the biggest aspects is of course key switch selection, and the Cherry MX Brown switches MSI has chosen for the GK-701 offer a tactile “non-clicky” feel that some prefer.

GK-701 Mechanical Gaming Keyboard specs from MSI:

  • Cherry MX Brown switches
  • Red LED Backlight
  • Windows Key Lock
  • N-Key Rollover
  • Multimedia Hotkeys
  • Anti-slip Rubber Feet
  • Ergonomic Design
  • USB 2.0 connection
  • Braided wire and gold-plated connector
  • Switches lifetime: 50 Million Clicks
  • Dimensions: 450 x 165 x 38mm, 1200g weight


The MSI GK-701 Mechanical Gaming Keyboard is available now and currently selling on Newegg.com for $119.99.

Source: MSI

On-die watercooling

Subject: General Tech | October 7, 2015 - 01:06 PM |
Tagged: watercooling, nifty

These researchers are skipping the waterblock altogether and have made channels in surface of the die its self for de-ionized water to flow through and cool the chip.  The 28-nanometer Altera FPGA they tested this cooling method on had numerous channels cut into it which were then sealed up with a layer of silicon.  With a flow rate of 147 ml/minute they kept the chip to a comfortable 24C, a mere 4C higher than the temperature of the water and significantly lower than the 60C the chip would run at using air cooling.  Neither Hack a Day nor PCPer encourage you to try to cut micron sized channels in your brand new processor, however we all hope to see this cooling technique incorporated into heatspreaders in future generations of processors.


"Researchers at Georgia Tech have been working on cutting fluid channels directly into the back of commercial silicon die (an Altera FPGA, to be exact). The tiny channels measure about 100 micron and are resealed with another layer of silicon. Water is pumped into the channels to cool the device efficiently."

Here is some more Tech News from around the web:

Tech Talk

Source: Hack a Day

Steam "Store Within a Store" at GameStop, GAME UK, and EB

Subject: Systems | October 5, 2015 - 07:39 PM |
Tagged: valve, steam os, steam machines, steam, pc gaming

According to SteamDB, Valve has struck deals with GameStop, GAME UK, and EB Canada to create “store within a store” areas in North American and UK locations. The article does not clarify how many of stores will receive this treatment. It does note that Steam Controller, Steam Link, and even Steam Machines will be sold from these outlets, which will give physical presence to Valve's console platform alongside the existing ones.


The thing about Valve is that, when they go silent, you can't tell whether they reconsidered their position, or they just are waiting for the right time to announce. They have been fairly vocal about Steam accessories, but the machines themselves have been pretty much radio silence for the better part of a year. There was basically nothing at CES 2015 after a big push in the prior year. The talk shifted to Steam Link, which was obviously part of their original intention but, due to the simultaneous lack of Steam Machine promotion, feels more like a replacement than an addition.

But, as said, that's tricky logic to use with Valve.

As a final note, I am curious about what the transaction entailed. From what I hear, purchasing retail space is pricey and difficult, but some retailers donate space for certain products and initiatives that they find intrinsic value in. Valve probably has a lot money, but they don't have Microsoft levels of cash. Whether Valve paid for the space, or the retailers donated it, is question that leads to two very different, but both very interesting in their own way, follow-ups. Hopefully we'll learn more, but we probably won't.

Source: SteamDB

StarCraft II v3.0 Gets Another New UI

Subject: General Tech | October 3, 2015 - 11:04 PM |
Tagged: Starcraft II, legacy of the void, blizzard

Third time's the charm, unless they plan another release at some point.

The StarCraft II interface isn't perfect. Even though it is interesting and visually appealing, some tasks are unnecessarily difficult and space is not used in the most efficient way. To see what I mean, try to revert the multiplayer mode to Wings of Liberty, or, worse, find your Character Code. Blizzard released a new UI with Heart of the Swarm back in 2013, and they're doing a new one for the release of Legacy of the Void on November 10th. Note that my two examples probably won't be fixed in this update, they are just examples of UX issues.

While the update aligns with the new expansion, Blizzard will patch the UI for all content levels, including the free Starter Edition. This honestly makes sense, because it's easier to patch a title when all variations share a common core. Then again, not every company patches five-year-old titles like Blizzard does, so the back-catalog support is appreciated.


The most heartwarming change for fans, if pointless otherwise, is in the campaign selection screen. As the StarCraft II trilogy will be completed with Legacy of the Void, the interface aligns them as three episodes in the same style as the original StarCraft did.

On the functional side, the interface has been made more compact (which I alluded to earlier). This was caused by the new chat design, which is bigger yet less disruptive than it was in Heart of the Swarm. The column of buttons on the side are now a top bar, which expands down for sub-menu items.


While there are several things that I don't mention, a final note for this post is that Arcade will now focus on open lobbies. Players can look for the specific game they want, but the initial screen will show lobbies that are waiting to fill. The hope seems to be that players waiting for a game will spend less time. This raises two questions. First, Arcade games tend to have a steep learning curve, so I wonder if this feature will slump off after people try a few rounds before realizing that they should stick with a handful of games. Second, I wonder what this means for player numbers in general -- this sounds like a feature that is added during player declines, which Blizzard seems to hint is not occuring.

I'm not sure when the update will land, but it will probably be around the launch of Legacy of the Void on November 10th.

Source: Blizzard

Ars Technica Reviews Android 6.0 (Marshmellow)

Subject: Mobile | October 6, 2015 - 07:01 AM |
Tagged: google, android 6.0, Android

Android 6.0 was launched yesterday, and Ars Technica has, so far, been the only outlet to give it a formal review. That said, it is a twelve-page review with a table of contents -- so that totally counts for five or so.


The main complaint that the reviewer has is the operating system's inability to be directly updated. There is a large chain of rubber stamps between Google's engineers and the world at large. Carriers and phone manufacturers can delay (or not even attempt to certify) patches for their many handsets. It is not like Windows, where Microsoft controls the centralized update service. In the beginning, this wasn't too big of an issue as updates were typically for features. Sucker, buy a new phone if you want WebGL.

Now it's about security. Granted, it has always been about security, even on the iPhone, we just care more now. If you think about it, every time a phone gets jailbroken, a method exists to steal admin privileges away from Apple and give them to... the user. Some were fairly sophisticated processes involving USB tethering to PCs, while others involved browsing to a malicious website with a payload that the user (but not Apple) wanted to install. Hence why no-one cared: the security was being exploited by the user for the user. It was only a matter of time before either the companies sufficiently crush the bugs, or it started to be tasty for the wolves.

And Google is getting bit.

Otherwise, Ars Technica mostly praised the OS. Be sure to read their review to get a full sense of their opinion. As far as I can tell, they only tested it on the Nexus 5.

Source: Ars Technica