Windows 10 S ... the S could stand for secure

Subject: General Tech | June 9, 2017 - 02:29 PM |
Tagged: Windows 10 S, security

Microsoft recently pointed out that their new lite version of Windows 10 for students, Windows 10 S, is completely immune to all known malware.  This does make sense, the OS is simply unable to install anything that is not from the Windows Store, which does not host any official malware, even if some of the available programs are not entirely useful.  That security will last as long as no one figures out a way to fake the file validation and the connection to Microsoft's online store, or manages to get a malware infected file approved for sale on the store.  Apple has had some experience which prove that is not an impossibility.   Pop by Slashdot for more.

You could also chose to go with the OS of choice for financial institutions and various other industries, Windows XP Embedded with the Enhanced Write Filter.  Generally secure and can be reset with a simple reboot ... in most cases.

windows-apps-topic.png

"However, if you want to guarantee your safety from ransomware, then Microsoft points out there's an even more secure option to consider -- Windows 10 S. The new, hardened Windows 10 variant only runs apps from the Windows Store, which means it can't run programs from outside Microsoft's ecosystem, and that includes malware. Which is why, as Microsoft says, "No known ransomware works against Windows 10 S."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

Logitech's G433 7.1 Gaming Headset: Stylish Looks and Pro-G Drivers for $99

Subject: General Tech | June 9, 2017 - 02:23 PM |
Tagged: wired, surround, Pro-G, logitech, headset, headphones, gaming, G433, DTS Headphone:X, drivers, 7.1

Logitech has released their latest surround gaming headphones with the wired G433 Gaming Headset, a 7.1-channel (via DTS Headphone:X) model that is latest to use the company's Pro-G drivers.

Logitech G433_Red.jpg

The style of the new G433 is quite eye-catching, with four colors (black, red, blue, and blue camo) of a unique fabric finish that Logitech says is hydrophobic (repels water) for enhanced durability. The G433 primarily function as an analog headphone (with a 3.5 mm plug) unless an included USB DAC/headphone amp is used, giving PC users access to DTS Headphone:X surround up to 7.1 channels and customizable EQ via Logitech's Gaming Software. The microphone is a removable boom style with noise reduction to help improve voice clarity, and Logitech has used a 5-element double-grounded cable to eliminate crosstalk and prevent game audio from bleeding into voice.

g433_colors.jpg

The G433 arrives with an MSRP of $99, making the headset the least expensive Pro-G option to date, but this comparatively low price tag for a premium option still provides the buyer a complete accessory pack including the USB DAC,  alternate ear pads, two 3.5 mm audio cables (one with inline mic), a 3.5 mm audio/mic Y-cable, and a fabric storage bag.

Logitech G433_Blue.jpg

The Logitech G433 is available now, and with a pair on hand will have a full review up very soon!

Source: Logitech

Samsung Announces FreeSync 2 HDR Displays, includes C49HG90 49-in UltraWide!

Subject: Displays | June 9, 2017 - 11:24 AM |
Tagged: Samsung, hdr, freesync 2, freesync, CHG90, CHG70, amd

Samsung made a surprise announcement this morning, taking the wraps off of the first FreeSync 2 monitors to grace our pages, officially. These gaming displays come in three difference sizes, one of them incredibly unique, and all with HDR support and Quantum Dot Technology to go along with the variable refresh rate technology of FreeSync. 

All three displays utilize the QLED Quantum Dot tech first showcased in the QLED TV lineup launched just this past January at CES. It uses a new metal core and has some impressive color quality capabilities, going to 125% of the sRGB color space and 95% of the DCI-PE color space! I don't yet know what the peak luminance is, or how many backlight zones there might be for HDR support, but I have asked Samsung for clarification and will update here when I get feedback. All three displays use VA panels.

All three displays also become the first to pass certification with AMD for FreeSync 2, which we initially detailed WAY BACK in January of this year. FreeSync 2 should tell us that this display meets some minimum standards for latency, color quality, and low frame rate compensation. These are all great on paper, though I am still looking for details from AMD on what exactly the minimum standards have been set to. At the time, AMD would only tell me that FreeSync 2 displays "will require a doubling of the perceivable brightness and doubling of the viewable color volume based on the sRGB standards."

C49HG90_006_L-Perspective_Black2.jpg

The bad boy of the group, the Samsung CHG90 (part number C49HG90), is easily the most interesting. It comes in with a staggering screen size of 49-inches and a brand new 32:9 aspect ratio with an 1800R curvature. With a 3840x1080 resolution, I am eager to see this display in person and judge how the ultra-wide design impacts our gaming and our productivity capability. (They call this resolution DFHD, for double full HD.) The refresh rate peaks at 144 Hz and a four-channel scanner is in place to minimize any motion blur or ghosting. A 1ms rated response time also makes this monitor incredibly impressive, on paper. Price for the C49HG90 is set at $1499 with preorders starting today on Amazon.com. (Amazon lists a June 30th release date, but I am looking to clarify.)

Also on the docket is the CHG70, available in two sizes, a 32-in (C32HG70) and a 27-in (C27HG70) model. Both are 2560x1440 resolution screens with 16:9 aspect ratios, 1ms response times and FreeSync 2 integrations. That means the same 125% sRGB and 95% DCI-P3 color space support along with the Samsung Quantum Dot technology. Both will sport a 144 Hz refresh rate and an 1800R curvature. The specifications are essentially identical between all three models, making the selection process an easier choice based on price segment and screen real estate. The C27HG70 will be on preorder from Samsung.com exclusively for $599 while the C32HG70 will be on preorder at Newegg.com for $699, just $100 more.

All three displays will feature a Game Mode to optimize image settings for...gaming.

Samsung’s CHG90 extends the playing field for virtual competitors, with its 49-inch design representing the widest gaming monitor available. The monitor delivers a dramatic 1,800R curvature and an ultra-wide 178-degree viewing angle, ensuring that content is clearly visible from nearly any location within a given space. As a result, gamers no longer need to worry about the logistics, expenses, and bezel interference that occur when combining multiple smaller monitors together for an expanded view.

The new CHG90 monitor includes a height adjustable stand (HAS), allowing flexible height adjustment for improved viewing comfort. Designed for the most demanding games, the CHG70 monitor goes a step further with a dual-hinge stand that provides users more precise control over how the display panel is positioned.

In addition to Game Mode, a feature that optimizes image setting for playing games when connected to a PC or a game console, each of the new monitors include a game OSD dashboard, designed to blend seamlessly into game interfaces.

A full table of specifications is below and trust me on this one guys, I am already up in Samsung's and AMD's face to get these monitors in here for review!

monitor-spec.png

Now all we are missing is the power of a Radeon RX Vega card to push this high resolution, high refresh rate HDR goodness!!

Source: Samsung

Radeon Software Crimson ReLive Edition 17.6.1 - Prey and DiRT 4

Subject: Graphics Cards | June 8, 2017 - 05:26 PM |
Tagged: radeon, Crimson Edition 17.6.1, amd

In the very near future AMD will be releasing an updated driver, focused on improving performance in Prey and DiRT 4. 

index.jpg

For DiRT 4 it will enable a Multi GPU profile and up to 30% performance improvement when using 8xMSAA on a Radeon RX 580 8GB compared to the previous release.

Source: AMD

AI to the rescue? Microsoft assimilates the security company Hexadite

Subject: General Tech | June 8, 2017 - 12:42 PM |
Tagged: microsoft, hexadite, windows defender, security

If you have never heard of Hexadite you are not alone, the online security company was formed in 2014, headquartered in Boston but based in Tel-Aviv.  As it was just purchased by Microsoft for around $100 million so they can integrate Hexadite's Automated Incident Response Solution into their Windows Defender Advanced Threat Protection.  AIRS is not antivirus software, instead it is a tool that integrates with existing software and monitors for any alerts.  Once an alert is detected the tool automatically investigates that alert and searches for solutions, in theory saving your security teams sanity by vastly reducing the number of alerts they must deal with directly.  It will be interesting to see if this has an effect on the perception of companies and users as to the effectiveness of Windows Defender. 

More over at The Inquirer.

Capture.PNG

"Hexadite's technology and talent will augment our existing capabilities and enable our ability to add new tools and services to Microsoft's robust enterprise security offerings."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer

Podcast #453 - More Computex, WWDC, 3D Xpoint, and more

Subject: General Tech | June 8, 2017 - 11:22 AM |
Tagged: X399, x370, x299, wwdc, video, shield, podcast, plex, pixel, macbook, Mac Pro, Logitech G413, Lian-Li, gigabyte, computex, asus, asrock, apollo lake, 3D XPoint

PC Perspective Podcast #453 - 06/07/17

Join us for talk about continued Computex 2017 coverage, WWDC '17, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano

Peanut Gallery: Alex Lustenberg, Ken Addison

Program length: 1:33:54
 
Podcast topics of discussion:
  1. Week in Review:
  2. Computex Continued
  3. WWDC 2017:
  4. News items of interest:
  5. Hardware/Software Picks of the Week
  6. Closing/outro
 

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

Source:

IBM Announces 5nm Breakthrough with Silicon Nanosheet Technology

Subject: General Tech | June 7, 2017 - 09:31 PM |
Tagged: silicon nanosheet, Samsung, IBM, GLOBALFOUNDRIES, FinFET, 5nm

It seems only yesterday that we saw Intel introduce their 22nm FinFET technology, and now we are going all the way down to 5nm.  This is obviously an exaggeration.  The march of process technology has been more than a little challenging for the past 5+ years for everyone in the industry.  Intel has made it look a little easier by being able to finance these advances a little better than the other pure-play foundries.  It does not mean that they have not experienced challenges on their own.

We have seen some breakthroughs these past years with everyone jumping onto FinFETs with TSMC, Samsung, and GLOBALFOUNDRIES introducing their own processes.  GLOBALFOUNDRIES initially had set out on their own, but that particular endeavor did not pan out.  The ended up licensing Samsung’s 14nm processes (LPE and LPP) to start producing chips of their own, primarily for AMD in their graphics and this latest generation of Ryzen CPUs.

NicolasLoubet-1440x960.jpg

These advances have not been easy.  While FinFETs are needed at these lower nodes to continue to provide the performance and power efficiency while supporting these transistor densities, the technology will not last forever.  10nm and 7nm lines will continue to use them, but many believe that while we will see the densities improve, the power characteristics will start to lag behind.  The theory is that past 7nm nodes traditional FinFETs will no longer work as desired.  This is very reminiscent of the sub 28nm processes that attempted to use planar structures on bulk silicon.  In that case the chips could be made, but power issues plagued the designs and eventually support for those process lines were dropped.

IBM and their research associates Samsung, GLOBALFOUNDRIES at SUNY Polytechnic Institute Colleges of Nanoscale Science and Engineering’s NanoTech Complex in Albany, NY have announced a breakthrough in a new “Gate-All-Around” architecture made on a 5nm process.  FinFETs are essentially a rectangle surround on three sides by gates, giving it the “fin” physical characteristics.  This new technology now covers the fourth side and embeds these channels in nanosheets of silicon.

The problem with FinFETs is that they will eventually be unable to scale with power as transistors get closer and closer.  While density scales, power and performance will get worse as compared to previous nodes.  The 5nm silicon nanosheet technology gives a significant boost to power and efficiency, thereby doing to FinFETs what they did with planar structures at the 20/22nm nodes.

ibm-suny-asml-euv-machine-1440x812.jpg

One of the working EUV litho machines at SUNY Albany.

IBM asserts that the average chip the size of a fingernail can contain up to 30 billion transistors and continue to see the density, power, and efficiency improvements that we would expect with a normal process shrink.  The company expects these process nodes to start rolling out in a 2019 time frame if all goes as planned.

There are few details in how IBM was able to achieve this result.  We do know a couple things about it.  EUV lithography was used extensively to avoid the multi-patterning nightmare that this would entail.  For the past two years Ametek has been installing 100 watt EUV litho machines throughout the world to select clients.  One of these is located on the SUNY Albany campus where this research was done.  We also know that deposition was done layer by layer with silicon and the other materials.

What we don’t know is how long it takes to create a complete wafer.  Usually these test wafers are packed full of SRAM and very little logic.  It is a useful test and creates a baseline for many structures that will eventually be applied to this process.  We do not know how long it takes to produce such a wafer, but considering how the layers look to be deposited it takes a long, long time with current tools and machinery.  Cutting edge wafers in production can take upwards of 16 weeks to complete.  I hesitate to even guess how long each test wafer takes.  Because of the very 3D nature of the design, I am curious as to how the litho stages work and how many passes are still needed to complete the design.

This looks to be a very significant advancement in process technology that should be mass produced in the timeline suggested by IBM.  It is a significant jump, but it seems to borrow a lot of previous FinFET structures.  It does not encompass anything exotic like “quantum wells”, but is able to go lower than the currently specified 7nm processes that TSMC, Samsung, and Intel have hinted at (and yes, process node names should be taken with a grain of salt from all parties at this time).  IBM does appear to be comparing this to what Samsung calls its 7nm process in terms of dimensions and transistor density.

Nanosheet-5nm-for-release-1.jpg

Cross section of a 5nm transistor showing the embedded channels and silicon nanosheets.

While Moore’s Law has been stretched thin as of late, we are still seeing these scientists and engineers pushing against the laws of physics to achieve better performance and scaling at incredibly small dimensions.  The silicon nanosheet technology looks to be an effective and relatively affordable path towards smaller sizes without requiring exotic materials to achieve.  IBM and its partners look to have produced a process node that will continue the march towards smaller, more efficient, and more powerful devices.  It is not exactly around the corner, but 2019 is close enough to start planning designs that could potentially utilize this node.

Source: IBM

Qt Outlines What Their View on Vulkan Support Is

Subject: General Tech | June 7, 2017 - 09:10 PM |
Tagged: Qt, vulkan

During our recent interview, the Khronos Group mentioned that one reason to merge into Vulkan was because, at first, the OpenCL working group wasn’t sure whether they wanted an explicit, low-level API, or an easy-to-use one that hides the complexity. Vulkan taught them to take a very low-level position, because there can always be another layer above them that hides complexity to everything downstream of it. This is important for them, because the only layers below them are owned by OS and hardware vendors.

QtCompany_logo_1200x630.png

This post is about Qt, though. Qt is a UI middleware, written in C++, that has become very popular as of late. The big revamp of AMD’s control panel with Crimson Edition was a result of switching from .NET to Qt, which greatly sped up launch time. They announced their intent to support the Vulkan API on the very day that it launched.

Yesterday, they wrote a blog post detailing their intentions for Vulkan support in Qt 5.10.

First and foremost, their last bulletpoint claims that these stances can change as the middleware evolves, particularly with Qt Quick, Qt 3D, Qt Canvas 3D, QPainter, and similar classes. This is a discussion of their support for Qt 5.10 specifically. As it stands, though, Qt intends to focus on cross-platform, window management, and “function resolving for the core API”. The application is expected to manage the rest of the Vulkan API itself (or, of course, use another helper for the other parts).

This makes sense for Qt’s position. Their lowest level classes should do as little as possible outside of what their developers expect, allowing higher-level libraries the most leeway to fill in the gaps. Qt does have higher-level classes, though, and I’m curious what others, especially developers, believe Qt should do with those to take advantage of Vulkan. Especially when we start getting into WYSIWYG editors, like Qt 3D Studio, there is room to do more.

Obviously, the first release isn’t the place to do it, but I’m curious none-the-less.

Source: Qt

G.Skill Memory Used In World Record Breaking DDR4 5,500 MHz Overclock

Subject: Memory | June 7, 2017 - 08:49 PM |
Tagged: G.Skill, overclocking, ddr4, x299, liquid nitrogen, computex

Amidst the flood of new product announcements at Computex, G.Skill was busy hosting an overclocking competition where its memory was used to in a record breaking overclock that saw DDR4 memory clocked at an impressive 5,500 MHz. Professional overclocker Toppc broke his 5,000 MHz record from last year with the new overclock that was accomplished on Intel’s X299 platform.

Toppc.jpg

Toppc used a MSI X299 Gaming Pro Carbon AC motherboard, Intel Core X-series processor, G.Skill DDR4 memory built using Samsung 8Gb ICs, and, of course, copious amounts of liquid nitrogen! Looking at the HWBot page, it appears Toppc specifically used an Intel Core i7-7740K (Kaby Lake X) processor and 8GB G.Skill Trident Z RGB RAM (CL 14-14-14-14 stock). Both the CPU and memory modules were cooled with liquid nitrogen for the overclock. The CPU-Z screenshot shows the processor running 1 cores / 2 threads with a 133.06 bus speed. It also shows an 8x multiplier and core speed of 1064.46 but I am questioning whether or not it is accurately reading the Kaby Lake X part correctly as running at those speeds wouldn’t need such exotic cooling – perhaps it is needed to run at the 133.06 bus speed and to keep the memory controller from overheating (or melting hehe).

G.Skill is currently pushing the envelope on standard air cooled DIMMs with a prototype kit hitting 4,800 MHz. The company's CVP Tequila Huang stated in a press release:

“We are seeing amazing overclocking potential for these newly released hardware and we believe that more overclocking benchmark records will be achieved very soon by professional overclockers worldwide."

I am interested to see if it will have any additional headroom in the memory overclocking department and if so how long the 5.5 GHz world record will stand.

Source: G.Skill

Dawn of War III Vulkan Support on Linux to Add Intel GPUs

Subject: General Tech | June 7, 2017 - 04:54 PM |
Tagged: pc gaming, linux, vulkan, Intel, mesa, feral interactive

According to Phoronix, Alex Smith of Feral Interactive has just published a few changes to the open source Intel graphics driver, which allows their upcoming Dawn of War III port for Linux to render correctly on Vulkan. This means that the open-source Intel driver should support the game on day one, although drawing correctly and drawing efficiently could be two very different things -- or maybe not, we’ll see.

feral-2017-dawnofwar3.png

It’s interesting seeing things go in the other direction. Normally, graphics engineers parachute in to high-end developers and help them make the most of their software for each respective, proprietary graphics driver. In this case, we’re seeing the game studios pushing fixes to the graphics vendors, because that’s how open source rolls. It will be interesting to do a pros and cons comparison of each system one day, especially if cross-pollination results from it.

Source: Phoronix

Roccat's newest Kone, the EMP gaming mouse

Subject: General Tech | June 7, 2017 - 03:57 PM |
Tagged: input, roccat, Kone EMP, gaming mouse

Roccat;s new Kone EMP shares some attributes with earlier members of the Kone lineup, specifically the Owl-Eye optical sensor based on PixArt’s PWM 3361DM, which can be set at up to 12000dpi and the SWARM software suite to program the mouse.  The onboard ARM Cortex-M0 and 512kB of memory allows the mouse to keep that programming, even on another machine which does not have SWARM installed.  Modders-Inc tested the mouse out, see what they thought of it here.

DSC_0395.jpg

"The Roccat Kone EMP is the next mouse in the Kone line up and the successor to the Kone XTD. The Kone EMP features Roccat's OWL-Eye optical sensor and four RGB LEDs for custom lighting."

Here is some more Tech News from around the web:

Tech Talk

Source: Modders Inc

A chat with Paradox on sequels and expansions

Subject: General Tech | June 7, 2017 - 01:21 PM |
Tagged: gaming, paradox

Paradox is well named as it has a very different philosophy from the rest of the industry about how to treat games after they have been released.  It is becoming quite common for developers to already be working on a sequel to a game that they have just released, or are in the process of releasing.  Once a game launched you can expect to see numerous and often expensive DLC released for the game, which usually offer little to no new real gameplay or functionality.

Paradox treats games completely differently, their DLC expansions are often expensive but frequently offer a significant change to the base game and when released they always add several major new features to anyone who owns the game without charge.  They do this for a long time after launch, two examples are Crusader Kings II which is five years old and has twelve expansions, while the four year old Europa Universalis IV has ten expansions.  Rock, Paper, SHOTGUN sat down with the creative director Johan Andersson and CEO Fredrik Wester to discuss the future of these games and Paradox itself, as well as talking about the effects of offering major updates to older games as opposed to the more common constant release of sequels to games.

Paradox-Logo.jpg

"With Crusader Kings II now five years old and twelve expansions deep, and Europa Universalis IV a relatively sprightly four years and ten expansions, what is the future of these titles? At what point are they done and at what point does the thought of a sequel come up."

Here is some more Tech News from around the web:

Gaming

 

Coming as a shock to no one, Wannacry can exploit Windows 10

Subject: General Tech | June 7, 2017 - 12:42 PM |
Tagged: wannacry, windows 10, security

If you have an unpatched Windows installation you are vulnerable to the SMBv1 exploit, except perhaps if you are still on WinXP in which case your machine is more likely to crash than to start encrypting. Do yourself a favour and head to Microsoft to manually download the patch appropriate for your OS and run it, if you already have it then it will tell you so, otherwise it will repair the vulnerability.  The version of Wannacry and its progenitor, EternalBlue, which is making life miserable for users and techs everywhere does not currently go after Win10 machines but you can read how it can easily be modified to do so over at Slashdot.

banner-datarecovery-cryingLady-416x260.jpg

"The publicly available version of EternalBlue leaked by the ShadowBrokers targets only Windows XP and Windows 7 machines. Researchers at RiskSense who created the Windows 10 version of the attack were able to bypass mitigations introduced by Microsoft that thwart memory-based code-execution attacks."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Slashdot

GOG.com Summer Sale Has Just Begun

Subject: General Tech | June 7, 2017 - 07:02 AM |
Tagged: sale, pc gaming, GOG

GOG.com, formerly Good Old Games, because good old names, is having their summer sale. Discounts are advertised at up to 90%, and a copy of Rebel Galaxy will be gifted immediately following your first purchase.

gog-2017-summersale.jpg

For me, this was the moment that I jumped on The Witcher 3. I deliberately avoided it until the DLC were bundled and the whole package was on a significant discount, which is now the case. The Witcher 3 Game of the Year, albeit not the year that we’re in, is now 50% off. Another front-page deal is Dragon Age Origins Ultimate Edition for 80% off, as is the original Mirror’s Edge, although I already have both of them. If you haven’t played it yet, Brothers: A Tale of Two Sons is great, and it’s 85% off (under $2).

Source: GOG.com

MSI Unveils Fanless Cubi 3 PC Powered By Kaby Lake-U Processors

Subject: General Tech | June 7, 2017 - 02:35 AM |
Tagged: msi, SFF, barebones, nuc, kaby lake, Intel, Optane, computex

MSI recently introduced a new member of its Cubi small form factor barebones PC lineup. The Cubi 3 is a fanless PC that is build around Intel’s Kaby Lake-U processors and will arrive sometime this fall.

MSI Cubi 3.jpg

Notebook Italia and Tek.No got hands on of the MSI mini PC at Computex.

The Cubi 3 is a bit larger than its predecessors, but with the larger enclosure MSI was able to achieve a fanless design for up to (U series) Core i7 processors. The SFF PC sports a brushed aluminum case that shows off the top of the CPU heatsink through vents that run around the top edge of the case. There are two flat antennas for Wi-Fi and Bluetooh integrated into the left and right sides of the case.

FanlessTech reports that the MSI Cubi 3 will sport 15W Kaby Lake-U processors from low end Celerons up to Core i7 models. These parts are dual core parts with HyperThreading (2c/4t) with 3 MB or 4 MB of L3 cache and either HD (615 or 620) or Iris Plus (640 or 650) integrated graphics. The processor is paired with two DDR4 SO-DIMM slots for up to 32 GB of 2133 MHz memory, an M.2 2280 SSD (there is even Intel Optane support), and a single 2.5” drive.

The Cubi 3 has an audio jack and two USB 3.0 ports up front, and what appears to be two USB 2.0 ports on the left side. Rear I/O includes one HDMI, one DisplayPort, two more USB 3.0, two Gigabit Ethernet, two COM ports, and one power jack for the 65W AC power adapter.

There is no word on pricing yet, but it is slated to begin production in August with availability this fall.

It is always nice to see more competition in this niche fanless SFF space, and the little box would not look out of place on a desk or even in the living room. What are your thoughts?

Source: Fanless Tech

Micron Pushes GDDR5X To 16Gbps, Expects To Launch GDDR6 In Early 2018

Subject: Memory | June 7, 2017 - 01:02 AM |
Tagged: micron, gddr6, gddr5x

JEDEC made the GDDR5X memory standard official almost a year and a half ago where it launched at 10 Gbps and quickly hit 12 Gbps. Set to bridge the gap between GDDR5 and the upcoming GDDR6, the “G5X” standard is quickly catching up to and matching the speeds that GDDR6 will run at.

Specifically, Micron’s Graphics Design Team in Munich was able to achieve an impressive 16 Gbps in their high speed test environment. The team was able to hit 16 Gbps on a “meaningful sampling” of its mass production GDDR5X silicon which makes the feat much more impressive as it means these higher speeds are moving closer to reality than theory. Micron measured a PRBS11 (psuedorandom binary sequence) pattern read at 16 Gbps using an oscilloscope and also showed off a chart that compared the stable data rate timing margin versus data rate from 10 Gbps to 16 Gbps.

Micron GDDR5X.png

In addition to teasing the 16 Gbps memory speed (it will be awhile yet before we see products like graphics cards running memory at those speeds), Micron announced that it expects to being mass productions of GDDR6 chips in early 2018. GDDR6 will see a new (larger) FBGA1180 package, faster base sort speeds (GDDR6 will start at 12Gbps vs G5X's 10Gbps), and moving to a dual channel approach with channels that will have half as many I/O links (GDDR5X is x16/x32 while GDDR6 will be x8/16 per channel). It will be interesting to see how this move will stack up to G5X, but in theory Micron will be able to push clocks even higher (maybe even higher than 16 Gbps) by having more but simpler channels (and it may be easier for graphics card manufacturers to wire up their cards to the memory chips.

SK Hynix, who showed off its first GDDR6 chip at GTC, appears to be following the same I/O design as Micron with two channel memory at x8 or x16 per channel.

Are you ready for faster GDDR5X? Hopefully these new faster G5X chips come out soon to give AMD and NVIDIA a more appealing alternative to HBM and HBM2 for mid-range and high end consumer graphics cards since High Bandwidth Memory seems to still be suffering from limited supply and is holding the GPU guys back on being able to crank up the production lines!

Also read:

Source: Micron

Valve Ends Steam Greenlight Program

Subject: General Tech | June 6, 2017 - 08:48 PM |
Tagged: valve, steam, pc gaming

As of today, June 6th, Valve has closed their Greenlight program. New submissions will not be accepted and voting has been disabled. Next week, starting on June 13th, Valve will open Steam Direct, which allows anyone to put their game on the platform for a deposit of $100 per title, which will be refunded once the title makes $1,000 in sales. Valve performs a light amount of testing on each game it receives, so it makes sense to have something that prevents you from drowning upon the opening of the flood gates, and it’s nice that they refund it when sales are high enough that their typical fees cover their expenses, rather than double-dipping.

SteamLogo.png

There is still some doubt floating around the net, though... especially regarding developers from impoverished nations. As a Canadian, it’s by no means unreasonable to spend around a hundred dollars, plus or minus the exchange rate of the year, to put a game, made up of years of work, onto a gigantic distribution platform. That doesn’t hold true everywhere. At the same time, Valve does have a measurable cost per submission, so, if they lower the barrier below that, it would be at their expense. It would also be the right thing to do in some cases. Either way, that’s just my unsolicited two cents.

Steam Direct opens on June 13th.

HardwareCanucks on a Computex HDR vs SDR Demo

Subject: Graphics Cards, Displays | June 6, 2017 - 06:06 PM |
Tagged: hdr, sdr, nvidia, computex

Dmitry Novoselov of Hardware Canucks saw an NVIDIA SDR vs HDR demo, presumably at Computex based on timing and the intro bumper, and noticed that the SDR monitor looked flat. According to his post in the YouTube comments, he asked NVIDIA to gain access to the monitor settings, and they let him... and he found that the brightness, contrast, and gamma settings were way off. He then performed a factory reset, to test how the manufacturer defaults hold up in the comparison, and did his video based on those results.

I should note that video footage of HDR monitors will not correctly describe what you can see in person. Not only is the camera not HDR, and thus not capable of showing the full range of what the monitor is displaying, but also who knows what the camera’s (and later video processing) exposure and color grading will actually correspond to. That said, he was there and saw it in person, so his eyewitness testimony is definitely valid, but it may or may not focus on qualities that you care about.

Anywho, the test was Mass Effect: Andromeda, which has a native HDR profile. To his taste, he apparently prefers the SDR content in a lot of ways, particularly how the blown out areas behave. He claims that he’s concerned about game-to-game quality, because there will be inconsistency between how one color grading professional chooses to process a scene versus another, but I take issue with that. Even in standard color range, there will always be an art director that decides what looks good and what doesn’t.

They are now given another knob, and it’s an adjustment that the industry is still learning how to deal with, but that’s not a downside to HDR.

Run softly and carry a big Scythe

Subject: Cases and Cooling | June 6, 2017 - 12:39 PM |
Tagged: scythe, Mugen 5, air cooler

Scythe's Mugen 5 has a bit of a list to one side, which is designed to give your RAM a little more breathing room and will fit on motherboards with very little clearance between the socket and the DIMMs.  At 890g and 130x110x154.5mm it is not the largest cooler on the market but is big enough to warrant attention when picking out a case to install your system in.  [H]ard|OCP's tests show this cooler to be more focused the audibility of the cooler than topping the cooling charts, heavy overclockers will be better served by a different cooler but those building a quiet system should check out the full review.

149591671380fntpns12_2_6_l.jpg

"The Mugen 5 is one of the larger CPU air coolers you will find on the market, and with that is has an "asymmetric design for maximum memory compatibility," so it does not extend deep into DIMM territory. The polished copper baseplate, as well as the rest of the HSF is nickel plated. Also we have a newly engineered mounting mechanism."

Here is some more Tech News from around the web:

Tech Talk

Source: [H]ard|OCP

Skype Deprecates Several Platforms

Subject: General Tech | June 6, 2017 - 02:27 AM |
Tagged: skype, microsoft

Microsoft has just announced that they will be retiring several Skype apps in about a month’s time (July 1st). The affected platforms are Windows Phone 8, Windows Phone 8.1, Messaging for Windows 10 Mobile, Windows RT, and Skype apps for TV. It’s important to note that Skype for Windows Phone still works, although it requires the Windows 10 Mobile Anniversary Update or later. This was originally announced last year, but no date was given at the time (just "in the coming months").

skype-logo-feb_2012_rgb_500.png

Some sites are noting a workaround for affected users: Skype for Web. Unfortunately, this is probably not a viable option in most circumstances. Specifically, Skype for Web does not officially support mobile browsers, which means that Windows RT users might be in luck, but every other affected device is without options come July 1st.

Source: Thurrott