All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Storage | June 12, 2017 - 03:42 PM | Jeremy Hellstrom
Tagged: kingston, DCP1000, enterprise ssd, NVMe, PCIe SSD
The Kingston DCP1000 NVMe PCIe SSD comes in 800GB, 1.6TB, and 3.2TB though as it is an Enterprise class drive even the smallest size will cost you over $1000. Even with a price beyond the budget of almost all enthusiasts it is interesting to see the performance of this drive, especially as Kitguru's testing showed it to be faster than the Intel D P3608. Kitguru cracked the 1.6TB card open to see how it worked and within found four Kingston 400GB NVMe M.2 SSDs, connected by a PLX PEX8725 24-lane, 10-port PCIe 3.0 switch which then passes the data onto the cards PCIe 3.0 x8 connector. Each of those 400GB SSDs have their own PhisonPS5007-11 eight channel quad-core controller which leads to very impressive performance. They did have some quibbles about the performance consistency of the drive; however it is something they have seen on most drives of this class and not something specific to Kingston's drive.
"Move over Intel DC P3608, we have a new performance king! In today’s testing, it was able to sustain sequential read and write speeds of 7GB/s and 6GB/s, respectively! Not only that, but it is able to deliver over 1.1million IOPS with 4KB random read performance and over 180K for write."
Here are some more Storage reviews from around the web:
- Seagate Ironwolf: 10TB Storage for NAS Review @ Bjorn3d
- Synology DS916+ NAS @ PC Review
- Thecus W5810 5-bay NAS @ Kitguru
- Seagate BarraCuda 1 TB ST1000LM048 @
Subject: General Tech | June 12, 2017 - 03:34 PM | Scott Michaud
Tagged: xbox, xbox one, controller, gamepad
When the original Xbox launched, back in 2001, it was bundled with a massive controller in most regions, which was eventually nicknamed “Duke”. While some users loved this form factor, Microsoft decided to make the “S” controller (the default for Japanese Xboxes) the international default about a year later. Duke ended up a cult classic.
Now, at E3 2017, Hyperkin Games Inc. is launching an Xbox One controller with a very similar design, which will also be compatible with Windows 10. A few liberties were taken to add and subtract buttons that didn’t exist on the opposing side of the Xbox 1 - Xbox One design fence. Hyperkin consulted with Seamus Blackley, one of the original developers of the Xbox console, who approved the remake.
No word on pricing, but it will be available this holiday season (2017).
Subject: Mobile | June 12, 2017 - 01:16 PM | Jeremy Hellstrom
Tagged: Huawei, matebook, MateBook X, MateBook D, fanless
Huawei mobile phones are growing in popularity in North America, with products available on Amazon and brick and mortar stores as well. They have now expanded their product lineup to include 13" laptops, the MateBook X and the MateBook D. These laptops are fanless, thanks to their all metal design and the incorporation of Huawei's Space Cooling technology which are microencapsulated phase change materials built into the body of the laptop. Inside you will find a seventh generation i5 or i7 variant, either 4GB or 8GB of LPDDR3 RAM and a 256GB or 512GB SSD. The Inquirer were impressed with almost every aspect of this ultramobile, from performance to the nine hours of battery life; read all about it here.
"LAPTOPS? REALLY? Is there anything Huawei isn't producing these days? We should have known this day would come when Huawei announced its first Windows 10 tablet, the original MateBook. Now, over a year later, the Chinese behemoth has unveiled its successor along with two, very un-tablety laptops: the MateBook X and the MateBook D."
Here are some more Mobile articles from around the web:
More Mobile Articles
- Samsung Secure Folder Kills & Replaces My Knox @ TechARP
- HTC U11 @ The Inquirer
- The 2017 Samsung Galaxy A7 @ TechARP
- The 10.5-inch iPad Pro is much more “pro” than what it replaces @ Ars Technica
Subject: General Tech, Shows and Expos | June 12, 2017 - 12:19 PM | Jeremy Hellstrom
Tagged: ssd, sata, NVMe, M.2, computex 2017, adta
Adata had a flashy booth at Computex, focusing on their upcoming storage and memory products which The Tech Report spent some time at. They had quite a lineup to show off, a pair of Enterprise class NVMe M.2 drives, the IM2P33E8 powered by Silicon Motion's upcoming SM2262 controller which is reputed to hit 3000 MB/s read, 1500 MB/s write as well as the SATA IM2S33D8 using the SM2259 controller.
For high end users there are the NVMe XPG SX9000, XPG SX8000 and XPG SX7000, the former with a Marvell controller and Toshiba's evergreen 15-nm MLC NAND, the latter pair with a Silicon Motion controller and IMFT 3D MLC flash. For the price sensitive they have launched an M.2 drive which only uses two PCIe lanes, it will not be as the high end drives but should leave a HDD or older SSD in the dirt.
As for what is below? Why that is an XPG Spectrix S10 drive which is the world's first RGB infected SSD.
"Without high-end motherboards or funky case concepts to show off, Adata focused its Computex presence on its strong point: storage. Join us as we walk through the company's upcoming SSD offerings."
Here is some more Tech News from around the web:
- The AMD Vega Memory Architecture Q&A With Jeffrey Cheng @ TechARP
- Researchers find super flaw in Virgin Media Super Hub routers @ The Inquirer
- Alphabet offloads bot businesses Boston Dynamics and SCHAFT @ The Register
- Microsoft officially hangs up on old Skype phones, users fuming @ The Register
- Windows 10 Creators Update preview: Lovin' for Edge and pen users, nowt much else @ The Register
Subject: General Tech | June 12, 2017 - 03:00 AM | Ryan Shrout
Tagged: powerplay, logitech g, logitech, lightspeed, g903, g703
Logitech has finally released what I can only describe as the holy grail of mouse technologies. By combining the well-established and high performance wireless connectivity of the G900 mouse with a while-in-use wireless CHARGING system for new Logitech gaming mice, Logitech is promising to be bring us “unlimited gaming” and a life that no longer requires cables, battery notifications, or location-based timeouts.
Rather than bury the lead by diving into the new mice that go along with the technology, let’s first discuss PowerPlay, both the brand and the product name that Logitech is giving to the wireless charging mat that makes this all happen. Wireless charging is not a new idea, and it has been implemented on other products prior, but not to this scale. With Logitech PowerPlay you are not required to leave the mouse over a certain section of the surface and pause usage to charge. Instead, PowerPlay, when paired with one of the two mice launching with the technology, affords you continuous power that keeps you charged WHILE you are gaming!
This is a significant advancement and one that leads to quite a few improvements for gamers. First, overcoming the need to be placed and still, PowerPlay creates the largest single surface for charging any device I have seen. The size of the surface is 275mm x 320mm and closely mirrors other Logitech G mouse surfaces. Getting a surface that large, with enough power to guarantee the mouse will be provided more power than it can consume while in use, took a long time to engineer. And going above anything this size will be even more difficult as EMI restrictions from governmental bodies around the world come into play.
Implementation of PowerPlay is a USB-attached power input that has a hard surface that goes on your desk or table. Logitech then provides a soft surface that go over it to suit your preference. The mice that support PowerPlay (shown below) will still have USB connections on them for charging or use while away from your main PC, so you aren’t stuck in one place or lugging around the added hardware if you don’t need it.
The amount of charging power on PowerPlay provides wasn’t stated exactly, but it is definitely lower than a direct USB connection. I asked Logitech engineers how I could compare the performance of both power input methods. From a zero-state on the mouse to a full charge, the USB cable takes about 2 hours, while the PowerPlay would charge it in close to 14 hours. That’s significant difference, but Logitech assured me that a user could game forever with this system assuming no interruptions in power to the pad itself. The power delivery has multiple steps and Logitech says it will charge faster when in idle.
I can’t tell you how often I have asked for a feature like this, or how often the idea has been brought up by readers. Logitech has delivered – though it will cost you $99, plus the cost of a new mouse, to get up and running.
Speaking of those new mice, Logitech is bringing two options today that will work just fine with, or without, the PowerPlay feature. The G903 is the successor to the incredibly popular and well-reviewed G900, a wireless-based gaming mouse that has exceeded my expectations in performance at each turn. Second is the G703, a successor to the G403. These mice are priced at $149 and $99, respectively. The PowerPlay technology is supported by a small module that is put in place on the underside of the mouse. That opening can also house a 10g weight for users that would prefer a heavier model; note that you cannot use both the weight and utilizing wireless charging.
Finally, Logitech has used this opportunity to brand the wireless data technology that first debuted in the G900 as Lightspeed. I have talked about the engineering and design that went into Logitech’s release of its wireless gaming hardware previously, and it does bear repeating and a deeper dive coming soon. But gamers that worry about wireless not being as fast or as accurate as wired gaming mice should be convinced through the testing and science behind Logitech’s implementation.
In total, this hardware from Logitech provides what I feel is the most robust and feature rich gaming mouse package that exists today. The G903 and the G703 retain their superior design and capability (with some improvements along the way) while the PowerPlay wireless charging mat offers a new feature that gamers, and PC enthusiasts of all kinds, have been clamoring at for years.
We should have our sample units in very shortly, with availability starting in late June for the mice and in August for the mat!
Subject: General Tech | June 11, 2017 - 04:56 PM | Jim Tanous
Tagged: Xbox Scorpio, xbox, microsoft, E3
At its E3 2017 keynote Sunday, Microsoft finally unveiled the official details for its upcoming "Project Scorpio" console, now called "Xbox One X." The console, surprisingly smaller than even the Xbox One S, will launch November 7, 2017 and, as expected, will be priced at $499, the same launch price of the original Xbox One in November 2013.
With a maximum 6 teraflops of GPU horsepower and a class-leading 326GB/s memory bandwidth, Microsoft is hoping that its significant performance advantage over Sony's $399 PS4 Pro, as well as its ability to play UHD Blu-ray discs, will help justify the $100 price difference for consumers.
|Xbox One X||PS4 Pro|
|2.3GHz 8-Core||2.16 GHz 8-Core|
|GPU||6 TFLOPS||4.2 TFLOPS|
|Memory||12GB GDDR5||8GB GDDR5|
|Memory Bandwidth||326 GB/s||218 GB/s|
|Storage||1TB HDD||1TB HDD|
One of the criticisms of the PS4 Pro is that many of the games "optimized" for the system do not utilize 4K assets or run at true 4K resolution. In response, Microsoft clarified repeatedly throughout its keynote that many games designed for Xbox One X will indeed run at 4K/60fps. While Microsoft will likely ensure that its own house-published titles and those from close partners will hit this mark, it remains to be seen how well cross-platform games from third parties will fare.
As for those who don't have 4K displays, Xbox One X will use supersampling to increase perceived resolution and quality at 1080p. The popular Xbox 360 backwards compatibility feature (which will soon include original Xbox games) will also benefit from the Xbox One X's increased horsepower, with Microsoft promising faster load times and improved anti-aliasing.
As with the PS4 Pro, all games will support both console generations, with many titles going forward "enhanced for Xbox One X." One of Sony's biggest problems is the lack of games that truly take advantage of the PS4 Pro's unique features, so Microsoft's ability to bring third party developers on board will be key to the Xbox One X's success.
We'll need the console to hit the market to get a more detailed look at its technical specifications, but based on Microsoft's claimed performance numbers, the Xbox One X looks like a relatively good deal from a hardware perspective. The console's 6 TFLOPS of graphics processing power compares to an NVIDIA GTX 1070, which currently retails for just over $400. Add in the 1TB hard drive, custom 8-core CPU, and UHD Blu-ray player, and the price is suddenly not so unreasonable. Of course, newer cards like the AMD Radeon RX 580 also hit around 6 TFLOPS for ~$220, but you won't be able to find one of those these days. At a $100 premium over the PS4 Pro, however, it's unclear how the console community will value the Xbox One X's hardware advantage.
One thing that is clear is that Microsoft's Xbox team wasn't too happy to be the source of mockery based on performance and sales for the past four years, and they're highly motivated to come out swinging this fall.
Preorders for Xbox One X have yet to be announced, but you'll find the Amazon pre-order page here when orders go live.
Subject: Processors | June 9, 2017 - 03:02 PM | Jeremy Hellstrom
Tagged: amd, ryzen 5, productivity, ryzen 7 1800x, Ryzen 5 1500X, AMD Ryzen 5 1600, Ryzen 5 1600X, ryzen 5 1400
The Tech Report previously tested the gaming prowess of AMD's new processor family and are now delving into the performance of productivity software on Ryzen. Many users who are shopping for a Ryzen will be using it for a variety of non-gaming tasks such as content creation, coding or even particle flow analysis. The story is somewhat different when looking through these tests, with AMD taking the top spot in many benchmarks and in others being surpassed only by the Core i7 6700k, in some tests that chip leaves all competition in the dust by a huge margin. For budget minded shoppers, the Ryzen 5 1600 barely trails both the i7-7700K and the 1600X in our productivity tests making it very good bargain for someone looking for a new system. Check out the full suite of tests right here.
"Part one of our AMD Ryzen 5 review proved these CPUs have game, but what happens when we have to put the toys away and get back to work? We ran all four Ryzen 5 CPUs through a wide range of productivity testing to find out."
Here are some more Processor articles from around the web:
- AMD Ryzen 5 1400 3.2 GHz @ techPowerUp
- Intel Skylake X and Kaby Lake X: Luke and Leo Discuss @ Kitguru
- Core i3-7350K @ Hardware Secrets
Subject: General Tech | June 9, 2017 - 02:29 PM | Jeremy Hellstrom
Tagged: Windows 10 S, security
Microsoft recently pointed out that their new lite version of Windows 10 for students, Windows 10 S, is completely immune to all known malware. This does make sense, the OS is simply unable to install anything that is not from the Windows Store, which does not host any official malware, even if some of the available programs are not entirely useful. That security will last as long as no one figures out a way to fake the file validation and the connection to Microsoft's online store, or manages to get a malware infected file approved for sale on the store. Apple has had some experience which prove that is not an impossibility. Pop by Slashdot for more.
You could also chose to go with the OS of choice for financial institutions and various other industries, Windows XP Embedded with the Enhanced Write Filter. Generally secure and can be reset with a simple reboot ... in most cases.
"However, if you want to guarantee your safety from ransomware, then Microsoft points out there's an even more secure option to consider -- Windows 10 S. The new, hardened Windows 10 variant only runs apps from the Windows Store, which means it can't run programs from outside Microsoft's ecosystem, and that includes malware. Which is why, as Microsoft says, "No known ransomware works against Windows 10 S."
Here is some more Tech News from around the web:
- Computex 2017: Corsair goes high-concept @
- Blackberry KeyOne consumer alert: You bend it, you break it @ The Inquirer
- Linksys WRT3200ACM AC3200 Wireless Router @ Kitguru
- Skype Retires Older Apps for Windows, Linux @ Slashdot
- COUGAR ARMOR Gaming Chair Review @ NikKTech
Subject: General Tech | June 9, 2017 - 02:23 PM | Sebastian Peak
Tagged: wired, surround, Pro-G, logitech, headset, headphones, gaming, G433, DTS Headphone:X, drivers, 7.1
Logitech has released their latest surround gaming headphones with the wired G433 Gaming Headset, a 7.1-channel (via DTS Headphone:X) model that is latest to use the company's Pro-G drivers.
The style of the new G433 is quite eye-catching, with four colors (black, red, blue, and blue camo) of a unique fabric finish that Logitech says is hydrophobic (repels water) for enhanced durability. The G433 primarily function as an analog headphone (with a 3.5 mm plug) unless an included USB DAC/headphone amp is used, giving PC users access to DTS Headphone:X surround up to 7.1 channels and customizable EQ via Logitech's Gaming Software. The microphone is a removable boom style with noise reduction to help improve voice clarity, and Logitech has used a 5-element double-grounded cable to eliminate crosstalk and prevent game audio from bleeding into voice.
The G433 arrives with an MSRP of $99, making the headset the least expensive Pro-G option to date, but this comparatively low price tag for a premium option still provides the buyer a complete accessory pack including the USB DAC, alternate ear pads, two 3.5 mm audio cables (one with inline mic), a 3.5 mm audio/mic Y-cable, and a fabric storage bag.
The Logitech G433 is available now, and with a pair on hand will have a full review up very soon!
Subject: Displays | June 9, 2017 - 11:24 AM | Ryan Shrout
Tagged: Samsung, hdr, freesync 2, freesync, CHG90, CHG70, amd
Samsung made a surprise announcement this morning, taking the wraps off of the first FreeSync 2 monitors to grace our pages, officially. These gaming displays come in three difference sizes, one of them incredibly unique, and all with HDR support and Quantum Dot Technology to go along with the variable refresh rate technology of FreeSync.
All three displays utilize the QLED Quantum Dot tech first showcased in the QLED TV lineup launched just this past January at CES. It uses a new metal core and has some impressive color quality capabilities, going to 125% of the sRGB color space and 95% of the DCI-PE color space! I don't yet know what the peak luminance is, or how many backlight zones there might be for HDR support, but I have asked Samsung for clarification and will update here when I get feedback. All three displays use VA panels.
All three displays also become the first to pass certification with AMD for FreeSync 2, which we initially detailed WAY BACK in January of this year. FreeSync 2 should tell us that this display meets some minimum standards for latency, color quality, and low frame rate compensation. These are all great on paper, though I am still looking for details from AMD on what exactly the minimum standards have been set to. At the time, AMD would only tell me that FreeSync 2 displays "will require a doubling of the perceivable brightness and doubling of the viewable color volume based on the sRGB standards."
The bad boy of the group, the Samsung CHG90 (part number C49HG90), is easily the most interesting. It comes in with a staggering screen size of 49-inches and a brand new 32:9 aspect ratio with an 1800R curvature. With a 3840x1080 resolution, I am eager to see this display in person and judge how the ultra-wide design impacts our gaming and our productivity capability. (They call this resolution DFHD, for double full HD.) The refresh rate peaks at 144 Hz and a four-channel scanner is in place to minimize any motion blur or ghosting. A 1ms rated response time also makes this monitor incredibly impressive, on paper. Price for the C49HG90 is set at $1499 with preorders starting today on Amazon.com. (Amazon lists a June 30th release date, but I am looking to clarify.)
Also on the docket is the CHG70, available in two sizes, a 32-in (C32HG70) and a 27-in (C27HG70) model. Both are 2560x1440 resolution screens with 16:9 aspect ratios, 1ms response times and FreeSync 2 integrations. That means the same 125% sRGB and 95% DCI-P3 color space support along with the Samsung Quantum Dot technology. Both will sport a 144 Hz refresh rate and an 1800R curvature. The specifications are essentially identical between all three models, making the selection process an easier choice based on price segment and screen real estate. The C27HG70 will be on preorder from Samsung.com exclusively for $599 while the C32HG70 will be on preorder at Newegg.com for $699, just $100 more.
All three displays will feature a Game Mode to optimize image settings for...gaming.
Samsung’s CHG90 extends the playing field for virtual competitors, with its 49-inch design representing the widest gaming monitor available. The monitor delivers a dramatic 1,800R curvature and an ultra-wide 178-degree viewing angle, ensuring that content is clearly visible from nearly any location within a given space. As a result, gamers no longer need to worry about the logistics, expenses, and bezel interference that occur when combining multiple smaller monitors together for an expanded view.
The new CHG90 monitor includes a height adjustable stand (HAS), allowing flexible height adjustment for improved viewing comfort. Designed for the most demanding games, the CHG70 monitor goes a step further with a dual-hinge stand that provides users more precise control over how the display panel is positioned.
In addition to Game Mode, a feature that optimizes image setting for playing games when connected to a PC or a game console, each of the new monitors include a game OSD dashboard, designed to blend seamlessly into game interfaces.
A full table of specifications is below and trust me on this one guys, I am already up in Samsung's and AMD's face to get these monitors in here for review!
Now all we are missing is the power of a Radeon RX Vega card to push this high resolution, high refresh rate HDR goodness!!
Subject: Graphics Cards | June 8, 2017 - 05:26 PM | Jeremy Hellstrom
Tagged: radeon, Crimson Edition 17.6.1, amd
In the very near future AMD will be releasing an updated driver, focused on improving performance in Prey and DiRT 4.
For DiRT 4 it will enable a Multi GPU profile and up to 30% performance improvement when using 8xMSAA on a Radeon RX 580 8GB compared to the previous release.
Subject: General Tech | June 8, 2017 - 12:42 PM | Jeremy Hellstrom
Tagged: microsoft, hexadite, windows defender, security
If you have never heard of Hexadite you are not alone, the online security company was formed in 2014, headquartered in Boston but based in Tel-Aviv. As it was just purchased by Microsoft for around $100 million so they can integrate Hexadite's Automated Incident Response Solution into their Windows Defender Advanced Threat Protection. AIRS is not antivirus software, instead it is a tool that integrates with existing software and monitors for any alerts. Once an alert is detected the tool automatically investigates that alert and searches for solutions, in theory saving your security teams sanity by vastly reducing the number of alerts they must deal with directly. It will be interesting to see if this has an effect on the perception of companies and users as to the effectiveness of Windows Defender.
"Hexadite's technology and talent will augment our existing capabilities and enable our ability to add new tools and services to Microsoft's robust enterprise security offerings."
Here is some more Tech News from around the web:
- Museum of Failure will help us learn from our 404s @ The Inquirer
- Raspberry Pi Malware Mines BitCoin @ Hack a Day
- AMD Threadripper and Vega: Luke and Leo discuss @ Kitguru
- MediaTek considers placing chip orders with Globalfoundries @ DigiTimes
- Pop-up Android adware uses social engineering to resist deletion @ The Register
Subject: General Tech | June 8, 2017 - 11:22 AM | Alex Lustenberg
Tagged: X399, x370, x299, wwdc, video, shield, podcast, plex, pixel, macbook, Mac Pro, Logitech G413, Lian-Li, gigabyte, computex, asus, asrock, apollo lake, 3D XPoint
PC Perspective Podcast #453 - 06/07/17
Join us for talk about continued Computex 2017 coverage, WWDC '17, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano
Peanut Gallery: Alex Lustenberg, Ken Addison
Week in Review:
News items of interest:
1:10:50 Honey, I shrunk the silicon
Hardware/Software Picks of the Week
Subject: General Tech | June 7, 2017 - 09:31 PM | Josh Walrath
Tagged: silicon nanosheet, Samsung, IBM, GLOBALFOUNDRIES, FinFET, 5nm
It seems only yesterday that we saw Intel introduce their 22nm FinFET technology, and now we are going all the way down to 5nm. This is obviously an exaggeration. The march of process technology has been more than a little challenging for the past 5+ years for everyone in the industry. Intel has made it look a little easier by being able to finance these advances a little better than the other pure-play foundries. It does not mean that they have not experienced challenges on their own.
We have seen some breakthroughs these past years with everyone jumping onto FinFETs with TSMC, Samsung, and GLOBALFOUNDRIES introducing their own processes. GLOBALFOUNDRIES initially had set out on their own, but that particular endeavor did not pan out. The ended up licensing Samsung’s 14nm processes (LPE and LPP) to start producing chips of their own, primarily for AMD in their graphics and this latest generation of Ryzen CPUs.
These advances have not been easy. While FinFETs are needed at these lower nodes to continue to provide the performance and power efficiency while supporting these transistor densities, the technology will not last forever. 10nm and 7nm lines will continue to use them, but many believe that while we will see the densities improve, the power characteristics will start to lag behind. The theory is that past 7nm nodes traditional FinFETs will no longer work as desired. This is very reminiscent of the sub 28nm processes that attempted to use planar structures on bulk silicon. In that case the chips could be made, but power issues plagued the designs and eventually support for those process lines were dropped.
IBM and their research associates Samsung, GLOBALFOUNDRIES at SUNY Polytechnic Institute Colleges of Nanoscale Science and Engineering’s NanoTech Complex in Albany, NY have announced a breakthrough in a new “Gate-All-Around” architecture made on a 5nm process. FinFETs are essentially a rectangle surround on three sides by gates, giving it the “fin” physical characteristics. This new technology now covers the fourth side and embeds these channels in nanosheets of silicon.
The problem with FinFETs is that they will eventually be unable to scale with power as transistors get closer and closer. While density scales, power and performance will get worse as compared to previous nodes. The 5nm silicon nanosheet technology gives a significant boost to power and efficiency, thereby doing to FinFETs what they did with planar structures at the 20/22nm nodes.
One of the working EUV litho machines at SUNY Albany.
IBM asserts that the average chip the size of a fingernail can contain up to 30 billion transistors and continue to see the density, power, and efficiency improvements that we would expect with a normal process shrink. The company expects these process nodes to start rolling out in a 2019 time frame if all goes as planned.
There are few details in how IBM was able to achieve this result. We do know a couple things about it. EUV lithography was used extensively to avoid the multi-patterning nightmare that this would entail. For the past two years Ametek has been installing 100 watt EUV litho machines throughout the world to select clients. One of these is located on the SUNY Albany campus where this research was done. We also know that deposition was done layer by layer with silicon and the other materials.
What we don’t know is how long it takes to create a complete wafer. Usually these test wafers are packed full of SRAM and very little logic. It is a useful test and creates a baseline for many structures that will eventually be applied to this process. We do not know how long it takes to produce such a wafer, but considering how the layers look to be deposited it takes a long, long time with current tools and machinery. Cutting edge wafers in production can take upwards of 16 weeks to complete. I hesitate to even guess how long each test wafer takes. Because of the very 3D nature of the design, I am curious as to how the litho stages work and how many passes are still needed to complete the design.
This looks to be a very significant advancement in process technology that should be mass produced in the timeline suggested by IBM. It is a significant jump, but it seems to borrow a lot of previous FinFET structures. It does not encompass anything exotic like “quantum wells”, but is able to go lower than the currently specified 7nm processes that TSMC, Samsung, and Intel have hinted at (and yes, process node names should be taken with a grain of salt from all parties at this time). IBM does appear to be comparing this to what Samsung calls its 7nm process in terms of dimensions and transistor density.
Cross section of a 5nm transistor showing the embedded channels and silicon nanosheets.
While Moore’s Law has been stretched thin as of late, we are still seeing these scientists and engineers pushing against the laws of physics to achieve better performance and scaling at incredibly small dimensions. The silicon nanosheet technology looks to be an effective and relatively affordable path towards smaller sizes without requiring exotic materials to achieve. IBM and its partners look to have produced a process node that will continue the march towards smaller, more efficient, and more powerful devices. It is not exactly around the corner, but 2019 is close enough to start planning designs that could potentially utilize this node.
Subject: General Tech | June 7, 2017 - 09:10 PM | Scott Michaud
Tagged: Qt, vulkan
During our recent interview, the Khronos Group mentioned that one reason to merge into Vulkan was because, at first, the OpenCL working group wasn’t sure whether they wanted an explicit, low-level API, or an easy-to-use one that hides the complexity. Vulkan taught them to take a very low-level position, because there can always be another layer above them that hides complexity to everything downstream of it. This is important for them, because the only layers below them are owned by OS and hardware vendors.
This post is about Qt, though. Qt is a UI middleware, written in C++, that has become very popular as of late. The big revamp of AMD’s control panel with Crimson Edition was a result of switching from .NET to Qt, which greatly sped up launch time. They announced their intent to support the Vulkan API on the very day that it launched.
First and foremost, their last bulletpoint claims that these stances can change as the middleware evolves, particularly with Qt Quick, Qt 3D, Qt Canvas 3D, QPainter, and similar classes. This is a discussion of their support for Qt 5.10 specifically. As it stands, though, Qt intends to focus on cross-platform, window management, and “function resolving for the core API”. The application is expected to manage the rest of the Vulkan API itself (or, of course, use another helper for the other parts).
This makes sense for Qt’s position. Their lowest level classes should do as little as possible outside of what their developers expect, allowing higher-level libraries the most leeway to fill in the gaps. Qt does have higher-level classes, though, and I’m curious what others, especially developers, believe Qt should do with those to take advantage of Vulkan. Especially when we start getting into WYSIWYG editors, like Qt 3D Studio, there is room to do more.
Obviously, the first release isn’t the place to do it, but I’m curious none-the-less.
Subject: Memory | June 7, 2017 - 08:49 PM | Tim Verry
Tagged: G.Skill, overclocking, ddr4, x299, liquid nitrogen, computex
Amidst the flood of new product announcements at Computex, G.Skill was busy hosting an overclocking competition where its memory was used to in a record breaking overclock that saw DDR4 memory clocked at an impressive 5,500 MHz. Professional overclocker Toppc broke his 5,000 MHz record from last year with the new overclock that was accomplished on Intel’s X299 platform.
Toppc used a MSI X299 Gaming Pro Carbon AC motherboard, Intel Core X-series processor, G.Skill DDR4 memory built using Samsung 8Gb ICs, and, of course, copious amounts of liquid nitrogen! Looking at the HWBot page, it appears Toppc specifically used an Intel Core i7-7740K (Kaby Lake X) processor and 8GB G.Skill Trident Z RGB RAM (CL 14-14-14-14 stock). Both the CPU and memory modules were cooled with liquid nitrogen for the overclock. The CPU-Z screenshot shows the processor running 1 cores / 2 threads with a 133.06 bus speed. It also shows an 8x multiplier and core speed of 1064.46 but I am questioning whether or not it is accurately reading the Kaby Lake X part correctly as running at those speeds wouldn’t need such exotic cooling – perhaps it is needed to run at the 133.06 bus speed and to keep the memory controller from overheating (or melting hehe).
G.Skill is currently pushing the envelope on standard air cooled DIMMs with a prototype kit hitting 4,800 MHz. The company's CVP Tequila Huang stated in a press release:
“We are seeing amazing overclocking potential for these newly released hardware and we believe that more overclocking benchmark records will be achieved very soon by professional overclockers worldwide."
I am interested to see if it will have any additional headroom in the memory overclocking department and if so how long the 5.5 GHz world record will stand.
Subject: General Tech | June 7, 2017 - 04:54 PM | Scott Michaud
Tagged: pc gaming, linux, vulkan, Intel, mesa, feral interactive
According to Phoronix, Alex Smith of Feral Interactive has just published a few changes to the open source Intel graphics driver, which allows their upcoming Dawn of War III port for Linux to render correctly on Vulkan. This means that the open-source Intel driver should support the game on day one, although drawing correctly and drawing efficiently could be two very different things -- or maybe not, we’ll see.
It’s interesting seeing things go in the other direction. Normally, graphics engineers parachute in to high-end developers and help them make the most of their software for each respective, proprietary graphics driver. In this case, we’re seeing the game studios pushing fixes to the graphics vendors, because that’s how open source rolls. It will be interesting to do a pros and cons comparison of each system one day, especially if cross-pollination results from it.
Subject: General Tech | June 7, 2017 - 03:57 PM | Jeremy Hellstrom
Tagged: input, roccat, Kone EMP, gaming mouse
Roccat;s new Kone EMP shares some attributes with earlier members of the Kone lineup, specifically the Owl-Eye optical sensor based on PixArt’s PWM 3361DM, which can be set at up to 12000dpi and the SWARM software suite to program the mouse. The onboard ARM Cortex-M0 and 512kB of memory allows the mouse to keep that programming, even on another machine which does not have SWARM installed. Modders-Inc tested the mouse out, see what they thought of it here.
"The Roccat Kone EMP is the next mouse in the Kone line up and the successor to the Kone XTD. The Kone EMP features Roccat's OWL-Eye optical sensor and four RGB LEDs for custom lighting."
Here is some more Tech News from around the web:
- Corsair GLAIVE RGB USB Gaming Mouse @ Benchmark Reviews
- Patriot Viper V770 Mechanical RGB Keyboard @ techPowerUp
- G.SKILL RIPJAWS KM570 MX Mechanical Gaming Keyboard Review @ NikKTech
Subject: General Tech | June 7, 2017 - 01:21 PM | Jeremy Hellstrom
Tagged: gaming, paradox
Paradox is well named as it has a very different philosophy from the rest of the industry about how to treat games after they have been released. It is becoming quite common for developers to already be working on a sequel to a game that they have just released, or are in the process of releasing. Once a game launched you can expect to see numerous and often expensive DLC released for the game, which usually offer little to no new real gameplay or functionality.
Paradox treats games completely differently, their DLC expansions are often expensive but frequently offer a significant change to the base game and when released they always add several major new features to anyone who owns the game without charge. They do this for a long time after launch, two examples are Crusader Kings II which is five years old and has twelve expansions, while the four year old Europa Universalis IV has ten expansions. Rock, Paper, SHOTGUN sat down with the creative director Johan Andersson and CEO Fredrik Wester to discuss the future of these games and Paradox itself, as well as talking about the effects of offering major updates to older games as opposed to the more common constant release of sequels to games.
"With Crusader Kings II now five years old and twelve expansions deep, and Europa Universalis IV a relatively sprightly four years and ten expansions, what is the future of these titles? At what point are they done and at what point does the thought of a sequel come up."
Here is some more Tech News from around the web:
- Total War: Warhammer 2 is taking everything further @ Rock, Paper, SHOTGUN
- Arms review: Nintendo reinvents the fighting game and it’s brilliant @ Ars Technica
- BattleTech is the mech game I’ve always wanted @ Rock, Paper, SHOTGUN
- Battleborn free downloadable experience launched @ HEXUS
- Ealdorlight is a procedural storytelling fantasy RPG @ Rock, Paper, SHOTGUN
- Brigador: Up-Armored Edition ‘relaunches’ the mech combat game @ Rock, Paper, SHOTGUN
Subject: General Tech | June 7, 2017 - 12:42 PM | Jeremy Hellstrom
Tagged: wannacry, windows 10, security
If you have an unpatched Windows installation you are vulnerable to the SMBv1 exploit, except perhaps if you are still on WinXP in which case your machine is more likely to crash than to start encrypting. Do yourself a favour and head to Microsoft to manually download the patch appropriate for your OS and run it, if you already have it then it will tell you so, otherwise it will repair the vulnerability. The version of Wannacry and its progenitor, EternalBlue, which is making life miserable for users and techs everywhere does not currently go after Win10 machines but you can read how it can easily be modified to do so over at Slashdot.
"The publicly available version of EternalBlue leaked by the ShadowBrokers targets only Windows XP and Windows 7 machines. Researchers at RiskSense who created the Windows 10 version of the attack were able to bypass mitigations introduced by Microsoft that thwart memory-based code-execution attacks."
Here is some more Tech News from around the web:
- Microsoft slaps down Kaspersky's Windows 10 antitrust complaint @ The Inquirer
- LifeTrak Zoom HRV Wearable Body Computer
- Fujitsu PC biz tie-in with Lenovo to happen 'soon' @ The Register
- Why You Must Patch the New Linux sudo Security Hole @ Linux.com
- Foxconn, Amazon, Apple join Toshiba chip plant feeding frenzy @ The Register
- iOS 11 ain't coming to the iPhone 5, iPhone 5C or iPad 4 @ The Inquirer
- TRENDnet TV-NVR104K 4-Channel HD PoE NVR Kit Review @ NikKTech