Putting the Ryzen 5 to work
Subject: Processors | June 9, 2017 - 03:02 PM | Jeremy Hellstrom
Tagged: amd, ryzen 5, productivity, ryzen 7 1800x, Ryzen 5 1500X, AMD Ryzen 5 1600, Ryzen 5 1600X, ryzen 5 1400
The Tech Report previously tested the gaming prowess of AMD's new processor family and are now delving into the performance of productivity software on Ryzen. Many users who are shopping for a Ryzen will be using it for a variety of non-gaming tasks such as content creation, coding or even particle flow analysis. The story is somewhat different when looking through these tests, with AMD taking the top spot in many benchmarks and in others being surpassed only by the Core i7 6700k, in some tests that chip leaves all competition in the dust by a huge margin. For budget minded shoppers, the Ryzen 5 1600 barely trails both the i7-7700K and the 1600X in our productivity tests making it very good bargain for someone looking for a new system. Check out the full suite of tests right here.
"Part one of our AMD Ryzen 5 review proved these CPUs have game, but what happens when we have to put the toys away and get back to work? We ran all four Ryzen 5 CPUs through a wide range of productivity testing to find out."
Here are some more Processor articles from around the web:
- AMD Ryzen 5 1400 3.2 GHz @ techPowerUp
- Intel Skylake X and Kaby Lake X: Luke and Leo Discuss @ Kitguru
- Core i3-7350K @ Hardware Secrets
Why?
Astute readers of the site might remember the original story we did on Bitcoin mining in 2011, the good ole' days where the concept of the blockchain was new and exciting and mining Bitcoin on a GPU was still plenty viable.
However, that didn't last long, as the race for cash lead people to developing Application Specific Integrated Circuits (ASICs) dedicated solely to Bitcoin mining quickly while sipping power. Use of the expensive ASICs drove the difficulty of mining Bitcoin to the roof and killed any sort of chance of profitability from mere mortals mining cryptocurrency.
Cryptomining saw a resurgence in late 2013 with the popular adoption of alternate cryptocurrencies, specifically Litecoin which was based on the Scrypt algorithm instead of AES-256 like Bitcoin. This meant that the ASIC developed for mining Bitcoin were useless. This is also the period of time that many of you may remember as the "Dogecoin" era, my personal favorite cryptocurrency of all time.
Defenders of these new "altcoins" claimed that Scrypt was different enough that ASICs would never be developed for it, and GPU mining would remain viable for a larger portion of users. As it turns out, the promise of money always wins out, and we soon saw Scrypt ASICs. Once again, the market for GPU mining crashed.
That brings us to today, and what I am calling "Third-wave Cryptomining."
While the mass populous stopped caring about cryptocurrency as a whole, the dedicated group that was left continued to develop altcoins. These different currencies are based on various algorithms and other proofs of works (see technologies like Storj, which use the blockchain for a decentralized Dropbox-like service!).
As you may have predicted, for various reasons that might be difficult to historically quantify, there is another very popular cryptocurrency from this wave of development, Ethereum.
Ethereum is based on the Dagger-Hashimoto algorithm and has a whole host of different quirks that makes it different from other cryptocurrencies. We aren't here to get deep in the woods on the methods behind different blockchain implementations, but if you have some time check out the Ethereum White Paper. It's all very fascinating.
Continue reading our look at this third wave of cryptocurrency!
Windows 10 S ... the S could stand for secure
Subject: General Tech | June 9, 2017 - 02:29 PM | Jeremy Hellstrom
Tagged: Windows 10 S, security
Microsoft recently pointed out that their new lite version of Windows 10 for students, Windows 10 S, is completely immune to all known malware. This does make sense, the OS is simply unable to install anything that is not from the Windows Store, which does not host any official malware, even if some of the available programs are not entirely useful. That security will last as long as no one figures out a way to fake the file validation and the connection to Microsoft's online store, or manages to get a malware infected file approved for sale on the store. Apple has had some experience which prove that is not an impossibility. Pop by Slashdot for more.
You could also chose to go with the OS of choice for financial institutions and various other industries, Windows XP Embedded with the Enhanced Write Filter. Generally secure and can be reset with a simple reboot ... in most cases.
"However, if you want to guarantee your safety from ransomware, then Microsoft points out there's an even more secure option to consider -- Windows 10 S. The new, hardened Windows 10 variant only runs apps from the Windows Store, which means it can't run programs from outside Microsoft's ecosystem, and that includes malware. Which is why, as Microsoft says, "No known ransomware works against Windows 10 S."
Here is some more Tech News from around the web:
- Computex 2017: Corsair goes high-concept @
- Blackberry KeyOne consumer alert: You bend it, you break it @ The Inquirer
- Linksys WRT3200ACM AC3200 Wireless Router @ Kitguru
- Skype Retires Older Apps for Windows, Linux @ Slashdot
- COUGAR ARMOR Gaming Chair Review @ NikKTech
Logitech's G433 7.1 Gaming Headset: Stylish Looks and Pro-G Drivers for $99
Subject: General Tech | June 9, 2017 - 02:23 PM | Sebastian Peak
Tagged: wired, surround, Pro-G, logitech, headset, headphones, gaming, G433, DTS Headphone:X, drivers, 7.1
Logitech has released their latest surround gaming headphones with the wired G433 Gaming Headset, a 7.1-channel (via DTS Headphone:X) model that is latest to use the company's Pro-G drivers.
The style of the new G433 is quite eye-catching, with four colors (black, red, blue, and blue camo) of a unique fabric finish that Logitech says is hydrophobic (repels water) for enhanced durability. The G433 primarily function as an analog headphone (with a 3.5 mm plug) unless an included USB DAC/headphone amp is used, giving PC users access to DTS Headphone:X surround up to 7.1 channels and customizable EQ via Logitech's Gaming Software. The microphone is a removable boom style with noise reduction to help improve voice clarity, and Logitech has used a 5-element double-grounded cable to eliminate crosstalk and prevent game audio from bleeding into voice.
The G433 arrives with an MSRP of $99, making the headset the least expensive Pro-G option to date, but this comparatively low price tag for a premium option still provides the buyer a complete accessory pack including the USB DAC, alternate ear pads, two 3.5 mm audio cables (one with inline mic), a 3.5 mm audio/mic Y-cable, and a fabric storage bag.
The Logitech G433 is available now, and with a pair on hand will have a full review up very soon!
Samsung Announces FreeSync 2 HDR Displays, includes C49HG90 49-in UltraWide!
Subject: Displays | June 9, 2017 - 11:24 AM | Ryan Shrout
Tagged: Samsung, hdr, freesync 2, freesync, CHG90, CHG70, amd
Samsung made a surprise announcement this morning, taking the wraps off of the first FreeSync 2 monitors to grace our pages, officially. These gaming displays come in three difference sizes, one of them incredibly unique, and all with HDR support and Quantum Dot Technology to go along with the variable refresh rate technology of FreeSync.
All three displays utilize the QLED Quantum Dot tech first showcased in the QLED TV lineup launched just this past January at CES. It uses a new metal core and has some impressive color quality capabilities, going to 125% of the sRGB color space and 95% of the DCI-PE color space! I don't yet know what the peak luminance is, or how many backlight zones there might be for HDR support, but I have asked Samsung for clarification and will update here when I get feedback. All three displays use VA panels.
All three displays also become the first to pass certification with AMD for FreeSync 2, which we initially detailed WAY BACK in January of this year. FreeSync 2 should tell us that this display meets some minimum standards for latency, color quality, and low frame rate compensation. These are all great on paper, though I am still looking for details from AMD on what exactly the minimum standards have been set to. At the time, AMD would only tell me that FreeSync 2 displays "will require a doubling of the perceivable brightness and doubling of the viewable color volume based on the sRGB standards."
The bad boy of the group, the Samsung CHG90 (part number C49HG90), is easily the most interesting. It comes in with a staggering screen size of 49-inches and a brand new 32:9 aspect ratio with an 1800R curvature. With a 3840x1080 resolution, I am eager to see this display in person and judge how the ultra-wide design impacts our gaming and our productivity capability. (They call this resolution DFHD, for double full HD.) The refresh rate peaks at 144 Hz and a four-channel scanner is in place to minimize any motion blur or ghosting. A 1ms rated response time also makes this monitor incredibly impressive, on paper. Price for the C49HG90 is set at $1499 with preorders starting today on Amazon.com. (Amazon lists a June 30th release date, but I am looking to clarify.)
Also on the docket is the CHG70, available in two sizes, a 32-in (C32HG70) and a 27-in (C27HG70) model. Both are 2560x1440 resolution screens with 16:9 aspect ratios, 1ms response times and FreeSync 2 integrations. That means the same 125% sRGB and 95% DCI-P3 color space support along with the Samsung Quantum Dot technology. Both will sport a 144 Hz refresh rate and an 1800R curvature. The specifications are essentially identical between all three models, making the selection process an easier choice based on price segment and screen real estate. The C27HG70 will be on preorder from Samsung.com exclusively for $599 while the C32HG70 will be on preorder at Newegg.com for $699, just $100 more.
All three displays will feature a Game Mode to optimize image settings for...gaming.
Samsung’s CHG90 extends the playing field for virtual competitors, with its 49-inch design representing the widest gaming monitor available. The monitor delivers a dramatic 1,800R curvature and an ultra-wide 178-degree viewing angle, ensuring that content is clearly visible from nearly any location within a given space. As a result, gamers no longer need to worry about the logistics, expenses, and bezel interference that occur when combining multiple smaller monitors together for an expanded view.
The new CHG90 monitor includes a height adjustable stand (HAS), allowing flexible height adjustment for improved viewing comfort. Designed for the most demanding games, the CHG70 monitor goes a step further with a dual-hinge stand that provides users more precise control over how the display panel is positioned.
In addition to Game Mode, a feature that optimizes image setting for playing games when connected to a PC or a game console, each of the new monitors include a game OSD dashboard, designed to blend seamlessly into game interfaces.
A full table of specifications is below and trust me on this one guys, I am already up in Samsung's and AMD's face to get these monitors in here for review!
Now all we are missing is the power of a Radeon RX Vega card to push this high resolution, high refresh rate HDR goodness!!
Radeon Software Crimson ReLive Edition 17.6.1 - Prey and DiRT 4
Subject: Graphics Cards | June 8, 2017 - 05:26 PM | Jeremy Hellstrom
Tagged: radeon, Crimson Edition 17.6.1, amd
In the very near future AMD will be releasing an updated driver, focused on improving performance in Prey and DiRT 4.
For DiRT 4 it will enable a Multi GPU profile and up to 30% performance improvement when using 8xMSAA on a Radeon RX 580 8GB compared to the previous release.
AI to the rescue? Microsoft assimilates the security company Hexadite
Subject: General Tech | June 8, 2017 - 12:42 PM | Jeremy Hellstrom
Tagged: microsoft, hexadite, windows defender, security
If you have never heard of Hexadite you are not alone, the online security company was formed in 2014, headquartered in Boston but based in Tel-Aviv. As it was just purchased by Microsoft for around $100 million so they can integrate Hexadite's Automated Incident Response Solution into their Windows Defender Advanced Threat Protection. AIRS is not antivirus software, instead it is a tool that integrates with existing software and monitors for any alerts. Once an alert is detected the tool automatically investigates that alert and searches for solutions, in theory saving your security teams sanity by vastly reducing the number of alerts they must deal with directly. It will be interesting to see if this has an effect on the perception of companies and users as to the effectiveness of Windows Defender.
"Hexadite's technology and talent will augment our existing capabilities and enable our ability to add new tools and services to Microsoft's robust enterprise security offerings."
Here is some more Tech News from around the web:
- Museum of Failure will help us learn from our 404s @ The Inquirer
- Raspberry Pi Malware Mines BitCoin @ Hack a Day
- AMD Threadripper and Vega: Luke and Leo discuss @ Kitguru
- MediaTek considers placing chip orders with Globalfoundries @ DigiTimes
- Pop-up Android adware uses social engineering to resist deletion @ The Register
Overview
It's difficult to believe that it's only been a little over 2 years since we got our hands on the revised Dell XPS 13. Placing an emphasis on minimalistic design, large displays in small chassis, and high-quality construction, the Dell XPS 13 seems to have influenced the "thin and light" market in some noticeable ways.
Aiming their sights at a slightly different corner of the market, this year Dell unveiled the XPS 13 2-in-1, a convertible tablet with a 360-degree hinge. However, instead of just putting a new hinge on the existing XPS 13, Dell has designed the all-new XPS 13 2-in-1 from the ground up to be even more "thin and light" than it's older sibling, which has meant some substantial design changes.
Since we are a PC hardware-focused site, let's take a look under the hood to get an idea of what exactly we are talking about with the Dell XPS 13 2-in-1.
| Dell XPS 13 2-in-1 | ||||||
|---|---|---|---|---|---|---|
| MSRP | $999 | $1199 | $1299 | $1399 | ||
| Screen | 13.3” FHD (1920 x 1080) InfinityEdge touch display | |||||
| CPU | Core i5-7Y54 | Core i7-7Y75 | ||||
| GPU | Intel HD Graphics 615 | |||||
| RAM | 4GB | 8GB | 16GB | |||
| Storage | 128GB SATA | 256GB PCIe | ||||
| Network | Intel 8265 802.11ac MIMO (2.4 GHz, 5.0 GHz) Bluetooth 4.2 |
|||||
| Display Output |
1 x Thunderbolt 3 |
|||||
| Connectivity | USB 3.0 Type-C 3.5mm headphone USB 3.0 x 2 (MateDock) |
|||||
| Audio | Dual Array Digital Microphone Stereo Speakers (1W x 2) |
|||||
| Weight | 2.7 lbs ( 1.24 kg) | |||||
| Dimensions | 11.98-in x 7.81-in x 0.32-0.54-in (304mm x 199mm x 8 -13.7 mm) |
|||||
| Battery | 46 WHr | |||||
| Operating System | Windows 10 Home / Pro (+$50) | |||||
One of the more striking design decisions from a hardware perspective is the decision to go with the low power Core i5-7Y54 processor, or as you may be familar with from it's older naming scheme, Core M. In the Kaby Lake generation, Intel has decided to drop the Core M branding (though oddly Core m3 still exists) and integrate these lower power parts into the regular Core branding scheme.
Click here to continue reading our review of the Dell XPS 13 2-in-1
Podcast #453 - More Computex, WWDC, 3D Xpoint, and more
Subject: General Tech | June 8, 2017 - 11:22 AM | Alex Lustenberg
Tagged: X399, x370, x299, wwdc, video, shield, podcast, plex, pixel, macbook, Mac Pro, Logitech G413, Lian-Li, gigabyte, computex, asus, asrock, apollo lake, 3D XPoint
PC Perspective Podcast #453 - 06/07/17
Join us for talk about continued Computex 2017 coverage, WWDC '17, and more!
You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano
Peanut Gallery: Alex Lustenberg, Ken Addison
-
Week in Review:
-
Computex Continued
-
WWDC 2017:
-
News items of interest:
-
-
1:10:50 Honey, I shrunk the silicon
-
-
Hardware/Software Picks of the Week
-
Josh: So cheap, so soft...
-
Allyn: Star Trek Bridge Crew
-
-
Closing/outro
Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!
IBM Announces 5nm Breakthrough with Silicon Nanosheet Technology
Subject: General Tech | June 7, 2017 - 09:31 PM | Josh Walrath
Tagged: silicon nanosheet, Samsung, IBM, GLOBALFOUNDRIES, FinFET, 5nm
It seems only yesterday that we saw Intel introduce their 22nm FinFET technology, and now we are going all the way down to 5nm. This is obviously an exaggeration. The march of process technology has been more than a little challenging for the past 5+ years for everyone in the industry. Intel has made it look a little easier by being able to finance these advances a little better than the other pure-play foundries. It does not mean that they have not experienced challenges on their own.
We have seen some breakthroughs these past years with everyone jumping onto FinFETs with TSMC, Samsung, and GLOBALFOUNDRIES introducing their own processes. GLOBALFOUNDRIES initially had set out on their own, but that particular endeavor did not pan out. The ended up licensing Samsung’s 14nm processes (LPE and LPP) to start producing chips of their own, primarily for AMD in their graphics and this latest generation of Ryzen CPUs.
These advances have not been easy. While FinFETs are needed at these lower nodes to continue to provide the performance and power efficiency while supporting these transistor densities, the technology will not last forever. 10nm and 7nm lines will continue to use them, but many believe that while we will see the densities improve, the power characteristics will start to lag behind. The theory is that past 7nm nodes traditional FinFETs will no longer work as desired. This is very reminiscent of the sub 28nm processes that attempted to use planar structures on bulk silicon. In that case the chips could be made, but power issues plagued the designs and eventually support for those process lines were dropped.
IBM and their research associates Samsung, GLOBALFOUNDRIES at SUNY Polytechnic Institute Colleges of Nanoscale Science and Engineering’s NanoTech Complex in Albany, NY have announced a breakthrough in a new “Gate-All-Around” architecture made on a 5nm process. FinFETs are essentially a rectangle surround on three sides by gates, giving it the “fin” physical characteristics. This new technology now covers the fourth side and embeds these channels in nanosheets of silicon.
The problem with FinFETs is that they will eventually be unable to scale with power as transistors get closer and closer. While density scales, power and performance will get worse as compared to previous nodes. The 5nm silicon nanosheet technology gives a significant boost to power and efficiency, thereby doing to FinFETs what they did with planar structures at the 20/22nm nodes.
One of the working EUV litho machines at SUNY Albany.
IBM asserts that the average chip the size of a fingernail can contain up to 30 billion transistors and continue to see the density, power, and efficiency improvements that we would expect with a normal process shrink. The company expects these process nodes to start rolling out in a 2019 time frame if all goes as planned.
There are few details in how IBM was able to achieve this result. We do know a couple things about it. EUV lithography was used extensively to avoid the multi-patterning nightmare that this would entail. For the past two years Ametek has been installing 100 watt EUV litho machines throughout the world to select clients. One of these is located on the SUNY Albany campus where this research was done. We also know that deposition was done layer by layer with silicon and the other materials.
What we don’t know is how long it takes to create a complete wafer. Usually these test wafers are packed full of SRAM and very little logic. It is a useful test and creates a baseline for many structures that will eventually be applied to this process. We do not know how long it takes to produce such a wafer, but considering how the layers look to be deposited it takes a long, long time with current tools and machinery. Cutting edge wafers in production can take upwards of 16 weeks to complete. I hesitate to even guess how long each test wafer takes. Because of the very 3D nature of the design, I am curious as to how the litho stages work and how many passes are still needed to complete the design.
This looks to be a very significant advancement in process technology that should be mass produced in the timeline suggested by IBM. It is a significant jump, but it seems to borrow a lot of previous FinFET structures. It does not encompass anything exotic like “quantum wells”, but is able to go lower than the currently specified 7nm processes that TSMC, Samsung, and Intel have hinted at (and yes, process node names should be taken with a grain of salt from all parties at this time). IBM does appear to be comparing this to what Samsung calls its 7nm process in terms of dimensions and transistor density.
Cross section of a 5nm transistor showing the embedded channels and silicon nanosheets.
While Moore’s Law has been stretched thin as of late, we are still seeing these scientists and engineers pushing against the laws of physics to achieve better performance and scaling at incredibly small dimensions. The silicon nanosheet technology looks to be an effective and relatively affordable path towards smaller sizes without requiring exotic materials to achieve. IBM and its partners look to have produced a process node that will continue the march towards smaller, more efficient, and more powerful devices. It is not exactly around the corner, but 2019 is close enough to start planning designs that could potentially utilize this node.
Qt Outlines What Their View on Vulkan Support Is
Subject: General Tech | June 7, 2017 - 09:10 PM | Scott Michaud
Tagged: Qt, vulkan
During our recent interview, the Khronos Group mentioned that one reason to merge into Vulkan was because, at first, the OpenCL working group wasn’t sure whether they wanted an explicit, low-level API, or an easy-to-use one that hides the complexity. Vulkan taught them to take a very low-level position, because there can always be another layer above them that hides complexity to everything downstream of it. This is important for them, because the only layers below them are owned by OS and hardware vendors.
This post is about Qt, though. Qt is a UI middleware, written in C++, that has become very popular as of late. The big revamp of AMD’s control panel with Crimson Edition was a result of switching from .NET to Qt, which greatly sped up launch time. They announced their intent to support the Vulkan API on the very day that it launched.
Yesterday, they wrote a blog post detailing their intentions for Vulkan support in Qt 5.10.
First and foremost, their last bulletpoint claims that these stances can change as the middleware evolves, particularly with Qt Quick, Qt 3D, Qt Canvas 3D, QPainter, and similar classes. This is a discussion of their support for Qt 5.10 specifically. As it stands, though, Qt intends to focus on cross-platform, window management, and “function resolving for the core API”. The application is expected to manage the rest of the Vulkan API itself (or, of course, use another helper for the other parts).
This makes sense for Qt’s position. Their lowest level classes should do as little as possible outside of what their developers expect, allowing higher-level libraries the most leeway to fill in the gaps. Qt does have higher-level classes, though, and I’m curious what others, especially developers, believe Qt should do with those to take advantage of Vulkan. Especially when we start getting into WYSIWYG editors, like Qt 3D Studio, there is room to do more.
Obviously, the first release isn’t the place to do it, but I’m curious none-the-less.
G.Skill Memory Used In World Record Breaking DDR4 5,500 MHz Overclock
Subject: Memory | June 7, 2017 - 08:49 PM | Tim Verry
Tagged: G.Skill, overclocking, ddr4, x299, liquid nitrogen, computex
Amidst the flood of new product announcements at Computex, G.Skill was busy hosting an overclocking competition where its memory was used to in a record breaking overclock that saw DDR4 memory clocked at an impressive 5,500 MHz. Professional overclocker Toppc broke his 5,000 MHz record from last year with the new overclock that was accomplished on Intel’s X299 platform.
Toppc used a MSI X299 Gaming Pro Carbon AC motherboard, Intel Core X-series processor, G.Skill DDR4 memory built using Samsung 8Gb ICs, and, of course, copious amounts of liquid nitrogen! Looking at the HWBot page, it appears Toppc specifically used an Intel Core i7-7740K (Kaby Lake X) processor and 8GB G.Skill Trident Z RGB RAM (CL 14-14-14-14 stock). Both the CPU and memory modules were cooled with liquid nitrogen for the overclock. The CPU-Z screenshot shows the processor running 1 cores / 2 threads with a 133.06 bus speed. It also shows an 8x multiplier and core speed of 1064.46 but I am questioning whether or not it is accurately reading the Kaby Lake X part correctly as running at those speeds wouldn’t need such exotic cooling – perhaps it is needed to run at the 133.06 bus speed and to keep the memory controller from overheating (or melting hehe).
G.Skill is currently pushing the envelope on standard air cooled DIMMs with a prototype kit hitting 4,800 MHz. The company's CVP Tequila Huang stated in a press release:
“We are seeing amazing overclocking potential for these newly released hardware and we believe that more overclocking benchmark records will be achieved very soon by professional overclockers worldwide."
I am interested to see if it will have any additional headroom in the memory overclocking department and if so how long the 5.5 GHz world record will stand.
Dawn of War III Vulkan Support on Linux to Add Intel GPUs
Subject: General Tech | June 7, 2017 - 04:54 PM | Scott Michaud
Tagged: pc gaming, linux, vulkan, Intel, mesa, feral interactive
According to Phoronix, Alex Smith of Feral Interactive has just published a few changes to the open source Intel graphics driver, which allows their upcoming Dawn of War III port for Linux to render correctly on Vulkan. This means that the open-source Intel driver should support the game on day one, although drawing correctly and drawing efficiently could be two very different things -- or maybe not, we’ll see.
It’s interesting seeing things go in the other direction. Normally, graphics engineers parachute in to high-end developers and help them make the most of their software for each respective, proprietary graphics driver. In this case, we’re seeing the game studios pushing fixes to the graphics vendors, because that’s how open source rolls. It will be interesting to do a pros and cons comparison of each system one day, especially if cross-pollination results from it.
Roccat's newest Kone, the EMP gaming mouse
Subject: General Tech | June 7, 2017 - 03:57 PM | Jeremy Hellstrom
Tagged: input, roccat, Kone EMP, gaming mouse
Roccat;s new Kone EMP shares some attributes with earlier members of the Kone lineup, specifically the Owl-Eye optical sensor based on PixArt’s PWM 3361DM, which can be set at up to 12000dpi and the SWARM software suite to program the mouse. The onboard ARM Cortex-M0 and 512kB of memory allows the mouse to keep that programming, even on another machine which does not have SWARM installed. Modders-Inc tested the mouse out, see what they thought of it here.
"The Roccat Kone EMP is the next mouse in the Kone line up and the successor to the Kone XTD. The Kone EMP features Roccat's OWL-Eye optical sensor and four RGB LEDs for custom lighting."
Here is some more Tech News from around the web:
- Corsair GLAIVE RGB USB Gaming Mouse @ Benchmark Reviews
- Patriot Viper V770 Mechanical RGB Keyboard @ techPowerUp
- G.SKILL RIPJAWS KM570 MX Mechanical Gaming Keyboard Review @ NikKTech
A chat with Paradox on sequels and expansions
Subject: General Tech | June 7, 2017 - 01:21 PM | Jeremy Hellstrom
Tagged: gaming, paradox
Paradox is well named as it has a very different philosophy from the rest of the industry about how to treat games after they have been released. It is becoming quite common for developers to already be working on a sequel to a game that they have just released, or are in the process of releasing. Once a game launched you can expect to see numerous and often expensive DLC released for the game, which usually offer little to no new real gameplay or functionality.
Paradox treats games completely differently, their DLC expansions are often expensive but frequently offer a significant change to the base game and when released they always add several major new features to anyone who owns the game without charge. They do this for a long time after launch, two examples are Crusader Kings II which is five years old and has twelve expansions, while the four year old Europa Universalis IV has ten expansions. Rock, Paper, SHOTGUN sat down with the creative director Johan Andersson and CEO Fredrik Wester to discuss the future of these games and Paradox itself, as well as talking about the effects of offering major updates to older games as opposed to the more common constant release of sequels to games.
"With Crusader Kings II now five years old and twelve expansions deep, and Europa Universalis IV a relatively sprightly four years and ten expansions, what is the future of these titles? At what point are they done and at what point does the thought of a sequel come up."
Here is some more Tech News from around the web:
- Total War: Warhammer 2 is taking everything further @ Rock, Paper, SHOTGUN
- Arms review: Nintendo reinvents the fighting game and it’s brilliant @ Ars Technica
- BattleTech is the mech game I’ve always wanted @ Rock, Paper, SHOTGUN
- Battleborn free downloadable experience launched @ HEXUS
- Ealdorlight is a procedural storytelling fantasy RPG @ Rock, Paper, SHOTGUN
- Brigador: Up-Armored Edition ‘relaunches’ the mech combat game @ Rock, Paper, SHOTGUN
Coming as a shock to no one, Wannacry can exploit Windows 10
Subject: General Tech | June 7, 2017 - 12:42 PM | Jeremy Hellstrom
Tagged: wannacry, windows 10, security
If you have an unpatched Windows installation you are vulnerable to the SMBv1 exploit, except perhaps if you are still on WinXP in which case your machine is more likely to crash than to start encrypting. Do yourself a favour and head to Microsoft to manually download the patch appropriate for your OS and run it, if you already have it then it will tell you so, otherwise it will repair the vulnerability. The version of Wannacry and its progenitor, EternalBlue, which is making life miserable for users and techs everywhere does not currently go after Win10 machines but you can read how it can easily be modified to do so over at Slashdot.
"The publicly available version of EternalBlue leaked by the ShadowBrokers targets only Windows XP and Windows 7 machines. Researchers at RiskSense who created the Windows 10 version of the attack were able to bypass mitigations introduced by Microsoft that thwart memory-based code-execution attacks."
Here is some more Tech News from around the web:
- Microsoft slaps down Kaspersky's Windows 10 antitrust complaint @ The Inquirer
- LifeTrak Zoom HRV Wearable Body Computer
- Fujitsu PC biz tie-in with Lenovo to happen 'soon' @ The Register
- Why You Must Patch the New Linux sudo Security Hole @ Linux.com
- Foxconn, Amazon, Apple join Toshiba chip plant feeding frenzy @ The Register
- iOS 11 ain't coming to the iPhone 5, iPhone 5C or iPad 4 @ The Inquirer
- TRENDnet TV-NVR104K 4-Channel HD PoE NVR Kit Review @ NikKTech
GOG.com Summer Sale Has Just Begun
Subject: General Tech | June 7, 2017 - 07:02 AM | Scott Michaud
Tagged: sale, pc gaming, GOG
GOG.com, formerly Good Old Games, because good old names, is having their summer sale. Discounts are advertised at up to 90%, and a copy of Rebel Galaxy will be gifted immediately following your first purchase.
For me, this was the moment that I jumped on The Witcher 3. I deliberately avoided it until the DLC were bundled and the whole package was on a significant discount, which is now the case. The Witcher 3 Game of the Year, albeit not the year that we’re in, is now 50% off. Another front-page deal is Dragon Age Origins Ultimate Edition for 80% off, as is the original Mirror’s Edge, although I already have both of them. If you haven’t played it yet, Brothers: A Tale of Two Sons is great, and it’s 85% off (under $2).
MSI Unveils Fanless Cubi 3 PC Powered By Kaby Lake-U Processors
Subject: General Tech | June 7, 2017 - 02:35 AM | Tim Verry
Tagged: msi, SFF, barebones, nuc, kaby lake, Intel, Optane, computex
MSI recently introduced a new member of its Cubi small form factor barebones PC lineup. The Cubi 3 is a fanless PC that is build around Intel’s Kaby Lake-U processors and will arrive sometime this fall.
Notebook Italia and Tek.No got hands on of the MSI mini PC at Computex.
The Cubi 3 is a bit larger than its predecessors, but with the larger enclosure MSI was able to achieve a fanless design for up to (U series) Core i7 processors. The SFF PC sports a brushed aluminum case that shows off the top of the CPU heatsink through vents that run around the top edge of the case. There are two flat antennas for Wi-Fi and Bluetooh integrated into the left and right sides of the case.
FanlessTech reports that the MSI Cubi 3 will sport 15W Kaby Lake-U processors from low end Celerons up to Core i7 models. These parts are dual core parts with HyperThreading (2c/4t) with 3 MB or 4 MB of L3 cache and either HD (615 or 620) or Iris Plus (640 or 650) integrated graphics. The processor is paired with two DDR4 SO-DIMM slots for up to 32 GB of 2133 MHz memory, an M.2 2280 SSD (there is even Intel Optane support), and a single 2.5” drive.
The Cubi 3 has an audio jack and two USB 3.0 ports up front, and what appears to be two USB 2.0 ports on the left side. Rear I/O includes one HDMI, one DisplayPort, two more USB 3.0, two Gigabit Ethernet, two COM ports, and one power jack for the 65W AC power adapter.
There is no word on pricing yet, but it is slated to begin production in August with availability this fall.
It is always nice to see more competition in this niche fanless SFF space, and the little box would not look out of place on a desk or even in the living room. What are your thoughts?
Micron Pushes GDDR5X To 16Gbps, Expects To Launch GDDR6 In Early 2018
Subject: Memory | June 7, 2017 - 01:02 AM | Tim Verry
Tagged: micron, gddr6, gddr5x
JEDEC made the GDDR5X memory standard official almost a year and a half ago where it launched at 10 Gbps and quickly hit 12 Gbps. Set to bridge the gap between GDDR5 and the upcoming GDDR6, the “G5X” standard is quickly catching up to and matching the speeds that GDDR6 will run at.
Specifically, Micron’s Graphics Design Team in Munich was able to achieve an impressive 16 Gbps in their high speed test environment. The team was able to hit 16 Gbps on a “meaningful sampling” of its mass production GDDR5X silicon which makes the feat much more impressive as it means these higher speeds are moving closer to reality than theory. Micron measured a PRBS11 (psuedorandom binary sequence) pattern read at 16 Gbps using an oscilloscope and also showed off a chart that compared the stable data rate timing margin versus data rate from 10 Gbps to 16 Gbps.
In addition to teasing the 16 Gbps memory speed (it will be awhile yet before we see products like graphics cards running memory at those speeds), Micron announced that it expects to being mass productions of GDDR6 chips in early 2018. GDDR6 will see a new (larger) FBGA1180 package, faster base sort speeds (GDDR6 will start at 12Gbps vs G5X's 10Gbps), and moving to a dual channel approach with channels that will have half as many I/O links (GDDR5X is x16/x32 while GDDR6 will be x8/16 per channel). It will be interesting to see how this move will stack up to G5X, but in theory Micron will be able to push clocks even higher (maybe even higher than 16 Gbps) by having more but simpler channels (and it may be easier for graphics card manufacturers to wire up their cards to the memory chips.
SK Hynix, who showed off its first GDDR6 chip at GTC, appears to be following the same I/O design as Micron with two channel memory at x8 or x16 per channel.
Are you ready for faster GDDR5X? Hopefully these new faster G5X chips come out soon to give AMD and NVIDIA a more appealing alternative to HBM and HBM2 for mid-range and high end consumer graphics cards since High Bandwidth Memory seems to still be suffering from limited supply and is holding the GPU guys back on being able to crank up the production lines!
Also read:
Valve Ends Steam Greenlight Program
Subject: General Tech | June 6, 2017 - 08:48 PM | Scott Michaud
Tagged: valve, steam, pc gaming
As of today, June 6th, Valve has closed their Greenlight program. New submissions will not be accepted and voting has been disabled. Next week, starting on June 13th, Valve will open Steam Direct, which allows anyone to put their game on the platform for a deposit of $100 per title, which will be refunded once the title makes $1,000 in sales. Valve performs a light amount of testing on each game it receives, so it makes sense to have something that prevents you from drowning upon the opening of the flood gates, and it’s nice that they refund it when sales are high enough that their typical fees cover their expenses, rather than double-dipping.
There is still some doubt floating around the net, though... especially regarding developers from impoverished nations. As a Canadian, it’s by no means unreasonable to spend around a hundred dollars, plus or minus the exchange rate of the year, to put a game, made up of years of work, onto a gigantic distribution platform. That doesn’t hold true everywhere. At the same time, Valve does have a measurable cost per submission, so, if they lower the barrier below that, it would be at their expense. It would also be the right thing to do in some cases. Either way, that’s just my unsolicited two cents.
Steam Direct opens on June 13th.
- 1 of 1079
- ››
|
PC Perspective Upcoming Events
Get notified when we go live! PC Perspective Live! Google Calendar RSS |

























