All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | August 17, 2016 - 02:02 PM | Jeremy Hellstrom
Tagged: Enderal, SureAI, mod, skyrim
Enderal: The Shards of Order is a complete conversion of the Steam version of Skyrim into a completely new game in a brand new world. The mod is 8 GB in size and requires a separate launcher, both available at Enderal.com and you can expect between 30 to 100 hours playtime. You may remember this team from Nehrim, their previous total conversion mod for Oblivion. Rock, Paper, SHOTGUN have covered this mod previously, however until now it was only available in German. The full English version, including voice acting, is now complete and ready for you to dive into. You might want to consider unmodding your Skyrim to install the mod, it does create a copy of the Skyrim installation so you can restore your Thomas the Tank Engine mod once you are set up.
"A final version of Enderal: The Shards of Order has been completed and can be downloaded for free now. While ‘Enderal’ sounds like it could be something made by a United States pharmaceutical company, it is actually a massive total conversion mod for Skyrim, not just adding new weapons or turning it into a survival game, but creating a whole new RPG using the raw materials of its parent."
Here is some more Tech News from around the web:
- Final Fantasy 15 hands-on: Brave new direction or just pandering to fans? @ Ars Technica
- MOO! Master Of Orion Reboot Leaves Early Access @ Rock, Paper, SHOTGUN
- The Legend Of THQnix: Dead Publisher Is Back, Kinda @ Rock, Paper, SHOTGUN
- AMD & NVIDIA GPU VR Performance: Valve's Robot Repair @ [H]ard|OCP
- Sunless Sea’s Former Lead Now Writing For Stellaris @ Rock, Paper, SHOTGUN
- Hands On: Dawn Of War III @ Rock, Paper, SHOTGUN
- Battlefield 1 Open Beta starts from 31st August @ HEXUS
- BioShock Original And Remastered Graphics Comparison @ Rock, Paper, SHOTGUN
- Without Kojima, Metal Gear becomes a multiplayer zombie action game @ Ars Technica
- Beyond No Man's Sky: 10 of the best games still to come in 2016 @ The Inquirer
Subject: Graphics Cards, Processors | August 17, 2016 - 01:38 PM | Scott Michaud
Tagged: Xeon Phi, larrabee, Intel
Tom Forsyth, who is currently at Oculus, was once on the core Larrabee team at Intel. Just prior to Intel's IDF conference in San Francisco, which Ryan is at and covering as I type this, Tom wrote a blog post that outlined the project and its design goals, including why it didn't hit market as a graphics device. He even goes into the details of the graphics architecture, which was almost entirely in software apart from texture units and video out. For instance, Larrabee was running FreeBSD with a program, called DirectXGfx, that gave it the DirectX 11 feature set -- and it worked on hundreds of titles, too.
Also, if you found the discussion interesting, then there is plenty of content from back in the day to browse. A good example is an Intel Developer Zone post from Michael Abrash that discussed software rasterization, doing so with several really interesting stories.
Subject: General Tech | August 17, 2016 - 12:41 PM | Jeremy Hellstrom
Tagged: nvidia, Intel, HPC, Xeon Phi, maxwell, pascal, dirty pool
There is a spat going on between Intel and NVIDIA over the slide below, as you can read about over at Ars Technica. It seems that Intel have reached into the industries bag of dirty tricks and polished off an old standby, testing new hardware and software against older products from their competitors. In this case it was high performance computing products which were tested, Intel's new Xeon Phi against NVIDIA's Maxwell, tested on an older version of the Caffe AlexNet benchmark.
NVIDIA points out that not only would they have done better than Intel if an up to date version of the benchmarking software was used, but that the comparison should have been against their current architecture, Pascal. This is not quite as bad as putting undocumented flags into compilers to reduce the performance of competitors chips or predatory discount programs but it shows that the computer industry continues to have only a passing acquaintance with fair play and honest competition.
"At this juncture I should point out that juicing benchmarks is, rather sadly, par for the course. Whenever a chip maker provides its own performance figures, they are almost always tailored to the strength of a specific chip—or alternatively, structured in such a way as to exacerbate the weakness of a competitor's product."
Here is some more Tech News from around the web:
- USB Implementers Forum introduces branding for safe USB-C charging @ The Inquirer
- Some Windows 10 Anniversary Update: SSD freeze @ The Register
- Intel Project Alloy: all-in-one VR headset takes aim at Google's Project Daydream @ The Inquirer
- Wanna build your own drone? Intel emits Linux-powered x86 brains for DIY flying gizmos @ The Register
- Intel's Optane XPoint DIMMs pushed back – source @ The Register
Subject: Cases and Cooling | August 17, 2016 - 11:43 AM | Sebastian Peak
Tagged: cooler master, MasterLiquid Maker 92, AIO, liquid cooler, self contained, convertible
Cooler Master has introduced an unusual all-in-one liquid CPU cooler with their new MasterLiquid Maker 92, a design which places all of the components together on top of the CPU block.
We've seen a similar idea from Corsair with the cooler first found in the Bulldog system, and later introduced separately as the H5 SF mini-ITX liquid cooler. Cooler Master's design uses a different arrangement, with push-pull 92mm fans sandwiching a radiator that rotates 90º to permit either a verticle or horizontal setup. The latter position allows for better low-profile chassis compatibility, and also adds airflow to motherboard components.
- Model: MLZ-H92M-A26PK-R1
- CPU: Intel LGA 2011-v3/ 2011/ 1151/ 1150/ 1155/ 1156 socket
- Power Connector : SATA and 4-Pin
- Radiator Material: Aluminum
- Vertical: 99.9 x 81.6 x 167.5mm (3.9 x 3.2 x 6.6”)
- Horizontal: 99.9 x 142 x 118.8 mm (3.9 x 5.6 x 4.7”)
- Dimension: Φ95x 25.4 mm (3.7 x 1”)
- Airflow: 49.7 CFM (max)
- Air Pressure: 6.4 mmH2O (max)
- Noise Level: 30 dBA (max)
- Noise Level: <12 dBA (max)
- MTTF: 175,000 hours
- L-10 Life: 50,000 hours
- Rated Voltage: 12VDC
- Warranty: 5 Years
Cooler Master is offer pre-orders on a first-come, first-serve basis beginning August 30 from this page. Pricing is not listed.
Subject: General Tech | August 16, 2016 - 08:10 PM | Scott Michaud
Vulkan-Tutorial.com, while not affiliated with The Khronos Group, is a good read to understand how the Vulkan API is structured. It is set up like the tutorials I followed when learning WebGL, which seems to make it quite approachable. I mean, we are still talking about the Vulkan API, which was in no way designed to be easy or simple, but introduction-level material is still good for developers of all skill level, unless they're looking for specific advice.
To emphasize what I mean by “approachable”, the tutorial even includes screenshots of Visual Studio 2015 at some points, to help Windows users set up their build environment. Like... step-by-step screenshots. This explanation is also accompanied by Linux instructions, although those use Ubuntu terminal commands.
IDF 2016: ScaleMP Merges Software-Defined Memory With Storage-Class Memory, Makes Optane Work Like RAM
Subject: Storage | August 16, 2016 - 04:05 PM | Allyn Malventano
Tagged: Virtual SMP, SMP, SDM-S, SDM-F, ScaleMP, IDF 2016, idf
ScaleMP has an exciting announcement at IDF today, but before we get into it, I need to do some explaining. Most IT specialists know how to employ virtualization to run multiple virtual environments within the same server, but what happens when you want to go the other way around?
You might not have known it, but virtualization can go both ways. ScaleMP make such a solution, and it enables some amazing combinations of hardware all thrown at a single virtualized machine. Imagine what could be done with a system containing 32,768 CPUs and 2048TB (2PB) of RAM. Such a demand is actually more common than you might think:
List of companies / applications of ScaleMP.
ScaleMP's tech can fit into a bunch of different usage scenarios. You can choose to share memory, CPU cores, IO, or all three across multiple physical machines, all combined into a single beast of a virtualized OS, but with the launch of 3D XPoint there's one more thing that might come in handy as a sharable resource, as there is a fairly wide latency gap between NAND and RAM:
Alright, now that we've explained the cool technology and the gap to be filled, onto the news of the day, which is that ScaleMP has announced that their Software Defined Memory tech has been optimized for Intel Optane SSDs. This means that ScaleMP / Optane customers will be able to combine banks of XPoint installed across multiple systems all into a single VM. Another key to this announcement is that due to the way ScaleMP virtualizes the hardware, the currently developing storage-class (NVMe) XPoint/Optane solutions can be mounted as if they were system memory, which should prove to be a nice stopgap until we see second generation 3D XPoint in DIMM form.
More to follow from IDF 2016. ScaleMP's press blast appears after the break.
Subject: Cases and Cooling | August 16, 2016 - 02:23 PM | Jeremy Hellstrom
Tagged: Alphacool, Eisbaer 240, AIO, watercooler
Alphacool's Eisbaer 240 AiO CPU watercooler sports a dual 120mm bay radiator along with a DC-LT pump. The tubing is also designed by Alphacool and sports an interesting quick disconnect coupling which allows you to integrate another radiator, reservoir, or even an entire second loop for your GPU if you so desire. As well the fluid reservoir on the waterblock features a window to allow you to quickly check your water levels and to see if there are any bubbles present in your loop. All of this sounds good but the question of performance remains; stop by Hardware Canucks to see this impressive cooler in action.
"Prebuilt AIO or a potentially complicated custom loop? Alphacool believes we shouldn't have to choose and their Eisbaer 240 combines both under one roof and could blaze a new trail."
Here are some more Cases & Cooling reviews from around the web:
- Cooler Master MasterLiquid Pro 240 Review: A Gestalt Approach to Cooling @ Modders-Inc
- be quiet! Dark Rock TF @ techPowerUp
- Thermaltake Core P3 Tower @ HardwareOverclock
- Deepcool Genome @ techPowerUp
- be quiet! Dark Base Pro 900 @ Benchmark Reviews
Subject: Storage | August 16, 2016 - 02:00 PM | Allyn Malventano
Tagged: XPoint, Testbed, Optane, Intel, IDF 2016, idf
IDF 2016 is up and running, and Intel will no doubt be announcing and presenting on a few items of interest. Of note for this Storage Editor are multiple announcements pertaining to upcoming Intel Optane technology products.
Optane is Intel’s branding of their joint XPoint venture with Micron. Intel launched this branding at last year's IDF, and while the base technology is as high as 1000x faster than NAND flash memory, full solutions wrapped around an NVMe capable controller have shown to sit at roughly a 10x improvement over NAND. That’s still nothing to sneeze at, and XPoint settles nicely into the performance gap seen between NAND and DRAM.
Since modern M.2 NVMe SSDs are encroaching on the point of diminishing returns for consumer products, Intel’s initial Optane push will be into the enterprise sector. There are plenty of use cases for a persistent storage tier faster than NAND, but most enterprise software is not currently equipped to take full advantage of the gains seen from such a disruptive technology.
XPoint die. 128Gbit of storage at a ~20nm process.
In an effort to accelerate the development and adoption of 3D XPoint optimized software, Intel will be offering enterprise customers access to an Optane Testbed. This will allow for performance testing and tuning of customers’ software and applications ahead of the shipment of Optane hardware.
I did note something interesting in Micron's FMS 2016 presentation. QD=1 random performance appears to start at ~320,000 IOPS, while the Intel demo from a year ago (first photo in this post) showed a prototype running at only 76,600 IOPS. Using that QD=1 example, it appears that as controller technology improves to handle the large performance gains of raw XPoint, so does performance. Given a NAND-based SSD only turns in 10-20k IOPS at that same queue depth, we're seeing something more along the lines of 16-32x performance gains with the Micron prototype. Those with a realistic understanding of how queues work will realize that the type of gains seen at such low queue depths will have a significant impact in real-world performance of these products.
The speed of 3D XPoint immediately shifts the bottleneck back to the controller, PCIe bus, and OS/software. True 1000x performance gains will not be realized until second generation XPoint DIMMs are directly linked to the CPU.
The raw die 1000x performance gains simply can't be fully realized when there is a storage stack in place (even an NVMe one). That's not to say XPoint will be slow, and based on what I've seen so far, I suspect XPoint haters will still end up burying their heads in the sand once we get a look at the performance results of production parts.
Leaked roadmap including upcoming Optane products
Intel is expected to show a demo of their own more recent Optane prototype, and we suspect similar performance gains there as their controller tech has likely matured. We'll keep an eye out and fill you in once we've seen Intel's newer Optane goodness it in action!
Subject: General Tech, Processors, Displays, Shows and Expos | August 16, 2016 - 01:50 PM | Ryan Shrout
Tagged: VR, virtual reality, project alloy, Intel, augmented reality, AR
At the opening keynote to this summer’s Intel Developer Forum, CEO Brian Krzanich announced a new initiative to enable a completely untether VR platform called Project Alloy. Using Intel processors and sensors the goal of Project Alloy is to move all of the necessary compute into the headset itself, including enough battery to power the device for a typical session, removing the need for a high powered PC and a truly cordless experience.
This is indeed the obvious end-game for VR and AR, though Intel isn’t the first to demonstrate a working prototype. AMD showed the Sulon Q, an AMD FX-based system that was a wireless VR headset. It had real specs too, including a 2560x1440 OLED 90Hz display, 8GB of DDR3 memory, an AMD FX-8800P APU with R7 graphics embedded. Intel’s Project Alloy is currently using unknown hardware and won’t have a true prototype release until the second half of 2017.
There is one key advantage that Intel has implemented with Alloy: RealSense cameras. The idea is simple but the implications are powerful. Intel demonstrated using your hands and even other real-world items to interact with the virtual world. RealSense cameras use depth sensing to tracking hands and fingers very accurately and with a device integrated into the headset and pointed out and down, Project Alloy prototypes will be able to “see” and track your hands, integrating them into the game and VR world in real-time.
The demo that Intel put on during the keynote definitely showed the promise, but the implementation was clunky and less than what I expected from the company. Real hands just showed up in the game, rather than representing the hands with rendered hands that track accurately, and it definitely put a schism in the experience. Obviously it’s up to the application developer to determine how your hands would actually be represented, but it would have been better to show case that capability in the live demo.
Better than just tracking your hands, Project Alloy was able to track a dollar bill (why not a Benjamin Intel??!?) and use it to interact with a spinning lathe in the VR world. It interacted very accurately and with minimal latency – the potential for this kind of AR integration is expansive.
Those same RealSense cameras and data is used to map the space around you, preventing you from running into things or people or cats in the room. This enables the first “multi-room” tracking capability, giving VR/AR users a new range of flexibility and usability.
Though I did not get hands on with the Alloy prototype itself, the unit on-stage looked pretty heavy, pretty bulky. Comfort will obviously be important for any kind of head mounted display, and Intel has plenty of time to iterate on the design for the next year to get it right. Both AMD and NVIDIA have been talking up the importance of GPU compute to provide high quality VR experiences, so Intel has an uphill battle to prove that its solution, without the need for external power or additional processing, can truly provide the untethered experience we all desire.
Subject: General Tech | August 16, 2016 - 01:38 PM | Jeremy Hellstrom
Tagged: RRAM, flexible silicon
Flexible computers are quickly becoming more of a reality as researchers continue to find ways to make generally brittle components such as processors and memory out of new materials. This latest research has discovered new materials to construct RRAM which allow working memory to remain viable even when subjected to flex. Instead of using traditional CMOS they have found certain tungsten oxides which display all of the properties required for flexible memory. The use of those oxides is not new, however they came with a significant drawback; in order to fabricate the material you needed a larger amount of heat than for CMOS. Nanotechweb reports on new developments from a team led by James Tour of Rice University which have lead to a fabrication process which can take place at room temperature. Check out their article for an overview and link to their paper.
"Researchers in the US and Korea say they have developed a new way to make a flexible, resistive random access memory (RAM) device in a room-temperature process – something that has proved difficult to do until now."
Here is some more Tech News from around the web:
- Microsoft is going to roll up all your Windows 7 and 8.1 updates Windows 10-style @ The Inquirer
- TSMC to make chips for iPad coming out in 2017, says report @ DigiTimes
- Minecraft meets Oculus Rift VR on Windows 10 @ The Inquirer
- White hat pops Windows User Account Control with log viewer data @ The Register
- Microsoft: Why we had to tie Azure Stack to boxen we picked for you @ The Register
- Windows 10 Anniversary Update – Heaven for Power Users @ Hardware Secrets
- NVIDIA Pascal Mobile GPU Specifications & Details! @ TechARP
Subject: Systems, Mobile | August 16, 2016 - 11:39 AM | Sebastian Peak
Tagged: Skylake, nvidia, notebook, laptop, Intel Core i7, gtx 1070, gtx 1060, gigabyte, gaming
GIGABYTE has refreshed their gaming laptop lineup with NVIDIA's GTX 10 series graphics, announcing updated versions of the P55 & P57 Series, and thin-and-light P35 & P37.
"GIGABYTE offers a variety of options based on preference while providing the latest GeForce® GTX 10 series graphics and the latest 6th Generation Intel Core i7 Processor for the power and performance to meet the growing demands of top tier applications, games, and Virtual Reality. With the superior performance GIAGBYTE also includes industry leading features such as M.2 PCIe SSD, DDR4 memory, USB 3.1 with Type-C connection, and HDMI 2.0."
The notebooks retain 6th-gen Intel (Skylake) Core processors, but now feature NVIDIA GeForce GTX 1070 and GTX 1060 GPUs.
Here's a rundown of the new systems from GIGABYTE, beginning with the Performance Series:
The GIGABYE P57 Gaming Laptop
"The new 17” P57 is pulling no punches when it comes to performance, including the all-new, ultra-powerful NVIDIA® GeForce® GTX 1070 & 1060 Graphics. With a fresh GPU, come fresh ID changes. Along with its subtle style, curved lines and orange accents, comes all-new additional air intake ventilation above the keyboard to improve thermal cooling. The backlit keyboard itself supports Anti-Ghost with 30-Key Rollover. The Full HD 1920x1080 IPS display provides vivid and immersive visuals, while a Swappable Bay is included for user preference of an optical drive, an additional HDD, or weight reduction."
Next we have the thin-and-light ULTRAFORCE Gaming models:
The ULTRAFORCE P35
"The new 17.3” P37 reiterates what ULTRAFORCE is all about. Despite being a 17” model, the P37 weights under 2.7kg and retains an ultra-thin and light profile being less than 22.5mm thin. Paired with extreme mobility is the NVIDIA GeForce GTX 1070 graphics. The display comes in both options of 4K UHD 3840x2160 and FHD 1920x1080, achieving high-res gaming thanks to the performance boost with the new graphics.
The P37 includes a hot-swappable bay for an additional HDD, ODD, or to reduce weight for improved mobility, forming a quad-storage system with multiple M.2 PCIe SSDs and HDDs. The Macro Keys on the left, together with the included Macro Hub software, allows up to 25 programmable macros for one-click execution in any games and applications
Powerful yet portable, the thinnest gaming laptop of the series, the 15.6” P35, also has either a UHD 3840x2160 or FHD 1920x1080 display, delivering perfect and vivid colors for an enhanced gameplay experience. Included in the Ultrabook-like chassis is the powerful all-new NVIDIA® GeForce GTX 1070 GPU. The P35 also features the iconic hot-swappable bay for flexible storage and the quad-storage system."
The P37 keyboard features macro keys
We will update with pricing and availability for these new laptops when known.
Subject: Systems | August 16, 2016 - 08:00 AM | Sebastian Peak
Tagged: PC, nvidia, Lenovo, Intel Core i7, IdeaCentre Y910, GTX 1080, gaming, desktop, all in one, AIO
Lenovo has announced a new all-in-one gaming desktop, and the IdeaCentre Y910 offers up to a
7th-generation 6th-generation Intel Core i7 processor and NVIDIA GeForce GTX 1080 graphics behind its 27-inch QHD display.
But this is no ordinary all-in-one, as Lenovo has designed the Y910 to be "effortlessly upgradeable":
"Designed to game, engineered to evolve, the IdeaCentreTM AIO Y910 is easy to upgrade –
no special tools needed. Simply press the Y button to pop out the back panel, for effortless swapping of your GPU, Memory or Storage."
The specs include a 7th-gen Intel Core i7 processor, and if that's not a typo we're talking about Intel Kaby Lake here. Specs have been corrected as 6th-gen Intel Core processors up to an i7. Exactly what SKU might be inside the Y910 isn't clear just yet, and we'll update when we know for sure. It would be limited to 65 W based on the specified cooling, and notice that the CPU isn't on the list of user-upgradable parts (though it could still be possible).
Here's a rundown of specs from Lenovo:
- Processor: Up to a 6th-generation Intel Core i7 Processor
- Graphics: Up to NVIDIA GeForce GTX 1080 8 GB
- Memory: Up to 32 GB DDR4
- Storage: Up to 2 TB HDD + 256 GB SSD
- Display: 27-inch QHD (2560x1440) near-edgeless
- Audio: Integrated 7.1 Channel Dolby Audio, 5W Harmon Kardon speakers
- Webcam: 720p, Single Array Microphone
- Networking: Killer DoubleShot WiFi / LAN
- Rear Ports:
- 2x USB 2.0
- HDMI-in / HDMI-out
- Side Ports:
- 3x USB 3.0
- 6-in-1 Card Reader (SD, SDHC, SDXC, MMC, MS, MS-Pro) Headphone, Microphone
- Cooling: 65 W
- Dimensions (W x L x H): 237.6 x 615.8 x 490.25 mm (9.35 x 24.24 x 19.3 inches)
- Weight: Starting at 27 lbs (12.24 kg)
Update: The IdeaCentre Y910 starts at $1,799.99 for a version with the GTX 1070, and will be available in October.
Subject: Systems | August 16, 2016 - 08:00 AM | Sebastian Peak
Tagged: small form-factor, SFF, nvidia, Lenovo, Killer Networking, Intel, IdeaCentre Y710 Cube, GTX 1080, gaming, gamescom, cube
Lenovo has announced the IdeaCentre Y710 Cube; a small form-factor system designed for gaming regardless of available space, and it can be configured with some very high-end desktop components for serious performance.
"Ideal for gamers who want to stay competitive no matter where they play, the IdeaCentre Y710 Cube comes with a built-in carry handle for easy transport between gaming stations. Housed sleekly within a new, compact cube form factor, it features NVIDIA’s latest GeForce GTX graphics and 6th Gen Intel Core processors to handle today’s most resource-intensive releases."
The Y710 Cube offers NVIDIA GeForce graphics up to the GTX 1080, and up to a 6th-generation Core i7 processor. (Though a specific processor number was not mentioned, this is likely the non-K Core i7-6700 CPU given the 65W cooler specified below).
Lenovo offers a pre-installed XBox One controller receiver with the Y710 Cube to position the small desktop as a console alternative, and the machines are configured with SSD storage and feature Killer Double Shot Pro networking (where the NIC and wireless card are combined for better performance).
- Processor: Up to 6th Generation Intel Core i7 Processor
- Operating System: Windows 10 Home
- Graphics: Up to NVIDIA GeForce GTX 1080; 8 GB
- Memory: Up to 32 GB DDR4
- Storage: Up to 2 TB HDD + 256 GB SSD
- Cooling: 65 W
- Networking: Killer LAN / WiFi 10/100/1000M
- Video: 1x HDMI, 1x VGA
- Rear Ports: 1x USB 2.0 1x USB 3.0
- Front Ports: 2x USB 3.0
- Dimensions (L x D x H): 393.3 x 252.3 x 314.5 mm (15.48 x 9.93 x 12.38 inches)
- Weight: Starting at 16.3 lbs (7.4 kg)
- Carry Handle: Yes
- Accessory: Xbox One Wireless Controller/Receiver (optional)
The IdeaCentre Y710 Cube is part of Lenovo's Gamescom 2016 annoucement, and will be available for purchase starting in October. Pricing starts at $1,299.99 for a version with the GTX 1070.
Subject: General Tech | August 16, 2016 - 03:00 AM | Ryan Shrout
Tagged: pro, mouse, logitech g, logitech, gaming
Readers of PC Perspective have noticed that in the last couple of years a very familiar name has been asserting itself again in the world of gaming peripherals. Logitech, once the leader and creator of the gaming-specific market with devices like the G15 keyboard, found itself in a rut and was being closed in on by competitors such as Razer, Corsair and SteelSeries. The Logitech G brand was born and a renewed focus on this growing and enthusiastic market took place. We have reviewed several of the company’s new products including the G933/633 gaming headsets, G402 mouse that included an accelerometer and the G29 racing wheel.
Today Logitech is announcing the Logitech G Pro Gaming Mouse. As the name would imply, this mouse is targeted at gamers that fancy themselves as professionals, or aspiring to be so. As a result, I imagine that many “normie” PC gamers will find the design, features and pricing to be attractive enough to put next to the keyboard on their desk. This is a wired-only mouse.
The design of the Pro Gaming Mouse is very similar to that of the Logitech G100s, a long running and very popular mouse with the professional community. It falls a bit on the small side but Logitech claims that the “small and nimble profile allows gamers of many different game types to play as precisely as possible.” It’s incredibly light as well – measuring in at just 83g!
This mouse has 6 programmable buttons, much less than some of the more extreme “gaming” mice on the market, all of which can be controlled through the Logitech Gaming Software platform. The on-board memory on the Pro allows gamers to configure the mouse on their own system and take those settings with them to competition or friends’ PCs without the need to re-install software.
RGB lights are of course included with the Pro mouse and I like the idea of the wrap around the sides and back of the mouse to add some flair to the design.
Logitech is using the PMW3366 sensor in the Pro Gaming Mouse, the same used in the G502, G900 and others. Though mouse sensors might be overlooked for their importance in a gaming, the PMW3366 optical sensor is known to deliver accurate translations from 200-12,000 DPI with no acceleration or smoothing integrated that might hinder the input from the gamer.
The buttons on the Logitech G Pro use a torsion spring system rated at 20 million clicks (!!) which works out to 25 kilometers of button travel for the life of the mouse. The spring system used is designed to minimize effort and distance required for button actuation.
All aspects of the mouse were built with gamers in mind and with Logitech’s in-house professional gamers at the design table. Everything from the plastic feel, size, weight, etc. The scroll wheel is optimized for gamer’s use, not productivity, while the braided cable prevents snags. And the best part? The Logitech G Pro Gaming Mouse is set to have an MSRP of just $69.
The full press release is after the break and we are due to have a Logitech G Pro Gaming Mouse in our hands later today. We will follow up with thoughts and impressions soon!
Subject: Systems, Mobile | August 16, 2016 - 12:00 AM | Sebastian Peak
Tagged: pascal, nvidia, notebook, msi, GTX 1080, gtx 1070, gtx 1060, gaming laptop, gaming
MSI has updated their gaming notebook lineup with the new NVIDIA Pascal mobile GPUs, with the GTX 1080, GTX 1070, and GTX 1060 now available across the board. MSI says the new GPUs will provide up to 40% better performance than the company’s previous GT, GS, and GE models.
“MSI’s GT83/73VR Titan series now showcases an even more commanding design with sports car inspired exhausts and MSI’s Cooler Boost Titan, featuring multiple exhausts and dual whirlwind blade fans to guarantee the best performance even under the most stress. Available in 3 different sizes and 17 unique configurations, including with SLI graphics, 4K panels and Tobii’s eye-tracking technology, MSI’s GT series is the optimum laptop for serious gamers.”
Positioned at the top of the heap is the mighty Titan series, which naturally offers the highest possible specs for those who can afford the price tag.
Notice anything about the top-end GT83 model in the chart above? The GT83VR Titan SLI indeed contains not one, but two NVIDIA GTX 1080 graphics chips, making this $5099 gaming machine a monster of a system - though its 1080p screen real estate means a connected VR headset will be more likely to use all of that available GPU power.
Moving down to the GT72/GT62 series, we see a move to the GTX 1070 GPU accross the board:
Next up is the GS73, which offers (in addition to Pascal graphics) MSI's "Cooler Boost Trinity", which is the company's advanced cooling system for thin notebook designs.
“MSI’s redesigned GS73/63 VR Stealth Pro series now comes with MSI’s Cooler Boost Trinity, a temperature control system featuring three ultra-thin whirlwind blade fans, and a 5-pipe thermal design optimized for ultra-slim gaming notebooks. Available in 17-inch, 15-inch, and 14-inch options, MSI’s GS series gives power mobile gaming a new meaning with the performance of larger systems while measuring less than 1-inch thick.”
The more modest GTX 1060 powers the <1 inch thick notebooks in the series, and both the GS73 and GS63 VR Stealth Pro are equipped with 4K resolution IPS screens (with the GS43VR Phantom Pro at 1080p).
Next we have the VR Apache series, with another approach to cooling called "Cooler Boost 4":
“MSI’s GE72/62 VR Apache series now features MSI’s Cooler Boost 4 technology, an enhanced cooling system with multiple exhausts to keep temperatures low even during the most headed battles. Starting at $1,649, the VR-ready GE series comes in two different sizes and is the ideal unit for gaming enthusiast looking for a powerful and reliable unit.”
These lower-cost gaming machines are still equipped with Intel Core i7 processors, and offer GTX 1060 graphics for both models.
As a very interesting addition to the news of these new laptops, MSI has also announced that select machines equipped with NVIDIA GTX 10 Series graphics will feature 120Hz IPS panels with a 5ms response time.
We should have more imformation on availability soon.
Subject: Motherboards | August 15, 2016 - 10:49 PM | Sebastian Peak
Tagged: X99A WORKSTATION, workstation, socket 2011-3, msi, motherboard, Military Class 5, Intel X99, ECC
MSI’s new X99A WORKSTATION motherboard offers what the company calls “extreme QA testing” to go along with the “industries higher quality components” in a motherboard that doesn’t scrimp on features, either. The ATX motherboard supports ECC Registered DIMMs (with a supported Xeon processor), has dual Intel NICs onboard (with teaming support), and offers Steel Armor PCI-E and DDR4 slots, among other features.
“Engineered to cater even the most demanding professional. By using industries’ highest quality components, with an unmatched R&D design and extreme QA testing, the WORKSTATION motherboards guarantee the best in performance, stability and reliability.
Packed with features, including optimizations for NVIDIA QUADRO graphics cards and industry leading storage solutions, the WORKSTATION motherboard is the perfect multi-tasking powerhouse for demanding productivity applications.”
Features from MSI:
- Supports New Intel Core i7 Processor Extreme Edition for LGA 2011-3 socket.
- DDR4 Steel Armor with Best signal stability. Supports ECC Registered memory.
- MULTI-GPU with Steel Armor: Steel Armor PCI-E slots. Supports Nvidia Quadro SLI.
- Maximized high speed storage support: Turbo M.2 32Gb/s + Turbo U.2 32Gb/s + SATA-E 10Gb/s
- USB 3.1 Gen2 2X FASTER: USB 3.1 Gen2 offers performance twice as fast as a regular USB 3.0 connection.
- Military Class 5: The latest evolution in high quality components featuring the brand new Titanium Chokes.
- Dual Intel LAN with Teaming: higher networking performance and stability.
- Audio Boost 3: Reward your ears with studio grade sound quality
- Guard-Pro: Improved Protection and Power Efficiency
- EZ Debug LED: Easiest way to troubleshoot
- Overvoltage Protection:Prevent Unforeseen Damage
- Click BIOS 5: Award-winning brand new Click BIOS 5 with high resolution scalable font
The MSI X99A WORKSTATION motherboard is not yet listed on Newegg, but the listing is active (though temporarily out of stock) on Amazon with a $389.99 price tag. As to supported processors, that part of the source link currently offers no information, though MSI states "The X99A WORKSTATION supports the latest Intel processors out of the box, both Broadwell-E and XEON based models".
Subject: Systems | August 15, 2016 - 01:30 PM | Jeremy Hellstrom
Tagged: system guide
The Tech Report have released their recommended system components for this summer, which you can check out right here. They have maintained their previous format, offering a choice of several components at the Budget, Sweet Spot and High End levels, wrapping up with example builds. They recommend holding off on building a budget machine for the nonce, at the time of publishing they recommended the RX 480 4GB but we have now seen the release of cards more suitable for this level of build. The Sweet Spot is VR Ready and the High End machine remains Broadwell-E, much as with our own Hardware Leaderboard they cannot recommend moving from the reigning champ.
"In this edition of The Tech Report System Guide, we account for the choices that AMD's Radeon RX 480 and Nvidia's GeForce GTX 1060 afford builders in the under-$300 graphics card market."
Here are some more Systems articles from around the web:
Subject: General Tech | August 15, 2016 - 12:22 PM | Jeremy Hellstrom
Tagged: google, wireless isp, LTE
The FCC bidding was not terribly exciting but the result was numerous companies buying up parts of the spectrum and more importantly to this post, the opening of 3550-3650 MHz band for anyone to use. The 3.5GHz band is already allocated to shipborne navigation and military radar systems, this will be a test of ability of computer systems to moderate interference instead of the blanket ban they have always relied on in the past.
Google is about to test that ability, they will be running a test in several US cities to check the propagation of the signal as well as any possible maritime or military interference from the broadcast. This could be a way to get high speed internet to the curb without requiring fibre optic runs and would also be compatible with LTE, if Google wanted to dip their toes into that market. You can read about the tests and where they will be happening over at Hack a Day.
"In a recently released FCC filing, Google has announced their experimental protocol for testing the new CBRS. This isn’t fast Internet to a lamp pole on the corner of the street yet, but it lays the groundwork for how the CBRS will function, and how well it will perform."
Here is some more Tech News from around the web:
- 7 reasons Windows XP refuses to die @ The Inquirer
- Native Skype for Windows Phone walked behind shed, shot heard @ The Register
- A Trove Of 3D Printer Filament Test Data @ Hack a Day
- Firefox 49 For Linux Will Ship With Plug-in Free Netflix, Amazon Prime Video Support @ Slashdot
- Adobe stops software licence audits in Americas, Europe @ The Register
Subject: Graphics Cards | August 12, 2016 - 06:33 PM | Sebastian Peak
Tagged: report, nvidia, gtx 1060 3gb, gtx 1060, GeForce GTX 1060, geforce, cuda cores
NVIDIA will offer a 3GB version of the GTX 1060, and there's more to the story than the obvious fact that is has half the frame buffer of the 6GB version available now. It appears that this is an entirely different product, with 128 fewer CUDA cores (1152) than the 6GB version's 1280.
Image credit: VideoCardz.com
Boost clocks are the same at 1.7 GHz, and the 3GB version will still operate with a 120W TDP and require a 6-pin power connector. So why not simply name this product differently? It's always possible that this will be an OEM version of the GTX 1060, but in any case expect slightly lower performance than the existing version even if you don't run at high enough resolutions to require the larger 6GB frame buffer.
Subject: Graphics Cards | August 12, 2016 - 05:44 PM | Jeremy Hellstrom
Tagged: rx 470, LatencyMon, dpc, amd
When The Tech Report first conducted their review of the RX 470 they saw benchmark behaviour very different from any other GPU in that family but could not figure out what it was and resolve it before the mob arrived with pitchforks and torches demanding they publish or die.
As it turns out there was indeed something rotten in benchmark; incredibly high DPC on the test machine. Investigation determined the culprit to be the beta BIOS on their ASRock Z170 Extreme7+, specifically the BIOS which allowed you to overclock locked Intel CPUs. They have just released their new findings along with a look at LatencyMon and DPC in general. Take a look at the new benchmarks and information about DPC, but also absorb the consequences of demanding articles arrive picoseconds after the NDA expires; if there is a delay in publishing there might just be a damn good reason why.
"We retested our RX 470 to account for this issue, and we also updated our review with DirectX 12 benchmarks for Rise of the Tomb Raider and Hitman, plus full OpenGL and Vulkan benchmarks for Doom."
Here are some more Graphics Card articles from around the web:
- AMD & NVIDIA GPU VR Performance in Trials on Tatooine @ [H]ard|OCP
- AMD's Radeon RX 460 @ The Tech Report
- 18-Way GPU Linux Benchmarks, Including The Radeon RX 460 & RX 470 On Open-Source @ Phoronix
- ASUS Radeon RX 460 STRIX OC 4 GB @ techPowerUp
- MSI RX 470 Gaming X 8G @ Kiguru
- MSI GTX 1060 6GB Gaming X @ Kitguru
- MSI GeForce GTX 1070 Gaming Z @ Modders-Inc
- Nvidia Titan X (Pascal) Extended Overclock Guide @ Guru of 3D
- Nvidia Titan X @ Kitguru
- MSI GeForce GTX 1080 Gaming Z 8G Review @HiTech Legion
- Zotac GTX 1080 AMP! Edition 8 GB @ techPowerUp