Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

IDF 2016: ScaleMP Merges Software-Defined Memory With Storage-Class Memory, Makes Optane Work Like RAM

Subject: Storage | August 16, 2016 - 04:05 PM |
Tagged: Virtual SMP, SMP, SDM-S, SDM-F, ScaleMP, IDF 2016, idf

ScaleMP has an exciting announcement at IDF today, but before we get into it, I need to do some explaining. Most IT specialists know how to employ virtualization to run multiple virtual environments within the same server, but what happens when you want to go the other way around?

ScaleMP-3.png

You might not have known it, but virtualization can go both ways. ScaleMP make such a solution, and it enables some amazing combinations of hardware all thrown at a single virtualized machine. Imagine what could be done with a system containing 32,768 CPUs and 2048TB (2PB) of RAM. Such a demand is actually more common than you might think:

ScaleMP-2.png

List of companies / applications of ScaleMP.

ScaleMP-4.png

ScaleMP's tech can fit into a bunch of different usage scenarios. You can choose to share memory, CPU cores, IO, or all three across multiple physical machines, all combined into a single beast of a virtualized OS, but with the launch of 3D XPoint there's one more thing that might come in handy as a sharable resource, as there is a fairly wide latency gap between NAND and RAM:

NAND RAM gap.png

Alright, now that we've explained the cool technology and the gap to be filled, onto the news of the day, which is that ScaleMP has announced that their Software Defined Memory tech has been optimized for Intel Optane SSDs. This means that ScaleMP / Optane customers will be able to combine banks of XPoint installed across multiple systems all into a single VM. Another key to this announcement is that due to the way ScaleMP virtualizes the hardware, the currently developing storage-class (NVMe) XPoint/Optane solutions can be mounted as if they were system memory, which should prove to be a nice stopgap until we see second generation 3D XPoint in DIMM form.

More to follow from IDF 2016. ScaleMP's press blast appears after the break.

Alphacool Eisbaer 240 CPU Cooler, the Germanic Icewind

Subject: Cases and Cooling | August 16, 2016 - 02:23 PM |
Tagged: Alphacool, Eisbaer 240, AIO, watercooler

Alphacool's Eisbaer 240 AiO CPU watercooler sports a dual 120mm bay radiator along with a DC-LT pump.  The tubing is also designed by Alphacool and sports an interesting quick disconnect coupling which allows you to integrate another radiator, reservoir, or even an entire second loop for your GPU if you so desire.  As well the fluid reservoir on the waterblock features a window to allow you to quickly check your water levels and to see if there are any bubbles present in your loop.  All of this sounds good but the question of performance remains; stop by Hardware Canucks to see this impressive cooler in action.

ang4_sm.jpg

"Prebuilt AIO or a potentially complicated custom loop? Alphacool believes we shouldn't have to choose and their Eisbaer 240 combines both under one roof and could blaze a new trail."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

IDF 2016: Intel To Demo Optane XPoint, Announces Optane Testbed for Enterprise Customers

Subject: Storage | August 16, 2016 - 02:00 PM |
Tagged: XPoint, Testbed, Optane, Intel, IDF 2016, idf

IDF 2016 is up and running, and Intel will no doubt be announcing and presenting on a few items of interest. Of note for this Storage Editor are multiple announcements pertaining to upcoming Intel Optane technology products.

P1020336-.JPG

Optane is Intel’s branding of their joint XPoint venture with Micron. Intel launched this branding at last year's IDF, and while the base technology is as high as 1000x faster than NAND flash memory, full solutions wrapped around an NVMe capable controller have shown to sit at roughly a 10x improvement over NAND. That’s still nothing to sneeze at, and XPoint settles nicely into the performance gap seen between NAND and DRAM.

XPoint.png

Since modern M.2 NVMe SSDs are encroaching on the point of diminishing returns for consumer products, Intel’s initial Optane push will be into the enterprise sector. There are plenty of use cases for a persistent storage tier faster than NAND, but most enterprise software is not currently equipped to take full advantage of the gains seen from such a disruptive technology.

DSC03304.JPG

XPoint die. 128Gbit of storage at a ~20nm process.

In an effort to accelerate the development and adoption of 3D XPoint optimized software, Intel will be offering enterprise customers access to an Optane Testbed. This will allow for performance testing and tuning of customers’ software and applications ahead of the shipment of Optane hardware.

U.2.jpg

I did note something interesting in Micron's FMS 2016 presentation. QD=1 random performance appears to start at ~320,000 IOPS, while the Intel demo from a year ago (first photo in this post) showed a prototype running at only 76,600 IOPS. Using that QD=1 example, it appears that as controller technology improves to handle the large performance gains of raw XPoint, so does performance. Given a NAND-based SSD only turns in 10-20k IOPS at that same queue depth, we're seeing something more along the lines of 16-32x performance gains with the Micron prototype. Those with a realistic understanding of how queues work will realize that the type of gains seen at such low queue depths will have a significant impact in real-world performance of these products.

future NVM.PNG

The speed of 3D XPoint immediately shifts the bottleneck back to the controller, PCIe bus, and OS/software. True 1000x performance gains will not be realized until second generation XPoint DIMMs are directly linked to the CPU.

The raw die 1000x performance gains simply can't be fully realized when there is a storage stack in place (even an NVMe one). That's not to say XPoint will be slow, and based on what I've seen so far, I suspect XPoint haters will still end up burying their heads in the sand once we get a look at the performance results of production parts.

intel-optane-ssd-roadmap.jpg

Leaked roadmap including upcoming Optane products

Intel is expected to show a demo of their own more recent Optane prototype, and we suspect similar performance gains there as their controller tech has likely matured. We'll keep an eye out and fill you in once we've seen Intel's newer Optane goodness it in action!

IDF 2016: Intel Project Alloy Promises Untethered VR and AR Experiences

Subject: General Tech, Processors, Displays, Shows and Expos | August 16, 2016 - 01:50 PM |
Tagged: VR, virtual reality, project alloy, Intel, augmented reality, AR

At the opening keynote to this summer’s Intel Developer Forum, CEO Brian Krzanich announced a new initiative to enable a completely untether VR platform called Project Alloy. Using Intel processors and sensors the goal of Project Alloy is to move all of the necessary compute into the headset itself, including enough battery to power the device for a typical session, removing the need for a high powered PC and a truly cordless experience.

01.jpg

This is indeed the obvious end-game for VR and AR, though Intel isn’t the first to demonstrate a working prototype. AMD showed the Sulon Q, an AMD FX-based system that was a wireless VR headset. It had real specs too, including a 2560x1440 OLED 90Hz display, 8GB of DDR3 memory, an AMD FX-8800P APU with R7 graphics embedded. Intel’s Project Alloy is currently using unknown hardware and won’t have a true prototype release until the second half of 2017.

There is one key advantage that Intel has implemented with Alloy: RealSense cameras. The idea is simple but the implications are powerful. Intel demonstrated using your hands and even other real-world items to interact with the virtual world. RealSense cameras use depth sensing to tracking hands and fingers very accurately and with a device integrated into the headset and pointed out and down, Project Alloy prototypes will be able to “see” and track your hands, integrating them into the game and VR world in real-time.

02.jpg

The demo that Intel put on during the keynote definitely showed the promise, but the implementation was clunky and less than what I expected from the company. Real hands just showed up in the game, rather than representing the hands with rendered hands that track accurately, and it definitely put a schism in the experience. Obviously it’s up to the application developer to determine how your hands would actually be represented, but it would have been better to show case that capability in the live demo.  

03.jpg

Better than just tracking your hands, Project Alloy was able to track a dollar bill (why not a Benjamin Intel??!?) and use it to interact with a spinning lathe in the VR world. It interacted very accurately and with minimal latency – the potential for this kind of AR integration is expansive.

Those same RealSense cameras and data is used to map the space around you, preventing you from running into things or people or cats in the room. This enables the first “multi-room” tracking capability, giving VR/AR users a new range of flexibility and usability.

04.jpg

Though I did not get hands on with the Alloy prototype itself, the unit on-stage looked pretty heavy, pretty bulky. Comfort will obviously be important for any kind of head mounted display, and Intel has plenty of time to iterate on the design for the next year to get it right. Both AMD and NVIDIA have been talking up the importance of GPU compute to provide high quality VR experiences, so Intel has an uphill battle to prove that its solution, without the need for external power or additional processing, can truly provide the untethered experience we all desire.

RRAM that can do the twist

Subject: General Tech | August 16, 2016 - 01:38 PM |
Tagged: RRAM, flexible silicon

Flexible computers are quickly becoming more of a reality as researchers continue to find ways to make generally brittle components such as processors and memory out of new materials.  This latest research has discovered new materials to construct RRAM which allow working memory to remain viable even when subjected to flex.  Instead of using traditional CMOS they have found certain tungsten oxides which display all of the properties required for flexible memory.  The use of those oxides is not new, however they came with a significant drawback; in order to fabricate the material you needed a larger amount of heat than for CMOS.  Nanotechweb reports on new developments from a team led by James Tour of Rice University which have lead to a fabrication process which can take place at room temperature.  Check out their article for an overview and link to their paper

nn-2016-027113_0006.gif

"Researchers in the US and Korea say they have developed a new way to make a flexible, resistive random access memory (RAM) device in a room-temperature process – something that has proved difficult to do until now."

Here is some more Tech News from around the web:

Tech Talk

Source: Nanotechweb

GIGABYTE Announces GeForce GTX 10 Series Gaming Laptops

Subject: Systems, Mobile | August 16, 2016 - 11:39 AM |
Tagged: Skylake, nvidia, notebook, laptop, Intel Core i7, gtx 1070, gtx 1060, gigabyte, gaming

GIGABYTE has refreshed their gaming laptop lineup with NVIDIA's GTX 10 series graphics, announcing updated versions of the P55 & P57 Series, and thin-and-light P35 & P37.

GIGABYTE_logo.jpg

"GIGABYTE offers a variety of options based on preference while providing the latest GeForce® GTX 10 series graphics and the latest 6th Generation Intel Core i7 Processor for the power and performance to meet the growing demands of top tier applications, games, and Virtual Reality. With the superior performance GIAGBYTE also includes industry leading features such as M.2 PCIe SSD, DDR4 memory, USB 3.1 with Type-C connection, and HDMI 2.0."

The notebooks retain 6th-gen Intel (Skylake) Core processors, but now feature NVIDIA GeForce GTX 1070 and GTX 1060 GPUs.

specs.PNG

Here's a rundown of the new systems from GIGABYTE, beginning with the Performance Series:

GIGABYTE P57.jpg

The GIGABYE P57 Gaming Laptop

"The new 17” P57 is pulling no punches when it comes to performance, including the all-new, ultra-powerful NVIDIA® GeForce® GTX 1070 & 1060 Graphics. With a fresh GPU, come fresh ID changes. Along with its subtle style, curved lines and orange accents, comes all-new additional air intake ventilation above the keyboard to improve thermal cooling. The backlit keyboard itself supports Anti-Ghost with 30-Key Rollover. The Full HD 1920x1080 IPS display provides vivid and immersive visuals, while a Swappable Bay is included for user preference of an optical drive, an additional HDD, or weight reduction."

Next we have the thin-and-light ULTRAFORCE Gaming models:

ULTRAFORCE P35.jpg

The ULTRAFORCE P35

"The new 17.3” P37 reiterates what ULTRAFORCE is all about. Despite being a 17” model, the P37 weights under 2.7kg and retains an ultra-thin and light profile being less than 22.5mm thin. Paired with extreme mobility is the NVIDIA GeForce GTX 1070 graphics. The display comes in both options of 4K UHD 3840x2160 and FHD 1920x1080, achieving high-res gaming thanks to the performance boost with the new graphics.

The P37 includes a hot-swappable bay for an additional HDD, ODD, or to reduce weight for improved mobility, forming a quad-storage system with multiple M.2 PCIe SSDs and HDDs. The Macro Keys on the left, together with the included Macro Hub software, allows up to 25 programmable macros for one-click execution in any games and applications

Powerful yet portable, the thinnest gaming laptop of the series, the 15.6” P35, also has either a UHD 3840x2160 or FHD 1920x1080 display, delivering perfect and vivid colors for an enhanced gameplay experience. Included in the Ultrabook-like chassis is the powerful all-new NVIDIA® GeForce GTX 1070 GPU. The P35 also features the iconic hot-swappable bay for flexible storage and the quad-storage system."

P37 keyboard.jpg

The P37 keyboard features macro keys

We will update with pricing and availability for these new laptops when known.

Source: GIGABYTE

Lenovo IdeaCentre Y710 Cube: Small Form-Factor Gaming with Desktop Power

Subject: Systems | August 16, 2016 - 08:00 AM |
Tagged: small form-factor, SFF, nvidia, Lenovo, Killer Networking, Intel, IdeaCentre Y710 Cube, GTX 1080, gaming, gamescom, cube

Lenovo has announced the IdeaCentre Y710 Cube; a small form-factor system designed for gaming regardless of available space, and it can be configured with some very high-end desktop components for serious performance.

Lenovo IdeaCentre_Y710_Cube_Left_hero_shot.jpg

"Ideal for gamers who want to stay competitive no matter where they play, the IdeaCentre Y710 Cube comes with a built-in carry handle for easy transport between gaming stations. Housed sleekly within a new, compact cube form factor, it features NVIDIA’s latest GeForce GTX graphics and 6th Gen Intel Core processors to handle today’s most resource-intensive releases."

The Y710 Cube offers NVIDIA GeForce graphics up to the GTX 1080, and up to a 6th-generation Core i7 processor. (Though a specific processor number was not mentioned, this is likely the non-K Core i7-6700 CPU given the 65W cooler specified below).

Lenovo IdeaCentre_Y710_Cube_Birdseye.jpg

Lenovo offers a pre-installed XBox One controller receiver with the Y710 Cube to position the small desktop as a console alternative, and the machines are configured with SSD storage and feature Killer Double Shot Pro networking (where the NIC and wireless card are combined for better performance).

Specifications include:

  • Processor: Up to 6th Generation Intel Core i7 Processor
  • Operating System: Windows 10 Home
  • Graphics: Up to NVIDIA GeForce GTX 1080; 8 GB
  • Memory: Up to 32 GB DDR4
  • Storage: Up to 2 TB HDD + 256 GB SSD
  • Cooling: 65 W
  • Networking: Killer LAN / WiFi 10/100/1000M
  • Connectivity:
    • Video: 1x HDMI, 1x VGA
    • Rear Ports: 1x USB 2.0 1x USB 3.0
    • Front Ports: 2x USB 3.0
  • Dimensions (L x D x H): 393.3 x 252.3 x 314.5 mm (15.48 x 9.93 x 12.38 inches)
  • Weight: Starting at 16.3 lbs (7.4 kg)
  • Carry Handle: Yes
  • Accessory: Xbox One Wireless Controller/Receiver (optional)

Lenovo IdeaCentre_Y710_Cube_Back_Port.jpg

The IdeaCentre Y710 Cube is part of Lenovo's Gamescom 2016 annoucement, and will be available for purchase starting in October. Pricing starts at $1,299.99 for a version with the GTX 1070.

Source: Lenovo

Lenovo Announces the IdeaCentre Y910 AIO Gaming Desktop

Subject: Systems | August 16, 2016 - 08:00 AM |
Tagged: PC, nvidia, Lenovo, Intel Core i7, IdeaCentre Y910, GTX 1080, gaming, desktop, all in one, AIO

Lenovo has announced a new all-in-one gaming desktop, and the IdeaCentre Y910 offers up to a 7th-generation 6th-generation Intel Core i7 processor and NVIDIA GeForce GTX 1080 graphics behind its 27-inch QHD display.

Y910_01.jpg

But this is no ordinary all-in-one, as Lenovo has designed the Y910 to be "effortlessly upgradeable":

"Designed to game, engineered to evolve, the IdeaCentreTM AIO Y910 is easy to upgrade –
no special tools needed. Simply press the Y button to pop out the back panel, for effortless swapping of your GPU, Memory or Storage."

The specs include a 7th-gen Intel Core i7 processor, and if that's not a typo we're talking about Intel Kaby Lake here. Specs have been corrected as 6th-gen Intel Core processors up to an i7. Exactly what SKU might be inside the Y910 isn't clear just yet, and we'll update when we know for sure. It would be limited to 65 W based on the specified cooling, and notice that the CPU isn't on the list of user-upgradable parts (though it could still be possible).

Y910_02.jpg

Here's a rundown of specs from Lenovo:

  • Processor: Up to a 6th-generation Intel Core i7 Processor
  • Graphics: Up to NVIDIA GeForce GTX 1080 8 GB
  • Memory: Up to 32 GB DDR4
  • Storage: Up to 2 TB HDD + 256 GB SSD
  • Display: 27-inch QHD (2560x1440) near-edgeless
  • Audio: Integrated 7.1 Channel Dolby Audio, 5W Harmon Kardon speakers
  • Webcam: 720p, Single Array Microphone
  • Networking: Killer DoubleShot WiFi / LAN
  • Rear Ports:
    • 2x USB 2.0
    • LAN
    • HDMI-in / HDMI-out
  • Side Ports:
    • 3x USB 3.0
    • 6-in-1 Card Reader (SD, SDHC, SDXC, MMC, MS, MS-Pro) Headphone, Microphone
  • Cooling: 65 W
  • Dimensions (W x L x H): 237.6 x 615.8 x 490.25 mm (9.35 x 24.24 x 19.3 inches)
  • Weight: Starting at 27 lbs (12.24 kg)

Y910_03.jpg

Update: The IdeaCentre Y910 starts at $1,799.99 for a version with the GTX 1070, and will be available in October.

Source: Lenovo

Logitech G Pro Gaming Mouse: Designed With And For Professional eSports Players

Subject: General Tech | August 16, 2016 - 03:00 AM |
Tagged: pro, mouse, logitech g, logitech, gaming

Readers of PC Perspective have noticed that in the last couple of years a very familiar name has been asserting itself again in the world of gaming peripherals. Logitech, once the leader and creator of the gaming-specific market with devices like the G15 keyboard, found itself in a rut and was being closed in on by competitors such as Razer, Corsair and SteelSeries. The Logitech G brand was born and a renewed focus on this growing and enthusiastic market took place. We have reviewed several of the company’s new products including the G933/633 gaming headsets, G402 mouse that included an accelerometer and the G29 racing wheel.

Today Logitech is announcing the Logitech G Pro Gaming Mouse. As the name would imply, this mouse is targeted at gamers that fancy themselves as professionals, or aspiring to be so. As a result, I imagine that many “normie” PC gamers will find the design, features and pricing to be attractive enough to put next to the keyboard on their desk. This is a wired-only mouse.

angle.jpg

The design of the Pro Gaming Mouse is very similar to that of the Logitech G100s, a long running and very popular mouse with the professional community. It falls a bit on the small side but Logitech claims that the “small and nimble profile allows gamers of many different game types to play as precisely as possible.” It’s incredibly light as well – measuring in at just 83g!

weightgraph.jpg

This mouse has 6 programmable buttons, much less than some of the more extreme “gaming” mice on the market, all of which can be controlled through the Logitech Gaming Software platform. The on-board memory on the Pro allows gamers to configure the mouse on their own system and take those settings with them to competition or friends’ PCs without the need to re-install software.

RGB lights are of course included with the Pro mouse and I like the idea of the wrap around the sides and back of the mouse to add some flair to the design. 

Logitech is using the PMW3366 sensor in the Pro Gaming Mouse, the same used in the G502, G900 and others. Though mouse sensors might be overlooked for their importance in a gaming, the PMW3366 optical sensor is known to deliver accurate translations from 200-12,000 DPI with no acceleration or smoothing integrated that might hinder the input from the gamer.

topdown.jpg

The buttons on the Logitech G Pro use a torsion spring system rated at 20 million clicks (!!) which works out to 25 kilometers of button travel for the life of the mouse. The spring system used is designed to minimize effort and distance required for button actuation.

sideprofile.jpg

All aspects of the mouse were built with gamers in mind and with Logitech’s in-house professional gamers at the design table. Everything from the plastic feel, size, weight, etc. The scroll wheel is optimized for gamer’s use, not productivity, while the braided cable prevents snags. And the best part? The Logitech G Pro Gaming Mouse is set to have an MSRP of just $69.

The full press release is after the break and we are due to have a Logitech G Pro Gaming Mouse in our hands later today. We will follow up with thoughts and impressions soon!

Source: Logitech
Author:
Manufacturer: NVIDIA

Take your Pascal on the go

Easily the strongest growth segment in PC hardware today is in the adoption of gaming notebooks. Ask companies like MSI and ASUS, even Gigabyte, as they now make more models and sell more units of notebooks with a dedicated GPU than ever before.  Both AMD and NVIDIA agree on this point and it’s something that AMD was adamant in discussing during the launch of the Polaris architecture.

pascalnb-2.jpg

Both AMD and NVIDIA predict massive annual growth in this market – somewhere on the order of 25-30%. For an overall culture that continues to believe the PC is dying, seeing projected growth this strong in any segment is not only amazing, but welcome to those of us that depend on it. AMD and NVIDIA have different goals here: GeForce products already have 90-95% market share in discrete gaming notebooks. In order for NVIDIA to see growth in sales, the total market needs to grow. For AMD, simply taking back a portion of those users and design wins would help its bottom line.

pascalnb-4.jpg

But despite AMD’s early talk about getting Polaris 10 and 11 in mobile platforms, it’s NVIDIA again striking first. Gaming notebooks with Pascal GPUs in them will be available today, from nearly every system vendor you would consider buying from: ASUS, MSI, Gigabyte, Alienware, Razer, etc. NVIDIA claims to have quicker adoption of this product family in notebooks than in any previous generation. That’s great news for NVIDIA, but might leave AMD looking in from the outside yet again.

Technologically speaking though, this makes sense. Despite the improvement that Polaris made on the GCN architecture, Pascal is still more powerful and more power efficient than anything AMD has been able to product. Looking solely at performance per watt, which is really the defining trait of mobile designs, Pascal is as dominant over Polaris as Maxwell was to Fiji. And this time around NVIDIA isn’t messing with cut back parts that have brand changes – GeForce is diving directly into gaming notebooks in a way we have only seen with one release.

g752-open.jpg

The ASUS G752VS OC Edition with GTX 1070

Do you remember our initial look at the mobile variant of the GeForce GTX 980? Not the GTX 980M mind you, the full GM204 operating in notebooks. That was basically a dry run for what we see today: NVIDIA will be releasing the GeForce GTX 1080, GTX 1070 and GTX 1060 to notebooks.

Continue reading our preview of the new GeForce GTX 1080, 1070 and 1060 mobile Pascal GPUs!!

MSI Updates Gaming Laptops with NVIDIA Pascal Graphics

Subject: Systems, Mobile | August 16, 2016 - 12:00 AM |
Tagged: pascal, nvidia, notebook, msi, GTX 1080, gtx 1070, gtx 1060, gaming laptop, gaming

MSI has updated their gaming notebook lineup with the new NVIDIA Pascal mobile GPUs, with the GTX 1080, GTX 1070, and GTX 1060 now available across the board. MSI says the new GPUs will provide up to 40% better performance than the company’s previous GT, GS, and GE models.

msi_gt83.jpg

“MSI’s GT83/73VR Titan series now showcases an even more commanding design with sports car inspired exhausts and MSI’s Cooler Boost Titan, featuring multiple exhausts and dual whirlwind blade fans to guarantee the best performance even under the most stress.  Available in 3 different sizes and 17 unique configurations, including with SLI graphics, 4K panels and Tobii’s eye-tracking technology, MSI’s GT series is the optimum laptop for serious gamers.”

Positioned at the top of the heap is the mighty Titan series, which naturally offers the highest possible specs for those who can afford the price tag.

GT83_73.png

Notice anything about the top-end GT83 model in the chart above? The GT83VR Titan SLI indeed contains not one, but two NVIDIA GTX 1080 graphics chips, making this $5099 gaming machine a monster of a system - though its 1080p screen real estate means a connected VR headset will be more likely to use all of that available GPU power.

Moving down to the GT72/GT62 series, we see a move to the GTX 1070 GPU accross the board:

GT72_62.png

Next up is the GS73, which offers (in addition to Pascal graphics) MSI's "Cooler Boost Trinity", which is the company's advanced cooling system for thin notebook designs.

msi_gs73.jpg

“MSI’s redesigned GS73/63 VR Stealth Pro series now comes with MSI’s Cooler Boost Trinity, a temperature control system featuring three ultra-thin whirlwind blade fans, and a 5-pipe thermal design optimized for ultra-slim gaming notebooks.  Available in 17-inch, 15-inch, and 14-inch options, MSI’s GS series gives power mobile gaming a new meaning with the performance of larger systems while measuring less than 1-inch thick.”

The more modest GTX 1060 powers the <1 inch thick notebooks in the series, and both the GS73 and GS63 VR Stealth Pro are equipped with 4K resolution IPS screens (with the GS43VR Phantom Pro at 1080p).

GS73_63_43.png

Next we have the VR Apache series, with another approach to cooling called "Cooler Boost 4":

msi_ge72.jpg

“MSI’s GE72/62 VR Apache series now features MSI’s Cooler Boost 4 technology, an enhanced cooling system with multiple exhausts to keep temperatures low even during the most headed battles.  Starting at $1,649, the VR-ready GE series comes in two different sizes and is the ideal unit for gaming enthusiast looking for a powerful and reliable unit.”

These lower-cost gaming machines are still equipped with Intel Core i7 processors, and offer GTX 1060 graphics for both models.

GE72_62.png

As a very interesting addition to the news of these new laptops, MSI has also announced that select machines equipped with NVIDIA GTX 10 Series graphics will feature 120Hz IPS panels with a 5ms response time.

msi_120.png

We should have more imformation on availability soon.

Source: MSI

MSI Releases the X99A WORKSTATION Motherboard with ECC Support

Subject: Motherboards | August 15, 2016 - 10:49 PM |
Tagged: X99A WORKSTATION, workstation, socket 2011-3, msi, motherboard, Military Class 5, Intel X99, ECC

MSI’s new X99A WORKSTATION motherboard offers what the company calls “extreme QA testing” to go along with the “industries higher quality components” in a motherboard that doesn’t scrimp on features, either. The ATX motherboard supports ECC Registered DIMMs (with a supported Xeon processor), has dual Intel NICs onboard (with teaming support), and offers Steel Armor PCI-E and DDR4 slots, among other features.

X99A_W_1.jpg

“Engineered to cater even the most demanding professional. By using industries’ highest quality components, with an unmatched R&D design and extreme QA testing, the WORKSTATION motherboards guarantee the best in performance, stability and reliability.

Packed with features, including optimizations for NVIDIA QUADRO graphics cards and industry leading storage solutions, the WORKSTATION motherboard is the perfect multi-tasking powerhouse for demanding productivity applications.”

X99A_W_2.jpg

Features from MSI:

  • Supports New Intel Core i7 Processor Extreme Edition for LGA 2011-3 socket.
  • DDR4 Steel Armor with Best signal stability. Supports ECC Registered memory.
  • MULTI-GPU with Steel Armor: Steel Armor PCI-E slots. Supports Nvidia Quadro SLI.
  • Maximized high speed storage support: Turbo M.2 32Gb/s + Turbo U.2 32Gb/s + SATA-E 10Gb/s
  • USB 3.1 Gen2 2X FASTER: USB 3.1 Gen2 offers performance twice as fast as a regular USB 3.0 connection.
  • Military Class 5: The latest evolution in high quality components featuring the brand new Titanium Chokes.
  • Dual Intel LAN with Teaming: higher networking performance and stability.
  • Audio Boost 3: Reward your ears with studio grade sound quality
  • Guard-Pro: Improved Protection and Power Efficiency
  • EZ Debug LED: Easiest way to troubleshoot
  • Overvoltage Protection:Prevent Unforeseen Damage
  • Click BIOS 5: Award-winning brand new Click BIOS 5 with high resolution scalable font

The MSI X99A WORKSTATION motherboard is not yet listed on Newegg, but the listing is active (though temporarily out of stock) on Amazon with a $389.99 price tag. As to supported processors, that part of the source link currently offers no information, though MSI states "The X99A WORKSTATION supports the latest Intel processors out of the box, both Broadwell-E and XEON based models".

Source: MSI

Time for fresh system recommendations?

Subject: Systems | August 15, 2016 - 01:30 PM |
Tagged: system guide

The Tech Report have released their recommended system components for this summer, which you can check out right here.  They have maintained their previous format, offering a choice of several components at the Budget, Sweet Spot and High End levels, wrapping up with example builds.  They recommend holding off on building a budget machine for the nonce, at the time of publishing they recommended the RX 480 4GB but we have now seen the release of cards more suitable for this level of build.  The Sweet Spot is VR Ready and the High End machine remains Broadwell-E, much as with our own Hardware Leaderboard they cannot recommend moving from the reigning champ.

z170asliplus.jpg

"In this edition of The Tech Report System Guide, we account for the choices that AMD's Radeon RX 480 and Nvidia's GeForce GTX 1060 afford builders in the under-$300 graphics card market."

Here are some more Systems articles from around the web:

Systems

Google tests switching to a low fibre diet; WiFi almost all the way

Subject: General Tech | August 15, 2016 - 12:22 PM |
Tagged: google, wireless isp, LTE

The FCC bidding was not terribly exciting but the result was numerous companies buying up parts of the spectrum and more importantly to this post, the opening of 3550-3650 MHz band for anyone to use.  The 3.5GHz band is already allocated to shipborne navigation and military radar systems, this will be a test of ability of computer systems to moderate interference instead of the blanket ban they have always relied on in the past. 

Google is about to test that ability, they will be running a test in several US cities to check the propagation of the signal as well as any possible maritime or military interference from the broadcast.  This could be a way to get high speed internet to the curb without requiring fibre optic runs and would also be compatible with LTE, if Google wanted to dip their toes into that market.  You can read about the tests and where they will be happening over at Hack a Day.

Google_fiber_logo.jpeg

"In a recently released FCC filing, Google has announced their experimental protocol for testing the new CBRS. This isn’t fast Internet to a lamp pole on the corner of the street yet, but it lays the groundwork for how the CBRS will function, and how well it will perform."

Here is some more Tech News from around the web:

Tech Talk

Source: Hack a Day
Subject: General Tech
Manufacturer: Various

Introduction

Even before the formulation of the term "Internet of things", Steve Gibson proposed home networking topology changes designed to deal with this new looming security threat. Unfortunately, little or no thought is given to the security aspects of the devices in this rapidly growing market.

One of Steve's proposed network topology adjustments involved daisy-chaining two routers together. The WAN port of an IOT-purposed router would be attached to the LAN port of the Border/root router.

di1.png

In this arrangement, only IOT/Smart devices are connected to the internal (or IOT-purposed) router. The idea was to isolate insecure or poorly implemented devices from the more valuable personal local data devices such as a NAS with important files and or backups. Unfortunately this clever arrangement leaves any device directly connected to the “border” router open to attack by infected devices running on the internal/IOT router. Said devices could perform a simple trace-route and identify that an intermediate network exists between it and the public Internet. Any device running under the border router with known (or worse - unknown!) vulnerabilities can be immediately exploited.

di2.png

Gibson's alternative formula reversed the positioning of the IOT and border router. Unfortunately, this solution also came with a nasty side-effect. The border router (now used as the "secure" or internal router) became subject to all manner of man-in-the-middle attacks. Since the local Ethernet network basically trusts all traffic within its domain, an infected device on the IOT router (now between the internal router and the public Internet) can manipulate or eavesdrop on any traffic emerging from the internal router. The potential consequences of this flaw are obvious.

di3.png

The third time really is the charm for Steve! On February 2nd of this year (Episode #545 of Security Now!) Gibson presented us with his third (and hopefully final) foray into the magical land of theory-crafting as it related to securing our home networks against the Internet of Things.

Continue reading our editorial covering IOT security methodology!!

3GB Version of NVIDIA GTX 1060 Has 128 Fewer CUDA Cores

Subject: Graphics Cards | August 12, 2016 - 06:33 PM |
Tagged: report, nvidia, gtx 1060 3gb, gtx 1060, GeForce GTX 1060, geforce, cuda cores

NVIDIA will offer a 3GB version of the GTX 1060, and there's more to the story than the obvious fact that is has half the frame buffer of the 6GB version available now. It appears that this is an entirely different product, with 128 fewer CUDA cores (1152) than the 6GB version's 1280.

NVIDIA-GeForce-GTX-1060-3-GB-Announcement.jpg

Image credit: VideoCardz.com

Boost clocks are the same at 1.7 GHz, and the 3GB version will still operate with a 120W TDP and require a 6-pin power connector. So why not simply name this product differently? It's always possible that this will be an OEM version of the GTX 1060, but in any case expect slightly lower performance than the existing version even if you don't run at high enough resolutions to require the larger 6GB frame buffer.

Source: VideoCardz

Wherein the RX 470 teaches us a valuable lesson about deferred procedure calls

Subject: Graphics Cards | August 12, 2016 - 05:44 PM |
Tagged: rx 470, LatencyMon, dpc, amd

When The Tech Report first conducted their review of the RX 470 they saw benchmark behaviour very different from any other GPU in that family but could not figure out what it was and resolve it before the mob arrived with pitchforks and torches demanding they publish or die. 

As it turns out there was indeed something rotten in benchmark; incredibly high DPC on the test machine.  Investigation determined the culprit to be the beta BIOS on their ASRock Z170 Extreme7+, specifically the BIOS which allowed you to overclock locked Intel CPUs.  They have just released their new findings along with a look at LatencyMon and DPC in general.  Take a look at the new benchmarks and information about DPC, but also absorb the consequences of demanding articles arrive picoseconds after the NDA expires; if there is a delay in publishing there might just be a damn good reason why.

villagers_with_pitchforks.jpg

"We retested our RX 470 to account for this issue, and we also updated our review with DirectX 12 benchmarks for Rise of the Tomb Raider and Hitman, plus full OpenGL and Vulkan benchmarks for Doom."

Here are some more Graphics Card articles from around the web:

Graphics Cards

A mouthfull of a motherboard, the MSI X99A XPower Gaming Titanium Edition

Subject: Motherboards | August 12, 2016 - 03:26 PM |
Tagged: X99A XPower Gaming Titanium Edition, Intel X99, msi

The list of features on this motherboard is as long as the name; five PCIe 3.0 16x slots, the  Game Boost overclocking knob, 10 SATA 6Gbps ports, one each of SEx, M.2 and U.2, 13 USB 3.1 ports including a single Type C, WiFi and Bluetooth ... all it is missing is a game fowl in a fruit tree.  [H]ard|OCP did have problems with XMP settings, but nothing they couldn't overcome via manual settings and the issues they had with the Core-i7 6950X turned out to be caused by the PSU and RAM and not the motherboard at all.  If you need a plethora of storage and add-in cards then check this board out as it can handle almost anything you want to stick in it.

1469600731dIdWyUE4ur_1_12_l.jpg

"MSI’s X99A XPOWER GAMING TITANIUM is a mouthful, but the XPOWER series has been a favorite of ours here at HardOCP for years now. The latest X99 iteration has much to prove. Is the X99A XPOWER GAMING TITANIUM another pretty face, or is it a fitting addition to the venerable XPOWER line?"

Here are some more Motherboard articles from around the web:

Motherboards

Source: [H]ard|OCP

Xaggerated claims from Intel? Psah, we've heard that googolplex of times before

Subject: General Tech | August 12, 2016 - 01:16 PM |
Tagged: 3D XPoint, Intel, FMS 2016

You might have caught our reference to this on the podcast, XPoint is amazingly fast but the marketing clams were an order or magnitude or two off of the real performance levels.  Al took some very nice pictures at FMS and covered what Micron had to say about their new QuantX drives.  The Register also dropped by and offers a tidbit on the pricing, roughly four to five times as much as current flash or about half the cost of an equivalent amount of RAM.  They also compare the stated endurance of 25 complete drive writes per day to existing flash which offers between 10 to 17 depending on the technology used. 

The question they ask at the end is one many data centre managers will also be asking, is the actual speed boost worth the cost of upgrading or will other less expensive alternatives be more economical?

102021829-arms-full-of-money.530x298.jpg

"XPoint will substantially undershoot the 1,000-times-faster and 1,000-times-longer-lived-than-flash claims made by Intel when it was first announced – with just a 10-times speed boost and 2.5-times longer endurance in reality."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Corsair Releases Hydro GFX GTX 1080 Liquid-Cooled Graphics Card

Subject: Graphics Cards | August 12, 2016 - 10:59 AM |
Tagged: overclock, nvidia, msi, liquid cooled, hydro H55, hydro gfx, GTX 1080, graphics card, gaming, corsair

Corsair and MSI have teamed up once again to produce a liquid-cooled edition of the latest NVIDIA GPU, with the GTX 1080 receiving the same treatment these two gave to the Hydro GFX version of GTX 980 Ti last year.

HydroGFX_01.jpg

“The CORSAIR Hydro GFX GTX 1080 brings all the benefits of liquid cooling to the GeForce GTX 1080, boasting an integrated CORSAIR Hydro Series H55 cooler that draws heat from the GPU via a micro-fin copper base cold plate and dissipates it efficiently using a 120mm high-surface area radiator. A pre-installed low-noise LED-lit 120mm fan ensures steady, reliable air-flow, keeping GPU temperatures down and clock speeds high.

With a low-profile PCB and pre-fitted, fully-sealed liquid cooler, the Hydro GFX GTX 1080 is simple and easy to install. Just fit the card into a PCI-E 3.0 x16 slot, mount the radiator and enjoy low maintenance liquid cooling for the lifetime of the card.”

Naturally, with an integrated closed-loop liquid cooler this GTX 1080 won't be relegated to stock speeds out of the box, though Corsair leaves this up to the user. The card offers three performance modes which allow users to choose between lower noise and higher performance. Silent Mode leaves the GTX 1080 at stock settings (1733 MHz Boost), Gaming Mode increases the Boost clock to 1822 MHz, and OC Mode increases this slightly to 1847 MHz (while increasing memory speed in this mode as well).

gpu_clocks.png

This liquid-cooled version will provide higher sustained clocks

Here are the full specs from Corsair:

  • GPU: NVIDIA GeForce GTX 1080
  • CUDA Cores: 2,560
  • Interface: PCI Express 3.0 x16
  • Boost / Base Core Clock:
    • 1,847 MHz / 1,708 MHz (OC Mode)
    • 1,822 MHz / 1,683 MHz (Gaming Mode)
    • 1,733 MHz / 1,607 MHz (Silent Mode)
  • Memory Clock:
    • 10,108 MHz (OC Mode)
    • 10,010 MHZ (Gaming Mode)
    • 10,010 MHz (Silent Mode)
  • Memory Size: 8192MB
  • Memory Type: 8GB GDDR5X
  • Memory Bus: 256-bit
  • Outputs:
    • 3x DisplayPort (Version 1.4)
    • 1x HDMI (Version 2.0)
    • 1x DL-DVI-D
  • Power Connector: 8-pin x 1
  • Power Consumption: 180W
  • Dimension / Weight:Card: 270 x 111 x 40 mm / 1249 g
  • Cooler: 151 x 118 x 52 mm/ 1286 g
  • SKU: CB-9060010-WW

HydroGFX_Environmental_10.png

The Corsair Hydro GFX GTX 1080 is available now, exclusively on Corsair's official online store, and priced at $749.99.

Source: Corsair