Toshiba and Western Digital announce QLC and 96-Layer BiCS Flash

Subject: Storage | June 28, 2017 - 09:49 PM |
Tagged: wdc, WD, toshiba, QLC, nand, BiCS, 96-layer, 3d

A couple of announcements out of Toshiba and Western Digital today. First up is Toshiba announcing QLC (4 bit per cell) flash on their existing BiCS 3 (64-layer) technology. QLC may not be the best for endurance as the voltage tolerances become extremely tight with 16 individual voltage states per cell, but Toshiba has been working on this tech for a while now.

FMS-QLC.jpg

In the above slide from the Toshiba keynote at last year's Flash Memory Summit, we see the use case here is for 'archival grade flash', which would still offer fast reads but is not meant to be written as frequently as MLC or TLC flash. Employing QLC in Toshiba's current BiCS 3 (64-layer) flash would enable 1.5TB of storage in a 16-die stack (within one flash memory chip package).

roadmap.png

Next up is BiCS 4, which was announced by Western Digital. We knew BiCS 4 was coming but did not know how many layers it would be. We now know that figure, and it is 96. The initial offerings will be the common 256Gbit (32GB) capacity per die, but stacking 96 cells high means the die will come in considerably smaller, meaning more per wafer, ultimately translating to lower cost per GB in your next SSD.

While these announcements are welcome, their timing and coordinated launch from both companies seems odd. Perhaps it has something to do with this?

AMD Releases Radeon ProRender for Blender and SolidWorks

Subject: General Tech | June 28, 2017 - 06:24 PM |
Tagged: solidworks, ray tracing, radeon, prorender, nvidia, mental ray, Blender, amd

AMD has released a free ray-tracing engine for Blender, as well as Maya, 3D Studio Max, and SolidWorks, called Radeon ProRender. It uses a physically-based workflow, which allows multiple materials to be expressed in a single, lighting-independent shader, making it easy to color objects and have them usable in any sensible environment.

amd-2017-prorender-mikeP.jpg

Image Credit: Mike Pan (via Twitter)

I haven’t used it yet, and I definitely haven’t tested how it stacks up against Cycles, but we’re beginning to see some test renders from Blender folks. It looks pretty good, as you can see with the water-filled Cornell box (above). Moreover, it’s rendered on an NVIDIA GPU, which I’m guessing they had because of Cycles, but that also shows that AMD is being inclusive with their software.

Radeon ProRender puts more than a little pressure on Mental Ray, which is owned by NVIDIA and licensed on annual subscriptions. We’ll need to see how quality evolves, but, as you see in the test render above, it looks pretty good so far... and the price can’t be beat.

Source: AMD

Beyond Good and Evil 2 Voyager Engine Demo

Subject: General Tech | June 28, 2017 - 05:09 PM |
Tagged: ubisoft, pc gaming

Honestly, I don’t really know how many first-party engines Ubisoft currently maintains anymore. Anvil is one of their more popular ones, which was used in Assassin’s Creed, Steep, For Honor, and Tom Clancy’s Ghost Recon Wildlands. Far Cry 5 will be using the Dunia Engine, which was forked from the original CryEngine. Tom Clancy’s The Division, Mario + Rabbids, and the new South Park use Snowdrop. I know that I’m missing some.

Add another one to the list: Voyager, which will be used in Beyond Good and Evil 2.

From what I gather with the video, this engine is optimized for massive differences in scale. The Creative Director for Beyond Good and Evil 2, Michael Ancel, showed the camera (in developer mode) smoothly transition from a high-detailed player model out to a part of a solar system. They claim that the sunset effects are actually caused by the planet’s rotation. Interesting stuff!

Plan 9 from Skylake-X

Subject: Processors | June 28, 2017 - 03:03 PM |
Tagged: 7900x, Core i9, Intel, skylake-x, x299

The Tech Report recently wrapped up the first part of their review of Intel's new Core i9-7900X, focusing on its effectiveness in production machine.  Their benchmarks cover a variety of scientific tasks such as PhotoWorxx, FPU Julia and Mandel as well as creativity benchmarks like picCOLOR, DAWBench DSP 2017 and STARS Euler3D.  During their testing they saw the same peaks in power consumption as Ryan did in his review, 253W under a full Blender load.  Their follow up review will focus on the new chips gaming prowess, for now you should take a look at how your i9-7900X will perform for you when you are not playing around.

skylake-basics.png

"Intel's Core i9-7900X and its Skylake-X brethren bring AVX-512 support, a new cache hierarchy, and a new on-die interconnect to high-end desktops. We examine how this boatload of high-performance computing power advances the state of the art in productivity applications."

Here are some more Processor articles from around the web:

Processors

 

Toshiba's new 64 layer NVMe drive takes the cake

Subject: Storage | June 28, 2017 - 02:12 PM |
Tagged: Toshiba XG5, toshiba, ssd, NVMe, nand, M.2, BiCS, 64-Layer

We first heard about the Toshiba XG5 1TB NVMe SSD at Computex, with its 64 layer BiCS flash and stated read speeds of 3GB/s, writes just over 2 GB/s.  Today Kitguru published a review of the new drive, including ATTO results which match and even exceed the advertised read and write speeds.  Their real world test involved copying 30GB of movies off of a 512GB Samsung 950 Pro to the XG5, only Samsung's new 960 lineup and the OCZ RD400 were able to beat Toshiba's new SSD.  Read more in their full review, right here.

121A9760.jpg

"The Toshiba XG5 1TB NVMe SSD contains Toshiba's newest 3D 64-Layer BiCS memory and our report will examine Toshiba's newest memory, as well as their newest NVMe controller to go along with it."

Here are some more Storage reviews from around the web:

Storage

A game of memory, testing Intel's sensitivity to RAM frequency

Subject: General Tech | June 28, 2017 - 01:07 PM |
Tagged: gaming, Intel, ddr3, ddr4

Overclockers Club have completed a daunting task, testing the effect of RAM frequency on game performance from DDR3-1333 through DDR4-3200.  In theory Intel's chips will not see the same improvements as AMD's Ryzen, lacking Infinity Fabric which has proved to be sensitive to memory frequency.  Since OCC cover two generations of RAM they also needed to test with two different processors, in this case the i7-4770K and i7-7700K and they tested performance at 1440p as well as 1080p.  Read the full article to see the full results which do show some performance deltas, however they nothing compared to spending more on your GPU.

6.jpg

"After running through all of the tests, it appears that what I previously thought was an easy and clear answer is in fact more complicated. With the evidence provided I can safely say that memory can play a large role in some games over all frame rates. However, other factors like the processor, type of video card, and resolution will usually provide bigger impact in the final frame rates. Strictly speaking of game performances, the fastest memory tested does yield better results."

Here is some more Tech News from around the web:

Gaming

Celebrate Summer with FSP and PC Perspective! Win PSUs and Cases!

Subject: General Tech | June 28, 2017 - 12:53 PM |
Tagged: giveaway, contest

It seems like it has been forever since we had a contest on the site...let's remedy that with our friends at FSP!

Celebrate Summer with FSP and PC Perspective!

Anyone on the globe is able to enter - good luck!

Source: FSP

Gigabyte Launches GA-AB350N-Gaming WIFI Mini ITX AM4 Motherboard

Subject: Motherboards | June 28, 2017 - 01:44 AM |
Tagged: gigabyte, mini ITX, b350, amd, AM4, raven ridge, SFF, ryzen

Gigabyte is joining the small form factor Ryzen motherboard market with its new GA-AB350N-Gaming WIFI. The new Mini ITX motherboard sports AMD’s AM4 socket and B350 chipset and supports Ryzen “Summit Ridge” CPUs, Bristol Ridge APUs (7th Gen/Excavator), and future Zen-based Raven Ridge APUs. The board packs a fair bit of hardware into the Mini ITX form factor and is aimed squarely at gamers and enthusiasts.

Gigabyte GA-AB350N-Gaming WIFI.png

The AB350N-Gaming WIFI has an interesting design in that some of the headers and connectors are flipped versus where they are traditionally located. The chipset sits to the left of CPU socket above the 6-phase VRMs and PowIRStage digital ICs. Four SATA 6Gbps ports and a USB 3.0 header occupy the top edge of the board. Two DDR4 dual channel memory slots are aligned on the right edge and support (overclocked) frequencies up to 3200 MHz depending on the processor used. The Intel wireless NIC, Realtek Gigabit Ethernet, and Realtek ALC1220 audio chips have been placed in the space between the AM4 socket and the single PCI-E 3.0 x16 slot. There is also a single M.2 (PCI-E 3.0 x4 32Gbps) slot on the underside of the motherboard. Gigabyte has also integrated “RGB Fusion” technology with two on board RGB LED lighting zones and two RGBW headers for off board lighting strips as well as high end audio capacitors and headphone amplifier. Smart Fan 5 technology allegedly is capable of automatically differentiating between fans and water pumps connected to the two fan headers and will automatically provide the correct PWM signal based on fan curves the user can customize in the UEFI BIOS. The motherboard is powered by a 24-pin ATX and 8-pin EPS and while it does not have a very beefy power phase setup it should be plenty for most overclocks (especially with Ryzen not wanting to go much past 4 GHz (easily) anyway).

Rear I/O includes:

  • 1 x PS/2
  • 2 x Antenna (Intel 802.11ac Wi-Fi + BT 4.2)
  • 2 x USB 2.0
  • 2 x USB 3.1 Gen 2 (10Gbps)
  • 4 x USB 3.1 Gen 1 (5Gbps)
  • 6 x Audio (5 x analog, 1 x S/PDIF)
  • 1 x DisplayPort 1.2
  • 1 x HDMI 1.4
  • 1 x Realtek GbE

Gigabyte has an interesting SFF motherboard with the GA-AB350N-Gaming WIFI and I am interested in seeing the reviews. More Mini ITX options for Ryzen and other Zen-based systems is a good thing, and moving the power phases to the left may end up helping overclocking and cooling in smaller cases with tower coolers.

Unfortunately, Gigabyte has not yet revealed pricing or availability. Looking around online at its competition, i would guess it would be around $85 though.

Also read;

Source: Gigabyte

Qualcomm Partners with Bosch, OmniVision, and Ximmerse to Shore Up Mobile VR Sensors

Subject: Mobile | June 27, 2017 - 08:00 PM |
Tagged: xr, VR, qualcomm, google, daydream, AR

Qualcomm has put forward steady work on creating the vibrant hardware ecosystem for mobile VR to facilitate broad adoption of wireless, dedicated head mounted displays. Though the value of Samsung’s Gear VR and Google’s Daydream View cannot but overstated in moving the perception of consumer VR forward, the need to utilize your smart phone in a slot-in style design has its limitations. It consumes battery that you may require for other purposes, it limits the kinds of sensors that the VR system can utilize, and creates a sub-optimal form factor in order to allow for simple user installation.

sd835vr.jpg

The Qualcomm Snapdragon 835 VR Reference Device

Qualcomm created the first standalone VR HMD reference design back in early 2016, powered by the Snapdragon 820 processor. Google partnered with Qualcomm at I/O to create the Daydream standalone VR headset reference design with the updated Snapdragon 835 Mobile Platform at its core, improving performance and graphical capability along the way. OEMs like Lenovo and HTC have already committed to Daydream standalone units, with Qualcomm at the heart of the hardware.

Qualcomm Technologies recently announced a HMD Accelerator Program (HAP) to help VR device manufacturers quickly develop premium standalone VR HMDs. At the core of this program is the standalone VR HMD reference design. It goes beyond a simple prototype device, offering a detailed reference design that allows manufacturers to apply their own customizations while utilizing our engineering, design, and experience in VR. The reference design is engineered to minimize software changes, hardware issues, and key component validation.

- Hugo Swart, Qualcomm Atheros, Inc.

As part of this venture, and to continue pushing the VR industry forward to more advanced capabilities like XR (extended reality, a merger of VR and AR), Qualcomm is announcing agreements with key component vendors aiming to tighten and strengthen the VR headset ecosystem.

hugoswart.jpg

Hugo Swart, Senior Director, Product Management, Qualcomm Atheros, Inc.

Ximmerse has built a high-precision and drift-free controller for VR applications that offers low latency input and 3DoF (3 degrees of freedom) capability. This can “provide just about any interaction, such as pointing, selecting, grabbing, shooting, and much more. For precise 6 DoF positional tracking of your head, tight integration is required between the sensor fusion processing (Snapdragon) and the data from both the camera and inertial sensors.”

Bosch Sensortec has the BMX055 absolute orientation sensor that performs the function that its name would imply: precisely locating the user in the real world and tracking movement via accelerometer, gyroscope, and magnetometer.

ov9281.jpg

Finally, OmniVision integrates the OV9282 which is a 1MP high speed shutter image sensor for feature tracking.

These technologies, paired with the work Qualcomm has already done for the Snapdragon 835 VR Development Kit, including on the software side, is an important step to the growth of this segment of the market. I don’t know of anyone that doesn’t believe standalone, wireless headsets are the eventual future of VR and AR and the momentum created by Qualcomm, Google, and others continues its steady pace of development.

Source: Qualcomm

Go west young researcher! AMD's Radeon Vega Frontier Edition is available now

Subject: Graphics Cards | June 27, 2017 - 06:51 PM |
Tagged: Vega FE, Vega, HPC, amd

AMD have released their new HPC card, the Radeon Vega Frontier Edition, which Jim told you about earlier this week.  The air cooled version is available now, with an MSRP of $999USD followed by a water-cooled edition arriving in Q3 with price tag of $1499.

cards.gif

The specs they list for the cards are impressive and compare favourably to NVIDIA's P100 which is the card AMD tested against, offering higher TFLOPS for both FP32 and FP16 operations though the memory bandwidth lags a little behind.

  Radeon Vega
Frontier Edition
Quadro GP100
GPU Vega GP100
Peak/Boost Clock 1600 MHz 1442 MHz
FP32 TFLOPS (SP) 13.1 10.3
FP64 TFLOPS (DP)

0.819

5.15
Memory Interface 1.89 Gb/s
2048-bit HBM2
1.4 Gbps
4096-bit HBM2
Memory Bandwidth 483 GB/s 716 GB/s
Memory Size 16GB HBC* 16GB
TDP 300 W air, 375 W water 235 W

The memory size for the Vega is interesting, HBC is AMDs High Bandwidth Cache Controller which not only uses the memory cache more effectively but is able to reach out to other high performance system memory for help.  AMD states that the Radeon Vega Frontier Edition has the capability of expanding traditional GPU memory to 256TB; perhaps allowing new texture mods for Skyrim or Fallout!  Expect to see more detail on this feature once we can get our hands on a card to abuse, nicely of course.

slides-16-740x416.jpg

AMD used the DeepBench Benchmark to provide comparative results, the AMD Vega FE system used a dual socketed system with Xeon E5 2640v4s @ 2.4Ghz 10C/20T, 32GB DDR4 per socket, on Ubuntu 16.04 LTS with ROCm 1.5, and OpenCL 1.2, the NVIDIA Tesla P100 system used the same hardware with the CuDNN 5.1, Driver 375.39 and Cuda version 8.0.61 drivers.  Those tests showed the AMD system completing the benchmark in 88.7ms, the Tesla P100 completed in 133.1 ms, quite an impressive lead for AMD.  Again, there will be much more information on performance once the Vega FE can be tested.

img4.jpg

Read on to hear about the new card in AMD's own words, with links to their sites.

Source: AMD

CastAR casts off for the perhaps the last time

Subject: General Tech | June 27, 2017 - 01:13 PM |
Tagged: Jeri Ellsworth, Rick Johnson, CastAR, augmented reality

The brain child of fomer Valve employees Jeri Ellsworth and Rick Johnson, CastAR, is no more.  They were part of the original team at Valve which helped create SteamVR, their focus was on augmented reality applications which Valve eventually decided to drop and Jeri and Rick were allowed to keep the IP which they helped develop.  They went on to launch a very successful Kickstarter to help develop their technology and when they eventually received $15 million in investments they chose to return the money invested by their Kickstarter backers; a very different reaction than others have had.

Unfortunately they have not been able to continue to attract investment for their AR products and according to the information Polygon garnered, they have significantly downsized the number of employees and may be seeking to sell their technology.  This is exceptionally bad news as their first set of AR goggles were set to launch later this year.  The market seems far more willing to invest in VR than it does AR, which presents a large hurdle for smaller businesses to succeed.  Hopefully we will hear happier news about Jeri, her team, and CastAR in the future but for now it looks rather bleak.

index.png

"In 2013, Technical Illusions got its start with a hugely successful Kickstarter, netting just north of one million dollars. This success drew the attention of investors and eventually led to a funding round of $15 million. With this success, Technical Illusions decided to refund the backers of its Kickstarter."

Here is some more Tech News from around the web:

Tech Talk

Source: Polygon

NVIDIA Partners Launching Mining Focused P106-100 and P104-100 Graphics Cards

Subject: Graphics Cards | June 26, 2017 - 11:29 PM |
Tagged: pascal, nvidia, nicehash, mining, gp106-100, gp104-100, cryptocurrency

In addion to the AMD-based mining graphics cards based on the RX 470 Polaris silicon that have appeared online, NVIDIA and its partners are launching cryptocurrency mining cards based on GP106 and GP104 GPUs. Devoid of any GeForce or GTX branding, these cost controlled cards focused on mining lack the usual array of display outputs and have much shorter warranties (rumors point at a 3 month warranty restriction imposed by NVIDIA). So far Asus, Colorful, EVGA, Inno3D, MSI, and Zotac "P106-100" cards based on GP106 (GTX 1060 equivalent) silicon have been spotted online with Manli and Palit reportedly also working on cards. Many of these manufacturers are also also planning "P104-100" cards based on GP104 or the GTX 1070 though much less information is available at the moment. Pricing is still up in the air but pre-orders are starting to pop up overseas so release dates and prices will hopefully become official soon.

ASUS GP106-100 MINER.jpg

These mining oriented cards appear to be equipped with heatsinks similar to their gaming oriented siblings, but have fans rated for 24/7 operation. Further, while the cards can be overclocked they are clocked out of the box at reference clock speeds and allegedly have bolstered power delivery hardware to keep the cards mining smoothly under 24/7 operation. The majority of cards from NVIDIA partners lack any display outputs (the Colorful card has a single DVI out) which helps a bit with ventilation by leaving both slots vented. These cards are intended to be run in headless system or with systems that also have graphics integrated into the CPU (miners not wanting to waste a PCI-E slot!).

  Base Clock Boost Clock Memory (Type) Pricing
ASUS MINING-P106-6G 1506 MHz 1708 MHz 6 GB (GDDR5) @ 8 GHz $226
Colorful P106-100 WK1/WK2 1506 MHz 1708 MHz 6GB (GDDR5) @ 8 GHz ?
EVGA GTX1060 6G P106 1506 MHz 1708 MHz 6GB (GDDR5) @ 8 GHz $284?
Inno3D P106-100 Compact 1506 Mhz 1708 MHz 6GB (GDDR5) @ 8 GHz ?
Inno3D P106-100 Twin 1506 MHz 1708 MHz 6GB (GDDR5) @ 8 GHz ?
MSI P106-100 MINER 1506 MHz 1708 MHz 6GB (GDDR5) @ 8 GHz $224
MSI P104-100 MINER TDB TBD 6GB (GDDR5X) @ ? ?
ZOTAC P106-100 1506 MHz 1708 MHz 6GB (GDDR5) @ 8 GHz ?

Looking at the Nicehash Profitability Calculator, the GTX 1060 and GTX 1070 are rated at 20.13 MH/s and 28.69 MH/s at DaggerHashimoto (Etherium) mining respectively with many users able to get a good bit higher hash rates with a bit of overclocking (and in the case of AMD undervolting to optimize power efficiency). NVIDIA cards tend to be good for other algorithms as well such as ZCash and Libry and Equihash (at least those were the majority of coins my 750 Ti mined likely due to it not having the memory to attempt ETH mining heh). The calculator estimates these GPUs at 0.00098942 BTC per day and 0.00145567 BTC per day respectivey. If difficulty and exchange rate were to remains constant that amounts to an income of $1197.95 per year for a GP106 and $1791.73 per year for a GP104 GPU and ROI in under 3 months. Of course cryptocurrency to USD exchange rates will not remain constant, there are transactions and mining fees, and mining difficulty will rise as more hardware is added to the network as miners so these estimated numbers will be lower in reality. Also, these numbers are before electricity, maintainence time, and failed hardware costs, but currently mining alt coins is still very much profitable using graphics cards.

AMD and NVIDIA (and their AIB partners) are hoping to get in on this action with cards binned and tuned for mining and at their rumored prices placing them cheaper than their gaming focused RX and GTX variants miners are sure to scoop these cards up in huge batches (some of the above cards are only availabe in large orders). Hopefully this will alleviate the strain on the gaming graphics card market and bring prices back down closer to their original MSRPs for gamers!

Also read:

What are your thoughts on all this GPU mining and cryptocurrency / blockchain technology stuff?

Source: Videocardz

Seasonic's PRIME series of PSUs goes Platinum

Subject: Cases and Cooling | June 26, 2017 - 06:32 PM |
Tagged: Seasonic PRIME, 850W, 80 Plus Platinum, modular psu

It was almost a year ago that Lee reviewed the Seasonic PRIME 750W Titanium PSU; today it is [H]ard|OCP who has a review of a cousin of that PSU.  The Seasonic PRIME 850W Platinum PSU is a new addition to the PRIME family, bearing the same 12 year warranty as its relatives as well as the single 12V rail design and physical Hybrid button.  As [H] have already reviewed the previous 850W PRIME model, the newcomer has some big shoes to fill.  It comes very close to doing so, as you can see in their full review.

1496794935dtrk6eag20_2_9_l.jpg

"As is usual, Seasonic talks softly and carries a big stick. The biggest stick lately has been its Prime series power supplies. Today's Prime comes to us touting excellent efficiency, a fully modular design, tight output voltage, and a quiet noise profile supplied by a fluid dynamic bearing fan. Does Seasonic continue its current reign?"

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

Source: [H]ard|OCP

Let's get dangerous! Windows betas leak to the interwebs

Subject: General Tech | June 26, 2017 - 03:03 PM |
Tagged: microsoft. leak, beta

Someone has uploaded an immense amount of previously secret Windows code from Microsoft to Beta Archive, who are currently trying to take the private content down as quickly as they can.  The leaks include a number of unreleased builds of Server 2016, Windows 10 "Redstone" builds and even versions to run on 64bit ARM which would be interesting to look at if that was all that was uploaded.  Unfortunately along with those builds were Microsoft's PnP code, USB and Wi-Fi stacks, storage drivers, and ARM-specific OneCore kernel code, all of which is a goldmine for those who choose to make life miserable for computer users everywhere.  Take a peek at an overview of what was leaked at The Register.

Loose_lips_might_sink_ships.jpg

"The data – some 32TB of official and non-public installation images and software blueprints that compress down to 8TB – were uploaded to betaarchive.com, the latest load of files provided just earlier this week. It is believed the confidential data in this dump was exfiltrated from Microsoft's in-house systems around March this year."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

Mining specific cards are real - ASUS and Sapphire GP106 and RX 470 show up

Subject: Graphics Cards | June 26, 2017 - 12:21 PM |
Tagged: radeon, nvidia, mining, geforce, cryptocurrency, amd

It appears that the prediction of mining-specific graphics cards was spot on and we are beginning to see the release of them from various AMD and NVIDIA board partners. ASUS has launched both a GP106-based solution and an RX 470 offering, labeled as being built exclusively for mining. And Sapphire has tossed it's hat into the ring with RX 470 options as well.

EZUPrFwrJaMtsGyJ_setting_000_1_90_end_500.png

The most interesting release is the ASUS MINING-P106-6G, a card that takes no official NVIDIA or GeForce branding, but is clearly based on the GP106 GPU that powers the GeForce GTX 1060. It has no display outputs, so you won't be able to use this as a primary graphics card down the road. It is very likely that these GPUs have bad display controllers on the chip, allowing NVIDIA to make use of an otherwise unusable product.

MINING-P106-6G_IO_500.png

The specifications on the ASUS page list this product as having 1280 CUDA cores, a base clock of 1506 MHz, a Boost clock of 1708 MHz, and 6GB of GDDR5 running at 8.0 GHz. Those are identical specs to the reference GeForce GTX 1060 product.

The ASUS MINING-RX470-4G is a similar build but using the somewhat older, but very efficient for mining, Radeon RX 470 GPU. 

MINING-RX470-4G_2D_500.png

Interestingly, the ASUS RX 470 mining card has openings for a DisplayPort and HDMI connection, but they are both empty, leaving the single DVI connection as the only display option.

MINING-RX470-4G_IO_500.png

The Mining RX 470 has 4GB of GDDR5, 2048 stream processors, a base clock of 926 MHz and a boost clock of 1206 MHz, again, the same as the reference RX 470 product.

We have also seen Sapphire versions of the RX 470 for mining show up on Overclockers UK with no display outputs and very similar specifications.

GX37YSP_168450_800x800.jpg

In fact, based on the listings at Overclockers UK, Sapphire has four total SKUs, half with 4GB and half with 8GB, binned by clocks and by listing the expected MH/s (megahash per second) performance for Ethereum mining.

63634076269.png

These releases show both NVIDIA and AMD (and its partners) desire to continue cashing in on the rising coin mining and cryptocurrency craze. For AMD, this allows them to find an outlet for the RX 470 GPU that might have otherwise sat in inventory with the upgraded RX 500-series out on the market. For NVIDIA, using GPUs that have faulty display controllers for mining-specific purposes allows it to be better utilize production and gain some additional profit with very little effort.

Those of you still looking to buy GPUs at reasonable prices for GAMING...you remember, what these products were built for...are still going to have trouble finding stock on virtual or physical shelves. Though the value of compute power has been dropping over the past week or so (an expected result of increase interesting in the process), I feel we are still on the rising side of this current cryptocurrency trend.

Source: Various

Microcode Bug Affects Intel Skylake and Kaby Lake CPUs

Subject: Processors | June 26, 2017 - 08:53 AM |
Tagged: xeon, Skylake, processor, pentium, microcode, kaby lake, Intel, errata, cpu, Core, 7th generation, 6th generation

A microcode bug affecting Intel Skylake and Kaby Lake processors with Hyper-Threading has been discovered by Debian developers (who describe it as "broken hyper-threading"), a month after this issue was detailed by Intel in errata updates back in May. The bug can cause the system to behave 'unpredictably' in certain situations.

Intel CPUs.jpg

"Under complex micro-architectural conditions, short loops of less than 64 instructions that use AH, BH, CH or DH registers as well as their corresponding wider register (eg RAX, EAX or AX for AH) may cause unpredictable system behaviour. This can only happen when both logical processors on the same physical processor are active."

Until motherboard vendors begin to address the bug with BIOS updates the only way to prevent the possibility of this microcode error is to disable HyperThreading. From the report at The Register (source):

"The Debian advisory says affected users need to disable hyper-threading 'immediately' in their BIOS or UEFI settings, because the processors can 'dangerously misbehave when hyper-threading is enabled.' Symptoms can include 'application and system misbehaviour, data corruption, and data loss'."

The affected models are 6th and 7th-gen Intel processors with HyperThreading, which include Core CPUs as well as some Pentiums, and Xeon v5 and v6 processors.

Source: The Register

Imagination Technologies Pursues Acquisition Talks

Subject: Graphics Cards, Mobile | June 23, 2017 - 10:45 PM |
Tagged: Imagination Technologies, imagination, apple, gpu

According to a press release from Imagination Technologies, the group has been approached by multiple entities who are interested in acquiring them. None of these potential buyers have been mentioned by name, however. The press release also makes it clear that the group is only announcing that discussions have started, and that other interested parties can contact their financial adviser, Rothschild, to join in.

imaginationtech-logo.png

It’s entirely possible that nothing could come from these discussions, but Imagination Technologies clearly wants as many options to choose from as possible.

This announcement is clearly related to the recent news that Apple plans to stop licensing technology from them, which made up about half of the whole company’s revenue at the time. The press release states that they are still in dispute with Apple with a dedicated, highly visible, single-line paragraph. As far as I know, Apple hasn’t yet provided proof that they are legally clear of Imagination Technology’s licenses, and the press release claims that they still dispute Apple’s claims.

Hopefully we’ll hear more concrete details in the near future.

The GeForce GTX USB drive is real and small and fun

Subject: General Tech | June 23, 2017 - 05:13 PM |
Tagged: nvidia, gtx, geforce gtx usb drive, geforce

What started as merely an April Fool's prank by NVIDIA has now turned into one of the cutest little promotions I've ever seen. Originally "launched" as part of the GeForce G-ASSIST technology that purported to offer AI-enabled gaming if you were away from your keyboard, NVIDIA actually built the tiny, adorable, GeForce GTX USB Key.

gtxusb1.jpg

This drive was made to look like the GeForce GTX 1080 Founders Edition graphics card and was only produced in a quantity of 1080. I happen to find a 64GB option in a Fedex box this morning when I cam into the office.

gtxusb2.jpg

Performance on this USB 3.0 based drive is pretty solid, peaking at 111 MB/s on reads and 43 MB/s on writes. 

GTX 64GB USB-1.png

If you want of these for yourself, you need to be signed up through GeForce Experience and opting in to the GeForce newsletter. Do that, and you're entered. 

gtxusb3.jpg

We have some more pictures of the USB drive below (including the surprising interior shot!), so click this link to see them.

AMD Radeon Vega Frontier Edition Air and Liquid-Cooled GPUs Now Available for Pre-Order

Subject: Graphics Cards | June 23, 2017 - 02:21 AM |
Tagged: vega frontier edition, Vega, radeon, pre-order, gpu, amd

AMD promised “late June” availability for its Radeon Vega Frontier Edition, and it looks like the company will almost hit that mark. The latest high-end prosumer and workstation GPU from AMD is now available for pre-order, with an expected ship date of July 3rd.

Update [2017-06-24]: The initial pre-order stock at both Newegg and Amazon has sold out. It's unknown if AMD will make additional units available in time for the launch.

radeon-vega-frontier-edition.jpg

The Radeon Vega Frontier Edition helps drive the new digital world. It nurtures creativity. It is your gateway to parts unknown. Expand the boundaries of what's possible and witness the impossible. With the new "Vega" GPU architecture at its core, you will have no barriers or compromises to what you want to achieve. Take advantage of the massive 16GB of cutting-edge, second-generation high-bandwidth memory to create expansive designs and models. Crunch and manipulate datasets using the sixty-four Next-Gen Compute Units (nCUs - 4096 stream processors) at your disposal. Unleash your imagination to develop games, CGI or VR content leveraging the latest features found on the "Vega" GPU architecture and witness the breathtaking power of "Vega" course through your system.

The Radeon Vega Frontier Edition is available in both air and AIO liquid-cooled designs, and the product page clarifies the following specs. Note, however, that specific core and memory clocks are not listed, which is especially interesting given the liquid-cooled varient's increased TDP.

AMD Radeon Vega Frontier Edition (Air Cooled)

radeon-vega-frontier-edition-air.jpg

  • Memory: 16GB High Bandwidth Cache
  • Memory Bandwidth: 483 GB/s
  • Compute Units: 64
  • Stream Processors: 4096
  • Single Precision Compute (FP32): 13.1 TFLOPS
  • Half Precision Compute (FP16): 26.2 TFLOPS
  • Display Output: 3 x DisplayPort 1.4, 1 x HDMI 2.0
  • TDP: 300W
  • Price: $1,199.99 (Newegg | Amazon)

AMD Radeon Vega Frontier Edition (Liquid Cooled)

radeon-vega-frontier-edition-liquid-1.jpg

  • Memory: 16GB High Bandwidth Cache
  • Memory Bandwidth: 483 GB/s
  • Compute Units: 64
  • Stream Processors: 4096
  • Single Precision Compute (FP32): 13.1 TFLOPS
  • Half Precision Compute (FP16): 26.2 TFLOPS
  • Display Output: 3 x DisplayPort 1.4, 1 x HDMI 2.0
  • TDP: 375W
  • Price: $1,799.99 (Newegg | Amazon)

Before you pre-order, however, there’s one big caveat. Although AMD touts the card as ideal for “innovators, creators, and pioneers of the world,” the Radeon Vega Frontier Edition will lack application certification, a factor that is crucial to many who work with content creation software and something typically found in high-end professional GPUs like the Quadro and FirePro lines.

For those hoping for Vega-based professional cards sporting certification, the Vega Frontier Edition product page teases the launch of the Vega-powered Radeon Pro WX in Q3 2017.

Source:

FSP embraces summer with new heatsinks like the Windale 6

Subject: Cases and Cooling | June 22, 2017 - 03:06 PM |
Tagged: FSP Group, windale 6

FSP Group are more commonly known for their PSUs, recently they have branched out into other components including heatsinks.  [H]ard|OCP had a chance to test out their Windale 6 cooler, which sounds oddly familiar.  The cooling performance was somewhat better than a stock cooler and noticeably quieter, but overclockers may want to look elsewhere.  The cooler stands 122x110x160mm and sports a 120mm fan however the mounting solution presented some challenges.  Drop by for the details.

1497486913ql8e7l7hj5_2_7_l.jpg

"FSP is a very new brand when it comes to CPU air coolers and is entering a market that is highly competitive and seeded with others that have been designing air coolers for quite some time. Its Windale 6 cooler features six direct contact heatpipes, a 120mm fan, and what FSP says is an "optimized fin design." But does it cool?"

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

 

Source: [H]ard|OCP