All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | June 28, 2017 - 11:39 PM | Scott Michaud
Tagged: pc gaming, gdq, speedrun
Starting on Sunday, Games Done Quick will be hosting their twice-annual, 24-hour speedrun marathon until 3am on the following Sunday. It will begin with a one-handed playthrough of NieR: Automata, and just keep going through game after game, including a handful of races between popular runners of applicable titles. (Personally, those tend to be my favorite segments.) Many are run on the PC!
This event will benefit Doctors Without Borders.
Until Awesome Games Done Quick 2017, it looked like the amount raised per week-long event settled at around 1.3 million. That one, however, leapfrogged the previous year’s total by a whole million dollars, ending up at $2.22 million USD. Summer Games Done Quick, apart from last year, tends to do a little less, but who knows?
Subject: General Tech | June 28, 2017 - 11:17 PM | Scott Michaud
Tagged: Unity, machine learning, deep learning
Unity, who makes the popular 3D game engine of the same name, has announced a research fellowship for integrating machine learning into game development. Two students, who must have been enrolled in a Masters or a PhD program on June 26th, will be selected and provided with $30,000 for a 6-month fellowship. The deadline is midnight (PDT) on September 9th.
We’re beginning to see a lot of machine-learning applications being discussed for gaming. There are some cases, like global illumination and fluid simulations, where it could be faster for a deep-learning algorithm to hallucinate a convincing than a physical solver will produce a correct one. In this case, it makes sense to post-process each frame, so, naturally, game engine developers are paying attention.
If eligible, you can apply on their website.
Subject: Graphics Cards | June 28, 2017 - 11:00 PM | Scott Michaud
Tagged: epic games, ue4, nvidia, geforce, giveaway
If you are an indie game developer, and you could use a little more GPU performance, NVIDIA is hosting a hardware giveaway. Starting at the end of July, and ongoing until Summer 2018, NVIDIA and Epic Games will be giving away GeForce GTX 1080 and GeForce GTX 1080 Ti cards to batches of Unreal Engine 4 projects.
To enter, you need to share screenshots and videos of your game on Twitter, Facebook, and Instagram, tagging both UnrealEngine and NVIDIA. (The specific accounts are listed on the Unreal Engine blog post that announces this initiative.) They will also feature these projects on both the Unreal Engine and the NVIDIA blog, which is just as valuable for indie projects.
So... hey! Several chances at free hardware!
Subject: General Tech | June 28, 2017 - 10:40 PM | Scott Michaud
Tagged: square enix, pc gaming, eidos montreal, deus ex: mankind divided
Frames of modern video games can be made up of tens of thousands of draw calls, which consist of a set of polygons and a shader pipeline that operates on it, and compute tasks. Last September, we found an article by Adrian Courrèges that broke down a single frame of DOOM, and discussed all of the techniques based on information from debug tools and SIGGRAPH slides.
This time, we found a video from János Turánszki that analyzes the ~32,000 - 33,000 graphics API calls of a single Deus Ex: Mankind Divided frame, using NVIDIA Nsight. As he scrubs through these events, he mentions things like how text is painted, a bug with temporal anti-aliasing, what appears to be a multi-pass blur for frosted glass, and so forth.
János Turánszki develops the open-source (MIT licensed) Wicked Engine.
Subject: Storage | June 28, 2017 - 09:49 PM | Allyn Malventano
Tagged: toshiba, WD, wdc, nand, 3d, BiCS, 96-layer, QLC
A couple of announcements out of Toshiba and Western Digital today. First up is Toshiba announcing QLC (4 bit per cell) flash on their existing BiCS 3 (64-layer) technology. QLC may not be the best for endurance as the voltage tolerances become extremely tight with 16 individual voltage states per cell, but Toshiba has been working on this tech for a while now.
Subject: General Tech | June 28, 2017 - 06:24 PM | Scott Michaud
Tagged: solidworks, ray tracing, radeon, prorender, nvidia, mental ray, Blender, amd
AMD has released a free ray-tracing engine for Blender, as well as Maya, 3D Studio Max, and SolidWorks, called Radeon ProRender. It uses a physically-based workflow, which allows multiple materials to be expressed in a single, lighting-independent shader, making it easy to color objects and have them usable in any sensible environment.
Image Credit: Mike Pan (via Twitter)
I haven’t used it yet, and I definitely haven’t tested how it stacks up against Cycles, but we’re beginning to see some test renders from Blender folks. It looks pretty good, as you can see with the water-filled Cornell box (above). Moreover, it’s rendered on an NVIDIA GPU, which I’m guessing they had because of Cycles, but that also shows that AMD is being inclusive with their software.
Radeon ProRender puts more than a little pressure on Mental Ray, which is owned by NVIDIA and licensed on annual subscriptions. We’ll need to see how quality evolves, but, as you see in the test render above, it looks pretty good so far... and the price can’t be beat.
Subject: General Tech | June 28, 2017 - 05:09 PM | Scott Michaud
Tagged: ubisoft, pc gaming
Honestly, I don’t really know how many first-party engines Ubisoft currently maintains anymore. Anvil is one of their more popular ones, which was used in Assassin’s Creed, Steep, For Honor, and Tom Clancy’s Ghost Recon Wildlands. Far Cry 5 will be using the Dunia Engine, which was forked from the original CryEngine. Tom Clancy’s The Division, Mario + Rabbids, and the new South Park use Snowdrop. I know that I’m missing some.
Add another one to the list: Voyager, which will be used in Beyond Good and Evil 2.
From what I gather with the video, this engine is optimized for massive differences in scale. The Creative Director for Beyond Good and Evil 2, Michael Ancel, showed the camera (in developer mode) smoothly transition from a high-detailed player model out to a part of a solar system. They claim that the sunset effects are actually caused by the planet’s rotation. Interesting stuff!
Subject: Processors | June 28, 2017 - 03:03 PM | Jeremy Hellstrom
Tagged: 7900x, Core i9, Intel, skylake-x, x299
The Tech Report recently wrapped up the first part of their review of Intel's new Core i9-7900X, focusing on its effectiveness in production machine. Their benchmarks cover a variety of scientific tasks such as PhotoWorxx, FPU Julia and Mandel as well as creativity benchmarks like picCOLOR, DAWBench DSP 2017 and STARS Euler3D. During their testing they saw the same peaks in power consumption as Ryan did in his review, 253W under a full Blender load. Their follow up review will focus on the new chips gaming prowess, for now you should take a look at how your i9-7900X will perform for you when you are not playing around.
"Intel's Core i9-7900X and its Skylake-X brethren bring AVX-512 support, a new cache hierarchy, and a new on-die interconnect to high-end desktops. We examine how this boatload of high-performance computing power advances the state of the art in productivity applications."
Here are some more Processor articles from around the web:
- Intel Core i9 7900X Linux Benchmarks @ Phoronix
- Intel Core i7 7740X Benchmarks On Linux @ Phoronix
- Ryzen 5 1400 @ Hardware Secrets
Subject: Storage | June 28, 2017 - 02:12 PM | Jeremy Hellstrom
Tagged: NVMe, toshiba, Toshiba XG5, ssd, nand, M.2, BiCS, 64-Layer
We first heard about the Toshiba XG5 1TB NVMe SSD at Computex, with its 64 layer BiCS flash and stated read speeds of 3GB/s, writes just over 2 GB/s. Today Kitguru published a review of the new drive, including ATTO results which match and even exceed the advertised read and write speeds. Their real world test involved copying 30GB of movies off of a 512GB Samsung 950 Pro to the XG5, only Samsung's new 960 lineup and the OCZ RD400 were able to beat Toshiba's new SSD. Read more in their full review, right here.
"The Toshiba XG5 1TB NVMe SSD contains Toshiba's newest 3D 64-Layer BiCS memory and our report will examine Toshiba's newest memory, as well as their newest NVMe controller to go along with it."
Here are some more Storage reviews from around the web:
- Toshiba N300 8TB HDD @ Kitguru
- Kingston Gold Series UHS-1 Speed Class 3 64GB MicroSDXC @ Modders-Inc
- Kingston DataTraveler Ultimate GT 2TB USB 3.1 Gen 1 Flash Drive Review @ NikKTech
- Drobo 5D3 DAS Review (Thunderbolt 3) @ Kitguru
- LaCie 2TB Rugged Thunderbolt USB-C Professional All-Terrain Mobile Storage Review @ NikKTech
Subject: General Tech | June 28, 2017 - 01:07 PM | Jeremy Hellstrom
Tagged: gaming, Intel, ddr3, ddr4
Overclockers Club have completed a daunting task, testing the effect of RAM frequency on game performance from DDR3-1333 through DDR4-3200. In theory Intel's chips will not see the same improvements as AMD's Ryzen, lacking Infinity Fabric which has proved to be sensitive to memory frequency. Since OCC cover two generations of RAM they also needed to test with two different processors, in this case the i7-4770K and i7-7700K and they tested performance at 1440p as well as 1080p. Read the full article to see the full results which do show some performance deltas, however they nothing compared to spending more on your GPU.
"After running through all of the tests, it appears that what I previously thought was an easy and clear answer is in fact more complicated. With the evidence provided I can safely say that memory can play a large role in some games over all frame rates. However, other factors like the processor, type of video card, and resolution will usually provide bigger impact in the final frame rates. Strictly speaking of game performances, the fastest memory tested does yield better results."
Here is some more Tech News from around the web:
- Nintendo New 2DS XL mini-review: The best version of the 3DS hardware yet @ Ars Technica
- The Steam Sale continues ... if you somehow didn't realize it by now
- Unknown Pleasures: Steam’s latest hidden delights @ Rock, Paper, SHOTGUN
- Racing games on sale @ Humble Store
- Wot I Think: Darkest Dungeon – The Crimson Court @ Rock, Paper, SHOTGUN
- Sand worms and lightning: Aven Colony takes city-building to exoplanets @ Ars Technica
- Double Dragon Trilogy free with any purchase @ GOG
Subject: General Tech | June 28, 2017 - 12:53 PM | Ryan Shrout
Tagged: giveaway, contest
It seems like it has been forever since we had a contest on the site...let's remedy that with our friends at FSP!
Anyone on the globe is able to enter - good luck!
Subject: Motherboards | June 28, 2017 - 01:44 AM | Tim Verry
Tagged: gigabyte, mini ITX, b350, amd, AM4, raven ridge, SFF, ryzen
Gigabyte is joining the small form factor Ryzen motherboard market with its new GA-AB350N-Gaming WIFI. The new Mini ITX motherboard sports AMD’s AM4 socket and B350 chipset and supports Ryzen “Summit Ridge” CPUs, Bristol Ridge APUs (7th Gen/Excavator), and future Zen-based Raven Ridge APUs. The board packs a fair bit of hardware into the Mini ITX form factor and is aimed squarely at gamers and enthusiasts.
The AB350N-Gaming WIFI has an interesting design in that some of the headers and connectors are flipped versus where they are traditionally located. The chipset sits to the left of CPU socket above the 6-phase VRMs and PowIRStage digital ICs. Four SATA 6Gbps ports and a USB 3.0 header occupy the top edge of the board. Two DDR4 dual channel memory slots are aligned on the right edge and support (overclocked) frequencies up to 3200 MHz depending on the processor used. The Intel wireless NIC, Realtek Gigabit Ethernet, and Realtek ALC1220 audio chips have been placed in the space between the AM4 socket and the single PCI-E 3.0 x16 slot. There is also a single M.2 (PCI-E 3.0 x4 32Gbps) slot on the underside of the motherboard. Gigabyte has also integrated “RGB Fusion” technology with two on board RGB LED lighting zones and two RGBW headers for off board lighting strips as well as high end audio capacitors and headphone amplifier. Smart Fan 5 technology allegedly is capable of automatically differentiating between fans and water pumps connected to the two fan headers and will automatically provide the correct PWM signal based on fan curves the user can customize in the UEFI BIOS. The motherboard is powered by a 24-pin ATX and 8-pin EPS and while it does not have a very beefy power phase setup it should be plenty for most overclocks (especially with Ryzen not wanting to go much past 4 GHz (easily) anyway).
Rear I/O includes:
- 1 x PS/2
- 2 x Antenna (Intel 802.11ac Wi-Fi + BT 4.2)
- 2 x USB 2.0
- 2 x USB 3.1 Gen 2 (10Gbps)
- 4 x USB 3.1 Gen 1 (5Gbps)
- 6 x Audio (5 x analog, 1 x S/PDIF)
- 1 x DisplayPort 1.2
- 1 x HDMI 1.4
- 1 x Realtek GbE
Gigabyte has an interesting SFF motherboard with the GA-AB350N-Gaming WIFI and I am interested in seeing the reviews. More Mini ITX options for Ryzen and other Zen-based systems is a good thing, and moving the power phases to the left may end up helping overclocking and cooling in smaller cases with tower coolers.
Unfortunately, Gigabyte has not yet revealed pricing or availability. Looking around online at its competition, i would guess it would be around $85 though.
- Biostar's ITX Ryzen motherboard in action; the X370GTN
- BIOSTAR Shows Mini-ITX AM4 Motherboard for AMD Ryzen
- ASRock's Fatal1ty X370 Gaming-ITX/ac Mini ITX Motherboard for Ryzen Coming Soon
- The AMD Ryzen 7 1800X Review: Now and Zen
- The Ryzen 5 Review: 1600X and 1500X Take on Core i5
Subject: Mobile | June 27, 2017 - 08:00 PM | Ryan Shrout
Tagged: xr, VR, qualcomm, google, daydream, AR
Qualcomm has put forward steady work on creating the vibrant hardware ecosystem for mobile VR to facilitate broad adoption of wireless, dedicated head mounted displays. Though the value of Samsung’s Gear VR and Google’s Daydream View cannot but overstated in moving the perception of consumer VR forward, the need to utilize your smart phone in a slot-in style design has its limitations. It consumes battery that you may require for other purposes, it limits the kinds of sensors that the VR system can utilize, and creates a sub-optimal form factor in order to allow for simple user installation.
The Qualcomm Snapdragon 835 VR Reference Device
Qualcomm created the first standalone VR HMD reference design back in early 2016, powered by the Snapdragon 820 processor. Google partnered with Qualcomm at I/O to create the Daydream standalone VR headset reference design with the updated Snapdragon 835 Mobile Platform at its core, improving performance and graphical capability along the way. OEMs like Lenovo and HTC have already committed to Daydream standalone units, with Qualcomm at the heart of the hardware.
Qualcomm Technologies recently announced a HMD Accelerator Program (HAP) to help VR device manufacturers quickly develop premium standalone VR HMDs. At the core of this program is the standalone VR HMD reference design. It goes beyond a simple prototype device, offering a detailed reference design that allows manufacturers to apply their own customizations while utilizing our engineering, design, and experience in VR. The reference design is engineered to minimize software changes, hardware issues, and key component validation.
- Hugo Swart, Qualcomm Atheros, Inc.
As part of this venture, and to continue pushing the VR industry forward to more advanced capabilities like XR (extended reality, a merger of VR and AR), Qualcomm is announcing agreements with key component vendors aiming to tighten and strengthen the VR headset ecosystem.
Hugo Swart, Senior Director, Product Management, Qualcomm Atheros, Inc.
Ximmerse has built a high-precision and drift-free controller for VR applications that offers low latency input and 3DoF (3 degrees of freedom) capability. This can “provide just about any interaction, such as pointing, selecting, grabbing, shooting, and much more. For precise 6 DoF positional tracking of your head, tight integration is required between the sensor fusion processing (Snapdragon) and the data from both the camera and inertial sensors.”
Bosch Sensortec has the BMX055 absolute orientation sensor that performs the function that its name would imply: precisely locating the user in the real world and tracking movement via accelerometer, gyroscope, and magnetometer.
Finally, OmniVision integrates the OV9282 which is a 1MP high speed shutter image sensor for feature tracking.
These technologies, paired with the work Qualcomm has already done for the Snapdragon 835 VR Development Kit, including on the software side, is an important step to the growth of this segment of the market. I don’t know of anyone that doesn’t believe standalone, wireless headsets are the eventual future of VR and AR and the momentum created by Qualcomm, Google, and others continues its steady pace of development.
Subject: Graphics Cards | June 27, 2017 - 06:51 PM | Jeremy Hellstrom
Tagged: Vega FE, Vega, HPC, amd
AMD have released their new HPC card, the Radeon Vega Frontier Edition, which Jim told you about earlier this week. The air cooled version is available now, with an MSRP of $999USD followed by a water-cooled edition arriving in Q3 with price tag of $1499.
The specs they list for the cards are impressive and compare favourably to NVIDIA's P100 which is the card AMD tested against, offering higher TFLOPS for both FP32 and FP16 operations though the memory bandwidth lags a little behind.
|Peak/Boost Clock||1600 MHz||1442 MHz|
|FP32 TFLOPS (SP)||13.1||10.3|
|FP64 TFLOPS (DP)||
|Memory Interface||1.89 Gb/s
|Memory Bandwidth||483 GB/s||716 GB/s|
|Memory Size||16GB HBC*||16GB|
|TDP||300 W air, 375 W water||235 W|
The memory size for the Vega is interesting, HBC is AMDs High Bandwidth Cache Controller which not only uses the memory cache more effectively but is able to reach out to other high performance system memory for help. AMD states that the Radeon Vega Frontier Edition has the capability of expanding traditional GPU memory to 256TB; perhaps allowing new texture mods for Skyrim or Fallout! Expect to see more detail on this feature once we can get our hands on a card to abuse, nicely of course.
AMD used the DeepBench Benchmark to provide comparative results, the AMD Vega FE system used a dual socketed system with Xeon E5 2640v4s @ 2.4Ghz 10C/20T, 32GB DDR4 per socket, on Ubuntu 16.04 LTS with ROCm 1.5, and OpenCL 1.2, the NVIDIA Tesla P100 system used the same hardware with the CuDNN 5.1, Driver 375.39 and Cuda version 8.0.61 drivers. Those tests showed the AMD system completing the benchmark in 88.7ms, the Tesla P100 completed in 133.1 ms, quite an impressive lead for AMD. Again, there will be much more information on performance once the Vega FE can be tested.
Read on to hear about the new card in AMD's own words, with links to their sites.
Subject: General Tech | June 27, 2017 - 01:13 PM | Jeremy Hellstrom
Tagged: Jeri Ellsworth, Rick Johnson, CastAR, augmented reality
The brain child of fomer Valve employees Jeri Ellsworth and Rick Johnson, CastAR, is no more. They were part of the original team at Valve which helped create SteamVR, their focus was on augmented reality applications which Valve eventually decided to drop and Jeri and Rick were allowed to keep the IP which they helped develop. They went on to launch a very successful Kickstarter to help develop their technology and when they eventually received $15 million in investments they chose to return the money invested by their Kickstarter backers; a very different reaction than others have had.
Unfortunately they have not been able to continue to attract investment for their AR products and according to the information Polygon garnered, they have significantly downsized the number of employees and may be seeking to sell their technology. This is exceptionally bad news as their first set of AR goggles were set to launch later this year. The market seems far more willing to invest in VR than it does AR, which presents a large hurdle for smaller businesses to succeed. Hopefully we will hear happier news about Jeri, her team, and CastAR in the future but for now it looks rather bleak.
"In 2013, Technical Illusions got its start with a hugely successful Kickstarter, netting just north of one million dollars. This success drew the attention of investors and eventually led to a funding round of $15 million. With this success, Technical Illusions decided to refund the backers of its Kickstarter."
Here is some more Tech News from around the web:
- Google Slapped With $2.7 Billion By EU For Skewing Searches @ Slashdot
- Ukrainian Banks, Electricity Firm Hit by Fresh Cyber Attack; Reports Claim the Ransomware Is Quickly Spreading Across the World @ Slashdot
- Solving the NVMeF-JBOF-is-not-a-SAN conundrum @ The Register
- Dell drops optical drive price-fixing lawsuit against Hitachi @ The Register
- Linksys EA9500 MAX-STREAM AC5400 MU-MIMO Gigabit Router Review @ NikKTech
- TP-Link Deco M5 Mesh Wi-Fi Router System @ Custom PC Review
- WiMiUs L1 4K Action Cam @ Benchmark Reviews
Subject: Graphics Cards | June 26, 2017 - 11:29 PM | Tim Verry
Tagged: pascal, nvidia, nicehash, mining, gp106-100, gp104-100, cryptocurrency
In addion to the AMD-based mining graphics cards based on the RX 470 Polaris silicon that have appeared online, NVIDIA and its partners are launching cryptocurrency mining cards based on GP106 and GP104 GPUs. Devoid of any GeForce or GTX branding, these cost controlled cards focused on mining lack the usual array of display outputs and have much shorter warranties (rumors point at a 3 month warranty restriction imposed by NVIDIA). So far Asus, Colorful, EVGA, Inno3D, MSI, and Zotac "P106-100" cards based on GP106 (GTX 1060 equivalent) silicon have been spotted online with Manli and Palit reportedly also working on cards. Many of these manufacturers are also also planning "P104-100" cards based on GP104 or the GTX 1070 though much less information is available at the moment. Pricing is still up in the air but pre-orders are starting to pop up overseas so release dates and prices will hopefully become official soon.
These mining oriented cards appear to be equipped with heatsinks similar to their gaming oriented siblings, but have fans rated for 24/7 operation. Further, while the cards can be overclocked they are clocked out of the box at reference clock speeds and allegedly have bolstered power delivery hardware to keep the cards mining smoothly under 24/7 operation. The majority of cards from NVIDIA partners lack any display outputs (the Colorful card has a single DVI out) which helps a bit with ventilation by leaving both slots vented. These cards are intended to be run in headless system or with systems that also have graphics integrated into the CPU (miners not wanting to waste a PCI-E slot!).
|Base Clock||Boost Clock||Memory (Type)||Pricing|
|ASUS MINING-P106-6G||1506 MHz||1708 MHz||6 GB (GDDR5) @ 8 GHz||$226|
|Colorful P106-100 WK1/WK2||1506 MHz||1708 MHz||6GB (GDDR5) @ 8 GHz||?|
|EVGA GTX1060 6G P106||1506 MHz||1708 MHz||6GB (GDDR5) @ 8 GHz||$284?|
|Inno3D P106-100 Compact||1506 Mhz||1708 MHz||6GB (GDDR5) @ 8 GHz||?|
|Inno3D P106-100 Twin||1506 MHz||1708 MHz||6GB (GDDR5) @ 8 GHz||?|
|MSI P106-100 MINER||1506 MHz||1708 MHz||6GB (GDDR5) @ 8 GHz||$224|
|MSI P104-100 MINER||TDB||TBD||6GB (GDDR5X) @ ?||?|
|ZOTAC P106-100||1506 MHz||1708 MHz||6GB (GDDR5) @ 8 GHz||?|
Looking at the Nicehash Profitability Calculator, the GTX 1060 and GTX 1070 are rated at 20.13 MH/s and 28.69 MH/s at DaggerHashimoto (Etherium) mining respectively with many users able to get a good bit higher hash rates with a bit of overclocking (and in the case of AMD undervolting to optimize power efficiency). NVIDIA cards tend to be good for other algorithms as well such as ZCash and Libry and Equihash (at least those were the majority of coins my 750 Ti mined likely due to it not having the memory to attempt ETH mining heh). The calculator estimates these GPUs at 0.00098942 BTC per day and 0.00145567 BTC per day respectivey. If difficulty and exchange rate were to remains constant that amounts to an income of $1197.95 per year for a GP106 and $1791.73 per year for a GP104 GPU and ROI in under 3 months. Of course cryptocurrency to USD exchange rates will not remain constant, there are transactions and mining fees, and mining difficulty will rise as more hardware is added to the network as miners so these estimated numbers will be lower in reality. Also, these numbers are before electricity, maintainence time, and failed hardware costs, but currently mining alt coins is still very much profitable using graphics cards.
AMD and NVIDIA (and their AIB partners) are hoping to get in on this action with cards binned and tuned for mining and at their rumored prices placing them cheaper than their gaming focused RX and GTX variants miners are sure to scoop these cards up in huge batches (some of the above cards are only availabe in large orders). Hopefully this will alleviate the strain on the gaming graphics card market and bring prices back down closer to their original MSRPs for gamers!
- Mining specific cards are real - ASUS and Sapphire GP106 and RX 470 show up
- First look at Pascal-based GPU cryptocurrency mining station @ Videocardz
- ASUS, COLORFUL and MSI showcase their mining graphics cards @ Videocardz
- Riding the Crypto wave @ TechPowerUP Forums (links/info on mining cards collected here)
- Donate to the PC Perspective Mining Pool! A NiceHash How-to
- Let's Talk About Mining - Cryptocurrency Revisited
- Computex 2017: ASRock Launching H110 Pro BTC+ Motherboard With 13 PCI-E Slots
What are your thoughts on all this GPU mining and cryptocurrency / blockchain technology stuff?
Subject: Cases and Cooling | June 26, 2017 - 06:32 PM | Jeremy Hellstrom
Tagged: Seasonic PRIME, 850W, 80 Plus Platinum, modular psu
It was almost a year ago that Lee reviewed the Seasonic PRIME 750W Titanium PSU; today it is [H]ard|OCP who has a review of a cousin of that PSU. The Seasonic PRIME 850W Platinum PSU is a new addition to the PRIME family, bearing the same 12 year warranty as its relatives as well as the single 12V rail design and physical Hybrid button. As [H] have already reviewed the previous 850W PRIME model, the newcomer has some big shoes to fill. It comes very close to doing so, as you can see in their full review.
"As is usual, Seasonic talks softly and carries a big stick. The biggest stick lately has been its Prime series power supplies. Today's Prime comes to us touting excellent efficiency, a fully modular design, tight output voltage, and a quiet noise profile supplied by a fluid dynamic bearing fan. Does Seasonic continue its current reign?"
Here are some more Cases & Cooling reviews from around the web:
- Seasonic PRIME 850W Gold @ Kitguru
- Thermaltake Smart Pro RGB 850W Power Supply Review @ Hardware Asylum
- Super Flower Leadex II 750 W @ techPowerUp
- Which Power Supply do you need? – Seasonic showcase 2017 @ Kitguru
Subject: General Tech | June 26, 2017 - 03:03 PM | Jeremy Hellstrom
Tagged: microsoft. leak, beta
Someone has uploaded an immense amount of previously secret Windows code from Microsoft to Beta Archive, who are currently trying to take the private content down as quickly as they can. The leaks include a number of unreleased builds of Server 2016, Windows 10 "Redstone" builds and even versions to run on 64bit ARM which would be interesting to look at if that was all that was uploaded. Unfortunately along with those builds were Microsoft's PnP code, USB and Wi-Fi stacks, storage drivers, and ARM-specific OneCore kernel code, all of which is a goldmine for those who choose to make life miserable for computer users everywhere. Take a peek at an overview of what was leaked at The Register.
"The data – some 32TB of official and non-public installation images and software blueprints that compress down to 8TB – were uploaded to betaarchive.com, the latest load of files provided just earlier this week. It is believed the confidential data in this dump was exfiltrated from Microsoft's in-house systems around March this year."
Here is some more Tech News from around the web:
- MEMS: The Biggest Word in Small @ Hack a Day
- Dobot Magician Robotic Arm @ techPowerUp
- Twitch Announces Six-Day Marathon Of Classic MST3K Episodes @ Slashdot
Subject: Graphics Cards | June 26, 2017 - 12:21 PM | Ryan Shrout
Tagged: radeon, nvidia, mining, geforce, cryptocurrency, amd
It appears that the prediction of mining-specific graphics cards was spot on and we are beginning to see the release of them from various AMD and NVIDIA board partners. ASUS has launched both a GP106-based solution and an RX 470 offering, labeled as being built exclusively for mining. And Sapphire has tossed it's hat into the ring with RX 470 options as well.
The most interesting release is the ASUS MINING-P106-6G, a card that takes no official NVIDIA or GeForce branding, but is clearly based on the GP106 GPU that powers the GeForce GTX 1060. It has no display outputs, so you won't be able to use this as a primary graphics card down the road. It is very likely that these GPUs have bad display controllers on the chip, allowing NVIDIA to make use of an otherwise unusable product.
The specifications on the ASUS page list this product as having 1280 CUDA cores, a base clock of 1506 MHz, a Boost clock of 1708 MHz, and 6GB of GDDR5 running at 8.0 GHz. Those are identical specs to the reference GeForce GTX 1060 product.
The ASUS MINING-RX470-4G is a similar build but using the somewhat older, but very efficient for mining, Radeon RX 470 GPU.
Interestingly, the ASUS RX 470 mining card has openings for a DisplayPort and HDMI connection, but they are both empty, leaving the single DVI connection as the only display option.
The Mining RX 470 has 4GB of GDDR5, 2048 stream processors, a base clock of 926 MHz and a boost clock of 1206 MHz, again, the same as the reference RX 470 product.
We have also seen Sapphire versions of the RX 470 for mining show up on Overclockers UK with no display outputs and very similar specifications.
In fact, based on the listings at Overclockers UK, Sapphire has four total SKUs, half with 4GB and half with 8GB, binned by clocks and by listing the expected MH/s (megahash per second) performance for Ethereum mining.
These releases show both NVIDIA and AMD (and its partners) desire to continue cashing in on the rising coin mining and cryptocurrency craze. For AMD, this allows them to find an outlet for the RX 470 GPU that might have otherwise sat in inventory with the upgraded RX 500-series out on the market. For NVIDIA, using GPUs that have faulty display controllers for mining-specific purposes allows it to be better utilize production and gain some additional profit with very little effort.
Those of you still looking to buy GPUs at reasonable prices for GAMING...you remember, what these products were built for...are still going to have trouble finding stock on virtual or physical shelves. Though the value of compute power has been dropping over the past week or so (an expected result of increase interesting in the process), I feel we are still on the rising side of this current cryptocurrency trend.
Subject: Processors | June 26, 2017 - 08:53 AM | Sebastian Peak
Tagged: xeon, Skylake, processor, pentium, microcode, kaby lake, Intel, errata, cpu, Core, 7th generation, 6th generation
A microcode bug affecting Intel Skylake and Kaby Lake processors with Hyper-Threading has been discovered by Debian developers (who describe it as "broken hyper-threading"), a month after this issue was detailed by Intel in errata updates back in May. The bug can cause the system to behave 'unpredictably' in certain situations.
"Under complex micro-architectural conditions, short loops of less than 64 instructions that use AH, BH, CH or DH registers as well as their corresponding wider register (eg RAX, EAX or AX for AH) may cause unpredictable system behaviour. This can only happen when both logical processors on the same physical processor are active."
Until motherboard vendors begin to address the bug with BIOS updates the only way to prevent the possibility of this microcode error is to disable HyperThreading. From the report at The Register (source):
"The Debian advisory says affected users need to disable hyper-threading 'immediately' in their BIOS or UEFI settings, because the processors can 'dangerously misbehave when hyper-threading is enabled.' Symptoms can include 'application and system misbehaviour, data corruption, and data loss'."
The affected models are 6th and 7th-gen Intel processors with HyperThreading, which include Core CPUs as well as some Pentiums, and Xeon v5 and v6 processors.