Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »
Author:
Manufacturer: AMD

Is it time to buy that new GPU?

Testing commissioned by AMD. This means that AMD paid us for our time, but had no say in the results or presentation of them.

Earlier this week Bethesda and Arkane Studios released Prey, a first-person shooter that is a re-imaging of the 2006 game of the same name. Fans of System Shock will find a lot to love about this new title and I have found myself enamored with the game…in the name of science of course.

Prey-2017-05-06-15-52-04-16.jpg

While doing my due diligence and performing some preliminary testing to see if we would utilize Prey for graphics testing going forward, AMD approached me to discuss this exact title. With the release of the Radeon RX 580 in April, one of the key storylines is that the card offers a reasonably priced upgrade path for users of 2+ year old hardware. With that upgrade you should see some substantial performance improvements and as I will show you here, the new Prey is a perfect example of that.

Targeting the Radeon R9 380, a graphics card that was originally released back in May of 2015, the RX 580 offers substantially better performance at a very similar launch price. The same is true for the GeForce GTX 960: launched in January of 2015, it is slightly longer in the tooth. AMD’s data shows that 80% of the users on Steam are running on R9 380X or slower graphics cards and that only 10% of them upgraded in 2016. Considering the great GPUs that were available then (including the RX 480 and the GTX 10-series), it seems more and more likely that we going to hit an upgrade inflection point in the market.

slides-5.jpg

A simple experiment was setup: does the new Radeon RX 580 offer a worthwhile upgrade path for those many users of R9 380 or GTX 960 classifications of graphics cards (or older)?

  Radeon RX 580 Radeon R9 380 GeForce GTX 960
GPU Polaris 20 Tonga Pro GM206
GPU Cores 2304 1792 1024
Rated Clock 1340 MHz 918 MHz 1127 MHz
Memory 4GB
8GB
4GB 2GB
4GB
Memory Interface 256-bit 256-bit 128-bit
TDP 185 watts 190 watts 120 watts
MSRP (at launch) $199 (4GB)
$239 (8GB)
$219 $199

Continue reading our look at the Radeon RX 580 in Prey!

Meet the GT 1030

Subject: Graphics Cards | May 17, 2017 - 01:55 PM |
Tagged: nvidia, msi, gt 1030, gigabyte, evga. zotac

The GT 1030 quietly launched from a variety of vendors late yesterday amidst the tsunami of AMD announcements.  The low profile card is advertised as offering twice the performance of the iGPU found on Intel Core i5 processors and in many cases is passively cooled.  From the pricing of the cards available now, expect to pay around $75 to $85 for this new card.

evga.jpg

EVGA announced a giveaway of several GTX 1030s at the same time as they released the model names.  The card which is currently available retails for $75 and is clocked at 1290MHz base, 1544 MHz boost and has 384 CUDA Cores.  The 2GB of GDDR5 is clocked a hair over 6GHz and runs on a 64 bit bus providing a memory bandwidth of 48.06 GB/s.  Two of their three models offer HDMI + DVI-D out, the third has a pair of DVI-D connectors.

zt-p10300a-10l_image5_0.jpg

Zotac's offering provides slightly lower clocks, a base of 1227MHz and boost of 1468MHz however the VRAM remains unchanged at 6GHz.  It pairs HDMI 2.0b with a DVI slot and comes with a low profile bracket if needed for an SFF build.

msi.png

MSI went all out and released a half dozen models, two of which you can see above.  The GT 1030 AERO ITX 2G OC is actively cooled which allows you to reach a 1265MHz base and 1518MHz boost clock.  The passively cooled GT 1030 2GH LP OCV1 runs at the same frequency and fits in a single slot externally, however you will need to leave space inside the system as the heatsink takes up an additional slot internally.  Both are fully compatible with the Afterburner Overclocking Utility and its features such as the Predator gameplay recording tool.

gb pair.png

Last but not least are a pair from Gigabyte, the GT 1030 Low Profile 2G and Silent Low Profile 2G cards.  The the cards both offer you two modes, in OC Mode the base clock is 1252MHz and boost clock 1506MHz while in Gaming Mode you will run at 1227MHz base and 1468MHz boost. 

Source: EVGA
Author:
Subject: Processors
Manufacturer: Various

Application Profiling Tells the Story

It should come as no surprise to anyone that has been paying attention the last two months that the latest AMD Ryzen processors and architecture are getting a lot of attention. Ryzen 7 launched with a $499 part that bested the Intel $1000 CPU at heavily threaded applications and Ryzen 5 launched with great value as well, positioning a 6-core/12-thread CPU against quad-core parts from the competition. But part of the story that permeated through both the Ryzen 7 and the Ryzen 5 processor launches was the situation surrounding gaming performance, in particular 1080p gaming, and the surprising delta  that we see in some games.

Our team has done quite a bit of research and testing on this topic. This included a detailed look at the first asserted reason for the performance gap, the Windows 10 scheduler. Our summary there was that the scheduler was working as expected and that minimal difference was seen when moving between different power modes. We also talked directly with AMD to find out its then current stance on the results, backing up our claims on the scheduler and presented a better outlook for gaming going forward. When AMD wanted to test a new custom Windows 10 power profile to help improve performance in some cases, we took part in that too. In late March we saw the first gaming performance update occur courtesy of Ashes of the Singularity: Escalation where an engine update to utilize more threads resulted in as much as 31% average frame increase.

ping-amd.png

As a part of that dissection of the Windows 10 scheduler story, we also discovered interesting data about the CCX construction and how the two modules on the 1800X communicated. The result was significantly longer thread to thread latencies than we had seen in any platform before and it was because of the fabric implementation that AMD integrated with the Zen architecture.

This has led me down another hole recently, wondering if we could further compartmentalize the gaming performance of the Ryzen processors using memory latency. As I showed in my Ryzen 5 review, memory frequency and throughput directly correlates to gaming performance improvements, in the order of 14% in some cases. But what about looking solely at memory latency alone?

Continue reading our analysis of memory latency, 1080p gaming, and how it impacts Ryzen!!

AMD Announces Radeon Vega Frontier Edition Graphics Cards

Subject: Graphics Cards | May 16, 2017 - 07:39 PM |
Tagged: Vega, reference, radeon, graphics card, gpu, Frontier Edition, amd

AMD has revealed their concept of a premium reference GPU for the upcoming Radeon Vega launch, with the "Frontier Edition" of the new graphics cards.

Vega FE Slide.png

"Today, AMD announced its brand-new Radeon Vega Frontier Edition, the world’s most powerful solution for machine learning and advanced visualization aimed to empower the next generation of data scientists and visualization professionals -- the digital pioneers forging new paths in their fields. Designed to handle the most demanding design, rendering, and machine intelligence workloads, this powerful new graphics card excels in:

  • Machine learning. Together with AMD’s ROCm open software platform, Radeon Vega Frontier Edition enables developers to tap into the power of Vega for machine learning algorithm development. Frontier Edition delivers more than 50 percent more performance than today’s most powerful machine learning GPUs.
  • Advanced visualization. Radon Vega Frontier Edition provides the performance required to drive increasingly large and complex models for real-time visualization, physically-based rendering and virtual reality through the design phase as well as rendering phase of product development.
  • VR workloads. Radeon Vega Frontier Edition is ideal for VR content creation supporting AMD’s LiquidVR technology to deliver the gripping content, advanced visual comfort and compatibility needed for next-generation VR experiences.
  • Revolutionized game design workflows. Radeon Vega Frontier Edition simplifies and accelerates game creation by providing a single GPU optimized for every stage of a game developer’s workflow, from asset production to playtesting and performance optimization."

Vega FE.jpg

From the image provided on the official product page it appears that there will be both liquid-cooled (the gold card in the background) and air-cooled variants of these "Frontier Edition" cards, which AMD states will arrive with 16GB of HBM2 and offer 1.5x the FP32 performance and 3x the FP16 performance of the Fury X.

From AMD:

Radeon Vega Frontier Edition

  • Compute units: 64
  • Single precision compute performance (FP32): ~13 TFLOPS
  • Half precision compute performance (FP16): ~25 TFLOPS
  • Pixel Fillrate: ~90 Gpixels/sec
  • Memory capacity: 16 GBs of High Bandwidth Cache
  • Memory bandwidth: ~480 GBs/sec

The availability of the Radeon Vega Frontier Edition was announced as "late June", so we should not have too long to wait for further details, including pricing.

Source: AMD

Ryzen and the art of benchmark maintenance

Subject: Processors | May 19, 2017 - 04:15 PM |
Tagged: amd, ryzen, ryzen 5 1400, ryzen 5 1600

Neoseeker tested out the 4 core Ryzen 5 1400 and 6 core 1600 model to see how they stack up against other lower cost processors.  They ran the tests at the highest stable overclock they could reach, interestingly both were able to hit a 3.8 GHz base clock, paired with DDR4-2400.  The processors were cooled with AMD's Wraith Max cooler so it is possible to push these CPUs further if you are willing to overvolt.  Drop by to see how these two processor match up to the competition.

04.jpg

"The two AMD processors for review today are the newest budget offerings of the Ryzen 5 series with the Ryzen 1400 and 1600 models. The Ryzen 1400 is a four core/eight thread and the Ryzen 1600 is a six core/twelve thread processor, with both having a base operating speed of 3.2 GHz. The boost clock for the Ryzen 1400 is 3.4 GHz while the Ryzen 1600 is able to boost to 3.6 GHz."

Here are some more Processor articles from around the web:

Processors

 

Source: Neoseeker

AMD Teases Ryzen Mobile APUs with Zen CPU Cores and On-Die Vega Graphics

Subject: Processors | May 18, 2017 - 01:01 AM |
Tagged: Zen, Vega, ryzen mobile, ryzen, raven ridge, APU, amd

AMD teased its upcoming Zen-based APUs aimed at mobile devices during its Financial Analyst Day where the company revealed the "Raven Ridge" parts will be aptly known as Ryzen Mobile. The Tech Report managed to acquire a couple slides which confirm some of the broader specifications and reveal how they stack up to AMD's latest Bristol Ridge A-Series APUs – at least as far as AMD's internal testing is concerned (which is to say not independently verified yet so take with a grain of salt).

AMD Ryzen Mobile APUs.jpg

Ryzen Mobile appears to be the new consumer-facing brand name for what has so far been code named "Raven Ridge". These parts will use a Zen-based CPU, Vega GPU, and integrated chipset. Thanks to the slides, it is now confirmed that the Vega-based graphics processor will be on-die. What has not been confirmed is whether the chipset will be on die or on package and exact specifications on CPU cores counts, GPU Compute Units, cache, memory support, and I/O like PCI-E lanes (you know, all the good stuff! heh). Note that rumors so far point towards Raven Ridge / Ryzen Mobile utilizing a single 4-core (8-thread) CCX, per core L2, 8MB shared L3 cache, and a Vega-based GPU with 1024 cores. HBM2 has also been rumored for awhile but we will have to wait for more leaks and/or an official announcement to know for sure if these Ryzen Mobile parts aimed for the second half of 2017 will have that (hopefully!).

With that said, according to AMD, Ryzen Mobile will offer up to 50% better CPU performance, 40% better GPU performance, and will use up to 50% less power than the previous 7th generation (Excavator-based) A-Series APUs (e.g. FX 9830P and A12-9730P). Those are some pretty bold claims, but still within the realm of possibility. Zen and Vega are both much more efficient architectures and AMD is also benefiting from a smaller process node (TSMC 28nm vs Samsung / GlobalFoundries 14nm FinFET). I do wonder how high the APUs will be able to clock on the CPU side of things with 4 GHz seeming to be the wall for most Zen-based Summit Ridge chips, so most of the CPU performance improvement claims will have to come from architecture changes rather than increases in clockspeeds (the highest clocked A-Series Bristol Ridge ran at up to 3.7 GHz and I would expect Raven Ridge to be around that, maybe the flagship part turbo-ing a bit more). Raven Ridge will benefit from the shared L3 cache and, more importantly, twice as many threads (4 vs 8) and this may be where AMD is primarily getting that 50% more CPU performance number from. On the graphics side of things, it looks like Bristol Ridge with its R7 graphics (GCN 3 (Tonga/Fiji on the Desktop)) had up to 512 cores. Again, taking the rumors into account which say that Raven Ridge will have a 1024 core Vega GPU, this may be where AMD is getting the large performance increase from (the core increase as well as newer architecture). On the other hand, the 40% number could suggest Ryzen Mobile will not have twice the GPU cores. I would guess that 1024 might be possible, but running at lower clocks and that is where the discrepancy is. I will admit I am a bit skeptical about the 1024 (16 CU) number though because that is a huge jump... I guess we will see though!

Further, I am curious if Ryzen Mobile will use HBC (high bandwidth cache) and if HBM2 does turn out to be utilized how that will play into the HBC and whether or not we will finally see the fruits of AMD's HSA labors! I think we will see most systems use DDR4, but certainly some SKUs could use HBM2 and that would definitely open up a lot of performance possibilities on mobile!

There is still a lot that we do not know, but Ryzen Mobile is coming and AMD is making big promises that I hope it delivers on. The company is aiming the new chips at a wide swath of the mobile market from budget laptops and tablets to convertibles and even has their sights set on premium thin and lights. The mobile space is one where AMD has struggled with in getting design wins even when they had good parts for that type of system. They will really need to push and hit Ryzen Mobile out of the park to make inroads into the laptop, tablet, and ultrabook markets!

AMD plans to launch the consumer version of Ryzen Mobile in the second half of this year (presumably with systems featuring the new APUs out in time for the holidays if not for the back to school end of summer rush). The commercial SKUs (which I think refers to the Ryzen equivalent of AMD Pro series APUs.Update: Mobile Ryzen Pro) will follow in the first half of 2018.

What are your thoughts on Ryzen Mobile and the alleged performance and power characteristics? Do you think the rumors are looking more or less correct?

Also read:

Source: Tech Report

AMD Compares 1x 32-Core EPYC to 2x 12-Core Xeon E5s

Subject: Processors | May 17, 2017 - 04:05 AM |
Tagged: amd, EPYC, 32 core, 64 thread, Intel, Broadwell-E, xeon

AMD has formally announced their EPYC CPUs. While Sebastian covered the product specifications, AMD has also released performance claims against a pair of Intel’s Broadwell-E Xeons. While Intel’s E5-2650 v4 processors have an MSRP of around $1170 USD, each, we don’t know how that price will compare to AMD’s offering. At first glance, pitting thirty two cores against two twelve-core chips seems a bit unfair, although it could end up being a very fair comparison if the prices align.

amd-2017-epyc-ubuntucompile.jpg

Image Credit: Patrick Moorhead

Patrick Moorhead, who was at the event, tweeted out photos of a benchmark where Ubuntu was compiled over GCC. It looks like EPYC completed in just 33.7s while the Broadwell-E chip took 37.2s (making AMD’s part ~9.5% faster). While this, again, stems from having a third more cores, this depends on how much AMD is going to charge you for them, versus Intel’s current pricing structure.

amd-2017-epyc-threads.jpg

Image Credit: Patrick Moorhead

This one chip also has 128 PCIe lanes, rather than Intel’s 80 total lanes spread across two chips.

Western Digital Launches 10TB Red and Red Pro

Subject: Storage | May 17, 2017 - 09:57 PM |
Tagged: western digital, wdc, WD, Red Pro, red, NAS, helium, HelioSeal, hdd, Hard Drive, 10TB

Western Digital increased the capacity of their Red and Red Pro NAS hard disk lines to 10TB. Acquiring the Helioseal technology via their HGST acquisition, which enables Helium filled hermetically sealed drives of even higher capacities, WD expanded the Red lines to 8TB (our review of those here) using that tech. Helioseal has certainly proven itself, as over 15 million such units have shipped so far.

sshot-54.png

We knew it was just a matter of time before we saw a 10TB Red and Red Pro, as it has been some time since the HGST He10 launched, and Western Digital's own 10TB Gold (datacenter) drive has been shipping for a while now.

  • Red 10TB:        $494
  • Red Pro 10TB: $533

MSRP pricing looks a bit high based on the lower cost/GB of the 8TB model, but given some time on the market and volume shipping, these should come down to match parity with the lesser capacities.

Press blast appears after the break.

Author:
Subject: General Tech
Manufacturer: YouTube TV

YouTube Tries Everything

Back in March, Google-owned YouTube announced a new live TV streaming service called YouTube TV to compete with the likes of Sling, DirecTV Now, PlayStation Vue, and upcoming offerings from Hulu, Amazon, and others. All these services aim to deliver curated bundles of channels aimed at cord cutters that run over the top of customer’s internet only connections as replacements for or in addition to cable television subscriptions.  YouTube TV is the latest entrant to this market with the service only available in seven test markets currently, but it is off to a good start with a decent selection of content and features including both broadcast and cable channels, on demand media, and live and DVR viewing options. A responsive user interface and generous number of family sharing options (six account logins and three simultaneous streams) will need to be balanced by the requirement to watch ads (even on some DVR’ed shows) and the $35 per month cost.

Get YouTube TV 1.jpg

YouTube TV was launched in 5 cities with more on the way. Fortunately, I am lucky enough to live close enough to Chicago to be in-market and could test out Google’s streaming TV service. While not a full review, the following are my first impressions of YouTube TV.

Setup / Sign Up

YouTube TV is available with a one month free trail, after which you will be charged $35 a month. Sign up is a simple affair and can be started by going to tv.youtube.com or clicking the YouTube TV link from “hamburger” menu on YouTube. If you are on a mobile device, YouTube TV uses a separate app than the default YouTube app and weighs in at 9.11 MB for the Android version. The sign up process is very simple. After verifying your location, the following screens show you the channels available in your market and gives you the option of adding Showtime ($11) and/or Fox Soccer ($15) for additional monthly fees. After that, you are prompted for a payment method that can be the one already linked to your Google account and used for app purchases and other subscriptions. As far as the free trial, I was not charged anything and there was no hold on my account for the $35. I like that Google makes it easy to see exactly how many days you have left on your trial and when you will be charged if you do not cancel. Further, the cancel link is not buried away and is intuitively found by clicking your account photo in the upper right > Personal > Membership. Google is doing things right here. After signup, a tour is offered to show you the various features, but you can skip this if you want to get right to it.

In my specific market, I have the following channels. When I first started testing some of the channels were not available, and were just added today. I hope to see more networks added, and if Google can manage that YouTube TV and it’s $35/month price are going to shape up to be a great deal.

  • ABC 7, CBS 2, Fox 32, NBC 5, ESPN, CSN, CSN Plus, FS1, CW, USA, FX, Free Form, NBC SN, ESPN 2, FS2, Disney, E!, Bravo, Oxygen, BTN, SEC ESPN Network, ESPN News, CBS Sports, FXX, Syfy, Disney Junior, Disney XD, MSNBC, Fox News, CNBC, Fox Business, National Geographic, FXM, Sprout, Universal, Nat Geo Wild, Chiller, NBC Golf, YouTube Red Originals
  • Plus: AMC, BBC America, IFC, Sundance TV, We TV, Telemundo, and NBC Universal (just added).
  • Optional Add-Ons: Showtime and Fox Soccer.

I tested YouTube TV out on my Windows PCs and an Android phone. You can also watch YouTube TV on iOS devices, and on your TV using an Android TVs and Chromecasts (At time of writing, Google will send you a free Chromecast after your first month). (See here for a full list of supported devices.) There are currently no Roku or Apple TV apps.

Get YouTube TV_full list.jpg

Each YouTube TV account can share out the subscription to 6 total logins where each household member gets their own login and DVR library. Up to three people can be streaming TV at the same time. While out and about, I noticed that YouTube TV required me to turn on location services in order to use the app. Looking further into it, the YouTube TV FAQ states that you will need to verify your location in order to stream live TV and will only be able to stream live TV if you are physically in the markets where YouTube TV has launched. You can watch your DVR shows anywhere in the US. However, if you are traveling internationally you will not be able to use YouTube TV at all (I’m not sure if VPNs will get around this or if YouTube TV blocks this like Netflix does). Users will need to login from their home market at least once every 3 months to keep their account active and able to stream content (every month for MLB content).

YouTube TV verifying location in Chrome (left) and on the android app (right).

On one hand, I can understand this was probably necessary in order for YouTube TV to negotiate a licensing deal, and their terms do seem pretty fair. I will have to do more testing on this as I wasn’t able to stream from the DVR without turning on location services on my Android – I can chalk this up to growing pains though and it may already be fixed.

Features & First Impressions

YouTube TV has an interface that is perhaps best described as a slimmed down YouTube that takes cues from Netflix (things like the horizontal scrolling of shows in categories). The main interface is broken down into three sections: Library, Home, and Live with the first screen you see when logging in being Home. You navigate by scrolling and clicking, and by pulling the menus up from the bottom while streaming TV like YouTube.

YouTube TV Home.jpg

Continue reading for my first impressions of YouTube TV!

AMD Announces EPYC: A Massive 32-Core Datacenter SoC

Subject: Processors | May 16, 2017 - 06:49 PM |
Tagged: Zen, server, ryzen, processor, EPYC, datacenter, cpu, amd, 64 thread, 32 core

AMD has announced their new datacenter CPU built on the Zen architecture, which the company is calling EPYC. And epic they are, as these server processors will be offered with up to 32 cores and 64 threads, 8 memory channels, and 128 PCI Express lanes per CPU.

Epyc_1.jpg

Some of the details about the upcoming "Naples" server processors (now EPYC) were revealed by AMD back in March, when the upcoming server chips were previewed:

"Naples" features:

  • A highly scalable, 32-core System on Chip (SoC) design, with support for two high-performance threads per core
  • Industry-leading memory bandwidth, with 8-channels of memory per "Naples" device. In a 2-socket server, support for up to 32 DIMMS of DDR4 on 16 memory channels, delivering up to 4 terabytes of total memory capacity.
  • The processor is a complete SoC with fully integrated, high-speed I/O supporting 128 lanes of PCIe, negating the need for a separate chip-set
  • A highly-optimized cache structure for high-performance, energy efficient compute
  • AMD Infinity Fabric coherent interconnect for two "Naples" CPUs in a 2-socket system
  • Dedicated security hardware 

EPYC Screen.png

Compared to Ryzen (or should it be RYZEN?), EPYC offers a huge jump in core count and available performance - though AMD's other CPU announcement (Threadripper) bridges the gap between the desktop and datacenter offerings with an HEDT product. This also serves to bring AMD's CPU offerings to parity with the Intel product stack with desktop/high performance desktop/server CPUs.

epycpackage.jpg

EPYC is a large processor. (Image credit: The Tech Report)

While specifications were not offered, there have been leaks (of course) to help fill in the blanks. Wccftech offers these specs for EPYC (on the left):

Wccftech Chart.png

(Image credit: Wccftech)

We await further information from AMD about the EPYC launch.

Source: AMD

Sit back and read about the Vertagear Triigger 350 Special Edition Gaming Chair

Subject: General Tech | May 19, 2017 - 02:02 PM |
Tagged: vertagear, Triigger 350 Special Edition, gaming chair

We now know the real reason Kyle agreed to getting a red stripe in his hair; so he can match the chair he is sitting in.  He has been resting his laurels on the Vertagear Triigger 350 Special Edition for the past few months and has published a review of his experiences.  This 55lb beast is constructed of aluminium, mesh and calf leather with hubless caster type wheels which turned out to work and look good.  If you are in the market for a high end gaming chair you should check out the full review, especially the last page where he answers numerous questions asked by his forum members.

149513174587gi9q3da6_1_2.jpg

"What happens when you rope yourself in to doing a gaming chair review? You take your time, do it right, and make sure your butt spends at least a few months in the chair before you write your review. My butt has been in the VertaGear Triigger 350 Gaming Chair for over 3 months, and here are my thoughts."

Here is some more Tech News from around the web:

Tech Talk

Source: [H]ard|OCP

AMD's 16-Core Ryzen Threadripper CPUs Coming This Summer

Subject: Processors | May 16, 2017 - 07:22 PM |
Tagged: Zen, Threadripper, ryzen, processor, HEDT, cpu, amd

AMD revealed their entry into high-end desktop (HEDT) with the upcoming Ryzen "Threadripper" CPUs, which will feature up to 16 cores and 32 threads.

Threadripper 2.png

Little information was revealed along with the announcement, other than to announce availablility as "summer 2017", though rumors and leaks surrounding Threadripper have been seen on the internet (naturally) leading up to today's announcement, including this one from Wccftech. Not only will Threadripper (allegedly) offer quad-channel memory support and 44 PCI Express lanes, but they are also rumored to be released in a massive 4094-pin package (same as "Naples" aka EPYC) that most assuredly will not fit into the AM4 socket.

WCCFTECH Chart 2.png

Image credit: Wccftech

These Threadripper CPUs follow the lead of Intel's HEDT parts on X99, which are essentially re-appropriated Xeons with higher clock speeds and some feature differences such as a lack of ECC memory support. It remains to be seen what exactly will separate the enthusiast AMD platform from the EPYC datacenter platform, though the rumored base clock speeds are much higher with Threadripper.

Source: AMD

Lian Li’s PC-O12 Mid-tower Case is Now Available

Subject: Cases and Cooling | May 18, 2017 - 03:11 PM |
Tagged: Lian Li, PC-O12, e-atx

Lian Li's new PC-O12 is an interesting case.  It bears many similarities to cases currently on the market, with tempered glass windows, cable management features and a separated chamber at the bottom which holds the PSU and drive cages. 

o12-00.jpg

The way it differs is obvious when you look at the back panel and alignment of the four expansion slots for graphics cards.  They are designed to allow you to mount your GPUs vertically with the use of a 440mm PCI-E 16X Riser cable.  This will let you show off the artwork and LEDs on your card and is touted as increasing cooling efficiency.  While this will give you a unique looking system it also adds an impressive price tag of $399.99.

o12-05.jpg

You can read the full PR below the specifications.

Capture.PNG

May 18, 2017, Keelung, Taiwan - Lian-Li Industrial Co. Ltd launches the PC-O12; a compact mid-tower chassis that combines sleek tempered glass panels with strong, but lightweight, steel and aluminum. This new addition to Lian Li’s latest generation O-series chassis range offers unsurpassed style, plus slim design with ample space for a powerful but compact PC build. Thanks to its unique design, it offers space for two vertically placed graphics cards in a separate compartment for gorgeous PC builds.

Tempered glass adds a touch of class
The PC-O12’s flawless tempered glass front and side panels make it a sleek and sophisticated showcase for the latest cutting edge computing technologies. Tempered glass is tough, safe and very durable, providing a ‘fresh from the showroom’ appearance indefinitely. The PC-O12’s alluring black aluminum outer body and panels complete the picture. Internally, a rigid steel frame provides a firm foundation for state of the art features.

Ideal balance of chassis size and features
Despite it’s space-saving format, this mid-tower enclosure offers plenty of room for the most powerful hardware. The 440mm full bandwidth PCI Express 16x riser cable allows flexible vertical graphics card mounting to enhance cooling and to show off the latest graphics technology through the tempered glass side panel. The roomy case interior fits graphics cards up to 340mm long and CPU coolers up to 75mm high.

There’s internal space for up to eight hard disk and SSD drives for terabytes of fast storage capacity. In addition, the newest ultra speedy, powerful external USB 3.1 type C devices are supported, and there are a total of four external USB connectors as standard.

A case with great low-noise cooling performance
With up to five large-format fans, this chassis ensures valuable PC components keep running cool, prolonging life, enhancing performance and reducing noise. There’s space for three 120mm fans at the top of the case, plus two 140mm or 120mm fans at the front. With so many airflow options, users are able to reduce fan speed and reduce noise. In addition, removable mesh dust filters cover the primary fan mounts. The drive cages and PSU mount include rubber vibration dampeners to minimize noise.

Price and Availability
The PC-O12 is now available at Newegg for $399.99 Find detailed specifications for the PC-O12 here

Additional PCI Express riser cables are available at Performance PC starting in June 2017

Source: Lian Li
Author:
Manufacturer: Seasonic

Introduction and Features

Introduction

2-Prime-logo.jpg

Sea Sonic Electronics Co., Ltd has been designing and building PC power supplies since 1981 and they are one of the most highly respected manufacturers on the planet. Not only do they market power supplies under their own Seasonic name but they are the OEM for many other big name brands.

Seasonic began introducing their new PRIME Series power supplies last year and we reviewed several of the flagship Titanium units and found them to be among the best power supplies we have tested to date. But the PRIME Series now includes many more units with both Platinum and Gold level efficiency certification.

3-Prime-compare.jpg

The power supply we have in for review is the Seasonic PRIME 1200W Gold. This unit comes with all modular cables and is certified to comply with the 80 Plus Gold criteria for high efficiency. The power supply is designed to deliver very tight voltage regulation on the three primary rails (+3.3V, +5V and +12V) and provides superior AC ripple and noise suppression. Add in a super-quiet 135mm cooling fan with a Fluid Dynamic Bearing, top-quality components and a 12-year warranty, and you have the makings for another outstanding power supply.

4a-Side.jpg

Seasonic PRIME Gold Series PSU Key Features:

•    650W, 750W, 850W, 1000W or 1200W continuous DC output
•    High efficiency, 80 PLUS Gold certified
•    Micro-Tolerance Load Regulation (MTLR)
•    Top-quality 135mm Fluid Dynamic Bearing fan
•    Premium Hybrid Fan Control (allows fanless operation at low power)
•    Superior AC ripple and noise suppression
•    Fully modular cabling design
•    Multi-GPU technologies supported
•    Gold-plated high-current terminals
•    Protections: OPP,OVP,UVP,SCP,OCP and OTP
•    12-Year Manufacturer’s warranty
•    MSRP for the PRIME 1200W Gold is $199.90 USD

4b-Front-cables.jpg

Here is what Seasonic has to say about the new PRIME power supply line: “The creation of the PRIME Series is a renewed testimony of Sea Sonic’s determination to push the limits of power supply design in every aspect. This elegant-looking, exclusive lineup of new products will include 80 Plus Titanium – in the range of 650W to 1000W, and Platinum-Gold-rated units in the range of 650W to 1200W, with excellent electrical characteristics, top-level components and fully modular cabling.

Seasonic employs the most efficient manufacturing methods, uses the best materials and works with most reliable suppliers to produce reliable products. The PRIME Series layout, revolutionary manufacturing solutions and solid design attest to the highest level of ingenuity of Seasonics’s engineers and product developers. Demonstrating confidence in its power supplies, Seasonic stands out in the industry by offering PRIME Series a generous 12-year manufacturer’s warranty period.

Please continue reading our review of the Seasonic PRIME 1200W Gold PSU!

Good news Battletech fans, Paradox will publish Harebrained Schemes new game

Subject: General Tech | May 17, 2017 - 02:56 PM |
Tagged: battletech, paradox, gaming, Kickstarter

The Kickstarter for the new turn based Battletech gaming was wildly successful with 41,733 backers pledging $2,785,537 and now we have even more good news.  Paradox Interactive, they of the continual updates and addins to published games have agreed to publish the new Battletech game.  Not only does this ensure solid support for players after release but could mean we see a long lineup of expansions after release, Paradox just added another major expansion to EU4 four years after its release.  For backers there is even more news, the closed beta will kick off in June and there is a new video of multiplayer gameplay you can watch.

"The long life of these internally developed games is a core part of Paradox’s business model, but the company is also expanding as a publisher. That includes not only third-party originals like Battletech, but ports of existing titles such as Prison Architect on tablet."

Here is some more Tech News from around the web:

Gaming

AMD Releases Radeon Software Crimson ReLive 17.5.2

Subject: Graphics Cards | May 20, 2017 - 07:01 AM |
Tagged: graphics drivers, amd

The second graphics driver of the month from AMD, Radeon Software Crimson ReLive 17.5.2, adds optimizations for Bethesda’s new shooter, Prey. AMD claims that it will yield up to a 4.5% performance improvement, as measured on an RX 580 (versus the same card with 17.5.1). This is over and above the up to 4.7% increase that 17.5.1 had over 17.4.4.

amd-2016-crimson-relive-logo.png

Outside of that game, 17.5.2 also addresses four issues. The first is a crash in NieR: Automata. The second is long load times in Forza Horizon 3. The third is a system hang with the RX 550 when going sleep. The fourth fixed issue is a bit more complicated; apparently, in a multi-GPU system, where monitors are attached to multiple graphics cards, the primary graphics card can appear disabled in Radeon Settings. All four are now fixed, so, if they affect you, then pick up the driver.

As always, they are available from AMD’s website.

Source: AMD
Author:
Subject: Systems, Mobile
Manufacturer: Apple

What have we here?

The latest iteration of the Apple MacBook Pro has been a polarizing topic to both Mac and PC enthusiasts. Replacing the aging Retina MacBook Pro introduced in 2012, the Apple MacBook Pro 13-inch with Touch Bar introduced late last year offered some radical design changes. After much debate (and a good Open Box deal), I decided to pick up one of these MacBooks to see if it could replace my 11" MacBook Air from 2013, which was certainly starting to show it's age.

DSC02852.JPG

I'm sure that a lot of our readers, even if they aren't Mac users, are familiar with some of the major changes the Apple made with this new MacBook Pro. One of the biggest changes comes when you take a look at the available connectivity on the machine. Gone are the ports you might expect like USB type-A, HDMI, and Mini DisplayPort. These ports have been replaced with 4 Thunderbolt 3 ports, and a single 3.5mm headphone jack.

While it seems like USB-C (which is compatible with Thunderbolt 3) is eventually posed to take over the peripheral market, there are obvious issues with replacing all of the connectivity on a machine aimed at professionals with type-c connectors. Currently, type-c devices are few and are between, meaning you will have to rely on a series of dongles to connect the devices you already own. 

DSC02860.JPG

I will say however, that it ultimately hasn't been that much of an issue for me so far in the limited time that I've owned this MacBook. In order to evaluate how bad the dongle issue was, I only purchased a single, simple adapter with my MacBook which provided me with a Type-A USB port and a pass-through Type-C port for charging.

Continue reading our look at using the MacBook Pro with Windows!

Google Daydream Standalone VR Headset Powered by Snapdragon 835

Subject: Graphics Cards, Mobile | May 17, 2017 - 02:30 PM |
Tagged: snapdragon 835, snapdragon, qualcomm, google io 2017, google, daydream

During the Google I/O keynote, Google and Qualcomm announced a partnership to create a reference design for a standalone Daydream VR headset using Snapdragon 835 to enable the ecosystem of partners to have deliverable hardware in consumers’ hands by the end of 2017. The time line is aggressive, impressively so, thanks in large part to the previous work Qualcomm had done with the Snapdragon-based VR reference design we first saw in September 2016. At the time the Qualcomm platform was powered by the Snapdragon 820. Since then, Qualcomm has updated the design to integrate the Snapdragon 835 processor and platform, improving performance and efficiency along the way.

Google has now taken the reference platform and made some modifications to integrate Daydream support and will offer it to partners to show case what a standalone, untethered VR solution can do. Even though Google Daydream has been shipping in the form of slot-in phones with a “dummy” headset, integrating the whole package into a dedicate device offers several advantages.

First, I expected the free standalone units to have better performance than the phones used as a slot-in solution. With the ability to tune the device to higher thermal limits, Qualcomm and Google will be able to ramp up the clocks on the GPU and SoC to get optimal performance. And, because there is more room for a larger battery on the headset design, there should be an advantage in battery life along with the increase in performance.

sd835vr.jpg

The Qualcomm Snapdragon 835 VR Reference Device

It is also likely that the device will have better thermal properties than those using high smartphones today. In other words, with more space, there should be more area for cooling and thus the unit shouldn’t be as warm on the consumers face.

I would assume as well that the standalone units will have improved hardware over the smartphone iterations. That means better gyros, cameras, sensors, etc. that could lead to improved capability for the hardware in this form. Better hardware, tighter and more focused integration and better software support should mean lower latency and better VR gaming across the board. Assuming everything is implemented as it should.

The only major change that Google has made to this reference platform is the move away from Qualcomm’s 6DOF technology (6 degrees of freedom, allowing you to move in real space and have all necessary tracking done on the headset itself) and to Google calls WorldSense. Based on the Google Project Tango technology, this is the one area I have questions about going forward. I have used three different Tango enabled devices thus far with long-term personal testing and can say that while the possibilities for it were astounding, the implementations had been…slow. For VR that 100% cannot be the case. I don’t yet know how different its integration is from what Qualcomm had done previously, but hopefully Google will leverage the work Qualcomm has already done with its platform.

Google is claiming that consumers will have hardware based on this reference design in 2017 but no pricing has been shared with me yet. I wouldn’t expect it to be inexpensive though – we are talking about all the hardware that goes into a flagship smartphone plus a little extra for the VR goodness. We’ll see how aggressive Google wants its partners to be and if it is willing to absorb any of the upfront costs with subsidy.

Let me know if this is the direction you hope to see VR move – away from tethered PC-based solutions and into the world of standalone units.

Source: Qualcomm

Podcast #450 - AMD Ryzen, AMD EPYC, AMD Threadripper, AMD Vega, and more non AMD news!

Subject: Editorial | May 18, 2017 - 11:46 AM |
Tagged: youtube tv, western digital, video, Vega, Threadripper, spir-v, ryzen, podcast, opencl, Google VR, EPYC, Core i9, battletech, amd

PC Perspective Podcast #450 - 05/18/17

Join us for AMD Announcments, Core i9 leaks, OpenCL updates, and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, Allyn Malventano

Peanut Gallery: Alex Lustenberg

Program length: 1:20:36

Podcast topics of discussion:

  1. Week in Review:
  2. News items of interest:
  3. Hardware/Software Picks of the Week
    1. Ryan: Gigabit LTE please hurry
    2. Allyn: TriboTEX (nanotech engine oil additive)
  4. Closing/outro

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

 

Source:

Pot, meet kettle. Is it worse to hoard exploits or patches?

Subject: General Tech | May 16, 2017 - 01:27 PM |
Tagged: security, microsoft

Microsoft and the NSA have each been blaming the other for the ability of WannaCrypt to utilize a vulnerability in SMBv1 to spread.  Microsoft considers the NSA's decision not to share the vulnerabilities which their Eternalblue tool utilizes with Microsoft and various other security companies to be the cause of this particular outbreak.  Conversely, the fact is that while Microsoft developed patches to address this vulnerability for versions of Windows including WinXP, Server 2003, and Windows 8 RT back in March, they did not release the patches for legacy OSes until the outbreak was well underway. 

Perhaps the most compelling proof of blame is the number of systems which should not have been vulnerable but were hit due to the fact that the available patches were never installed. 

These three problems, the NSA wanting to hoard vulnerabilities so they can exploit them for espionage, Microsoft ending support of older products because they are a business and do not find it profitable to support products a decade or more after release and users not taking advantage of available updates have left us in the pickle we find ourselves in this week.  On the plus side this outbreak does have people patching, so we have that going for us.

fingerpointing.jpg

"Speaking of hoarding, though, it's emerged Microsoft was itself stockpiling software – critical security patches for months."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Register