All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Processors | March 4, 2017 - 06:00 AM | Tim Verry
Tagged: xfr, turbo, sensemi, ryzen, overclocking, amd
Following the leaks and official news and reviews of AMD's Ryzen processors there were a few readers asking for clarity on the eXtended Frequency Range (XFR) technology and whether or not it is enabled on all Ryzen CPUs or only the X models. After quite a bit of digging through forums and contradictory articles, I believe I have the facts in hand to answer those questions. In short, XFR is supported on all Ryzen processors (at least all the Ryzen 7 CPUs released so far) including the non-X Ryzen 7 1700; however the X SKUs get a bigger boost from XFR than the non-X model(s).
Specifically, the Ryzen 7 1700X and Ryzen 7 1800X when paired with a high end air or water cooler is able to boost up to an additional 100 MHz over the 4 GHz advertised boost clock while the Ryzen 7 1700 is limited to an XFR boost of up to 50 MHz so long as there is thermal headroom. Interestingly, the Extended Frequency Range boosts are done in 25 MHz increments (and likely achieved by adjusting the multiplier by 0.25x).
How does this all work though? Well, with Ryzen AMD introduced a new suite of technologies it calls "SenseMI" which, despite the questionable name (heh), puts a lot of intelligence into the processor and builds on the paths AMD started down with Carrizo and Excavator designs. The five main technologies are Pure Power, Precision Boost, Extended Frequency Range (XFR), Neural Net Prediction, and Smart Prefetch. The first three are important when talking about XFR.
With Ryzen AMD has embedded a number of sensors throughout the chip that accurately measure temperatures, clock speeds, and voltages within 1°C, 1mA, 1mW, 1mV and it has connected all the sensors together using its Infinity Fabric. Pure Power lets AMD make localized adaptive adjustments to optimize power usage without negatively affecting performance. Precision Boost is AMD's equivalent to Intel's Turbo Boost and it is built on top of Pure Power's sensor network. Precision Boost enables a Ryzen CPU to dynamically clock up beyond the base clock speed across all cores or clock even further across two cores. Lightly threaded workloads will benefit from the latter while workloads using any more than two threads will get the all core boost, so there is not a lot of granularity in number of cores vs allowed boost but there does not really need to be and the Precision Boost is more granular than Intel's Turbo Boost in clock speed bumps of 25MHz increments versus 100 MHz increments up to the maximum allowed Precision Boost clock. As an example, the Ryzen 7 1800X has a base clock of 3.6 GHz and so long as there is thermal headroom it can adjust the clock speed up by 25 MHz steps to 3.7 GHz across all eight cores or up to as much as 4.0 GHz on two cores.
From there XFR allows the processor to clock beyond the 2 core Precision Boost (XFR only works to increase the boost of the two core turbo not the all core turbo) and as temperatures decrease the allowed XFR increases. While initial reports and slides from AMD suggested XFR would scale with the cooler (air, water, LN2, LHe) with no upper limit aside from temperature and other sensor input, it appears AMD has taken a step back and limited X series Ryzen 7 chips to a maximum XFR boost of 100 MHz over the two core Precision Boost and non-X series Ryzen 7 processors to a maximum XFR boost of 50 MHz over the maximum boosted two core clock speed. The Ryzen 7 1700 will have two extra steps above its two core boost so while the chip has a base clock of 3.0 GHz, Precision Boost can take all eight cores to 3.1GHz or two cores to 3.7 GHz. Further, so long as temperatures are still in check XFR can take those two boosted cores to 3.75 GHz.
XFR will be a setting that you are able to toggle on and off via a motherboard setting, and some motherboards may have the feature turned on by default. Unfortunately, if you choose to manually overclock you will lose XFR functionality (and boost). Further, Precision Boost and XFR are connected and you are not able to turn off one but not the other (you either get both or nothing). Note that if you overclock using AMD's "Ryzen Master" software utility, it will also disable Precision Boost and XFR, but the lower power C-states will stay enabled which may be desirable if you want the power bill and room to cool down when not gaming or creating content.
I would expect as yields and the binning processes improve for Ryzen AMD may lift or extend the XFR limits either with a product refresh (not sure if a micro-code update would be possible) or maybe only in the upcoming hexa-core and quad core Ryzen 5 and Ryzen 3 processors that have less cores and more headroom for overclocking. That is merely speculation however. Ryzen 5 and Ryzen 3 should support XFR on both X and non-X models, but it is too early to know or say what the XFR boost will be.
XFR is neat though not as big of a deal as I originally thought it to be without limits, and as many expected manual overclocking is still going to be the way to go. This is not all bad news though, because it means that the much cheaper Ryzen 7 1700 just got a lot more attractive. You give up a 50 MHz XFR boost that you can't use anyway because you are going to manually overclock and you gamble a bit on getting a decently binned chip that can hit R7 1800X clock speeds, but you save $170 that you can put towards a better motherboard or a better graphics card (or a second one for CrossFire - even on B350).
I am still waiting on our overclocking results as well as Kyle's overclocking results when it comes to the Ryzen 7 1700, but several other sites are reporting success at hitting at least 4.0 GHz (though not many results over 4.0 or 4.1 GHz which isn't unexpected since these are not the highest binned chips and yields are still young so bins are more real/based on silicon and not just for product segmentation but most can hit the higher speeds at x power, v voltage, and n temperature et al). For example, Legit Reviews reports that they were able to hit manually overclock a R7 1700 to 4.0 GHz on all cores at 1.3875 volts. They were able to keep the non-X Ryzen chip stable with those settings on both aftermarket air and AIO water coolers.
AMD's Ryzen Master overclocking software lets you OC and setup CPU and memory profiles from your OS.
More on overclocking: Tom's Hardware has posted that, according to AMD, the safe voltage ceiling for overclocking is 1.35V if you want the CPU to last, but that up to 1.45V CPU voltage is "sustainable". Further, note that is is recommended not to set the SOC Voltage higher than 1.2 volts. Also, much like Intel's platform, it is possible to adjust the base clock (BCLCK) but you may run into stability problems with the rest of the system if you push this too far outside expected specifications (PC Gamer claims you can set this up to 140 MHz though so AM4/Ryzen may be more forgiving in this area than Intel. Edit: The highest figure I've seen so far is 106.4 MHz being stable before the rest of the system gets too far out of spec and becomes unstable. The main benefit to adjusting this is to support overclocked RAM above 3200 MHz so unless you need that your overclocking efforts are probably better spent adjusting the multiplier. /edit). Finally, when manually overclocking you will be able to turn off SMT and/or turn off cores in 2s (e.g. disable 2 cores or disable 4 cores, you can't disable in single numbers but groups of two).
Hopefully this helps to clear up the XFR confusion. If you do not need guaranteed clocks with a bonus XFR boost for a stable workstation build, saving money and going with the Ryzen 7 1700 and manually overclocking it to at least attempt to reach R7 1700X or 1800X speeds seems like the way to go for enthusiasts that are considering making the jump to AM4 especially if you enjoy tinkering with things like overclocking. There's nothing wrong with going with the higher priced and binned chips if you want to go that route, but don't do it for XFR in my opinion.
What are your thoughts? Are you planning to overclock your Ryzen CPU or do you think the Precision Boost and XFR is enough?
Subject: General Tech | March 3, 2017 - 03:12 PM | Jeremy Hellstrom
Tagged: mechanical keyboard, input, HyperX ALLOY FPS, Cherry MX
The HyperX Alloy FPS is a R LED, no Gs or Bs, but you can cycle through a variety of modes using the Function key which replaces the Windows key on the right side of the keyboard. The shell is aluminium, strong and light for those who tend to abuse their keyboards and the CherryMX switches are firmly attached and so should survive a few rage-quits. Modders Inc liked the keyboard overall and the price is reasonable, $80 for Blue switches or $100 if you prefer Red or Brown. Check out the full review for more specifics.
"Over the last couple of years the gaming division of Kingston; HyperX has been working hard to bust into the peripherals market. Their products started off with mouse pads and headsets. In September 2016, the HyperX Alloy FPS was released. The HyperX Allow FPS features a compact, minimalist design to maximize desk space and portability."
Here is some more Tech News from around the web:
- Rosewill RK-9300 Keyboard @ techPowerUp
- Cooler Master MasterKeys Pro M RGB @ Kitguru
- Cooler Master MasterKeys Pro S White @ Kitguru
- Roccat Suora FX @ Modders-Inc
- Roccat Suora FX @ eTeknix
- COUGAR Revenger Optical Gaming Mouse Review @ NikKTech
Subject: Cases and Cooling | March 3, 2017 - 02:12 PM | Jeremy Hellstrom
Tagged: lepa, NEOllusion RGB, air cooler, RGB
LEPA have launched a new series of air coolers, the NEOllusion RGB which comes with a remote control so you can create a fancy light show inside your case. The screenshots at [H]ard|OCP show that the lights the heatsink produces are quite bright and will certainly be visible even from a distance. For those of you who are more interested in cooling performance than pretty lights, the NEOllusion stands 126x40x161.7mm, with a 120mm fan and a recommended max TDP of 200W. Tests show the cooler favours form over function, keeping temperatures in control but not offering competitive performance; it does prefer visual impact over audio effects as it is one of the quietest coolers [H] have tested. If you are the type to desire a quiet light show in your case, check out the full review.
"LEPA comes to us today with a new air cooler that is specifically focused on users that are looking for a little more bling inside their desktop computer build. And while really cool lights may or may not be your thing, we wanted to see just how the NEOllusion performed when it comes to its primary function, CPU cooling."
Here are some more Cases & Cooling reviews from around the web:
- Scythe Kabuto 3 @ techPowerUp
- Corsair HD120 RGB PWM Fan Kit @ Benchmark Reviews
- Reeven Polariz RFC-4 Fan Controller @ Benchmark Reviews
- Phanteks Eclipse P400S Tempered Glass Case @ Benchmark Reviews
- Enermax SteelWing @ Modders Inc
- Thermaltake View 31 TG Mid Tower Review @ NikKTech
- KitGuru looks at the Calyos NSG-S0 @ Kitguru
Subject: General Tech | March 3, 2017 - 01:48 PM | Josh Walrath
Tagged: xbox one, wheels, wheel base, rally, racing, PC, Fanatec, ClubSport V2.5, ClubSport V2
Subject: General Tech | March 3, 2017 - 01:33 PM | Jeremy Hellstrom
Tagged: microsoft, windows 10
If you are using Windows 10 Pro or Enterprise, you may have already disabled the automatic reboot function after updates are installed but for Home users after the Anniversary update, that has not been possible. It turns out there are a lot of users quite upset with unplanned reboots, especially those who leave their computers running overnight or while they are away. Microsoft have accepted this feedback and will return the ability to delay reboots to owners of the Home Edition in their next update. In the meantime, The Register describes a way in which you can regain a little more control over automatic reboots with your current build.
"Since the Windows 10 Anniversary Update in 2016, there is no way to prevent Windows 10 [Home] from automatically installing updates and rebooting your PC," fumed one vulture fan, John, who added that a group policy can be set on W10 Pro and Enterprise editions to prevent automated restarts."
Here is some more Tech News from around the web:
- Skype-on-Linux graduates from Alpha to Beta status @ The Register
- Google's troll-destroying AI can't cope with typos @ The Register
- biquiti UniFi AP AC HD WiFi Access Point (UAP-AC-HD) @ Custom PC Review
- Windows Phone users may find it harder to find love @ The Inquirer
Subject: General Tech | March 3, 2017 - 07:01 AM | Scott Michaud
Tagged: zspace, VR, Khronos
About a year before I joined PC Perspective, I acquired a degree in Education, which involved teaching at a local high school. Even though that was just five years after graduating high school, the amount of available technology has exploded in that time. SmartBoards were relevant enough to be taught at my teacher’s college just in case you got one. Contrast this to when I was a high school student, where “overhead projector” was assumed to mean “transparent paper and erasable marker”.
Why do I mention this? Well, basically everyone in the tech industry has been investigating the potential of VR and AR for the last couple of years, and education is a very obvious and practical application of it.
In this case, zSpace reached out and informed that they just joined the Khronos Group’s OpenXR Working Group. They hope to guide the specification from the educational technology perspective. From what I can see on their website, their products are basically like Wacom Cintiqs, except that the pen can function the volume of air in front of the screen, and glasses with markers adjust the output image to make it look like objects are floating between you and the display.
If you’re in the education sector, then be sure to check out what zSpace is doing, if only to be aware of the teaching tools that are available in the world. Every teacher I knew enjoyed browsing Staples, looking through the various bits of stationary for ideas, like recipe cards for cheap, impromptu student polls and challenges.
As for the rest of us? The more mainstream VR and AR is, the more innovation will occur, especially when they contribute back to open standards; win win.
Subject: Processors | March 2, 2017 - 03:08 PM | Jeremy Hellstrom
Tagged: Ryzen 1700X, Zen, x370, video, ryzen, amd
Having started your journey with Ryan's quick overview of the performance of the 1800X and anxiously awaiting our further coverage now that we have both the parts and the time to test them you might want to take a peek at some other coverage. [H]ard|OCP tested the processor which many may be looking at due to the more affordable pricing, the Ryzen 1700X. Their test system is based on a Gigabyte A370-Gaming 5 with 16GB of Corsair Vengeance DDR4-3600 which ran at 2933MHz during testing; Kyle reached out to vendors who assured him an update will make 3GHz reachable will arrive soon. Part of their testing focused on VR performance, so make sure to check out the full article.
"Saying that we have waited for a long time for a "real" CPU out of AMD would be a gross misunderstatement, but today AMD looks to remedy that. We are now offered up a new CPU that carries the branding name of Ryzen. Has AMD risen from the CPU graveyard? You be the judge after looking at the data."
Here are some more Processor articles from around the web:
- AMD's Ryzen 7 1800X, Ryzen 7 1700X, and Ryzen 7 1700 CPUs @ The Tech Report
- AMD’s moment of Zen: Finally, an architecture that can compete @ Ars Technica
- AMD Ryzen 7 1800X CPU Review: The Wait is Over @ Modders-Inc
- The AMD Ryzen 7 1800X Performance Review @ Hardware Canucks
- The AMD Ryzen 7 Performance In 3D Rendering & Video Transcoding @ TechARP
- AMD Ryzen 7 1800X @ Kitguru
- AMD Ryzen 7 1800X @ Guru of 3D
- AMD Ryzen 7 1800X, 1700X, and 1700 Processor Review @ OCC
- AMD Ryzen 7 1800X Linux Benchmarks @ Phoronix
Subject: General Tech, Motherboards, Cases and Cooling | March 2, 2017 - 02:05 PM | Jeremy Hellstrom
Tagged: AM4, ryzen, nzxt, fractal design, scythe
We have some good news from several companies about compatibility with that AM4 board you are hoping to set up. NXZT have announced a program in which you can request a free AM4 mounting kit for your Kraken X62, X52, X42, X61, X41 or Kraken X31. Just follow this link to apply for one, they will ship world wide starting on the 15th of March. You will need to provide proof of purchase of both your AM4 motherboard and Kraken cooler.
Fractal Design have a similar offer for owners of of their Kelvin series of coolers. You can email their Support team for a bracket for your Kelvin T12, S24 or S36, make sure to attach proof of purchase of either a Ryzen processor or AM4 board.
Scythe is doing things a litle differently. If you reside in Europe, they are offering free mounting kits to owners of their Mugen 5 cooler, simply reach out them via this link, again attaching a receipt for the cooler and either a Ryzen CPU or AM4 motherboard. Owners of a Katana 3 or 4, Kabuto 3, Shuriken Rev. B, Tatsumi “A”, Byakko, or Iori cooler need not even go through that process, your coolers mount is already compatible. For owners of other coolers you can reach out to Scythe via the previous link to order a bracket for 3,99€, to ship out sometime in May or later. We will let you know when we hear from the NA branch.
"Coinciding with the new AMD Zen-based Ryzen CPUs, and the new AM4 socket, NZXT will be providing a free retention bracket for all current Kraken users. NZXT believes in providing high-quality components to our customers, in addition to exceptional customer service no matter where they reside and we will continue that support alongside the launch of Ryzen."
Here is some more Tech News from around the web:
- Netflix Uses AI in Its New Codec To Compress Video Scene By Scene @ Slashdot
- IBM inexplicably granted patent for 'Out of Office' because FFS @ The Inquirer
- The day after 'S3izure', does anyone feel like moving to the cloud? @ The Register
- Nintendo Switch Game Cartridges Taste Awful @ [H]ard|OCP
- Online shops plundered by bank card-stealing malware after bungling backend Aptos hacked @ The Register
- TSMC seeking stake in Toshiba chip business to expand into 3D NAND sector @ DigiTimes
- SSD push for Seagate to complement its HDD business @ DigiTimes
- Some hateful human has brought Microsoft Clippy to Google Chrome for no reason @ The Inquirer
- Business Foxconn 'very confident' of buying Toshiba's NAND business @ The Register
Subject: Editorial | March 2, 2017 - 11:37 AM | Alex Lustenberg
Tagged: Vega, ryzen, podcast, fcat, 1080Ti, 1080
PC Perspective Podcast #439 - 03/02/17
Join us for GTX 1080 Ti, Radeon RX Vega, Ryzen and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Allyn Malventano, Josh Walrath, Jermey Hellstrom
Program length: 1:41:49
Week in Review:
News items of interest:
0:36:35 Ryzen News
0:56:25 Ryzen 7 1800X Discussion
Subject: Processors | March 2, 2017 - 11:29 AM | Ryan Shrout
Tagged: amd, ryzen, gaming, 1080p
By far one of the most interesting and concerning points about today's launch of the AMD Ryzen processor is gaming results. Many other reviewers have seen similar results to what I published in my article this morning: gaming at 1080p, even at "ultra" image quality settings, in many top games shows a deficit in performance compared to Intel Kaby Lake and Broadwell-E processors.
I shared my testing result with AMD over a week ago, trying to get answers and hoping to find some instant fix (a BIOS setting, a bug in my firmware). As it turns out, that wasn't the case. To be clear, our testing was done on the ASUS Crosshair VI Hero motherboard with the 5704 BIOS and any reports you see claiming that the deficits only existed on ASUS products are incorrect.
AMD responded to the issues late last night with the following statement from John Taylor, CVP of Marketing:
“As we presented at Ryzen Tech Day, we are supporting 300+ developer kits with game development studios to optimize current and future game releases for the all-new Ryzen CPU. We are on track for 1000+ developer systems in 2017. For example, Bethesda at GDC yesterday announced its strategic relationship with AMD to optimize for Ryzen CPUs, primarily through Vulkan low-level API optimizations, for a new generation of games, DLC and VR experiences.
Oxide Games also provided a public statement today on the significant performance uplift observed when optimizing for the 8-core, 16-thread Ryzen 7 CPU design – optimizations not yet reflected in Ashes of the Singularity benchmarking. Creative Assembly, developers of the Total War series, made a similar statement today related to upcoming Ryzen optimizations.
CPU benchmarking deficits to the competition in certain games at 1080p resolution can be attributed to the development and optimization of the game uniquely to Intel platforms – until now. Even without optimizations in place, Ryzen delivers high, smooth frame rates on all “CPU-bound” games, as well as overall smooth frame rates and great experiences in GPU-bound gaming and VR. With developers taking advantage of Ryzen architecture and the extra cores and threads, we expect benchmarks to only get better, and enable Ryzen excel at next generation gaming experiences as well.
Game performance will be optimized for Ryzen and continue to improve from at-launch frame rate scores.” John Taylor, AMD
The statement begins with Taylor reiterating the momentum of AMD to support developers both from a GPU and a CPU technology angle. Getting hardware in the hands of programmers is the first and most important step to find and fixing any problem areas that Ryzen might have, so this is a great move to see taking place. Both Oxide Games and Creative Assembly, developers of Ashes of the Singularity and Total War respectively, have publicly stated their intent to demonstrate improved threading and performance on Ryzen platforms very soon.
Taylor then recognizes the performance concerns at 1080p with attribution to those deficits going to years of optimizations for Intel processors. It's difficult, if not impossible, to know for sure how much weight this argument has, but it would make some logical sense. Intel CPUs have been the automatic, defacto standard for gaming PCs for many years, and any kind of performance optimizations and development would have been made on those same Intel processors. So it seems plausible that simply by seeding Ryzen to developers and having them look at performance as development goes forward would result in a positive change for AMD's situation.
For buyers today that are gaming at 1080p, the situation is likely to remain as we have presented it going forward. Until games get patched or new games are released from developers that have had access and hands-on time with Ryzen, performance is unlikely to change from some single setting/feature that AMD or its motherboard partners can enable.
The question I would love answered is why is this even happening? What architectural difference between Core and Zen is attributing to this delta? Is it fundamental to the pipeline built or to the caching structure or to how SMT is enabled? Does Windows 10 and its handling of kernel processes have something to do with it? There is a lot to try to figure out as testing moves forward.
If you want to see the statements from both Oxide and Creative Assembly, they are provided below.
“Oxide games is incredibly excited with what we are seeing from the Ryzen CPU. Using our Nitrous game engine, we are working to scale our existing and future game title performance to take full advantage of Ryzen and its 8-core, 16-thread architecture, and the results thus far are impressive. These optimizations are not yet available for Ryzen benchmarking. However, expect updates soon to enhance the performance of games like Ashes of the Singularity on Ryzen CPUs, as well as our future game releases.” - Brad Wardell, CEO Stardock and Oxide
"Creative Assembly is committed to reviewing and optimizing its games on the all-new Ryzen CPU. While current third-party testing doesn’t reflect this yet, our joint optimization program with AMD means that we are looking at options to deliver performance optimization updates in the future to provide better performance on Ryzen CPUs moving forward. " – Creative Assembly, Developers of the Multi-award Winning Total War Series
Subject: General Tech | March 2, 2017 - 02:59 AM | Scott Michaud
Tagged: Oculus, VR, pc gaming
Alongside the release of Robo Recall from Epic Games, which is free of you own an Oculus Rift and the Oculus Touch controllers, Oculus has changed up how you can purchase the Oculus Rift. As was the case since the Touch controllers shipped, the Oculus Rift is bundled with these motion controllers. The difference is that, now, the bundle will cost $598 USD. This is a $200 reduction in price compared to someone who purchased the headset and the controllers separately last week. The controllers, alone, are now $99 USD.
So this is interesting.
According to recent statements by Gabe Newell, who is obviously in the HTC Vive camp, the VR market doesn’t have “a compelling reason for people to spend 20 hours a day in VR”. This assertion was intended to dispel the opinion that a price cut would help VR along. From his perspective, VR will have a huge bump in resolution and frame rate within one or two years, and current headsets are basically the minimum of adequacy.
So, from both a software and technology standpoint, VR can benefit from more time in the oven before tossing it down the garbage disposal. I see that point and I agree with it, but only to a point. A price reduction can still help in several ways. First, the games industry has made some drastic shifts toward the individual. Free tools, from IDEs to AAA-quality game engines, seem to be picking up in adoption. A high entry fee for a segment of that mind share will push those with creative ideas elsewhere.
But, probably more importantly, even if the market is small, pulling in more users makes it grow. The more lead users that you can acquire, the more risk can be attempted, which will make an even better situation for whenever we need to start considering mass market. Imagine if a factor of two increase in user base would be enough for Microsoft (or Linux distros) to consider virtual desktops for VR. If we reach that threshold a year or two sooner, then it will have a more significant impact on the value for mainstream users whenever the technology catches up to their interest.
And yes, this is coming from the guy who is currently surrounded by four monitors...
Anyway, rant aside, Oculus has jumped in to a significant price reduction. This should get it into the hands of more people, assuming the injunction order doesn’t get accepted and drop on them like a hammer.
Subject: General Tech | March 2, 2017 - 12:16 AM | Tim Verry
Tagged: youtube red, youtube, live tv, cord cutting, cloud dvr, broadcast tv
YouTube is jumping into the streaming TV market with the launch of YouTube TV. The new "over the top" streaming service is aimed at cord cutters and users that want to watch live and recorded TV on their mobile devices. YouTube TV joins AT&T's DirecTV Now, Dish Network's Sling TV, and PlayStation Vue with a streaming package of ~40 channels for $35 per month that is reportedly the result of licensing negotiations and deals two years in the making.
The streaming platform, which is reportedly coming in the next weeks to months (depending on the market and local market licensing), will come out swinging with two main advantages over the existing competition: YouTube TV will allow more simultaneous streams (six accounts with up to 3 streams going at the same time) and have DVR functionality with unlimited storage and unlimited simultaneous recordings where episodes will be saved for 9 months.
Unfortunately, YouTube TV suffers the same main drawback of these over the top TV streaming services which is channel selection. Due to licensing issues, YouTube TV will have a collection of 40 channels at launch including access to ABC, NBC, FOX, CBS, CBS Sports Network, ESPN, E!, CW, FX, USA, Freeform, FS1, Disney Channel, and more. However, it lacks the cable-only networks like AMC and Viacom (also no MTV, CNN, TNT, TBS, Comedy Central, HGTV, or Food Network). Showtime is available for an extra monthly fee though.
The sports channels are nice to see and are sure to be appreciated, but due to Verizon's exclusivity deal NFL games are restricted to PCs and can not be streamed on mobile devices using YouTube TV.
For those interested, CNET has a full list of the channels here. YouTube TV will reportedly also allow access to YouTube Red programming, but the TV programming will still have ads (of course).
Excepting the NFL streams, users can watch live and recorded TV on their PCs, smartphones, tablets, and Chromecasts. Google Home support is currently in development as well and will eventually allow you to tune into a channel on your Chromecast using your voice.
I am a excited to see another major player enter this IP TV streaming space, and with a working DVR it will have a leg up over the competition (here's looking at you, DirecTV Now). With Google backing the venture I am hopeful that it can flex its considerable capital muscle to work out further deals with the stubborn cable networks and eventually (maybe) we will see a truly a la carte TV streaming service!
What are your thoughts on YouTube TV? Is it enough to get you to cut the cord, or are you too into The Walking Dead?
Subject: Processors | March 1, 2017 - 09:17 PM | Tim Verry
Tagged: solder, Ryzen 1700, ryzen, overclocking, IHS, delid, amd
Professional extreme overclocker Roman "der8auer" Hartung from Germany recently managed to successfully de-lid his AMD Ryzen 7 1700 processor and confirmed that AMD is, in fact, using solder as its thermal interface material of choice between the Ryzen die and IHS (integrated heat spreader). The confirmation that AMD is using solder is promising news for enthusiasts eager to overclock the new processors and see just how far they are able to push them on air and water cooling.
Image credit: Roman Hartung. Additional high resolution photos are available here.
In a video on his YouTube channel, der8auer ("The Farmer") shows the steps involved in delidding the Ryzen 7 1700 which involve using razor blades, a heating element to get the IHS heated to a temperature high enough to melt the indium (~170°C on the block with the indium melting around 157°C), and a whole lot of courage. After using the razor blades to cut the glue around the edges, he heated up the IHS enough to start melting the solder and after a cringe-worthy cracking sound he was able to lift the package away from the IHS with the die and on-package components intact!
He does note that the Ryzen using PGA rather than the LGA method Intel has moved to makes the CPU a bit harder to handle as the pins are on the CPU rather than the socket and are easily bent. Compared to the delidding process and possibility of cracking the die or ripping off some components and killing the $329 CPU though, bent pins are nothing and can usually be bent back heh. He reportedly went through two previous Ryzen CPUs before getting a successful de-lid on the third attempt after all.
It seems that AMD is using two small pads of Indium solder along with some gold plating on the inside of the IHS to facilitate heat transfer and allow the solder to mate with the IHS. Because AMD is using what seems to be high quality solder TIM, delidding and replacing the TIM does not seem to be necessary at all; however, Roman "der8auer" Hartung speculates that direct die cooling could work out very well for those enthusiasts brave enough to try it since the cooler does not need to put high amounts of pressure onto the CPU to hold it into place unlike an LGA socket.
If you are interested in seeing the overclocking benefits of de-lidding and direct die cooling a Ryzen CPU, keep an eye on his YouTube channel for a video over the weekend detailing his testing using a Ryzen 7 1800X.
I am really looking forward to seeing how far enthusiasts are able to push Ryzen (especially on water), and maybe we can convince Morry to de-lid a Ryzen CPU!
- Overclockers Push Ryzen 7 1800X to 5.2 GHz On LN2, Break Cinebench Record
- Delidding your Intel Haswell CPU @ PC Perspective (Morry Teitelman)
- Photos and Tests of Skylake (Intel Core i7-6700K) Delidded
- Intel Haswell-E De-Lidded: Solder Is Its Thermal Interface
Subject: General Tech | March 1, 2017 - 08:12 PM | Scott Michaud
Tagged: VR, pc gaming, openxr, Khronos
While the Vulkan update headlines the Khronos Group’s presence at GDC 2017, they also re-announced their VR initiative, now called OpenXR. This specification wraps around the individual SDKs, outlining functionality that is to be exposed to the application and the devices. If a device implements the device layer, then it will immediately support everything that uses the standard, and vice-versa.
OpenVR was donated by Valve, leading to OpenXR...
... because an X is really just a reflected V, right?
Like OpenGL and Vulkan, individual vendors will still be allowed to implement their own functionality, which I’m hoping will be mostly exposed through extensions. The goal is to ensure that users can, at a minimum, enjoy the base experience of any title on any device.
They are aiming for 2018, but interested parties should contribute now to influence the initial release.
Subject: Graphics Cards | March 1, 2017 - 05:04 PM | Sebastian Peak
Tagged: video card, RX 580, RX 570, RX 560, RX 550, rx 480, rumor, report, rebrand, radeon, graphics, gpu, amd
According to a report from VideoCardz.com we can expect AMD Radeon RX 500-series graphics cards next month, with an April 4th launch of the RX 580 and RX 570, and subsequent RX 560/550 launch on April 11. The bad news? According to the report "all cards, except RX 550, are most likely rebranded from Radeon RX 400 series".
Until official confirmation on specs arrive, this is still speculative; however, if Vega is not ready for an April launch and AMD will indeed be refreshing their Radeon lineup, an R9 300-series speed bump/rebrand is not out of the realm of possibility. VideoCardz offers (unconfirmed, at this point) specs of the upcoming RX 500-series cards, with RX 400 numbers for comparison:
Chart credit: VideoCardz.com
The first graph shows the increased GPU boost clock speed of ~1340 MHz for the rumored RX 580, with the existing RX 480 clocked at 1266 MHz. Both would be Polaris 10 GPUs with otherwise identical specs. The same largely holds for the rumored specs on the RX 570, though this GPU would presumably be shipping with faster memory clocks as well. On the RX 560 side, however, the Polaris 11 powered replacement for the RX 460 might be based on the 1024-core variant we have seen from the Chinese market.
Chart credit: VideoCardz.com
No specifics on the RX 550 are yet known, which VideoCardz says "is most likely equipped with Polaris 12, a new low-end GPU". These rumors come via heise.de (German language), who state that those "hoping for Vega-card will be disappointed - the cards are intended to be rebrands with known GPUs". We will have to wait until next month to know for sure, but even if this is the case, expect faster clocks and better performance for the same money.
Subject: General Tech | March 1, 2017 - 03:46 PM | Jeremy Hellstrom
Tagged: Tegra X1, Nintendo Switch, Joy-Con, gaming
The Nintendo Switch has arrived for those who feel that mobile gaming is lacking in analog joysticks and buttons. The product sits in an interesting place, the 720p screen is nowhere near the resolution of modern phones though those phones lack a dock which triggers an overclocked mode to send 1080p to a TV. The programming team behind Nintendo also has far more resources than most mobile app developers and they can incorporate some tricks which a phone simply will not be able to replicate. Ars Technica took the Switch, its two Joy-Cons and the limited number of released games on a tour to see just how well Nintendo did on their new portable gaming system. There are some improvements that could be made but the Joy-Cons do sound more interesting than the Gameboy Advanced.
"With the Switch, Nintendo seems to be betting that the continued drum beat of Moore's Law and miniaturization has made that dichotomy moot. The Switch is an attempt to drag the portable gaming market kicking and screaming to a point where it's literally indistinguishable from the experience you'd get playing on a 1080p HDTV."
Here is some more Tech News from around the web:
- XC’mon: Xenonauts 2 releasing free dev builds @ Rock, Paper, SHOTGUN
- Galactic Civilizations III v2.0 Review @ OCC
- Stellaris’ New Horizons mod is the best Star Trek game @ Rock, Paper, SHOTGUN
- Wot I Think – Torment: Tides of Numenera @ Rock, Paper, SHOTGUN
- ARMA Humble Bundle
- Total Warhammer DLC ended with Bretonnia as devs switch to sequel @ Rock, Paper, SHOTGUN
- Radeon vs. NVIDIA Performance For HITMAN On Linux With 17 GPUs @ Phoronix
Subject: Mobile | March 1, 2017 - 02:26 PM | Tim Verry
Tagged: Snapdragon 625, opinion, MWC, keyone, enterprise, Cortex A53, blackberry, Android 7.1, Android
February is quite the busy month with GDC, MWC, and a flurry of technology announcements coming out all around the same time! One of the more surprising announcements from Mobile World Congress in Barcelona came from BlackBerry in the form of a new mid-range smartphone it is calling the KEYone. The KEYone is an Android 7.1 smartphone actually built by TCL with an aluminum frame, "soft touch" plastic back, curved edges, and (in traditional CrackBerry fashion) a full physical QWERTY keyboard!
The black and silver candy bar style KEYone (previously known as "Mercury") measures 5.78" x 2.85" x 0.37" and weighs 0.39 pounds. The left, right, and bottom edges are rounded and the top edge is flat. There are two bottom firing stereo speakers surrounding a USB Type-C port (Type-C 1.0 with USB OTG), a headphone jack up top, and volume, power, and convenience key buttons on the right side. The front of the device, which BlackBerry has designed to be comfortable using one handed, features a 4.5" 1620 x 1080 LCD touchscreen (434 PPI) protected by Gorilla Glass 4, a front facing camera with LED flash, and a large physical keyboard with straight rows of keys that have a traditional BlackBerry feel. The keyboard, in addition to having physical buttons, supports touch gestures such as swiping, and the spacebar has a fingerprint reader that early hands on reports indicate works rather well for quickly unlocking the phone. Further, every physical key can be programmed as a hot key to open any application with a long press (B for browser, E for email, ect).
On the camera front, BlackBerry is using the same sensor found in the Google Pixel which is a Sony IMX378. There is a 12MP f/2.0 rear camera with dual LED flash and phase detect auto focus on the back as well as a front facing 8MP camera. Both cameras can record 1080p30 video as well as support HDR and software features like face detection. Android Central reports that the camera software is rather good (it even has a pro mode) and the camera is snappy at taking photos.
Internally, BlackBerry has opted to go with squarely mid-range hardware which is disappointing but not the end of the world. Specifically, the KEYone is powered by a Snapdragon 625 (MSM8953) with eight ARM Cortex A53 cores clocked at 2GHz and an Adreno 506 GPU paired with 3GB of RAM and 32GB of internal storage. Wireless support includes dual band 802.11ac, FM, Bluetooth 4.2, GPS, NFC, and GSM/HSPA/LTE cellular radios. The smartphone uses a 3,505 mAh battery that is not user removable but at least supports Quick Charge 3.0 which can reportedly charge the battery to 50% in 36 minutes. Storage can be expanded via MicroSD cards. The smartphone is running Android 7.1.1 with some BlackBerry UI tweaks but is otherwise fairly stock. Under the hood however BlackBerry has hardened the OS and includes its DTEK security sftware along with promising monthly updates.
Not bad right? Looking at the specifications and reading/watching the various hands-on reports coming out it is really looking like BlackBerry (finally) has a decent piece of hardware for enterprise customers, niche markets (lawyers, healthcare, ect), and customers craving a physical keyboard in a modern phone. At first glance the BlackBerry KEYone hits all the key marks to a competitive Android smartphone... except for its $549 price tag. The KEYone is expected to launch in April.
No scroll ball? Blasphemy! (hehe)
Unfortunately, that $549 price is not a typo, and is what kills it even for a CrackBerry addict like myself. After some reflection and discussion with our intrepid smartphone guru Sebastian, I feel as though BlackBerry would have a competitive smartphone on its hands at $399, but at $549 even business IT departments are going to balk much less consumers (especially as many businesses embrace the BYOD culture or have grown accustomed to pricing out and giving everyone whatever basic Android or iPhone they can fit into the budget).
While similarly specced Snapdragon 625 smartphones are going for around $300, (e.g. Asus ZenPhone 3 at $265.98), there is some precedent for higher priced MSM8953-based smartphones such as the $449 Moto Z Play. There is some inherent cost in integrating a physical keyboard and BlackBerry has also hardened the Android 7.1.1 OS which I can see them charging a premium for and that business customers (or anyone that does a lot of writing on the go) that values security can appreciate. It seems like BlackBerry (and hardware partner TCL) has finally learned how to compete on the hardware design front in this modern Android-dominated market, and now they must learn how to compete on price especially as more and more Americans are buying unlocked and off-contract smartphones! I think the KEYone is a refreshing bit of hardware to come out of BlackBerry (I was not a fan of the Priv design) and I would like to see it do well and give the major players (Apple, Samsung, LG, Asus, Huawei, ect) some healthy competition with the twist of its focus on better security but in order for that to happen I think the BlackBerry KEYone needs to be a bit cheaper.
What are your thoughts on the KEYone and the return of the physical keyboard? Am I onto something or simply off my Moto Rokr on this?
Subject: General Tech | March 1, 2017 - 01:23 PM | Jeremy Hellstrom
Tagged: tesla motors, battery
Hack a Day posted a video of a teardown of the battery that powers the Tesla Model S, for those curious about how it is set up. This is not recommended for you to try at home, not only are there a huge number of bolts and Torx screws, it seems that each has a specific torque amount which must be adhered to. Inside are 16 battery packs, each of which contain 444 cells with a total of 24V, for a sum of 5.3 kWh. Do not test the charge on these batteries with your tongue! Click on through to watch the video.
"Tesla famously build their battery packs from standard 18650 lithium-ion cells, but it’s safe to say that the pack in the Model S has little in common with your laptop battery. Fortunately for those of a curious nature, [Jehu Garcia] has posted a video showing the folks at EV West tearing down a Model S pack from a scrap car, so we can follow them through its construction."
Here is some more Tech News from around the web:
- Security slip-ups in 1Password and other password managers 'extremely worrying' @ The Register
- Windows 7 market share rises at the expense of Windows 10 @ The Inquirer
- Amazon's AWS S3 cloud storage evaporates: Top websites, Docker stung @ The Register
- HTC to launch mobile VR devices for year-end holiday season @ DigiTimes
- Google Pulls the Plug On Its Pixel Laptops @ Slashdot
- The 15 New AMD Ryzen 7 CPU Coolers Revealed @ TechARP
- Nvidia unveils the GTX 1080 Ti at GDC @ The Tech Report
Subject: Graphics Cards | February 28, 2017 - 10:59 PM | Ryan Shrout
Tagged: pascal, nvidia, gtx 1080 ti, gp102, geforce
Tonight at a GDC party hosted by CEO Jen-Hsun Huang, NVIDIA announced the GeForce GTX 1080 Ti graphics card, coming next week for $699. Let’s dive right into the specifications!
|GTX 1080 Ti||Titan X (Pascal)||GTX 1080||GTX 980 Ti||TITAN X||GTX 980||R9 Fury X||R9 Fury||R9 Nano|
|GPU||GP102||GP102||GP104||GM200||GM200||GM204||Fiji XT||Fiji Pro||Fiji XT|
|Base Clock||1480 MHz||1417 MHz||1607 MHz||1000 MHz||1000 MHz||1126 MHz||1050 MHz||1000 MHz||up to 1000 MHz|
|Boost Clock||1600 MHz||1480 MHz||1733 MHz||1076 MHz||1089 MHz||1216 MHz||-||-||-|
|Memory Clock||11000 MHz||10000 MHz||10000 MHz||7000 MHz||7000 MHz||7000 MHz||500 MHz||500 MHz||500 MHz|
|Memory Interface||352-bit||384-bit G5X||256-bit G5X||384-bit||384-bit||256-bit||4096-bit (HBM)||4096-bit (HBM)||4096-bit (HBM)|
|Memory Bandwidth||484 GB/s||480 GB/s||320 GB/s||336 GB/s||336 GB/s||224 GB/s||512 GB/s||512 GB/s||512 GB/s|
|TDP||250 watts||250 watts||180 watts||250 watts||250 watts||165 watts||275 watts||275 watts||175 watts|
|Peak Compute||10.6 TFLOPS||10.1 TFLOPS||8.2 TFLOPS||5.63 TFLOPS||6.14 TFLOPS||4.61 TFLOPS||8.60 TFLOPS||7.20 TFLOPS||8.19 TFLOPS|
The GTX 1080 Ti looks a whole lot like the TITAN X launched in August of last year. Based on the 12B transistor GP102 chip, the new GTX 1080 Ti will have 3,584 CUDA core with a 1.60 GHz Boost clock. That gives it the same processor count as Titan X but with a slightly higher clock speed which should make the new GTX 1080 Ti slightly faster by at least a few percentage points and has a 4.7% edge in base clock compute capability. It has 28 SMs, 28 geometry units, 224 texture units.
Interestingly, the memory system on the GTX 1080 Ti gets adjusted – NVIDIA has disabled a single 32-bit memory controller to give the card a total of 352-bit wide bus and an odd-sounding 11GB memory capacity. The ROP count also drops to 88 units. Speaking of 11, the memory clock on the G5X implementation on GTX 1080 Ti will now run at 11 Gbps, a boost available to NVIDIA thanks to a chip revision from Micron and improvements to equalization and reverse signal distortion.
The TDP of the new part is 250 watts
, falling between the Titan X and the GTX 1080. That’s an interesting move considering that the GP102 was running at 250 watts with identical to the Titan product. The cooler has been improved compared to the GTX 1080, offering quieter fan speeds and lower temperatures when operating at the same power envelope.
Performance estimates from NVIDIA put the GTX 1080 Ti about 35% faster than the GTX 1080, the largest “kicker performance increase” that we have seen from a flagship Ti launch.
Pricing is going to be set at $699 so don't expect to find this in any budget builds. But for the top performing GeForce card on the market, it's what we expect. It should be on virtual shelves starting next week.
(Side note, with the GTX 1080 getting a $100 price drop tonight, I think we'll find this new lineup very compelling to enthusiasts.)
NVIDIA did finally detail its tiled caching rendering technique. We'll be diving more into that in a separate article with a little more time for research.
One more thing…
In another interesting move, NVIDIA is going to be offering “overclocked” versions of the GTX 1080 and GTX 1060 with +1 Gbps memory speeds. Partners will be offering them with some undisclosed price premium.
I don’t know how much performance this will give us but it’s clear that NVIDIA is preparing its lineup for the upcoming AMD Vega release.
We’ll have more news from NVIDIA and GDC as it comes!
Subject: Graphics Cards | February 28, 2017 - 10:55 PM | Tim Verry
Tagged: pascal, nvidia, GTX 1080, GDC
Update Feb 28 @ 10:03pm It's official, NVIDIA launches $699 GTX 1080 Ti.
NVIDIA is hosting a "Gaming Celebration" live event during GDC 2017 to talk PC gaming and possibly launch new hardware (if rumors are true!). During the event, NVIDIA CEO Jen-Hsun Huang made a major announcement regarding its top-end GTX 1080 graphics card with a price drop to $499 effective immediately.
The NVIDIA GTX 1080 is a pascal based graphics card with 2560 CUDA cores paired with 8GB of GDDR5X memory. Graphics cards based on this GP104 GPU are currently selling for around $580 to $700 (most are around $650+/-) with the "Founders Edition" having an MSRP of $699. The $499 price teased at the live stream represents a significant price drop compared to what the graphics cards are going for now. NVIDIA did not specify if the new $499 MSRP was the new Founders Edition price or an average price that includes partner cards as well but even if it only happened on the reference cards, the partners would have to adjust their prices downwards accordingly to compete.
I suspect that NVIDIA is making such a bold move to make room in their lineup for a new product (the long-rumored 1080 Ti perhaps?) as well as a pre-emptive strike against AMD and their Radeon RX Vega products. This move may also be good news for GTX 1070 pricing as they may also see price drops to make room for cheaper GTX 1080 partner cards that come in below the $499 price point.
If you have been considering buying a new graphics card, NVIDIA has sweetened the pot a bit especially if you had already been eyeing a GTX 1080. (Note that while the price drop is said to be effective immediately, at the time of writing Amazon was still showing "normal"/typical prices for the cards. Enthusiasts might have to wait a few hours or days for the retailers to catch up and update their sites.)
This makes me a bit more excited to see what AMD will have to offer with Vega as well as the likelihood of a GTX 1080 Ti launch happening sooner rather than later!