All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: General Tech | November 8, 2017 - 03:26 PM | Jeremy Hellstrom
Tagged: gaming, Wolfenstein 2, the new colossus, nvidia, amd, vulkan
Wolfenstein II The New Colossus uses the Vulkan API which could favour AMD's offerings however NVIDIA have vastly improved their support so a win is not guaranteed. The Guru of 3D tested the three resolutions which most people are interested in, 1080p, 1440p and 4K on 20 different GPUs in total. They also took a look at the impact of 4-core versus 8-core CPUs, testing the i7-4790K, i7-5960K as well as the Ryzen 7 1800X and even explored the amount of VRAM the game uses. Drop by to see all their results as well as hints on dealing with the current bugs.
"We'll have a peek at the PC release of Wolfenstein II The New Colossus for Windows relative towards graphics card performance. The game is 100% driven by the Vulkan API. in this test twenty graphics cards are being tested and benchmarked."
Here is some more Tech News from around the web:
- Stellaris FTL changes are in the warp pipes @ Rock, Paper, SHOTGUN
- Humble Strategy Simulator Bundle
- Nearly a year later, video game voice actors end their strike @ Ars Technica
- Call of Duty WWII: Benchmark Performance Analysis @ TechPowerUp
- Take Two: All future games will feature microtransactions @ HEXUS
- Wot I Think: Call of Duty: WW2 Multiplayer @ Rock, Paper, SHOTGUN
- Call of Duty: WW2: PC graphics analysis benchmark @ Guru of 3D
- Total War: Rome II expanding again with Empire Divided @ Rock, Paper, SHOTGUN
- F1 2017 On Linux With 23 Graphics Cards Using Vulkan @ Phoronix
- Radeon vs. NVIDIA Vulkan Performance For F1 2017 On Linux @ Phoronix
Subject: Processors | November 8, 2017 - 02:03 PM | Ryan Shrout
Tagged: qualcomm, centriq 2400, centriq, arm
At an event in San Jose on Wednesday, Qualcomm and partners officially announced that its Centriq 2400 server processor based on the Arm-architecture was shipping to commercial clients. This launch is of note as it becomes the highest-profile and most partner-lauded Arm-based server CPU and platform to be released after years of buildup and excitement around several similar products. The Centriq is built specifically for enterprise cloud workloads with an emphasis on high core count and high throughput and will compete against Intel’s Xeon Scalable and AMD’s new EPYC platforms.
Paul Jacobs shows Qualcomm Centriq to press and analysts
Built on the same 10nm process technology from Samsung that gave rise to the Snapdragon 835, the Centriq 2400 becomes the first server processor in that particular node. While Qualcomm and Samsung tout that as a significant selling point, on its own it doesn’t hold much value. Where it does come into play and impact the product position with the resulting power efficiency it brings to the table. Qualcomm claims that the Centriq 2400 will “offer exceptional performance-per-watt and performance-per dollar” compared to the competition server options.
The raw specifications and capabilities of the Centriq 2400 are impressive.
|Centriq 2460||Centriq 2452||Centriq 2434|
|Process Tech||10nm (Samsung)||10nm (Samsung)||10nm (Samsung)|
|Base Clock||2.2 GHz||2.2 GHz||2.3 GHz|
|Max Clock||2.6 GHz||2.6 GHz||2.5 GHz|
|Memory Speeds||2667 MHz
|Cache||24MB L2, split
|23MB L2, split
|20MB L2, split
|PCIe||32 lanes PCIe 3.0||32 lanes PCIe 3.0||32 lanes PCIe 3.0|
Built on 18 billion transistors a die area of just 398mm2, the SoC holds 48 high-performance 64-bit cores running at frequencies as high as 2.6 GHz. (Interestingly, this appears to be about the same peak clock rate of all the Snapdragon processor cores we have seen on consumer products.) The cores are interconnected by a bi-directional ring bus that is reminiscent of the integration Intel used on its Core processor family up until Skylake-SP was brought to market. The bus supports 250 GB/s of aggregate bandwidth and Qualcomm claims that this will alleviate any concern over congestion bottlenecks, even with the CPU cores under full load.
The caching system provides 512KB of L2 cache for every pair of CPU cores, essentially organizing them into dual-core blocks. 60MB of L3 cache provides core-to-core communications and the cache is physically divided around the die for on-average faster access. A 6-channel DDR4 memory systems, with unknown peak frequency, supports a total of 768GB of capacity.
Connectivity is supplied with 32 lanes of PCIe 3.0 and up to 6 PCIe devices.
As you should expect, the Centriq 2400 supports the ARM TrustZone secure operating environment and hypervisors for virtualized environments. With this many cores on a single chip, it seems likely one of the key use cases for the server CPU.
Maybe most impressive is the power requirements of the Centriq 2400. It can offer this level of performance and connectivity with just 120 watts of power.
With a price of $1995 for the Centriq 2460, Qualcomm claims that it can offer “4X better performance per dollar and up to 45% better performance per watt versus Intel’s highest performance Skylake processor, the Intel Xeon Platinum 8180.” That’s no small claim. The 8180 is a 28-core/56-thread CPU with a peak frequency of 3.8 GHz and a TDP of 205 watts and a cost of $10,000 (not a typo).
Qualcomm had performance metrics from industry standard SPECint measurements, in both raw single thread configurations as well as performance per dollar and per watt. I will have more on the performance story of Centriq later this week.
More important than simply showing hardware, Qualcomm and several partners on hand at the press event as well as many statements from important vendors like Alibaba, HPE, Google, Microsoft, and Samsung. Present to showcase applications running on the Arm-based server platforms was an impressive list of the key cloud services providers: Alibaba, LinkedIn, Cloudflare, American Megatrends Inc., Arm, Cadence Design Systems, Canonical, Chelsio Communications, Excelero, Hewlett Packard Enterprise, Illumina, MariaDB, Mellanox, Microsoft Azure, MongoDB, Netronome, Packet, Red Hat, ScyllaDB, 6WIND, Samsung, Solarflare, Smartcore, SUSE, Uber, and Xilinx.
The Centriq 2400 series of SoC isn’t perfect for all general-purpose workloads and that is something we have understood from the outset of this venture by Arm and its partners to bring this architecture to the enterprise markets. Qualcomm states that its parts are designed for “highly threaded cloud native applications that are developed as micro-services and deployed for scale-out.” The result is a set of workloads that covers a lot of ground:
- Web front end with HipHop Virtual Machine
- NoSQL databases including MongoDB, Varnish, Scylladb
- Cloud orchestration and automation including Kubernetes, Docker, metal-as-a-service
- Data analytics including Apache Spark
- Deep learning inference
- Network function virtualization
- Video and image processing acceleration
- Multi-core electronic design automation
- High throughput compute bioinformatics
- Neural class networks
- OpenStack Platform
- Scaleout Server SAN with NVMe
- Server-based network offload
I will be diving more into the architecture, system designs, and partner announcements later this week as I think the Qualcomm Centriq 2400 family will have a significant impact on the future of the enterprise server markets.
Subject: General Tech | November 8, 2017 - 01:15 PM | Jeremy Hellstrom
Tagged: logitech, iot, harmony link
If you own a Logitech Harmony Link and registered it then you already know, but for those who did not receive the email you should know your device will become unusable in March. According to the information Ars Technica acquired, Logitech have decided not to renew a so called "technology certificate license" which will mean the Link will no longer work. It is not clear what this certificate is nor why the lack of it will brick the Link but that is what will happen. Apparently if you have a Harmony Link which is still under warranty you can get a free upgrade to a Harmony Hub; if your Link is out of warranty then you can get a 35% discount. Why exactly one would want to purchase another one of these devices which can be remotely destroyed is an interesting question, especially as there was no monthly contract or service agreement suggesting this was a possibility when customers originally purchased their device.
"Customers received an e-mail explaining that Logitech will "discontinue service and support" for the Harmony Link as of March 16, 2018, adding that Harmony Link devices "will no longer function after this date."
Here is some more Tech News from around the web:
- This could be our favorite gadget of 2017: A portable projector @ The Register
- 'How Chrome Broke the Web' @ Slashdot
- Don't worry about those 40 Linux USB security holes. That's not a typo @ The Register
- Flaw Crippling Millions of Crypto Keys Is Worse Than First Disclosed @ Slashdot
- Highly flexible organic flash memory for foldable and disposable electronics @ Phys.org
- KRACK whacked, media playback holes packed, other bugs go splat in Android patch pact @ The Register
- The Biggest Tech Fails of the Last Decade @ TechSpot
Subject: Networking | November 7, 2017 - 10:00 PM | Jim Tanous
Tagged: wi-fi, vpn, ubiquiti, networking, mesh, Amplifi HD, amplifi
Earlier this year we took a look at the AmpliFi HD Home Wi-Fi System as part of our review of mesh wireless network devices. AmpliFi is the consumer-targeted brand of enterprise-focused Ubiquiti Networks, and while we preferred the eero Mesh Wi-Fi System in our initial look, the AmpliFi HD still offered great performance and some unique features. Today, AmpliFi is introducing a new member of its networking family called AmpliFi Teleport, a "plug-and-play" device that provides a secure connection to users' home networks from anywhere.
Essentially a zero-configuration hardware-based VPN, the Teleport is linked with a user's AmpliFi account, which automatically creates a secure connection to the user's AmpliFi HD Wi-Fi System at home. Users take the small (75.85mm x 43mm x 39mm) Teleport device with them on the road, plug it in and connect it to the public Wi-Fi or Ethernet, and then connect their personal devices to the Teleport.
This provides a secure connection for private Internet traffic, but also allows access to local resources on the home network, including NAS devices, file shares, and home automation products. AmpliFi also touts that this would allow users to view their local streaming content even in locations where it would otherwise be unavailable -- e.g., watching U.S. Netflix shows while overseas, or streaming your favorite sports team while in a city where the game is blacked out.
In addition to traveling, AmpliFi notes that those with multiple homes or a vacation cottage could also benefit from Teleport, as it would allow you to share the same network resources and media streaming access regardless of location. In any case, a device like Teleport is still reliant on the speed and quality of your home and remote Internet connections, so there may be cases where network speeds are so low that it makes the device useless. That, of course, is a factor that would plague any network-dependent service or device, so while it's not a mark against the Teleport, it's something to keep in mind.
Teleport's features, while incredibly useful, are of course familiar to those experienced with VPNs and other secure remote connection methods. In terms of overall functionality, the AmpliFi Teleport isn't offering anything new here. The benefit, therefore, is its simple setup and configuration. Users don't need to setup and run a VPN on their home hardware, subscribe to a third party VPN service, or know anything about encryption protocols, firewall configuration, or network tunneling. They simply need to plug the Teleport into power, follow the connection guide, and that's it -- they're up and running with a secure connection to their home network.
You'll pay for this convenience, however, as the Teleport isn't cheap. It's launching today on Kickstarter with "early bird" pricing of $199, which will get you the Teleport device and the required AmpliFi HD router. A second round of early purchasers will see that price increase to $229, while final pricing is $269. Again, that's just for the Teleport and the router. A kit including two AmpliFi mesh access points is $399. There's no word on standalone pricing for the Teleport device only for those who already have an AmpliFi mesh network at home.
Regardless of the package, once you have the hardware there's no extra cost or subscription fee to use the Teleport, so frequent travelers might find the system worth it when compared to some other subscription-based VPN services.
The AmpliFi Teleport is expected to ship to early purchasers in December. We don't have the hardware in hand yet for performance testing, but AmpliFi has promised to loan us review samples as the product gets closer to shipping. Check out the Teleport Kickstarter page and AmpliFi's website for more information.
Subject: Graphics Cards | November 7, 2017 - 03:21 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, gtx 1070 ti, geforce, msi
NVIDIA chose to limit the release of their GTX 1070 Ti to reference cards, all sporting the same clocks regardless of the model. That does not mean that the manufacturers skimped on the features which help you overclock successfully. As a perfect example, the MSI GTX 1070 Ti GAMING TITANIUM was built with Hi-C CAPs, Super Ferrite Chokes, and Japanese Solid Caps and 10-phase PWM. This resulted in an impressive overclock of 2050MHz on the GPU and a memory frequency of 9GHz once [H]ard|OCP boosted the power delivered to the card. That boost is enough to meet or even exceed the performance of a stock GTX 1080 or Vega 64 in most of the games they tested.
"NVIDIA is launching the GeForce GTX 1070 Ti today, and we’ve got a custom retail MSI GeForce GTX 1070 Ti GAMING TITANIUM video card to test and overclock, yes overclock, to the max. We’ll make comparisons against GTX 1080/1070, AMD Radeon RX Vega 64 and 56 for a complete review."
Here are some more Graphics Card articles from around the web:
- Nvidia's GeForce GTX 1070 Ti @ The Tech Report
- NVIDIA GeForce GTX 1070 Ti, Takes On The Radeon RX Vega 64 Under Linux @ Phoronix
- MSI GeForce GTX 1070 Ti Titanium 8G @ Guru of 3D
- NVIDIA GeForce GTX 1070 Ti Founders Edition Review @ OCC
- ASUS GTX 1070 Ti STRIX 8 GB @ TechPowerUp
- Colorful iGame GTX 1070 Ti Vulcan X TOP 8 GB @ TechPowerUp
- 34-Way Graphics Card Comparison On Ubuntu 17.10 @ Phoronix
Subject: Graphics Cards | November 7, 2017 - 02:04 PM | Jeremy Hellstrom
Tagged: Star Wars, nvidia, titan xp, disney
Priced at $1200, you can choose to power your gaming rig with either the light side of the Force or the dark side. NVIDIA have announced two new Titan Xp GPUs, one battle scarred and lightsaber green representing the Rebel Alliance and a pristine black card which glows a familiar red. It would seem that they are a bit behind the times as neither of those organizations exist in the current Star Wars timeline but that doesn't make them any less attractive to fans.
The specifications are familiar, a Pascal-based GP102 GPU, with 3840 CUDA cores @ 1.6GHz, and 12GB of GDDR5X memory running at 11.4Gbps. The look is very unique however, so if you are a big fan of Star Wars then this might just be something you want to consider. The full PR and launch movie are just below.
Tatooine, Outer Rim Territory—NVIDIA has announced two new collector’s edition NVIDIA TITAN Xp GPUs created for the ultimate Star Wars fan. The new Jedi Order™ and Galactic Empire™ editions of the NVIDIA TITAN Xp have been crafted to reflect the look and feel of the Star Wars galaxy.
These new Star Wars collector’s edition GPUs pay homage to the light side/dark side dichotomy, and contain hints of the Star Wars galaxy, such as the hilt of Luke Skywalker's lightsaber and light panels reminiscent of the Death Star.
The Jedi Order GPU simulates the wear and tear and battle-worn finish of many items used by the Rebel Alliance, resulting from its diecast aluminum cover being subjected to an extensive, corrosive salt spray.
Conversely, the Galactic Empire GPU’s finish features simple, clean lines, emulating the high-end, orderly nature of the resource-rich Empire.
Both versions have multiple windowed areas to showcase internals and lighting, evoking each faction’s lightsabers, green and red, respectively. The finishes of both versions took over a year to perfect.
The retail box packaging also pays homage to the light and dark sides of the Force, with the Jedi Order edition bathed in white, and the Galactic Empire edition bathed in black.
Exclusive Pre-Order Access for GeForce Experience Users
GeForce Experience users get exclusive pre-order access to purchase(1) the Jedi Order and Galactic Empire TITAN Xp editions before the cards are broadly available in mid-November. Starting tomorrow, GeForce Experience users can purchase one card of each design by using their log-in credentials in the NVIDIA store.
Power! Unlimited Power!
The Jedi Order and Galactic Empire TITAN Xp GPUs use the NVIDIA Pascal-based GP102 GPU, each with 3,840 CUDA cores running at 1.6GHz and 12GB of GDDR5X memory running at 11.4Gbps.
Their staggering 12TFLOPs of processing power under the hood allows Star Wars fans to play any of today’s most cutting-edge titles at the highest resolution with the highest detail quality turned on.
Priced at $1,200, each edition also includes a collectible electroformed metal badge containing the insignia of their preferred alliance.
Subject: General Tech | November 7, 2017 - 01:35 PM | Jeremy Hellstrom
Tagged: machine learning, ai
Not to be out done by the research conducted by Japan's Kyushu University which led to the frog is not truck portion of lasts weeks podcast, MIT researchers have also been tormenting image recognition software. Their findings were a little more worrisome, as a 3D printed turtle was identified as a rifle which could lead to some very bad situations in airports or other secure locations. In this case, instead of adding a few pixels to the image, they introduced different angles and lighting conditions which created enough noise to completely fool Google's image recognition AI, Inception. The printed turtle was misidentified because of a the texture which they chose, showing that this issue extends beyond photos to include physical objects. Pop by The Register for more details as well as an ingredient you never want to see on your toast.
"Students at MIT in the US claim they have developed an algorithm for creating 3D objects and pictures that trick image-recognition systems into severely misidentifying them. Think toy turtles labeled rifles, and baseballs as cups of coffee."
Here is some more Tech News from around the web:
- No, Samsung, you really do owe Apple $120m for patent infringement @ The Register
- Almost Everything on Computers Is Perceptually Slower Than It Was in 1983 @ [H]ard|OCP
- Get Watch Dogs FREE From Ubisoft This Week! @ TechARP
- Fat-fingered Level 3 techie reduces internet to level zero: Glitch knocks out connections @ The Register
- Kaspersky warns of increased DDoS attacks against gaming companies @ The Inquirer
- Android security update fixes KRACK, slaps Band-Aid on Pixel 2 XL screen @ Ars Technica
- Seldom used 'i' mangled by baffling autocorrect bug in Apple's iOS 11 @ The Register
- Microsoft releases strict standards for 'highly secure' Windows 10 devices @ The Inquirer
- MINIX: Intel's Hidden In-chip Operating System @ Slashdot
Subject: General Tech | November 6, 2017 - 04:54 PM | Jeremy Hellstrom
Tagged: audio, 1MORE, Quad Drivers, in-ear
These new in-ear headphones from 1MORE have a single carbon driver for mids and lows with three balanced armatures to handle the high and ultra-high frequencies. This is not a surround sound implementation but instead an attempt to provide very high quality sound from in-ear monitors. TechPowerUp tested these Dolby Certified headphones and found them to be an improvement on the already impressive Triple Driver model, even powered by a smartphone; they do not require a DAC or pre-amp to provide great sound. On the other hand, $200 is steep for this style of headphone so read through the review before jumping on Amazon.
"On the wings of the raging success they had with their $100 Triple Driver In-Ear Headphones, currently considered one of the best IEMs in terms of price-performance, 1MORE brings us their even more refined sibling equipped with an additional balanced armature. Do the 1MORE Quad Drivers have what it takes to justify a price bump to $200?"
Here is some more Tech News from around the web:
- Razer Tiamat 2.2 V2 @ Kitguru
- CORSAIR VOID PRO RGB Wireless SE Premium Gaming Headset Review @ NikKTech
- Rosewill Nebula GX10 @ TechPowerUp
- Thinksound TS03+mic HD In-Ear Headphones Review @ NikKTech
- Corsair ST100 RGB Premium Headset stand @ Kitguru
Subject: Storage | November 6, 2017 - 03:22 PM | Jeremy Hellstrom
Tagged: crucial, Momentum Cache, NVMe, Crucial Storage Executive
The SSD Review noticed something very interesting in the latest update to Crucial's Storage Executive software, the Momentum Cache feature now works with a variety of non-Crucial NVMe SSDs. The software allows your system to turn part of your RAM into a cache so that reads and writes can initially be sent to that cache which results in improved performance thanks to RAM's significantly quicker response time. If you have a Crucial SSD installed as well as another NVMe SSD and are using the default Windows NVMe driver, you can set up caching on the non-Crucial SSD if you so desire. Stop by for a look at the performance impact as well as a list of the drives which have been successfully tested.
"Crucial’s Momentum Cache feature, part of Crucial Storage Executive, is unlocked for all NVMe SSDs, or at least the ones we have tested in our Z170 test system; the key here, of course, is that a compatible Crucial SSD must initially be on the system to enable this feature at all."
Here are some more Storage reviews from around the web:
- Patriot Hellfire 240GB @ Benchmark Reviews
- Team Group CARDEA Zero 240GB M2 SSD @ Guri of 3D
- HP S700 SSD Review @ OCC
- Western Digital (WD) My Cloud Home 6TB @ Kitguru
- Seagate IronWolf Pro 12TB SATA III HDD Review @ NikKTech
- Seagate BarraCuda Pro 12TB HDD @ Kitguru
Subject: Processors | November 6, 2017 - 02:00 PM | Josh Walrath
Tagged: radeon, Polaris, mobile, kaby lake, interposer, Intel, HBM2, gaming, EMIB, apple, amd, 8th generation core
In what is probably considered one of the worst kept secrets in the industry, Intel has announced a new CPU line for the mobile market that integrates AMD’s Radeon graphics. For the past year or so rumors of such a partnership were freely flowing, but now we finally get confirmation as to how this will be implemented and marketed.
Intel’s record on designing GPUs has been rather pedestrian. While they have kept up with the competition, a slew of small issues and incompatibilities have plagued each generation. Performance is also an issue when trying to compete with AMD’s APUs as well as discrete mobile graphics offerings from both AMD and NVIDIA. Software and driver support is another area where Intel has been unable to compete due largely to economics and the competitions’ decades of experience in this area.
There are many significant issues that have been solved in one fell swoop. Intel has partnered with AMD’s Semi-Custom Group to develop a modern and competent GPU that can be closely connected to the Intel CPU all the while utilizing HBM2 memory to improve overall performance. The packaging of this product utilizes Intel’s EMIB (Embedded Multi-die Interconnect Bridge) tech.
EMIB is an interposer-like technology that integrates silicon bridges into the PCB instead of relying upon a large interposer. This allows a bit more flexibility in layout of the chips as well as lowers the Z height of the package as there is not a large interposer sitting between the chips and the PCB. Just as interposer technology allows the use of chips from different process technologies to work seamlessly together, EMIB provides that same flexibility.
The GPU looks to be based on the Polaris architecture which is a slight step back from AMD’s cutting edge Vega architecture. Polaris does not implement the Infinity Fabric component that Vega does. It is more conventional in terms of data communication. It is a step beyond what AMD has provided for Sony and Microsoft, who each utilize a semi-custom design for the latest console chips. AMD is able to integrate the HBM2 controller that is featured in Vega. Using HBM2 provides a tremendous amount of bandwidth along with power savings as compared to traditional GDDR-5 memory modules. It also saves dramatically on PCB space allowing for smaller form factors.
EMIB provides nearly all of the advantages of the interposer while keeping the optimal z-height of the standard PCB substrate.
Intel did have to do quite a bit of extra work on the power side of the equation. AMD utilizes their latest Infinity Fabric for fine grained power control in their upcoming Raven Ridge based Ryzen APUs. Intel had to modify their current hardware to be able to do much the same work with 3rd party silicon. This is no easy task as the CPU needs to monitor and continually adjust for GPU usage in a variety of scenarios. This type of work takes time and a lot of testing to fine tune as well as the inevitable hardware revisions to get thing to work correctly. This then needs to be balanced by the GPU driver stack which also tends to take control of power usage in mobile scenarios.
This combination of EMIB, Intel Kaby Lake CPU, HBM2, and a current AMD GPU make this a very interesting combination for the mobile and small form factor markets. The EMIB form factor provides very fast interconnect speeds and a smaller footprint due to the integration of HBM2 memory. The mature AMD Radeon software stack for both Windows and macOS environments provides Intel with another feature in which to sell their parts in areas where previously they were not considered. The 8th Gen Kaby Lake CPU provides the very latest CPU design on the new 14nm++ process for greater performance and better power efficiency.
This is one of those rare instances where such cooperation between intense rivals actually improves the situation for both. AMD gets a financial shot in the arm by signing a large and important customer for their Semi-Custom division. The royalty income from this partnership should be more consistent as compared to the console manufacturers due to the seasonality of the console product. This will have a very material effect on AMD’s bottom line for years to come. Intel gets a solid silicon solution with higher performance than they can offer, as well as aforementioned mature software stack for multiple OS. Finally throw in the HBM2 memory support for better power efficiency and a smaller form factor, and it is a clear win for all parties involved.
The PCB savings plus faster interconnects will allow these chips to power smaller form factors with better performance and battery life.
One of the unknowns here is what process node the GPU portion will be manufactured on. We do not know which foundry Intel will use, or if they will stay in-house. Currently TSMC manufactures the latest console SoCs while GLOBALFOUNDRIES handles the latest GPUS from AMD. Initially one would expect Intel to build the GPU in house, but the current rumor is that AMD will work to produce the chips with one of their traditional foundry partners. Once the chip is manufactured then it is sent to Intel to be integrated into their product.
Apple is one of the obvious candidates for this particular form factor and combination of parts. Apple has a long history with Intel on the CPU side and AMD on the GPU side. This product provides all of the solutions Apple needs to manufacture high performance products in smaller form factors. Gaming laptops also get a boost from such a combination that will offer relatively high performance with minimal power increases as well as the smaller form factor.
The potential (leaked) performance of the 8th Gen Intel CPU with Radeon Graphics.
The data above could very well be wrong about the potential performance of this combination. What we see is pretty compelling though. The Intel/AMD product performs like a higher end CPU with discrete GPU combo. It is faster than a NVIDIA GTX 1050 Ti and trails the GTX 1060. It also is significantly faster than a desktop AMD RX 560 part. We can also see that it is going to be much faster than the flagship 15 watt TDP AMD Ryzen 7 2700U. We do not yet know how it compares to the rumored 65 watt TDP Raven Ridge based APUs from AMD that will likely be released next year. What will be fascinating here is how much power the new Intel combination will draw as compared to the discrete solutions utilizing NVIDIA graphics.
To reiterate, this is Intel as a customer for AMD’s Semi-Custom group rather than a licensing agreement between the two companies. They are working hand in hand in developing this solution and then both profiting from it. AMD getting royalties from every Intel package sold that features this technology will have a very positive effect on earnings. Intel gets a cutting edge and competent graphics solution along with the improved software and driver support such a package includes.
Update: We have been informed that AMD is producing the chips and selling them directly to Intel for integration into these new SKUs. There are no royalties or licensing, but the Semi-Custom division should still receive the revenue for these specialized products made only for Intel.
Subject: General Tech | November 6, 2017 - 01:03 PM | Jeremy Hellstrom
Tagged: broadcom, qualcomm, billions
While the gang does some sleuthing about the current signs of the end of the world; not simply cats and dogs living together but rumours of AMD and Intel working together, lets look at a different surprise. It seems that Broadcom have set its sights on Qualcomm, offering $130 billion to buy out the company and its assets. In part this might be inspired by Qualcomm's pending release of the Centriq family of processors seeing as how Broadcom cancelled their ARM based server chip development earlier this year. It sems as though Qualcomm is not looking too hard at this as a way to pay their ever expanding legal bills in their cases against Apple as according to the story that Slashdot has linked to, Qualcomm considers this an offer it can refuse.
Keep an eye out for an update as Josh and Ryan check on the mixing of Intel's Embedded Multi-die Interconnect Bridges and AMD's Polaris.
"Chipmaker Broadcom officially unveiled a $130bn offer, including net debt, for Qualcomm on Monday, in what could be the largest tech deal in history. Under Broadcom's proposal, Qualcomm shareholders would receive $70 per share -- $60 in cash and $10 in shares of its rival. It would value Qualcomm's equity at roughly $103bn."
Here is some more Tech News from around the web:
- Intel and AMD partner on new 8th-gen CPUs to challenge Nvidia @ The Inquirer
- Essen 2017: Best board games from the biggest board game convention @ Ars Technica
- Biggest Tor overhaul in a decade adds layers of security improvements @ The Register
- 'Qualcomm, we will buy you... for... one HUNDRED... BILLION DOLLARS' – Broadcom @ The Register
- Take off, ya hosers! Silicon Valley court says Google can safely ignore Canadian search ban @ The Register
Subject: General Tech | November 5, 2017 - 08:14 PM | Scott Michaud
Tagged: Unity, zspace, xr, AR, VR
The Unity Educator Toolkit was created by Unity3D to integrate learning game development into the K-12 public curriculum. Now zSpace, which we’ve mentioned a few times, is joining in to the initiative with their mixed-reality platform. The company is known for creating displays that, when viewed with their glasses, track where you are and make the object appear to be in front of you. They also have a stylus that lets you interact with the virtual object.
They are focused on the educational side of VR and AR.
It’s not entirely clear what this means, because a lot of the details are behind a sign-up process. That said, if you’re an educator, then check out the package to see if it’s relevant for you. Creating games is an interesting, albeit challenging and somewhat daunting, method of expressing oneself. Giving kids the tools to make little game jam-style expressions, or even using the technology in your actual lessons, will reach a new group of students.
Subject: Mobile | November 5, 2017 - 07:49 PM | Scott Michaud
Tagged: google, Pixel 2 XL
The Pixel 2 XL launch hasn’t been going so well for Google. Early complaints were about the screen: how it had alleged burn-in problems within the first few days, and how it couldn’t support the sRGB color space. Since then, we’ve even been hearing reports that some phones shipped without the OS even installed. Whoops!
Now here’s a specific complaint: people are saying that the phone is charging slow. This is an easy one to test – run a multimeter in-line with the USB cable see what happens. Google+ user, Nathan K., apparently did, and he found that the Pixel 2 XL maxed out at 10.5W. When the screen is on, this drops to a maximum of 6W, which he claims (and I would have guessed) is likely due to the combined heat of a phone that’s both in-use and charging. Lithium batteries are very sensitive to heat.
He also says that this issue isn’t really a problem in-and-of itself. He just wishes that manufacturers advertised more about how the battery should perform, and maybe even provide the switches for users to override if needed. I could see that being a warranty nightmare, but I’m rarely going to fall on the side against user choice as a general rule, so I think that would be nice.
Subject: General Tech | November 5, 2017 - 07:13 PM | Scott Michaud
Tagged: blizzard, starcraft 2, pc gaming
Over the last few years, Blizzard has been progressively opening up StarCraft II for non-paying customers. These initiatives ranged from allowing whole parties to share the highest expansion level of any one member, unlocking Terran for unranked games, opening up mods to the Starter edition, and so forth.
Starting on November 14th, after a handful of months of the original StarCraft going free-to-play, Blizzard will allow free access to multiplayer (including the ranked ladder), a handful of co-op commanders, and the Wings of Liberty campaign. If you already own Wings of Liberty, then you will get Heart of the Swarm for free (if you claim it between November 8th and December 8th).
If you already own both, then… well, life as usual for you.
In terms of making money, Blizzard is hoping to sell the remaining two-or-three campaigns, Heart of the Swarm, Legacy of the Void, and Nova Covert Ops, as well as the other up-sells, like announcers, co-op commanders, and so forth. If you’re in it for the vanilla (or Arcade) multiplayer, though, then you can jump in on November 14th without paying a dime.
Subject: General Tech | November 5, 2017 - 05:01 PM | Scott Michaud
About a week ago, the Blender Foundation held their annual Blender Conference. The event was sponsored by AMD, same as last year, who is putting quite a bit of time and money in the free, open-source 3D suite. They are especially focused on OpenCL and Cycles development, which benefits their Radeon GPUs and high-end, workstation CPUs.
In fact, AMD, along with Tangent Animation, Nimble Collective, and Aleph Objects, have paid for engineers to work on the project.
A lot of the talk was about Blender 2.8, as it is both upcoming and a significant change. Ton Roosendaal talked a lot about the new scene graph, how objects can be groups as collections, and how an infinite number of layers are possible. It’s a significant, back-end change that’s been discussed in the past.
There’s still no firm release schedule for Blender 2.8, but it’s coming along. You can download one of the pre-release builds on their website, but don’t expect it to be stable. I found my first crash bug in about 5 minutes.
Subject: General Tech | November 3, 2017 - 12:58 PM | Jeremy Hellstrom
Tagged: microsoft, console, gaming, xbox one x, model numbers gone wild
AMD will once again benefit from the launch of a new console, the Xbox One X is powered by eight Jaguar cores running at 2.3 GHz and 40 custom AMD CUs which run at 1172 MHz which will provide six teraflops of processing power. Ars Technica took the new console for a spin and were quite impressed, in theory. The XbOX does offer proper 4k HDR video output, assuming you have the TV for it, however most of the available games do not offer both so you might be somewhat disappointed with a title such as Halo3. On the other hand, all games do look better on the X1X and perform quite well. Drop by for a large number of screenshots comparing the Xbone to the XbxX and details on which games benefit the most from the new device.
"When it comes to hard numbers, the Xbox One X definitely merits Microsoft’s marketing hype as “the most powerful console ever.” Microsoft has pulled out the stops in squeezing stronger components into the same basic architecture of the four-year-old Xbox One."
Here is some more Tech News from around the web:
- Scientists Prove Emoticons Are Not Universally Understood @ Slashdot
- Some Pixel 2 XLs escaped the factory without an operating system installed @ The Inquirer
- SCO vs. IBM case over who owns Linux comes back to life. Again @ The Register
- Qualcomm sues Apple for allegedly blabbing smartphone chip secrets in emails CC'd to Intel @ The Register
- Microsoft to close accessibility loophole to free Windows 10 Upgrades in December @ The Inquirer
- Hackers abusing digital certs smuggle malware past security scanners @ The Register
- Microsoft reveals network simulator that keeps Azure alive @ The Register
Subject: Editorial | November 3, 2017 - 09:00 AM | Jim Tanous
Tagged: video, Ryan Shrout, pcper mailbag, pcper
It's Friday, which means it's time for PC Perspective's weekly mailbag, our video show where Ryan and team answer your questions about the tech industry, the latest and greatest hardware, the process of running a tech review website, and more!
Here's what you'll find on today's show:
00:37 - Reviewers biased on i5-8400 reviews?
02:35 - Vega 56 performance close to Vega 64?
04:43 - HDR limited by monitor panel tech?
07:27 - Why do NVIDIA and AMD use blower style coolers?
09:54 - Backup software recommendation?
12:26 - Where are the 8-core Ryzen laptops?
13:51 - Will NVIDIA ever support FreeSync?
16:24 - Upgrade Xbox One X with SSD?
17:43 - i7-8700 vs. i7-8700K?
19:18 - Ryzen 5 1600 for gaming and Plex Server?
Be sure to subscribe to our YouTube Channel to make sure you never miss our weekly reviews and podcasts, and please consider supporting PC Perspective via Patreon to help us keep videos like our weekly mailbag coming!
Subject: Cases and Cooling | November 2, 2017 - 03:55 PM | Jeremy Hellstrom
Tagged: enermax, MaxTytan, 800w, modular psu, SLEEMAX, 80 Plus Titanium
Enermax have launched a new family of PSUs called MaxTytan, with the 800W model appearing for review on [H]ard|OCP. These PSUs feature Sleemax cabling, each wire is covered in fabric which does look nice but adds bulk to the wires. The cables plus the 80 PLUS Titanium rating add to the price, the MSRP is $200. That hurts the rating [H] provided as the power quality they saw in testing was good, but not great, and is somewhat more expensive than the competition. Drop by for the review, as the PSU provides decent power and a nice look for cases that expose components.
"Enermax pulls out a flagship with its MaxTytan PSU, this one rated at 800 watts. The MaxTytan PSU has some interesting features like its on-demand Dust Free Rotation fan system. It also comes with very "custom" looking SLEEMAX cable covers that wrap every single cable individually like you find in custom rigs. And huge Titanium efficiency!"
Here are some more Cases & Cooling reviews from around the web:
- BitFenix Formula Gold 750 W @ TechPowerUp
- Seasonic FOCUS PLUS Gold & Platinum 750W @ [H]ard|OCP
- Bitfenix Formula Gold 750W @ Kitguru
- Enermax Platimax DF 1200W @ [H]ard|OCP
Subject: Graphics Cards | November 2, 2017 - 03:03 PM | Jeremy Hellstrom
Tagged: pascal, nvidia, gtx 1070 ti, geforce
It should come as no surprise to anyone how the GTX 1070 Ti performs, better than a GTX 1070 but not quite as fast as a GTX 1080 ... unless you overclock. With the push of two buttons Ryan was able to hit 1987 MHz which surpasses your average GTX 1080 by a fair margin. Hardware Canucks saw 2088MHz when they overclocked as well as memory of 8.9Gbps which pushed the performance past the reference GTX 1080 in many games. Their benchmark suite encompasses a few different games so you should check to see if your favourites are there.
The real hope of this launch was that prices would change, not so much the actual prices you pay but the MSRP of cards both AMD and NVIDIA. For now that has not happened but perhaps soon it will, though Bitcoin hitting $7000 does not help.
"NVIDIA’s launch of their new GTX 1070 Ti is both senseless and completely sensible depending on which way you tend to look at things. The emotional among you are going to wonder why NVIDIA is even bothering to introduce a new product into a lineup that’s more than a year old."
Here are some more Graphics Card articles from around the web:
- GeForce GTX 1070 Ti Founder Edition @ Guru of 3D
- GeForce GTX 1070 Ti 2-way FCAT SLI @ Guru of 3D
- MSI GeForce GTX 1070 Ti Gaming @ Guru of 3D
- Palit GeForce GTX 1070 Ti Super Jetstream @ Guru of 3D
- Nvidia GTX 1070 Ti review: A fine graphics card—but price remains high @ Ars Technica
- GTX 1070 Ti Review- 35 Games benchmarked @ BabelTechReviews
- MSI GTX 1070 Ti Gaming 8 GB @ TechPowerUp
- NVIDIA GeForce GTX 1070 Ti Founders Edition 8 GB @ TechPowerUp
- A Quick Look At NVIDIA’s GeForce GTX 1070 Ti @ Techgage
- MSI GeForce GTX 1070 Ti Gaming
- MSI GTX 1070 Ti Gaming 8G @ Kitguru
- Palit GTX 1070 Ti Super JetStream 8 GB @ TechPowerUp
- Palit GTX 1070 Ti Super JetStream @ Kitguru
- The NVIDIA GeForce GTX 1080 Ti Founders Edition @ TechARP
- MSI GeForce GTX 1080 Ti GAMING X TRIO @ [H]ard|OCP
- Sapphire RX VEGA 64 Limited Edition @ Modders-Inc
- The AMD Radeon RX Vega 64 @ TechARP
Subject: General Tech | November 2, 2017 - 12:11 PM | Alex Lustenberg
Tagged: Volta, video, podcast, PCI-e 4, nvidia, msi, Microsoft Andromeda, Memristors, Mali-D71, Intel Optane, gtx 1070 ti, cord cutting, arm, aegis 3, 8th generation core
PC Perspective Podcast #474 - 11/02/17
Join us for discussion on Optane 900P, Cord Cutting, 1070 Ti, and more!
The URL for the podcast is: http://pcper.com/podcast - Share with your friends!
- iTunes - Subscribe to the podcast directly through the iTunes Store (audio only)
- Google Play - Subscribe to our audio podcast directly through Google Play!
- RSS - Subscribe through your regular RSS reader (audio only)
- MP3 - Direct download link to the MP3 file
Hosts: Ryan Shrout, Josh Walrath, Jeremy Hellstrom, Allyn Malventano,
Peanut Gallery: Ken Addison, Alex Lustenberg
Program length: 1:32:19
0:03:45 PCPer Mailbag #15 - 10/26/2017
Week in Review:
News items of interest:
Hardware/Software Picks of the Week