All | Editorial | General Tech | Graphics Cards | Networking | Motherboards | Cases and Cooling | Processors | Chipsets | Memory | Displays | Systems | Storage | Mobile | Shows and Expos
Subject: Motherboards | December 14, 2018 - 07:04 PM | Tim Verry
Tagged: Z390, water cooling, gigabyte, aorus
Gigabyte’s Aorus brand is planning to make its flagship Z390 Aorus Xtreme motherboard even more extreme by pairing it with a RGB LED-lit perspex monoblock that covers the processor and PCH areas. The aptly named Z390 Aorus Xtreme Waterforce is an E-ATX form factor motherboard for 8th and 9th Generation Intel Core processors (LGA 1151) that comes packed with overclocking features and a plethora of I/O and expansion options.
The Z390 Aorus Xtreme Waterforce is powered by two 8-pin CPU power connectors, a right angle 24-pin ATX, and a six pin PCI-E input for extra PCI-E slot power delivery. The LGA 1151 socket sits front and center up top with four DDR4 memory slots off to the right of it. Power delivery is handled by a 16 phase PowIRStage digital VRM. Below the CPU sit three PCI-E x16 slots (which run at x16/x8/x4), two PCI-E x1 slots, and three M.2 slots with heatspreaders. There is also a mini PCI-E slot but it comes with an Intel wireless card pre-installed. Additional expansion options include six SATA 3 ports. The Gigabyte motherboard includes 10 sensor points and eight hybrid fan headers. It also supports Smart Fan 5 and RGB Fan Commander software tools as well as support for external RGB LED strips and addressable LEDs via headers (RGB Fusion). The Aorus board also supports OC Touch which offers physical buttons and switches for adjusting overclocks without needing to go into the UEFI BIOS.
Audio duties are handled by the Realtek ALC1220-VB codec along with ESS Sabre ES9018K2M DAC, LME 49720 dedicated analog power delivery, Ti OPA1622 Op amp, and several WIMA and Nichicon audio capacitors. The reported 127 dB SNR audio also supports “Amp Up” which can automatically adjust to various headphones. Networking on the Z390 Xtreme Waterforce is handled by Intel (CNV1) for 802.11ac Wave 2 2x2 wireless and Gigabit Ethernet and by Aquantia for the 10 GbE.
Rear I/O on the flagship motherboard shouldn’t disappoint with:
- 2 x SMA antenna connectors (Wi-Fi)
- 2 x Thunderbolt 3 40 Gbps
- 4 x USB 3.1 Gen 2 (10 Gbps)
- 2 x USB 3.1 Gen 1 (5 Gbps)
- 2 x USB 2.0
- 1 x 10 Gigabit Ethernet
- 1 x 1 Gigabit Ethernet
- 1 x HDMI
- 5 x Analog audio
- 1 x S/PDIF audio
The Aorus All In One Monoblock is the star of the show though as it is what differentiates it from the normal Z390 Aorus Xtreme board. The monoblock uses standard G1/4” threads and uses dense copper fins to help heat transfer to the water loop. Water flows over the CPU, VRM, and PCH areas to keep everything nice and cool even when overclocking. According to Gigabyte there is a leak detection circuit that will shut down the PC if a leak in the waterblock/loop is detected to protect your components. The downside to the monoblock is, of course, the added complexity to the build process, it certainly looks nice though so some enthusiasts may well find it worth it.
Gigabyte has not yet released pricing or availability information, but it’s going to come at a premium price. The Z390 Aorus Xtreme (sans waterblock) has a MSRP of $549.99, for example, and the addition of the Aorus RGB monoblock could add another $50 to $100 to that price.
The reviews on this board and the monoblock should be interesting. While it may be expensive, I'm sure that some watercooling enthusiasts will find uses for it in all-out "cool all the things" builds!
Subject: General Tech | December 13, 2018 - 03:43 PM | Jeremy Hellstrom
Tagged: RGB, mechanical keyboard, input, vulcan 120 aimo, roccat
Roccat's Vulcan 120 Aimo uses low profile Titan mechanical switches, which have a travel distance of 3.6mm and an actuation distance of 1.8mm, compared to a similar Cherry MX switch with 4mm and 2mm respectively. The Tech Report also found the spacing to be rather tight, in part due to the skirted design, so this might be one you want to test drive before purchasing. The included Swarm software lets you program keys in a variety of ways including association noises with certain key presses, while the Aimo RGB option offers some interesting performance which you might have strong feelings about one way or the other.
"Roccat's Vulcan 120 Aimo keyboard cuts a striking profile with its skirtless key caps and in-house Titan switches. We put the Vulcan 120 Aimo to the test to see whether a new spin on mechanical key switches is enough to help it stand out in a crowded market."
Here is some more Tech News from around the web:
- Nixeus REVEL Fit @ TechPowerUp
- Logitech G Pro Wireless Mouse @ Kitguru
- Dark Project ME3 @ TechPowerUp
Subject: General Tech | December 13, 2018 - 01:15 PM | Jeremy Hellstrom
Tagged: nvidia, machine learning, jetson, AGX Xavier
NVIDIA claims their newly announced Jetson AGX Xavier SoC can provide up to 32 trillion operations per second for specific tasks, requiring a mere 10W to do so. The chips are design for image processing and recognition along with all those other 'puter learnin' things you would expect and chances are a device will have several of these chips working in tandem, which offers a lot of processing power. It is already being used for real time monitoring of DNA sequencing and will be installed in car manufacturing lines in Japan.
The Inquirer points out that this performance comes at a cost, currently $1100 per unit as long as you are buying 1000 of them or more.
"Essentially a data wrangling server plonked onto a silicon package, Jetson AGX Xavier is designed to handle all the tech and processing that autonomous things need to go about their robot lives, such as image processing and computer vision and the inference of deep learning algorithms."
Here is some more Tech News from around the web:
- RISC-V Will Stop Hackers Dead From Getting Into Your Computer @ Hackaday
- Small American town rejects Comcast – while ISP reps take issue with your El Reg vultures @ The Register
- In a Test, 3D Model of a Head Was Able To Fool Facial Recognition System of Several Popular Android Smartphones @ Slashdot
- Russia's cutting edge robot turns out to be bloke in costume @ The Inquirer
- Asustek president Jerry Shen to leave in major business revamp @ DigiTimes
- Julius Lilienfeld and the First Transistor @ Hackaday
- Ticketmaster tells customer it's not at fault for site's Magecart malware pwnage @ The Register
- The Worst CPU & GPU Purchases of 2018 @ Techspot
- Hands-on: Switch’s NES controllers offer unmatched old-school authenticity @ Ars Technica
- Standing Desk Starter Guide: Some Dos and Don'ts @ Techspot
Subject: Graphics Cards | December 13, 2018 - 09:01 AM | Jim Tanous
Tagged: Radeon Software Adrenalin Edition, radeon software, radeon, gpu, drivers, amd, Adrenalin Edition
AMD today released the latest major update to its Radeon software and driver suite. Building on the groundwork laid last year, AMD Radeon Software Adrenalin 2019 Edition brings a number of new features and performance improvements.
With this year’s software update, AMD continues to make significant gains in game performance compared to last year’s driver release, with an average gain of up to 15 percent in across a range of popular titles. Examples include Assassin’s Creed Odyssey (11%), Battlefield V (39%), and Shadow of the Tomb Raider (15%).
Beyond performance, Adrenalin 2019 Edition introduces a number of new and improved features. Highlights include:
Game Streaming: Radeon gamers can now stream any game or application from their PCs to their mobile devices via the AMD Link app at up to 4K 60fps. The feature supports both on-screen controls as well as Bluetooth controllers. ReLive streaming is also expanding to VR, with users able to stream games and videos from their PCs to standalone VR headsets via new AMD VR store apps. This includes Steam VR titles, allowing users to play high-quality PC-based VR games on select standalone headsets. AMD claims that its streaming technology offers “up to 44% faster responsiveness” than other game streaming solutions.
ReLive Streaming and Sharing: Gamers more interested in streaming their games to other people will find several new features in AMD’s ReLive feature, including adjustable picture-in-picture instant replays from 5 to 30 seconds, automatic GIF creation, and a new scene editor with more stream overlay options and hotkey-based scene transition control.
Radeon Game Advisor: A new overlay available in-game that helps users designate their target experience (performance vs. quality) and then recommends game-specific settings to achieve that target. Since the tool is running live alongside the game, it can respond to changes as they occur and dynamically recommend updated settings and options.
Radeon Settings Advisor: A new tool in the Radeon Software interface that scans system configuration and settings and recommends changes (e.g., enabling or disabling Radeon Chill, changing the display refresh rate, enabling HDR) to achieve an optimal gaming experience.
WattMan One-Click Tuning Improvements: Radeon WattMan now supports automatic tuning of memory overclocking, GPU undervolting, expanded fan control options, and unlocked DPM states for RX Vega series cards.
Display Improvements: FreeSync 2 can now tone-map HDR content to look better on displays that don’t support the full color and contrast of the HDR spec, and AMD’s Virtual Super Resolution feature is now supported on ultra-wide displays.
Radeon Overlay: AMD’s Overlay feature which allows gamers to access certain Radeon features without leaving their game has been updated to display system performance metrics, WattMan configuration options, Radeon Enhanced Sync controls, and the aforementioned Game Advisor.
AMD Link: AMD’s mobile companion app now offers easier setup via QR code scanning, voice control of various Radeon and ReLive settings (e.g., start/stop streaming, save replay, take screenshot), WattMan controls, enhanced performance metrics, and the ability to initiate a Radeon Software update.
Radeon Software Adrenalin 2019 Edition is available now from AMD’s support website for all supported AMD GPUs.
Subject: General Tech | December 13, 2018 - 05:31 AM | Jim Tanous
Tagged: Zen 2, Sunny Cove, snapdragon, ryzen 3, ray tracing, radeon pro, podcast, Optane, Intel, edge, chromium, amd, 3dmark
PC Perspective Podcast #525 - 12/12/2018
Our podcast this week features discusion of the new Intel Sunny Cove architecture, Ryzen 3 rumors, the high-end Snapdragon 8cx, an affordable Radeon Pro GPU, and more!
Subscribe to the PC Perspective Podcast
Check out previous podcast episodes: http://pcper.com/podcast
00:03:21 - AMD Radeon Pro WX8200 Review
00:14:50 - Intel Architecture Day: Sunny Cove, Gen11 iGPU, Foveros
00:27:16 - Ryzen 3 Rumors
00:38:57 - Using a 4K TV as a Monitor
00:43:21 - Snapdragon 8cx
00:57:29 - Microsoft Edge Switching to Chromium
01:03:38 - MSI GTX 1060 with GDDR5X
01:05:40 - 3DMark Port Royal Ray Tracing Benchmark
01:09:03 - Hunting Speculative Execution Vulnerabilities
01:11:38 - 7nm Vega Logo
01:13:49 - Intel Optane DIMM Latency
01:30:45 - The Outer Worlds
Subject: General Tech, Systems, Mobile | December 13, 2018 - 01:02 AM | Tim Verry
Slated for an early 2020 release, Intel is planning a new larger (but still) small form factor NUC system dubbed Ghost Canyon X according to a report by FanlessTech. Ghost Canyon X will feature a larger 5 liter form factor that will be able to accomodate a discrete graphics card along with both M.2 and SATA 3 storage.
The Ghost Canyon X NUC will be powered by 9th Generation Coffee Lake HR processors that will come in i5 and i7 flavors. The chips have a 45W TDP and will come in quad core i5-9XXXH, six core i7, or eight core i7-9XXXH configurations (with HyperThreading) and will be paired with two DDR4 DIMMs (up to 64GB DDR4 2400 MHz or 32GB DDR4 2666 MHz). Ghost Canyon X NUCs will have three HDMI 2.0 video outputs, two Thunderbolt 3 ports, and a SD card slot for external I/O (likely along with USB 3.1 and audio outputs though those are not pictured). Internal storage includes up to 3 M.2 drives (two M.2 2242 80/110 and one 80mm) using PCI-E 3.0 x4 links and SATA 3 for standard hard drives and SATA SSDs. The biggest change with the NUC platform is the inclusion of a single PCI-E x16 slot which can be used to add a discrete graphics card to the system. While 5 liters is quite a jump up from the 0.7L standard NUCs and the 1.2L of the Kaby Lake-G powered Hades Canyon gaming NUC, it is still a fairly small system so not all graphics cards are going to fit but enthusiasts should be able to use GPUs that have shorter Mini ITX designs easily enough.
FanlessTech notes that the reference Ghost Canyon X NUC will most likely be actively cooled, but third party fanless cases from makers like Akassa, Streacom, Tranquil PC and others should be achievable with a 45W TDP CPU (and even GPU if you go with a lower end model).
Further details are still unknown and the pictured case design is still subject to change as the system gets further along in the design process and closer to launch. Curiously, that expected early 2020 Ghost Canyon X launch would coincide with Intel’s plans for launching its own discrete graphics solution so an Intel NUC with an Intel graphics card would be an interesting system to see!
Stay tuned for updated NUC information as we get closer to Computex 2019 and CES 2020!
Subject: Graphics Cards, Mobile | December 12, 2018 - 10:04 PM | Tim Verry
Tagged: turing, rumor, RTX 2070, RTX 2060, nvidia
Rumors have appeared online that suggest NVIDIA may be launching mobile versions of its RTX 2070 and RTX 2060 GPUs based on its new Turing architecture. The new RTX 2070 and RTX 2060 with Max-Q designs were leaked by Twitter user TUM_APISAK who posted cropped screenshots of Geekbench 4.3.1 and 3DMark 11 Performance results.
Allegedly handling the graphics duties in a Lenovo 81HE, the GeForce RTX 2070 with Max-Q Design (8GB VRAM) combined with a Core i7-8750H Coffee Lake six core CPU and 32 GB system memory managed a Geekbench 4.3.1 score of 223,753. The GPU supposedly has 36 Compute Units (CUs) and a core clockspeed of 1,300 MHz. The desktop RTX 2070 GPU which is already available also has 36 CUs with 2,304 CUDA cores, 144 texture units, 64 ROPS, 288 Tensor cores, and 36 RT (ray tracing) cores. The desktop GPU has a 175W reference (non FE) TDP and clocks of 1410 MHz base and 1680 MHz boost (1710 MHz for Founder's Edition). Assuming that 36 CU number is accurate, the mobile (RTX 2070M) may well have the same core counts, just running at lower clocks which would be nice to see but would require a beefy mobile cooling solution.
As far as the RTX 2060 Max-Q Design graphics processor, not as much information was leaked as far as specifications as the leak was limited to two screenshots allegedly from Final Fantasy XV's benchmark results page comparing a desktop RTX 2060 with a Max-Q RTX 2060. The number of CUs (and other numbers like CUDA/Tensor/RT cores, TMUs, and ROPs) was not revealed in those screenshots, for example. The comparison does lend further credence to the rumors of the RTX 2060 utilizing 6 GB of GDDR6 memory though. Tom's Hardware does have a screenshot that shows the RTX 2060 with 30 CUs which suggest 1,920 CUDA cores, 240 Tensor cores, and 30 RT cores though with clocks up to 1.2 GHz (which does mesh well with previous rumors of the desktop part).
|Graphics Card||Generic VGA||Generic VGA|
|Memory||6144 MB||6144 MB|
|Core clock||960 MHz||975 MHz|
|Memory Clock||1750 MHz||1500 MHz|
|Driver name||NVIDIA GeForce RTX 2060||NVIDIA GeForce RTX 2060 with Maz-Q Design|
Also, the TU106 RTX 2060 with Max-Q Design reportedly has a 975 MHz core clock and a 1500 MHz (6 GHz) memory clock. Note that the 960 MHz core clock and 1750 MHz (7 GHz) memory clocks don't match previous RTX 2060 rumors which suggested higher GPU clocks in particular (up to 1.2 GHz). To be fair, it could just be the software reporting incorrect numbers due to the GPUs not being official yet. One final bit of leaked information included a note about 3DMark 11 performance with the RTX 2060 Max Q Design GPU hitting at least 19,000 in the benchmark's Performance preset which allegedly puts it in between the scores of the mobile GTX 1070 and the mobile GTX 1070 Max-Q. (A graphics score between nineteen and twenty thousand would put it a bit above a desktop GTX 1060 but far below the desktop 1070).
As usual, take these rumors and leaked screenshots with a healthy heaping of salt, but they are interesting nonetheless. Combined with the news about NVIDIA possibly announcing new mid-range GPUs at CES 2019, we may well see new laptops and other mobile graphics solutions shown off at CES and available within the first half of 2019 which would be quite the coup.
What are your thoughts on the rumored RTX 2060 for desktops and its mobile RTX 2060 and RTX 2070 Max-Q siblings?
Subject: Mobile | December 12, 2018 - 05:19 PM | Jeremy Hellstrom
Tagged: dell, linux, ubuntu 18.04, XPS developer edition, Kaby Lake R
Dell have updated their Linux powered XPS Developer's Edition laptop with a Kaby Lake R processor, up to a 2TB PCIe SSD, 4-16GB of RAM and either a 1080p screen or a 4K touchscreen depending on how much you are willing to pay. Dell included all the latest features, including a pair of Thunderbolt 3 ports as well as a Type C 3.1 port; there is even an SD card reader.
Apart from the webcam and the lack of older style USB ports, Ars Technica gives this new Linux power laptop top marks.
"Recently, Dell finally sent Ars the latest model of the XPS 13 DE for testing. And while Dell did put a lot of work into this latest iteration, the biggest upgrade with the latest Developer Edition is the inclusion of Ubuntu 18.04."
Here are some more Mobile articles from around the web:
More Mobile Articles
- Huawei Mate 20 Pro @ Kitguru
- HUAWEI Mate 20 @ TechARP
- LG G7 ThinQ @ The Inquirer
- Jacked Up With Less Jack: OnePlus 6T Smartphone Review @ Techgage
- Zeblaze THOR PRO 3G Smartwatch Review @ NikKTech
It's like Skyrim ... with guns ... in space ... with a Firefly meets Borderlands feel? The Outer Worlds teaser
Subject: General Tech | December 12, 2018 - 02:56 PM | Jeremy Hellstrom
Tagged: obsidian, The Outer Worlds, gaming
The teaser trailer for The Outer Worlds certainly looks interesting, though one has to wonder in Obsidian may have tried to combine too many different styles into a single game. On the other hand they are responsible for the best of the first person Fallout games so we can hold out some hope. Even better is the news from Rock, Paper, SHOTGUN that even though Microsoft now owns Obsidian, the game will be released by 2K and not a Windows Store exclusive launch!
"Obsidian Entertainment, the studio behind RPGs from Alpha Protocol through Fallout: New Vegas to Pillars Of Eternity, tonight announced The Outer Worlds, a new singleplayer first-person RPG with a space-western twang."
Here is some more Tech News from around the web:
- John Romero's about to make you a gift @ Rock, Paper, SHOTGUN
- Smash Bros. Ultimate review: The best fighting game on any Nintendo system @ Ars Technica
- Wot I Think - X4: Foundations @ Rock, Paper, SHOTGUN
- OCC Reviews Darksiders III
- Conan Unconquered is an RTS coming from Petroglyph @ Rock, Paper, SHOTGUN
- The Best PC Games (You Should Be Playing) @ Techspot
- Humble Team 17 Bundle
- Ark: Survival Evolved creators announce Atlas, a pirate survival MMO @ Rock, Paper, SHOTGUN
- Take a long look through the spyglass at Beyond Good And Evil 2 in action @ Rock, Paper, SHOTGUN
Subject: General Tech | December 12, 2018 - 12:37 PM | Jeremy Hellstrom
Tagged: RTX 2060, nvidia, navi, amd
The majority of today's news will cover Intel's wide range of announcements from their architecture day, with new Optane DIMMs seeking to reduce latency to come close to matching that of DRAM to Foveros chiplets and hints of coming in off the Lake to spend some time in a Sunny Cove. Indeed there are more links below the fold offering more coverage as yesterdays announcements were very dense.
That might overshadow a rumour which dedicated discrete GPUs lovers would be interested in, the fact that NVIDIA might be able to get the RTX 2060 to market before AMD can launch a Navi based card. The Inquirer has seen rumours that NVIDIA might be able to release the card in the first half of 2019, while the 7nm Navi isn't expected until the second half of year. The early supply of mid-range NVIDIA GPUs might attract buyers who no longer want to wait; though depending on how Navi performs they could come to regret that lack of patience.
"GRAPHICS CARDS IN 2019 are set to get a good bit more interesting, as a leak suggests that Nvidia's GeForce RTX 2060 could reach the market before AMD's next-gen Navi Radeon cards."
Here is some more Tech News from around the web:
- Intel 2018 Architecture Day @ [H]ard|OCP
- Intel talks about its architectural vision for the future @ The Tech Report
- Intel introduces Foveros: 3D die stacking for more than just memory @ Ars Technica
- Intel Architecture Day – Foveros, Sunny Cove and Gen11 Graphics Coming Soon @ Legit Reviews
- TSMC to expand 8-inch fab capacity for robust demand for automotive, IoT @ DigiTimes
- The internet is going to hell and its creators want your help fixing it @ The Register
- Synology MR2200ac Mesh Router Review: First WPA3-Certified Wi-Fi Router @ Modders-Inc
- LG's beer-making bot singlehandedly sucks all fun, boffinry from home brewing @ The Register
- Ever Wondered How Those Computer-Controlled Christmas Light Displays Work? @ Techspot
Subject: Storage | December 12, 2018 - 09:17 AM | Allyn Malventano
Tagged: ssd, Optane, Intel, DIMM, 3D XPoint
Intel's architecture day press release contains the following storage goodness mixed within all of the talk about 3D chip packaging:
Memory and Storage: Intel discussed updates on Intel® Optane™ technology and the products based upon that technology. Intel® Optane™ DC persistent memory is a new product that converges memory-like performance with the data persistence and large capacity of storage. The revolutionary technology brings more data closer to the CPU for faster processing of bigger data sets like those used in AI and large databases. Its large capacity and data persistence reduces the need to make time-consuming trips to storage, which can improve workload performance. Intel Optane DC persistent memory delivers cache line (64B) reads to the CPU. On average, the average idle read latency with Optane persistent memory is expected to be about 350 nanoseconds when applications direct the read operation to Optane persistent memory, or when the requested data is not cached in DRAM. For scale, an Optane DC SSD has an average idle read latency of about 10,000 nanoseconds (10 microseconds), a remarkable improvement.2 In cases where requested data is in DRAM, either cached by the CPU’s memory controller or directed by the application, memory sub-system responsiveness is expected to be identical to DRAM (<100 nanoseconds).The company also showed how SSDs based on Intel’s 1 Terabit QLC NAND die move more bulk data from HDDs to SSDs, allowing faster access to that data.
Did you catch that? 3D XPoint memory in DIMM form factor is expected to have an access latency of 350 nanoseconds! That's down from 10 microseconds of the PCIe-based Optane products like Optane Memory and the P4800X. I realize those are just numbers, and showing a nearly 30x latency improvement may be easier visually, so here:
Above is an edit to my Bridging the Gap chart from the P4800X review, showing where this new tech would fall in purple. That's all we have to go on for now, but these are certainly exciting times. Consider that non-volatile storage latencies have improved by nearly 100,000x over the last decade, and are now within striking distance (less than 10x) of DRAM! Before you get too excited, realize that Optane DIMMs will be showing up in enterprise servers first, as they require specialized configurations to treat DIMM slots as persistent storage instead of DRAM. That said, I'm sure the tech will eventually trickle down to desktops in some form or fashion. If you're hungry for more details on what makes 3D XPoint tick, check out how 3D XPoint works in my prior article.
Subject: Processors | December 12, 2018 - 09:00 AM | Sebastian Peak
Tagged: xeon, Sunny Cove, processor, intel core, Intel, integrated graphics, iGPU, Foveros, cpu, 3D stacking
Intel’s Architecture Day was held yesterday and brought announcements of three new technologies. Intel shared details of a new 3D stacking technology for logic chips, a brand new CPU architecture for desktop and server, and some surprising developments on the iGPU front. Oh, and they mentioned that whole discrete GPU thing…
3D Stacking for Logic Chips
First we have Foveros, a new 3D packaging technology that follows Intel’s previous EMIB (Embedded Multi-die Interconnect Bridge) 2D packaging technology and enables die-stacking of high-performance logic chips for the first time.
“Foveros paves the way for devices and systems combining high-performance, high-density and low-power silicon process technologies. Foveros is expected to extend die stacking beyond traditional passive interposers and stacked memory to high-performance logic, such as CPU, graphics and AI processors for the first time.”
Foveros will allow for a new “chiplet” paradigm, as “I/O, SRAM, and power delivery circuits can be fabricated in a base die and high-performance logic chiplets are stacked on top”. This new approach would permit design elements to be “mixed and matched”, and allow new device form-factors to be realized as products can be broken up into these smaller chiplets.
The first range of products using this technology are expected to launch in the second half of 2019, beginning with a product that Intel states “will combine a high-performance 10nm compute-stacked chiplet with a low-power 22FFL base die,” which Intel says “will enable the combination of world-class performance and power efficiency in a small form factor”.
Intel Sunny Cove Processors - Coming Late 2019
Next up is the announcement of a brand new CPU architecture with Sunny Cove, which will be the basis of Intel’s next generation Core and Xeon processors in 2019. No mention of 10nm was made, so it is unclear if Intel’s planned transition from 14nm is happening with this launch (the last Xeon roadmap showed a 10 nm transition with "Ice Lake" in 2020).
Intel states that Sonny Cove is “designed to increase performance per clock and power efficiency for general purpose computing tasks” with new features included “to accelerate special purpose computing tasks like AI and cryptography”.
Intel provided this list of Sunny Cove’s features:
- Enhanced microarchitecture to execute more operations in parallel.
- New algorithms to reduce latency.
- Increased size of key buffers and caches to optimize data-centric workloads.
- Architectural extensions for specific use cases and algorithms. For example, new performance-boosting instructions for cryptography, such as vector AES and SHA-NI, and other critical use cases like compression and decompression.
Integrated Graphics with 2x Performance
Intel slide image via ComputerBase
Intel did reveal next-gen graphics, though it was a new generation of the company’s integrated graphics announced at the event. The update is nonetheless significant, with the upcoming Gen11 integrated GPU “expected to double the computing performance-per-clock compared to Intel Gen9 graphics” thanks to a huge increase in Execution Units, from 24 EUs with Gen9 to 64 EUs with Gen11. This will provide “>1 TFLOPS performance capability”, according to Intel, who states that the new Gen11 graphics are also expected to feature advanced media encode/decode, supporting “4K video streams and 8K content creation in constrained power envelopes”.
And finally, though hardly a footnote, the new Gen11 graphics will feature Intel Adaptive Sync technology, which was a rumored feature of upcoming discrete GPU products from Intel.
And now for that little part about discrete graphics: At the event Intel simply “reaffirmed its plan to introduce a discrete graphics processor by 2020”. Nothing new here, and this obviously means that we won’t be seeing a new discrete GPU from Intel in 2019 - though the beefed-up Gen11 graphics should provide a much needed boost to Intel’s graphics offering when Sonny Cove launches “late next year”.
Subject: Motherboards | December 11, 2018 - 06:19 PM | Jeremy Hellstrom
Tagged: msi, Z390, Intel, MEG Z390 ACE
MSI's MEG was the cream of the crop for Threadripper, even though it carried a significant price. Now we have a chance to see how this design works on Intel, as MSI have the MEG Z390 ACE for under $300, to pair with a processor such as the i7-9900K. MEG sports an enhanced backplate, as you can see from the picture below, for those who like to insert a lot of extras into their motherboard.
As for general performance, stability and overclocking? Check out [H]ard|OCP's review to see why the board was sporting Gold once it was unstrapped from the bench.
"The MSI Enthusiast Gaming lineup expands once again with two Z390 offerings for Intel’s latest 9000 series CPUs. The MEG boards offer a blend of quality, features, with power delivery, and overclocking in mind. MSI has certainly raised the bar for its products over the last few years. So our expectations for the ACE motherboard are high."
Here are some more Motherboard articles from around the web:
- ASRock Z390 Taichi Ultimate @ Guru of 3D
- Aorus Z390 Pro @ Modders-Inc
- MSI MPG Z390 Gaming Pro Carbon @ Guru of 3D
- ASRock Z390 Phantom Gaming 9 @ TechPowerUp
Subject: General Tech | December 11, 2018 - 01:10 PM | Jeremy Hellstrom
Tagged: supreme, oops, Samsung
It will be a surprise to many that Supreme is a skateboard fashion brand; even more surprised was Supreme, when Samsung announced they were forming some sort of partnership with the company. It seems that a knock-off version of the New York based provider of duds for skaters exists in Italy, thanks to a less than effective trademark and that company not only convinced Samsung they were the real deal but also that it would benefit Samsung to partner with them to host a big fashion show in Beijing.
Samsung is rather embarrassed about the whole thing, so don't taunt them too much. Pop by Ars Technica for a bit of a lesson on why you should double check anything a skater tells you is true!
"Supreme is not working with Samsung, opening a flagship location in Beijing or participating in a Mercedes-Benz runway show. These claims are blatantly false and propagated by a counterfeit organization."
Here is some more Tech News from around the web:
- Sony's PlayStation Classic can be easily hacked thanks to weak cryptography @ The Inquirer
- Super Micro Says Review Found No Malicious Chips in Motherboards @ Slashdot
- Latest Google+ flaw leads Chocolate Factory to shut down site early @ The Register
- HoloLens 2 might ditch Intel for Qualcomm's Snapdragon 850 @ The Inquirer
- Did you know that iOS ad clicks cost more than Android? These scammers did @ The Register
- Christmas 2018 Mega Worldwide Joint Giveaway @ NikKTech
Subject: General Tech | December 10, 2018 - 03:51 PM | Jeremy Hellstrom
Tagged: audio, FiiO, m7, Exynos 7270, Sabre 9018Q2C, DAC
There are those for whom the idea of listening to audio via a phone is painful to contemplate, as the lack of a dedicated high fidelity DAC will ruin the experience. They will quite happily drop $200 on something like the FiiO M7 and consider it a bargain. The device is also interesting technically, with a DAC and Exynos processor running it, which is why the device is somewhat interesting to non-audiophiles as well. Check out Nikktech for a look at the interface, hardware and audio quality if you are curious.
It also has an FM receiver!
"It may not be the flagship music player in the entire High-Resolution lineup by FiiO but thanks to its Exynos 7270 Processor and the Sabre 9018Q2C DAC/Amp the M7 should have no problem satisfying even the most demanding audiophiles."
Here is some more Tech News from around the web:
- E-Lektron EL20-MB & EL16-P Mobile Sound Systems Review @ NikKTech
- Creative Sound BlasterX G6 7.1 External Gaming Audio DAC Review @ Hardware Asylum
- Arozzi Sfera, Sfera Pro & Colonna USB Microphones Review @ NikKTech
- Asus Strix Fusion 700 RGB @ Guru of 3D
Subject: Graphics Cards | December 10, 2018 - 03:28 PM | Sebastian Peak
Tagged: Vega, trademark, rumor, report, radeon, graphics, gpu, amd, 7nm
The logo, with the familiar "V" joined by a couple of new stripes on the right side, could mean a couple of things; with a possible reference to Vega II (2), or perhaps the VII suggests the Roman numeral 7 for 7nm, instead? VideoCardz.com thinks the latter may be the case:
"AMD has registered a new trademark just 2 weeks ago. Despite many rumors floating around about Navi architecture and its possible early reveal or announcement in January, it seems that AMD is not yet done with Vega. The Radeon Vega logo, which features the distinctive V lettering, has now received 2 stripes, to indicate the 7nm die shrink."
Whatever the case may be it's interesting to consider the possibility of a 7nm Vega GPU before we see Navi. We really don't know, though it does seem a bit presumptuous to consider a new product as early as CES, as Tech Radar speculates:
"We know full well that the next generation of AMD graphics will be built upon a 7nm architecture going by the roadmaps the company released at CES 2018. At the same time, it seems to all sync up with AMD's plans to announce new 7nm GPUs at CES 2019, so it almost seems certain that we’ll see Vega II graphics cards soon."
The prospect of new graphics cards is always tantalizing, but we'll need more than a logo before things really get interesting.
Subject: General Tech | December 10, 2018 - 01:42 PM | Jeremy Hellstrom
Tagged: just cause 4, gaming, benchmarks, 4k, 1440p, 1080p
One of the best pieces of stress relief software* just got a major update, and TechSpot has discovered it may actually cause more stress than it relieves. The focus of their article is on performance but before offering a hint at what to expect it is worth noting they found Just Cause 4 to be a downgrade from the previous release, with many of the graphics being similar or lower quality than the previous game and at a much higher performance cost.
If you have anything below a GTX 1080 or Vega 64 you will struggle to maintain 60fps on very high quality at 1080p and you might be able to scrape by at 1440p with a GTX 1080 or Vega 64 but smooth 4K is beyond even an RTX 2080. Since the game itself, apart from some of the detailed scenery, doesn't seem that much different from the previous title it will be interesting to see if the reported performance issues lessen over time.
*There is a game included as well.
"Today we’re benchmarking Just Cause 4 with a boatload of different GPUs to help you determine if your graphics card will handle this brand new title, and if need be, work out a suitable upgrade option."
Here are some more Graphics Card articles from around the web:
- Revisiting Battlefield V Ray Tracing Performance @ TechSpot
- Battlefield V Tides of War GeForce RTX DirectX Raytracing @ TechPowerUp
- AMD Radeon RX 590 Linux Benchmarks, 18-Way NVIDIA/AMD Gaming Comparison @ Phoronix
- The Best Graphics Cards 2018 @ TechSpot
- MSI GeForce RTX 2080 GAMING X TRIO @ [H]ard|OCP
- NVIDIA GeForce RTX 2080 Linux Gaming Benchmarks @ Phoronix
Subject: General Tech | December 10, 2018 - 12:38 PM | Jeremy Hellstrom
Tagged: spectre, splitspectre, speculator, security, arm, Intel, amd
The discovery of yet another variant of Spectre vulnerability is not good news for already exhausted security experts or reporters, but there is something new in this story which offers a glimmer of hope. A collaborative team of researchers from Northeastern University and IBM found this newest design law using an automatic bug finding tool they designed, called Speculator.
They designed the tool to get around the largest hurdle security researchers face, the secrecy of AMD, Intel and ARM who are trying to keep the recipe for their special sauce secret, and rightly so. Protecting their intellectual properly is paramount to their stockholders and there are arguments about the possible effectiveness of security thorough obscurity in protecting consumers from those with nefarious intent but it does come at a cost for those hunting bugs for good.
"SplitSpectre is a proof-of-concept built from Speculator, the team's automated CPU bug-discovery tool, which the group plans to release as open-source software."
Here is some more Tech News from around the web:
- MAMR Mia – it's not just WD: Toshiba's popped to the microwave too @ The Register
- At least one major carrier lied about its 4G coverage, FCC review finds @ Ars Technica
- APC UPS 600VA BE600M1 Battery Backup & Surge Protector Review @ Legit Reviews
- Hydrogen Powered Nerf Blaster Is Dangerously Awesome @ Hackaday
- Ars Technica’s ultimate board game gift guide, 2018 edition
Subject: Graphics Cards | December 10, 2018 - 10:36 AM | Jim Tanous
Tagged: 3dmark, ray tracing, directx raytracing, raytracing, rtx, benchmarking, benchmarks
After first announcing it last month, UL this weekend provided new information on its upcoming ray tracing-focused addition to the 3DMark benchmarking suite. Port Royal, what UL calls the "world's first dedicated real-time ray tracing benchmark for gamers," will launch Tuesday, January 8, 2019.
For those eager for a glimpse of the new ray-traced visual spectacle, or for the majority of gamers without a ray tracing-capable GPU, the company has released a video preview of the complete Port Royal demo scene.
Access to the new Port Royal benchmark will be limited to the Advanced and Professional editions of 3DMark. Existing 3DMark users can upgrade to the benchmark for $2.99, and it will become part of the base $29.99 Advanced Edition package for new purchasers starting January 8th.
Real-time ray tracing promises to bring new levels of realism to in-game graphics. Port Royal uses DirectX Raytracing to enhance reflections, shadows, and other effects that are difficult to achieve with traditional rendering techniques.
As well as benchmarking performance, 3DMark Port Royal is a realistic and practical example of what to expect from ray tracing in upcoming games— ray tracing effects running in real-time at reasonable frame rates at 2560 × 1440 resolution.
3DMark Port Royal was developed with input from AMD, Intel, NVIDIA, and other leading technology companies. We worked especially closely with Microsoft to create a first-class implementation of the DirectX Raytracing API.
Port Royal will run on any graphics card with drivers that support DirectX Raytracing. As with any new technology, there are limited options for early adopters, but more cards are expected to get DirectX Raytracing support in 2019.
Subject: Graphics Cards | December 9, 2018 - 05:40 PM | Tim Verry
Tagged: pascal, msi, GP104, GeForce GTX 1060, armor
MSI is launching a refreshed GTX 1060 graphics card that uses GDDR5X for its 6GB of video memory rather than GDDR5. The aptly named GTX 1060 Armor 6GD5X OC graphics card shares many features of the existing Armor 6G OC (and OCV1) that the new card is a refresh of including the dual TORX fan Armor 2X cooler and maximum 4 display outputs among three DisplayPort 1.4, one HDMI 2.0b, and one DVI-D.
The new Pascal-based GPU in the upcoming graphics card is reportedly a cut-down variant of NVIDIA's larger GP104 chip rather than the GP106-400 used for previous GTX 1060s, but the core count and other compute resources remain the same at 1,280 CUDA cores, 80 TMUs, 48 ROPs, and a 192-bit memory bus. Clock speeds have been increased slightly versus reference specifications however at 1544 MHz base and up to 1759 MHz boost. The GPU is paired with 6 GB of GDDR5X that is curiously clocked at 8 GHz. The memory more than likely has quite a bit of overclocking headroom vs GTX 1060 6GB cards using GDDR5 but it appears MSI is leaving those pursuits for enthusiasts to explore on their own.
MSI is equipping its GTX 1060 Armor 6GD5X OC graphics cards with a 8+6 pin PCI-E power connection setup which should help overclockers push the cards as far as they can (previous GTX 1060 Armor OC cards had only a single 8-pin). Looking at the specification page the new card will be slightly shorter but with a thicker cooler at 276mm x 140mm x 41mm than the GDDR5-based card. As part of the Armor series the card has a white and black design like its predecessors.
MSI has not yet released pricing or availability information but with the GDDR5-based graphics cards priced at around $275 I would suspect the MSI GTX 1060 Armor 6GD5X OC to sit around $290 at launch.
I am curious how well new GTX 1060 graphics cards will perform when paired with faster GDDR5X memory and how the refreshed cards stack up against AMD's refreshed Polaris 30 based RX 590 graphics cards.
- The GeForce GTX 1060 6GB Review - GP106 Starting at $249
- NVIDIA GeForce GTX 1060 Preview: Pascal with GP106