Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

Gigabyte Launches Z390 AORUS Xtreme Waterforce Motherboard

Subject: Motherboards | December 14, 2018 - 07:04 PM |
Tagged: Z390, water cooling, gigabyte, aorus

Gigabyte’s Aorus brand is planning to make its flagship Z390 Aorus Xtreme motherboard even more extreme by pairing it with a RGB LED-lit perspex monoblock that covers the processor and PCH areas. The aptly named Z390 Aorus Xtreme Waterforce is an E-ATX form factor motherboard for 8th and 9th Generation Intel Core processors (LGA 1151) that comes packed with overclocking features and a plethora of I/O and expansion options.

Gigabyte Aorus Z390 Xtreme Waterforce Motherboard.png

The Z390 Aorus Xtreme Waterforce is powered by two 8-pin CPU power connectors, a right angle 24-pin ATX, and a six pin PCI-E input for extra PCI-E slot power delivery. The LGA 1151 socket sits front and center up top with four DDR4 memory slots off to the right of it. Power delivery is handled by a 16 phase PowIRStage digital VRM. Below the CPU sit three PCI-E x16 slots (which run at x16/x8/x4), two PCI-E x1 slots, and three M.2 slots with heatspreaders. There is also a mini PCI-E slot but it comes with an Intel wireless card pre-installed. Additional expansion options include six SATA 3 ports. The Gigabyte motherboard includes 10 sensor points and eight hybrid fan headers. It also supports Smart Fan 5 and RGB Fan Commander software tools as well as support for external RGB LED strips and addressable LEDs via headers (RGB Fusion). The Aorus board also supports OC Touch which offers physical buttons and switches for adjusting overclocks without needing to go into the UEFI BIOS.

Audio duties are handled by the Realtek ALC1220-VB codec along with ESS Sabre ES9018K2M DAC, LME 49720 dedicated analog power delivery, Ti OPA1622 Op amp, and several WIMA and Nichicon audio capacitors. The reported 127 dB SNR audio also supports “Amp Up” which can automatically adjust to various headphones. Networking on the Z390 Xtreme Waterforce is handled by Intel (CNV1) for 802.11ac Wave 2 2x2 wireless and Gigabit Ethernet and by Aquantia for the 10 GbE.

Rear I/O on the flagship motherboard shouldn’t disappoint with:

  • 2 x SMA antenna connectors (Wi-Fi)
  • 2 x Thunderbolt 3 40 Gbps
  • 4 x USB 3.1 Gen 2 (10 Gbps)
  • 2 x USB 3.1 Gen 1 (5 Gbps)
  • 2 x USB 2.0
  • 1 x 10 Gigabit Ethernet
  • 1 x 1 Gigabit Ethernet
  • 1 x HDMI
  • 5 x Analog audio
  • 1 x S/PDIF audio

The Aorus All In One Monoblock is the star of the show though as it is what differentiates it from the normal Z390 Aorus Xtreme board. The monoblock uses standard G1/4” threads and uses dense copper fins to help heat transfer to the water loop. Water flows over the CPU, VRM, and PCH areas to keep everything nice and cool even when overclocking. According to Gigabyte there is a leak detection circuit that will shut down the PC if a leak in the waterblock/loop is detected to protect your components. The downside to the monoblock is, of course, the added complexity to the build process, it certainly looks nice though so some enthusiasts may well find it worth it.

Gigabyte has not yet released pricing or availability information, but it’s going to come at a premium price. The Z390 Aorus Xtreme (sans waterblock) has a MSRP of $549.99, for example, and the addition of the Aorus RGB monoblock could add another $50 to $100 to that price.

The reviews on this board and the monoblock should be interesting. While it may be expensive, I'm sure that some watercooling enthusiasts will find uses for it in all-out "cool all the  things" builds!

Source: Gigabyte
Subject: Storage
Manufacturer: Samsung

Introduction

For years we have been repeatedly teased by Samsung. Launch after successful launch in the consumer SSD space, topping performance charts nearly every time, but what about enterprise? Oh sure, there were plenty of launches on that side, with the company showing off higher and higher capacity 2.5" enterprise SSDs year after year, but nobody could ever get their hands on one, and even the higher tier reviewers could not confirm Samsung's performance claims. While other SSD makers would privately show me performance comparison data showing some Samsung enterprise part walking all over their own enterprise parts, there was not much concern in their voices since only a small group of companies had the luxury of being on Samsung's short list of clients that could purchase these products. Announcements of potentially groundbreaking products like the Z-SSD were soured by press folk growing jaded by unobtanium products that would likely never be seen by the public.

181207-183150.jpg

Samsung has recently taken some rather significant steps to change that tune. They held a small press event in September, where we were assured that enterprise SSD models were coming to 'the channel' (marketing speak for being available on the retail market). I was thrilled, as were some of the Samsung execs who had apparently been pushing for such a move for some time.

As a next step towards demonstrating that Samsung is dedicated to their plan, I was recently approached to test a round of their upcoming products. I accepted without hesitation, have been testing for the past week, and am happy to now bring you detailed results obtained from testing eight different SSDs across four enterprise SSD models. Testing initially began with three of the models, but then I was made aware that the Z-SSD was also available for testing, and given the potential significance of that product and its placement as a competitor to 3D XPoint products like Intel's Optane, I thought it important to include that testing as well, making this into one heck of a Samsung Enterprise SSD roundup!

One large note before we continue - this is an enterprise SSD review. Don't expect to see game launches, SYSmark runs, or boot times here. The density of the data produced by my enterprise suite precludes most easy side-by-side comparisons, so I will instead be presenting the standard full-span random and sequential results for fully conditioned drives, marking the rated specs on the charts as we go along. High-Resolution QoS will also be used throughout, as Quality of Service is one of the most important factors to consider when choosing SSDs for enterprise usage. In short, the SSDs will be tested against their own specifications, with the exception of some necessary comparisons between the Samsung Z-SSD and the Intel Optane SSD DC P4800X which I will squeeze in towards the end of this very lengthy and data-dense review.

Read on for our full review of Samsung's new enterprise products!

... and the Roccat is on the Aim-o, Aim-o, Aim-o

Subject: General Tech | December 13, 2018 - 03:43 PM |
Tagged: RGB, mechanical keyboard, input, vulcan 120 aimo, roccat

Roccat's Vulcan 120 Aimo uses low profile Titan mechanical switches, which have a travel distance of 3.6mm and an actuation distance of 1.8mm, compared to a similar Cherry MX switch with 4mm and 2mm respectively.  The Tech Report also found the spacing to be rather tight, in part due to the skirted design, so this might be one you want to test drive before purchasing.  The included Swarm software lets you program keys in a variety of ways including association noises with certain key presses, while the Aimo RGB option offers some interesting performance which you might have strong feelings about one way or the other.

topdown.png

"Roccat's Vulcan 120 Aimo keyboard cuts a striking profile with its skirtless key caps and in-house Titan switches. We put the Vulcan 120 Aimo to the test to see whether a new spin on mechanical key switches is enough to help it stand out in a crowded market."

Here is some more Tech News from around the web:

Tech Talk

Next season on The Jetsons, Professor X TOPS the list of special guests; and now a word from NVIDIA

Subject: General Tech | December 13, 2018 - 01:15 PM |
Tagged: nvidia, machine learning, jetson, AGX Xavier

NVIDIA claims their newly announced Jetson AGX Xavier SoC can provide up to 32 trillion operations per second for specific tasks, requiring a mere 10W to do so.  The chips are design for image processing and recognition along with all those other 'puter learnin' things you would expect and chances are a device will have several of these chips working in tandem, which offers a lot of processing power.  It is already being used for real time monitoring of DNA sequencing and will be installed in car manufacturing lines in Japan.

The Inquirer points out that this performance comes at a cost, currently $1100 per unit as long as you are buying 1000 of them or more.

12-Jetson-family.jpg

"Essentially a data wrangling server plonked onto a silicon package, Jetson AGX Xavier is designed to handle all the tech and processing that autonomous things need to go about their robot lives, such as image processing and computer vision and the inference of deep learning algorithms."

Here is some more Tech News from around the web:

Tech Talk

Source: The Inquirer

AMD Radeon Software Adrenalin 2019 Edition Adds Mobile & VR Game Streaming, Performance Tuning

Subject: Graphics Cards | December 13, 2018 - 09:01 AM |
Tagged: Radeon Software Adrenalin Edition, radeon software, radeon, gpu, drivers, amd, Adrenalin Edition

AMD today released the latest major update to its Radeon software and driver suite. Building on the groundwork laid last year, AMD Radeon Software Adrenalin 2019 Edition brings a number of new features and performance improvements.

amd-radeon-adrenalin-2019-edition.jpg

With this year’s software update, AMD continues to make significant gains in game performance compared to last year’s driver release, with an average gain of up to 15 percent in across a range of popular titles. Examples include Assassin’s Creed Odyssey (11%), Battlefield V (39%), and Shadow of the Tomb Raider (15%).

amd-adrenalin-2019-games.jpg

New Features

Beyond performance, Adrenalin 2019 Edition introduces a number of new and improved features. Highlights include:

Game Streaming: Radeon gamers can now stream any game or application from their PCs to their mobile devices via the AMD Link app at up to 4K 60fps. The feature supports both on-screen controls as well as Bluetooth controllers. ReLive streaming is also expanding to VR, with users able to stream games and videos from their PCs to standalone VR headsets via new AMD VR store apps. This includes Steam VR titles, allowing users to play high-quality PC-based VR games on select standalone headsets. AMD claims that its streaming technology offers “up to 44% faster responsiveness” than other game streaming solutions.

amd-adrenalin-2019-game-streaming.jpg

ReLive Streaming and Sharing: Gamers more interested in streaming their games to other people will find several new features in AMD’s ReLive feature, including adjustable picture-in-picture instant replays from 5 to 30 seconds, automatic GIF creation, and a new scene editor with more stream overlay options and hotkey-based scene transition control.

Radeon Game Advisor: A new overlay available in-game that helps users designate their target experience (performance vs. quality) and then recommends game-specific settings to achieve that target. Since the tool is running live alongside the game, it can respond to changes as they occur and dynamically recommend updated settings and options.

amd-adrenalin-2019-game-advisor.jpg

Radeon Settings Advisor: A new tool in the Radeon Software interface that scans system configuration and settings and recommends changes (e.g., enabling or disabling Radeon Chill, changing the display refresh rate, enabling HDR) to achieve an optimal gaming experience.

WattMan One-Click Tuning Improvements: Radeon WattMan now supports automatic tuning of memory overclocking, GPU undervolting, expanded fan control options, and unlocked DPM states for RX Vega series cards.

Display Improvements: FreeSync 2 can now tone-map HDR content to look better on displays that don’t support the full color and contrast of the HDR spec, and AMD’s Virtual Super Resolution feature is now supported on ultra-wide displays.

amd-adrenalin-2019-freesync2-hdr.jpg

Radeon Overlay: AMD’s Overlay feature which allows gamers to access certain Radeon features without leaving their game has been updated to display system performance metrics, WattMan configuration options, Radeon Enhanced Sync controls, and the aforementioned Game Advisor.

amd-adrenalin-2019-amd-link-voice-control.jpg

AMD Link: AMD’s mobile companion app now offers easier setup via QR code scanning, voice control of various Radeon and ReLive settings (e.g., start/stop streaming, save replay, take screenshot), WattMan controls, enhanced performance metrics, and the ability to initiate a Radeon Software update.

Availability

Radeon Software Adrenalin 2019 Edition is available now from AMD’s support website for all supported AMD GPUs.

Source: AMD

Intel Plans Ghost Canyon X NUC With Discrete Graphics Support In 2020

Subject: General Tech, Systems, Mobile | December 13, 2018 - 01:02 AM |
Tagged: Intel

Slated for an early 2020 release, Intel is planning a new larger (but still) small form factor NUC system dubbed Ghost Canyon X according to a report by FanlessTech. Ghost Canyon X will feature a larger 5 liter form factor that will be able to accomodate a discrete graphics card along with both M.2 and SATA 3 storage.

Intel Ghost Canyon NUC 5L Mini PC Coffee Lake HR.png

The Ghost Canyon X NUC will be powered by 9th Generation Coffee Lake HR processors that will come in i5 and i7 flavors. The chips have a 45W TDP and will come in quad core i5-9XXXH, six core i7, or eight core i7-9XXXH configurations (with HyperThreading) and will be paired with two DDR4 DIMMs (up to 64GB DDR4 2400 MHz or 32GB DDR4 2666 MHz). Ghost Canyon X NUCs will have three HDMI 2.0 video outputs, two Thunderbolt 3 ports, and a SD card slot for external I/O (likely along with USB 3.1 and audio outputs though those are not pictured). Internal storage includes up to 3 M.2 drives (two M.2 2242 80/110 and one 80mm) using PCI-E 3.0 x4 links and SATA 3 for standard hard drives and SATA SSDs. The biggest change with the NUC platform is the inclusion of a single PCI-E x16 slot which can be used to add a discrete graphics card to the system. While 5 liters is quite a jump up from the 0.7L standard NUCs and the 1.2L of the Kaby Lake-G powered Hades Canyon gaming NUC, it is still a fairly small system so not all graphics cards are going to fit but enthusiasts should be able to use GPUs that have shorter Mini ITX designs easily enough.

FanlessTech notes that the reference Ghost Canyon X NUC will most likely be actively cooled, but third party fanless cases from makers like Akassa, Streacom, Tranquil PC and others should be achievable with a 45W TDP CPU (and even GPU if you go with a lower end model).

Further details are still unknown and the pictured case design is still subject to change as the system gets further along in the design process and closer to launch. Curiously, that expected early 2020 Ghost Canyon X launch would coincide with Intel’s plans for launching its own discrete graphics solution so an Intel NUC with an Intel graphics card would be an interesting system to see!

Stay tuned for updated NUC information as we get closer to Computex 2019 and CES 2020!

Source: FanlessTech

NVIDIA Rumored To Launch RTX 2060 and RTX 2070 Max-Q Mobile GPUs

Subject: Graphics Cards, Mobile | December 12, 2018 - 10:04 PM |
Tagged: turing, rumor, RTX 2070, RTX 2060, nvidia

Rumors have appeared online that suggest NVIDIA may be launching mobile versions of its RTX 2070 and RTX 2060 GPUs based on its new Turing architecture. The new RTX 2070 and RTX 2060 with Max-Q designs were leaked by Twitter user TUM_APISAK who posted cropped screenshots of Geekbench 4.3.1 and 3DMark 11 Performance results.

NVIDIA Max-Q.png

Allegedly handling the graphics duties in a Lenovo 81HE, the GeForce RTX 2070 with Max-Q Design (8GB VRAM) combined with a Core i7-8750H Coffee Lake six core CPU and 32 GB system memory managed a Geekbench 4.3.1 score of 223,753. The GPU supposedly has 36 Compute Units (CUs) and a core clockspeed of 1,300 MHz. The desktop RTX 2070 GPU which is already available also has 36 CUs with 2,304 CUDA cores, 144 texture units, 64 ROPS, 288 Tensor cores, and 36 RT (ray tracing) cores. The desktop GPU has a 175W reference (non FE) TDP and clocks of 1410 MHz base and 1680 MHz boost (1710 MHz for Founder's Edition). Assuming that 36 CU number is accurate, the mobile (RTX 2070M) may well have the same core counts, just running at lower clocks which would be nice to see but would require a beefy mobile cooling solution. 

As far as the RTX 2060 Max-Q Design graphics processor, not as much information was leaked as far as specifications as the leak was limited to two screenshots allegedly from Final Fantasy XV's benchmark results page comparing a desktop RTX 2060 with a Max-Q RTX 2060. The number of CUs (and other numbers like CUDA/Tensor/RT cores, TMUs, and ROPs) was not revealed in those screenshots, for example. The comparison does lend further credence to the rumors of the RTX 2060 utilizing 6 GB of GDDR6 memory though. Tom's Hardware does have a screenshot that shows the RTX 2060 with 30 CUs which suggest 1,920 CUDA cores, 240 Tensor cores, and 30 RT cores though with clocks up to 1.2 GHz (which does mesh well with previous rumors of the desktop part).

Graphics Card Generic VGA Generic VGA
Memory 6144 MB 6144 MB
Core clock 960 MHz 975 MHz
Memory Clock 1750 MHz 1500 MHz
Driver name NVIDIA GeForce RTX 2060 NVIDIA GeForce RTX 2060 with Maz-Q Design
Driver version 25.21.14.1690 25.21.14.1693

Also, the TU106 RTX 2060 with Max-Q Design reportedly has a 975 MHz core clock and a 1500 MHz (6 GHz) memory clock. Note that the 960 MHz core clock and 1750 MHz (7 GHz) memory clocks don't match previous RTX 2060 rumors which suggested higher GPU clocks in particular (up to 1.2 GHz). To be fair, it could just be the software reporting incorrect numbers due to the GPUs not being official yet. One final bit of leaked information included a note about 3DMark 11 performance with the RTX 2060 Max Q Design GPU hitting at least 19,000 in the benchmark's Performance preset which allegedly puts it in between the scores of the mobile GTX 1070 and the mobile GTX 1070 Max-Q. (A graphics score between nineteen and twenty thousand would put it a bit above a desktop GTX 1060 but far below the desktop 1070).

As usual, take these rumors and leaked screenshots with a healthy heaping of salt, but they are interesting nonetheless. Combined with the news about NVIDIA possibly announcing new mid-range GPUs at CES 2019, we may well see new laptops and other mobile graphics solutions shown off at CES and available within the first half of 2019 which would be quite the coup.

What are your thoughts on the rumored RTX 2060 for desktops and its mobile RTX 2060 and RTX 2070 Max-Q siblings?

Related reading:

Source: GND-Tech

Developers! Developers! Developers! ... might just prefer an Ubuntu powered Dell XPS laptop

Subject: Mobile | December 12, 2018 - 05:19 PM |
Tagged: dell, linux, ubuntu 18.04, XPS developer edition, Kaby Lake R

Dell have updated their Linux powered XPS Developer's Edition laptop with a Kaby Lake R processor, up to a 2TB PCIe SSD, 4-16GB of RAM and either a 1080p screen or a 4K touchscreen depending on how much you are willing to pay.  Dell included all the latest features, including a pair of Thunderbolt 3 ports as well as a Type C 3.1 port; there is even an SD card reader. 

Apart from the webcam and the lack of older style USB ports, Ars Technica gives this new Linux power laptop top marks.

dellxps-front.jpg

"Recently, Dell finally sent Ars the latest model of the XPS 13 DE for testing. And while Dell did put a lot of work into this latest iteration, the biggest upgrade with the latest Developer Edition is the inclusion of Ubuntu 18.04."

Here are some more Mobile articles from around the web:

More Mobile Articles

Source: Ars Technica

It's like Skyrim ... with guns ... in space ... with a Firefly meets Borderlands feel? The Outer Worlds teaser

Subject: General Tech | December 12, 2018 - 02:56 PM |
Tagged: obsidian, The Outer Worlds, gaming

The teaser trailer for The Outer Worlds certainly looks interesting, though one has to wonder in Obsidian may have tried to combine too many different styles into a single game.  On the other hand they are responsible for the best of the first person Fallout games so we can hold out some hope.  Even better is the news from Rock, Paper, SHOTGUN that even though Microsoft now owns Obsidian, the game will be released by 2K and not a Windows Store exclusive launch! 

Head over to watch the teaser.

the-outer-worlds-b-1212x682.jpg

"Obsidian Entertainment, the studio behind RPGs from Alpha Protocol through Fallout: New Vegas to Pillars Of Eternity, tonight announced The Outer Worlds, a new singleplayer first-person RPG with a space-western twang."

Here is some more Tech News from around the web:

Tech Talk

 

A break from your regular Intel briefing

Subject: General Tech | December 12, 2018 - 12:37 PM |
Tagged: RTX 2060, nvidia, navi, amd

The majority of today's news will cover Intel's wide range of announcements from their architecture day, with new Optane DIMMs seeking to reduce latency to come close to matching that of DRAM to Foveros chiplets and hints of coming in off the Lake to spend some time in a Sunny Cove.  Indeed there are more links below the fold offering more coverage as yesterdays announcements  were very dense.

That might overshadow a rumour which dedicated discrete GPUs lovers would be interested in, the fact that NVIDIA might be able to get the RTX 2060 to market before AMD can launch a Navi based card.  The Inquirer has seen rumours that NVIDIA might be able to release the card in the first half of 2019, while the 7nm Navi isn't expected until the second half of year.  The early supply of mid-range NVIDIA GPUs might attract buyers who no longer want to wait; though depending on how Navi performs they could come to regret that lack of patience. 

IMG_0401.JPG

"GRAPHICS CARDS IN 2019 are set to get a good bit more interesting, as a leak suggests that Nvidia's GeForce RTX 2060 could reach the market before AMD's next-gen Navi Radeon cards."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer

Intel's Optane DC Persistent Memory DIMMs Push Latency Closer to DRAM

Subject: Storage | December 12, 2018 - 09:17 AM |
Tagged: ssd, Optane, Intel, DIMM, 3D XPoint

Intel's architecture day press release contains the following storage goodness mixed within all of the talk about 3D chip packaging:

Memory and Storage: Intel discussed updates on Intel® Optane™ technology and the products based upon that technology. Intel® Optane™ DC persistent memory is a new product that converges memory-like performance with the data persistence and large capacity of storage. The revolutionary technology brings more data closer to the CPU for faster processing of bigger data sets like those used in AI and large databases. Its large capacity and data persistence reduces the need to make time-consuming trips to storage, which can improve workload performance. Intel Optane DC persistent memory delivers cache line (64B) reads to the CPU. On average, the average idle read latency with Optane persistent memory is expected to be about 350 nanoseconds when applications direct the read operation to Optane persistent memory, or when the requested data is not cached in DRAM. For scale, an Optane DC SSD has an average idle read latency of about 10,000 nanoseconds (10 microseconds), a remarkable improvement.2  In cases where requested data is in DRAM, either cached by the CPU’s memory controller or directed by the application, memory sub-system responsiveness is expected to be identical to DRAM (<100 nanoseconds).
 
The company also showed how SSDs based on Intel’s 1 Terabit QLC NAND die move more bulk data from HDDs to SSDs, allowing faster access to that data.

Did you catch that? 3D XPoint memory in DIMM form factor is expected to have an access latency of 350 nanoseconds! That's down from 10 microseconds of the PCIe-based Optane products like Optane Memory and the P4800X. I realize those are just numbers, and showing a nearly 30x latency improvement may be easier visually, so here:

gap-5.png

Above is an edit to my Bridging the Gap chart from the P4800X review, showing where this new tech would fall in purple. That's all we have to go on for now, but these are certainly exciting times. Consider that non-volatile storage latencies have improved by nearly 100,000x over the last decade, and are now within striking distance (less than 10x) of DRAM! Before you get too excited, realize that Optane DIMMs will be showing up in enterprise servers first, as they require specialized configurations to treat DIMM slots as persistent storage instead of DRAM. That said, I'm sure the tech will eventually trickle down to desktops in some form or fashion. If you're hungry for more details on what makes 3D XPoint tick, check out how 3D XPoint works in my prior article.

180808-191612.jpg

Intel Unveils Next-Gen Sunny Cove CPU, Gen11 Graphics, and 3D Stacking Technology

Subject: Processors | December 12, 2018 - 09:00 AM |
Tagged: xeon, Sunny Cove, processor, intel core, Intel, integrated graphics, iGPU, Foveros, cpu, 3D stacking

Intel’s Architecture Day was held yesterday and brought announcements of three new technologies. Intel shared details of a new 3D stacking technology for logic chips, a brand new CPU architecture for desktop and server, and some surprising developments on the iGPU front. Oh, and they mentioned that whole discrete GPU thing…

3D Stacking for Logic Chips

First we have Foveros, a new 3D packaging technology that follows Intel’s previous EMIB (Embedded Multi-die Interconnect Bridge) 2D packaging technology and enables die-stacking of high-performance logic chips for the first time.

2d-and-3d-packaging-drive-new-design-flexibility.jpg

“Foveros paves the way for devices and systems combining high-performance, high-density and low-power silicon process technologies. Foveros is expected to extend die stacking beyond traditional passive interposers and stacked memory to high-performance logic, such as CPU, graphics and AI processors for the first time.”

Foveros will allow for a new “chiplet” paradigm, as “I/O, SRAM, and power delivery circuits can be fabricated in a base die and high-performance logic chiplets are stacked on top”. This new approach would permit design elements to be “mixed and matched”, and allow new device form-factors to be realized as products can be broken up into these smaller chiplets.

3d-packaging-a-catalyst-for-product-innovation.jpg

The first range of products using this technology are expected to launch in the second half of 2019, beginning with a product that Intel states “will combine a high-performance 10nm compute-stacked chiplet with a low-power 22FFL base die,” which Intel says “will enable the combination of world-class performance and power efficiency in a small form factor”.

Intel Sunny Cove Processors - Coming Late 2019

Next up is the announcement of a brand new CPU architecture with Sunny Cove, which will be the basis of Intel’s next generation Core and Xeon processors in 2019. No mention of 10nm was made, so it is unclear if Intel’s planned transition from 14nm is happening with this launch (the last Xeon roadmap showed a 10 nm transition with "Ice Lake" in 2020).

Intel_CPUs.jpg

Intel states that Sonny Cove is “designed to increase performance per clock and power efficiency for general purpose computing tasks” with new features included “to accelerate special purpose computing tasks like AI and cryptography”.

Intel provided this list of Sunny Cove’s features:

  • Enhanced microarchitecture to execute more operations in parallel.
  • New algorithms to reduce latency.
  • Increased size of key buffers and caches to optimize data-centric workloads.
  • Architectural extensions for specific use cases and algorithms. For example, new performance-boosting instructions for cryptography, such as vector AES and SHA-NI,  and other critical use cases like compression and decompression.

Integrated Graphics with 2x Performance

Gen11_Pipeline.png

Intel slide image via ComputerBase

Intel did reveal next-gen graphics, though it was a new generation of the company’s integrated graphics announced at the event. The update is nonetheless significant, with the upcoming Gen11 integrated GPU “expected to double the computing performance-per-clock compared to Intel Gen9 graphics” thanks to a huge increase in Execution Units, from 24 EUs with Gen9 to 64 EUs with Gen11. This will provide “>1 TFLOPS performance capability”, according to Intel, who states that the new Gen11 graphics are also expected to feature advanced media encode/decode, supporting “4K video streams and 8K content creation in constrained power envelopes”.

And finally, though hardly a footnote, the new Gen11 graphics will feature Intel Adaptive Sync technology, which was a rumored feature of upcoming discrete GPU products from Intel.

Discrete GPUs?

And now for that little part about discrete graphics: At the event Intel simply “reaffirmed its plan to introduce a discrete graphics processor by 2020”. Nothing new here, and this obviously means that we won’t be seeing a new discrete GPU from Intel in 2019 - though the beefed-up Gen11 graphics should provide a much needed boost to Intel’s graphics offering when Sonny Cove launches “late next year”.

Source: Intel

MSI's new Z390; Ace in the hole or jumping the shark?

Subject: Motherboards | December 11, 2018 - 06:19 PM |
Tagged: msi, Z390, Intel, MEG Z390 ACE

MSI's MEG was the cream of the crop for Threadripper, even though it carried a significant price.   Now we have a chance to see how this design works on Intel, as MSI have the MEG Z390 ACE for under $300, to pair with a processor such as the i7-9900K.   MEG sports an enhanced backplate, as you can see from the picture below, for those who like to insert a lot of extras into their motherboard. 

As for general performance, stability and overclocking?  Check out [H]ard|OCP's review to see why the board was sporting Gold once it was unstrapped from the bench.

1543994662wd9p8ixg3i_1_8_l.jpg

"The MSI Enthusiast Gaming lineup expands once again with two Z390 offerings for Intel’s latest 9000 series CPUs. The MEG boards offer a blend of quality, features, with power delivery, and overclocking in mind. MSI has certainly raised the bar for its products over the last few years. So our expectations for the ACE motherboard are high."

Here are some more Motherboard articles from around the web:

Motherboards

Source: [H]ard|OCP

Samsung's new Supremely Suspcious Deal

Subject: General Tech | December 11, 2018 - 01:10 PM |
Tagged: supreme, oops, Samsung

It will be a surprise to many that Supreme is a skateboard fashion brand; even more surprised was Supreme, when Samsung announced they were forming some sort of partnership with the company.  It seems that a knock-off version of the New York based provider of duds for skaters exists in Italy, thanks to a less than effective trademark and that company not only convinced Samsung they were the real deal but also that it would benefit Samsung to partner with them to host a big fashion show in Beijing.

Samsung is rather embarrassed about the whole thing, so don't taunt them too much.  Pop by Ars Technica for a bit of a lesson on why you should double check anything a skater tells you is true!

ff1da29aed7305cbf098b6f4639b6f92.jpeg

"Supreme is not working with Samsung, opening a flagship location in Beijing or participating in a Mercedes-Benz runway show. These claims are blatantly false and propagated by a counterfeit organization."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Ars Technica
Author:
Manufacturer: AMD

Vega meets Radeon Pro

Professional graphics cards are a segment of the industry that can look strange to gamers and PC enthusiasts. From the outside, it appears that businesses are paying more for almost identical hardware when compared to their gaming counterparts from both NVIDIA and AMD. 

However, a lot goes into a professional-level graphics card that makes all the difference to the consumers they are targeting. From the addition of ECC memory to protect against data corruption, all the way to a completely different driver stack with specific optimizations for professional applications, there's a lot of work put into these particular products.

The professional graphics market has gotten particularly interesting in the last few years with the rise of the NVIDIA TITAN-level GPUs and "Frontier Edition" graphics cards from AMD. While lacking ECC memory, these new GPUs have brought over some of the application level optimizations, while providing a lower price for more hobbyist level consumers.

However, if you're a professional that depends on a graphics card for mission-critical work, these options are no replacement for the real thing.

Today we're looking at one of AMD's latest Pro graphics offerings, the AMD Radeon Pro WX 8200. 

DSC05271.JPG

Click here to continue reading our review of the AMD Radeon Pro WX 8200.

That's no Zune, it's a FiiO M7

Subject: General Tech | December 10, 2018 - 03:51 PM |
Tagged: audio, FiiO, m7, Exynos 7270, Sabre 9018Q2C, DAC

There are those for whom the idea of listening to audio via a phone is painful to contemplate, as the lack of a dedicated high fidelity DAC will ruin the experience.  They will quite happily drop $200 on something like the FiiO M7 and consider it a bargain.  The device is also interesting technically, with a DAC and Exynos processor running it, which is why the device is somewhat interesting to non-audiophiles as well.   Check out Nikktech for a look at the interface, hardware and audio quality if you are curious.

It also has an FM receiver!

fiio_m7_hd_music_player_review_9.jpg

"It may not be the flagship music player in the entire High-Resolution lineup by FiiO but thanks to its Exynos 7270 Processor and the Sabre 9018Q2C DAC/Amp the M7 should have no problem satisfying even the most demanding audiophiles."

Here is some more Tech News from around the web:

Audio Corner

 

Source: Nikktech

Report: New AMD Trademark Shows Possible 7nm Vega Logo

Subject: Graphics Cards | December 10, 2018 - 03:28 PM |
Tagged: Vega, trademark, rumor, report, radeon, graphics, gpu, amd, 7nm

News of a new logo trademark from AMD is making the rounds, with VideoCardz.com spotting the image via Twitter user "BoMbY". Time to speculate!

Vega_II_Logo.jpg

AMD trademark image posted by Twitter user BoMbY (via VideoCardz.com)

The logo, with the familiar "V" joined by a couple of new stripes on the right side, could mean a couple of things; with a possible reference to Vega II (2), or perhaps the VII suggests the Roman numeral 7 for 7nm, instead? VideoCardz.com thinks the latter may be the case:

"AMD has registered a new trademark just 2 weeks ago. Despite many rumors floating around about Navi architecture and its possible early reveal or announcement in January, it seems that AMD is not yet done with Vega. The Radeon Vega logo, which features the distinctive V lettering, has now received 2 stripes, to indicate the 7nm die shrink."

Whatever the case may be it's interesting to consider the possibility of a 7nm Vega GPU before we see Navi. We really don't know, though it does seem a bit presumptuous to consider a new product as early as CES, as Tech Radar speculates:

"We know full well that the next generation of AMD graphics will be built upon a 7nm architecture going by the roadmaps the company released at CES 2018. At the same time, it seems to all sync up with AMD's plans to announce new 7nm GPUs at CES 2019, so it almost seems certain that we’ll see Vega II graphics cards soon."

The prospect of new graphics cards is always tantalizing, but we'll need more than a logo before things really get interesting.

Source: VideoCardz

Just Cause it's new is no excuse Four this performance

Subject: General Tech | December 10, 2018 - 01:42 PM |
Tagged: just cause 4, gaming, benchmarks, 4k, 1440p, 1080p

One of the best pieces of stress relief software* just got a major update, and TechSpot has discovered it may actually cause more stress than it relieves.  The focus of their article is on performance but before offering a hint at what to expect it is worth noting they found Just Cause 4 to be a downgrade from the previous release, with many of the graphics being similar or lower quality than the previous game and at a much higher performance cost.

If you have anything below a GTX 1080 or Vega 64 you will struggle to maintain 60fps on very high quality at 1080p and you might be able to scrape by at 1440p with a GTX 1080 or Vega 64 but smooth 4K is beyond even an RTX 2080.  Since the game itself, apart from some of the detailed scenery, doesn't seem that much different from the previous title it will be interesting to see if the reported performance issues lessen over time.

*There is a game included as well.

2018-12-08-image-4.jpg

"Today we’re benchmarking Just Cause 4 with a boatload of different GPUs to help you determine if your graphics card will handle this brand new title, and if need be, work out a suitable upgrade option."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: TechSpot

Out on a branch, speculating about possible architectural flaws

Subject: General Tech | December 10, 2018 - 12:38 PM |
Tagged: spectre, splitspectre, speculator, security, arm, Intel, amd

The discovery of yet another variant of Spectre vulnerability is not good news for already exhausted security experts or reporters, but there is something new in this story which offers a glimmer of hope.  A collaborative team of researchers from Northeastern University and IBM found this newest design law using an automatic bug finding tool they designed, called Speculator.

They designed the tool to get around the largest hurdle security researchers face, the secrecy of AMD, Intel and ARM who are trying to keep the recipe for their special sauce secret, and rightly so.  Protecting their intellectual properly is paramount to their stockholders and there are arguments about the possible effectiveness of security thorough obscurity in protecting consumers from those with nefarious intent but it does come at a cost for those hunting bugs for good. 

Pop by The Register for details on how Speculator works.

TreeHouse_0002_20130603_web.jpg

"SplitSpectre is a proof-of-concept built from Speculator, the team's automated CPU bug-discovery tool, which the group plans to release as open-source software."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register