Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

NVIDIA Rumored To Launch RTX 2060 and RTX 2070 Max-Q Mobile GPUs

Subject: Graphics Cards, Mobile | December 12, 2018 - 10:04 PM |
Tagged: turing, rumor, RTX 2070, RTX 2060, nvidia

Rumors have appeared online that suggest NVIDIA may be launching mobile versions of its RTX 2070 and RTX 2060 GPUs based on its new Turing architecture. The new RTX 2070 and RTX 2060 with Max-Q designs were leaked by Twitter user TUM_APISAK who posted cropped screenshots of Geekbench 4.3.1 and 3DMark 11 Performance results.

NVIDIA Max-Q.png

Allegedly handling the graphics duties in a Lenovo 81HE, the GeForce RTX 2070 with Max-Q Design (8GB VRAM) combined with a Core i7-8750H Coffee Lake six core CPU and 32 GB system memory managed a Geekbench 4.3.1 score of 223,753. The GPU supposedly has 36 Compute Units (CUs) and a core clockspeed of 1,300 MHz. The desktop RTX 2070 GPU which is already available also has 36 CUs with 2,304 CUDA cores, 144 texture units, 64 ROPS, 288 Tensor cores, and 36 RT (ray tracing) cores. The desktop GPU has a 175W reference (non FE) TDP and clocks of 1410 MHz base and 1680 MHz boost (1710 MHz for Founder's Edition). Assuming that 36 CU number is accurate, the mobile (RTX 2070M) may well have the same core counts, just running at lower clocks which would be nice to see but would require a beefy mobile cooling solution. 

As far as the RTX 2060 Max-Q Design graphics processor, not as much information was leaked as far as specifications as the leak was limited to two screenshots allegedly from Final Fantasy XV's benchmark results page comparing a desktop RTX 2060 with a Max-Q RTX 2060. The number of CUs (and other numbers like CUDA/Tensor/RT cores, TMUs, and ROPs) was not revealed in those screenshots, for example. The comparison does lend further credence to the rumors of the RTX 2060 utilizing 6 GB of GDDR6 memory though. Tom's Hardware does have a screenshot that shows the RTX 2060 with 30 CUs which suggest 1,920 CUDA cores, 240 Tensor cores, and 30 RT cores though with clocks up to 1.2 GHz (which does mesh well with previous rumors of the desktop part).

Graphics Card Generic VGA Generic VGA
Memory 6144 MB 6144 MB
Core clock 960 MHz 975 MHz
Memory Clock 1750 MHz 1500 MHz
Driver name NVIDIA GeForce RTX 2060 NVIDIA GeForce RTX 2060 with Maz-Q Design
Driver version 25.21.14.1690 25.21.14.1693

Also, the TU106 RTX 2060 with Max-Q Design reportedly has a 975 MHz core clock and a 1500 MHz (6 GHz) memory clock. Note that the 960 MHz core clock and 1750 MHz (7 GHz) memory clocks don't match previous RTX 2060 rumors which suggested higher GPU clocks in particular (up to 1.2 GHz). To be fair, it could just be the software reporting incorrect numbers due to the GPUs not being official yet. One final bit of leaked information included a note about 3DMark 11 performance with the RTX 2060 Max Q Design GPU hitting at least 19,000 in the benchmark's Performance preset which allegedly puts it in between the scores of the mobile GTX 1070 and the mobile GTX 1070 Max-Q. (A graphics score between nineteen and twenty thousand would put it a bit above a desktop GTX 1060 but far below the desktop 1070).

As usual, take these rumors and leaked screenshots with a healthy heaping of salt, but they are interesting nonetheless. Combined with the news about NVIDIA possibly announcing new mid-range GPUs at CES 2019, we may well see new laptops and other mobile graphics solutions shown off at CES and available within the first half of 2019 which would be quite the coup.

What are your thoughts on the rumored RTX 2060 for desktops and its mobile RTX 2060 and RTX 2070 Max-Q siblings?

Related reading:

Source: GND-Tech

Developers! Developers! Developers! ... might just prefer an Ubuntu powered Dell XPS laptop

Subject: Mobile | December 12, 2018 - 05:19 PM |
Tagged: dell, linux, ubuntu 18.04, XPS developer edition, Kaby Lake R

Dell have updated their Linux powered XPS Developer's Edition laptop with a Kaby Lake R processor, up to a 2TB PCIe SSD, 4-16GB of RAM and either a 1080p screen or a 4K touchscreen depending on how much you are willing to pay.  Dell included all the latest features, including a pair of Thunderbolt 3 ports as well as a Type C 3.1 port; there is even an SD card reader. 

Apart from the webcam and the lack of older style USB ports, Ars Technica gives this new Linux power laptop top marks.

dellxps-front.jpg

"Recently, Dell finally sent Ars the latest model of the XPS 13 DE for testing. And while Dell did put a lot of work into this latest iteration, the biggest upgrade with the latest Developer Edition is the inclusion of Ubuntu 18.04."

Here are some more Mobile articles from around the web:

More Mobile Articles

Source: Ars Technica

It's like Skyrim ... with guns ... in space ... with a Firefly meets Borderlands feel? The Outer Worlds teaser

Subject: General Tech | December 12, 2018 - 02:56 PM |
Tagged: obsidian, The Outer Worlds, gaming

The teaser trailer for The Outer Worlds certainly looks interesting, though one has to wonder in Obsidian may have tried to combine too many different styles into a single game.  On the other hand they are responsible for the best of the first person Fallout games so we can hold out some hope.  Even better is the news from Rock, Paper, SHOTGUN that even though Microsoft now owns Obsidian, the game will be released by 2K and not a Windows Store exclusive launch! 

Head over to watch the teaser.

the-outer-worlds-b-1212x682.jpg

"Obsidian Entertainment, the studio behind RPGs from Alpha Protocol through Fallout: New Vegas to Pillars Of Eternity, tonight announced The Outer Worlds, a new singleplayer first-person RPG with a space-western twang."

Here is some more Tech News from around the web:

Tech Talk

 

A break from your regular Intel briefing

Subject: General Tech | December 12, 2018 - 12:37 PM |
Tagged: RTX 2060, nvidia, navi, amd

The majority of today's news will cover Intel's wide range of announcements from their architecture day, with new Optane DIMMs seeking to reduce latency to come close to matching that of DRAM to Foveros chiplets and hints of coming in off the Lake to spend some time in a Sunny Cove.  Indeed there are more links below the fold offering more coverage as yesterdays announcements  were very dense.

That might overshadow a rumour which dedicated discrete GPUs lovers would be interested in, the fact that NVIDIA might be able to get the RTX 2060 to market before AMD can launch a Navi based card.  The Inquirer has seen rumours that NVIDIA might be able to release the card in the first half of 2019, while the 7nm Navi isn't expected until the second half of year.  The early supply of mid-range NVIDIA GPUs might attract buyers who no longer want to wait; though depending on how Navi performs they could come to regret that lack of patience. 

IMG_0401.JPG

"GRAPHICS CARDS IN 2019 are set to get a good bit more interesting, as a leak suggests that Nvidia's GeForce RTX 2060 could reach the market before AMD's next-gen Navi Radeon cards."

Here is some more Tech News from around the web:

Tech Talk

 

Source: The Inquirer

Intel's Optane DC Persistent Memory DIMMs Push Latency Closer to DRAM

Subject: Storage | December 12, 2018 - 09:17 AM |
Tagged: ssd, Optane, Intel, DIMM, 3D XPoint

Intel's architecture day press release contains the following storage goodness mixed within all of the talk about 3D chip packaging:

Memory and Storage: Intel discussed updates on Intel® Optane™ technology and the products based upon that technology. Intel® Optane™ DC persistent memory is a new product that converges memory-like performance with the data persistence and large capacity of storage. The revolutionary technology brings more data closer to the CPU for faster processing of bigger data sets like those used in AI and large databases. Its large capacity and data persistence reduces the need to make time-consuming trips to storage, which can improve workload performance. Intel Optane DC persistent memory delivers cache line (64B) reads to the CPU. On average, the average idle read latency with Optane persistent memory is expected to be about 350 nanoseconds when applications direct the read operation to Optane persistent memory, or when the requested data is not cached in DRAM. For scale, an Optane DC SSD has an average idle read latency of about 10,000 nanoseconds (10 microseconds), a remarkable improvement.2  In cases where requested data is in DRAM, either cached by the CPU’s memory controller or directed by the application, memory sub-system responsiveness is expected to be identical to DRAM (<100 nanoseconds).
 
The company also showed how SSDs based on Intel’s 1 Terabit QLC NAND die move more bulk data from HDDs to SSDs, allowing faster access to that data.

Did you catch that? 3D XPoint memory in DIMM form factor is expected to have an access latency of 350 nanoseconds! That's down from 10 microseconds of the PCIe-based Optane products like Optane Memory and the P4800X. I realize those are just numbers, and showing a nearly 30x latency improvement may be easier visually, so here:

gap-5.png

Above is an edit to my Bridging the Gap chart from the P4800X review, showing where this new tech would fall in purple. That's all we have to go on for now, but these are certainly exciting times. Consider that non-volatile storage latencies have improved by nearly 100,000x over the last decade, and are now within striking distance (less than 10x) of DRAM! Before you get too excited, realize that Optane DIMMs will be showing up in enterprise servers first, as they require specialized configurations to treat DIMM slots as persistent storage instead of DRAM. That said, I'm sure the tech will eventually trickle down to desktops in some form or fashion. If you're hungry for more details on what makes 3D XPoint tick, check out how 3D XPoint works in my prior article.

180808-191612.jpg

Intel Unveils Next-Gen Sunny Cove CPU, Gen11 Graphics, and 3D Stacking Technology

Subject: Processors | December 12, 2018 - 09:00 AM |
Tagged: xeon, Sunny Cove, processor, intel core, Intel, integrated graphics, iGPU, Foveros, cpu, 3D stacking

Intel’s Architecture Day was held yesterday and brought announcements of three new technologies. Intel shared details of a new 3D stacking technology for logic chips, a brand new CPU architecture for desktop and server, and some surprising developments on the iGPU front. Oh, and they mentioned that whole discrete GPU thing…

3D Stacking for Logic Chips

First we have Foveros, a new 3D packaging technology that follows Intel’s previous EMIB (Embedded Multi-die Interconnect Bridge) 2D packaging technology and enables die-stacking of high-performance logic chips for the first time.

2d-and-3d-packaging-drive-new-design-flexibility.jpg

“Foveros paves the way for devices and systems combining high-performance, high-density and low-power silicon process technologies. Foveros is expected to extend die stacking beyond traditional passive interposers and stacked memory to high-performance logic, such as CPU, graphics and AI processors for the first time.”

Foveros will allow for a new “chiplet” paradigm, as “I/O, SRAM, and power delivery circuits can be fabricated in a base die and high-performance logic chiplets are stacked on top”. This new approach would permit design elements to be “mixed and matched”, and allow new device form-factors to be realized as products can be broken up into these smaller chiplets.

3d-packaging-a-catalyst-for-product-innovation.jpg

The first range of products using this technology are expected to launch in the second half of 2019, beginning with a product that Intel states “will combine a high-performance 10nm compute-stacked chiplet with a low-power 22FFL base die,” which Intel says “will enable the combination of world-class performance and power efficiency in a small form factor”.

Intel Sunny Cove Processors - Coming Late 2019

Next up is the announcement of a brand new CPU architecture with Sunny Cove, which will be the basis of Intel’s next generation Core and Xeon processors in 2019. No mention of 10nm was made, so it is unclear if Intel’s planned transition from 14nm is happening with this launch (the last Xeon roadmap showed a 10 nm transition with "Ice Lake" in 2020).

Intel_CPUs.jpg

Intel states that Sonny Cove is “designed to increase performance per clock and power efficiency for general purpose computing tasks” with new features included “to accelerate special purpose computing tasks like AI and cryptography”.

Intel provided this list of Sunny Cove’s features:

  • Enhanced microarchitecture to execute more operations in parallel.
  • New algorithms to reduce latency.
  • Increased size of key buffers and caches to optimize data-centric workloads.
  • Architectural extensions for specific use cases and algorithms. For example, new performance-boosting instructions for cryptography, such as vector AES and SHA-NI,  and other critical use cases like compression and decompression.

Integrated Graphics with 2x Performance

Gen11_Pipeline.png

Intel slide image via ComputerBase

Intel did reveal next-gen graphics, though it was a new generation of the company’s integrated graphics announced at the event. The update is nonetheless significant, with the upcoming Gen11 integrated GPU “expected to double the computing performance-per-clock compared to Intel Gen9 graphics” thanks to a huge increase in Execution Units, from 24 EUs with Gen9 to 64 EUs with Gen11. This will provide “>1 TFLOPS performance capability”, according to Intel, who states that the new Gen11 graphics are also expected to feature advanced media encode/decode, supporting “4K video streams and 8K content creation in constrained power envelopes”.

And finally, though hardly a footnote, the new Gen11 graphics will feature Intel Adaptive Sync technology, which was a rumored feature of upcoming discrete GPU products from Intel.

Discrete GPUs?

And now for that little part about discrete graphics: At the event Intel simply “reaffirmed its plan to introduce a discrete graphics processor by 2020”. Nothing new here, and this obviously means that we won’t be seeing a new discrete GPU from Intel in 2019 - though the beefed-up Gen11 graphics should provide a much needed boost to Intel’s graphics offering when Sonny Cove launches “late next year”.

Source: Intel

MSI's new Z390; Ace in the hole or jumping the shark?

Subject: Motherboards | December 11, 2018 - 06:19 PM |
Tagged: msi, Z390, Intel, MEG Z390 ACE

MSI's MEG was the cream of the crop for Threadripper, even though it carried a significant price.   Now we have a chance to see how this design works on Intel, as MSI have the MEG Z390 ACE for under $300, to pair with a processor such as the i7-9900K.   MEG sports an enhanced backplate, as you can see from the picture below, for those who like to insert a lot of extras into their motherboard. 

As for general performance, stability and overclocking?  Check out [H]ard|OCP's review to see why the board was sporting Gold once it was unstrapped from the bench.

1543994662wd9p8ixg3i_1_8_l.jpg

"The MSI Enthusiast Gaming lineup expands once again with two Z390 offerings for Intel’s latest 9000 series CPUs. The MEG boards offer a blend of quality, features, with power delivery, and overclocking in mind. MSI has certainly raised the bar for its products over the last few years. So our expectations for the ACE motherboard are high."

Here are some more Motherboard articles from around the web:

Motherboards

Source: [H]ard|OCP

Samsung's new Supremely Suspcious Deal

Subject: General Tech | December 11, 2018 - 01:10 PM |
Tagged: supreme, oops, Samsung

It will be a surprise to many that Supreme is a skateboard fashion brand; even more surprised was Supreme, when Samsung announced they were forming some sort of partnership with the company.  It seems that a knock-off version of the New York based provider of duds for skaters exists in Italy, thanks to a less than effective trademark and that company not only convinced Samsung they were the real deal but also that it would benefit Samsung to partner with them to host a big fashion show in Beijing.

Samsung is rather embarrassed about the whole thing, so don't taunt them too much.  Pop by Ars Technica for a bit of a lesson on why you should double check anything a skater tells you is true!

ff1da29aed7305cbf098b6f4639b6f92.jpeg

"Supreme is not working with Samsung, opening a flagship location in Beijing or participating in a Mercedes-Benz runway show. These claims are blatantly false and propagated by a counterfeit organization."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Ars Technica
Author:
Manufacturer: AMD

Vega meets Radeon Pro

Professional graphics cards are a segment of the industry that can look strange to gamers and PC enthusiasts. From the outside, it appears that businesses are paying more for almost identical hardware when compared to their gaming counterparts from both NVIDIA and AMD. 

However, a lot goes into a professional-level graphics card that makes all the difference to the consumers they are targeting. From the addition of ECC memory to protect against data corruption, all the way to a completely different driver stack with specific optimizations for professional applications, there's a lot of work put into these particular products.

The professional graphics market has gotten particularly interesting in the last few years with the rise of the NVIDIA TITAN-level GPUs and "Frontier Edition" graphics cards from AMD. While lacking ECC memory, these new GPUs have brought over some of the application level optimizations, while providing a lower price for more hobbyist level consumers.

However, if you're a professional that depends on a graphics card for mission-critical work, these options are no replacement for the real thing.

Today we're looking at one of AMD's latest Pro graphics offerings, the AMD Radeon Pro WX 8200. 

DSC05271.JPG

Click here to continue reading our review of the AMD Radeon Pro WX 8200.

That's no Zune, it's a FiiO M7

Subject: General Tech | December 10, 2018 - 03:51 PM |
Tagged: audio, FiiO, m7, Exynos 7270, Sabre 9018Q2C, DAC

There are those for whom the idea of listening to audio via a phone is painful to contemplate, as the lack of a dedicated high fidelity DAC will ruin the experience.  They will quite happily drop $200 on something like the FiiO M7 and consider it a bargain.  The device is also interesting technically, with a DAC and Exynos processor running it, which is why the device is somewhat interesting to non-audiophiles as well.   Check out Nikktech for a look at the interface, hardware and audio quality if you are curious.

It also has an FM receiver!

fiio_m7_hd_music_player_review_9.jpg

"It may not be the flagship music player in the entire High-Resolution lineup by FiiO but thanks to its Exynos 7270 Processor and the Sabre 9018Q2C DAC/Amp the M7 should have no problem satisfying even the most demanding audiophiles."

Here is some more Tech News from around the web:

Audio Corner

 

Source: Nikktech

Report: New AMD Trademark Shows Possible 7nm Vega Logo

Subject: Graphics Cards | December 10, 2018 - 03:28 PM |
Tagged: Vega, trademark, rumor, report, radeon, graphics, gpu, amd, 7nm

News of a new logo trademark from AMD is making the rounds, with VideoCardz.com spotting the image via Twitter user "BoMbY". Time to speculate!

Vega_II_Logo.jpg

AMD trademark image posted by Twitter user BoMbY (via VideoCardz.com)

The logo, with the familiar "V" joined by a couple of new stripes on the right side, could mean a couple of things; with a possible reference to Vega II (2), or perhaps the VII suggests the Roman numeral 7 for 7nm, instead? VideoCardz.com thinks the latter may be the case:

"AMD has registered a new trademark just 2 weeks ago. Despite many rumors floating around about Navi architecture and its possible early reveal or announcement in January, it seems that AMD is not yet done with Vega. The Radeon Vega logo, which features the distinctive V lettering, has now received 2 stripes, to indicate the 7nm die shrink."

Whatever the case may be it's interesting to consider the possibility of a 7nm Vega GPU before we see Navi. We really don't know, though it does seem a bit presumptuous to consider a new product as early as CES, as Tech Radar speculates:

"We know full well that the next generation of AMD graphics will be built upon a 7nm architecture going by the roadmaps the company released at CES 2018. At the same time, it seems to all sync up with AMD's plans to announce new 7nm GPUs at CES 2019, so it almost seems certain that we’ll see Vega II graphics cards soon."

The prospect of new graphics cards is always tantalizing, but we'll need more than a logo before things really get interesting.

Source: VideoCardz

Just Cause it's new is no excuse Four this performance

Subject: General Tech | December 10, 2018 - 01:42 PM |
Tagged: just cause 4, gaming, benchmarks, 4k, 1440p, 1080p

One of the best pieces of stress relief software* just got a major update, and TechSpot has discovered it may actually cause more stress than it relieves.  The focus of their article is on performance but before offering a hint at what to expect it is worth noting they found Just Cause 4 to be a downgrade from the previous release, with many of the graphics being similar or lower quality than the previous game and at a much higher performance cost.

If you have anything below a GTX 1080 or Vega 64 you will struggle to maintain 60fps on very high quality at 1080p and you might be able to scrape by at 1440p with a GTX 1080 or Vega 64 but smooth 4K is beyond even an RTX 2080.  Since the game itself, apart from some of the detailed scenery, doesn't seem that much different from the previous title it will be interesting to see if the reported performance issues lessen over time.

*There is a game included as well.

2018-12-08-image-4.jpg

"Today we’re benchmarking Just Cause 4 with a boatload of different GPUs to help you determine if your graphics card will handle this brand new title, and if need be, work out a suitable upgrade option."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: TechSpot

Out on a branch, speculating about possible architectural flaws

Subject: General Tech | December 10, 2018 - 12:38 PM |
Tagged: spectre, splitspectre, speculator, security, arm, Intel, amd

The discovery of yet another variant of Spectre vulnerability is not good news for already exhausted security experts or reporters, but there is something new in this story which offers a glimmer of hope.  A collaborative team of researchers from Northeastern University and IBM found this newest design law using an automatic bug finding tool they designed, called Speculator.

They designed the tool to get around the largest hurdle security researchers face, the secrecy of AMD, Intel and ARM who are trying to keep the recipe for their special sauce secret, and rightly so.  Protecting their intellectual properly is paramount to their stockholders and there are arguments about the possible effectiveness of security thorough obscurity in protecting consumers from those with nefarious intent but it does come at a cost for those hunting bugs for good. 

Pop by The Register for details on how Speculator works.

TreeHouse_0002_20130603_web.jpg

"SplitSpectre is a proof-of-concept built from Speculator, the team's automated CPU bug-discovery tool, which the group plans to release as open-source software."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register

3DMark Port Royal Ray Tracing Benchmark Launches January 8th

Subject: Graphics Cards | December 10, 2018 - 10:36 AM |
Tagged: 3dmark, ray tracing, directx raytracing, raytracing, rtx, benchmarking, benchmarks

After first announcing it last month, UL this weekend provided new information on its upcoming ray tracing-focused addition to the 3DMark benchmarking suite. Port Royal, what UL calls the "world's first dedicated real-time ray tracing benchmark for gamers," will launch Tuesday, January 8, 2019.

For those eager for a glimpse of the new ray-traced visual spectacle, or for the majority of gamers without a ray tracing-capable GPU, the company has released a video preview of the complete Port Royal demo scene.

Access to the new Port Royal benchmark will be limited to the Advanced and Professional editions of 3DMark. Existing 3DMark users can upgrade to the benchmark for $2.99, and it will become part of the base $29.99 Advanced Edition package for new purchasers starting January 8th.

Real-time ray tracing promises to bring new levels of realism to in-game graphics. Port Royal uses DirectX Raytracing to enhance reflections, shadows, and other effects that are difficult to achieve with traditional rendering techniques.

As well as benchmarking performance, 3DMark Port Royal is a realistic and practical example of what to expect from ray tracing in upcoming games— ray tracing effects running in real-time at reasonable frame rates at 2560 × 1440 resolution.

3DMark Port Royal was developed with input from AMD, Intel, NVIDIA, and other leading technology companies. We worked especially closely with Microsoft to create a first-class implementation of the DirectX Raytracing API.

Port Royal will run on any graphics card with drivers that support DirectX Raytracing. As with any new technology, there are limited options for early adopters, but more cards are expected to get DirectX Raytracing support in 2019.

3DMark can be acquired via Steam or directly from UL's online store. The Advanced Edition, which includes access to all benchmarks, is priced at $29.99.

MSI Launches GTX 1060 6GB Armor 6GD5X OC Graphics Card Using GDDR5X Memory

Subject: Graphics Cards | December 9, 2018 - 05:40 PM |
Tagged: pascal, msi, GP104, GeForce GTX 1060, armor

MSI is launching a refreshed GTX 1060 graphics card that uses GDDR5X for its 6GB of video memory rather than GDDR5. The aptly named GTX 1060 Armor 6GD5X OC graphics card shares many features of the existing Armor 6G OC (and OCV1) that the new card is a refresh of including the dual TORX fan Armor 2X cooler and maximum 4 display outputs among three DisplayPort 1.4, one HDMI 2.0b, and one DVI-D.

MSI GTX 1060 Armor 6GD5X OC.png

The new Pascal-based GPU in the upcoming graphics card is reportedly a cut-down variant of NVIDIA's larger GP104 chip rather than the GP106-400 used for previous GTX 1060s, but the core count and other compute resources remain the same at 1,280 CUDA cores, 80 TMUs, 48 ROPs, and a 192-bit memory bus. Clock speeds have been increased slightly versus reference specifications however at 1544 MHz base and up to 1759 MHz boost. The GPU is paired with 6 GB of GDDR5X that is curiously clocked at 8 GHz. The memory more than likely has quite a bit of overclocking headroom vs GTX 1060 6GB cards using GDDR5 but it appears MSI is leaving those pursuits for enthusiasts to explore on their own.

MSI is equipping its GTX 1060 Armor 6GD5X OC graphics cards with a 8+6 pin PCI-E power connection setup which should help overclockers push the cards as far as they can (previous GTX 1060 Armor OC cards had only a single 8-pin). Looking at the specification page the new card will be slightly shorter but with a thicker cooler at 276mm x 140mm x 41mm than the GDDR5-based card. As part of the Armor series the card has a white and black design like its predecessors.

MSI has not yet released pricing or availability information but with the GDDR5-based graphics cards priced at around $275 I would suspect the MSI GTX 1060 Armor 6GD5X OC to sit around $290 at launch.

I am curious how well new GTX 1060 graphics cards will perform when paired with faster GDDR5X memory and how the refreshed cards stack up against AMD's refreshed Polaris 30 based RX 590 graphics cards.

Related reading:

Source: MSI

It's Crucial not to try two new things at once

Subject: Storage | December 7, 2018 - 03:24 PM |
Tagged: crucial, QLC, P1, 500gb, PCIe SSD, NVMe

The Crucial P1 SSD marks two firsts for the company, their first NVMe drive as well as their first SSD using QLC flash. The drive differs from Samsung's QVO in that it uses Micron's 64-layer 3D QLC flash and an SM2263 controller but still uses QLC flash, much to the dismay of The Tech Report, amongst others.  The 500GB drive currently sells for $110, which is attractive but when you look at the performance, it seems perhaps a bit expensive; which is not good.

Check it out here, or read some of our old TLC reviews if you can't stand the QLC.

drive-bare.jpg

"Powered by Micron's 3D quad-level-cell NAND, the Crucial P1 might be a herald of QLC-dominated days to come. We put Crucial's first NVMe drive through its paces to see how increasing the number of bits per cell affects performance."

Here are some more Storage reviews from around the web:

Storage

Microsft heals some wounds as it moves to Open Source

Subject: General Tech | December 7, 2018 - 01:57 PM |
Tagged: windows, open source, microsoft, edge, chromium, browser, Opera, firefox

One of the big stories this week has been the rumour and confirmation of Microsoft's move to Chromium.  What we hadn't seen until this morning was what the competition thought about it, which we now know thanks to a link from Slashdot.   You will be shocked to learn that Firefox sees this as solid proof you should have been using Firefox all along, or should switch immediately.

Opera and Google both applaud the move; Opera pointing out that they did something very similar about 6 years ago while Google welcomes Microsoft to the open source community it once spurned.  Take a peek at the rest here.

chromium-logo-100751561-large.jpg

"Google largely sees Microsoft's decision as a good thing, which is not exactly a surprise given that the company created the Chromium open source project. "Chrome has been a champion of the open web since inception and we welcome Microsoft to the community of Chromium contributors. We look forward to working with Microsoft and the web standards community to advance the open web, support user choice, and deliver great browsing experiences."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

Microsoft Confirms Edge Browser is Moving to Chromium

Subject: General Tech | December 7, 2018 - 10:02 AM |
Tagged: windows, open source, microsoft, Joe Belfiore, edge, chromium, browser

It's official: Microsoft is indeed moving their Edge browser to Chromium as previously reported. Windows VP Joe Belfiore made the announcement yesterday with a blog post entitled "Microsoft Edge: Making the web better through more open source collaboration".

Microsoft_Edge_logo.PNG

The post begins as follows (emphasis added):

"For the past few years, Microsoft has meaningfully increased participation in the open source software (OSS) community, becoming one of the world’s largest supporters of OSS projects. Today we’re announcing that we intend to adopt the Chromium open source project in the development of Microsoft Edge on the desktop to create better web compatibility for our customers and less fragmentation of the web for all web developers.

As part of this, we intend to become a significant contributor to the Chromium project, in a way that can make not just Microsoft Edge — but other browsers as well — better on both PCs and other devices."

Not an immediate move, the under-the-hood changes to the Microsoft Edge browser will take place "over the next year or so", with the transition described as happening "gradually over time". From Microsoft:

1. We will move to a Chromium-compatible web platform for Microsoft Edge on the desktop. Our intent is to align the Microsoft Edge web platform simultaneously (a) with web standards and (b) with other Chromium-based browsers. This will deliver improved compatibility for everyone and create a simpler test-matrix for web developers.

2. Microsoft Edge will now be delivered and updated for all supported versions of Windows and on a more frequent cadence. We also expect this work to enable us to bring Microsoft Edge to other platforms like macOS. Improving the web-platform experience for both end users and developers requires that the web platform and the browser be consistently available to as many devices as possible. To accomplish this, we will evolve the browser code more broadly, so that our distribution model offers an updated Microsoft Edge experience + platform across all supported versions of Windows, while still maintaining the benefits of the browser’s close integration with Windows.

3. We will contribute web platform enhancements to make Chromium-based browsers better on Windows devices. Our philosophy of greater participation in Chromium open source will embrace contribution of beneficial new tech, consistent with some of the work we described above. We recognize that making the web better on Windows is good for our customers, partners and our business – and we intend to actively contribute to that end.

The full blog post from Belfiore is available here.

Source: Microsoft

Snapdragon 8cx is Qualcomm’s answer to higher-performance for Windows PCs

Subject: Mobile | December 6, 2018 - 08:38 PM |
Tagged: snapdragon x24, snapdragon, qualcomm, NVMe, kryo 495, adreno 680, 8cx

While yesterday was all about Snapdragon 855, and the enhancements it will bring to mobile devices, Qualcomm’s focus today at their Snapdragon Tech Summit was all about the “Always on, Always connected” (AOAC) PC.

Announced almost exactly a year ago, AOAC is the term that Qualcomm uses to brand Snapdragon devices featuring the Windows operating system.

8cx-1.png

In the past year, Qualcomm has shipped PCs based on both the Snapdragon 835 and well as the PC-only Snapdragon 850 SoCs.

Today, Qualcomm is taking the wraps off of their higher-performance Snapdragon option for PCs, Snapdragon 8cx.

Snapdragon 8cx Badge.jpg

From the start, Qualcomm assures us that Snapdragon 8cx won’t be completely replacing Snapdragon 850 in the marketplace, pointing to it being a more upmarket solution.

8cx-2.png

Kryo 495

8cx-4.png

Unlike the Prime Core design on the Snapdragon 855, the 8cx platform is sticking with a more traditional BIG.little design with four performance and four efficiency cores. However, we do see larger cache sizes than previous Snapdragons, with a total of 10MB system cache.

8cx-5.png

Qualcomm did make a few performance claims against Intel's notebook parts, but they are a bit confusing.

While they did compare the Snapdragon 8cx to Intel's mainstream 15W U-series quad-core mobile CPUs, the performance numbers Qualcomm showed were for both CPUs running at 7W. 

Qualcomm says this is because of the thermal constraints of a fanless design, of which all the Snapdragon PCs are, but looking at the thermal performance of real-world fanless PCs with Intel U-series processors like the Surface Pro 6 with a Core-i5, 7W seems to be a lower power level than that PC ever actually sees.

As always, only time and independent performance analysis will tell the true competitive nature of these CPUs.

Adreno 680

Also all-new for Snapdragon 8cx is the Adreno 680 GPU, what Qualcomm is touting as their fastest GPU ever with a 2x performance improvement and 60% greater power efficiency over Snapdragon 850.

8cx-3.png

On the connectivity side, Adreno 680 will provide desktop-level outputs, including support for up to two simultaneous 4K HDR displays.

Despite the significant performance increases on the GPU side, Qualcomm is claiming that the Adreno 680 GPU in Snapdragon 8cx is 60% more efficient than the Adreno GPU in their current lead PC platform, Snapdragon 850.

Snapdragon X24

Snapdragon 8cx will sport the same X24 modem we saw announced alongside the Snapdragon 855 yesterday.

8cx-7.png

This new modem will enable both LTE connections up to 2Gbps as we saw with Snapdragon 855, but judging from the specification sheet that was provided, 8cx seems to lack the ability for Wifi-6 (802.11ax) and 802.11ay.

In addition, Qualcomm also teased that 5G-enabled 8cx devices (likely with the Snapdragon x50 modem) will also be coming in 2019.

Connectivity

One of the most significant downsides for the current generation of Snapdragon-powered PCs has been the carryover of UFS storage from the mobile phone side. While UFS can provide a sufficient experience on Android devices, it became a significant bottleneck on Windows-based devices.

8cx-6.png

Thanks to an available four lanes of PCI Express 3.0 connectivity, the Snapdragon 8cx will provide support for NVMe SSDs. While Qualcomm still hasn’t implemented a native NVMe controller into their SSD like Apple, this will at least enable the option for faster storage coming from OEMs.

However, it remains to be seen how many OEMs adopt NVMe SSDs in their Snapdragon 8cx products, due to the added cost, and potential thermal issues with higher performance, PCIe SSD in a fan-less form factor.

Software

8cx-8.png

Another pain point for Snapdragon PCs has been software support. While the initial Windows on Snapdragon releases were able to run native ARM 32bit applications as well as emulate 32bit x86 applications, software support has come a long way for this platform in the past year.

One of the biggest areas of concern has been native browser support. Currently, the only native ARM browser on Windows is Edge. With Microsoft's announced move of Edge to the Chromium rendering system, we will now gain an implementation of the open source engine that power Google Chrome, but not the Chrome browser itself yet.

8cx-9.png

Mozilla however, is set to ship a native ARM64 version of Firefox in the coming months, which will be the first high-performance answer to Edge for the Windows on Snapdragon platform.

Microsoft was also on stage today discussing how they are bringing Windows 10 Enterprise to Snapdragon devices, allowing for more wide deployments of these machines in large corporations.

Pricing and Availability

Despite bringing Lenovo on stage at the event to talk about their partnership with Qualcomm, no actual devices or even manufactures of 8cx devices were officially announced today.

Due to that, we have no real information on pricing or availability on Snapdragon 8cx-powered systems besides that they are coming in 2019, at some point.

That being said since Snapdragon 850 is still sticking around as an option in the marketplace, expect Snapdragon 8cx devices to be more expensive than the current crop of Snapdragon-enabled PCs.

Snapdragon 8cx Chip Comparison, US Coin.jpg

We expect more information to come on Snapdragon 8cx in the coming months at CES and MWC, so stay tuned for more information as it becomes available!

Source: Qualcomm

Let the bodies hit the floor; for some 4K gaming in the living room

Subject: Displays | December 6, 2018 - 03:41 PM |
Tagged: 4k, tv, Samsung, 40NU7100, tcl, 55R617, vizio, PQ65-F1

4K TV's aren't just for those couple of Netflix shows or YouTube videos you use to show off to your friends, they are also a viable replacement for a monitor.  If you pick the right one you not only get 4K resolutions but also HDR and after investing that much dosh you might not be looking at upgrading your PC's monitor any time soon.  Drop by TechSpot for a look at three TV's they recommend, ranging from a mere $630 up to $2100, with a few honourable mentions as well.

Perhaps you have some suggestions of your own to offer in the comments.

2018-11-12-image-3.jpg

"If you're interested in replacing your desktop monitor with a 4K TV and want to know what to buy, you've come to the right place. Maybe you aren't quite sure where to start or could use a hand in narrowing your search. Whatever the case, this guide is intended to help steer you in the right direction"

Here are some more Display articles from around the web:

Displays

Source: TechSpot