Flash player not detected. Click here to install flash.
« 1 2 3 4 5 »

What's so bright about the Genius Scorpion M8-610?

Subject: General Tech | February 15, 2017 - 09:00 PM |
Tagged: input, genius, scorpion M8-610, gaming mouse, ambidextrous

The symmetrical design of the Genius Scorpion M8-610 will ensure comfort no matter what your chirality is, something that is seemingly more uncommon in gaming mice these days.  The Avago ADNS-9800 laser sensor can provide between 800 to 8200 DPI and all the buttons are Omron D2FC-F-7N, not bad for a mouse that runs less than $40.  Modders Inc took a look at the mouse and the software suite which accompanies it in their latest review; take a look at what they thought right here.

m861008.jpg

"While it is easy to get lured by fancy colors and flashy design when looking for a gaming mouse, it always comes down to functional consistency above all else. Aside from the keyboard, the mouse allows users to communicate with the computer and to the wider world online."

Here is some more Tech News from around the web:

Tech Talk

Source: Modders Inc

50GB of high resolution Fallout, finally a use for that 8GB of VRAM?

Subject: General Tech | February 15, 2017 - 07:33 PM |
Tagged: gaming, fallout 4

[H]ard|OCP took a look into the effect on performance the gigantic high resolution texture pack has on system performance in their latest article.  For those who want the answer immediately, the largest amount of VRAM they saw utilized was a hair over 5GB, in most cases more than double the usage that the default textures use.  This certainly suggests that those with 4GB cards should reconsider installing the texture pack and that a 6GB card shouldn't see performance impacts.  As for the performance deltas, well we can't provide spoilers for their entire review!

h_quality.PNG

"Bethesda has released its official High Resolution Texture Pack DLC for Fallout 4. We will look at performance impact, VRAM capacity usage levels, and compare image quality to see if this High Resolution Texture Pack might be worthy of your bandwidth in actually improving the gameplay experience."

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP
Subject: Mobile
Manufacturer: Huawei

Introduction and Specifications

The Mate 9 is the current version of Huawei’s signature 6-inch smartphone, building on last year’s iteration with the company’s new Kirin 960 SoC (featuring ARM's next-generation Bifrost GPU architecture), improved industrial design, and exclusive Leica-branded dual camera system.

Mate9_Main.jpg

In the ultra-competitive smartphone world there is little room at the top, and most companies are simply looking for a share of the market. Apple and Samsung have occupied the top two spots for some time, with HTC, LG, Motorola, and others, far behind. But the new #3 emerged not from the usual suspects, but from a name many of us in the USA had not heard of until recently; and it is the manufacturer of the Mate 9. And comparing this new handset to the preceding Mate 8 (which we looked at this past August), it is a significant improvement in most respects.

With this phone Huawei has really come into their own with their signature phone design, and 2016 was a very good product year with the company’s smartphone offerings. The P9 handset launched early in 2016, offering not only solid specs and impressive industrial design, but a unique camera that was far more than a gimmick. Huawei’s partnership with Leica has resulted in a dual-camera system that operates differently than systems found on phones such as the iPhone 7 Plus, and the results are very impressive. The Mate 9 is an extension of that P9 design, adapted for their larger Mate smartphone series.

DSC_0649.jpg

Continue reading our review of the Huawei Mate 9!

Vulkan is not extinct, in fact it might be about to erupt

Subject: General Tech | February 15, 2017 - 06:29 PM |
Tagged: vulkan, Intel, Intel Skylake, kaby lake

The open source API, Vulkan, just received a big birthday present from Intel as they added official support on their Skylake and Kaby Lake CPUs under Windows 10.  We have seen adoption of this API from a number of game engine designers, Unreal Engine and Unity have both embraced it, the latest DOOM release was updated to support Vulkan and there is even a Nintendo 64 renderer which runs on it.  Ars Technica points out that both AMD and NVIDIA have been supporting this API for a while and that we can expect to see Android implementations of this close to the metal solution in the near future.

khronos-2016-vulkanlogo2.png

"After months in beta, Intel's latest driver for its integrated GPUs (version 15.45.14.4590) adds support for the low-overhead Vulkan API for recent GPUs running in Windows 10. The driver supports HD and Iris 500- and 600-series GPUs, the ones that ship with 6th- and 7th-generation Skylake and Kaby Lake processors."

Here is some more Tech News from around the web:

Tech Talk

Source: Ars Technica

NVIDIA Releases GeForce 378.66 Drivers with New Features

Subject: Graphics Cards | February 15, 2017 - 02:29 AM |
Tagged: opencl 2.0, opencl, nvidia, graphics drivers

While the headline of the GeForce 378.66 graphics driver release is support for For Honor, Halo Wars 2, and Sniper Elite 4, NVIDIA has snuck something major into the 378 branch: OpenCL 2.0 is now available for evaluation. (I double-checked 378.49 release notes and confirmed that this is new to 378.66.)

nvidia-geforce.png

OpenCL 2.0 support is not complete yet, but at least NVIDIA is now clearly intending to roll it out to end-users. Among other benefits, OpenCL 2.0 allows kernels (think shaders) to, without the host intervening, enqueue work onto the GPU. This saves one (or more) round-trips to the CPU, especially in workloads where you don’t know which kernel will be required until you see the results of the previous run, like recursive sorting algorithms.

So yeah, that’s good, albeit you usually see big changes at the start of version branches.

Another major addition is Video SDK 8.0. This version allows 10- and 12-bit decoding of VP9 and HEVC video. So... yeah. Applications that want to accelerate video encoding or decoding can now hook up to NVIDIA GPUs for more codecs and features.

NVIDIA’s GeForce 378.66 drivers are available now.

Source: NVIDIA

Crucial expands their MX300 line of SSDs all the way up to 2TB

Subject: Storage | February 14, 2017 - 11:51 PM |
Tagged: tlc, slc, MX300, micron, imft, Dynamic Write Acceleration, DWA, crucial, 3DNAND, 3d nand

Last June Al took a look at the Crucial MX300 750GB and its ability to switch its cache dynamically from TLC to SLC, helping Crucial improve how they implemented this feature along the way.  It proved to be a great value for the money; not the best performing drive but among the least expensive on the market.  Crucial has since expanded the lineup and Hardware Canucks took a look at the 2TB model.  This model has more than just a larger pool of NAND, the RAM cache has been doubled up to 1GB and the dynamic cache has more space to work in as well.  Take a look at this economy sized drive in their full review.

board_lg.jpg

"Crucial's newest MX300 series continues to roll on with a new 2TB version. This SSD may be one of the best when it comes to performance, price and capacity all combined into one package."

Here are some more Storage reviews from around the web:

Storage

AMD Releases Radeon Software Crimson ReLive 17.2.1

Subject: Graphics Cards | February 14, 2017 - 10:57 PM |
Tagged: amd, graphics drivers

Just in time for For Honor and Sniper Elite 4, AMD has released a new set of graphics drivers, Radeon Software Crimson ReLive 17.2.1, that target these games. The performance improvements that they quote are in the 4-5% range, when compared to their previous driver on the RX 480, which would be equivalent to saving a whole millisecond per frame at 60 FPS. (This is just for mathematical reference; I don’t know what performance users should expect with an RX 480.)

amd-2016-crimson-relive-logo.png

Beyond driver overhead improvements, you will now be able to utilize multiple GPUs in CrossFire (for DirectX 11) on both titles.

Also, several issues have been fixed with this version. If you have a FreeSync monitor, and some games fail to activate variable refresh mode, then this driver might solve this problem for you. Scrubbing through some videos (DXVA H.264) should no longer cause visible corruption. A couple applications, like GRID and DayZ, should no longer crash under certain situations. You get the idea.

If you have an AMD GPU on Windows, pick up these drivers from their support page.

Source: AMD

Amazon Chimes in with a videoconferencing solution that is primed to take on the big players

Subject: General Tech | February 14, 2017 - 06:39 PM |
Tagged: amazon, chime, videoconferencing

If there is one thing we are short on, it is incompatible videoconferencing applications to use and support.  Obviously this is why Amazon purchased Biba and has now leaped into the fray to provide Chime, a truly unique service which will transmit your voice and video over the internet in something called a conference.  Sarcasm aside, Amazon Web Services have proven that they provide a solid set of services, which will be the backbone of the new app.  Those who have struggled with Adobe's offering or tried to have a meeting during many of the outage periods which plague various other providers might want to take a look.

The basic service is free, Plus allows screen sharing and access to corporate directories for $2.50 per user a month and the Pro version runs $15, allowing up to 100 people in a video call as well as the all important personalized URL.  Pop by Slashdot if you so desire.

Screen-Shot-2017-02-13-at-9.19.54-PM.png

"Amazon has released new service to make voice and video calls and share screen. Called Chime, the service is aimed at business users. It directly competes with well-known players such as Skype, Google Hangouts, GoToMeeting, Zoom, and Cisco's WebEx, among others."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot
Subject: Motherboards
Manufacturer: ECS

Introduction and Technical Specifications

Introduction

02-board.jpg

Courtesy of ECS

The ECS Z170-Lightsaber motherboard is the newest offering in ECS' L337 product line with support for the Intel Z170 Express chipset. The Z170-Lightsaber builds on their previous Z170-based product, adding several enthusiast-friendly features like enhanced audio and RGB LED support to the board. With an MSRP of $180, ECS priced the Lightsaber as a higher-tiered offering, justified with its additional features and functionality compared to their previous Z170-based product.

03-board-profile.jpg

Courtesy of ECS

ECS designed the Z170-Lightsaber with a 14-phase digital power delivery system, using high efficiency chokes and MOSFETs, as well as solid core capacitors for optimal board performance. The following features into the Z170-Lightsaber board: six SATA 3 ports; one SATA-Express port; a PCIe X2 M.2 port; a Qualcomm Killer GigE NIC; three PCI-Express x16 slots; four PCI-Express x1 slots; 3-digit diagnostic LED display; on-board power, reset, Quick overclock, BIOS set, BIOS update, BIOS backup, and Clear CMOS buttons; Realtek audio solution; integrated DisplayPort and HDMI video port support; and USB 2.0, 3.0, and 3.1 Gen2 port support.

Continue reading our review of the ECS Z170-Lightsaber Motherboard!

Anidees AI Crystal Tempered Glass Chassis, so shiny it is hard to photograph

Subject: Cases and Cooling | February 13, 2017 - 06:14 PM |
Tagged: anidees, tempered glass, AI Crystal

Much like HDR displays and VR, it can be hard to show what a tempered glass case looks like with just a few pictures but that is exactly what eTeknix set out to do in this review.  The case is quite large, e-ATX motherboards are supported and the three fans at the front of the case are all 140mm as is the one in the back, with room for three more if you so desire.  There is also an integral fan controller, with the three modes easily accessible on the top of the case. The PSU is hidden under a shroud, SSDs are mounted on the back of the case and decent cable management ensure that your see-through system is worth looking at. 

DSC_2959.jpg

"There’s a big trend in the chassis market this last year or so, as more and more brands shift from plastic side panel windows, to huge chunks of tempered glass. If you really care about having a system that looks like a premium quality product, and having a great way to show off your new build and hardware, then tempered glass is the way to go."

Here are some more Cases & Cooling reviews from around the web:

CASES & COOLING

 

Source: eTeknix

Clearing storage newsblasts from cache

Subject: General Tech | February 13, 2017 - 05:59 PM |
Tagged: acronis, caringo, Cisco, fujitsu, Intel, mimecast

The Register received more than a few tidbits of news from a wide array of storage companies, which they have condensed in this post.  Acronis have released new versions of their Backup suite and True Image, with the Backup suite now able to capture Office 365 mailboxes.  Cisco have released a product which allows you to have your onsite cloud run like Azure while Fujitsu announced their mid-range ETERNUS AF650 all-flash array.  Intel have updated their implementation of the open source Lustre parallel file system for supercomputers and several companies released earning data, though Mimecast wished their news was better.

acronis-logo.png

"Incoming! Boom, boom and boom again – storage news announcements hit the wires in a relentless barrage. Here's a few we've received showing developments in data protection, cloud storage, hyper-converged storage, the dregs of flash memory and more."

Here is some more Tech News from around the web:

Tech Talk

Source: The Register
Author:
Subject: Processors
Manufacturer: AMD

Get your brains ready

Just before the weekend, Josh and I got a chance to speak with David Kanter about the AMD Zen architecture and what it might mean for the Ryzen processor due out in less than a month. For those of you not familiar with David and his work, he is an analyst and consultant on processor architectrure and design through Real World Tech while also serving as a writer and analyst for the Microprocessor Report as part of the Linley Group. If you want to see a discussion forum that focuses on architecture at an incredibly detailed level, the Real World Tech forum will have you covered - it's an impressive place to learn.

zenpm-4.jpg

David was kind enough to spend an hour with us to talk about a recently-made-public report he wrote on Zen. It's definitely a discussion that dives into details most articles and stories on Zen don't broach, so be prepared to do some pausing and Googling phrases and technologies you may not be familiar with. Still, for any technology enthusiast that wants to get an expert's opinion on how Zen compares to Intel Skylake and how Ryzen might fare when its released this year, you won't want to miss it.

Subject: Mobile
Manufacturer: Lenovo
Tagged: yoga, X1, Thinkpad, oled, Lenovo

Intro, Exterior and Internal Features

Lenovo sent over an OLED-equipped ThinkPad X1 Yoga a while back. I was mid-development on our client SSD test suite and had some upcoming travel. Given that the new suite’s result number crunching spreadsheet ends extends out to column FHY (4289 for those counting), I really needed a higher res screen and improved computer horsepower in a mobile package. I commandeered the X1 Yoga OLED for the trip and to say it grew on me quickly is an understatement. While I do tend to reserve my heavier duty computing tasks and crazy spreadsheets for desktop machines and 40” 4K displays, the compute power of the X1 Yoga proved itself quite reasonable for a mobile platform. Sure there is a built in pen that comes in handy when employing the Yoga’s flip over convertibility into tablet mode, but the real beauty of this particular laptop comes with its optional 2560x1440 14” OLED display.

170209-180903.jpg

OLED is just one of those things you need to see in person to truly appreciate. Photos of these screens just can’t capture the perfect blacks and vivid colors. In productivity use, something about either the pixel pattern or the amazing contrast made me feel like the effective resolution of the panel was higher than its rating. It really is a shame that you are likely reading this article on an LCD, because the OLED panel on this particular model of Lenovo laptop really is the superstar. I’ll dive more into the display later on, but for now let’s cover the basics:

Read on for our review of the ThinkPad X1 Yoga!

A Closer Look at Intel's Optane SSD DC P4800X Enterprise SSD Performance

Subject: Storage | February 10, 2017 - 09:22 PM |
Tagged: Optane, XPoint, P4800X, 375GB

Over the past few hours, we have seen another Intel Optane SSD leak rise to the surface. While we previously saw a roadmap and specs for a mobile storage accelerator platform, this time we have some specs for an enterprise part:

optane-leak.png

The specs are certainly impressive. While they don't match the maximum theoretical figures we heard at the initial XPoint announcement, we do see an endurance rating of 30 DWPD (drive writes per day), which is impressive given competing NAND products typically run in the single digits for that same metric. The 12.3 PetaBytes Written (PBW) rating is even more impressive given the capacity point that rating is based on is only 375GB (compare with 2000+ GB of enterprise parts that still do not match that figure).

Now I could rattle off the rest of the performance figures, but those are just numbers, and fortunately we have ways of showing these specs in a more practical manner:

rnd.png

Assuming the P4800X at least meets its stated specifications (very likely given Intel's track record there), and also with the understanding that XPoint products typically reach their maximum IOPS at Queue Depths far below 16, we can compare the theoretical figures for this new Optane part to the measured results from the two most recent NAND-based enterprise launches. To say the random performance makes leaves those parts in the dust is an understatement. 500,000+ IOPS is one thing, but doing so at lower QD's (where actual real-world enterprise usage actually sits) just makes this more of an embarrassment to NAND parts. The added latency of NAND translates to far higher/impractical QD's (256+) to reach their maximum ratings.

server workload QD.png

Intel research on typical Queue Depths seen in various enterprise workloads. Note that a lower latency device running the same workload will further 'shallow the queue', meaning even lower QD.

Another big deal in the enterprise is QoS. High IOPS and low latency are great, but where the rubber meets the road here is consistency. Enterprise tests measure this in varying degrees of "9's", which exponentially approach 100% of all IO latencies seen during a test run. The plot method used below acts to 'zoom in' on the tail latency of these devices. While a given SSD might have very good average latency and IOPS, it's the outliers that lead to timeouts in time-critical applications, making tail latency an important item to detail.

qos-r.png

qos-w.png

I've taken some liberties in my approximations below the 99.999% point in these plots. Note that the spec sheet does claim typical latencies "<10us", which falls off to the left of the scale. Not only are the potential latencies great with Optane, the claimed consistency gains are even better. Translating what you see above, the highest percentile latency IOs of the P4800X should be 10x-100x (log scale above) faster than Intel's own SSD DC P3520. The P4800X should also easily beat the Micron 9100 MAX, even despite its IOPS being 5x higher than the P3520 at QD16. These lower latencies also mean we will have to add another decade to the low end of our Latency Percentile plots when we test these new products.

Well, there you have it. The cost/GB will naturally be higher for these new XPoint parts, but the expected performance improvements should make it well worth the additional cost for those who need blistering fast yet persistent storage.

Author:
Manufacturer: EVGA

The new EVGA GTX 1080 FTW2 with iCX Technology

Back in November of 2016, EVGA had a problem on its hands. The company had a batch of GTX 10-series graphics cards using the new ACX 3.0 cooler solution leave the warehouse missing thermal pads required to keep the power management hardware on its cards within reasonable temperature margins. To its credit, the company took the oversight seriously and instituted a set of solutions for consumers to select from: RMA, new VBIOS to increase fan speeds, or to install thermal pads on your hardware manually. Still, as is the case with any kind of product quality lapse like that, there were (and are) lingering questions about EVGA’s ability to maintain reliable product; with features and new options that don’t compromise the basics.

Internally, the drive to correct these lapses was…strong. From the very top of the food chain on down, it was hammered home that something like this simply couldn’t occur again, and even more so, EVGA was to develop and showcase a new feature set and product lineup demonstrating its ability to innovate. Thus was born, and accelerated, the EVGA iCX Technology infrastructure. While this was something in the pipeline for some time already, it was moved up to counter any negative bias that might have formed for EVGA’s graphics cards over the last several months. The goal was simple: prove that EVGA was the leader in graphics card design and prove that EVGA has learned from previous mistakes.

EVGA iCX Technology

Previous issues aside, the creation of iCX Technology is built around one simple question: is one GPU temperature sensor enough? For nearly all of today’s graphics cards, cooling is based around the temperature of the GPU silicon itself, as measured by NVIDIA (for all of EVGA’s cards). This is how fan curves are built, how GPU clock speeds are handled with GPU Boost, how noise profiles are created, and more. But as process technology has improved, and GPU design has weighed towards power efficiency, the GPU itself is often no longer the thermally limiting factor.

slides05.jpg

As it turns out, converting 12V (from the power supply) to ~1V (necessary for the GPU) is a simple process that creates a lot of excess heat. The thermal images above clearly demonstrate that and EVGA isn’t the only card vendor to take notice of this. As it turns out, EVGA’s product issue from last year was related to this – the fans were only spinning fast enough to keep the GPU cool and did not take into account the temperature of memory or power delivery.

The fix from EVGA is to ratchet up the number of sensors on the card PCB and wrap them with intelligence in the form of MCUs, updated Precision XOC software and user viewable LEDs on the card itself.

slides10.jpg

EVGA graphics cards with iCX Technology will include 9 total thermal sensors on the board, independent of the GPU temperature sensor directly integrated by NVIDIA. There are three sensors for memory, five for power delivery and an additional sensor for the GPU temperature. Some are located on the back of the PCB to avoid any conflicts with trace routing between critical components, including the secondary GPU sensor.

Continue reading about EVGA iCX Technology!

NVIDIA Announces Q4 2017 Results

Subject: Editorial | February 9, 2017 - 11:59 PM |
Tagged: TSMC, Samsung, Results, quadro, Q4, nvidia, Intel, geforce, Drive PX2, amd, 2017, 2016

It is most definitely quarterly reports time for our favorite tech firms.  NVIDIA’s is unique with their fiscal vs. calendar year as compared to how AMD and Intel report.  This has to do when NVIDIA had their first public offering and set the fiscal quarters ahead quite a few months from the actual calendar.  So when NVIDIA announces Q4 2017, it is actually reflecting the Q4 period in 2016.  Clear as mud?

Semantics aside, NVIDIA had a record quarter.  Gross revenue was an impressive $2.173 billion US.  This is up slightly more than $700 million from the previous Q4.  NVIDIA has shown amazing growth during this time attributed to several factors.  Net income (GAAP) is at $655 million.  This again is a tremendous amount of profit for a company that came in just over $2 billion in revenue.  We can compare this to AMD’s results two weeks ago that hit $1.11 billion in revenue and a loss of $51 million for the quarter.  Consider that AMD provides CPUs, chipsets, and GPUs to the market and is the #2 x86 manufacturer in the world.

NVLogo_2D_H.jpg

The yearly results were just as impressive.  FY 2017 featured record revenue and net income.  Revenue was $6.91 billion as compare to FY 2016 at $5 billion.  Net income for the year was $1.666 billion with comparison to $614 million for FY 2016.  The growth for the entire year is astounding, and certainly the company had not seen an expansion like this since the early 2000s.

The core strength of the company continues to be gaming.  Gaming GPUs and products provided $1.348 billion in revenue by themselves.  Since the manufacturing industry was unable to provide a usable 20 nm planar product for large, complex ASICs companies such as NVIDIA and AMD were forced to innovate in design to create new products with greater feature sets and performance, all the while still using the same 28 nm process as previous products.  Typically process shrinks accounted for the majority of improvements (more transistors packed into a smaller area with corresponding switching speed increases).  Many users kept cards that were several years old due to there not being a huge impetus to upgrade.  With the arrival of the 14 nm and 16 nm processes from Samsung and TSMC respectively, users suddenly had a very significant reason to upgrade.  NVIDIA was able to address the entire market from high to low with their latest GTX 10x0 series of products.  AMD on the other hand only had new products that hit the midrange and budget markets.

NV-Q4-2014.jpg

The next biggest area for NVIDIA is that of the datacenter.  This has seen tremendous growth as compared to the other markets (except of course gaming) that NVIDIA covers.  It has gone from around $97 million in Q4 2016 up to $296 million this last quarter.  Tripling revenue in any one area is rare.  Gaming “only” about doubled during this same time period.  Deep learning and AI are two areas that required this type of compute power and NVIDIA was able to deliver a comprehensive software stack, as well as strategic partnerships that provided turnkey solutions for end users.

After datacenter we still have the visualization market based on the Quadro products.  This area has not seen the dramatic growth as other aspects of the company, but it remains a solid foundation and a good money maker for the firm.  The Quadro products continue to be improved upon and software support grows.

One area that promises to really explode in the next three to four years is the automotive sector.  The Drive PX2 system is being integrated into a variety of cars and NVIDIA is focused on providing a solid and feature packed solution for manufacturers.  Auto-pilot and “co-pilot” modes will become more and more important in upcoming models and should reach wide availability by 2020, if not a little sooner.  NVIDIA is working with some of the biggest names in the industry from both automakers and parts suppliers.  BMW should release a fully automated driving system later this year with their i8 series.  Audi also has higher end cars in the works that will utilize NVIDIA hardware and fully automated operation.  If NVIDIA continues to expand here, eventually it could become as significant a source of income as gaming is today.

There was one bit of bad news from the company.  Their OEM & IP division has seen several drops over the past several quarters.  NVIDIA announced that the IP licensing to Intel would be discontinued this quarter and would not be renewed.  We know that AMD has entered into an agreement with Intel to provide graphics IP to the company in future parts and to cover Intel in potential licensing litigation.  This was a fair amount of money per quarter for NVIDIA, but their other divisions more than made up for the loss of this particular income.

NVIDIA certainly seems to be hitting on all cylinders and is growing into markets that previously were unavailable as of five to ten years ago.  They are spreading out their financial base so as to avoid boom and bust cycles of any one industry.  Next quarter NVIDIA expects revenue to be down seasonally into the $1.9 billion range.  Even though that number is down, it would still represent the 3rd highest quarterly revenue.

Source: NVIDIA

Blender Foundation Releases 2.78b... for Performance!

Subject: General Tech | February 9, 2017 - 10:03 PM |
Tagged: Blender

It has been a few months since the release of 2.78, and the Blender Foundation has been sitting on a bunch of performance enhancements in that time. Since 2.79 is still a couple of months off, they decided to “cherry pick” a bunch of them back into the 2.78 branch and push out an update to it. Most of these updates are things like multi-threading their shader compiler for Cycles, speeding up motion blur in Cycles, and reducing “fireflies” in Cycles renders, which indirectly helps performance by requiring less light samples to average out the noise.

blender-2016-278logo.jpg

I tried running two frames from different scenes of my upcoming PC enthusiast explanation video. While they’re fairly light, motion graphics sequences, they both use a little bit of motion blur (~half of a 60 Hz frame of integration) and one of the two frames is in the middle of three objects with volumetric absorption that are moving quite fast.

0580.png

The "easier" scene to render.

When disabling my GTX 670, and only using my GTX 1080, the easier scene went from 9.96s in 2.78a to 9.99s in 2.78b. The harsher scene, with volumetric absorption and a big streak of motion blur, went from 36.4s in 2.78a to 36.31s in 2.78b. My typical render settings include a fairly high sample count, though, so it’s possible that I could get away with less and save time that way.

0605.png

The "harsher" scene to render.

Blender is currently working on Agent 327, which is an upcoming animated feature film. Typically, these movies guide development of the project, so it makes sense that my little one-person motion graphics won’t have the complexity to show the huge optimizations that they’re targeting. Also, I had a lot of other programs running, which is known to make a significant difference in render time, although they were doing the same things between runs. No browser tabs were opened or closed, the same videos were running on other monitors while 2.78a and 2.78b were working, etc. But yeah, it's not a bulletproof benchmark by any means.

Also, some of the optimizations solve bugs with Intel’s CPU implementation as well as increase the use of SSE 4.1+ and AVX2. Unfortunately for AMD, these were pushed up right before the launch of Ryzen, and Blender with Cycles has been one of their go-to benchmarks for multi-threaded performance. While this won’t hurt AMD any more than typical version-to-version variations, it should give a last-minute boost to their competitors on AMD’s home turf.

Blender 2.78b is available today, free as always, at their website.

New graphics drivers? Fine, back to benchmarking.

Subject: Graphics Cards | February 9, 2017 - 07:46 PM |
Tagged: amd, nvidia

New graphics drivers are a boon to everyone who isn't a hardware reviewer, especially one who has just wrapped up benchmarking a new card the same day one is released.  To address this issue see what changes have been implemented by AMD and NVIDIA in their last few releases, [H]ard|OCP tested a slew of recent drivers from both companies.  The performance of AMD's past releases, up to and including the AMD Crimson ReLive Edition 17.1.1 Beta can be found here.  For NVIDIA users, recent drivers covering up to the 378.57 Beta Hotfix are right here.  The tests show both companies generally increasing the performance of their drivers, however the change is so small you are not going to notice a large difference.

0chained-to-office-man-desk-stick-figure-vector-43659486-958x1024.jpg

"We take the AMD Radeon R9 Fury X and AMD Radeon RX 480 for a ride in 11 games using drivers from the time of each video card’s launch date, to the latest AMD Radeon Software Crimson ReLive Edition 17.1.1 Beta driver. We will see how performance in old and newer games has changed over the course of 2015-2017 with new drivers. "

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Have you ever noticed how popular June 21, 2006 is?

Subject: General Tech | February 9, 2017 - 06:51 PM |
Tagged: workaround, microsoft

Have you ever noticed how many drivers on your system are dated June 21st, 2006?  If not, pop open device manager and take a look at some of your devices which don't use a driver directly from the manufacturer.  Slashdot posted a link to the inimitable Raymond Chen who explains exactly why so many of your drivers bear that date.  The short version is that this is a workaround which prevents newer Microsoft drivers from overwriting manufacturer's drivers by ensuring the date stamp on the Microsoft driver will never have a more recent date.  This is especially important for laptop users as even the simple chipset drivers will be supplied by the manufacturer.  For instance this processor is old, but not that old!

2006.PNG

"When the system looks for a driver to use for a particular piece of hardware, it ranks them according to various criteria. If a driver provides a perfect match to the hardware ID, then it becomes a top candidate. And if more than one driver provides a perfect match, then the one with the most recent timestamp is chosen."

Here is some more Tech News from around the web:

Tech Talk

Source: Slashdot

Podcast #436 - ECS Mini-STX, NVIDIA Quadro, AMD Zen Arch, Optane, GDDR6 and more!

Subject: Editorial | February 9, 2017 - 03:50 PM |
Tagged: podcast, Zen, Windows 10 Game Mode, webcam, ryzen, quadro, Optane, nvidia, mini-stx, humble bundle, gddr6, evga, ECS, atom, amd, 4k

PC Perspective Podcast #436 - 02/09/17

Join us for ECS Mini-STX, NVIDIA Quadro, AMD Zen Arch, Optane, GDDR6 and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

Hosts: Ryan Shrout, Allyn Malventano, Ken Addison, Josh Walrath, Jermey Hellstrom

Program length: 1:32:21

Podcast topics of discussion:

  1. Week in Review:
  2. News items of interest:
    1. 1:14:00 Zen Price Points Leaked
  3. Hardware/Software Picks of the Week
  4. Closing/outro
 
 

Source: