51 flavours of Radeon to choose from

Subject: Graphics Cards | June 2, 2017 - 03:02 PM |
Tagged: amd, radeon, linux

When Phoronix does a performance round up they do not mess around.  Their latest look at the performance of AMD cards on Linux stretches all the way back to the HD 2900XT and encompasses almost every single GPU released between that part and the RX 580, with a pair of Firepro cards and the Fury included as well.  For comparative performance numbers you will see 28 NVIDIA cards on these charts, which makes the charts some of the longest you have seen.  Drop by to check out the state of AMD performance on Linux in a variety of games as well as synthetic benchmarks.

image.php_.jpg

"It's that time of the year where we see how the open-source AMD Linux graphics driver stack is working on past and present hardware in a large GPU comparison with various OpenGL games and workloads. This year we go from the new Radeon RX 580 all the way back to the Radeon HD 2900XT, looking at how the mature Radeon DRM kernel driver and R600 Gallium3D driver is working for aging ATI/AMD graphics hardware. In total there were 51 graphics cards tested for this comparison of Radeon cards as well as NVIDIA GeForce hardware for reference."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: Phoronix

Imagination PowerVR Ray Tracing with UE4 & Vulkan Demo

Subject: Graphics Cards, Mobile | June 2, 2017 - 02:23 AM |
Tagged: Imagination Technologies, PowerVR, ray tracing, ue4, vulkan

Imagination Technologies has published another video that demonstrates ray tracing with their PowerVR Wizard GPU. The test system, today, is a development card that is running on Ubuntu, and powering Unreal Engine 4. Specifically, it is using UE4’s Vulkan renderer.

The demo highlights two major advantages of ray traced images. The first is that, rather than applying a baked cubemap with screen-space reflections to simulate metallic objects, this demo calculates reflections with secondary rays. From there, it’s just a matter of hooking up the gathered information into the parameters that the shader requires and doing the calculations.

The second advantage is that it can do arbitrary lens effects, like distortion and equirectangular, 360 projections. Rasterization, which projects 3D world coordinates into 2D coordinates on a screen, assumes that edges are still straight, and that causes problems as FoV gets very large, especially full circle. Imagination Technologies acknowledges that workarounds exist, like breaking up the render into six faces of a cube, but the best approximation is casting a ray per pixel and seeing what it hits.

The demo was originally for GDC 2017, back in February, but the videos have just been released.

Hey cable cutters, Plex does live TV now

Subject: Graphics Cards | June 1, 2017 - 02:38 PM |
Tagged: plex, live tv

Today Plex announced the addition of a live streaming service to their Plex Pass which will let you stream digital cable channels to any of your devices once you set it up on your account.  All you need is a digital tuner such as an NVIDIA SHIELD, a product from Hauppauge, AVerMedia, DVBLogic or even a digital antenna.  Hook the device up to the same network your Plex server resides on and start streaming your favourite shows, without paying your cable company rental fees for that set top box.  You can check out what channels are available in your area on this Plex page.  If you are unfamiliar with Plex or want to read a bit more on the setup you can check out this story at Gizmodo.

plextv.PNG

"In the latest build of Plex, the server can actually automatically convert the media, resolving this problem. More importantly, you get now watch live TV via a feature called, strangely enough, Live TV."

Here is some more Tech News from around the web:

Tech Talk

 

Source: Gizmodo
Author:
Manufacturer: AMD

We are up to two...

UPDATE (5/31/2017): Crystal Dynamics was able to get back to us with a couple of points on the changes that were made with this patch to affect the performance of AMD Ryzen processors.

  1. Rise of the Tomb Raider splits rendering tasks to run on different threads. By tuning the size of those tasks – breaking some up, allowing multicore CPUs to contribute in more cases, and combining some others, to reduce overheads in the scheduler – the game can more efficiently exploit extra threads on the host CPU.
     
  2. An optimization was identified in texture management that improves the combination of AMD CPU and NVIDIA GPU.  Overhead was reduced by packing texture descriptor uploads into larger chunks.

There you have it, a bit more detail on the software changes made to help adapt the game engine to AMD's Ryzen architecture. Not only that, but it does confirm our information that there was slightly MORE to address in the Ryzen+GeForce combinations.

END UPDATE

Despite a couple of growing pains out of the gate, the Ryzen processor launch appears to have been a success for AMD. Both the Ryzen 7 and the Ryzen 5 releases proved to be very competitive with Intel’s dominant CPUs in the market and took significant leads in areas of massive multi-threading and performance per dollar. An area that AMD has struggled in though has been 1080p gaming – performance in those instances on both Ryzen 7 and 5 processors fell behind comparable Intel parts by (sometimes) significant margins.

Our team continues to watch the story to see how AMD and game developers work through the issue. Most recently I posted a look at the memory latency differences between Ryzen and Intel Core processors. As it turns out, the memory latency differences are a significant part of the initial problem for AMD:

Because of this, I think it is fair to claim that some, if not most, of the 1080p gaming performance deficits we have seen with AMD Ryzen processors are a result of this particular memory system intricacy. You can combine memory latency with the thread-to-thread communication issue we discussed previously into one overall system level complication: the Zen memory system behaves differently than anything we have seen prior and it currently suffers in a couple of specific areas because of it.

In that story I detailed our coverage of the Ryzen processor and its gaming performance succinctly:

Our team has done quite a bit of research and testing on this topic. This included a detailed look at the first asserted reason for the performance gap, the Windows 10 scheduler. Our summary there was that the scheduler was working as expected and that minimal difference was seen when moving between different power modes. We also talked directly with AMD to find out its then current stance on the results, backing up our claims on the scheduler and presented a better outlook for gaming going forward. When AMD wanted to test a new custom Windows 10 power profile to help improve performance in some cases, we took part in that too. In late March we saw the first gaming performance update occur courtesy of Ashes of the Singularity: Escalation where an engine update to utilize more threads resulted in as much as 31% average frame increase.

Quick on the heels of the Ryzen 7 release, AMD worked with the developer Oxide on the Ashes of the Singularity: Escalation engine. Through tweaks and optimizations, the game was able to showcase as much as a 30% increase in average frame rate on the integrated benchmark. While this was only a single use case, it does prove that through work with the developers, AMD has the ability to improve the 1080p gaming positioning of Ryzen against Intel.

rotr-screen4-small.jpg

Fast forward to today and I was surprised to find a new patch for Rise of the Tomb Raider, a game that was actually one of the worst case scenarios for AMD with Ryzen. (Patch #12, v1.0.770.1) The patch notes mention the following:

The following changes are included in this patch

- Fix certain DX12 crashes reported by users on the forums.

- Improve DX12 performance across a variety of hardware, in CPU bound situations. Especially performance on AMD Ryzen CPUs can be significantly improved.

While we expect this patch to be an improvement for everyone, if you do have trouble with this patch and prefer to stay on the old version we made a Beta available on Steam, build 767.2, which can be used to switch back to the previous version.

We will keep monitoring for feedback and will release further patches as it seems required. We always welcome your feedback!

Obviously the data point that stood out for me was the improved DX12 performance “in CPU bound situations. Especially on AMD Ryzen CPUs…”

Remember how the situation appeared in April?

rotr.png

The Ryzen 7 1800X was 24% slower than the Intel Core i7-7700K – a dramatic difference for a processor that should only have been ~8-10% slower in single threaded workloads.

How does this new patch to RoTR affect performance? We tested it on the same Ryzen 7 1800X benchmarks platform from previous testing including the ASUS Crosshair VI Hero motherboard, 16GB DDR4-2400 memory and GeForce GTX 1080 Founders Edition using the 378.78 driver. All testing was done under the DX12 code path.

tr-1.png

tr-2.png

The Ryzen 7 1800X score jumps from 107 FPS to 126.44 FPS, an increase of 17%! That is a significant boost in performance at 1080p while still running at the Very High image quality preset, indicating that the developer (and likely AMD) were able to find substantial inefficiencies in the engine. For comparison, the 8-core / 16-thread Intel Core i7-6900K only sees a 2.4% increase from this new game revision. This tells us that the changes to the game were specific to Ryzen processors and their design, but that no performance was redacted from the Intel platforms.

Continue reading our look at the new Rise of the Tomb Raider patch for Ryzen!

Computex 2017: NVIDIA GeForce GTX Max-Q Design Notebooks are Thinner, Lighter

Subject: Graphics Cards, Mobile | May 30, 2017 - 12:48 AM |
Tagged: nvidia, mobile, max-q design, max-q, GTX 1080, geforce

During CEO Jensen Huang’s keynote at Computex tonight, NVIDIA announced a new initiative called GeForce GTX with Max-Q Design, targeting the mobile gaming markets with a product that is lighter, thinner yet more powerful than previously available gaming notebooks.

slide1.jpg

The idea behind this technology differentiation centers around gaming notebooks that have seen limited evolution over the last several years in form factor and design. The biggest stereotype of gaming notebooks today is that they must big, bulky and heavy to provide a competitive gaming experience when compared to desktop computers. NVIDIA is taking it upon itself to help drive innovation forward in this market, in some ways similar to how Intel created the Ultrabook.

slide2.jpg

Using “typical” specifications from previous machines using a GeForce GTX 880M (admittedly a part that came out in early 2014), NVIDIA claims that Max-Q Designs will offer compelling gaming notebooks with half the weight, nearly a third of the thinness yet still see 3x the performance. Utilizing a GeForce GTX 1080 GP104 GPU, the team is focusing on four specific hardware data points to achieve this goal.

slide3.jpg

First, NVIDIA is setting specifications of the GPUs in this design to run at their maximum efficiency point, allowing the notebook to get the best possible gaming performance from Pascal with the smallest amount of power draw. This is an obvious move and is likely something that has been occurring for a while, but further down the product stack. It’s also likely that NVIDIA is highly binning the GP104 parts to filter those that require the least amount of power to hit the performance target of Max-Q Designs.

Second, NVIDIA is depending on the use of GeForce Experience software to set in-game settings optimally for power consumption. Though details are light, this likely means running the game with frame rate limiting enabled, keeping gamers from running at refresh rates well above their screen’s refresh rate (static or G-Sync) which is an unnecessary power drain. It could also mean lower quality settings than we might normally associate with a GeForce GTX 1080 graphics card.

systemcomparison.jpg

Comparing a 3-year old notebook versus a Max-Q Design

The third and fourth points are heavily related: using the best possible cooling solutions and integrating the best available power regulators targeting efficiency. The former allows the GPU to be cooled quickly, and quietly (with a quoted sub-40 dbA goal), keeping the GTX 1080 at its peak efficiency curve. And putting the GPU in that state without inefficient power delivery hardware would be a waste, so NVIDIA is setting standards here too.

UPDATE: From the NVIDIA news release just posted on the company's website, we learned of a couple of new additions to Max-Q Design:

NVIDIA WhisperMode Technology
NVIDIA also introduced WhisperMode technology, which makes laptops run much quieter while gaming. WhisperMode intelligently paces the game's frame rate while simultaneously configuring the graphics settings for optimal power efficiency. This reduces the overall acoustic level for gaming laptops. Completely user adjustable and available for all Pascal GPU-based laptops, WhisperMode will be available soon through a GeForce Experience software update.

Availability
MaxQ-designed gaming laptops equipped with GeForce GTX 1080, 1070 and 1060 GPUs will be available starting June 27 from the world's leading laptop OEMs and system builders, including Acer, Aftershock, Alienware, ASUS, Clevo, Dream Machine, ECT, Gigabyte, Hasee, HP, LDLC, Lenovo, Machenike, Maingear, Mechrevo, MSI, Multicom, Origin PC, PC Specialist, Sager, Scan, Terrans Force, Tronic'5, and XoticPC. Features, pricing and availability may vary.

Jensen showed an upcoming ASUS Republic of Gamers notebook called Zephyrus that hit all of these targets – likely NVIDIA’s initial build partner. On it they demonstrated Project Cars 2, an impressive looking title for certain. No information was given on image quality settings, resolutions, frame rates, etc.

zephyrus.jpg

The ASUS ROG Zephyrus Max-Q Design Gaming Notebook

This design standard is impressive, and though I assume many gamers and OEMs will worry about having an outside party setting requirements for upcoming designs, I err on the side this being a necessary step. If you remember notebooks before the Intel Ultrabook push, they were stagnant and uninspiring. Intel’s somewhat forceful move to make OEMs innovate and compete in a new way changed the ecosystem at a fundamental level. It is very possible that GeForce GTX with Max-Q Design will do the same thing for gaming notebooks.

An initiative like this continues NVIDIA’s seeming goal of creating itself as the “PC brand”, competing more with Xbox and PlayStation than with Radeon. Jensen claimed that more than 10 million GeForce gaming notebooks were sold in the last year, exceeding the sales of Xbox hardware in the same time frame. He also called out the ASUS prototype notebook as having compute capability 60% higher than that of the PS4 Pro. It’s clear that NVIDIA wants to be more than just the add-in card leader, more than just the leader in computer graphics. Owning the ecosystem vertical gives them more control and power to drive the direction of software and hardware.

asusrog.jpg

The ASUS ROG Zephyrus Max-Q Design Gaming Notebook

So, does the Max-Q Design technology change anything? Considering the Razer Blade B5 is already under 18mm thin, the argument could be made that the market was already going down this path, and NVIDIA is simply jumping in to get credit for the move. Though Razer is a great partner for NVIDIA, they are likely irked that NVIDIA is going to push all OEMs to steal some of the thunder from this type of design that Razer started and evangelized.

That political discussion aside, Max-Q Design will bring new, better gaming notebook options to the market from many OEMs, lowering the price of entry for these flagship designs. NVIDIA did not mention anything about cost requirements or segments around Max-Q, so I do expect the first wave of these to be on the premium end of the scale. Over time, as cost cutting measures come into place, and the necessity of thinner, lighter gaming notebooks is well understood, Max-Q Designs could find itself in a wide range of price segments.

Source: NVIDIA

Computex 2017: EVGA Unveils GTX 1080 Ti Kingpin With Guaranteed 2GHz+ Overclock

Subject: Graphics Cards | May 29, 2017 - 08:30 PM |
Tagged: Kingpin, gtx 1080 ti, gpu, evga, computex 2017

EVGA today took the wraps off its latest and highest-end NVIDIA GPU with the announcement of the EVGA GeForce GTX 1080 Ti Kingpin Edition. Part of the company's continuing line of "K|NGP|N" licensed graphics cards, the 1080 Ti Kingpin includes performance, cooling, and stability-minded features that are intended to set it apart from all of the other 1080 Ti models currently available.

evga-1080-ti-kingpin-top.jpg

From a design standpoint, the 1080 Ti Kingpin features an oversized PCB, triple-fan iCX cooler, an expansive copper heat sink, and right-edge PCIe connectors (2 x 8pin), meaning that those with an obsession for cable management won't need to pick up something like the EVGA PowerLink. The card's design is also thin enough that owners can convert it into a true single-slot card by removing the iCX cooler, allowing enthusiasts to pack more water- or liquid nitrogen-cooled GPUs into a single chassis.

The GTX 1080 Ti Kingpin also features a unique array of display outputs, with dual-link DVI, HDMI 2.0, and three Mini DisplayPort 1.3 connectors. This compares with the three full-size DisplayPort and single HDMI outputs found on the 1080 Ti reference design. The presence of the DVI port on the Kingpin edition also directly addresses the concerns of some NVIDIA customers who weren't fans of NVIDIA's decision to ditch the "legacy" connector.

evga-1080-ti-kingpin-power.jpg

With its overbuilt PCB and enhanced cooling, EVGA claims that users will be able to achieve greater performance from the Kingpin Edition compared to any other currently shipping GTX 1080 Ti. That includes a "guaranteed" overclock of at least 2025MHz right out of the box, which compares to the 1480MHz base / 1600MHz boost clock advertised for the 1080 Ti's reference design (although it's important to note that NVIDIA's advertised boost clocks have become quite conservative in recent years, and many 1080 Ti owners are able to easily exceed 1600MHz with modest overclocking).

EVGA has yet to confirm an exact release date for the GeForce GTX 1080 Ti Kingpin, but it is expected to launch in late June or July. As for price, EVGA has also declined to provide specifics, but interested enthusiasts should start saving their pennies now. Based on previous iterations of the "K|NGP|N" flagship model, expect a price premium of anywhere between $100 and $400.

Source: EVGA

SoftBank Invests $4 Billion In NVIDIA, Becomes Fourth Largest Shareholder

Subject: General Tech, Graphics Cards | May 27, 2017 - 12:18 AM |
Tagged: vision fund, softbank, nvidia, iot, HPC, ai

SoftBank, the Tokyo, Japan based Japanese telecom and internet technology company has reportedly quietly amassed a 4.9% stake in graphics chip giant NVIDIA. Bloomberg reports that SoftBank has carefully invested $4 billion into NVIDIA avoiding the need to get regulatory approval in the US by keeping its investment under 5% of the company. SoftBank has promised the current administration that it will invest $50 billion into US tech companies and it seems that NVIDIA is the first major part of that plan.

SXM2-VoltaChipDetails.png

NVIDIA's Tesla V100 GPU.

Led by Chairman and CEO Masayoshi Son, SoftBank is not afraid to invest in technology companies it believes in with major past acquisitions and investments in companies like ARM Holdings, Sprint, Alibaba, and game company Supercell.

The $4 billion-dollar investment makes SoftBank the fourth largest shareholder in NVIDIA, which has seen the company’s stock rally from SoftBank’s purchases and vote of confidence. The (currently $93) $100 billion Vision Fund may also follow SoftBank’s lead in acquiring a stake in NVIDIA which is involved in graphics, HPC, AI, deep learning, and gaming.

Overall, this is good news for NVIDIA and its shareholders. I am curious what other plays SoftBank will make for US tech companies.

What are your thoughts on SoftBank investing heavily in NVIDIA?

EVGA's Hydro Copper waterblock for GTX 1080

Subject: Graphics Cards | May 26, 2017 - 03:56 PM |
Tagged: evga, Hydro Copper GTX 1080, water cooler, nvidia

EVGA's Hydro Copper GTX 1080 is purpose built to fix any GTX 1080 on the market with thermal pads for the memory and VRMs already attached with a tube of EVGA Frostbite thermal paste for the GPU.  The ports to connect into your watercooling loop are further apart than usual, something that TechPowerUp were initially skeptical about, once they tested the cooler those doubts soon disappeared though they had other concerns about the design. Check out the review for the full details on this coolers performance.

block-2.jpg

"The EVGA Hydro Copper GTX 1080 is a full-cover waterblock that offers integrated lighting with no cable management needed, a six-port I/O port manifold, and an aluminum front cover for aesthetics and rigidity alike. It also aims to simplify installation by incorporating pre-installed thermal pads out of the box."

Here is some more Tech News from around the web:

Tech Talk

 

Source: TechPowerUp

ZOTAC also announced an External VGA Box

Subject: Graphics Cards, Shows and Expos | May 25, 2017 - 07:24 PM |
Tagged: external gpu, zotac, thunderbolt 3, computex 2017

zotac-external-vga-box_image01.jpg

They haven't given us much detail but as you would expect the ZOTAC external GPU box connects an GPU to your system via a Thunderbolt 3 connector, allowing you to add more GPU power to a mobile system or any other computer which needs a little boost to its graphics.  You can fit cards of up to 9" in length, which makes it a perfect match for the two Mini-GPUs just below or other lower powered cards which are not as well endowed as your average GTX 1080 or 1080 Ti.  It also adds four USB 3.0 ports and a Quick Charge 3.0 port to your system so you can leave it at home and simply attach your laptop via the Thunderbolt cable and get right to gaming.

zotac-external-vga-box_image02.jpg

Source: Zotac

Zotac announces a pair of really Mini GTX 1080 Ti's

Subject: Graphics Cards, Shows and Expos | May 25, 2017 - 07:00 PM |
Tagged: zotac, GTX 1080 Ti Mini, GTX 1080 Ti Arctic Storm Mini, gtx 1080 ti, computex 2017

ZOTAC is claiming bragging rights about the size of their new GTX 1080 Ti's, that they are the smallest of their kind.  The two new cards measure a miniscule 210.8mm (8.3") in length and in the case of the Arctic Storm mini it is the lightest watercooled GPU on the market. 

ZT-P10810G-10P_image2.jpg

You can see the size of the ZOTAC GeForce GTX 1080 Ti Mini by how much of the length is taken up by the PCIe connector, compared to most 1080 Ti's which are over a foot long.  This card is not long enough to fit a third fan on.

1080Ti-ArcticStorm-Mini-05.png

The Arctic Storm version is the same size as the air-cooled model but opts for the worlds lightest watercooler.  That may mean you want a powerful pump attached to the GPU as there is less metal to transfer heat but it means small silent builds can pack a lot of graphical power.

Both these cards will use dual 8-pin PCIe power connectors, expect to see more of them at Computex.

 

Source: Zotac