Leaked AMD Fiji Card Images Show Small Form Factor, Water Cooler Integration

Subject: Graphics Cards | May 22, 2015 - 09:39 AM |
Tagged: wce, radeon, Fiji, amd, 390x

UPDATE (5/22/15): Johan Andersson tweeted out this photo this morning, with the line: "This new island is one seriously impressive and sweet GPU. wow & thanks @AMDRadeon ! They will be put to good use :)"  Looks like we can confirm that at least one of the parts AMD is releasing does have the design of the images we showed you before, though the water cooling implementation is missing or altered.

repi-amd.jpg

END UPDATE

File this under "rumor" for sure, but a cool one none the less...

After yesterday's official tidbit of information surrounding AMD's upcoming flagship graphics card for enthusiasts and its use of HBM (high bandwidth memory), it appears we have another leak on our hands. The guys over at Chiphell have apparently acquired some stock footage of the new Fiji flagship card (whether or not it will be called the 390X has yet to be seen) and it looks...awesome.

390x1.jpg

In that post from yesterday I noted that with an HBM design AMD could in theory build an add-in card that is of a different form factor than anything we have previously seen for a high end part. Based on the image above, if this turns out to be the high end Fiji offering, it appears the PCB will indeed be quite small as it no longer requires memory surrounding the GPU itself. You can also see that it will in fact be water cooled though it looks like it has barb inlets rather than a pre-attached cooler in this image.

390x2.jpg

The second leaked image shows display outputs consisting of three full-size DisplayPort connections and a single HDMI port.

All of this could be faked of course, but if it is, the joker did a damn good job of compiling all the information into one design. If it's real, I think AMD might finally have a match for the look and styling of the high-end GeForce offerings.

What do you think: real or fake? Cool or meh? Let us know!

Source: Chiphell

Could it be a 980 Ti, or does the bill of lading lie?

Subject: Graphics Cards | May 21, 2015 - 07:33 PM |
Tagged: rumour, nvidia, 980 Ti

980ti.PNG

The source of leaks and rumours is often unexpected, such as this import data of a shipment headed from China into India.  Could this 6GB card be the GTX 980 Ti that so many have theorized would be coming sometime around AMD's release of their new cards?  Does the fact that 60,709 Indian Rupees equal 954.447 US Dollars put a damper on your excitement or could it be that these 6 lonely cards are being sold at a higher rate overseas than they might be in the US? 

We don't know but we do know there is a mysterious card out there somewhere.

Source: Zauba

How about that High Bandwidth Memory

Subject: Graphics Cards | May 19, 2015 - 03:51 PM |
Tagged: memory, high bandwidth memory, hbm, Fiji, amd

Ryan and the rest of the crew here at PC Perspective are excited about AMD's new memory architecture and the fact that they will be first to market with it.  However as any intelligent reader is wont to look for; a second opinion on the topic is worth finding.  Look no further than The Tech Report who have also been briefed on AMD's new memory architecture.  Read on to see what they learned from Joe Macri and their thoughts on the successor to GDDR5 and HBM2 which is already in the works.

stack-diagram.jpg

"HBM is the next generation of memory for high-bandwidth applications like graphics, and AMD has helped usher it to market. Read on to find out more about HBM and what we've learned about the memory subsystem in AMD's next high-end GPU, code-named Fiji."

Here are some more Graphics Card articles from around the web:

Graphics Cards

NVIDIA Under Attack Again for GameWorks in The Witcher 3: Wild Hunt

Subject: Graphics Cards | May 17, 2015 - 12:04 PM |
Tagged: The Witcher 3, nvidia, hairworks, gameworks, amd

I feel like every few months I get to write more stories focusing on the exact same subject. It's almost as if nothing in the enthusiast market is happening and thus the cycle continues, taking all of us with it on a wild ride of arguments and valuable debates. Late last week I started hearing from some of my Twitter followers that there were concerns surrounding the upcoming release of The Witcher 3: Wild Hunt. Then I found a link to this news post over at Overclock3d.net that put some of the information in perspective.

Essentially, The Witcher 3 uses parts of NVIDIA's GameWorks development tools and APIs, software written by NVIDIA to help game developers take advantage of new technologies and to quickly and easily implement them into games. The problem of course is that GameWorks is written and developed by NVIDIA. That means that optimizations for AMD Radeon hardware are difficult or impossible, depending on who you want to believe. Clearly it doesn't benefit NVIDIA to optimize its software for AMD GPUs financially, though many in the community would like NVIDIA to give a better effort - for the good of said community.

gwlogo.jpg

Specifically in regards to The Witcher 3, the game implements NVIDIA HairWorks technology to add realism on many of the creatures of the game world. (Actually, the game includes HairWorks, HBAO+, PhysX,  Destruction and Clothing but our current discussion focuses on HairWorks.) All of the marketing and video surrounding The Witcher 3 has been awesome and the realistic animal fur simulation has definitely been a part of it. However, it appears that AMD Radeon GPU users are concerned that performance with HairWorks enabled will suffer.

An example of The Witcher 3: Wild Hunt with HairWorks

One of the game's developers has been quoted as such:

Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology – the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.

There are at least several interpretations of this statement floating around the web. First, and most enflaming, is that NVIDIA is not allowing CD Project Red to optimize it by not offering source code. Another is that CD Project is choosing to not optimize for AMD hardware due to time considerations. The last is that it simply isn't possible to optimize it because of hardware limitations of HairWorks.

I went to NVIDIA with these complaints about HairWorks and Brian Burke gave me this response:

We are not asking game developers do anything unethical.
 
GameWorks improves the visual quality of games running on GeForce for our customers.  It does not impair performance on competing hardware.
 
Demanding source code access to all our cool technology is an attempt to deflect their performance issues. Giving away your IP, your source code, is uncommon for anyone in the industry, including middleware providers and game developers. Most of the time we optimize games based on binary builds, not source code.
 
GameWorks licenses follow standard industry practice.  GameWorks source code is provided to developers that request it under license, but they can’t redistribute our source code to anyone who does not have a license. 
 
The bottom line is AMD’s tessellation performance is not very good and there is not a lot NVIDIA can/should do about it. Using DX11 tessellation has sound technical reasoning behind it, it helps to keep the GPU memory footprint small so multiple characters can use hair and fur at the same time.
 
I believe it is a resource issue. NVIDIA spent a lot of artist and engineering resources to help make Witcher 3 better. I would assume that AMD could have done the same thing because our agreements with developers don’t prevent them from working with other IHVs. (See also, Project Cars)
 
I think gamers want better hair, better fur, better lighting, better shadows and better effects in their games. GameWorks gives them that.  

Interesting comments for sure. The essential take away from this is that HairWorks depends heavily on tessellation performance and we have known since the GTX 680 was released that NVIDIA's architecture performs better than AMD's GCN for tessellation - often by a significant amount. NVIDIA developed its middleware to utilize the strength of its own GPU technology and while it's clear that some disagree, not to negatively impact AMD. Did NVIDIA know that would be the case when it was developing the software? Of course it did. Should it have done something to help AMD GPUs more gracefully fall back? Maybe.

Next, I asked Burke directly if claims that NVIDIA was preventing AMD or the game developer from optimizing HairWorks for other GPUs and platforms were true? I was told that both AMD and CD Project had the ability to tune the game, but in different ways. The developer could change the tessellation density based on the specific GPU detected (lower for a Radeon GPU with less tessellation capability, for example) but that would require dedicated engineering from either CD Project or AMD to do. AMD, without access to the source code, should be able to make changes in the driver at the binary level, similar to how most other driver optimizations are built. Burke states that in these instances NVIDIA often sends engineers to work with game developers and that AMD "could have done the same had it chosen to."  And again, NVIDIA reiterated that in no way do its agreements with game developers prohibit optimization for AMD GPUs.

It would also be possible for AMD to have pushed for the implementation of TressFX in addition to HairWorks; a similar scenario played out in Grand Theft Auto V where several vendor-specific technologies were included from both NVIDIA and AMD, customized through in-game settings. 

NVIDIA has never been accused of being altruistic; it doesn't often create things and then share it with open arms to the rest of the hardware community. But it has to be understood that game developers know this as well - they are not oblivious. CD Project knew that HairWorks performance on AMD would be poor but decided to implement the technology into The Witcher 3 anyway. They were willing to sacrifice performance penalties for some users to improve the experience of others. You can argue that is not the best choice, but at the very least The Witcher 3 will let you disable the HairWorks feature completely, removing it from the performance debate all together.

In a perfect world for consumers, NVIDIA and AMD would walk hand-in-hand through the fields and develop hardware and software in tandem, making sure all users get the best possible experience with all games. But that style of work is only helpful (from a business perspective) for the organization attempting to gain market share, not the one with the lead. NVIDIA doesn't have to do it and chooses to not. If you don't want to support that style, vote with your wallet.

Another similar controversy surrounded the recent release of Project Cars. AMD GPU performance was significantly lower than comparable NVIDIA GPUs, even though this game does not implement any GameWorks technologies. In that case, the game's developer directly blamed AMD's drivers, saying that it was a lack of reaching out from AMD that caused the issues. AMD has since recanted its stance that the performance delta was "deliberate" and says a pending driver update will address gamers performance issues.

wicher3-1.jpg

All arguing aside, this game looks amazing. Can we all agree on that?

The only conclusion I can come to from all of this is that if you don't like what NVIDIA is doing, that's your right - and you aren't necessarily wrong. There will be plenty of readers that see the comments made by NVIDIA above and continue to believe that they are being at best disingenuous and at worst, are straight up lying. As I mentioned above in my own comments NVIDIA is still a for-profit company that is responsible to shareholders for profit and growth. And in today's world that sometimes means working against other companies than with them, resulting in impressive new technologies for its customers and push back from competitor's customers. It's not fun, but that's how it works today.

Fans of AMD will point to G-Sync, GameWorks, CUDA, PhysX, FCAT and even SLI as indications of NVIDIA's negative impact on open PC gaming. I would argue that more users would look at that list and see improvements to PC gaming, progress that helps make gaming on a computer so much better than gaming on a console. The truth likely rests somewhere in the middle; there will always be those individuals that immediately side with one company or the other. But it's the much larger group in the middle, that shows no corporate allegiance and instead just wants to have as much fun as possible with gaming, that will impact NVIDIA and AMD the most.

So, since I know it will happen anyway, use the comments page below to vent your opinion. But, for the benefit of us all, try to keep it civil!

NVIDIA Releases 352.84 WHQL Drivers for Windows 10

Subject: Graphics Cards | May 15, 2015 - 11:36 PM |
Tagged: windows 10, geforce, graphics drivers, nvidia, whql

The last time that NVIDIA has released a graphics driver for Windows 10, they added a download category to their website for the pre-release operating system. Since about January, graphics driver updates were pushed by Windows Update and, before that, you would need to use Windows 8.1 drivers. Receiving drivers from Windows Update also meant that add-ons, such as PhysX runtimes and the GeForce Experience, would not be bundled with it. I know that some have installed them separately, but I didn't.

nvidia-geforce.png

The 352.84 release, which is their second Windows 10 driver to be released outside of Windows Update, is also certified by WHQL. NVIDIA has recently been touting Microsoft certification for many of their drivers. Historically, they released a large number of Beta drivers that were stable, but did not wait for Microsoft to vouch for them. For one reason or another, they have put a higher priority on that label, even for “Game Ready” drivers that launch alongside a popular title.

For some reason, the driver is only available via GeForce Experience and NVIDIA.com, but not GeForce.com. I assume NVIDIA will publish it there soon, too.

Source: NVIDIA

Oculus Rift "Full Rift Experience" Specifications Released

Subject: Graphics Cards, Processors, Displays, Systems | May 15, 2015 - 03:02 PM |
Tagged: Oculus, oculus vr, nvidia, amd, geforce, radeon, Intel, core i5

Today, Oculus has published a list of what they believe should drive their VR headset. The Oculus Rift will obviously run on lower hardware. Their minimum specifications, published last month and focused on the Development Kit 2, did not even list a specific CPU or GPU -- just a DVI-D or HDMI output. They then went on to say that you really should use a graphics card that can handle your game at 1080p with at least 75 fps.

oculus-dk2-product.jpg

The current list is a little different:

  • NVIDIA GeForce GTX 970 / AMD Radeon R9 290 (or higher)
  • Intel Core i5-4590 (or higher)
  • 8GB RAM (or higher)
  • A compatible HDMI 1.3 output
  • 2x USB 3.0 ports
  • Windows 7 SP1 (or newer).

I am guessing that, unlike the previous list, Oculus has a more clear vision for a development target. They were a little unclear about whether this refers to the consumer version or the current needs of developers. In either case, it would likely serve as a guide for what they believe developers should target when the consumer version launches.

This post also coincides with the release of the Oculus PC SDK 0.6.0. This version pushes distortion rendering to the Oculus Server process, rather than the application. It also allows multiple canvases to be sent to the SDK, which means developers can render text and other noticeable content at full resolution, but scale back in places that the user is less likely to notice. They can also be updated at different frequencies, such as sleeping the HUD redraw unless a value changes.

The Oculus PC SDK (0.6.0) is now available at the Oculus Developer Center.

Source: Oculus

Rumor: AMD Radeon R9 300-series Release Dates

Subject: Graphics Cards | May 14, 2015 - 07:00 AM |
Tagged: tonga, radeon, R9, pitcairn, Fiji, bonaire, amd

Benchlife.info, via WCCFTech, believes that AMD's Radeon R9 300-series GPUs will launch in late June. Specifically, the R9 380, the R7 370, and the R7 360 will arrive on the 18th of June. These are listed as OEM parts, as we have mentioned on the podcast, which Ryan speculates could mean that the flagship Fiji XT might go by a different name. Benchlife.info seems to think that it will be called by the R9 390(X) though, and that it will be released on the 24th of June.

WCCFTech is a bit more timid, calling it simply “Fiji XT”.

amd-gaming-evolved.png

In relation to industry events, this has the OEM lineup launching on the last day of E3 and Fiji XT launching in the middle of the following week. This feels a little weird, especially because AMD's E3 event with PC Gamer is on the 16th. While it makes sense for AMD to announce the launch a few days before it happens, that doesn't make sense for OEM parts unless they were going to announce a line of pre-built PCs. The most likely candidate to launch gaming PCs is Valve, and they're one of the few companies that are absent from AMD's event.

And this is where I run out of ideas. Launching a line of OEM parts at E3 is weird unless it was to open the flood gates for OEMs to make their own announcements. Unless Valve is scheduled to make an announcement earlier in the day, or a surprise appearance at the event, that seems unlikely. Something seems up, though.

Source: WCCFTech

Giveaway! Take home a 20th Anniversary ASUS GeForce GTX 980!

Subject: Graphics Cards | May 12, 2015 - 12:11 PM |
Tagged: GTX 980, giveaway, contest, asus, anniversary

I bet many of you are going to feel a little old as you read this, but ASUS is celebrating its 20th anniversary of being in the graphics card market! Starting in 1996, when ASUS launched its first discrete graphics product based on the S3 Virge/DX GPU and running through 2015 with the release of updated GTX 900-series and AMD Radeon 200-series cards, ASUS has been a pivotal part of the GPU landscape.

GiveAwayBanner_20thGTX980_2.jpg

To celebrate this achievement, ASUS built a new, gold, 20th anniversary edition of its GeForce GTX 980 product, limited to only 200 units in all of North America! And the best part? You can win one right here on PC Perspective for FREE!

asus20th-2.jpg

This amazing graphics card has some killer features:

  • GTX 980 GPU
  • 4GB GDDR5 memory at 7.0 GHz
  • 1317 MHz Base / 1431 MHz Boost clocks
  • 8+8-pin power connections
  • DirectCU II 0db Fan Technology
  • Memory Defroster
  • "Safe Mode" VBIOS reload
  • 14-phase Super Alloy power

asus20th-3.jpg

How do you enter for this prize? Pay attention, there are some specifics!

  1. First and foremost, you must be subscribed to the ASUS PCDIY email list, which you can sign up for at http://pcdiy.asus.com/. Repeat: this is a requirement!
     
  2. You must leave a comment on the news story here and tell us what you love about ASUS graphics cards or the new 20th anniversary GTX 980 Special Edition!
     
  3. This is a global contest - so feel free to enter from anywhere in the world!
     
  4. You can gain other entries by utilizing the Gleam.io form below!

PCPer / ASUS 20th Anniversary GTX 980 Contest

The contest will end on May 15th at 9pm ET so be sure to get your entries in early!

Source: ASUS

AMD Previews New Flagship GPU with HBM and 3x Perf/Watt Improvements

Subject: Graphics Cards | May 6, 2015 - 02:21 PM |
Tagged: amd, hbm, radeon, gpu

During today's 2015 AMD Financial Analyst Day, CEO Dr. Lisa Su discussed some of the details of the upcoming enthusiast Radeon graphics product. Though it wasn't given a name, she repeatedly said that the product would be announced "in the coming weeks...at upcoming industry events."

63566517786.png

You won't find specifications here but understanding the goals and targets that AMD has for this new flagship product will help tell the story of this new Radeon product. Dr. Su sees AMD investing at very specific inflection points, the most recent of which are DirectX 12, 4K displays and VR technology. With adoption of HBM (high bandwidth memory) that sits on-die with the GPU, rather than across a physical PCB, we will see both a reduction in power consumption as well as a significant increase in GPU memory bandwidth.

63566517812.png

HBM will accelerate the performance improvements at those key inflection points Dr. Su mentioned. Additional memory bandwidth will aid the ability for discrete GPUs to push out 4K resolutions and beyond, no longer limited by texture sizes. AMD's LiquidVR software, in conjunction with HBM, will be able to improve latency and reduce performance concerns on current and future generations of virtual reality hardware.

63566517880.png

One interesting comment made during the conference was that HBM would enable new form factors for the GPUs now that you now longer need to have memory spread out on a PCB. While there isn't much room in the add-in card market for differentiation, in the mobile space that could mean some very interesting things for higher performance gaming notebooks.

Mark Papermaster, AMD CTO, said earlier in the conference call that HBM would aid in performance but maybe more importantly will lower power and improve total GPU efficiency. HBM will offer more than 3x improved performance/watt compared to GDDR5 while also running more than 50% lower power than GDDR5. Lower power and higher performance upgrades don't happen often so I am really excited to see what AMD does with it.

63566516712.png

There weren't any more details on the next flagship Radeon GPU but it doesn't look like we'll have to wait much longer.

Source: AMD

New GeForce Bundle: Witcher 3 and Batman: Arkham Knight

Subject: Graphics Cards | May 5, 2015 - 12:46 PM |
Tagged: The Witcher 3, nvidia, GTX 980, GTX 970, gtx, geforce, batman arkham knight

NVIDIA has announced a new game bundle for GeForce graphics cards starting now, and it’s a doozy: Purchase a qualifying GTX card and receive download codes for both Witcher 3: Wild Hunt and Batman: Arkham Knight!

batman-witcher-glp-header.png

Needless to say, both of these titles have been highly anticipated, and neither have been released just yet with Witcher 3: Wild Hunt due to be released on May 19, and Batman: Arkham Knight arriving on June 23. So which cards qualify? Amazon has a page specifically created for this new offer here, and depending on which card you select you’ll be eligible for either both upcoming games, or just Witcher 3. NVIDIA has this chart on their promotion page for reference:

eligible_cards.png

So GeForce GTX 980 and GTX 970 cards will qualify for both games, and GTX 960 cards and the mobile GPUs qualify for Witcher 3 alone. Regardless of which game or card you may choose these PC versions of the new games will feature graphics that can’t be matched by consoles, and NVIDIA points out the advantages in their post:

"Both Batman and The Witcher raise the bar for graphical fidelity, rendering two very different open worlds with a level of detail we could only dream of last decade. And on PC, each is bolstered by NVIDIA GameWorks effects that increase fidelity, realism, and immersion. But to use those effects in conjunction with the many other available options, whilst retaining a high frame rate, it’s entirely possible that you’ll need an upgrade."

NVIDIA also posted this Witcher 3: Wild Hunt behind-the-scenes video:

In addition to the GameWorks effects present in Witcher 3, NVIDIA worked with developer Rocksteady on Batman: Arkham Knight “to create new visual effects exclusively for the game’s PC release.” The post elaborates:

"This is in addition to integrating the latest and greatest versions of our GPU-accelerated PhysX Destruction, PhysX Clothing and PhysX Turbulence technologies, which work in tandem to create immersive effects that react realistically to external forces. In previous Batman: Arkham games, these technologies created immersive fog and ice effects, realistic destructible scenery, and game-enhancing cloth effects that added to the atmosphere and spectacle. Expect to see similar effects in Arkham Knight, along with new features and effects that we'll be talking about in depth as we approach Arkham Knight’s June 23rd release."

batman-arkham-knight-screenshot-1.jpg

Of note, mobile GPUs included in the promotion will only receive download codes if the seller of the notebook is participating in the "Two Times The Adventure" offer, so be sure to check that out when looking for a gaming laptop that qualifies!

The promotion is going on now and is available “for a limited time or while supplies last”.

Source: NVIDIA

Rumor: Could AMD and PC Gamer Event at E3 Finally Bring New GPU Announcements?

Subject: Graphics Cards | April 30, 2015 - 08:28 PM |
Tagged: PC Gamer, gpu, Fiji, E3 2015, amd

We haven’t had much more than rumor and speculation about upcoming AMD graphics for a while now, but there is more than enough fresh fuel for the GPU fire today to ignore completely. It seems that AMD and PC Gamer magazine have teamed up to announce a special (what else) PC gaming event at this year’s E3 show on June 16, and this would be the perfect place for some new hardware announcements.

PCGAMINGSHOW.jpg

Not enough for you? Well, while the AMD Fiji GPU rumors are nothing new to followers of industry news, it has now been indirectly announced that the upcoming Fiji GPU from AMD will in fact feature 2.5D high-bandwidth memory (HBM). As reported by tech news/rumor site wccftech the announcement came via the official schedule for the upcoming Hot Chips symposium, which is slated for August 23-25 in Cupertino, California. 

Screenshot_2015-04-30-10-25-44.png

This screenshot was taken this morning from the official online event schedule

(Note: This part of the day 2 schedule has now been changed to read “AMD’s Next Generation GPU and Memory Architecture”, with all mention of Fiji and HBM removed.)

Whether this gives us insight into the actual release date of the long-awaited Fiji GPU from AMD is unclear, but new AMD GPU products certainly seem to be imminent as we move into the summer months. Speculation is fun (for a while), but hopefully the PC gaming event at E3 in June will provide at least some official news from AMD on the new GPU products we've been waiting for.

Source: PC Gamer

Got Windows 10? Get GeForce 352.63 beta

Subject: Graphics Cards | April 29, 2015 - 03:49 PM |
Tagged: GeForce 352.63, beta, windows 10

The new Win 10 NVIDIA GeForce driver is here, in two different flavours depending on your form factor.  If you spend the money for a gaming laptop with a GeForce 600M through 900M then this is the driver for you.  On the other hand if you have a traditional desktop and a GPU or two then head to this page.

index.jpg

If you have a Sony laptop you should double check your GPU is covered and unfortunately at this point Hybrid Power technology is not supported.  NVIDIA did not provide much additional information on the desktop side; it is a beta and so is the OS so make sure to record the full information about your bugs and crashes when reporting them, not just a frowny face followed by expletives.

 

Source: NVIDIA

GTA V: The GPU review

Subject: Graphics Cards | April 21, 2015 - 04:07 PM |
Tagged: GTA5, gaming, titan x, GTX 980, R9 290X, r9 295x2

Some sort of game involving driving stolen prostitutes into cars in an open sore world has arrived and the questions about what it takes to make the game look good are popping up like pills.  [H]ard|OCP seems to have heard of the game and tested out its performance on the top performing video cards from AMD and NVIDIA in both single and doubles.  You will get more out of a double but unfortunately only around a 50% improvement so obviously that second shot is watered down a bit.  In the end the GTX TITAN X was the best choice for those who want to crank everything up, with the 980 tasting slightly better than the 290X for those that actually have to ask the price.  Check the full review here.

1429511282q5iVvFquHG_1_9_l.jpg

"Grand Theft Auto V has finally been released on the PC. In this preview we will look at some video card comparisons in performance, maximize graphics settings at 1440p and 4K. We will briefly test AMD CHS Shadow and NVIDIA PCSS shadow and talk about them. We will even see if SLI and CrossFire work."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP

Red Hat Joins Khronos Group

Subject: General Tech, Graphics Cards | April 20, 2015 - 07:30 AM |
Tagged: Red Hat, Khronos

With a brief blog post, Red Hat has announced that they are now members of the Khronos Group. Red Hat, one of the largest vendors of Linux software and services, would like to influence the direction of OpenGL and the upcoming Vulkan API. Also, apart from Valve, they are one of the only Linux vendors that contributes to the Khronos Group as an organization. I hope that their input counter-balances Apple, Google, and Microsoft, who are each members, in areas that are beneficial to the open-source operating system.

redhat-logo.png

As for now, Red Hat intends to use their membership to propose OpenGL extensions as well as influence Vulkan as previously mentioned. It also seems reasonable that they would push for extensions to Vulkan, which the Khronos Group mentioned would support extensions at GDC, especially if something that they need fails to reach “core” status. While this feels late, I am glad that they at least joined now.

Source: Red Hat

Moore's Law Is Fifty Years Old!

Subject: General Tech, Graphics Cards, Processors | April 19, 2015 - 02:08 PM |
Tagged: moores law, Intel

While he was the director of research and development at Fairchild Semiconductor, Gordon E. Moore predicted that the number of components in an integrated circuits would double every year. Later, this time-step would slow to every two years; you can occasionally hear people talk about eighteen months too, but I am not sure who derived that number. In a few years, he would go on to found Intel with Robert Noyce, where they spend tens of billions of dollars annually to keep up with the prophecy.

Intel-logo.png

It works out for the most part, but we have been running into physical issues over the last few years though. One major issue is that, with our process technology dipping into the single- and low double-digit nanometers, we are running out of physical atoms to manipulate. The distance between silicon atoms in a solid at room temperature is about 0.5nm; a 14nm product has features containing about 28 atoms, give or take a few in rounding error.

Josh has a good editorial that discusses this implication with a focus on GPUs.

It has been a good fifty years since the start of Moore's Law. Humanity has been developing plans for how to cope with the eventual end of silicon lithography process shrinks. We will probably transition to smaller atoms and molecules and later consider alternative technologies like photonic crystals, which routes light in the hundreds of terahertz through a series of waveguides that make up an integrated circuit. Another interesting thought: will these technologies fall in line with Moore's Law in some way?

Source: Tom Merritt

NVIDIA Released 350.12 WHQL and AMD Released 15.4 Beta for Grand Theft Auto V

Subject: Graphics Cards | April 14, 2015 - 01:27 AM |
Tagged: nvidia, amd, GTA5

Grand Theft Auto V launched today at around midnight GMT worldwide. This corresponded to 7PM EDT for those of us in North America. Well, add a little time for Steam to unlock the title and a bit longer for Rockstar to get enough servers online. One thing you did not need to wait for was new video card drivers. Both AMD and NVIDIA have day-one drivers that provide support.

rockstart-gtav_trevor2_1920x1080.jpg

You can get the NVIDIA drivers at their landing page

You can get the AMD drivers at their release notes

Personally, I ran the game for about a half hour on Windows 10 (Build 10049) with a GeForce GTX 670. Since these drivers are not for the pre-release operating system, I tried running it on 349.90 to see how it performed before upgrading. Surprisingly, it seems to be okay (apart from a tree that was flickering in and out of existence during a cut-scene). I would definitely update my drivers if they were available and supported, but I'm glad that it seems to be playable even on Windows 10.

Source: AMD

90-some percent of the performance for 70 percent of the price; PowerColor's PCS+ R9 290X

Subject: Graphics Cards | April 6, 2015 - 04:54 PM |
Tagged: factory overclocked, powercolor pcs+, R9 290X

The lowest priced GTX 980 on Amazon is currently $530 while the PowerColor PCS+ R9 290X is $380, about 72% of the price of the GTX 980.  The performance that [H]ard|OCP saw after overclocking the 290X was much closer, in some games even matching it but usually about 5-10% slower than the GTX 980, making it quite obvious which card is the better value.  The GTX 970 is a different story, you can find a card for $310 and the performance is only slightly behind the 290X although the 290X takes a larger lead at higher resolutions.  Read through the review carefully as the performance delta and overall smoothness varies from game to game but unless you like paying to brag about your handful of extra frames the 970 and 290X are the cards offering you the best bang for your buck.

1427061263kWJDj1MtDq_1_1.jpg

"Today we examine what value the PowerColor PCS+ R9 290X holds compared to overclocked GeForce GTX 970. AMD's Radeon R9 290X pricing has dropped considerably since launch and constitutes a great value and competition for the GeForce GTX 970. At $350 this may be an excellent value compared to the competition."

Here are some more Graphics Card articles from around the web:

Graphics Cards

 

Source: [H]ard|OCP

Saved so much using Linux you can afford a Titan X?

Subject: Graphics Cards | March 27, 2015 - 04:02 PM |
Tagged: gtx titan x, linux, nvidia

Perhaps somewhere out there is a Linux user who wants a TITAN X and if there is they will like the results of Phoronix's testing.  The card works perfectly straight out of the box with the latest 346.47 driver as well as the 349.12 Beta; if you want to use Nouveau then don't buy this card.  The TITAN did not win any awards for power efficiency but for OpenCL tests, synthetic OpenGL benchmarks and Unigine on Linux it walked away a clear winner.  Phoronix, and many others, hope that AMD is working on an updated Linux driver to accompany the new 300 series of cards we will see soon to help them be more competitive on open source systems.

If you are sick of TITAN X reviews by now, just skip to their 22 GPU performance roundup of Metro Redux.

image.php_.jpg

"Last week NVIDIA unveiled the GeForce GTX TITAN X during their annual GPU Tech Conference. Of course, all of the major reviews at launch were under Windows and thus largely focused on the Direct3D performance. Now that our review sample arrived this week, I've spent the past few days hitting the TITAN X hard under Linux with various OpenGL and OpenCL workloads compared to other NVIDIA and AMD hardware on the binary Linux drivers."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: Phoronix

NVIDIA Quadro M6000 Announced

Subject: Graphics Cards | March 23, 2015 - 07:30 AM |
Tagged: quadro, nvidia, m6000, gm200

Alongside the Titan X, NVIDIA has announced the Quadro M6000. In terms of hardware, they are basically the same component: 12 GB of GDDR5 on a 384-bit memory bus, 3072 CUDA cores, and a reduction in double precision performance to 1/32nd of its single precision. The memory, but not the cache, is capable of ECC (error-correction) for enterprises who do not want a stray photon to mess up their computation. That might be the only hardware difference between it and the Titan X.

nvidia-quadro-m6000.jpg

Compared to other Quadro cards, it loses some double precision performance as mentioned earlier, but it will be an upgrade in single precision (FP32). The add-in board connects to the power supply with just a single eight-pin plug. Technically, with its 250W TDP, it is slightly over the rating for one eight-pin PCIe connector, but NVIDIA told Anandtech that they're confident that it won't matter for the card's intended systems.

That is probably true, but I wouldn't put it past someone to do something spiteful given recent events.

The lack of double precision performance (IEEE 754 FP64) could be disappointing for some. While NVIDIA would definitely know their own market better than I do, I was under the impression that a common workstation system for GPU compute was a Quadro driving a few Teslas (such as two of these). It would seem weird for a company to have such a high-end GPU be paired with Teslas that have such a significant difference in FP64 compute. I wonder what this means for the Tesla line, and whether we will see a variant of Maxwell with a large boost in 64-bit performance, or if that line will be in an awkward place until Pascal.

Or maybe not? Maybe NVIDIA is planning to launch products based on an unannounced, FP64-focused architecture? The aim could be to let the Quadro deal with the heavy FP32 calculations, while the customer could opt to load co-processors according to their double precision needs? It's an interesting thought as I sit here at my computer musing to myself, but then I immediately wonder why did they not announce it at GTC if that is the case? If that is the case, and honestly I doubt it because I'm just typing unfiltered thoughts here, you would think they would kind-of need to be sold together. Or maybe not. I don't know.

Pricing and availability is not currently known, except that it is “soon”.

Source: Anandtech

A TITANic roundup of GPUs

Subject: Graphics Cards | March 19, 2015 - 03:20 PM |
Tagged: titan x, nvidia, gtx titan x, gm200, geforce, 4k

You have read Ryan's review of the $999 behemoth from NVIDIA and now you can take the opportunity to see what other reviewers think of the card.  [H]ard|OCP tested it against the GTX 980 which shares the same cooler and is every bit as long as the TITAN X.  Along the way they found a use for the 12GB of VRAM as both Watch_Dogs and Far Cry 4 used over 7GB of memory when tested at 4k resolution though the frame rates were not really playable, you will need at least two TITAN X's to pull that off.  They will be revisiting this card in the future, providing more tests for a card with incredible performance and an even more incredible price.

14265930473m7LV4iNyQ_1_6_l.jpg

"The TITAN X video card has 12GB of VRAM, not 11.5GB, 50% more streaming units, 50% more texture units, and 50% more CUDA cores than the current GTX 980 flagship NVIDIA GPU. While this is not our full TITAN X review, this preview focuses on what the TITAN X delivers when directly compared to the GTX 980."

Here are some more Graphics Card articles from around the web:

Graphics Cards

Source: [H]ard|OCP