AMD Mantle Private Beta Announced

Subject: General Tech, Graphics Cards | May 1, 2014 - 08:00 AM |
Tagged: Mantle, amd

As our readers are well aware, Mantle is available for use with a few games. Its compatibility begun with the beta Catalyst 14.1 driver and an update for Battlefield 4. AMD was quite upfront about the technology, even granting a brief interview with Guennadi Riguer, Chief Architect of the API to fill in a few of the gaps left from their various keynote speeches.

View Full Size

What is under lock and key, however, is the actual software development kit (SDK). AMD claimed that it was too immature for the public. It was developed in partnership with DICE, Oxide Games, and other, established developers to fine-tune its shape, all the while making it more robust. That's fine. They have a development plan. There is nothing wrong with that. Today, while the SDK is still not public and sealed by non-disclosure agreement, AMD is accepting applications from developers who are requesting to enter the program.

If you want to develop a Mantle application or game, follow the instructions at their website for AMD to consider you. They consider it stable, performant, and functional enough for "a broader audience in the developer community".

AMD cites 40 developers already registered, up from seven (DICE, Crytek, Oxide, etc.).

If you are not a developer, then this news really did not mean too much to you -- except that progress is being made.

Source: AMD

May 1, 2014 | 11:09 AM - Posted by Anonymous (not verified)

Enjoy your mantle cards beta testers by the time mantle is/ever relevant current cards will be thing of the past.

May 1, 2014 | 11:59 AM - Posted by JohnGR

Enjoy your "Wonder driver" and don't forget to buy two Titan Zs.

May 1, 2014 | 12:17 PM - Posted by Anonymous (not verified)

↑ ↓

May 1, 2014 | 12:15 PM - Posted by Anonymous (not verified)

Lol mantle was the selling point and Lol a driver doesnt cost $300-1000

And I dont need a wonder driver with 2 780 classified at 1.3Ghz :P

May 1, 2014 | 12:29 PM - Posted by Anonymous (not verified)

and on air low volts my 1.3 780s runs cooler and quieter than your power hungry overheating at stock speeds leaf blower 290x.

May 1, 2014 | 12:34 PM - Posted by Anonymous (not verified)

curious word "performant" as its usage has increased in the past few years, while its actual definition is up for debate. It is a Buzzword that should raise suspicion, whenever used. It has that Marketing tinge to it.



From French


performant (comparative more performant, superlative most performant)
1.(jargon, chiefly computing) Capable of or characterized by an adequate or excellent level of performance or efficiency.  [quotations ▼] Ours is a performant network monitoring and systems monitoring tool. This software is ten percent more performant than its predecessor.

2.Of or relating to performance  [quotations ▼]

(capable of achieving an adequate or excellent level of performance or efficiency): effective, efficient, high-performing, responsive, successful


performant (plural performants)
1.Someone who performs something, such as a ritual  [quotations ▼]




1.Present participle of performer.


performant m (feminine performante, masculine plural performants, feminine plural performantes)
1.efficient, effective, performant

May 1, 2014 | 03:12 PM - Posted by Scott Michaud

I chose the word because I was creating a list of adjectives. Performance is a noun. Performant is its adjective form. It fits next to "stable" and "functional", grammatically.

May 1, 2014 | 03:24 PM - Posted by Allyn Malventano

curious word "Buzzword" as its usage has increased in the past few years, while its actual definition is up for debate. It is a word that should raise suspicion, whenever used. It has that Marketing tinge to it.

(Seriously - you're suspicious because Scott used English?)

May 1, 2014 | 04:49 PM - Posted by Anonymous (not verified)

It is english, but the marketing folks have made into an english Buzzword. Avoid that word to not want to sound like the Ballmers or the Ribbon lady Larsons out there. So english is not the point, and not Scott, just the word[performant]! And "Buzzword" (negative connotation) has been around for more than 50 years!

May 1, 2014 | 06:02 PM - Posted by Allyn Malventano

And "Buzzword" (negative connotation) has been around for more than 50 years!

Buzzword: 1946.

Performant: 1847.

Please restate your point in the form of a valid argument.

May 1, 2014 | 08:22 PM - Posted by Anonymous (not verified)

And it exists as an older word, but of a limited/questonable modern technical meaning dew to its overuse as a Buzzword du jour, of the marketing departments who utilize its meaning/s for more obfuscation, and the argument about the roots of the word, is not at play, but the use of jargon is. The GPU folks need to get away from this constant need to push out new hardware faster than their Driver/API developers can get the software functioning at a proper level, the CPU industry has never had as much problems keeping their ISAs properly documented, and more needs to be done towards getting the GPU hardware instruction sets fully published, and into the opensource community hands, so that the last 5 years GPU hardware can still get the attention and driver updates, and open source support. AMD and Nvidia need some competition in the desktop GPU space, there is plenty of GPU competition in the mobile market.

May 2, 2014 | 02:33 AM - Posted by Tim Verry

In any case, you do realize that the definition that you, yourself, provided backs up Scott's use of the word performant?

May 2, 2014 | 11:18 AM - Posted by Anonymous (not verified)

The word is not in question, it is it's overuse as an marketing Buzzword that makes anyone who uses it appear to be marketing, the word's syntactical use is not in question. But to not sound like a used car salesman, Reviews should not use Buzzwords, and those french derived english words, and just french words in general have been abused by the brill building ad companies for ages. And damn the Ad copy and the reviewers text sometime runs togather in some articles, so maybe make the ad copy in italic to set it apart from the reviewers text, and not just stick a source at the bottom right of the article. By all means keep using Buzzwords, but expect readers to think marketing, and not objectivity.

May 2, 2014 | 08:06 PM - Posted by Allyn Malventano

Ok, your argument is that it's used by PR folks in their PR speak stuff that they blast out to everyone. Fair enough. Let's search my PCPer email inbox for this word. By your argument, surely it must be in there dozens, if not hundreds of times, as we at PCPer get bombarded by press blasts daily: out of *a lot* of press blast emails, it shows up *once*, as part of a consolidated CES press blast email, from a lesser known company's blurb.

Coincidentally, "buzzword" appears 6 times in the same search.

Of all the places to get this word from, I seriously doubt Scott is just parroting what some PR person sent him.

May 3, 2014 | 02:09 PM - Posted by Anonymous (not verified)

Buzzword as a buzzword overused as a Buzzword, you say! Considering some of you junior writer's grammer and writing skills, go back and read some of them, or bring the original articles(as they first appeared) to a college writing lab. There is a need to have someone act as a copy editor for your junior writers. The Junior writer's computer history Knoledge of the last 30 years, and going back longer(refering to Intel as "Big Blue"), that was good for a chuckle, is not up to a competent level, as exemplified by this, or other more serious errors when dealing with the complicated subjects of computing, the computing industry, and computer science.
This is not just one junior writer's problem on the many technology websites, they all have problems, some websites, not this one, really need to give up commenting about the market aspects of PCs, or the technology aspect, and just stick with basic reviews and benchmarking.

Man all that worthless debate about 32 Bit or 64 bit SOCs/CPUs for cellphones and the ability to adderss 4+ GB of memory, nearly made my head explode(The size of the Data bus, and/or Internal general purpose registers of a CPU does not determine directly how much memory a CPU/SOC can address, it is the size[Width] of the address bus that determines how much memory can be directly addressed, without extra virtual memory hardware support and special adderssing pins/traces/tables), but these "Tech" journalists kept on arguing. They would have been laughed off the stage at an ACM meeting.

32 bits will address this much memory: 4,294,967,296 bytes*
64 bits: 18,446,744,073,709,551,616 bytes*

*[That's the Address Bus width 32, 64, whatever bits, and for the purposes of directly adderssing memory without the help of hardware virtual memory circuitry/instructions and OS support]

Note: There were some systems that could multiplex memory addersses over the Data bus, and Vice versa with data over the address bus, and other such hardware gymnastics, but with the modern CPUs/SOCs this creates a bandwidth bottleneck and is generally not done.

May 1, 2014 | 10:35 PM - Posted by Anonymous (not verified)

...and here i am still using the msi gtx 275 twin frozr, to be fair it plays all the games i WANT to play at 1080p and until it stops playing all the games i want to play i will not replace it.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.