AMD versus NVIDIA versus Frostbite3

Subject: General Tech | October 16, 2013 - 01:31 PM |
Tagged: battlefield 4, r9 280x, gtx 770

As the recommended requirements for BF4 indicate that a mid-range GPU should be able to handle the game, [H]ard|OCP tested out the public beta with the new R9 280X as well as the GTX 770.  This not only gives an idea of comparative performance but also a chance to see if the extra memory present on AMD's card gives any performance advantage at 2560 x 1600.   At first glance the charts seem to favour NVIDIA but that was not what [H] found when gaming as the high peaks represented points with little or no action and the performance started to suffer during action sequences while the AMD card remained solid throughout both calm and the storms.

View Full Size

"Electronic Arts has opened a public beta of the upcoming Battlefield 4 game debuting the new Frostbite 3 game engine. Today we will preview some gameplay performance in BF4 Beta on an AMD Radeon R9 280X and GeForce GTX 770 and see how the game will challenge today's GPU's. The results are not quite what we expected."

Here is some more Tech News from around the web:

Gaming

Source: [H]ard|OCP
October 16, 2013 | 03:04 PM - Posted by Terminashunator (not verified)

Why do they use old GPUs as a comparison? I assume a 3870 is close to a 5770/6770/7750/R7 260x?

October 16, 2013 | 03:56 PM - Posted by JohnGR (not verified)

More like a 5570, or a 7730, or a R7 240. It does have much more bandwidth, but only 320 processors and old design.

October 16, 2013 | 04:29 PM - Posted by razor512

They use older GPU's to better represent an older gaming PC. Most users looking to play demanding AAA titles, will often have a decent gaming PC.

the problem is that not everyone has the money to upgrade each year so it is likely for a user to have a gaming PC that they built a while back and it is stuck with a radeon 3870.

Ideally I would like to see these companies release a chart showing the min, avg, and max frame rates for a number of cards for their game that way users will have a more reliable way to find out how the game will perform on their system.

October 16, 2013 | 05:37 PM - Posted by Gregster

One of the biggest debates I have and very often is VRAM amounts. When the 680 was first released, I jumped on that and later bought another for SLI and bought another 2 Asus VG278H screens to play in 3D surround (I already owned one). I happily played BF3 in 2D surround with all settings maxed and never ran out of VRAM. Even when I was telling people I don't, I was told I was wrong, as they had seen 2.3GB used on their 7970 @ 1080P, so no way I could use that little.

My point is, I do wish reviewers would concentrate on VRAM used on GPU's and with full settings and frame times, so we can see if the game is smooth as well. Who ever came up with frame times is a genius :D Now come up with VRAM usage at max and resolutions please.

October 16, 2013 | 09:33 PM - Posted by razor512

While a game engine can make due with a certain amount of memory, eg 2GB depending on the engine, it will use more memory if available.

If a user is seeing 2.3GB used then the game may just be storing extra textures that are not 100% needed at the moment but the game engine has not specifically told it to unload them.

for example some old/ classic games that wanted 32MB video memory, will use more if available, but they are not 100% needed. for the game to run and limiting it to 32MB will not increase the lag spikes if any.

the only time you really get extra lag is when the game engine is starved for video memory and it begins to rely on system memory to make up for the lack of video memory, this will often cause lag spikes or just a general low frame rate.

October 17, 2013 | 01:52 AM - Posted by Gregster

I fully understand how VRAM works and caches but it would be sweet for a reviewer who tests several cards on one AAA title to also list the VRAM used and frame times. This clearly shows what is used/needed and stops the scare mongers (or makes them right).

They would only need to run GPU-Z and look at the max VRAM used.

October 16, 2013 | 06:39 PM - Posted by Branthog

Why in the world would the recommended OS be Windows *8*?

October 16, 2013 | 08:25 PM - Posted by biohazard918

BF4 uses direct X 11.2 features which is only available with the xbone or windows 8 because Microsoft are dicks.

October 18, 2013 | 11:14 PM - Posted by Anonymous (not verified)

DX11.1 for Windows 8, and DX11.2 for Windows 8.1

DX11.X for XBone, which supposedly is dx11.2 plus some forward looking stuff

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.