Review Index:

ASUS MG279Q 27-in 1440P 144Hz IPS 35-90Hz FreeSync Monitor Review

Subject: Displays
Manufacturer: ASUS

Introduction, Specifications, and Packaging

AMD fans have been patiently waiting for a proper FreeSync display to be released. The first round of displays using the Adaptive Sync variable refresh rate technology arrived with an ineffective or otherwise disabled overdrive feature, resulting in less than optimal pixel response times and overall visual quality, especially when operating in variable refresh rate modes. Meanwhile G-Sync users had overdrive functionality properly functioning , as well as a recently introduced 1440P IPS panel from Acer. The FreeSync camp was overdue for an IPS 1440P display superior to that first round of releases, hopefully with those overdrive issues corrected. Well it appears that ASUS, the makers of the ROG Swift, have just rectified that situation with a panel we can finally recommend to AMD users:

View Full Size

Before we get into the full review, here is a sampling of our recent display reviews from both sides of the camp:

  • ASUS PG278Q 27in TN 1440P 144Hz G-Sync
  • Acer XB270H 27in TN 1080P 144Hz G-Sync
  • Acer XB280HK 28in TN 4K 60Hz G-Sync
  • Acer XB270HU 27in IPS 1440P 144Hz G-Sync
  • LG 34UM67 34in IPS 25x18 21:9 48-75Hz FreeSync
  • BenQ XL2730Z 27in TN 1440P 40-144Hz FreeSync
  • Acer XG270HU 27in TN 1440P 40-144Hz FreeSync
  • ASUS MG279Q 27in IPS 1440P 144Hz FreeSync(35-90Hz) < You are here

The reason for there being no minimum rating on the G-Sync panels above is explained in our article 'Dissecting G-Sync and FreeSync - How the Technologies Differ', though the short version is that G-Sync can effectively remain in VRR down to <1 FPS regardless of the hardware minimum of the display panel itself.

Continue reading as we will look at this new ASUS MG279Q 27" 144Hz 1440P IPS FreeSync display!



Panel Size(diagonal)

27" (68.5mm)Wide Screen (16:9)


Display Viewing Area(HxV)

596.74 x 335.66 mm


Panel Backlight/ Type

WLED/ In-Plane  Switching technology


Display Surface



Color Saturation

100% sRGB


True Resolution

2560 x 1440 up to 144Hz (DP1.2)


1920 x 1080 up to 120Hz (HDMI1.4)


Pixel Pitch

0.233 mm (109ppi)



350 cd/m² (typical)


Contrast Ratio (Max.)



ASUS Smart Contrast Ratio(ASCR)



Viewing Angle (CR10)

178°(H) /178°(V)


Display Colors

16.7 million


Response Time

4ms (Gray to Gray)


ASUS EyeCare



Video Features

Trace Free Technology



Game Visual

Yes (GameVisual: FPS,RTS/RPG, Racing , sRGB, Cinema, Scenary modes)


Skin-Tone Selection

3 modes



2-Watt x2 , RMS


Color Temperature Selection

4 modes


GamePlus/Blue Light Filter

Yes (GamePlus/Blue Light Filter)


Gaming Hotkeys

5-way Navigation OSD  Joystick



Crosshair / Timer


GamePlus hotkey



6 modes (Scenery, FPS, RTS/RPG, sRGB, Racing, Cinema)


Input/ Output

DisplayPort1.2, mini-DisplayPort1.2, HDMI/MHL2.0 x 2, 'USB 3.0 ports (Upstream x 1, Downstream x 2), Earphone Jack


Signal Frequency

Digital Signal Frequency

51.2~221.97KHz(H) / 35~144Hz(V)



Power Consumption

<38.7W (Energy star 6.0)



100–240V, 50 / 60 Hz


Mechanical Design

Chassis Colors

Matted black



+20° ~ -5°



+60° ~ -60°


Height Adjustment (mm)

0~150 mm


VESA Wall Mounting(mm)

100 x 100 mm



Kensington lock




Phys. Dimension (WxHxD) with stand

625 x 559 x 238 mm


Box Dimension (WxHxD)

625 x 368 x 63 mm



7.3Kg (Net Weight)



Power Cord, DisplayPort-to-miniDP cable, USB 3.0 cable (optional), MHL cable (optional), DisplayPort cable (optional), HDMI cable (optional), Warranty Card, Quick start guide, Support DVD


Regulation Approvals

Energy Star 6.0, UL/cUL, CB, CE, ErP, FCC, CCC, BSMI, CU(EAC), C-Tick, VCCI, J-MOSS, PSE, RoHS, WEEE, Windows 7 / 8.1 WHQL




View Full Size

View Full Size

The MG279Q came well packaged with all necessary cords. One oddity noted was that the DisplayPort cable was actually a DP to Mini-DP cable.

View Full Size

Along with the power, HDMI, and USB cables was an instruction manual and a mystery red clip. More on that and the Mini-DP cable on the next page.

June 29, 2015 | 03:45 PM - Posted by chizow (not verified)

So in summary, the whole time you were testing this you were probably thinking:

"Sheesh, for $200 more than the $1250 I spent, I could've gotten a 980Ti and a better G-Sync panel."

June 29, 2015 | 04:05 PM - Posted by obababoy

Haha do you ever provide anything contructive that is not related to trolling?
What if you are like myself and have a general dislike for Nvidias business practices regardless of the quality of products they produce...Or what if you like GPUS that are the best performance/$ winners (my R9 290)? Or what if I was trying to justify to my wife a $600 monitor or $800 one?

AMD has an exciting future with DX12 and their very affordable 6 and 8 core CPUS that may benefit or the better gpu performance. And even if they don't I still enjoy supporting them..

June 29, 2015 | 09:04 PM - Posted by chizow (not verified)

Then you get inferior AMD products, as you deserve. Sure you save a buck or two, but you also get 2nd-rate product.

June 30, 2015 | 08:00 AM - Posted by Anonymous (not verified)

Haha was about to say the same to him. But diehard AMD fans keep proving dumb antics over and over again, so whats new?

Great Tech is not cheap and I rather spend that much on the better product thank you very much!

June 30, 2015 | 12:07 PM - Posted by chizow (not verified)

Yeah I mean he just outlined something I simply wouldn't be happy with, what's the point if you're already spending that much money, in not buying parts that are only marginally more expensive, but clearly better when it comes to both performance and features.

8350 vs. i7?
Fury X vs. 980Ti?
FreeSync vs. G-Sync?

You're looking at maybe $300 diff in price but the more and more cheaper concessions are made, the lower performance and end-user experience degrades.

June 30, 2015 | 03:27 PM - Posted by obababoy

Well what if someone like me didn't feel like wasting $200 for specs on a monitor outside of what I will EVER play on? I wish the refresh of this was 30hz but I will NEVER play a game below that and will never play over 90hz with modern games at max settings at 1440p...Don't you realize there are a ton of people in my exact situation which is what Asus knows and priced accordingly for...?

June 30, 2015 | 08:57 PM - Posted by chizow (not verified)

How do you know you'll never play out of those specs, as if you have control over frame rate drops, especially on some of these panels that are 1440p and above. Do you even know what kind of GPU power you're going to need to run at least 30FPS on these panels being discussed?

1440p is going to need 290 *MINIMUM* to drive 30+FPS and you'll still have to turn things down.

2160p/4K is going to need Fury X at *MINIMUM* to drive 30+FPS and same thing, you're going to have to turn things down.

One of the main goals and problems solved by VRR when Nvidia invented it was that you WOULD NOT have to turn down settings to get a perfect, tear and input lag-free experience even when your FPS dropped below native refresh.

And while its great that you don't THINK you will ever fall outside of those bands, that doesn't change the fact there are COMPETING PRODUCTS THAT NEVER SETTLE for these kinds of limitations that are priced slightly more and do a BETTER JOB of handling VRR at both higher and lower refresh rates with less ghosting and less input lag as well.

So yes, for someone with low standards, like yourself, this panel will be good enough, but for someone who wants and demands more, $200 will be a small premium to pay for superior results!

July 3, 2015 | 09:10 PM - Posted by Anonymous (not verified)

Hey everyone look at this mentally feeble green teamer who has nothing better to do with his life than submit to the nvidia marketing department and throw extra money at them. "Slightly" lol.

As if a gamer would accept framerates below 30, disgusting. Seems like you have the low standards with your casual games.

July 5, 2015 | 11:01 PM - Posted by Anonymous (not verified)

Calling names makes you look bad as well, so maybe let's just keep things constructive?

And the point about 30FPS is very valid as we often get drops we have no control of. This is why we have a FRAME TIME analysis now in addition to frame rate, and also why we look at the MINIMUM FPS score in addition to the average.

If you want to avoid a LOW frame rate score or get more solid frame times you have to drop the settings or buy better hardware which then defeats some of the purpose of buying an asynchronous monitor.

August 17, 2015 | 11:37 PM - Posted by Anonymous (not verified)

Nothing was constructive there, that post was all the response he deserved. Quit feigning civility where it wasnt asked for.

July 2, 2015 | 12:21 PM - Posted by Anonymous (not verified)

Dude. Don't you know? The AMD 8350 totally competes with the i7 in gaming experiences.

October 28, 2015 | 11:38 AM - Posted by savydog (not verified)

maybe a i7 850 at like 2 ghz lol my i7 4790k DESTROYED THEM

June 30, 2015 | 03:12 PM - Posted by obababoy

I have other Hobbies that cost far more money than PC gaming at my level. Even with your 5% better gaming experience I am saving 25% of my money on a monitor alone...Acer gsync panel vs this one. I also have an R9 290 that was $315 back when I got it almost 2 years ago and is able to still play all of the main titles totally fine at decent framerates with mainly max settings. AMD cards age far better than Nvidia...Just look at how the 780ti plays today when it was $700+ just over a year ago..

I know Nvidia products generally perform "better" but the premium for me is not worth it at all..It seems like most people who have Nvidia assume EVERYONE with AMD is sad and gets choppy framerates and doesn't enjoy themselves when they get a "inferior" product when just recently NV pulled ahead.

Are you playing on SLI right now with you Gsync experience? If not than why do you need it. Same thing goes for me. I only ever want to play with one GPU and avoid all the hassle of SLI/CF profiles.

Then there is all the future talk of DX12 with AMD supporting the more important lower level functions etc etc but that is a whole other topic that we don't know much about yet.

June 30, 2015 | 09:02 PM - Posted by chizow (not verified)

And that's great! The halo performance segment has always carried a hefty premium, that will never change so if you want to stay below that 10-20% range and settle for 2nd best, that's fine and good, but don't sit here and try and say there is no difference between the products or one isn't worth it because just because YOU don't see the benefit doesn't mean it isn't important to someone else.

I do actually use SLI with my ROG Swift actually because a single Titan X wasn't enough, I still found myself turning stuff down when I didn't really want to. With 2xTitan X, its no compromises and it just works!

As for the 780Ti vs. 290/X again, I buy cards for how they will perform until the next time I can upgrade. As much as AMD fans love to buy a card as if it is a marriage, I know my next upgrade is just 24-36 months away, so no I don't put a whole lot of stock on how a card will perform in 2 years because chances are it will be too slow regardless and I will want to upgrade again. I buy cards for CURRENT and near future performance, not how well they aged when I'm ready to get rid of them.

July 1, 2015 | 07:55 AM - Posted by obababoy

See...MUCH better response...I can dig it, no trolling but I never mentioned there is no difference between the products.

Clearly you are a no compromise gamer. That is fine. I am a "put $10,000 into my STi" type guy when I will likely never actually compete.

I just want people to realize there is a value where my PC currently stands and it makes little performance compromises for big cost savings.

July 1, 2015 | 10:34 AM - Posted by chizow (not verified)

But of course I'm going to throw some snarky jabs in there to tease all the AMD fanboys, look at all the nonsense they spout in every single one of these reviews? What's the point of saying I told you so without having some fun with it? :D

The problem is it is never an acknowledgement that a premium is justified, first thing AMD fanboys say is "G-Sync is not worth the premium", but in reality, that's just them projecting their penny-pinching budget conscious value system on others.

More discerning individuals like myself understand what G-Sync and VRR sought out to fix, and once both technologies were featured and reviewed, many of us saw the potential. We relied on INDEPENDENT REVIEWERS to give us an honest account before we made the decision to buy into the tech or not.

We also quickly saw the red flags when AMD claimed they had not only came up with a suitable alternative, but one that was better than G-Sync in numerous ways. Except we could see, that wasn't the case, and FreeSync still has many flaws today that mean it FUNDAMENTALLY fails to address many of the key points of VRR related to V-Sync and double/triple-buffering.

I'll outline them here for your reference:

1) Screen tearing
2) Input lag
3) Framerate halving below max/native refresh rate.

July 2, 2015 | 08:14 AM - Posted by obababoy

I just got this monitor last night and I can say without a doubt coming from my older asus TN panel that this has...NO input lag that I can notice and no screen tearing. Now below 35fps you can slightly see the lack of smoothness but it is no different than running my previous monitor at that fps.

July 2, 2015 | 01:02 PM - Posted by Rroc (not verified)

I believe he's referring to the input lag coming from when you're forced to game outside of the "freesync" window of 35-90Hz. If you game inside the window, you are basically golden.

If outside, the animation's smoothness difference can be so jarring that input lag may come into play.

July 3, 2015 | 07:27 PM - Posted by Ass (not verified)

Your use of the reflexive in the first sentence of the third paragraph is incorrect. Don't worry; I blame the system.

June 29, 2015 | 04:08 PM - Posted by Mihemine (not verified)

you're saving 200$ on the monitor alone, in the eu you are saving ~190€ .
add that to the fact that you don't need a 980 ti or fury X to run this, you can use a cheap R9 290 and have a good experience.

June 29, 2015 | 09:04 PM - Posted by chizow (not verified)

You do realize that if you fallback to old cards and backwards compatibility, FreeSync is going to lose every time to Nvidia right? Every single Kepler card is supported for G-Sync while AMD limits support to GCN 1.1 and newer. Also, still no CF support for AMD, while any Kepler config in SLI is still a very real option for Nvidia users.

June 29, 2015 | 04:43 PM - Posted by StephanS

No, instead of getting a slower & louder GTX 980 + GSync monitor
You can get the faster Fury X + FSync monitor AND still save about $100

Less money, faster, quieter... its a no brainer.

Also FreeSync is the new industry standard... good luck with your gsync monitor in the future. You lock yourself , and nvidia might not support it a few years from now.

June 29, 2015 | 05:20 PM - Posted by Allyn Malventano

Why are you comparing to the 980 and not the 980 Ti, which is (in most games) faster than the Fury X?

Do agree that the Fury X is quieter under load, but that pump whine is certainly louder than the 980 Ti at idle.

June 29, 2015 | 06:16 PM - Posted by Anonymous (not verified)

Good thing games aren't played at Idle right ?

June 30, 2015 | 10:49 PM - Posted by Allyn Malventano

Well the thing is that we tend to not notice GPU noise (regardless of card) when gaming as you're listening to the game sounds / music, but then when you're done playing and everything quiets down, you are left with that pump whine, which tends to wear on you (especially in Crossfire).

June 29, 2015 | 08:58 PM - Posted by chizow (not verified)

Well, because AMD fanboys.

June 30, 2015 | 05:27 PM - Posted by Anonymous (not verified)

As if you aren't an nvidia fanboy?

June 30, 2015 | 10:43 PM - Posted by chizow (not verified)

I'm a fan of the best and I'm completely comfortable in my own skin in that regard. Nvidia provides the best PC gaming experience today, if and when that changes, I'll buy something else, but I doubt that will happen anytime soon!

What's the excuse of all these AMD fanboys? Oh right, bargain, cheap, almost as good etc.

June 29, 2015 | 11:39 PM - Posted by fkr

because you compare things at the same price allyn.

a 980ti and a gsync monitor cost $200 more, so to offset that cost you would have to buy a cheaper GPU.

If money is no object when comparing products then what is the point.

listen I love this site and look forward to the podcast every week but your constant fanboyism for nvidia is tiresome.

is that coil whine that bad at idle in a good enclosed case while under a desk.

look I currently run nvidia products but I do not think it adds to AMD fans wanting to read your articles when you do not take into account apples to apples comparisons.

June 29, 2015 | 11:57 PM - Posted by trenter (not verified)

He's the worst nvidia fanboy out of the whole group. It seems josh walrath is the only one that even understands anything about the architectures amd and nvidia employ. Unfortunately we only ever get to hear from the least knowledgeable of the group most of the time.

June 30, 2015 | 12:36 AM - Posted by Anonymous (not verified)

You should see the comments when Ryan from Anandtech rips Chizow a new one.

It was hilarious.

June 30, 2015 | 02:55 AM - Posted by PCPerversion

That was indeed hilarious and a welcome move from the more balanced Ryan. I have to agree 100% with FKR that this sites Nvidia bias is tiresome at best. All the more so when they attempt to "appear" to be impartial. As for Chizow Shill his pro Nvidia troll posts are universally ridiculed at every tech site I have visited and rightly so. The guy has issues...

June 30, 2015 | 10:39 AM - Posted by chizow (not verified)

Hahah what is funny something I've noticed over the years. Nvidia fans when encountered with a problem, will raise hell and draw attention to it, usually directly to Nvidia on their forums until the problem is fixed.

Meanwhile, when a problem is brought to light with AMD products, the only thought that goes through their fanboys' minds is how quickly they can damage control and sweep the problem under the rug.

Apparently you AMD fanboys don't understand that YOU are the only one's who are negatively impacted by AMD being lazy with their solutions, but more likely, its because you're not even users of the actual products, you're just sideline cheerleaders of a product, just because. True fanboys, by definition.

We've seen it 2x recently and your stupid anon handle is a good indication of your position on both.

FCAT/runtframes. Nope, not a problem, "PCPerversion" is lying obviously a shill site for Nvidia. Lo and behold, they are the driving force in getting your junk CF solution fixed for you.

FreeSync, all kinds of problems with ghosting, OD broken, stutters/odd behavior at the low end of the VRR window. "PCPerversion" is lying, its not true because AMD said so, obvious Nvidia shill site. Lo and behold, after more testing and confirmation AMD admits the problem and says they are working with panel makers to fix it.

See a pattern here? Yep. AMD fanboys areignorant and would rather sweep a problem under the rug to their detriment rather than demand better from AMD.

I fully expect a nonsensical "hueer dueerr you're a fanboy" reply in response to my cogent explanation of why I think we as an enthusiast tech community are better off without the tech bottomfeeders that overwhelmingly prefer AMD.

June 30, 2015 | 03:24 PM - Posted by obababoy

You are what you hate...Don't you see that you dumb monkey?! You enjoy trying to piss of anyone who chooses AMD and generalize all of them into one category. Sure there are some AMD dudes that are self-absorbed high-cost-regret-insecurity assholes like yourself but you almost tip the scales. There are shitty AMD fans and shitty Nvidia fans, but it doesn't mean everyone is.
Im an AMD fan that loves PCper. They get some kickbacks from Nvidia but they still report on facts.

Not only is this the best monitor AMD has worked with, but PCper highly recommends it and even a 980ti owner bought it because it was $200 cheaper and he didn't care about gsync(on another forum). Clearly you don't care about forward progress and enjoy employing asshattery all day on forums. Feel sorry for you bud.

I don't think I will ever understand idiots who blame an entire community for the actions of one, in the comments section of a review about a PC gaming monitor...hahaha

June 30, 2015 | 09:09 PM - Posted by chizow (not verified)

LMAO, are you serious right now? Look at all the AMD fanboys blaming Allyn/Ryan for shilling for Nvidia because they critically reviewed a product and backed it up with science, as any GOOD REVIEWER would do as a SERVICE TO THEIR READERS to allow them to come to an INFORMED DECISION based on the MERITS OF THE ACTUAL PRODUCTS.

Its truly amazing, even you claim they are getting kickbacks to trash AMD products as if you hope to get inferior products with all of these flaws and deficiencies relative to the competition.

You can claim all you like it is a generalization but READ the comments from the AMD fanboys and proponents, its all the same. Ad hominem attacks on the messenger, rarely if ever do they actually tackle the technical issues, much less address them or voice their concern for them.

And we wonder why AMD has gotten away with this broken model of support and overall lackluster strategy of throwing out some half-assed, poorly supported standard, while taking zero accountability for problems when they occur. No thanks, the sooner we are rid of this tech bottomfeeder business model, the better for all of us.

July 1, 2015 | 08:01 AM - Posted by obababoy

Not kickbacks to trash AMD! I spoke with Allyn and he squashed my misguided understanding of Ryan's relationship with Nvidia. Ill own it right now..

I get your point with people bashing PCper. Not what I am saying. I was trying to get you to realize that you post 10 straight up hateful trolling comments before you actually post the really valid points...Your method of delivery I guess is the weak link not your points.

July 1, 2015 | 10:41 AM - Posted by chizow (not verified)

Nah as you can see in this comment thread and others, it really doesn't matter how I deliver the message, at least the trolling gets some laughs and I certainly get a kick out of contriving them. :)

AMD fanboys have this warped perception that any negative or critical press for AMD is a huge detriment to AMD and by proxy, themselves when in reality, it only impacts the product they take home and use.

As for Allyn and Ryan, I can see their cynicism and ambivalence towards AMD stems from the simple fact people don't like to be lied to or misled, or treated if they are stupid. AMD and their reps will LIE TO YOUR FACE and think nothing of it. It's the same reason I think so little of AMD and their brand at this point.

It's literally cringeworthy watching AMD reps make some of these claims, then go back just a few weeks later and ask yourself, did any of this happen?

-Did HBM allow them to make the world's fastest GPU?

-4GB isn't a problem for 4K optimized gaming?

-Fury X is convincingly faster than 980Ti across the board as their internal benches indicated?

-Is Fury X an Overclocker's Dream?

-Is the liquid cooler on Fury X really a benefit that enables cool and quiet operation when many of the pumps are shipping with an annoying whine?

Just some questions for critical thinkers to evaluate in regard to AMD. :)

June 30, 2015 | 10:51 PM - Posted by Allyn Malventano

How are we being impartial exactly? Are we making up benchmark results? Are we making up the differences between G-Sync and FreeSync? ...or is it that you just don't like hearing facts?

July 1, 2015 | 12:34 AM - Posted by Any (not verified)

I don't know Allyn. We never heard how G-Sync had input lag. Yet here was Tom Petersen on your live stream saying exactly that.

What we did hear from you and PcPerspective collective was a verbatum of Nvidias talking points that were contradictory to what Tom Petersen voluntarily said without being questioned.

Either Nvidia and Tom Petersen where playing for fools all this time with marketing or you never bothered to question it with the contradicting evidence from BlurBusters on the matter.

July 1, 2015 | 11:54 AM - Posted by chizow (not verified)

Sorry, while Allyn and Ryan are more than capable of defending themselves, your statements are simply inaccurate.

Ryan has covered input lag at max refresh numerous times with Tom Petersen, and he has never once denied that at max refresh G-Sync behaves very similar to V-Sync On.

However, Ryan to his credit as usual, has mentioned that some gamers might actually prefer to have Vsync OFF at the top-end above max refresh to avoid any potential input lag, given tearing is generally less noticeable at high frame rates. He also notes in later videos/articles/interviews, that this is something FreeSync is capable of and one of the only real deficiencies he felt G-Sync lacked over FreeSync.

Petersen said their goal for G-Sync was to NEVER have any tearing but he did see the merits of that viewpoint and as always, he was open to the idea of implementing this into G-Sync as they are always looking to improve it. Anyone who knows Tom Petersen's plausible deniability methods knows this means there's a good chance this gets implemented at some point and they are already looking into it.

And lo and behold, the fruits of open, honest, and constructive feedback! But for those that prefer Nvidia, this is a big reason why we prefer Nvidia. So yes, Ryan can take a bow for this positive change to G-Sync and as always, Tom Petersen is the best in the business.
"For enthusiasts, we’ve included a new advanced control option that enables G-SYNC to be disabled when the frame rate of a game exceeds the maximum refresh rate of the G-SYNC monitor. For instance, if your frame rate can reach 250 on a 144Hz monitor, the new option will disable G-SYNC once you exceed 144 frames per second. Doing so will disable G-SYNCs goodness and reintroduce tearing, which G-SYNC eliminates, but it will improve input latency ever so slightly in games that require lighting fast reactions. "

Amazing how easy and painless change can be when a company is open and forthright about their products, why does the dialogue for change never go this well with AMD and their fanboys? Its always resist, deny, finger pointing, name calling, etc. and as a result their products that they use end up being the ones that suffer.

July 1, 2015 | 12:02 PM - Posted by Allyn Malventano

We were never able to reproduce any additional input lag in measurements. If there is any, it is immeasurable / negligible. Also, TFTCentral has measured the the Swift and Acer Predator (both G-Sync) as the lowest latency panels they have tested. AMD was passing out some PR guidance to reviewers showing supposed lower frame rates in G-Sync vs. FreeSync (implying additional lag), but this was not reproducible on our end. Some other sites just repeated those AMD slides but I have yet to see any independent testing confirm it.

July 1, 2015 | 04:22 PM - Posted by Anonymous (not verified)


You don't take Tom Petersens word for it.

July 1, 2015 | 05:52 PM - Posted by Any (not verified)

What B.S. Allyn

TFT Central measures fixed latency.

AMD Slides came pre-FreeSync monitor release.

BlurBusters and users pounded the issues from day 1 of the G-Sync Kit. That's +1yr to market of ignorance from PCPerspective and now your deflecting the issue when Nvidia Tom Petersen not only mentions it but repeats it. Not only that he also points to a differences of his own products.

June 30, 2015 | 09:06 AM - Posted by funandjam

he/she/it is trolling on so many different tech sites, hard to believe he/she/it has time go out and make money to pay for isp and electricity, unless that is how he/she/it makes money? lol. I had heard about that article before but never looked it up, you got a link for that comment section?

June 30, 2015 | 10:31 AM - Posted by chizow (not verified)

You AMD fanclowns can't even keep your comments straight. I guess you are referring to Jarred Walton's comments in reply to me ripping his FreeSync slidedeck republish where he foolishly declared the two technologies equivalent while writing an Engadget review.

Then when pressed about the differences, many of which were brought to light with advanced techniques Allyn and Ryan used here at PCPer, he used the "dog don't have an oscilloscope" excuse when it came to actual scientific testing. I simply called him out for writing a piece that was a huge disservice to the tradition of AT, sorry it rubbed you and other AMD fanboys the wrong way! :)

Hilarious, its funny how he hasn't written anything at AT since then, maybe he decided to actually go write for Engadget, instead.

June 30, 2015 | 11:37 AM - Posted by funandjam

The problem here is that you are one of the absolute worst at spreading misinformation and downright trolling, and you do it all over various tech review site forums and not just PCPer.
I read those comments on that Anandtech article: and you are wrong and Jarred is right, it's that simple.

Do yourself a favor(and the rest of us too), take your whole anti-AMD bit and turn it down several notches(if not off all the way) it makes you look desperate and childish. (note, this last sentence, is in reference to how you present yourself and hte way you behave)

on topic, this latest offering looks very compelling, but I still think they should have left this particular monitor in the development labs awhile longer until better scalars are ready.

June 30, 2015 | 12:29 PM - Posted by chizow (not verified)

What misinformation? Please feel free to outline them here, as I'm more than happy to point to references, including the work done by Allyn here at PCPer that clearly and definitively show FreeSync is NOT equivalent to G-Sync.

Indeed, if AMD and their fanboys weren't so desperately trying to misinform everyone into thinking FreeSync is an equal alternative to G-Sync, I wouldn't have to say a thing!

On topic: So you can admit, that FreeSync just isn't as good as G-Sync and thus, G-Sync is fully deserving of any premium. That is certainly a start, you can start by thanking PCPer for making that clear to you and others, because Jarred and AnandTech certainly were not up to the task.

June 30, 2015 | 05:24 PM - Posted by Allyn Malventano

I like how these guys accuse me and Ryan of being Nvidia fanboys in the comments of a review of a FreeSync panel that we are actually recommending.

June 30, 2015 | 09:04 PM - Posted by chizow (not verified)

Well, you know you did try to fix their broken ass products on several occasions by bringing them to light, surely that makes you an enemy and fanboy in their book.

June 30, 2015 | 10:27 AM - Posted by chizow (not verified)

Again fkr, you don't place a premium when products are better? Who cares if something costs more when it clearly carries additional benefits?

980Ti vs. Fury X, same price, 980Ti carries numerous benefits starting with being faster, more overclockable, and more VRAM to run the games and resolutions you would expect from a flagship card. And that's before you get into the ecosystem and support benefits of one vendor over the other.

Then you get into G-Sync vs. FreeSync, again, if your primary purpose is to buy a panel for VRR and keep it for a few years, why settle on a FreeSync panel that still has worse ghosting/overdrive, has minimum VRR windows (outside of which VRR breaks), has lower max refresh rates, and also has worse input lag within your VRR window?

G-Sync costs more because everything behind it is BETTER.

July 1, 2015 | 03:53 PM - Posted by fkr

I am not just a gamer tho. I like to do folding @ home and also have to decode allot of video. i think that the furyx may work out really well for those tasks.

as to who cares about things costing more, I do. I have kids and college funds. swimming lesson, horse lessons, fcking ninja lessons son.

June 30, 2015 | 02:22 PM - Posted by Allyn Malventano

It's not 'coil whine', it's the pump. The sound certainly gets to some people from the 'sounds' of it. We are working on a short piece comparing the sound profiles of the different cards, but trying to present something so subjective in a precise manner is taking some time.

June 30, 2015 | 09:18 PM - Posted by chizow (not verified)

Yep, the same pump whine at least 3-4 other reviewers commented on, but AMD assured didn't make it to retail parts, yet we see numerous reports of users on youtube and forums stating this is a problem. Now AMD is saying it will be fixed in FUTURE revisions....fool me once shame on me...fool me twice...

But yeah I think that's a key thing dB readings don't pick up, the pitch of a sound and the annoying nature of low frequency noise, like buzzing from a low Hz pump for example.

July 1, 2015 | 01:43 AM - Posted by PCPerversion

LMAO @ Chizow!

Reading your posts is a RIOT EVERY SINGLE TIME DUDE. Thanks!

But I fear for your health... you need to relax bro, take a chill pill or something.

Getting so fired up can't be good for you despite the fact that it's so amusing to watch.

Try repeating over and over to yourself, "It's just a graphics card, not the most important thing in my life."

If that doesn't help "Perhaps Allyn or Ryan could recommend a good psychiatrist?

July 1, 2015 | 10:42 AM - Posted by chizow (not verified)

Hey look, more nonsensical rubbish from some anon AMD fanboy posting under a troll pseudoynm.

I know its easy to laugh at all the BS AMD spouts and their fanboys lap up, every last bit of it. Glad you're enjoying the show!

June 30, 2015 | 09:14 PM - Posted by J Nev (not verified)

Allyn your nVidia bias is now confirmed. The pump whine was only in early batch / reviewer products and you know this and/or you don't know for sure that pump whine exists in newer Fury X parts. Using that argument against AMD is appalling.

June 30, 2015 | 09:30 PM - Posted by chizow (not verified)

LOL seriously? This exactly why I can't stand AMD and their fanboys, its like you all WANT cards with pump whine.

Philosophical question:

If a Fury X pump or coil whines, and no AMD fanboy admits to hearing it, does it make a sound? :D

June 30, 2015 | 11:00 PM - Posted by Allyn Malventano

My argument is backed up by the additional pair of Fury X's sitting in our office - both of which are *louder* than our sample. We got those retail units to confirm what AMD had told us (that it was fixed in retail). Apparently it wasn't. What you are interpreting as Nvidia bias is actually AMD repeatedly having to go back on what they tell reviewers.

July 1, 2015 | 01:18 AM - Posted by J Nev (not verified)

OK point taken, pity it had to be pried out of you. Next time explain the facts in full the first time and you may sway people to your inner green leanings.

July 1, 2015 | 09:49 AM - Posted by Anonymous (not verified)

Another option could be that you no jump to wild ass conclusions and name calling when you don't have ANY facts at all.

July 1, 2015 | 12:04 PM - Posted by Allyn Malventano

Instead of taking the time to back up each and every little point being made in comments, we were busy taking sound measurements and collecting other data for the upcoming post about it. Sacrifice necessary for the greater good I suppose.

Edit: Here are those results:

June 29, 2015 | 09:01 PM - Posted by chizow (not verified)

Who needs luck when Nvidia has done a better job of supporting a superior technology that they invented, out of the gate?

Is AMD doing a good job blaming monitor partners and vendors when they design a half-baked spec and it ends up being broken, requiring firmware and driver fixes to come to a point its still not as good as G-Sync?

Is AMD doing a good job when they STILL haven't delivered a CF FreeSync driver?

And as Allan already called you out on, why are you comparing to a GTX 980 when you can choose a 980Ti + G-Sync monitor and NEVER SETTLE for inferior AMD products? Sure it costs more, but its also better.

FreeSync isn't an industry standard at all and based on the lackluster first wave of panels compared to the 2nd generation of outstanding panels from G-Sync partners at Computex, it is obvious the G-Sync will continue to be a premium option.

Oh and also, the ONLY option when it comes to mobile/laptop VRR gaming.

June 30, 2015 | 12:10 AM - Posted by trenter (not verified)

Pleas3 stop trying to allign yourself with every author of every article you comment on. Anyone with any respect for themselves and the rest of their reader base will ignore you and anything you have to say. You've been an nvidia troll in almost every tech website I've ever visited. You have a financial stake in nvidia or so you've mentioned in the past, therefore you're no use to anyone and your opinions cannot be trusted.

June 30, 2015 | 10:23 AM - Posted by chizow (not verified)

Who said I aligned myself with every author, idiot? Do you not have the ability to critically think, engage and question what you read? I just happen to highly agree with Allyn and Ryan's take on this issue among many others and their degree of skepticism and critical reviews of FreeSync are largely responsible for educating the masses on the topic and getting AMD to address their FreeSync solution's deficiencies.

The fact you and every other AMD fanboy just wants to dismiss and sweep these problems under the rug is what is laughable, but unsurprisingly, you keep getting the same 2nd-rate half-baked solutions from AMD because they know quite well the kind of tech bottomfeeders they cater to in the marketplace.

July 1, 2015 | 02:44 AM - Posted by PCPerversion

ROFLMAO @ Chiz this is just so good...

July 1, 2015 | 09:13 AM - Posted by Anonymous (not verified)

And this is what the typical AMD fanboy falls back on when cornered. That is the most amusing to me XD

July 1, 2015 | 10:43 AM - Posted by chizow (not verified)

lolol yep.

June 30, 2015 | 12:23 AM - Posted by trenter (not verified)

He already explained why he was comparing the 980 with fury x moron.

June 30, 2015 | 10:20 AM - Posted by chizow (not verified)

Yes and he stupidly proved he's an AMD fanboy, just as you have for backing him.

Buying decision should hinge on GPU first and foremost, so obviously any non-idiot is going to make sure they make that decision first, so comparing a 980 to a Fury X is something only an AMD fanboy would do when anyone that wasn't a devout AMD fanboy would just pick the much better 980Ti for the SAME MONEY.

Then once you've "locked in" that decision, the monitor decision becomes clear, why not spend a small premium on a much better VRR solution to take advantage of your faster GPU? If the difference is only $200 on a $1200+ expenditure, why not go for the better solution in all aspects? In that respect, the choice makes a lot more sense.

June 30, 2015 | 11:30 AM - Posted by Anonymous (not verified)

I did just that. Putting that much money upfront for this kind of setup which is a bit more I thought. If it gets me the better product and experience its worth every penny and I'm sooooo happy I did put up that little bit more for that Gsync monitor which makes my PC Gaming that much more enjoyable.

There is still no Freesync equivalent out there period, plus that ghosting and overdrive mention all over the net with proof was a big nono for me and others.

June 30, 2015 | 12:33 PM - Posted by chizow (not verified)

Exactly, why settle/compromise and wait/hope for improvements that address FreeSync deficiencies when you can just buy G-Sync panels that don't have any of these deficiencies, and have not since Day 1?

Its great to see PCPers work in publicizing the issues has paid off and allowed those actually in the market to make INFORMED decisions.

June 30, 2015 | 04:55 AM - Posted by Anonymous (not verified)

At 1440p i would nog get a fury x. That card seems to really underpeform at everything below 4k.

June 29, 2015 | 04:44 PM - Posted by Anonymous (not verified)

Still trolling ?

Didn't you learn anything when Ryan Smith from Anadtech put you in your place.

June 29, 2015 | 08:57 PM - Posted by chizow (not verified)

Haha yeah and then Ryan conveniently sick so he didn't join in on the parade of sites raining on Fury X's review.

It's all fail for AMD fanboys, just like I said.

Rebrandeon 300 series happened.
Fury X failed to impress, despite AMD's bogus benchmarks it is slower than 980Ti
4GB is a problem
It most certainly is NOT an Overclocker's dream.

List of rubbish goes on, but AMD only gets away with it because their target audience just isn't smart enough to call them on it.

June 29, 2015 | 11:40 PM - Posted by fkr

nice double post, quit using ie

June 30, 2015 | 08:59 AM - Posted by Anonymous (not verified)

Really that is your best IE? No comeback Nvidia banter? XD

June 30, 2015 | 01:09 AM - Posted by godrilla (not verified)

To play gameworks titles at 30 hz no thanks!

June 30, 2015 | 10:16 AM - Posted by H1tman_Actua1


July 22, 2015 | 05:24 PM - Posted by Deluxe Eye Therapy Anti-Aging (not verified)

Hello, its pleasant piece of writing on the topic of media print,
we all know media is a fantastic source of facts.

June 29, 2015 | 04:51 PM - Posted by nevzim (not verified)

Nvidia, if only current GPUs are capable, please start supporting freesync now.

June 29, 2015 | 05:31 PM - Posted by Anonymous (not verified)

If they supported frame multiplication, then it seems like they could set the frame limits to the higher range. For 144 Hz display, it seems like they could set the minimum at 72 and frame multiply anything below that to stay in the 72-144 Hz range. Attempting to support 35 Hz directly at the panel seems to be difficult, but we know it is doable with a smart enough scaler combined with specific panels. Nvidia seems to limit the possible panels to use with gsync. I assume that it is too difficult to get the overdrive to work properly with some panels. This may be due to inconsistencies across the panel. Not every pixel will have the exact same response curve, especially across a wide range of frequencies. They may pick gsync panels based on consistency of this response.

It doesn't seem like a bad idea to support frame multiplication at the video card, since it should have more information about when the next frame is ready than a scaler that is guessing based on limited statistics. There really isn't much stopping scaler makers from implementing similar technology to what gsync does (frame multiplication plus variable overdrive) though and a sufficiently smart scaler should be able to guess quite well. I think AMD may believe that doing frame multiplication would just be a temporary solution until scaler makers update their hardware to properly support VRR interfaces. This would lead them to not want to implement it because the scaler makers would then not have a reason to add it to their scalers. I am leaning towards not adding it to the video card. Interfaces exist to hide such implementation details. Ideally, the video card would support a wide VRR window and the display would do whatever it has to to make this look the best (even frame multiplication) depending on the specific characteristics of the panel. The video card maker should not have to concern themselves with certifying panels. It should just work with whatever displays support the proper interface.

I don't think Nvidia's proprietary solution is good for the market. If you want to make a gsync display, you have to use Nvidia's scaler; the gsync module is really just a smarter scaler. Display makers probably do not want a vendor lock-in either. This should set off some competition among scaler/controller makers but it will take some time for them to catch up. There is nothing stopping them from doing frame multiplication in the scaler with free sync just like gsync does. It will be a more expensive scaler, but probably not much more once they start making actual ASICs. I would expect NVidia to get rid of the FPGA eventually also, once production volumes are high enough.

It still seems like Nvidia offers the better solution currently, if you are okay with matching a gsync panel with an Nvidia card. Although this free sync panel shows more ghosting, how noticable is this during actual game play? I personally already have a very nice fixed refresh display, so I would wait until this settles a bit.

June 29, 2015 | 06:14 PM - Posted by Anonymous (not verified)

Nvidia doesn't make the G-Sync scaler Alterra does.

Intel recently bought Alterra and might pose a problem going forward depending on what Intel plans to do with Alterra.

June 29, 2015 | 06:38 PM - Posted by Anonymous (not verified)

Alterra makes FPGAs (field programmable gate arrays). These are just generic chips that can be programmed to perform the function that Nvidia wants. Nvidia is fabless, so they technically don't "make" much. Nvidia almost certainly creates the programming for the FPGA. An FPGA is a good solution for low volume parts where it may not be worth it to make an ASIC. Also, FPGAs allow the functionality to be updated almost completely after the product is shipped. You would just write a different program into the ROM which stores the programming for the device which would be loaded on startup.

There are obviously limitations on the functionality of the FPGA. You can only emulate a design up to a certain size depending on how large the FPGA is. It also has a limited number of off chip connections with limits on the kind of signals they can produce. I would expect Nvidia to make an actual ASIC eventually. that is they would take the design and actually make a chip implementing the functionality rather than programming the design into an FPGA. Nvidia isn't really limited to Alterra for FPGAs. They probably could relatively easily port their design to be programmed into a different FPGA.

June 29, 2015 | 06:52 PM - Posted by Anonymous (not verified)

Altera just makes an FPGA which can be programmed to do the work. I would be curious to know which FPGA it is. I just noticed someone reporting it as an Altera Arria V GX FPGA. These appear to be $200 to $500 in small quantities. Nvidia would get them a lot cheaper in large quantities, but it will still be quite expensive. In large enough quantities, it will be much cheape, to make an ASIC, but the initial cost is large.

June 30, 2015 | 05:27 PM - Posted by Allyn Malventano

Yes, frame multiplication could be done in the driver. Nvidia is doing it for Mobile G-sync, so it's definitely possible.

June 29, 2015 | 05:33 PM - Posted by JohnGR

I would expect to see price at the "PROS:"

Anyway, thanks for the review.

June 29, 2015 | 05:41 PM - Posted by BillDStrong

So, um, WTH is Color Saturation, and why is the sRGB listed next to it? sRGB is saturation free. :)

The term you are looking for is Color Space. And the percentage it the coverage. This is important in the new television standards that come out. They won't be using sRGB, they will be using BT 2020 coverage.

Saturation is not referenced in sRGB, other than its relation to the white point it uses.

Also, it might be time to start listing monitor bit depth again, and whether it is using frame dithering to achieve it. The new standards are requiring 10-bit color, with the new Blue-Ray standard also supporting 12-bit color.

As these displays become more common, they will trickle down into the PC market, if only because they will offer the best picture you can get.

I wouldn't be surprised to see games remastered in Ultra HD, or HDR in a few years time.

June 30, 2015 | 02:30 PM - Posted by Allyn Malventano

That's an awful lot of analysis about what is likely just a bad translation of a table.

Agreed on the bit depth, but these gaming panels are going to be 24 bit (3x8 bit) for a while longer.

July 1, 2015 | 10:49 PM - Posted by BillDStrong

It took me less than 60secs to analyze the faults of the table, and I had to translate to be useful. So, yes, it was an awful lot of analysis that I shouldn't have had to do.

Now, will you make it better next time? If yes, then it wasn't wasted, and I enjoy helping make the quality of the site better, in my own little way.

June 29, 2015 | 06:04 PM - Posted by Anonymous (not verified)

1000:1 contrast...

VRR will be relevant once it's on something other than LCD.

June 29, 2015 | 07:11 PM - Posted by toffty (not verified)

Bought this monitor and I have been very pleased with it. AMD 290 driving it @40-75 fps in games. Freesync makes game play amazingly smooth. There is light bleed in the lower right corner though even in darker games I do not notice it; in fact even if i look for it I can't see a color difference in anything other than pure black. Took brightness down to 70 and lowered green and blue down to 96 for calibration (red 100 still)

If you have an amd card and want freesync this is the monitor to get.

June 29, 2015 | 07:18 PM - Posted by toffty (not verified)

I'll add that I upgraded from an HP Z24 which has been great; but the tearing...

Freesync alone was worth the price. Everything else is a bonus.

June 29, 2015 | 09:16 PM - Posted by Mandrake

The notion of 'VRR windows' still seems silly to me. :-/

June 30, 2015 | 03:31 PM - Posted by obababoy

Well to a bunch of the creators they don't see it as a money making spec right now...Still seems silly though.

July 2, 2015 | 12:27 PM - Posted by Anonymous (not verified)

It is what it is. Current technology with standards integration of "freesync" can't compete with Gsync's superior ability to not have to deal with these windows.

June 30, 2015 | 03:58 AM - Posted by Anonymous13 (not verified)

Finnaly a Monitor that's worth getting.

Freesync or not, it is a 144Hz IPS for non Freesync supporting cards. Compared that to my current 60hz IPS, hmm.. dreamy.

For Freesync cards, 35 to 90. Ohh I want. Though I hope that with time the sub 35Hz side gets additional GPU support like frame doubling to allow the screen to still run in Freesync. From what I read on various sites that is what Nvidia's G-Sync dones (correct me if I'm incorrect on that)

Still I'd gladly upgrade my non A-sync 60hz IPS to a A-Sync 35-90hz or 144hz (for non a-sync GPUs) ISP.

If I got used to the 60Hz statick, I don't see any problems in getting used to additional 30Hz at peak ;)

Now to save up them Zolty (I live in Poland :P)

June 30, 2015 | 06:25 AM - Posted by Pobear (not verified)

I want a 34 inch version.

June 30, 2015 | 07:24 AM - Posted by Master Chen (not verified)

35Hz is good, but not good enough yet for me personally.
I need a panel that would be able to go to 28~30Hz AT THE VERY LEAST while also being an IPS/PLS screen type and having either thin-bezel or outright bezel-less design akin to that of the Dell's U2414H/U2515H/U2715H monitors. Such FreeSync monitor would be simply just perfect for me personally, even if it will be just 1080p 24"/27". This here MG279Q is not quite there yet, is what I'm saying. It's close, but not there yet. Maybe two more years, or so.

June 30, 2015 | 08:14 AM - Posted by Anonymous (not verified)

Even people with AMD Graphics cards that I know (7950, 7970, R9 290) still haven't bought a freesync monitor. So far one said there isn't a good one out yet that he wants and this one did not meet his requirements, the VRR windows is too small still and also wants up to 144Hz VRR.

June 30, 2015 | 08:48 AM - Posted by Anonymous (not verified)

Not to mention the ghosting & overdrive issues still present....

June 30, 2015 | 09:15 AM - Posted by justin150 (not verified)

The truth is that both GSync and Freesync are still immature technologies. But this monitor shows how far they have come in a year.

I want my gaming monitor to support the highest possible frame rate with least possible ghosting and overdrive issues.

I will start building my new gaming system at Xmas. I have been a green team member for years, but I still have an old AMD system somewhere.

My one complaint is that we now have to chose to match monitors to GPUs. That sort of lock (NVidia!) is not acceptable

June 30, 2015 | 10:16 AM - Posted by Anonymous (not verified)

I have been using the ACER Predator XB270HU and it's been fantastic. No immature tech here.

I have been playing even more since I have had this thing. Not a single flaw when gaming or general PC use. Best experience since 120-144HZ monitor were first introduced!

June 30, 2015 | 10:37 AM - Posted by Stefem (not verified)

Honestly I don't care so much about the VRR window size but look at page 3, that ghosting is not acceptable for me, have you ever tried an ULMB display?
I tend to change monitor less often than graphics cards, that's why I waited some time before switching from CRT to LCD, the monitor needs to be overall better than the one I'm replacing and not half better and half worst.
It's nice to see AMD improving its solution but cannot overlook that NVIDIA found a solution and solved those issue before commercializing G-Sync.

June 30, 2015 | 11:34 AM - Posted by Anonymous (not verified)

Definitely agree with you!

June 30, 2015 | 02:22 PM - Posted by Anonymousx3 (not verified)

Does freesync support borderless windowed mode gaming or is that only supported by gsync for now?

June 30, 2015 | 02:59 PM - Posted by Allyn Malventano

That's G-Sync only (and very recent at that).

July 1, 2015 | 11:08 AM - Posted by obababoy

Very petty of me, but I wish it had the anodized aluminum headset holder like the...Acer was it? Thought that was a really cool feature because my $300 headphones should not be on the desk(and should have came with a stand).

July 1, 2015 | 12:05 PM - Posted by Allyn Malventano

That was the BenQ.

July 1, 2015 | 06:01 PM - Posted by Ultramar

Great review guys.
When will we get the AOC G2460PF review?

July 2, 2015 | 08:19 AM - Posted by obababoy

Got this monitor last night. Newegg shipped my 4-7 day shipping item in 1 day.
So far it is blowing my freakin mind!! I had the ASUS vg236h 23 inch 1080p TN 3D monitor from 2010 and went to this and I couldn't go to sleep last night. Played Witcher 3 at 1440p max settings with my R9 290 and hovered around 37-45 fps the whole time and I get the same smoothness now with hairworks on that I got before with hairworks off(that put me at 60fps with it off at 1080p).

Only thing I wish I could do is switch between freesync on and off with the push of 1 button or when I exit a fullscreen game or something so it goes back to 144hz.

July 13, 2015 | 01:42 AM - Posted by Chris M. (not verified)

Can't you just use FRTC (Frame Rate Target Control) of the AMD drivers and set it to just under 90 so that you never go over 90 and therefore remain within the freesync range of this monitor ?

August 18, 2015 | 09:04 AM - Posted by Azix (not verified)

These damn prices. Looks like I'm skipping freesync

August 23, 2015 | 12:34 AM - Posted by Anonymous (not verified)

Monitor with 144hz/120hz make's no point if supporting g-sync or free-sync beside those who use 3D glasses at 120hz. I got 1 GTX 980 and waiting for the new Asus PG2749Q and hope ill be fine.

September 27, 2015 | 09:31 AM - Posted by Anonymous (not verified)

I'm sat here reading through these comment's and wondering how pathetic you lot are. I mean jesus christ, they are just graphic cards!!! You know, there is a thing called a front door, why not go out side and enjoy life once in a while? You might find something more interesting to bitch about then AMD VS NVIDIA. They are both the same and do the same job, display a graphic to your monitor. *Yawn... yawn... yawn...*

November 10, 2015 | 08:09 PM - Posted by Anonymous (not verified)

2 tru mah friend need to stop arguing over whos e-peen is bigger

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

This question is for testing whether you are a human visitor and to prevent automated spam submissions.