G-SYNC is sweet but far from free

Subject: Displays | August 12, 2014 - 03:36 PM |
Tagged: asus, g-sync, geforce, gsync, nvidia, pg278q, Republic of Gamers, ROG, swift, video

Ryan was not the only one to test the ASUS ROG Swift PG278Q G-Sync monitor, Overclockers Club also received a model to test out.  Their impressions of the 27" 2560 x 1440 TN panel were very similar, once they saw this monitor in action going back to their 30-inch 60Hz IPS monitor was not as enjoyable as once it was.  The only bad thing they could say about the display was the MSRP, $800 is steep for any monitor and makes it rather difficult to even consider getting two or more of them for a multiple display system.

View Full Size

”When you get down to it, the facts are that even with a TN panel being used for the high refresh rate, the ASUS ROG Swift PG278Q G-Sync monitor delivers great picture quality and truly impressive gaming. I could go on all day long about how smooth each of the games played while testing this monitor, but ultimately not be able to show you without having you sit at the desk with me. No stuttering, no tearing, no lag; it's like getting that new car and having all the sales hype end up being right on the money. When I flip back and forth between my 60Hz monitor and the PC278Q, its like a night and day experience.”

Here are some more Display articles from around the web:

Displays

August 12, 2014 | 04:33 PM - Posted by H1tman_Actua1

nothing is free.

especially nothing awesome is free.

Nor is amd's so called "free" solution

WTF if your point?

August 12, 2014 | 05:05 PM - Posted by Gerlad (not verified)

Well, AMD's Free-sync is better in that it doesn't buffer frames after scanning them, which means it doesn't add 1 or 2 frames of input lag to the display's input lag.

Oh, and you don't need to buy a $150-$200 higher-priced monitor with a special Nvidia kit to get dynamic refresh rate, you just need to buy a monitor with DP 1.2a and higher, which doesn't cost any extra money to implement in a monitor, unlike Nvidia's G-Synic kit.

August 12, 2014 | 05:16 PM - Posted by arbiter

that "lag" as you call it is PR put out by AMD so can't believe everything AMD says given their history of making claims.

With AMD's idea you have to buy a video card if you don't have one that costs 150$ or higher to use their's so in the end cost is around the same.

On side note, buying Nvidia people expect they will pay more for their stuff tends to have more R&D in it. If you don't like it well don't buy it, its your money.

August 12, 2014 | 06:20 PM - Posted by Gerlad (not verified)

It's not PR. Nvidia admitted that their implementation will cause at least 1-frame lag on MaximumPC.
G-SYNC is just a cheap attempt by Nvidia to make a quick buck.

Between spending $150-200 on buying an overpriced G-SYNC monitor and buying a new Radeon, I'd choose the latter.

August 12, 2014 | 06:34 PM - Posted by arbiter

that 150$ radeon is only like 260x/265 so well not really that great of a deal. other then those you are buying a 290 series so.

1 frame of lag at 144 fps is only 7ms. wow that is so much latency that i will get totally owned in everything.

cheap attempt huh? AMD has yet to prove their crap even works cept for controlled demo's by them showing videos.

August 12, 2014 | 07:30 PM - Posted by Gerlad (not verified)

It was at least one frame lag. That translates to 20-25ms of extra input lag added to your monitors signal processing lag. It's quite noticeable on those 28" 60Hz 4K monitors.

August 12, 2014 | 07:41 PM - Posted by gloomfrost

source? would love to see a real free sync vs G-sync comparison proving your theories.

August 12, 2014 | 07:50 PM - Posted by arbiter

i would love to know where you get 1 frame is 20-25ms, 1 frame at 60hz is 16ms. Other thing is if that 1 frame of buffer is really a buffer or if its just there to keep for the display incase a mass drop happens cause something happens in game causes fps to drop a ton.

August 12, 2014 | 07:39 PM - Posted by gloomfrost

hmm.... i think we are not fully aware yet of the bonuses of intelligent two way communication vs free syncs one way communication approach.

August 12, 2014 | 05:09 PM - Posted by flame post (not verified)

getting popcorn

....

August 12, 2014 | 05:20 PM - Posted by Anonymous (not verified)

yeah, that's a pretty troll-tastic headline considering the discourse surrounding this topic.

August 12, 2014 | 05:47 PM - Posted by arbiter

amd fanboyz will come out just to attack the topic while ignoring that amd's solution isn't much cheaper if any.

August 12, 2014 | 07:48 PM - Posted by Daniel (not verified)

Watching brand loyalists go at it is like watching two, four year olds have a fight. You have fun now.

August 12, 2014 | 08:02 PM - Posted by arbiter

I see its mostly AMD fans trashing nvidia any chance they get, nvidia releases new tech, 5sec later a post bashing them for something. even when its AMD article they find was to bash nvidia. With them doing that tends to start the war. Even with all people that try to bash nvidia, most ppl vote to BUY nvidia with their money, case in point look at steams hardware survey numbers.

August 12, 2014 | 09:50 PM - Posted by Anonymous (not verified)

The majority of steam users use Intel IGPs. So most people buy Intel rather then AMD or NVIDIA.

August 12, 2014 | 11:57 PM - Posted by wujj123456

I thought GSync doesn't work with multi displays anyway? So Nvidia knows it will at most sell one? :-)

August 13, 2014 | 01:33 AM - Posted by Anonymous (not verified)

Fanbois need to fuck off in general...

Noone cares about your worthless opinion.
Ill spend my money the way i feel like it.

Gsync is here now and its awesome.
We dont even have a prototype monitor for fresync, no clue if it even works as good as gsync. What companies SAY is meaningless.

August 13, 2014 | 05:02 AM - Posted by -Z-

Question - Is a multi-monitor setup supported with G-sync? Or is it 1 card, 1 monitor? In that case, would a tri-SLI system be able to (hypothetically) drive 3 G-sync monitors? Am I asking a dumb question when I ask if that also means that all 3 monitors need to be in sync with each other as well as the graphics cards driving them?

August 13, 2014 | 06:37 AM - Posted by remon (not verified)

So, shoudn't we add the cost of the card on G-Sync too? With this monitor you have to spend up to 1k dollars to have G-Sync. Wooo.

August 13, 2014 | 02:51 PM - Posted by funandjam

It's the same idea for FreeSync as well. There are plenty of unsupported AMD cards that won't work with Freesync, such as my 7970. If I wanted either -sync tech, I'd have to buy a new card along with a new monitor.

I'm sure when DisplayPort Adaptive-Sync enabled monitors are released that they will have a premium over regular monitors, but will it be as much as Gsync's release?

August 13, 2014 | 07:52 AM - Posted by Anonymous (not verified)

It seems obvious to me that variable refresh rate should be part of the display standards. I should not be locked into a specific manufacturer for video card and display; this is why standards exist. If you want a proprietary solution get an Apple computer with their thunderbolt only display. The display will basically not work with anything except a thunderbolt enabled mac. Thunderbolt is the only input port it has and they seem to have deliberately disabled this from working with just a display port input.

August 13, 2014 | 08:26 AM - Posted by Tim Larsen (not verified)

G-Sync will work on multiple monitors BUT apparently it needs a card per monitor so a surround setup will require 3 cards.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.