Acer has announced three new G-Sync gaming monitors, all of which come equipped with eye-tracking technology from Tobii. The displays announced range from 24.5" to 27" in size, with refresh rates ranging up to 240 Hz.
Acer Predator Z271T
"Each new monitor features NVIDIA G-SYNC and high refresh rates for smooth gaming experiences without lag. The new Predator gaming monitors are available in different sizes and configurations to meet the needs of a wide range of users looking to take their gaming experiences forward."
- Predator Z271T: 27”, curved screen (1800R curvature), FHD 1920 x 1080, 144 Hz)
- Predator XB251HQT: 24.5”, flat ZeroFrame screen, FHD 1920 x 1080, 240 Hz)
- Predator XB271HUT: 27”, flat ZeroFrame screen, WQHD 2560 x 1440, 165 Hz)
Acer Predator XB271HUT
The Z271T is the sole curved display option, offering an 1800 radius curve and standard 1920×1080 resolution at 144 Hz. The flat-paneled versions provide a choice between very high refresh rates (240 Hz with the 1920×1080 XB251HQT) and higher resolution (2560×1440 at 165 Hz from the XB271HUT).
Acer Predator XB251HQT back, side view
U.S. pricing and availablity have not been announced.
With this level of
With this level of ‘innovation’ Acer will go bankrupt in no time.
So, they all have the clown
So, they all have the clown shoes? Shame
I hope these are priced
I hope these are priced between $150 – $250, otherwise, i’ll just wait for the 4K gaming monitor push.
Monitors/TV’s are supposed to be purchased at most every 5 years. 1440 is 2012 & high refresh is 2013 & Gsync is 2014. It’s 2016 price accordingly please.
Is display manufacturer
Is display manufacturer making effort to produce 4k 100hz monitors now that we have DP 1.4?
so much bulk on the bottom
so much bulk on the bottom
So many gimmicks being thrown
So many gimmicks being thrown at monitors these days, and yet image quality is still terrible. My monitor is 13 years old and there’s still nothing worthwhile to upgrade to. There probably will never be, until we get OLED.
… and yet the biggest
… and yet the biggest drawback of OLED is that they are worse than 1st gen plasma televisions when it comes to latency. My parents bought a new LG OLED and playing games is physically impossible on that screen. HDR 4k content is amazing to look at, but it isn’t that much better than a good IPS screen.
To be fair, that’s the big
To be fair, that’s the big advantage of Plasma, in that latency is almost zero.
It all comes down to the
It all comes down to the image processing between the input and the phosphors, which tends to add a lot of latency. That’s OLED’s current problem – too much processing latency. Looks incredible though…
Mark input as PC and turn on
Mark input as PC and turn on gaming mode.
I suspect that quantum dot
I suspect that quantum dot enhanced LCD is the best overall technology right now. The high contrast of OLED can really pop, but it actually has limited max brightness and other possible issues. As far as I know, Apple has still never used OLED tech on their phones. I assume that there is a reason for this. I have seen burn in on OLED display phones in the store that have been on continuously for a long time. I don’t know if they will work well for a computer display which displays static images most of the time. I am wondering if LCD tech will advance to the point where the drawbacks of OLED will clearly not be worth it. It would be great if they can get full array LED backlighting down to a cheap price and add a lot more zones. Televisions with full array local dimming are in the $5000 range or so currently.
I have a Dell U3011 Ultra Sharp display (2560×1600@60 Hz). I purchased it because I wanted 16×10 in that size rather than 16×9. I was also not impressed with the LED backlighting of the time. It is the last model with a CCCF backlight. It looks spectacular for images, although I do have some issues with deep color under Linux. It is a 10-bit panel, I believe, so applications which do not support it correctly can look massively over-saturated or under-saturated in some cases. It doesn’t have that great of pixel density when sitting close. It also gets quite hot. A QD enhanced LED based display would probably run a lot cooler and take less power.
I don’t know what I will get as a second display. I would like to have higher pixel density (I sit relatively close) for text. I also want a much higher refresh rate. I don’t think we will be getting a 120 Hz, 4K, quantum dot enhanced computer display anytime soon, at least not at a price I would be willing to pay. The U3011 was about $1200 when I bought it though. I didn’t really need that level of color accuracy, but the choices for 30 inch displays were very limited at the time. Also, the 6-bit TN panels I see on most laptops and cheap displays looks terrible to me. Some of the IPS touch screens don’t look too bad in comparison, but I don’t know how they would look next to a much better quality IPS.
I am wondering what you have that is 13 years old and is still of such good quality. Displays have come a long way really, but really good quality is still expensive.
I’ve got the Tobii EyeX that
I’ve got the Tobii EyeX that powers these. I mostly use it for Windows Hello, which is surprisingly useful. For games, I’m not too sold on the look-to-change-view function, and thus far no games have implemented look-to-lock-on (as seen effectively in the setup demo) which would be an actually useful functional application of eye-tracking.
240 Hz isn’t a “real” refresh
240 Hz isn’t a “real” refresh rate, right? I would think that would be outside the actual respond time of the panel, leading to massive amounts of ghosting.
With eye/gaze tracking,
With eye/gaze tracking, Nvidia can use the same hardware that does single-pass stereo for VR goggles to make single-pass overlapping projections for foveated rendering. Two eyes and two resolutions would make 4 ports, and the pascal hardware can support up to 16 ports.
https://pcper.com/reviews/Graphics-Cards/GeForce-GTX-1080-8GB-Founders-Edition-Review-GP104-Brings-Pascal-Gamers/Simul
This should let a GTX1070 get 165+ FPS in next-gen shooters. It might even let mainstream cards like the GTX1060 get over 60 FPS on a 4k display.
https://en.wikipedia.org/wiki/Foveated_imaging