The new Acer Predator X34 claims a “world's first” as a curved 34-inch IPS G-SYNC gaming monitor, and from appearance to specs the new display looks very impressive.
- Curved 34-inch IPS 21:9 ultra-wide QHD display
- 3440×1440 @ 60 Hz resolution
- 4 ms response time
- 100,000,000:1 max contrast ratio
- 300 cd/m2 brightness
- 1.07 billion colors (10-bit)
- 100% sRGB
- Panel supports overclocking to 100Hz
- NVIDIA G-SYNC technology
- Two 7W speakers enhanced with DTS Sound
- Zero frame design maximizes viewing area
- Tilt from –5 to +35 degrees, height adjustments up to 5 inches
- Connectivity includes HDMI, DisplayPort 1.2, and 4x USB 3.0 ports
The 60 Hz native refresh rate might cause comment, but the adjustable overclocking up to 100 Hz should satisfy those looking for a better high FPS experience, though at 3440×1440 it would be difficult to max out even 60 Hz with the newest games at ultra settings if you're running a single GPU. And if you do play at the highest settings the included NVIDIA G-SYNC variable refresh technology will certainly help with those moments when the game is outputting much less than 60 FPS.
So how much will the new Predator X34 set you back? Acer says the monitor will be available “at leading online retailers in the United States” for a cool $1299.
I’m new here, so, hi you all
I’m new here, so, hi you all 😉
Ordered this baby 9 weeks ago and i’m so excited how the experience will be!
Although the first shipped models have some strange issues, i hope they will fix this in the future versions.
This will be my first proper monitor after playing on old Samsung TV’s 🙂 I expect a huge step up in terms of quality and i guess i will not be dissapointed. At least if the issues from the first shipped ones are solved.
Can’t wait for a review tho!
What issues have you heard
What issues have you heard of?
Some people have issues with
Some people have issues with color banding and coil whine. I don’t know if the backlight bleeding is worser than on other ips’s, but it looks very strong.
There are also samples that aren’t able to handle the 100Hz.
Some links with a few pics (first is a german forum):
http://extreme.pcgameshardware.de/news-kommentare-zu-monitore/395640-acer-predator-x34-gekruemmter-g-sync-monitor-im-juli-fuer-1-400-euro-7.html#post7697276
http://www.overclock.net/t/1537403/tftcentral-acer-predator-xr341ck-34-curved-gaming-screen-with-g-sync/2130#post_24409297
They have claimed these issues will be solved with an firmware update tho! I don’t know if it’s possible to fix all those things softwarewise. Maybe they have to fix it hardwarewise as well. My shop now says “delivery date unknown”… again 🙂
It might be downgrading the
It might be downgrading the image signal once you go in 3D and G-sync mode.
That’s the only reason banding would show up. If the image signal is lower than the image being displayed.
Hi, seems that the issues of
Hi, seems that the issues of color banding have been solved
TFT Central has a review from mid september of this monitor and they noticed the issue. Acer replyed and it should have been fixed through firmware.
http://www.tftcentral.co.uk/reviews/acer_predator_x34.htm
They confirmed the firmware
They confirmed the firmware update fixed it but not that Acer has applied the fix to all the monitors.
I missed the PCPerspective
I missed the PCPerspective article about these problems, with all the photos and videos, and the extensive analysis about how this is Nvidia’s wrong doing.
lmao this AMDtroll
lmao this AMDtroll again.
Let’s see, maybe there’s no PCPer article on it because this panel still isn’t available in the US yet, and while there are undoubtedly issues with some of these Acer monitors, there aren’t widespread specification limitations and defects like we’ve consistently seen with FreeJunk panels.
Also, you and every other AMD fanboy should be thanking Allyn and PCPer, that’s 2x now they’ve done you a solid and exposed issues with AMD products that ultimately led to SOLUTIONS for you the AMD end-user, because we all know AMD fanboys are too protective and hush hush about AMD problems to bring these issues to light otherwise. 😉
1) CrossFire #RuntframeGate
2) FreeJunk broken OD and ghosting.
Hey, where have you been? The
Hey, where have you been? The Nvidia army was falling apart without you posting in 50 hardware sites 200 posts a day to defend Nvidia. And yes if people google “chizow” they will verify those numbers.
Hey, where have you been? The
Hey, where have you been? The AMD army was falling apart without you posting in 50 hardware sites 200 posts a day to defend AMD. And yes if people google "JohnGR" they will verify those numbers.
You could have a point if I
You could have a point if I was posting all over the internet like he does, and only on Nvidia – AMD threads. I mean who posts ONLY ON ONE AND VERY SPECIFIC TYPE OF articles?
For example
https://disqus.com/by/chizow/
Almost 22000 posts in about 2,5 years. About 25 posts per day and ONLY on Disqus.
People who want to rid the
People who want to rid the world of AMD fanboys, like you. 🙂
25-50 posts a day ALL for
25-50 posts a day ALL for Nvidia?
Is this part time job or full time job?
Neither, its a hobby, poking
Neither, its a hobby, poking fun at all the AMD fanboys and the stupidity they post. If you didn’t post such stupidity, I wouldn’t have anything to reply to. Sadly it looks like that fun may all be coming to an end soon enough. 🙂
My real job provides me the freedom and resources to enjoy and buy the best while making fun of AMD fanboys like you that don’t even buy the products you fanboy over! 😀
Guys FFS grow up?
Guys FFS grow up?
Where is the fun in that? :p
Where is the fun in that? :p
Daaamn!
Its THAT bad?!
Daaamn!
Its THAT bad?!
LOL nice one Allyn
LOL nice one Allyn
Such a childish post by
Such a childish post by someone who should know better since you work for this site, you really come off looking just as bad as those you are replying to and looking more like what you are often accused of.
He’s just pointing out the
He’s just pointing out the logical fallacies that AMD fanboys frequently ignore, its OK, there’s a lesson in every post and I find its great that Allyn is willing and capable in dishing it out. 🙂
Just waiting for AMD to
Just waiting for AMD to implode, there’s not much else going on until next-gen now that Nvidia has handily won this round, Fiji was a total failboat and Rebrandeon 300 series is well, just a bunch of rebrands.
Don’t worry, bad news incoming for AMD fanboys later today when they release earnings (losses), the only question is how AMD fanboys like you will you spin it? 😀
And 50 hardware sites? LMAO try again, you must be getting confused with yourself as Allyn alluded the only one spamming multiple sites is you. 😉
I already proved that you
I already proved that you spam 25+ posts a day at Disqus alone. Add PCPer, Anandtech and other sites where you post, and the numbers are way beyond someone who just passes his time online. Especially when his only interest is that AMD and Nvidia articles. No posts on politics, athletics, games, other hardware, car, cinema, weather, whatever.
The description screams LIKE A JOB.
JMO
LMAO and Allyn already proved
LMAO and Allyn already proved that you spam multiple sites a day all across the internet, I rarely post here other than to point out your stupidity and double-standard fanboying but its certainly a nice diversion from the less serious trolling on Disqus. 🙂
And why would I post on that other stuff on a tech site when there’s real people in my life to have those conversations with lol, while few of them are interested in my tech related hobbies.
No he didn’t. He just
No he didn’t. He just supported the fellow Nvidia spammer as brother to brother. Nothing more than that. And that’s what is the problem with Allyn. He is one of the TOP authors, but he usually can’t fight the Nvidia fanboy in him. He just proved what I am saying.
LOL once again, anyone who
LOL once again, anyone who claim they don’t have some bias or affinity for one brand or company or another is just LYING to themselves, especially anyone who will post non-anonymously and put their name to a point of view. Its just human nature, to deny this is to deny yourself.
Allyn probably does have an affinity for Nvidia, does that automatically make him an Nvidia fanboy? No, imo.
It just means:
1) He prefers better product
2) He places higher value on inherently good or interesting tech and has the analytical capabilities to quantify this.
3) He doesn’t like BEING LIED TO by vendors lol.
4) He’s smart enough to smell BS when its passed under his nose.
No offense, but I know none of this really resonates with you because well, you’re an AMD fanboy that can’t relate to any of this. I think most people who prefer Nvidia however, this makes perfect sense.
So yes, when Raja Koduri, Richard Huddy etc. spout a bunch of lies to his face at CES 2014 and CES 2015 that end up, well, not being truthful, he maybe takes it personally along with the fact that the competition unapologetically just does it better.
You call it fanboyism, I call it just preferring better product. Its no biggy, fans of the best tend to prefer different product from different vendors over the years while true fanboys like you ride junk products like Sempron the grave 🙂
I’ve done several pieces
I've done several pieces including (or even centered around) G-Sync related issues. They faded quickly because the issues were fixed. Just like your fading memory of my stating this in reply to several of your previous similar accusations/comments. My analytical process trumps your selective memory any day of the week.
Any links with articles like
Any links with articles like that analysis on the ASUS FreeSync monitor that was topping at 90Hz and that was AMD’s fault?
Your articles are TOP in their class, as it is your love for a specific company.
Don’t worry, just keep
Don’t worry, just keep ducking your head in the sand:
https://pcper.com/reviews/Editorial/Look-Reported-G-Sync-Display-Flickering
Allyn’s work is great though, any true enthusiast sees the importance of the kind of investigative work he does, as it brings problems to light so that they can be FIXED or IMPROVED upon by the vendor.
Oddly, it seems the most vested AMD fanboys aren’t even interested in the products they fanboy over, they’re just in it to sweep any problems under the rug as if any potential problems are more damaging to AMD’s reputation than a half-baked shoddy solution like FreeJunk. Go figure!
Also, another prime example of PCPer going to bat for the enthusiast while shedding any notions of homering for their favorite brands, Ryan specifically acknowledged G-Sync’s inability to turn V-Sync off at the top of its VRR range as a negative compared to FreeSync, and even mentioned it to Tom Petersen on numerous webcasts.
And guess what happened! Petersen, as he is known to do, took this feedback and IMPROVED THEIR PRODUCT and updated their driver to allow this for G-Sync. That’s how progress is made for the benefit of the consumer, but I know this is all foreign to fanboys like you that don’t even buy the products you fanboy over.
So yes, once again, Ryan, Allyn, PCPer get to take a bow for making products better for actual users and enthusiasts, AMD fanboys get to STFD and enjoy the FreeJunk they are trying to protect and pretend is as good as G-Sync, when its not, and with fanboy apologists like you, it shouldn’t be any wonder WHY that’s the case. 😉
Yes he does write excellent
Yes he does write excellent articles. But he also have double standards with AMD and Nvidia.
As for the rest of your post, didn’t read it. I don’t get payed to read what you probably get payed to write. JMO of course.
lol yep like I said, burying
lol yep like I said, burying your head in the sand like a typical AMD fanboy/apologist.
Yes he writes excellent articles and yes, he and Ryan get results. Most tech enthusiasts have always found the value of the press and tech sites because they have always been our strongest and most direct voice to these companies to tell them what we want, what they are doing right, and what they are doing wrong.
So why again do you always take issue when PCPer points out problems with AMD products that ultimately, affect positive change, while ignorantly claiming PCPer doesn’t do the same when they find problems with Nvidia products, when they clearly do?
Oh right, because you’re an AMD fanboy that doesn’t actually buy AMD products and therefore doesn’t see the value in his investigative findings, just fanboys over them because, well fanboys will be fanboys. 😀
“My real job provides me the
“My real job provides me the freedom and resources to enjoy and buy the best while making fun of AMD fanboys like you that don’t even buy the products you fanboy over! :D”
Isn’t comments like that kind of the essence of what fanboyism is all about?
No, not really, because the
No, not really, because the main distinction that I will always hold to is the fact I am a fan of great products, and over time, this will naturally change. AMD fanboys however, well, seldom is AMD the best, they’re just fanboys because they’re fans of a company and are willing to ignore all else in their purchasing decisions.
For example, originally I was a 3Dfx fan like most early PC gamers, they fell, I swung between ATI and Nvidia until Nvidia got a commanding lead with G80 (8800 GTX) and since then I’ve had no reason to go back to AMD, in fact, performance and price aside, Nvidia has only given me more reason to stay within their ecosystem.
Features, products, support, end-user experience and analytical thinking trump fanboyism, every day of the week. 🙂
Oh ouch, feel the burn JohnGR
Oh ouch, feel the burn JohnGR 😉
It’s just in your mind.
It’s just in your mind.
Wow you pre-ordered for
Wow you pre-ordered for $1300? That’s a lot to plunk down on 1st run panel from Acer. There’s also the ROG Swift 21:9 3440×1440 expected next year for similar money, personally I am waiting for that, especially since this Acer doesn’t even officially support 100Hz.
I know Asus has had some problems as well on their high-end panels but to me, Acer will always be a value brand, and while they’ve made some improvements I’d still spend on Asus ROG over Acer if money is the same.
V1 firmware has blue banding
V1 firmware has blue banding as described by TFTcentral, that was fixed with V2 firmware
g-sync only ?
How much did
g-sync only ?
How much did nvidia had to ‘pay’ for this to happen?
Yep, they paid a ton of
Yep, they paid a ton of money, those bastards. And ACER screwed ’em anyway!
http://www.amazon.com/Acer-XR341CK-bmijpphz-34-inch-UltraWide/dp/B0111MRT90#Ask
It wouldn’t be too hard for a
It wouldn’t be too hard for a third party to support both g-sync signals and adaptive sync, but (AFAIK) you currently can’t do g-sync without using a g-sync module from Nvidia. Nvidia isn’t going to support free sync unless they absolutely have to. It would be nice if scaler makers would support frame multiplication in their free sync scalers, as this is about the only real difference.
It wouldn’t be too hard for a
It wouldn’t be too hard for a third party to support both g-sync signals and adaptive sync, but (AFAIK) you currently can’t do g-sync without using a g-sync module from Nvidia. Nvidia isn’t going to support free sync unless they absolutely have to. It would be nice if scaler makers would support frame multiplication in their free sync scalers, as this is about the only real difference.
The freesync version has
The freesync version has already ben out 2 months ago
Overclocking the panel to
Overclocking the panel to 100hz, does this mean gsync will operate up to 100hz or will gsync only work up to 60hz and anything above that is at a fixed rate?
I have heard that the
I have heard that the overclocking is only available with G-Sync. What semms weird to mee, might be wrong… But the overclocking works definitely with G-Sync.
Shouldn’t be a surprise
Shouldn’t be a surprise really, Nvidia basically created their own custom scaler 2 years ago when they invented VRR and since there were no COTS scaler ASICs that could handle this on the market, they created their own premium FPGA.
They soon found out that they weren’t limited to the restrictions and QC issues found on generic scalers on the market and the result is the ability to OC beyond traditional scaler ranges.
The original ROG Swift was able to OC from 120Hz (original spec) to 144Hz by release and the Swift 2 IPS is rumored to go from 144Hz all the way up to 165Hz! The Swift 21:9 also supposedly goes up to 100Hz which is a 25Hz improvement over the 75Hz on the FreeSync version.
So all-in-all, the premium for G-Sync can really be attributed to a much higher performing scaler ASIC, not unlike a faster GPU or even a faster CPU, its simply better and commands a premium.
Not only does G-Sync allow higher VRR framerates, but due to the local RAM acting as a lookaside buffer as Allyn/PCPer excellently detailed, it allows for virtually no minimum FPS floor in VRR mode by doubling and tripling frames at low refresh rates.
a simple yes or no would have
a simple yes or no would have sufficed, I didn’t ask for a lecture on the history of gsync. Seriously, it looks like you are trying to place an advert here for nvidia.
The answer isn’t simple
The answer isn’t simple though, and I wasn’t replying to you, consider yourself enlightened.
I was happy to debunk the lies AMD and their fanboys have been regurgitating for months, that the G-Sync module was a useless premium Nvidia was charging, just because “greedy” as AMD fanboys love to say. Not useless after all, I guess!
You’re welcome.
with g-sync, 100hz. Without
with g-sync, 100hz. Without g-sync, 60hz
Far too expensive. At this
Far too expensive. At this price, I’d rather have that 4k 40″ Seiki. It has a much more usable aspect ratio (while still being even wider than this), no gimmicky curving, better contrast, and a lot cheaper. VRR is nice, but not worth paying $400 more for a monitor that is worse in every other regard.
The curve isn’t a gimmick.
The curve isn’t a gimmick. 34″ 21:9 ultrawide monitors actually are wide enough to show distortion at the peripheral edges of your vision, so they curve to adjust for that.
On top of that, a 21:9 format makes *MUCH* more of a difference than 4K when it comes to immersive gaming and productivity. It’s equivalent to running a dual-monitor 17″ 1720×1440 (5:4 ratio), without having to deal with the clutter and bezels of two actual monitors.
The only thing that a 16:9 4K monitor brings to the table over what we have today (27″ 1440p monitors) is higher pixel density. And I’d argue that pixel density on a 27″ 1440p monitor isn’t an issue to begin with.
The curve may slightly help
The curve may slightly help if you view it from the absolute center, but it destroys anything off-axis. It also takes up more space and makes the monitor much less flexible for multimonitor setups.
How can 3440×1440 be more immersive than 3840×2160? It’s smaller in either dimension. 4k gives you even more of a multimonitor experience, being the equivalent of a pair of 1920×2160 monitors or 4 1080p monitors.
That 4k 40″ Seiki has almost exactly the same pixel density as this 34″ (109 vs 110 ppi). In the end, you get less horizontal resolution and size, MUCH less vertical resolution and size, worse black levels, IPS glow, and Gsync for $400 more. Seems pretty terrible. It might be worth considering if it cost significantly less than the Seiki, but at this price, even just the poor contrast is enough to turn me off of this. I’ve been spoiled too long by plasma and CRT to be willing to go back to 1000:1 contrast, especially for such a ridiculous price.
“How can 3440×1440 be more
“How can 3440×1440 be more immersive than 3840×2160? It’s smaller in either dimension. 4k gives you even more of a multimonitor experience, being the equivalent of a pair of 1920×2160 monitors or 4 1080p monitors.”
What your missing is the aspect ratio. 3440×1440 is about 21:9, 3840×2160 is 16:9.
21:9 is the aspect ratio of many movies. It’s more immersive because it gives you a wider field of view. 4k May have more pixels but the difference is how they are arranged.
21:9 monitors will play most movies without any black borders
We used to game on 4:3…16:9 by was deemed more immersive
“21:9 gives you a wider field
“21:9 gives you a wider field of view than 4k”.
No it doesn’t. These two monitors have almost the same DPI, and 34″ 21:9 is 31.36″ wide, while 40″ 4k is 34.86″ wide. The claim that 31.36″ is wider than 34.86″ is nonsense. Adding more vertical peripheral vision does not detract from horizontal peripheral vision; if it did, we would all be using monitors that were one pixel tall, since that’s “wider”. This is not a zero-sum game; human FOV is fixed. If you were arguing that 3440×1440 is more immersive than 2880×1800 (which has slightly more pixels than 3440), then you might have a legitimate argument (though I would still disagree), but you are instead arguing that the less-wide screen is more immersive because it is wider.
Think of it this way: on the 4k, you can run a centered resolution of 3440×1440. This will almost perfectly simulate the 21:9 (including in terms of size). From there, is enabling the rest of pixels above the currently used pixels going to make the experience less immersive? Is seeing less more immersive? Worst case is that your field of vision does not cover those pixels, so you can’t see them, in which case enabling them does nothing. But if you can see those new pixels, then you see more; a larger proportion of your FOV is covered, and so the experience is more immersive. If you really think that less is more, then you can just leave it like that, and run 3440×1440 on a superior panel while spending $900 instead of $1300. You can’t even make any arguments about wasted physical space in this case, since curved monitors kill any opportunity for multiple monitors anyway.
And what happens when you want to do something that is inherently vertical (reading Pcper, for example)? Having the option of 2160 vertical res is a lot nicer than being stuck with only 1440.
“Think of it this way: on the
“Think of it this way: on the 4k, you can run a centered resolution of 3440×1440. This will almost perfectly simulate the 21:9 (including in terms of size).”
No you wouldn’t, you would just see a higher density portion of the 21:9, not the full view.
You’ve got all the pixels and
You’ve got all the pixels and all the physical space, why wouldn’t you see the “full view”? The DPI is the same between the two monitors
Your right, it’s all about
Your right, it’s all about the field of view being rendered which is determined by your aspect ratio (as set per your video game settings). Consider that a game is rendering 1 pixel as 1 inch of camera view. 21:9 give you 21 inches of horizontal view and the 16:9 only gives you 16 inches of horizontal view.
The screen resolution doesn’t mater when it comes to field of view just the quality of the view being rendered. The 16:9 screen could have a 8K or 100k resolution but the display will always be representing the 16 inches of the aspect view. If you want more view you would have to pan horizontally left or right with your mouse. 21:9 reduces this horizontal movement.
Ultra Wide gaming is the new standard and it is AWESOME.!
Sounds more like a poorly
Sounds more like a poorly made game that limits your FOV options for some bullshit “competitive” reason. Instead of reducing horizontal FOV to go from 21:9 to 16:9, they could keep the horizontal FOV the same (and therefore add more vertical FOV).
You list G-Sync as if it’s a
You list G-Sync as if it’s a negative.
(Deleted)
(Deleted)
That’s not what I was
That’s not what I was implying. It’s nice, just not worth $400 + a worse monitor.
Try this: Google “4K vs
Try this: Google “4K vs 21:9”.
Select a few of the articles or forum threads that return from that search, and see if you can spot a trend in opinions.
Hint: literally 100% of the results on the first page (i.e. articles and forum threads) of that search will come back stating that 3440×1440 21:9 is better for productivity, and more immersive for gaming than 4K.
Literally. 100. Percent.
Doesn’t that tell you something?
All of those are talking
All of those are talking about small (~28″) 4k monitors where you use scaling to produce something like a 27″ 2560×1440 monitor but with more detail. Compared to that, yes, 34″ 3440×1440 may be better, but I was pointing out a 40″ 4k display that is WIDER than the 34″, and would not need scaling, so the effective resolution is wider than 3440 as well.
And I didn’t see anyone talking about productivity in those comparisons. I certainly would not want to try to read a website, read a book, or type a paper (which are all portrait orientation) on only 1440 vertical res if I could have more.
Even so, that’s an appeal to popularity rather than any real argument.
The increased immersion
The increased immersion claims come from the larger aspect ratio. Properly coded games give you a larger field of view for 21:9 vs 16:9 aspect ratio. A 4K monitor will just display a higher resolution 16:9 1080p image. I guess you could sit closer to your large 4K display and use a high FOV to get a similar effect to using 21:9. I don’t think most software defaults to that though.
Any game that artificially
Any game that artificially limits your FOV in any way can hardly be called “properly coded”. Players should always have the full freedom to set the FOV to whatever is most comfortable for them and their particular setup.
Most software (at least on Windows) defaults to no scaling, and will run 4k just like it is. With a 40″ 4k, you won’t need scaling, as the DPI is the same as this Acer monitor.
Off centre matters for curved
Off centre matters for curved displays when you are watching a TV at couch but how exactly does it matter when you are sitting at a computer? Plus the whole point of 21:9 is to not have multi-monitor set-ups.
I have one on order its going to be awesome for racing games, and more peripheral vision in other games.
If you’re not going to have
If you’re not going to have multiple monitors, then 21:9 is not nearly wide enough. I have three monitors side by side which is more or less 5:1, and I wouldn’t want to downsize at all: for multitasking, you still need all the space you can get, even if it’s too large to look at all of it simultaneously.
Also, what about a portrait monitor for reading books or webpages? Most of the internet is designed vertically, and having a single portrait monitor makes reading MUCH nicer. 21:9 would be a big downgrade in that regard.
I agree that off-center is not as important on a monitor as a TV, but it still matters. People still move their head and chairs around a bit, and occasionally you will have a second person watching.
There is no way my wife will
There is no way my wife will allow triple monitors. There is a big difference between 16:9 and 21:9. Its great for photo and video editing. As for reading websites on a 21:9 monitor, I can have two windows open and glance at both.
You can have as many windows
You can have as many windows open as you want, but you will be constantly scrolling all of them, since none of them will be able to fit much at any time if you only have 1440 pixels of vertical resolution.
Yeah – I agree – I always sit
Yeah – I agree – I always sit off centre of my monitor…
Sitting in the middle is way over rated.
Oh shit, no this is the one!
Oh shit, no this is the one! Must have!
hopefully the “Predator”
hopefully the “Predator” badge is just a sticker
The monitor looks great, the
The monitor looks great, the stand looks like something upon which I’d like to hang my laundry.
The stand looks rubbish, mine
The stand looks rubbish, mine is going to go straight onto a ERGOTRON MX Desk Mount. Its listed as only up to 30″ but its 3Kgs lighter then its max.
Just saw those light bleed
Just saw those light bleed photos over at Amazon! For that much money this thing should be perfect.
Thats IPS glow not light
Thats IPS glow not light bleed
I bought the 27″ Gsync 1440
I bought the 27″ Gsync 1440 which said it had all kinds of color but windows and my spyder tells a different story. The color banding gives it away, I slacked to long otherwise I would have returned it. 🙁
I pick mine up tomorrow,
I pick mine up tomorrow, can’t wait