Podcast #328 - G-Sync Flickering, In Win D-Frame Mini, Fractal R5 Silent and more!

Subject: General Tech | December 4, 2014 - 03:34 PM |
Tagged: podcast, video, g-sync, flickering, ROG Swift, pg278q, in win, d-frame mini, fractal, define r5 silent, nvidia, amd, Intel, asus, gtx 970 DirectCU Mini, msi, 970 Gaming

PC Perspective Podcast #328 - 12/04/2014

Join us this week as we discuss G-Sync Flickering, In Win D-Frame Mini, Fractal R5 Silent and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Allyn Malventano, Jeremy Hellstrom, Josh Walrath, and Sebastian Peak

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

 

Manufacturer: PC Percpective

Overview

We’ve been tracking NVIDIA’s G-Sync for quite a while now. The comments section on Ryan’s initial article erupted with questions, and many of those were answered in a follow-on interview with NVIDIA’s Tom Petersen. The idea was radical – do away with the traditional fixed refresh rate and only send a new frame to the display when it has just completed rendering by the GPU. There are many benefits here, but the short version is that you get the low-latency benefit of V-SYNC OFF gaming combined with the image quality (lack of tearing) that you would see if V-SYNC was ON. Despite the many benefits, there are some potential disadvantages that come from attempting to drive an LCD panel at varying periods of time, as opposed to the fixed intervals that have been the norm for over a decade.

IMG_9328.JPG

As the first round of samples came to us for review, the current leader appeared to be the ASUS ROG Swift. A G-Sync 144 Hz display at 1440P was sure to appeal to gamers who wanted faster response than the 4K 60 Hz G-Sync alternative was capable of. Due to what seemed to be large consumer demand, it has taken some time to get these panels into the hands of consumers. As our Storage Editor, I decided it was time to upgrade my home system, placed a pre-order, and waited with anticipation of finally being able to shift from my trusty Dell 3007WFP-HC to a large panel that can handle >2x the FPS.

Fast forward to last week. My pair of ROG Swifts arrived, and some other folks I knew had also received theirs. Before I could set mine up and get some quality gaming time in, my bro FifthDread and his wife both noted a very obvious flicker on their Swifts within the first few minutes of hooking them up. They reported the flicker during game loading screens and mid-game during background content loading occurring in some RTS titles. Prior to hearing from them, the most I had seen were some conflicting and contradictory reports on various forums (not limed to the Swift, though that is the earliest panel and would therefore see the majority of early reports), but now we had something more solid to go on. That night I fired up my own Swift and immediately got to doing what I do best – trying to break things. We have reproduced the issue and intend to demonstrate it in a measurable way, mostly to put some actual data out there to go along with those trying to describe something that is borderline perceptible for mere fractions of a second.

screen refresh rate-.png

First a bit of misnomer correction / foundation laying:

  • The ‘Screen refresh rate’ option you see in Windows Display Properties is actually a carryover from the CRT days. In terms of an LCD, it is the maximum rate at which a frame is output to the display. It is not representative of the frequency at which the LCD panel itself is refreshed by the display logic.
  • LCD panel pixels are periodically updated by a scan, typically from top to bottom. Newer / higher quality panels repeat this process at a rate higher than 60 Hz in order to reduce the ‘rolling shutter’ effect seen when panning scenes or windows across the screen.
  • In order to engineer faster responding pixels, manufacturers must deal with the side effect of faster pixel decay between refreshes. This is a balanced by increasing the frequency of scanning out to the panel.
  • The effect we are going to cover here has nothing to do with motion blur, LightBoost, backlight PWM, LightBoost combined with G-Sync (not currently a thing, even though Blur Busters has theorized on how it could work, their method would not work with how G-Sync is actually implemented today).

With all of that out of the way, let’s tackle what folks out there may be seeing on their own variable refresh rate displays. Based on our testing so far, the flicker only presented at times when a game enters a 'stalled' state. These are periods where you would see a split-second freeze in the action, like during a background level load during game play in some titles. It also appears during some game level load screens, but as those are normally static scenes, they would have gone unnoticed on fixed refresh rate panels. Since we were absolutely able to see that something was happening, we wanted to be able to catch it in the act and measure it, so we rooted around the lab and put together some gear to do so. It’s not a perfect solution by any means, but we only needed to observe differences between the smooth gaming and the ‘stalled state’ where the flicker was readily observable. Once the solder dust settled, we fired up a game that we knew could instantaneously swing from a high FPS (144) to a stalled state (0 FPS) and back again. As it turns out, EVE Online does this exact thing while taking an in-game screen shot, so we used that for our initial testing. Here’s what the brightness of a small segment of the ROG Swift does during this very event:

eve ss-2-.png

Measured panel section brightness over time during a 'stall' event. Click to enlarge.

The relatively small ripple to the left and right of center demonstrate the panel output at just under 144 FPS. Panel redraw is in sync with the frames coming from the GPU at this rate. The center section, however, represents what takes place when the input from the GPU suddenly drops to zero. In the above case, the game briefly stalled, then resumed a few frames at 144, then stalled again for a much longer period of time. Completely stopping the panel refresh would result in all TN pixels bleeding towards white, so G-Sync has a built-in failsafe to prevent this by forcing a redraw every ~33 msec. What you are seeing are the pixels intermittently bleeding towards white and periodically being pulled back down to the appropriate brightness by a scan. The low latency panel used in the ROG Swift does this all of the time, but it is less noticeable at 144, as you can see on the left and right edges of the graph. An additional thing that’s happening here is an apparent rise in average brightness during the event. We are still researching the cause of this on our end, but this brightness increase certainly helps to draw attention to the flicker event, making it even more perceptible to those who might have not otherwise noticed it.

Some of you might be wondering why this same effect is not seen when a game drops to 30 FPS (or even lower) during the course of normal game play. While the original G-Sync upgrade kit implementation simply waited until 33 msec had passed until forcing an additional redraw, this introduced judder from 25-30 FPS. Based on our observations and testing, it appears that NVIDIA has corrected this in the retail G-Sync panels with an algorithm that intelligently re-scans at even multiples of the input frame rate in order to keep the redraw rate relatively high, and therefore keeping flicker imperceptible – even at very low continuous frame rates.

A few final points before we go:

  • This is not limited to the ROG Swift. All variable refresh panels we have tested (including 4K) see this effect to a more or less degree than reported here. Again, this only occurs when games instantaneously drop to 0 FPS, and not when those games dip into low frame rates in a continuous fashion.
  • The effect is less perceptible (both visually and with recorded data) at lower maximum refresh rate settings.
  • The effect is not present at fixed refresh rates (G-Sync disabled or with non G-Sync panels).

This post was primarily meant as a status update and to serve as something for G-Sync users to point to when attempting to explain the flicker they are perceiving. We will continue researching, collecting data, and coordinating with NVIDIA on this issue, and will report back once we have more to discuss.

During the research and drafting of this piece, we reached out to and worked with NVIDIA to discuss this issue. Here is their statement:

"All LCD pixel values relax after refreshing. As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.

This means all LCDs have some slight variation in brightness. In this case, lower frequency refreshes will appear slightly brighter than high frequency refreshes by 1 – 2%.

When games are running normally (i.e., not waiting at a load screen, nor a screen capture) - users will never see this slight variation in brightness value. In the rare cases where frame rates can plummet to very low levels, there is a very slight brightness variation (barely perceptible to the human eye), which disappears when normal operation resumes."

So there you have it. It's basically down to the physics of how an LCD panel works at varying refresh rates. While I agree that it is a rare occurrence, there are some games that present this scenario more frequently (and noticeably) than others. If you've noticed this effect in some games more than others, let us know in the comments section below. 

(Editor's Note: We are continuing to work with NVIDIA on this issue and hope to find a way to alleviate the flickering with either a hardware or software change in the future.)

Samsung Announces First FreeSync UHD Monitors

Subject: Displays | November 20, 2014 - 10:50 AM |
Tagged: TN, Samsung, nvidia, monitor, ips, g-sync, freesync, amd

We have been teased for the past few months about when we would see the first implementations of AMD’s FreeSync technology, but now we finally have some concrete news about who will actually be producing these products.

Samsung has announced that they will be introducing the world’s first FreeSync enabled Ultra HD monitors.  The first models to include this feature will be the updated UD590 and the new UE850.  These will be introduced to the market in March of 2015.  The current UD590 monitor is a 28” unit with 3845x2160 resolution with up to 1 billion colors.  This looks to be one of those advanced TN panels that are selling from $500 to $900, depending on the model.

Samsung-UD590.jpg

AMD had promised some hand’s on time for journalists by the end of this year, and shipping products in the first half of next year.  It seems that Samsung is the first to jump on the wagon.  We would imagine that others will be offering the technology.  In theory this technology offers many of the same benefits of NVIDIA’s G-SYNC, but it does not require the same level of hardware.  I can imagine that we will be seeing some interesting comparisons next year with shipping hardware and how Free-Sync stacks up to G-SYNC.

Joe Chan, Vice President of Samsung Electronics Southeast Asia Headquarters commented, “We are very pleased to adopt AMD FreeSync technology to our 2015 Samsung Electronics Visual Display division’s UHD monitor roadmap, which fully supports open standards.  With this technology, we believe users including gamers will be able to enjoy their videos and games to be played with smoother frame display without stuttering or tearing on their monitors.”

Source: Samsung

Podcast #320 - Micron M600 SSD, NVIDIA and Adaptive Sync, Windows 10 and more!

Subject: General Tech | October 2, 2014 - 02:05 PM |
Tagged: X99 Classified, X99, video, tlc, tegra k1, ssd, Samsung, podcast, nvidia, micron, M600, iphone 6, g-sync, freesync, evga, broadwell-u, Broadwell, arm, apple, amd, adaptive sync, a8, 840 evo, 840

PC Perspective Podcast #320 - 10/02/2014

Join us this week as we discuss the Micron M600 SSD, NVIDIA and Adaptive Sync, Windows 10 and more!

You can subscribe to us through iTunes and you can still access it directly through the RSS page HERE.

The URL for the podcast is: http://pcper.com/podcast - Share with your friends!

  • iTunes - Subscribe to the podcast directly through the iTunes Store
  • RSS - Subscribe through your regular RSS reader
  • MP3 - Direct download link to the MP3 file

Hosts: Ryan Shrout, Jeremy Hellstrom, Josh Walrath, and Allyn Malventano

Subscribe to the PC Perspective YouTube Channel for more videos, reviews and podcasts!!

NVIDIA Confirms It Has No Plans to Support Adaptive Sync

Subject: Graphics Cards | September 27, 2014 - 07:24 PM |
Tagged: nvidia, maxwell, gsync, g-sync, freesync, adaptive sync

During an interview that we streamed live with NVIDIA's Tom Petersen this past Thursday, it was confirmed that NVIDIA is not currently working on, or has any current plans to, add support for the VESA-based and AMD-pushed Adaptive Sync portion of the DisplayPort 1.2a specification. To quote directly:

There is no truth [to that rumor of NVIDIA Adaptive Sync support] and we have made no official comments about Adaptive Sync. One thing I can say is that NVIDIA as a company is 100% dedicated to G-Sync. We are going to continue to invest in G-Sync and it is a way we can make the gaming experience better. We have no need for Adaptive Sync. We have no intention of [implementing it]."

Discussion of G-Sync begins at 1:27:14 in our interview.

To be clear, the Adaptive Sync part of DP 1.2a and 1.3+ are optional portions of the VESA spec that is not required for future graphics processors or even future display scalar chips. That means that upcoming graphics cards from NVIDIA could still be DisplayPort 1.3 compliant without implementing support for the Adaptive Sync feature. Based on the comments above, I fully expect that to be the case.

IMG_9328.JPG

The ASUS ROG Swift PG278Q G-Sync monitor

With that new information, you can basically assume that the future of variable refresh monitors is going to be divided: one set for users of GeForce cards and one set for users with Radeon cards. (Where Intel falls into this is up in the air.) Clearly that isn't ideal for a completely open ecosystem but NVIDIA has made the point, over and over, that what they have developed with G-Sync is difficult and not at all as simple as could be solved with the blunt instrument that Adaptive Sync is. NVIDIA has a history of producing technologies and then keeping them in-house, focusing on development specifically for GeForce owners and fans. The dream of having a VRR monitor that will run on both vendors GPUs appears to be dead.

When asked about the possibility of seeing future monitors that can support both NVIDIA G-Sync technology as well as Adaptive Sync technology, Petersen stated that while not impossible, he "would not expect to see such a device."

The future of G-Sync is still in development. Petersen stated:

"Don't think that were done. G-Sync is not done. Think of G-Sync as the start of NVIDIA solving the problems for gamers that are related to displays...G-Sync is our first technology that makes games look better on displays. But you can start looking at displays and make a lot of things better."

tearing5.jpg

Diagram showing how G-Sync affects monitor timings

So now we await for the first round of prototype FreeSync / Adaptive Sync monitors to hit our labs. AMD has put a lot of self-inflicted pressure on itself for this release by making claims, numerous times, that FreeSync will be just as good of an experience as G-Sync, and I am eager to see if they can meet that goal. Despite any ill feelings that some users might have about NVIDIA and some of its policies, it typically does a good job of maintaining a high quality user experience with these custom technologies. AMD will have to prove that what it has developed is on the same level. We should know more about that before we get too much further into fall.

You can check out our stories and reviews covering G-Sync here:

Author:
Subject: Displays
Manufacturer: Acer

Technical Specifications

Here they come - the G-Sync monitors are finally arriving at our doors! A little over a month ago we got to review the ASUS ROG Swift PG278Q, a 2560x1440 144 Hz monitor that was the first retail-ready display to bring NVIDIA's variable refresh technology to consumers. It was a great first option with a high refresh rate along with support for ULMB (ultra low motion blur) technology, giving users a shot at either option.

Today we are taking a look at our second G-Sync monitor that will hit streets sometime in mid-October with an identical $799 price point. The Acer XB280HK is a 28-in 4K monitor with a maximum refresh rate of 60 Hz and of course, support for NVIDIA G-Sync.

The Acer XB280HK, first announced at Computex in June, is the first 4K monitor on the market to be announced with support for variable refresh. It isn't that far behind the first low-cost 4K monitors to hit the market, period: the ASUS PB287Q and the Samsung U28D590D both shipped in May of 2014 with very similar feature sets, minus G-Sync. I discussed much of the general usability benefits (and issues) that arose when using a consumer 4K panel with Windows 8.1 in those reviews, so you'll want to be sure you read up on that in addition to the discussion of 4K + G-Sync we'll have today.

02.jpg

While we dive into the specifics on the Acer XB280HK monitor today, I will skip over most of the discussion about G-Sync, how it works and why we want it. In our ASUS PG278Q review I had a good, concise discussion on the technical background of NVIDIA G-Sync technology and how it improves gaming.

The idea of G-Sync is pretty easy to understand, though the implementation method can get a bit more hairy. G-Sync introduces a variable refresh rate to a monitor, allowing the display to refresh at wide range of rates rather than at fixed intervals. More importantly, rather than the monitor dictating what rate this refresh occurs at to the PC, the graphics now tells the monitor when to refresh in a properly configured G-Sync setup. This allows a monitor to match the refresh rate of the screen to the draw rate of the game being played (frames per second) and that simple change drastically improves the gaming experience for several reasons.

tearing5.jpg

Continue reading our review of the Acer XB280HK 4K G-Sync Monitor!!

Acer's two new XBO series gaming displays; G-SYNC on both, 4k on one

Subject: Displays | September 18, 2014 - 05:59 PM |
Tagged: Acer XB270H, XB280HK, 4k, g-sync

The Acer XB280HK is a 28" 4K G-SYNC display which will launch next month at expected price of US$799 or $849.99CDN.  The XB270H is a 27" 1080p display also with G-SYNC support and is currently available at $599USD or $649CDN.  As both are rated with a 1ms response time it is likely these are backlit TN panels but with the recent advances in TN panels the viewing angles should be much better than the original generation.

XB_sku_main.png

SAN JOSE, Calif., Sept. 18, 2014 – Acer America is bringing its new XBO series gaming displays featuring NVIDIA G-SYNC technology to gaming enthusiasts in North America. This cutting-edge line delivers significant performance advantages that infuse gaming with incredibly smooth, realistic and responsive visuals, elevating game play to a new level of stunning realism.

The two XBO series display models for North America include the Acer XB280HK boasting a 28-inch 4K2K Ultra HD (3840 x 2160) display with a @60Hz refresh rate and the Acer XB270H with a 27-inch screen and a maximum Full HD 1080p @ 144Hz resolution. Both models provide a quick 1ms response time, further enhancing in-game performance. They also feature revolutionary NVIDIA G-SYNC technology, comfortable ergonomics and excellent connectivity.

“We’re excited to bring these first-rate gaming displays to gamers in the United States and Canada,” said Ronald Lau, Acer America business manager. “The incredibly sharp and smooth images provided by NVIDIA G-SYNC technology are sure to thrill the most avid gamers. Combined with Acer’s highly flexibly ergonomic stand, non-glare ComfyView panel and low dimming technology, users are assured long hours of both comfortable and visually stunning game play.”  

NVIDIA G-SYNC: Picture-Perfect Visuals
NVIDIA G-SYNC technology ensures that every frame rendered by the GPU is perfectly portrayed by synchronizing the monitor’s refresh rates to the GPU in a GeForce GTX-powered PC. This breakthrough technology eliminates screen tearing and minimizes display stutter and input lag to deliver a smooth, fast and breathtaking gaming experience on the hottest PC gaming titles. Scenes appear instantly, objects look visually sharp, and gameplay is more responsive to provide faster reaction times, giving gamers a competitive edge.

“NVIDIA G-SYNC technology dramatically improves the way gamers see their games, by delivering images that are fast, sharp and stutter-free,” said Tom Petersen, distinguished engineer at NVIDIA. “This is the way games were meant to be played, and gamers will absolutely love these new Acer XBO monitors.”

Comfortable Ergonomics
By making gaming as comfortable as possible, the XBO series monitors help extend game time with three Acer innovations. Acer flicker-less technology reduces eye strain via a stable power supply that eliminates screen flicker. Its low dimming technology provides users the ability to adjust brightness down to 15 percent in low-light environments and Acer ComfyView non-glare screen reduces reflection for clearer viewing, a significant benefit for gamers.

A flexible, multi-function ErgoStand extends a wide range of options for maximum comfort and viewing perspectives. For finding the best angle, the screen tilts from -5 to -35 degrees and the height can be raised by up to 5.9 inches. In addition, the base rotates 120 degrees from left to right for easy screen sharing during game play and collaboration with others. Plus, the screen pivots from horizontal to vertical to accommodate two entirely different gaming scenarios.

Both new XBO series monitors deliver wide viewing angles up to 170 degrees horizontal and up to 160 degrees vertical. The Acer XB280HK delivers 1.07 billion colors and the Acer XB270HL provides 16.7 million colors, while both offer a native contrast ratio of 1000:1, a 300 nits brightness and a 72 percent NTSC color saturation, a combination that delivers exceptionally vibrant, detailed and high-quality imagery.

Superb Connectivity
The displays come with DisplayPort as well as high-speed USB 3.0 ports (1 up, 4 down) that are located on the side and down of screen for easily connecting a mouse, keyboard, gaming headset, joystick and other peripherals. One of the USB ports is equipped for battery charging.  

Eco-Friendly Design
EPEAT Gold registered, the highest level of EPEAT registration available, the displays meet all of EPEAT’s required criteria and at least 75 percent of EPEAT’s optional criteria. They’re also mercury-free and LED-backlit, which reduces energy costs by consuming less power than standard CCFL-backlit displays. ENERGY STAR 6.0 and TCO 6.0 qualified, they adhere to strict environmental, performance and ergonomic design standards.

Pricing and Availability
The Acer XB270H is available now at leading online retailers in the United States and Canada with a MSRP of US$599 and $649.99 CAD. The Acer XB280HK will be available next month at leading online retailers in the United States and Canada with a manufacturer’s suggested retail price (MSRP) of US$799 and $849.99 CAD.

Acer displays are backed by professional, high-quality technical support and a three-year warranty. Acer’s online community at community.acer.com provides customers discussion forums, answers to frequently asked questions and the opportunity to share ideas for new and enhanced services and products.

Source: Acer
Author:
Manufacturer: NVIDIA

A few days with some magic monitors

Last month friend of the site and technology enthusiast Tom Petersen, who apparently does SOMETHING at NVIDIA, stopped by our offices to talk about G-Sync technology. A variable refresh rate feature added to new monitors with custom NVIDIA hardware, G-Sync is a technology that has been frequently discussed on PC Perspective

The first monitor to ship with G-Sync is the ASUS ROG Swift PG278Q - a fantastic 2560x1440 27-in monitor with a 144 Hz maximum refresh rate. I wrote a glowing review of the display here recently with the only real negative to it being a high price tag: $799. But when Tom stopped out to talk about the G-Sync retail release, he happened to leave a set of three of these new displays for us to mess with in a G-Sync Surround configuration. Yummy.

So what exactly is the current experience of using a triple G-Sync monitor setup if you were lucky enough to pick up a set? The truth is that the G-Sync portion of the equation works great but that game support for Surround (or Eyefinity for that matter) is still somewhat cumbersome. 

IMG_9606.JPG

In this quick impressions article I'll walk through the setup and configuration of the system and tell you about my time playing seven different PC titles in G-Sync Surround.

Continue reading our editorial on using triple ASUS ROG Swift monitors in G-Sync Surround!!

PCPer Live! Recap - NVIDIA G-Sync Surround Demo and Q&A

Subject: Graphics Cards, Displays | August 22, 2014 - 08:05 PM |
Tagged: video, gsync, g-sync, tom petersen, nvidia, geforce

Earlier today we had NVIDIA's Tom Petersen in studio to discuss the retail availability of G-Sync monitors as well as to get hands on with a set of three ASUS ROG Swift PG278Q monitors running in G-Sync Surround! It was truly an impressive sight and if you missed any of it, you can catch the entire replay right here.

Even if seeing the ASUS PG278Q monitor again doesn't interest you (we have our full review of the monitor right here), you won't want to miss the very detailed Q&A that occurs, answering quite a few reader questions about the technology. Covered items include:

  • Potential added latency of G-Sync
  • Future needs for multiple DP connections on GeForce GPUs
  • Upcoming 4K and 1080p G-Sync panels
  • Can G-Sync Surround work through an MST Hub?
  • What happens to G-Sync when the frame rate exceeds the panel refresh rate? Or drops below minimum refresh rate?
  • What does that memory on the G-Sync module actually do??
  • A demo of the new NVIDIA SHIELD Tablet capabilities
  • A whole lot more!

Another big thank you to NVIDIA and Tom Petersen for stopping out our way and for spending the time to discuss these topics with our readers. Stay tuned here at PC Perspective as we will have more thoughts and reactions to G-Sync Surround very soon!!

NVIDIA Live Stream: We Want Your Questions!

Subject: Graphics Cards, Displays, Mobile | August 21, 2014 - 05:23 PM |
Tagged: nvidia, video, live, shield, shield tablet, g-sync, gsync, tom petersen

Tomorrow at 12pm EDT / 9am PDT, NVIDIA's Tom Petersen will be stopping by the PC Perspective office to discuss some topics of interest. There has been no lack of topics floating around the world of graphics card, displays, refresh rates and tablets recently and I expect the show tomorrow to be incredibly interesting and educational.

On hand we'll be doing demonstrations of G-Sync Surround (3 panels!) with the ASUS ROG Swift PG278Q display (our review here) and also show off the SHIELD Tablet (we have a review of that too) with some multiplayer action. If you thought the experience with a single G-Sync monitor was impressive, you will want to hear what a set of three of them can be like.

pcperlive.png

NVIDIA Live Stream with Tom Petersen

9am PT / 12pm ET - August 22nd

PC Perspective Live! Page

The topic list is going to include (but not limited to):

  • ASUS PG278Q G-Sync monitor
  • G-Sync availability and pricing
  • G-Sync Surround setup, use and requirements
  • Technical issues surrounding G-Sync: latency, buffers, etc.
  • Comparisons of G-Sync to Adaptive Sync
  • SHIELD Tablet game play
  • Altoids?

gsyncsurround.jpg

But we want your questions! Do you have burning issues that you think need to be addressed by Tom and the NVIDIA team about G-Sync, FreeSync, GameWorks, Tegra, tablets, GPUs and more? Nothing is off limits here, though obviously Tom may be cagey on future announcements. Please use the comments section on this news post below (registration not required) to ask your questions and we can organize them before the event tomorrow. We MIGHT even be able to come up with a couple of prizes to giveaway for live viewers as well...

See you tomorrow!!