Quality time with G-Sync
Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.
NVIDIA's Prototype G-Sync Monitor
We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics. All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.
Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed. This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.
In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.
The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920x1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.
Subject: Displays | December 16, 2013 - 09:09 AM | Ryan Shrout
Tagged: vg248qe, nvidia, gsync, giveaway, g-sync, contest, asus
I know that LOTS of you have been clamoring for information on how you can get your hands on one of those DIY G-Sync upgrade kits for yourself and I have some good news. Though I can't tell you where to buy one or how much it will cost, I can offer you 1 of 5 FREE G-Sync upgrade kits through a giveaway we are hosting at PC Perspective!
Here are the rules for the sweepstakes:
- You must already own an ASUS VG248QE monitor
- We need you to supply feedback on the G-Sync experience after the upgrade
- Sorry, this is only available in the US and Canada
Now, the real question is, how can you enter to win as long as you meet those above requirements? It's pretty simple!
- Fill out the form below with name and email information
- You have to include a link to a picture of your existing VG248QE monitor. Include text on it (or on a sheet of paper in the photo) that mentions this contest! Use Imgur if you need an image host.
- Leave a comment on this post that describes WHY you want G-Sync technology
- Hey, if you subscribe to our YouTube channel that won't hurt your chances either. Leave your YouTube name in the comment as well!
Our thanks goes to NVIDIA for supplying the kits and good luck to all participants! We'll pick our winners on December 23rd and have the units out by the end of the year.
The First Custom R9 290X
It has been a crazy launch for the AMD Radeon R9 series of graphics cards. When we first reviewed both the R9 290X and the R9 290, we came away very impressed with the GPU and the performance it provided. Our reviews of both products resulted in awards of the Gold class. The 290X was a new class of single GPU performance while the R9 290 nearly matched performance at a crazy $399 price tag.
But there were issues. Big, glaring issues. Clock speeds had a huge amount of variance depending on the game and we saw a GPU that was rated as "up to 1000 MHz" running at 899 MHz in Skyrim and 821 MHz in Bioshock Infinite. Those are not insignificant deltas in clock rate that nearly perfectly match deltas in performance. These speeds also changed based on the "hot" or "cold" status of the graphics card - had it warmed up and been active for 10 minutes prior to testing? If so, the performance was measurably lower than with a "cold" GPU that was just started.
That issue was not necessarily a deal killer; rather, it just made us rethink how we test GPUs. The fact that many people were seeing lower performance on retail purchased cards than with the reference cards sent to press for reviews was a much bigger deal. In our testing in November the retail card we purchased, that was using the exact same cooler as the reference model, was running 6.5% slower than we expected.
The obvious hope was the retail cards with custom PCBs and coolers would be released from AMD partners and somehow fix this whole dilemma. Today we see if that was correct.
Subject: Graphics Cards | December 12, 2013 - 05:20 PM | Ryan Shrout
Tagged: video, amd, radeon, hawaii, r9 290, R9 290X, bitcoin, litecoin, mining
If you already listened to this weeks PC Perspective Podcast, then feel free to disregard this post. For the rest of you - subscribe to our damned weekly podcast would you already?!?
In any event, I thought it might be interesting to extract this 6 minute discussion we had during last nights live streamed podcast about how the emergence of Litecoin mining operations is driving up prices of GPUs, particularly the compute-capable R9 290 and R9 290X Hawaii-based cards from AMD.
Check out these prices currently on Amazon!
- Radeon R9 290X - $725+
- Radeon R9 290 - $499+
- Radeon R9 280X - $429+
- GeForce GTX 770 - $409+
- GeForce GTX 780 - $509+
- GeForce GTX 780 Ti - $699+
The price of the GTX 770 is a bit higher than it should be while the GTX 780 and GTX 780 Ti are priced in the same range they have been for the last month or so. The same cannot be said for the AMD cards listed here - the R9 280X is selling for $130 more than its expected MSRP at a minimum but you'll see quite a few going for much higher on Amazon, Ebay (thanks TR) and others. The Radeon R9 290 has an MSRP of $399 from AMD but the lowest price we found on Amazon was $499 and anything on Newegg.com is showing at the same price, but sold out. The R9 290X is even more obnoxiously priced when you can find them.
Do you have any thoughts on this? Do you think Litecoin mining is really causing these price inflations and what does that mean for AMD, NVIDIA and the gamer?
Subject: Displays | December 16, 2013 - 09:11 AM | Ryan Shrout
Tagged: video, vg248qe, nvidia, gsync, g-sync, asus
It looks like some G-Sync ready monitors are going to be on sale starting today, though perhaps not from the outlets you would have expected. NVIDIA let me know last night that they are working with partners, including ASUS obviously, to make a small amount of pre-modified ASUS VG248QE G-Sync monitors available for purchase. These are the same monitors we used in our recent G-Sync preview story so you should check that article out if you want our opinions on the display and the technology.
Those people selling the displays? Digital Storm, Falcon Northwest, Maingear, and Overlord Computer. This creates some unfortunate requirements on potential buyers. For example, Falcon Northwest is only selling the panels to users that either are buying a new Falcon PC or already own a Falcon custom system. Digital Storm on the other hand WILL sell the monitor on its own or allow you to send in your VG248QE monitor to have the upgrade service done for you. The monitor alone will sell for $499 while the upgrade price (with module included) is $299.
This distribution model for G-Sync technology likely isn't what users wanted or expected. After all, we were promised upgrade kits for users of that specific ASUS VG248QE display and we still do not have data on how NVIDIA plans to sell them or distribute them. Being able to purchase the display from these resellers above is at least SOMETHING before the holiday, but it really isn't the way we would like to see G-Sync showcased. NVIDIA needs to get these products in the hands of gamers sooner rather than later.
NVIDIA also prepared a new video to showcase G-Sync. Unlike other marketing videos this one wasn't placed on YouTube as the ability for it to run at a fixed 60 FPS is a strict requirement, something that YouTube can't do or can't do reliably. For this video's demonstration to work correctly you need set your display to a 60 Hz refresh rate and you should use a video player capable of maintaining the static 60 FPS content decoding.
To grab a copy of this video, you can use the link right here that will download the file directly from Mega.co.nz. It should help demonstrate the effects us using a G-Sync enabled display for users that don't have access to see one in person.
Oh, and I know that LOTS of you have been clamoring for information on how you can get your hands on one of those DIY G-Sync upgrade kits for yourself and I have some good news. Though I can't tell you where to buy one or how much it will cost, I can offer you one of 5 FREE G-Sync ASUS VG248QE upgrade kits through a giveaway we are hosting at PC Perspective! Check out this page for the details!!
A not-so-simple set of instructions
Valve released to the world the first beta of SteamOS, a Linux-based operating system built specifically for PC gaming, on Friday evening. We have spent quite a lot of time discussing and debating the merits of SteamOS, but this weekend we wanted to do an installation of the new OS on a system and see how it all worked.
Our full video tutorial of installing and configuring SteamOS
First up was selecting the hardware for the build. As is usually the case, we had a nearly-complete system sitting around that needed some tweaks. Here is a quick list of the hardware we used, with a discussion about WHY just below.
|Processor||Intel Core i5-4670K - $222|
|Motherboard||EVGA Z87 Stinger Mini ITX Motherboard - $257|
|Memory||Corsair Vengeance LP 8GB 1866 MHz (2 x 4GB) - $109|
|Graphics Card||NVIDIA GeForce GTX TITAN 6GB - $999
EVGA GeForce GTX 770 2GB SuperClocked - $349
|Storage||Samsung 840 EVO Series 250GB SSD - $168|
|Case||EVGA Hadron Mini ITX Case - $189|
|Power Supply||Included with Case|
|Optical Drive||Slot loading DVD Burnder - $36|
|Peak Compute||4,494 GFLOPS (TITAN), 3,213 GFLOPS (GTX 770)|
|Total Price||$1947 (GTX TITAN) $1297 (GTX 770)|
We definitely weren't targeting a low cost build with this system, but I think we did create a very powerful system to test SteamOS on. First up was the case, the new EVGA Hadron Mini ITX chassis. It's small, which is great for integration into your living room, yet can still hold a full power, full-size graphics card.
The motherboard we used was the EVGA Z87 Stinger Mini ITX - an offering that Morry just recently reviewed and recommended. Supporting the latest Intel Haswell processors, the Stinger includes great overclocking options and a great feature set that won't leave enthusiasts longing for a larger motherboard.
Subject: General Tech, Processors, Mobile | December 14, 2013 - 04:07 AM | Scott Michaud
Tagged: Intel, Broadwell
This leak is from China DIY and, thus, machine-translated into English from Chinese. They claim that Broadwell is coming in the second half of 2014 and will be introduced in three four series:
- H will be the high performance offerings
- U and Y have very low power consumption
- M will fit mainstream performance
The high performance offerings will have up to four CPU cores, 6MB of L3 cache, support for up to 32GB of memory, and thermal rating of 47W. The leak claims that some will be configurable down to 37W which is pretty clearly its "SDP" rating. The problem, of course, is whether 47W is its actual TDP or, rather, another SDP rating. Who knows.
The H series is said to be available in either one or two chips. Both a separate PCH and CPU version will exist as well as a single-chip solution that integrates the PCH on-die.
There is basically nothing said about the M series beyond acknowledging its existence.
The U and Y series will be up to dual-core with 4MB L3 cache. The U series will have a thermal rating of 15W to 28W. The Y series will be substantially lower at 4.5W configurable down to 3.5W. No clue about which of these numbers are TDPs and which are SDPs. You can compare this earlier reports that Haswell will reach as low as 4.5W SDP.
Hopefully we will learn more about these soon and, perhaps, get a functional timeline of Intel releases. Seriously, I think I need to sit down and draw a flowchart some day.
A slightly smaller MARS
The NVIDIA GeForce GTX 760 was released in June of 2013. Based on the same GK104 GPU as the GTX 680, GTX 670 and GTX 770, the GTX 760 disabled a couple more of the clusters of processor cores to offer up impressive performance levels for a lower cost than we had seen previously. My review of the GTX 760 was very positive as NVIDIA had priced it aggressively against the competing products from AMD.
As for ASUS, they have a storied history with the MARS brand. Typically an over-built custom PCB with two of the highest end NVIDIA GPUs stapled together, the ASUS MARS cards have been limited edition products with a lot of cache around them. The first MARS card was a dual GTX 285 product that was the first card to offer 4GB of memory (though 2GB per GPU of course). The MARS II took a pair of GTX 580 GPUs and pasted them on a HUGE card and sold just 1000 of them worldwide. It was heavy, expensive and fast; blazing fast. But at a price of $1200+ it wasn't on the radar of most PC gamers.
Interestingly, the MARS iteration for the GTX 680 never occurred and why that is the case is still a matter of debate. Some point the finger at poor sales and ASUS while others think that NVIDIA restricted ASUS' engineers from being as creative as they needed to be.
Today's release of the ASUS ROG MARS 760 is a bit different - this is still a high end graphics card but it doesn't utilize the fastest single-GPU option on the market. Instead ASUS has gone with a more reasonable design that combines a pair of GTX 760 GK104 GPUs on a single PCB with a PCI Express bridge chip between them. The MARS 760 is significantly smaller and less power hungry than previous MARS cards but it is still able to pack a punch in the performance department as you'll soon see.
Subject: Cases and Cooling | December 12, 2013 - 04:03 AM | Tim Verry
Tagged: water cooling, nzxt, kraken, gpu cooler
NZXT has a new cooling product coming out next week that caught my attention recently. Called the KRAKEN G10, it is a bracket and fan assembly that allows users to attach All In One (AIO) water coolers – those traditionally aimed at CPUs – to graphics cards. The Kraken G10 consists of a steel bracket with a CPU waterblock mounting hole, 92mm fan, and water tube routing space.
The Kraken G10 is a stylish approach to pairing cheap sealed water coolers to graphics cards with reference PCBs – a feat that has traditionally been limited to adventurous enthusiasts armed with zip ties and a good deal of patience. The bracket is compatible with many recent GPUs from both AMD and NVIDIA, and it allows enthusiasts to mount AIO water coolers from NZXT as well as other manufacturers. The water cooler is then able to efficiently cool the GPU while the pre-installed 92mm fan works to keep the VRM and memory areas of the graphics card cool. The G10 bracket measures 177 x 32.5 x 110.6 mm (WxHxD) and comes in white, black, or red with white fan blades and NZXT logos on the side and fan.
Specifically, the KRAKEN G10 supports the following graphics card models (with reference PCBs only).
Further, users can use the G10 to mount the waterblocks of the following sealed loop water coolers.
|KUHLER H2O 920V4||H110||KRAKEN X60||Water 3.0 Extreme||LQ-320|
|KUHLER H2O 620V4||H90||KRAKEN X40||Water 3.0 Pro||LQ-315|
|KUHLER H2O 920||H55||Water 3.0 Performer||LQ-310|
|KUHLER H2O 620||H50||Water 2.0 Extreme|
|Water 2.0 Pro|
|Water 2.0 Performer|
It is a niche, but useful, product that can allow users to more easily upgrade the cooling of their graphics cards to allow for higher overclocks and/or quieter operation. The NZXT KRAKEN G10 will be available on December 16, 2013 for $29.99 from NZXT’s website. Update: It appears the first batch of G10 brackets has sold out already. Fret not, a second batch of Kraken G10s will be available towards the middle of next month according to NZXT. (End of update.)
Are you planning to unleash the Kraken G10 on your GPUs (I expect crypto currency miners might be especially interested in this product heh).
Subject: General Tech, Processors | December 13, 2013 - 08:49 PM | Scott Michaud
Tagged: Intel, haswell
Intel will begin to refresh their Haswell line of processors, according to VR-Zone, starting in Q2 and continue into Q3. This will be accompanied by their 9-series of motherboard chipsets. The Intel Core i7-4770 and Core i7-4771 will be replaced, not just surpassed, by the Core i7-4790. That said, the only difference is a 100MHz bump to both the base and turbo CPU frequencies.
The K-series processors will come in Q3 and are said to be based on Haswell-E with DDR4 memory. I find this quite confusing because of previous reports that Broadwell-K would appear at roughly the same time. I am unsure what this means for Broadwell-K and I am definitely unsure why some Haswell-E components would be considered part of the Haswell refresh instead of the Haswell-E launch.
Subject: General Tech | December 16, 2013 - 12:37 PM | Jeremy Hellstrom
Tagged: idiots, DRM, disney
If you bought a collection of Disney movies to keep the kids placated this Christmas, Disney has a great holiday surprise for you. From what we have heard via [H]ard|OCP your Christmas specials are going to disappear from your library and your only *legal* way of watching these specials will be to order Disney TV and schedule your holidays around their chosen broadcast times. Before you aim all your vitriol at Disney, save a bit for Amazon as they are the providers that have agreed to allow Disney to pull an epic Scrooge move. When Disney first approached Amazon to be a distributor of their movies and shows Amazon agreed to allow Disney to pull the content whenever they felt like it. Aren't you glad you paid for those movies and shows now? Too bad there is no other way to get hold of them during the holidays and stop your children from crying.
"Disney has decided to pull access to several purchased Christmas videos from Amazon during the holiday season, as the movie studio wants its TV-channel to have the content exclusively. Affected customers have seen their videos disappear from their online libraries, showing once again that not everything you buy is actually yours to keep."
Here is some more Tech News from around the web:
- Google may drop Intel for own-recipe ARM: Bloomberg @ The Register
- Further Teardown of the Saturn V Flight Computer @ Hack a Day
- Troubleshooting in the Command Line: Tips for Linux Beginners @ Linux.com
- SteamOS vs. Ubuntu 13.10 - Intel HD Graphics Performance @ Phoronix
- Ninjalane Podcast - Kingpin Video Card 4-way SLI goodness and is 4k a waste
- TSSDR Holiday Giveaway – Win 1 of 2 Unreleased Adaptec (By PMC) ASR-8885 12Gbps RAID Adapters
Subject: General Tech, Processors | December 16, 2013 - 09:17 PM | Scott Michaud
Tagged: Intel, Haswell-EP, Broadwell-EP, Broadwell
Intel has made its way on to our news feed several times over the last few days. The ticking and the tocking seem to be back on schedule. Was Intel held back by the complexity of 14nm? Was it too difficult for them to focus on both high-performance and mobile development? Was it a mix of both?
VR-Zone, who knows how to get a hold of Intel slides, just leaked details about Broadwell-EP. This product line is predicted to replace Haswell-EP at some point in the summer of 2015 (they expect right around Intel Developer Forum). They claim it will be Intel's first 14nm Xeon processor which obviously suggests that it will not be preceded by Broadwell in the lower performance server categories.
Image Credit: VR-Zone
Broadwell-EP will have up to 18 cores per socket (Hyper-Threading allows up to 36 threads). Its top level cache, which we assume is L3, will be up to 45MB large. TDPs will be the same as Haswell-EP which range from 70W to 145W for server parts and from 70W to 160W for workstations. The current parts based on Ivy Bridge, as far as I can tell, peak at 150W and 25MB of cache. Intel will apparently allow Haswell and Broadwell to give off a little more heat than their predecessors. This could be a very good sign for performance.
VR-Zone expects that a dual-socket Broadwell-EP Xeon system could support up to 2TB of DDR4 memory. They expect close to 1 TFLOP per socket of double precision FP performance. This meets or exceeds the performance available by Kaveri including its GPU. Sure, the AMD solution will be available over a year earlier and cost a fraction of the multi-thousand-dollar server processor, but it is somewhat ridiculous to think that a CPU has the theoretical performance available to software render the equivalent of Battlefield 4's medium settings without a GPU (if the software was written with said rendering engine, which it is not... of course).
This is obviously two generations off as we have just received the much anticipated Ivy Bridge-E. Still, it is good to see that Intel is keeping themselves moving ahead and developing new top-end performance parts for enthusiasts and high-end servers.
Subject: General Tech | December 12, 2013 - 02:05 PM | Jeremy Hellstrom
Tagged: win 8.1, touchscreen, notebook, fail
It is going to be hard for Microsoft to flog its new OS when notebook manufactures are not interested in selling touchscreen notebooks. Apparently the idea of greasy fingers obscuring your view of Metro just hasn't caught on as was predicted by the GUI geniuses behind Win8. Though DigiTimes does not specify which vendors are abandoning touchscreens, first tier vendors include all of the names you are familiar with. The decision is financial, not spiteful, as a touchscreen does add around 10% to the cost of producing a notebook and as no one is buying them it is foolish to continue to produce them.
"Some first-tier notebook brand vendors have recently adjusted their notebook roadmaps for 2014 and will delay the releases of touchscreen conventional notebooks to focus on non-touchscreen models, which have a pricing advantage, according to sources from the upstream supply chain."
Here is some more Tech News from around the web:
- The TR Podcast 147: Amazon airlifts, 4K goes mainstream, and 290X goes wobbly @ The Tech Report
- Rambus versus Micron FINALLY OVER @ The Register
- Just when you were considering Red Hat Linux 6.5, here comes 7 @ The Register
- TSSDR Holiday Giveaway – Win Crucial M500 240GB, 480 and 1TB SSDs
Subject: General Tech, Processors | December 15, 2013 - 04:27 AM | Scott Michaud
Tagged: Intel, google, arm
Amazon, Facebook, and Google are three members of a fairly exclusive club. These three companies order custom server processors from Intel (and other companies). Jason Waxman of Intel was quoted by Wired, "Sometimes OEMs and end customers ask us to put a feature into the silicon and it sort of depends upon how big a deal it is and whether it has to be invisible or proprietary to a customer. We're always happy to, if we can find a way to get it into the silicon".
Now, it would seem, that Google is interested in developing their own server processors based on architecture licensed from ARM. This could be a big deal for Intel as Bloomberg believes Google accounts for a whole 4.3% of the chip giant's revenue.
Of course this probably does not mean Google will spring up a fabrication lab somewhere. That would just be nutty. It is still unclear whether they will cut in ARM design houses, such as AMD or Qualcomm, or whether they will take ARM's design and run straight to TSMC, GlobalFoundries, or IBM with it.
I am sure there would be many takers for some sizable fraction of 4.3% of Intel's revenue.
Subject: General Tech | December 13, 2013 - 06:59 PM | Jeremy Hellstrom
Tagged: valve, steam os, Gabe Newell
Well it is December 13th and as promised you can get your hands on Steam OS, more or less. We've tried starting the download a few times here at PC Perspective and are running into a few difficulties but maybe you will have better luck. Click onto this link to head to the SteamDB site and you just might be able to get your hands on Valve's new operating system. We have been lead to believe it will bear a lot of resemblance to the already familiar Steam Big Picture though as we have yet to get a working image to install on a machine that is hard to verify. There is a secondary repository you can try as well.
And a new magnet link torrrent just popped up which should help you a lot! Magnet link for torrent download.
As they state on the page "Valve is having server issues (no wonder), download will probably fail." but you probably expected that anyways. Of course you will not be able to download a Steam Machine, unless you are one of those lucky so-and-so's who got in on the beta. Once we have succeeded in installing Gabe's new plaything on a machine you can expect an update but until then why not try it on your own. No word on if this will support badgers or not.
Subject: General Tech, Graphics Cards | December 18, 2013 - 04:25 AM | Scott Michaud
Tagged: GeForce GTX 780 Ti, DirectCU II, asus
There has not been too many custom coolers for top-end NVIDIA graphics cards as of late. Starting with the GeForce GTX 690, NVIDIA allegedly demands AIB partners stick to the reference designs for certain models. Obviously, this is a problem as it limits the innovation realized by partners when they are forced to compete on fewer metrics (although the reference designs were pretty good regardless). This is especially true because the affected models are the upper high-end where pricing is more flexible if the product is worth it.
This is apparently not the case for the top end GTX 780 Ti. ASUS has just announced the GeForce GTX 780 Ti DirectCU II graphics card. ASUS claims this will lead to 30% cooler operating with 3x less noise. A 6% bump to performance (as measured in Battlefield 4) will accompany that cooler and quieter operation as the full GK110 GPU will boost to 1020MHz.
ASUS makes custom GPUs for both AMD and NVIDIA. Be sure to check out our review of another high-end DirectCU II card, with 100% less NVIDIA, very soon. It will definitely be a great read and maybe even an excellent podcast topic.
Subject: General Tech | December 15, 2013 - 03:02 AM | Scott Michaud
Tagged: itu, gigabit broadband
And now for something a little different from what we normally report on. G.fast is a telecom standard which allows really fast (capable of over a gigabit) communication over moderate distances (~a quarter of a mile) using standard telephone cable. The point of this standard is to avoid installing infrastructure between the end of a fiber roll-out to the neighborhood and the insides of every individual home.
Eh, it looks enough like a phone cord.
The hope that it will trigger another wave of infrastructure improvements for upcoming "Ultra-HD" (4K and 8K) video services and online storage solutions. Installing fiber seems to be treated more like self-obligation than a necessary upgrade. This solution would not even require a technician to enter the home much like we currrently have with ADSL2.
I do have lingering concerns, however, with the reliability of fiber-optic networks. Copper infrasturcture was designed to be resilient. I wonder how reliable G.fast will be compared to this legacy network in areas prone to natural disaster. It sounds like standard telephone services will, unlike a fiber-to-the-home solution, function in a power outage at least at the home level but what about one localized to that neighborhood? Then again, this is definitely not an area of my expertise.
The ITU wants G.fast to be finalized "as early as" April 2014.
Subject: General Tech | December 18, 2013 - 06:20 PM | Ryan Shrout
Tagged: video, Type Heaven, topre, keyboard
I don't consider myself a keyboard guru, but I sure do go through a lot of them in my line of work. At any of five different workstations in our office I'll be using a different keyboard. And we tend to interchange them often enough that I would guess I have typed on as many as 15 different keyboards this year. Some for longer periods of time than others of course, but the ones that make it to my main desk get quite a workout.
When our friends at Seasonic told us they wanted to send along a Topre Type Heaven keyboard for us to try out, I told them to feel free; but in my head I was thinking "oh geez another keyboard." Turns out I didn't give this brand and this keyboard enough credit out the gate.
(Note: Seasonic is the official distributor of the Topre keyboard brand in the US now and offers a 2 year warranty on the units!)
With a price tag of $150 on Amazon.com, there are going to quite of few of you that just instantly turn off. Understandable. Others though will appreciate the need for a high quality input device if you do any appreciable amount of typing for work or pleasure. Using a technology called electrostatic capacitive key switches, Topre combines benefits of Cherry and standard membrane keyboards in one package.
Check out my video above for some sound comparison as well as my thoughts on using the keyboard long term. Not to spoil it: but I'm keeping this keyboard on my desk despite me missing the multimedia controls of my previous keyboard.
Subject: General Tech | December 16, 2013 - 04:20 PM | Scott Michaud
Tagged: Starcraft II, HoTS, bitcoin
Bitcoin Starcraft Challenge is a show match between Scarlett, the Zerg player from Canada, and NaNiwa, the Protoss player from Sweden. These are two of the best 25 players in the world and, with TLO, the only top-25 players from outside of South Korea (although they each spent substantial time training there at some point).
Of course the interesting part is that they are playing for Bitcoins, 12 of them, which has a value of roughly $10,000 USD. Thankfully there is no Terrans to drop MULEs otherwise the whole economy would collapse (I troll, they are balanced all things considered).
TotalBiscuit and other (currently TBD) announcers will commentate the event, this Saturday, at noon EST. The event will be best of 7 and streamed by TotalBiscuit on Twitch; the VoDs will be later available on his Youtube page. It should be a very interesting event.
This will probably be the most efficient way to acquire Bitcoins with your GPU for quite some time.
Subject: General Tech, Graphics Cards | December 17, 2013 - 05:02 PM | Scott Michaud
Tagged: nvidia, ShadowPlay, geforce experience
Another update to GeForce Experience brings another anticipated ShadowPlay feature. The ability to stream live gameplay to Twitch, hardware accelerated by Kepler, was demoed at the NVIDIA event in Montreal from late October. They showed Batman Origins streaming at 1080p 60FPS without capping or affecting the in-game output settings.
GeForce Experience 1.8.1 finally brings that feature, in beta of course, to the general public. When set up, Alt + F8 will launch the Twitch stream and Alt + F6 will activate your webcam. Oh, by the way, one feature they kept from us (or at least me) is the ability to overlay your webcam atop your gameplay.
Nice touch NVIDIA.
Of course the upstream bandwidth requirements of video are quite high: 3.5Mbps on the top end, a more common 2Mbps happy medium, and a 0.75Mbps minimum. NVIDIA has been trying to ensure that your machine will not lag but there's nothing a GPU can do about your internet connection.
GeForce Experience 1.8.1 is available now at the GeForce website.
- 1 of 2
Get notified when we go live!