Subject: General Tech, Mobile, Shows and Expos | February 24, 2012 - 06:18 PM | Scott Michaud
Tagged: nvidia, DirectTouch, MWC 12
As a part of their Tegra 3 product, NVIDIA embedded the ability to control some of the touchscreen processing onto the CPU. The offloading allows for increased power efficiency by reducing the number of powered components as well as increased touch responsivity. Atmel, Cypress, and Synaptics are three leading touch-controller companies who join N-Trig, Raydium, and Focaltech in supporting the DirectTouch architecture.
Touchy subject, I know -- but...
Advancements in touch technology are definitely welcome especially when the words power efficiency or responsiveness are involved. Both NVIDIA and Intel have been looking for ways to reduce the number of electronics behind your phone or tablet. The less required to do the most the better we are. It is great to see NVIDIA taking the lead in innovation when it is needed the most.
While I do not mean to rain upon NVIDIA’s bright blue skies -- I must make a note. Despite the precision brought by high sample rate, there does appear to be quite a bit of latency between where his finger is and where the touch is reported. I would be curious to see where that latency occurs.
Of course this issue probably has nothing to do with NVIDIA. Videogames, particularly on consoles, are known to have latencies floating up to 100ms as the input device does not influence the frames being rendered often enough. The latency could come in from the touch device itself, from the software, the operating system, and/or whatever else.
We do not know where the latency occurs, but I expect that whoever crushes it will have a throne awaiting them somewhere in Silicon Valley.