Tactus makes touch screens physical

Could have Tactus Technology created the holy grail for touch-powered devices when writing and typing is a concern? The company recently showed its “Tactile Layer” technology, a new system designed to make modern touch-screens have more physical-based interaction features.

Presented during the Display Week held by the Society for Information Display (SID) in Boston, Tactic Layer is described as a lightweight, power-efficient and easily-integrated technology that creates a haptic interface on-the-fly when the user needs to write text or input data.

The technology can be programmed to show several types of interactive features, ie. keys resembling the ones of a “real” keyboard or other kinds of buttons and elements: the interaction is haptic (the user can “feel” the buttons when he or she touches them) but also visual (once created, the interface can be seen as a “real” UI and not just a virtualized one).

Tactus Technology made its Tactic Layer by using a microfluidic system that lets the interactive elements “pop out” of the touch (virtual) screen and then disappear once they aren’t useful anymore. The company states that its technology can be easily integrated in any touch-based device with very few drawbacks.

Tactic Layer does not add any more thickness to the display system, Tactus states, it “replaces window and sits on top of the existing touch screen”. The technology works with existing touch sensing and display technologies, needs a minimum amount of power, can be scaled from smartphones up to television screens and is completely customizable as for shape, location and size of the interactive elements.

Source: TG Daily.

Report a problem with article
Previous Story

Microsoft Office 2010 desktop now listed in Windows Store on Windows 8

Next Story

Facebook app center launches

10 Comments

Commenting is disabled on this article.

So why not just use real buttons? All we've succeeded in doing is dumping normal keys/buttons for touch screens, deciding we're not happy with it, and finding a newer, more expensive method of putting buttons back on our devices. I can see how dynamically having the buttons wherever you want would be nifty, but at the end of the day, is it really worth it when you could just have a standardized keyboard that software developers build around?

Gerowen said,
So why not just use real buttons? All we've succeeded in doing is dumping normal keys/buttons for touch screens, deciding we're not happy with it, and finding a newer, more expensive method of putting buttons back on our devices. I can see how dynamically having the buttons wherever you want would be nifty, but at the end of the day, is it really worth it when you could just have a standardized keyboard that software developers build around?

Because touch is more flexible and more intuitive. This is only the beginning of polymorphic displays.

More than anything, this will be great for accessibility. But when will we start seeing it? 2015?

While I have no doubt that this is going to be amazing, the biggest drawback that I see is that it's not "programmable"... The interface elements that are going to "pop out" have to be defined beforehand, and can't be designated on the fly.

So you could have a keyboard when the on-screen-keyboard slides out, or possibly individual keys, but you couldn't use the space for a gamepad afterwards.

Still... I look forward to this being used to entice the people who still cling to their blackberries.

I want this and need this now. I have always wondered when someone would find a solution to this. I hope this solution actually works well!