'(·)@allgebrah·Oct 15, 2017free idea: a silicone putty with sensors in it that can harden as whatever controller you want2311
'(·)@allgebrah·Oct 15, 2017add a number of vibration motors and you also have force feedback, let it not harden at all and you have a screen but for hands1210
'(·)@allgebrah·Oct 15, 2017Replying to @allgebrah and @literalbananaactually due to [dayjob] I am mildly experienced in this area and could totally make it work12
'(·)@allgebrah·Oct 15, 2017Replying to @allgebrah and @literalbananawould need a competent EE person for wireless power and miniaturization of the nodules, and idk if there's a putty that's up to spec11
'(·)@allgebrah·Oct 15, 2017Replying to @allgebrah and @literalbananabut there's a huge niche for "gesture recognition interface that 1. works at all and 2. has force feedback"12
'(·)@allgebrah·Oct 16, 2017Replying to @allgebrah and @literalbananacurrent video recognition-based gesture interfaces simply can't deal with hands occluding each other and themselves11
'(·)@allgebrah·Oct 16, 2017Replying to @allgebrah and @literalbananaforce feedback is limited to, like, what was the state of the art, ultrasound and airflows? I've not even used this, it's that experimental11
'(·)@allgebrahReplying to @allgebrah and @literalbananaan alternative to the putty would be a reconfigurable skeleton with an elastic skin, less messy but more unknown and harder to fix12:03 AM · Oct 16, 2017·Twitter Web Client1 Like
'(·)@allgebrah·Oct 16, 2017Replying to @allgebrah and @literalbananaideally I'd have something like a small shoggoth that could reconfigure itself but I don't know of anything that would allow it to do that11
'(·)@allgebrah·Oct 16, 2017Replying to @allgebrah and @literalbananahmm trying to think of a workflow say you have a 3d object on a screen and a bit of putty in front of you software matches nodules to mesh1