July 7, 2020
In the robot development of recent years, a great deal of attention has been paid to the development of vision systems. The more complex that image processing becomes, the more expensive overall systems grow, and the more susceptible to interference from outside optical noise they become. Humans use rough size and position information obtained from seeing to move their hands, and, through the process of grasping, obtain even more information from their tactile sense, enabling them to hold objects. Vision and tactile sensation should be seen as a set that make up for each other’s weaknesses. This demonstration video shows how tactile sensing can be used to make determinations that are not possible using vision alone. Furthermore, complex tactile signal processing is a good match for AI, which excels at multivariable processing, so we used this technology in a demonstration unit.
July 7, 2020
We developed this demonstration unit after receiving similar requests from both the medical industry and robot industry at the same time. These industries need to learn the status of forces applied through an end effector when grasping objects, such as organs or foods, which are soft and have undefined shapes. When the objects being grasped are soft, touch is essential to ensure that they are not damaged or destroyed. Our sensors are compact, so they can even be mounted in the tips of tools like that shown in the video(Pay special attention to the part where the marshmallow is pulled on)
July 7, 2020
The video shows how this type of sensor, often requested by the medical and sports industries, is used. This demonstration unit has sensors arranged to provide 100% density, but roughly the same level can be achieved with sensors spaced slightly further apart, trading off precision. Making an entire soft mattress an interface enables the monitoring of patients’ full body postures and makes it possible to adjust beds accordingly, to measure the shapes of hands or feet, or various other uses.
July 7, 2020
We combined tactile sensors to make a video game controller. Of course, this is a quirky design with plenty of issues, so we hope you take it as a light and playful exercise. We used a six-axis tactile sensor for the joystick, enabling three-dimensional control, placed soft buttons on the side as flexible input interfaces, and placed a focus plate on the back.
July 7, 2020
The demonstration unit uses a smartphone design concept, but this technology could be applied in the same way for large monitors and even non-transparent articles. Ordinary touch panels cannot be used while wearing gloves or with wet hands, but force plates are unaffected by these. One novel aspect of this technology is shown towards the middle of the video, in which the user’s finger is on the screen but not moving, but the sensors recognize the force and direction which the user’s finger is applying pressure. One unfortunate weakness of this design is that it does not support multitouch. (Instead, it determines the center point of the forces applied by two fingers).