This microchip processes inputs using machine learning algorithms to distinguish between intentional gestures and accidental touches, enhancing interaction accuracy
Representation Pic
Researchers at NC State University have developed a fabric-based sensor that uses three-dimensional embroidery techniques and machine learning to control electronic devices through touch. Integrated into clothing, this sensor can operate mobile apps and other electronic functions by recognizing various gestures. The device, powered by friction-generated electricity between its two triboelectric materials, sends data to a microchip. This microchip processes inputs using machine learning algorithms to distinguish between intentional gestures and accidental touches, enhancing interaction accuracy.
ADVERTISEMENT
Star Trek’s Holodeck could train future robots
Researchers at the University of Pennsylvania have developed “Holodeck,” a system inspired by Star Trek’s virtual environment simulator, to generate interactive 3D spaces for AI and robotics training. Unlike its sci-fi counterpart, which created lifelike scenarios for human interaction, this real-world Holodeck uses generative AI to produce vast numbers of virtual environments from simple language prompts, aiding in developing robots that can navigate complex real-world situations safely. The system represents a significant advancement in using large language models (LLMs) to design spaces.