Thursday, January 23, 2025

Eyeballs and Things


Get your eyeballs ready, they have competition:

Eyes of tomorrow: Smart contact lenses lead the way for human-machine interaction
May 2024, phys.org

They put RFID tags in the contact lens and that's it, like the reverse Wii remote DIY smartboard projector circa 2011.

via College of Engineering and Applied Sciences at Nanjing University: Hengtian Zhu et al, Frequency-encoded eye tracking smart contact lens for human–machine interaction, Nature Communications (2024). DOI: 10.1038/s41467-024-47851-y



New flexible film detects eyelash proximity in blink-tracking glasses
May 2024, phys.org

They are seeing with static electricity: "Non contact sensor" - can identify or measure an object without directly touching it, like infrared thermometers and vehicle proximity notification systems, using static electricity via fluorinated ethylene propylene which produces an external electrostatic field that can "see" objects without physical contact.

via Shanghai Key Laboratory for Intelligent Sensing and Detection Technology at East China University of Science and Technology: Facile Electret-Based Self-Powered Soft Sensor for Noncontact Positioning and Information Translation, ACS Applied Materials & Interfaces (2024). DOI: 10.1021/acsami.4c02741


AI headphones let wearer listen to a single person in a crowd by looking at them just once
May 2024, phys.org

"Target Speech Hearing" - wear the headphones and look at a person speaking for three to five seconds to "enroll" them. The system then cancels all other sounds in the environment and plays just the enrolled speaker's voice in real time. The sound waves from the speaker's voice reach the microphones on both sides of the headset simultaneously, and that's how works.

via University of Washington: Bandhav Veluri et al, Look Once to Hear: Target Speech Hearing with Noisy Examples, Proceedings of the CHI Conference on Human Factors in Computing Systems (2024). DOI: 10.1145/3613904.3642057 , dl.acm.org/doi/10.1145/3613904.3642057


Cutting-edge vision chip brings human eye-like perception to machines
Jun 2024, phys.org

This approach decomposes visual information into primitive-based visual representations. By combining these primitives, it mimics the features of the human visual system. (The "Tianmouc chip" achieves high-speed visual information acquisition at 10,000 frames per second, 10-bit precision, and a high dynamic range of 130 dB, all while reducing bandwidth by 90% and maintaining low power consumption.) 

via Tsinghua University Center for Brain Inspired Computing Research: Zheyu Yang et al, A vision chip with complementary pathways for open-world sensing, Nature (2024). DOI: 10.1038/s41586-024-07358-4


You're just a stick figure to this camera - a new camera to prevent companies from collecting private information
Jul 2024, phys.org

PrivacyLens uses both a standard video camera and a heat-sensing camera to spot people in images from their body temperature. The person's likeness is then completely replaced by a generic stick figure, allowing the camera to function without revealing the identity of the person.

via University of Michigan: Yasha Iravantchi et al, PrivacyLens: On-Device PII Removal from RGB Images using Thermally-Enhanced Sensing, Proceedings on Privacy Enhancing Technologies (2024). DOI: 10.56553/popets-2024-0146


Ultra-high speed camera for molecules: Attosecond spectroscopy captures electron transfer dynamics
Sep 2024, phys.org

Ultrashort ultraviolet pulses from high-order harmonic sources or free electron laser facilities stand as powerful tools for initiating and observing the response of molecules to photoionization, on timescales ranging from the femtosecond (10-15 seconds) down to the attosecond (10-18 seconds). 

via Madrid Institute for Advanced Studies in Nanoscience, Autonomous University and Complutense University of Madrid: Federico Vismarra et al. Few-femtosecond electron transfer dynamics in photoionized donor–π–acceptor molecules. Nature Chemistry (2024). DOI: 10.1038/s41557-024-01620-y 

No comments:

Post a Comment