hardware and wearables
physical devices for sensing, interaction, and utility — a cluster that reflects a consistent pull toward hands-on building at the hardware/software boundary. the ideas span from human-computer interaction (hands-free control via EMG bracelet or pupilometry glasses) to signal sensing (sensory capturer, EEG artifact rejection) to environmental detection (acoustic drone detection, predictive maintenance sensors).
the highest-rated ideas in this cluster are the ML-on-hardware ones: acoustic drone detection and predictive maintenance sensors all scored [STRONG] on the spreadsheet because they combine embedded systems engineering with meaningful ML research. the common pattern: novel sensor signal → custom dataset → trained model → edge deployment. this is technically demanding in an interesting way — TinyML, quantization, model compression — and produces demos that are viscerally impressive. EEG artifact rejection is the most academically serious of these: self-supervised neural signal cleaning is a real open problem with research significance.
the human-computer interaction ideas (EMG bracelet, pupilometry glasses, robotic arm assistant) target hands-free computing from different angles. the bracelet approach is probably most practical given that Meta is working on the same thing with smart glasses wristbands, validating the space. sensory capturer and nose device are the most exploratory — capturing sensory experiences beyond audio/video is a genuinely underdeveloped area that connects to the memory and context cluster, where richer sensory capture enables richer memory recall.