What I Work on
I’m a technologist and entrepreneur with a hybrid background in sensing systems, AI platforms, and human–computer interfaces. My work focuses on building platforms that can interface with humans and operate under real-world constraints in closed loop, learn from interaction, and adapt in real time under uncertainty.
I’ve built and led systems across LLM platforms, multimodal AI, and sensing systems, from early research through production deployment.
Current themes include:
- Neural interfaces and neuromodulation
- Closed-loop learning and control
- Voice and language as human–AI interfaces
- Computational phenotyping and personalization
- Platform design across hardware, software, and AI
My long-horizon bet is on neurotechnology and human–AI interfaces - systems that can interface with biology, represent complex signals, and operate in closed-loop with humans. I’m particularly interested in how new abstractions across hardware, software, and learning systems unlock step-changes in capability, similar to past computing platform shifts, and how biology becomes an integrated part of engineering and AI systems.
I’m driven by one question:
What happens when AI and the human brain truly understand each other?
Projects · LinkedIn · Google Scholar
Selected Work
My journey has spanned startups, academia, and translational R&D. I’m currently the Founder and CEO of Althea, where I built a production AI platform for autonomous, agent-driven workflows using LLMs, multimodal systems, and real-time orchestration.
Previously, I led brain sensing technology and R&D at Hyperfine and Liminal, developing multimodal sensing platforms combining hardware, signal processing, and machine learning for real-time brain monitoring.
I have also served as Adjunct Faculty at Yale School of Medicine and as a Research Scientist at Stanford University, working on neural interfaces, sensing systems, and computational modeling across engineering and neuroscience.
- AI Systems & Platforms — Production LLM-based agent systems, multimodal AI, and real-time orchestration platforms for autonomous workflow execution (Althea)
- Sensing, Imaging, Modeling — Multimodal sensing systems, ultrasound and transducer technologies, and end-to-end signal acquisition, modeling, and real-time inference pipelines (Hyperfine, Liminal, Stanford, UCL)
- Neurotechnology — Neural interfaces, brain–computer systems, neuromodulation, and closed-loop decoding and interaction with neural signals (Yale, Stanford, Liminal)
Writing
I write about building real-world AI systems, sensing platforms, human-AI interfaces, and entrepreneurship — often focused on what’s missing between today’s tools and tomorrow’s platforms.