Portfolio

This page contains a non-exhaustive selection of projects I have worked on, grouped by institution and time period.

For context on my early entrepreneurship experience during my PhD, see the short section on venture formation.

Amazon (2023-Present)

At Amazon, I have led the development of multimodal agentic AI assistants that translate large language model capabilities into real-world, task-oriented products across healthcare and commerce.

Multimodal Agentic Assistants for Primary Care

A patient-facing, multimodal assistant designed to support text- and voice-based interactions in primary care, integrating language, audio, and structured workflows under safety and regulatory constraints.

Video Avatars & Multimodal Experiences

Applied research and prototyping of video-based, multimodal AI experiences combining speech, vision, and structured interaction flows to enable natural, task-oriented user interactions.

Buy For Me — Agentic Shopping System

A browser-based agentic system that helps customers complete purchases across third-party websites using multimodal perception, reasoning, and tool-enabled actions.

Tempus AI (2023)

At Tempus, my work focused on developing applied machine learning systems for clinical decision support, with an emphasis on computer vision models for quantitative analysis of pathology data used in oncology workflows.

Google (2019-2023)

At Google, my work focused on building foundational data and machine learning systems for population-scale healthcare applications.

Project Nightingale

A large-scale clinical data infrastructure and analytics initiative focused on aggregating, standardizing, and enabling analysis of population-scale EHR data to support clinical workflows and machine learning in healthcare.

MIT (2014-2019)

At MIT, my PhD research focused on applying AI and multimodal sensing to the objective measurement of human pain, combining physiological signals, computer vision, and affective computing. In parallel, I pursued several additional projects in human-centered AI.

Injection Study

We investigated to use electrodermal activity (EDA), heart rate variability (HRV), and facial expression analysis as potential endpoints to determine quantitative pain scores.

Entrepreneurship

MAIC

Co-founded an early-stage startup through the Antler accelerator, incorporated and based in Singapore, focused on applying AI, computer vision, and IoT to workforce and task management in the construction industry.