
Startup. We're building superintelligence to reclaim your attention and focus. Attention is how you navigate the world, learn, form memories; it’s how you experience reality. Today, our attention has been hijacked by algorithms designed to extract data from you to feed Big Tech business models.
Our team is building tools that let you see, protect, and own the most important resource you have, attention.
We’re starting with a Mac app that collects behavioral signals and usage patterns across a person's devices - digital phenotyping for cognitive states. We use on-device ML to classify mental states and activities. This enables new types of insights about how you think and work (Oura Ring for your mind) while laying the foundation for a new generation of attention-preserving adaptive interfaces powered by local models.
Real ML, not just LLM wrappers. We're training temporal models and random forests on behavioral data.
Access to real data. We're collecting high-fidelity behavioral signals with a clear path to training novel models.
Lots of compute. You will have access to two NVIDIA Sparks and practically unlimited credits for API calls to frontier models.
Privacy-first architecture. Everything runs on-device. We're exploring federated learning for model improvement without centralized data.
Small team, high ownership. You'll work in a pod of 4. No bureaucracy, no waiting for approvals. Ship, communicate, iterate.
Mission that matters. We're building tools to help people reclaim their attention from systems designed to exploit it
Right now, parts of the ML workflow are still too manual (e.g., data export → model runs → results). We're hiring a senior engineer to join our team to make the Python-side ML system reliable and easy to iterate on: data processing, feature pipelines, on-device inference, training pathways, evaluation, and privacy constraints. For local training and inference, you will have access to two NVIDIA DGX Sparks.
Must-haves
Nice to have