Lemai Tech builds foundation models, edge vision, and supply-chain agents that design, source, and personalize the next billion meals and playtimes — for pets across 15 markets.
Five models and agents, trained on 1B+ anonymized pet interactions. Ship faster, personalize deeper, and keep a paw on every supply lane.
Our multimodal pet-nutrition and wellness assistant. Answers owner questions with vet-grade reasoning, grounded in 12M product SKUs and peer-reviewed veterinary literature.
Per-pet diet formulation engine. Consumes breed, age, activity, and allergen signals to generate recipes that manufacturers can produce at batch scale.
On-device CV model for cameras, feeders, and smart collars. Detects gait anomalies, mood, and food-intake patterns, running under 60ms on a $3 MCU.
Reinforcement-learning policy that lives inside interactive toys. Learns each pet's engagement curve in hours; upgraded OTA from our fleet telemetry.
Autonomous orchestration across suppliers, warehouses, and 25 online channels. Forecasts demand, re-routes shipments, and negotiates pricing — hands-off.
Safety layer on every model we ship. Red-teamed by veterinarians; blocks unsafe recommendations and surfaces health-risk flags to human reviewers.
We invest in open research on animal-centric AI. Our models combine vision, audio, and telemetry with RLHF from owners and veterinarians.
Self-supervised pretraining on 40,000 hours of video and sensor telemetry across 120 breeds. Transfers zero-shot to species we never trained on.
Low-cost fusion of audio, pose, and thermal signals. 3.2% error on breed-agnostic anomaly detection, running on 64KB of SRAM.
A preference-learning protocol that incorporates owner feedback without overfitting to individual bias. Deployed in NutriAgent and SmartToy.
A framework for pre-deployment red-teaming by licensed veterinarians. Open-sourced; 2,100+ stars on GitHub.
The proven distribution and supply stack behind Lemai Tech — now operated by autonomous agents end to end.
Every vertical below runs on a shared Agentic stack — so insights compound across products.
Recipe generation, allergen-aware personalization, batch optimization.
Adaptive play policies, RL-tuned difficulty, on-device safety.
Cameras, feeders, collars running PawVision at the edge.
Anonymized telemetry pipeline for licensed research partners.
Manufacturers, marketplaces, and research partners — we’re onboarding a small cohort this quarter.