EdgeLens turns existing smart home cameras into private, on-device AI companions. We analyze context, not pixels—helping families care for seniors, babies, and pets, and improve healthy eating without streaming their lives to the cloud.
Millions of homes have cameras, but, most are still motion-first and cloud-dependent for “smart” features.
For OEMs, that means rising cloud storage + AI inference costs as usage scales and commoditized hardware.
For families, it means worrying about privacy in most sensitive spaces like living rooms, bedrooms and nurseries, because “smart” often implies sending video off-device.
ElderCare Vision Mode for Indoor Cameras. A private, always-on fall detection and wellness agent.
Detects falls instantly on-device. No video leaves the room, preserving dignity.
“Mom was active today, took meds at 9am.” Generated locally by our SLM.
Automatic item detection, real-time inventory, and freshness tracking.
Tracks consumption and provides health-first diet recommendations.
Expiry-driven alerts and usage intelligence to reduce food waste.
Auto-shopping lists & 1-tap ordering via Instacart & Kroger.
Iris is an On-device Vision AI Agent that sees inside your fridge to understand household needs. It acts proactively to help you eat healthier, prevent food waste, and automate grocery shopping.
Our proprietary Orchestration Engine moves heavy AI workloads from the cloud to the edge.
Compact, domain-tuned models for wellness & care running locally on camera chips and home hubs (≤8GB RAM).
ATLAS dynamically manages AI workloads. It runs 99% of inference locally, only pinging the cloud for firmware updates and never for raw video processing and object detection.
Starting with Guardian & Iris, expanding to the whole home.
On-device care mode for indoor cameras (elder wellness & safety).
On-device kitchen vision for smart fridges (nutrition & food awareness).
Pet care + baby sleep insight.
[SAN JOSE, CA] — EdgeLens AI today announced Early Access for EdgeLens ATLAS™, a production-grade orchestration platform that automatically decides where AI workloads should run (on-device vs. cloud) and how they should run to meet strict SLAs, cost budgets, and privacy requirements.
ATLAS™ combines a dual-orchestrator engine to reduce cloud spend by up to 60% and cut p95 latency by ~50%. General Availability (GA) is planned for Q1 2026, following live showcases at CES 2026 in Las Vegas.
AI teams are under pressure to ship richer AI experiences while controlling costs and privacy. ATLAS converts cloud-heavy workflows into edge-first execution through a predict-first dual orchestrator that adapts in real time to device resources and network conditions.
“Cloud AI inference costs are rising fast and squeezing device OEM margins. ATLAS keeps AI experiences fast and private while reducing cloud spend by dynamically orchestrating AI workloads across the device and the cloud with an edge-first approach.”
— Nagendra Kumar, Founder & CEO of EdgeLens AIEdgeLens ATLAS™ is available today for Early Access partners on NVIDIA Jetson, Apple Silicon, and Qualcomm Snapdragon hardware families. Live demos will be shown at CES 2026 in Las Vegas this January.
Founded by a former Amazon Alexa AI leader, EdgeLens AI debuts its “AI Vision Agent” for smart refrigerators to eliminate cloud dependency, latency, and privacy concerns.
[SAN JOSE, CALIFORNIA] – EdgeLens AI, a new AI startup, today launched its exclusive beta program to bring powerful, on-device generative AI to the Smart Home ecosystem. The company is debuting its “AI Vision Agent” for smart refrigerators, empowering manufacturers to offer next-generation features without the latency, cost, and privacy concerns of the cloud.
In a direct response to growing consumer demand for data privacy and industry pressure to reduce ongoing cloud-related expenses, EdgeLens AI’s platform runs complex AI tasks entirely on the device’s local hardware.
“During my time leading Alexa’s AI, it became clear that the future of ambient computing couldn’t be fully realized while tethered to the cloud. Users expect instantaneous interactions and demand strong data privacy.”
— Nag Lavu, Founder and CEO of EdgeLens AIEdgeLens AI’s first product for pilot partners, the AI Vision Agent, transforms a standard smart fridge camera into an intelligent kitchen assistant.
Nag Lavu
Founder & CEO
[email protected]
We’ve been building AI products for millions of smart home devices at Amazon Alexa, Google Health, Apple and Walmart Labs. We’ve seen the power of AI at scale, but we have also seen its limits.
Most “smart” devices today are still just motion detectors with an app. They stream video to the cloud, create privacy risks, and hope the user has time to watch endless clips.
We started EdgeLens AI because the next wave of Home AI must be:
We are building the intelligence that stays in the room—quietly watching for risk, helping you understand patterns, and supporting better decisions for the people you love.