DataSapien has launched an open beta of its device-native artificial intelligence platform, aimed at enabling app publishers to run small-scale AI models directly on smartphones and tablets.
The company says that by facilitating on-device processing, it can reduce operational costs, protect user data, and enable private AI-powered functionality without using the cloud.
cost challenges
Many companies investing in AI technology face high cloud computing costs and unpredictable per-token fees from third-party providers. Industry statistics show that with an estimated $109 billion in AI investments worldwide, 74% to 80% of AI deployments fail to realize commercial value, often due to high infrastructure costs and low user engagement.
DataSapien’s platform provides a model marketplace where app publishers can select and integrate optimized small language models (SLMs) designed to run efficiently on consumer devices without an active internet connection. Available models include Google’s Gemma 3n, LiquidAI’s LFM Nanos, Meta’s Llama 3, and Microsoft’s Phi-4, each optimized for mobile and edge applications.
Privacy considerations
The growing use of cloud-based AI has raised persistent concerns about privacy and data management. DataSapien’s architecture processes information locally, limiting the sharing of user data and addressing the risk of personal information being exposed or monetized externally. The system leverages what’s called a personal data store – a secure private repository of contextual information such as health, financial, and location data – to ensure that data never leaves your device.
StJohn Deakins, co-founder and CEO of DataSapien, said current cloud-based AI models are problematic for enterprises, especially when considering privacy and operational costs.
“The cloud AI model is broken. Businesses are now trapped. To get smart AI, they have to hand over customer data to big tech and pay ‘taxes’ on every interaction. It’s expensive, time-consuming, and risky.”
We will return permissions to device owners and app publishers. With the release of amazing small models like Gemma 3n and LiquidAI, the smartest AI is no longer in the data center, but in your pocket. We just built a bridge to get there. Our customers see a 44x increase in engagement and a 100% reduction in cloud AI charges. “This is the platform shift mobile has been waiting for,” Deakins said.
Customizable deployment
The platform includes no-code, visual tools for designing AI-driven user journeys, allowing you to quickly switch between AI models without redeploying your app. This flexibility may be attractive to publishers who want to adopt new AI developments as they arise without risking service interruptions or increased technical debt.
In addition to cost savings, DataSapien reports that early adopters of the platform during the testing phase reported a 44x increase in user interactions compared to traditional cloud-based mobile customer experience tools.
Regulatory background
The emergence of data protection regulations and growing consumer concern for privacy are forcing companies to rethink how they process and store user data. Local, device-driven processing models could offer a way to comply with the changing regulatory landscape while delivering personalized digital services directly within mobile applications.
The platform is currently available in open beta and provides access to the DataSapien software development kit and a library of supported models.
