The promise of AI remains immense – but one thing might be holding it back. “The infrastructure that powers AI today won’t sustain tomorrow’s demands,” a recent CIO.com article leads. “CIOs must rethink how to scale smarter – not just bigger – or risk falling behind.”
CrateDB agrees – and the database firm is betting on solving the problem by being a ‘unified data layer for analytics, search, and AI.’
“The challenge is that most IT systems are relying, or have been built, around batch pipeline or asynchronous pipeline, and now you need to reduce the time between the production and the consumption of the data,” Stephane Castellani, SVP marketing, explains. “CrateDB is a very good fit because it really can give you insights to the right data with also a large volume and complexity of formats in a matter of milliseconds.”
A blog post notes the four-step process for CrateDB to act as the ‘connective tissue between operational data and AI systems’; from ingestion, to real-time aggregation and insight, to serving data to AI pipelines, to enabling feedback loops between models and data. The velocity and variety of data is key; Castellani notes the reduction of query times from minutes to milliseconds. In manufacturing, telemetry can be collected from machines in real-time, enabling greater learning for predictive maintenance models.
There is another benefit, as Castellani explains. “Some also use CrateDB in the factory for knowledge assistance,” he says. “If something goes wrong, you have a specific error message appear on your machine and say ‘I’m not an expert with this machine, what does it mean and how can I fix it?’, [you] can ask a knowledge assistant, that is also relying on CrateDB as a vector database, to get access to the information, and pull the right manual and right instructions to react in real-time.”
AI, however, does not stand still for long; “we don’t know what [it] is going to look like in a few months, or even a few weeks”, notes Castellani. Organisations are looking to move towards fully agentic AI workflows with greater autonomy, yet according to recent PYMENTS Intelligence research, manufacturing – as part of the wider goods and services industry – are lagging. CrateDB has partnered with Tech Mahindra on this front to help provide agentic AI solutions for automotive, manufacturing, and smart factories.
Castellani notes excitement about the Model Context Protocol (MCP), which standardises how applications provide context to large language models (LLMs). He likens it to the trend around enterprise APIs 12 years ago. CrateDB’s MCP Server, which is still at the experimental stage, serves as a bridge between AI tools and the analytics database. “When we talk about MCP it’s pretty much the same approach [as APIs] but for LLMs,” he explains.
Tech Mahindra is just one of the key partnerships going forward for CrateDB. “We keep focusing on our basics,” Castellani adds. “Performance, scalability… investing into our capacity to ingest data from more and more data sources, and always minimis[ing] the latency, both on the ingestion and query side.”
Stephane Castellani will be speaking at AI & Big Data Expo Europe on the topic of Bringing AI to Real-Time Data – Text2SQL, RAG, and TAG with CrateDB, and IoT Tech Expo Europe on the topic of Smarter IoT Operations: Real-Time Wind Farm Analytics and AI-Driven Diagnostics. You can watch the full interview with Stephane below:
Read the full article here