Redis: enabling the future of real-time AI and ML
Taimur Rashid is a seasoned business development executive with a track record of developing go-to-market strategies, building new market capabilities, and driving revenue growth and market expansion.
Over the past fifteen years, he held senior leadership positions in sales, business development, customer success and product development at publicly traded companies including Amazon Web Services (AWS), Microsoft, and Oracle.
Now, as the Chief Business Development Officer at Redis, Rashid shares his insight into Redis and its application of artificial intelligence (AI) and machine learning (ML)
Can you tell me about Redis and your roles and responsibilities there?
Redis is the company behind open-source Redis, the world’s most loved in-memory database, and the commercial provider of Redis Enterprise, a real-time data platform. As Chief Business Development Officer, I oversee emerging business and commercial strategy at Redis. This includes a variety of functional areas including strategic business development, corporate development, and incubation of new initiatives. I am currently leading a new initiative related to AI/ML and the future of work.
How is Redis enabling the future of real-time AI and ML?
We are enabling the future of real-time AI and ML across two specific themes.
The first is for data infrastructure modernisation. Companies can enrich their real-time application with some ML capabilities, by deploying Redis as an online feature store for low latency ML. Within the production phase of MLOps there are additional areas where Redis can serve as a low-latency data store for feature serving. Companies like Uber, Gojek, DoorDash, Netflix, Spotify, Airbnb, and many others have started to share their ML reference architectures. And at the heart of these architectures, is the feature store, which is the interface between data and models.
The second is for building intelligent applications to drive business outcomes. For this category, Redis is used as a vector database to store vector embeddings, which are numerical representations of raw data. This can be raw data for audio, video, images, or even unstructured text within documents. Once stored in Redis, these vector embeddings can be used for similarity searches to power intelligent applications like recommendation systems, visual search, fraud detection, and much more.
Why do you think it is important to incorporate AI and ML with cloud-based operations?
Organisations that use machine learning to power real-time, customer-facing interactions every daycare deeply about performance and cost. Typical use cases include recommendations, search ranking, real-time pricing, and fraud detection. Redis is ideally suited to meet the needs of AI/ML applications that are latency sensitive, and that need to be scaled to manage complex models and ever-growing data sets.
Why do you think AI and ML are becoming a core part of businesses?
Across industries we are witnessing accelerating digital innovation at a record pace. Since the start of the pandemic, organisations are adopting digital technologies to modernise the digital interaction for customers, drive more engagement and secure their back-end architectures to future proof their businesses. Increasing digitisation results in more data being generated at high velocity, and therefore allowing more insights to be gleaned from the data. AI/ML are becoming a core part of the business because it’s the key to enabling business transformation.
What can we expect from Redis and its use of AI/ML-enabled technology in the future?
We expect Redis to be a cornerstone in the MLOps architecture and lifecycle as the low-latency data store for real-time ML. This is a foundational architectural approach for organisations that want to modernise their data infrastructure and architecture to support real-time applications. By democratising Redis as a vector database, we expect the pace of AI innovation to accelerate as more developers and organisations can build intelligent capabilities that can be infused within existing applications.