Redis Labs Delivers Powerful Data Platform for Next Wave of AI Applications

RedisAI elevates inferencing performance and efficiency up to 10x by serving models within Redis, where the reference data lives

RedisGears is a serverless engine for infinite data-processing options at milliseconds speed across Redis data-structures, modules, and cluster nodes

MOUNTAIN VIEW, Calif.–(BUSINESS WIRE)–#AIRedis Labs, the home of Redis and provider of Redis Enterprise, today announced the general availability of RedisAI and RedisGears, previewed last week at RedisConf 2020 Takeaway. Together, RedisAI and RedisGears transform Redis Enterprise into a low-latency, real-time data platform for infinite processing capabilities across any data model, while simultaneously inferencing artificial intelligence (AI) requests all within Redis.

According to Gartner1, “Through 2024, the shift from pilot to production artificial intelligence (AI) will drive a 5x increase in streaming data analytics infrastructure.” The report further states, “Getting AI into production requires IT leaders to complement DataOps and ModelOps with infrastructures that enable end users to embed trained models into streaming-data infrastructures to deliver continuous near-real-time predictions.”

“We’ve heard the challenges customers have as they move AI into production, in particular, the end-to-end AI serving time, which in many cases was influenced by the time it takes to collect, prepare and feed the data to the AI serving engine. RedisAI and RedisGears were designed to solve this problem, by reducing the end-to-end AI serving time to milliseconds,” said Yiftach Schoolman, CTO, and co-founder at Redis Labs. “With Redis Enterprise as the perfect high-performance and scalable foundation, RedisAI and RedisGears will enable our customers to successfully utilize AI technologies to create operational efficiencies and solve real business problems in real-time.”

RedisAI: Delivering predictions that help the business win

RedisAI (co-developed by Redis Labs and Tenserwerk) is the answer to the challenge every architect or developer faces as they attempt to design and implement AI in their production applications: the time spent outside the AI inference engine to collect and prepare the reference data. With the AI serving engine inside Redis, RedisAI reduces the time spent on these external processes and can deliver up to 10x more inferences than other AI serving platforms and at a much lower latency. Many of the leading AI-driven applications such as fraud detection, transaction scoring, ad serving, recommendation engine, image recognition, autonomous vehicles, and game monetization will achieve dramatically better business outcomes with these performance improvements.

Integrated with MLflow, RedisAI eases the management of the AI lifecycle by allowing running models to be updated seamlessly and without downtime. With built-in support for major AI backend systems (TensorFlow, PyTorch, and ONNX Runtime), RedisAI allows inferencing to be run across platforms. For instance, a model that was trained by PyTorch can be inferenced by TensorFlow. Finally, in combination with Redis Enterprise’s high availability, linear scalability, and flexible deployment model, RedisAI can be deployed anywhere: cloud, on-premises data-centers, edge, and even on Raspberry-Pi devices.

RedisGears: Customize Redis to build better applications

RedisGears is a new serverless engine for infinite programmability options in Redis. RedisGears enables transaction, batch, and event-driven operations to be processed in milliseconds by performing these functions close to the data—all within Redis. RedisGears enables Redis to operate as a single data service by doing all the heavy lifting of managing updates to existing database and data warehouse systems.

Furthermore, RedisGears allows developers to perform any type of operation across Redis’ data structures and modules. RedisGears functions can be developed once and used across any type of Redis deployment: open-source, Redis Enterprise software, or Redis Enterprise Cloud.

1Gartner, “Predicts 2020: Artificial Intelligence Core Technologies,” Chirag Dekate, Saniye Alaybeyi, Alan Priestley, Daniel Bowers, 11 December 2019.

Additional Resources

Read the Redis Labs blog about the challenges companies face with AI inferencing, or for more information on RedisAI and RedisGears.

Also watch the keynote, RedisAI, and RedisGears sessions from RedisConf 2020 Takeaway.

About Redis Labs

Modern businesses depend on the power of real-time data. With Redis Labs, organizations deliver instant experiences in a highly reliable and scalable manner. Redis Labs is the home of Redis, the world’s most popular in-memory database and commercial provider of Redis Enterprise that delivers superior performance, matchless reliability and unparalleled flexibility for high-speed transactions, recommendation engines, data ingest, fraud mitigation, real-time indexing, session management, and caching.

Redis Labs is ranked as a leader in NoSQL databases by independent analysts, trusted by five Fortune 10 companies, three of the four credit card networks, the top four telecommunication companies, three of the top six healthcare companies, and four of the top five technology companies.

Redis has been voted the most loved database, rated the most popular database container, and #1 cloud database.

Contacts

Steve Naventi

Redis Labs

press@redislabs.com

error: Content is protected !!