Databricks, a company specializing in data lakehouse technology, announced on Tuesday a new platform designed for the manufacturing industry. Called lakehouse for manufacturing, the platform aims to unify data and artificial intelligence (AI) for various analytics use cases such as predictive maintenance, quality control and supply chain optimization.
The platform builds on Databricks’ core data lakehouse platform, which leverages Delta Lake, Apache Spark and MLFlow, open-source projects that enable scalable data processing and machine learning (ML) workflows. The platform also integrates with model serving, a service that Databricks introduced last month to simplify the deployment and management of ML models in production.
“The sheer amount of data is a huge challenge for the manufacturing industry as more companies deploy sensors to connect workers, buildings, vehicles and factories,” Shiv Trisal, global manufacturing industry leader at Databricks, said in an interview with VentureBeat.
“Moreover,” he said, “this data is growing exponentially, with an estimated 200-500% growth rate over the next five years. The lakehouse architecture enables organizations to leverage all of their data in one place to perform AI at scale while also reducing the total cost of ownership (TCO), which is a huge priority for every IT leader today.”