Tecton Partners with Databricks to Help Enterprises Deploy Machine Learning Applications to Production in Minutes
Tecton, the enterprise feature store company, today announced a partnership with Databricks, the Data and AI Company and pioneer of the data lakehouse paradigm, to help organizations build and automate their machine learning (ML) feature pipelines from prototype to production. Tecton is integrated with the Databricks Lakehouse Platform so data teams can use Tecton to build production-ready ML features on Databricks in minutes.
We are thrilled to have Tecton available on the Databricks Lakehouse Platform, Databricks customers now have the option to use Tecton to operationalize features for their ML projects and effectively drive business with production ML applications.” Adam Conway, SVP of Products at Databricks
Productionizing ML models to serve a broad range of predictive applications including fraud detection, real-time underwriting, dynamic pricing, recommendations, personalization, and search poses unique data engineering challenges that keep too many organizations from bringing ML to the forefront of business processes and services. Curating, serving, and managing the predictive data signals that fuel predictive applications, also known as ML features, is hard. That is why Databricks and Tecton have collaborated to accelerate and automate the many steps involved in transforming raw data inputs into ML features and serving those features to fuel predictive applications at scale.
Built on an open lakehouse architecture, Databricks allows ML teams to prepare and process data, streamline cross-team collaboration, and standardize the full ML lifecycle from experimentation to production. With Tecton, these same teams can now automate the full lifecycle of ML features and operationalize ML applications in minutes without having to leave the Databricks workspace.
Building on Databricks’ powerful and massively scalable foundation for data and AI, Tecton extends the underlying data infrastructure to support ML-specific requirements. This partnership with Databricks enables organizations to embed ML into live, customer-facing applications and business processes, quickly, reliably and at scale.” Mike Del Balso, co-founder and CEO of Tecton
Available on the Databricks Lakehouse Platform, Tecton acts as the central source of truth for ML features, and automatically orchestrates, manages, and maintains the data pipelines that generate features. Allowing data teams to define features as code using Python and SQL, the integration further enables ML teams to track and share features with a version-control repository. Tecton then automates and orchestrates production-grade ML data pipelines that materialize feature values in a centralized repository. From there, users can instantly explore, share, and serve features for model training, batch and real-time predictions across use cases without worrying about typical roadblocks such as training-serving skew or point-in-time correctness.
As the interface between the Databricks Lakehouse Platform and their ML models, Tecton allows customers to process features using real-time and streaming data from a myriad of data sources. By automatically building the complex feature engineering pipelines needed to process streaming and real-time data, Tecton eliminates the need for extensive engineering support and enables users to drastically improve model performance, accuracy, and outcome.