Confluent Tableflow Boosts Real-Time AI & Analytics Across Clouds
GA of Delta Lake and Databricks Unity Catalog integrations simplify how organizations flip streaming knowledge into ruled, analytics-ready tables
New enterprise options and Microsoft OneLake availability strengthen reliability, safety, and multicloud flexibility
Confluent, Inc. (Nasdaq:CFLT), the info streaming pioneer, in the present day introduced the General Availability (GA) of Delta Lake and Databricks Unity Catalog integrations in Confluent Tableflow, together with Early Access (EA) availability on Microsoft OneLake. These developments make Tableflow a completely managed, end-to-end answer that connects operational, analytical, and synthetic intelligence (AI) methods throughout hybrid and multicloud environments. Confluent now will get Apache Kafka® matters immediately into Delta Lake or Apache IcebergTM tables with automated quality control, catalog synchronization, and enterprise-grade safety.
Since its debut, Tableflow has remodeled how organizations make streaming knowledge analytics-ready. It eliminates brittle ETL jobs and guide Lakehouse integrations that gradual groups down. With Delta Lake and Unity Catalog integrations now in GA and Tableflow help for OneLake, Confluent is increasing its multicloud footprint. These updates ship a unified answer that connects real-time and analytical knowledge with enterprise governance. Now, it’s simpler to construct real-time AI and analytics that propel companies forward of their competitors.
“Customers wish to do extra with their real-time knowledge, however the friction between streaming and analytics has all the time slowed them down,” stated Shaun Clowes, Chief Product Officer at Confluent. “With Tableflow, we’re closing that hole and making it straightforward to attach Kafka on to ruled lakehouses. That means high-quality knowledge prepared for analytics and AI the second it’s created.”
Production-Ready for Enterprise Scale
The GA launch introduces new enterprise-grade options that make Tableflow one of the full, dependable, and safe stream-to-table options accessible in the present day, enabling organizations to:
- Simplify analytics. Delta Lake help (GA) converts Kafka matters immediately into Delta Lake tables saved in cloud object storage (Amazon S3 or Azure Data Lake Storage). Delta Lake and Iceberg codecs can now be enabled concurrently per matter for versatile, cross-format analytics.
- Unify governance. Unity Catalog help (GA) robotically synchronizes metadata, schema, and entry insurance policies between Tableflow and Databricks Unity Catalog for centralized governance and constant knowledge administration.
- Improve reliability. Dead Letter Queue captures and isolates malformed data with out interrupting knowledge stream. This schema-backed error dealing with gives transparency, sooner restoration, and steady knowledge high quality.
- Save time and minimize complexity. Upsert performance robotically updates and inserts data as knowledge adjustments, retaining Delta Lake and Iceberg tables constant, deduplicated, and all the time analytics-ready with out guide upkeep.
- Strengthen safety. Bring Your Own Keyextends customer-managed encryption keys to Tableflow for full management of information at relaxation. This allows compliance for regulated industries like monetary companies, healthcare, and the general public sector.
Building on current capabilities comparable to schema evolution, compaction, and automatic desk upkeep and integrations with Apache Iceberg, AWS Glue, and Snowflake Open Catalog, Tableflow now gives an end-to-end basis for groups that want their streaming knowledge to be immediately analytics-ready, ruled, and resilient.
“At Attune, delivering real-time insights from sensible constructing Internet of Things (IoT) knowledge is central to our mission,” stated David Kinney, Principal Solutions Architect at Attune. “With just some clicks, Confluent Tableflow lets us materialize key Kafka matters into trusted, analytics-ready tables, giving us correct visibility into buyer engagement and machine habits. These high-quality datasets now energy analytics, machine studying (ML) fashions, and generative AI functions, all constructed on a dependable knowledge basis. Tableflow has simplified our knowledge structure whereas opening new prospects for the way we leverage knowledge.”
Now Available on Microsoft OneLake
Tableflow can also be now accessible in EA on Azure built-in with OneLake, increasing its footprint and giving prospects larger flexibility throughout multicloud deployments. This is particularly impactful for organizations utilizing Azure Databricks and Microsoft Fabric, the place Delta Lake and Unity Catalog integrations are actually totally supported. Together, they provide a seamless, ruled analytics expertise from real-time streams to cloud lakehouses. Now prospects can:
- Accelerate time to perception. Instantly materialize Kafka matters as open tables in Microsoft OneLake and question them from Microsoft Fabric or your selection of software utilizing the OneLake Table APIs—no guide ETL or schema wrangling required.
- Eliminate complexity and operational prices. Automate schema mapping, sort conversion, and desk upkeep on your streaming knowledge, enabling governance and reliability in Azure-native analytics workflows.
- Enable Azure AI and analytics companies. Seamlessly combine with Azure’s analytics and AI companies utilizing Microsoft OneLake Table APIs to energy real-time insights and AI use circumstances and handle deployments simply by way of Confluent Cloud UI, CLI, or Terraform.
This EA launch marks a serious milestone in extending Tableflow’s multicloud footprint and strengthening Confluent’s partnership with Microsoft and Databricks.
“Access to real-time knowledge is important for purchasers to make quick and correct selections,” stated Dipti Borkar, Vice President and GM of Microsoft OneLake and ISV Ecosystem. “With Confluent’s Tableflow now accessible on Microsoft Azure, prospects can stream Kafka occasions to OneLake as Apache Iceberg or Delta Lake tables and question them immediately by way of Microsoft Fabric and fashionable third occasion engines utilizing OneLake Table APIs, chopping complexity and rushing up selections.”
Additional Resources
- Learn extra about Tableflow’s Delta Lake, Unity Catalog in Azure Databricks, Microsoft Azure, and Microsoft OneLake in in the present day’s launch weblog.
- See it in motion on this overview video or Tim Berglund’s lightboard clarification.
- Get began on Confluent Cloud by navigating to the Tableflow part on your cluster. New customers can join Confluent Cloud without cost and discover Tableflow’s capabilities.
- Contact us for a customized demo on Tableflow and begin unlocking the total potential of your streaming real-time knowledge.
The submit Confluent Tableflow Boosts Real-Time AI & Analytics Across Clouds first appeared on AI-Tech Park.
