The Hotels Network (THN) partnered with Meroxa to streamline data flow between their sales and support teams, overcoming siloed data and complex architecture challenges. Using Meroxa’s Conduit Platform, THN achieved a unified, real-time pipeline from Salesforce to Redpanda, reducing operational costs by 30% and enhancing customer support capabilities. The solution’s scalability ensures THN can continue to grow and optimize operations seamlessly.
We’re excited to introduce the latest update to the **Conduit Operator**, now with built-in **schema registry support**. This new feature allows seamless data encoding and decoding, improving data compatibility across your pipelines. Whether you're managing multiple Conduit instances or scaling your data operations, schema registry integration ensures a smoother, more reliable experience for handling complex data flows.
The Conduit team has just released Conduit v0.12, and we're gearing up for the launch of Conduit v1 with a focus on making pipelines more resilient. One key feature of this release is pipeline recovery, designed to automatically restart pipelines that experience temporary errors like network interruptions or service downtime.
With configurable backoff settings, Conduit can efficiently handle retries, reducing the impact of transient issues. Learn more about this feature and how it ensures your pipelines are always up and running.
We made it, Conduit v0.11 is here! In this latest release, we’ve focused on adding schema support, enabling you to detect schema changes and retain type information end-end.
We are thrilled to introduce our latest offering, the Conduit Platform, which brings a host of new features and improvements designed to enhance your real-time data streaming experience, now powered by our robust Conduit open-source core. This transformation brings enhanced performance, scalability, and usability, coupled with access to over 100 connectors maintained by our dedicated open-source community. Here’s a closer look at what’s new and how it can benefit your data operations.
Conduit connector for Apache Flink, a powerful combination that significantly expands Flink’s capabilities. Apache Flink is renowned for its robust stream processing capabilities, while Conduit offers a lightweight and fast data streaming solution, simplifying the creation of connectors.
Explore the power of Conduit to create custom connectors tailored to your specific data integration needs. Learn how to use the Conduit SDK for enhanced data management and discover a world of possibilities in streamlining your data workflows. Start building your custom connector today!
Explore the new features and enhancements in Conduit version 0.10, designed to streamline your data integration processes. Discover how our latest update can help improve efficiency, security, and performance for your data operations. Upgrade today and transform how you manage data with Conduit 0.10
Discover the excitement of Hackweek! Dive into our latest blog post to explore innovative projects and creative breakthroughs from our most recent Hackweek. Learn how teams collaborate to turn bold ideas into reality, fostering a culture of innovation. Perfect for tech enthusiasts and creative thinkers alike!
Discover the revolutionary Conduit 0.9 update, enhancing data processing with standalone processors and advanced capabilities for seamless manipulation and efficiency. Explore now.
Conduit 0.8 more than doubles single-pipeline performance.
Revamp your data pipelines with Conduit and Redpanda! Swap Kafka and Kafka Connect complexities for a swift, user-friendly options.
Visit conduit.io to download and learn how to use Conduit, the secure and efficient open-source data integration tool accredited by the DoD Iron Bank.
Explore batching in Conduit connectors for improved data pipeline performance. Understand how it boosts throughput and scalability.
Conduit 0.7 gets us one step closer to being a full functioning, feature rich alternative to Kafka Connect.
Learn how OpenAI's GPT-4 has helped to streamline data connector building for Meroxa, reducing development time and effort.
Conduit 0.5: We made an easy-to-configure Dead Letter Queues (DLQ) through HTTP & gRPC, extending health checking, & adding capabilities with Debezium records.
Conduit is a tool to help developers build streaming data pipelines between production data stores and messaging systems.
We decided to build a performance benchmark for Conduit early, so we could determine how much it can handle and what it takes to break it.
Connector Middleware improves the developer experience. You can utilize middleware provided by the SDK to enrich the functionality of connectors without reinventing the wheel.
Conduit is a tool that helps developers move data within their infrastructure to the places they’re needed.
The opportunity to delight someone using your tool can happen at any time. The open-source Conduit project team makes the user experience a top priority.
Testing streaming systems and architectures can be difficult because you need to mock data and have an upstream system continuously push that mock data. Conduit has made it easier with a built in generator that creates fake data for streaming systems..
We faced challenges with Protobuf, so we began looking for a resolution... enter Buf!
In this release, Conduit now has an official SDK that will allow developers to build connectors for any data store.
Since Conduit ships as a tiny single binary, it functions as a powerful tool that allows you to efficiently move data from one place to another.
Conduit is a tool to move data around and Heroku is an application platform.
The world is trending towards more rapid delivery of goods and services. We use the term “Real-time” to mean that it happens as close to “now” as possible.
Conduit is an open-source project to make real-time data integration easier for developers and operators.
The Future of the Modern Data Stack looks excellent for data engineers. But where is the modern data stack for software engineers?
We’re open-sourcing Conduit, Meroxa’s data integration tool built to be flexible & extendible, and provide developer-friendly streaming data orchestration.