

Manoj Kaushik
Architecture COE Lead at Unthinkable Solutions
Manoj Kaushik is a Software Architecture COE Lead at Unthinkable Solutions. With 20 years of experience and a deep understanding of distributed systems, Manoj shares his practical insights into building scalable, event-driven architectures for modern digital platforms.

Anmol Satija
Host
Anmol Satija is driven by curiosity and a deep interest in how tech impacts our lives. As the host of The Unthinkable Tech Podcast, she breaks down big tech trends with industry leaders in a way that’s thoughtful, clear, and engaging.
Episode Overview
In this episode of The Unthinkable Tech Podcast, host Anmol Satija dives deep into Event-Driven Architecture (EDA) with Manoj Kaushik, Software Architecture COE Lead at Unthinkable Solutions. They unpack what EDA is, why it’s gaining traction in modern tech ecosystems, where it’s most effective, and how to navigate its integration with legacy systems. The episode also explores real-world use cases and challenges faced during EDA implementation.
Chapters Covered:
- Introduction to Event-Driven Architecture (EDA)
- How EDA Works: Producers, Brokers, and Consumers
- Why EDA matters in the modern data-driven world
- Best use cases and where to avoid EDA
- Understanding eventual consistency vs data consistency
- Integrating EDA with legacy systems
- Common challenges in EDA adoption
- Final takeaways and strategic advice for EDA adoption
Transcript
Anmol: Hello and welcome to “the Unthinkable Tech podcast, the go-to source for the pulse on technology that’s shaping our future. I am your host Anmol Satija. This is another existing episode where I am joined by another bright mind who takes us through some interesting tech insights.
Today we’re going to explore a software architecture pattern that’s becoming increasingly crucial in our hyper-connected world: Event-Driven Architecture, or EDA. This architecture pattern is transforming how software systems respond to user actions, notifications, and other events. It’s a shift away from traditional synchronous operations, paving the way for systems that are more scalable, flexible, and capable of handling massive amounts of data generated by users and devices.
To guide us through the intricacies of EDA, we are joined by Manoj Kaushik, Software Architecture COE lead at Unthinkable Solutions. Manoj brings a wealth of knowledge and experience to the table, and I am excited to hear his thoughts.
Welcome to the Podcast, Manoj.
Manoj: Thank you Anmol!
What is Event-Driven Architecture (EDA)?
Anmol: So, let’s begin by understanding what exactly is EDA.
Manoj: So, primarily the EDA is or you can say event-driven architecture.
It’s a software design pattern in which the production, detection, and conjunction of the events drive the flow of the software system. Here, you need to understand the events as well. Everything revolves around the events, whatever the user action is there, whatever the notification is there, so it’s not a synchronous thing, it’s more about revolving or the centerpiece is an event. Whatever it is, there is an occurrence in the system. We need to implement a flow around that event in terms of how it will be handled.
Key components of EDA: Event Producers, Brokers & Consumers
Anmol: So, in addition to that, can you explain how event-driven architecture works for better understanding?
Manoj: Okay. So, diving in deep, let’s understand the event first. What is an event? So, to give an example, if you are clicking on something, it is an event. So, it’s a user-generated event. You are getting a notification on your mobile, that is also an event. So, any occurrence in terms of Where the state is changing of the system, we define it as an event.
And with that thing in the centerpiece, we decouple the architecture of the software by implementing certain components. So the components to say, if we got deep in it. There are majorly two components to it. One is the producer, which is producing the event. So, for example, when the user is clicking on something, that particular UI interface is producing that event, right?
And then there is an event consumer. So, whatever the event has been produced, the event has occurred. So, based on that event, there should be some processing that needs to be done in the system. So, that is the event consumer.
And two, for the collaboration of these two components, there is an event broker. So, what happens is the producer generates the event, that event is stored into the broker, and from there, with a certain protocol, the consumer is reading that event that there is some occurrence of the event and now I have to fire the business logic to handle that event.
So, the event broker, necessarily, if you understand, it’s a queue-like storage.
And then there are certain components like Apache, Kafka, the SQS, the SNS, Rabbit MQ, and certain open source, and Queue systems that can be classified as the broker. So, these are the major three components. One is the producer, the second is the consumer, and the third is the broker that constitutes the event-driven architecture.
Anmol: It’s clear that event-driven architecture (EDA) is all about reacting to events, which can be anything from user actions to system notifications. Considering the benefits you’ve outlined and the components that make EDA so effective, it’s easy to see its potential impact. Recently, even a study suggested that 71% of businesses believe the benefits of event-driven architecture outweigh the costs. So based on that can you shed some light on why event-driven architecture has become increasingly important and popular in the modern digital landscape?
Why event-driven architecture is crucial in the digital era?
Manoj: Sure. So the point is why it is increasingly important in the modern digital landscape? We first need to understand the evolution of the industry. Initially, there was very little data, and with the growing age we can see, we can observe that the amount of data being produced is humongous. Right?
In the traditional application or traditional software architecture you will find that there are certain limitations in terms of scalability to process that sort of data that is getting generated now nowadays. So event-driven architecture is becoming important because it gives you the decoupling and the flexibility that ensures the scalability. That ensures that your software can process the large amount of data that is getting generated nowadays. Be it the number of users, number of transactions, or the number of the user base with a large sort of system and specifically with technologies like IOT that are coming up. It generates data every second.
So you need to have a system in place that is decoupled enough, which is scalable enough to process that sort of quantity of data. Because of that, EDA is becoming more relevant nowadays than the traditional software paradigm.
Anmol: Right. Maybe you can mention some use cases that would help us to understand where it is more beneficial or where we should completely avoid it.
Manoj: Sure, so when it comes to like where it is beneficial as I mentioned in my earlier answer, that the systems which are large enough, which are processing a lot of data we will see that event-driven architecture has the applicability over there. So for example, we are dealing with a microservices-based system, we are dealing with IOT or event-driven applications.
So let me explain with a detailed example – say there is a vehicle tracking system. Consider Uber, where you are getting notified how the car is moving or how the vehicle is moving. So you are getting a notification of that particular movement, or you are using a Google map. Similarly, we have also created certain vehicle tracking systems in terms of emergency response trackers for the ambulance and that sort of stuff.
So what is happening? Every five seconds or every 10 seconds some data needs to be processed because that is getting you the geolocation of the vehicle. Imagine thousands of cars are playing around on these roads and those are throwing the data every second or every two seconds. So there would be a large large number of data. And we need to process all that data. So in those sorts of systems where we have a large scalable user base or we have large data to process. So, for those sorts of systems, we should go for EDM.
Anmol: Do we have some stats for the client that can maybe tell us how it has increased the efficiency?
Synchronous vs Eventual Consistency: When to use what?
Manoj: So for efficiency, you need to understand how it works. When it comes to synchronous operations, your system is getting the data and at that very particular event, you have to process that data. So imagine a backend or a server that is processing the data in synchronous mode and thousands of cars are throwing the data onto the system. Then you might need a large amount of infrastructure, which is practically impossible to sustain.
It is because that sort of consistency or synchronization cannot be achieved over there. And it’s an applicability, you need to understand what sort of applicability you have to do. So there are two modes of data processing. One is the data consistency and the second is eventual consistency. So when it comes to data consistency, for example, a transaction – a financial transaction. If you are doing something you need an answer to that or a response to that immediately because the money is involved. So highly transaction system should be data-consistent now. Whereas eventual consistency is that you do something it will be processed and eventually the data becomes consistent. Not at the same point but eventually, because there you can afford some window over there.
So when it comes to the event-driven applications or the IOT applications or the vehicle response tracker that I just mentioned as an example, eventual consistency can work. It is because what happens is that if you are getting the location of the car after two seconds or three seconds it doesn’t pose a problem to you. As it is an indication of how the car is moving. Two to three seconds will not hamper anything.
That’s where the eventual consistency comes into the picture. So in those sorts of stuff where we can go with the eventual consistency, EDA can be a default choice.
Anmol: As per my understanding, so there are certain industries right like you mentioned the finance sector or the e-commerce sector. The finance sector will not be going towards EDA because they need an immediate response.
Is EDA suitable for every industry?
Manoj: Not like that. You cannot segregate the industry in terms of whether the EDA is applicable or not. You need to understand the workflows. EDA can be applied to a component of software. The entire product doesn’t need to be based on EDA. It is more around that we need to identify where we can decouple. So there can be some use cases in the vehicle tracking system as well which is not based on the eda and then there can be certain use cases in the financial industry as well where you can go with an eventual consistency.
So it’s more around that you need to uh understand what sort of applicability or what is the workflow that we need. Not necessarily every time we need synchronous communication and not necessarily we can apply EDA in any given workflow.
Challenges of integrating EDA with legacy systems
Anmol: How do companies integrate EDA with their existing legacy system
without causing major disruptions?
Manoj: So to be honest with my experience this is quite a task it is tough to implement eda with the legacy system. Why? Because when you are retrofitting EDA in a legacy system, what happens is that the system is generally tightly coupled. It is because of the traditional software architecture. So things are hooked. You take an example of say a mobile adapter or mobile charger to be put into the socket. With different geographies, you have different adapters. In Europe it is different, in the US it is different, and in India it is different. So if you think that okay let’s decouple it, you need to understand where it it has to be.
So EDA is just like plugging in a universal adapter in between. So you need to understand, you need to identify where we can put the adapter. So a universal adapter is a thing where you are plugging your charger and based on the geography you are choosing what socket you are using. That identification of where that adapter can fit in is a real problem because you need to understand the legacy system in depth. What sort of consistency you can afford? Can you afford an eventual consistency or not? Then there would be specific challenges for the skills as well because when it comes to the EDA, it’s not simplified for sure.
It gives an extra layer of complexity and that is instead of the scalability and decoupling that it brings to the table. But then you need a very skilled and expert team to continue with the integration or the retrofitting of this.
Anmol: So have we done something like that for our clients? I believe we have the right set of people with us.
Manoj: Yeah we have done that. We have done both cases, in terms of the modernization of a legacy system and then creating a lot of systems from scratch for a digital platform. Where we have introduced the EDA wherever it can be introduced. Specifically, when there is a benchmarking like a certain number of data or the sort of data we expect in the system. So we need to understand the benchmarking over there as well. Based on that benchmarking we need to choose whether we can go with the EDA or we should not go.
Common challenges in EDA implementation
Anmol: So there must be some certain challenges as well organizations face regularly when transitioning to an event-driven model. Maybe you can highlight those challenges as well.
Manoj: Yes, so there are certain challenges. Everything comes with pros and cons. When it comes to the EDA, it has its challenges. So one is that the complexity gets increased with the control flow. So how things happen say a pretty simple thing like you are doing something and you are getting a response of that. So say, it’s a simple request-response-based structure. You are requesting something you are getting a response and that’s a traditional mindset that gets into the people be it a business user or a technical team.
So what do you expect when you ask me a question you expect an answer from me and that’s a normal course or flow. But what happens with the EDA, it’s altogether a different shift. It’s like a cultural shift and cultural shift is hard to make. So to just understand say what happens when you ask a question and it’s not that I am listening to it. It’s more of a that there’s some interface in between. Your question is being recorded over there. And whenever I get the time I listen to that question and I give the answer. Then my answer is transmitted to you. So it’s not a normal control flow and that is a cultural shift. And because of that cultural shift and not like asynchronous communication thing the complexity gets increased.
For eg. the debugging, troubleshooting, coding complexity, etc. gets increased. If you don’t have an expert and skilled team, it is hard to maintain. The specific challenge over here which is very realistic that if there is a flaw in the implementation, you will find it tough to understand the data consistency. You might be losing the data and you might not know where you are losing it. So you need to have a very high and skilled team in terms of when you are going for an EDA. That is the biggest challenge that I have seen. People have been trying and then there are certain issues with the data consistency which is of utmost importance.
Building a fail-safe EDA system: The checklist approach
Anmol: So there must be a process that we follow to identify these challenges?
Manoj: Yeah, when it comes to the complexity or like when we know about the challenges, there is always a checklist that we follow. Be it in terms of identification of the components, how things are, then implementation of the workflow, and the monitoring of the things.
So workflows mean we need to understand like when I was giving an example of a broker that your message is being recorded somewhere and then I am listening to it. So there should be a fail-safe approach and proper logging in terms of whether the event is generated when you have asked the question, that event is logged. Whether it has been recorded by the broker or not. Then as a consumer, I was able to access that question or not.
So we need to monitor, we need to have a workflow for a fail-safe approach. Whether I listen to that question one time even then there should be a proper log. And there might be a retrial method where I can go for a second round if I fail to understand. It should be like let’s have another round. So there should be a workflow that makes sure that this has to be done again. So there is no issue with the data consistency. All those steps like there are seven to eight checklists that we maintain with us that are given to the entire team and entire monitoring team where we comply with that checklist to understand that there is no such issue occurring in the system.
Anmol: Okay, so since we have talked about the challenges and how they can be integrated with the legacy system, there must be some lesser-known facts about event-driven architecture that tech leaders should be aware of.
Event sourcing and immutability: Taking EDA to the next level
Manoj: So I think we have already discussed the peripherals of an event-driven architecture. But yes, as you ask for certain facts, so one interesting thing that I can tell over here is that we can couple up the EDA with event sourcing.
So what do you mean by event sourcing? It can be any outcome, any current state which is the outcome of certain events that occur in sequence.
So this is the event sourcing. You can attain immutability, which can be a unique selling point of your software product with the EDA coupled with event sourcing.
So how things are, so just to take an example, you are using a financial product, right? And now what you are checking? You are checking your balance. It might be your wallet balance or a bank balance, whatever you want. Take an example. So how does the current balance shape up? Something gets in, something gets credited, and then there are certain expenses. So there is some credit, some debit. So those sorts of events, are calculated and the result is your balance.
So if you have just the balance, there might be a doubt about whether this is right or wrong. Or maybe it is tempered or not. So if software is just giving you a balance, you cannot remember whether it is tempered or not. It is the same amount or not?
But if you have the entire history. Then you are sure of the immutability that whatever the balance it is showing, it is right. So same thing happens when we couple up the event-driven architecture with event sourcing. So what happens is that every event is now recorded and saved into a persistent storage.
The net outcome or the current state is the outcome of the entire chronology of the events. So whatever has happened that comes into the chronology. And the current state is calculated out of that chronology of the events. That sort of immutability can be a very unique selling point.
So I give you an example of one of our health tech clients for which we built a software product. In their case, the unique selling point was immutability. Because people right now are worried that their data can be shared with other players in the market. So that was the unique selling point of that product that the data is immutable.
The data is given or accessible only to the providers, only in the case where the patient has given permission.
Anmol: Consent.
Manoj: Yeah. So that it wasn’t a consent basis. And for a consent basis, we can say it’s a compliance thing. So everyone will say that, okay, there’s a consent in place. But how to ensure or give that credential to the statement that it is immutable or that it is not shared with anyone else?
So what we did over there was that we were storing that consent event as well. Like whatever the access it may be. So if there is any service that is accessing the data, that will be considered as an event. So that was stored. Consent is given to a provider. That was an event, so that was stored. Provider asking for consent. That was an event, so that was also stored.
So that the entire chronology of the event was stored and it was accessible to the patient. To understand that this is the entire chronology. These many consents he or she has given. These many consents he or she has been asked by the providers and up to what time. Whenever the consent is revoked, that is also an event.
So we were recording the entire chronology into the user-specific wallets. We created a concept of a wallet over there and it was a file personal to that particular user with an encryption key. That was only with the user and no one else.
Based on that we did the immutability. I think that is a fabulous example of where we can get into that sort of thing as a USP
Anmol: I think that that must be a great success because this must have built trust in the mind of the patient. As they need to keep their data secure.
Manoj: Yes and right now because of all the weariness and increase in security breaches, this is a unique selling point. It should be taken care of by the tech companies how to ensure the privacy of the data.
Final Thoughts: Strategic insights for tech leaders
Anmol: Rightly said Manoj. So to summarize, EDA Is a powerful architectural model but requires a clear understanding and a strategic approach to integration. We always need to keep scalability, data consistency, and the cultural shift toward asynchronous communication in mind. And most importantly, we need to ensure that we have the right team with the necessary expertise to navigate the complexities that come with EDA.
So as we wrap up this insightful conversation, I’d like to thank you, Manoj, for sharing your expertise on event-driven architecture.
Manoj: Thank you, Anmol. Thanks for having me.
Anmol: And to our listeners, we hope you’ve gained a deeper understanding of event-driven architecture and how it can impact your tech strategies. Stay tuned for more episodes of “The Unthinkable Tech Podcast,” where we will continue to unravel crucial tech insights with the experts. Don’t forget to like, share, and subscribe to our podcast. Do let us know the topics that you want us to cover. Until next time, keep innovating, and keep thinking the unthinkable.