Instead of just opening closed systems through APIs, many companies are also looking to use event-based triggers to react to changes in real time.
Competitive pressures are driving the need for new thinking when it comes to developing applications that help a business respond in real time. Being able to make real-time decisions based on events is central to these efforts.
Any event, whether generated internally, such as a transaction, state change, database change, etc., or generated externally by client activity, calls for the need for action. But working with events introduces new technical requirements.
One way to look at the different requirements is to compare event handling to how apps built using APIs might work. For example, applications can use APIs to establish a one-to-one relationship between different application components. For example, a mobile banking application would use APIs to allow a customer to query a backend system for a bank balance. A request is sent and a result is delivered. And the application components at each end of session must both be online at the same time.
Event-based applications break this one-to-one relationship. And they change the way interactions work. Traditional applications require a push, an intervention to trigger the next action. Things happened in order. A customer requests their bank balance and the balance amount is returned to them. With event-based applications, systems respond to events as they occur naturally. A system does not need to wait for a response to perform another action.
Use cases abound
Event applications can be used in a wide range of industries, including manufacturing, financial services, transportation, logistics, retail, and more.
Often a single event is used by multiple applications for different purposes at different times. For example, if an airline passenger changes flights, this change has an impact on the allocation of seats on old and new flights. If the trip was booked through a travel agency, the change may impact other aspects of the trip. A hotel reservation may need to be moved from one night to the next and adjustments may be required for car rental and ground transportation services.
In retail, it’s easy to see how a single event like a website purchase needs to be shared with other systems, including inventory, tax calculation and collection, billing processing / payment, shipping, etc. The point to keep in mind is that multiple disparate systems, some of which may not be controlled by the company, must all work together to provide a seamless customer experience.
See also: Supercharge data flowing into advanced analytics
MQ vs. Kafka: Not necessarily one or the other
When working with events, the conduit, the software that sits between the various event creators and event consumers, must have special properties. Quite often, the choice comes down to two broad categories of solutions. Alternatives are based on message queuing, like IBM MQ, or event broadcasting, like Apache Kafkathe open source distributed event streaming platform.
The two are often presented as competing solutions. But in reality, they do different things and are designed for different uses.
Kafka is used to build real-time streaming data pipelines and real-time streaming applications. It allows a data pipeline to reliably process and move data from one system to another and allows a streaming application to consume streams of data.
IBM MQ supports the exchange of information between applications, systems, services, and files by sending and receiving message data through messaging queues. This simplifies the creation and maintenance of business applications. IBM MQ works with a wide range of computing platforms and can be deployed in a range of different environments, including on-premises, cloud and hybrid cloud deployments. IBM MQ supports a number of different APIs, including Message Queue Interface (MQI), Java Message Service (JMS), REST, .NET, IBM MQ Light, and MQTT.
As such, one of the differentiators between Kafka and IBM MQ is that Kafka is essentially about a stream of events or a sequence of events, whereas MQ is more about individual messages.
Separation is the key
The way modern applications are developed results in independent elements working together as one. Decoupling of different elements from a larger application is becoming the norm. APIs, IBM MQ and Kafka serve as the glue between the elements. Each has its own purpose in different applications.
Companies make the various components available as services, often through APIs. However, instead of just opening up formerly closed systems through APIs, many companies are also looking to use event-based triggers to react to changes in real time.