Message Queues, Background Processing and the End of the Monolithic App (reposted from Heroku blog)

Here’s a post of ours on message queuing and background processing that we published on the Heroku blog the other day. Definitely worth checking out if you believe like us that distributed multi-tier architectures are the future of production-scale cloud applications.


Iron.io Guest Post on Heroku

Message Queues, Background Processing and the End of the Monolithic App

Platform as a Service has transformed the use of cloud infrastructure and drastically increased cloud adoption for common types of applications, but apps are becoming more complex. There are more interfaces, greater expectations on response times, increasing connections to other systems, and lots more processing around each event. The next shift in cloud development will be less about building monolithic apps and more about creating highly scalable and adaptive systems.

Don’t get us wrong, developers are not going to go around calling themselves systems engineers any time soon but at the scale and capabilities that the cloud enables, the title is not too far from the truth.

Platforms as Foundation

It makes sense that platforms are great for framework-suited tasks – receiving requests and responding to them – but as the web evolves, more complex architectures are called for, and these architectures haven’t yet evolved in an equivalent manner as all encompassing framework-centered applications.

By way of example, apps are rapidly evolving away from a synchronous request/response model towards a more asynchronous/evented model. The reason is because users are demanding faster response times and more immediate data. Also, more actions are being triggered by each event.

Rather than thinking of the request and response as the lifecycle of your application, many developers are thinking of each request loop as just another set of input/output opportunities. Your application is always-on, and by building your architecture to support events and process responses asynchronously and highly concurrently, you can increase throughput and reduce operational complexity and rigidity substantially.

blank
Evented Flow: Shorter Response Loops + Async Processing and Callbacks

 

 

 For the rest of the article, go here >>

Leave a Comment





This site uses Akismet to reduce spam. Learn how your comment data is processed.