Categories
HTTP Concepts

HTTP Long Polling

Web applications were originally developed around a client/server model, where the Web client is always the initiator of transactions, requesting data from the server. Thus, there was no mechanism for the server to independently send, or push, data to the client without the client first making a request.

To overcome this deficiency, Web app developers can implement a technique called HTTP long polling, where the client polls the server requesting new information.  The server holds the request open until new data is available. Once available, the server responds and sends the new information. When the client receives the new information, it immediately sends another request, and the operation is repeated. This effectively emulates a server push feature. 

Considerations for HTTP Long Polling

There are a couple of things to consider when using HTTP long polling to build realtime interactivity in your application, both in terms of developing and operations/scaling.

  • As usage grows, how will you orchestrate your realtime backend?
  • When mobile devices rapidly switch between WiFi and cellular networks or lose connections, and the IP address changes, does long polling automatically re-establish connections?
  • With long polling, can you manage the message queue and catch up missed messages?
  • Does long polling provide load balancing or failover support across multiple servers?

When building a realtime application with HTTP long polling for server push, you’ll have to develop your own communication management system. This means that you’ll be responsible for updating, maintaining, and scaling your backend infrastructure.

Backend Infrastructure for Realtime Functionality

With these considerations in mind, that’s where a realtime data stream network comes in. This data stream network takes care of the backend infrastructure for you, so you don’t have to worry about maintaining and orchestrating your realtime network.

When looking at HTTP long polling with the goal of streaming data, PubNub is a low-latency and low-overhead realtime Web app communication environment, and features the ability to send messages to a single client, client groups and all clients. Upgrading to PubNub is both rapid and easy since it is based on a publish/subscribe model.

Key Benefits of Protocol Agnostic Realtime Messaging

Instead of relying solely on HTTP long polling for realtime messaging, a protocol-agnostic approach is beneficial. PubNub automatically chooses the best protocols and frameworks depending on the environment, latency, etc. Any server or client code wanting to communicate makes a single API call to publish or subscribe to data channels.

The code is identical between clients and servers, making implementation much simpler than using HTTP long polling. In terms of connection management, the network handles network redundancy, routing topology, loading balancing, failover, and channel access. Additionally, it offers core building blocks for building realtime into your application.

  • Presence – Detect when users enter/leave your app and whether machines are online
  • Storage & Playback – Store realtime message streams for future retrieval and playback
  • Mobile Push Gateway – Manage the complexities of realtime apps on mobile devices, including Push Notifications
  • Access Management – Fine grain Publish and Subscribe permissions down to the person, device or channel
  • Security – Secure all communications with enterprise-grade encryption standards
  • Analytics – Leverage visualizations into your realtime data streams
  • Data Sync – Sync application state across clients in realtime
'Coz sharing is caring