Before streaming APIs, if you wanted to know if there were any updates on a specific event you had to query the API periodically and check if updates indeed occurred. Most likely, many of these queries would probably return no results as no new event happened but still resources were consumed in the process including the API call and parsing the response. Now, what if you want to get notified not only of 1 event but 10, 20 or 100? Wasting resources like this is not acceptable both for a the application consuming the API and for the API provider and that is why streaming emerged.
Streaming is the ability for a client to issue a single request and receive in response a continuous stream of updates from the server. These are real-time notifications on any given event upon which you want to trigger some operation. These operations can vary from sending a simple alert to synchronizing with another system hosted in-house or in the cloud. For SalesForce in particular you could want to be notified when a new Account record is created/updated or when an Opportunity for a deal of over $1M is created/updated. These notifications are called “topics” in the SalesForce vocabulary. A topic is nothing more than a query on a standard or custom SalesForce object, if this query returns any results different than the previous time it was executed then a notification is sent including whatever details were specified when creating the topic.
When to use streaming
- To receive near real-time notification of new or updated data:
- Taking action on change in state – e.g. send an alert over SMS using the Twilio Cloud Connector
- Synchronize state – e.g. sync up with other system like Atlassian Jira using Jira Cloud Connector
- If you are currently polling an API you’d get much more efficient consumption of API calls.
How Force.com streaming API works
- Administrator creates a topic (i.e. “SELECT Name, Id FROM Account”)
- You suscribe your client application to this topic
- Then start receiving updates that match the query of the topic.
Important things to notice:
- All queries MUST include the Id field.
- Any object can be queried, including custom ones but just one object per query.
- Only updated results will be returned, so if the query is “SELECT Name, Id FROM Account” you won’t receive all the accounts back each time, just the ones that have been created or updated since the last query execution.
- The Streaming API must be enabled for your organization.
3 Steps to go live using the SalesForce Connector
Yes, just 3 steps and can be reduced to only 2 if the topic you want to subscribe is already created:
1) Declare the namespace for the SalesForce Cloud Connector and your credentials:
2) Create a topic that will get us notified of any Account record updates (this step may be optional as you can create topics: using the SalesForce Workbench)
3) Set up a Mule flow to subscribe to this topic and handle the notifications, in this case we just output the date to a logger:
What is sent to the logger includes information about the Salesforce data that has changed, how it changes, and when. The payload will be a Map, that is a key-value structure that can be accessed with the map-payload evaluator (notice how the Name of the Account is extracted from the payload). That’s all you need! Really simple, isn’t it? In order to use the SalesForce Cloud Connector make sure you define the following dependency and repository in your pom.xml:
Implementing Streaming APIs using the DevKit
If you were to implement a Streaming API in Mule you would have to manually create a Message Source to listen for notifications and dispatch updates to one or more Message Processors. With the recently released Mule DevKit you don’t need to care about this as you can simply create Message Sources using just one annotation on a regular Java method! More information on implementing streaming APIs here. The SalesForce Cloud Connector implements the SalesForce Streaming API using the CometD client that follows the Comet model. An application using streaming Comet opens a single persistent connection from the client to the server for all Comet events. These events are incrementally handled and interpreted on the client side every time the server sends a new event, with neither side closing the connection. Source code can be found here.
Real life examples only vary in what you do after a notification is received and that is where you notice the power of Mule ESB to integrate with other systems. Check out the source code of these examples:
- Chatter to SMS example using Twilio
- SalesForce to Freshbooks
If you would like to see an example of the SalesForce streaming API and one of our available Cloud Connectors let me know and will work on something and publish another post explaining that example. Stay in touch leaving your comments/questions here or in our forums. Also follow us on Twitter.
Follow: @federecio @mulejockey