What do the Kiwis really know about APIs?

Reading Time: 3 minutes


It turns out quite a bit.

Media companies, postal services, newspapers and other print publishers have been talking about monetizing digital assets both to create new revenue opportunities and, quite frankly, to remain relevant in the age of the digital prosumer.

New Zealand Post has embarked on a project to unlock their digital assets with APIs to create brand new revenue streams through their affiliate and partner networks. We will feature their story as one of many exciting use cases during MuleSoft’s inaugural conference – CONNECT 2014 in San Francisco, May 27-29th.

We’re excited to count Salesforce, Tesla, and Box – amongst the most innovative companies in the world today – as our customers and feature their transformational CIOs – Ross Meyercord, Jay Vijayan, and Ben Haines – in our keynote sessions to discuss how each in their own unique way is fundamentally changing business patterns by becoming a Connected Company.

The conference agenda also includes industry specific sessions which will highlight the drivers for change in Healthcare, Financial Services, Insurance, Retail, Media, Government and High Tech and how Connected Companies in these industries use MuleSoft’s Anypoint Platform for SOA, SaaS integration, and APIs to drive great customer experiences, amplify the pace of innovation, and ramp up new revenue channels.


For the developers in the audience, the conference has hands-on workshops, numerous demo stations, and scheduled time for meetings with MuleSoft product managers, engineers and solution architects.  We’re also planning a full track to feature new capabilities in the latest MuleSoft 3.5 release along with robust discussions with early beta customers.

If there are any specific topics you’d like us to include, please let us know.  We also welcome you to submit papers for consideration by April 15, 2014.

Look for future blog posts as we announce new speakers and sessions!

In a connected world, the atmosphere is made of smart dust

Reading Time: 3 minutes

I came across this article recently that highlights where connectivity could go, even beyond the 50 billion connected things expected to be on the planet by 2020.

The author described something called Smart Dust. Tiny microscopic sensors floating through our cities, tracking and collecting all kinds of data.

“He and his team use Dust, portable packets of sensors that float in the air throughout the entire city and track movement, biometric indicators, temperature change and chemical composition of everything in their city.“

The scene described in the article is fiction but the concept of Smart Dust is not. Tiny devices so small they are invisible to the eye but have enough RAM, wireless capabilities and the ability to run tiny operating systems. They aren’t suited towards running processor intensive functions but they are perfectly capable of gathering data and sending it back to a base station.

This makes the idea of the Internet of Things and 50 billion connected things seem a bit passé. Phones, fitness devices, home thermostats, the smart TV in your living room, your car in the driveway – all connected – is one thing. But imagine trillions of tiny smart devices the size of dust particles floating around the planet, that all need to connect and share data. This is why API languages like RAML, which are discoverable and self-describing, are so important. If a device can broadcast its API and capabilities to other devices using a common language that those other devices can understand, it greatly simplifies integration. It goes beyond how to connect, it means understanding how to communicate, function and control each other.

The world may soon be quantified by sensors, floating in the wind. Now that’s going to be a real integration challenge.

Sweet & Simple: Using SOQL Relationship Queries with Salesforce.com Connector

Reading Time: 5 minutes

The best things in life are often sweet and simple. However, “S & S” is an easy concept to understand and appreciate but often hard to implement. For example, a sweet and simple way to attract traffic to our blog would be to show women in bikinis playing with cats. In reality that is rather hard to pull off for a technical site. There simply is no budget to publish anything like “API Illustrated, Swimsuit Edition” or “ESBN, the Body Issue”. Instead, this article will focus on sweet and simple features in our products that can make life easier for integration developers.

With 100,000+ customers, Salesforce.com is one of the most popular integration endpoints for ESB implementations.  There are a couple of commonly asked questions when it comes to Salesforce.com: how do you reduce the number of API calls since there are daily limits per instance, and how do you retrieve all the related records in one query?  The SOQL Relationship Queries help accomplish both goals, as a developer can make just one API call against different SObject types that are also related.

Here we will illustrate how the Anypoint Salesforce.com Connector fully supports the Relationship Queries.  We will use the archetypal Header-Line hierarchy data structure, for example, Opportunity and Opportunity Line Items.  This represents the classic use case in which an Order (i.e. opportunity) can contain multiple products (i.e. opportunity line items).  To use a relationship query, simply put the nested SOQL statement into the usual Query Text box: 

Continue reading

Enabling Transactions in Node.js using Domains

Reading Time: 7 minutes

At MuleSoft we have open source software deep in our DNA. We have hundreds of public projects there as well, and we have contributed to many open source projects including Node.js itself. We’re excited about Node.js and have several large, sophisticated Node.js projects in development. Our use of cutting edge Node.js features has resulted in both a lot of knowledge gained and, no surprise, a lot of pain experienced.

In this post, we’d like to share our experience developing part of our data access layer for one of our larger Node.js projects. We’ll start with an architectural overview, talk about modules in use, the problems we’ve found and how we worked with the open source community to resolve those issues, making an extremely powerful Node.js feature much more usable by everyone.

This particular project relies on postgres for data persistence, and provides a number of clients, including an AngularJS user interface, with a rich set of APIs. We make use of RAML to define these APIs, with JSON schemas for entities. These entities have shapes that are hierarchical, unlike the tabular data stored by Postgres, and so a transformation layer separates our services from our database repositories: As believers in layered architectures and SOLID, we designed the system as shown in the somewhat contrived examples below. Note that all layers in the system operate asynchronously and return promises.

Controller Layer

controller layer pulls data from the HTTP request and passes it down to a service layer. For example:

Service Layer

A service layer returns entities that are of the shape specified by the JSON schema in the API’s RAML. Services make use of a simple mapping subsystem that knows how to reshape database rows into entities:

More sophisticated services pull data from multiple data sources, do filtering, validation, and other tasks.

Repository Layer

A repository layer provides apis that access the database, building queries, performing CRUD operations, and providing data integrity services such as transactions:

The DAL object in the code above is the deepest layer of the data system, and provides query creation and multi-tenancy services to the repositories.

Here is where it got interesting: We make use of knex for data access, and knex provides a reasonable implementation of transactions. Transaction need to be started by services: A service may want to insert two entities as a unit – a parent object and a child, for example. It will make a call to two different repositories to do so, and needs both repositories to enlist in the same transaction. However, only repositories may call knex code, in order to maintain a proper level of abstraction.

Domains

Enter domains

Domains are an extremely powerful, yet little known and poorly understood feature of Node.js. They provide a context in which code can execute and from which errors can be captured. We decided to make use of a domain, attached to an http-level API call, to store our transactions. This resulted in service layer transactional code looking like this:

transactor.run(function () {
    return self.insert(atom)
      .then(function (atom) {
        var electron = particleService.insertElectron(atom);
        return Promise.all([atom, electron]);
      }).spread(function (atom, electron) {
        atom.electrons = [electron];
        return atom;
      });
  });

The transact method looks like this:

This pattern is similar to the notion of Transaction Contexts in multi-threaded environments such as Java. It makes it extremely easy to see where transactions begin and end and relieves services from having to know when a particular piece of code is running independently and when it is running within a transaction: Knex kindly takes care of the details of connection reuse and issuing the proper SQL to the back end, and we use dependable to inject required modules at every step along the way.

It looks straighforward, but making all this actually work in practice was extremely difficult, as multiple underlying systems, from knex to Node.js itself, required contribution from our team to work together correctly in the presence of domains. A story we’ll tell in the next blog entry. Stay tuned.

Parallel Multicasting in Mule Made Easy

Reading Time: 10 minutes

A common integration scenario is where a single message needs to be sent through multiple routes.

Take for example a case in which you’re receiving a message about a new client’s on-boarding. The message needs to be routed through the CRM to create the client, to marketing who will want to know how the client heard about the company, and finally passed to provisioning and stock systems so they can work their magic as well.

In this case, the message is broadcasted in a “fire and forget” fashion, meaning you don’t need a response from any of these systems to continue your processing. Each of those systems are responsible for handling their own logic and their own errors. In Mule ESB, you could do this like this:

There are other cases, however, in which you do need the response from the routes. Suppose you’re using a travel booking application and somebody wants a direct flight from Buenos Aires to San Francisco. Your app needs to contact all known airline brokers, get availability for those flights and choose the cheapest one. The <async> scope is insufficient for you in that case because you want the thread processing the request to actually wait for the responses to arrive. Sounds like a job for a multicasting router!

When <all> is just not enough

Continue reading

In a connected world…

Reading Time: 2 minutes


The integration revolution is about to take off. Devices, data, and platforms will share information like never before. During CONNECT 2014 we’ll explore all things integration and see how enterprises unlock the power of being connected.

But what will your connected world look like? We want to hear your thoughts. And as a nice bonus, here’s your chance to win some MuleSoft gear!
Simply tweet, “In a connected world … #Connect14” and let us know what your connected world is like.

In a connected world your car schedules its own maintenance. #Connect14

— MuleSoft (@MuleSoft) March 27, 2014

In a connected world, data entry is a thing of the past #Connect14

— Dan Ahmadi (@Dan_Ahmadi) March 27, 2014

The top 5 tweets win MuleSoft swag!

Terms and conditions

Anypoint Studio Themes

Reading Time: 4 minutes

We’re excited to announce the long-awaited release of our Anypoint Studio Themes™

Designed to provide a more pleasurable and productive integration development environment, you can customize your IDE experience with different creative, colorful themes. Choose from among several themes available for download within Anypoint Studio™, Mule’s Eclipse-based IDE.

Our in-house UX team continues to work around the clock to improve Studio’s usability, and capitalize on existing intuitive functionality. Faced with the intense challenges of designing the interaction for a new product and a completely new platform, they decided to distract themselves with relatively simple chore of improving the IDE. It’s already a pretty amazing tool, though, so making it even better was going to be kinda hard.

“We tried to think of what we could do enhance an already-amazing experience,” reports one UX Designer. “Applying themes to your instance of Studio seemed like the quickest way of turning an out-of-the-box integration environment into a delightful, super-awesome, customized experience. My favorite is Peanut Butter & Jelly.”

Eager to contribute to the project, most of our software engineering team contributed numerous ideas for themes. Some of the themes were prioritized for development, like “Star Wars”, “Hello Kitty”, and “True Blood”, while others are slotted for development later in the year. Look forward to more productivity-enhancing themes in the coming months, such as:

  • “Hedgehogs!”
  • “The 19th Century”
  • “FPS by the Seashore”
  • “Manga Pirates”

To install a new Anypoint Studio Theme, navigate to the Help menu in Studio, then select Install New Software. Use the drop-down to select Anypoint Studio Themes, then browse the dozens of available themes. Follow the steps to download, then relaunch Studio with your awesome new theme.

We anticipate quick adoption of the themes and expect to see the productivity of our users improve exponentially. Let us know what you think: Will the colorful new themes improve your productivity? Does anime enhance the integration development experience? Do we need more penguins? Please leave your comments below!