types of apis

Open platforms in action right in the heart of Austin

Reading Time: 6 minutes

We recently collaborated with Visa on a video about APIs. Knowing how important APIs are to the modern business, we asked Susan French, global head of the Visa Developer Program, to share some examples of interesting applications and ideas with Visa’s open APIs she observed at the launch of the Visa Everywhere Initiative in Austin.

“When the first electric grid fired up, I don’t think even Edison expected to see a washing machine at the other end of the outlet. Or a self-driving electric car,” said Sam Shrauger, SVP of Digital Solutions at the Visa Everywhere Lounge in Austin. “Similarly, when we opened up our APIs, we didn’t know what kind of innovations would take us by surprise.”

And so it went for Visa Developer at Austin. Digital innovators poured into the Visa Everywhere Lounge to experience the future of payments. They used fingerprints, irises, or faces to pay for drinks. They took a seat in Visa’s Connected Car. But they weren’t just along for the ride. It was inspiring to see first-hand the ideas some of the greatest thinkers and innovators have brewing in their minds.

Visa Developer pic 1

The kickoff of our Everywhere Initiative API Challenge gave me a fantastic opportunity to see startups from across the country go head-to-head in their search for new ways to use our APIs. We saw tons of great ideas – many of which we would not even have imagined ourselves.

Visa developer pic 2

For example, IoT smart device innovator, Knocki, proposed using Visa APIs to let consumers make purchases simply by knocking on a piece of furniture. Out of detergent? Knock five times on your washing machine and it’s reordered (wow!).

Messaging platform innovator, msg.ai, proposed letting consumers connect with brands and make purchases using Visa Checkout without leaving their chat.

Hiku came out from Silicon Valley to show us how their small connected device allows consumers to scan and automatically repurchase everyday items when they’re running low.

Event Base, a mobile technology platform for major events like SXSW, Comic-Con and Sundance, pitched the integration of APIs to offer VIP exclusivity to account holders.

And, to all the competitors, thanks for taking up the challenge. Ledge, a social borrowing platform, took the top spot. Ledge enables millennials (and others!) to tap their social networks to request and automatically repay loans from friends and family. Ledge proposed that the Visa Direct APIs can make this exchange of funds seamless. The panel chose Ledge because of their innovative approach to financial inclusion and the strength of their leadership team. We can hardly wait to roll up our sleeves and start working together on it.

Visa developer pic 3

“We’d like to work with Visa so we don’t have to make individual deals with issuing and acquiring banks,” said Adam Neff, co-founder of Ledge. “That’s not our expertise.”

I was thrilled to have this opportunity to engage with developers and witness them embracing our APIs so enthusiastically. It was great to see the wheels spin as they imagined what Visa could help them design, build, and deliver.

We’ve been around for 60 years – our APIs have been around for 6 weeks – yet they are setting a new foundation for what Visa can enable in the next 60 years. I can’t wait to see what happens next. This was only the first leg of the Visa Everywhere Initiative. So if you’ve got a few ideas up your sleeve, we’d love for you to participate. See how you can join in on the API Challenge.

Thanks, Susan! For more about how you can use APIs to create cool new things, give Anypoint Platform™ — API solutions a test drive.

4 reasons to attend CONNECT

Reading Time: 3 minutes

CONNECT is the world’s premier digital business conference. This year it’s being held in San Francisco from May 21 from May 25, and if you’re an IT decision maker, developer, architect, or tech executive, it’s a conference with a wealth of truly valuable information.

But why should you attend CONNECT? Here are four great reasons:

  1. On-site training. Are you looking for fast-track training to get yourself up to speed on Anypoint Platform? We’ll have training courses on Saturday, Sunday and Monday, which will get you up and running at lightning speed.
  2. Free MuleSoft certification. Want to become certified? You can at CONNECT. Certification is free with a conference ticket, so not only will you gain the recognition you need, you’ll also get face time with numerous MuleSoft experts who can answer any question you have in a variety of formal and informal settings.  
  3. Since CONNECT will have plenty of speakers from top brands and subject matter experts, there will be ample networking opportunities and chances to exchange knowledge and best practices from the top practitioners in the industry.
  4. You’ll get great insight on how you can make a difference in your organization. Check out this welcome message from our CTO, Uri Sarid.

Don’t forget MuleSoft blog readers get 10% off their CONNECT registration when you use this special code: mktg-blog-10. Register today and we’ll see you there!

Best practices for multi-SaaS integrations

Reading Time: 4 minutes

It’s not correct to say that businesses operate “in the cloud.” Rather, they can operate in as many as half a dozen clouds because of the proliferation of applications and SaaS providers. It presents quite a challenge to unify these very disparate systems and get them to work in a unified fashion, but in order to achieve digital transformation, developers have to ensure that all systems within an enterprise are integrated.

“The most critical issues in integrating are correctly mapping the business data objects between solutions and ensuring the integrations are implemented in a way that guarantees reliability, availability and consistency across systems,” says Ken Yagen, our vice president of products. “Every system has its own governance about how you can access data (such as availability SLAs, throttling limits, record limits, et cetera) and their own schedule on when APIs or data objects change.”

But integrating disparate systems isn’t the only thing developers need to consider when helping their businesses on the path of digital transformation. “Developers also need to keep in mind that people within and outside an enterprise will be able to access parts of the network using consumption models that are familiar to their way of working. By decentralizing IT, developers allow the integrated SaaS solutions to be adopted across the enterprise, not just through a central IT function,” Ken notes.

So what do developers need to make these enterprise integrations happen in a way that makes sense across the enterprise? “A developer needs an integration platform that abstracts and manages many of these concerns,” Ken advises. In addition, he recommends “taking an API-led approach to connectivity, where you build purposeful APIs that represent various systems, processes and experiences. The componentized APIs are reusable and can be modified quickly without disrupting business processes.”

Ken points out that a good connectivity platform can assist developers in abstracting away the need to custom code integrations, and can provide templates and out-of-the-box connectors that can make integrations easy and fast. From a business perspective, Ken also notes it’s important to “understand how to shift to composability with API-led connectivity [in order to] build a relationship between developers and the rest of the business to retain a company’s competitive status in the market.”

For more information about what developer skills are needed to manage integrations across multiple SaaS providers, applications, and legacy systems, check out the webinar hosted by MuleSoft senior enterprise architect Eugene Berman.

Setting up queues and exchanges with Anypoint MQ

Reading Time: 4 minutes

Messaging is used to help organizations provide reliable, zero message loss environments, to decouple applications, enable scale, and to unlock data for broader distribution.  It’s a critical pattern for most enterprise architectures.  In healthcare, for example, a messaging pattern might be used to create a buffer between the electronic medical records system and any number of consuming applications who need to know when a patient record has been updated.  Or in retail, a messaging pattern might be used to ensure a customer can have a seamless experience as they modify their shopping cart across, in-store, web, and mobile channels.  

Until now, however, most messaging has been driven by on-premises technology requirements.  As more and more connectivity moves to the cloud or becomes hybrid, the ability to deliver messaging patterns in the cloud becomes increasingly important. That’s why MuleSoft created Anypoint MQ, and we’re going to be demoing it in a webinar on March 31, 2016, hosted by Senior Product Manager Vivin Nath.


Imagine this scenario:

You have have process in place for updating customer records in your cloud CRM — updates that both your on-premises master customer DB and your on-premises ERP (SAP) need to consume to keep customer data across systems consistent.  Rather than sending every customer update to each system individually, an unreliable and error prone pattern, you could instead set up a message exchange. Through this exchange, your DB and ERP systems could subscribe to any published customer updates, pull these messages down into their respective consumer queues, and process the messages at their own pace.  By putting rules and logic in place to ensure the messages are indeed received and processed, you’d be able to create a highly reliable delivery system.    

Anypoint MQ blog image

You can do this and much more with Anypoint MQ.  In our upcoming webinar, Vivin Nath will show you how. Register today! 

The habits of highly effective APIs

Reading Time: 3 minutes

APIs are increasingly important to the way the modern business works, and as such, the usage of APIs is exploding. Today, according to Programmable Web, there are nearly 15,000 APIs in existence. With so many APIs, how can a developer be sure that hers will be both useful and used?

Highly effective APIs have a number of characteristics that make them highly usable, long-lived, and important to the business. It’s important to make sure that current APIs in use have these characteristics, but developers should also ensure that any new API being built have them as well. Joe McKendrick says in ZDNet, quoting our book First, Break IT, that “APIs need the same care and planning as any piece of software that is produced. That is, they need to be created with a business need in mind, be secure, and well governed…APIs are a reflection of the business, and will also represent a key part of the workflow.”

The characteristics of highly effective APIs are:

  • Well-designed
  • Well-implemented
  • Consistent
  • Discoverable
  • Complementary to the existing architecture

APIs can provide a wide variety of business benefits. They can enable a company to become more efficient in delivering new products and services. They provide a simpler way to integrate compared to the custom integration and complicated metaware of the past. They enable businesses to become more agile and flexible in their interactions with customers, partners, and suppliers. But they need to be designed well and maintained consistently to provide these benefits.

For more on API design and management, take a look at First, Break IT, edited by MuleSoft founder Ross Mason.

Why microservices are replacing old IT operating models

Reading Time: 5 minutes

Tech executives are turning to microservices and APIs to help them achieve their business goals. In a recent MuleSoft survey, more than three-quarters (78 per cent) of IT professionals questioned said pressure to deliver IT services had increased during the past 12 months. Of these, two-thirds (66 per cent) said that change was needed in order to meet a “significant” or “drastic” increase in pressure. This pressure comes as no surprise – increased competition, the need to deliver services and applications to customers more quickly and the need to unlock the value of siloed business data and assets are all putting pressure on IT organizations to transform.

Ross Mason recently wrote in Computer Business Review, “This need for company-wide agility has become one of the driving forces behind the development of API strategies, and it is fuelling a growing interest in microservices.”

Microservices and APIs add flexibility

IT organizations are interested in increasing flexibility both in utilizing existing technology and adding new applications as the business requires. In our survey, we discovered that the majority of ITDMs said they are currently using APIs to free data, specifically link new software with existing systems and applications (72 percent) or unlock data silos (55 percent). 52 percent also indicated the need to increase agility and enable business teams to self-serve IT as a reason for having an API strategy.

Agility was also shown to be behind much of the interest in microservices. Respondents using or thinking of using microservices identified the ability to add new features or capabilities without re-writing a whole application as most important.

Ross notes, “As CIOs…adapt containerized technology, they will create API-led microservices that will impact the way that teams are structured and managed. The prize is great agility in delivering and changing software. Microservices ..can ensure that the organization remains as nimble as possible in order to maintain competitive advantage.”

Business success depends on integration

The world of technology is moving too fast to keep doing things the old way – the role of IT is no longer to manage the data center and keep the lights on, but rather to become a strategic partner to the business, guiding the whole organization to a holistic use of technology. As Ross often says, “businesses will compete on the ability to unlock the value of their siloed data and assets and to connect the unconnected.” These strategic integration decisions – as well as microservices and API strategies – will help your business

For more resources on how transforming IT can help your business, take a look at our eBook First, Break IT: How Composable Services are Changing the Role of the CIO. 

Salesforce Solutions from MuleSoft

Reading Time: 6 minutes
Anypoint Data Gateway Download

MuleSoft makes connecting anything easy, and connecting to Salesforce is not an exception. From out-of-the-box tooling to packaged integration patterns, in this blog post you will learn the options you have to get started on your Salesforce integration.

dataloader.io

We will start with dataloader.io, an easy-to-use, 100% web-based application that allows its users to easily export, import and delete data in Salesforce.com. Most commonly, dataloader.io is used to do basic data cleansing or to migrate large sets of records into Salesforce, but there is a lot more that you can do with it.

Salesforce to Salesforce one way sync:

Salesforce migration with dataloader.io

In the above scenario, scheduled task #1 exports data from Salesforce as a CSV file and uploads that file into a file sharing service like FTP, SFTP, Box or Dropbox. You can schedule the tasks in such a way so that once task #1 is completed, task #2 imports the data into a designated destination such as another Salesforce organization. Even if the source of the data is not Salesforce, dataloader.io can similarly import any CSV formated data into Salesforce.

As exemplified here, dataloader.io is a great tool for simple export, import and deletion of data, but should you need something in addition to its capabilities such as orchestration, bi-directional synchronization or real-time integration, the solutions below may be a better fit.

Anypoint Data Gateway for Lightning Connect

For quick connectivity between back-office systems and Salesforce, use Anypoint Data Gateway for Lightning Connect. Using wizards-based tooling, Anypoint Data Gateway connects to systems like MySQL, IBM DB2, SQL Server, Oracle DB or SAP and exposes its data via an OData API, making it consumable by Salesforce via Lightning Connect. This means without any coding, Salesforce users can see critical back-office data within Salesforce without having to duplicate data into Salesforce.

Exposing SAP data to Salesforce:

Screenshot 2016-03-08 12.46.12

In the scenario above, you can use Anypoint Data Gateway to select the source system, such as SAP, to be exposed into Salesforce. Once configured, real-time data from SAP can be fed into Salesforce as external objects. These external objects from SAP can be shown as a tab within Salesforce for any Salesforce user to leverage.

Anypoint Platform & Anypoint Templates

Both dataloader.io and Anypoint Data Gateway are powered by Anypoint Platform, a hybrid integration platform for SOA, SaaS and APIs. To help users go from a blank canvas to a production application quickly, we have designed a set of Anypoint Templates to demonstrate integration best practices that feature popular endpoints such as Salesforce. Below are a few examples:

Screenshot 2016-03-08 16.25.15

Salesforce and database bi-directional sync:

This template shows how to bi-directionally synchronize data between Salesforce and any database. The template is a great place to start, but can be extended or customized to better fit your needs.

Broadcast from Salesforce to multiple systems:

Screenshot 2016-03-10 11.04.22

This template serves as a foundation for synchronization of objects from a Salesforce instance to many destination systems using the publish/subscribe pattern. Anytime there is a new object created or a change is made to an existing object, the integration will capture the changes and publish them into a queue. A subscriber application then will poll the changes from the queue and update the target systems, which in this case are Salesforce B and database instances.

In this blog post, we’ve summarized the most common Salesforce integration challenges and mapped them to MuleSoft’s solutions. Give them a try and if you have any questions or feedback, let us know at info@mulesoft.com.

Anypoint Data Gateway Download

file transfer

HowTo – File based integrations and transfer

Reading Time: 13 minutes

We recently introduced our  HowTo blog series, which is designed to present simple use-case tutorials to help you as you evaluate MuleSoft’s Anypoint platform. In this blog post, we show how an organization can use Anypoint Platform to communicate with their partners using a secure file-based solution.

When an organization communicates with its business partners, there are many different options such as traditional B2B exchange, file transfers, fast-growing API based approaches etc. B2B and APIs are usually the preferred modes of communication, especially with larger organizations, since they are near real-time and less error-prone. But these options typically require a sophisticated IT team to set up. Hence, some partners may demand a simpler approach to integrations such as file transfers — i.e. file-based B2B exchange. Another important aspect to consider during file-based B2B is how to secure the communication and data exchanges.

In this blog, I will demonstrate an example which satisfies some the most typical requirements for a file-based B2B exchange. The requirements for this use case are:

  1. Poll for a flat file (CSV) which contains the product catalog data
  2. Transform this data into a format specified by the partner
  3. Encrypt this data using PGP (Pretty Good Privacy)
  4. Transfer this data as a flat file (CSV) to the partner using a secure file transfer protocol

Some might question the value of encrypting the payload data since we are already ensuring transport security by using SFTP. But SFTP does not protect against unauthorized access to the files at the target location. PGP encryption also ensures that the file has not been modified in transit or sent by an unauthorized party.

Pre-requisites:

Steps:

  1. Start MuleSoft Anypoint Studio and point to a new workspace.
  2. Give a name to the project and click finish. You will see the mule configuration file where we can now add the Mule flows. Here is more information to understand Mule configuration.howto1-2
  3. Create three folders under the src/main/resources folder and call them ‘input’, ‘sftp’ and ‘archive’.howto1-3.png
  4. Copy the sample.csv into the src/main/resources/archive folder. Also, copy the pubring.gpg and secring.gpg into the src/main/resources folder.howto1-4.png
  5.  Add File endpoint component to the Mule configuration file and this will automatically create a flow (What is a mule flow?). In the File endpoint properties tab below click on connector configuration.howto1-5.png
  6. Select the default File connector configuration as shown below and Click OK.howto1-6.png
  7. Rename the display name for the file endpoint to ‘Poll Product Catalog’. Add the path to the file endpoint input and the directory where the file needs to be moved after reading. As you can see from the below snapshot, these endpoints can be configured as properties, the values for which can be set in src/main/app/mule-app.properties file.howto1-7.png
  8. Click on the Metadata tab on the left-hand side of the File endpoint to configure the output metadata for the flat file as shown below.howto1-8.png
  1.  Now that we have the file endpoint set up to read the flat file, drag the transform message component (DataWeave) to the flow and rename it to ‘Transform Product Catalog for Partner’. (Here is more information on the DataWeave component).howto1-9.png
  2. Now let’s set the output metadata for the transform message component to match the flat file format as specified by the partner. Click on Define metadata -> Create new type -> Type = CSV -> Type Id = partner. Then configure the fields of the flat file as shown below and then click Finish.howto1-10.png
  3. Then map the data in DataWeave using drag and drop from source (left) to target (right). You need to hardcode the field  Company with the value ‘Infinity Solutions’ as shown below.Screen Shot 2016-03-08 at 11.53.24 AM
  4. After the mapping is complete, we need to convert the output of the DataWeave component to a string format so that we can encrypt the payload. So drag the Object to String transformer from the palette into the flow and rename it as ‘Convert to String for Encryption’ as shown below.
  5. howto1-12.pngNow add the Encryption component from the palette to the flow. Rename the display name to ‘PGP Encryption’ and click on the + to create a new connector configuration. (Here is more detail on the PGP Encrypter).
  6. howto1-13.pngSelect “PGP_Encrypter” as the Default Encrypter and then click on the PGP Encrypter tab to configure the encrypter component as shown below and then click OK.
  7. Configure the rest of the Encrypter configuration as shown below. Here we are selecting ‘Encrypt’ as the operation and ‘PGP_Encrypter’ as the encrypter.howto1-15.png
  1. Now add the File component from the palette to the flow. Rename the display name to ‘Create Product Catalog Target File’. We will change the path to point to the SFTP folder we created in Step 4 above. We use a MEL (Mule expression language) expression – #[function:datestamp]-#[message.inboundProperties[‘originalFilename’]] for the File Name/ Pattern to prepend the datestamp to the original file name. In the end, the configuration should look like the snapshot shown below. (More information on MEL expression is available here).Screen Shot 2016-03-08 at 11.56.48 AM
  2. Now that we have the flat file in the required target format drag the SFTP component from the palette into the flow. Rename the display name as ‘Transfer file to SFTP Server’. Click on the + to add a connector configuration. In the connector configuration settings, leave the defaults as is and click OK. We will use a MEL expression – #[function:datestamp]-#[message.inboundProperties[‘originalFilename’]] for the Output Pattern. Then, we can complete the configuration by adding the target SFTP server settings as shown below.Screen Shot 2016-03-08 at 11.57.17 AM
  3. The last step is to now setup the mule-app.properties file with your environment specific values. You can download and fill out the sample mule-app.properties file from the project which has blank placeholders. You can then replace the file at src/main/app/mule-app.properties with this file.howto1-18.png
  4. Now run the project and copy the sample.csv file from the src/main/resources/archive folder to src/main/resources/input folder. The process will pick up the file, process it and then transfer it to the specified SFTP server location.

We have now successfully created a file-based B2B process which enables an organization to communicate securely with partners.

To create this file-based B2B we have leveraged the following features of the Anypoint Platform:

  • File Connector, which can be used to read and write files including different formats such as CSV, XML, JSON, etc.
  • The Encrypter component is used to enforce security at the payload level. Different encryption strategies such as JCE, XML, and PGP used out of the box using single click and configure approach.
  • DataSense uses message metadata to facilitate application design.
  • Mule Expression Language (MEL): Lightweight Mule-specific expression language that can be used to access/evaluate the data in the payload.
  • Transformers/DataWeave:  A simple, powerful way to query and transform data using the platform.

As you can see from the above example, it is very straightforward to set up a file-based B2B exchange to communicate with individual partners who do not have the flexibility to set up more sophisticated B2B communication. With files so ingrained in the B2B exchanges across many organizations, the best strategy is to have multiple options to communicate as organizations evolve to API-based B2B communication.

How to create and use OData APIs for any connectivity need

Reading Time: 6 minutes

In my blogpost last week, I shared how, in just 5 minutes, you can expose MySQL, DB2, SQLServer, Oracle or SAP datasource as an OData API into Salesforce using Anypoint Data Gateway for Lightning Connect.

Data Gateway - Out of the box

But let’s say what Data Gateway offers out-of-the-box is not a perfect fit for what you want to do. Maybe you want to create an OData API for a different datasource, expose a legacy API as an OData API or do data orchestration before exposing data into Salesforce. So what do you do?

Introducing OData extension for APIkit

APIkit is an open-source Maven tool within Anypoint Studio that enables developers to create well-designed REST APIs. We are excited to introduce a new OData extension, which allows users to create and expose OData/REST APIs in three simple steps:

OData for APIkit

Before moving forward, please notice the OData extension for APIkit is still in Beta! GA is planned for mid-2016.

Step 1: Defining the model

For starters, we need to define the model to be exposed. First, create a model.json file within your Studio project with a clear definition of the entities and the properties, so that it can be exposed as OData entities.

model.json

In this example, we have a “workers” entity with a single key (the “workerId” integer field) and a “firstName” string field. This entity is what we will expose through the OData API.

Details about the supported data types and model.json definition can be found here.

Step 2: Implementing the API

Now that we have the model.json file completed, we need to implement it in Studio. We will do so by using the APIkit OData extension. Once the plugin is installed, right click on the model.json file and select “APIKit > Generate flows” to generate two new elements in your project: a RAML file and an api.xml file.

OData for APIkit

The RAML file

This RAML file is the documentation for your RESTful API. All entities defined in step 1 are available as resources with proper CRUD operations, both at the collection and the element levels.

The api.xml file

Following the familiar APIkit flows-generation process, this xml file has the entry HTTP point and the scaffolded flows for your API.

With these steps, we can now implement the API in any way we want. We can pull data from any supported datasource or API, do data aggregation, orchestration and/or transformation. At this point, you can use all the connectors and operations available within Studio. The only requirement is to conform to the expected flows output defined in the RAML file.

Step 3: Running / Deploying

With our REST/OData API ready, we can now run it in Studio or deploy it in a mule runtime. This newly created app exposes two endpoints: a RESTful API accessible through “/api”, and an OData API accessible through “/api/odata.svc”.

And there’s more. Do you need an OData API for a set of SAP BOs/BAPIs? Need a bidirectional OData API for your Oracle, DB2 or Azure DB? Or do you need to do complex data orchestration within your OData API? All of that (and more) is possible with this extension.

For more information, feel free to check out the OData for APIkit documentation.

MuleSoft CONNECT is back!

Reading Time: 3 minutes

MuleSoft CONNECT 2016, the premier digital business conference where CIOs, IT leaders, and developers meet to exchange ideas and insights on driving digital transformation, is back! If you care about harnessing the digital revolution, mastering the API economy, or learning the best way to connect applications, data, and devices, this conference is for you. MuleSoft blog readers have a special discount code entitling them to 10% off (code: mktg-blog-10) so you’ll definitely want to register for CONNECT today. 

Our list of speakers is incredible. Hear real stories from Unilever, Coca-Cola, Siemens, Splunk, Autotrader, Anheuser-Busch InBev, Redwood Logistics, Rent-a-Center, StubHub, University of Chicago, and more about how they are using our products to innovate faster, deliver products more quickly, and create delightful experiences for their customers. There will also be industry-specific sessions as well as exclusive events on the topics of microservices, IoT, APIs, mobility, security, and more. Plus there will be hands-on training and demos from MuleSoft and industry thought leaders.

You’re not going to want to miss this event. It’s at the San Francisco Hilton Union Square from May 21-25, 2016. With 40+ sessions and special demos, breakout sessions, case studies, and hands-on labs, there will be programming customized for your role whether you’re an architect, a developer or an executive. Simply put, we will pack the most relevant content into this week to make sure you come away energized and inspired to bring about change and opportunity into your organization.

Don’t forget to register with your special discount code ((code: mktg-blog-10) and book a hotel, as space will fill up fast. We’re looking forward to seeing you there! For a taste of what you’ll experience at CONNECT 2016, take a look at the keynotes from CONNECT 2015.