Meet a MuleSoft Champion: Felipe Ocadiz, Integration Engineer at Twitter

September 6 2017

0 comments 0
felipe ocadiz mulesoft champion

The long-term success of a technology ultimately lies on the strength of its community. At MuleSoft, we are fortunate to have a vibrant community that has many members who contribute by answering questions, sharing best practices, and organizing meetups. Their knowledge and effort made the experience of building solutions with MuleSoft simpler and more rewarding.
(more…)

How Automatic Streaming in Mule 4 Beta Works

August 10 2017

0 comments 0
automatic streaming mule 4

Streaming in Mule 4 is now as easy as drinking beer!

There are incredible improvements in the way that Mule 4 enables you to process, access, transform, and stream data. For streaming specifically, Mule 4 enables multiple parallel data reads without side effects and without the user caching that data in memory first.

(more…)

Connectivity is the Cornerstone of the Digital Revolution

March 17 2017

0 comments 0

It had just turned 2 a.m. on April 18th, 2003. I was getting ready to release my first open source project, and I was about to pull the trigger on a name. I settled on MULE as I was trying to solve the hard, unrelenting work of connecting applications and data—MULE was going to take the “donkey work” out of integration. What I didn’t know was that I was choosing the stock ticker symbol for a company that would go on to solve much bigger problems for companies globally. It’s pretty cool to reflect on that now, and it has been an amazing experience getting here.

In 2003, I was trying to solve a hard technical problem. I wanted to make it less painful and more efficient for people like me and my teams to connect disparate systems. I was working for an investment bank in London, where a project to connect seven systems was going to cost 30 million euros and take 18 months. I was struck by how incredibly painful and complicated this turned out to be.

I looked at the middleware market—ostensibly the companies who could make this easier—and realized that it was highly fragmented. The lightning bolt that hit me was that all the problems of connecting systems and data could be boiled down into a distinct set of components. Every connectivity problem could be solved using a different combination of these building blocks. The old way was to build specific software offerings for every connectivity problem, whether batch, EAI, orchestration, event-driven architecture, web services, file movement, or many more. My new way was to build a common set of components that you compose in different ways to solve each of these connectivity and architectural problems on one platform.

After much complaining to my girlfriend (now my wife), she got frustrated and told me to stop talking about it and do something about it. So, I built a container, established the components, and productized them. I had two key principles in these early days of MULE:

  1. The software has to do what you say it does. It seems obvious, but so much software over-promises and under-delivers.
  2. Own the success of MULE’s users. This means you support any user as if they were your friend and then pave the way with documentation, examples or bug fixes so others can get further on their own.

This made the technology easier to consume and productized an area of the IT market that had never been properly productized before.

It is interesting and fortunate that the patterns that we established in MULE back then have been very relevant and are replicated throughout emerging trends today, like moving to the cloud. The patterns are very similar to one another. The unique way in which we’ve solved connectivity has turned out to be flexible enough and adaptable enough to work in many different situations. We call it declarative modularity. It’s a pretty simple idea that a software component should do one thing well, and expose its interface so it can be used in different ways but always provide the same result. You could argue this is just good software practice; we’ve just applied it to a large set of problems and made it fly. We’ve changed the notion of what’s possible in enterprise technology by approaching this problem differently than anyone else. And that’s very much part of our DNA as a company. We’ve had to pivot, we’ve always experimented and will continue to do so; the fundamental premise of the way we solve this significant problem continues to be our compass.

The reason connectivity is so important is that, not surprisingly, everything is becoming more connected. In this new digital era that we now live in, connectivity is the foundation of every consumer experience, business model, process, product and service; every digital innovation needs connectivity to make it viable. The more connections you make, the more value you get out of digital assets. Connectivity links your applications, data and devices with those of partners, suppliers, customers and even employees—to create a new breed of service offerings and capabilities. Connectivity is critical to new technology advances in areas like the internet of things (IoT), artificial intelligence (AI) and augmented reality (AR). None of this works if you can’t connect to the data and assets. And given the number of applications, data sources and devices, the old way of doing this is just too heavy, costly and resource intensive.

The way in which you connect really matters. With the massive growth of cloud applications, APIs, microservices, IoT, and real-time big data, as well as new technology like AI, intelligent voice interfaces and VR/AR emerging, I’ve seen a change in the way organizations are thinking about this problem. Connectivity is critical to most (if not all) projects and no longer can central IT teams be order-takers to deliver projects to the business. The demands on IT are too high. Instead, IT is looking to deliver more of their capability “as-a-Service” to the business and then enable business users to self-serve more on their own. The modern API is at the center of this shift; it has become the de facto way to exchange value between providers and consumers.

The internet exploded thanks to HTML and browsers making it possible for anyone on the web to exchange information. I believe we are at the beginning of the next era of the web where modern APIs are making any capability available to people and machines. We’ve already seen the lighthouses in the consumer space like Amazon and Uber make the impossible possible by leveraging modern APIs, and I believe we’re going to see more innovation in this era. It’s a matter of survival.

In the next few years, I expect we’ll see many organizations make tremendous gains in speed, agility and innovation that will rival those in the consumer space, and it’s because business leaders are starting to realize that reusable, self-service connectivity is the way to get value out of all their enterprise assets. I believe it is the cornerstone of the next digital revolution and the opportunity we face today.

We believe the way for organizations to drive this transformation is through a new concept called application networks. These are networks of applications, data and devices connected with APIs to make them pluggable and to create reusable services. APIs are the building blocks used to define how data is accessed, exposed and shared across the application network. What makes this so transformational is that now a much broader community of developers or analysts can get self-serve access to data and capabilities within the organization, as well as partners externally, without being bottlenecked by IT. We are seeing strong traction with this concept in our customer base. I believe many organizations in the next few years will be delivering much more of their IT capability to the business through their application network, which will unlock the value of digital assets within organizations, change the role of IT and make businesses more agile and drive innovation.

One of our core strengths as a company that has pushed us and opened the aperture to what is possible in enterprise technology is the people. We’ve managed to pull together some of the best and brightest in this company, and I truly believe there’s nothing we can’t do. I am extraordinarily grateful to every Muley. I also want to thank our developer community, which has adopted MuleSoft so enthusiastically and tirelessly, and contributes to make it expand and evolve.

I also want to thank our customers, who have really changed the way businesses operate with technology. A great example is McDonald’s. What I love about their story is that they are striving to create a set of capabilities across the entire global franchise, a digital platform for restaurant owners and new digital channels for their consumers. This holistic, transformative way of thinking is the right way to think about evolving the modern business. And it’s exciting to be a part of that for all of our customers.

We are just getting started. Right now, this is an exciting point in our journey. We’ve created a new stage for change in the enterprise is tremendous, and it would be a shame to not push that as far as possible. As for me, it’s really rewarding to say we are changing the way businesses operate to drive competitive outcomes. With the support of our team, our customers, our developers, and our partners, we believe the possibilities are endless. And that’s what makes me excited and proud to be a Muley.


 

Webinar: The Best of Anypoint Studio: MuleSoft Champions Edition [Demos]

February 28 2017

0 comments 0
Webinar- The Best of Anypoint Studio- MuleSoft Champions Edition [Demos]

Since 2015, MuleSoft Champion’s program has grown to over 12,000 developers from around the world. Developers not only complete challenges to redeem prizes (from training and certification vouchers to items from the MuleSoft swag store), but they also share valuable best practices with others in the program.

To celebrate the great thought leadership that is brewing within the community, we invited 3 champions from the program to share, through a demo, their favorite Anypoint Studio feature. On Wednesday, you’ll see:

  • Francesco Ganora, MuleSoft Champion, showing how to collaborate on API design within Studio and API designer
  • Harish Kumar, Solution Architect, AXA Direct Insurance Japan, sharing how to use MUnit and Maven to run unit and functional tests
  • Karthik Selvaraj, Senior Technical Consultant, Perficient India Pvt Ltd, going over how to do batch integration and basic data transformation using DataWeave.

What are these features?  

API design collaboration using API designer and Anypoint Studio

The goal for Anypoint Platform is to make collaboration as easy as possible. As part of this effort, we brought API design to every developer by embedding API design capabilities in Studio. This allows each developer to collaborate on designing an API while staying within their own preferred tooling (API owners in API designer and integration specialists in Anypoint Studio). Within each tool, they have the ability to push/pull changes made by the other developer, which makes real-time collaboration simple. 

Using MUnit & Maven to run MUnit tests

We’ve seen great uptake and feedback on MUnit, the testing framework for Mule, since its launch last year. With MUnit, it is simple to do unit and integrated testing for Mule applications and you can easily integrate it into your existing CI/CD process. This demo will show you how to do basic testing with MUnit, and also, how you can also use Maven to run MUnit tests. 

Doing batch integration & basic DataWeave transformation

If you’re getting started with Anypoint Studio and want to learn how to do batch integration, don’t miss Karthic’s demo! His demo will show how to take data from a SQL database, transform that data into JSON using DataWeave, and trigger the batch load using a HTTP call. While this example will only cover how to bulk load using a manual trigger (an HTTP call), users can easily do polling at a fixed frequency (ex. every hour, day, etc.).

All of this in 1 webinar! Don’t miss it and register below!

Webinar Date: Wednesday, March 1, 2017

Time: 10:00 am – 11:00 am PST

register now


 

AQuA API with Zuora Connector

November 29 2016

0 comments 0
AQuA API with Zuora Connector

Zuora has been offering three different types of API: REST API, SOAP API and AQuA API. It is believed that majority of Zuora customers use the SOAP API to integrate with Zuora because of the breadth of its API. (MuleSoft is using the SOAP API internally!) While MuleSoft offered the Zuora Connector based on SOAP API, we started to hear from customers that some tasks could be done easily with AQuA API and Zuora is starting to recommend that customers use the AQuA API instead. To support this trend, we are excited to announce the release of Zuora Connector v3.1.0.

For those who are not familiar with AQuA API, it stands for Aggregate Query API. The AQuA API offers a easier way to send multiple queries to Zuora and retrieve results of the queries as files or segments of files. One caveat here is that all the queries may not be completed at the same time, and the user will need to call Zuora periodically to check the status of the queries. We thought the experience around this could be better, and wanted to build a connector that addressed some of these limitations. So instead of relying on Studio to check the status of the queries (which is a fine method since the connector supports the AQuA API), our recommended approach is that users put the job in object store and let the connector return the file ID with the results of the queries once the job is finished.

In this blog post, I will walk you through a demo app to show how to use the AQuA API with the Zuora Connector. Before you start with the demo app, please make sure you have the Zuora Connector v3.1.0 downloaded from the Exchange.

screen-shot-2016-10-26-at-12-26-12-am

After you install the Zuora Connector, load this demo app called zuora-aqua-api-operations-demo. This blog post will focus on the last two flows of that Mule application.

screen-shot-2016-10-26-at-12-37-01-am

Once you load the example, let’s configure the connector. To configure the Zuora Connector, go to Global Elements, and find “Zuora.” After selecting “Zuora,” click “Edit”.

screen-shot-2016-10-26-at-12-39-38-am

screen-shot-2016-10-26-at-12-39-57-am

In the “Zuora: Configuration”, you can specify the configuration for Zuora. You could directly add your Zuora information in the configuration, but we recommend using the properties file to add your configuration information. In mule-app.properties, please configure the following properties based on your Zuora instance.

config.username=
config.password=
config.endpoint= (i.e. https://apisandbox.zuora.com/apps/services/a/75.0 )
config.restEndpoint= (i.e. https://apisandbox.zuora.com/apps/api/)
config.wsdlLocation= (Please locate your Zuora WSDL. You might need to download a WSDL file from Zuora.)

After you complete the configuration for your Zuora environment, run the app. When you open up a browser and hit localhost:8081, your browser will show the following page. Please click on Post Query Results to Object Store.

screen-shot-2016-10-26-at-2-31-24-pm

The Post Query Results to Object Store form allows you to send two queries: one for Export Zuora Object Query Language (ZOQL) and the other for ZOQL in one job. For this demo, the following queries have been used:

  • SELECT name,Currency,Status FROM Account WHERE name=’Name Test for PostQuery’ in ExportZOQL Query
  • SELECT AccountId FROM Invoice in ZOQL Query

When you go back to Studio and check the folder (src/test/resources), you will find the following two files in less than a minute.

screen-shot-2016-10-26-at-2-42-42-pm

2c92c08557ff9b2e01580215bf9832e5 includes the results of SELECT AccountId FROM Invoice in ZOQL Query. 

4028e6963424ef6001342503xxxx
4028e69734172d4f01341fd11xxxx
2c92c0f83b02a9dc013b05d1dxxxx
2c92c0f83b02a9dc013b05d1d3xxxx
2c92c0f83b02a9dc013b05d1d5xxxx
2c92c0f83b02a9dc013b05d1d7xxxx
2c92c0f83b02a9dc013b05d1d9xxxx

2c92c08557ff9b2e01580215bfe632e6 includes the results of SELECT name,Currency,Status FROM Account WHERE name=’Name Test for PostQuery’ in ExportZOQL Query.

Account: Name,Account: Currency,Account: Status
Name Test for PostQuery,USD,Draft
Name Test For PostQuery,USD,Draft
Name Test for PostQuery,USD,Draft
Name Test For PostQuery,USD,Draft

Let me briefly explain how I retrieved these two files. As you can see below, when “Save Job to Object Store” is checked, the Post Query Response with id and batchid(s) is stored in Object Store (in the demo, it will be stored in PostQueryResultsStore.)

screen-shot-2016-10-26-at-2-48-49-pm

In the following flow, the Zuora Connector in Source checks the status of my job with the Post Query Response stored in PostQueryResultsStore by calling Zuora periodically (i.e. every second). Once the job is completed, the rest of the flow continues, and the Zuora Connector with “Get Export File Stream” retrieves a file for each query.

screen-shot-2016-10-26-at-2-54-28-pm

For new users, try the above example to get started, and for others, please share with us how you are planning to use the Zuora Connector! Also, explore the Anypoint Exchange to see other resources you can leverage today.


 

Using Bulk API with MongoDB Connector

November 1 2016

0 comments 0
mongodb-connector-banners

You can visit the MongoDB Connector page here.

With MongoDB as one of the most popular NoSQL databases, we are excited to announce the release of our MongoDB Connector v4.2.0. This version includes improvements in connector configuration and support for batch/bulk operations. Let’s walk through an example of using the bulk operation (Bulk.insert()) using the MongoDB Connector v4.2.0.

Since v2.6, MongoDB has supported bulk operations. If you list bulk operations and execute them with Bulk.execute(), MongoDB groups your operations (up to 1000 operations) by operation type. Among these bulk operations, Anypoint Connector for MongoDB v4.2.0 supports Bulk.insert().

Before getting started, please make sure you have v4.2.0 downloaded in Anypoint Studio. If not, please download it from Exchange.

screen-shot-2016-09-21-at-11-31-42-am

 

Find the demo-batch app from this page and import it into Studio. The following app shows you two different use cases: using “Insert Documents” (bulk operation) flow as is, or using it in a batch flow.

screen-shot-2016-09-21-at-11-35-52-am

 

Once you import the demo app, select “Global Element” and open the MongoDB configuration by clicking on “Edit.”

screen-shot-2016-09-21-at-11-39-06-am

You can specify your MongoDB configuration here, but I recommend you use the mule-app.properties. In mule-app.properties, configure the following keys:

username=
password=
database=
host=
collection=demo (you can choose a different collection.)

If you start the demo app in Studio and hit localhost:8081/ with your browser, the following documents will be added to the collection called “demo”. In this case, the first set of documents will be added by “Bulk Insert” and the next set of document will be added by “Bulk Insert” in batch.

Before the “Insert Documents (Bulk.insert())” operation, if a user wanted to insert 5,000 documents into MongoDB, they needed to use the “Insert Document” operation 5,000 times. In this case – if – for example, the 500th document failed to be inserted to MongoDB, the rest of 4,500 documents would not be inserted. Now, with the same scenario, you can not only use “Insert Documents” once to insert all documents, and the one failed document would not hinder the rest of the documents from being inserted.

Also, you can use the “Insert Documents” operation in a flow or a batch flow. If you use it in the flow and process a huge amount of documents, you need to make sure you have enough memory or manage memory properly not to be overflowed. However, if you use it in a batch flow, the batch will deal with memory management for the huge documents, and you can focus on managing the failed documents.

For new users, try the above example to get started, and for others, please share with us how you use or are planning to use “Insert Documents” in the MongoDB Connector! Also, explore the Anypoint Exchange to see other resources you can leverage today.


 

EDI is System to System not B2B

October 27 2016

2 comments 0
b2b edi

90% of everything is moved around the world in containers across countries and continents. The containerization of cargo is a highly efficient and open standard of transport. It’s efficient because the same physical container can be easily handled by a super sophisticated automated terminal in Rotterdam as well as the more basic terminals across Africa and Asia. It’s a simple structure that can easily flow through and between businesses and it’s easy to work with agnostically.

However, the data that often accompanies these containers are crippled by manual protocols and difficult to adapt (or adopt) archaic structures. Most container terminals and their business partners use EDIFACT or ANSI EDI standards. This is an interindustry standard of Electronic Data Interchange (EDI), but as a standard, it’s pretty cumbersome to deal with.

EDI Message Structure

Understanding should be ambiguous

To give you an idea of what these messages look like take a look at the following EDIFACT message:

  • Try and understand what the EDI message is for
  • Then look through and determine what are the key pieces of information
  • What percentage of the information did you understand?
  • How long did it take you to read through it and answer the above questions?

UNA:+.? ‘

UNB+IATB:1+6XPPC+LHPPC+940101:0950+1′

UNH+1+PAORES:93:1:IA’

MSG+1:45′

IFT+3+XYZCOMPANY AVAILABILITY’

ERC+A7V:1:AMD’

IFT+3+NO MORE FLIGHTS’

ODI’

TVL+240493:1000::1220+FRA+JFK+DL+400+C’

PDI++C:3+Y::3+F::1′

APD+74C:0:::6++++++6X’

TVL+240493:1740::2030+JFK+MIA+DL+081+C’

PDI++C:4′

APD+EM2:0:1630::6+++++++DA’

UNT+13+1′

UNZ+1+1′


Many people would understand that this is a message related to flight bookings; it’s actually the EDI and B2B integration between an OTA (Online Travel Agent) and a flight booking system. You may be able to decipher some of the airport acronyms. However, like me most will answer that they understood at most 20% of the message and if you tried hard it probably took around 5-6 mins and a lot of head scratching. Theses edi-based B2B interactions are common, and many are significantly larger and more complicated.

Can you imagine now building a modern day iPhone or Android app and asking a whiz kid developer to deal with messages like the above? The likelihood would be that:

  1. You end up having to supplement your UX developer with an EDI expert
  2. You then supplement your two developers with a domain expert on flight bookings
  3. You end up with three times the cost and delivering your project two times slower.

Imagine you now need to change some of the underlying systems and the messages. Who would you need to be involved to make sure the change does not impact the end UI your customers see? How quickly could you make that change?

I imagine at this point – you’re thinking “I really wish I didn’t have to deal with EDI” and you’re not alone. Many modern day systems end up using parsers to get the data in a more human readable format before processing them, but when working with multiple systems, you may just end up with the same mess you were trying to avoid – a single format for exchanging information.

Transmission

Communication should be transparent

Unfortunately, things now get worse.  The UI developer likely has to deal with much more than the message and its structure but also how to deal with the way these messages are passed around. In a shipping terminal the likelihood is that you may get this data from various channels:

  • Flat file from an FTP server that needs to be batch processed
  • An attachment to an email that needs cleaning up
  • Maybe the odd Webservice
  • In an advance terminal maybe JMS

All of these different methods creates a burden on your developer to consume EDI transactions. They have to invest time in doing more point-to-point integrations to fetch/receive EDI messages and also send/respond to messages. This continues to slow down your UI developer’s goal – which is to create a new user-friendly app on the latest smartphones.

Once again you may “fast track” this by adding another expert to deal with the communication protocol, but this adds cost, time and complexity; once again creating technical debt that if one of the underlying systems change you are hindered to adapt quickly.

How do developers traditionally deal with the issue?

Agnostically working with data

The reason why your UI developer has issues is that their sphere of influence is their app; building in all the components to deal with different message types and transmission protocols is typically done within their application. They will likely need to master this,  which, although it may speed up the next project,  also means they may continue to build upon this constrained and fragile model creating more point-to-point solutions. Relying solely on an EDI between applications perpetuates technical dept by incorporating the processing and transmission logic within an application.

The unintended consequences of EDI

Silos of data

Great value can be found in analyzing and harvesting data for traditional businesses in the transport & logistics, manufacturing and healthcare industries; exposing these digital assets can greatly improve their revenue and costs:

  • Providing visibility as a value added service to customers
  • Sharing information to support other links in the supply chain for just in time (JIT) manufacturing
  • Predictive analytics supporting better process optimization

Will industries change?

Absolutely, and it’s already started! More modern day systems already use various types of parsers to convert data back into a more usable formats such as XML. Larger EDI community systems have already begun discussions with partners on leveraging a better data transmission format based on XML. Destin8 (MCP PLC), a UK-based port community system already offers XML-based formats for several types of messages and this trend will likely continue as their customers modernize systems often built on much more modern technologies.

Where EDI excels

EDI is here to stay, and after decades of use across many different industries, it is well established as the defacto standard for communicating between systems and therefore B2B communication

EDI provides a common structure and language to communicate with common standards. Although these standards are often interpreted differently, the overall structure remains the same. EDI is very effective at being a dense medium to transmit information due to its highly abbreviated formats.

Therefore EDI is something that we need to work with, but working with it in a smart way and ensuring that it does not hinder future innovation is key.

The best of both worlds

Legacy modernization

For those with legacy systems, adding a wrapper to modernize their interactions with other systems is far more cost effective (and efficient) than re-implementing mission critical systems. This enables developers to quickly consume data without having to understand EDI; this also unlocks data to look then how to innovate with it to provide value-added services and visibility to their customers, or using the data to optimize their business processes.

API-led connectivity

The alternative (or next step) is to have connectivity at the heart of the business and IT landscape, ensuring that the application network is structured in a way that:

  • Data is easy to access and discoverable
  • Information can be easily aggregated from multiple systems
  • Developers are insulated from back end systems allowing them to focus on innovation
  • Legacy systems can be replaced in a controlled way
  • New systems can be added to the Application landscape without the need for deep experts in back-end systems
  • In essence creating a composable layer of APIs that are agile to the business’ needs

Take a look how some businesses are adopting an API-led approach to connectivity and innovating on top of EDI, improving their business agility and creating new and valuable experiences for their customers and stakeholders.


Understand your customers with Watson AlchemyLanguage Connector

September 29 2016

0 comments 0
watson-alchemylanguage-connector

The amount of data in the world is growing exponentially. IDC predicted in 2014 that the amount of data on the planet would grow 10-fold in the next six years to 2020 from around 4.4 zettabytes to 44 zettabytes. Additionally, the majority of the recently generated data – like the comments you just made on Facebook – are unstructured or unmanaged.

Luckily, with more leading enterprises, such as Amazon (Alexa), Apple (Siri), and IBM (Watson), are investing in AI, machine learning, and natural language processing these unstructured data have the potential to be transformed into meaningful, informative data for enterprises. They’re developing a new educational platform and products that can adapt to individual preschoolers’ learning preferences and aptitude levels, using data, pattern recognition, and natural language processing for personalized learning experiences.

Among many players, IBM Watson has been driving the adoption of cognitive computing in enterprises, which is believed to become a $2 trillion opportunity over the next decade. For example, Sesame Workshop, the nonprofit educational organization that produces Sesame Street, is developing a new educational platform and products that can adapt to an individual preschoolers’ learning preferences and aptitude levels. In healthcare, many hospitals are using or planning to use IBM Watson to find cures for rare diseases.

To help more organizations explore the possibilities in machine learning and natural language processing, IBM opened its services through APIs in its developer cloud. To help MuleSoft’s ecosystem to participate in this opportunity, Admios developed the MuleSoft Certified IBM AlchemyLanguage Connector, leveraging IBM’s APIs that enable text analysis through natural language process. Using this connector, organizations can understand what customers are saying about them and their products, and turn those insights into meaningful actions for their organization.

Using the connector for IBM® Watson AlchemyLanguage

The following example shows how to analyze tweets with AlchemyLanguange.

First, set up your mule project by taking the steps below:

  1. Make sure you have the latest version of Anypoint Studio. (i.e. 6.x.x)
  2. Since the app will use MuleSoft’s connector for Twitter, please find the following information from your Twitter account.
    1. twitter.access.key=
    2. twitter.access.secret=
    3. twitter.consumer.key=
    4. twitter.consumer.secret=
  3. In order to use the AlchemyLanguage Connector, you need to get an API Key from IBM BlueMix. If you don’t have an IBM BlueMix account, you can create one here.

Once you complete the instructions above, please create an empty Mule project in Studio and download a Twitter Connector and AlchemyLanguage Connector
screen-shot-2016-09-20-at-12-06-58-am

 

Copy and paste the following XML in “Configuration XML” of your Mule project, and put connector configurations for Twitter Connector and AlchemyLanguange Connector in mule-app.properties.

After these steps, you will see the following flow in your message flow view. When you run this flow, it polls 15 tweets containing @mulesoft every 60 seconds and uses the AlchemyLanguange Connector in the batch flow to analyze the sentiment of each tweet.

screen-shot-2016-09-20-at-12-11-11-am

Here are some results of sentiment analysis I performed with this flow.

Replace @mulesoft with your company’s Twitter handle, or the Twitter handle for your products to quickly and easily assess the sentiment of the tweets you are receiving. Of course, you can also replace the Twitter Connector with Facebook or any other social media platform.

screen-shot-2016-09-20-at-12-50-45-am

For new users, try the above example to get started, and for others, please share with us how you are using or planning to use Watson AlchemyLanguage or other natural language processing services! Also, feel free to check out MuleSoft Certified Connectors or visit the connector certification program page to learn more and be a part of MuleSoft ecosystem.


 

How to solve the top 4 retail B2B challenges

September 26 2016

2 comments 0

You don’t need us to tell you how drastically the retail industry is changing. The convergence of channels driven by always-connected consumers and emerging technology is completely altering the way we shop for everything. To keep up with the pace of change, retailers need to be agile, flexible, and adaptable, which isn’t always easy with B2B technology standards. Here are the top 4 B2B challenges that retailers have to solve in order to thrive amidst digital disruption.

The top 4 B2B challenges for retailers

What are the main challenges that retailers are facing? They all have to do with speed and agility — how quickly they can adapt to change.

1) Responding to consumer trends
Consumers want more high-quality goods at a lower price delivered quickly across multiple channels. That’s not too much to ask, is it? Shoppers are focused on price and convenience no matter where they shop, whether it’s in-store or online — or some combination of both. However, there is a disconnect between what shoppers expect of a modern omnichannel retailer and what they often get. Providing the shopper what they want when they want it requires a streamlined, connected supply chain which can provide the development, sourcing, and production of seasonal and in-demand products just-in-time to service multiple markets and channels.

2) Reducing Operational Inefficiency
Every retailer wants to maintain an efficient operation, which means that managing complexity and change is a necessity. But it’s quite difficult to do. Organizational silos hamper communication, consistency, and collaboration across the business, and the dependency on manual operations, particularly in B2B transactions, are often cumbersome. Believe it or not, according to a supply chain benchmark study from Boston Retail Partners, 46% of North American retailers are using spreadsheets to manage their supply chain planning. This isn’t just inefficient; it can actually create costs.

3) Staying ahead of slipping profit margins
Anything that creates extra work and inefficiency affects the retailer’s bottom line — profit. We know customers are price-sensitive, the global economy is still on shaky ground, and consumer shopping patterns are changing.
Retailers are turning to technology to help them shore up declining profit margins. In an IDC Retail Insights Survey, “improving profit margins” was listed as a top objective driving IT investment.

4) Implementing omnichannel integration
Multiple selling channels are expanding, and quickly. These days, consumers use at least three channels when shopping, spend more, generate more profit, and are much more loyal. In order for multi-channel shopping to be a satisfying experience for consumers, retailers must offer a consistent product offering, price, and customer experience across all of their channels. Therefore, retailers have to align organizational objectives, streamline and improve their business processes, make order and inventory management more seamless, and deliver information to consumers in a timely and relevant fashion.

Is there a single solution to these challenges?

To achieve many of the objectives outlined above, retailers rely on B2B communication. And a global standard for B2B communication in retail is the notoriously cumbersome EDI. Over time, EDI message standards have become frayed such that the vision of a consistent, industry-wide message standard providing economies of scale and scope in B2B interactions, has become significantly diluted.

EDI messages are verbose and reflect a previous age where B2B/EDI transactions were traditionally processed in batches, rather than the real-time processing and processing transparency that today’s trading landscape demands. B2B systems that transmit EDI messages are integrated into a point-to-point system, rather than a platform, increasing fragility and complexity, and creating yet more demands on central IT to maintain.

When EDI first came into use, supply chains were simpler, with a limited number of suppliers. Now, thanks to globalization and specialized manufacturing, there might be hundreds of suppliers in a supply chain, which, as market conditions evolve, may change as well. Businesses want control and visibility over their processes, even as they grow more complex, because they are so integral to business strategy and success.

Yet EDI isn’t going away anytime soon. To succeed, companies must find a way to marry new architectural approaches to B2B/EDI challenges that insulate the legacy technologies from the new and abstract its limitations.

MuleSoft’s work with retail leaders suggests that it is possible to solve the challenges outlined above, despite the constraints that B2B/EDI can impose. These organizations are applying API and microservices approaches to drive innovation approach that supports innovation while utilizing B2B/EDI technologies.

Modernizing B2B/EDI through API-led connectivity

API-led connectivity is a multi-layered approach that scales IT capacity through by emphasizing modular components, decentralized authority over application development, and reusable assets. It is a different IT operating model; it promotes decentralized access to data and capabilities while retaining security.

3-layered-api-led-connectivity-diagram

API-led connectivity calls three-layered architecture with the following components:

  • System Layer: Underlying all IT architectures are core systems of record (e.g. one’s ERP, key customer and billing systems, proprietary databases etc). Often these systems are not easily accessible due to connectivity concerns; APIs conceal that complexity from the user. System APIs provide a means of accessing underlying systems of record and exposing that data, often in a canonical format, while providing downstream insulation from any interface changes or rationalization of those systems. These APIs will also change more infrequently and often will be governed by central IT given the importance of the underlying systems.
  • Process Layer: The business processes that interact with and shape this data should be encapsulated independently of the source systems where the data originates, as well as from the target channels through which data is to be delivered. For example, in a purchase order process, there is some logic that is common across products, geographies, and retail channels that can and should be distilled into a single service that can then be called by other services.
  • Experience Layer: Data is now consumed across a broad set of channels, each of which want access to the same data but in a variety of different forms. For example, a retail branch POS system, e-commerce site, and mobile shopping application may all want to access the same customer information fields, but each will require that information in very different formats. Experience APIs reconfigure data so that it is most easily consumed by its intended audience all from a common data source, as opposed to setting up separate point-to-point integrations for each channel.

In the EDI context, an API-led connectivity approach provides both flexibility to serve different partners, but tight control over core ERP systems:

The architectural benefits of this approach are a decoupled architecture that abstracts away complexity, and a more agile response to change. All of your channels are able to reuse the same process logic, so as you on-board new partners, you only need to manage the logic of receiving messages and the purchase order processing logic is already baked in, thus allowing greater speed.

Case study: Engaging customers in new ways and new markets

One retailer in North America has a business model of bricks and mortar stores that show goods that you can rent instead of purchase. You pay a monthly fee to lease sofas, TVs, desks, etc. They have realized that revenue growth potential is limited by the number of bricks and mortar stores they have, so they decided to enter into a number of different partnerships with other retailers to install digital kiosks or booths in those retailers and allow those retailers to also benefit from the company’s services, albeit through the digital kiosk.

They are able to build an end-to-end supply chain process from customers buying things in the kiosks, to actual purchase orders being generated on the back end. This is an example of a business being extended by B2B/EDI functionality and engaging their customers in new and interesting ways.

Interested in more?

MuleSoft has numerous resources for retailers taking on digital transformation initiatives. Take a look at more information on modernizing your supply chain, taking on omnichannel initiatives, and connecting silos in your organization.


 

The 5 most important Salesforce integration patterns

September 20 2016

0 comments 0
sfdc_custom

Modern businesses require modern customer relationship management. For many organizations, Salesforce fulfills that need. Salesforce can increase and accelerate sales, grow customer loyalty, and enhance marketing capabilities. It gives teams across an organization the ability to access and leverage the most up-to-date customer information in order to streamline business processes and create the most effective services and solutions. But in order for it to happen, businesses need to develop a Salesforce integration strategy to make sure it connects with the necessary enterprise systems.

Effective and efficient Salesforce integration with systems like databases, ERP systems, and custom applications is critical to making it a valuable business tool. Many businesses are recognizing the importance of Salesforce integration and are developing connections between Salesforce and the adjacent systems. But these types of point-to-point integrations are neither practical nor sustainable. There are numerous touchpoints and opportunities for Salesforce integrations to provide value for an enterprise, such as dealing with legacy systems, incorporating systems from M & As, developing a partner ecosystem, or building new company initiatives. All of these needs will lead to uncovering new opportunities, new ways to approach an account, or new value-adds to your customers, and all will need to be integrated into Salesforce.

Salesforce Integration Challenges Are Addressed By Common Integration Patterns”

When considering the variety of Salesforce integration needs, common patterns emerge for how to address them. Patterns, as denoted here, are the most logical sequence of steps to solve a specific type of Salesforce integration problem, and are established from actual use cases.

First, we need to define an integration application; it includes both a business use case and a pattern. The business use case consists of the value obtained from a particular integration and the patternis the generic process for data movement and handling.

The following structure can be used to delineate the format of a simple point-to-point atomic integration:

Application(s) A to Application(s) B – Object(s) – Pattern

In order to templatize common integration needs or best practices, patterns must first be established to make integrations reusable, extensible, and understandable. A pattern must contain a combination of at least two of the below elements:

  • The source system where data resides prior to execution
  • The criteria that determines the scope of data to be copied, moved or replicated
  • The transformation which the data set will undergo
  • The destination system where the data will be inserted
  • The results capture to compare the original state with the desired state.

The Five Most Common Salesforce Integration Patterns

The five most common Salesforce integration patterns are:

  1. Migration
  2. Broadcast
  3. Aggregation
  4. Bi-directional synchronization
  5. Correlation

The Migration Pattern

migration-pattern

Data migration is moving a specific set of data at a particular point in time from one system to another. A migration pattern allows developers to build automated migration services that create functionality to be shared across numerous teams in an organization. Developers can set the configuration parameters to pass into the API calls, so that the migration can dynamically migrate scoped Salesforce data in or out of Salesforce either on command or on an as-needed basis via an API. One way to save a great deal of time for development and operations teams is to create reusable services for frequent data migrations.

Migrations are appropriate for numerous Salesforce integration use cases, including migrating data from a legacy system to Salesforce, backing up a customer master dataset, consolidating CRM systems, etc. They are intended to handle large volumes of data, process many records in batches, and have a graceful failure case. Migrations are essential to any data systems and are used extensively in any organization that has data operations — in other words, every organization. In addition, migration is important for keeping enterprise data agnostic from the tools used to create it, view it, and manage it, so it can be used and reused by multiple systems. Without migration, data would be lost any time tools were changed, deeply affecting productivity.

The Broadcast Pattern

broadcast-pattern

The broadcast Salesforce integration pattern moves data from a single source system to multiple destination systems in an ongoing, near real-time, or real-time basis¬. Essentially, it is one-way synchronization from one to many. Typically “one way sync” implies a 1:1 relationship; the broadcast pattern creates a 1:many relationship.

In contrast to the migration pattern, the broadcast pattern is transactional and is optimized for processing records as quickly as possible. Broadcast patterns keep data up-to-date between multiple systems across time. It’s important that a broadcast Salesforce integration pattern be highly reliable to avoid losing critical data in transit. And because these integration patterns generally have low human oversight as they are usually initiated in mission-critical systems by a push notification or are scheduled, reliability becomes even more crucial.

The broadcast pattern allows for the immediate transfer of customer data between systems, As an example, the pattern can enable an action in Salesforce to immediately translate into order fulfillment processing. Some common use cases for the broadcast pattern include: creating a sales order in SAP when an opportunity is marked as CLOSED WON in Salesforce, or synchronizing real-time data from Siebel to Salesforce.

The Aggregation Pattern

aggregation-pattern

The aggregation Salesforce integration pattern takes or receives data from multiple systems and copies or moves it into just one system. Aggregation removes the need to run multiple migrations on a regular basis, removing concerns about data accuracy and synchronization. It is the simplest way to extract and process data from multiple systems into a single application or report.

By using an Salesforce integration template built on an aggregation pattern, it’s possible to query multiple systems on demand and merge data sets to create or store reports in .csv or other formats of choice, for example. Aggregation contains a custom logic that can be modified to merge and format data as needed and which can be easily extended to insert data into multiple systems, such as Salesforce, SAP and Siebel.

Some uses for the aggregation pattern include: Updating Salesforce with data from both ERP and issue tracking systems, creating a dashboard that pulls data from multiple Salesforce instances, while ensuring data consistency, or building APIs that collect and return data from multiple systems, or that report across multiple systems

The aggregation Salesforce pattern enables the extraction and processing of data from multiple systems and merging them into one application; this ensures that data is always up to date, does not get replicated, and can be processed or merged to produce any desired dataset or report. This avoids the necessity of having a separate database for merged content and makes reports available in any format or within any repository. The creation of orchestration APIs that retrieve data from multiple systems and process it into one response to modernize legacy systems and the creation of a central data repository that is used for compliance or auditing purpose are some real-world scenarios in which the aggregation Salesforce integration pattern is particularly useful.

Some key considerations for using aggregation include collecting data, the scope of the source data and insert data, merging multiple datasets, formatting data, and any additional destinations. For example, when collecting data, there are two ways to do so: either create a system that listens for messages from multiple systems and aggregates them in real time, or create an application that is triggered by an event.  When combining multiple datasets, you must consider how to merge them and how to present the data in the final report or destination system.

The Bi-Directional Sync Pattern

bidirectional-sync-pattern

Bi-directional sync Salesforce integration patterns unite multiple datasets in multiple different systems, causing them to behave as one system while allowing them to recognize the existence of different datasets. This type of integration comes in handy when different tools or different systems, which are needed for their own specific purposes, must accomplish different functions in the same data set. Using bi-directional sync enables both systems to be used and maintains a consistent real-time view of the data across systems.

Bi-directional sync integration enables the systems to perform optimally while maintaining data integrity across both synchronized systems. It can modularly add and remove two or more systems that subspecialize inside a domain as storage. This integration patterns is advantageous when object representations of reality must be comprehensive and consistent.

Some use cases of this particular Salesforce integration pattern include: integrating Salesforce with multiple systems that contribute to operational efficiencies and a streamlined quote to cash but still serve as the system of record for all data that needs to be synchronized.

The Correlation Pattern

correlation-pattern

Correlation and bi-directional sync Salesforce integration patterns are very similar but there is one important difference. The correlation pattern singles out the intersection of two data sets and does a bi-directional synchronization of that scoped dataset, but only if that item occurs in both systems naturally.  Bi-directional synchronization will create new records if they are found in one system and not the other. The correlation pattern does not discern the data object’s origin. It will agnostically synchronize objects as long as they are found in both systems.

This pattern is useful for cases in which two groups or systems only want to share data, but only if they both have records representing the same items or contacts in reality. For example, hospitals in the same healthcare network might want to correlate patient data for shared patients across hospitals, but want to avoid privacy violations consisting of sharing patient data with a hospital that has never admitted or treated the patient.

With the correlation pattern, the most important consideration is the definition of the term “same” across records. This definition can vary by industry; in addition, consequences for unclear definitions also are variable. For example, in the retail industry, when targeting offers to customers, the same name may be close enough to achieve the goal; however, in a healthcare setting, relying on a name alone could have serious consequences if two patients have the same name and different courses of treatment. The table below illustrates what can occur when the definition of “same” is too strict, too lax, or accurate across correlation and bi-directional sync integration patterns:

pattern-chart

The correlation pattern allows shared account data to be synchronized across applications, including Salesforce instances, either across an organization or between a company and a partner. It can also allow for synchronization of customer data entered by two different employees in the same or different departments.

Resources for further Salesforce integration development

Clearly, Salesforce integration can have numerous benefits on data management in the enterprise. But how can you get started utilizing these patterns? And what out of the box templates are available to make implementing Salesforce even easier in your organization?

Take a look at the numerous resources and solutions we have to help you.