Reading Time: 11 minutes

In the last month there have been a handful of stories about the discovery of unsecured sensitive data in “the cloud.” For example, Microsoft has been in the news recently due to a potential inadvertent leak of 38 million of their customers’ data records due to a configuration that defaulted to unsecured by design. Around the same time, the review website SeniorAdvisor was discovered to have misconfigured their instance of the AWS S3 cloud storage service resulting in an exposure of about three million identifying records. In another case reported this month, Japanese company Murata learned that a subcontractor had — against company policy — uploaded more than 75 thousand documents containing sensitive customer data to a cloud provider.

While each case is unique, there is a common thread here and in other similar stories. As companies expand their data transformation activities beyond the confines of the internal IT “castle,” we all need to be diligent about where instances of company data can be found and who is accessing them. And one of the keys to success is relying on platforms that provide extensive monitoring and a rich set of alerting mechanisms. 

latest report
Learn why we are the Leaders in API management and iPaaS

Prioritizing API security when you choose your tools

To be clear, these companies did not set out to create systems that expose customer data. However, it is possible that their product’s default configurations for data access and publication along with confusing user interfaces contributed to cases resulting in the public exposure of customer data to anonymous access. 

How systems are designed and how they behave “by default” is a choice made by the people who create hosting products and services. And those choices, by extension, can become the policies of all the companies that use those systems. Our choices matter.

It’s hard to protect what you can’t see

As companies embrace new “information democratization” patterns like low/no-code platforms and robotic process automation (RPA), there are important tradeoffs to deal with. Old command-and-control patterns of governance, reliance on barriers of entry, and reliance on big-up-front design can easily be cut short. In some cases, established governance and review might simply be bypassed by new onsite tools and cloud-based services that offer an independent, streamlined data access and publishing workflow. It can be hard to see these security challenges since they arrive as part of a new wave of positive changes at a time when agility is advertised as a precious advantage in a competitive environment.

IT leadership needs to have “eyes on the field” to know when new products are brought into the company. Security reviews for new low-code data systems need to include a careful review of common data connector patterns, typical publishing workflows, and default security settings. Any new public end points — whether through local gateways or hosted remotely in the cloud — need to be added to periodic penetration testing and regular data reviews. And all active services need to be continuously monitored for unusual traffic and access patterns. All this along with your organization’s regular data management and threat reviews.

It turns out most of these actions (monitoring traffic, periodic pen testing, etc.) is not complicated work. It just requires diligence and dedication both on the part of those implementing the testing and those analyzing the results. However, a key challenge is often knowing where to look. It’s hard to protect your company, your employees, and your customers from data exposures you can’t see. 

Accessibility and governance are the keys to API security

As systems become more decentralized, as more “citizen developers” engage in creative and innovative practices with the company, there is a need to increase the level of observability in general. That means getting more information about who is doing what with your data. 

Often, you don’t need to prevent people from accessing data sources within the organization. In fact you likely want to enable innovative use of your internal data — that’s what these new agile, low-code tools are all about. What is needed is a higher level of information about who is accessing and publishing data. This is another kind of “control plane” that can alert IT teams when new sources go online and allow teams to review the new connectors and set up continuous monitoring of their use.

Quality data publishing systems have this kind of alerting and monitoring built in as a core feature. They include the ability to assess potential data publishing leaks and dangerous data consumption patterns. With power comes responsibility. A reliable platform partner is one that helps you protect your data, not just make it easy to publish on the open web.

With great power comes great responsibility

No company wants to be named in a story about mismanaged data access. Nor does any organization want to have their reputation tarnished in a legal tangle over who is responsible for any damages due to a data breach. With little to no margin for error, CIOs are advised to focus on partners that eagerly and vigilantly aspire to be secure — ones that make it easy to do “the right thing” and difficult to mistakenly export or expose sensitive data.

As we build systems that are more composable, more “plug-and-play,” we all run the risk of creating solutions that are harder to observe and, therefore, harder to manage. Selecting partners and vendors takes on a new importance when the tools you are enabling are general components designed for use by a broad spectrum of stakeholders including internal teams and external partners. 

An important lesson in these recent news stories is that we are all responsible for the choices we make. We choose our vendors, we choose our platforms, we choose our levels of monitoring and observability, and we choose the internal processes that teams follow to successfully access and publish important information on platforms both within and outside the company firewalls. 

Now is a great time to talk within your own teams about the state of data publishing across your company. To confirm you have sufficient initial review and ongoing monitoring in place. Now is not the time to pull back on enabling citizen developers and innovation inside your organization. Instead, you can choose to both protect your data and empower your staff to meet the challenge of enabling speed and agility at scale.

Check out this guide to API security to learn more about the key components you need to take into account when establishing your security strategy.