The API Evangelist Blog

API Evangelist

The Caltech University API Landscape

19 Feb 2020

I regularly take a look at what different universities are up to when it comes to their APIs. I spent two days talking with different universities at the University API summit in Utah a couple weeks back, and I wanted to continue working my way through the list of schools I am speaking with, profiling their approach to doing APIs, while also providing some constructive feedback on what schools might consider doing next when it comes to optimizing API delivery and consumption across campus. Next up on m list is Caltech, who we have been having conversations with at Postman, and I wanted to conduct an assessment of what the current state of APIs are across the school. The university reflects what I see at most universities, meaning there are plenty of APIs in operation, but not any visible organized effort when it comes to bringing together all the existing APIs into a single developer portal, or centralizing API knowledge and best practices when it comes to the interesting API things going on across a campus, and with other partners and stakeholders. APIs in the Caltech Library When it comes to APIs at the University level the first place to start is always at the library, and things are no different at Caltech. While there is no official landing page specifically for APIs the Caltech library has GitHub page dedicated to a variety of programmatic solutions https://caltechlibrary.github.io/, but you can find many signals of API activity behind the scene, like this API announcement that the write API is available https://www.library.caltech.edu/news/write-api-now-operational. You can also find an interesting case study on how the library is using APIs provided by an interesting API provider called Clarivate, which I will be looking to understand further. As with every other university, there is a huge opportunity for Caltech to be more organized and public about the API resources offered as part of the library--even if it isn't widely available to...[Read More]


API Evangelist

API Interrogation

14 Feb 2020

I was doing some investigation into how journalists are using APIs, or could / should be using APIs. After some quick Googling, Binging, and DuckDuckGoing, I came across a workshop by  David Eads of ProPublica Illinois, called a A hitchhiker's guide To APIs. As I began reading, I was struck by how well it captured not only usage of Postman in journalism, but also how well it captures what Postman does in general in a single precise sentence, “In this hands-on session, you will use Postman to interrogate a web API.” That is how I use Postman. That is why 10 million developers use Postman.  APIs are how we can interrogate the digital world unfolding around us. It is increasingly how we can interrogate the digital world emerging across our physical worlds. I like the concept in general, but definitely think it is something I should explore further when it comes to journalism and investigative storytelling. Postman provides a pretty powerful way to get at the data being published by city, county, state, and federal government. It also provides a robust way to get at the social currents flowing around us on Twitter, Facebook, LinkedIn, and other leading platforms. Postman and APIs provides technical and non-technical users with what they need to target a source of data or content, authenticate, and begin interrogating the source for all relevant information. I find that interrogating a startup is best done via their own API, as well as their digital presence via Twitter, LinkedIn, GitHub, Stack Overflow, Facebook, Youtube, and Instagram using APIs, over speaking with them directly. I find that interrogating a federal agency is often only possible through the datasets it publishes, providing me with a self service way to understand a specific slice of the how our society works (or doesn’t). While I can interrogate a company, organization, institution, and government agencies using their websites, I find that also being able to interrogate their platform,...[Read More]


API Evangelist

All of the Discussions from the BYU API University Workshop in Utah

12 Feb 2020

I went to Provo Utah a couple weeks ago and participated in the sixth annual Brigham Young University (BYU) University API Workshop. I was the keynote opener for the first edition of the conference, and I was the same for the sixth edition of the event bringing together many different universities together to talk about API usage across their campuses. When the event began it was primarily BYU staff, but it has expanded to include administrators and faculty from what I counted to be over twenty other universities from across the United States--making for a pretty interesting mix of conversation from higher education API practitioners looking to solve problems, and share their stories of how APIs have help make an impact at how universities serve students and the public. The University API Workshop is an “unConference Focused on University & Personal APIs & Their Use in Improving Learning”. It brought together around one hundred folks to discuss a wide variety of API topics. Since it was an unconference, everyone pitched their own ideas, with some of them being about sharing API knowledge, while others was about soliciting knowledge from the other attendees. Resulting in a pretty compelling list of session spread across two days. You can browse through the sessions using the Google Docs that every session organizer published. Providing a pretty compelling look at how APIs are making an impact at the higher education level, shining a light on the concerns of API stakeholders across the campus. Session One Let’s stop using usernames & passwords User Experience in the API World Postman Fundamentals Securing APIs/data with proper authorization Session Two Walk, Talk, and API Stalk API Governance at Scale taking ideas to consistent execution Mendix (HPAPaaS/Low Code) After a Year at BYU Our New NGDLE | Open Courses Made With Web Components, Microservices, Docker, CI/CD and more! DDD vs. BI - Balancing Centralizing and Decentralizing Forces in Data Architecture Session Three How do I test...[Read More]


API Evangelist

Postman Governance as the Foundation for Wider API Governance

11 Feb 2020

This an overview of possible strategies for governing how Postman is used across a large organization. It is common for Postman to be already in use across an organization by individuals operating in isolation using a free tier of access. Governance of not just Postman, but also the end to end API life cycle begins with getting all developers using Postman under a single organizational team, working across master planned workspaces. If there are concerns about how Postman is being used across an enterprise organization, governance of this usage begins by focusing on bringing all enterprise Postman users together under a single license, and team, so that activity can be managed collectively. Postman Users Over the last five years Postman has become an indispensable tool in the toolbox of developers. 10 million developers have downloaded the application and are using it to authorize and make requests to APIs then debug the responses. The benefit to API operations for the enterprise is clear, but the challenge now for enterprise organizations is to identify their individual Postman users and encourage them to operate under a single pro, team, or enterprise license.  Currently users are operating in isolation, defining, storing, and applying secrets and PII locally on their own workstations within Postman, and syncing to the cloud as part of their regular usage of Postman—isolating details about APIs, secrets, potentially PII, and other sensitive data within these three areas. Personal Workspaces - Storing collections, and environments within their localized personal workspaces and individual Postman account. Personal Collections - Developing API collections in isolation, leaving them inaccessible to other teams, and reusable across operations. Personal Environments - Using environments to store secrets, PII, and other data within their localized personal workspaces and individual Postman account.  When it comes to enterprise API governance, observability, and security, the problem isn’t with Postman being used by developers, the problems is developers are not using Postman together under a single license, across managed shared workspaces. Putting...[Read More]


API Evangelist

Conducting API Weaponization Audits

11 Feb 2020

I’ve been thinking about chaos engineering lately, the discipline of experimenting on a software system in production in order to build confidence in the system's capability to withstand turbulent and unexpected conditions. I listened to a talk by Kolton Andrus the CEO of Gremlin the other day, and my partner in crime at Postman Joyce (@petuniagray) is an avid evangelist on the subject. So I have been thinking about the concept, how it applies to your average enterprise organization, and the impact it could make on the way we operate our platforms. I don’t think chaos engineering is for every company, but I think there are lessons involved in chaos engineering that are relevant for every company. Similarly I think we need an equal approach in the area of weaponization, and how APIs can easily be used to harm a platform, its community, and the wider public—a sort of weaponization audit. Let’s take what we’ve learned from Twitter, Facebook, Youtube, and others. Let’s look at the general security landscape, but let’s get more creative when it comes to coloring within the lines of an API platform, but in unexpected ways. Let’s get women and people of color involved. Let’s focus on ways in which a platform can be abused. Using the web, mobile, device, or APIs underneath. I’d like to consider security, privacy, reliability, observability, as well as out of the box ways to game the system. Let's assume that nobody can be trusted, but recognizing we still need to offer a certain quality of service and community for our intended users. I am guessing it won’t be too hard to hire a savvy group of individuals who could poke and prod at a platform until the experience gets compromised in some harmful way.  Like chaos engineering, I’m guessing most organizations wouldn’t be up for an API weaponization audit. It would reveal some potentially uncomfortable truths that leadership probably isn’t too concerned with addressing, and...[Read More]


API Evangelist

The Basics of Working with the Postman API

10 Feb 2020

It is pretty easy to think of Postman as the platform where you engage with your internal APIs, as well as other 3rd party APIs. It doesn’t always occur to developers that Postman has an API as well. Most everything you can do through the Postman interface you can do via the Postman API. Not surprisingly, the Postman API also has a Postman collection, providing you with quick and easy access to your Postman collections, workspaces, teams, mocks, and other essential elements of the Postman platform and client tooling. Providing you with the same automation opportunities you have come to expect from other APIs. API access, integration, and automation should be the default with everything you do online—desktop, web, mobile, and device applications all use APIs. Your API infrastructure is no different. Postman takes this seriously, and works to make sure that anything you can do through the desktop or web interfaces, that you can also do via the Postman API--allowing API providers and consumers to seamlessly integrate and automate the Postman platform into their operations by leveraging the following APIs. Collections - Being able to programmatically create and manage Postman API collections in use. Environments - Adding and managing the details of the environments applied across Postman collections. Mocks - Creating, retrieving, and deleting mocks APIs that are generated from Postman collections. Monitors - Create, update, retrieve, delete, and run monitors that execute Postman collections. Workspaces - Creating, retrieving, updating, and deleting the workspaces that collections are organized in. Users - Provides a /me endpoint that allows for pulling of information about the API key being used. Import - Allowing for the import of Swagger, OpenAPI, and RAML API definitions into Postman. API - Programmatically creating and managing APIs, including version, schema, and its link relations. These eight API paths give you full control over managing the full API life cycle of APIs you are developing, and the integration and automation of the APIs...[Read More]


API Evangelist

Standardizing My API Life Cycle Governance

10 Feb 2020

I am working on redesigning all of my base APIs, as well as produce a mess of new ones. As part of the process I am determined to be more thoughtful and consistent in how I design and deliver the APIs. API governance always begins with using API definitions, as you can't govern something you can't measure and track, so having machine readable artifacts is essential. After that, the design of the API is the first place to look when it comes to standardizing each of the APIs coming off the assembly line. Then I am looking to do my best to begin defining, measuring, and standardizing how I do many other areas of API operations, hleping me keep track of the many moving parts of doing microsservices.  To help me govern the life cycle for each API, I am going to be quantifying and measuring as many of the follow areas as I can. These are what I consider to be the essential building blocks of each API that I deliver, and since I'm using Postman to not just interact with these APIs once they are in production, I will be using Postman to also deliver and govern each stop along the API life cycle. Using Postman collections to define, deliver, and govern each of these areas, using scripts, runners, and monitors to automate the enforcement of standards and consistency across the APIs I am delivering on a regular basis.  Definitions OpenAPI - There is an OpenAPI for each individual API. Collection - This is a Postman collection for each individual API. JSON Schema - There is a JSON schema for each individual schema. Design Requests Base - Ensure the base path is planned. Versioning - Define how APIs are versioned. Resource - Evaluate each resource published. Sub-Resources - Evaluate each sub-resource published. Methods - Ensure common use of HTTP methods. Actions - Determine how actions are taken beyond methods. Path Parameters - Establish common approach for path parameters. Query...[Read More]


API Evangelist

Backend AWS API Gateway Integration OpenAPI Extensions

10 Feb 2020

I have spent a lot of time automating my AWS API infrastructure, working to make it so I can automatically deploy API infrastructure to AWS.  I am using AWS API Gateway as part of this suite of API deployments so I have been working hard to understand how AWS speaks OpenAPI as part of their implementation. As part of my work there are three distinct types of APIs I am deploying using AWS API Gateway, which have three distinct ways of extending OpenAPI to describe. The Pass Through Just passing what comes in to an HTTP host and path I give it and then passing the response back through without any transformations or other voodoo along the way. This is a basic OpenAPI extension for defining a pass through API using the AWS API Gateway. A DynamoDB Backend For my basic CRUD databases I am just using a DynamoDB backend because it allows me to quickly launch data APIs that allow me to Create, Read, Update, and Delete (CRUD) data I am storing in the NoSQL database—providing me with a pretty basic approach to delivering data API infrastructure. Here is the OpenAPI vendor extension for wiring things up using a DynamoDB backend. I like DynamoDB because you can just make API calls to get most of what you need without any sort of business logic or code in between. If I am just looking to manage data using simple web API endpoints, this is what I am doing when it comes to deploying API infrastructure. Logic with Lambda I would say the the previous two types of APIs represent the most common implementations I have, but I am working to evolve my infrastructure to take advantage of newer approaches to delivering APIS like Lambda. Here is the OpenAPI extension for defining a Lambda backend, which I can then wire up to a database and storage or purely implement some business logic to do what I...[Read More]


API Evangelist

API Links For Every UI Element

10 Feb 2020

I’ve showcased ClokudFlare's approach making their API available as part of their user interface several times now. It is a practice I want to see replicated in more desktop, web, and mobile applications, so I want to keep finding new ways of talking about, and introducing to new readers. If you sign up or use CloudFlare, and navigate your way to their SSL/TLS section, you will see a UI element for changing the levels of your SSL/TLS encryption, and below it you see some statics on the traffic that has been served over TLS over the last 24 hours. Providing you full control over SSL/TLS within the CloudFlare UI. At the bottom of the UI element for managing your SSL/TLS you will see an API link, which if you click you get three API calls for getting, changing, and verifying the SSL/TLS status of your domain. Providing you with one click access to the API calls behind the UI elements, giving you two separate options for managing your DNS. This is how all user interfaces within applications should be. The API shouldn’t just be located via some far off developer portal, they should be woven into the UI experience, revealing the API pipes behind the UI at every opportunity. This allows for the automation of any activity a user is taking through the interface using the platform's API. You could also consider embedding a simple Postman Collection for each API capability, allowing a user to run in Postman—to further support, you could also make a Postman environment available, pre-populated with a users API Key, making execution of each platform capability outside of the platform possible in just one or two clicks. Once each UI capability is defined as a Postman collection it can immediately be executed by a user in a single click. It can also be executed using a Postman runner as part of an existing CI/CD process, or on a schedule using a...[Read More]


API Evangelist

Secrets and Personally Identifiable Information (PII) Across Our API Definitions

27 Jan 2020

As API providers and consumers we tend to have access to a significant amount of credentials, keys, tokens, as well as personally identifiable data (PII). We use this sensitive information throughout the API integration and delivery life cycles. We depend on credentials, keys, and tokens to authorize each of our API requests, and we potentially capture PII as part of the request and response for each the individual API requests we execute regularly. Most developers, teams, and organizations I’ve spoken with do not have a strategy for addressing how secrets and PII are applied across the internal and external API landscape. API management over the last decade has helped us as API providers better manage how we define and manage authentication for the APIs we are providing, but there hasn’t been a solution emerge that helps us manage the tokens we use across many internal and external APIs. With this reality, there are a lot of developers who are self-managing how they authenticate with APIs, and work with PII that gets returned from APIs. I am working on several talks with enterprise organizations about this challenge, and to prepare I want to work through my thoughts on the problem, as well as some possible solutions. I wanted to map out how we integrate with the APIs we are developing and consuming, and think about what the common building blocks of how we can better define, educate, execute, audit, and govern the secrets and PII that is applied throughout the API life cycle across all of the APIs we depend on. Allowing me to have a more informed conversation about how we can get better at managing the more sensitive parts of our operations. What Are The Types of Sensitive Information? First I wanted to understand the types of common information being applied by API developers, helping me establish and evolve a list of the types of data we are looking for when securing the API...[Read More]


API Evangelist

An Introduction to API Authentication

27 Jan 2020

APIs operate using the web, but like web applications, many API require some sort of authentication or authorization before you can access the valuable resources available within each API path. When you open up your APIs on the web you aren’t just giving away access to your resources to anyone who comes along. API providers employ a number of different authentication mechanisms to ensure only the applications and systems who should have access are actually able to make a successful API call. To help refresh the types of authentication available across the API landscape, while also demonstrating the reach of Postman as an API client, I wanted to take a fresh look at authentication to help my readers understand what is possible. Depending on the API provider, platform, and the types of resources being made available you will encounter a number of different authentication methods—here are the 11 that Postman supports, reflecting 90% of the APIs you will come across publicly, as well as within the enterprise organization. Refelecting what the API sector employs for authentication of their APIs, as well as what Postman supports as an API client. No Authentication - Like the web, these APIs are publicly available and accessible without any authentication. You can just make a request to a specific URL, and you get the response back without needing any credentials or key. This reflects a very small portion of the API economy, but still is an important aspect of the overall authentication discussion, and what is possible. API Key - An application programming interface key (API key) is a unique identifier used to authenticate a user, developer, or calling program to an API. However, they are typically used to authenticate a project with the API rather than a human user. Different platforms may implement and use API keys in different ways. Bearer token - A Bearer Token is an opaque string, not intended to have any meaning to clients using...[Read More]


API Evangelist

Profiling Adobe APIs

23 Jan 2020

As I was profiling APIs on my list of APIs I found myself profiling Adobe. I am moving through the list of companies alphabetically, so you can see how far along I am. Anyways, like any other large company I need to make a decision about how I am going to manage the profiling of different API products and lines of business. Companies like Amazon, Google, Azure, and Adobe have large numbers of APIs and I always know I will need to have some sort of plan for documenting everything that is going on. With Adobe, I am going to track everything in a single GitHub repository, but will be working to create separate API definitions (OpenAPI and Postman collections) for each of the individual APIs being offered. To provide some context, it helps to understand why I profile APIs in the first place. As the API Evangelist I review public API operations studying how API providers are doing what they do. I then aggregate the "building blocks" of their public operations into a master set of reserarch that I use to drive my storytelling and API strategy workshops. So, with the Adobe APIs I'm not looking to review their API operations as much as I am looking to understand how they operate, and develop an understanding of how far along they are in their enterprise API journey. As with any profiling of a company, I begin by Googling their name pus API, but then dive as deep as I can into the details of what I find with each click. When you Google Adobe APIs you get this main landing page with the tagline, “APIs and SDKs for all Adobe products – create mobile, web and desktop apps”. You can tell Adobe is working hard to bring together their APIs under one big tent, with the following main areas to support developers: Landing Page - Adobe API landing page. Authentication - Overview of authentication. Open...[Read More]


API Evangelist

Three Ways to Use Postman and Azure DevOps

22 Jan 2020

I set out to understand the role that Postman can play in an Azure DevOps powered API life cycle. I was fully prepared to crash course Azure Dev Ops, and begin mapping out the role that Postman can play, but before I got started I began Googling Postman + Azure DevOps. I was happily surprised to find a number of rich walk throughs written by the passionate Postman community--surpassing anything I could have put together for a version 1.0 of my Azure DevOps Postman guidance. I will still work to pull together my own official Azure DevOps Postman walkthrough, but to prepare I wanted to publish a summary of what I have found while thinking about how Postman and Azure DevOps can work together.  The Postman Basics Before we get going with what I have found, I wanted to point to a couple of key concepts readers will need to be familiar with before they set out trying to use Postman with Azure DevOps, helping set the tone for any integration. It always helps to start with the basics and not assume all of my readers will understand what Postman delivers. Intro to collections - Getting familiar with what collections are, and how they work. Intro to collection runs - Understanding the nuance of how collections can be run. Intro to scripts - Learning about how to script within the collections being run. It is critical that you have a decent grasp on what are possible with Postman collections, and how it can be applied as part of any CI/CD pipeline. Most developers think of Postman as simply an HTTP client for just making calls to APIs. Once you understand how collections can be run, and the many different ways that scripts can be applied, you will be much more effective at applying as part of any pipeline, including with Azure DevOps--providing a great place to start. Testing Azure DevOps APIs Using Postman While mapping out this walk...[Read More]


API Evangelist

The State of California Doing APIs The Right Way By Starting Simple

22 Jan 2020

I got introduced to the CA.gov Alpha Team by my fellow government change maker Luke Fretwell (@lukefretwell) the other day, and I am beginning to tune into what they are up to in similar ways to how I’ve done with other city, state, and federal government entities over the years. We kicked off a conversation around their approach to delivering APIs, and what was possible with Postman. After we were done kicking things off they shared some links with me to help me get up to speed on what they have been doing with their new approach to delivering technology across the State of California. As far as first impressions go I am super stoked with their approach. They are starting small, and working hard to be as public with how they are doing everything. The CA.gov Alpha Team gets right down to the core of doing API well, by setting up the essential communication channels you need to do APIs well across any small or large organization. GitHub - All of the projects they develop are published to GitHub. Twitter - Providing a social stream from what is happening. Blog - Shaping the narrative around all of the work that is occuring. The CA.gov Alpha Team has not just gone all in on GitHub, they are all about their work truly existing in the public domain. It looks like everything they are doing is first being defined as a GitHub repository, providing a default way for other government stakeholders, as well as the public at large to stay in tune with what is going on, and even contribute to what is happening. This is how all government should be by default, and the CA.gov Alpha Team provides one possible blueprint for other city, state, and federal agencies to follow. I really like that the CA.gov Alpha Team is seeding and managing everything out in the open on Twitter, and being so vocal about it all with a...[Read More]


API Evangelist

Help Defining 13 of the AsyncAPI Protocol Bindings

22 Jan 2020

I have been evolving my definition of what my API toolbox covers, remaining focused on HTTP APIs, but also make sure I am paying attention to HTTP/2 and HTTP/3 APIs, as well as those that depend on TCP only. My regular call with Fran Méndez (@fmvilas) of AsyncAPI reminded me that I should be using the specification to ground me in the expansion of my API toolbox, just as OpenAPI has defined much of it for the last five years. For this particular multi-protocol API toolbox research, the AsyncAPI protocol bindings reflect how I am looking to expand upon my API toolbox. Here are the 13 protocols being defined around the AsyncAPI specification: AMQP binding - This document defines how to describe AMQP-specific information on AsyncAPI. AMQP 1.0 binding - This document defines how to describe AMQP 1.0-specific information on AsyncAPI. HTTP binding - This document defines how to describe HTTP-specific information on AsyncAPI. JMS binding - This document defines how to describe JMS-specific information on AsyncAPI. Kafka binding - This document defines how to describe Kafka-specific information on AsyncAPI. MQTT binding - This document defines how to describe MQTT-specific information on AsyncAPI. MQTT5 binding - This document defines how to describe MQTT 5-specific information on AsyncAPI. NATS binding - This document defines how to describe NATS-specific information on AsyncAPI. Redis binding - This document defines how to describe Redis-specific information on AsyncAPI. SNS binding - This document defines how to describe SNS-specific information on AsyncAPI. SQS binding - This document defines how to describe SQS-specific information on AsyncAPI. STOMP binding - This document defines how to describe STOMP-specific information on AsyncAPI. WebSockets binding - This document defines how to describe WebSocket-specific information on AsyncAPI. Not all of the protocol bindings are fully fleshed out, and AsyncAPI could use help from the community to quantify what is required with each of the protocols. I am going to try and contribute what I can as I make my way through each of the protocols as part of my API toolbox research.I am defining the building blocks for each of the protocols which...[Read More]


API Evangelist

My Upcoming Talk with the UK Government Digital Services (GDS): The API Life Cycle Is For Everyone

21 Jan 2020

I am heading to London in February to talk to the UK government about APIs. They invited me out to talk about my history of work with government in the US and EU, and share my views of the API life cycle. To help share my view of the API landscape I pulled together a talk titled, "The API Life Cycle Is For Everyone". I am hoping to share my view of the fundamentals of a modern API life cycle, as well as emphasize the importance of both developers and non-developers having a place at the table. Here is what I've pulled together for my time with the GDS in London. APIs are widely considered to be something that is exclusively in the domain of software developers. While it is true that APIs are often a very technical and abstract concept which requires a more technically inclined individual to engage, APIs are something that impacts everyone across todays digital landscape, impacting both business users and developers, making the API development life cycle something all parties should be educated on, made aware of, and equipped to participe in. As part of my contribution to the GDS talks on interoperability and open standards I’d like to spend an hour with you talking through the human-machine intersection across: API Definitions - Talking about Swagger / OpenAPI, as well as Postman collections and environments, and how they are being put to use. API Documentation - Understanding common approaches to delivering and maintaining documentation for APIs that are being delivered. API Mocks - Thinking about how API mocking can be used to articulate and share what an API delivers for all stakeholders involved. API Testing - Understanding the role that API assertions and testing play in defining the operations and reliability of our API infrastructure. API Management - Looking at how API management secures our APIs, but also helps us develop the awareness of how they are used. API Contracts...[Read More]


API Evangelist

Looking at Electronic Data Interchange (EDI) Reminds Me that the API Economy is Just Getting Started

21 Jan 2020

I am neck deep in the expansion of what I consider to be my API toolbox, and I have been spending time mapping out the world of EDI. If you aren’t familiar with  the Electronic Data Interchange (EDI), it “is the electronic interchange of business information using a standardized format; a process which allows one company to send information to another company electronically rather than with paper. Business entities conducting business electronically are called trading partners.”  EDI is the original API by providing a, “technical basis for automated commercial "conversations" between two entities, either internal or external. The term EDI encompasses the entire electronic data interchange process, including the transmission, message flow, document format, and software used to interpret the documents”. EDI is everywhere, and truly the backbone of the global supply chain, but one that you only hear lightly about as part of the overall API conversation. I have regularly come across the overlap between EDI and API over the last 10 years of doing API Evangelist, and while I have engaged in discussion around modernizing legacy EDI approaches in healthcare and commerce, most other fundamental building blocks of the global supply chain are entirely new to me. Revealing how little I know about the bigger picture of EDI, and how small my API world actually is. I don’t claim to know everything about information exchange and interoperability, but EDI is something that should be a bigger part of my storytelling, and the fact that it isn’t I think is revealing about how much more work we actually have in front of us when it comes to delivering on the promise of the API economy.  Take a look at some of the major EDI standards to get a sampling of the scope I am talking about. These are the electronic data interchange standards that have governed commerce before the Internet was around, and continues to define how data moves around. [email protected] (EDIGAS) - The [email protected][Read More]


API Evangelist

I Think We Will Have To Embrace Chaos With the Future of APIs

21 Jan 2020

I like studying APIs. I like to think about how to do APIs well. I enjoy handcrafting a fully fleshed out OpenAPI definition for my APIs. The challenge is convincing other folks of the same. I see the benefits of doing APIs well, and I understand doing the consequences of not doing them well. But, do others? I never assume they do. I assume that most people are just looking to get an immediate job done, and aren’t too concerned with the bigger picture. I think people have the perception that technology moves too fast, and they either do not have the time to consider the consequences, or they know that they will have moved on by the time the consequences are realized. I’m pretty convinced that most of our work on API design, governance, and other approaches to try and standardize how we do things will fall on deaf ears. Not that we shouldn’t keep trying, but I think it helps if we are honest about how this will utlimately play out. If I give a talk about good API design at a tech conference everyone who shows up for the talk is excited about good API design. If I give a talk about good API design within an enterprise organization and leadership mandates everyone attend, not everyone present is excited, let alone cares about API design. I wish people would care about API design, and be open to learning about how others are designing their APIs, but they aren’t. Mostly it is because developers aren’t given the space within their regular sprints to care, but it is also because people are only looking to satisfy the JIRA ticket they are given, and often times the ticket has said nothing about the API being well designed, and consistent with other teams. Even with teams that have been given sufficient API design training and governance, if it isn’t explicitly called out as part of the...[Read More]


API Evangelist

Expanding My API Toolbox for the Next Decade

21 Jan 2020

I am continuing to iterate on what I consider to be a modern API toolbox. API Evangelist research is born out of the SOA and API worlds colliding, and while I have been heavily focused on HTTP APIs over the years, I have regularly acknowledged that a diverse API toolbox is required for success, and invested time in understanding just what I mean when I say this. Working to broaden my own understanding of the technologies in use across the enterprise, and realistically map out what I mean when I say API landscape. I am still workshopping my new API toolbox definition for 2020, but I wanted to work on some of the narrative around each of the items in it, helping me learn along the way, while also expanding the scope of what I am talking about. Transmission Control Protocol (TCP) The Transmission Control Protocol (TCP) is one of the main protocols of the Internet protocol suite, and provides reliable, ordered, and error-checked delivery of a stream of bytes between applications running on hosts communicating via an IP network. The Web and APIs both rely on TCP, which is part of the Transport Layer of the TCP/IP suite. SSL/TLS often runs on top of TCP. It is the backbone of our API toolbox, but there are many different ways you can put TCP to work when it comes to the programming interfaces behind the applications we depend on. It can be tough to separate what is a protocol, and what is a methodology when looking at the API landscape. I’m still working to understand each of these tools in the toolbox, and organize them in a meaningful way—which is why I am writing this post. While all APIs technically rely on TCP, these approaches to communication and information exchange are often implemented directly using TCP. Electronic Data Interchange (EDI) - Electronic Data Interchange (EDI) is the electronic interchange of business information using a standardized format;...[Read More]


API Evangelist

DevOps Azure Style

17 Jan 2020

I am spending time thinking more deeply about how APIs can be delivered via Azure. I spent much of the holidays looking at how to deliver APIs on AWS, but only a small amount of time looking at Azure. I'm looking at how Azure can be used for the development and delivery of APIs, trying to understand the different ways you can use not just Azure for managing APIs, but also use Azure APIs for managing your APIs. Next up is Azure DevOps, and learning more about the nuts and bolts of how the orchestration solution allows you to streamline and stabilize the delivery of your API infrastructure using Azure. First, I want to just break down what the core elements of Azure Devops. Learning more about how Azure sees the DevOps workflow and how they have provided a system to put their vision to work. Here are the main elements of Azure DevOps that help us understand the big picture when it comes to mapping to your API life cycle. Azure DevOps Server - Share code, track work, and ship software using integrated software delivery tools, hosted on-premises Azure Boards - Deliver value to your users faster using proven agile tools to plan, track, and discuss work across your teams. Azure Pipelines - Build, test, and deploy with CI/CD that works with any language, platform, and cloud. Connect to GitHub or any other Git provider and deploy continuously. Azure Repos - Get unlimited, cloud-hosted private Git repos and collaborate to build better code with pull requests and advanced file management. Azure Test Plans - Test and ship with confidence using manual and exploratory testing tools. Azure Artifacts - Create, host, and share packages with your team, and add artifacts to your CI/CD pipelines with a single click. Azure DevTest Labs - Fast, easy, and lean dev-test environments Not every API implementations will use all of these elements, but it is still nice to understand...[Read More]


API Evangelist

A View of the API Delivery Life Cycle from the Azure Getting Started Page

17 Jan 2020

I am working my way through doing more work around the multi-cloud deployment of APIs and spending some more time on the Azure platform here in 2020, and I found their getting started page pretty reflective of what I'm seeing out there when it comes to delivering the next generation of software. When landing on AWS home page it can be overwelming to make sense of everything, and I thought that Azure organized things into a coherent vision of how software is being delivered in the cloud. Infrastructure Providing the fundamental building blocks of compute for all of this. Linux virtual machines  Windows virtual machines  I never thought I"d see Linux and Windows side by side like this. Languages Acknowledging there are multiple programming languages to get the job done. .NET  Python  Java  PHP  Node.js  Go Again, I never thought I'd see such strong support for anything beyond .NET. Application This nails the different layes in which I see folks delivering API infrastructure. Web Apps  Serverless Functions  Containers  Microservices with Kubernetes  Microservices with Service Fabric I think its silly to put microservices there, because APIs are delivered in all. Database The database layers behind the APIs we are all delivering across operations. Relational Databases  SQL Database as a service  SQL Database for the edge  SQL Server on an Azure  PostgreSQL database as a service  MySQL database as a service  Azure Cosmos DB (NoSQL) Again, I am blown away to see MySQL and PostgreSQL along with SQL Server. Storage Where you put all of your blobs and other objects used across your APIs. Blob Storage I'd say this layer is a little anemic compared with other cloud environmetns. Machine Learning Acknolwedging that machine learning is a growing area of API deployment. Machine Learning  Cognitive Services  Azure Notebooks This area will continue grow pretty rapidly in coming years in all industries. Interfaces The ways in which we are interfacing with the software development life cycle. Azure CLI ...[Read More]


API Evangelist

What Is Your API Development Workflow?

16 Jan 2020

I am going to invest in a new way to tell stories here on API Evangelist—we will see if I can make this stick. I enjoy doing podcasts but I am not good at the scheduling and reliable repetition many expect of a podcast. Getting people to join me on a podcast takes a lot of work (I know from experience) to do reliably. People usually want to talk, but finding slots in both of our schedules and getting them to jump online and successfully record an episode isn’t easy to do on a regular basis. However, I still want to be able to craft audio narratives around specific topics that are relevant to the API sector, while also allowing many different voices to chime in. So I’ve come up with a formula I want to test and and see if I can build some momentum. To help stimulate the API conversation and bring in other voices I want to pose a single question on a regular basis and solicit audio responses from folks across the API space, then compile the results into a single podcast that I will publish on the blog and via other channels. All folks need to do in their response to one of my questions is open up their phone and record their response and send me the resulting audio file via email, DM, or carrier pigeon. Then I will organize all the responses into a single coherent podcast with me opening, asking my questions, then chaining together the responses, and closing up with a little analysis. Make sense? A kind of an asynchronous podcast conversation amongst several participants. Ok, let’s start with my first question.: How do you develop APIs? Describe how you or your team actually develops an API. What is the workflow for how you go from idea to production, and what tools and services are involved. Be honest. I am not looking for fluff or pie in...[Read More]


API Evangelist

My Eventbrite API Keys Were Easy To Find

16 Jan 2020

If you read my blog regularly you know I rant all the time about having to sign up for new APIs and then find my API keys and tokens. API providers excel at making it extremely difficult to get up and running with an API, even once you have read their documentation and figured out what their API is all about. So when I come across API providers doing it well, I have to showcase here in a blog posts. Today’s shining example of how to make it easy to find your API keys comes from the Eventbrite API. I was crafting a Postman API capability collection for my boss the other day, and I needed to find me an API key to get the data I needed out of the Eventbrite API. Finding the API paths we needed to get the event and registration data needed had already taken us some time, so I was fully expected the usual friction when it came to finding my API key. Then I clicked on the Eventbrite authentication page and clicked on the link telling me to visit my API keys page, and there they were! No hunting or detective required—my keys were prominently placed above the fold. Amazing!!!  This is how it should be. I shouldn’t have to look around for my key—it is the 2020s. Please stop hiding my keys and making it hard for me to find what I need to get up and running with your API. As you are planning out how to develop and deploy the user experience for the API management layer of your operations make sure you pick 25 existing public APIs, then sign up and find your keys. Learn from the experience and put your keys at a common URL that is prominently linked from your documentation and authentication page. If you have a favorite API that you think adding an application and finding your keys is the pattern...[Read More]


API Evangelist

API Life Cycle Governance Beyond Just API Design

16 Jan 2020

When you hear enterprise organizations talk about API governance they usually mean the governance of API design practices across the organization. This is the place where everyone starts when it comes to standardizing how APIs are delivered. It makes sense to start here because this is where the most pain is experience at scale when you try to put APIs to work across a large enterprise organization. Even if all APIs and micro services are REST(ish), there are so many different ways you can deliver the details of an API--you might as well be using APIs from different companies when trying to put APIs developed across different teams to use in a single application. Making API design the first stumbling block teams consider when planning API governance, and something that would make a meaningful impact on how APIs are delivered. After working with enterprise organizations who have been on their API journey for 5+ years I have begun to see API governance move beyond API design, and begin to look at other stops along the API life cycle, and work to standardize other critical elements. Here are some of the next steps I see enterprise organizations taking when it comes to getting a handle on API governance across teams: Documentation - Making sure everyone is using the same services and tooling for documenting APIs making sure the most common elements are present, and all APIs are well defined. Monitoring - Requiring all teams monitor APIs and report upon the available of each API, establishing a common monitoring and reporting practice that is consistent across all development teams. Testing - Standardizing tooling and approaches to API testing, indexing and cataloging the tests that are in place, and beginning to measure the test coverage for any API in production. Performance - Looking at the speed of APIs and making sure that all APIs are benchmarked as soon as they are developed, then measured against that across multiple...[Read More]


API Evangelist

Eventbrite Events with Order Count and Capacity Using the API

15 Jan 2020

My boss asked me if I could build a Postman collection that would pull our future events from Evenbrite and display ticket counts for each individual event. So I got to work hacking on the Eventbrite API, learning each of the events API paths, stitching together what I needed to pull together my Postman collection for this new API capability. I’m a big fan of not just creating reference collections for different APIs like the Eventbrite API, but also creating individual capability collections that use one or many API requests to deliver on a specific business objective. I was able to craft my Postman API capability collection using two Eventbrite APIs, getting me the data I need to satisfy what my boss needed to get the updates he needed. Events By Organization - Pulls all of the future active events for our Eventbrite organization. Event Orders - Pulling the orders fore each individual event, pulling the relevant information needed to assess each event. This Eventbrite event order Postman capability collection only has one request in it, but I call the second API multiple times using a test script for the request. So in the end I’m making multiple API calls using a single Postman request, allowing me to get at what I need for each future event across multiple APIS--abstracting away some of the complexity. I have published the collection as a Postman template which you can access via the Postman documentation I’ve published, but you will need to add your own Eventbrite token and organization id to actually execute. Once you added these properties entered you can click send and see a listing of events with ticket counts as well as maximum capacity for all the future events using the Postman visualizer tab. I’ve added this Postman capability collection my list of individual API collections I’ve been building. Providing a list of the common things I need to accomplish across the different platforms I depend for my...[Read More]


API Evangelist

Why Hasn’t There Been Another Stripe or Twilio?

13 Jan 2020

Stripe and Twilio are held up as shining examples of how to do APIs in our world. This shining blueprint of how to do APIs has been around for a decade for others to follow. It isn’t a secret. So, why haven’t we seen more Stripes or Twilios emerge? Don’t get me wrong, there are other well done APIs that have emerged, but none of them have received the attention and level of business that Stripe and Twilio have enjoyed. These things always get me thinking and wondering what the reality really is, and if the narrative we are crafting is the one that fits with reality on the ground—pushing me to ask the questions that others aren’t always prepared to ask. I am going to spend some time flagging some of the new APIs who do rise the to the occasion, but while I am working on that I wanted to pose some questions about why we haven’t seen the Twilio and Stripe being modeled by more API providers. Here are a few of my thoughts as I work through this view of the API landscape, and helping me understand why there aren’t more API rockstars to showcase: Investment - Investment cycles have changed and the investment you need to do this right is available for startups in the last five years. Blueprint - Twilio and Stripe are not a blueprint that applies universally to other APIs, but worked will in those business verticals. APIs - This use case of APIs is not as universal as we think it is and is not something that will work being applied to all business verticals. Skills - It takes more skills than we anticipate when it comes to actually delivering an API as well as Twilio and Stripe have done. Cloud - The dominance of the cloud providers is making it harder for small API startups to get traction and attention of investors. Wrong - These...[Read More]


API Evangelist

The State of Simple CRUD API Creation

09 Jan 2020

With all the talk of APIs you think it would be easier to publish a simple Create, Read, Update, and Delete (CRUD) API. Sure, there are a number of services and open source solutions for publishing a CRUD API from your database, but for me to just say I want a CRUD resource, give it a name, push a button, and have it—there isn’t much out there. I should be able to just write the word “images”, and hit go, and have a complete images API that I can add properties to the schema, and query parameters to each method. After ten years of doing this I am just amazed that the fundamentals of API deliver are still so complicated and verbose.  We even have the vocabulary to describe all of the details of my API (OpenAPI), and I still can’t just push a button and get my API. I can take my complete OpenAPI definition and publish it to AWS, Azure, or Google and “generate my API”, but it doesn’t create the backend for me. There has been waves of database or spreadsheet to API solutions over the years, but there is not single API solution to plant the seeds when there is no existing data source. Over the holidays I managed to create a Postman collection that will take my OpenAPI from a Postman-defined API and generate a AWS DynamoDB and AWS API Gateway instance of API, but it was the closest I could get to what is in my head across AWS, Azure, and Google. Why can’t I just hit GO on my OpenAPI, and have an API in a single click? Nio matter which cloud provider I am on! The reasons why I can’t immediately have a CRUD API are many. Some technical. Most are business reasons. I would say it is primarily a reflection of our belief that we are all innovative special snowflakes, when in reality we are all...[Read More]


API Evangelist

A Postman API Governance Collection

09 Jan 2020

You can use Postman to test your APIs. With each request you can include a test script which evaluates each incoming response and validates for specific elements, displaying the test results along with each response. However, you can also use the same mechanisms to evaluate the overall design of any API you are managing with Postman. One of the new beta features of Postman is being able to manage your APIs, allowing you to define each API using OpenAPI 3.0, then generate collections, mocks, docs, and tests with Postman. This got me thinking—why can’t we use the new Postman API manager, plus the Postman API, and script testing for governing the design of an API. To explore the possibilities I created a Postman collection for applying some basic API design governance to any API you have defined in a Postman workspace. The collection uses the Postman API to pull the OpenAPI for each API and store it within an environment, then there are a range of basic requests that can be made to evaluate the design of the APIs that we have defined as an OpenAPI.  The collection is a proof of concept, and is meant to be a starting point for designing many difference types of API governance rules, and thinking about how Postman collections can be used to govern the API life cycle, starting with the design of our APIs—something that is exposed as OpenAPI. My new Postman API governance collection has a handful of folders, and the following requests: Info - Looking at the general info for the API. Validate the Name Of The API Validate the Description for the API Paths - Evaluating the design patterns of each API path. Ensure Words Are Used in Paths Methods - Looking at the details of each API method. Check For GET, POST, PUT, and DELETE Check All Methods Have Summaries Check All Methods Have Descriptions Check All Methods Have Operation Ids Check All...[Read More]


API Evangelist

Spreading API Collections From My Personal Workspaces Across Multiple Workspaces

08 Jan 2020

As a Postman user for a number of years I have several hundred random collections littering my personal workspace. I had noticed that workspaces emerged a while back, but really hadn’t ever put much thought into how I organize my collections. As the number of collections grows I’m noticing performance issues within Postman, and general chaos because I work primarily fro within my personal workspace. Pushing me to step back and think more holistically in how I create, store, organize, and share my API collections within the Postman platform and beyond using AWS S3 and GitHub. Forcing a little organization and structure on how I move APIs forward across thier own API life cycle trajectory. First, when working in my personal workspace there were performance issues using Postman. There were just too many Postman collections in there to be efficient. This further slowed me down when it comes to finding the collections I needed. Having to look purely alphabetically for collections that could have any sort of naming conventions applied to them took way too much time. This reality has pushed me to think about the different bucket in which I operate and get work done proved to be helpful, helping me create a handful of workspaces for me to organize my API collections into, rather than just operating from a single workspace filled with hundreds of APIs I have imported over the years. My frist task was to just delete things that was clearly junk. Then I looked at all my collections via the Postman API to see if there was any last modified or run date—sadly there isn’t. I will have to think about way in which I can track the evolution and usage of my Postman collections so that I can consider automating the cleanup of collections, or at least archiving of them based upon them being modified or not. Once I cleaned up a little bit I was able to see...[Read More]


API Evangelist

Postman Tutorials are Common but the Postman Collection is Often Missing

08 Jan 2020

I am amazed at the number of blog posts I come across for API providers explaining how their API consumers can use Postman with their API, but do not actually share a complete Postman collection for developers to use. API providers obviously see Postman as a tool for making API calls, but do not fully grasp the ability to document an API with a Postman collection, save, publish, and share this collection with documentation or the Run in Postman button. As part of this realization I am not looking to shame API providers for not understanding what is possible, I am more looking to acknowledge how much work we (Postman) have to to when it comes to helping folks understand what is possible with the Postman platform, moving folks being the notion that Postman is just an HTTP client. There are some pretty involved tutorials out there for using Postman with a variety of APIs. API providers have invested a lot into these blog posts, tutorials, and other resources to help their API consumers on-board with their APIs, but do not publish Postman collections as part of their documentation or tutorial. This tells me that API providers aren’t seeing the portability and share-ability of Postman collections. They simply see Postman as an API client, not as tooling for defining, sharing, publishing, versioning, saving, organizing, and evolving API requests. This means we have a lot of work ahead of us to educate folks about what Postman collections are, and how they will make your life easier, while reducing redundancy across operations. Helping folks move beyond simply operating Postman as an isolated HTTP client. Having full control over defining a request to an API while being able to see the details of that response is the core value of Postman. Developers get it. Clearly they also see the need in sharing this ability, and equip others realize the same value. They are crafting tutorials and blog posts...[Read More]


API Evangelist

Deploy, Publish or Launch An API?

08 Jan 2020

I’m always fascinated by the words we use to describe what we do in a digital world. One dimension of the API life cycle that perpetually interests me is the concept of deploying an API, or as some might call it publishing or launching. I am fascinated by how people describe the act of making an API available, but I’m even more interested in how shadows exist within these realities. Meaning, within a 30 minute Googling session for publish, deploy, and launch an API, I come across many real world examples of delivering an API, but how few of them will deliver the actual tangible, functional, nuts and bolts of the API. After searching for publish API, here is what stood out: Apigee SwaggerHub Postman Oracle Broadcom Azure MuleSoft WSO2 SAP Socrata After searching for deploy API, here is what stood out: AWS API Gateway Firebase Google Serverless Stack Mendix API Platform API Evangelist GitHub Heroku After searching for launch API, here is what stood out: Adobe Launch SpaceX Apple Launch Services RapidAPI 80% of these will not actually deliver the API, it will just take an existing and make it available. I know most of these service providers believe that their solution does deploy because use it proxies an existing API, but really very few of these actually deliver the API, they more publish, deploy, and launch it into some state of availability—the final act of making it available and open for business. After all these years of studying API gateway and management providers I’m still fascinated by the lack of true API deployment present, and how much it is about proxying what already exists, creating a shadow that continues to prevent us fro standardizing how we deliver APIs.[Read More]


API Evangelist

Dead Simple Real World API Management

08 Jan 2020

I began API Evangelist research almost a decade ago by looking into the rapidly expanding concept of API management, so I think it is relevant to go into 2020 by taking a look at where things are today. In 2010, the API management conversation was dominated by 3Scale, Mashery, and Apigee. In 2020, API management is a commodity that is baked into all of the cloud providers, and something every company needs. In 2010 there were not open source API management provider, and in 2020 there a numerous open source solutions. While there are forces in 2020 looking to continue moving the conversation forward with service mesh and other next generation API management concepts, I feel the biggest opportunity in tackling the mundane work of just effectively managing our APIs using simple real world API management practices. I am neck deep in working to deploy a simple set of APIs, looking for the path of least resistance when it comes to going from 0 to 60 with a new API. After playing around with AWS, Azure, and Google for a couple days, reminded of how robust, but also complex some of their API management approaches can be, I find myself on the home page of API Evangelist, staring at the page, and I click on my sole sponsor Tyk—finding myself pleasantly reminded how effective simple real world API management can be. Within 10 minutes I have singed up for an account and began managing one of my prototype APIs, allow me to: Add API - Add the url and authentication for one of my project APIs. Version - Choose to version, or not version the API I am deploying. Endpoints - Design a fresh set of endpoints transforming my API. Load Balance - Round-robin load-balance traffic to all my APIs. Regions - Manage the geographic distribution of my API infrastructure. Rate Limit - Limit the amount of API calls that can be made to API. Users...[Read More]


API Evangelist

Postman Open Source

07 Jan 2020

I get asked a lot if Postman is open source. I get told ocasionally that people wish it was open source. I have to admit I didn't fully grasp how open Postman was until I helped work on the new open source philosophy page for Postman. While the Postman application itself isn't open source (it is built on open source), the core building blocks of Postman are open source, shifting my view of how you can use the application across operations. Expanding Postman usage beyond just being a solitaire desktop applicaton, and turning it into a digitally scalable gear on the API factory floor. Postman as a desktop application is not open source, but here are the core components that are open source, making Postman something you can run anywhere: Postman Runtime - The core runtime of Postman that allows you to run collecctions, including requests, scripts, etc anywhere, extending the work that gets done within the application to anywhere the runtime can be installed and executed. Postman Collections Format - The collections you save and share with Postman are all open source and can be shared, exported, published, and used as a unit of currency within any application or system, further extending the reach of the platform. Newman - Command-line tool for running and testing a Postman Collection as part of any pipeline, making Postman collecitons a unit of compute that can be baked into the software development life cycle, and leveraged as API truth wherever it is needed. Postman Collection SDK - SDK to quickly unlock the power of Postman Collections format using JavaScript, allowing you to create, manage, and automate how collections are defined and put to work across a platform withoiut depending on the application. Postman Code Generators - Convert Postman collections to usable code in more than 20 different programming languages, generating simple client scripts for consumers that are defined by the Psoitman collections used as the code generators definition. I am...[Read More]


API Evangelist

Challenges Binding APIs Deployed Via Gateway To Backend Services

07 Jan 2020

I spent some of the holidays immersed in the backend integrations of the top three cloud providers, AWS, Azure, and Google. Specifically I was studying the GUI, APIs, schema, mapping, and other approaches to wiring up APIs to backend systems. I am looking for the quickest API-driven way to deploy an API, and hooking it up to a variety of meaningful resources on the backend, beginning with SQL and NoSQL data stores, but then branching out discovering the path of lest resistance for more complex backends. Maybe it is because of my existing experience with Amazon, but I found the AWS approach to wiring up integrations using OpenAPI to be the easiest to follow and implement, over what Azure and Google offered. Eventually I will be mapping out the landscape for each of the providers, but at first look, Azure and Google required substantially more work to understand and implement even the most basic backends for a simple API.  Don’t get me wrong, if you want to just gateway an existing API using AWS, Azure, or Google, it is pretty straightforward. You just have to learn each of their mapping techniques and you can quickly define the incoming request, and out going response mappings without much effort. However, for this exercise I was looking for an end to end actual deployment of an API, not the proxying or Hollywood front for an existing API. If you want to launch a brand new API from an existing datasource, or a brand new API with a brand new data source, I found AWS to be path of least resistance. I was able to launch a full read / write API using AWS API Gateway + AWS DynamoDB with no code, something I couldn’t do on Azure or Google, without specific domain knowledge of their database solutions. I had only light exposure to DynamoDB, and while there were some quirks of the implementation I had to get over,...[Read More]


API Evangelist

Academic or Street API Tooling

07 Jan 2020

There always seems like there are two separate types of tools in my world, the academic tools that consider the big picture and promise to steer me in the right direction, and then there is the street tooling that helps me get my work done on a day to day basis. After going to work for a street tooling vendor who has some academic tooling aspirations, it has gotten me thinking more about the tools I depend on, and learning more about what people are using within the enterprise to get their work done each day. I have used different academic tooling over my life as the API Evangelist. I’d say every API management tool I’ve adopted has been very academic until recently. From my view API management started as academic and then became a factory floor commodity. I feel say Kong and Tyk are the only version that have achieved a street level status within all of this, and NGINX is looking to turn it’s street cred into more of something that is more academic, and visionary. There aren’t many academic API tooling that have gone from vision to implementation—they just can’t survive the investment and acquisition cycles that gobble them. Making it difficult to see the real adoption they need to become baked into our daily lives. API management has done it, but very few other stops along the API life cycle have realized this level of adoption. Street tooling, or the hand tools developers use to get their jobs done on a daily basis are a much different beast. Postman and NGINX are both examples of tools that developers know about and depend on to operate each day. Using NGINX to deploy and Postman to consume APIs each day. These aren’t tools that promise some grand vision of how we could or should be, these are tools about dealing with what is right in front of us. These are tools that keep...[Read More]


API Evangelist

The Fundamentals: Deploying APIs From Your Databases

06 Jan 2020

You know, I tend to complain about a lot of things across the API space while focusing on the damage caused by fast moving technology startups and the venture capital that fuels them. Amidst all of this forward motion I easily forget to showcase the good in the space. The things that are actually moving the conversation forward and doing the hard work of connecting the dots when it comes to APIs. I easily forget to notice when there are real businesses chugging along delivering useful services for for all of us when it comes to APIs. One of my favorite database to API businesses out there, and one of the companies who have been around for a significant portion of my time as the API Evangelist, working hard to help people deploy APIs from their databases, is SlashDB. If you want to deploy APIs from your databases, SlashDB is the solution. If you are looking to make data within MySQL, PostgreSQL, SQLite, MS SQL Server, Oracle, IBM DB2, Sybase, RedShift, NoSQL, or other data source available quickly as an API, SlashDB has the solutions you are looking for. SlashDB isn’t one of those sexy new startups with a bunch of venture funding looking to be your new API best friend. SlashDB is looking to do the mundane difficult work needed to make the data available within your legacy databases available as APIs so that you can use across your applications. SlashDB is all about securely exposing your data using standardized web APIs, making your digital resources available wherever you need them. SlashDB doesn’t have the splashy website, but they have the goods when it comes doing one of the most common tasks when deploying APIs—wiring up your APIs to their data backends. They also have the straightforward pricing tiers for you to navigate as you expand the number of data sources you are wiring up, and the number of consumers you have consuming data...[Read More]


API Evangelist

Postman Collections For Pulling My Twitter Friends And Followers

06 Jan 2020

I have been cranking out the Twitter API capabilities lately, crafting single request Postman collections that focus on a specific capability of the popular social API. I use the API for a number of different things around API Evangelist, and as I assess how I use the social media API I wanted to be engineering my integrations as Postman collections so I can better organize and execute using Postman, while also adding to the list of API capabilities I’m sharing with my audience of developers and non-developers. Today I cranked out two individual Twitter API capabilities helping me better manage my Twitter followers and friends: Twitter Followers - Pulls your Twitter followers 200 at a time, saves them within an environment, then allows you to increment through each page of followers, eventually pulling and storing all of your followers. Twitter Friends - Pulls your Twitter friends 200 at a time, saves them within an environment, then allows you to increment through each page of friends, eventually pulling and storing all of your friends. These capabilities are separate Postman collections so that they can be used independently, or together. I am keeping them organized into a Postman workspace so that I can use manually, but then also have a daily monitoring running, pulling any new followers or friends from my Twitter. I pull the resulting JSON from the environments I pair up with each collection using the Postman API and integrate into some of my other API Evangelist monitoring and automation. Next I am going to create a Postman collection that will reconcile the two lists and tell me which people I am following do not follow me back, creating a third list that I can use to unfollow and clean up my profile. Crafting these types of collections helps me renew my understanding of some of the APIs I already use. It also helps me better define the individual capabilities I put to work on a daily basis, and...[Read More]


API Evangelist

My Levels of Postman API Environment Understanding To Date

06 Jan 2020

I have been a pretty hardcore Postman user since the beginning. Over the years I felt like I understood what Postman was all about, but one of the first concepts that blew up my belief around what Postman could do was the concept of the Postman environment. Like other Postman features, environments are extremely versatile, and can be used in many different ways depending on your understanding of Postman, as well as the sophistication of the APIs and the workflow you are defining using Postman. My Postman environments awakening has occurred in several phases, consistently blowing my mind about what is possible with Postman and Postman collections. Postman environments are already one of the edges I have given Postman collections over a pure OpenAPI definition—it just provides more environmental context than you can get with OpenAPI alone. However, at each shift in my understanding of how Postman environments can be used, entirely new worlds opened up for me regarding how that context can be applied and evolved over time across many different APIs. Resulting in four distinct layers of understanding about how Postman environments works and can be applied in my world—I’m sure there will be more dimensions to this, but this is a snapshot of how I see things going into 2020. Environments Settings For Single API Calls I have to start with the ground floor and express why environments matter in the first place, and provide an edge over OpenAPI all by itself. Being able to define key / value pairs for authorization and other variables across one or many different API collections helps speed up the on-boarding, orchestration, and reuse of API requests within those collections. It quickly allows you to switch users or other context, but still use the same collection of API requests, shifting how we automate and orchestrate across our API infrastructure. However, simply putting the base url for your API as a variable, and defining tokens and other...[Read More]


API Evangelist

A Dynamic Salesforce REST API Postman Collection Builder Collection

06 Jan 2020

I have been working on developing new ways to make the Salesforce API more accessible and easier to onboard with over the last couple of months, helping reduce friction every time I have to pick up the platform in my work. One of the next steps in this work is to develop a prototype for generating a dynamic Postman collection for the Salesforce REST API. I had created a Postman collection for the API earlier, but the Salesforce team pointed out to me that the available APIs will vary from not only version to version, but also user account to user account. With this in mind I wanted to develop a tool for dynamically generating a Postman collection for the Salesforce API, and as I got to work building it I realized that I should probably just make the tool a Postman collection itself (mind blown). To help make on-boarding with the Salesforce API easier I created a Postman collection that uses the Salesforce API to autogenerate the Postman collection based upon the available objects and endpoints for the Salesforce REST API. The Postman collection has three requests within the collection to accomplish the creation of a dynamic collection. The first request pulls all the latest versions for the Salesforce API, using the Salesforce API. Once I have the version of the Salesforce API I am targeting for a build I add it to the Postman environment I am using to define the operations of my Postman collection, and then I pull the list of available objects for this version, and for my own Salesforce account. The objects that exist will vary for each Salesforce account, as well as version, making it pretty critical that that any Postman collection is dynamic, being generated from this personalized list of objects. The next request in our Salesforce Postman collection builder is the build, which generates individual requests for all of the available objects. After you run, the...[Read More]


API Evangelist

The Many Differences Between Each API

03 Jan 2020

I’m burning my way through profiling, updating, and refreshing the listings for about 2K+ APIs in my directory. As I refresh the profile of each of the APIs in my index I am looking to make sure I have an adequate description of what they do, that they are well tagged, and I always look for an existing OpenAPI or Postman collection. These API definitions are really the most valuable thing I can find for an API provider, telling me about what each providers API delivers, but more importantly it does the same for other consumers, service and tooling providers. API definitions are the menu for each of the APIs I’m showcasing as part of my API research. As I refresh the profile for each API I re-evaluate how they do their API, not just the technical details of their API, but also the business and on-boarding of their API. If an API providers doesn’t always have an OpenAPI, Postman collection, or other machine readable definition for their APIs, depending on the value of the API and standardization of their API design and documentation, I will craft a simple scrape script to harvest the API definition, and generate the OpenAPI and Postman collection automatically. As I cycle through this process fore each API in my index I’m reminded of just how different APIs can be, even if they are just RESTful or web APIs. Demonstrating that there are many interpretations of what an API should be, both technically, and from a business perspective. Some APIS have many different paths, representing a wide variety of resources and capabilities. Some APIs have very few paths, and heavily rely on query parameters to work the magic when it comes to applying an API. Others invest heavily in enumerators and the values of query parameters to extract what you need from each API—often times forgetting to tell you what these values should or could be. Some of the time...[Read More]


API Evangelist

Pricing Comparison for Screen Capture APIs

03 Jan 2020

There is a pricing comparison between 33 separate screen capture APIs halfway down the page on this interesting piece about how to choose the right screen capture service. This type of comparison should exist across every business sector being impacted by APIs, as well as new ones emerging to introduce entirely new digital resources for use in our desktop, web, mobile, device, and network applications. Sadly, right now these types of machine readable, let alone human readable lists do not exist across the sector. Assembling these types of comparisons takes a lot of time and energy, and aren’t always possible in a special API snowflake of a world where seemingly similar APIs are actually very different beasts—sometimes intentionally, but usually unintentionally. I have had a machine readable schema for defining API pricing for almost five years now. I’ve profiled common resources like email, SMS, and others, but ultimately haven’t had the resources to invest in the work at the levels needed. I know how much work goes into establishing an exhaustive list of APIs in any business sector as well as finding a price, and defining the access tiers for each individual API provider. I wish I had more resources to invest in profiling of APIs, but also profiling down to this level of detail where each of the individual API resources they offer have some sort of semantic vocabulary applied, and a machine readable defining of the pricing and on-boarding required for each API provider. This is how we are going to get to the API economy we all like to fantasize about, where we can automatically discover, on-board, pay for, and switch between valuable aPI resources as we need in real-time. We need to get to work on doing this for the most tangible, consistent, and valuable API across the sector. We won’t be able to do for all types of APIs, and sometimes I twill be an apples to oranges comparison, but we...[Read More]


API Evangelist

Not Just An API Provider But Also An API Matchmaker

03 Jan 2020

Your API is always the best. Of course it is. However, not everyone will see the value your API delivers without a little enlightenment. Sometimes the value of an API is missed in isolation when you are just looking at what a single API can do. To help developers, as well as business users understand what is possible it can help to connect the dots between your API and other valuable 3rd party APIs. This is something you see from API providers who have integration pages showcasing the different integrations that are already available, and those who have invested in making sure their API is available on integration platform as a service (iPaaS) providers like IFTTT and Zapier. If a new user isn’t up to speed on what your API does, it can help to put it side by side with other APIs they are already familiar with. Being aware of not just the industry you are operating an API within, but also complimentary industries is what we should all be striving for as an API provider. The most competitive API providers all have integration pages demonstrating the value that an API provides, but more importantly the value it can deliver when bundled with other popular services their customers are already using. This means that API providers have to be solving a real world problem, but also have done their homework when it comes to understanding a real world version of this problem that other people face. Or simply have enough consumers of an API who are demanding that are also demanding integrations with other commonly used platforms. Regardless of how an API provider gets there, having an awareness of other platforms that companies are depending on as part of their operation, and ensuring that your API solutions are compatible and interoperable by default just makes sense. I find that playing with a complimentary API in Postman helps you think about the moving parts of...[Read More]


API Evangelist

What Is The API Life Cycle?

02 Jan 2020

I regularly struggle with the words and phrases I use in my storytelling. I’m never happy with my level of word-smithing, as well as the final output. Ultimately I don’t let it stop me, I just push myself to constantly re-evaluate how I speak, being forever critical and often pedantic about why I do things, and why I don’t. One word I struggle with is lifecycle. First I struggle with it being a word, or two words. Historically I have been team word, but more recently I’ve switched to two words. However, this round of anxiety over the phrase is more operational, and existential, over it being about how I use the word in my storytelling. I am more interested in if we should even be using the phrase, and if we are, how do we get more formal about quantifying exactly what we mean by the API life cycle. As I work to flesh out my API life cycle Postman collection, defining an API-driven guard rails for how I deliver my APIs, and distilling each step down to a single request and set of pre and post request scripts, I am forced to think about what the API life cycle really is. Pushing me to go beyond just talking about some abstract concept, to actually having a set of interfaces and scripts that quantify each stop along the API life cycle. While I will be adding more stops to my Postman API life cycle collection, I currently have 27 stops defined, providing me with some concrete actions I can take at each point in the evolution of my APIs. Define - Defining the central truth of the API using OpenAPI, JSON Schema, and Postman collections and environments. Environments - Providing environments that drive different stages of the API life cycle in conjunction with various collections. Design - Quantifying, standardizing, and evolving the HTTP and other design patterns I use across the APIs I deliver....[Read More]


API Evangelist

Deploying My Postman OpenAPI To AWS API Gateway

02 Jan 2020

I created a bunch of different Postman collections for AWS services leading up to re:Invent this year, and now I’m using individual requests to deliver on some different Postman AWS API life cycle workflows. To flesh out the scaffolding for how I define and deliver APIs throughout their API life cycle I got to work on a Postman collection for defining and executing every single stop in my API life cycle in a way that I could consistently apply across many different APIs. I am using Postman to define the central truth of each of my APIs with OpenAPI, and I want to use Postman to deliver and execute on that truth across every single stop along the API life cycles. One of the more critical stops I wanted to provide a solution for was API deployment, providing me with a simple way to immediately deploy an API from an OpenAPI definition. Deploying APIs are hard. It is one of the most complicated and least standardized stops along the API life cycle. Regardless, I wanted a simple straightforward Postman collection that would allow me to take an API definition within Postman, and publish an API to one of the major cloud platforms—AWS won out for simplicity. Ultimately, using Postman I was able to pull an OpenAPI for one of my APIs, then deploy an API in five steps. Providing a basic, introductory Postman collection for deploying a Postman API to AWS API Gateway. Pull API - Loads up the specific version of a Postman API into the environment for processing within each of the next steps. Create Table - Actually creates an AWS DynamoDB table derived from the name of the API being pulled from Postman. Prepare OpenAPI - Takes the OpenAPI and generates AWS API Gateway integration extensions that define the backend. Publish OpenAPI - Takes the new OpenAPI with integration extensions and publishes to AWS API Gateway. Deploy API - Actually deploys the API...[Read More]


API Evangelist

A Postman Collection for Managing the Life Cycles Of My APIs

02 Jan 2020

I had grown weary of just researching, talking, and teaching about the API lifecycle over the last ten years as the API Evangelist. This was one of the major motivators for me to join the Postman team. I want to take my knowledge of the API life cycle and work to make sure the rubber meet the road a little more when it comes to actually realizing much of what I talk about. I began investing in this vision over the holidays by crafting a Postman collection that isn't for defining a single API, it is meant to define the life cycle of a single API. I can manage multiple stops along the API life cycle already with Postman--I just wanted to bring it all together into a single machine readable collection that uses the Postman API, but also other APIs I use to orchestrate my world each day. My API life cycle collection is still a work in progress, but it is coming together nicely, and is the most tangle format of what I have been in my head when I think of Postman as an API delivery platform. This collection centers around managing an OpenAPI truth within Postman, then moving this API definition down the life cycle, and even deploy development or production versions of each API using AWS API Gateway. Of course everythig is API-driven, and designed to work across many different APIs to define, deliver, and manage any single API, maintaning a definition of the life cycle within a single Postman environment that can be used to bridge multiple API platform via a single collection. So far I have over a hundred individual capabilities defined as Postman requests, and organized into folders that are broken down by different stops along the API life cycle. I'm still moving them around and abstracting away the friction, while I work hard to define the most sensible workflows with each of my API life cycle...[Read More]


API Evangelist

Pulling Your Twitter Bookmarks Via The Twitter API

30 Dec 2019

I created two Twitter API capabilities the other day to help someone pull a list of their Twitter favorites using the Twitter API. They said they wanted bookmarks and I assumed they used favorites in the same way I do (as bookmarks), and created one Postman collection for pulling API favorites, and another to parse the URLs present in the body. I use Twitter public likes as a way of bookmarking, then I harvest those via the Twitter API--something I've done for over a decade. I had heard of Twitter bookmarks, and seen them in the desktop and mobile apps, but hadn't really made the shift in my brain. So I assumed they were talking about likes. DOH! Anyways, they tweeted back at me and helped me realize misconception. Ok, so how do we still get them their bookmarks? After some quick investigation there is no Twitter API for your private bookmarks, making the pulling of your data a little more challenging, but not impossible. This is where I began helping people not just understand the technology of APIs, but also the politics of API operations. Meaning Twitter has an API for your bookmarks, they just don't want you to get at it via the public API (I am not sure why). Anyways, in this scenario I can't make a ready to go Postman collection for you to use, I am going to have to teach you a little bit more Postman Kung Fu, and teach you how to sniff out the APIs that exist behind everything you do each day. It is still something you can do without programming, and with Postman you can still get at your data in the same way we did for the public Twitter favorites API. You just have to be curious enough to not turn away as I pull back the curtain of the world of APIs a little bit more, with a simple walk through. Something that...[Read More]


API Evangelist

Pulling Links From Those Tweets You Have Favorited

29 Dec 2019

I am busy crafting new API capabilities from my laundry list of requests I have from folks. When I get an email or come across a Tweet with someone asking how they do something on Twitter I will add to my list, and at some point pull together a simple Postman collection for accomplishing what is being desired. Providing a single Twitter capability that I can add to my list, and anyone (hopefully) can put to use with their own Twitter account and application, within their own local Postman environment. My goal here is to help provide simple API-driven capabilities that anyone can use, while also pushing my skills when it comes to crafting useful Postman collections that aren’t just for developers. Today’s API capability is from Elana Zeide (@elanazeide) who asked on Twitter, “So now I have a lot of twitter bookmarks of amazing things you people have shared ... is there any way to export/download them to another app? (I know you can do it w/ likes) Anyone come up with some clever workaround/automation?”. To possibly help her out I started by creating a single Postman collection that just pulls the favorites for any Twitter user via the Twitter API. Pull Twitter Favorites Capability - It authenticates with the Twitter API and pulls the likes for any Twitter user using their handle, and publishing the list of favorites to the visualizer screen. This all by itself is a perfectly usable API capability all by itself, but once I was done I used it as my base for pulling any URL that is present in the Tweet. Making for entirely separate Twitter API capability that I hope folks will find useful. Pull Links From Twitter Favorites Capability - It authenticates with the Twitter API and pulls the likes for any Twitter user using their handle, extracts all of the links from those tweets and publishes the list of links to the visualizer screen. Both of these...[Read More]


API Evangelist

How My API Evangelist Research and Writing Works

28 Dec 2019

Many folks don’t quite get my work and writing style. They are confused by the erratic flow of stories being published to API Evangelist, the incomplete nature of some of my research sites, and other annoying factors that don’t quite make sense when you view API Evangelist a particular way. If you think it as a technology blog like Techcrunch, ReadWrite, The New Stack, or others, you will be passing certain judgement on the content of my work, the tone of what I say, and the chaotic way in which I publish my research and stories across hundreds of separate sub-domains. People expect me to write up their API, review their approach, or know everything about the thousands of APIs that exist across the public landscape. API Evangelist isn’t this type of blog—it is simply my workbench for things that interest me, are relevant to the industry and my career, or is valuable to someone who pays me to generate value in the API universe.  Two Distinct Layers Of Research There are two main layers to my research, which I use to mine API information and knowledge. These two dimensions feed off of each other, but ultimately drive my research, storytelling, and at times the wider conversation in the API space. Helping me organize everything into these two buckets: Landscape - Reviewing the public and private API offerings across many different business sectors, providing me with a unique view of how API providers are doing what they do. Life Cycle - Taking what I’ve learned across the landscape and organizing information and knowledge by stops along the API life cycle, for use in my regular work and storytelling. These two layers are constantly feeding each other. For example, after making a pass through all the payment APIs, updating the landscape for that area, I will add new building blocks I’ve stumbled across to my API life cycle research. Then when I embark on research into the...[Read More]


API Evangelist

Atlassian Provides Run in Postman and OpenAPI by Default for Jira, Confluence, and BitBucket APIs

27 Dec 2019

I was profiling the Atlassian APIs, considering what is possible with JIRA, Confluence, and Bitbucket. Three services that are baked into many enterprise organizations I’ve worked with over the years. My intention was to create a Postman collection for JIRA, but once I Landed on the home page for the API I noticed they had a button in the top corner for Running in Postman, and a dropdown for getting an OpenAPI 3.0 spec. Which is something that I strongly believe should be default for all APIs, ensuring there is a prominently placed link to the machine readable truth behind each API. I like seeing Postman as the default executable in the top corner of the documentation for APIs. I also enjoy seeing the orange Run in Postman button across documentation, blog posts, and other resources—helping folks quickly on-board with some API resource or capabilities. I want to see more of this. I’d like it all to become the default mode of operating for API providers. I want all API providers to manage an OpenAPI truth for their API, while also developing and evolving many different Postman derivatives of that truth. Providing reference collections that describe the full surface area of our APIs, but also make sure there are more on-boarding, workflow, and capability style APIs that empower end-users to put APIs to work distributed across API documentation, and the stories we tell about what is possible with our APIs. Interestingly the Postman collection isn’t just a unit of representation for the JIRA, Confluence, and BitBucket APIs. The Postman collection is also a representing of the unit of work that is executed across these platforms. If you have worked in software development across the enterprise you know what I am talking about. Postman is the Swiss Army Knife for how enterprise developers not only develop and deliver their work, which is defined and tracked using JIRA, Confluence, and BitBucket, but Postman collections are also how...[Read More]


API Evangelist

Applying An API-First Approach To Understanding The Pacific Northwest Mushroom Industry

23 Dec 2019

This is an API first project for mapping out the mushroom industry. I have always had a passion for mushrooms, but as I get older I am looking at investing in more side projects that aren’t always 100% about APIs. I wanted to spend some time this holidays refreshing my memory about what types of mushrooms are available on the market, and what types of products are being made from them. As I do with any data or content driven research I begin by creating an API to store all of the data and content I am gathering, helping me flesh out the dimensions of each business sector I am interested in. As with all of my work I really don’t know where this research is headed—I am just interested in learning about mushrooms. Eventually I’d like to use this data and content in a variety of web and mobile applications, but since I’m just getting started I don’t really understand all of the data I am needing to gather. A situation that is perfect suited for beginning as an API first project, helping me not just gather the data I need, but also do it in a way that will help me prepare for the future, while also not investing too much into wiring up a database, coding a web or mobile application, and any other costly infrastructure that may (or may not) be needed down the road. By starting as API first, I am able to flesh out the schema and structure of my APIs which will drive my research, and the resulting applications I will be needing down the road. To get started I spent about 10 minutes thinking about what the main areas of resources I will be needing to track across my work, and created ten separate individual resources. Mushrooms - A list of the mushrooms, their scientific names, description, and the beginning of what I will need to map...[Read More]


API Evangelist

API Providers Should Maintain Their Own API Definitions

23 Dec 2019

I am working my way through 2K+ API providers, refreshing my view of the API landscape, and the data I use to tune into the API economy. As I refresh the profile of each API provider, one of the main things I’m doing is looking for an OpenAPI or Postman collection. While the profiling of their API operations using APIs.json is critical, having a machine readable definition for each API is kind of the most important part of this process. Having an OpenAPI or Postman collection gives me a machine readable list of the value that each API delivers, and allows me (and others) to more easily integrate an API into other applications and systems. Sadly, not every API provider understands the need, or is able to invest the resources to produce an API definition. While profiling an API provider the most ideal situation I can come across is when an OpenAPI already exists in service of API documentation, or the API provider just gets the struggle of their API consumers and they have a Postman collection already published. Ideally, the OpenAPI is publicly available and I don’t have dig it out from behind the documentation, or they have the Run in Postman button clearly published on their website. In the best situations, API providers have their OpenAPI and / or their Postman collections published to GitHub, and are actively maintaining their API definitions using Git, which allows other API consumers and API service providers to depend on an authoritative source of truth when it comes to API definitions for each API they use. I wish every API provider would maintain their own API definitions in this way, sadly very few do. The majority APIs I come across do not have documentation driven by OpenAPI and do not have Postman collections. When I encounter one of these API providers I spend usually about 60 seconds googling for Swagger, OpenAPI, and Postman + their name in...[Read More]


API Evangelist

Where Does The Exhaust For Your API Operations End Up Being Stored?

20 Dec 2019

As part of my ongoing API discovery and observability research, I am interested in better defining where the common places are within the enterprise that we find API signals. Those log files and other exhaust by-products from API operations that will contain hosts, paths, parameters, and other parts and pieces of the APIs that are already in operation. API discovery is complex and it isn’t something I think we are going to be able to solve by mandating teams to make their APIs more discoverable, I think it is something we are going to have to do for them. Augmenting their existing work with services and tooling that then defines what APIs they are producing and consuming as part of the existing tools, applications, and systems. Further expanding the definition of API observability by tapping the exhaust from the outputs of existing infrastructure to help us map out the API landscape that exists within the enterprise.  I am currently helping the Optic folks think beyond the personal value their proxy delivers for individual developers by proxying your desktop, web, mobile, and Postman traffic and automatically generating OpenAPI definitions for you, and consider what the more industrial grade use cases will be. As part of these conversations I am more deeply thinking about how APIs are operated within the enterprise, and being more formal in how I discuss where you can tap into the existing exhaust that is captured around API operations, building on the following list I already have. Apache Log File - The most ubiquitous open source web server out there is the default for many API providers. NGINX Log File - The next most ubiquitous open source web server is definitely something I should be looking for. IIS Log File - Then of course, many Microsoft web server folks are still using IIS to serve up their API infrastructure. Amazon CloudWatch - Looking at how the enterprise is centralizing their logs with CloudWatch...[Read More]


API Evangelist

OpenAPI is the Static Truth and Postman Collections are Real World Derivatives of that Truth

20 Dec 2019

I was talking with the Optic folks this morning about API definitions when they asked me for my opinions on what the difference between OpenAPI and Postman were. A question that isn’t easy to answer, and will produce many different answers depending on who you are talking with. It is a question I’ve been struggling with since before I started at Postman, and will continue to struggle with over the coming years as their Chief Evangelist. The best I can do right now is keep writing about it, and continue talking with smart people like Optic, and iterate upon the answer until I can better see what is happening. Here is how I see things currently: OpenAPI is the static truth, and Postman collections are the real world, real time derivative’s of this truth. Each individual Postman collection reflects the derived value of an API, representing how a developer, application, or system integration is applying this value in the real world. Now if you squint your eyes, all of those Postman collection derivatives roll up into a single OpenAPI truth. OpenAPI is essential for nailing down what the overarching truth of what an API contract delivers, while Postman is essential in quantifying, realizing, and executing this truth on the ground for a specific business use case. There are definitely ways in which OpenAPI and Postman collections overlap, but then there are the ways in which they bring different value to the table.  When it comes to capital G Governance OpenAPI is more meaningful to business leadership—it represents a more constant truth that can then be translated within services, tooling, and defining policy at the macro level. When it comes to lowercase g governance Postman collection is more meaningful to developers, because it represents the transactions they need to accomplish each day, which are derived from the greater truth, but have more context regarding each specific business transaction that a developer is expected to deliver. This...[Read More]


API Evangelist

How I Profile The TypeForm API

20 Dec 2019

I was being asked for more information about how I profile APIs, and deal with the many differences I come across. It isn’t easy navigating the differences between different APIs, and come out with a standard API definition (OpenAPI or Postman collection) that you can use across different stops along the API life cycle. I’m pretty agile and flexible in how I approach profiling different APIs, with a variety of tools and tricks I use to vacuum up as much details as I possibly can with as little manual labor as I possibly can. The example for profiling that was thrown at me was the TypeForm API, which is a pretty sophisticated API, but will still need some massaging to create an acceptable set of API definitions. First thing I do is search for an OpenAPI definition, hopefully published to GitHub or prominently linked off their documentation, but I will settle having to sniff out from behind an APIs documentation. TypeForm doesn’t have an OpenAPI or Swagger available (from what we can tell). Next, I go looking for a Postman collection. Boom!! Typeform has a Postman collection. The question now is why hasn’t Typeform published their Postman collection to the Postman Network? I will Tweet at them. Ok, now I have a machine readable definition for the Typeform API that I can import into my API monitoring system—which is just an API that I use in Postman to import a Postman collection (head explodes). My Postman collection import API grabs as many elements from the Postman collection definition as it can, normalizing the paths, parameters, and other details for an API. I am always adding to what my API is capable of, but it does a pretty good job of giving me what I need to begin to profile the surface area of an API. Now I have all of the paths imported into my monitoring system. However, I am still at the mercy of how...[Read More]


API Evangelist

What Else Has Influenced APIs Over the Last 50+ Years?

19 Dec 2019

Because I have so many smart folks in my Twitter timeline I want to put out some of the seeds for stories I am working on for 2020. I want your help determining what has set the stage for the world of APIs we all believe in so much. Here are just a handful of the nuggets I have pulled out of my research and reading. Early On 1933 - Telex Messaging 1949 - Memex (Linked Documents) 1949 - Computer Talk Over Phone 1958 - Digital Phone Lines 1959 - Semi-Automatic Ground Environment (SAGE) Wide Area Network 1961 - Computer Time Sharing 1963 - Hypertext 1963 - Hypermedia 1964 - Sync Satellite Television Network 1964 - IBM Sabre Reservation System 1966 - Michigan Educational Research Information Triad (MERIT)  1968 - Multiplexing 1969 - Mass produced software components By McIlroy, Malcolm Douglas 1969 - Host Software The First RFC 1969 - ARPANET Four Initial Nodes Established 1969 - Compuserve 1970s 1970 - ARPANET Reaches East Coast (MIT) 1971 - Email 1971 - File Transfer Protocol (FTP) 1971 - TELNET 1971 - ARPANET Has 23 nodes 1972 - ARPANET Has 29 nodes 1973 - ARPANET Has 40 nodes 1974 - ARPANET Has 46 nodes 1974 - Transmission Control Program (TCP) 1974 - Systems Network Architecture (SNA) 1975 - ARPANET Has 57 Modes 1976 - CYCLADES Computer Network 1976 - X.25 Packet Switching Protocol 1979 - First Commercial Cellular Network 1980s 1980 - USENET 1981 - ARPANET Has 213 Nodes 1981 - TCP/IP 1982 - Simple Mail Transfer Protocol (SMTP) 1983 - ARPANET Switches to TCP/IP 1983 - IPV4 1983 - Berkely Sockets 1984 - CD-ROM 1984 - Domain Name System  (DNS) 1984 - Dynamic Host Configuration Protocol (DHCP) 1984 - Open Systems Interconnect (OSI) 1985 - Whole Earth 'Lectronic Link (WELL) 1985 - National Science Foundation Network (NSFNET) 1987 - Transport Layer Interface (TLI) 1990s 1991 - Gopher 1991 - Windows Sockets API 1991 - Common Object...[Read More]


API Evangelist

The 3dcart Developer Home Page Is Nice and Clean

19 Dec 2019

I look through a lot of API developer portals and when I come across interesting layouts I like to pause and highlight them showing to other API providers what is possible, while turning API Evangelist into a sort of style guid when it comes to crafting your API operations. I was asking the folks over at 3dcart if they have an OpenAPI or Postman collection for their API to help me round of my machine readable index of the commerce API provider, and after I stumbled across their developer portal, I thought I'd share here. I like it because in addition to the global navigation for their portal, it really gets at the primary next steps anyone will be taking off the landing page of their developer portal. You can tell it really forced them to pause and think about the narrative around what people will be looking for. Helping people understand what is possible, while also routing them down the most common paths taken when it comes to building an application on 3dcart.[Read More]


API Evangelist

Taming The Salesforce API Scope

18 Dec 2019

I was recently looking to building a prototype integration between Salesforce and Workday, where I find myself needing to on-board with the Salesforce REST API for probably the 50+ time in my career. I am always looking for projects that use the API so that I can keep my skills sharp when it comes to one of the leading API platforms out there. Even with this experience, each time I on-board with the API I find myself having to work pretty hard to make sense of the Salesforce REST API, first wading through a sea of information to get to find the API reference documentation, setting up an OAuth application, and getting to where I am actually making my first API call. Once I am successfully making calls to the Salesforce API, I then have to further explore the surface area of the Salesforce REST API before I can fully understand all the resources are available, and what is truly possible with my integration. After spending about an hour in the Salesforce documentation it all came back to me. I remembered how powerful and versatile the API is, but my moment of déjà vu left me feeling like it would be pretty easy to reduce the time needed to go from landing on the home page of developer.salesforce.com to making your first API call. The challenge with the Salesforce API is it is extremely large, and possess a number of resources that will vary based upon two important dimensions, version and your user account. The API you see with a base developer account isn’t the same you’ll see with a well established corporate Salesforce implementation. Each individual Salesforce customer might be using a specific version, and have a completely different set of resources available to them, making it pretty challenging to properly document the API. Even with these challenges I think there are a number of ways in which the Salesforce API could be made...[Read More]


API Evangelist

APIs For Victoria Australia

17 Dec 2019

I was helping out someone trying to download air quality data in Australia today, and while I was playing around the Victoria Australia government AirWatch data API I thought I'd go ahead and add them to my API Evangelist network by importing their Swagger 2.,0 files and converting them to OpenAPI 3.0, while also publishing Postman collections for teach of their APIs. Expanding out the APIs I have in my directory, while also encouraging the state to publish the Postman collections I've created to the Postman API Network. The State of Victoria has some pretty interesting APIs that they have made available using Axway. I have published an APIs.json index for the states developer portal, providing an index of their API operations, as well as the individual APIs. You can get at the Postman collections I've generated using these links. ABS Labour Force API Postman Collection Agriculture Victoria Soils API Postman Collection DataVic CKAN API Postman Collection DataVic Open Data API Postman Collection EPA AirWatch API Postman Collection Important Government Dates API Postman Collection Museums Victoria Collections API Postman Collection Popular Baby Names Victoria API Postman Collection Victorian Heritage API Postman Collection I would go ahead and publish the Postman collections to the Postman Network, but I have asked them to go ahead and publish them. I would rather the listings be more authoritative and something that is owned by the API operators. I'm just looking to maintain a GitHub repository with fresh copies of their OpenAPI, Postman collections, and APIs.json so I can use as the source of truth for the APIs across API Evangelist, APIs.io, and other iPaaS, and integration providers.  I am working through several different business sectors and government APIs, updating my directory of APIs, while also sharing with soem other API service providers I have been talking to. If there is a particular API provider you'd like to see added to my list, go ahead and submit a pull request...[Read More]


API Evangelist

A Portable 23andMe API Sandbox

17 Dec 2019

I was creating a Postman collection for the 23andMe API. The 23andMe API is still available, despite the company pulling back somewhat when it comes to accessing the DNA and genetics API. You can still get access to the API for research purposes, but you have to email their business development group and convince them of the merits of your research before you’ll get access to the data. It is pretty common for companies to have valuable data like 23andMe does, and there being a significant amount of concern regarding who has access to it. This is why API management exists as a fundamental building block of API operations, so you can have total control over who has access to your data, and possess detailed logs regarding what has been accessed by consumers. Requiring approval of a developer account before you get your API keys is common, pushing API developers to justify their access and establish a trusted relationship with API providers. This is something you can setup with your API management tooling or services, providing a public sign-up form, yet making each new API consumer wait to be approved before they get their API keys. Even with this security layer in place you may still want to allow API consumers to kick the tires more and see what is possible while awaiting approval for API access. One way you can accomplish this is by creating Postman collections for all the API endpoints, making sure there are one or more examples for each individual API path so that they can be mocked by any consumer using Postman. I went ahead and did this for the 23andMe API. Their documentation is still available, and there are examples for each individual path. I wanted to create a Postman collection for the API to round of my collection of API definitions, and since their documentation had examples, I thought I’d demonstrate how to create portable API sandboxes using...[Read More]


API Evangelist

Being Flexible With Authorization When It Comes To Multiple APIs Within A Single API Collection

16 Dec 2019

I am working on a Postman collection that deploys an API to AWS. I’m pulling the OpenAPI from Postman using the Postman API API (mind blown), and then publishing the API to AWS as an API using the AWS API Gateway API (mind blown again). As part of this process I also need a DynamoDB instance to use as a persistent data store behind the API, which I will create using the DynamoDB API. I need all of these capabilities organized within a single Postman collection, but because of the need to authenticate with multiple API services I will be organizing each capability by AWS service so I can set the authorization for each folder, and let each individual API request inherit from the folder, otherwise I will have to set each individual API request while working—I abstract away the variables I use across the authorization as part of a Postman environment, but I still want to logically think through how I can apply authorization across services. When defining Postman collections you can apply the authorization at the collection, folder, or request levels. This allows you to be more thoughtful of how authenticate across multiple APIs within a single Postman collection. This Postman collection is going to end up being what I’d consider to be a workflow collection, meaning it will walk through each step for the deployment of an API to AWS using Postman, so eventually it most likely will just be a series of individual API requests which can be run manually by a user, or automated with a Postman runner or monitor. However, as I am architecting my collection I don’t want to have to define the authorization for each individual request—I just want them to inherit authorizations, so I am just going to add a folder for each service. This gives me the ability to set authorization for Postman at the header level for an individual request, which I will move...[Read More]


API Evangelist

API Observability Is More Than Just Testing And Monitoring

16 Dec 2019

API observability is something I have written about for a while now after learning about it from Stripe. It is a phrase that has grown popular in API testing, monitoring, and performance circles. Borrowed from the physical world of control systems, observability is a measure of how well internal states of a system can be inferred from knowledge of its external outputs. I am all about getting on API observability train when it comes to monitoring of our systems, but if you’ve read my work you know I am all about expanding the definition to not just include the technical, but also the business and politics of API operations. One of the key aspects of observability is using the outputs or exhaust from the existing services and tooling used to operate your system. To help increase API observability within the enterprise I am always on the hunt for what the existing services and tooling that are in place so that I can better understand what the existing outputs are. If a service or tool is already in place within the enterprise, and we can tap into existing outputs, the chances for successfully changing behavior significantly increases. One tool that is ubiquitous across enterprise organizations is Postman, which provides a whole wealth of opportunity when it comes to tapping into the existing outputs to provide more observability into what is going on across the API life cycle. 90% of the Postman usage within the enterprise I come across occurs in isolation. One developer working with internal and external APIs within Postman. This occurs hundreds, or thousands over within medium and large enterprise organizations. Developers are profiling APIs, building requests, and saving them into collections with very little communication, coordination, and sharing of API awareness across teams. While it represents a pretty concerning trend across enterprise organizations where leadership has such little visibility into what teams are working on, it also represents a pretty significant opportunity for leadership to take...[Read More]


API Evangelist

Believing The Technology Startup Hype And Refusing See Anything Else

14 Dec 2019

I’ve been in numerous discussions with defenders of the Instructure Canvas platform after posting the Instructure LMS data points. These folks are blindly and passionately defending Instructure as a force for good, founded by good people, and being offended that I would see beyond the static representation of the startup they are demanding that everyone see and believe. I find it endlessly fascinating how we as a society continue to believe the storytelling around startups, and receive the marketing they put out as some sort of truth that will play out forever into the future. Even more dangerously these people don’t just believe, they actively police the front-line of critics who are shining a light on what is really going on, doing the bidding of not just startups, and their investors, but the capitalist machine. First, let me quickly disarm some of the standard tactics folks will use to come back at me on this piece. No, I am not anti-startup. I work for one, and have worked for many others. No, I am not anti-investor, I know lots of investors, advise them regularly, and derive most of my income from venture capital. This does not mean I am always walking the line and curbing my criticism of the bad that perpetuated by startups and investors across the landscape. People who feel the need to blindly defend technology is one of the reasons why there are so many bad actors on the stage, and why people are able to get away with the shenanigans they are pulling. Critics and whistleblowers are one of the forces that helps keep exploitative companies in the shadows, and minimize the damage they cause. I’m not saying all critique is constructive or helpful, but I’m saying that if you are actively pushing back on the critics and not listening to what they are saying, you are most likely part of the problem. To recap for those who are just jumping in--Instructure,...[Read More]


API Evangelist

Remember That An Application Is Not Just About Someone Building A Web or Mobile Application With Your API

13 Dec 2019

I encounter regular waves of API providers who are discouraged with the traffic to their API portal as well as the number of developers who are actually building something on top of their API. Many suffer from the hangover of “if you build it they will come” syndrome. Believing that if they just publish their APIs that developers will just show up and build amazing things. While many of us evangelists and advocates have over-promised amazing outcomes when it comes to publishing APIs, many of us in the trenches have long been honest about the hard work it takes to make your APIs something developers will want to use. Just publishing your APIs to a developer portal is not enough. Having a well designed and documented API is not enough. Making enough noise so that people find your API is a full time job, and ideally it is done by a whole team of people—study how Twilio has done it if you need a working example. Also, you have to regularly re-evaluate what the possibilities are when it comes to building or developing “applications”. This isn’t the API ecosystem of a decade ago where we focused on building just widgets and mobile applications. There are many more ways in which people can put your APIs to work in 2019, and you should be making time to understand what those possibilities are. The chance that some new developer will randomly find your API and build the next killer mobile application is pretty slim, and the real world implementations are probably going be much more mundane and granular. The important thing to remember about the word “application” is it does not necessary mean a web, mobile, or device application—it can simply mean “applying” your API.  Which in a world of integration platform as a service (iPads), bots, voice, and other use cases, applying your API could mean many different things. Don’t expect that all of your API...[Read More]


API Evangelist

The Postman Business Model Is In Alignment With Enterprise Objectives

12 Dec 2019

One of the things that became very clear during my conversations with folks at AWS re:Invent last week is that Postman’s revenue model is in alignment with what is needed within the enterprise. To help explain, let’s answer the question I got over and over at re:Invent—how does Postman make money? Steady waves of folks would show up at our booth in the re:Invent expo hall, and usually after a couple minutes of talking about their FREE usage of Postman, and how ubiquitous the tool is at their organization, they would inquire about our pro and enterprise offerings—which is all about helping enterprise organizations get more organized when it comes to doing APIs. The Postman Pro and Enterprise offerings are all about scaled usage of the platform, which includes the number of collections, users, teams, workspaces, and the collaboration, automation, and orchestration across them. All the core Postman features are free, and will remain free—developers love Postman because of its utility, and we do not intend to mess with that.  Paying for Postman means you are getting more organized about how you manage users, collections, environments, teams, and workspaces, as well as using more monitors, runners, mocks, and documentation. While more usage doesn’t always mean an organizations is doing things in a more logical fashion, but Postman enterprise features center around the organized governance of users, teams, workspaces, collections, and environments, steering enterprise customers towards optimizing how things get done. Having observability into all of your teams delivering and consuming APIs is the most important thing you can be investing in as part of your enterprise operations—the Postman enterprise tier is centered around charging for collaboration, automation, and scaling your teams using a tool they already are using, which increases observability across your API operations. This is why I am working for Postman. I am regularly conflicted about how the companies I rely upon tier their pricing and scale the business side of what they...[Read More]


API Evangelist

Focusing On Digital API Capabilities Over Just Doing APIs

12 Dec 2019

As I work on creating more useful Postman collections I am distilling my API definitions down to the small possible unit as I possibly can. While I have many robust reference Postman collections and OpenAPIs, I am enjoying creating Postman collections that accomplish just a single ting—representing each digital capability that I have. Currently my digital capabilities are spread across a number of servers, GitHub repositories, and Postman workspaces. If I use one of the APIs in my long list of API providers it is pretty common that I use less than 5% of the API paths from each individual providers. So, other than for sharing as part of my API Evangelist research, why do I need to wade through entire reference API collections to get at the one or two capabilities I need to actually use. I’m slowly working through the stack of APIs that I use, pulling out the different capabilities I put to work as part of my API Evangelist work, defining as single Postman collections that I list on my GitHub capabilities page. I have published two Twitter API capabilities I have defined, which I will be expanding on pretty quickly, helping document all of the essential API calls I make to the Twitter platform. Twitter Tweet Search A basic search for Tweets on Twitter by query. Twitter Account Status Doing a lookup on users and returning only the fields that describe their status. I have probably another 50 individual Twitter platform capabilities I am putting to work across my platform. I am going to list them all out here, and then begin documenting how I put each of these capabilities to work. Then I’m going to evaluate whether there is any opportunity for me to scale each capabilities using Postman monitors, helping me automate the orchestration of Twitter data across my operations. Next, I got to work defining a handful of GitHub capabilities I put to use on a regular...[Read More]


API Evangelist

Automatically Generate OpenAPI For Your APIs Just By Using Them

12 Dec 2019

I am a big fan of tools that just augment our normal existence then make our lives easier without having to do much additional work. One of the tools that fits into this category is Optic, an open source tool that will generate OpenAPI definitions from your proxied traffic. A while back I developed my own script for doing this by processing Charles Proxy files synced with Dropbox, but I never evolved it beyond Swagger when OpenAPI 3.0 was released. So I was pleased to talk with the Optic team at API World in San Jose a while back. Like many notes in my notebook, my thoughts on Optic got buried by the constant flow of conversations and ideas coming in on a daily basis, but a Tweet from them the other day reminded me that I wanted to showcase and talk a little more about what they are up to and why Optic is important to the API sector. Optic will take your browser, CLI, and Postman API traffic and automatically generate an OpenAPI based upon your API calls that are routed through the Optic proxy. Helping us automate the generation of machine readable API contracts which can then be used across our API operations.  The generation of OpenAPI from the exhaust of our existing work is valuable, but what also grabs my attention is that Optic helps handle the diffs between each OpenAPI generating, which can be used to help you detect changes in APIs that are already in use. As I said, I have had this capability for a while now, and it is something you can do within Postman—turning on a proxy and generating a Postman collection. But, having as a standalone open source open source component that produces OpenAPI contracts as a standalone service is pretty critical for helping us make sense of our API exhaust at scale. Optic’s core feature is generating OpenAPIs and managing the diff between each...[Read More]


API Evangelist

We Will Not Convince Everyone To Go API Design First

11 Dec 2019

I believe in going API first. I think it provides a positive step for development teams. I think it is one that makes sense for most enterprise organizations. But if I have learned anything in the last decade of working with APIs is that I rarely get what I want, and people don’t always do what is right and what will make sense. I have gotten a lot of pushback from developers, API providers, and service providers regarding the notion that code first API delivery is bad, as well as the idea that API design first is good. For me, the real challenge here isn’t about selling folks on one approach or the other, it is more about injecting more stakeholders and communication into the process earlier on in the evolution of APIs, rather than waiting until they are baked into production before iterating upon the design of the interface. There are a lot of existing folks who are trained to deliver code across the enterprise landscape. I’ve heard repeatedly from folks that they are a programmer, and not a templater, artifactor, or API designer. I get it. We have a lot of momentum occurring based upon the way things have been historically, and I don’t doubt that it will be difficult to change our behavior. The challenge here lies around understanding how much of the pushback on API design first is purely about being resistant to change over there being multiple ways to tackle the deliver of an API. I feel pretty confident about there being multiple ways to actually deliver an API, and I don’t care if you are mocking it, delivering via a gateway, with a framework, or hand pounding your artisan APIs on the forge in the workshop. I just care that there is as much light on the overall process, as many stakeholders as possible involved, and there is a feedback loop around what the design of the APIs should...[Read More]


API Evangelist

My Thoughts ON Amazon EventBridge Schema Registry And Discovery

10 Dec 2019

My friend Fran Méndez (@fmvilas) over at AsyncAPI asked me to share my opinions on Amazon’s EventBridge schema registry and discovery which is in preview. Something that is looking to be a pretty critical add-on to Amazon EventBridge, which provides a serverless event bus that connects application data from your own apps, SaaS, and AWS services. Event-driven approaches to APIs are growing in popularity for a number of reasons, and is something that is only increasing the need for us to get our schema house in order, resulting in solutions like the schema registry for EventBridge being pretty valuable to general API operations. I haven’t taken EventBridge for a test drive, so all of my thoughts are purely superficial, but at first glance it looks like something that can have a meaningful impact on how people are making sense of the schema we have floating around, but I think there will be some key elements that will make or break a solution like the schema registry, something Amazon is already thinking about with their code generation from the schema registry. Here are some of the initial thoughts I am having as I read through the announcements and documentation around EventBridge and the schema registry. JSON Schema Generation - The auto publishing, diff’ing, versioning, discovery, and evolving of JSON Schema representations for all schema in use will be pretty critical in making the registry tangible. Protocol Buffers - There will need to be easy generation and conversion of Protocol Buffers as part of the process. I don’t see that EventBridge supports gRPC or Protocol Buffers, but it was a thought I was having./ AsyncAPI Generation - The schema catalog should automatically generate and version AsyncAPI specifications for all the schema, and ultimately channels and topics being defined as part of EventBridge. Tagging - Being able to tag schema, organize them, and discover based upon an evolving taxonomy that helps make sense of the expanding schema landscape will...[Read More]


API Evangelist

Abstracting Away API Response Complexity With Postman Visualizer

10 Dec 2019

I was creating a Postman collection for validating the status of Twitter users, where I was looking to extract a subset of data from the very verbose Twitter API response for a Tweet Lookup. There is a lot of data contained within a single Tweet JSON response, and all I was looking for was just a handful of the fields. I thought this would be a great opportunity to show off the new Postman visualizer, where you can display the API response for each request however you want. To get started I crafted an API request for the Twitter lookup API path, allowing me to pass in up to 100 Twitter user handles, and return a JSON response for all the Twitter users I want to check in on the status of—leveraging Postman to authorize and see the details of the API response. This res[omse has the data I need, but looking through the entire of the JSON response is a lot more than I can ask of many of the people I will be sharing the collection with. I’m going to be sharing it with mostly non-developers, hoping to provide them them with a quick way to check the status of various Twitter users, and wading through the JSON is unacceptable, so I used the new Postman visualizer to render an HTML list of only the data I wanted. The Postman visualizer allows me to pull only the fields I need and publish as HTML to the visualizer tab. Providing a more human readable view of the Twitter Lookup API response, making the Twitter API more accessible by developers and non-developers who are looking for a quick way to validate the status of one or many Twitter users. To make this happen, all I did was add a test script to the API request, adding a little JavaScript which takes the JSON response, loop through each user being returned and retrieve only the fields...[Read More]


API Evangelist

Validating Twitter Users Using The Twitter API Without Writing Code

09 Dec 2019

I was asked on Twitter if I had any code for pulling the status of Twitter users. Since I am the API Evangelist, and the Chief Postman at Twitter I tend to prefer sharing of Postman collections rather than actual language specific code. It is easy for me to craft a single Postman request that accomplishes a specific purpose, then share as a template in the Postman API network, than it is to write code. Plus, any user can then execute on their own within their own local Postman client. To satisfy this request, and demonstrate how Postman collections work, I created one for looking up the status of a list of any Twitter handle, verifying the state of each individual Twitter user—providing only the basic information you need to validate one or many different Twitter accounts. Before you can use this API collection you will have to download the Postman applications, then setup your own Twitter application so that you can make calls to the Twitter API--do not worry, it is painless, and something even a non-developer can do. When filling out the form, all you need is to give your app a name, description, website URL, and tell them how it will be used. You can ignore the rest of the fields. Once the application is added you can obtain your access tokens by clicking on the keys and tokens tab. The first time you create your application you will have regenerate your access token and access token secret, and then you will need all four tokens to authenticate (Don't worry those aren't my real tokens). Once you have your own tokens, go back to your Postman application and click on the Twitter Lookup Postman collection, and edit its details. Once the edit window pops up select the Authorization tab, then select to use OAuth 1.0 and add your auth data to request headers from the dropdown boxes, then add your own ,...[Read More]


API Evangelist

Twitter API Authorization Using Postman

09 Dec 2019

I just created a new Postman collection for validating Twitter users via the Twitter API. As part of the Postman collection documentation and tutorial I included the steps for authorizing with the Twitter API. This is something that can easily be a hurdle for developer, and will definitely run most non-developers off. In reality, setting up your own Twitter API application, then copying your API tokens and use them in a Postman collection is something anyone can accomplish. This is an authorization workflow that I will be referencing in many different Twitter API tutorials and stories on the blog, so I wanted to have as a standalone URL that I could easily share with anyone. Before you can make any call to the Twitter API you will need to have four application tokens you can only obtain via your own Twitter developer account. The first step of this process is to setup a Twitter developer account which is different than your regular account, and can be done via the Twitter developer portal. Once you have a Twitter developer account you can visit the application listing page, and choose to create a new application in the top corner, and manage any existing application you have already added in the past. Allowing you to manage the details of your access to the Twitter API. While creating an application there are a number of details you will need to consider, but to jumpstart your API Integration all you will need is the name, description, website URL, and tell Twitter how this app will be used. You can always edit these settings at any point in the future, so do not worry too much about them when getting started. Once you have created your Twitter application you can visit the keys and tokens tab to obtain your consumer API keys as well as the access token and access token secret. Providing the four tokens you will need to actually authorize...[Read More]


API Evangelist

A Postman Meetup This Tuesday In Seattle

09 Dec 2019

I am all recovered from being at AWS re:Invent all week in Las Vegas, and gearing up for a Postman meetup in Seattle this Tuesday. I am stoked to be doing ane vent on my home turf, but I am aslo pretty happy with the lineup. I am going to be opening things off with a quick look at Postman collections, but then I have Tableau and NGINX giving some talks, and then a quick closing with a look at the Postman visualizer--here is the lineup for Tuesday nights goings on. Postman API Collections Kin Lane (@kinlane), Chief Evangelist @ Postman You save your Postman requests as collections each day, but have you learned about all the ways in which collections can be applied? Let’s move beyond just reference collections for every endpoint in your API and making collections reflect the real world business use cases your APIs solve. Pushing your Postman collections to me more intuitive and useful for your developers, helping on-board them with the possibilities while also documenting what your APIs can do, providing portable, shareable, machine readable, and executable representations of what your APIs deliver. How Tableau uses Postman to enable developers Geraldine Zanolli a.k.a Gigi (@illonage) Developer Advocate @ Tableau Tableau , well-known in the Business Intelligence industry for building great data visualizations, also offers a set of APIs and Developer Tools that allow developers to integrate, customize, automate, and extend Tableau to fit the specific needs of their organization. Learn how Tableau uses Postman to give developers an interface to do their first API request. The NGINX API Gateway Liam Crilly (@liamcrilly), Director of Product Management @ NGINX APIs are changing the way we build applications and changing the way we expose data, both inside and outside our organizations. But what is the most efficient and effective way to deliver these APIs? That’s the job of the API gateway. In this session, we will look at different deployment patterns for...[Read More]


API Evangelist

We Will Not Discuss APIs Without A Postman Collection

02 Dec 2019

I heard a story this morning while having breakfast with someone at the Venetian before I made my way to the re:Invent registration counter which reminded me of the now infamous secret to Amazon’s API success myth story. I can’t mention the company involved because they are pretty confident they’d never get approval to tell this story publicly, but as I so often do, I still feel it is worth telling even in an anonymous way. Internal groups at this company were having such a problem around coherently discussing the details of APIs between internal groups, that they made a rule that they will not talk with other teams about any API without there being a Postman collection present (ha, Postman mediator) to facilitate the conversation. There has been several stories on this blog about the problems with emailing API responses between teams, and sending Microsoft Word documents with XML or JSON responses embedded in them. If you work within the enterprise you know that this is a common way to share API responses, and get guidance, ask questions, and generally discuss the details of each API being put to use. Imagine if all of this was banned, and if you had a question about the details of making an API request or parsing the response, it was mandatory to provide a Postman collection of each API request and response in question. Ensuring that ALL the details of the request with a real-life example of the response was present before any discussion would commence. Talk about a game changer when it comes to making sure people were on the same page when it came to discussing some very abstract concepts. Ensuring team members are all on the same page when it comes to what an API is, let alone the endless number of details regarding query parameters, headers, authentication and other details takes a lot of work. Even if all stakeholders in a conversation are...[Read More]


API Evangelist

Mock AWS Services Using Postman Collections With Examples

02 Dec 2019

As I create each of the 50+ Postman collections for AWS services I am always striving for establishing as complete of a collection I as possibly can—this includes having examples for each API request being defined. This is something that is easier said than done, as there are many different ways in which you can setup your AWS infrastructure, and work with your infrastructure using AWS APIs, but nonetheless, I still strive for ensuring there is an example saved as part of each Postman collection. While this helps me better define each request, there are numerous benefits from having API examples, and one of the most beneficial of these is being able to generate mock APIs from the AWS Postman collections I’m publishing. Taking a look at the Postman collection I have published for Amazon DynamoDB, I have managed to save examples for each of the API requests documented as part of the AWS reference Postman collection. This makes it so anyone can run the Postman collection within their own Postman platform account, and then generate a mock server for the Amazon DynamoDB API. Allowing developers to develop against the API without actually having to use a live AWS account, have the proper infrastructure and permissions setup, making it quicker and easier to jumpstart the development of desktop, web, and mobile applications. Allowing developers to publish their own mock servers for AWS services, and save time and money when it comes to your AWS budget. I can envision developing AWS Postman collections that are complete with examples derived from specific AWS infrastructure deployments. Tailoring a specific setup and configuration, then making API requests to the AWS APIs needed for orchestrating against these existing infrastructure configurations, and saving the examples return from each API response. Essentially taking a snapshot of an existing AWS setup across multiple services, then making that snapshot available as a mocked set of AWS APIs that return the responses developers are needing...[Read More]


API Evangelist

gRPCs Potentially Fatal Weakness

02 Dec 2019

I was reading an article on Microsofts DevBlog about gRPC vs HTTP APIs. It makes the usual arguments of how gRPC compares with HTTP APIs. While the arguments for gRPC are definitely compelling, I find the weaknesses of gRPC in this moment in time even more interesting, for two reasons, 1) they are something we can overcome with the right tooling and services, and 2) they reflect our challenge between the human and machine readablity of all of this, which many of us technologists really suck at, leaving me concerned whether or not we will be able to get this right—as I think we underestimated this characteristic of HTTP APIs, and have missed the full potential of this opportunity even as we are faced with this next step. Here is what was said the blog post, highlighting two distinct weaknesses of gRPC, but which I view as more about systemic illnesses in the wider view of the API landscape, and our inability to understand the important role that humans play in all of this: Limited browser support gRPC has excellent cross-platform support! gRPC implementations are available for every programming language in common usage today. However one place you can’t call a gRPC service from is a browser. gRPC heavily uses HTTP/2 features and no browser provides the level of control required over web requests to support a gRPC client. For example, browsers do not allow a caller to require that HTTP/2 be used, or provide access to underlying HTTP/2 frames. gRPC-Web is an additional technology from the gRPC team that provides limited gRPC support in the browser. gRPC-Web consists of two parts: a JavaScript client that supports all modern browsers, and a gRPC-Web proxy on the server. The gRPC-Web client calls the proxy and the proxy will forward on the gRPC requests to the gRPC server. Not all of gRPC’s features are supported by gRPC-Web. Client and bidirectional streaming isn’t supported, and there is limited support...[Read More]


API Evangelist

I Am Heading To Vegas For AWS re:Invent

01 Dec 2019

I’m sitting in the Seattle airport waiting for my flight to Las Vegas. I’m heading to AWS re:Invent, spending the entire week talking everything APIs with the masses at the flagship conference. It is my 3rd time in Vegas for re:Invent, but my first time exhibiting with such a strong brand—Postman. Like years before, I will be there to talk to as many people as I can about how they are delivering APIs, and learning about the challenges they face when consuming APIs. However, this year I won’t just be there as a representative for API Evangelist—this year I am there to help also talk about the role Postman plays in the delivery and consumption of APIs. To get fired up for the conference I’ve spent the last couple of weeks developing Postman collections for as many AWS APIs as I could—I had set 25 services as my target, and managed to a little more than 50 separate services defines as reference Postman collections. I learned a lot throughout the process, assisting me in loading up a whole lot of details about common AWS API infrastructure into my brain, helping me prime my brain for conversations I will be having at re:Invent. Helping not just think deeply about AWS services, but also how Postman can be used to work with AWS APIs. These Postman reference collections are just the foundation for my API understanding, API conversations, and other ways of considering how AWS APIs impact how we deliver applications in the cloud. While the AWS Postman collections help jumpstart anyones usage of AWS, I’m also looking at how to use them to actually define, deploy, manage, and evolve APIs that operate on AWS. AWS APIs have long fascinated me, and have played a significant role in my evolution as the API Evangelist, In 2020 I’m still keen on learning from AWS as an API pioneer, but I am more interested in learning how we can...[Read More]


API Evangelist

I Am Happy I Chose The Term Evangelism

27 Nov 2019

There is always lot of discussion around the proper term to use for describing what it is we all do when it comes to getting the word out about our APIs. Some of use use the word evangelism, while others prefer to use advocacy, relations, or being an ambassador or champion. Sometimes it is focused on APIs, but other times it is focused on developers or other aspect of what we are looking to highlight. While there are many “announced” reasons why we evangelize, advocate, and champion, the real honest reason is always that we want to bring attention to our products and services. Sure, in some cases we are interested in educating and supporting developers, but really all of this is about bringing attention to whatever we are peddling—me included. I didn’t grow up religious—the opposite actually. I never went to a church ceremony until I was an adult. So the term evangelism doesn’t carry a lot of baggage for me. However, I do fully understand that it does for many other people. Even though I didn’t go to church, I did grow up around enough people who were very religious to understand the meaning evangelism can bring to the table. Early on in doing API Evangelism I felt somewhat bad for using this term, and felt like I was bringing a whole lot of unnecessary meaning and baggage to the table as I was trying to “enlighten” folks of the API potential. Now, after a decade of doing this, I’m happy I chose the term evangelism, because I feel it best represents what it is I do in this technology obsessed world we have created for ourselves.  Technology is the new religion for many people. You know what two of the fastest growing areas of APIs are? Blockchain and churches. When you have so many people blindly believing in technology, it begins to look and smell a lot like religion. When you embark...[Read More]


API Evangelist

Bulk Updating My Postman Collections Using The Postman API

27 Nov 2019

I had recently pulled all of the AWS Postman collections I have created and spread across Postman workspaces. After creating over 50 AWS Postman collections I learned some things along the way, and realized I needed to update my variable for the baseURL of each API, but I had already created all my collections, and to update these variables manually would take me hours, if not days. So I got to work writing a script that would pull the latest JSON for each collection, conduct a find and replace on the replacing it with a service specific base url that went something like this , then write back using the Postman API. This is something that would take me many hours to update across 50+ collections and nearly 1000 individual requests, but is something that I could accomplish in less than an hour with the Postman API. Once again, when I can’t get what I need to quickly in the Postman UI, I can quickly get things done using the Postman API. This is how it should be. I don’t expect that the Postman UI keep pace with all of my needs. I like Postman as it is, and carefully plodding forward adding features that make sense to as wide of an audience as possible. I always know that I can get at what I need through the API, and automate the changes I need. In this case, I’m able to rapidly make updates at scale across many different API collections, relying on Postman to help me manage API definitions manually through the interface and in many different automated ways via their API. I am still getting my bearings when it comes to managing the variables I use across my many Postman collections. I am rapidly iterating upon how I name my variables for maximum flexibility within Postman environments, and where I apply them within my Postman collections. This is something that requires a lot...[Read More]


API Evangelist

So You Wanna Build An iPaaS Solution

26 Nov 2019

I’m getting more emails and DMs from startups doing what I’d consider to be integration platform as a service (iPaaS) solutions. These are services that help developers or business users integrate using multiple APIs. Think IFTTT or Zapier. I’ve seen many waves of them come, and I’ve seen many waves of them go away. I’m always optimistic that someone will come along and make one that reflects my open API philosophy and still can make revenue to support itself. So far, Zapier is the closest one we have, and I”m sure there are others, but honestly I’ve grown weary of test driving new ones, and I am not as up to speed as I should be on what’s out there.  When it comes to iPaaS providers the bar is pretty high to convince me that I should be test driving your solution, and why I should support you in the long run. This is partly from just having supported so many services over the years, only to have them eventually go away. It is also because of new problems for consumers being introduced into the mix because of the abstracting away of the complexities of APIs, rather than joining forces to educate and fixes these complexities amongst API providers. I’m always willing to talk with new iPaaS providers that come along, but I have a few requirements I like to put out there which usually filters the people who end up reaching out and engage from those who do not. Due Diligence - Make sure you are testing driving and reviewing as many existing iPaaS platforms as you possibly can, because there are a lot of them, and the more you kick the tires on, the more robust and valuable your solution will be. API Definitions - Your solution needs to be API definition driven, adopting OpenAPI, Postman collections, and existing formats for defining each of the APIs you are integrating with, as well as...[Read More]


API Evangelist

Pulling All My Postman Collections Using The Postman API

26 Nov 2019

I needed access to all of the AWS Postman collections I am building. The problem is they are distributed across multiple different workspaces. I had organized over 50 AWS Postman collections based upon the resource they were making available. Now I just wanted a list of all of them, and report on what I have done. It sounded like a good idea to group them by resource at first, but now that I needed to work with all of them in a single list, I’m thinking maybe not. So I got to work pulling all of my collections from the Postman API and filtering out any collection that wasn’t from AWS. I find it easy to get caught up in what features are available to me via the interface of the services and tooling I use, letting the UI define what is possible. This is why I only use services and tooling that have APIs if I can help it—as the API is always the relief valve for allowing me to get done what I need. In this case, while very robust, I couldn’t get everything I needed done with the Postman UI in the time period required, so I switched to the API, and was able to programmatically get at the data I needed. Allowing me to pull all of my collections from across workspaces, then organize and generate exactly the list of collections I needed for a specific report I’m working on. While talking with folks about Postman I regularly encounter individuals who speak about the limitations in the product, stating they couldn’t use it to accomplish something because it didn’t do this or that. Without ever considering that they could accomplish it via the API. Personally, I am impressed at how thoughtful Postman has been about adding new features, helping minimize the complexity and bloat of the platform. This is why I expect platforms to have APIs, so that I can get...[Read More]


API Evangelist

Is it an Amazon or AWS Branded Service

26 Nov 2019

I’m working on 50+ AWS Postman collections at the moment, as well as crafting Postman environments for use across them. I’ve encountered some namespace challenges in this work, and I was needing to establish a naming convention for the key / value pairs I’m using within my Postman environments. To help establish the namespaces I am just taking the display name for each of the APIs I am profiling, but one thing I’m noticing is that there are two different names in use across APIs, being either AWS or Amazon—with no rhyme or reason why a service is labeled one way or the other.  Looking down the list of all the services they have, I would say that AWS is more prominent than Amazon as the beginning namespace for each service. I’m just curious if there is any guidance or rhyme or reason to the naming of services launched under AWS. At first it feels like I’m being too pedantic, but from a branding, and even programmatically across services, it seems like having a common naming convention for services would make sense. Like my thoughts on the API design consistency across AWS APIs, I’m not trying to shame AWS, I am just trying to learn from what is happening, and share what I find with other API providers. I regularly use Amazon as an example to learn from in my API storytelling, which unfortunately sets them up for constructive criticism as well--I am sure they can handle. ;-) For my environment variable challenge I am simply going to prefix my variables with aws as the service namespace. I’m just standardizing for shortness, ease of use, and distinguishing AWS APIs from the other API providers I’m profiling. I was more interested in just pausing and thinking about why this occurs, and work to think more deeply about how we name our APIs. For me, the lessons around naming our APIs is more about pushing us to...[Read More]


API Evangelist

Where O Where Is My API Key

25 Nov 2019

Finding your API key for an API provider can be a real pain in the ass. Depending on the account it can be buried deep within your settings, or possibly out in the back 40 in another separate developer account. I’m not even talking about OAuth here, I am just talking about obtaining one or more API keys to access the valuable API resources you desire from a free service, or even from a service you are paying for. There is no standard for how to create, define, storage, and access API keys. It has been something that API providers have helped standardize somewhat (I guess?), but ultimately there is no unified way for me to access all the API keys I use across the thousands of API I use — yes, I do use thousands of APIs, because I am the API Evangelist. Why can’t API keys be easier to find, and exist as a default part of our platform accounts. They shouldn’t be this hard to generate, find, and put to use. I keep coming back to my CloudFlare DNS application experience on this subject. Next to some of the actions I can take in the CloudFlare UI is a link to the API call to make the same action, complete with my API keys to make the calls. I don’t have to do anything else other than use the application to find the keys. Can you imagine if every UI element in every application had the underlying API call available right next to it with API keys? Can you imagine if every API call came back with a link in the response to where I can take the same action in the UI? Maybe I have a different view of the world than others, but this seems like it should be common place in the tech sector. I’m afraid many folks have seen APIs as some technical thing over there for too...[Read More]


API Evangelist

API Design Consistency Across Amazon Web Services

25 Nov 2019

I have been crafting Postman collections for as many AWS APIs as I can before re:Invent. As I work my way through the different APIs I”m reminded of the difficulties involved in API consistency and governance at large enterprise organizations. While most AWS APIs employ a pretty formulaic XML RPC design, there are variations within how these RPC APIs work, but there are also several outliers of other more RESTful and even full blown hypermedia APIs present. Making for a pretty wild mix of API resources to put to work, something that has been abstracted away as part of their SDKs, but is painfully present when directly integrating with APIs across multiple services. From the lay of the land I’m guessing AWS had instituted their primary XML RPC approach, and baked that into governance law across the organization in early days. Over the years, after significant growth, some groups were able to publish APIs outside of this pattern, resulting in the patchwork quilt of services that are present. The most notable and ironic deviation from this pattern is the API for the AWS API Gateway, which employs a RESTful approach using the HAL media type. Which personally, I would prefer as the dominate pattern across all the service, but sadly it is the more legacy XML RPC pattern that dominates. I get it, you can’t go changing the AWS S3 or EC2 APIs now, they are known for their stability, but I still think there are some important API design and governance lessons present in the valuable cloud API stack. The first lesson in all of this I’d say is that we need to make sure and establish your API design governance early on and socialize across all teams—even new ones. The second lesson I’d say is to make sure and review your API design governance regularly to make sure you aren’t missing any healthier patterns that may have come along. You don’t want to...[Read More]


API Evangelist

API Copyright: Directories

25 Nov 2019

I am gearing up for API copyright heading to the Supreme Court, having another look at whether or not the naming and ordering of your API interface is copyrightable, as well as whether or not reimplementation of it can be considered fair use. To help strengthen my arguments that API should not be copyrightable I wanted to work through my thoughts about how APIs are similar to other existing concepts that are not copyrightable. One of the newer concepts I”m working with to help strengthen my argument that copyright does not apply to APIs involves the directory, and shining a light on the fact that APIs are just a directory of our digital resources. As with all of my API storytelling, I am focused exclusively on web API. Occasionally you’l hear me talking about language and platform SDK APIs, browser APIs, and other variations, but the majority of what I mean when I say API is done via public DNS over HTTP, HTTP/2, and sometimes TCP. All of these APIs are just a directory of Uniform Resource Identifiers (URIs) of corporate and institutional digital assets. Modern APIs most often leverage DNS, providing a machine readable listing of resources that are available within a specific domain. The domain and the resulting API directory are not creative expressions, they are an address that points to a directory of digital assets, allowing them to be found by developers for use in other applications and systems. APIs are not a form of creative expression, they are just helping companies, organizations, institutions, and government agencies make their digital resources and capabilities discoverable. I can easily make the argument that APIs are simply a directory or menu of organizational resources and capabilities. Due to the diverse nature of what an API can be, I can also easily apply the analogy that APIs are also recipes. All the above is true. As with my restaurant menu and recipe stories, people have trouble believing...[Read More]


API Evangelist

Reducing API On-Boarding Friction With API Environments

22 Nov 2019

I’m obsessed with making my Postman collections more accessible and executable to developers and non-developers. I’m really frustrated that on-boarding with APIs is still so difficult, and I’m determined to reduce friction in this area in 2020. I’m confident that Postman collections plus environments represent the next stage in API integration, automation, and orchestration, but we still have some sharp corners to round off when it comes to streamling the on-boarding of new users to each APIs being referenced within each Postman reference, workflow, capability, collaboration, walkthrough, and the numerous other types of Postman collections that are emerging across the API landscape.  I’ve been working hard to create a number of new Postman collections and even once you click on the Run in Postman button for each API collection, you still have to go signup for an account, create an OAuth application, and plugin in the OAuth details for API before you can make your first API call — this is dumb. We desperately need to redefine what an application is in the context of the API management layer for providers. As I’ve said many times before, every API provider needs to provide personal OAuth access tokens by default as part of your account like GitHub does. This should be the baseline for all APIs, but we also need more services and tooling that helps us generate and manage tokens for use across our low-code and no-code API integration solutions. I got frustrated enough recently that I just began hacking together what I am simply calling: API Evangelist Environments. It’s a pretty crude tool, but it represents a pretty good start. It helps establish a base OAuth application for a handful of common APIs, allowing anyone to click and generate a token which can be used to make API calls. If you put your Postman API key in the text box, it will also generate a Postman environment and place it in your Postman account....[Read More]


API Evangelist

Developing An API Environment Naming Strategy

22 Nov 2019

I’m creating more Postman environments lately. I’m realizing the potential for using environments as a pivotal layer in defining and working with Postman collections. Environments are the missing ingredient from OpenAPI, and provide a powerful way to abstract away key value pairs, including keys and tokens from the Postman collections, allowing me to apply the same Postman collection over and over but in different contexts. As I’m pushing the boundaries of working with API environments I’m being pushed to think more deeply about how I name my variables so that they stay in sync with Postman collections, but more specifically work seamlessly across many different types of Postman collections. An example of this shift has emerged while I am crafting Postman collections for AWS APIs, and was defining a base environment for each collection with a handful of key / value pairs which I will need to actually fire up a collection and begin making calls to each AWS API I am targeting with each Postman collection. baseURL - The baseURL for use across all the requests I have organized into collections. accessKey - One half of the keys needed for authenticating against any AWS API. secretKey - The other half of the keys needed for authenticating against any AWS API. region - The region where each of the APIs I’m engaging with are operating within AWS. As I build each Postman environment I am naming each one with the same name as I am applying to the Postman reference collection I am creating for each AWS API. This works well in isolation, but the approach begins to break down as I begin crafting capability, workflow, and walk through Postman collections that will be putting multiple AWS from across multiple AWS services to work. This realization has made me aware that I am going to need to begin developing a more comprehensive and long term strategy for crafting my environments, as well as the variables I use...[Read More]


API Evangelist

API Copyright: Restaurant Menu

22 Nov 2019

I am gearing up for API copyright heading to the Supreme Court, having another look at whether or not the naming and ordering of your API interface is copyrightable, as well as whether or not reimplementation of it can be considered fair use. To help strengthen my arguments that API should not be copyrightable I wanted to work through my thoughts about how APIs are similar to other existing concepts that are not copyrightable. One of the old concepts I had worked through back in May of 2014, and was used by Google as part of their argument, was the notion that your API is just a menu for your organizational digital resources--I wanted to take a fresh look at this concept, and add it to the toolbox for when we head to DC. The restaurant menu analogy is an important one to help people who really don’t understand APIs, as well as those who do understand APIs, but don’t understand the wider world of copyright. As I’ve said before, the problem here is purely because people can’t separate the interface from the backend, from the data, content, and media that flows through them. People simply see the entire things as an API. I want to keep making the argument that your backend code, and the data, content, and media that flows through can be copyrighted and patented—I don’t care. I’m arguing that the interface to these valuable resources needs to stay open, in the public domain, and should not be subject to copyright, just like the menu of a restaurant. Your application programming interface (API) is not your application, it is the programming interface to your application. Your API is the menu to the digital resources and capabilities you possess within your company, organization, institution, or government agency. APIs are how your digital resources and capabilities are made known, accessed, and baked into your partners technological solutions. You want your APIs to be as open,...[Read More]


API Evangelist

The Missed Revenue Opportunities For The State Of California Because They Do Not Have A Business Registry API

21 Nov 2019

I was talking with OpenCorporates CEO Chris Taggart (@countculture) while in Washington DC a couple of weeks ago, reminding me of a previous conversation we had about the current state of business registry submission and search for the State of California. He had asked me a couple months ago if I knew anyone working for the State of California on opening up data, or possibly specifically with the business registry—I didn’t. Chris is wanting to talk with anyone in the know about evolving and modernizing the states approach to managing their corporate entity information. Clearly he has a desire to get better access to the data, but as our conversation in DC reminded me, he has some serious concerns about the importance of not just access to corporate data, but being able to keep up to speed what is happening at the business level in not just California, but other states as well. There are many ways in which government can’t keep up with the pace in the private sector, and corporate filings is definitely one of the more critical areas that is falling behind. Like many areas of government I’m guessing this is by design. Not because of some grand government or corporate conspiracy, but just by the the ongoing suffocation of the resources government has to get things done. I’m guessing the folks running the State of California business search are doing the best they can with the resources they have. I’m guessing that the private sector has no real motivation to lobby and compel the State of California to do any better when it comes to filing, managing, and search for business data. The result is a business registry system that barely meets the needs of everyone involved, and leaving a lot of opportunities on the table when it comes to understanding and responding to change in the way business gets done in the state. When it comes to business registry in...[Read More]


API Evangelist

Reference And Walkthrough API Documentation And Collections From ShipEngine

21 Nov 2019

One of the things I have been thoroughly enjoying as part of my work with Postman is the many different ways in which Postman collections are being work. If you’ve followed my blog over the years you know I’ve been a big supporter of OpenAPI—it has changed how we communicate about our APIs. Until I began working with Postman I viewed Postman collections as just another API definition like OpenAPI, where you can define the surface area of your APIs. My perspective began changing once I learned about Postman environments, but my world has been completely shifted once I began learning the different ways in which Postman customers are putting collections to work as part of their API operations. Reference Postman collections is the most common approach you will find in the wild, outlining all of the API paths that exist for an API, but over the last couple months I’ve seen entirely new types of collections emerge, including workflow, capability, maintenance, governance, and now walkthrough Postman collections. I saw ShipEngine tweeting out about giving a live demo of their “fancy new @getpostman docs” tonight at the #AustinAPI meetup, and after landing on the ShipEngine documentation home page I found two types of Postman collections, with supporting documentation. Providing two distinct ways of communicating around the ShipEngine APIs, helping provide one set for on-boarding new users, and another for making sure every API path is documented and accessible. First they provide ShipEngine Walkthrough, which provides collection is a guided tour through ShipEngine's most popular features, reducing the overhead for any developer looking to quickly understand what the ShipEngine API provides—highlight the following capabilities.  Create and download shipping labels  Calculate shipping costs for any package  Compare rates across UPS, FedEx, USPS and other carriers  Track packages in real-time or on-demand  Parse and validate mailing addresses for any country on Earth! Then ShipEngine provide with ShipEngine Reference, which is a collection that contains sample requests for every...[Read More]


API Evangelist

Attracting The Big Customers You Desire Requires A Steady Stream Of API Storytelling

21 Nov 2019

API storytelling is the number tool in my toolbox, and this should be the same for API providers and service providers. I know that many folks, especially the more technical folks snicker every time I write about storytelling being so important, but this is just because of a lack of awareness regarding all the influence that stories have on our personal and professional lives. One hallmark of successful API operations is always a healthy dose of storytelling via blogs, social media, GitHub and the mainstream press. One example of this out in the wild is out of Capital One, who have embraced being very transparent in how they do APIs, both internally and externally, being very active when it comes to telling stories about what they are up to, and encouraging their employees do the same. Being this vocal as a large enterprise organization isn’t easy, especially one that operates within such a highly regulated industry. As with any enterprise organization Capital One is still very reserved in many ways, and has its official communication channels and overall narrative. With that said, they have a slightly more amplified voice out of technical groups, and they give more agency to its employees, as well as outside voices like me. I’ve never had to sign an NDA as part of regular engagements with their API teams, leaving it up to me to be sensible in what I filter and what I do not. I wouldn’t say that Capital One is wildly open with their storytelling like startups I’ve worked with, but they are just one or two notches more open than many other enterprise organizations I work with, especially when it comes to the banking industry—I’d say Capital One is the most visible bank in the United States when it comes to API, and related infrastructure storytelling. After doing my video series with Capital One about two years ago I had people from competing banks reach out...[Read More]


API Evangelist

Company Specific API Workspaces

20 Nov 2019

I have many different workspaces defined within my Postman team account. I’m organizing a couple thousands APIs into different topical categories that help articulate the value they deliver. Once I’ve sufficiently profiled a company and their APIs, producing a reference Postman collection, I’ll share to, or create a workspace for each tag I have applied to each API as part of my profiling process. Depending on the company, I am looking to establish a complete (as possible) Postman reference collection, as well as some more specialized capability, workflow, and other relevant collects that are derived from, and compliment the reference APIs present, but go further towards accomplishing specific business objects with each API path available. I am currently profiling the Postman API, while also defining the capabilities present within the API that I depend on to conduct API Evangelist business. I have already added the Postman API to several other workspaces including design, mocking, documentation, testing, and client, but now I’m going to create a company specific workspace for managing my API collections for Postman. My master copy of the Postman collection which I’ve downloaded from the Postman API Network will act as the master API collection for Postman, and I will share the collection out to any relevant workspace, but I will also be creating some more specific workflow and capability collections that reflect how I use the Postman API to run API Evangelist. I’m not sure this approach will scale, but I’m purposely looking to push the number of collections and workspaces I’m managing using Postman to see where the friction is. I’m using the Postman API to help me organize and audit how I’m managing APIs at scale, so I I can adapt and pivot based upon how I’m breaking things up. If I had to manage everything through the Postman interface it would take me days to get things reorganized each time I change my strategy, but since I have the...[Read More]


API Evangelist

Azure Provides SDK Governance Guidelines

20 Nov 2019

Most companies I encounter who are doing API governance are purely focused on API design, with a handful also thinking more deeply about documentation, testing, monitoring, and other stops along the API life cycle. API governance done well involves every stop, but since most organizations are only just getting started investing in API governance, the area of API design is a sensible place to start. Defining how APIs should be designed, helping introduce consistency in how APIs are developed, reducing friction for consumers when it comes to integrating them into applications. Anytime I come across examples of API governance in the wild I try to showcase it here on the blog, and for this story I wanted to shine a light on how the Microsoft Azure team provides guidelines for how SDKs should be developed and delivered. Azure provides the following scaffolding when it comes to their guidelines, answering many of the questions teams will have when it comes to crafting consistent SDKs for APIs across any product or group: Introduction - Providing general information regarding what the SDK guidelines are for, and references the need to work with the Azure Developer Experience Architecture Board to define the right experience. Terminology - Offering a glossary of the most common terms and phrases that are in use that might not be readily known by stakeholders, providing easy to find definitions for words that will be used across guidelines API Design -  Articulating how the API surface each client library must have the most thought as it is the primary interaction that the consumer has with each service, defining the overall developer experience. Implementation - Guidance for to actually begin implementing your SDKs once you have thoughtfully define the surface area of the API for the SDK, beginning to actually craft the intended language library SDKs. Documentation - Providing details of what is required when it comes to delivering documentation to support all language client libraries, and...[Read More]


API Evangelist

The Common Building Blocks Of Evangelism

19 Nov 2019

As part of my work as the Chief Evangelist for Postman I find myself regularly talking to other devrel, advocates, and evangelists who are looking for ideas on how to expand their evangelism toolbox, and be more successful in their work. As part of these conversation I wanted to work my way through my own toolbox to see what is still relevant, and share the tools and tricks I have with other evangelists I am talking with, and hopefully also learn about a few new ones myself, as I engage with different API personalities from across the space.  Much of what I do as the API Evangelist is very formulaic and repetitive. While there is a creative storytelling element as part of these, really the more tangible, repeatable and quantifiable elements of what I do are pretty known, and do not represent any sort of secret sauce or special ability--here are the building blocks of API evangelism you will find in my toolbox. Blog Posts - Publishing a single blog post on a specific topic, process, or idea I think the community will find valuable.  Social Media Post - Publishing single Tweet, LinkedIn, Facebook, or other post to a social network for my profile(s). Questions / Answers - Publishing questions or answers to online properties, helping seed, drive, and amplify conversations. Emails - Craft a newsletter, broadcast, and personalized email involving a specific concept, narrative, or other relevant item. Meetup Talk - A 10 to 30 minute talk to a targeted audience of less than 100 people in a Meetup environment. Conference Talk - A 15 to 60 minute talk to a larger audience of 50 to 500 people in physical conference setting. Webinar Talk - A 15 to 60 minute talk on a specific subject to an online audience via web conferencing platform. Images - Creating an image that represents a specific concept for use across my evangelism activity on and offline. Video -...[Read More]


API Evangelist

API Copyright: Blank Forms

19 Nov 2019

I am gearing up for API copyright heading to the Supreme Court, having another look at whether or not the naming and ordering of your API interface is copyrightable, as well as whether or not reimplementation of it can be considered fair use. To help strengthen my arguments that API should not be copyrightable I wanted to work through my thoughts about how APIs are similar to other existing concepts that are not copyrightable. One of the new concepts I’m looking at understanding better, and add to my toolbox of API copyright arguments is the blank form, which APIs resemble in more than one way. As I was refreshing my understanding of what copyright is and isn’t I was working my way through Copyright.gov, and found this snippet about the copyrightability of forms—which I think applies pretty nicely to what APIs accomplish in a digital way. Blank forms typically contain empty fields or lined spaces as well as words or short phrases that identify the content that should be recorded in each field or space. Blank forms that are designed for recording information and do not themselves convey information are uncopyrightable. Similarly, the ideas or principles behind a blank form, the systems or methods implemented by a form, or the form’s functional layout are not protected by copyright.  A blank form may incorporate Works Not Protected by Copyright 4 images or text that is sufficiently creative to be protected by copyright. For example, bank checks may be registered if they contain pictorial decoration that is sufficiently creative. Contracts, insurance policies, and other documents with “fill-in” spaces may also be registered if there is sufficient literary authorship that is not standard or functional.  In all cases, the registration covers only the original textual or pictorial expression that the author contributed to the work, but does not cover the blank form or other uncopyrightable elements that the form may contain. Examples of blank forms include • Time...[Read More]


API Evangelist

Subway Map Visualization Postman Collection

18 Nov 2019

I have been working to migrate all the different API driven JavaScript solutions I have developed over the years and run on GitHub using Jekyll to operate self-contained Postman collections. Now that Postman has a JavasScript visitations layer, I can make calls to APIs, parse the response, and generate HTML, CSS, and JavaScript visualizations. Allowing me to begin organizing all my API-driven visualization tools as simple, sharable, and executable Postman collections. I had developed a way to visualize the API lifecycle a while back using the Subway Map Visualization jQuery Plugin, by Nik Kalyani. It provides a pretty slick way of drawing lines, establishing stations, connectors, and other icon Subway map visualizations. I have been running this on Github using Jekyll, but wanted to make it something that I could keep portable and machine readable so that anyone else could run in locally or on the web.  I haven't hooked the visualization up to any specific APIs yet. I’m going to make it run from my API lifecycle training, allowing users to visualize and then explore the stops along the API life cycle they want. Then I want to see what I can do to hook it up to AWS, Google, and Azure for helping visualize API infrastructure, allowing me to map out different APIs, and organize them into lines based upon OpenAPI tags or Postman collection folders. The Subway Map visualization Postman collection is published as a template in the Postman network, and I have published supporting documentation from the collection. Once I have updated it to work with any particular API I will publish a separate template, keeping this one as my base, but then evolving it to meet different API life cycle and infrastructure needs, helping create a Subway map for navigating the complex API landscapes we are building.[Read More]


API Evangelist

Organizing EC2 API Actions As A Postman Collection

18 Nov 2019

I’m crafting Postman collections in support of the upcoming re:Invent conference in Vegas in December. One of the first collections I crafted was for Amazon EC2, allowing anyone put the Postman collection to work managing their AWS EC2 infrastructure. At first glance of the 359 actions available via the AWS EC2 API documentation page, I was overwhelmed. I definitely needed a way to tame the AWS EC2 API, making it more accessible and usable by a human—while APIs are meant for system to system integration, and delivering desktop, web, mobile, and device applications, it still has to be implemented by a human.  When crafting the AWS EC2 Postman collection I wanted to take some time to better organize the wealth of actions you can take, making them more accessible via a single Postman collection, organized by resource. You can access the collection here, and the Postman generated API documentation for the AWS EC2 API by clicking on the image below. The Postman collection helps organize the 350+ actions into folders by resource—making them a little easier on the eyes, and to navigate, while also making every AWS EC2 action immediately executable once you publish your keys and secrets to the Postman environment that accompanies the collection. By organizing all of the individual resources into more coherent groups, it increases the chance that the resource you need will be found, and put too use. Account Attributes Describe Account Attributes (Docs) Address To Classic Restore Address To Classic (Docs) Address To VPC Move Address To VPC (Docs) Addresses Allocate Address (Docs) Associate Address (Docs) Describe Addresses (Docs) Disassociate Address (Docs) Release Address (Docs) Aggregate ID Format Describe Aggregate ID Format (Docs) Availability Zones Describe Availability Zones (Docs) BYOIP CIDR Advertise BYOIP CIDR (Docs) Deprovision BYOIP CIDR (Docs) Provision BYOIP CIDR (Docs) Withdraw BYOIP CIDR (Docs) Bundle Task Cancel Bundle Task (Docs) Capacity Reservation Cancel Capacity Reservation (Docs) Create Capacity Reservation (Docs) Modify Capacity Reservation (Docs) Capacity Reservation...[Read More]


API Evangelist

API Copyright Heading To The Supreme Court

17 Nov 2019

I received an email this last Friday that the Supreme Court agreed to hear the case on the freedom to reimplement APIs, as well as reconsider the copyrightability of APIs, and whether their reimplementation constitutes fair use. I’ve been a signer on two briefs as well as supporter of the case since 2012, and to help fire up my imagination and storytelling around why APIs should NOT be copyrightable, I wanted to revisit some of my storytelling over the years, and brainstorm some possible new arguments that might help in this latest wave of litigation. Here are just a sampling of the stories I have written over the years: May 2012 - APIs Have Been Copyrightable for 22 Years November 2012 - Help EFF Make Case For No Copyight on APIs June 2013 - Helping EFF Urge The Courts to Block Copyright Claims in Oracle v. Google API Fight November 2013 - Putting The Open In API With The API Commons November 2013 - API Commons Is More Than Just The Definition, Specification or Schema December 2013 - It's Between Copyright And Fair Use In Oracle vs Google API Case May 2014 - Where Will Your API Stand In The Oracle v Google API Copyright Debate? May 2014 - Restaurant Menus As Analogy For API Copyright August 2014 - Continuing With The API Restaurant Analogy November 2014 - My Continued Support As Signer Of Oracle v Google Amicus Brief From EFF August 2015 - What We Can Do To Make A Difference In The Wake Of Oracle v Google API Copyright Case Each time I pick up this torch I have to re-educate myself exactly what copyright is and isn’t, and build upon all my former learnings helping pour the latest batch of concrete that acts as the foundation for my belief system around API copyright, patents, trademark, and licensing. For this round, let’s once again revisit the basics, what is copyright? Copyright is the...[Read More]


API Evangelist

Why Is API On-Boarding And Authentication Still So Hard?

13 Nov 2019

I was teaching a class to business users yesterday and they were very curious about being able to play with some of the public APIs that exist, but for me, once again I found myself struggling with how to help them get over the hurdle of signing up for an API and getting the key or token you need to begin making your first call to an API. Even with having some ready to go Postman Collections that help introduce business users to some very well known APIs like Facebook, Twitter, and Instagram, I still can’t on-board them without pushing them to setup a developer account, create a new application, and obtain a set of keys or tokens. Something that is a regular hurdle for developers to jump over, and one that will keep most “normals” out of the conversation. This bothers me. After over a decade of evangelizing the streamlining of API on-boarding processes, and listening to API management and other API service providers profess how they’ll reduce friction when it comes to putting APIs to work—clearly nothing has changed. Few APIs are truly open, meaning you don’t need any API keys or tokens to access, and the rest of the APIs use a mix of different authentication strategies to secure their APIs—introducing friction for developers and non-developers. This isn’t just a problem for non-developers, it is also a problem for developers, making it pretty challenging to go from learning about a new API to actually making a call to that API. Even with my experience, I find it very difficult to on-board with most APIs—this is a problem. When it comes to putting APIs to work, there are a handful of common areas where friction exist, making it difficult and in many cases impossible to actually make an API call, even once they gone through the hard work of finding an API and learning about what it does: Sign-Up - Finding where you...[Read More]


API Evangelist

What You Mean When You Say You Have An Open API (Not OpenAPI)

13 Nov 2019

My friend Lorenzino Vaccari (@lvaccari) asked me to help him with what I think of as an open API. Not to be confused with the OpenAPI specification, but an API that is “open”. I’ll begin with the state of things and the reality that there are many API providers who proclaim that they have an open API, when in reality there are very, very, very, very, very, few APIs that are actually open. Honestly it is a term that I’d completely avoid using mostly because it doesn’t have any meaning anymore, but also because of the unfortunate name that Swagger was given after it was put into the Linux Foundation. Even so, I’d do anything for Lorenzino, so let’s work through this exercise of what it actually means when you wield the term “open API” in my opinion—so here we go. In my opinion, the most common thing that “open API” means is open for business. It has almost nothing to do with API access, but if I was to explore what an honest real world bar for what “open API” SHOULD mean, here is a list of things I would consider. First, I’d say that being an open API is not black or white, but more many shades of gray, resulting in a laundry list of things API providers and consumers should be considering when they wield the term. Discovery - Can I find the API? Is it easy to discover and explore for any potential consumer. Portal - Is the landing page or portal for the API accessible to the public, allowing me to learn about the API. Docs - Is there complete, up to date API documentation for the API that is accessibly publicly. Paths - Do developers have access to all of the paths that are available for the API made available. Enumerators - As a developer do I have access to all enumerated values that can be used for an...[Read More]


API Evangelist

People Are Aware Of Public APIs But Less Aware Of Mobile APIs

13 Nov 2019

After some time in DC talking API governance I’m reminded that the “normals” are increasingly aware of public APIs, being able to actively discuss Facebook, Twitter, and other APIs, but are still very unaware of the larger mass of APIs that exist behind the mobile applications we are all dependent on. I don’t blame them, as many of the API providers I walk to who develop mobile APIs often do not fully see them either, leaving mobile APIs often un-secured, and operating in the shadows. Public APIs are easy to talk about, and I’m glad we’ve made progress in hing theelp “normals” be more aware, but we need to also be investing in more storytelling that helps bring mobile APIs out of the dark. Most people, developer or non-developers just see a mobile application when looking at the icons available on our mobile devices. When I look at them, I just see APIs. The mobile applications are just a Hollywood facade, and it is APIs that move data and content back and forth between mobile phones and the platforms behind each application. It is APIs that broadcast our locations, access our cameras, and use our microphone and speakers. APIs are the plumbing behind the applications we depend on, but do not receive much of the conversation and attention when it comes to privacy, security, and the observability of our personal and professional data resources—this is a problem, and we need to invest more in helping educate folks about the API pipes just below the surface. Even when APIs are hidden behind mobile applications they are still public APIs. Even though these APIs may not have not be published via a public developer portal, they still use public DNS and are accessible simply by running am mobile application through a proxy. These APIs should be receiving the same amount of attention, scrutiny, and auditing as any other public API. We should be having open discussion around...[Read More]