The API Evangelist Blog

API Evangelist

The Layers of the API Specifications, Definitions, and Schema Onion

18 Apr 2020

I struggle with using the right words in my API storytelling. Striking a blend between what people are saying across the sector, and what people should be saying. There are many words and phrases in the space that help describe what it is they do, while there are others that confuse more than they describe anything in particular. Mostly I struggle because all of this API stuff can be complicated and very abstract, but also because I can be a little dyslexic at times, seeing some words as interchangeable, depending on what day it is. To help me (once again) think through the world of API definitions, I wanted to riff off of my talk from AsyncAPI virtual conference this week and peel back the layers of the API specifications, definitions and schema onion. The words API specifications, definition, or schema are often used interchangeably as part of API discussions, but there are some realities on the ground when working with these artifacts that can increase the friction across operations if we allow them to be used interchangeably. It is pedantic as hell to want to write a story about the nuance of these terms, but if it helps me be more precise in my work, I’ll do it. To help illustrate the dimensions here, I wanted to highlight the artifacts around the Slack API that I am using for my talk next week. Slack Web API  OpenAPI - The OpenAPI for the Slack Web API defines the surface area of this single HTTP API instance. Slack Events API AsyncAPI - The AsyncAPI for the Slack Events API defines the surface of this real time HTTP API instance. These two artifacts describe the surface area of specific APIs, leveraging two open source API specification formats, but also adopting a third API specification format that these two specifications use to describe the underlying schema being used as part of the structure for request and response, message,...[Read More]


API Evangelist

We Should Have Built an API First

15 Apr 2020

It has been a while since I wrote a simple breakdown of why APIs matter, but also why API-First matters. I went down the API-First rabbit hole with API-First [Design || Code] and API-First [Business], but I haven’t just made the basic case of why we should have built an API in the first place. It really isn’t a concept you full grasp until you’ve made the very costly mistake of being API-Last several times over, so it makes sense to break things down into a single blog post to help folks [hopefully] learn from without going down the same paths I’ve been in my application development career. To help on-board folks with what I mean when I say API-First, let’s recap how we got here with a simple finctional product API story. One Product Catalog With Multiple Destinations Products are a ubiquitous resource in an online world. By 2000, if we were selling products in the real world, we also began needing to publish products to a website, which by 2005 would morph more into a mix of database-driven web applications. However by 2010, we also needed to have the same products available in our mobile applications., resulting in a handful of channels we need our products available on. Websites - Public websites that a product catalog need to be published to for consumption. Web Applications - Specific web applications that need access to our product clog. Mobile Applications - Making sure the product catalog is available via iPhone and Android phones. In 2020, these channels are morphing iton a suite of single page and static applications that work across web and mobile properties, but the need for a single set of content or data for making available across these channels hasn’t changed. By 2010, most companies were realizing that HTTP APIs were the most effective way to deliver content and data across applications, and by 2020 this has emerged as the reality for...[Read More]


API Evangelist

Growth in the Number of API Collections Over the Last Five Years

15 Apr 2020

There are surprisingly few meaningful API numbers to showcase across the API sector. There are few API or API service providers who have a view of the landscape that can produce meaningful numbers, and most prefer to keep their numbers close to their chests for a variety of reasons. Over the last decade we've all grown accustom to the Programmable Web hockey stock chart showing the number of public APIs added to the PW directory, due to the ubiquitous graphic being used in conference talks and blog posts since before 2010. I've always been a big advocate of API service providers developing and sharing their data, something that hasn't diminished at Postman. Postman has a pretty unique view of the API landscape, possessing many interesting data points. The company is working on a strategy for compiling, organizing, publishing, and sharing the data it has in a thoughtful way, something that you can see trickling out through a variety of slides from recent events like the Postman Galaxy Tour, and Postcon last year. Like this one showing the growth in API collections, topping at 34.9M collections published in 2020. The important thing with this visual is it reflects both private and public APIs. This graphis is just one of many data points Postman possesses that I am working to encourage packing up, so that others can reference and use, much like the ProgrammableWeb public API chart over the last decade. Visuals like this are important all of us making sense of what is going on, and be able to truly see the scope of what is happening. I'd love to see other API service providers share their numbers, but like Postman, I understand it is difficult to do in a meaningful way that respects the privacy of your users, and the interests of your investors. I will keep pushing for more observability into the API community through the Postman lens, and hopefully produce some interesting visuals...[Read More]


API Evangelist

Real Time Email Notifications About API Deprecations Down the Road

13 Apr 2020

I got an email from GitHub after firing up an older Postman collection I had. The collection was originally engineered to just pass in a GitHub token using a query parameter, which historically has been accepted, but is something that will be going away soon. It makes sense, and while query parameters are much easier for authentication, using headers is just a more logical and secure way to pass your tokens in with each API call. The token usage itself isn't what caught my attention, what gave me pause was the usage of real time email to notify users of features they are currently using which will be going away in the future. Here is the amil I got my GitHub about my usage of the deprecating access token query parameter: Hi @kinlane,On March 24th, 2020 at 03:55 (UTC) your personal access token ([TOKEN NAME) using [USER AGENT] was used as part of a query parameter to access an endpoint through the GitHub API:https://api.github.com/search/repositoriesPlease use the Authorization HTTP header instead, as using the `access_token` query parameter is deprecated. If this token is being used by an app you don't have control over, be aware that it may stop working as a result of this deprecation.Depending on your API usage, we'll be sending you this email reminder on a monthly basis for each token and User-Agent used in API calls made on your behalf. Just one URL that was accessed with a token and User-Agent combination will be listed in the email reminder, not all.Visit https://developer.github.com/changes/2020-02-10-deprecating-auth-through-query-param for more information about suggested workarounds and removal dates.Thanks,The GitHub Team I like this type of communication from API providers. I think this is a nice addition to any API management solution. Where you could flag any element of an API and when analytics reveals a developer using this feature, a transactional email is send off to the user. Allowing API providers to be more organized about how they plan for deprecations,...[Read More]


API Evangelist

Establishing an API-First Reference Implementation

06 Apr 2020

I do a lot of API blah blah blah’ing about abstract technical concepts. Sometimes I am able to craft a coherent narratives around some complex technology goings on, but most of the time I am just practicing. Workshopping different concepts until I find one that will make an impact on the way folks see APIs. One of the challenges I face in my storytelling is that I operate too far into the abstract, and not making the rubber meet the road enough. Another challenger I face is going too far down the rabbit hole with a particular companies implementation, which usually turns down the volume significantly on my storytelling, because most companies, organizations, institutions, and government agencies aren’t equipped to be 100% transparent about what they are doing. After a decade of storytelling exploration I find that operating somewhere in between the abstract and the real world is the best place to be, resulting in me desiring a reference implementation that I could use as an anchor for my storytelling, helping keep me grounded when it comes to how I talk about APIs. Welcome Union Fashion One of my co-workers at Postman had created a fictional company called Union Fashion when he started working on our solutions engineering team, but hadn’t put much more work into the project since. When I heard about it sounded exactly like what I was looking for. An e-commerce reference implementation that a wide audience could relate with, providing us with a model API implementation that we could use across webinars, workshops, and other storytelling channels. I’m big on building on the work of others, so I adopted Union Fashion, and I am working to define the fictional company as an API-First approach to operating a common real world business. Repo) (Docs) - Defining all of the products that Union Fashion offers. Orders - (Repo) (Docs) - Allows for the ordering of Union Fashion products online. Baskets - (Repo) (Docs) - Allows...[Read More]


API Evangelist

Crowdsourcing COVID-19 Testing Location Data

03 Apr 2020

I have been heads down working on resources for Postman's COVID-19 response, pulling together a variety of COVID-19 data and information that developers (and non-developers) can put to use when trying to make sense of what is going on around us. Identifying existing API resources that were available, while shining a light on the hard work of others was the first wave of our response, but along the way we identified some areas where there were no existing APIs, and felt there was an opportunity to step in. So I got to work on developing a couple of proof of concepts (POCs) that we could rally around as a company, and further contribute to the COVID-19 / Coronavirus fight. One of the POCs that came out of this work was an idea for crowdsourcing COVID-19 testing location data, resulting in a pretty interesting blueprint for making data available as APIs, which could be used for a variety of open data efforts—not just COVID-API testing locations. Framing the COVID-19 Testing Location Problem Before I dive into what I built, let me talk a little about how I landed on this being a problem in the first place, which is an important first step in any technological response to a real world problem. I was listening to the regular highlighting of drive-through COVID-19 testing locations during the press conferences coming out of this administration, and I was seeing or hearing it on the news I am digesting each day. Recognizing that the availability of COVID-19 testing locations was a politically charged topic, I wanted to better understand where we could or should go if we had Corona symptoms. I don’t have a doctor, event though I have health insurance, so I really have no idea where to go if I came down with it. I have anxiety about where I would go in my community if I came down with symptoms, and I can imagine that other...[Read More]


API Evangelist

COVID-19 Data and Information

26 Mar 2020

When it comes to coping with the stressful world unfolding around us I like to lose myself in my work. Data and APIs is a great way to tune out the world and keep myself busy while in isolation. Like most other technosolutionists I want to do some good in this crazy time, even if I don’t quite fully know what that means. So, to help me define what that means I sat down and began scratching at what was already occuring across the landscape. Identifying what sources of data were available out there, and what types of information was available which would truly make a difference in everyones world--not make it worse. Informational API Collections To begin I wanted to better understand where the top sources of information were, so I began documenting who the most relevant government agencies were in the COVID-19 conversation, going directly to the source of information at the highest levels. Center for Disease Control (CDC) (Website) (Collection) - A simple collection for pulling information from the CDC. European Centre for Disease Prevention and Control (ECDC) (Website) (Collection) - A simple collection for pulling information from the ECDC. World Health Organization (WHO) (Website) (Collection) - A simple collection for pulling information from the WHO. This seemed to reflect the authoritative resources available to me, so I got to work defining how each of these agencies shares information, mapping out the top channels I could profile as a Postman collection, aggregating relevant information, and then allowing it to be pulled manually or in some automated way. Twitter - Each agency uses Twitter as a way of providing updates. YouTube - Each agency uses YouTube to publish video resources. RSS / Atom Feeds - Each of the agency provides RSS feeds of info. I created a Postman collection for each agency, all someone has to do is enter their Twitter and YouTube API authentication, and they are up and running pulling data from across each of the agencies. My goal...[Read More]


API Evangelist

The Official Cloudflare API Postman Collection

12 Mar 2020

I use Cloudflare for my DNS. I like the threat protection they offer, the dead simple DNS management, and their robust API. I automate the management of a handful of my domains. Providing maintenance on 100 of my API life cycle, couple hundred API landscape sites, and the range of tooling, APIs, and other side projects I have. I’ve written several times about how Cloudlare weaves their API into their UI, so I am happy to write about their new Postman collection for the complete Cloudflare API. The Clouflare API Postman collection provides 447 individual API requests organized into different folders, making the entire surface area of the API much easier to navigation and make sense of what is going on. The Cloudflare API Postman collections gives you quick access to working with Users, Accounts, Organizations, Zones, DNS, Certificates, Workers, Firewalls, Load Balancer, Logs, and other essential infrastructure assets. I use about 1/20th of the valuable API resources Cloudflare provides, but I couldn’t operate my infrastructure like I do with out it.  I have added the Cloudflare API Postman collection to internal Postman workspaces, allowing me to automate more of API infrastructure work. I haven’t expanded my usage of the Cloudflare API because I haven’t had time to kick the tires and learn about what is going on. With the Cloudflare API Postman collection I am able to quickly play around with different APIs and learn more about what is possible. I’ve been wanted to play with Cloudlfare workers for sometime, and think more about how I can use them to deliver or consume APIs at the edge, and the Cloudflare API Postman collection makes it easier for me to make the time to learn more about how it all works. I’d love to see the Cloudflare API Postman collection get added to the Postman API Network. If you work at Cloudflare and are in charge of maintaining the Postman collection, all you have to...[Read More]


API Evangelist

A Proof of Concept API Service Tier

12 Mar 2020

If you  have followed me over the years you know that I get very frustrated by the access or lack of access to APIs, as well as the services and tooling that target the sector. As someone who is perpetually kicking the tires of API providers and service providers, not being able to on-board at all, on-board without my credit card, or one of the many other ways companies introduce friction, I find myself regularly pissed off. So anytime someone makes my world easier, and accommodates my need, desire, and obsession with playing with every damn API tool out there I have to say something. Today’s example is from my partner in crime Tyk, with their proof of concept option  Tyk acknowledges that most of us might not be ready for pro status—we just need to kick the tires a bit. I love this approach. It’s an evolution of the freemium model that I think is more honest and acknowledges the need to play around before entering in the credit card. This isn’t all about getting something for free or always being ready to pay for a service. This is about me getting access to your service, be able to develop my proof af concept (which I was able to do in < 10 minutes with Tyk), and then justify the cost of going pro with other stakeholders. Nice work Tyk—definitely what I like to see when playing with any API solution. [Read More]


API Evangelist

API-First [Business]

10 Mar 2020

I am working my way through defining a more precise definition of what API-first means which I can use across my API storytelling and conversations. I workshopped the widest definition possible of what API-First means to me yesterday, and be the end of the day I posted another more precise definition of what API-First means to a more technical crowd which I dubbed API-First [Design || Code]. Today, I’m once again thinking more about the business side of the conversation, and focusing on what I would like to eventually be a more precise definition of what API-First means to business stakeholders, which I am dubbing as API-First [Business]. As I said in my broader definition of API-First, if these conversations aren’t including business stakeholders we are doing it wrong. These people are making many of the decisions around the why and how of the desktop, web, mobile, device, and network applications we are delivering on top of our API infrastructure, so we can’t argue that API-First is a developer or technical only concept. We need business stakeholders also thinking API-First, otherwise our projects will never have the resources they need, and are more likely to fall short in meeting real world business objectives. API-First is not a developer concept, it is a concept that business and developer audiences should both be aware of, and then there are separate inner cores to the definition of API-First, one API-First[Design || Code], and the other API-First[Business], which can help bring a more precise definition to the table for each dimension of our operations. Some Common Business Productivity APIs To help make this definition a little more real I wanted to actually apply it against a handful of services I am currently working with business stakeholders at Postman. I am working with some very smart technically savvy folks who aren’t programmers to understand how I can help them be more effective and efficient in their daily work. Working together,...[Read More]


API Evangelist

What Is API First?

09 Mar 2020

I really struggled with this piece on API-first. It is one of those holistic API advice pieces I am very conflicted about. API-first feels like yet another marketing phrase or latest trend like microservices. So as I am writing down my thoughts over the last couple weeks on this, my bullshit-o-meter kept going off. Honestly it still is, but I still feel like there is enough value here that I can move forward with a story. As my co-worker Joyce rightfully pointed out in a meeting recently, API-first is one of those phrases we regularly throw out there without much assessment, agreement, or real definition of what it means. That is one reason it feels so wrong at times, because I feel like it is one of those feel good things we throw out there, but never really think too deeply about while doing, or after it fails and we’ve moved on to the next thing. With all of that said, I still believe that API-first can matter, if we, as Joyce points out, actually define what we mean by it. I think there is a lot of misconception about what we mean by API-firsat, and I’d like to stimulate conversation around what it means, if not just get more precise around how I talk about it. One concern I have about the API-first discussion is that once again it is something that only concerns developers when delivering APIs, and that it is something that business folks shouldn’t worry their pretty little heads about. This is a classic historical technique for dividing and conquering the technology-human paradigm that is spreading across society, and is something I am not interested in perpetuating. So I have broken down API-first discussion into two main parts, one through the technical lens, and another through how business folks will need to be made aware of as they continue to employ technology as part of their everyday work. Looking Through the...[Read More]


API Evangelist

API-First [Design || Code]

09 Mar 2020

I worked through my thoughts on what API first is, which I consider to be the outer layers of what is going on when we use this phrase. I wanted to focus on the technical and business rift that exists in this discussion first, now I want to dive into the more technical core of this phrase, and get to the heart of how developers are going to see this phrase. Depending on the type of developer you are, and your exposure to different aspects of the API industry or API operations within your organization you are going to make different assumptions about what API first is or isn’t. Some will feel it is more just about doing APIs before you build applications, while others are going to see it as being more about going API design first, before you ever write any code. Ultimately I want to establish a definition of API first that is inclusive, and not pushing people out, while also helping me ground how I use the phrase. Let’s Recap, What Is API-First? From the previous post, let’s take a fresh look at what is API-First from the vantage point of a more technically included stakeholder like architect, developer, or other IT actors. Setting the stage for how API-First [Code] can be approached. Before developing a web application, develop an API first Before developing a mobile application, develop an API first Before developing a device application, develop an API first Before attempting any system integration, develop an API first Before directly connecting to a databases, develop an API first Also, Why do API-First? Naturally people will ask why? To help flesh out why API-First matters, before we help separate API-First [Code] from API-First [Design], let’s look at the benefits of going API-First, then separate code from design approaches might make a little more sense. Allow potential stakeholders to communicate about what is needed before applications are actually build. API will reduce...[Read More]


API Evangelist

What Is My API Network

06 Mar 2020

I am working on the vision for the Postman Network. As I do with everything, I want to start with the basics human aspects of what is going on, and then relate them to the more technical and then business aspects of it all. Right now, the Postman Network is a listing of teams and individuals who have published Postman collections under a handful of categories. While visiting the network you can browse collections by category or search by keyword, and view the team or individual profile, select the “Run in Postman” button, or view the documentation. My goal is to brainstorm what is next for the Postman network, but also help define what network means in a world of API collaboration. When it comes to my API network, I like to focus on the meaningful elements, the “person, place, or thing” that makes my world go around. I am not interested in nouns that do not enrich my world, and I am keen on emphasis of the humanity of all of this over purely the tech for the sake of tech. So what are the nouns that make up my world? People - While I don’t always like people, because I am one, they tend to be the center of my world. People are the most important building block of my network, and drive what I truly care about when it comes to APIs. Teams - I engage with a variety of teams as part of my job, both internal to Postman, but also externally across the many different enterprise organizations I am working with. While I have relationships with individuals on the team, I find myself regularly thinking about how to add value to the entire team, and influence how and why they are doing APIs. Projects - My world is littered with projects. Some projects move forward, while others simmer, and sometimes whither on the vine. Projects usually involve one or many...[Read More]


API Evangelist

The Building Blocks of API Partner Programs

04 Mar 2020

I’m doing a deep dive into partner API research, taking a fresh look at how API providers and service providers are operating their partner programs. I looked through around a hundred partner programs I have indexed, and listed a few of the notable ones below. It can be difficult to study partner API programs because many organizations consider their API program a type of partner program by itself, and then there are also a lot of partner APIs, providing actual services involving partners, providing programmatic access to a variety of partner resources. I’ll be rolling up this research into several other more formal strategies and guides that I will publish as part of API Evangelist, but like I do with all my work I wanted to publish my notes and research here as I’m working through. Purpose The reasons behind having a partner program, and what value it brings to an organization and its partners. Providing a list of reasons why you will want to invest in a partner program, and use to sell the concept to other stakeholders. Increase Exposure - Providing more exposure opportunities for platform partners. Increase Skills - Expand upon the skills of partners who are putting a platform to work. Increase Awareness - Grow the awareness amongst partners about what is possible. Increase Sales - Making it about the money, and expanding the sales intake for partners. Drive Communications - Push the platform and its partners to communicate more. Increase Collaboration - Pushing partners to work together, and with the platform more. Encourage Usage - Incentivize more usage of the platform and its products and services. Encourage Adoption - Drive adoption of the platform, pushing partners to depend on it more. Encourage Syndication - Increase the syndication of content and other branded assets. Opportunity for Growth - Allow partners to grow by using the platform more. Protect Users From Unwanted Behavior - Solicit partner assistance to help keep users safe....[Read More]


API Evangelist

Postman API Reference and Capability Collections

04 Mar 2020

Postman collections are a great way to document every detail of an API, defining the host, path, parameters, headers, and body of each API request. Allowing any single API request to be captured as a machine and human readable Postman collection that can be shared and used by any technical or non-technical user. The most common approach to defining a Postman API collection is to document the requests across all available APIs, providing a complete collection of all API requests that can be made, then using that reference to mock, document, test, monitor, and execute individual requests manually or as part of any automated process.  However, there are other ways to evolve these requests to ensure that they more closely resemble common business tasks, accomplishing everyday activities that technical and non-technical individual need to accomplish. An AWS EC2 Reference Collection An example of a Postman reference API collection can be found in the collections I worked on leading up to AWS re:Invent last December. One of the reference API collections I have been crafting is for the Amazon EC2, providing a portable and executable collection of all API requests possible for the cloud compute platform. The AWS EC2 reference collection has over 350 individual requests possible in, providing a dizzying amount of control over delivering, operating, and evolving compute capacity across AWS regions. While this collection a robust representation of all the available AWS EC2 resources, it will take additional work to understand what is possible, find the specific request needed for any particular integration or application, and populate the request with relevant values to realize any specific business need. It is a great start when it comes to putting AWS EC2 to work, but to make things more usable, it will take a little more work. An Amazon EC2 Capability Collection This AWS EC2 reference collection provides a foundation for delivering integrations and applications, and can be used as a seed for a different...[Read More]


API Evangelist

Peeling the OpenAPI-Driven API Life Cycle Collaboration Onion

03 Mar 2020

I am trying to better understand how we all work together to deliver and consume APIs. Fleshing out more meaning behind some of the common words we use in the space such as collaboration, platform, hubs, workspaces, feedback looks, comments, sharing, notifications, and other communication channels. I want push my thoughts forward on what the gears of API collaboration are, and how we can better work together to move many different APIs forward as provider and consumer. API collaboration isn’t very straightforward, and in my mind there are several layers to how things actually are playing out across the API landscape. This is my best attempt at breaking things out into different buckets for helping us make sense of how we are working together to move API infrastructure forward at the organizational and industry level. Layer One - Single OpenAPI Management In 2020, OpenAPI has won the great API specification wars of the previous decade. OpenAPI is helping individual developers and architects more efficiently define and design OpenAPI definitions, using the core objects of the API specification as our guide. Providing us with a box of gears we can assemble to define the floor of our digital factories putting out the digital products and services we provide to our customers each day. Info - Helping manage the name and description for OpenAPI definitions Contact - Helping integrate and manage contact info as part of wider team management. License - Helping manage the4 licensing for the APIs being defined. Server - More management for available servers (ie. mock, development, production, etc) Server Variables - Helping manage server variables as part of environment management. Paths - Help managing the design and definition of API paths. Operation - Better operation management (verbs, summary, description, operationIds, etc.) Parameter - Help more consistently name and define query and path parameters. Headers - Be more deliberate and aware about how headed are defined and used. Request Body - More tools for...[Read More]


API Evangelist

The Technology, Business, and Politics of the OpenAPI Conversation

02 Mar 2020

I was pondering a tweet from Aidan Cunniffe (@aidandcunniffe) over at Optic they other day.  He was expressing what he says is a “controversial opinion that keeps getting backed up by conversations. Each version of OpenAPI and JSON Schema map to ~15 versions. All the implementations by vendors, cloud providers, and open source libs implement a useful (but not always the same subset.” I don’t think it is a controversial opinion at all, I think he points to a pretty critical deficiency in our belief around APIs and specifications like OpenAPI. Something that begins with the specification itself and how it evolves, but as Aidan points out, echoes out through API service and tooling providers, but then also across API providers themselves who put the OpenAPI specification as part of their own operations.  On Twitter, Aidan continues with, “what is the point of a data-spec if it's not enforceable the same way, everywhere? We have to acknowledge that there's no one spec (a versioned markdown file doesn't count), there's 15, 20, 30, 50 of them in the wild today -- and that's blocking teams from using tooling end-end.” Then continues by suggesting “a wasm reference implementation that every vendor and lib could drop-in and link to across programming languages might actually solve this problem and truly enable end-end use of OpenAPI I'd make this objective #1 for 2020 if I had the keys. I just have the tweets :)”. Makes sense to me, and I’d say something that the OpenAPI community should adopt. Honestly, and I’ve made the argument before, I think the OAI should be investing to stabilize core OpenAPI tooling, going beyond just the spec.  Technical Solutions Require Business and Industry Political Understanding I support the technical solution Aidan puts forward, and would love to see investment across multiple providers to make happen. However, I think we will need to better understand the business and politics of it all to see the change we want—consistent support of...[Read More]


API Evangelist

Design and Build API with Postman

25 Feb 2020

I am doing more talks and workshops within enterprise organizations educating teams about designing and building APIs, helping Postman customers be more successful in not just using Postman, but in defining, designing, delivering, supporting, and evolving high quality APIs using Postman. 90% of the teams I work with are still build-first when it comes to delivering API capabilities across the enterprise, so we are invested in helping bring that number down. Empowering teams to go API-first when it comes to designing and building their APIs, moving beyond the more costly approach of writing code first, and develop more healthy practices that involve business and technical stakeholders in the process. It is natural for developers to want to roll up their sleeves and begin coding to deliver an API. It is what they are trained to do. However, it makes a lot more sense to involve business stakeholders earlier on in the process, and avoid the costly, isolating, and more time intensive approach of purely approach APIs a writing code. API has been working internally, and with our most engaged customers to better define what an API-first workflow involving the following stops along the API life cycle: APIs Builder - On the Postman platform, all APIs begin with the new APIs tab—the beta implementation of being able to manage the API life cycle within Postman. Create - You can create a new API by starting fresh, or importing an existing API definition in the OpenAPI, RAML, or GraphQL formats, and use it as the definition for each new API> Definition - To change the design of an API, you can directly edit the OpenAPI, RAML, or GraphQL definition, manipulating the design of the API and the underlying schema. Mock - With an API definition you can then mock each API, providing a virtualized representation of each path, with examples returned as mocked responses. Environment - Defining key / value pairs and globals that can be used...[Read More]


API Evangelist

Managing API Secrets Using Postman Environments

24 Feb 2020

Postman environments are machine readable definitions of design, development, staging, and production environments that can be used across API operations. When used properly they contain the keys, tokens, and other secrets needed for authorizing each individual, or collection of API requests. Making them an excellent place to begin getting more organized about how API secrets are applied, managed, and audited across teams. Secrets can also be littered throughout Postman collections, but when collections and environments are used properly, developers should be isolating secrets to environments, helping make sure Postman collections contain the technical details of the surface area of an API, but the unique values applied to each API is actually present as part of well defined Postman environments. Providing the opportunity for managing and governing how API secrets are being applied and stored by developers, and opening up the opportunity to use Postman as part of wider API governance efforts. Environments are an essential building block to be considered as part of wider API governance strategy. Like Postman collection, environments will need the greatest amount of governance to inject the most observability, reliability, and security across API operations. When used right, Postman environments help isolate and standardize how secrets, PII, and other sensitive information is used across the delivery and integration of APIs. Allowing for centralized control over environments by leveraging Postman for the managementof environments through the interface and the API. GUI - Managing all of the environments in use with the Postman web interface. All Environments - Manually manage all of the environments in use using the central Postman web interface, allowing any member of governance to audit how environments are being used. API - Automating the management of environments using the Postman API, opening up the opportunity for auditing, managing, and enforcing governance at scale across the environments being applied by all enterprise teams engaging with API operations. All Environments - Programmatically pulling all environments via Postman API, so that they can evaluated...[Read More]


API Evangelist

Content Negotiation for APIs and the Web

24 Feb 2020

APIs often seem like another one of those very technical acronyms that only the most technical people will care about. If you don’t aspire to be a software developer, why should you ever care about what application programming interfaces (APIs)? To push back on this notion I regularly push myself to make APIs more accessible to business users. I feel it is important that anyone who use the web daily as part of their professional career should possess a working understanding of the tool(s) they depend, and have a grip on how APIs aren’t some add-on to the World Wide Web we depend on each day--understanding that APIs and the web are one and the same. Over the last twenty years the web has became a fundamental aspect of how we do business online, and APIs are just the latest evolution of how the web is being put to use to use as part of the digital transformation businesses are going through across every business sector today. The World Wide Web, commonly known as the Web, is an information system where documents and other web resources are identified by Uniform Resource Locators, which may be interlinked by hypertext, and are accessible over the Internet—with “documents and other web resources” being the bridge between APIs and the web. If you are using the web you can use APIs, as long as you understand one of the fundamental building blocks of the web--that you can negotiate “documents and other web resources” in the following information formats. Hyper Text Markup Language (HTML) - Each time you use the web you are getting and posting HTML documents using the Internet. HTML is a machine readable format that renders each web page you view, helping make easier for humans to read in a variety of languages. While you may not write HTML or directly “read” HTML, you are using HTML each day as you make your away around to different web...[Read More]


API Evangelist

The Caltech University API Landscape

19 Feb 2020

I regularly take a look at what different universities are up to when it comes to their APIs. I spent two days talking with different universities at the University API summit in Utah a couple weeks back, and I wanted to continue working my way through the list of schools I am speaking with, profiling their approach to doing APIs, while also providing some constructive feedback on what schools might consider doing next when it comes to optimizing API delivery and consumption across campus. Next up on m list is Caltech, who we have been having conversations with at Postman, and I wanted to conduct an assessment of what the current state of APIs are across the school. The university reflects what I see at most universities, meaning there are plenty of APIs in operation, but not any visible organized effort when it comes to bringing together all the existing APIs into a single developer portal, or centralizing API knowledge and best practices when it comes to the interesting API things going on across a campus, and with other partners and stakeholders. APIs in the Caltech Library When it comes to APIs at the University level the first place to start is always at the library, and things are no different at Caltech. While there is no official landing page specifically for APIs the Caltech library has GitHub page dedicated to a variety of programmatic solutions https://caltechlibrary.github.io/, but you can find many signals of API activity behind the scene, like this API announcement that the write API is available https://www.library.caltech.edu/news/write-api-now-operational. You can also find an interesting case study on how the library is using APIs provided by an interesting API provider called Clarivate, which I will be looking to understand further. As with every other university, there is a huge opportunity for Caltech to be more organized and public about the API resources offered as part of the library--even if it isn't widely available to...[Read More]


API Evangelist

API Interrogation

14 Feb 2020

I was doing some investigation into how journalists are using APIs, or could / should be using APIs. After some quick Googling, Binging, and DuckDuckGoing, I came across a workshop by  David Eads of ProPublica Illinois, called a A hitchhiker's guide To APIs. As I began reading, I was struck by how well it captured not only usage of Postman in journalism, but also how well it captures what Postman does in general in a single precise sentence, “In this hands-on session, you will use Postman to interrogate a web API.” That is how I use Postman. That is why 10 million developers use Postman.  APIs are how we can interrogate the digital world unfolding around us. It is increasingly how we can interrogate the digital world emerging across our physical worlds. I like the concept in general, but definitely think it is something I should explore further when it comes to journalism and investigative storytelling. Postman provides a pretty powerful way to get at the data being published by city, county, state, and federal government. It also provides a robust way to get at the social currents flowing around us on Twitter, Facebook, LinkedIn, and other leading platforms. Postman and APIs provides technical and non-technical users with what they need to target a source of data or content, authenticate, and begin interrogating the source for all relevant information. I find that interrogating a startup is best done via their own API, as well as their digital presence via Twitter, LinkedIn, GitHub, Stack Overflow, Facebook, Youtube, and Instagram using APIs, over speaking with them directly. I find that interrogating a federal agency is often only possible through the datasets it publishes, providing me with a self service way to understand a specific slice of the how our society works (or doesn’t). While I can interrogate a company, organization, institution, and government agencies using their websites, I find that also being able to interrogate their platform,...[Read More]


API Evangelist

All of the Discussions from the BYU API University Workshop in Utah

12 Feb 2020

I went to Provo Utah a couple weeks ago and participated in the sixth annual Brigham Young University (BYU) University API Workshop. I was the keynote opener for the first edition of the conference, and I was the same for the sixth edition of the event bringing together many different universities together to talk about API usage across their campuses. When the event began it was primarily BYU staff, but it has expanded to include administrators and faculty from what I counted to be over twenty other universities from across the United States--making for a pretty interesting mix of conversation from higher education API practitioners looking to solve problems, and share their stories of how APIs have help make an impact at how universities serve students and the public. The University API Workshop is an “unConference Focused on University & Personal APIs & Their Use in Improving Learning”. It brought together around one hundred folks to discuss a wide variety of API topics. Since it was an unconference, everyone pitched their own ideas, with some of them being about sharing API knowledge, while others was about soliciting knowledge from the other attendees. Resulting in a pretty compelling list of session spread across two days. You can browse through the sessions using the Google Docs that every session organizer published. Providing a pretty compelling look at how APIs are making an impact at the higher education level, shining a light on the concerns of API stakeholders across the campus. Session One Let’s stop using usernames & passwords User Experience in the API World Postman Fundamentals Securing APIs/data with proper authorization Session Two Walk, Talk, and API Stalk API Governance at Scale taking ideas to consistent execution Mendix (HPAPaaS/Low Code) After a Year at BYU Our New NGDLE | Open Courses Made With Web Components, Microservices, Docker, CI/CD and more! DDD vs. BI - Balancing Centralizing and Decentralizing Forces in Data Architecture Session Three How do I test...[Read More]


API Evangelist

Postman Governance as the Foundation for Wider API Governance

11 Feb 2020

This an overview of possible strategies for governing how Postman is used across a large organization. It is common for Postman to be already in use across an organization by individuals operating in isolation using a free tier of access. Governance of not just Postman, but also the end to end API life cycle begins with getting all developers using Postman under a single organizational team, working across master planned workspaces. If there are concerns about how Postman is being used across an enterprise organization, governance of this usage begins by focusing on bringing all enterprise Postman users together under a single license, and team, so that activity can be managed collectively. Postman Users Over the last five years Postman has become an indispensable tool in the toolbox of developers. 10 million developers have downloaded the application and are using it to authorize and make requests to APIs then debug the responses. The benefit to API operations for the enterprise is clear, but the challenge now for enterprise organizations is to identify their individual Postman users and encourage them to operate under a single pro, team, or enterprise license.  Currently users are operating in isolation, defining, storing, and applying secrets and PII locally on their own workstations within Postman, and syncing to the cloud as part of their regular usage of Postman—isolating details about APIs, secrets, potentially PII, and other sensitive data within these three areas. Personal Workspaces - Storing collections, and environments within their localized personal workspaces and individual Postman account. Personal Collections - Developing API collections in isolation, leaving them inaccessible to other teams, and reusable across operations. Personal Environments - Using environments to store secrets, PII, and other data within their localized personal workspaces and individual Postman account.  When it comes to enterprise API governance, observability, and security, the problem isn’t with Postman being used by developers, the problems is developers are not using Postman together under a single license, across managed shared workspaces. Putting...[Read More]


API Evangelist

Conducting API Weaponization Audits

11 Feb 2020

I’ve been thinking about chaos engineering lately, the discipline of experimenting on a software system in production in order to build confidence in the system's capability to withstand turbulent and unexpected conditions. I listened to a talk by Kolton Andrus the CEO of Gremlin the other day, and my partner in crime at Postman Joyce (@petuniagray) is an avid evangelist on the subject. So I have been thinking about the concept, how it applies to your average enterprise organization, and the impact it could make on the way we operate our platforms. I don’t think chaos engineering is for every company, but I think there are lessons involved in chaos engineering that are relevant for every company. Similarly I think we need an equal approach in the area of weaponization, and how APIs can easily be used to harm a platform, its community, and the wider public—a sort of weaponization audit. Let’s take what we’ve learned from Twitter, Facebook, Youtube, and others. Let’s look at the general security landscape, but let’s get more creative when it comes to coloring within the lines of an API platform, but in unexpected ways. Let’s get women and people of color involved. Let’s focus on ways in which a platform can be abused. Using the web, mobile, device, or APIs underneath. I’d like to consider security, privacy, reliability, observability, as well as out of the box ways to game the system. Let's assume that nobody can be trusted, but recognizing we still need to offer a certain quality of service and community for our intended users. I am guessing it won’t be too hard to hire a savvy group of individuals who could poke and prod at a platform until the experience gets compromised in some harmful way.  Like chaos engineering, I’m guessing most organizations wouldn’t be up for an API weaponization audit. It would reveal some potentially uncomfortable truths that leadership probably isn’t too concerned with addressing, and...[Read More]


API Evangelist

The Basics of Working with the Postman API

10 Feb 2020

It is pretty easy to think of Postman as the platform where you engage with your internal APIs, as well as other 3rd party APIs. It doesn’t always occur to developers that Postman has an API as well. Most everything you can do through the Postman interface you can do via the Postman API. Not surprisingly, the Postman API also has a Postman collection, providing you with quick and easy access to your Postman collections, workspaces, teams, mocks, and other essential elements of the Postman platform and client tooling. Providing you with the same automation opportunities you have come to expect from other APIs. API access, integration, and automation should be the default with everything you do online—desktop, web, mobile, and device applications all use APIs. Your API infrastructure is no different. Postman takes this seriously, and works to make sure that anything you can do through the desktop or web interfaces, that you can also do via the Postman API--allowing API providers and consumers to seamlessly integrate and automate the Postman platform into their operations by leveraging the following APIs. Collections - Being able to programmatically create and manage Postman API collections in use. Environments - Adding and managing the details of the environments applied across Postman collections. Mocks - Creating, retrieving, and deleting mocks APIs that are generated from Postman collections. Monitors - Create, update, retrieve, delete, and run monitors that execute Postman collections. Workspaces - Creating, retrieving, updating, and deleting the workspaces that collections are organized in. Users - Provides a /me endpoint that allows for pulling of information about the API key being used. Import - Allowing for the import of Swagger, OpenAPI, and RAML API definitions into Postman. API - Programmatically creating and managing APIs, including version, schema, and its link relations. These eight API paths give you full control over managing the full API life cycle of APIs you are developing, and the integration and automation of the APIs...[Read More]


API Evangelist

Standardizing My API Life Cycle Governance

10 Feb 2020

I am working on redesigning all of my base APIs, as well as produce a mess of new ones. As part of the process I am determined to be more thoughtful and consistent in how I design and deliver the APIs. API governance always begins with using API definitions, as you can't govern something you can't measure and track, so having machine readable artifacts is essential. After that, the design of the API is the first place to look when it comes to standardizing each of the APIs coming off the assembly line. Then I am looking to do my best to begin defining, measuring, and standardizing how I do many other areas of API operations, hleping me keep track of the many moving parts of doing microsservices.  To help me govern the life cycle for each API, I am going to be quantifying and measuring as many of the follow areas as I can. These are what I consider to be the essential building blocks of each API that I deliver, and since I'm using Postman to not just interact with these APIs once they are in production, I will be using Postman to also deliver and govern each stop along the API life cycle. Using Postman collections to define, deliver, and govern each of these areas, using scripts, runners, and monitors to automate the enforcement of standards and consistency across the APIs I am delivering on a regular basis.  Definitions OpenAPI - There is an OpenAPI for each individual API. Collection - This is a Postman collection for each individual API. JSON Schema - There is a JSON schema for each individual schema. Design Requests Base - Ensure the base path is planned. Versioning - Define how APIs are versioned. Resource - Evaluate each resource published. Sub-Resources - Evaluate each sub-resource published. Methods - Ensure common use of HTTP methods. Actions - Determine how actions are taken beyond methods. Path Parameters - Establish common approach for path parameters. Query...[Read More]


API Evangelist

Backend AWS API Gateway Integration OpenAPI Extensions

10 Feb 2020

I have spent a lot of time automating my AWS API infrastructure, working to make it so I can automatically deploy API infrastructure to AWS.  I am using AWS API Gateway as part of this suite of API deployments so I have been working hard to understand how AWS speaks OpenAPI as part of their implementation. As part of my work there are three distinct types of APIs I am deploying using AWS API Gateway, which have three distinct ways of extending OpenAPI to describe. The Pass Through Just passing what comes in to an HTTP host and path I give it and then passing the response back through without any transformations or other voodoo along the way. This is a basic OpenAPI extension for defining a pass through API using the AWS API Gateway. A DynamoDB Backend For my basic CRUD databases I am just using a DynamoDB backend because it allows me to quickly launch data APIs that allow me to Create, Read, Update, and Delete (CRUD) data I am storing in the NoSQL database—providing me with a pretty basic approach to delivering data API infrastructure. Here is the OpenAPI vendor extension for wiring things up using a DynamoDB backend. I like DynamoDB because you can just make API calls to get most of what you need without any sort of business logic or code in between. If I am just looking to manage data using simple web API endpoints, this is what I am doing when it comes to deploying API infrastructure. Logic with Lambda I would say the the previous two types of APIs represent the most common implementations I have, but I am working to evolve my infrastructure to take advantage of newer approaches to delivering APIS like Lambda. Here is the OpenAPI extension for defining a Lambda backend, which I can then wire up to a database and storage or purely implement some business logic to do what I...[Read More]


API Evangelist

API Links For Every UI Element

10 Feb 2020

I’ve showcased ClokudFlare's approach making their API available as part of their user interface several times now. It is a practice I want to see replicated in more desktop, web, and mobile applications, so I want to keep finding new ways of talking about, and introducing to new readers. If you sign up or use CloudFlare, and navigate your way to their SSL/TLS section, you will see a UI element for changing the levels of your SSL/TLS encryption, and below it you see some statics on the traffic that has been served over TLS over the last 24 hours. Providing you full control over SSL/TLS within the CloudFlare UI. At the bottom of the UI element for managing your SSL/TLS you will see an API link, which if you click you get three API calls for getting, changing, and verifying the SSL/TLS status of your domain. Providing you with one click access to the API calls behind the UI elements, giving you two separate options for managing your DNS. This is how all user interfaces within applications should be. The API shouldn’t just be located via some far off developer portal, they should be woven into the UI experience, revealing the API pipes behind the UI at every opportunity. This allows for the automation of any activity a user is taking through the interface using the platform's API. You could also consider embedding a simple Postman Collection for each API capability, allowing a user to run in Postman—to further support, you could also make a Postman environment available, pre-populated with a users API Key, making execution of each platform capability outside of the platform possible in just one or two clicks. Once each UI capability is defined as a Postman collection it can immediately be executed by a user in a single click. It can also be executed using a Postman runner as part of an existing CI/CD process, or on a schedule using a...[Read More]


API Evangelist

Secrets and Personally Identifiable Information (PII) Across Our API Definitions

27 Jan 2020

As API providers and consumers we tend to have access to a significant amount of credentials, keys, tokens, as well as personally identifiable data (PII). We use this sensitive information throughout the API integration and delivery life cycles. We depend on credentials, keys, and tokens to authorize each of our API requests, and we potentially capture PII as part of the request and response for each the individual API requests we execute regularly. Most developers, teams, and organizations I’ve spoken with do not have a strategy for addressing how secrets and PII are applied across the internal and external API landscape. API management over the last decade has helped us as API providers better manage how we define and manage authentication for the APIs we are providing, but there hasn’t been a solution emerge that helps us manage the tokens we use across many internal and external APIs. With this reality, there are a lot of developers who are self-managing how they authenticate with APIs, and work with PII that gets returned from APIs. I am working on several talks with enterprise organizations about this challenge, and to prepare I want to work through my thoughts on the problem, as well as some possible solutions. I wanted to map out how we integrate with the APIs we are developing and consuming, and think about what the common building blocks of how we can better define, educate, execute, audit, and govern the secrets and PII that is applied throughout the API life cycle across all of the APIs we depend on. Allowing me to have a more informed conversation about how we can get better at managing the more sensitive parts of our operations. What Are The Types of Sensitive Information? First I wanted to understand the types of common information being applied by API developers, helping me establish and evolve a list of the types of data we are looking for when securing the API...[Read More]


API Evangelist

An Introduction to API Authentication

27 Jan 2020

APIs operate using the web, but like web applications, many API require some sort of authentication or authorization before you can access the valuable resources available within each API path. When you open up your APIs on the web you aren’t just giving away access to your resources to anyone who comes along. API providers employ a number of different authentication mechanisms to ensure only the applications and systems who should have access are actually able to make a successful API call. To help refresh the types of authentication available across the API landscape, while also demonstrating the reach of Postman as an API client, I wanted to take a fresh look at authentication to help my readers understand what is possible. Depending on the API provider, platform, and the types of resources being made available you will encounter a number of different authentication methods—here are the 11 that Postman supports, reflecting 90% of the APIs you will come across publicly, as well as within the enterprise organization. Refelecting what the API sector employs for authentication of their APIs, as well as what Postman supports as an API client. No Authentication - Like the web, these APIs are publicly available and accessible without any authentication. You can just make a request to a specific URL, and you get the response back without needing any credentials or key. This reflects a very small portion of the API economy, but still is an important aspect of the overall authentication discussion, and what is possible. API Key - An application programming interface key (API key) is a unique identifier used to authenticate a user, developer, or calling program to an API. However, they are typically used to authenticate a project with the API rather than a human user. Different platforms may implement and use API keys in different ways. Bearer token - A Bearer Token is an opaque string, not intended to have any meaning to clients using...[Read More]


API Evangelist

Profiling Adobe APIs

23 Jan 2020

As I was profiling APIs on my list of APIs I found myself profiling Adobe. I am moving through the list of companies alphabetically, so you can see how far along I am. Anyways, like any other large company I need to make a decision about how I am going to manage the profiling of different API products and lines of business. Companies like Amazon, Google, Azure, and Adobe have large numbers of APIs and I always know I will need to have some sort of plan for documenting everything that is going on. With Adobe, I am going to track everything in a single GitHub repository, but will be working to create separate API definitions (OpenAPI and Postman collections) for each of the individual APIs being offered. To provide some context, it helps to understand why I profile APIs in the first place. As the API Evangelist I review public API operations studying how API providers are doing what they do. I then aggregate the "building blocks" of their public operations into a master set of reserarch that I use to drive my storytelling and API strategy workshops. So, with the Adobe APIs I'm not looking to review their API operations as much as I am looking to understand how they operate, and develop an understanding of how far along they are in their enterprise API journey. As with any profiling of a company, I begin by Googling their name pus API, but then dive as deep as I can into the details of what I find with each click. When you Google Adobe APIs you get this main landing page with the tagline, “APIs and SDKs for all Adobe products – create mobile, web and desktop apps”. You can tell Adobe is working hard to bring together their APIs under one big tent, with the following main areas to support developers: Landing Page - Adobe API landing page. Authentication - Overview of authentication. Open...[Read More]


API Evangelist

Three Ways to Use Postman and Azure DevOps

22 Jan 2020

I set out to understand the role that Postman can play in an Azure DevOps powered API life cycle. I was fully prepared to crash course Azure Dev Ops, and begin mapping out the role that Postman can play, but before I got started I began Googling Postman + Azure DevOps. I was happily surprised to find a number of rich walk throughs written by the passionate Postman community--surpassing anything I could have put together for a version 1.0 of my Azure DevOps Postman guidance. I will still work to pull together my own official Azure DevOps Postman walkthrough, but to prepare I wanted to publish a summary of what I have found while thinking about how Postman and Azure DevOps can work together.  The Postman Basics Before we get going with what I have found, I wanted to point to a couple of key concepts readers will need to be familiar with before they set out trying to use Postman with Azure DevOps, helping set the tone for any integration. It always helps to start with the basics and not assume all of my readers will understand what Postman delivers. Intro to collections - Getting familiar with what collections are, and how they work. Intro to collection runs - Understanding the nuance of how collections can be run. Intro to scripts - Learning about how to script within the collections being run. It is critical that you have a decent grasp on what are possible with Postman collections, and how it can be applied as part of any CI/CD pipeline. Most developers think of Postman as simply an HTTP client for just making calls to APIs. Once you understand how collections can be run, and the many different ways that scripts can be applied, you will be much more effective at applying as part of any pipeline, including with Azure DevOps--providing a great place to start. Testing Azure DevOps APIs Using Postman While mapping out this walk...[Read More]


API Evangelist

The State of California Doing APIs The Right Way By Starting Simple

22 Jan 2020

I got introduced to the CA.gov Alpha Team by my fellow government change maker Luke Fretwell (@lukefretwell) the other day, and I am beginning to tune into what they are up to in similar ways to how I’ve done with other city, state, and federal government entities over the years. We kicked off a conversation around their approach to delivering APIs, and what was possible with Postman. After we were done kicking things off they shared some links with me to help me get up to speed on what they have been doing with their new approach to delivering technology across the State of California. As far as first impressions go I am super stoked with their approach. They are starting small, and working hard to be as public with how they are doing everything. The CA.gov Alpha Team gets right down to the core of doing API well, by setting up the essential communication channels you need to do APIs well across any small or large organization. GitHub - All of the projects they develop are published to GitHub. Twitter - Providing a social stream from what is happening. Blog - Shaping the narrative around all of the work that is occuring. The CA.gov Alpha Team has not just gone all in on GitHub, they are all about their work truly existing in the public domain. It looks like everything they are doing is first being defined as a GitHub repository, providing a default way for other government stakeholders, as well as the public at large to stay in tune with what is going on, and even contribute to what is happening. This is how all government should be by default, and the CA.gov Alpha Team provides one possible blueprint for other city, state, and federal agencies to follow. I really like that the CA.gov Alpha Team is seeding and managing everything out in the open on Twitter, and being so vocal about it all with a...[Read More]


API Evangelist

Help Defining 13 of the AsyncAPI Protocol Bindings

22 Jan 2020

I have been evolving my definition of what my API toolbox covers, remaining focused on HTTP APIs, but also make sure I am paying attention to HTTP/2 and HTTP/3 APIs, as well as those that depend on TCP only. My regular call with Fran Méndez (@fmvilas) of AsyncAPI reminded me that I should be using the specification to ground me in the expansion of my API toolbox, just as OpenAPI has defined much of it for the last five years. For this particular multi-protocol API toolbox research, the AsyncAPI protocol bindings reflect how I am looking to expand upon my API toolbox. Here are the 13 protocols being defined around the AsyncAPI specification: AMQP binding - This document defines how to describe AMQP-specific information on AsyncAPI. AMQP 1.0 binding - This document defines how to describe AMQP 1.0-specific information on AsyncAPI. HTTP binding - This document defines how to describe HTTP-specific information on AsyncAPI. JMS binding - This document defines how to describe JMS-specific information on AsyncAPI. Kafka binding - This document defines how to describe Kafka-specific information on AsyncAPI. MQTT binding - This document defines how to describe MQTT-specific information on AsyncAPI. MQTT5 binding - This document defines how to describe MQTT 5-specific information on AsyncAPI. NATS binding - This document defines how to describe NATS-specific information on AsyncAPI. Redis binding - This document defines how to describe Redis-specific information on AsyncAPI. SNS binding - This document defines how to describe SNS-specific information on AsyncAPI. SQS binding - This document defines how to describe SQS-specific information on AsyncAPI. STOMP binding - This document defines how to describe STOMP-specific information on AsyncAPI. WebSockets binding - This document defines how to describe WebSocket-specific information on AsyncAPI. Not all of the protocol bindings are fully fleshed out, and AsyncAPI could use help from the community to quantify what is required with each of the protocols. I am going to try and contribute what I can as I make my way through each of the protocols as part of my API toolbox research.I am defining the building blocks for each of the protocols which...[Read More]


API Evangelist

My Upcoming Talk with the UK Government Digital Services (GDS): The API Life Cycle Is For Everyone

21 Jan 2020

I am heading to London in February to talk to the UK government about APIs. They invited me out to talk about my history of work with government in the US and EU, and share my views of the API life cycle. To help share my view of the API landscape I pulled together a talk titled, "The API Life Cycle Is For Everyone". I am hoping to share my view of the fundamentals of a modern API life cycle, as well as emphasize the importance of both developers and non-developers having a place at the table. Here is what I've pulled together for my time with the GDS in London. APIs are widely considered to be something that is exclusively in the domain of software developers. While it is true that APIs are often a very technical and abstract concept which requires a more technically inclined individual to engage, APIs are something that impacts everyone across todays digital landscape, impacting both business users and developers, making the API development life cycle something all parties should be educated on, made aware of, and equipped to participe in. As part of my contribution to the GDS talks on interoperability and open standards I’d like to spend an hour with you talking through the human-machine intersection across: API Definitions - Talking about Swagger / OpenAPI, as well as Postman collections and environments, and how they are being put to use. API Documentation - Understanding common approaches to delivering and maintaining documentation for APIs that are being delivered. API Mocks - Thinking about how API mocking can be used to articulate and share what an API delivers for all stakeholders involved. API Testing - Understanding the role that API assertions and testing play in defining the operations and reliability of our API infrastructure. API Management - Looking at how API management secures our APIs, but also helps us develop the awareness of how they are used. API Contracts...[Read More]


API Evangelist

Looking at Electronic Data Interchange (EDI) Reminds Me that the API Economy is Just Getting Started

21 Jan 2020

I am neck deep in the expansion of what I consider to be my API toolbox, and I have been spending time mapping out the world of EDI. If you aren’t familiar with  the Electronic Data Interchange (EDI), it “is the electronic interchange of business information using a standardized format; a process which allows one company to send information to another company electronically rather than with paper. Business entities conducting business electronically are called trading partners.”  EDI is the original API by providing a, “technical basis for automated commercial "conversations" between two entities, either internal or external. The term EDI encompasses the entire electronic data interchange process, including the transmission, message flow, document format, and software used to interpret the documents”. EDI is everywhere, and truly the backbone of the global supply chain, but one that you only hear lightly about as part of the overall API conversation. I have regularly come across the overlap between EDI and API over the last 10 years of doing API Evangelist, and while I have engaged in discussion around modernizing legacy EDI approaches in healthcare and commerce, most other fundamental building blocks of the global supply chain are entirely new to me. Revealing how little I know about the bigger picture of EDI, and how small my API world actually is. I don’t claim to know everything about information exchange and interoperability, but EDI is something that should be a bigger part of my storytelling, and the fact that it isn’t I think is revealing about how much more work we actually have in front of us when it comes to delivering on the promise of the API economy.  Take a look at some of the major EDI standards to get a sampling of the scope I am talking about. These are the electronic data interchange standards that have governed commerce before the Internet was around, and continues to define how data moves around. [email protected] (EDIGAS) - The [email protected][Read More]


API Evangelist

I Think We Will Have To Embrace Chaos With the Future of APIs

21 Jan 2020

I like studying APIs. I like to think about how to do APIs well. I enjoy handcrafting a fully fleshed out OpenAPI definition for my APIs. The challenge is convincing other folks of the same. I see the benefits of doing APIs well, and I understand doing the consequences of not doing them well. But, do others? I never assume they do. I assume that most people are just looking to get an immediate job done, and aren’t too concerned with the bigger picture. I think people have the perception that technology moves too fast, and they either do not have the time to consider the consequences, or they know that they will have moved on by the time the consequences are realized. I’m pretty convinced that most of our work on API design, governance, and other approaches to try and standardize how we do things will fall on deaf ears. Not that we shouldn’t keep trying, but I think it helps if we are honest about how this will utlimately play out. If I give a talk about good API design at a tech conference everyone who shows up for the talk is excited about good API design. If I give a talk about good API design within an enterprise organization and leadership mandates everyone attend, not everyone present is excited, let alone cares about API design. I wish people would care about API design, and be open to learning about how others are designing their APIs, but they aren’t. Mostly it is because developers aren’t given the space within their regular sprints to care, but it is also because people are only looking to satisfy the JIRA ticket they are given, and often times the ticket has said nothing about the API being well designed, and consistent with other teams. Even with teams that have been given sufficient API design training and governance, if it isn’t explicitly called out as part of the...[Read More]


API Evangelist

Expanding My API Toolbox for the Next Decade

21 Jan 2020

I am continuing to iterate on what I consider to be a modern API toolbox. API Evangelist research is born out of the SOA and API worlds colliding, and while I have been heavily focused on HTTP APIs over the years, I have regularly acknowledged that a diverse API toolbox is required for success, and invested time in understanding just what I mean when I say this. Working to broaden my own understanding of the technologies in use across the enterprise, and realistically map out what I mean when I say API landscape. I am still workshopping my new API toolbox definition for 2020, but I wanted to work on some of the narrative around each of the items in it, helping me learn along the way, while also expanding the scope of what I am talking about. Transmission Control Protocol (TCP) The Transmission Control Protocol (TCP) is one of the main protocols of the Internet protocol suite, and provides reliable, ordered, and error-checked delivery of a stream of bytes between applications running on hosts communicating via an IP network. The Web and APIs both rely on TCP, which is part of the Transport Layer of the TCP/IP suite. SSL/TLS often runs on top of TCP. It is the backbone of our API toolbox, but there are many different ways you can put TCP to work when it comes to the programming interfaces behind the applications we depend on. It can be tough to separate what is a protocol, and what is a methodology when looking at the API landscape. I’m still working to understand each of these tools in the toolbox, and organize them in a meaningful way—which is why I am writing this post. While all APIs technically rely on TCP, these approaches to communication and information exchange are often implemented directly using TCP. Electronic Data Interchange (EDI) - Electronic Data Interchange (EDI) is the electronic interchange of business information using a standardized format;...[Read More]


API Evangelist

DevOps Azure Style

17 Jan 2020

I am spending time thinking more deeply about how APIs can be delivered via Azure. I spent much of the holidays looking at how to deliver APIs on AWS, but only a small amount of time looking at Azure. I'm looking at how Azure can be used for the development and delivery of APIs, trying to understand the different ways you can use not just Azure for managing APIs, but also use Azure APIs for managing your APIs. Next up is Azure DevOps, and learning more about the nuts and bolts of how the orchestration solution allows you to streamline and stabilize the delivery of your API infrastructure using Azure. First, I want to just break down what the core elements of Azure Devops. Learning more about how Azure sees the DevOps workflow and how they have provided a system to put their vision to work. Here are the main elements of Azure DevOps that help us understand the big picture when it comes to mapping to your API life cycle. Azure DevOps Server - Share code, track work, and ship software using integrated software delivery tools, hosted on-premises Azure Boards - Deliver value to your users faster using proven agile tools to plan, track, and discuss work across your teams. Azure Pipelines - Build, test, and deploy with CI/CD that works with any language, platform, and cloud. Connect to GitHub or any other Git provider and deploy continuously. Azure Repos - Get unlimited, cloud-hosted private Git repos and collaborate to build better code with pull requests and advanced file management. Azure Test Plans - Test and ship with confidence using manual and exploratory testing tools. Azure Artifacts - Create, host, and share packages with your team, and add artifacts to your CI/CD pipelines with a single click. Azure DevTest Labs - Fast, easy, and lean dev-test environments Not every API implementations will use all of these elements, but it is still nice to understand...[Read More]


API Evangelist

A View of the API Delivery Life Cycle from the Azure Getting Started Page

17 Jan 2020

I am working my way through doing more work around the multi-cloud deployment of APIs and spending some more time on the Azure platform here in 2020, and I found their getting started page pretty reflective of what I'm seeing out there when it comes to delivering the next generation of software. When landing on AWS home page it can be overwelming to make sense of everything, and I thought that Azure organized things into a coherent vision of how software is being delivered in the cloud. Infrastructure Providing the fundamental building blocks of compute for all of this. Linux virtual machines  Windows virtual machines  I never thought I"d see Linux and Windows side by side like this. Languages Acknowledging there are multiple programming languages to get the job done. .NET  Python  Java  PHP  Node.js  Go Again, I never thought I'd see such strong support for anything beyond .NET. Application This nails the different layes in which I see folks delivering API infrastructure. Web Apps  Serverless Functions  Containers  Microservices with Kubernetes  Microservices with Service Fabric I think its silly to put microservices there, because APIs are delivered in all. Database The database layers behind the APIs we are all delivering across operations. Relational Databases  SQL Database as a service  SQL Database for the edge  SQL Server on an Azure  PostgreSQL database as a service  MySQL database as a service  Azure Cosmos DB (NoSQL) Again, I am blown away to see MySQL and PostgreSQL along with SQL Server. Storage Where you put all of your blobs and other objects used across your APIs. Blob Storage I'd say this layer is a little anemic compared with other cloud environmetns. Machine Learning Acknolwedging that machine learning is a growing area of API deployment. Machine Learning  Cognitive Services  Azure Notebooks This area will continue grow pretty rapidly in coming years in all industries. Interfaces The ways in which we are interfacing with the software development life cycle. Azure CLI ...[Read More]


API Evangelist

What Is Your API Development Workflow?

16 Jan 2020

I am going to invest in a new way to tell stories here on API Evangelist—we will see if I can make this stick. I enjoy doing podcasts but I am not good at the scheduling and reliable repetition many expect of a podcast. Getting people to join me on a podcast takes a lot of work (I know from experience) to do reliably. People usually want to talk, but finding slots in both of our schedules and getting them to jump online and successfully record an episode isn’t easy to do on a regular basis. However, I still want to be able to craft audio narratives around specific topics that are relevant to the API sector, while also allowing many different voices to chime in. So I’ve come up with a formula I want to test and and see if I can build some momentum. To help stimulate the API conversation and bring in other voices I want to pose a single question on a regular basis and solicit audio responses from folks across the API space, then compile the results into a single podcast that I will publish on the blog and via other channels. All folks need to do in their response to one of my questions is open up their phone and record their response and send me the resulting audio file via email, DM, or carrier pigeon. Then I will organize all the responses into a single coherent podcast with me opening, asking my questions, then chaining together the responses, and closing up with a little analysis. Make sense? A kind of an asynchronous podcast conversation amongst several participants. Ok, let’s start with my first question.: How do you develop APIs? Describe how you or your team actually develops an API. What is the workflow for how you go from idea to production, and what tools and services are involved. Be honest. I am not looking for fluff or pie in...[Read More]


API Evangelist

My Eventbrite API Keys Were Easy To Find

16 Jan 2020

If you read my blog regularly you know I rant all the time about having to sign up for new APIs and then find my API keys and tokens. API providers excel at making it extremely difficult to get up and running with an API, even once you have read their documentation and figured out what their API is all about. So when I come across API providers doing it well, I have to showcase here in a blog posts. Today’s shining example of how to make it easy to find your API keys comes from the Eventbrite API. I was crafting a Postman API capability collection for my boss the other day, and I needed to find me an API key to get the data I needed out of the Eventbrite API. Finding the API paths we needed to get the event and registration data needed had already taken us some time, so I was fully expected the usual friction when it came to finding my API key. Then I clicked on the Eventbrite authentication page and clicked on the link telling me to visit my API keys page, and there they were! No hunting or detective required—my keys were prominently placed above the fold. Amazing!!!  This is how it should be. I shouldn’t have to look around for my key—it is the 2020s. Please stop hiding my keys and making it hard for me to find what I need to get up and running with your API. As you are planning out how to develop and deploy the user experience for the API management layer of your operations make sure you pick 25 existing public APIs, then sign up and find your keys. Learn from the experience and put your keys at a common URL that is prominently linked from your documentation and authentication page. If you have a favorite API that you think adding an application and finding your keys is the pattern...[Read More]


API Evangelist

API Life Cycle Governance Beyond Just API Design

16 Jan 2020

When you hear enterprise organizations talk about API governance they usually mean the governance of API design practices across the organization. This is the place where everyone starts when it comes to standardizing how APIs are delivered. It makes sense to start here because this is where the most pain is experience at scale when you try to put APIs to work across a large enterprise organization. Even if all APIs and micro services are REST(ish), there are so many different ways you can deliver the details of an API--you might as well be using APIs from different companies when trying to put APIs developed across different teams to use in a single application. Making API design the first stumbling block teams consider when planning API governance, and something that would make a meaningful impact on how APIs are delivered. After working with enterprise organizations who have been on their API journey for 5+ years I have begun to see API governance move beyond API design, and begin to look at other stops along the API life cycle, and work to standardize other critical elements. Here are some of the next steps I see enterprise organizations taking when it comes to getting a handle on API governance across teams: Documentation - Making sure everyone is using the same services and tooling for documenting APIs making sure the most common elements are present, and all APIs are well defined. Monitoring - Requiring all teams monitor APIs and report upon the available of each API, establishing a common monitoring and reporting practice that is consistent across all development teams. Testing - Standardizing tooling and approaches to API testing, indexing and cataloging the tests that are in place, and beginning to measure the test coverage for any API in production. Performance - Looking at the speed of APIs and making sure that all APIs are benchmarked as soon as they are developed, then measured against that across multiple...[Read More]


API Evangelist

Eventbrite Events with Order Count and Capacity Using the API

15 Jan 2020

My boss asked me if I could build a Postman collection that would pull our future events from Evenbrite and display ticket counts for each individual event. So I got to work hacking on the Eventbrite API, learning each of the events API paths, stitching together what I needed to pull together my Postman collection for this new API capability. I’m a big fan of not just creating reference collections for different APIs like the Eventbrite API, but also creating individual capability collections that use one or many API requests to deliver on a specific business objective. I was able to craft my Postman API capability collection using two Eventbrite APIs, getting me the data I need to satisfy what my boss needed to get the updates he needed. Events By Organization - Pulls all of the future active events for our Eventbrite organization. Event Orders - Pulling the orders fore each individual event, pulling the relevant information needed to assess each event. This Eventbrite event order Postman capability collection only has one request in it, but I call the second API multiple times using a test script for the request. So in the end I’m making multiple API calls using a single Postman request, allowing me to get at what I need for each future event across multiple APIS--abstracting away some of the complexity. I have published the collection as a Postman template which you can access via the Postman documentation I’ve published, but you will need to add your own Eventbrite token and organization id to actually execute. Once you added these properties entered you can click send and see a listing of events with ticket counts as well as maximum capacity for all the future events using the Postman visualizer tab. I’ve added this Postman capability collection my list of individual API collections I’ve been building. Providing a list of the common things I need to accomplish across the different platforms I depend for my...[Read More]


API Evangelist

Why Hasn’t There Been Another Stripe or Twilio?

13 Jan 2020

Stripe and Twilio are held up as shining examples of how to do APIs in our world. This shining blueprint of how to do APIs has been around for a decade for others to follow. It isn’t a secret. So, why haven’t we seen more Stripes or Twilios emerge? Don’t get me wrong, there are other well done APIs that have emerged, but none of them have received the attention and level of business that Stripe and Twilio have enjoyed. These things always get me thinking and wondering what the reality really is, and if the narrative we are crafting is the one that fits with reality on the ground—pushing me to ask the questions that others aren’t always prepared to ask. I am going to spend some time flagging some of the new APIs who do rise the to the occasion, but while I am working on that I wanted to pose some questions about why we haven’t seen the Twilio and Stripe being modeled by more API providers. Here are a few of my thoughts as I work through this view of the API landscape, and helping me understand why there aren’t more API rockstars to showcase: Investment - Investment cycles have changed and the investment you need to do this right is available for startups in the last five years. Blueprint - Twilio and Stripe are not a blueprint that applies universally to other APIs, but worked will in those business verticals. APIs - This use case of APIs is not as universal as we think it is and is not something that will work being applied to all business verticals. Skills - It takes more skills than we anticipate when it comes to actually delivering an API as well as Twilio and Stripe have done. Cloud - The dominance of the cloud providers is making it harder for small API startups to get traction and attention of investors. Wrong - These...[Read More]


API Evangelist

The State of Simple CRUD API Creation

09 Jan 2020

With all the talk of APIs you think it would be easier to publish a simple Create, Read, Update, and Delete (CRUD) API. Sure, there are a number of services and open source solutions for publishing a CRUD API from your database, but for me to just say I want a CRUD resource, give it a name, push a button, and have it—there isn’t much out there. I should be able to just write the word “images”, and hit go, and have a complete images API that I can add properties to the schema, and query parameters to each method. After ten years of doing this I am just amazed that the fundamentals of API deliver are still so complicated and verbose.  We even have the vocabulary to describe all of the details of my API (OpenAPI), and I still can’t just push a button and get my API. I can take my complete OpenAPI definition and publish it to AWS, Azure, or Google and “generate my API”, but it doesn’t create the backend for me. There has been waves of database or spreadsheet to API solutions over the years, but there is not single API solution to plant the seeds when there is no existing data source. Over the holidays I managed to create a Postman collection that will take my OpenAPI from a Postman-defined API and generate a AWS DynamoDB and AWS API Gateway instance of API, but it was the closest I could get to what is in my head across AWS, Azure, and Google. Why can’t I just hit GO on my OpenAPI, and have an API in a single click? Nio matter which cloud provider I am on! The reasons why I can’t immediately have a CRUD API are many. Some technical. Most are business reasons. I would say it is primarily a reflection of our belief that we are all innovative special snowflakes, when in reality we are all...[Read More]


API Evangelist

A Postman API Governance Collection

09 Jan 2020

You can use Postman to test your APIs. With each request you can include a test script which evaluates each incoming response and validates for specific elements, displaying the test results along with each response. However, you can also use the same mechanisms to evaluate the overall design of any API you are managing with Postman. One of the new beta features of Postman is being able to manage your APIs, allowing you to define each API using OpenAPI 3.0, then generate collections, mocks, docs, and tests with Postman. This got me thinking—why can’t we use the new Postman API manager, plus the Postman API, and script testing for governing the design of an API. To explore the possibilities I created a Postman collection for applying some basic API design governance to any API you have defined in a Postman workspace. The collection uses the Postman API to pull the OpenAPI for each API and store it within an environment, then there are a range of basic requests that can be made to evaluate the design of the APIs that we have defined as an OpenAPI.  The collection is a proof of concept, and is meant to be a starting point for designing many difference types of API governance rules, and thinking about how Postman collections can be used to govern the API life cycle, starting with the design of our APIs—something that is exposed as OpenAPI. My new Postman API governance collection has a handful of folders, and the following requests: Info - Looking at the general info for the API. Validate the Name Of The API Validate the Description for the API Paths - Evaluating the design patterns of each API path. Ensure Words Are Used in Paths Methods - Looking at the details of each API method. Check For GET, POST, PUT, and DELETE Check All Methods Have Summaries Check All Methods Have Descriptions Check All Methods Have Operation Ids Check All...[Read More]


API Evangelist

Spreading API Collections From My Personal Workspaces Across Multiple Workspaces

08 Jan 2020

As a Postman user for a number of years I have several hundred random collections littering my personal workspace. I had noticed that workspaces emerged a while back, but really hadn’t ever put much thought into how I organize my collections. As the number of collections grows I’m noticing performance issues within Postman, and general chaos because I work primarily fro within my personal workspace. Pushing me to step back and think more holistically in how I create, store, organize, and share my API collections within the Postman platform and beyond using AWS S3 and GitHub. Forcing a little organization and structure on how I move APIs forward across thier own API life cycle trajectory. First, when working in my personal workspace there were performance issues using Postman. There were just too many Postman collections in there to be efficient. This further slowed me down when it comes to finding the collections I needed. Having to look purely alphabetically for collections that could have any sort of naming conventions applied to them took way too much time. This reality has pushed me to think about the different bucket in which I operate and get work done proved to be helpful, helping me create a handful of workspaces for me to organize my API collections into, rather than just operating from a single workspace filled with hundreds of APIs I have imported over the years. My frist task was to just delete things that was clearly junk. Then I looked at all my collections via the Postman API to see if there was any last modified or run date—sadly there isn’t. I will have to think about way in which I can track the evolution and usage of my Postman collections so that I can consider automating the cleanup of collections, or at least archiving of them based upon them being modified or not. Once I cleaned up a little bit I was able to see...[Read More]


API Evangelist

Postman Tutorials are Common but the Postman Collection is Often Missing

08 Jan 2020

I am amazed at the number of blog posts I come across for API providers explaining how their API consumers can use Postman with their API, but do not actually share a complete Postman collection for developers to use. API providers obviously see Postman as a tool for making API calls, but do not fully grasp the ability to document an API with a Postman collection, save, publish, and share this collection with documentation or the Run in Postman button. As part of this realization I am not looking to shame API providers for not understanding what is possible, I am more looking to acknowledge how much work we (Postman) have to to when it comes to helping folks understand what is possible with the Postman platform, moving folks being the notion that Postman is just an HTTP client. There are some pretty involved tutorials out there for using Postman with a variety of APIs. API providers have invested a lot into these blog posts, tutorials, and other resources to help their API consumers on-board with their APIs, but do not publish Postman collections as part of their documentation or tutorial. This tells me that API providers aren’t seeing the portability and share-ability of Postman collections. They simply see Postman as an API client, not as tooling for defining, sharing, publishing, versioning, saving, organizing, and evolving API requests. This means we have a lot of work ahead of us to educate folks about what Postman collections are, and how they will make your life easier, while reducing redundancy across operations. Helping folks move beyond simply operating Postman as an isolated HTTP client. Having full control over defining a request to an API while being able to see the details of that response is the core value of Postman. Developers get it. Clearly they also see the need in sharing this ability, and equip others realize the same value. They are crafting tutorials and blog posts...[Read More]


API Evangelist

Deploy, Publish or Launch An API?

08 Jan 2020

I’m always fascinated by the words we use to describe what we do in a digital world. One dimension of the API life cycle that perpetually interests me is the concept of deploying an API, or as some might call it publishing or launching. I am fascinated by how people describe the act of making an API available, but I’m even more interested in how shadows exist within these realities. Meaning, within a 30 minute Googling session for publish, deploy, and launch an API, I come across many real world examples of delivering an API, but how few of them will deliver the actual tangible, functional, nuts and bolts of the API. After searching for publish API, here is what stood out: Apigee SwaggerHub Postman Oracle Broadcom Azure MuleSoft WSO2 SAP Socrata After searching for deploy API, here is what stood out: AWS API Gateway Firebase Google Serverless Stack Mendix API Platform API Evangelist GitHub Heroku After searching for launch API, here is what stood out: Adobe Launch SpaceX Apple Launch Services RapidAPI 80% of these will not actually deliver the API, it will just take an existing and make it available. I know most of these service providers believe that their solution does deploy because use it proxies an existing API, but really very few of these actually deliver the API, they more publish, deploy, and launch it into some state of availability—the final act of making it available and open for business. After all these years of studying API gateway and management providers I’m still fascinated by the lack of true API deployment present, and how much it is about proxying what already exists, creating a shadow that continues to prevent us fro standardizing how we deliver APIs.[Read More]


API Evangelist

Dead Simple Real World API Management

08 Jan 2020

I began API Evangelist research almost a decade ago by looking into the rapidly expanding concept of API management, so I think it is relevant to go into 2020 by taking a look at where things are today. In 2010, the API management conversation was dominated by 3Scale, Mashery, and Apigee. In 2020, API management is a commodity that is baked into all of the cloud providers, and something every company needs. In 2010 there were not open source API management provider, and in 2020 there a numerous open source solutions. While there are forces in 2020 looking to continue moving the conversation forward with service mesh and other next generation API management concepts, I feel the biggest opportunity in tackling the mundane work of just effectively managing our APIs using simple real world API management practices. I am neck deep in working to deploy a simple set of APIs, looking for the path of least resistance when it comes to going from 0 to 60 with a new API. After playing around with AWS, Azure, and Google for a couple days, reminded of how robust, but also complex some of their API management approaches can be, I find myself on the home page of API Evangelist, staring at the page, and I click on my sole sponsor Tyk—finding myself pleasantly reminded how effective simple real world API management can be. Within 10 minutes I have singed up for an account and began managing one of my prototype APIs, allow me to: Add API - Add the url and authentication for one of my project APIs. Version - Choose to version, or not version the API I am deploying. Endpoints - Design a fresh set of endpoints transforming my API. Load Balance - Round-robin load-balance traffic to all my APIs. Regions - Manage the geographic distribution of my API infrastructure. Rate Limit - Limit the amount of API calls that can be made to API. Users...[Read More]


API Evangelist

Postman Open Source

07 Jan 2020

I get asked a lot if Postman is open source. I get told ocasionally that people wish it was open source. I have to admit I didn't fully grasp how open Postman was until I helped work on the new open source philosophy page for Postman. While the Postman application itself isn't open source (it is built on open source), the core building blocks of Postman are open source, shifting my view of how you can use the application across operations. Expanding Postman usage beyond just being a solitaire desktop applicaton, and turning it into a digitally scalable gear on the API factory floor. Postman as a desktop application is not open source, but here are the core components that are open source, making Postman something you can run anywhere: Postman Runtime - The core runtime of Postman that allows you to run collecctions, including requests, scripts, etc anywhere, extending the work that gets done within the application to anywhere the runtime can be installed and executed. Postman Collections Format - The collections you save and share with Postman are all open source and can be shared, exported, published, and used as a unit of currency within any application or system, further extending the reach of the platform. Newman - Command-line tool for running and testing a Postman Collection as part of any pipeline, making Postman collecitons a unit of compute that can be baked into the software development life cycle, and leveraged as API truth wherever it is needed. Postman Collection SDK - SDK to quickly unlock the power of Postman Collections format using JavaScript, allowing you to create, manage, and automate how collections are defined and put to work across a platform withoiut depending on the application. Postman Code Generators - Convert Postman collections to usable code in more than 20 different programming languages, generating simple client scripts for consumers that are defined by the Psoitman collections used as the code generators definition. I am...[Read More]


API Evangelist

Challenges Binding APIs Deployed Via Gateway To Backend Services

07 Jan 2020

I spent some of the holidays immersed in the backend integrations of the top three cloud providers, AWS, Azure, and Google. Specifically I was studying the GUI, APIs, schema, mapping, and other approaches to wiring up APIs to backend systems. I am looking for the quickest API-driven way to deploy an API, and hooking it up to a variety of meaningful resources on the backend, beginning with SQL and NoSQL data stores, but then branching out discovering the path of lest resistance for more complex backends. Maybe it is because of my existing experience with Amazon, but I found the AWS approach to wiring up integrations using OpenAPI to be the easiest to follow and implement, over what Azure and Google offered. Eventually I will be mapping out the landscape for each of the providers, but at first look, Azure and Google required substantially more work to understand and implement even the most basic backends for a simple API.  Don’t get me wrong, if you want to just gateway an existing API using AWS, Azure, or Google, it is pretty straightforward. You just have to learn each of their mapping techniques and you can quickly define the incoming request, and out going response mappings without much effort. However, for this exercise I was looking for an end to end actual deployment of an API, not the proxying or Hollywood front for an existing API. If you want to launch a brand new API from an existing datasource, or a brand new API with a brand new data source, I found AWS to be path of least resistance. I was able to launch a full read / write API using AWS API Gateway + AWS DynamoDB with no code, something I couldn’t do on Azure or Google, without specific domain knowledge of their database solutions. I had only light exposure to DynamoDB, and while there were some quirks of the implementation I had to get over,...[Read More]


API Evangelist

Academic or Street API Tooling

07 Jan 2020

There always seems like there are two separate types of tools in my world, the academic tools that consider the big picture and promise to steer me in the right direction, and then there is the street tooling that helps me get my work done on a day to day basis. After going to work for a street tooling vendor who has some academic tooling aspirations, it has gotten me thinking more about the tools I depend on, and learning more about what people are using within the enterprise to get their work done each day. I have used different academic tooling over my life as the API Evangelist. I’d say every API management tool I’ve adopted has been very academic until recently. From my view API management started as academic and then became a factory floor commodity. I feel say Kong and Tyk are the only version that have achieved a street level status within all of this, and NGINX is looking to turn it’s street cred into more of something that is more academic, and visionary. There aren’t many academic API tooling that have gone from vision to implementation—they just can’t survive the investment and acquisition cycles that gobble them. Making it difficult to see the real adoption they need to become baked into our daily lives. API management has done it, but very few other stops along the API life cycle have realized this level of adoption. Street tooling, or the hand tools developers use to get their jobs done on a daily basis are a much different beast. Postman and NGINX are both examples of tools that developers know about and depend on to operate each day. Using NGINX to deploy and Postman to consume APIs each day. These aren’t tools that promise some grand vision of how we could or should be, these are tools about dealing with what is right in front of us. These are tools that keep...[Read More]


API Evangelist

The Fundamentals: Deploying APIs From Your Databases

06 Jan 2020

You know, I tend to complain about a lot of things across the API space while focusing on the damage caused by fast moving technology startups and the venture capital that fuels them. Amidst all of this forward motion I easily forget to showcase the good in the space. The things that are actually moving the conversation forward and doing the hard work of connecting the dots when it comes to APIs. I easily forget to notice when there are real businesses chugging along delivering useful services for for all of us when it comes to APIs. One of my favorite database to API businesses out there, and one of the companies who have been around for a significant portion of my time as the API Evangelist, working hard to help people deploy APIs from their databases, is SlashDB. If you want to deploy APIs from your databases, SlashDB is the solution. If you are looking to make data within MySQL, PostgreSQL, SQLite, MS SQL Server, Oracle, IBM DB2, Sybase, RedShift, NoSQL, or other data source available quickly as an API, SlashDB has the solutions you are looking for. SlashDB isn’t one of those sexy new startups with a bunch of venture funding looking to be your new API best friend. SlashDB is looking to do the mundane difficult work needed to make the data available within your legacy databases available as APIs so that you can use across your applications. SlashDB is all about securely exposing your data using standardized web APIs, making your digital resources available wherever you need them. SlashDB doesn’t have the splashy website, but they have the goods when it comes doing one of the most common tasks when deploying APIs—wiring up your APIs to their data backends. They also have the straightforward pricing tiers for you to navigate as you expand the number of data sources you are wiring up, and the number of consumers you have consuming data...[Read More]


API Evangelist

Postman Collections For Pulling My Twitter Friends And Followers

06 Jan 2020

I have been cranking out the Twitter API capabilities lately, crafting single request Postman collections that focus on a specific capability of the popular social API. I use the API for a number of different things around API Evangelist, and as I assess how I use the social media API I wanted to be engineering my integrations as Postman collections so I can better organize and execute using Postman, while also adding to the list of API capabilities I’m sharing with my audience of developers and non-developers. Today I cranked out two individual Twitter API capabilities helping me better manage my Twitter followers and friends: Twitter Followers - Pulls your Twitter followers 200 at a time, saves them within an environment, then allows you to increment through each page of followers, eventually pulling and storing all of your followers. Twitter Friends - Pulls your Twitter friends 200 at a time, saves them within an environment, then allows you to increment through each page of friends, eventually pulling and storing all of your friends. These capabilities are separate Postman collections so that they can be used independently, or together. I am keeping them organized into a Postman workspace so that I can use manually, but then also have a daily monitoring running, pulling any new followers or friends from my Twitter. I pull the resulting JSON from the environments I pair up with each collection using the Postman API and integrate into some of my other API Evangelist monitoring and automation. Next I am going to create a Postman collection that will reconcile the two lists and tell me which people I am following do not follow me back, creating a third list that I can use to unfollow and clean up my profile. Crafting these types of collections helps me renew my understanding of some of the APIs I already use. It also helps me better define the individual capabilities I put to work on a daily basis, and...[Read More]


API Evangelist

My Levels of Postman API Environment Understanding To Date

06 Jan 2020

I have been a pretty hardcore Postman user since the beginning. Over the years I felt like I understood what Postman was all about, but one of the first concepts that blew up my belief around what Postman could do was the concept of the Postman environment. Like other Postman features, environments are extremely versatile, and can be used in many different ways depending on your understanding of Postman, as well as the sophistication of the APIs and the workflow you are defining using Postman. My Postman environments awakening has occurred in several phases, consistently blowing my mind about what is possible with Postman and Postman collections. Postman environments are already one of the edges I have given Postman collections over a pure OpenAPI definition—it just provides more environmental context than you can get with OpenAPI alone. However, at each shift in my understanding of how Postman environments can be used, entirely new worlds opened up for me regarding how that context can be applied and evolved over time across many different APIs. Resulting in four distinct layers of understanding about how Postman environments works and can be applied in my world—I’m sure there will be more dimensions to this, but this is a snapshot of how I see things going into 2020. Environments Settings For Single API Calls I have to start with the ground floor and express why environments matter in the first place, and provide an edge over OpenAPI all by itself. Being able to define key / value pairs for authorization and other variables across one or many different API collections helps speed up the on-boarding, orchestration, and reuse of API requests within those collections. It quickly allows you to switch users or other context, but still use the same collection of API requests, shifting how we automate and orchestrate across our API infrastructure. However, simply putting the base url for your API as a variable, and defining tokens and other...[Read More]


API Evangelist

A Dynamic Salesforce REST API Postman Collection Builder Collection

06 Jan 2020

I have been working on developing new ways to make the Salesforce API more accessible and easier to onboard with over the last couple of months, helping reduce friction every time I have to pick up the platform in my work. One of the next steps in this work is to develop a prototype for generating a dynamic Postman collection for the Salesforce REST API. I had created a Postman collection for the API earlier, but the Salesforce team pointed out to me that the available APIs will vary from not only version to version, but also user account to user account. With this in mind I wanted to develop a tool for dynamically generating a Postman collection for the Salesforce API, and as I got to work building it I realized that I should probably just make the tool a Postman collection itself (mind blown). To help make on-boarding with the Salesforce API easier I created a Postman collection that uses the Salesforce API to autogenerate the Postman collection based upon the available objects and endpoints for the Salesforce REST API. The Postman collection has three requests within the collection to accomplish the creation of a dynamic collection. The first request pulls all the latest versions for the Salesforce API, using the Salesforce API. Once I have the version of the Salesforce API I am targeting for a build I add it to the Postman environment I am using to define the operations of my Postman collection, and then I pull the list of available objects for this version, and for my own Salesforce account. The objects that exist will vary for each Salesforce account, as well as version, making it pretty critical that that any Postman collection is dynamic, being generated from this personalized list of objects. The next request in our Salesforce Postman collection builder is the build, which generates individual requests for all of the available objects. After you run, the...[Read More]


API Evangelist

The Many Differences Between Each API

03 Jan 2020

I’m burning my way through profiling, updating, and refreshing the listings for about 2K+ APIs in my directory. As I refresh the profile of each of the APIs in my index I am looking to make sure I have an adequate description of what they do, that they are well tagged, and I always look for an existing OpenAPI or Postman collection. These API definitions are really the most valuable thing I can find for an API provider, telling me about what each providers API delivers, but more importantly it does the same for other consumers, service and tooling providers. API definitions are the menu for each of the APIs I’m showcasing as part of my API research. As I refresh the profile for each API I re-evaluate how they do their API, not just the technical details of their API, but also the business and on-boarding of their API. If an API providers doesn’t always have an OpenAPI, Postman collection, or other machine readable definition for their APIs, depending on the value of the API and standardization of their API design and documentation, I will craft a simple scrape script to harvest the API definition, and generate the OpenAPI and Postman collection automatically. As I cycle through this process fore each API in my index I’m reminded of just how different APIs can be, even if they are just RESTful or web APIs. Demonstrating that there are many interpretations of what an API should be, both technically, and from a business perspective. Some APIS have many different paths, representing a wide variety of resources and capabilities. Some APIs have very few paths, and heavily rely on query parameters to work the magic when it comes to applying an API. Others invest heavily in enumerators and the values of query parameters to extract what you need from each API—often times forgetting to tell you what these values should or could be. Some of the time...[Read More]


API Evangelist

Pricing Comparison for Screen Capture APIs

03 Jan 2020

There is a pricing comparison between 33 separate screen capture APIs halfway down the page on this interesting piece about how to choose the right screen capture service. This type of comparison should exist across every business sector being impacted by APIs, as well as new ones emerging to introduce entirely new digital resources for use in our desktop, web, mobile, device, and network applications. Sadly, right now these types of machine readable, let alone human readable lists do not exist across the sector. Assembling these types of comparisons takes a lot of time and energy, and aren’t always possible in a special API snowflake of a world where seemingly similar APIs are actually very different beasts—sometimes intentionally, but usually unintentionally. I have had a machine readable schema for defining API pricing for almost five years now. I’ve profiled common resources like email, SMS, and others, but ultimately haven’t had the resources to invest in the work at the levels needed. I know how much work goes into establishing an exhaustive list of APIs in any business sector as well as finding a price, and defining the access tiers for each individual API provider. I wish I had more resources to invest in profiling of APIs, but also profiling down to this level of detail where each of the individual API resources they offer have some sort of semantic vocabulary applied, and a machine readable defining of the pricing and on-boarding required for each API provider. This is how we are going to get to the API economy we all like to fantasize about, where we can automatically discover, on-board, pay for, and switch between valuable aPI resources as we need in real-time. We need to get to work on doing this for the most tangible, consistent, and valuable API across the sector. We won’t be able to do for all types of APIs, and sometimes I twill be an apples to oranges comparison, but we...[Read More]


API Evangelist

Not Just An API Provider But Also An API Matchmaker

03 Jan 2020

Your API is always the best. Of course it is. However, not everyone will see the value your API delivers without a little enlightenment. Sometimes the value of an API is missed in isolation when you are just looking at what a single API can do. To help developers, as well as business users understand what is possible it can help to connect the dots between your API and other valuable 3rd party APIs. This is something you see from API providers who have integration pages showcasing the different integrations that are already available, and those who have invested in making sure their API is available on integration platform as a service (iPaaS) providers like IFTTT and Zapier. If a new user isn’t up to speed on what your API does, it can help to put it side by side with other APIs they are already familiar with. Being aware of not just the industry you are operating an API within, but also complimentary industries is what we should all be striving for as an API provider. The most competitive API providers all have integration pages demonstrating the value that an API provides, but more importantly the value it can deliver when bundled with other popular services their customers are already using. This means that API providers have to be solving a real world problem, but also have done their homework when it comes to understanding a real world version of this problem that other people face. Or simply have enough consumers of an API who are demanding that are also demanding integrations with other commonly used platforms. Regardless of how an API provider gets there, having an awareness of other platforms that companies are depending on as part of their operation, and ensuring that your API solutions are compatible and interoperable by default just makes sense. I find that playing with a complimentary API in Postman helps you think about the moving parts of...[Read More]


API Evangelist

What Is The API Life Cycle?

02 Jan 2020

I regularly struggle with the words and phrases I use in my storytelling. I’m never happy with my level of word-smithing, as well as the final output. Ultimately I don’t let it stop me, I just push myself to constantly re-evaluate how I speak, being forever critical and often pedantic about why I do things, and why I don’t. One word I struggle with is lifecycle. First I struggle with it being a word, or two words. Historically I have been team word, but more recently I’ve switched to two words. However, this round of anxiety over the phrase is more operational, and existential, over it being about how I use the word in my storytelling. I am more interested in if we should even be using the phrase, and if we are, how do we get more formal about quantifying exactly what we mean by the API life cycle. As I work to flesh out my API life cycle Postman collection, defining an API-driven guard rails for how I deliver my APIs, and distilling each step down to a single request and set of pre and post request scripts, I am forced to think about what the API life cycle really is. Pushing me to go beyond just talking about some abstract concept, to actually having a set of interfaces and scripts that quantify each stop along the API life cycle. While I will be adding more stops to my Postman API life cycle collection, I currently have 27 stops defined, providing me with some concrete actions I can take at each point in the evolution of my APIs. Define - Defining the central truth of the API using OpenAPI, JSON Schema, and Postman collections and environments. Environments - Providing environments that drive different stages of the API life cycle in conjunction with various collections. Design - Quantifying, standardizing, and evolving the HTTP and other design patterns I use across the APIs I deliver....[Read More]


API Evangelist

Deploying My Postman OpenAPI To AWS API Gateway

02 Jan 2020

I created a bunch of different Postman collections for AWS services leading up to re:Invent this year, and now I’m using individual requests to deliver on some different Postman AWS API life cycle workflows. To flesh out the scaffolding for how I define and deliver APIs throughout their API life cycle I got to work on a Postman collection for defining and executing every single stop in my API life cycle in a way that I could consistently apply across many different APIs. I am using Postman to define the central truth of each of my APIs with OpenAPI, and I want to use Postman to deliver and execute on that truth across every single stop along the API life cycles. One of the more critical stops I wanted to provide a solution for was API deployment, providing me with a simple way to immediately deploy an API from an OpenAPI definition. Deploying APIs are hard. It is one of the most complicated and least standardized stops along the API life cycle. Regardless, I wanted a simple straightforward Postman collection that would allow me to take an API definition within Postman, and publish an API to one of the major cloud platforms—AWS won out for simplicity. Ultimately, using Postman I was able to pull an OpenAPI for one of my APIs, then deploy an API in five steps. Providing a basic, introductory Postman collection for deploying a Postman API to AWS API Gateway. Pull API - Loads up the specific version of a Postman API into the environment for processing within each of the next steps. Create Table - Actually creates an AWS DynamoDB table derived from the name of the API being pulled from Postman. Prepare OpenAPI - Takes the OpenAPI and generates AWS API Gateway integration extensions that define the backend. Publish OpenAPI - Takes the new OpenAPI with integration extensions and publishes to AWS API Gateway. Deploy API - Actually deploys the API...[Read More]


API Evangelist

A Postman Collection for Managing the Life Cycles Of My APIs

02 Jan 2020

I had grown weary of just researching, talking, and teaching about the API lifecycle over the last ten years as the API Evangelist. This was one of the major motivators for me to join the Postman team. I want to take my knowledge of the API life cycle and work to make sure the rubber meet the road a little more when it comes to actually realizing much of what I talk about. I began investing in this vision over the holidays by crafting a Postman collection that isn't for defining a single API, it is meant to define the life cycle of a single API. I can manage multiple stops along the API life cycle already with Postman--I just wanted to bring it all together into a single machine readable collection that uses the Postman API, but also other APIs I use to orchestrate my world each day. My API life cycle collection is still a work in progress, but it is coming together nicely, and is the most tangle format of what I have been in my head when I think of Postman as an API delivery platform. This collection centers around managing an OpenAPI truth within Postman, then moving this API definition down the life cycle, and even deploy development or production versions of each API using AWS API Gateway. Of course everythig is API-driven, and designed to work across many different APIs to define, deliver, and manage any single API, maintaning a definition of the life cycle within a single Postman environment that can be used to bridge multiple API platform via a single collection. So far I have over a hundred individual capabilities defined as Postman requests, and organized into folders that are broken down by different stops along the API life cycle. I'm still moving them around and abstracting away the friction, while I work hard to define the most sensible workflows with each of my API life cycle...[Read More]


API Evangelist

Pulling Your Twitter Bookmarks Via The Twitter API

30 Dec 2019

I created two Twitter API capabilities the other day to help someone pull a list of their Twitter favorites using the Twitter API. They said they wanted bookmarks and I assumed they used favorites in the same way I do (as bookmarks), and created one Postman collection for pulling API favorites, and another to parse the URLs present in the body. I use Twitter public likes as a way of bookmarking, then I harvest those via the Twitter API--something I've done for over a decade. I had heard of Twitter bookmarks, and seen them in the desktop and mobile apps, but hadn't really made the shift in my brain. So I assumed they were talking about likes. DOH! Anyways, they tweeted back at me and helped me realize misconception. Ok, so how do we still get them their bookmarks? After some quick investigation there is no Twitter API for your private bookmarks, making the pulling of your data a little more challenging, but not impossible. This is where I began helping people not just understand the technology of APIs, but also the politics of API operations. Meaning Twitter has an API for your bookmarks, they just don't want you to get at it via the public API (I am not sure why). Anyways, in this scenario I can't make a ready to go Postman collection for you to use, I am going to have to teach you a little bit more Postman Kung Fu, and teach you how to sniff out the APIs that exist behind everything you do each day. It is still something you can do without programming, and with Postman you can still get at your data in the same way we did for the public Twitter favorites API. You just have to be curious enough to not turn away as I pull back the curtain of the world of APIs a little bit more, with a simple walk through. Something that...[Read More]


API Evangelist

Pulling Links From Those Tweets You Have Favorited

29 Dec 2019

I am busy crafting new API capabilities from my laundry list of requests I have from folks. When I get an email or come across a Tweet with someone asking how they do something on Twitter I will add to my list, and at some point pull together a simple Postman collection for accomplishing what is being desired. Providing a single Twitter capability that I can add to my list, and anyone (hopefully) can put to use with their own Twitter account and application, within their own local Postman environment. My goal here is to help provide simple API-driven capabilities that anyone can use, while also pushing my skills when it comes to crafting useful Postman collections that aren’t just for developers. Today’s API capability is from Elana Zeide (@elanazeide) who asked on Twitter, “So now I have a lot of twitter bookmarks of amazing things you people have shared ... is there any way to export/download them to another app? (I know you can do it w/ likes) Anyone come up with some clever workaround/automation?”. To possibly help her out I started by creating a single Postman collection that just pulls the favorites for any Twitter user via the Twitter API. Pull Twitter Favorites Capability - It authenticates with the Twitter API and pulls the likes for any Twitter user using their handle, and publishing the list of favorites to the visualizer screen. This all by itself is a perfectly usable API capability all by itself, but once I was done I used it as my base for pulling any URL that is present in the Tweet. Making for entirely separate Twitter API capability that I hope folks will find useful. Pull Links From Twitter Favorites Capability - It authenticates with the Twitter API and pulls the likes for any Twitter user using their handle, extracts all of the links from those tweets and publishes the list of links to the visualizer screen. Both of these...[Read More]


API Evangelist

How My API Evangelist Research and Writing Works

28 Dec 2019

Many folks don’t quite get my work and writing style. They are confused by the erratic flow of stories being published to API Evangelist, the incomplete nature of some of my research sites, and other annoying factors that don’t quite make sense when you view API Evangelist a particular way. If you think it as a technology blog like Techcrunch, ReadWrite, The New Stack, or others, you will be passing certain judgement on the content of my work, the tone of what I say, and the chaotic way in which I publish my research and stories across hundreds of separate sub-domains. People expect me to write up their API, review their approach, or know everything about the thousands of APIs that exist across the public landscape. API Evangelist isn’t this type of blog—it is simply my workbench for things that interest me, are relevant to the industry and my career, or is valuable to someone who pays me to generate value in the API universe.  Two Distinct Layers Of Research There are two main layers to my research, which I use to mine API information and knowledge. These two dimensions feed off of each other, but ultimately drive my research, storytelling, and at times the wider conversation in the API space. Helping me organize everything into these two buckets: Landscape - Reviewing the public and private API offerings across many different business sectors, providing me with a unique view of how API providers are doing what they do. Life Cycle - Taking what I’ve learned across the landscape and organizing information and knowledge by stops along the API life cycle, for use in my regular work and storytelling. These two layers are constantly feeding each other. For example, after making a pass through all the payment APIs, updating the landscape for that area, I will add new building blocks I’ve stumbled across to my API life cycle research. Then when I embark on research into the...[Read More]


API Evangelist

Atlassian Provides Run in Postman and OpenAPI by Default for Jira, Confluence, and BitBucket APIs

27 Dec 2019

I was profiling the Atlassian APIs, considering what is possible with JIRA, Confluence, and Bitbucket. Three services that are baked into many enterprise organizations I’ve worked with over the years. My intention was to create a Postman collection for JIRA, but once I Landed on the home page for the API I noticed they had a button in the top corner for Running in Postman, and a dropdown for getting an OpenAPI 3.0 spec. Which is something that I strongly believe should be default for all APIs, ensuring there is a prominently placed link to the machine readable truth behind each API. I like seeing Postman as the default executable in the top corner of the documentation for APIs. I also enjoy seeing the orange Run in Postman button across documentation, blog posts, and other resources—helping folks quickly on-board with some API resource or capabilities. I want to see more of this. I’d like it all to become the default mode of operating for API providers. I want all API providers to manage an OpenAPI truth for their API, while also developing and evolving many different Postman derivatives of that truth. Providing reference collections that describe the full surface area of our APIs, but also make sure there are more on-boarding, workflow, and capability style APIs that empower end-users to put APIs to work distributed across API documentation, and the stories we tell about what is possible with our APIs. Interestingly the Postman collection isn’t just a unit of representation for the JIRA, Confluence, and BitBucket APIs. The Postman collection is also a representing of the unit of work that is executed across these platforms. If you have worked in software development across the enterprise you know what I am talking about. Postman is the Swiss Army Knife for how enterprise developers not only develop and deliver their work, which is defined and tracked using JIRA, Confluence, and BitBucket, but Postman collections are also how...[Read More]


API Evangelist

Applying An API-First Approach To Understanding The Pacific Northwest Mushroom Industry

23 Dec 2019

This is an API first project for mapping out the mushroom industry. I have always had a passion for mushrooms, but as I get older I am looking at investing in more side projects that aren’t always 100% about APIs. I wanted to spend some time this holidays refreshing my memory about what types of mushrooms are available on the market, and what types of products are being made from them. As I do with any data or content driven research I begin by creating an API to store all of the data and content I am gathering, helping me flesh out the dimensions of each business sector I am interested in. As with all of my work I really don’t know where this research is headed—I am just interested in learning about mushrooms. Eventually I’d like to use this data and content in a variety of web and mobile applications, but since I’m just getting started I don’t really understand all of the data I am needing to gather. A situation that is perfect suited for beginning as an API first project, helping me not just gather the data I need, but also do it in a way that will help me prepare for the future, while also not investing too much into wiring up a database, coding a web or mobile application, and any other costly infrastructure that may (or may not) be needed down the road. By starting as API first, I am able to flesh out the schema and structure of my APIs which will drive my research, and the resulting applications I will be needing down the road. To get started I spent about 10 minutes thinking about what the main areas of resources I will be needing to track across my work, and created ten separate individual resources. Mushrooms - A list of the mushrooms, their scientific names, description, and the beginning of what I will need to map...[Read More]


API Evangelist

API Providers Should Maintain Their Own API Definitions

23 Dec 2019

I am working my way through 2K+ API providers, refreshing my view of the API landscape, and the data I use to tune into the API economy. As I refresh the profile of each API provider, one of the main things I’m doing is looking for an OpenAPI or Postman collection. While the profiling of their API operations using APIs.json is critical, having a machine readable definition for each API is kind of the most important part of this process. Having an OpenAPI or Postman collection gives me a machine readable list of the value that each API delivers, and allows me (and others) to more easily integrate an API into other applications and systems. Sadly, not every API provider understands the need, or is able to invest the resources to produce an API definition. While profiling an API provider the most ideal situation I can come across is when an OpenAPI already exists in service of API documentation, or the API provider just gets the struggle of their API consumers and they have a Postman collection already published. Ideally, the OpenAPI is publicly available and I don’t have dig it out from behind the documentation, or they have the Run in Postman button clearly published on their website. In the best situations, API providers have their OpenAPI and / or their Postman collections published to GitHub, and are actively maintaining their API definitions using Git, which allows other API consumers and API service providers to depend on an authoritative source of truth when it comes to API definitions for each API they use. I wish every API provider would maintain their own API definitions in this way, sadly very few do. The majority APIs I come across do not have documentation driven by OpenAPI and do not have Postman collections. When I encounter one of these API providers I spend usually about 60 seconds googling for Swagger, OpenAPI, and Postman + their name in...[Read More]


API Evangelist

Where Does The Exhaust For Your API Operations End Up Being Stored?

20 Dec 2019

As part of my ongoing API discovery and observability research, I am interested in better defining where the common places are within the enterprise that we find API signals. Those log files and other exhaust by-products from API operations that will contain hosts, paths, parameters, and other parts and pieces of the APIs that are already in operation. API discovery is complex and it isn’t something I think we are going to be able to solve by mandating teams to make their APIs more discoverable, I think it is something we are going to have to do for them. Augmenting their existing work with services and tooling that then defines what APIs they are producing and consuming as part of the existing tools, applications, and systems. Further expanding the definition of API observability by tapping the exhaust from the outputs of existing infrastructure to help us map out the API landscape that exists within the enterprise.  I am currently helping the Optic folks think beyond the personal value their proxy delivers for individual developers by proxying your desktop, web, mobile, and Postman traffic and automatically generating OpenAPI definitions for you, and consider what the more industrial grade use cases will be. As part of these conversations I am more deeply thinking about how APIs are operated within the enterprise, and being more formal in how I discuss where you can tap into the existing exhaust that is captured around API operations, building on the following list I already have. Apache Log File - The most ubiquitous open source web server out there is the default for many API providers. NGINX Log File - The next most ubiquitous open source web server is definitely something I should be looking for. IIS Log File - Then of course, many Microsoft web server folks are still using IIS to serve up their API infrastructure. Amazon CloudWatch - Looking at how the enterprise is centralizing their logs with CloudWatch...[Read More]


API Evangelist

OpenAPI is the Static Truth and Postman Collections are Real World Derivatives of that Truth

20 Dec 2019

I was talking with the Optic folks this morning about API definitions when they asked me for my opinions on what the difference between OpenAPI and Postman were. A question that isn’t easy to answer, and will produce many different answers depending on who you are talking with. It is a question I’ve been struggling with since before I started at Postman, and will continue to struggle with over the coming years as their Chief Evangelist. The best I can do right now is keep writing about it, and continue talking with smart people like Optic, and iterate upon the answer until I can better see what is happening. Here is how I see things currently: OpenAPI is the static truth, and Postman collections are the real world, real time derivative’s of this truth. Each individual Postman collection reflects the derived value of an API, representing how a developer, application, or system integration is applying this value in the real world. Now if you squint your eyes, all of those Postman collection derivatives roll up into a single OpenAPI truth. OpenAPI is essential for nailing down what the overarching truth of what an API contract delivers, while Postman is essential in quantifying, realizing, and executing this truth on the ground for a specific business use case. There are definitely ways in which OpenAPI and Postman collections overlap, but then there are the ways in which they bring different value to the table.  When it comes to capital G Governance OpenAPI is more meaningful to business leadership—it represents a more constant truth that can then be translated within services, tooling, and defining policy at the macro level. When it comes to lowercase g governance Postman collection is more meaningful to developers, because it represents the transactions they need to accomplish each day, which are derived from the greater truth, but have more context regarding each specific business transaction that a developer is expected to deliver. This...[Read More]


API Evangelist

How I Profile The TypeForm API

20 Dec 2019

I was being asked for more information about how I profile APIs, and deal with the many differences I come across. It isn’t easy navigating the differences between different APIs, and come out with a standard API definition (OpenAPI or Postman collection) that you can use across different stops along the API life cycle. I’m pretty agile and flexible in how I approach profiling different APIs, with a variety of tools and tricks I use to vacuum up as much details as I possibly can with as little manual labor as I possibly can. The example for profiling that was thrown at me was the TypeForm API, which is a pretty sophisticated API, but will still need some massaging to create an acceptable set of API definitions. First thing I do is search for an OpenAPI definition, hopefully published to GitHub or prominently linked off their documentation, but I will settle having to sniff out from behind an APIs documentation. TypeForm doesn’t have an OpenAPI or Swagger available (from what we can tell). Next, I go looking for a Postman collection. Boom!! Typeform has a Postman collection. The question now is why hasn’t Typeform published their Postman collection to the Postman Network? I will Tweet at them. Ok, now I have a machine readable definition for the Typeform API that I can import into my API monitoring system—which is just an API that I use in Postman to import a Postman collection (head explodes). My Postman collection import API grabs as many elements from the Postman collection definition as it can, normalizing the paths, parameters, and other details for an API. I am always adding to what my API is capable of, but it does a pretty good job of giving me what I need to begin to profile the surface area of an API. Now I have all of the paths imported into my monitoring system. However, I am still at the mercy of how...[Read More]


API Evangelist

What Else Has Influenced APIs Over the Last 50+ Years?

19 Dec 2019

Because I have so many smart folks in my Twitter timeline I want to put out some of the seeds for stories I am working on for 2020. I want your help determining what has set the stage for the world of APIs we all believe in so much. Here are just a handful of the nuggets I have pulled out of my research and reading. Early On 1933 - Telex Messaging 1949 - Memex (Linked Documents) 1949 - Computer Talk Over Phone 1958 - Digital Phone Lines 1959 - Semi-Automatic Ground Environment (SAGE) Wide Area Network 1961 - Computer Time Sharing 1963 - Hypertext 1963 - Hypermedia 1964 - Sync Satellite Television Network 1964 - IBM Sabre Reservation System 1966 - Michigan Educational Research Information Triad (MERIT)  1968 - Multiplexing 1969 - Mass produced software components By McIlroy, Malcolm Douglas 1969 - Host Software The First RFC 1969 - ARPANET Four Initial Nodes Established 1969 - Compuserve 1970s 1970 - ARPANET Reaches East Coast (MIT) 1971 - Email 1971 - File Transfer Protocol (FTP) 1971 - TELNET 1971 - ARPANET Has 23 nodes 1972 - ARPANET Has 29 nodes 1973 - ARPANET Has 40 nodes 1974 - ARPANET Has 46 nodes 1974 - Transmission Control Program (TCP) 1974 - Systems Network Architecture (SNA) 1975 - ARPANET Has 57 Modes 1976 - CYCLADES Computer Network 1976 - X.25 Packet Switching Protocol 1979 - First Commercial Cellular Network 1980s 1980 - USENET 1981 - ARPANET Has 213 Nodes 1981 - TCP/IP 1982 - Simple Mail Transfer Protocol (SMTP) 1983 - ARPANET Switches to TCP/IP 1983 - IPV4 1983 - Berkely Sockets 1984 - CD-ROM 1984 - Domain Name System  (DNS) 1984 - Dynamic Host Configuration Protocol (DHCP) 1984 - Open Systems Interconnect (OSI) 1985 - Whole Earth 'Lectronic Link (WELL) 1985 - National Science Foundation Network (NSFNET) 1987 - Transport Layer Interface (TLI) 1990s 1991 - Gopher 1991 - Windows Sockets API 1991 - Common Object...[Read More]


API Evangelist

The 3dcart Developer Home Page Is Nice and Clean

19 Dec 2019

I look through a lot of API developer portals and when I come across interesting layouts I like to pause and highlight them showing to other API providers what is possible, while turning API Evangelist into a sort of style guid when it comes to crafting your API operations. I was asking the folks over at 3dcart if they have an OpenAPI or Postman collection for their API to help me round of my machine readable index of the commerce API provider, and after I stumbled across their developer portal, I thought I'd share here. I like it because in addition to the global navigation for their portal, it really gets at the primary next steps anyone will be taking off the landing page of their developer portal. You can tell it really forced them to pause and think about the narrative around what people will be looking for. Helping people understand what is possible, while also routing them down the most common paths taken when it comes to building an application on 3dcart.[Read More]


API Evangelist

Taming The Salesforce API Scope

18 Dec 2019

I was recently looking to building a prototype integration between Salesforce and Workday, where I find myself needing to on-board with the Salesforce REST API for probably the 50+ time in my career. I am always looking for projects that use the API so that I can keep my skills sharp when it comes to one of the leading API platforms out there. Even with this experience, each time I on-board with the API I find myself having to work pretty hard to make sense of the Salesforce REST API, first wading through a sea of information to get to find the API reference documentation, setting up an OAuth application, and getting to where I am actually making my first API call. Once I am successfully making calls to the Salesforce API, I then have to further explore the surface area of the Salesforce REST API before I can fully understand all the resources are available, and what is truly possible with my integration. After spending about an hour in the Salesforce documentation it all came back to me. I remembered how powerful and versatile the API is, but my moment of déjà vu left me feeling like it would be pretty easy to reduce the time needed to go from landing on the home page of developer.salesforce.com to making your first API call. The challenge with the Salesforce API is it is extremely large, and possess a number of resources that will vary based upon two important dimensions, version and your user account. The API you see with a base developer account isn’t the same you’ll see with a well established corporate Salesforce implementation. Each individual Salesforce customer might be using a specific version, and have a completely different set of resources available to them, making it pretty challenging to properly document the API. Even with these challenges I think there are a number of ways in which the Salesforce API could be made...[Read More]


API Evangelist

APIs For Victoria Australia

17 Dec 2019

I was helping out someone trying to download air quality data in Australia today, and while I was playing around the Victoria Australia government AirWatch data API I thought I'd go ahead and add them to my API Evangelist network by importing their Swagger 2.,0 files and converting them to OpenAPI 3.0, while also publishing Postman collections for teach of their APIs. Expanding out the APIs I have in my directory, while also encouraging the state to publish the Postman collections I've created to the Postman API Network. The State of Victoria has some pretty interesting APIs that they have made available using Axway. I have published an APIs.json index for the states developer portal, providing an index of their API operations, as well as the individual APIs. You can get at the Postman collections I've generated using these links. ABS Labour Force API Postman Collection Agriculture Victoria Soils API Postman Collection DataVic CKAN API Postman Collection DataVic Open Data API Postman Collection EPA AirWatch API Postman Collection Important Government Dates API Postman Collection Museums Victoria Collections API Postman Collection Popular Baby Names Victoria API Postman Collection Victorian Heritage API Postman Collection I would go ahead and publish the Postman collections to the Postman Network, but I have asked them to go ahead and publish them. I would rather the listings be more authoritative and something that is owned by the API operators. I'm just looking to maintain a GitHub repository with fresh copies of their OpenAPI, Postman collections, and APIs.json so I can use as the source of truth for the APIs across API Evangelist, APIs.io, and other iPaaS, and integration providers.  I am working through several different business sectors and government APIs, updating my directory of APIs, while also sharing with soem other API service providers I have been talking to. If there is a particular API provider you'd like to see added to my list, go ahead and submit a pull request...[Read More]


API Evangelist

A Portable 23andMe API Sandbox

17 Dec 2019

I was creating a Postman collection for the 23andMe API. The 23andMe API is still available, despite the company pulling back somewhat when it comes to accessing the DNA and genetics API. You can still get access to the API for research purposes, but you have to email their business development group and convince them of the merits of your research before you’ll get access to the data. It is pretty common for companies to have valuable data like 23andMe does, and there being a significant amount of concern regarding who has access to it. This is why API management exists as a fundamental building block of API operations, so you can have total control over who has access to your data, and possess detailed logs regarding what has been accessed by consumers. Requiring approval of a developer account before you get your API keys is common, pushing API developers to justify their access and establish a trusted relationship with API providers. This is something you can setup with your API management tooling or services, providing a public sign-up form, yet making each new API consumer wait to be approved before they get their API keys. Even with this security layer in place you may still want to allow API consumers to kick the tires more and see what is possible while awaiting approval for API access. One way you can accomplish this is by creating Postman collections for all the API endpoints, making sure there are one or more examples for each individual API path so that they can be mocked by any consumer using Postman. I went ahead and did this for the 23andMe API. Their documentation is still available, and there are examples for each individual path. I wanted to create a Postman collection for the API to round of my collection of API definitions, and since their documentation had examples, I thought I’d demonstrate how to create portable API sandboxes using...[Read More]


API Evangelist

Being Flexible With Authorization When It Comes To Multiple APIs Within A Single API Collection

16 Dec 2019

I am working on a Postman collection that deploys an API to AWS. I’m pulling the OpenAPI from Postman using the Postman API API (mind blown), and then publishing the API to AWS as an API using the AWS API Gateway API (mind blown again). As part of this process I also need a DynamoDB instance to use as a persistent data store behind the API, which I will create using the DynamoDB API. I need all of these capabilities organized within a single Postman collection, but because of the need to authenticate with multiple API services I will be organizing each capability by AWS service so I can set the authorization for each folder, and let each individual API request inherit from the folder, otherwise I will have to set each individual API request while working—I abstract away the variables I use across the authorization as part of a Postman environment, but I still want to logically think through how I can apply authorization across services. When defining Postman collections you can apply the authorization at the collection, folder, or request levels. This allows you to be more thoughtful of how authenticate across multiple APIs within a single Postman collection. This Postman collection is going to end up being what I’d consider to be a workflow collection, meaning it will walk through each step for the deployment of an API to AWS using Postman, so eventually it most likely will just be a series of individual API requests which can be run manually by a user, or automated with a Postman runner or monitor. However, as I am architecting my collection I don’t want to have to define the authorization for each individual request—I just want them to inherit authorizations, so I am just going to add a folder for each service. This gives me the ability to set authorization for Postman at the header level for an individual request, which I will move...[Read More]


API Evangelist

API Observability Is More Than Just Testing And Monitoring

16 Dec 2019

API observability is something I have written about for a while now after learning about it from Stripe. It is a phrase that has grown popular in API testing, monitoring, and performance circles. Borrowed from the physical world of control systems, observability is a measure of how well internal states of a system can be inferred from knowledge of its external outputs. I am all about getting on API observability train when it comes to monitoring of our systems, but if you’ve read my work you know I am all about expanding the definition to not just include the technical, but also the business and politics of API operations. One of the key aspects of observability is using the outputs or exhaust from the existing services and tooling used to operate your system. To help increase API observability within the enterprise I am always on the hunt for what the existing services and tooling that are in place so that I can better understand what the existing outputs are. If a service or tool is already in place within the enterprise, and we can tap into existing outputs, the chances for successfully changing behavior significantly increases. One tool that is ubiquitous across enterprise organizations is Postman, which provides a whole wealth of opportunity when it comes to tapping into the existing outputs to provide more observability into what is going on across the API life cycle. 90% of the Postman usage within the enterprise I come across occurs in isolation. One developer working with internal and external APIs within Postman. This occurs hundreds, or thousands over within medium and large enterprise organizations. Developers are profiling APIs, building requests, and saving them into collections with very little communication, coordination, and sharing of API awareness across teams. While it represents a pretty concerning trend across enterprise organizations where leadership has such little visibility into what teams are working on, it also represents a pretty significant opportunity for leadership to take...[Read More]


API Evangelist

Believing The Technology Startup Hype And Refusing See Anything Else

14 Dec 2019

I’ve been in numerous discussions with defenders of the Instructure Canvas platform after posting the Instructure LMS data points. These folks are blindly and passionately defending Instructure as a force for good, founded by good people, and being offended that I would see beyond the static representation of the startup they are demanding that everyone see and believe. I find it endlessly fascinating how we as a society continue to believe the storytelling around startups, and receive the marketing they put out as some sort of truth that will play out forever into the future. Even more dangerously these people don’t just believe, they actively police the front-line of critics who are shining a light on what is really going on, doing the bidding of not just startups, and their investors, but the capitalist machine. First, let me quickly disarm some of the standard tactics folks will use to come back at me on this piece. No, I am not anti-startup. I work for one, and have worked for many others. No, I am not anti-investor, I know lots of investors, advise them regularly, and derive most of my income from venture capital. This does not mean I am always walking the line and curbing my criticism of the bad that perpetuated by startups and investors across the landscape. People who feel the need to blindly defend technology is one of the reasons why there are so many bad actors on the stage, and why people are able to get away with the shenanigans they are pulling. Critics and whistleblowers are one of the forces that helps keep exploitative companies in the shadows, and minimize the damage they cause. I’m not saying all critique is constructive or helpful, but I’m saying that if you are actively pushing back on the critics and not listening to what they are saying, you are most likely part of the problem. To recap for those who are just jumping in--Instructure,...[Read More]


API Evangelist

Remember That An Application Is Not Just About Someone Building A Web or Mobile Application With Your API

13 Dec 2019

I encounter regular waves of API providers who are discouraged with the traffic to their API portal as well as the number of developers who are actually building something on top of their API. Many suffer from the hangover of “if you build it they will come” syndrome. Believing that if they just publish their APIs that developers will just show up and build amazing things. While many of us evangelists and advocates have over-promised amazing outcomes when it comes to publishing APIs, many of us in the trenches have long been honest about the hard work it takes to make your APIs something developers will want to use. Just publishing your APIs to a developer portal is not enough. Having a well designed and documented API is not enough. Making enough noise so that people find your API is a full time job, and ideally it is done by a whole team of people—study how Twilio has done it if you need a working example. Also, you have to regularly re-evaluate what the possibilities are when it comes to building or developing “applications”. This isn’t the API ecosystem of a decade ago where we focused on building just widgets and mobile applications. There are many more ways in which people can put your APIs to work in 2019, and you should be making time to understand what those possibilities are. The chance that some new developer will randomly find your API and build the next killer mobile application is pretty slim, and the real world implementations are probably going be much more mundane and granular. The important thing to remember about the word “application” is it does not necessary mean a web, mobile, or device application—it can simply mean “applying” your API.  Which in a world of integration platform as a service (iPads), bots, voice, and other use cases, applying your API could mean many different things. Don’t expect that all of your API...[Read More]


API Evangelist

The Postman Business Model Is In Alignment With Enterprise Objectives

12 Dec 2019

One of the things that became very clear during my conversations with folks at AWS re:Invent last week is that Postman’s revenue model is in alignment with what is needed within the enterprise. To help explain, let’s answer the question I got over and over at re:Invent—how does Postman make money? Steady waves of folks would show up at our booth in the re:Invent expo hall, and usually after a couple minutes of talking about their FREE usage of Postman, and how ubiquitous the tool is at their organization, they would inquire about our pro and enterprise offerings—which is all about helping enterprise organizations get more organized when it comes to doing APIs. The Postman Pro and Enterprise offerings are all about scaled usage of the platform, which includes the number of collections, users, teams, workspaces, and the collaboration, automation, and orchestration across them. All the core Postman features are free, and will remain free—developers love Postman because of its utility, and we do not intend to mess with that.  Paying for Postman means you are getting more organized about how you manage users, collections, environments, teams, and workspaces, as well as using more monitors, runners, mocks, and documentation. While more usage doesn’t always mean an organizations is doing things in a more logical fashion, but Postman enterprise features center around the organized governance of users, teams, workspaces, collections, and environments, steering enterprise customers towards optimizing how things get done. Having observability into all of your teams delivering and consuming APIs is the most important thing you can be investing in as part of your enterprise operations—the Postman enterprise tier is centered around charging for collaboration, automation, and scaling your teams using a tool they already are using, which increases observability across your API operations. This is why I am working for Postman. I am regularly conflicted about how the companies I rely upon tier their pricing and scale the business side of what they...[Read More]


API Evangelist

Focusing On Digital API Capabilities Over Just Doing APIs

12 Dec 2019

As I work on creating more useful Postman collections I am distilling my API definitions down to the small possible unit as I possibly can. While I have many robust reference Postman collections and OpenAPIs, I am enjoying creating Postman collections that accomplish just a single ting—representing each digital capability that I have. Currently my digital capabilities are spread across a number of servers, GitHub repositories, and Postman workspaces. If I use one of the APIs in my long list of API providers it is pretty common that I use less than 5% of the API paths from each individual providers. So, other than for sharing as part of my API Evangelist research, why do I need to wade through entire reference API collections to get at the one or two capabilities I need to actually use. I’m slowly working through the stack of APIs that I use, pulling out the different capabilities I put to work as part of my API Evangelist work, defining as single Postman collections that I list on my GitHub capabilities page. I have published two Twitter API capabilities I have defined, which I will be expanding on pretty quickly, helping document all of the essential API calls I make to the Twitter platform. Twitter Tweet Search A basic search for Tweets on Twitter by query. Twitter Account Status Doing a lookup on users and returning only the fields that describe their status. I have probably another 50 individual Twitter platform capabilities I am putting to work across my platform. I am going to list them all out here, and then begin documenting how I put each of these capabilities to work. Then I’m going to evaluate whether there is any opportunity for me to scale each capabilities using Postman monitors, helping me automate the orchestration of Twitter data across my operations. Next, I got to work defining a handful of GitHub capabilities I put to use on a regular...[Read More]


API Evangelist

Automatically Generate OpenAPI For Your APIs Just By Using Them

12 Dec 2019

I am a big fan of tools that just augment our normal existence then make our lives easier without having to do much additional work. One of the tools that fits into this category is Optic, an open source tool that will generate OpenAPI definitions from your proxied traffic. A while back I developed my own script for doing this by processing Charles Proxy files synced with Dropbox, but I never evolved it beyond Swagger when OpenAPI 3.0 was released. So I was pleased to talk with the Optic team at API World in San Jose a while back. Like many notes in my notebook, my thoughts on Optic got buried by the constant flow of conversations and ideas coming in on a daily basis, but a Tweet from them the other day reminded me that I wanted to showcase and talk a little more about what they are up to and why Optic is important to the API sector. Optic will take your browser, CLI, and Postman API traffic and automatically generate an OpenAPI based upon your API calls that are routed through the Optic proxy. Helping us automate the generation of machine readable API contracts which can then be used across our API operations.  The generation of OpenAPI from the exhaust of our existing work is valuable, but what also grabs my attention is that Optic helps handle the diffs between each OpenAPI generating, which can be used to help you detect changes in APIs that are already in use. As I said, I have had this capability for a while now, and it is something you can do within Postman—turning on a proxy and generating a Postman collection. But, having as a standalone open source open source component that produces OpenAPI contracts as a standalone service is pretty critical for helping us make sense of our API exhaust at scale. Optic’s core feature is generating OpenAPIs and managing the diff between each...[Read More]


API Evangelist

We Will Not Convince Everyone To Go API Design First

11 Dec 2019

I believe in going API first. I think it provides a positive step for development teams. I think it is one that makes sense for most enterprise organizations. But if I have learned anything in the last decade of working with APIs is that I rarely get what I want, and people don’t always do what is right and what will make sense. I have gotten a lot of pushback from developers, API providers, and service providers regarding the notion that code first API delivery is bad, as well as the idea that API design first is good. For me, the real challenge here isn’t about selling folks on one approach or the other, it is more about injecting more stakeholders and communication into the process earlier on in the evolution of APIs, rather than waiting until they are baked into production before iterating upon the design of the interface. There are a lot of existing folks who are trained to deliver code across the enterprise landscape. I’ve heard repeatedly from folks that they are a programmer, and not a templater, artifactor, or API designer. I get it. We have a lot of momentum occurring based upon the way things have been historically, and I don’t doubt that it will be difficult to change our behavior. The challenge here lies around understanding how much of the pushback on API design first is purely about being resistant to change over there being multiple ways to tackle the deliver of an API. I feel pretty confident about there being multiple ways to actually deliver an API, and I don’t care if you are mocking it, delivering via a gateway, with a framework, or hand pounding your artisan APIs on the forge in the workshop. I just care that there is as much light on the overall process, as many stakeholders as possible involved, and there is a feedback loop around what the design of the APIs should...[Read More]


API Evangelist

My Thoughts ON Amazon EventBridge Schema Registry And Discovery

10 Dec 2019

My friend Fran Méndez (@fmvilas) over at AsyncAPI asked me to share my opinions on Amazon’s EventBridge schema registry and discovery which is in preview. Something that is looking to be a pretty critical add-on to Amazon EventBridge, which provides a serverless event bus that connects application data from your own apps, SaaS, and AWS services. Event-driven approaches to APIs are growing in popularity for a number of reasons, and is something that is only increasing the need for us to get our schema house in order, resulting in solutions like the schema registry for EventBridge being pretty valuable to general API operations. I haven’t taken EventBridge for a test drive, so all of my thoughts are purely superficial, but at first glance it looks like something that can have a meaningful impact on how people are making sense of the schema we have floating around, but I think there will be some key elements that will make or break a solution like the schema registry, something Amazon is already thinking about with their code generation from the schema registry. Here are some of the initial thoughts I am having as I read through the announcements and documentation around EventBridge and the schema registry. JSON Schema Generation - The auto publishing, diff’ing, versioning, discovery, and evolving of JSON Schema representations for all schema in use will be pretty critical in making the registry tangible. Protocol Buffers - There will need to be easy generation and conversion of Protocol Buffers as part of the process. I don’t see that EventBridge supports gRPC or Protocol Buffers, but it was a thought I was having./ AsyncAPI Generation - The schema catalog should automatically generate and version AsyncAPI specifications for all the schema, and ultimately channels and topics being defined as part of EventBridge. Tagging - Being able to tag schema, organize them, and discover based upon an evolving taxonomy that helps make sense of the expanding schema landscape will...[Read More]


API Evangelist

Abstracting Away API Response Complexity With Postman Visualizer

10 Dec 2019

I was creating a Postman collection for validating the status of Twitter users, where I was looking to extract a subset of data from the very verbose Twitter API response for a Tweet Lookup. There is a lot of data contained within a single Tweet JSON response, and all I was looking for was just a handful of the fields. I thought this would be a great opportunity to show off the new Postman visualizer, where you can display the API response for each request however you want. To get started I crafted an API request for the Twitter lookup API path, allowing me to pass in up to 100 Twitter user handles, and return a JSON response for all the Twitter users I want to check in on the status of—leveraging Postman to authorize and see the details of the API response. This res[omse has the data I need, but looking through the entire of the JSON response is a lot more than I can ask of many of the people I will be sharing the collection with. I’m going to be sharing it with mostly non-developers, hoping to provide them them with a quick way to check the status of various Twitter users, and wading through the JSON is unacceptable, so I used the new Postman visualizer to render an HTML list of only the data I wanted. The Postman visualizer allows me to pull only the fields I need and publish as HTML to the visualizer tab. Providing a more human readable view of the Twitter Lookup API response, making the Twitter API more accessible by developers and non-developers who are looking for a quick way to validate the status of one or many Twitter users. To make this happen, all I did was add a test script to the API request, adding a little JavaScript which takes the JSON response, loop through each user being returned and retrieve only the fields...[Read More]


API Evangelist

Validating Twitter Users Using The Twitter API Without Writing Code

09 Dec 2019

I was asked on Twitter if I had any code for pulling the status of Twitter users. Since I am the API Evangelist, and the Chief Postman at Twitter I tend to prefer sharing of Postman collections rather than actual language specific code. It is easy for me to craft a single Postman request that accomplishes a specific purpose, then share as a template in the Postman API network, than it is to write code. Plus, any user can then execute on their own within their own local Postman client. To satisfy this request, and demonstrate how Postman collections work, I created one for looking up the status of a list of any Twitter handle, verifying the state of each individual Twitter user—providing only the basic information you need to validate one or many different Twitter accounts. Before you can use this API collection you will have to download the Postman applications, then setup your own Twitter application so that you can make calls to the Twitter API--do not worry, it is painless, and something even a non-developer can do. When filling out the form, all you need is to give your app a name, description, website URL, and tell them how it will be used. You can ignore the rest of the fields. Once the application is added you can obtain your access tokens by clicking on the keys and tokens tab. The first time you create your application you will have regenerate your access token and access token secret, and then you will need all four tokens to authenticate (Don't worry those aren't my real tokens). Once you have your own tokens, go back to your Postman application and click on the Twitter Lookup Postman collection, and edit its details. Once the edit window pops up select the Authorization tab, then select to use OAuth 1.0 and add your auth data to request headers from the dropdown boxes, then add your own ,...[Read More]


API Evangelist

Twitter API Authorization Using Postman

09 Dec 2019

I just created a new Postman collection for validating Twitter users via the Twitter API. As part of the Postman collection documentation and tutorial I included the steps for authorizing with the Twitter API. This is something that can easily be a hurdle for developer, and will definitely run most non-developers off. In reality, setting up your own Twitter API application, then copying your API tokens and use them in a Postman collection is something anyone can accomplish. This is an authorization workflow that I will be referencing in many different Twitter API tutorials and stories on the blog, so I wanted to have as a standalone URL that I could easily share with anyone. Before you can make any call to the Twitter API you will need to have four application tokens you can only obtain via your own Twitter developer account. The first step of this process is to setup a Twitter developer account which is different than your regular account, and can be done via the Twitter developer portal. Once you have a Twitter developer account you can visit the application listing page, and choose to create a new application in the top corner, and manage any existing application you have already added in the past. Allowing you to manage the details of your access to the Twitter API. While creating an application there are a number of details you will need to consider, but to jumpstart your API Integration all you will need is the name, description, website URL, and tell Twitter how this app will be used. You can always edit these settings at any point in the future, so do not worry too much about them when getting started. Once you have created your Twitter application you can visit the keys and tokens tab to obtain your consumer API keys as well as the access token and access token secret. Providing the four tokens you will need to actually authorize...[Read More]


API Evangelist

A Postman Meetup This Tuesday In Seattle

09 Dec 2019

I am all recovered from being at AWS re:Invent all week in Las Vegas, and gearing up for a Postman meetup in Seattle this Tuesday. I am stoked to be doing ane vent on my home turf, but I am aslo pretty happy with the lineup. I am going to be opening things off with a quick look at Postman collections, but then I have Tableau and NGINX giving some talks, and then a quick closing with a look at the Postman visualizer--here is the lineup for Tuesday nights goings on. Postman API Collections Kin Lane (@kinlane), Chief Evangelist @ Postman You save your Postman requests as collections each day, but have you learned about all the ways in which collections can be applied? Let’s move beyond just reference collections for every endpoint in your API and making collections reflect the real world business use cases your APIs solve. Pushing your Postman collections to me more intuitive and useful for your developers, helping on-board them with the possibilities while also documenting what your APIs can do, providing portable, shareable, machine readable, and executable representations of what your APIs deliver. How Tableau uses Postman to enable developers Geraldine Zanolli a.k.a Gigi (@illonage) Developer Advocate @ Tableau Tableau , well-known in the Business Intelligence industry for building great data visualizations, also offers a set of APIs and Developer Tools that allow developers to integrate, customize, automate, and extend Tableau to fit the specific needs of their organization. Learn how Tableau uses Postman to give developers an interface to do their first API request. The NGINX API Gateway Liam Crilly (@liamcrilly), Director of Product Management @ NGINX APIs are changing the way we build applications and changing the way we expose data, both inside and outside our organizations. But what is the most efficient and effective way to deliver these APIs? That’s the job of the API gateway. In this session, we will look at different deployment patterns for...[Read More]


API Evangelist

We Will Not Discuss APIs Without A Postman Collection

02 Dec 2019

I heard a story this morning while having breakfast with someone at the Venetian before I made my way to the re:Invent registration counter which reminded me of the now infamous secret to Amazon’s API success myth story. I can’t mention the company involved because they are pretty confident they’d never get approval to tell this story publicly, but as I so often do, I still feel it is worth telling even in an anonymous way. Internal groups at this company were having such a problem around coherently discussing the details of APIs between internal groups, that they made a rule that they will not talk with other teams about any API without there being a Postman collection present (ha, Postman mediator) to facilitate the conversation. There has been several stories on this blog about the problems with emailing API responses between teams, and sending Microsoft Word documents with XML or JSON responses embedded in them. If you work within the enterprise you know that this is a common way to share API responses, and get guidance, ask questions, and generally discuss the details of each API being put to use. Imagine if all of this was banned, and if you had a question about the details of making an API request or parsing the response, it was mandatory to provide a Postman collection of each API request and response in question. Ensuring that ALL the details of the request with a real-life example of the response was present before any discussion would commence. Talk about a game changer when it comes to making sure people were on the same page when it came to discussing some very abstract concepts. Ensuring team members are all on the same page when it comes to what an API is, let alone the endless number of details regarding query parameters, headers, authentication and other details takes a lot of work. Even if all stakeholders in a conversation are...[Read More]


API Evangelist

Mock AWS Services Using Postman Collections With Examples

02 Dec 2019

As I create each of the 50+ Postman collections for AWS services I am always striving for establishing as complete of a collection I as possibly can—this includes having examples for each API request being defined. This is something that is easier said than done, as there are many different ways in which you can setup your AWS infrastructure, and work with your infrastructure using AWS APIs, but nonetheless, I still strive for ensuring there is an example saved as part of each Postman collection. While this helps me better define each request, there are numerous benefits from having API examples, and one of the most beneficial of these is being able to generate mock APIs from the AWS Postman collections I’m publishing. Taking a look at the Postman collection I have published for Amazon DynamoDB, I have managed to save examples for each of the API requests documented as part of the AWS reference Postman collection. This makes it so anyone can run the Postman collection within their own Postman platform account, and then generate a mock server for the Amazon DynamoDB API. Allowing developers to develop against the API without actually having to use a live AWS account, have the proper infrastructure and permissions setup, making it quicker and easier to jumpstart the development of desktop, web, and mobile applications. Allowing developers to publish their own mock servers for AWS services, and save time and money when it comes to your AWS budget. I can envision developing AWS Postman collections that are complete with examples derived from specific AWS infrastructure deployments. Tailoring a specific setup and configuration, then making API requests to the AWS APIs needed for orchestrating against these existing infrastructure configurations, and saving the examples return from each API response. Essentially taking a snapshot of an existing AWS setup across multiple services, then making that snapshot available as a mocked set of AWS APIs that return the responses developers are needing...[Read More]


API Evangelist

gRPCs Potentially Fatal Weakness

02 Dec 2019

I was reading an article on Microsofts DevBlog about gRPC vs HTTP APIs. It makes the usual arguments of how gRPC compares with HTTP APIs. While the arguments for gRPC are definitely compelling, I find the weaknesses of gRPC in this moment in time even more interesting, for two reasons, 1) they are something we can overcome with the right tooling and services, and 2) they reflect our challenge between the human and machine readablity of all of this, which many of us technologists really suck at, leaving me concerned whether or not we will be able to get this right—as I think we underestimated this characteristic of HTTP APIs, and have missed the full potential of this opportunity even as we are faced with this next step. Here is what was said the blog post, highlighting two distinct weaknesses of gRPC, but which I view as more about systemic illnesses in the wider view of the API landscape, and our inability to understand the important role that humans play in all of this: Limited browser support gRPC has excellent cross-platform support! gRPC implementations are available for every programming language in common usage today. However one place you can’t call a gRPC service from is a browser. gRPC heavily uses HTTP/2 features and no browser provides the level of control required over web requests to support a gRPC client. For example, browsers do not allow a caller to require that HTTP/2 be used, or provide access to underlying HTTP/2 frames. gRPC-Web is an additional technology from the gRPC team that provides limited gRPC support in the browser. gRPC-Web consists of two parts: a JavaScript client that supports all modern browsers, and a gRPC-Web proxy on the server. The gRPC-Web client calls the proxy and the proxy will forward on the gRPC requests to the gRPC server. Not all of gRPC’s features are supported by gRPC-Web. Client and bidirectional streaming isn’t supported, and there is limited support...[Read More]


API Evangelist

I Am Heading To Vegas For AWS re:Invent

01 Dec 2019

I’m sitting in the Seattle airport waiting for my flight to Las Vegas. I’m heading to AWS re:Invent, spending the entire week talking everything APIs with the masses at the flagship conference. It is my 3rd time in Vegas for re:Invent, but my first time exhibiting with such a strong brand—Postman. Like years before, I will be there to talk to as many people as I can about how they are delivering APIs, and learning about the challenges they face when consuming APIs. However, this year I won’t just be there as a representative for API Evangelist—this year I am there to help also talk about the role Postman plays in the delivery and consumption of APIs. To get fired up for the conference I’ve spent the last couple of weeks developing Postman collections for as many AWS APIs as I could—I had set 25 services as my target, and managed to a little more than 50 separate services defines as reference Postman collections. I learned a lot throughout the process, assisting me in loading up a whole lot of details about common AWS API infrastructure into my brain, helping me prime my brain for conversations I will be having at re:Invent. Helping not just think deeply about AWS services, but also how Postman can be used to work with AWS APIs. These Postman reference collections are just the foundation for my API understanding, API conversations, and other ways of considering how AWS APIs impact how we deliver applications in the cloud. While the AWS Postman collections help jumpstart anyones usage of AWS, I’m also looking at how to use them to actually define, deploy, manage, and evolve APIs that operate on AWS. AWS APIs have long fascinated me, and have played a significant role in my evolution as the API Evangelist, In 2020 I’m still keen on learning from AWS as an API pioneer, but I am more interested in learning how we can...[Read More]


API Evangelist

I Am Happy I Chose The Term Evangelism

27 Nov 2019

There is always lot of discussion around the proper term to use for describing what it is we all do when it comes to getting the word out about our APIs. Some of use use the word evangelism, while others prefer to use advocacy, relations, or being an ambassador or champion. Sometimes it is focused on APIs, but other times it is focused on developers or other aspect of what we are looking to highlight. While there are many “announced” reasons why we evangelize, advocate, and champion, the real honest reason is always that we want to bring attention to our products and services. Sure, in some cases we are interested in educating and supporting developers, but really all of this is about bringing attention to whatever we are peddling—me included. I didn’t grow up religious—the opposite actually. I never went to a church ceremony until I was an adult. So the term evangelism doesn’t carry a lot of baggage for me. However, I do fully understand that it does for many other people. Even though I didn’t go to church, I did grow up around enough people who were very religious to understand the meaning evangelism can bring to the table. Early on in doing API Evangelism I felt somewhat bad for using this term, and felt like I was bringing a whole lot of unnecessary meaning and baggage to the table as I was trying to “enlighten” folks of the API potential. Now, after a decade of doing this, I’m happy I chose the term evangelism, because I feel it best represents what it is I do in this technology obsessed world we have created for ourselves.  Technology is the new religion for many people. You know what two of the fastest growing areas of APIs are? Blockchain and churches. When you have so many people blindly believing in technology, it begins to look and smell a lot like religion. When you embark...[Read More]


API Evangelist

Bulk Updating My Postman Collections Using The Postman API

27 Nov 2019

I had recently pulled all of the AWS Postman collections I have created and spread across Postman workspaces. After creating over 50 AWS Postman collections I learned some things along the way, and realized I needed to update my variable for the baseURL of each API, but I had already created all my collections, and to update these variables manually would take me hours, if not days. So I got to work writing a script that would pull the latest JSON for each collection, conduct a find and replace on the replacing it with a service specific base url that went something like this , then write back using the Postman API. This is something that would take me many hours to update across 50+ collections and nearly 1000 individual requests, but is something that I could accomplish in less than an hour with the Postman API. Once again, when I can’t get what I need to quickly in the Postman UI, I can quickly get things done using the Postman API. This is how it should be. I don’t expect that the Postman UI keep pace with all of my needs. I like Postman as it is, and carefully plodding forward adding features that make sense to as wide of an audience as possible. I always know that I can get at what I need through the API, and automate the changes I need. In this case, I’m able to rapidly make updates at scale across many different API collections, relying on Postman to help me manage API definitions manually through the interface and in many different automated ways via their API. I am still getting my bearings when it comes to managing the variables I use across my many Postman collections. I am rapidly iterating upon how I name my variables for maximum flexibility within Postman environments, and where I apply them within my Postman collections. This is something that requires a lot...[Read More]


API Evangelist

So You Wanna Build An iPaaS Solution

26 Nov 2019

I’m getting more emails and DMs from startups doing what I’d consider to be integration platform as a service (iPaaS) solutions. These are services that help developers or business users integrate using multiple APIs. Think IFTTT or Zapier. I’ve seen many waves of them come, and I’ve seen many waves of them go away. I’m always optimistic that someone will come along and make one that reflects my open API philosophy and still can make revenue to support itself. So far, Zapier is the closest one we have, and I”m sure there are others, but honestly I’ve grown weary of test driving new ones, and I am not as up to speed as I should be on what’s out there.  When it comes to iPaaS providers the bar is pretty high to convince me that I should be test driving your solution, and why I should support you in the long run. This is partly from just having supported so many services over the years, only to have them eventually go away. It is also because of new problems for consumers being introduced into the mix because of the abstracting away of the complexities of APIs, rather than joining forces to educate and fixes these complexities amongst API providers. I’m always willing to talk with new iPaaS providers that come along, but I have a few requirements I like to put out there which usually filters the people who end up reaching out and engage from those who do not. Due Diligence - Make sure you are testing driving and reviewing as many existing iPaaS platforms as you possibly can, because there are a lot of them, and the more you kick the tires on, the more robust and valuable your solution will be. API Definitions - Your solution needs to be API definition driven, adopting OpenAPI, Postman collections, and existing formats for defining each of the APIs you are integrating with, as well as...[Read More]


API Evangelist

Pulling All My Postman Collections Using The Postman API

26 Nov 2019

I needed access to all of the AWS Postman collections I am building. The problem is they are distributed across multiple different workspaces. I had organized over 50 AWS Postman collections based upon the resource they were making available. Now I just wanted a list of all of them, and report on what I have done. It sounded like a good idea to group them by resource at first, but now that I needed to work with all of them in a single list, I’m thinking maybe not. So I got to work pulling all of my collections from the Postman API and filtering out any collection that wasn’t from AWS. I find it easy to get caught up in what features are available to me via the interface of the services and tooling I use, letting the UI define what is possible. This is why I only use services and tooling that have APIs if I can help it—as the API is always the relief valve for allowing me to get done what I need. In this case, while very robust, I couldn’t get everything I needed done with the Postman UI in the time period required, so I switched to the API, and was able to programmatically get at the data I needed. Allowing me to pull all of my collections from across workspaces, then organize and generate exactly the list of collections I needed for a specific report I’m working on. While talking with folks about Postman I regularly encounter individuals who speak about the limitations in the product, stating they couldn’t use it to accomplish something because it didn’t do this or that. Without ever considering that they could accomplish it via the API. Personally, I am impressed at how thoughtful Postman has been about adding new features, helping minimize the complexity and bloat of the platform. This is why I expect platforms to have APIs, so that I can get...[Read More]