The API Evangelist Blog

API Evangelist

The Open Source Community Tooling Built on API Blueprint

11 Jun 2020

Last on my list of API specifications to profile is API Blueprint. Once, one of the more promising API specifications leading the API design first charge, post acquisition there isn't a whole lot of forward motion. However, API Blueprint still provides a pretty compelling view of the landscape, and is worth tuning into to understand what is happening. Here are the API Blueprint open source tools I harvested from the GitHub API and organized by stop along the API life cycle they are serving. Specification api blueprint - (forks: 2095) (stars: 7969) (watchers: 7969) - api blueprint Generators aglio - (forks: 489) (stars: 4480) (watchers: 4480) - an api blueprint renderer with theme support that outputs static html laravel blueprint docs - (forks: 27) (stars: 204) (watchers: 204) - api blueprint renderer for laravel, customizable via blade templates Toolchains snowboard - (forks: 100) (stars: 641) (watchers: 641) - api blueprint toolkit Converters api spec converter - (forks: 91) (stars: 616) (watchers: 616) - convert api descriptions between popular formats such as openapi(fka swagger), raml, api blueprint, wadl, etc. apiary2postman - (forks: 25) (stars: 188) (watchers: 188) - tool for generating a postman collection from blueprint api markup or the apiary api blueman - (forks: 18) (stars: 143) (watchers: 143) - convert a generated api blueprint json file into a postman collection swagger2blueprint - (forks: 9) (stars: 98) (watchers: 98) - convert swagger api descriptions into api blueprint pmtoapib - (forks: 1) (stars: 33) (watchers: 33) - tool to convert postman collection exports to api blueprint documentation postman2apiary - (forks: 10) (stars: 22) (watchers: 22) - tool for generating blueprint api markup or the apiary api from a postman collection. apib2json - (forks: 10) (stars: 14) (watchers: 14) - a command-line utility for convert api blueprint to json schema apib2json - (forks: 10) (stars: 14) (watchers: 14) - a command-line utility for convert api blueprint to json schema Mocking api mock - (forks: 63) (stars: 492) (watchers: 492) - creates a mock server based on an api blueprint drakov -...[Read More]


API Evangelist

The Open Source Community Tooling Built on gRPC

09 Jun 2020

I am getting time to map out the diverse API toolbox landscape I see unfolding lately. I am aggregating the most used open source projects across the leading API specifications in use. While gRPC is more of a style or pattern than it is a specification--it is a fast growing standard. I'll be doing a separate analytics of Protocol Buffers, but I wanted to look at the cream on top of what is being build within the gRPC community. Revealizing almost a hundred open source tools loosely organized by what they deliver as part of the API lifecycle. Specification grpc - (forks: 6625) (stars: 26420) (watchers: 26420) - the c based grpc (c++, python, ruby, objective-c, php, c#) Implementations grpc - (forks: 6625) (stars: 26420) (watchers: 26420) - the c based grpc (c++, python, ruby, objective-c, php, c#) rpcx - (forks: 772) (stars: 4675) (watchers: 4675) - a zero cost, faster multi-language bidirectional microservices framework in go, like alibaba dubbo, but with more features, scale easily. try it. test it. if you feel it's better, use it! 𝐉𝐚𝐯𝐚有𝐝𝐮𝐛𝐛𝐨, 𝐆𝐨𝐥𝐚𝐧𝐠有𝐫𝐩𝐜𝐱! surging - (forks: 860) (stars: 2875) (watchers: 2875) - surging is a micro-service engine that provides a lightweight, high-performance, modular rpc request pipeline. the service engine supports http, tcp, ws, grpc, mqtt, udp, and dns protocols. it uses zookeeper and consul as a registry, and integrates it. hash, random, polling, fair polling as a load balancing algorithm, built-in service governance to ensure reliable rpc communication, the engine contains diagnostic, link tracking for protocol and middleware calls, and integration skywalking distributed apm iris - (forks: 2035) (stars: 18281) (watchers: 18281) - The fastest community-driven web framework for go. grpc, automatic https with public domain, mvc, sessions, caching, versioning api, problem api, websocket, dependency injection and more. fully compatible with the standard library and 3rd-party middleware packages. | https://bit.ly/iriscandothat1 | https://bit.ly/iriscandothat3 | grpc web - (forks: 295) (stars: 2995) (watchers: 2995) - grpc web implementation for golang and typescript tonic - (forks: 167) (stars: 2166)...[Read More]


API Evangelist

The Open Source Community Tooling Built on GraphQL

09 Jun 2020

I have done several dives into the world of GraphQL. As part of some API specification work I am not getting another chance to look at what the open source community around GraphQL looks like. Along with other API specifications in our modern API toolbox, I am looking at how GraphQL is being leverage, and what the motivations are behind the open source tooling that has emerged. Resulting in the following list of the cream off the top of the open source tooling build on top of GraphQL, broken down by different stops along the API lifecycle. Formatting prettier - (forks: 2366) (stars: 36727) (watchers: 36727) - prettier is an opinionated code formatter. Server Code strapi - (forks: 3163) (stars: 25979) (watchers: 25979) - 🚀 open source node.js headless cms to easily build customisable apis parse server - (forks: 4325) (stars: 17541) (watchers: 17541) - api server module for node/express graphql js - (forks: 1553) (stars: 16196) (watchers: 16196) - a reference implementation of graphql for javascript graphql engine - (forks: 1428) (stars: 17060) (watchers: 17060) - blazing fast, instant realtime graphql apis on postgres with fine grained access control, also trigger webhooks on database events. apollo server - (forks: 1391) (stars: 9707) (watchers: 9707) - 🌍 graphql server for express, connect, hapi, koa and more graphql - (forks: 557) (stars: 6454) (watchers: 6454) - an implementation of graphql for go / golang api platform - (forks: 669) (stars: 5853) (watchers: 5853) - rest and graphql framework to build modern api-driven projects (server-side and client-side) graphene - (forks: 612) (stars: 5758) (watchers: 5758) - graphql framework for python graphql yoga - (forks: 366) (stars: 5763) (watchers: 5763) - 🧘 fully-featured graphql server with focus on easy setup, performance & great developer experience express graphql - (forks: 483) (stars: 5319) (watchers: 5319) - create a graphql http server with express. graphql ruby - (forks: 936) (stars: 4297) (watchers: 4297) - ruby implementation of graphql graphql java - (forks: 781) (stars: 4267) (watchers: 4267) - graphql java implementation graphql php -...[Read More]


API Evangelist

The Open Source Community Tooling Built on Swagger

08 Jun 2020

I am finally finding time to pick up some old work quantifying the open source that has risen up around API specifications. I just pulled all the GitHub repos when you search for "Postman" and "OpenAPI", and now I wanted to do "Swagger". I'm looking to evaluate the cream off the top of what is going on in each of these buckets, but also eventually evaluate the long tail of what is going on. I've been trying to understand the evolution of Swagger 2.0 to OpenAPI 3.0 from a tooling perspective for some time now--this is me getting a handle on what is happening. Here is the top repositories when you search for "Swagger" on GitHub, roughly broken down by the stops along the API lifecycle they are serving. Deployment fastapi - (forks: 942) (stars: 14208) (watchers: 14208) - fastapi framework, high performance, easy to learn, fast to code, ready for production connexion - (forks: 542) (stars: 3150) (watchers: 3150) - swagger/openapi first framework for python on top of flask with automatic endpoint validation & oauth2 support surging - (forks: 860) (stars: 2875) (watchers: 2875) - surging is a micro-service engine that provides a lightweight, high-performance, modular rpc request pipeline. the service engine supports http, tcp, ws, grpc, mqtt, udp, and dns protocols. it uses zookeeper and consul as a registry, and integrates it. hash, random, polling, fair polling as a load balancing algorithm, built-in service governance to ensure reliable rpc communication, the engine contains diagnostic, link tracking for protocol and middleware calls, and integration skywalking distributed apm light 4j - (forks: 468) (stars: 2789) (watchers: 2789) - a fast, lightweight and more productive microservices framework loopback next - (forks: 499) (stars: 2810) (watchers: 2810) - loopback makes it easy to build modern api applications that require complex integrations. swag - (forks: 386) (stars: 2668) (watchers: 2668) - automatically generate restful api documentation with swagger 2.0 for go. scalatra - (forks: 337) (stars: 2454) (watchers: 2454) - tiny scala high-performance, async web...[Read More]


API Evangelist

The Open Source Community Tooling Built on Postman

08 Jun 2020

I am finally finding time to pick up some old work quantifying the open source that has risen up around API specifications. I am pulling all of the open source tooling available on GitHub when you search for "Postman". A portion of this is open source by Postman, others are collections built by API providers helping developers on-board more quickly, but there is another set of tooling that builds on top of the concept of Postman collection as a specification. Providing an interesting look at what developers are wanting when it comes to integrating the Postman platform into their oeprations. I have a longer list of everything, but here is the cream off the top. Documentation postmanerator - (forks: 69) (stars: 470) (watchers: 470) - a http api documentation generator that use postman collections docgen - (forks: 52) (stars: 335) (watchers: 335) - transform your postman collection to html/markdown documentation docodile - (forks: 23) (stars: 54) (watchers: 54) - generate html api documentation from a postman collection docman - (forks: 18) (stars: 47) (watchers: 47) - a simple page to generate documentation from postman collections Postdown - (forks: 14) (stars: 36) (watchers: 36) - generate markdown api document from postman. postman2doc - (forks: 5) (stars: 25) (watchers: 25) - convert postman collection.json to markdown/html/docx. Educational All Things Postman - (forks: 86) (stars: 304) (watchers: 304) - a selection of examples using postman rest client Conversion apiary2postman - (forks: 25) (stars: 188) (watchers: 188) - tool for generating a postman collection from blueprint api markup or the apiary api API Flow - (forks: 19) (stars: 181) (watchers: 181) - universal data structure and converter for api formats (swagger, raml, paw, postman…) blueman - (forks: 18) (stars: 143) (watchers: 143) - convert a generated api blueprint json file into a postman collection api spec converter - (forks: 76) (stars: 108) (watchers: 108) - this package helps to convert between different api specifications (postman, swagger, raml, stoplight). swagger2 to postman - (forks: 46) (stars: 92) (watchers: 92) - converter for...[Read More]


API Evangelist

The Open Source Community Tooling Built on OpenAPI

08 Jun 2020

I am finally finding time to pick up some old work quantifying the open source that has risen up around API specifications. I am pulling all of the open source tooling available on GitHub when you search for "OpenAPI". I just published the same assessment of searching for "Postman", but since Postman's API builder is centered around OpenAPI, it makes sense to do the same for OpenAPI. I'm looking to develop a understanding with many of the tooling provider listed here, but I am also looking to understand what developers are needing when it comes to OpenAPI. I have gone through the cream off the top of the search for "OpenAPI" on GitHub and here is what I have come up with so far. Specifications OpenAPI Specification - (forks: 6456) (stars: 17676) (watchers: 17676) - the openapi specification repository Parser swagger parser - (forks: 96) (stars: 604) (watchers: 604) - swagger 2.0 and openapi 3.0 parser/validator kin openapi - (forks: 111) (stars: 503) (watchers: 503) - openapi 3.0 implementation for go (parsing, converting, validation, and more) oas kit - (forks: 84) (stars: 436) (watchers: 436) - convert swagger 2.0 definitions to openapi 3.0 and resolve/validate/lint openapi.tools - (forks: 114) (stars: 174) (watchers: 174) - a collection of editors, linters, parsers, code generators, documentation, testing Validator swagger parser - (forks: 96) (stars: 604) (watchers: 604) - swagger 2.0 and openapi 3.0 parser/validator kin openapi - (forks: 111) (stars: 503) (watchers: 503) - openapi 3.0 implementation for go (parsing, converting, validation, and more) oas kit - (forks: 84) (stars: 436) (watchers: 436) - convert swagger 2.0 definitions to openapi 3.0 and resolve/validate/lint openapi cop - (forks: 10) (stars: 325) (watchers: 325) - a proxy that validates responses and requests against an openapi document. express openapi validator - (forks: 51) (stars: 253) (watchers: 253) - 🦋 auto-validates api requests, responses, and securities using expressjs and an openapi 3.x specification openapi.tools - (forks: 114) (stars: 174) (watchers: 174) - a collection of editors, linters, parsers, code generators, documentation, testing Generators...[Read More]


API Evangelist

Setting Up API Broker Workspaces

08 Jun 2020

I had begun playing around with the concept of API brokers back in 2014, and it is something that is recurring and evolving in a handful of the conversations I am having in the Postman ecosystem lately. API brokering is the concept that instead of developers directly engaging with all of the public APIs they will ned for an application, that a professional API broker could discover, sign-up, setup applications, and aggregate documentation, client libraries, and other essential items for developers. So all an application developer has to do is fire up a workspace, and they have all the docs, code, keys, and other elements ready to go for them to just start building. Saving an organization time and money by outsourcing much of the tedious work involved with discovering APIs, on-boarding with them, and preparing to build an API, letting organizations focus on what they do best.   Most of my talk in the past has been conceptual around being an API broker, but now with Postman I can actually do it for realz. I can take the many different workspaces of API collections I have and use them to populate client specific workspaces. Meaning I can setup a workspace specifically for companies I know that want quick access to a variety of APIs--without the friction of on-boarding themselves. Here are the building blocks I’ve established for my own definition of the API broker / client relationship.   User - I am on the business plan, so I add a new slot for each of my clients, paying for their team license as part of the overall value that I am delivering to them. They use this user account whenever they are engaging with the API I am maintaining for them. I can apply appropriate roles, and terminate access whenever the relationship ends.  Workspace - I setup a single workspace for my client, allowing them to access only the API, collections, environments, tests, and...[Read More]


API Evangelist

Writing and Working in a COVID-19 #BlackLivesMatter Uprising Storm

05 Jun 2020

Business is anything but usual these days. I have a lot of time on my hands when it comes to writing and working online, but the reality in the chair is anything but easy. When I sit down to tackle even the most basic of tasks I can usually make it through about 50% of the work before I feel drained, and left with a blank screen for a brain. In addition to COVID-19 and the #BlackLivesMatter uprising, we also lost the kid this month. Making for a swirling emotional mess of a reality that really isn’t conducive to doing much beyond just reading a book or watching a movie. Looking through my notebook there are numerous half, or even complete stories about APIs I could be publishing, but my blank screen of a brain can’t even properly edit them, let along grock and finish many of the API stories I was pushing forward. It is difficult for me on the best day to conjure up some storytelling about APIs here on the blog. With that said, it also can be useful to lose myself thinking about some technical topics if I can find ways to convince myself that they are meaningful in these times. Technology is no replacement for direct human action, but the forces we are up against are actively wielding API-driven technology against us, and pushing back on this has always been what API Evangelist is all about. It isn’t about just saying the Twitter API can be used for social justice, it is also about demonstrating to folks that the Twitter API is also being used to surveil and manipulate us. Pulling back the curtain on how all of this works is the cornerstone of what API Evangelist does, and I’m determined to find ways of doing this in service of the #BlackLivesMatter uprising. This is one of those “getting back on the horse” posts. Where I just practice using my...[Read More]


API Evangelist

The Building Blocks of API Sharing and Collaboration

05 Jun 2020

I am tasked with defining what sharing and collaboration means when it comes to API operations at work. I have never had a tool like Postman to help me define how we work as a team across an organization. Normally, I just do a lot of hand waving and say, “you do it like this”, and call it good. With Postman, I now have an opportunity to anchor what I am talking about using the current state of the Postman platform, as well as help shape the future by influencing the Postman road map.  To help me prepare for more coherent storytelling on the Postman blog, and as part of other conversations, I want to work my way through what the core features of Postman are when it comes to sharing and collaboration. There are many levels to how you can share and collaborate within Postman, which will have different impacts on the ground within enterprise organizations. To help me map Postman functionality to the real world of API operations, I wanted to break down the different level of sharing and collaboration that exists within Postman. Team Level The essential ingredient of API sharing and collaboration is a team. You just can’t effectively share and collaborate at the provider level without a team. You can share and collaborate in spontaneous ways amongst API provider stakeholders, as well as API consumer stakeholders, but with a well defined team of stakeholders you can take sharing and collaboration to new levels. Here are a couple of the core Postman features that facilitate sharing and collaboration across API operations. Create a Team - The act of creating a team in Postman, and upgrading your account to support the size and scope of your team is the first thing you can do to help lay a foundation that facilitates sharing and collaboration within a team, but also between teams. Invite to a Team - The process of inviting someone to...[Read More]


API Evangelist

What Layer Are You Used to API Complexity Being At?

22 May 2020

I’d say one of the most controversial aspects of the world of APIs involves the places where people are used to and prefer to deal with API complexity at. After you look at thousands of APIs you begin to better understand where people introduce complexity into the design of an API, and as more people become familiar with any single approach they are much more likely to stick with it over time, bake it into their APIs, and passionately believe it is the normal pattern that everyone else is and should be using. You can see what I am talking about at the heart of the microservices debate, where the API complexity is dealt with at the API level, or in the number of APIs you have—something many increasingly are finding to be unacceptable. Another way you can see the API complexity debate unfolding across the API space is within the graphQL, where the complexity is at the schema level, and believers truly think that this is where API complexity should live. I try to not be too prescriptive on where API complexity should lie because the reasons why you want to increase or decrease the scope and complexity of an API will vary widely depending on what you are delivering with the API, the organization where it is being produced, and who will be consuming it. First, What is API Complexity? I feel like it is important to set the stage with what is complex. To provide a complicated view of what complexity can be, the definition for the complexity is, “the state or quality of being intricate or complicated”, with the definition of complicated being, “consisting of many interconnecting parts or elements; intricate.”, and the definition of intricate being, “very complicated or detailed”. ;-)  Here are some of the questions that nag me as I think about API complexity. Is complexity in each individual API instance necessary? Is complexity in each individual API...[Read More]


API Evangelist

Where is the API Value?

08 May 2020

I had some thoughts bouncing around in my head the last couple days about where the value in this whole API game actually resides. When it comes to the types of APIs I am seeing be deployed, and where I see things head in the near future, I wanted to try and map out where the different moving parts of an API are, and assigning value to each component. At first I just wanted to think more deeply about how Claude Shannon the father of information theory would see (or not see) the value in APIs. In 1948 Claude Shanno wrote in his Mathematical Theory of Communication: The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design. In 2020, I feel like the message is still the center of value, but I would disagree about the meaning of those messages being irrelevant, but that is another post. However I would say that other dimensions have emerged in the 70+ years since Shannon wrote his theory. I would argue that Shannon's view of the message still hold true in 2020, but I just think he couldn't have imagined the many ways in which protocols, connections, formats, channels, and the network, performance, and volume of messages would influence the purpose, meaning, and ultimately value of each message, and messages in bulk. This is my first draft at trying to map out the moving parts. It is...[Read More]


API Evangelist

API Spec-First Development

07 May 2020

I am fleshing out ideas from a couple of recent conversations around API life cycle religion and philosophy. We’ve made our way through several lofty ideologies around how you should or shouldn’t do APIs, and next up in the queue is API specification first, or API spec first development. I like the phrase. I like it a lot. However, I am afraid if we aren’t being precise in what we mean by it we might be sacrificing a more meaningful usage of it for delivering a certain class of API that is meant for reuse. This is another one of my pedantic API definition stories, but since it is meant to help me use my words better in these conversations I am having, and secondarily for a wider reading here on the blog, you are going to have to bear with me as I work through my feels about API spec first development (should there be any dashes in there?) To help guide me in my work, I am using my API reference implementation “Union Fashion” as a foundation for this narrative. I’ve been calling what I am doing with Union Fashion API-first, but honestly I’m just looking to build out the best reference implementation that I can showing how to do APIs thoughtfully—I am not trying be prescriptive about there being one way do APIs, or one phrase to describe it. It just doesn’t exist. However, I do like peeling back the layers on the onion until I cry. When it comes to Union Fashion I might be agreeable to calling what we are doing API spec first development, if we are using it in two possible ways. Let’s use the Union Fashion products API to help break things down and see if we can’t help flesh out what we mean by API spec first development. The Product API Began With Three Specifications When I first started development of the Union Fashion product API...[Read More]


API Evangelist

The Basics of Postman Role Based Access Control (RBAC)

05 May 2020

I am working up towards a loftier piece on the importance of RBAC to the API life cycle, and as part of my research I was going through all of the documentation postman has for roles and permission at the team, workspace, api, and collection levels. RBAC is one of those layers of the API discussion that touches on almost every other layer, making it an area you have to not just think about at the microscopic levels within workspaces, but also at the macro level thinking about organizational and team level impact. I’m feeling like I am going to need to flesh out several dimensions of what is RBAC here on the blog, as well as the other critical factors of what influences RBAC across operations. The Basic Building Blocks of Postman RBAC To get a handle on things I wanted to think deeply about the core building blocks of Postman RBAC, and consider the roles, as well as the objects in which access is being controlled, but also the downstream effects of how RBAC gets applied or not. When you browse the Postman Learning Center under roles and permissions you get the following breakdown: Team Role Admin: manage team members and team settings Billing: manage team plan and payments Developer: access team resources and workspaces Workspace Roles Admin: manage workspace details and members Collaborator: work on team resources in a workspace API Roles Editor: edit APIs directly Viewer: view, fork, and export APIs Collection Roles Editor: edit collections directly Viewer: view, fork, and export collections That represents nine individual roles defined by the API resources in which they are applied. Postman does a good job of breaking out the control each role has in each of the respective areas, but the overall effectiveness of RBAC will be determined by how solid of a strategy you have for defining and managing each of these areas—meaning you are going to to have a strategy in place,...[Read More]


API Evangelist

Learn by API

05 May 2020

I was playing around with my co-worker Sue Smith’s Learn by API project today, and found it to be a pretty powerful usage of Postman for not just teaching users about Postman, but also teach them about healthy practices when it comes to API design. The Learn by API Postman collection provides an interesting building block for development of other introductory API concepts, but also API design concepts in service of a formal API governance strategy across an organization. I recommend you just click on the Run in Postman button for the Learn by API to collection to better understand the potential, but I’d also like to break down what she did to help illustrate how it could be used as an API education and training blueprint. Learning Collection The Learn by API collection is a single request wrapped in a portable Postman collection, allowing anyone (with Postman installed) download into their desktop client and execute. Teaching anyone (developers or non-developers) about the fundamentals of Postman, API design, as well as how the web works in general. To begin learning, all you do is click send in Postman once you have imported the collection, and you’ll will be introduced to the next step. The collection introduces each user to how Postman makes requests, how APIs are designed by showing the mechanics of how the web is working behind our web and mobile applications. It also really shows how Postman is all about helping you understand APIs by peeling back the layers of HTTP requests and responses in a hands on way. Visualizer Experience Once you have clicked send you see the response body as JSON, however if you click on the Visualize tab. Here you are given an HTML view of the tutorial you have initiated. The collection helps you understand what you just set into motion by sending the request, fusing the tutorial with the response of the request you initiated. Immediately immersing you...[Read More]


API Evangelist

HHS and CMS Finalizes Rules to Provide Patients More Control of Their Health Data Using APIs

05 May 2020

I have had a pretty massive API story in my notebook for a couple of weeks now that I just didn’t have the emotional bandwidth to process, but eventually I’m finding the energy to think about APIs at this scope. The TL;DR is that the US Health & Human Services (HHS) finalized a historic rule to provide patients more control of their health care using APIs--from the HHS announcement: “ONC’s final rule establishes secure, standards-based application programming interface (API) requirements to support a patient’s access and control of their electronic health information. APIs are the foundation of smartphone applications (apps). As a result of this rule, patients will be able to securely and easily obtain and use their electronic health information from their provider’s medical record for free, using the smartphone app of their choice.” I wanted to understand what this means for healthcare APIs. I’ve gone down the Blue Button API rabbit hole several times before, and I have an intimate awareness of how much time is involved with just loading all the moving parts into your head. However, this is why I write on API Evangelist, to help me process big ideas like this through ongoing API storytelling, and whittle away at projects I may not have time for in waves. I finally made time to do a first dive into this monumental precedent in the US federal government directing healthcare providers to deploy APIs, loading it all up in my head so I can chew on for a bit. The CMS Interoperability and Patient Access Final Rule The technical meat of this can be found within the CMS interoperability and patient access final rule. Helping me better understand what is being asked of healthcare providers, and who the providers even are. At this point the rule sounds promising because of its scope, but really the devil is in the details--the CMS interoperability and patient access final rule document(s) provide:  Information and tools...[Read More]


API Evangelist

Blog Posts to Work Through My API Task List

05 May 2020

I would say today reflects the purpose of API Evangelist in my world. Helping me get through the work I have on the table, while expanding my awareness of what is going on in the world of APIs. Most people think my blog is for them, and it is, but first and foremost it is about me working through my ideas and projects. I haven’t been feeling like writing much during the current situation we find ourselves in, but this morning I was needing some help getting to some items on my task list that have been sitting a little to long. Resulting in three posts here on the blog, and me remember why I do this, which always helps me find renewed energy in my work. I have about 10 different items I was looking to accomplish today, but three of them made for pretty worthy stories that helped me better understand what is going on while also pushing me to articulate my ideas to other people. Here is the result of me clearing three items from my task list today: Learn by API - I was impressed by a new type of collection my co-worker at Postman came up with, making learning about APIs and Postman a hands-on experience. I am going to adopt her approach for my Union Fashion API reference project. The Basics of Postman Role Based Access Control (RBAC) - Refreshing my memory of what is possible with Postman RBAC, so that I can pull together a compelling story about how Postman RBAC benefits our enterprise customers. HHS and CMS Finalizes Rules to Provide Patients More Control of Their Health Data Using APIs - Digging deeper into HHS and CMS finalizing a rule for payment providers to deploy Blue Button compliant APIs to give patients more access to their developers. This process reflects how I like to work. It represents how I have managed to move my API research forward...[Read More]


API Evangelist

The Quickest Way To Make An Idea for an API Usable By Others

04 May 2020

I have used Postman in a handful of webinars and takes recently to demonstrate how you can quickly go from idea to a tangible usable API. My goal is to equip myself with a way to I can quickly demonstrate an API I have in my head to someone else in five minutes or less. This is a walkthrough of that process, hopefully showing you a quick way to be more effective in how you share and collaborate around APIs. Allowing anyone, even non-developers to make their abstract ideas a little more tangible and real for others. Oakland Restaurants JSON To demonstrate how you can quickly deploy an API using Postman I’ve compiled a list of some of the restaurants in Oakland (where I live) that are still open for take-out and delivery. To demonstrate what is possible I have added four restaurants to a JSON file, providing me with what I’d like to see for my API response. This JSON will become the response for me new API, providing a list of restaurants for Oakland. However, with an eye for the future I will be looking to create separate JSON responses for Berkeley and other surrounding cities, breaking up my API into more usable chunks. New API Request in Postman To launch my new restaurants API I am opening up Postman, creating a new request for the API I am wanting to share with others. Adding a path resource named restaurants with a city parameter with a value of Oakland, allowing me to break things up by city in the near future. Save Request as a Collection Before I can actually put my new imaginary API to work I will need to save the request as part of a collection. Helping provide me with a way to organize my API, but also begin to make it more tangible and executable as a Postman collection. Taking the first step towards making my API idea a...[Read More]


API Evangelist

Using Postman Workspaces and GitHub Repositories Together

01 May 2020

I find that it helps to have defined boundaries for your APIs. If you have the resources and interest I recommend studying subjects like domain driven design. Investment in properly defining the boundaries of your organization and lines of business is a very worthy endeavor—if you have the time. However, if you already have a lot on your plate, and are just looking for incremental changes that can help make your life easier when it comes to the API sprawl we’ve introduced into our worlds, I recommend just spending a few moments thinking about how you can better use Postman workspaces to dive and conquer your API landscape. I have begun using Postman workspaces in similar ways that I use GitHub repositories. The GitHub repository has long represented a unit of API value in my world, but with the introduction of GitHub two-way sync into Postman I now have a one for one matchup between workspace and repository when it comes to moving each API forward. For my API-first e-commerce reference implementation Union Fashion I am currently developing five separate APIs, which each have their own Postman workspace and GitHub repository pairing. Products - (Workspace) (Repo) (Docs) - Defining all of the products that Union Fashion offers. Orders - (Workspace) (Repo) (Docs) - Allows for the ordering of Union Fashion products online. Baskets - (Workspace) (Repo) (Docs) - Allows for the ordering of Union Fashion products online. Users - (Workspace) (Repo) (Docs) - Defines users who engage with the Union Fashion platform. Search - (Workspace) (Repo) (Docs) - Provides a universal search for products, orders, and users. Postman workspaces and GitHub repositories accomplish many overlapping concerns along the API life cycle, however I am finding that Postman workspaces are better suited for establishing a tighter team level definition of who should have access and voice in moving an API forward, but GitHub is better for a more organizational-wide, or a public level of engagement. With both workspaces containing many of...[Read More]


API Evangelist

An API Deployment Narrative

01 May 2020

This is the narrative from one of the last webinars I did for Postman oin my API-first e-commerce reference implementation Union Fashion. I always try to write what I am going to be saying furing a webinar so that it is loaded up in my brain in a natural way---think Neo learning Kung Fu before sparring with Morpheus. Anyways, here is the narrative, with accompanying video embedded below, helping share the narrative behind how I am building Union Fashion, trying to learn, grow, and expand how I use APIs out in the open, so that others can learn along the way. Jeff Jones the VP of Engineering at Union Fashion has been working to get more organized about how they deliver applications using APIs at the company. In our last webinar episode we looked at how Union Fashion was building and testing APIs, moving their operations towards an API first way of delivering application infrastructure. The build and test planning session was all about ensuring Jeff could lay the foundation for organizational change at Union Fashion by making sure his teams had the proper planning, and everyone was in agreement on an API-first way of delivering APIs behind the web and mobile applications they were needing to run their business—investing in the following areas to help define how they build and test APIs before actually building them: Organizational - Making sure the team was well defined, including having dedicated workspaces for each API. APIs - We are making sure that common API definitions at the center of the conversation for each API being developed. Collections - Being very structured in how collections are define and used to power different stops along the API life cycle. Environments - We made sure secrets and some key value pairs were abstracted away from each of the API collections. Mocks - Mock servers are published from all APIs, using a specific collection, making each API being developed more tangible....[Read More]


API Evangelist

API Deployment Collections - AWS API Gateway, Lambda, and RDS

21 Apr 2020

After over a decade of API evolution, API deployment is still much more of a dark art than it is something that ever sees the light of day. Sure you can setup a pipeline for an API, making a known pattern a repeatable process, but there isn’t much consistency out there when it comes to repeatable patterns for use across many different APIs. Certain vendors are working on optimizing the API deployment cycle within the context of their platforms, but I’ve always wanted to see more open blueprints for how APIs can be deployed, designed for applying across a mix of APIs which are defined using OpenAPI. To push this conversation forward I began exploring what the common patterns for API deployment on the leading cloud providers would look like, and if it was something I could accomplish using API infrastructure, which mean that I can do using Postman. One API deployment blueprint I had on my list to develops was using AWS API Gateway, Lambda, and RDS. Providing me with a quick way to consistent and repeatable way of deploying APIs that I could define as a Postman collection that can be shared and used by others. The results is a first draft of a Postman API deployment collection, which can be used to deploy an API to AWS using Postman, taking an existing OpenAPI and bringing it to life using common AWS solutions. Rather than outlining what I did as I normally would, I am going to play around with producing content in other mediums, and publish a video walk through of how my new AWS API Gateway, Lambda, and RDS API deployment Postman collection works—let me know what you think. My AWS API Gateway, Lambda, and RDS API deployment collection is still under development and being refined each week, but feel free to kick the tires and ask questions. I’ll be showcasing how the collections fits into the bigger picture of an...[Read More]


API Evangelist

API Deployment Collections - AWS API Gateway and DynamoDB

21 Apr 2020

After over a decade of API evolution, API deployment is still much more of a dark art than it is something that ever sees the light of day. Sure you can setup a pipeline for an API, making a known pattern a repeatable process, but there isn’t much consistency out there when it comes to repeatable patterns for use across many different APIs. Certain vendors are working on optimizing the API deployment cycle within the context of their platforms, but I’ve always wanted to see more open blueprints for how APIs can be deployed, designed for applying across a mix of APIs which are defined using OpenAPI. To push this conversation forward I began exploring what the common patterns for API deployment on the leading cloud providers would look like, and if it was something I could accomplish using API infrastructure, which mean that I can do using Postman. One API deployment blueprint I had on my list to develops was using AWS API Gateway and DynamoDB. Providing me with a quick way to consistent and repeatable way of deploying APIs that I could define as a Postman collection that can be shared and used by others. The results is a first draft of a Postman API deployment collection, which can be used to deploy an API to AWS using Postman, taking an existing OpenAPI and bringing it to life using common AWS solutions. Rather than outlining what I did as I normally would, I am going to play around with producing content in other mediums, and publish a video walk through of how my new AWS API Gateway and DynamoDB API deployment Postman collection works—let me know what you think. My AWS API Gateway and DynaoDB API deployment collection is still under development and being refined each week, but feel free to kick the tires and ask questions. I’ll be showcasing how the collections fits into the bigger picture of an enterprise API life...[Read More]


API Evangelist

Running and Organizing AWS Lambdas with Postman Collections

20 Apr 2020

I am auto-generating and manually producing a number of Postman collection lately. I am creating a Postman collection that autogenerates AWS Lambdas from an OpenAPI stored in the Postman API builder, as well as a handful of infrastructure AWS Lambdas that accomplish bigger picture items like creating a database in RDS, or zipping up AWS Lambda packages to deploy APIs. So, I have a lot more AWS Lambdas laying around that I am needing to organize and put to use. I find the first rule of AWS Lambda club is remembering you created the thing in the first place and remember to actually put the thing to use—when you have so many of them laying around, you need a way to make them more discoverable, browsable, and usable in your everyday lives, something Postman excels at. When it comes to deploying APIs with AWS infrastructure using a Postman collection, there were two things I couldn’t do purely with AWS APIs, which pushed me to create AWS Lambda functions that would get the job done. Demonstrating that I was going to be creating a growing number of Lambda functions that I was going to need to organize, retrieve, and execute regularly as part of my manual work, but also automated process, beginning with these two functions: Create AWS RDS Aurora SQL Table - There is no way to create a table using the AWS RDS API, so I created an AWS Lambda function that would let me pass in the table name and fields using environment variables, mounting the database and creating the table I needed. Generate AWS Lambda Deployment Package - I was dynamically generating a lot of the code powering Lambdas, as well as the layers of Node.js dependencies they were using, so I created a Lambda script that would take files from an AWS S3 bucket and folder and generate a zipped up AWS Lambda deployment package. I then created myself a Postman...[Read More]


API Evangelist

The Layers of the API Specifications, Definitions, and Schema Onion

18 Apr 2020

I struggle with using the right words in my API storytelling. Striking a blend between what people are saying across the sector, and what people should be saying. There are many words and phrases in the space that help describe what it is they do, while there are others that confuse more than they describe anything in particular. Mostly I struggle because all of this API stuff can be complicated and very abstract, but also because I can be a little dyslexic at times, seeing some words as interchangeable, depending on what day it is. To help me (once again) think through the world of API definitions, I wanted to riff off of my talk from AsyncAPI virtual conference this week and peel back the layers of the API specifications, definitions and schema onion. The words API specifications, definition, or schema are often used interchangeably as part of API discussions, but there are some realities on the ground when working with these artifacts that can increase the friction across operations if we allow them to be used interchangeably. It is pedantic as hell to want to write a story about the nuance of these terms, but if it helps me be more precise in my work, I’ll do it. To help illustrate the dimensions here, I wanted to highlight the artifacts around the Slack API that I am using for my talk next week. Slack Web API  OpenAPI - The OpenAPI for the Slack Web API defines the surface area of this single HTTP API instance. Slack Events API AsyncAPI - The AsyncAPI for the Slack Events API defines the surface of this real time HTTP API instance. These two artifacts describe the surface area of specific APIs, leveraging two open source API specification formats, but also adopting a third API specification format that these two specifications use to describe the underlying schema being used as part of the structure for request and response, message,...[Read More]


API Evangelist

We Should Have Built an API First

15 Apr 2020

It has been a while since I wrote a simple breakdown of why APIs matter, but also why API-First matters. I went down the API-First rabbit hole with API-First [Design || Code] and API-First [Business], but I haven’t just made the basic case of why we should have built an API in the first place. It really isn’t a concept you full grasp until you’ve made the very costly mistake of being API-Last several times over, so it makes sense to break things down into a single blog post to help folks [hopefully] learn from without going down the same paths I’ve been in my application development career. To help on-board folks with what I mean when I say API-First, let’s recap how we got here with a simple finctional product API story. One Product Catalog With Multiple Destinations Products are a ubiquitous resource in an online world. By 2000, if we were selling products in the real world, we also began needing to publish products to a website, which by 2005 would morph more into a mix of database-driven web applications. However by 2010, we also needed to have the same products available in our mobile applications., resulting in a handful of channels we need our products available on. Websites - Public websites that a product catalog need to be published to for consumption. Web Applications - Specific web applications that need access to our product clog. Mobile Applications - Making sure the product catalog is available via iPhone and Android phones. In 2020, these channels are morphing iton a suite of single page and static applications that work across web and mobile properties, but the need for a single set of content or data for making available across these channels hasn’t changed. By 2010, most companies were realizing that HTTP APIs were the most effective way to deliver content and data across applications, and by 2020 this has emerged as the reality for...[Read More]


API Evangelist

Growth in the Number of API Collections Over the Last Five Years

15 Apr 2020

There are surprisingly few meaningful API numbers to showcase across the API sector. There are few API or API service providers who have a view of the landscape that can produce meaningful numbers, and most prefer to keep their numbers close to their chests for a variety of reasons. Over the last decade we've all grown accustom to the Programmable Web hockey stock chart showing the number of public APIs added to the PW directory, due to the ubiquitous graphic being used in conference talks and blog posts since before 2010. I've always been a big advocate of API service providers developing and sharing their data, something that hasn't diminished at Postman. Postman has a pretty unique view of the API landscape, possessing many interesting data points. The company is working on a strategy for compiling, organizing, publishing, and sharing the data it has in a thoughtful way, something that you can see trickling out through a variety of slides from recent events like the Postman Galaxy Tour, and Postcon last year. Like this one showing the growth in API collections, topping at 34.9M collections published in 2020. The important thing with this visual is it reflects both private and public APIs. This graphis is just one of many data points Postman possesses that I am working to encourage packing up, so that others can reference and use, much like the ProgrammableWeb public API chart over the last decade. Visuals like this are important all of us making sense of what is going on, and be able to truly see the scope of what is happening. I'd love to see other API service providers share their numbers, but like Postman, I understand it is difficult to do in a meaningful way that respects the privacy of your users, and the interests of your investors. I will keep pushing for more observability into the API community through the Postman lens, and hopefully produce some interesting visuals...[Read More]


API Evangelist

Real Time Email Notifications About API Deprecations Down the Road

13 Apr 2020

I got an email from GitHub after firing up an older Postman collection I had. The collection was originally engineered to just pass in a GitHub token using a query parameter, which historically has been accepted, but is something that will be going away soon. It makes sense, and while query parameters are much easier for authentication, using headers is just a more logical and secure way to pass your tokens in with each API call. The token usage itself isn't what caught my attention, what gave me pause was the usage of real time email to notify users of features they are currently using which will be going away in the future. Here is the amil I got my GitHub about my usage of the deprecating access token query parameter: Hi @kinlane,On March 24th, 2020 at 03:55 (UTC) your personal access token ([TOKEN NAME) using [USER AGENT] was used as part of a query parameter to access an endpoint through the GitHub API:https://api.github.com/search/repositoriesPlease use the Authorization HTTP header instead, as using the `access_token` query parameter is deprecated. If this token is being used by an app you don't have control over, be aware that it may stop working as a result of this deprecation.Depending on your API usage, we'll be sending you this email reminder on a monthly basis for each token and User-Agent used in API calls made on your behalf. Just one URL that was accessed with a token and User-Agent combination will be listed in the email reminder, not all.Visit https://developer.github.com/changes/2020-02-10-deprecating-auth-through-query-param for more information about suggested workarounds and removal dates.Thanks,The GitHub Team I like this type of communication from API providers. I think this is a nice addition to any API management solution. Where you could flag any element of an API and when analytics reveals a developer using this feature, a transactional email is send off to the user. Allowing API providers to be more organized about how they plan for deprecations,...[Read More]


API Evangelist

Establishing an API-First Reference Implementation

06 Apr 2020

I do a lot of API blah blah blah’ing about abstract technical concepts. Sometimes I am able to craft a coherent narratives around some complex technology goings on, but most of the time I am just practicing. Workshopping different concepts until I find one that will make an impact on the way folks see APIs. One of the challenges I face in my storytelling is that I operate too far into the abstract, and not making the rubber meet the road enough. Another challenger I face is going too far down the rabbit hole with a particular companies implementation, which usually turns down the volume significantly on my storytelling, because most companies, organizations, institutions, and government agencies aren’t equipped to be 100% transparent about what they are doing. After a decade of storytelling exploration I find that operating somewhere in between the abstract and the real world is the best place to be, resulting in me desiring a reference implementation that I could use as an anchor for my storytelling, helping keep me grounded when it comes to how I talk about APIs. Welcome Union Fashion One of my co-workers at Postman had created a fictional company called Union Fashion when he started working on our solutions engineering team, but hadn’t put much more work into the project since. When I heard about it sounded exactly like what I was looking for. An e-commerce reference implementation that a wide audience could relate with, providing us with a model API implementation that we could use across webinars, workshops, and other storytelling channels. I’m big on building on the work of others, so I adopted Union Fashion, and I am working to define the fictional company as an API-First approach to operating a common real world business. Repo) (Docs) - Defining all of the products that Union Fashion offers. Orders - (Repo) (Docs) - Allows for the ordering of Union Fashion products online. Baskets - (Repo) (Docs) - Allows...[Read More]


API Evangelist

Crowdsourcing COVID-19 Testing Location Data

03 Apr 2020

I have been heads down working on resources for Postman's COVID-19 response, pulling together a variety of COVID-19 data and information that developers (and non-developers) can put to use when trying to make sense of what is going on around us. Identifying existing API resources that were available, while shining a light on the hard work of others was the first wave of our response, but along the way we identified some areas where there were no existing APIs, and felt there was an opportunity to step in. So I got to work on developing a couple of proof of concepts (POCs) that we could rally around as a company, and further contribute to the COVID-19 / Coronavirus fight. One of the POCs that came out of this work was an idea for crowdsourcing COVID-19 testing location data, resulting in a pretty interesting blueprint for making data available as APIs, which could be used for a variety of open data efforts—not just COVID-API testing locations. Framing the COVID-19 Testing Location Problem Before I dive into what I built, let me talk a little about how I landed on this being a problem in the first place, which is an important first step in any technological response to a real world problem. I was listening to the regular highlighting of drive-through COVID-19 testing locations during the press conferences coming out of this administration, and I was seeing or hearing it on the news I am digesting each day. Recognizing that the availability of COVID-19 testing locations was a politically charged topic, I wanted to better understand where we could or should go if we had Corona symptoms. I don’t have a doctor, event though I have health insurance, so I really have no idea where to go if I came down with it. I have anxiety about where I would go in my community if I came down with symptoms, and I can imagine that other...[Read More]


API Evangelist

COVID-19 Data and Information

26 Mar 2020

When it comes to coping with the stressful world unfolding around us I like to lose myself in my work. Data and APIs is a great way to tune out the world and keep myself busy while in isolation. Like most other technosolutionists I want to do some good in this crazy time, even if I don’t quite fully know what that means. So, to help me define what that means I sat down and began scratching at what was already occuring across the landscape. Identifying what sources of data were available out there, and what types of information was available which would truly make a difference in everyones world--not make it worse. Informational API Collections To begin I wanted to better understand where the top sources of information were, so I began documenting who the most relevant government agencies were in the COVID-19 conversation, going directly to the source of information at the highest levels. Center for Disease Control (CDC) (Website) (Collection) - A simple collection for pulling information from the CDC. European Centre for Disease Prevention and Control (ECDC) (Website) (Collection) - A simple collection for pulling information from the ECDC. World Health Organization (WHO) (Website) (Collection) - A simple collection for pulling information from the WHO. This seemed to reflect the authoritative resources available to me, so I got to work defining how each of these agencies shares information, mapping out the top channels I could profile as a Postman collection, aggregating relevant information, and then allowing it to be pulled manually or in some automated way. Twitter - Each agency uses Twitter as a way of providing updates. YouTube - Each agency uses YouTube to publish video resources. RSS / Atom Feeds - Each of the agency provides RSS feeds of info. I created a Postman collection for each agency, all someone has to do is enter their Twitter and YouTube API authentication, and they are up and running pulling data from across each of the agencies. My goal...[Read More]


API Evangelist

The Official Cloudflare API Postman Collection

12 Mar 2020

I use Cloudflare for my DNS. I like the threat protection they offer, the dead simple DNS management, and their robust API. I automate the management of a handful of my domains. Providing maintenance on 100 of my API life cycle, couple hundred API landscape sites, and the range of tooling, APIs, and other side projects I have. I’ve written several times about how Cloudlare weaves their API into their UI, so I am happy to write about their new Postman collection for the complete Cloudflare API. The Clouflare API Postman collection provides 447 individual API requests organized into different folders, making the entire surface area of the API much easier to navigation and make sense of what is going on. The Cloudflare API Postman collections gives you quick access to working with Users, Accounts, Organizations, Zones, DNS, Certificates, Workers, Firewalls, Load Balancer, Logs, and other essential infrastructure assets. I use about 1/20th of the valuable API resources Cloudflare provides, but I couldn’t operate my infrastructure like I do with out it.  I have added the Cloudflare API Postman collection to internal Postman workspaces, allowing me to automate more of API infrastructure work. I haven’t expanded my usage of the Cloudflare API because I haven’t had time to kick the tires and learn about what is going on. With the Cloudflare API Postman collection I am able to quickly play around with different APIs and learn more about what is possible. I’ve been wanted to play with Cloudlfare workers for sometime, and think more about how I can use them to deliver or consume APIs at the edge, and the Cloudflare API Postman collection makes it easier for me to make the time to learn more about how it all works. I’d love to see the Cloudflare API Postman collection get added to the Postman API Network. If you work at Cloudflare and are in charge of maintaining the Postman collection, all you have to...[Read More]


API Evangelist

A Proof of Concept API Service Tier

12 Mar 2020

If you  have followed me over the years you know that I get very frustrated by the access or lack of access to APIs, as well as the services and tooling that target the sector. As someone who is perpetually kicking the tires of API providers and service providers, not being able to on-board at all, on-board without my credit card, or one of the many other ways companies introduce friction, I find myself regularly pissed off. So anytime someone makes my world easier, and accommodates my need, desire, and obsession with playing with every damn API tool out there I have to say something. Today’s example is from my partner in crime Tyk, with their proof of concept option  Tyk acknowledges that most of us might not be ready for pro status—we just need to kick the tires a bit. I love this approach. It’s an evolution of the freemium model that I think is more honest and acknowledges the need to play around before entering in the credit card. This isn’t all about getting something for free or always being ready to pay for a service. This is about me getting access to your service, be able to develop my proof af concept (which I was able to do in < 10 minutes with Tyk), and then justify the cost of going pro with other stakeholders. Nice work Tyk—definitely what I like to see when playing with any API solution. [Read More]


API Evangelist

API-First [Business]

10 Mar 2020

I am working my way through defining a more precise definition of what API-first means which I can use across my API storytelling and conversations. I workshopped the widest definition possible of what API-First means to me yesterday, and be the end of the day I posted another more precise definition of what API-First means to a more technical crowd which I dubbed API-First [Design || Code]. Today, I’m once again thinking more about the business side of the conversation, and focusing on what I would like to eventually be a more precise definition of what API-First means to business stakeholders, which I am dubbing as API-First [Business]. As I said in my broader definition of API-First, if these conversations aren’t including business stakeholders we are doing it wrong. These people are making many of the decisions around the why and how of the desktop, web, mobile, device, and network applications we are delivering on top of our API infrastructure, so we can’t argue that API-First is a developer or technical only concept. We need business stakeholders also thinking API-First, otherwise our projects will never have the resources they need, and are more likely to fall short in meeting real world business objectives. API-First is not a developer concept, it is a concept that business and developer audiences should both be aware of, and then there are separate inner cores to the definition of API-First, one API-First[Design || Code], and the other API-First[Business], which can help bring a more precise definition to the table for each dimension of our operations. Some Common Business Productivity APIs To help make this definition a little more real I wanted to actually apply it against a handful of services I am currently working with business stakeholders at Postman. I am working with some very smart technically savvy folks who aren’t programmers to understand how I can help them be more effective and efficient in their daily work. Working together,...[Read More]


API Evangelist

What Is API First?

09 Mar 2020

I really struggled with this piece on API-first. It is one of those holistic API advice pieces I am very conflicted about. API-first feels like yet another marketing phrase or latest trend like microservices. So as I am writing down my thoughts over the last couple weeks on this, my bullshit-o-meter kept going off. Honestly it still is, but I still feel like there is enough value here that I can move forward with a story. As my co-worker Joyce rightfully pointed out in a meeting recently, API-first is one of those phrases we regularly throw out there without much assessment, agreement, or real definition of what it means. That is one reason it feels so wrong at times, because I feel like it is one of those feel good things we throw out there, but never really think too deeply about while doing, or after it fails and we’ve moved on to the next thing. With all of that said, I still believe that API-first can matter, if we, as Joyce points out, actually define what we mean by it. I think there is a lot of misconception about what we mean by API-firsat, and I’d like to stimulate conversation around what it means, if not just get more precise around how I talk about it. One concern I have about the API-first discussion is that once again it is something that only concerns developers when delivering APIs, and that it is something that business folks shouldn’t worry their pretty little heads about. This is a classic historical technique for dividing and conquering the technology-human paradigm that is spreading across society, and is something I am not interested in perpetuating. So I have broken down API-first discussion into two main parts, one through the technical lens, and another through how business folks will need to be made aware of as they continue to employ technology as part of their everyday work. Looking Through the...[Read More]


API Evangelist

API-First [Design || Code]

09 Mar 2020

I worked through my thoughts on what API first is, which I consider to be the outer layers of what is going on when we use this phrase. I wanted to focus on the technical and business rift that exists in this discussion first, now I want to dive into the more technical core of this phrase, and get to the heart of how developers are going to see this phrase. Depending on the type of developer you are, and your exposure to different aspects of the API industry or API operations within your organization you are going to make different assumptions about what API first is or isn’t. Some will feel it is more just about doing APIs before you build applications, while others are going to see it as being more about going API design first, before you ever write any code. Ultimately I want to establish a definition of API first that is inclusive, and not pushing people out, while also helping me ground how I use the phrase. Let’s Recap, What Is API-First? From the previous post, let’s take a fresh look at what is API-First from the vantage point of a more technically included stakeholder like architect, developer, or other IT actors. Setting the stage for how API-First [Code] can be approached. Before developing a web application, develop an API first Before developing a mobile application, develop an API first Before developing a device application, develop an API first Before attempting any system integration, develop an API first Before directly connecting to a databases, develop an API first Also, Why do API-First? Naturally people will ask why? To help flesh out why API-First matters, before we help separate API-First [Code] from API-First [Design], let’s look at the benefits of going API-First, then separate code from design approaches might make a little more sense. Allow potential stakeholders to communicate about what is needed before applications are actually build. API will reduce...[Read More]


API Evangelist

What Is My API Network

06 Mar 2020

I am working on the vision for the Postman Network. As I do with everything, I want to start with the basics human aspects of what is going on, and then relate them to the more technical and then business aspects of it all. Right now, the Postman Network is a listing of teams and individuals who have published Postman collections under a handful of categories. While visiting the network you can browse collections by category or search by keyword, and view the team or individual profile, select the “Run in Postman” button, or view the documentation. My goal is to brainstorm what is next for the Postman network, but also help define what network means in a world of API collaboration. When it comes to my API network, I like to focus on the meaningful elements, the “person, place, or thing” that makes my world go around. I am not interested in nouns that do not enrich my world, and I am keen on emphasis of the humanity of all of this over purely the tech for the sake of tech. So what are the nouns that make up my world? People - While I don’t always like people, because I am one, they tend to be the center of my world. People are the most important building block of my network, and drive what I truly care about when it comes to APIs. Teams - I engage with a variety of teams as part of my job, both internal to Postman, but also externally across the many different enterprise organizations I am working with. While I have relationships with individuals on the team, I find myself regularly thinking about how to add value to the entire team, and influence how and why they are doing APIs. Projects - My world is littered with projects. Some projects move forward, while others simmer, and sometimes whither on the vine. Projects usually involve one or many...[Read More]


API Evangelist

The Building Blocks of API Partner Programs

04 Mar 2020

I’m doing a deep dive into partner API research, taking a fresh look at how API providers and service providers are operating their partner programs. I looked through around a hundred partner programs I have indexed, and listed a few of the notable ones below. It can be difficult to study partner API programs because many organizations consider their API program a type of partner program by itself, and then there are also a lot of partner APIs, providing actual services involving partners, providing programmatic access to a variety of partner resources. I’ll be rolling up this research into several other more formal strategies and guides that I will publish as part of API Evangelist, but like I do with all my work I wanted to publish my notes and research here as I’m working through. Purpose The reasons behind having a partner program, and what value it brings to an organization and its partners. Providing a list of reasons why you will want to invest in a partner program, and use to sell the concept to other stakeholders. Increase Exposure - Providing more exposure opportunities for platform partners. Increase Skills - Expand upon the skills of partners who are putting a platform to work. Increase Awareness - Grow the awareness amongst partners about what is possible. Increase Sales - Making it about the money, and expanding the sales intake for partners. Drive Communications - Push the platform and its partners to communicate more. Increase Collaboration - Pushing partners to work together, and with the platform more. Encourage Usage - Incentivize more usage of the platform and its products and services. Encourage Adoption - Drive adoption of the platform, pushing partners to depend on it more. Encourage Syndication - Increase the syndication of content and other branded assets. Opportunity for Growth - Allow partners to grow by using the platform more. Protect Users From Unwanted Behavior - Solicit partner assistance to help keep users safe....[Read More]


API Evangelist

Postman API Reference and Capability Collections

04 Mar 2020

Postman collections are a great way to document every detail of an API, defining the host, path, parameters, headers, and body of each API request. Allowing any single API request to be captured as a machine and human readable Postman collection that can be shared and used by any technical or non-technical user. The most common approach to defining a Postman API collection is to document the requests across all available APIs, providing a complete collection of all API requests that can be made, then using that reference to mock, document, test, monitor, and execute individual requests manually or as part of any automated process.  However, there are other ways to evolve these requests to ensure that they more closely resemble common business tasks, accomplishing everyday activities that technical and non-technical individual need to accomplish. An AWS EC2 Reference Collection An example of a Postman reference API collection can be found in the collections I worked on leading up to AWS re:Invent last December. One of the reference API collections I have been crafting is for the Amazon EC2, providing a portable and executable collection of all API requests possible for the cloud compute platform. The AWS EC2 reference collection has over 350 individual requests possible in, providing a dizzying amount of control over delivering, operating, and evolving compute capacity across AWS regions. While this collection a robust representation of all the available AWS EC2 resources, it will take additional work to understand what is possible, find the specific request needed for any particular integration or application, and populate the request with relevant values to realize any specific business need. It is a great start when it comes to putting AWS EC2 to work, but to make things more usable, it will take a little more work. An Amazon EC2 Capability Collection This AWS EC2 reference collection provides a foundation for delivering integrations and applications, and can be used as a seed for a different...[Read More]


API Evangelist

Peeling the OpenAPI-Driven API Life Cycle Collaboration Onion

03 Mar 2020

I am trying to better understand how we all work together to deliver and consume APIs. Fleshing out more meaning behind some of the common words we use in the space such as collaboration, platform, hubs, workspaces, feedback looks, comments, sharing, notifications, and other communication channels. I want push my thoughts forward on what the gears of API collaboration are, and how we can better work together to move many different APIs forward as provider and consumer. API collaboration isn’t very straightforward, and in my mind there are several layers to how things actually are playing out across the API landscape. This is my best attempt at breaking things out into different buckets for helping us make sense of how we are working together to move API infrastructure forward at the organizational and industry level. Layer One - Single OpenAPI Management In 2020, OpenAPI has won the great API specification wars of the previous decade. OpenAPI is helping individual developers and architects more efficiently define and design OpenAPI definitions, using the core objects of the API specification as our guide. Providing us with a box of gears we can assemble to define the floor of our digital factories putting out the digital products and services we provide to our customers each day. Info - Helping manage the name and description for OpenAPI definitions Contact - Helping integrate and manage contact info as part of wider team management. License - Helping manage the4 licensing for the APIs being defined. Server - More management for available servers (ie. mock, development, production, etc) Server Variables - Helping manage server variables as part of environment management. Paths - Help managing the design and definition of API paths. Operation - Better operation management (verbs, summary, description, operationIds, etc.) Parameter - Help more consistently name and define query and path parameters. Headers - Be more deliberate and aware about how headed are defined and used. Request Body - More tools for...[Read More]


API Evangelist

The Technology, Business, and Politics of the OpenAPI Conversation

02 Mar 2020

I was pondering a tweet from Aidan Cunniffe (@aidandcunniffe) over at Optic they other day.  He was expressing what he says is a “controversial opinion that keeps getting backed up by conversations. Each version of OpenAPI and JSON Schema map to ~15 versions. All the implementations by vendors, cloud providers, and open source libs implement a useful (but not always the same subset.” I don’t think it is a controversial opinion at all, I think he points to a pretty critical deficiency in our belief around APIs and specifications like OpenAPI. Something that begins with the specification itself and how it evolves, but as Aidan points out, echoes out through API service and tooling providers, but then also across API providers themselves who put the OpenAPI specification as part of their own operations.  On Twitter, Aidan continues with, “what is the point of a data-spec if it's not enforceable the same way, everywhere? We have to acknowledge that there's no one spec (a versioned markdown file doesn't count), there's 15, 20, 30, 50 of them in the wild today -- and that's blocking teams from using tooling end-end.” Then continues by suggesting “a wasm reference implementation that every vendor and lib could drop-in and link to across programming languages might actually solve this problem and truly enable end-end use of OpenAPI I'd make this objective #1 for 2020 if I had the keys. I just have the tweets :)”. Makes sense to me, and I’d say something that the OpenAPI community should adopt. Honestly, and I’ve made the argument before, I think the OAI should be investing to stabilize core OpenAPI tooling, going beyond just the spec.  Technical Solutions Require Business and Industry Political Understanding I support the technical solution Aidan puts forward, and would love to see investment across multiple providers to make happen. However, I think we will need to better understand the business and politics of it all to see the change we want—consistent support of...[Read More]


API Evangelist

Design and Build API with Postman

25 Feb 2020

I am doing more talks and workshops within enterprise organizations educating teams about designing and building APIs, helping Postman customers be more successful in not just using Postman, but in defining, designing, delivering, supporting, and evolving high quality APIs using Postman. 90% of the teams I work with are still build-first when it comes to delivering API capabilities across the enterprise, so we are invested in helping bring that number down. Empowering teams to go API-first when it comes to designing and building their APIs, moving beyond the more costly approach of writing code first, and develop more healthy practices that involve business and technical stakeholders in the process. It is natural for developers to want to roll up their sleeves and begin coding to deliver an API. It is what they are trained to do. However, it makes a lot more sense to involve business stakeholders earlier on in the process, and avoid the costly, isolating, and more time intensive approach of purely approach APIs a writing code. API has been working internally, and with our most engaged customers to better define what an API-first workflow involving the following stops along the API life cycle: APIs Builder - On the Postman platform, all APIs begin with the new APIs tab—the beta implementation of being able to manage the API life cycle within Postman. Create - You can create a new API by starting fresh, or importing an existing API definition in the OpenAPI, RAML, or GraphQL formats, and use it as the definition for each new API> Definition - To change the design of an API, you can directly edit the OpenAPI, RAML, or GraphQL definition, manipulating the design of the API and the underlying schema. Mock - With an API definition you can then mock each API, providing a virtualized representation of each path, with examples returned as mocked responses. Environment - Defining key / value pairs and globals that can be used...[Read More]


API Evangelist

Managing API Secrets Using Postman Environments

24 Feb 2020

Postman environments are machine readable definitions of design, development, staging, and production environments that can be used across API operations. When used properly they contain the keys, tokens, and other secrets needed for authorizing each individual, or collection of API requests. Making them an excellent place to begin getting more organized about how API secrets are applied, managed, and audited across teams. Secrets can also be littered throughout Postman collections, but when collections and environments are used properly, developers should be isolating secrets to environments, helping make sure Postman collections contain the technical details of the surface area of an API, but the unique values applied to each API is actually present as part of well defined Postman environments. Providing the opportunity for managing and governing how API secrets are being applied and stored by developers, and opening up the opportunity to use Postman as part of wider API governance efforts. Environments are an essential building block to be considered as part of wider API governance strategy. Like Postman collection, environments will need the greatest amount of governance to inject the most observability, reliability, and security across API operations. When used right, Postman environments help isolate and standardize how secrets, PII, and other sensitive information is used across the delivery and integration of APIs. Allowing for centralized control over environments by leveraging Postman for the managementof environments through the interface and the API. GUI - Managing all of the environments in use with the Postman web interface. All Environments - Manually manage all of the environments in use using the central Postman web interface, allowing any member of governance to audit how environments are being used. API - Automating the management of environments using the Postman API, opening up the opportunity for auditing, managing, and enforcing governance at scale across the environments being applied by all enterprise teams engaging with API operations. All Environments - Programmatically pulling all environments via Postman API, so that they can evaluated...[Read More]


API Evangelist

Content Negotiation for APIs and the Web

24 Feb 2020

APIs often seem like another one of those very technical acronyms that only the most technical people will care about. If you don’t aspire to be a software developer, why should you ever care about what application programming interfaces (APIs)? To push back on this notion I regularly push myself to make APIs more accessible to business users. I feel it is important that anyone who use the web daily as part of their professional career should possess a working understanding of the tool(s) they depend, and have a grip on how APIs aren’t some add-on to the World Wide Web we depend on each day--understanding that APIs and the web are one and the same. Over the last twenty years the web has became a fundamental aspect of how we do business online, and APIs are just the latest evolution of how the web is being put to use to use as part of the digital transformation businesses are going through across every business sector today. The World Wide Web, commonly known as the Web, is an information system where documents and other web resources are identified by Uniform Resource Locators, which may be interlinked by hypertext, and are accessible over the Internet—with “documents and other web resources” being the bridge between APIs and the web. If you are using the web you can use APIs, as long as you understand one of the fundamental building blocks of the web--that you can negotiate “documents and other web resources” in the following information formats. Hyper Text Markup Language (HTML) - Each time you use the web you are getting and posting HTML documents using the Internet. HTML is a machine readable format that renders each web page you view, helping make easier for humans to read in a variety of languages. While you may not write HTML or directly “read” HTML, you are using HTML each day as you make your away around to different web...[Read More]


API Evangelist

The Caltech University API Landscape

19 Feb 2020

I regularly take a look at what different universities are up to when it comes to their APIs. I spent two days talking with different universities at the University API summit in Utah a couple weeks back, and I wanted to continue working my way through the list of schools I am speaking with, profiling their approach to doing APIs, while also providing some constructive feedback on what schools might consider doing next when it comes to optimizing API delivery and consumption across campus. Next up on m list is Caltech, who we have been having conversations with at Postman, and I wanted to conduct an assessment of what the current state of APIs are across the school. The university reflects what I see at most universities, meaning there are plenty of APIs in operation, but not any visible organized effort when it comes to bringing together all the existing APIs into a single developer portal, or centralizing API knowledge and best practices when it comes to the interesting API things going on across a campus, and with other partners and stakeholders. APIs in the Caltech Library When it comes to APIs at the University level the first place to start is always at the library, and things are no different at Caltech. While there is no official landing page specifically for APIs the Caltech library has GitHub page dedicated to a variety of programmatic solutions https://caltechlibrary.github.io/, but you can find many signals of API activity behind the scene, like this API announcement that the write API is available https://www.library.caltech.edu/news/write-api-now-operational. You can also find an interesting case study on how the library is using APIs provided by an interesting API provider called Clarivate, which I will be looking to understand further. As with every other university, there is a huge opportunity for Caltech to be more organized and public about the API resources offered as part of the library--even if it isn't widely available to...[Read More]


API Evangelist

API Interrogation

14 Feb 2020

I was doing some investigation into how journalists are using APIs, or could / should be using APIs. After some quick Googling, Binging, and DuckDuckGoing, I came across a workshop by  David Eads of ProPublica Illinois, called a A hitchhiker's guide To APIs. As I began reading, I was struck by how well it captured not only usage of Postman in journalism, but also how well it captures what Postman does in general in a single precise sentence, “In this hands-on session, you will use Postman to interrogate a web API.” That is how I use Postman. That is why 10 million developers use Postman.  APIs are how we can interrogate the digital world unfolding around us. It is increasingly how we can interrogate the digital world emerging across our physical worlds. I like the concept in general, but definitely think it is something I should explore further when it comes to journalism and investigative storytelling. Postman provides a pretty powerful way to get at the data being published by city, county, state, and federal government. It also provides a robust way to get at the social currents flowing around us on Twitter, Facebook, LinkedIn, and other leading platforms. Postman and APIs provides technical and non-technical users with what they need to target a source of data or content, authenticate, and begin interrogating the source for all relevant information. I find that interrogating a startup is best done via their own API, as well as their digital presence via Twitter, LinkedIn, GitHub, Stack Overflow, Facebook, Youtube, and Instagram using APIs, over speaking with them directly. I find that interrogating a federal agency is often only possible through the datasets it publishes, providing me with a self service way to understand a specific slice of the how our society works (or doesn’t). While I can interrogate a company, organization, institution, and government agencies using their websites, I find that also being able to interrogate their platform,...[Read More]


API Evangelist

All of the Discussions from the BYU API University Workshop in Utah

12 Feb 2020

I went to Provo Utah a couple weeks ago and participated in the sixth annual Brigham Young University (BYU) University API Workshop. I was the keynote opener for the first edition of the conference, and I was the same for the sixth edition of the event bringing together many different universities together to talk about API usage across their campuses. When the event began it was primarily BYU staff, but it has expanded to include administrators and faculty from what I counted to be over twenty other universities from across the United States--making for a pretty interesting mix of conversation from higher education API practitioners looking to solve problems, and share their stories of how APIs have help make an impact at how universities serve students and the public. The University API Workshop is an “unConference Focused on University & Personal APIs & Their Use in Improving Learning”. It brought together around one hundred folks to discuss a wide variety of API topics. Since it was an unconference, everyone pitched their own ideas, with some of them being about sharing API knowledge, while others was about soliciting knowledge from the other attendees. Resulting in a pretty compelling list of session spread across two days. You can browse through the sessions using the Google Docs that every session organizer published. Providing a pretty compelling look at how APIs are making an impact at the higher education level, shining a light on the concerns of API stakeholders across the campus. Session One Let’s stop using usernames & passwords User Experience in the API World Postman Fundamentals Securing APIs/data with proper authorization Session Two Walk, Talk, and API Stalk API Governance at Scale taking ideas to consistent execution Mendix (HPAPaaS/Low Code) After a Year at BYU Our New NGDLE | Open Courses Made With Web Components, Microservices, Docker, CI/CD and more! DDD vs. BI - Balancing Centralizing and Decentralizing Forces in Data Architecture Session Three How do I test...[Read More]


API Evangelist

Postman Governance as the Foundation for Wider API Governance

11 Feb 2020

This an overview of possible strategies for governing how Postman is used across a large organization. It is common for Postman to be already in use across an organization by individuals operating in isolation using a free tier of access. Governance of not just Postman, but also the end to end API life cycle begins with getting all developers using Postman under a single organizational team, working across master planned workspaces. If there are concerns about how Postman is being used across an enterprise organization, governance of this usage begins by focusing on bringing all enterprise Postman users together under a single license, and team, so that activity can be managed collectively. Postman Users Over the last five years Postman has become an indispensable tool in the toolbox of developers. 10 million developers have downloaded the application and are using it to authorize and make requests to APIs then debug the responses. The benefit to API operations for the enterprise is clear, but the challenge now for enterprise organizations is to identify their individual Postman users and encourage them to operate under a single pro, team, or enterprise license.  Currently users are operating in isolation, defining, storing, and applying secrets and PII locally on their own workstations within Postman, and syncing to the cloud as part of their regular usage of Postman—isolating details about APIs, secrets, potentially PII, and other sensitive data within these three areas. Personal Workspaces - Storing collections, and environments within their localized personal workspaces and individual Postman account. Personal Collections - Developing API collections in isolation, leaving them inaccessible to other teams, and reusable across operations. Personal Environments - Using environments to store secrets, PII, and other data within their localized personal workspaces and individual Postman account.  When it comes to enterprise API governance, observability, and security, the problem isn’t with Postman being used by developers, the problems is developers are not using Postman together under a single license, across managed shared workspaces. Putting...[Read More]


API Evangelist

Conducting API Weaponization Audits

11 Feb 2020

I’ve been thinking about chaos engineering lately, the discipline of experimenting on a software system in production in order to build confidence in the system's capability to withstand turbulent and unexpected conditions. I listened to a talk by Kolton Andrus the CEO of Gremlin the other day, and my partner in crime at Postman Joyce (@petuniagray) is an avid evangelist on the subject. So I have been thinking about the concept, how it applies to your average enterprise organization, and the impact it could make on the way we operate our platforms. I don’t think chaos engineering is for every company, but I think there are lessons involved in chaos engineering that are relevant for every company. Similarly I think we need an equal approach in the area of weaponization, and how APIs can easily be used to harm a platform, its community, and the wider public—a sort of weaponization audit. Let’s take what we’ve learned from Twitter, Facebook, Youtube, and others. Let’s look at the general security landscape, but let’s get more creative when it comes to coloring within the lines of an API platform, but in unexpected ways. Let’s get women and people of color involved. Let’s focus on ways in which a platform can be abused. Using the web, mobile, device, or APIs underneath. I’d like to consider security, privacy, reliability, observability, as well as out of the box ways to game the system. Let's assume that nobody can be trusted, but recognizing we still need to offer a certain quality of service and community for our intended users. I am guessing it won’t be too hard to hire a savvy group of individuals who could poke and prod at a platform until the experience gets compromised in some harmful way.  Like chaos engineering, I’m guessing most organizations wouldn’t be up for an API weaponization audit. It would reveal some potentially uncomfortable truths that leadership probably isn’t too concerned with addressing, and...[Read More]


API Evangelist

The Basics of Working with the Postman API

10 Feb 2020

It is pretty easy to think of Postman as the platform where you engage with your internal APIs, as well as other 3rd party APIs. It doesn’t always occur to developers that Postman has an API as well. Most everything you can do through the Postman interface you can do via the Postman API. Not surprisingly, the Postman API also has a Postman collection, providing you with quick and easy access to your Postman collections, workspaces, teams, mocks, and other essential elements of the Postman platform and client tooling. Providing you with the same automation opportunities you have come to expect from other APIs. API access, integration, and automation should be the default with everything you do online—desktop, web, mobile, and device applications all use APIs. Your API infrastructure is no different. Postman takes this seriously, and works to make sure that anything you can do through the desktop or web interfaces, that you can also do via the Postman API--allowing API providers and consumers to seamlessly integrate and automate the Postman platform into their operations by leveraging the following APIs. Collections - Being able to programmatically create and manage Postman API collections in use. Environments - Adding and managing the details of the environments applied across Postman collections. Mocks - Creating, retrieving, and deleting mocks APIs that are generated from Postman collections. Monitors - Create, update, retrieve, delete, and run monitors that execute Postman collections. Workspaces - Creating, retrieving, updating, and deleting the workspaces that collections are organized in. Users - Provides a /me endpoint that allows for pulling of information about the API key being used. Import - Allowing for the import of Swagger, OpenAPI, and RAML API definitions into Postman. API - Programmatically creating and managing APIs, including version, schema, and its link relations. These eight API paths give you full control over managing the full API life cycle of APIs you are developing, and the integration and automation of the APIs...[Read More]


API Evangelist

Standardizing My API Life Cycle Governance

10 Feb 2020

I am working on redesigning all of my base APIs, as well as produce a mess of new ones. As part of the process I am determined to be more thoughtful and consistent in how I design and deliver the APIs. API governance always begins with using API definitions, as you can't govern something you can't measure and track, so having machine readable artifacts is essential. After that, the design of the API is the first place to look when it comes to standardizing each of the APIs coming off the assembly line. Then I am looking to do my best to begin defining, measuring, and standardizing how I do many other areas of API operations, hleping me keep track of the many moving parts of doing microsservices.  To help me govern the life cycle for each API, I am going to be quantifying and measuring as many of the follow areas as I can. These are what I consider to be the essential building blocks of each API that I deliver, and since I'm using Postman to not just interact with these APIs once they are in production, I will be using Postman to also deliver and govern each stop along the API life cycle. Using Postman collections to define, deliver, and govern each of these areas, using scripts, runners, and monitors to automate the enforcement of standards and consistency across the APIs I am delivering on a regular basis.  Definitions OpenAPI - There is an OpenAPI for each individual API. Collection - This is a Postman collection for each individual API. JSON Schema - There is a JSON schema for each individual schema. Design Requests Base - Ensure the base path is planned. Versioning - Define how APIs are versioned. Resource - Evaluate each resource published. Sub-Resources - Evaluate each sub-resource published. Methods - Ensure common use of HTTP methods. Actions - Determine how actions are taken beyond methods. Path Parameters - Establish common approach for path parameters. Query...[Read More]


API Evangelist

Backend AWS API Gateway Integration OpenAPI Extensions

10 Feb 2020

I have spent a lot of time automating my AWS API infrastructure, working to make it so I can automatically deploy API infrastructure to AWS.  I am using AWS API Gateway as part of this suite of API deployments so I have been working hard to understand how AWS speaks OpenAPI as part of their implementation. As part of my work there are three distinct types of APIs I am deploying using AWS API Gateway, which have three distinct ways of extending OpenAPI to describe. The Pass Through Just passing what comes in to an HTTP host and path I give it and then passing the response back through without any transformations or other voodoo along the way. This is a basic OpenAPI extension for defining a pass through API using the AWS API Gateway. A DynamoDB Backend For my basic CRUD databases I am just using a DynamoDB backend because it allows me to quickly launch data APIs that allow me to Create, Read, Update, and Delete (CRUD) data I am storing in the NoSQL database—providing me with a pretty basic approach to delivering data API infrastructure. Here is the OpenAPI vendor extension for wiring things up using a DynamoDB backend. I like DynamoDB because you can just make API calls to get most of what you need without any sort of business logic or code in between. If I am just looking to manage data using simple web API endpoints, this is what I am doing when it comes to deploying API infrastructure. Logic with Lambda I would say the the previous two types of APIs represent the most common implementations I have, but I am working to evolve my infrastructure to take advantage of newer approaches to delivering APIS like Lambda. Here is the OpenAPI extension for defining a Lambda backend, which I can then wire up to a database and storage or purely implement some business logic to do what I...[Read More]


API Evangelist

API Links For Every UI Element

10 Feb 2020

I’ve showcased ClokudFlare's approach making their API available as part of their user interface several times now. It is a practice I want to see replicated in more desktop, web, and mobile applications, so I want to keep finding new ways of talking about, and introducing to new readers. If you sign up or use CloudFlare, and navigate your way to their SSL/TLS section, you will see a UI element for changing the levels of your SSL/TLS encryption, and below it you see some statics on the traffic that has been served over TLS over the last 24 hours. Providing you full control over SSL/TLS within the CloudFlare UI. At the bottom of the UI element for managing your SSL/TLS you will see an API link, which if you click you get three API calls for getting, changing, and verifying the SSL/TLS status of your domain. Providing you with one click access to the API calls behind the UI elements, giving you two separate options for managing your DNS. This is how all user interfaces within applications should be. The API shouldn’t just be located via some far off developer portal, they should be woven into the UI experience, revealing the API pipes behind the UI at every opportunity. This allows for the automation of any activity a user is taking through the interface using the platform's API. You could also consider embedding a simple Postman Collection for each API capability, allowing a user to run in Postman—to further support, you could also make a Postman environment available, pre-populated with a users API Key, making execution of each platform capability outside of the platform possible in just one or two clicks. Once each UI capability is defined as a Postman collection it can immediately be executed by a user in a single click. It can also be executed using a Postman runner as part of an existing CI/CD process, or on a schedule using a...[Read More]


API Evangelist

Secrets and Personally Identifiable Information (PII) Across Our API Definitions

27 Jan 2020

As API providers and consumers we tend to have access to a significant amount of credentials, keys, tokens, as well as personally identifiable data (PII). We use this sensitive information throughout the API integration and delivery life cycles. We depend on credentials, keys, and tokens to authorize each of our API requests, and we potentially capture PII as part of the request and response for each the individual API requests we execute regularly. Most developers, teams, and organizations I’ve spoken with do not have a strategy for addressing how secrets and PII are applied across the internal and external API landscape. API management over the last decade has helped us as API providers better manage how we define and manage authentication for the APIs we are providing, but there hasn’t been a solution emerge that helps us manage the tokens we use across many internal and external APIs. With this reality, there are a lot of developers who are self-managing how they authenticate with APIs, and work with PII that gets returned from APIs. I am working on several talks with enterprise organizations about this challenge, and to prepare I want to work through my thoughts on the problem, as well as some possible solutions. I wanted to map out how we integrate with the APIs we are developing and consuming, and think about what the common building blocks of how we can better define, educate, execute, audit, and govern the secrets and PII that is applied throughout the API life cycle across all of the APIs we depend on. Allowing me to have a more informed conversation about how we can get better at managing the more sensitive parts of our operations. What Are The Types of Sensitive Information? First I wanted to understand the types of common information being applied by API developers, helping me establish and evolve a list of the types of data we are looking for when securing the API...[Read More]


API Evangelist

An Introduction to API Authentication

27 Jan 2020

APIs operate using the web, but like web applications, many API require some sort of authentication or authorization before you can access the valuable resources available within each API path. When you open up your APIs on the web you aren’t just giving away access to your resources to anyone who comes along. API providers employ a number of different authentication mechanisms to ensure only the applications and systems who should have access are actually able to make a successful API call. To help refresh the types of authentication available across the API landscape, while also demonstrating the reach of Postman as an API client, I wanted to take a fresh look at authentication to help my readers understand what is possible. Depending on the API provider, platform, and the types of resources being made available you will encounter a number of different authentication methods—here are the 11 that Postman supports, reflecting 90% of the APIs you will come across publicly, as well as within the enterprise organization. Refelecting what the API sector employs for authentication of their APIs, as well as what Postman supports as an API client. No Authentication - Like the web, these APIs are publicly available and accessible without any authentication. You can just make a request to a specific URL, and you get the response back without needing any credentials or key. This reflects a very small portion of the API economy, but still is an important aspect of the overall authentication discussion, and what is possible. API Key - An application programming interface key (API key) is a unique identifier used to authenticate a user, developer, or calling program to an API. However, they are typically used to authenticate a project with the API rather than a human user. Different platforms may implement and use API keys in different ways. Bearer token - A Bearer Token is an opaque string, not intended to have any meaning to clients using...[Read More]


API Evangelist

Profiling Adobe APIs

23 Jan 2020

As I was profiling APIs on my list of APIs I found myself profiling Adobe. I am moving through the list of companies alphabetically, so you can see how far along I am. Anyways, like any other large company I need to make a decision about how I am going to manage the profiling of different API products and lines of business. Companies like Amazon, Google, Azure, and Adobe have large numbers of APIs and I always know I will need to have some sort of plan for documenting everything that is going on. With Adobe, I am going to track everything in a single GitHub repository, but will be working to create separate API definitions (OpenAPI and Postman collections) for each of the individual APIs being offered. To provide some context, it helps to understand why I profile APIs in the first place. As the API Evangelist I review public API operations studying how API providers are doing what they do. I then aggregate the "building blocks" of their public operations into a master set of reserarch that I use to drive my storytelling and API strategy workshops. So, with the Adobe APIs I'm not looking to review their API operations as much as I am looking to understand how they operate, and develop an understanding of how far along they are in their enterprise API journey. As with any profiling of a company, I begin by Googling their name pus API, but then dive as deep as I can into the details of what I find with each click. When you Google Adobe APIs you get this main landing page with the tagline, “APIs and SDKs for all Adobe products – create mobile, web and desktop apps”. You can tell Adobe is working hard to bring together their APIs under one big tent, with the following main areas to support developers: Landing Page - Adobe API landing page. Authentication - Overview of authentication. Open...[Read More]


API Evangelist

Three Ways to Use Postman and Azure DevOps

22 Jan 2020

I set out to understand the role that Postman can play in an Azure DevOps powered API life cycle. I was fully prepared to crash course Azure Dev Ops, and begin mapping out the role that Postman can play, but before I got started I began Googling Postman + Azure DevOps. I was happily surprised to find a number of rich walk throughs written by the passionate Postman community--surpassing anything I could have put together for a version 1.0 of my Azure DevOps Postman guidance. I will still work to pull together my own official Azure DevOps Postman walkthrough, but to prepare I wanted to publish a summary of what I have found while thinking about how Postman and Azure DevOps can work together.  The Postman Basics Before we get going with what I have found, I wanted to point to a couple of key concepts readers will need to be familiar with before they set out trying to use Postman with Azure DevOps, helping set the tone for any integration. It always helps to start with the basics and not assume all of my readers will understand what Postman delivers. Intro to collections - Getting familiar with what collections are, and how they work. Intro to collection runs - Understanding the nuance of how collections can be run. Intro to scripts - Learning about how to script within the collections being run. It is critical that you have a decent grasp on what are possible with Postman collections, and how it can be applied as part of any CI/CD pipeline. Most developers think of Postman as simply an HTTP client for just making calls to APIs. Once you understand how collections can be run, and the many different ways that scripts can be applied, you will be much more effective at applying as part of any pipeline, including with Azure DevOps--providing a great place to start. Testing Azure DevOps APIs Using Postman While mapping out this walk...[Read More]


API Evangelist

The State of California Doing APIs The Right Way By Starting Simple

22 Jan 2020

I got introduced to the CA.gov Alpha Team by my fellow government change maker Luke Fretwell (@lukefretwell) the other day, and I am beginning to tune into what they are up to in similar ways to how I’ve done with other city, state, and federal government entities over the years. We kicked off a conversation around their approach to delivering APIs, and what was possible with Postman. After we were done kicking things off they shared some links with me to help me get up to speed on what they have been doing with their new approach to delivering technology across the State of California. As far as first impressions go I am super stoked with their approach. They are starting small, and working hard to be as public with how they are doing everything. The CA.gov Alpha Team gets right down to the core of doing API well, by setting up the essential communication channels you need to do APIs well across any small or large organization. GitHub - All of the projects they develop are published to GitHub. Twitter - Providing a social stream from what is happening. Blog - Shaping the narrative around all of the work that is occuring. The CA.gov Alpha Team has not just gone all in on GitHub, they are all about their work truly existing in the public domain. It looks like everything they are doing is first being defined as a GitHub repository, providing a default way for other government stakeholders, as well as the public at large to stay in tune with what is going on, and even contribute to what is happening. This is how all government should be by default, and the CA.gov Alpha Team provides one possible blueprint for other city, state, and federal agencies to follow. I really like that the CA.gov Alpha Team is seeding and managing everything out in the open on Twitter, and being so vocal about it all with a...[Read More]


API Evangelist

Help Defining 13 of the AsyncAPI Protocol Bindings

22 Jan 2020

I have been evolving my definition of what my API toolbox covers, remaining focused on HTTP APIs, but also make sure I am paying attention to HTTP/2 and HTTP/3 APIs, as well as those that depend on TCP only. My regular call with Fran Méndez (@fmvilas) of AsyncAPI reminded me that I should be using the specification to ground me in the expansion of my API toolbox, just as OpenAPI has defined much of it for the last five years. For this particular multi-protocol API toolbox research, the AsyncAPI protocol bindings reflect how I am looking to expand upon my API toolbox. Here are the 13 protocols being defined around the AsyncAPI specification: AMQP binding - This document defines how to describe AMQP-specific information on AsyncAPI. AMQP 1.0 binding - This document defines how to describe AMQP 1.0-specific information on AsyncAPI. HTTP binding - This document defines how to describe HTTP-specific information on AsyncAPI. JMS binding - This document defines how to describe JMS-specific information on AsyncAPI. Kafka binding - This document defines how to describe Kafka-specific information on AsyncAPI. MQTT binding - This document defines how to describe MQTT-specific information on AsyncAPI. MQTT5 binding - This document defines how to describe MQTT 5-specific information on AsyncAPI. NATS binding - This document defines how to describe NATS-specific information on AsyncAPI. Redis binding - This document defines how to describe Redis-specific information on AsyncAPI. SNS binding - This document defines how to describe SNS-specific information on AsyncAPI. SQS binding - This document defines how to describe SQS-specific information on AsyncAPI. STOMP binding - This document defines how to describe STOMP-specific information on AsyncAPI. WebSockets binding - This document defines how to describe WebSocket-specific information on AsyncAPI. Not all of the protocol bindings are fully fleshed out, and AsyncAPI could use help from the community to quantify what is required with each of the protocols. I am going to try and contribute what I can as I make my way through each of the protocols as part of my API toolbox research.I am defining the building blocks for each of the protocols which...[Read More]


API Evangelist

My Upcoming Talk with the UK Government Digital Services (GDS): The API Life Cycle Is For Everyone

21 Jan 2020

I am heading to London in February to talk to the UK government about APIs. They invited me out to talk about my history of work with government in the US and EU, and share my views of the API life cycle. To help share my view of the API landscape I pulled together a talk titled, "The API Life Cycle Is For Everyone". I am hoping to share my view of the fundamentals of a modern API life cycle, as well as emphasize the importance of both developers and non-developers having a place at the table. Here is what I've pulled together for my time with the GDS in London. APIs are widely considered to be something that is exclusively in the domain of software developers. While it is true that APIs are often a very technical and abstract concept which requires a more technically inclined individual to engage, APIs are something that impacts everyone across todays digital landscape, impacting both business users and developers, making the API development life cycle something all parties should be educated on, made aware of, and equipped to participe in. As part of my contribution to the GDS talks on interoperability and open standards I’d like to spend an hour with you talking through the human-machine intersection across: API Definitions - Talking about Swagger / OpenAPI, as well as Postman collections and environments, and how they are being put to use. API Documentation - Understanding common approaches to delivering and maintaining documentation for APIs that are being delivered. API Mocks - Thinking about how API mocking can be used to articulate and share what an API delivers for all stakeholders involved. API Testing - Understanding the role that API assertions and testing play in defining the operations and reliability of our API infrastructure. API Management - Looking at how API management secures our APIs, but also helps us develop the awareness of how they are used. API Contracts...[Read More]


API Evangelist

Looking at Electronic Data Interchange (EDI) Reminds Me that the API Economy is Just Getting Started

21 Jan 2020

I am neck deep in the expansion of what I consider to be my API toolbox, and I have been spending time mapping out the world of EDI. If you aren’t familiar with  the Electronic Data Interchange (EDI), it “is the electronic interchange of business information using a standardized format; a process which allows one company to send information to another company electronically rather than with paper. Business entities conducting business electronically are called trading partners.”  EDI is the original API by providing a, “technical basis for automated commercial "conversations" between two entities, either internal or external. The term EDI encompasses the entire electronic data interchange process, including the transmission, message flow, document format, and software used to interpret the documents”. EDI is everywhere, and truly the backbone of the global supply chain, but one that you only hear lightly about as part of the overall API conversation. I have regularly come across the overlap between EDI and API over the last 10 years of doing API Evangelist, and while I have engaged in discussion around modernizing legacy EDI approaches in healthcare and commerce, most other fundamental building blocks of the global supply chain are entirely new to me. Revealing how little I know about the bigger picture of EDI, and how small my API world actually is. I don’t claim to know everything about information exchange and interoperability, but EDI is something that should be a bigger part of my storytelling, and the fact that it isn’t I think is revealing about how much more work we actually have in front of us when it comes to delivering on the promise of the API economy.  Take a look at some of the major EDI standards to get a sampling of the scope I am talking about. These are the electronic data interchange standards that have governed commerce before the Internet was around, and continues to define how data moves around. [email protected] (EDIGAS) - The [email protected][Read More]


API Evangelist

I Think We Will Have To Embrace Chaos With the Future of APIs

21 Jan 2020

I like studying APIs. I like to think about how to do APIs well. I enjoy handcrafting a fully fleshed out OpenAPI definition for my APIs. The challenge is convincing other folks of the same. I see the benefits of doing APIs well, and I understand doing the consequences of not doing them well. But, do others? I never assume they do. I assume that most people are just looking to get an immediate job done, and aren’t too concerned with the bigger picture. I think people have the perception that technology moves too fast, and they either do not have the time to consider the consequences, or they know that they will have moved on by the time the consequences are realized. I’m pretty convinced that most of our work on API design, governance, and other approaches to try and standardize how we do things will fall on deaf ears. Not that we shouldn’t keep trying, but I think it helps if we are honest about how this will utlimately play out. If I give a talk about good API design at a tech conference everyone who shows up for the talk is excited about good API design. If I give a talk about good API design within an enterprise organization and leadership mandates everyone attend, not everyone present is excited, let alone cares about API design. I wish people would care about API design, and be open to learning about how others are designing their APIs, but they aren’t. Mostly it is because developers aren’t given the space within their regular sprints to care, but it is also because people are only looking to satisfy the JIRA ticket they are given, and often times the ticket has said nothing about the API being well designed, and consistent with other teams. Even with teams that have been given sufficient API design training and governance, if it isn’t explicitly called out as part of the...[Read More]


API Evangelist

Expanding My API Toolbox for the Next Decade

21 Jan 2020

I am continuing to iterate on what I consider to be a modern API toolbox. API Evangelist research is born out of the SOA and API worlds colliding, and while I have been heavily focused on HTTP APIs over the years, I have regularly acknowledged that a diverse API toolbox is required for success, and invested time in understanding just what I mean when I say this. Working to broaden my own understanding of the technologies in use across the enterprise, and realistically map out what I mean when I say API landscape. I am still workshopping my new API toolbox definition for 2020, but I wanted to work on some of the narrative around each of the items in it, helping me learn along the way, while also expanding the scope of what I am talking about. Transmission Control Protocol (TCP) The Transmission Control Protocol (TCP) is one of the main protocols of the Internet protocol suite, and provides reliable, ordered, and error-checked delivery of a stream of bytes between applications running on hosts communicating via an IP network. The Web and APIs both rely on TCP, which is part of the Transport Layer of the TCP/IP suite. SSL/TLS often runs on top of TCP. It is the backbone of our API toolbox, but there are many different ways you can put TCP to work when it comes to the programming interfaces behind the applications we depend on. It can be tough to separate what is a protocol, and what is a methodology when looking at the API landscape. I’m still working to understand each of these tools in the toolbox, and organize them in a meaningful way—which is why I am writing this post. While all APIs technically rely on TCP, these approaches to communication and information exchange are often implemented directly using TCP. Electronic Data Interchange (EDI) - Electronic Data Interchange (EDI) is the electronic interchange of business information using a standardized format;...[Read More]


API Evangelist

DevOps Azure Style

17 Jan 2020

I am spending time thinking more deeply about how APIs can be delivered via Azure. I spent much of the holidays looking at how to deliver APIs on AWS, but only a small amount of time looking at Azure. I'm looking at how Azure can be used for the development and delivery of APIs, trying to understand the different ways you can use not just Azure for managing APIs, but also use Azure APIs for managing your APIs. Next up is Azure DevOps, and learning more about the nuts and bolts of how the orchestration solution allows you to streamline and stabilize the delivery of your API infrastructure using Azure. First, I want to just break down what the core elements of Azure Devops. Learning more about how Azure sees the DevOps workflow and how they have provided a system to put their vision to work. Here are the main elements of Azure DevOps that help us understand the big picture when it comes to mapping to your API life cycle. Azure DevOps Server - Share code, track work, and ship software using integrated software delivery tools, hosted on-premises Azure Boards - Deliver value to your users faster using proven agile tools to plan, track, and discuss work across your teams. Azure Pipelines - Build, test, and deploy with CI/CD that works with any language, platform, and cloud. Connect to GitHub or any other Git provider and deploy continuously. Azure Repos - Get unlimited, cloud-hosted private Git repos and collaborate to build better code with pull requests and advanced file management. Azure Test Plans - Test and ship with confidence using manual and exploratory testing tools. Azure Artifacts - Create, host, and share packages with your team, and add artifacts to your CI/CD pipelines with a single click. Azure DevTest Labs - Fast, easy, and lean dev-test environments Not every API implementations will use all of these elements, but it is still nice to understand...[Read More]


API Evangelist

A View of the API Delivery Life Cycle from the Azure Getting Started Page

17 Jan 2020

I am working my way through doing more work around the multi-cloud deployment of APIs and spending some more time on the Azure platform here in 2020, and I found their getting started page pretty reflective of what I'm seeing out there when it comes to delivering the next generation of software. When landing on AWS home page it can be overwelming to make sense of everything, and I thought that Azure organized things into a coherent vision of how software is being delivered in the cloud. Infrastructure Providing the fundamental building blocks of compute for all of this. Linux virtual machines  Windows virtual machines  I never thought I"d see Linux and Windows side by side like this. Languages Acknowledging there are multiple programming languages to get the job done. .NET  Python  Java  PHP  Node.js  Go Again, I never thought I'd see such strong support for anything beyond .NET. Application This nails the different layes in which I see folks delivering API infrastructure. Web Apps  Serverless Functions  Containers  Microservices with Kubernetes  Microservices with Service Fabric I think its silly to put microservices there, because APIs are delivered in all. Database The database layers behind the APIs we are all delivering across operations. Relational Databases  SQL Database as a service  SQL Database for the edge  SQL Server on an Azure  PostgreSQL database as a service  MySQL database as a service  Azure Cosmos DB (NoSQL) Again, I am blown away to see MySQL and PostgreSQL along with SQL Server. Storage Where you put all of your blobs and other objects used across your APIs. Blob Storage I'd say this layer is a little anemic compared with other cloud environmetns. Machine Learning Acknolwedging that machine learning is a growing area of API deployment. Machine Learning  Cognitive Services  Azure Notebooks This area will continue grow pretty rapidly in coming years in all industries. Interfaces The ways in which we are interfacing with the software development life cycle. Azure CLI ...[Read More]


API Evangelist

What Is Your API Development Workflow?

16 Jan 2020

I am going to invest in a new way to tell stories here on API Evangelist—we will see if I can make this stick. I enjoy doing podcasts but I am not good at the scheduling and reliable repetition many expect of a podcast. Getting people to join me on a podcast takes a lot of work (I know from experience) to do reliably. People usually want to talk, but finding slots in both of our schedules and getting them to jump online and successfully record an episode isn’t easy to do on a regular basis. However, I still want to be able to craft audio narratives around specific topics that are relevant to the API sector, while also allowing many different voices to chime in. So I’ve come up with a formula I want to test and and see if I can build some momentum. To help stimulate the API conversation and bring in other voices I want to pose a single question on a regular basis and solicit audio responses from folks across the API space, then compile the results into a single podcast that I will publish on the blog and via other channels. All folks need to do in their response to one of my questions is open up their phone and record their response and send me the resulting audio file via email, DM, or carrier pigeon. Then I will organize all the responses into a single coherent podcast with me opening, asking my questions, then chaining together the responses, and closing up with a little analysis. Make sense? A kind of an asynchronous podcast conversation amongst several participants. Ok, let’s start with my first question.: How do you develop APIs? Describe how you or your team actually develops an API. What is the workflow for how you go from idea to production, and what tools and services are involved. Be honest. I am not looking for fluff or pie in...[Read More]


API Evangelist

My Eventbrite API Keys Were Easy To Find

16 Jan 2020

If you read my blog regularly you know I rant all the time about having to sign up for new APIs and then find my API keys and tokens. API providers excel at making it extremely difficult to get up and running with an API, even once you have read their documentation and figured out what their API is all about. So when I come across API providers doing it well, I have to showcase here in a blog posts. Today’s shining example of how to make it easy to find your API keys comes from the Eventbrite API. I was crafting a Postman API capability collection for my boss the other day, and I needed to find me an API key to get the data I needed out of the Eventbrite API. Finding the API paths we needed to get the event and registration data needed had already taken us some time, so I was fully expected the usual friction when it came to finding my API key. Then I clicked on the Eventbrite authentication page and clicked on the link telling me to visit my API keys page, and there they were! No hunting or detective required—my keys were prominently placed above the fold. Amazing!!!  This is how it should be. I shouldn’t have to look around for my key—it is the 2020s. Please stop hiding my keys and making it hard for me to find what I need to get up and running with your API. As you are planning out how to develop and deploy the user experience for the API management layer of your operations make sure you pick 25 existing public APIs, then sign up and find your keys. Learn from the experience and put your keys at a common URL that is prominently linked from your documentation and authentication page. If you have a favorite API that you think adding an application and finding your keys is the pattern...[Read More]


API Evangelist

API Life Cycle Governance Beyond Just API Design

16 Jan 2020

When you hear enterprise organizations talk about API governance they usually mean the governance of API design practices across the organization. This is the place where everyone starts when it comes to standardizing how APIs are delivered. It makes sense to start here because this is where the most pain is experience at scale when you try to put APIs to work across a large enterprise organization. Even if all APIs and micro services are REST(ish), there are so many different ways you can deliver the details of an API--you might as well be using APIs from different companies when trying to put APIs developed across different teams to use in a single application. Making API design the first stumbling block teams consider when planning API governance, and something that would make a meaningful impact on how APIs are delivered. After working with enterprise organizations who have been on their API journey for 5+ years I have begun to see API governance move beyond API design, and begin to look at other stops along the API life cycle, and work to standardize other critical elements. Here are some of the next steps I see enterprise organizations taking when it comes to getting a handle on API governance across teams: Documentation - Making sure everyone is using the same services and tooling for documenting APIs making sure the most common elements are present, and all APIs are well defined. Monitoring - Requiring all teams monitor APIs and report upon the available of each API, establishing a common monitoring and reporting practice that is consistent across all development teams. Testing - Standardizing tooling and approaches to API testing, indexing and cataloging the tests that are in place, and beginning to measure the test coverage for any API in production. Performance - Looking at the speed of APIs and making sure that all APIs are benchmarked as soon as they are developed, then measured against that across multiple...[Read More]


API Evangelist

Eventbrite Events with Order Count and Capacity Using the API

15 Jan 2020

My boss asked me if I could build a Postman collection that would pull our future events from Evenbrite and display ticket counts for each individual event. So I got to work hacking on the Eventbrite API, learning each of the events API paths, stitching together what I needed to pull together my Postman collection for this new API capability. I’m a big fan of not just creating reference collections for different APIs like the Eventbrite API, but also creating individual capability collections that use one or many API requests to deliver on a specific business objective. I was able to craft my Postman API capability collection using two Eventbrite APIs, getting me the data I need to satisfy what my boss needed to get the updates he needed. Events By Organization - Pulls all of the future active events for our Eventbrite organization. Event Orders - Pulling the orders fore each individual event, pulling the relevant information needed to assess each event. This Eventbrite event order Postman capability collection only has one request in it, but I call the second API multiple times using a test script for the request. So in the end I’m making multiple API calls using a single Postman request, allowing me to get at what I need for each future event across multiple APIS--abstracting away some of the complexity. I have published the collection as a Postman template which you can access via the Postman documentation I’ve published, but you will need to add your own Eventbrite token and organization id to actually execute. Once you added these properties entered you can click send and see a listing of events with ticket counts as well as maximum capacity for all the future events using the Postman visualizer tab. I’ve added this Postman capability collection my list of individual API collections I’ve been building. Providing a list of the common things I need to accomplish across the different platforms I depend for my...[Read More]


API Evangelist

Why Hasn’t There Been Another Stripe or Twilio?

13 Jan 2020

Stripe and Twilio are held up as shining examples of how to do APIs in our world. This shining blueprint of how to do APIs has been around for a decade for others to follow. It isn’t a secret. So, why haven’t we seen more Stripes or Twilios emerge? Don’t get me wrong, there are other well done APIs that have emerged, but none of them have received the attention and level of business that Stripe and Twilio have enjoyed. These things always get me thinking and wondering what the reality really is, and if the narrative we are crafting is the one that fits with reality on the ground—pushing me to ask the questions that others aren’t always prepared to ask. I am going to spend some time flagging some of the new APIs who do rise the to the occasion, but while I am working on that I wanted to pose some questions about why we haven’t seen the Twilio and Stripe being modeled by more API providers. Here are a few of my thoughts as I work through this view of the API landscape, and helping me understand why there aren’t more API rockstars to showcase: Investment - Investment cycles have changed and the investment you need to do this right is available for startups in the last five years. Blueprint - Twilio and Stripe are not a blueprint that applies universally to other APIs, but worked will in those business verticals. APIs - This use case of APIs is not as universal as we think it is and is not something that will work being applied to all business verticals. Skills - It takes more skills than we anticipate when it comes to actually delivering an API as well as Twilio and Stripe have done. Cloud - The dominance of the cloud providers is making it harder for small API startups to get traction and attention of investors. Wrong - These...[Read More]


API Evangelist

The State of Simple CRUD API Creation

09 Jan 2020

With all the talk of APIs you think it would be easier to publish a simple Create, Read, Update, and Delete (CRUD) API. Sure, there are a number of services and open source solutions for publishing a CRUD API from your database, but for me to just say I want a CRUD resource, give it a name, push a button, and have it—there isn’t much out there. I should be able to just write the word “images”, and hit go, and have a complete images API that I can add properties to the schema, and query parameters to each method. After ten years of doing this I am just amazed that the fundamentals of API deliver are still so complicated and verbose.  We even have the vocabulary to describe all of the details of my API (OpenAPI), and I still can’t just push a button and get my API. I can take my complete OpenAPI definition and publish it to AWS, Azure, or Google and “generate my API”, but it doesn’t create the backend for me. There has been waves of database or spreadsheet to API solutions over the years, but there is not single API solution to plant the seeds when there is no existing data source. Over the holidays I managed to create a Postman collection that will take my OpenAPI from a Postman-defined API and generate a AWS DynamoDB and AWS API Gateway instance of API, but it was the closest I could get to what is in my head across AWS, Azure, and Google. Why can’t I just hit GO on my OpenAPI, and have an API in a single click? Nio matter which cloud provider I am on! The reasons why I can’t immediately have a CRUD API are many. Some technical. Most are business reasons. I would say it is primarily a reflection of our belief that we are all innovative special snowflakes, when in reality we are all...[Read More]


API Evangelist

A Postman API Governance Collection

09 Jan 2020

You can use Postman to test your APIs. With each request you can include a test script which evaluates each incoming response and validates for specific elements, displaying the test results along with each response. However, you can also use the same mechanisms to evaluate the overall design of any API you are managing with Postman. One of the new beta features of Postman is being able to manage your APIs, allowing you to define each API using OpenAPI 3.0, then generate collections, mocks, docs, and tests with Postman. This got me thinking—why can’t we use the new Postman API manager, plus the Postman API, and script testing for governing the design of an API. To explore the possibilities I created a Postman collection for applying some basic API design governance to any API you have defined in a Postman workspace. The collection uses the Postman API to pull the OpenAPI for each API and store it within an environment, then there are a range of basic requests that can be made to evaluate the design of the APIs that we have defined as an OpenAPI.  The collection is a proof of concept, and is meant to be a starting point for designing many difference types of API governance rules, and thinking about how Postman collections can be used to govern the API life cycle, starting with the design of our APIs—something that is exposed as OpenAPI. My new Postman API governance collection has a handful of folders, and the following requests: Info - Looking at the general info for the API. Validate the Name Of The API Validate the Description for the API Paths - Evaluating the design patterns of each API path. Ensure Words Are Used in Paths Methods - Looking at the details of each API method. Check For GET, POST, PUT, and DELETE Check All Methods Have Summaries Check All Methods Have Descriptions Check All Methods Have Operation Ids Check All...[Read More]


API Evangelist

Spreading API Collections From My Personal Workspaces Across Multiple Workspaces

08 Jan 2020

As a Postman user for a number of years I have several hundred random collections littering my personal workspace. I had noticed that workspaces emerged a while back, but really hadn’t ever put much thought into how I organize my collections. As the number of collections grows I’m noticing performance issues within Postman, and general chaos because I work primarily fro within my personal workspace. Pushing me to step back and think more holistically in how I create, store, organize, and share my API collections within the Postman platform and beyond using AWS S3 and GitHub. Forcing a little organization and structure on how I move APIs forward across thier own API life cycle trajectory. First, when working in my personal workspace there were performance issues using Postman. There were just too many Postman collections in there to be efficient. This further slowed me down when it comes to finding the collections I needed. Having to look purely alphabetically for collections that could have any sort of naming conventions applied to them took way too much time. This reality has pushed me to think about the different bucket in which I operate and get work done proved to be helpful, helping me create a handful of workspaces for me to organize my API collections into, rather than just operating from a single workspace filled with hundreds of APIs I have imported over the years. My frist task was to just delete things that was clearly junk. Then I looked at all my collections via the Postman API to see if there was any last modified or run date—sadly there isn’t. I will have to think about way in which I can track the evolution and usage of my Postman collections so that I can consider automating the cleanup of collections, or at least archiving of them based upon them being modified or not. Once I cleaned up a little bit I was able to see...[Read More]


API Evangelist

Postman Tutorials are Common but the Postman Collection is Often Missing

08 Jan 2020

I am amazed at the number of blog posts I come across for API providers explaining how their API consumers can use Postman with their API, but do not actually share a complete Postman collection for developers to use. API providers obviously see Postman as a tool for making API calls, but do not fully grasp the ability to document an API with a Postman collection, save, publish, and share this collection with documentation or the Run in Postman button. As part of this realization I am not looking to shame API providers for not understanding what is possible, I am more looking to acknowledge how much work we (Postman) have to to when it comes to helping folks understand what is possible with the Postman platform, moving folks being the notion that Postman is just an HTTP client. There are some pretty involved tutorials out there for using Postman with a variety of APIs. API providers have invested a lot into these blog posts, tutorials, and other resources to help their API consumers on-board with their APIs, but do not publish Postman collections as part of their documentation or tutorial. This tells me that API providers aren’t seeing the portability and share-ability of Postman collections. They simply see Postman as an API client, not as tooling for defining, sharing, publishing, versioning, saving, organizing, and evolving API requests. This means we have a lot of work ahead of us to educate folks about what Postman collections are, and how they will make your life easier, while reducing redundancy across operations. Helping folks move beyond simply operating Postman as an isolated HTTP client. Having full control over defining a request to an API while being able to see the details of that response is the core value of Postman. Developers get it. Clearly they also see the need in sharing this ability, and equip others realize the same value. They are crafting tutorials and blog posts...[Read More]


API Evangelist

Deploy, Publish or Launch An API?

08 Jan 2020

I’m always fascinated by the words we use to describe what we do in a digital world. One dimension of the API life cycle that perpetually interests me is the concept of deploying an API, or as some might call it publishing or launching. I am fascinated by how people describe the act of making an API available, but I’m even more interested in how shadows exist within these realities. Meaning, within a 30 minute Googling session for publish, deploy, and launch an API, I come across many real world examples of delivering an API, but how few of them will deliver the actual tangible, functional, nuts and bolts of the API. After searching for publish API, here is what stood out: Apigee SwaggerHub Postman Oracle Broadcom Azure MuleSoft WSO2 SAP Socrata After searching for deploy API, here is what stood out: AWS API Gateway Firebase Google Serverless Stack Mendix API Platform API Evangelist GitHub Heroku After searching for launch API, here is what stood out: Adobe Launch SpaceX Apple Launch Services RapidAPI 80% of these will not actually deliver the API, it will just take an existing and make it available. I know most of these service providers believe that their solution does deploy because use it proxies an existing API, but really very few of these actually deliver the API, they more publish, deploy, and launch it into some state of availability—the final act of making it available and open for business. After all these years of studying API gateway and management providers I’m still fascinated by the lack of true API deployment present, and how much it is about proxying what already exists, creating a shadow that continues to prevent us fro standardizing how we deliver APIs.[Read More]


API Evangelist

Dead Simple Real World API Management

08 Jan 2020

I began API Evangelist research almost a decade ago by looking into the rapidly expanding concept of API management, so I think it is relevant to go into 2020 by taking a look at where things are today. In 2010, the API management conversation was dominated by 3Scale, Mashery, and Apigee. In 2020, API management is a commodity that is baked into all of the cloud providers, and something every company needs. In 2010 there were not open source API management provider, and in 2020 there a numerous open source solutions. While there are forces in 2020 looking to continue moving the conversation forward with service mesh and other next generation API management concepts, I feel the biggest opportunity in tackling the mundane work of just effectively managing our APIs using simple real world API management practices. I am neck deep in working to deploy a simple set of APIs, looking for the path of least resistance when it comes to going from 0 to 60 with a new API. After playing around with AWS, Azure, and Google for a couple days, reminded of how robust, but also complex some of their API management approaches can be, I find myself on the home page of API Evangelist, staring at the page, and I click on my sole sponsor Tyk—finding myself pleasantly reminded how effective simple real world API management can be. Within 10 minutes I have singed up for an account and began managing one of my prototype APIs, allow me to: Add API - Add the url and authentication for one of my project APIs. Version - Choose to version, or not version the API I am deploying. Endpoints - Design a fresh set of endpoints transforming my API. Load Balance - Round-robin load-balance traffic to all my APIs. Regions - Manage the geographic distribution of my API infrastructure. Rate Limit - Limit the amount of API calls that can be made to API. Users...[Read More]


API Evangelist

Postman Open Source

07 Jan 2020

I get asked a lot if Postman is open source. I get told ocasionally that people wish it was open source. I have to admit I didn't fully grasp how open Postman was until I helped work on the new open source philosophy page for Postman. While the Postman application itself isn't open source (it is built on open source), the core building blocks of Postman are open source, shifting my view of how you can use the application across operations. Expanding Postman usage beyond just being a solitaire desktop applicaton, and turning it into a digitally scalable gear on the API factory floor. Postman as a desktop application is not open source, but here are the core components that are open source, making Postman something you can run anywhere: Postman Runtime - The core runtime of Postman that allows you to run collecctions, including requests, scripts, etc anywhere, extending the work that gets done within the application to anywhere the runtime can be installed and executed. Postman Collections Format - The collections you save and share with Postman are all open source and can be shared, exported, published, and used as a unit of currency within any application or system, further extending the reach of the platform. Newman - Command-line tool for running and testing a Postman Collection as part of any pipeline, making Postman collecitons a unit of compute that can be baked into the software development life cycle, and leveraged as API truth wherever it is needed. Postman Collection SDK - SDK to quickly unlock the power of Postman Collections format using JavaScript, allowing you to create, manage, and automate how collections are defined and put to work across a platform withoiut depending on the application. Postman Code Generators - Convert Postman collections to usable code in more than 20 different programming languages, generating simple client scripts for consumers that are defined by the Psoitman collections used as the code generators definition. I am...[Read More]


API Evangelist

Challenges Binding APIs Deployed Via Gateway To Backend Services

07 Jan 2020

I spent some of the holidays immersed in the backend integrations of the top three cloud providers, AWS, Azure, and Google. Specifically I was studying the GUI, APIs, schema, mapping, and other approaches to wiring up APIs to backend systems. I am looking for the quickest API-driven way to deploy an API, and hooking it up to a variety of meaningful resources on the backend, beginning with SQL and NoSQL data stores, but then branching out discovering the path of lest resistance for more complex backends. Maybe it is because of my existing experience with Amazon, but I found the AWS approach to wiring up integrations using OpenAPI to be the easiest to follow and implement, over what Azure and Google offered. Eventually I will be mapping out the landscape for each of the providers, but at first look, Azure and Google required substantially more work to understand and implement even the most basic backends for a simple API.  Don’t get me wrong, if you want to just gateway an existing API using AWS, Azure, or Google, it is pretty straightforward. You just have to learn each of their mapping techniques and you can quickly define the incoming request, and out going response mappings without much effort. However, for this exercise I was looking for an end to end actual deployment of an API, not the proxying or Hollywood front for an existing API. If you want to launch a brand new API from an existing datasource, or a brand new API with a brand new data source, I found AWS to be path of least resistance. I was able to launch a full read / write API using AWS API Gateway + AWS DynamoDB with no code, something I couldn’t do on Azure or Google, without specific domain knowledge of their database solutions. I had only light exposure to DynamoDB, and while there were some quirks of the implementation I had to get over,...[Read More]


API Evangelist

Academic or Street API Tooling

07 Jan 2020

There always seems like there are two separate types of tools in my world, the academic tools that consider the big picture and promise to steer me in the right direction, and then there is the street tooling that helps me get my work done on a day to day basis. After going to work for a street tooling vendor who has some academic tooling aspirations, it has gotten me thinking more about the tools I depend on, and learning more about what people are using within the enterprise to get their work done each day. I have used different academic tooling over my life as the API Evangelist. I’d say every API management tool I’ve adopted has been very academic until recently. From my view API management started as academic and then became a factory floor commodity. I feel say Kong and Tyk are the only version that have achieved a street level status within all of this, and NGINX is looking to turn it’s street cred into more of something that is more academic, and visionary. There aren’t many academic API tooling that have gone from vision to implementation—they just can’t survive the investment and acquisition cycles that gobble them. Making it difficult to see the real adoption they need to become baked into our daily lives. API management has done it, but very few other stops along the API life cycle have realized this level of adoption. Street tooling, or the hand tools developers use to get their jobs done on a daily basis are a much different beast. Postman and NGINX are both examples of tools that developers know about and depend on to operate each day. Using NGINX to deploy and Postman to consume APIs each day. These aren’t tools that promise some grand vision of how we could or should be, these are tools about dealing with what is right in front of us. These are tools that keep...[Read More]


API Evangelist

The Fundamentals: Deploying APIs From Your Databases

06 Jan 2020

You know, I tend to complain about a lot of things across the API space while focusing on the damage caused by fast moving technology startups and the venture capital that fuels them. Amidst all of this forward motion I easily forget to showcase the good in the space. The things that are actually moving the conversation forward and doing the hard work of connecting the dots when it comes to APIs. I easily forget to notice when there are real businesses chugging along delivering useful services for for all of us when it comes to APIs. One of my favorite database to API businesses out there, and one of the companies who have been around for a significant portion of my time as the API Evangelist, working hard to help people deploy APIs from their databases, is SlashDB. If you want to deploy APIs from your databases, SlashDB is the solution. If you are looking to make data within MySQL, PostgreSQL, SQLite, MS SQL Server, Oracle, IBM DB2, Sybase, RedShift, NoSQL, or other data source available quickly as an API, SlashDB has the solutions you are looking for. SlashDB isn’t one of those sexy new startups with a bunch of venture funding looking to be your new API best friend. SlashDB is looking to do the mundane difficult work needed to make the data available within your legacy databases available as APIs so that you can use across your applications. SlashDB is all about securely exposing your data using standardized web APIs, making your digital resources available wherever you need them. SlashDB doesn’t have the splashy website, but they have the goods when it comes doing one of the most common tasks when deploying APIs—wiring up your APIs to their data backends. They also have the straightforward pricing tiers for you to navigate as you expand the number of data sources you are wiring up, and the number of consumers you have consuming data...[Read More]


API Evangelist

Postman Collections For Pulling My Twitter Friends And Followers

06 Jan 2020

I have been cranking out the Twitter API capabilities lately, crafting single request Postman collections that focus on a specific capability of the popular social API. I use the API for a number of different things around API Evangelist, and as I assess how I use the social media API I wanted to be engineering my integrations as Postman collections so I can better organize and execute using Postman, while also adding to the list of API capabilities I’m sharing with my audience of developers and non-developers. Today I cranked out two individual Twitter API capabilities helping me better manage my Twitter followers and friends: Twitter Followers - Pulls your Twitter followers 200 at a time, saves them within an environment, then allows you to increment through each page of followers, eventually pulling and storing all of your followers. Twitter Friends - Pulls your Twitter friends 200 at a time, saves them within an environment, then allows you to increment through each page of friends, eventually pulling and storing all of your friends. These capabilities are separate Postman collections so that they can be used independently, or together. I am keeping them organized into a Postman workspace so that I can use manually, but then also have a daily monitoring running, pulling any new followers or friends from my Twitter. I pull the resulting JSON from the environments I pair up with each collection using the Postman API and integrate into some of my other API Evangelist monitoring and automation. Next I am going to create a Postman collection that will reconcile the two lists and tell me which people I am following do not follow me back, creating a third list that I can use to unfollow and clean up my profile. Crafting these types of collections helps me renew my understanding of some of the APIs I already use. It also helps me better define the individual capabilities I put to work on a daily basis, and...[Read More]


API Evangelist

My Levels of Postman API Environment Understanding To Date

06 Jan 2020

I have been a pretty hardcore Postman user since the beginning. Over the years I felt like I understood what Postman was all about, but one of the first concepts that blew up my belief around what Postman could do was the concept of the Postman environment. Like other Postman features, environments are extremely versatile, and can be used in many different ways depending on your understanding of Postman, as well as the sophistication of the APIs and the workflow you are defining using Postman. My Postman environments awakening has occurred in several phases, consistently blowing my mind about what is possible with Postman and Postman collections. Postman environments are already one of the edges I have given Postman collections over a pure OpenAPI definition—it just provides more environmental context than you can get with OpenAPI alone. However, at each shift in my understanding of how Postman environments can be used, entirely new worlds opened up for me regarding how that context can be applied and evolved over time across many different APIs. Resulting in four distinct layers of understanding about how Postman environments works and can be applied in my world—I’m sure there will be more dimensions to this, but this is a snapshot of how I see things going into 2020. Environments Settings For Single API Calls I have to start with the ground floor and express why environments matter in the first place, and provide an edge over OpenAPI all by itself. Being able to define key / value pairs for authorization and other variables across one or many different API collections helps speed up the on-boarding, orchestration, and reuse of API requests within those collections. It quickly allows you to switch users or other context, but still use the same collection of API requests, shifting how we automate and orchestrate across our API infrastructure. However, simply putting the base url for your API as a variable, and defining tokens and other...[Read More]


API Evangelist

A Dynamic Salesforce REST API Postman Collection Builder Collection

06 Jan 2020

I have been working on developing new ways to make the Salesforce API more accessible and easier to onboard with over the last couple of months, helping reduce friction every time I have to pick up the platform in my work. One of the next steps in this work is to develop a prototype for generating a dynamic Postman collection for the Salesforce REST API. I had created a Postman collection for the API earlier, but the Salesforce team pointed out to me that the available APIs will vary from not only version to version, but also user account to user account. With this in mind I wanted to develop a tool for dynamically generating a Postman collection for the Salesforce API, and as I got to work building it I realized that I should probably just make the tool a Postman collection itself (mind blown). To help make on-boarding with the Salesforce API easier I created a Postman collection that uses the Salesforce API to autogenerate the Postman collection based upon the available objects and endpoints for the Salesforce REST API. The Postman collection has three requests within the collection to accomplish the creation of a dynamic collection. The first request pulls all the latest versions for the Salesforce API, using the Salesforce API. Once I have the version of the Salesforce API I am targeting for a build I add it to the Postman environment I am using to define the operations of my Postman collection, and then I pull the list of available objects for this version, and for my own Salesforce account. The objects that exist will vary for each Salesforce account, as well as version, making it pretty critical that that any Postman collection is dynamic, being generated from this personalized list of objects. The next request in our Salesforce Postman collection builder is the build, which generates individual requests for all of the available objects. After you run, the...[Read More]


API Evangelist

The Many Differences Between Each API

03 Jan 2020

I’m burning my way through profiling, updating, and refreshing the listings for about 2K+ APIs in my directory. As I refresh the profile of each of the APIs in my index I am looking to make sure I have an adequate description of what they do, that they are well tagged, and I always look for an existing OpenAPI or Postman collection. These API definitions are really the most valuable thing I can find for an API provider, telling me about what each providers API delivers, but more importantly it does the same for other consumers, service and tooling providers. API definitions are the menu for each of the APIs I’m showcasing as part of my API research. As I refresh the profile for each API I re-evaluate how they do their API, not just the technical details of their API, but also the business and on-boarding of their API. If an API providers doesn’t always have an OpenAPI, Postman collection, or other machine readable definition for their APIs, depending on the value of the API and standardization of their API design and documentation, I will craft a simple scrape script to harvest the API definition, and generate the OpenAPI and Postman collection automatically. As I cycle through this process fore each API in my index I’m reminded of just how different APIs can be, even if they are just RESTful or web APIs. Demonstrating that there are many interpretations of what an API should be, both technically, and from a business perspective. Some APIS have many different paths, representing a wide variety of resources and capabilities. Some APIs have very few paths, and heavily rely on query parameters to work the magic when it comes to applying an API. Others invest heavily in enumerators and the values of query parameters to extract what you need from each API—often times forgetting to tell you what these values should or could be. Some of the time...[Read More]


API Evangelist

Pricing Comparison for Screen Capture APIs

03 Jan 2020

There is a pricing comparison between 33 separate screen capture APIs halfway down the page on this interesting piece about how to choose the right screen capture service. This type of comparison should exist across every business sector being impacted by APIs, as well as new ones emerging to introduce entirely new digital resources for use in our desktop, web, mobile, device, and network applications. Sadly, right now these types of machine readable, let alone human readable lists do not exist across the sector. Assembling these types of comparisons takes a lot of time and energy, and aren’t always possible in a special API snowflake of a world where seemingly similar APIs are actually very different beasts—sometimes intentionally, but usually unintentionally. I have had a machine readable schema for defining API pricing for almost five years now. I’ve profiled common resources like email, SMS, and others, but ultimately haven’t had the resources to invest in the work at the levels needed. I know how much work goes into establishing an exhaustive list of APIs in any business sector as well as finding a price, and defining the access tiers for each individual API provider. I wish I had more resources to invest in profiling of APIs, but also profiling down to this level of detail where each of the individual API resources they offer have some sort of semantic vocabulary applied, and a machine readable defining of the pricing and on-boarding required for each API provider. This is how we are going to get to the API economy we all like to fantasize about, where we can automatically discover, on-board, pay for, and switch between valuable aPI resources as we need in real-time. We need to get to work on doing this for the most tangible, consistent, and valuable API across the sector. We won’t be able to do for all types of APIs, and sometimes I twill be an apples to oranges comparison, but we...[Read More]


API Evangelist

Not Just An API Provider But Also An API Matchmaker

03 Jan 2020

Your API is always the best. Of course it is. However, not everyone will see the value your API delivers without a little enlightenment. Sometimes the value of an API is missed in isolation when you are just looking at what a single API can do. To help developers, as well as business users understand what is possible it can help to connect the dots between your API and other valuable 3rd party APIs. This is something you see from API providers who have integration pages showcasing the different integrations that are already available, and those who have invested in making sure their API is available on integration platform as a service (iPaaS) providers like IFTTT and Zapier. If a new user isn’t up to speed on what your API does, it can help to put it side by side with other APIs they are already familiar with. Being aware of not just the industry you are operating an API within, but also complimentary industries is what we should all be striving for as an API provider. The most competitive API providers all have integration pages demonstrating the value that an API provides, but more importantly the value it can deliver when bundled with other popular services their customers are already using. This means that API providers have to be solving a real world problem, but also have done their homework when it comes to understanding a real world version of this problem that other people face. Or simply have enough consumers of an API who are demanding that are also demanding integrations with other commonly used platforms. Regardless of how an API provider gets there, having an awareness of other platforms that companies are depending on as part of their operation, and ensuring that your API solutions are compatible and interoperable by default just makes sense. I find that playing with a complimentary API in Postman helps you think about the moving parts of...[Read More]


API Evangelist

What Is The API Life Cycle?

02 Jan 2020

I regularly struggle with the words and phrases I use in my storytelling. I’m never happy with my level of word-smithing, as well as the final output. Ultimately I don’t let it stop me, I just push myself to constantly re-evaluate how I speak, being forever critical and often pedantic about why I do things, and why I don’t. One word I struggle with is lifecycle. First I struggle with it being a word, or two words. Historically I have been team word, but more recently I’ve switched to two words. However, this round of anxiety over the phrase is more operational, and existential, over it being about how I use the word in my storytelling. I am more interested in if we should even be using the phrase, and if we are, how do we get more formal about quantifying exactly what we mean by the API life cycle. As I work to flesh out my API life cycle Postman collection, defining an API-driven guard rails for how I deliver my APIs, and distilling each step down to a single request and set of pre and post request scripts, I am forced to think about what the API life cycle really is. Pushing me to go beyond just talking about some abstract concept, to actually having a set of interfaces and scripts that quantify each stop along the API life cycle. While I will be adding more stops to my Postman API life cycle collection, I currently have 27 stops defined, providing me with some concrete actions I can take at each point in the evolution of my APIs. Define - Defining the central truth of the API using OpenAPI, JSON Schema, and Postman collections and environments. Environments - Providing environments that drive different stages of the API life cycle in conjunction with various collections. Design - Quantifying, standardizing, and evolving the HTTP and other design patterns I use across the APIs I deliver....[Read More]


API Evangelist

Deploying My Postman OpenAPI To AWS API Gateway

02 Jan 2020

I created a bunch of different Postman collections for AWS services leading up to re:Invent this year, and now I’m using individual requests to deliver on some different Postman AWS API life cycle workflows. To flesh out the scaffolding for how I define and deliver APIs throughout their API life cycle I got to work on a Postman collection for defining and executing every single stop in my API life cycle in a way that I could consistently apply across many different APIs. I am using Postman to define the central truth of each of my APIs with OpenAPI, and I want to use Postman to deliver and execute on that truth across every single stop along the API life cycles. One of the more critical stops I wanted to provide a solution for was API deployment, providing me with a simple way to immediately deploy an API from an OpenAPI definition. Deploying APIs are hard. It is one of the most complicated and least standardized stops along the API life cycle. Regardless, I wanted a simple straightforward Postman collection that would allow me to take an API definition within Postman, and publish an API to one of the major cloud platforms—AWS won out for simplicity. Ultimately, using Postman I was able to pull an OpenAPI for one of my APIs, then deploy an API in five steps. Providing a basic, introductory Postman collection for deploying a Postman API to AWS API Gateway. Pull API - Loads up the specific version of a Postman API into the environment for processing within each of the next steps. Create Table - Actually creates an AWS DynamoDB table derived from the name of the API being pulled from Postman. Prepare OpenAPI - Takes the OpenAPI and generates AWS API Gateway integration extensions that define the backend. Publish OpenAPI - Takes the new OpenAPI with integration extensions and publishes to AWS API Gateway. Deploy API - Actually deploys the API...[Read More]


API Evangelist

A Postman Collection for Managing the Life Cycles Of My APIs

02 Jan 2020

I had grown weary of just researching, talking, and teaching about the API lifecycle over the last ten years as the API Evangelist. This was one of the major motivators for me to join the Postman team. I want to take my knowledge of the API life cycle and work to make sure the rubber meet the road a little more when it comes to actually realizing much of what I talk about. I began investing in this vision over the holidays by crafting a Postman collection that isn't for defining a single API, it is meant to define the life cycle of a single API. I can manage multiple stops along the API life cycle already with Postman--I just wanted to bring it all together into a single machine readable collection that uses the Postman API, but also other APIs I use to orchestrate my world each day. My API life cycle collection is still a work in progress, but it is coming together nicely, and is the most tangle format of what I have been in my head when I think of Postman as an API delivery platform. This collection centers around managing an OpenAPI truth within Postman, then moving this API definition down the life cycle, and even deploy development or production versions of each API using AWS API Gateway. Of course everythig is API-driven, and designed to work across many different APIs to define, deliver, and manage any single API, maintaning a definition of the life cycle within a single Postman environment that can be used to bridge multiple API platform via a single collection. So far I have over a hundred individual capabilities defined as Postman requests, and organized into folders that are broken down by different stops along the API life cycle. I'm still moving them around and abstracting away the friction, while I work hard to define the most sensible workflows with each of my API life cycle...[Read More]


API Evangelist

Pulling Your Twitter Bookmarks Via The Twitter API

30 Dec 2019

I created two Twitter API capabilities the other day to help someone pull a list of their Twitter favorites using the Twitter API. They said they wanted bookmarks and I assumed they used favorites in the same way I do (as bookmarks), and created one Postman collection for pulling API favorites, and another to parse the URLs present in the body. I use Twitter public likes as a way of bookmarking, then I harvest those via the Twitter API--something I've done for over a decade. I had heard of Twitter bookmarks, and seen them in the desktop and mobile apps, but hadn't really made the shift in my brain. So I assumed they were talking about likes. DOH! Anyways, they tweeted back at me and helped me realize misconception. Ok, so how do we still get them their bookmarks? After some quick investigation there is no Twitter API for your private bookmarks, making the pulling of your data a little more challenging, but not impossible. This is where I began helping people not just understand the technology of APIs, but also the politics of API operations. Meaning Twitter has an API for your bookmarks, they just don't want you to get at it via the public API (I am not sure why). Anyways, in this scenario I can't make a ready to go Postman collection for you to use, I am going to have to teach you a little bit more Postman Kung Fu, and teach you how to sniff out the APIs that exist behind everything you do each day. It is still something you can do without programming, and with Postman you can still get at your data in the same way we did for the public Twitter favorites API. You just have to be curious enough to not turn away as I pull back the curtain of the world of APIs a little bit more, with a simple walk through. Something that...[Read More]


API Evangelist

Pulling Links From Those Tweets You Have Favorited

29 Dec 2019

I am busy crafting new API capabilities from my laundry list of requests I have from folks. When I get an email or come across a Tweet with someone asking how they do something on Twitter I will add to my list, and at some point pull together a simple Postman collection for accomplishing what is being desired. Providing a single Twitter capability that I can add to my list, and anyone (hopefully) can put to use with their own Twitter account and application, within their own local Postman environment. My goal here is to help provide simple API-driven capabilities that anyone can use, while also pushing my skills when it comes to crafting useful Postman collections that aren’t just for developers. Today’s API capability is from Elana Zeide (@elanazeide) who asked on Twitter, “So now I have a lot of twitter bookmarks of amazing things you people have shared ... is there any way to export/download them to another app? (I know you can do it w/ likes) Anyone come up with some clever workaround/automation?”. To possibly help her out I started by creating a single Postman collection that just pulls the favorites for any Twitter user via the Twitter API. Pull Twitter Favorites Capability - It authenticates with the Twitter API and pulls the likes for any Twitter user using their handle, and publishing the list of favorites to the visualizer screen. This all by itself is a perfectly usable API capability all by itself, but once I was done I used it as my base for pulling any URL that is present in the Tweet. Making for entirely separate Twitter API capability that I hope folks will find useful. Pull Links From Twitter Favorites Capability - It authenticates with the Twitter API and pulls the likes for any Twitter user using their handle, extracts all of the links from those tweets and publishes the list of links to the visualizer screen. Both of these...[Read More]


API Evangelist

How My API Evangelist Research and Writing Works

28 Dec 2019

Many folks don’t quite get my work and writing style. They are confused by the erratic flow of stories being published to API Evangelist, the incomplete nature of some of my research sites, and other annoying factors that don’t quite make sense when you view API Evangelist a particular way. If you think it as a technology blog like Techcrunch, ReadWrite, The New Stack, or others, you will be passing certain judgement on the content of my work, the tone of what I say, and the chaotic way in which I publish my research and stories across hundreds of separate sub-domains. People expect me to write up their API, review their approach, or know everything about the thousands of APIs that exist across the public landscape. API Evangelist isn’t this type of blog—it is simply my workbench for things that interest me, are relevant to the industry and my career, or is valuable to someone who pays me to generate value in the API universe.  Two Distinct Layers Of Research There are two main layers to my research, which I use to mine API information and knowledge. These two dimensions feed off of each other, but ultimately drive my research, storytelling, and at times the wider conversation in the API space. Helping me organize everything into these two buckets: Landscape - Reviewing the public and private API offerings across many different business sectors, providing me with a unique view of how API providers are doing what they do. Life Cycle - Taking what I’ve learned across the landscape and organizing information and knowledge by stops along the API life cycle, for use in my regular work and storytelling. These two layers are constantly feeding each other. For example, after making a pass through all the payment APIs, updating the landscape for that area, I will add new building blocks I’ve stumbled across to my API life cycle research. Then when I embark on research into the...[Read More]


API Evangelist

Atlassian Provides Run in Postman and OpenAPI by Default for Jira, Confluence, and BitBucket APIs

27 Dec 2019

I was profiling the Atlassian APIs, considering what is possible with JIRA, Confluence, and Bitbucket. Three services that are baked into many enterprise organizations I’ve worked with over the years. My intention was to create a Postman collection for JIRA, but once I Landed on the home page for the API I noticed they had a button in the top corner for Running in Postman, and a dropdown for getting an OpenAPI 3.0 spec. Which is something that I strongly believe should be default for all APIs, ensuring there is a prominently placed link to the machine readable truth behind each API. I like seeing Postman as the default executable in the top corner of the documentation for APIs. I also enjoy seeing the orange Run in Postman button across documentation, blog posts, and other resources—helping folks quickly on-board with some API resource or capabilities. I want to see more of this. I’d like it all to become the default mode of operating for API providers. I want all API providers to manage an OpenAPI truth for their API, while also developing and evolving many different Postman derivatives of that truth. Providing reference collections that describe the full surface area of our APIs, but also make sure there are more on-boarding, workflow, and capability style APIs that empower end-users to put APIs to work distributed across API documentation, and the stories we tell about what is possible with our APIs. Interestingly the Postman collection isn’t just a unit of representation for the JIRA, Confluence, and BitBucket APIs. The Postman collection is also a representing of the unit of work that is executed across these platforms. If you have worked in software development across the enterprise you know what I am talking about. Postman is the Swiss Army Knife for how enterprise developers not only develop and deliver their work, which is defined and tracked using JIRA, Confluence, and BitBucket, but Postman collections are also how...[Read More]


API Evangelist

Applying An API-First Approach To Understanding The Pacific Northwest Mushroom Industry

23 Dec 2019

This is an API first project for mapping out the mushroom industry. I have always had a passion for mushrooms, but as I get older I am looking at investing in more side projects that aren’t always 100% about APIs. I wanted to spend some time this holidays refreshing my memory about what types of mushrooms are available on the market, and what types of products are being made from them. As I do with any data or content driven research I begin by creating an API to store all of the data and content I am gathering, helping me flesh out the dimensions of each business sector I am interested in. As with all of my work I really don’t know where this research is headed—I am just interested in learning about mushrooms. Eventually I’d like to use this data and content in a variety of web and mobile applications, but since I’m just getting started I don’t really understand all of the data I am needing to gather. A situation that is perfect suited for beginning as an API first project, helping me not just gather the data I need, but also do it in a way that will help me prepare for the future, while also not investing too much into wiring up a database, coding a web or mobile application, and any other costly infrastructure that may (or may not) be needed down the road. By starting as API first, I am able to flesh out the schema and structure of my APIs which will drive my research, and the resulting applications I will be needing down the road. To get started I spent about 10 minutes thinking about what the main areas of resources I will be needing to track across my work, and created ten separate individual resources. Mushrooms - A list of the mushrooms, their scientific names, description, and the beginning of what I will need to map...[Read More]


API Evangelist

API Providers Should Maintain Their Own API Definitions

23 Dec 2019

I am working my way through 2K+ API providers, refreshing my view of the API landscape, and the data I use to tune into the API economy. As I refresh the profile of each API provider, one of the main things I’m doing is looking for an OpenAPI or Postman collection. While the profiling of their API operations using APIs.json is critical, having a machine readable definition for each API is kind of the most important part of this process. Having an OpenAPI or Postman collection gives me a machine readable list of the value that each API delivers, and allows me (and others) to more easily integrate an API into other applications and systems. Sadly, not every API provider understands the need, or is able to invest the resources to produce an API definition. While profiling an API provider the most ideal situation I can come across is when an OpenAPI already exists in service of API documentation, or the API provider just gets the struggle of their API consumers and they have a Postman collection already published. Ideally, the OpenAPI is publicly available and I don’t have dig it out from behind the documentation, or they have the Run in Postman button clearly published on their website. In the best situations, API providers have their OpenAPI and / or their Postman collections published to GitHub, and are actively maintaining their API definitions using Git, which allows other API consumers and API service providers to depend on an authoritative source of truth when it comes to API definitions for each API they use. I wish every API provider would maintain their own API definitions in this way, sadly very few do. The majority APIs I come across do not have documentation driven by OpenAPI and do not have Postman collections. When I encounter one of these API providers I spend usually about 60 seconds googling for Swagger, OpenAPI, and Postman + their name in...[Read More]


API Evangelist

Where Does The Exhaust For Your API Operations End Up Being Stored?

20 Dec 2019

As part of my ongoing API discovery and observability research, I am interested in better defining where the common places are within the enterprise that we find API signals. Those log files and other exhaust by-products from API operations that will contain hosts, paths, parameters, and other parts and pieces of the APIs that are already in operation. API discovery is complex and it isn’t something I think we are going to be able to solve by mandating teams to make their APIs more discoverable, I think it is something we are going to have to do for them. Augmenting their existing work with services and tooling that then defines what APIs they are producing and consuming as part of the existing tools, applications, and systems. Further expanding the definition of API observability by tapping the exhaust from the outputs of existing infrastructure to help us map out the API landscape that exists within the enterprise.  I am currently helping the Optic folks think beyond the personal value their proxy delivers for individual developers by proxying your desktop, web, mobile, and Postman traffic and automatically generating OpenAPI definitions for you, and consider what the more industrial grade use cases will be. As part of these conversations I am more deeply thinking about how APIs are operated within the enterprise, and being more formal in how I discuss where you can tap into the existing exhaust that is captured around API operations, building on the following list I already have. Apache Log File - The most ubiquitous open source web server out there is the default for many API providers. NGINX Log File - The next most ubiquitous open source web server is definitely something I should be looking for. IIS Log File - Then of course, many Microsoft web server folks are still using IIS to serve up their API infrastructure. Amazon CloudWatch - Looking at how the enterprise is centralizing their logs with CloudWatch...[Read More]


API Evangelist

OpenAPI is the Static Truth and Postman Collections are Real World Derivatives of that Truth

20 Dec 2019

I was talking with the Optic folks this morning about API definitions when they asked me for my opinions on what the difference between OpenAPI and Postman were. A question that isn’t easy to answer, and will produce many different answers depending on who you are talking with. It is a question I’ve been struggling with since before I started at Postman, and will continue to struggle with over the coming years as their Chief Evangelist. The best I can do right now is keep writing about it, and continue talking with smart people like Optic, and iterate upon the answer until I can better see what is happening. Here is how I see things currently: OpenAPI is the static truth, and Postman collections are the real world, real time derivative’s of this truth. Each individual Postman collection reflects the derived value of an API, representing how a developer, application, or system integration is applying this value in the real world. Now if you squint your eyes, all of those Postman collection derivatives roll up into a single OpenAPI truth. OpenAPI is essential for nailing down what the overarching truth of what an API contract delivers, while Postman is essential in quantifying, realizing, and executing this truth on the ground for a specific business use case. There are definitely ways in which OpenAPI and Postman collections overlap, but then there are the ways in which they bring different value to the table.  When it comes to capital G Governance OpenAPI is more meaningful to business leadership—it represents a more constant truth that can then be translated within services, tooling, and defining policy at the macro level. When it comes to lowercase g governance Postman collection is more meaningful to developers, because it represents the transactions they need to accomplish each day, which are derived from the greater truth, but have more context regarding each specific business transaction that a developer is expected to deliver. This...[Read More]


API Evangelist

How I Profile The TypeForm API

20 Dec 2019

I was being asked for more information about how I profile APIs, and deal with the many differences I come across. It isn’t easy navigating the differences between different APIs, and come out with a standard API definition (OpenAPI or Postman collection) that you can use across different stops along the API life cycle. I’m pretty agile and flexible in how I approach profiling different APIs, with a variety of tools and tricks I use to vacuum up as much details as I possibly can with as little manual labor as I possibly can. The example for profiling that was thrown at me was the TypeForm API, which is a pretty sophisticated API, but will still need some massaging to create an acceptable set of API definitions. First thing I do is search for an OpenAPI definition, hopefully published to GitHub or prominently linked off their documentation, but I will settle having to sniff out from behind an APIs documentation. TypeForm doesn’t have an OpenAPI or Swagger available (from what we can tell). Next, I go looking for a Postman collection. Boom!! Typeform has a Postman collection. The question now is why hasn’t Typeform published their Postman collection to the Postman Network? I will Tweet at them. Ok, now I have a machine readable definition for the Typeform API that I can import into my API monitoring system—which is just an API that I use in Postman to import a Postman collection (head explodes). My Postman collection import API grabs as many elements from the Postman collection definition as it can, normalizing the paths, parameters, and other details for an API. I am always adding to what my API is capable of, but it does a pretty good job of giving me what I need to begin to profile the surface area of an API. Now I have all of the paths imported into my monitoring system. However, I am still at the mercy of how...[Read More]


API Evangelist

What Else Has Influenced APIs Over the Last 50+ Years?

19 Dec 2019

Because I have so many smart folks in my Twitter timeline I want to put out some of the seeds for stories I am working on for 2020. I want your help determining what has set the stage for the world of APIs we all believe in so much. Here are just a handful of the nuggets I have pulled out of my research and reading. Early On 1933 - Telex Messaging 1949 - Memex (Linked Documents) 1949 - Computer Talk Over Phone 1958 - Digital Phone Lines 1959 - Semi-Automatic Ground Environment (SAGE) Wide Area Network 1961 - Computer Time Sharing 1963 - Hypertext 1963 - Hypermedia 1964 - Sync Satellite Television Network 1964 - IBM Sabre Reservation System 1966 - Michigan Educational Research Information Triad (MERIT)  1968 - Multiplexing 1969 - Mass produced software components By McIlroy, Malcolm Douglas 1969 - Host Software The First RFC 1969 - ARPANET Four Initial Nodes Established 1969 - Compuserve 1970s 1970 - ARPANET Reaches East Coast (MIT) 1971 - Email 1971 - File Transfer Protocol (FTP) 1971 - TELNET 1971 - ARPANET Has 23 nodes 1972 - ARPANET Has 29 nodes 1973 - ARPANET Has 40 nodes 1974 - ARPANET Has 46 nodes 1974 - Transmission Control Program (TCP) 1974 - Systems Network Architecture (SNA) 1975 - ARPANET Has 57 Modes 1976 - CYCLADES Computer Network 1976 - X.25 Packet Switching Protocol 1979 - First Commercial Cellular Network 1980s 1980 - USENET 1981 - ARPANET Has 213 Nodes 1981 - TCP/IP 1982 - Simple Mail Transfer Protocol (SMTP) 1983 - ARPANET Switches to TCP/IP 1983 - IPV4 1983 - Berkely Sockets 1984 - CD-ROM 1984 - Domain Name System  (DNS) 1984 - Dynamic Host Configuration Protocol (DHCP) 1984 - Open Systems Interconnect (OSI) 1985 - Whole Earth 'Lectronic Link (WELL) 1985 - National Science Foundation Network (NSFNET) 1987 - Transport Layer Interface (TLI) 1990s 1991 - Gopher 1991 - Windows Sockets API 1991 - Common Object...[Read More]


API Evangelist

The 3dcart Developer Home Page Is Nice and Clean

19 Dec 2019

I look through a lot of API developer portals and when I come across interesting layouts I like to pause and highlight them showing to other API providers what is possible, while turning API Evangelist into a sort of style guid when it comes to crafting your API operations. I was asking the folks over at 3dcart if they have an OpenAPI or Postman collection for their API to help me round of my machine readable index of the commerce API provider, and after I stumbled across their developer portal, I thought I'd share here. I like it because in addition to the global navigation for their portal, it really gets at the primary next steps anyone will be taking off the landing page of their developer portal. You can tell it really forced them to pause and think about the narrative around what people will be looking for. Helping people understand what is possible, while also routing them down the most common paths taken when it comes to building an application on 3dcart.[Read More]


API Evangelist

Taming The Salesforce API Scope

18 Dec 2019

I was recently looking to building a prototype integration between Salesforce and Workday, where I find myself needing to on-board with the Salesforce REST API for probably the 50+ time in my career. I am always looking for projects that use the API so that I can keep my skills sharp when it comes to one of the leading API platforms out there. Even with this experience, each time I on-board with the API I find myself having to work pretty hard to make sense of the Salesforce REST API, first wading through a sea of information to get to find the API reference documentation, setting up an OAuth application, and getting to where I am actually making my first API call. Once I am successfully making calls to the Salesforce API, I then have to further explore the surface area of the Salesforce REST API before I can fully understand all the resources are available, and what is truly possible with my integration. After spending about an hour in the Salesforce documentation it all came back to me. I remembered how powerful and versatile the API is, but my moment of déjà vu left me feeling like it would be pretty easy to reduce the time needed to go from landing on the home page of developer.salesforce.com to making your first API call. The challenge with the Salesforce API is it is extremely large, and possess a number of resources that will vary based upon two important dimensions, version and your user account. The API you see with a base developer account isn’t the same you’ll see with a well established corporate Salesforce implementation. Each individual Salesforce customer might be using a specific version, and have a completely different set of resources available to them, making it pretty challenging to properly document the API. Even with these challenges I think there are a number of ways in which the Salesforce API could be made...[Read More]


API Evangelist

APIs For Victoria Australia

17 Dec 2019

I was helping out someone trying to download air quality data in Australia today, and while I was playing around the Victoria Australia government AirWatch data API I thought I'd go ahead and add them to my API Evangelist network by importing their Swagger 2.,0 files and converting them to OpenAPI 3.0, while also publishing Postman collections for teach of their APIs. Expanding out the APIs I have in my directory, while also encouraging the state to publish the Postman collections I've created to the Postman API Network. The State of Victoria has some pretty interesting APIs that they have made available using Axway. I have published an APIs.json index for the states developer portal, providing an index of their API operations, as well as the individual APIs. You can get at the Postman collections I've generated using these links. ABS Labour Force API Postman Collection Agriculture Victoria Soils API Postman Collection DataVic CKAN API Postman Collection DataVic Open Data API Postman Collection EPA AirWatch API Postman Collection Important Government Dates API Postman Collection Museums Victoria Collections API Postman Collection Popular Baby Names Victoria API Postman Collection Victorian Heritage API Postman Collection I would go ahead and publish the Postman collections to the Postman Network, but I have asked them to go ahead and publish them. I would rather the listings be more authoritative and something that is owned by the API operators. I'm just looking to maintain a GitHub repository with fresh copies of their OpenAPI, Postman collections, and APIs.json so I can use as the source of truth for the APIs across API Evangelist, APIs.io, and other iPaaS, and integration providers.  I am working through several different business sectors and government APIs, updating my directory of APIs, while also sharing with soem other API service providers I have been talking to. If there is a particular API provider you'd like to see added to my list, go ahead and submit a pull request...[Read More]