The API Evangelist Blog

API Evangelist

API Industry Guide: API Definitions

24 Sep 2019

This is a guide to the world of API definitions--introducing you to what the machine and human readable schema and API specification formats are used for when it comes to defining API-driven capabilities. Formats like JSON Schema, OpenAPI, AsyncAPI, and Postman Collections are changing how we define our APIs, and then use definitions to engage with developers throughout the API life cycle. This API definitions guide is the result of almost 10 years of research and participation in the evolution of API definitions as part of community development, conferences, and the API efforts behind the specifications themselves. I've worked to distill my research on API definitions down into a short guide that business as well as technical individuals can follow. Providing a single guide to the impact that API definitions are making across the sector, empowering both API providers and consumers when it comes to putting APIs to work in web, mobile, and device applications.  Table of Contents: API Definitions - Overview of what API definitions mean to stakeholders. Defining APIs - How API definitions are used to define each individual API. Machine Readable - Providing what is needed for systems to understand. Human Readable - Providing what is needed for humans to understand. JSON Schema - A standard for describing the properties of each API object. OpenAPI - Standard for defining API contracts for HTTP 1.1 or web APIs. AsyncAPI - Standard for defining API contracts for Kafka, AMQP and other APIs. Postman - An executable format for defining and interacting with APIs. APIs.json - An open standard for defining the operations that surround an API. Tagging - Using tags to label and organize API definitions across API operations. Versioning - Using semantic versioning to properly evolve API definitions used. Extending - Understanding that you can extend API definition formats to meet needs. Life Cycle - API definitions are used across multiple stops of a modern API life cycle. Tooling - Tools that can...[Read More]


API Evangelist

Adding the Twilio Referral Program As An API Building Block

24 Sep 2019

I am always on the hunt for interesting new building blocks for how we can all operate our APIs, and the most mature API providers out there are usually where I can find the most innovative ones. This week’s interesting find is out of the API pioneer Twilio, with their new referral program. Providing a way that Twilio users can invite a friend to become a Twilio user, which isn’t anything that groundbreaking, as most SaaS and API platforms offer this, but with Twilio’s referral program you get $10.00 to apply to your Twilio account for each referral you make. They made the Twilio referral program pretty straightforward, which is nice because many affiliate related solutions can be pretty cumbersome—it goes something like this: You get a personal referral link that you can share with your networks Users sign up with your link, upgrade, then receive $10 to spend with Twilio For each person you refer who signs up for Twilio and upgrades, you get $10 in your Twilio account As Twilio says on their announcement, “depending on the country where you’re sending… that’s over 1250 SMS messages… or 1,000 free voice minutes.... or over 12,000 chats. And that’s just for 1 referral”. As part of their announcement they even give you some template referral text, and recommendations of where you can share your referral message and code, taking advantage of your common social networks like LinkedIn, Twitch, YouTube, Reddit, Twitter, Facebook, and in your email signature.Twilio provides a pretty basic example of an API referral program, a building block I will be adding to my list of the more business elements we can be baking into our platform. I could even envision this being offered by a startup or existing API service providers allowing API providers to quickly implement as part of their API management, portal, and other existing services and tooling. Making for dead simple, plug and play API referrals for API providers,...[Read More]


API Evangelist

Making An API Request To Update Examples In My API Documentation And Power My API Mocks

23 Sep 2019

As I was working to improve upon a couple of the API collections I’m defining, and define what constitutes a “complete enough” or “robust enough” collection, I noticed how the process within Postman is iterative, outcome based, and driven from actual requests and responses. Meaning a complete enough Postman collection is relative to the outcomes I’m looking for—things like documentation and mock APIs. To be able to achieve robust API documentation and mocks I simply need to add my API request, successfully make a call to my API defined, and save the response as part of my API collection. Then, once I’m ready, all I need to do to update my API documentation and generate API mocks is to hit publish for each of these desired outcomes. Making the documenting and mocking of APIs very straightforward and logical for me. I will be stepping through my API profiling process again and writing a fresh post on how I profile the operational and API level data of the companies, organizations, institutions, and government agencies I’m covering. Have an example of each API request to not just round off the definition of an API so I have an understanding of the schema being applied is valuable, but I’m finding that making sure I have it in service of documentation, mocking, testing, and other essential stops along the lifecycle makes it much more meaningful and valuable. Personally, I want the request and response for each API I’m profiling to help increase the surface area of each API I have for better search-ability. However, thinking about making it so anyone can find the API and more easily put it to use with more complete documentation is another high water mark that improves the quality my profiling work—making it more valuable to my audience.Managing my profiling of APIs using Postman and specifically as Postman collections is changing how I profile the API sector, as well as how I see the...[Read More]


API Evangelist

Looking For APIs By Industry

23 Sep 2019

I am working to expand the vocabulary I use to search for new APIs. I already have a pretty extensive set of keywords and phrases I have from mining the world of APIs over the last nine years, but I wanted to get more formal about how I find new and interesting APIs across as many industries as I possibly can. To help me in my effort I adopted the North American Industry Classification System (NAICS) as the official vocabulary I use for searching and organizing the businesses, organizations, tooling, APIs, and datasets I'm profiling as part of my research. Helping me standardize how I uncover new APIs, keeping my vocabulary in alignment with government agencies like Censuse and Labor, but also industry organizations, and the companies that operate within each business sector. After settling in on NAICS, I found the NAICS API and created a Postman Collection for it. The open source API implementation doesn't have the most recent 2017 update, and the response schema didn't really work for what I was looking to build. So I got to work on creating my own version of NAISC from the 2017 index, shaping the structure of the JSON response to better meet my needs. Resulting in a single JSON file that represents the entire NAICS index, but also reflecting each tier of the specification. Now that I have the JSON in a more usable format I am using it as part of my automated API search engine, but I also wanted to develop ways to visualize and manually engage with any layer of the NAISC scaffolding while searching for new APIs on the web and GitHub. To help me see the scope of the vocabulary I was implementing I published it as a series of HTML lists, allowing me to quickly search for different APIs using the Bing Cognitive Search API, or via the public Bing search page. Producing an interesting list of industries which...[Read More]


API Evangelist

An Open Source Sitemap API For Every Domain

23 Sep 2019

I invest a lot of resources into spidering domains. I spend probably about $50.00 to $75.00 a month in compute resources to scrape domains I have targeted as part of my API research. This is something that will increase 3x in the next year as I expand my API search and discovery efforts. While it is very valuable to use GitHub and Bing API to uncover new and interesting domains, they don’t always give me very meaningful indexes of what each site contains. I prefer actually spidering the site and looking for meaningful words from my vocabulary like Swagger, OpenAPI, Documentation, etc. All the words the help me understand whether there is really an API there or just a blog post mentioning an API, or other ephemeral reference of API. As I work to refine and evolve my API search tooling I find myself wishing that website owners would own and operate their own sitemap API, providing a rich index of every public page available on their website. I should probably begin to adopt Common Crawl, instead of running my own scraper and crawler—however, the overhead of setting up Common Crawl and the special types of searches I’m conducting has prevented it from every occuring. I’ll most likely just keep doing my home grown version of harvesting of domains, but I can’t help dream and design the future I’d like to see along the way. Wouldn’t it be nice if EVERY domain had a base API for getting at the sitemap? Letting the domain owner control what is indexed and what is not, while also providing a simple, alternative, and machine readable interface for people to get access to content. Hell, I’d even pay a premium to get more structured data, or direct API access to the complete index of a website. It seems like something someone could cobble together and standardize using the ELK stick, and wrap it in a simple white label API...[Read More]


API Evangelist

Publishing Your Early Canary (Beta) APIs In Their Own API Workspace

20 Sep 2019

Providing early access to your APIs is nothing new. It is something that all API providers should do whether they are publishing their APIs publicly or keeping them private for internal or partner use. However, I feel that the concept of API workspaces introduced by Postman provides you with an opportunity to segment off your APIs using different collections and make them available to an invite only audience. Realizing the concept of API workspaces allows you to easily create a working area dedicated to providing trusted users have access to early versions of your latest APIs. Leveraging the Postman platform as a conduit for engaging with your developers, and getting feedback on the designs and functionality of your APIs before they ever are generally released to the public.Think of Postman workspaces as a separate namespace where as you are already developing your APIs, but now you can just turn on access to collections containing all the technical details of your API, including variables and authentication in an environment—allowing beta users to make calls to each API, and play around with the new functionality. All you will have to do is produce one or more Postman collections for your new beta APsIs, create a new workspace, and share your collections to the workspace. You can also publish API documentation for your collections if you want, or you can just begin with one or more Postman collections that provide everything developers will need to get up and running. Establishing special access to your APIs, without all the overhead of having to manage your beta program—you just rely on the functionality that Postman already delivers.You might even consider publishing your early API collections and include a mock API if you don’t have the API fully baked yet. Moving forward the timeline when you can get an API into your beta users hands, and begin getting early feedback on what is going on. This can be done with internal...[Read More]


API Evangelist

Going Outside The API Echo Chamber With Your API Services And Tooling

20 Sep 2019

You ever feel like you just preach to the choir when evangelizing your API tool or service? I do. All the time! While crafting stories for the blog I am constantly burying topics that I find super interesting but realize the “normals” I’m interested in reaching just aren’t going to care. Of course, I don’t always focus on this audience but I’d say I prefer a 60% normals to 40% geek focus in my storytelling. With this in mind I’m investing some more energy in getting out of the echo chamber with my storytelling and spending more time at healthcare, financial, energy, automobile, home improvement, and other conferences catering to specific industries and verticals, rather than just hanging with my API pals at the same old events. Don’t get me wrong, I will still be showing up at some of the best API events, but I’m going to be focusing on expeditionary mission work—it is the most dangerous type of API evangelism, but in my experience it can really pay off.When going outside of the API echo chamber to evangelize APIs you have to work hard to refine your vocabulary and keep your stories precise and compelling. One downside is that you have to tell a lot of the same stories over and over, but I guess if you aren’t that keen on creating new content, and perpetually having to reinvent yourself, this could be a positive thing. I enjoy pushing the boundaries of my storytelling within the echo chamber—it keeps my mind busy. However, reaching outside the echo chamber pushes me to refine my storytelling in ways that only speaking to “normals” can do. Also, getting the real world questions that are more closely aligned with the actual business problems we are looking to solve with APIs helps me ground my API blah blah blah in reality, and keeping it out of the stratosphere above the clouds.Being the API tooling presence at a consumer...[Read More]


API Evangelist

Some Questions To Ask When Quantifying Your Organizational API Maturity

19 Sep 2019

The government agencies, institutions, organizations, and companies that I talk to on a regular basis always express their desire to get a handle on how to consistently deliver APIs across the API lifecycle, and ask me to help quantity their overall API maturity. They are looking for honest answers on how mature their approach is compared to other API providers , and companies that operate within the same business sector that they do. To help folks self-analyze, as well as help guide conversations I am having with them on the ground, I’ve drafted a short list of questions you can ask of your operations.I have broken down my questions into six separate areas of the API lifecycle which go a long ways towards defining how mature a company is, asking some of the questions that I am commonly asking new companies I am engaging with in workshops, consulting, and now working with Postman. The goal of this is to help establish the overall maturity of an API operation, not to shame people involved, but to better get at where we should begin investing to help move teams along in their API journey. Here is a breakdown of API maturity questions I currently have.Discovery - Knowing where all of your digital assets are at scale.- Do you know where all of your APIs are?- Do you have a way track schema used across APIs?- Do you have an actively used catalog of APIs?- Do you use Swagger, OpenAPI, RAML, API Blueprint or Postman Collections to define your APIs?- Do you use JSON Schema to articulate the underlying data structure for APIs?- Do your APIs have owners or teams assigned to them?Design - Focusing in on an a design first area of the lifecycle.- Do you hand edit and work with Swagger, OpenAPI, RAML, API Blueprint or Postman Collections?- Do you auto-generate Swagger, OpenAPI, or Postman Collections- Do you mock your APIs at any stage of their...[Read More]


API Evangelist

Publishing My FHIR API Collection As Documentation And Making Available In The Postman Network

19 Sep 2019

I had generated a Postman Collection for the Fast Healthcare Interoperability Resources (FHIR) the other day. Making a simple, easy to use, executable representation of any FHIR compliant API. I wanted to get intimate with the healthcare API standard so that I can better contribute specification and conversation, but I also wanted to use the specification to map out what industry level API contracts can be using Postman. Over time the Postman collection will evolve, and become more robust to contain the pieces of the puzzle that help move it from being an API definition to an API contract.After creating the collection I have now published my FHIR API Postman Collection as documentation and to the Postman API Network. It still has to go through a review process before it is available in the Postman Network, but you can get at the documentation and Postman Collection directly. The process demonstrates how you can go from a working Postman Collection to documentation and discovery in just a couple of clicks. Then make your API collection and documentation available to any developers, helping them go from discovery to making actual API calls in just a couple clicks. The process for me demonstrates how ALL APIs should be defined, ensuring all APIs are well documented and executable by default—not something developed have to figure out for themselves.I am still working on which of my API definitions I want to be publishing to the Postman API Network. I’ll be working to convert all of my API definitions to Postman Collections, and publishing API documentation, but I’ll be very selective about which ones I publish to the Postman API Network. I really want the API provider to take charge of this process. I’m happy to have all these APIs associated with my account, but it would be more authoritative to have Postman collections, documentation, and the directory page managed by the API provider themself. It doesn’t take much to manage...[Read More]


API Evangelist

JSON Schema, Examples, And Postman Collections For 600 Schema.org Objects

19 Sep 2019

I wanted a ready to go supply of JSON examples from a variety of industries for me to use in my storytelling. No better place to begin with a project like this than with Schema.org, who provides a schema for just about anything you can think of. Karate Studio API anyone? I already had some scripts setup for creating Swagger 2.0 files from Schema.org, so I got to work retrofitting my Schema.org index to produce JSON Schema representations for each of the 600 objects in the index. Providing me with wealth of schema to choose from when I am defining new APIs, and telling stories about how APIs can make an impact across many different industries. After generating JSON Schema for 600 Schema.org objects, I worked to generate an example JSON for each of the objects, so I can demonstrate the schema as well as what it will look like while in use. The challenge with this work is how deep do you go when following the wormhole of Schema.org object properties—I opted to only do one level at the moment. I started hydrated each property that was an object, but then after going 5 levels deep I realized I should probably put more thought into this before creating massive JSON objects. I’ll work on this as a separate task, tackle a handful of the objects to better understand just how deep things can go, and come up with a better strategy for handling.Next I wanted to take each JSON example and create a simple one request Postman Collection, so I wrote a script to generate 600 basic collection producing just a single GET request, with the single level JSON example as a response. I’m not going to actually import my collections in Postman until I get them a little more polished, but so far the process has provided me with a nice proof of concept for starting new APIs from Schema.org using Postman. Next...[Read More]


API Evangelist

Postman Has Documentation And Executable Collection For Their Own API

18 Sep 2019

A true test of any API service provider is whether or not they have an API. It is one of the most critical tests I have for any company who is selling a service to API providers. If you are sell services to people who provide APIs, and do not have your own API, I’m immediately skeptical about how you are playing the game. Postman passes a lot of quality and ethical tests I have, but the fact that they eat their own food, and has documentation and a machine readable Postman Collection demonstrates they get it.I’m looking to sync my API catalog a series of my Postman workspaces—ensuring that if an API in my catalog has an OpenAPI, it also has an up to date Postman Collection. Actually, I am going to outsource the profiling of a companies API to a series of Postman workspaces, and only using my catalog for profiling the other elements of a company’s operations. The presence of a Postman Collection, and a Postman Environment has long been the signal that an API in my catalog has a “complete” profile available for its API. This is why profiling APIs with Postman is valuable, if builds the definition around the actual request and response, thus completing the circle for me.Anyways, I thought it was pretty significant that Postman has an API, and manages it using the same services that they offer to their customers. I also thought it was significant that the completeness of Postman's API is enabling me to move faster when it comes profiling the completeness of other APIs I’m tracking on. It really is a pretty virtuous cycle when you think about it. While I am happy to find an OpenAPI for any API I’m profiling and learning about, I’d rather find a Postman Collection, plus a Postman Environment, accompanied with a frictionless signup process to get my keys. Wouldn’t it be nice if I could just download...[Read More]


API Evangelist

Is This The Offer API You Were Needing?

18 Sep 2019

Going from an idea for an API, to something you can actually share with another team member or stakeholder is historically something that can take hours, days or even weeks. Even if you manage to convey your idea to another or team member, you can’t always guarantee they’ll understand the API design as intended. The best way to convey your thoughts is to mock and document your API as if it was a real API, letting your intended audience actually review the documentation, make actual calls to the API, and provide feedback on each individual request, or even begin changing the design and implementation as part of their feedback.I wanted to see how easy it was to take a new idea for an API and make it something that can be easily shared and worked with using Postman. I had recently been updating the JSON Schema for my Schema.org objects, so I quickly wrote a script to generate JSON samples from each of the standardized objects, and picked a random object to use in my demonstration—Offers. Now that I have a JSON example I can get to work generating a new mock API using my JSON snippet, creating a single API GET request, with the JSON as the request response. First I create a new mock server using the orange new button. Then I create a single GET endpoint for my API, and paste the JSON Schema for my Schema.org offers snippet into the response body for my new API endpoint. Then I just follow the rest of the wizard, giving my mock API a name, and leaving most other settings as their default before moving on to actually being able to use my mock API.Now I have a an API collection with a mock server that I can make calls to and see my JSON returned as part of the API response, making my offers API come to life.I can now share this API...[Read More]


API Evangelist

Creating A Postman Collection For The Fast Healthcare Interoperability Resources (FHIR) Specification

18 Sep 2019

I have been working on healthcare APIs in government for over five years now, providing feedback regularly to the Blue Button API effort over at Health and Human Service (HHS), including having them speak at APIStrat last year in Nashville, and me speaking at the White House as part of the Blue Button 2.0 summit. I’m always looking for more ways that I can contribute to the healthcare API ecosystem, especially when it comes to investing in the Fast Healthcare Interoperability Resources (FHIR) specification. Now that I have bandwidth to invest in healthcare APIs again, I wanted to get my mind thinking about the potential of the FHIR specification, and find ways to help move the conversation forward.To jumpstart my work on the Fast Healthcare Interoperability Resources (FHIR) specification I generated a Postman Collection from the FHIR test server, reverse engineering the specification and creating my first draft of a collection to center my effort around. The Postman Collection covers the following resource types: Account ActivityDefinition AdverseEvent AllergyIntolerance Appointment AppointmentResponse AuditEvent Basic Binary BiologicallyDerivedProduct BodyStructure Bundle CapabilityStatement CarePlan CareTeam CatalogEntry ChargeItem ChargeItemDefinition Claim ClaimResponse ClinicalImpression CodeSystem Communication CommunicationRequest CompartmentDefinition Composition ConceptMap Condition Consent Contract Coverage CoverageEligibilityRequest CoverageEligibilityResponse DetectedIssue Device DeviceDefinition DeviceMetric DeviceRequest DeviceUseStatement DiagnosticReport DocumentManifest DocumentReference EffectEvidenceSynthesis Encounter Endpoint EnrollmentRequest EnrollmentResponse EpisodeOfCare EventDefinition Evidence EvidenceVariable ExampleScenario ExplanationOfBenefit FamilyMemberHistory Flag Goal GraphDefinition Group GuidanceResponse HealthcareService ImagingStudy Immunization ImmunizationEvaluation ImmunizationRecommendation ImplementationGuide InsurancePlan Invoice Library Linkage List Location Measure MeasureReport Media Medication MedicationAdministration MedicationDispense MedicationKnowledge MedicationRequest MedicationStatement MedicinalProduct MedicinalProductAuthorization MedicinalProductContraindication MedicinalProductIndication MedicinalProductIngredient MedicinalProductInteraction MedicinalProductManufactured MedicinalProductPackaged MedicinalProductPharmaceutical MedicinalProductUndesirableEffect MessageDefinition MessageHeader MolecularSequence NamingSystem NutritionOrder Observation ObservationDefinition OperationDefinition OperationOutcome Organization OrganizationAffiliation Parameters Patient PaymentNotice PaymentReconciliation Person PlanDefinition Practitioner PractitionerRole Procedure Provenance Questionnaire QuestionnaireResponse RelatedPerson RequestGroup ResearchDefinition ResearchElementDefinition ResearchStudy ResearchSubject RiskAssessment RiskEvidenceSynthesis Schedule SearchParameter ServiceRequest Slot Specimen SpecimenDefinition StructureDefinition StructureMap Subscription Substance SubstanceSpecification SupplyDelivery SupplyRequest Task TerminologyCapabilities TestReport TestScript ValueSet VerificationResult VisionPrescription The FHIR API specification collection points at the FHIR test server by default, and some of the paths need parameters...[Read More]


API Evangelist

Business Users Do Not Search For API Solutions, They Just Search For Solutions

18 Sep 2019

As I craft stories for my blog I am always working to reach as wide possible audience as I can. It is one of the reasons I write so many stories, because the process helps me refine how I say things, and the words that I use. The process is a double edge sword because I want to reach my more technical audience by using very precise and meaningful terms, but I also want to reach out to a more business focused audience by using other more general and meaningful terms. This is a tough balance to find across any single post, so I tend to mix things up across posts, going down the technical rabbit hole on some, while keeping things high level for my business users on others. Keeping the site reaching as many folks as possible. Bridging what developers are thinking with what business users are thinking has been one of the sustained missions of API Evangelist since 2010. I’m not always successful in realizing this mission, but I work as hard as possibly I can to help reach business users whenever possible. When I write my posts I try to think about what a business user will be Googling for, and I realize that this will rarely ever contain the phrase application programming interface, or the acronym API. Business users are going to be searching for terms and phrases that match the problems they face, and the solutions they are in need of to make their lives easier. Often times they are unaware that they are looking for an API, a connector, or integration between platforms using APIs—rendering many of the technical concepts us API aware folks depend on useless when actually reaching outside the API echo chamber. To help me push my word-smithing outside of the realm of API, I wanted to explore some terms and phrases that might resonate with business users when they are googling: Integrations - The...[Read More]


API Evangelist

Being The Source Of API Truth At Your Organization

18 Sep 2019

When doing web services and API inventory at enterprise organizations I always come across one or two individuals or groups who are the keepers of the APIs, schema, and related knowledge and truth within the organization. It is the new version of the database keepers within large organizations, but instead of it just being just about data, it is about access to all types of resources, including data, content, algorithms, network, and other infrastructure elements. Being the API knowledgebase, directory, and source of truth within an organization takes a lot of work, but it is something that will pay off down the road when it comes to actually getting things done within an organization.I have had a database of APIs since 2011, but it has mostly been for my discovery only. Historically I have published some of these to GitHub, and I’m working on different ways of making them available via search, but now that I’m working at Postman, I am considering making them available in different ways via Postman workspaces, and collections. Ideally, API providers would publish their own API definitions to Postman to the Postman API network, so I could just rely on API provider verified collections, but until that day I'll keep working on crafting them myself. Even once I have a complete API collection, I still find that my way of organizing a collection, and pre-populating Postman environment with certain variables often goes far beyond what most API providers will offer up in their own collections.I’m looking for some of my hand-crafted Postman Collections and Templates to become the source of truth when it comes to some of the APIs I'm telling stories about. I don’t just want to be the person people come to for interesting APIs with complete API definitions, I want them to come to me because I am pre-populating them with meaningful values, naming them using more meaningful labels, and making them available via accessible workspaces. Helping...[Read More]


API Evangelist

What Are You Doing With Postman?

17 Sep 2019

I am now immersed in all things Postman. After a week at Postman, attending POSTCON, and listening to how API developers are putting the API development environment, I have realized what a Swiss army knife the platform is. I get the core functionality of the application very well, and have used for several years. However, I only lightly understood the mocking, testing, monitors, runners, documentation, and other key features. I also was introduced to how pre and post scripts reduce friction, and the versatility, portability, and share-ability of Postman Collections. I’m beyond ecstatic to be studying and evangelizing all things Postman, and get to spend full time thinking deeply about how it can be applied across the API lifecycle.My days will be filled with me playing around with Postman, and defining, mocking, testing, monitoring, documenting, and moving APIs forward, then telling the story of what I am doing here on API Evangelist, on the Postman Blog, and as part of other conversations I’m engaged in. While this will keep me very busy, I’d like to hear what you are doing with Postman. I want it all. I want your mundane stories about how you use it just to make simple API requests. I want to hear the stories about why you aren’t using some of the other features like mocking and testing. I also want to hear all about the amazing lifecycle workflows you’ve cobbled together using Postman. After talking with companies at POSTCON I’ve come to realize how much Postman enables many of the common API lifecycle workflows we hear about, but also that there is a long tail of workflows that are cobbled together to meet the specific needs of teams on the ground—lighting my imagination when it comes to what is possible with Postman.I am happy to showcase what you are working on here on the blog, as well as anonymize your story—it is something I do all the time. It’s up...[Read More]


API Evangelist

It Is Hard To Not Just Get To Work Coding My API

17 Sep 2019

I need a couple of generic APIs for some storytelling and workshop materials. Just some basic example of web API in action managing some common everyday resources like notes, products, and company information. As a developer, I have to admit that it is hard to not just get to work sling’n code to bring an API to life. I’m a full blown API-first believer, and I still struggle with wanting to roll up my sleeves and being code first. As a developer, it is what we do, and it isn’t going to be easy to change behavior, but I find that I need to stop being my own worst enemy, and stop ignoring the benefits of being API-first and invest what in what I need to practice what I preach. My APIs always start with a schema, and in this case I’m going to use Schema.org, keeping my APIs as standardized as possible. Now that I”m evangelizing Postman I’m going to to start all my APIs within the Postman API development environment, starting with a single GET request that returns the seed schema I’m looking to deliver. Then I’ll add a POST, PUT, and DELETE for my API, rounding off the CRUD basics, and fire up a mock using Postman, giving me my API-first capability I need. Then I can make requests to my API, and make available as part of my stories and tutorial, demonstrating how a basic API works. Now that I’m fleshing this idea out, I’m thinking I will also make a how to deploy an API tutorial using the same API. Originally I was just going to use it to show to consume an API, but since I”m going with an API-first approach with my API, there is an API-first lesson in there as well.I think one of the things to get me over my hurdle will be the tooling. To be empowered to go from Schema.org schema to a mock...[Read More]


API Evangelist

I Will Be At The API Specifications Conference In Vancouver Next Month

17 Sep 2019

I’m happy to have the time in my new role to make it up to the API Specifications Conference next month. It really is the most important API conference out there, not just because it used to be my baby APIStrat, but because it is about the bedrock of the API sector—API definitions. As the API sector continues to go mainstream it is becoming increasingly important that we continue to standardize how we deliver APIs, and develop a common vocabulary for everything API. The discussions we will be having at the API Specification Conference will impact every stop along the API lifecycle, making it the most important API event out there in my opinion. Looking through the ASC program schedule I am seeing coverage of the entire API toolbox, covering REST ,Hypermedia, GraphQL, gRPC, and event-driven APIs, with representation for JSON Schema, JSON-LD, OData, OpenAPI, RAML, API Blueprint, and AsyncAPI. Which for me represents the future of API design and development, preparing us all for the realities we will be facing on the ground across the organizations we work at each day. While REST is the cornerstone of everything API, I support everyone being well versed in a diverse API toolbox, as well as there being multiple API specification formats to rule them all—again, making the API Specifications the most important API event out there in my opinion.I will be heading to Vancouver for ASC in October. No plans to speak yet, but Abhijit Kane from Postman will be speaking on API schema formats and his learnings using them at Postman. If nothing else, I will just be roaming the halls panhandling and maybe busking a little bit in the corner. The lineup is pretty amazing—I’m looking forward to actually plotting out which talks I will be attending, and trying to take in sessions from across the spectrum. I’m looking to learn more about what the future holds for JSON Schema, AsyncAPI, as well as...[Read More]


API Evangelist

API-First Development

17 Sep 2019

This is part of my on-boarding work as Chief Evangelist with Postman. Part of my work to understand how Postman users are putting the platform to work is to process earch one of their use cases in a narrative format. The process helps these things stick in my mind, and since they are all driven by actual feedback and stories from their users, it populates my brain with relevant talking points when I'm engaging with customers. API-first is a philosophy and way of life that involves beginning every software capability as an API before you ever begin developer any user interface, or any other element. API-first ensures that all organizational capabilities are defined as simple, intuitive, well-designed API interfaces that can be used and reused across many different applications and systems. Right now, API-first is more of a myth than it is reality in most companies, but regardless, API-first is something many companies strive for when it comes to the development of their software. Postman has a dedicated API-First development use case which provides a simple set of steps to consider when it comes to establishing an API-first development process—breaking things down into four separate phases.1. Specification Phase - Getting to work defining what each API capability will actually do.- Create a new API - Begin defining a new API fro scratch, not working from existing pattern.- Write or import a new specification - Working on an API from an existing API specification.2. Development Phase - The actual development of an API that is being specified. - Create a mock server - Turn the specification into a mock representation of the API.- Create documentation - Publish documentation for your API using the API specification.- Debug API - Track, coordinate and fix bugs and defects in the API being developed.3. Testing Phase - Ensuring you have the surface area of your API covered by API tests.- Explore the API - With an API mocked and being...[Read More]


API Evangelist

The Stops Along The API Lifecycle That Postman Services

15 Sep 2019

I’m currently processing the common Postman use cases, and overlaying what the platform offers in context of historically what I’ve called my API lifecycle research. I plan on brining my vision more in alignment with Postman’s approach because I want to be telling consistent API lifecycle stories across API Evangelist and Postman, but also have things be in sync when I”m out speaking in public. I spent some time going through the Postman website and application and documented all of the stops along the API lifecycle they service, and then I took that and mapped it to the vocabulary I’ve been using to describe these areas as part of my own API lifecycle research. Providing me with a bulleted list of stops, not in any particular order, but demonstrating for me the scope of the impact Postman is making right now. Definitions - Postman imports RAML, WADL, Swagger, and OpenAPI while also investing in their own API definition format called Postman Collections. (Postman) (API Evangelist) Versioning - Via collections, Postman provides you with tooling for versioning your APIs. (Postman) (API Evangelist) Collaboration - The killer feature of Postman is all about collaborating with your team around well defined API collections. (Postman) Environment - You can define different environments with variables for use across different collections by teams.(Postman) Examples - You can save API request responses as examples and publish them as part of API documentation. (Postman) Search - Pro and enterprise users lets you search across collections, folders, requests, and responses.(Postman) (API Evangelist) Sync - You can sync your Postman activity across accounts, devices, and Git repositories. (Postman) Sharing - You can share collections using Postman in several ways, focusing on wide collaboration through the API lifecycle. (Postman) Workspaces - You can define different workspaces where you collaborate with teams to define API activity across platforms. (Postman) Marketplace - You can publish your APIs to the Postman Network, providing a marketplace of interesting APIs (Postman)...[Read More]


API Evangelist

She Asked “What Now” After Seeing Me Put Together My NASA API Collection

15 Sep 2019

I thoroughly enjoy engaging with my wife when it comes to APIs. She has been along for the entire API Evangelist ride, and she has absorbed more about APIs from listening to me talk about APIs than almost anyone else in the industry. While she tunes me out most of the time, she does listen and engage regularly, providing me with a pretty valuable sounding board to bounce ideas and concepts off from time to time. I’m cautious about how often I tap this resource, but when the time is right I will introduce her to an API concept I’m working with to see what she has to say. Yesterday was one of these days, and I took advantage of an opportunity to show her a Postman Collection I had made of several NASA APIs.My wife isn’t known for beating around the bush, just read her blog Hack Education for a sampling, and after walking her through what Postman does when you have a bunchof APIs defined as collections, complete with an environment and API keys, her response was “cool, but what now”? Providing me with the regular brutal honesty I look for in my wife, but also what I like to hear when it comes to my API blah blah blah. My motivation behind sharing this Postman Collection with her was all about testing the waters for building API collections and potentially sharing them with non-developers. Since all of the NASA APIs are read only, and while they provide a wealth of rich data as part of each API response, the resulting value of these API collections is all about the JSON responses, which you can save in Postman, but then as she states, “what now”?I am going to work on some other API collections that allow you to add or update data to demonstrate the wider API potential, but her question pushed me to think about how I can also work to better...[Read More]


API Evangelist

I Preemptively Apologize For The Flood Of Postman Storytelling You Are About To Endure

15 Sep 2019

As some of you may have heard, I joined Postman as their Chief Evangelist last week. What does this mean? It means I will be telling a LOT of stories about my journey with the Postman team. Abhinav and team understand who I am and what I do, so not a whole lot will change, but like many phases of the last decade, my stories will be heavily focused on what I’m doing with Postman. If you know my style, you know it won’t be the usual product marketing storytelling you will find on an API service provider blog, it will be relevant stories about what I’m doing and seeing as I do my research. Postman gets the value of me doing my research, and the importance of it remaining generic enough that you can apply outside the Postman ecosystem, but because Postman is such a versatile Swiss army knife af an API development environment (ADE), much of my storytelling will involve Postman. My goal is to always keep the stories I tell interesting, valuable, and relevant enough that people want to read them, and hopefully I do not lose my soul within the storytelling process. With that said, I am beyond excited to be diving into the Postman wormhole. If you’ve followed my storytelling over the years, you know I can crank out the content when I’m researching any topic, and when it comes to the core API lifecycle Postman is looking to dominate, I’m confident I will be fully engaged. When I am fully engaged with a topic I care about, I can easily crank things up to OCD levels, and reach real-time levels of narration here on the blog regarding whatever I am working on within the moment. Every idea in my head gets written down, and the "best" of it gets fleshed out here on the blog, providing me with dizzying levels of momentum when it comes to moving ideas forward,...[Read More]


API Evangelist

Giving A Postman Collection To Your Sales Team

15 Sep 2019

EasyPost spoke at the Postman User Conference (POSTCON) last week, and while they shared a number of very interesting stories, one that really stuck with me was about how they create Postman Collections for use by their non-technical sales teams. They put some of the common API driven tasks that a sales person would need to execute in the course of their daily work into each collection, and provided environments so that they didn’t have to mess around with authentication. Making for a pretty compelling tale of non-developers putting APIs to work, which is kind of the holy grail of API consumption—empowering average business users to realize the potential of APIs.I am going to be creating several proof of concepts to see if I can create Postman Collections that are valuable enough, and simple enough to be executed by a non-technical user. The Postman interface can be a little busy at times because it is so feature rich, but I don’t think it is out of the question for a business user to be able to get in there and execute on some pre-configured templates and collections. We’ll see. I think it depends on the quality of the collection, and the fearlessness of the would-be user. I think with the right amount of attention while crafting, naming, and organizing the API collection, as wel as a dedicated effort to abstracting away the complexity of any of the underlying APIs, making it one click to get some value from some common, or even specialized APIs, we can make APIs much more accessible to a business audience.Postman provides a unique opportunity to make existing APIs more commonplace--reaching beyond just developers. I think we just have to work harder at defining collections, environments, and the scripts that help automate potential API engagements. We’ll have to make sure we select useful API paths, pre-populate them with relevant values, and properly name and organize useful API capabilities, and then I...[Read More]


API Evangelist

Creating A NASA API Postman Collection And Environment

15 Sep 2019

This is a story derived from work to help develop a Postman Collection that could be used by the International Space Apps Challenge, establishing a machine readable definition for all the NASA APIs available at https://api.nasa.gov/ that can be used by participations in the challenge, helping developers quickly get up to speed with the valuable APIs NASA provides. The International Space Apps Challenge is in need of sponsors and participants to help make the event a success--please visit http://www.spaceappschallenge.org, or on Twitter at @SpaceApps to get involved.I met Katelyn Hertel (@Katers_Potaters) from the International Space Apps Challenge at POSTCON, and we got to talking about the interesting APIs NASA has, and how we can use Postman to make things easier for the developers who are participating in the global challenge. I immediately knew what was needed, and as soon as I got home this weekend I got to work on crafting a Postman Collection for all of the NASA APIs. It took me about four hours t apply for a key, and get up and running with each of the NASA APIs using Postman. As I was playing around with each of the individual API resources, which come from across many different groups, I took the liberty to better organize them, and came up with the following sub-folders for the NASA API Postman collection. Asteroids near East Objects - Information on near-earth asteroid information. Astronomy Picture of the Day (APOD) - A single astronomy picture each day. CelesTrak Two-line Element (TLE) - Orbital elements of an Earth-orbiting object. Earth Observatory Natural Event Tracker (EONET) - Curated source of continuously updated natural event metadata. Earth Polychromatic Imaging Camera (EPIC) - Daily imagery collected by DSCOVR's Earth Polychromatic Imaging Camera (EPIC) instrument. Exoplanet Archive - Access to NASA's Exoplanet Archive database. GeneLab - Bioinformatics and Biotechnology information search. Image and Video Library - Access to NASA's image and video library. Kepler Objects of Interest (KOI) - Stars...[Read More]


API Evangelist

API Workspace and Environment Management

15 Sep 2019

I have hundreds of collections within my Postman, but I’ve never done much collaboration with others in there until now. I find myself getting more organized with the Postman collections for API Evangelist, because I’m doing more coordination and sharing with the collections I maintain, but I’m also preparing to ramp up as Chief Evangelist for Postman. The process has highlighted for me the importance of API workspaces and environments along with our API definitions. These are two killer features of Postman which I think are concepts that should be ubiquitous across the API sector, and of course is a conversation that Postman will continue to play an increasingly important role. Establishing workspaces for different projects, and being more thought fun of how I craft the environments along with the Postman collections I’m defining for a variety of APIs has once again shifted how I see the API landscape.Workspaces mean the logical grouping, organization, access, and collaboration of API definitions. You shouldn’t just have a big blob of API definitions you use for a variety of reasons. You should be managing API definitions that are maintained and shared across a variety of workspaces, giving access to relevant stakeholders. My personal workspace in Postman is littered with hundreds of collections because it has only been me in there. In this reality I tend to be my own worst enemy and not have any coherent organization of collections in mind, just relying on the quick search to pull up what I need. However, I’m beginning to work with my wife and step-son on a variety of API definitions to help him get in on the “family business”, and now I”m needing to be more organized with my approach.I can see small, medium, and large organizations needing someone dedicated to helping manage API workspaces, and the environments in use across teams, for both internal and external APIs. Thinking more critically around which APIs teams need access to while...[Read More]


API Evangelist

I Am Joining Postman As Their Chief Evangelist

12 Sep 2019

I am determined to continue taking my career to the next level. I’ve done well doing API Evangelist over the years, but I feel like I’ve gone as far as I can all by my lonesome. To level things up I feel I need more than just my words. I need some meaningful tooling, and collaboration with people who are building interesting and meaningful things with APIs to get to the place I envision in my minds eye. There are just a handful of startups out there who have both captured my API imagination, are making a meaningful impact on the API sector, and share my ethical view of the API sector, and one of them is Postman. I’ve supported Postman since they emerged on the scene, and as of today, I am joining their team as their Chief Evangelist.Moving forward I will keep telling stories here on the blog, and doing what I do as the API Evangelist, but I will be actively supporting the Postman team when it comes to developer relations, marketing, product, and business development. I will be merging my visions of the API lifecycle with Postman's vision of how their API Development Environment (ADE) helps developers realize many of the stops along the API lifecycle that I talk about, while also working with enterprise groups, startups, and government agencies to development meaningful API blueprints that can help us all on our API journey. This will be just one of many tasks on my plate as I get my bearings within the Postman community, get to know the team, and become more acquainted with the product roadmap.After a six month break, I will be now showing up at more events, cranking up my storytelling on API Evangelist and Postman blogs, and working to craft other meaningful stories in a variety of formats, including guides, blueprints, videos, and other variations—helping us all better understand the API lifecycle. I am really eager to...[Read More]


API Evangelist

AsyncAPI Version 2.0 Is Ready For Use

11 Sep 2019

The latest major version of the event-driven API specification format AsyncAPI is ready for production. The AsyncAPI community has been working hard in recent months to hammer out the next generation specification for helping us all better define a common definition for our event-driven APIs so that we can use across the API lifecycle. AsyncAPI is a sister specification to OpenAPI (fka Swagger), and while you can describe HTTP APIs using the format, it excels at helping us define our event-driven API infrastructure, and bring all the same documentation, mocking, testing, and other critical stops along an API lifecycle we will need to be successful.I’m going to be playing around with a couple implementations using AsyncAPI 2.0, helping me define a prototype Kafka, as well as NATS APIs, but I wanted to take some time to share what the list of new collections are for the latest version of AsyncAPI. If you are familiar with version 1.x of AsyncAPI, some of these schema collections will look familiar, but they’ve added some interesting collections to help us describe the surface area of our event-driven infrastructure. AsyncAPI Object - This is the root document object for the API specification. It combines resource listing and API declaration together into one document. AsyncAPI Version String - The version string signifies the version of the AsyncAPI Specification that the document complies to. Info Object - The object provides metadata about the API. The metadata can be used by the clients if needed. Contact Object - Contact information for the exposed API. License Object - License information for the exposed API. Servers Object - An object representing a Server. Channels Object - Holds the relative paths to the individual channel and their operations. Channel paths are relative to servers. Channel Item Object - Describes the operations available on a single channel. Operation Object - Describes a publish or a subscribe operation. This provides a place to document how and why messages...[Read More]


API Evangelist

API Management For An Event-Driven API Landscape

11 Sep 2019

I’ve talked recently about a second coming of API management, which in my opinion is a wave that has several fronts. One of those fronts I see rolling out is in the area of event-driven API infrastructure. API management for request and response API infrastructure is pretty well defined, but the same concepts applied to event-driven infrastructure represents a pretty significant opportunity within this second evolution. Something that will span HTTP 1.1, HTTP/2, and TCP API implementations, providing the essential API management building blocks we might take for granted with HTTP 1.1 implementations, but as API providers expand their API toolbox to possess a variety of implementation patterns, they are beginning to look for consistent management capabilities across all their APIs.Over the last year I have had several questions from readers regarding who they should be talking to when it comes to API management for their Kafka, NATS, and other event-driven API infrastructure, including something as simple as better management for Webhooks. To help me better tune into the existing API management solutions out there that might help out in this area, as well as brainstorm what the next generation event-driven API management solutions might be, I wanted to explore this topic a little more here on the blog. Beginning with restating what some of the essential building blocks are for API management: Portal - A single URL to find out everything about an API, and get up and running working the resources that are available. On-Boarding - Think just about how you get a new developer to from landing on the home page of the portal to making their first API call, and then an application in production. Accounts - Allowing API consumers to sign up for an account, either for individual, or business access to API resources. Applications - Enable each account holder to register one or many applications which will be putting API resources to use. Authentication - Providing one, or multiple...[Read More]


API Evangelist

The API Reality In Our Heads Versus The Reality On The Ground

10 Sep 2019

I am spending some time grounding my views of the API landscape. Working my through all of my beliefs, and systematically blowing them to bits to see how they hold up against the stress of reality on the ground. This is something I’ve become very good at when it comes to my personal beliefs in recent years, and something I’ve been working to transfer to my professional world to help me keep a grip on what is going on. There are a number of reason why I fall prey to things that are not real in this game, and I’m pretty aware of the shady things that occur in the business world, but when it comes to technology I find the stories it whispers in my ear prove to be particularly enchanting and seem to go from whisper to truth at a velocity I don’t always understand.One of the things I need to develop a way of better evaluating in the moment is around the velocity at which things will happen. How fast adoption of APIs will occur within the mainstream. How quickly a company will adopt an API-first approach. And the time it will take a new tool to go from creation to adoption. Technology has this way of convincing me that everything is moving faster than ever before, and it is something that ends up as a residue on everything I touch, and is relative to how deeply I believe in this myth. As I approach a decade of doing this, I can say that API adoption and awareness has never played out in a timeline anywhere close to what I envisioned in the early days. At this point I’d say that most things are at a 6X scale than I had imagined. Sure, there are exceptions, but when it comes to the normal pace of change, especially within the enterprise, it has taken about 6 times as long for things to take...[Read More]


API Evangelist

Discovering The Confluent Schema Registry

10 Sep 2019

While spending time doing some research into schema management tooling I came across the Confluents Schema Registry. The schema management solutions is one of the first formal tools I’ve come across that is specifically designed for helping folks get their schema house in order when it comes to APIs. I’m sure there are others out there, but this was the first solution I've documented that addresses this in an API context as well as having an API, providing some of the critical features needed to make sense of the crazy schema mess enterprise organizations find themselves in.Here is the language from the Confluent website describing what the registry is all about: Confluent Schema Registry provides a RESTful interface for developers to define standard schemas for their events, share them across the organization and safely evolve them in a way that is backward compatible and future proof. The Confluence Schema Registry allows you to centralize your schema and provides a REST API to integrate, save, and retrieve schemas, and delivers functionality for automatically converting JSON messages to make your data human friendly. Providing a pretty fundamental schema management solution that other API service providers should be thinking about. Clearly this one is for use with your Kafka infrastructure, but the model applies across any API you are deploying, whether it is HTTP, TCP, MQTT, or otherwise—Confluent just provides us with one compelling model to follow.Now that I have schema catalog added to my monitoring system vocabulary it will be notifying me of other news, blogs, and other signals when it comes to how API providers are managing their schema, as well as any other API service providers like Confluent who are investing in this area of the API lifecycle. It is an area that I’ve been beating the drum about for a while now, and something I’d like to see more investment in. If companies are not able to get their schema house in order, they...[Read More]


API Evangelist

Continue Pushing The API Documentation Conversation Forward

09 Sep 2019

I am been finally seeing the investment across the API sector I wanted to see when it comes to API documentation. There are multiple API definition driven API documentation offerings available on the market now. Both open source and high quality commercial services. A couple years after Swagger UI made it’s splash I began lobbing for more investment in open source API documentation tooling, and after four years I’m starting to see it beginning to happen. However, let’s not rest on laurels and make sure we keep investing and building on the momentum that we have established, and continue making API documentation more valuable to developers, but also to business users who are interested in putting API resources to work.One of the major improvements in API documentation that I would like to see in coming years centers around visualizations. I’d like to see interactive documentation be augmented and extended using D3.js, and other visualization components, rendering API responses in a more visually pleasing way. Helping make API responses more meaningful to developers, and potentially to businesses users who are trying to understand the value an API delivers. Visualizations have the potential to make API documentation something that introduces and educates developers to how to integrate with an API, but also demonstrate and illustrate how the data, content, and other resources can be valuable. Bonus points for any tooling provider if the visual results are actually embeddable and shareable across websites and social media, allowing anyone to take the results of each API response and quickly make available to a developer or business users network.Beyond just visualizations, I’d like to see more interactive API documentation to make results savable, exportable, and shareable. Allowing any developer or business user to easily make an API request, then export the results as a JSON, CSV, or possibly as a spreadsheet. Empowering API consumers to quickly use API documentation to understand what an API delivers, get at the valuable at...[Read More]


API Evangelist

Bridging Grand Visions of an API Lifecycle With People on the Ground Being Successful In Their Work

09 Sep 2019

While my work as the API Evangelist can burn me out some of the time, I generally find it intellectually challenging. The work takes me from industry to industry, country to country, and to the highest levels of technology, and down to the work that goes on across development teams within companies. APIs are everywhere. Really, they are. They are impacting each one of our personal and professional lives, and this is one of the things that keeps me coming back doing what I do. The diversity of API implementations, and levels at which I can engage with the industry keeps me interested, continuously learning and producing.I enjoy thinking about the API space from the 250K level. It is interesting to study what is, what has been, and where things might be going when it comes to APIs. I find it compelling to learn about how API providers, service providers, and investors see things, then reconcile that with what actually happens on the ground. Looking at API change across business sectors has lifted my view from 100K to 250K, allowing me to not just understand how the tech echo chamber sees APIs, but also how mainstream businesses and government agencies see APIs. Always pushing me in new directions, helping me see beyond any single silo I might get trapped in along the way.Watching the API lifecycle come Into focus over the last decade has been interesting. Watching API management shift how we generate revenue from our software, witnessed API design come to life, as well as mocking, testing, and other stops along the API lifecycle have all been educational experiences. I like to think about API-first, and how progressive developers are delivering high quality APIs from end to end. Studying, listening and writing about these concepts, then repeating, repeating, and repeating, until I push my own understanding to new heights. This is what API Evangelist has been for me. It has been something that has...[Read More]


API Evangelist

Where Do You Like Your API Complexity?

06 Sep 2019

I prefer my API complexity at the path, query, then schema levels of my API design—specifically in that order. I don’t mind a huge number of individual API calls to get the job done because I just script away this aspect of API complexity. However, I do fully understand that many folks prefer their complexity at the query and schema levels over having lots of individual paths. I find that developers love to rant about imperative API complexity, and in my experience the folks who don’t like path level API complexity are also some of the most vocal types of folks who are very confident that their way is the right way, when in reality, there is no right way—just the way YOUR consumers will want or need to access the API resources you are serving up.In my experience, how someone learned about the web, and then APIs, dictate much of where they like their API complexity. If developers prefer their complexity at the query parameter layer their API doorway was the web. If developers prefer their complexity in the schema, their doorway was likely mobile. If you understand the role that paths can play in managing complexity, you’ve probably have embraced the overall concept of web APIs. If you understand the opportunity and necessity of sensibly spreading complexity across the path, query, and schema layers of your API, you probably don’t just have experience providing APIs, you probably have a lot of experience supporting many different types of API consumers—the key that unlocks the door to seeing the bigger picture of managing API complexity.Not all API consumers are created equal. They have different views of what an API is, and how you use one. If you’ve supported a wide audience of API consumers,  you realize there is no single silver bullet when it comes to where you offload your API complexity. You have to make tradeoffs at every turn when designing your API, and...[Read More]


API Evangelist

API Management Should Not Just Limit Me, It Should Allow Me To Scale

06 Sep 2019

I do a lot of thinking about API management. After almost a decade of contemplating how we manage our API infrastructure, I feel it is still the most important stop along the API lifecycle. I don’t care how well designed, deployed, documented, and supported your APIs are, if you aren’t tuned in using API management, you aren’t going to be successful. API management provides you with the tools to you need to define and understand how your consumers will put your API resources to work. After almost 15 years of evolution, API management hasn’t changed too much, but there is one core capability I’d like to see evolve, expanding upon the often go to feature of API rate limiting.API rate limiting has been a staple of API management since the beginning, allowing you to limit how much fo any resource a group or individual consumer can get access to—limiting the rate at which they can make API calls. The reason for rate limiting will vary from provider to provider, but the most common reason is to conserve compute resources so that an API remains usable by all consumers. Next, I’d say that pricing is the second most common reason for rate limiting, carving up API resources by access tier, and limiting the number of calls each API consumer can make per second, minute, day, or other time frame. While these concepts are still applicable to the business of APIs in 2019, I’d like to see the concept evolve to keep up with how we deploy infrastructure in a containerized, Kubernetes, serverless cloudy landscape.Instead of capping what I can consume of your API, why not allow me to pay for more access, as well as more performance for a short period of time, or whatever duration I desire. You can still impose rate limits to measure everything I’m consuming, but allow me to also give the OK and turn on the firehose. If I need to...[Read More]


API Evangelist

API Evangelist Does Not Run On GitHub Anymore

06 Sep 2019

I migrated the main API Evangelist site off of GitHub the other day. The moved followed the migration of 100+ network sites of my API research a couple of weeks back. While I still have a handful of definitions and tooling published to GitHub, the migration of my main site signals a pretty huge shift in how I operate the site. I’ve operated the site 100% on GitHub since 2014, using YAML as the backend data store, and Jekyll to publish the pages, blogs, and data-driven pages. I have always done this to keep the site as open and accessible as I possibly can, sharing all of the data behind what I was doing However, in 2019, due to increased GitHub API rate limits, Jekyll build timeout limits, and shifts in the purpose of API Evangelist, I don’t see the value in me working to keep things open and available on GitHub anymore.To operate API Evangelist I am still going with a static approach, meaning all of the pages are published as static HTML, rather than making dynamic from a CMS or database--however, I won't be using Jekyll anymore. I will maintain all the content and data within my own home brew CMS and database, and I will publish things out on a schedule, and in response to specific events that occur. The move significantly reduces the complexity and workload on my part when it came to maintaining the many different repositories, schema, and increasinlgy complex publishing process. It is much easier to just publish HTML files to the file system of a Linux server than use Git and APIs to orchestrate changes across hundreds of repositories. It was something that was becoming untenable due to increased error rates with Jekyll builds when I committed a change, and impossible to do via the GitHub API with a shift in API rate limits recently.I’m a little sad that it is all over. I enjoyed the performance...[Read More]


API Evangelist

The Different Ways API Providers Use The OpenAPI Servers Collection

05 Sep 2019

I was looking through the OpenAPI definitions I have harvested via some automated scripts I have running, and I came across an API definition that had a variety of URLs available for their APIs, making this part of the definition something I want to study more, identifying the common patterns in use. I harvest a growing number of OpenAPI definitions and Postman Collections to help me stay in tune with who the interesting API providers are, and documenting what the common building blocks of APIs are, helping shine a light on the useful practices that exist across API providers within many different industries. The OpenAPI server collection is beeing used to help automate switching between a variety of locations, and is most commonly used to differentiate between the different stages of an API server, as see I this example: This is just the most common usage of the OpenAPI server collection out there. I’d say the second most common example is publishing multiple regions in which an API is available—leveraging DNS to to make an API more available, performant, and meeting local and regional regulations. After harvesting and processing a couple thousand OpenAPI 30 definitions following doing the same with slightly more Swagger 2.0 files the importance of moving from a single host in Swagger 2.0 to multiple potential servers in OpenAPI 3.0 revealed itself. Signaling that APIs aren’t just being deployed and made available in a single location or way. I will be regularly pulling the values for the server collection across all the OpenAPI definitions I index to develop a better understanding at how API providers are using it. It provides an interesting look at API providers roll out their infrastructure. I don’t expect every API provider to be documenting their APIs this thoroughly, but since I’m scanning GitHub for most of these API definitions, many of the API providers are publishing their OpenAPI definitions to GitHub because it is part of some...[Read More]


API Evangelist

Controlling The Conversation Around Your Mobile Application APIs

03 Sep 2019

I have seen it play out over and over since I began monitoring the API conversation. Companies who launch APIs to power a mobile application but refuse to, or are unaware of how they should be controlling the conversation around public API infrastructure. The most common reason for this is that companies do not view the APIs behind their mobile applications as public APIs, and that somehow because they are buried within their mobile application, that they are safe from hackers. Completely clueless of the fact that anyone can run any mobile application through a proxy and reverse engineer the API infrastructure behind any mobile application.Mobile application platforms that do not control the conversation around their public APIs are the ones who end up having security incidents down the road. This is due to the face that these providers end up having a pretty significant blind spot stemming from their lack of awareness and control of the conversation around their APIs, and someone ends up paying closer attention to their APIs and eventually someone finds a vulnerability to exploit. If you have a publicly available mobile application, then you have publicly available APIs, and you should be treating your APIs like they are public. I’m not saying you should offer the public free and unfettered access to your APIs, I am simply saying that you should be operating a public API program around your APIs, controlling who has access, and shaping the message around what your APIs do, or don’t do.To see examples of companies who do not have a handle on their API conversation, search for TikTok API, or for Tinder API, and you’ll see that hackers own the conversation when it comes to the APIs for these platforms. When you aren’t dealing with this side of your operations, rogue API operators step up and dominate the conversation on GitHub, Stack Exchange, NPM, and other places that coders hang out. We’ve already seen Tinder...[Read More]


API Evangelist

Benefits Of Treating My API Infrastructure As API-First

26 Aug 2019

Most API providers I speak with see the value of consistently delivering API infrastructure to power desktop, web, mobile, device, and network applications. Less than 10% of these providers see the API infrastructure that powers their APIs, and ultimately their applications as APIs. Meaning, these providers do not view every stop along the API lifecycle as a set of APIs, ensuring that your API definitions, design, mocking, deployment, management, monitoring, testing, orchestration, security, and documentation all have APIs, and are able to be governed programmatically. Mostly it is because they are just getting started on their API journey and / or they just don’t have the bandwidth to be able to step back and look holistically at what they are trying to accomplish. In this truly API-first world, every architectural decision begins with a README in a Git repo, and the business, architectural, and development teams working together to hammer out a simple name, description, and list of capabilities for each individual architectural component. Once in agreement, then they can then get to work hammering out a contract for the schema, and the interfaces that are available for potential consumers. Once again, working between business, architecture, and development teams to come up with the simplest, most useful, and durable contract possible. One that will work for all stakeholders, and best represent the architectural capabilities of each component you will need to successfully operate each stop along your API life cycle. Once there is an overview and contract in place for each architectural component, you can begin mocking, testing, and documenting it for wider potential consumption by other stops along the API life cycle. With all stakeholders in agreement over the capabilities of each architecture component, and how it will work, you can begin wiring up specific back-end capabilities which might be as simple as data and content storage, or as complex as API management, monitoring, and testing. Leveraging 3rd party services, open source solutions, or...[Read More]


API Evangelist

Doing A Diff Between Available Web, Mobile and Public APIs

22 Aug 2019

I spend a lot of time running web and mobile applications through a proxy to reverse engineer their APIs. I generally use Charles Proxy for routing my desktop, web, and mobile traffic through, which then automatically saves a JSON dump of sessions every five minutes, and syncs with Dropbox via my shared folders. From there I have a schedule service that will look in the shared Dropbox folder every hour, sift through the Charles Proxy JSON dump, and look for JSON, XML, CSV, and other common machine readable formats—which are then converted into OpenAPI definitions. Allowing me to reverse engineer desktop, web, and mobile applications as I use them, and map the API surface area for these targeted applications. Recently I started playing with doing the same thing as part of my use of Postman. You can use the built-in proxy in the Postman native apps or use the Interceptor extension for the Postman app. Postman walks you through how to configure your laptop, mobile, and Postman application, and ultimately capture HTTP requests and save them to history or as a Postman Collection. Doing essentially the same thing I’m doing, but doing it with the Postman application, and leveraging collections instead of OpenAPI. I’d say there are pros and cons to both approaches, but Postman gives me the ability to manage environments, workspaces, and other essential concepts that would help take my API profiling work to the next level. One of the benefits of working with Postman collections is you get all the benefits of using the Postman app. Things like the built-in proxy for capturing traffic, but also the history, generate, fork, merge, and diff collections. My work profiling APIs is all about reverse engineering desktop, web, and mobile applications, as well as quickly translating API documentation in machine readable API definitions, like OpenAPI and Postman. When profiling an API, most of the time I have the API documentation open in one browser window,...[Read More]


API Evangelist

An API Policy Domain Specialist At Twitter

22 Aug 2019

There are some jobs on the Internet I apply for no matter what my current situation is, and an API policy domain specialist at Twitter was one of them that popped up recently. I applied for the job within the first couple of hours after it came out, but haven’t heard from them. I can speculate on the reasons why, but I think a story about the job posting itself is actually more interesting, so I’ll focus there. It is the first time I’ve seen a job posting for this role, but I think it will eventually become a required role in the future for any company with a public API—-that is, if companies want avoid the trouble Twitter is going through right now, which again, is making Twitter the poster child for how to do APIs both right and wrong. To highlight what this role is all about, I think Twitter’s own posting sums it up well, so let’s start by just reviewing what you’ll be doing if you get this job at Twitter: [Read More]


API Evangelist

Multiple Overlapping API Life Cycle(s)

21 Aug 2019

One of the toughest parts about teaching people about APIs is that there are many different views of what the API life cycle can be depending on who you are, and what your intentions are. As an advocate or evangelist for a single API you are speaking externally to the API consumption life cycle, but internally you are focused on the API delivery life cycle. As an API Evangelist for many APIs, targeting providers, consumers, and anyone else who comes along, I find it a constant challenge to properly speak to my intended audience. One problematic part of my storytelling I regularly see emerge is that I speak of a single API life cycle, where in reality there are many overlapping life cycles. So, to help me think through all of this I wanted to explore what these overlapping tracks might be—coming up with four distinct iterations of overlapping API building blocks. The API Delivery Life Cycle
 The most common way we refer to the API life cycle is from the perspective of the API provider, where it is all about delivering an API. Referencing the stops along the life cycle that are most relevant to someone who is delivering a new API, or might be moving an API forward as part of the evolution of an existing resource. From my vantage point, I consider these to be the most common stops along the API delivery life cycle. Definitions - Defining what an API does, crafting the JSON schema, OpenAPI, AsyncAPI, and other machine readable definitions of what is potentially being delivered, initializing the contract that will guide an API through the life cycle. Design - Stepping back and considering the healthiest API Design practices to apply, working from a diverse API toolbox, and ensuring you have a good handle on who your audience is, and the design patterns they’ll respond to. Mocking - Generating a mock instance of an API using the contract to...[Read More]


API Evangelist

The API Conferences I Am Tracking On For The Fall

20 Aug 2019

As we approach the fall it is time to begin thinking about the conference season, and what the most relevant API conferences are. I haven’t been doing any events this year, but staying in tune with the conference circuit has always been important to my work. Who knows, maybe I will be spend some more time investing in API related events after taking a break for six month. When it comes to API events and conferences, here is what I am tracking on. [Read More]


API Evangelist

Human Empathy Is One Of My Most Important API Outreach Tools

20 Aug 2019

I am an empathic human being. It is one of my top strengths, as well as one of my top weaknesses. It is also one of the most important tools in my API toolbox. Being able to understand the API experience from the position of different people throughout the world of APIs is a cornerstone of the API Evangelist brand. Personally, I find APIs themselves to be empathy triggering, and something that has regularly forced me out of my silos, then allowing me t put myself in the shoes of my consumers. Something that when realized in a perpetual fashion can become a pretty powerful force for dialing in the services you offer, and establish, maintain, and strengthen connections with other people within the community. Being able to listen to people in the hallways of conferences, and within the meeting rooms across enterprise, institutions, and government agencies, then internalize, process, and position my writing from what I learn from people is how I have written on API Evangelist for the last nine years. I rarely am positioning my narrative my own vantage point, or that of a company. Most of the time I am channeling someone I’ve met along the way, speaking from their perspective, and analyzing the world of APIs as they would see it. While I wish that the world always resembled my view of the API landscape, from experience I know better, and that there are many diverse ways of seeing the value or damage APIs are responsible for. While API design, and the overall user experience around API service and tooling goes a long way to speak to end-users, I still think the human touch, and positioning our messaging from the vantage point of our consumers will have the greatest impact. Making a person connection will last much longer than any single blog post, advertisement, Tweet, image, video, or other common unit of engagement. Of course, what you gather from putting...[Read More]


API Evangelist

Postman Collection As A Single Quantifiable, Shareable, Executable Unit Of Representation For Any Digital Capability

19 Aug 2019

In my world API definitions are more valuable than code. Code is regularly thrown away and rewritten. API definitions hold the persistent detail of what an API delivers, and contain all of the proprietary value when they are properly matured. OpenAPI has definitely risen to the top when it comes to which API definition formats you should be using, however, Postman Collections have one critical ingredient that makes them ultimately more usable, sharable, and meaningful to developers—-environmental context. This small but important difference is what makes Postman Collections so valuable as a single quantifiable, shareable, executable unit of representation for any digital capability. Like OpenAPI, Postman Collections describe the surface area of a web API, but they have that added layer to describe the environment you are running in, which makes it much more of a run-time and execute-time experience. This may seem like a minor detail, but developers who want instant gratification, a Postman Collection bundled with the Postman API lifecycle tooling, makes for a pretty powerful representation of a company’s, organization’s, institutions’s, or government agency’s digital capability. Allowing for API providers (or consumers) to describe what an API does in a machine readable format, bundle with it the environment context to actually execute the digital capability, and enable the unit of value to be realized within the Postman API development ecosystem. I can take any of my internal, or 3rd party public APIs I depend on, make successful calls to them, including authentication, tokens, and other environment variables, then export as a portable Postman Collection, and share with anyone I want using a simple URL, or by embedded the Run in Postman button within documentation, blog posts, and other resources. Then, any potential consumer can take that Postman Collection, load into their Postman client, and be able to realize the same digital capability I was using—-no documentation, on-boarding, or other friction required. You get instant gratification regarding putting the digital capability to work,...[Read More]


API Evangelist

A Second Wave of API Management is Going On

19 Aug 2019

I fully surfed the first wave of API management. API Evangelist began by researching what Mashery, Apigee, and 3Scale had set into motion. API Evangelist continued to has exist through funding from 3Scale, Mulesoft, WSO2, and continues to exist because of the support of next generation providers like Tyk. I intimately understand what API management is, and why it is valuable to both API providers and consumers. API management is so relevant as infrastructure it is now baked into the AWS, Azure, and Google Clouds. However, if you listen to technological winds blowing out there, you will mostly hear that the age of API management is over with, but in reality it is just getting started. The folks telling these tales are purely seeing the landscape from an investment standpoint, and not from an actual boots on the ground within mainstream enterprise perspective—something that is going to burn them from an investment standpoint, because they are going to miss out on the second wave of API management that is going on. The basics of API haven’t changed from the first to the second wave, so let’s start with the fundamental building blocks of API management before I move into describing what the next wave will entail: Portal - A single URL to find out everything about an API, and get up and running working the resources that are available. On-Boarding - Think just about how you get a new developer to from landing on the home page of the portal to making their first API call, and then an application in production. Accounts - Allowing API consumers to sign up for an account, either for individual, or business access to API resources. Applications - Enable each account holder to register one or many applications which will be putting API resources to use. Authentication - Providing one, or multiple ways for API consumers to authenticate and get access to API resources. Services - Defining which services...[Read More]


API Evangelist

Seeing API Consumers As Just The Other Ones

16 Aug 2019

As API providers, it can be easy to find ourselves in a very distant position from the consumers of our APIs. In recent weeks I have been studying the impacts of behavioral approaches to putting technology to work, something that has led me to the work of Max Meyer, and his Psychology of the Other-One (1921). I haven’t read his book yet, but have finished other works citing his work on how to “properly” study how animals (including humans) behave. While the psychological impact of all of this interests me, I’m most interested in how this perspective has amplified and driven how we use technology, and specifically how APIs can be used to create or bridge the divide between us (API providers) and our (API consumers). While web and mobile technology is often portrayed as connecting and bringing people together, it also can be used to establish a separation between providers and consumers. We often get caught up in the scale and growth of delivering API infrastructure, and we forget that our API consumers are humans, and we can end up just seeing them as personas, humans, or just a demographic. Of course, as API providers, we can’t be expected to make a direct connection with every single consumer, but we also have to be wary of becoming so distant from their reality that we can’t make a connection with them at all. Leaving our products, services, and tooling something that doesn’t serve them in any way, and we fail to meet our own business objectives behind what we were building in the first place. There will aways be some distance between API provider and consumer. However, we have to regularly work to narrow this divide, otherwise negative forces can make their way in between us and consumers. If we simply see your API consumers and end-users as the “other ones”, it will make supporting, and investing in their success much more difficult. Trust with...[Read More]


API Evangelist

Four Phases Of Internal API Evangelism

16 Aug 2019

General evangelism around what APIs are, as well as more precise advocacy around specific APIs or groups of API resources takes a lot of work, and repetition. Even as a seasoned API evangelist I can never assume my audience will receive and understand what it is that I am evangelizing, and I regularly find myself having to reassess the impact (or lack of) that I’m making, retool, refresh, and repeat my messaging to get the coverage and saturation I’m looking for. After a decade of doing this, I cannot tell which is more difficult, internal or external public evangelism, but I do find that after almost 10 years, I’m still learning from each audience I engage with—-proving to me that no single evangelism strategy will ever reliably work, and I need a robust, constantly evolving toolbox if I am going to be successful. I have many different tools in my internal evangelism toolbox, but I find that the most important aspect of what I do is repetition. I never assume that my audience understand what I’m saying after a single session, or simply by reading one wiki page, blog post, or white paper. Quality internal evangelism requires regular delivery and enforcement, and an acknowledgement that my first engagements with teams will not have the impact I desire. When it comes to internal evangelism, I tend to encounter the following phases when engaging with internal teams: Silence - The first time I present material to internal teams I almost always encounter silence. Teams will often listen dutifully, but rarely will engage with me, ask questions, and want to get to the details of what is going on. I can rarely assess the state of things, and find that the silence stems from a range of emotions, ranging from not caring at all, to being very distrustful of what I am presenting. I can never assume that teams will care, trust me, and that silence is always...[Read More]


API Evangelist

An Observable Regulatory Provider Or Industry API Management Gateway

15 Aug 2019

I wrote a separate piece on an API gateway specification standard recently, merging several areas of my research and riffing on a recent tweet from Sukanya Santhanam (@SukanyaSanthan1). I had all these building blocks laying around as part of my research on API gateways, but also from the other areas of the API lifecycle that I track on. Her tweet happened to coincide with other thoughts I had simmering, so I wanted to jump on the opportunity to publish some posts, and see if I could slow jam a conversation in this area. Now, after I defined what I’d consider to be a core API gateway set of building blocks, I wanted to take another crack at refining my vision for how we make it more observable and something that could be used as a tech sector regulatory framework. Copying and pasting from my API gateway core specification, here is what my v1 draft vision for an API gateway might be: Paths - Allowing many different API paths to exist. Schema - Allowing me to manage all of my schema. Integrations - Providing backend lego architecture. Resource - Allow for integration with other APIs. Database - Provide a stack of database integrations. Other - Define whole buffet of integration definitions. Requests - Define all of my HTTP 1.1 requests Methods - Providing me with my HTTP verbs. Path Parameters - Able to define path parameters. Query Parameters - Able to define query parameters. Bodies - Providing control over the request body. Headers - Full management of HTTP request headers. Encoding - Defining the media types in in use for requests. Validate - Providing validation for all incoming requests. Mappings - Allowing for mapping of requests to backend. Transformations - Transformation before sending to backend.. Examples - Ensuring there are samples for each request. Schema - Able to reference all schema used in requests. Tags - Being able to organize API requests using tags. Responses -...[Read More]


API Evangelist

An API Platform Reliability Engineer At Stripe

15 Aug 2019

I find that the most interesting and telling API building blocks come out of the companies who are furthest along in their API journey, and have made the conscious effort to heavily invest in their platforms. I try to regularly visit API platforms who are doing the most interesting work on a regular basis, because I am almost always able to find some gem of an approach that I can showcase here on the blog. This weeks gem is from API rockstar Stripe, and their posting for a reliability engineer for their API platform. Here is a summary from their job posting: As a Reliability Engineer, you’ll help lead an active area of high impact by defining and building proactive ways to further hone the reliability of our API. You’ll collaborate with team members across Engineering, as well as with our business, sales and operations teams to determine areas of greatest leverage. You Will: Work with engineers across the company to identify key areas for reliability improvement. Gather requirements and make thoughtful tradeoffs to ensure we are focusing our efforts on the most impactful projects. Work on services and tools to proactively improve the quality and reliability of our production API. Debug production issues across services and multiple levels of the stack. Improve operational standards, tooling, and processes. I’ve studied thousands of APIs over almost a decade, and seeing a company invest this heavily in API reliability is a rare thing. For me, this demonstrates two things, that Stripe takes their API seriously, and that it takes a huge amount of investment and resources to do APIs right. Something I don’t think many API providers realize as they try to do APIs as a side project, and wonder why they aren’t seeing the results they want. I find that the job postings for API providers is one of the most telling signals I can harvest to understand how committed a company is to their APIs....[Read More]


API Evangelist

Some Of The API Gateway Building Blocks

05 Aug 2019

The inspiration for this post wasn’t fully mine, I’m borrowing and building upon what Sukanya Santhanam (@SukanyaSanthan1) tweeted out the other day. It is a good idea, and something that should be open sourced and moved forward. I’ve been studying with API management since 2010, and using gateways for over 15 years. I’ve watched gateways evolve, and partnered regularly with API management and gateway providers (Shout out to Tyk). Modern API gateways aren’t your grandfather’s SOA tooling, they’ve definitely gone through several iterations. While I still prefer hand rolling and forging my APIs out back in my woodshed on an anvil, I find myself working with a lot of different API gateways lately. I’ve kept feeling like I needed to map out the layers of what I’d consider to be a modern API gateway, and begin providing links to the most relevant API gateways out there, and the most common building blocks for an API gateway. Now that you can find API gateways baked into the fabric of the cloud, it is time that we work to standardize the definition of what they can deliver. I’m not looking to change what already is. Actually, I’m looking to just document and build on what already is. As with every other stop along the API lifecycle I’m looking to just map out the common building blocks, and establish a blueprint going forward the might influence existing API gateway providers, as well as any newcomers. After going through my API gateway research for a while, I quickly sketched out these common building blocks for helping deploy, manage, monitor, and secure your APIs: Paths - Allowing many different API paths to exist. Schema - Allowing me to manage all of my schema. Integrations - Providing backend lego architecture. Resource - Allow for integration with other APIs. Database - Provide a stack of database integrations. Other - Define whole buffet of integration definitions. Requests - Define all of my HTTP...[Read More]


API Evangelist

A Look At Behavioral API Patents

05 Aug 2019

I have been studying uses of behavioral technology lately. Riffing off my partner in crimes work on the subject, but putting my own spin on it, and adding APIs (of course) into the mix. Applying on of my classic techniques, I figured I’d start with a patent search for “behavioral application programming interfaces”. I find patents to be a “wonderful” source of inspiration and understanding when it comes to what is going on with technology. Here are the top results for my patent search, with title, abstract, and a link to understand more about each patent. User-defined coverage of media-player devices on online social networks In one embodiment, a method includes detecting, by a media-player device including multiple antennas, a client system of a user is within a wireless communication range of the media-player device. In response to the detection, the media-player device broadcasts an authentication key for the user of the client system. The media-player device then registers the user to the media-player device based on the authentication key being verified by the client system. The media-player device further receives from the client system instructions to adjust a power level of each of the multiple antennas. The instructions are determined based on broadcast signals received at the client system and on a respective position of the client system associated with each received broadcast signal. The respective position of the client system is determined with respect to a position of the media-player device. Controlling use of vehicular Wi-Fi hotspots by a handheld wireless device A system and method of controlling use of vehicular Wi-Fi hotspots by a handheld wireless device includes: detecting that the handheld wireless device is communicating via a Wi-Fi hotspot; determining at the handheld wireless device that that the Wi-Fi hotspot is provided by a vehicle; and enabling one or more default prohibitions against transmitting low-priority data from the handheld wireless device via a cellular wireless carrier system while the handheld wireless...[Read More]


API Evangelist

Reverse Engineering Mobile APIs To Show A Company Their Public APIs

02 Aug 2019

One story I tell a lot when talking to folks about APIs, is how you can reverse engineer a mobile phone to map out the APIs being used. As the narrative goes, many companies that I talk with claim they do not have any public APIs when first engage with them. Then I ask them, “Do you have any mobile applications?”. To which the answer is almost always, “Yes!”. Having anticipated this regular conversation, if I am looking to engage with a company in the area of API consulting, I will have made the time to reverse engineer their application to produce a set of OpenAPI definitions that I can then share with them, showing that they indeed have public APIs. The process isn’t difficult, and I’ve written about this several times. To reverse engineer a mobile application, all you have to do is download the targeted application to your phone, configure your phone to use your laptop as a proxy, and be running Charles Proxy on your laptop. I’m not going to share a walkthrough of this, it is easy enough to Google and find the technical details of doing it, I’m more looking to just educate the average business person that this is possible. Once you have your mobile phone proxied through Charles, it will capture every call made home to the mother ship, logging the request and response structure of all APIs being used by the mobile application-—which uses public DNS for routing, making it a public API. I have an API that I developed which I can upload the resulting Charles Proxy output file, and convert all the API calls into an OpenAPI. Making quick work of documenting the APIs behind a mobile application. Which, when you share with someone who is under the belief that their mobile APIs are private APIs, it can make quite a splash. Usually you get a response, like “well, we don’t allow access to the...[Read More]


API Evangelist

Didn’t We Already Do That?

02 Aug 2019

When you are in the API game you hear this phrase a lot, “didn’t we already do that?”. It is a common belief system that because something was already done, that it means it will not work ever again. When you are operate solely in a computational world, you tend to see things as 1s and 0s, and if something was tried and “didn’t work”, there is no resetting of that boolean switch for some reason. We excel at believing things are done, without ever unpacking why something failed, or how the context may have changed. One of the things I’m going to do over the next couple months is go through the entire SOA toolbox and take accounting of everything we threw away, and evaluate what the possible reasoning were behind it—-good and bad. I strongly believe that many folks who abandoned SOA, willingly or unwillingly, and especially the people who enjoy saying, “didn’t we already do that”, haven never spent time unpacking why we did, why it didn’t work, let alone whether or not it might work today. I think this paradigm reflects one of the fundamental illnesses we encounter in the tech sector-—we have a dysfunctional and distorted relationship with the past (this is by design). I’ve written about this before, but I’ll say it again. Can you imagine saying, “didn’t we already do that” about other non-technical things. Fashion. Art. Music. Stories. Law. Why do we say it in technology? When it comes to SOA, there are many reasons why it didn’t work overall, and most of the reasons were not technical. So why would we not want to re-evaluate some of the technologies and practices to see if there would be a new opportunity to apply an old pattern or approach? This question isn’t just something I’ve heard a handful of times. There have been literally a hundred plus blog posts on API Evangelist where people have commented this—-especially when...[Read More]


API Evangelist

The Future Of APIs Will Be In The Browser

01 Aug 2019

I have been playing with the new browser reporting API lately, and while it isn’t widely supported, it does work in Chrome, and soon Firefox. I won’t go into too much technical detail, but the API provides an interesting look at reporting on APIs usage in the browser. Offering a unique view into the shadows of what is happening behind the curtain in our browser when we are using common web applications each day. I have been proxying my web traffic for a long time to produce a snapshot at the domains who are operating beneath the covers, but it is interesting for browsers to begin baking in a look at the domains who are violating, generating errors, and other shenanigans. As I’m contemplating the API discovery universe I can’t help but think of the how “API innovation” is occurring within the browser. When I say “API innovation”, I don’t mean the kind that got us all excited from 2005 through 2010, or the golden days from 2010 through 2015-—I am talking the exploitative kind. Serving advertisers, trackers, and other exploitative practices. Most people would scoff at me calling these things APIs, but they are using the web to deliver machine readable information, so they are APIs. I’ve been tracking on the APIs I use behind the scene in my browser using Charles Proxy for a while now, but I’m feeling I should formalize my analysis. I’m thinking I’ll take a sampling of domain, maybe 250+, and automate the browsing of each page, while also running through Charles Proxy. Then aggregate all of the domains that are loaded, and categorize them by media type–just to give me a sampling of the APIs in operation behind the scenes of some common sites. I’m sure most are advertising or social related, but I’m guessing there are a lot of other surprises in there. While some of the APIs will be publicly showcased in some way, there are...[Read More]


API Evangelist

About Giving Away API Knowledge For Free

01 Aug 2019

I’m in the business of providing access to the API knowledge accumulated over the last decade. Despite what many people claim, I do not know everything about APIs, but after a decade I have picked up a few things along the way. Historically, I have really enjoyed sharing my knowledge with people, but I’m increasingly becoming weary of sharing to openly because of the nature of how business gets conducted online. Specifically, that there are endless waves of folks who want to tap what I know without giving anything in return, who work at companies who enjoy a lot of resources. I know people have read the startup handbook, which tells them to reach out to people who know and get their view, but when everyone does this, and doesn’t give anything in return, it is, well…I guess business as usual? Right? ;-( Taking a sampling from the usual week in my inbox, I’ll get a handful of requests reflecting these personas: Analysts / Consultants - Other analysts reaching out to share information, and get my take on what I’m seeing. There is usually reciprocity here, so I’m usually happy to engage, especially if I know them personally, and have worked with them before. Startup Founders - I get a wide range of startup founders reaching out, many of which I do not know, wanting to get validation of their idea, and understand the marketplace they are targeting—usually if I know them, or they come with a reference I’ll engage. Venture Capitalists - There is a regular stream of VCs wanting to know what is happening, get my take on things, but they usually are just interested listin validating what they already know, and get introduced to some new concepts. Students - There is a growing number of students reaching out, and increasingly PHD students who are working on something API related as part of their studies. This represents the usual suspects. There are plenty...[Read More]


API Evangelist

The Challenges Of API Discovery Conversations Being Purely Technical

31 Jul 2019

Ironically one of the biggest challenges facing API discovery on the web, as well as within the enterprise, is that most conversations focus purely on the technical, rather than the human and often business implications of finding and putting APIs to work. The biggest movement in the realm of API discovery in the last couple years has been part of the service mesh evolution of API infrastructure, and how your gateways “discover” and understand the health of APIs or microservices that provide vital services to applications and other systems. Don’t get me wrong, this is super critical, but it is more about satisfying a technical need, which is also being fueled by an investment wave-—it won’t contribute to much to the overall API discovery and search conversation because of it’s limited view of the landscape. Runtime API discovery is critical, but there are so many other times we need API discovery to effectively operate the enterprise. Striving for technical precision at runtime is a great goal, but enabling all your groups, both technical and business to effectively find, understand, engage, and evolve with existing APIs should also be a priority. It can be exciting to focus on the latest technological trends, but doing the mundane profiling, documentation, and indexing of existing API infrastructure can have a much larger business impact. Defining the technical details of your API Infrastructure using OpenAPI, Postman, and other machine readable formats is just the beginning, ideally you are also working define the business side of things along the way. I find that defining APIs using OpenAPI and JSON Schema to be grueling work. However, I find documenting the teams and owners behind APIs, the licensing, dependencies (both technical and business), pricing, and other business aspects of an API to be even more difficult. Over the last decade we’ve gotten to work standardizing how define the technical surface area of our APIs, but we’ve done very little work to standardize how...[Read More]


API Evangelist

Differences Between API Observability Over Monitoring, Testing, Reliability, and Performance

31 Jul 2019

I’ve been watching the API observability coming out of Stripe, as well as Honeycomb for a couple years now. Then observability of systems is not a new concept, but it is one that progressive API providers have embraced to help articulate the overall health and reliability of their systems. In control theory, observability is a measure of how well internal states of a system can be inferred from knowledge of its external outputs. Everyone (except me) focuses only on the modern approaches for monitoring, testing, performance, status, and other common API infrastructure building blocks to define observability. I insist on adding the layers of transparency and communication, which I feel are the critical aspects of observability—-I mean if you aren’t transparent and communicative about your monitoring, testing, performance, and status, does it really matter? I work to define observability as a handful of key API building blocks that every API provider should be investing in: Monitoring - Actively monitoring ALL of your APIs to ensure they are up and running. Testing - Performing tests to ensure APIs aren’t just up but also doing what they are intended to. Performance - Adding an understanding of how well your APIs are delivering to ensure they perform as expected. Security - Actively locking down, scanning, and ensuring all your API infrastructure is secure. Many folks rely on the outputs from these areas to define observability, but there are a couple more ingredients needed to make it observable: Transparency - Sharing the practices and results from each of these areas is critical. Communication - If you aren’t talking about these things regularly they do not exist. Status - Providing real time status updates for al these areas is essential. You can be actively observing the outputs from monitoring, testing, performance, and security operations, but if this data isn’t accessible to other people on your team, within or company, partners, and for the public as required, then things aren’t observable....[Read More]


API Evangelist

Peer API Design Review Sessions

30 Jul 2019

Public APIs have always benefitted from something that internal APIs do not always received—-feedback from other people. While the whole public API thing didn’t play out as I originally imagined, there is still a lot of benefit in letting other see, and provide open feedback on your API. It is painful for many developers to receive feedback on their API, and it is something that can be even more painful when it is done publicly. This is why so many of us shy away from establishing feedback loops around our APIs. It is hard work to properly design, develop, and then articulate our API vision to an external audience. It is something I understand well, but still suffer from when it comes to properly accessioning peer review and feedback on my API designs. I prefer opening up to peer reviews of my API designs while they are still just mocks. I’m less invested in them at this point, and it is easier to receive feedback on them. It is way less painful to engage in an ongoing discussion fo what an API should (and shouldn’t) do early on, then it is to define the vision, deliver an API as code or within a gateway, and then have people comment on your baby that you have given birth to. It hurts to have people question your vision, and what you’ve put forth. Especially for us fragile white men who who aren’t often very good at accepting critical feedback, and want to just be left to our own devices. I’d much prefer just being a coder, but around 2008 through 2010 I saw the benefits to my own personal development when I opened up my work to my peers and let a little sunlight in. I am a better developer because of it. One tool in my API toolbox that is growing in importance is the peer, and open API design review sessions. Taking an OpenAPI draft,...[Read More]


API Evangelist

API For Processing Common Logging Formats And Generating OpenAPI Definitions

30 Jul 2019

I’ve invested a lot of time in the last six months into various research, scripts, and tooling to help me with finding APIs within the enterprise. This work is not part my current role, but as a side project to help me get into the mindset of how to help the enterprise understand where their APIs are, and what APIs they are using. Almost every enterprise group I have consulted for has trouble keeping tabs on what APIs are being consumed across the enterprise, and I’m keen on helping understand what the best ways are to help them get their API houses in order. While there are many ways to trace out how APIs are being consumed across the enterprise, I want to start with some of the basics, or the low hanging when it came to API logging within the enterprise. I’m sure there are a lot of common logging locations to tackle, but my list began with some of the common cloud platforms in use for logging of operations to begin my work—focusing on the following three cloud logging solutions: Amazon CloudFront - Beginning with the cloud leader, and looking at how the enterprise is centralizing their logs with CloudFront. Google StackDriver - Next, I found Google’s multi-platform approach interesting and worth evaluating as part of this work. Azure Logging - Of course, I have to include Azure in all of this as they are a fast growing competitor to Amazon in this space. After establishing a short list of cloud platforms logging solutions, I began looking at which of the common web server formats I should be looking for within these aggregate logging locations, trying to map out how the enterprise is logging web traffic. Providing me with a short list of the three most common web server formats I should be looking at when it comes to mapping the enterprise API landscape—-providing artifacts of the APIs that enterprise groups are operating....[Read More]


API Evangelist

API Storytelling Within The Enterprise

28 Jul 2019

Storytelling is important. Storytelling within the enterprise is hard. Reaching folks on the open web is hard work to, but there is usually an audience that will eventually tune in, and over time you can develop and cultivate that audience. The tools you have at your disposal within the enterprise are much more prescribed and often times dictated–even controlled. I also find that they aren’t always as effective as they are made out to be, with the perception being one thing, and the reach, engagement, and noise being then much harder realities you face when trying to get a message out. Email might seem like a good idea, and is definitely a critical tool when reaching specific individuals or groups, but as a general company wide thing, it quickly becomes exponentially ineffective with each person you add on as CC. I’d say that you are better off creating a daily or weekly email newsletter if you are going to be sending across large groups of the enterprise rather than participating in the constant email barrage that occurs on a daily basis. Email is an effective tool when used properly, but I’d say I haven’t perfected the art of using email to reach my intended audience within the enterprise. My preferred storytelling format is relatively muted within the enterprise — people rarely read blogs in this world. Blog reading is something you do out on the web apparently. This means I have to get pretty creative when it comes to getting your stories out. It doesn’t mean you shouldn’t be using this format of storytelling, but you just can’t count on folks to regularly consume a blog, or subscribe to an RSS feed. You can still have a blog, but you have to find other ways of slipping the links into existing conversations, documentation, and other avenues in which people consume information within the enterprise. I would say this reality of reading within the enterprise is...[Read More]


API Evangelist

APIs and Browser Bookmarklets

24 Jul 2019

I have quite a few API driven bookmarklets I use to profile APIs. I recently quit using Google Chrome, so I needed to migrate all of them to Firefox. I saw this work as an opportunity to better define and organize them, as they had accumulated over the years without any sort of strategy. When I need some new functionality in my browser I would create a new API, and craft a bookmarklet that would accomplish whatever I needed. I wish I had more browser add-on development skills, something I regular try to invest in, but I find that bookmarklets are the next best thing when it comes to browser and API interactions. There are a number of tasks I am looking to accomplish when I’m browsing the web pages of an API provider. The first thing I want to do is record their domain, then retrieve as much intelligence about the company behind the domain in a single click of the bookmarklet. This was the first bookmarklet and API I developed. Since then, I’ve made numerous others to record the pricing page, parse the terms of service, OpenAPI, and other valuable API artifacts from across the landscape. Bookmarklets are a great way to provide just a little more context combined with a URL pointer, for harvesting, processing, and possibly some human review. Allowing me to augment, enrich, and automate how I consume information as I’m roaming around the web, researching specific topics, and do what I do. At this point I am actually glad I didn’t invest a lot of energy into developing Chrome browser extension, because it wouldn’t have easily translated to a Firefox world. Since I have been investing in APIs plus bookmarklets, I can easily import, or copy and paste my bookmarklets over. I”m spending the time to go through them, inventory them, and better organize them for optimal usage, so the migration is a little more work than just import...[Read More]


API Evangelist

Absolutism Around API Tools Increases Friction And Failure

24 Jul 2019

I know you believe your tools are the best. I mean, from your vantage point, they are. I also know that when you are building a new API tool, your investors want you to position your tooling as the best. The one solution to rule them all. They want you to work hard to make your tools the best, but also make sure and demonize other tooling as being inferior, out of date, and something the dinosaurs use. I know this absolute belief in your tooling feels good and right, but you are actually creating friction for your users, and potentially failure or at least conflict within their teams. Absolutism, along with divide and conquer strategies for evangelizing API tooling works for great short term financial strategies, but doesn’t do much to help us on the ground actually developing, managing, and sustaining APIs. Ironically, there are many divers factors that contribute to why API tooling providers and their followers resort to absolutism when it comes to marketing and evangelizing their tools. Much of which has very little to do with the merits of the tools being discussed, and everything about those who are making the tools. I wanted to explore a few of them so they are available on the tip of my tongue while working within the enterprise. No Due Diligence On What Is Out There - Most startups who are developing API tooling do not spend the time understanding what already exists across the landscape, and get outside of the echo chamber to learn what real world companies are using to get the job done each day. No Learning Around Using Existing Tools - Even if startups are aware of existing tools, patterns, and processes, they rarely invest the time to actually understand what existing tools deliver—spending time to deeply understand how existing tools are being put to use by their would-be customers. Lack Of Awareness Around The Problem - There is a...[Read More]


API Evangelist

The Higher Level Business Politics That I Am Not Good At Seeing In The API Space

23 Jul 2019

I have built successful startups. I’m good at the technology of delivering new solutions. I am decent at understanding and delivering much of business side of bringing new technological solutions to market. What I’m not good at is the higher level business politics that occur. These are the invisible forces behind businesses that I’m good at seeing, playing in, and almost always catch me off guard, too late, or just simply piss me off that they are going on behind the scenes. Unfortunately it is in this realm where most of the money is to be made doing APIs, resulting in me being written out of numerous successful API efforts, because I’m not up to speed on what is going on. Startups are great vehicles for higher level economic strategies. They are the playground of people with access to resources, and have economic visions that rarely jive with what is happening on the ground floors. Startup strategies count on a handful at the top understanding the vision, with most at the bottom levels not being able to see the vision, and small group of disposable stooges in the middle, ensuring that the top level vision is realized—at all costs. You can work full time at a startup, and even enjoy a higher level position, and still never see the political goings on that are actually motivating the investment in your startup. This is by design. The whole process depends on the lower levels working themselves to the bone, working on, marketing, and selling one vision, while there are actually many other things going on above, usually with a whole other set of numbers. After 30 years of playing in this game I still stuck at seeing the higher level influences. I’ve seen shiny API tooling solution after shiny API tooling solution arrive on the market, and I still fall for the shine. Ooooohhh, look at that. It will solve X, or Y problem. I really...[Read More]


API Evangelist

API Provider And Consumer Developer Portals

23 Jul 2019

I’ve been studying API developer portals for almost a decade. I’ve visited the landing pages, portals, websites, and other incarnations from thousands of API providers. I have an intimate understanding of what is needed for API providers to attract, support, and empower API consumers. One area I’m deficient in, and I also think it reflects a wider deficiency in the API space, is regarding how to you make an API portal service both API providers and API consumers. Providing a single portal within the enterprise where everyone can come and understand how to deliver or consume an API. There are plenty of examples out there now when it comes to publishing an API portal for your consumers, but only a few that show you how to publish an API. I’d say the most common example are API marketplaces that allow both API consumers and providers to coexist, but this model isn’t exactly what you want within the enterprise. One thing the model lacks is the on-boarding of new developers when it comes to actually developing an API. Suffering from many of the same same symptoms API management service providers have historically suffered from—-not providing true assistance when it comes to delivering a quality API. When I envision an API portal that serves both providers and consumers, either publicly or privately, I envision just as much assistance when it comes to delivering a new API as we provide for new consumers of an API. Helping with API definition, design, deployment, management, testing, monitoring, documentation, and other critical stops along the API lifecycle. We need to see more examples of the split between API provider and consumers, equally helping both sides of the coin get up to speed, and be successful with what they are looking to achieve. I think we’ve spend almost 15 years investing in perfecting and monetizing the API portal with a focus not he consumer, and now we need to invest on helping...[Read More]


API Evangelist

The Role Having Awareness Of Your API Traffic Plays In API Security

22 Jul 2019

One of the biggest reasons we adopt new technology, and justify the development of new technology, is we do not want to do the heavy lifting when it comes to what we already have in place. A common illness when it comes to API security that I’ve been battling since 2015 is that you will have API security addressed once you adopted an API management solution. Your APIs require API keys, and thus they are secure. No further work necessary. The unwillingness or lack of knowledge regarding what is needed next, leaves a vacuum for new technology providers to come in and sell you the solution for what is next, when you should be doing more work to use the tools you already have. When it comes to API management, most vendors sold it as purely a security solution, and when companies implement it they become secure. Missing the entire point for why we do API management-—to develop an awareness of our API usage and consumption. Having keys for your APIs is not enough. You actually have to understand how those API consumers are putting API resources to work, otherwise your API security will always be deficient. Some of the fundamentals of API management you should be employing as part of your API security are: Registration - Make all developers sign of for API usage, establishing the terms of use. API Keys - Require all developers internal or external to use API keys for every application. API Usage - Which APIs are being used by all API consumers putting them to use in applications. API Errors - Understanding what the errors being generated are, and who is responsible for them. Logging - The logging of all API traffic, reconciling against what you know as reported usage. Invoicing - Invoicing of all consumers for their usage, even if they aren’t paying you money. Reporting - Provide reports on API usage for all stakeholders, to regularly develop...[Read More]


API Evangelist

Happy Path API Testing Bias

22 Jul 2019

I see a lot of happy path bias when it comes to the development of APIs, but specifically when it comes to crafting testing to ensure APIs are delivering as expected. Happy path is a term used in testing to describe the desired outputs a developer and product owner is looking for. Making the not so happy path being about testing for outcomes that a developer and product owner is not wanting to occur. When it comes to API development most developers and product owners are only interested in the happy path, and will almost always cut corners, minimize the investment in, or completely lack an imagination when it comes to less than happy path API testing. There are many reasons why someone will have a bias towards the happy path when developing an API. Every API provider is invested in achieving the happy path for delivering, providing, and consuming an API. This is what generates revenue. However, in this quest for revenue, we often become our own worst enemy. Shining a spotlight on the happy path, while being completely oblivious to what the not so happy paths will look like for end users. Why do we do this? Greed - We are so interested in getting an API up and running, used in our applications, and generating behavioral surplus, we are more than willing to ignore all other possible scenarios if we can easily meet our revenue goals by ignoring the unhappy path and there are no consequences. Tickets - Most development occurs using JIRA or other software development “tickets”, which tell developers what they are supposed to do to meet the requirements of their employment—tickets are written with the happy path in mind, and developers are rarely willing to do more. Imagination - While many of us technologists think we are imaginative creatures, most of us are pretty stuck in a computational way of thinking, and elaborating, iterating, and exploring beyond the initial...[Read More]


API Evangelist

What Makes You Think Your GraphQL Consumers Will Want To Do The Work

18 Jul 2019

Data work is grueling work. I’ve been working with databases since my first job developing student information databases in 1988 (don’t tell my wife). I’ve worked with Cobol, Foxpro, SQL Server, Filemaker, Access, MySQL, PostGres, and now Aurora databases over the last 30 years. I like data. I can even trick myself into doing massive data and schema refinement tasks on a regular basis. It is still grueling work that I rarely look forward to doing. Every company I’ve worked for has a big data problem. Data is not easy work, and the more data you accumulate, the more this work grows out of control. Getting teams of people to agree upon what needs to happen when it comes to schema and data storage, and actually execute upon the work in a timely, cost effective, and efficient way is almost always an impossible task. Which leaves me questioning (again), why GraphQL believers think they are going to be successfully in offshoring the cognitive and tangible work load to understand what data delivers, and then successfully apply it to a meaningful and monetizable business outcome. Don’t get me wrong. I get the use cases where GraphQL makes sense. Where you have an almost rabid base of known consumers, who have a grasp on the data in play, and possesses an awareness of the schema behind. I’m have made the case for GraphQL as a key architectural component within a platform before. The difference in my approach over the majority of GraphQL believers, is that I’m acknowledging there is a specific group of savvy folks who need access to data, and understand the schema. I’m also being honest about who ends up doing the heavy data lifting here—-making sure this group wants it. However, I have an entirely separate group of users (the majority) who do not understand the schema, and have zero interest in doing the hard work to understand the schema, navigate relationships, and develop...[Read More]


API Evangelist

What Is An Application?

18 Jul 2019

I have struggled asking this question in many discussions I’ve had around the world, at technology conferences, on consulting projects, and in the back rooms of dimly lit bars. What is an application? You get ten different answers if you ask this question to ten different people. I’d say the most common response is to reference the applications on a mobile device. These are the relevant. Most accessible. The most active and visible form of application in our modern life. Older programmers see them as desktop applications, where younger programmers see them as web applications, with varying grades of server applications in between. If you operate at the network layer, you’ve undoubtedly bastardized the term to mean several different things. Personally, I’ve always tried to avoid these obvious and tangible answers to this question, looking beyond the technology. My view of what an application is stems from a decade of studying the least visible, and least tangible aspect of an application, its programming interface. When talking to people about applications, the first question I ask folks is usually, “do you know what an API is”? If someone is API savvy I will move to asking, “when it comes to application programming interface (API), who or what is being programmed? Is it the platform? The application? Or, is it the end-user of the applications?” I’ve spent a decade thinking about this question, playing it over and over in my end, never quite being satisfied with what I find. Honestly, the more I scratch, the more concerned I get, and the more I’m unsure of exactly what an “application” is, and precisely who are what is actually being programmed. Let’s further this line of thinking by looking at the definitions of “application”: noun - The act of applying. noun - The act of putting something to a special use or purpose. noun - A specific use to which something is put. noun - The capacity of being...[Read More]


API Evangelist

The Many Ways In Which APIs Are Taken Away

17 Jul 2019

APIs are notorious for going away. There are so many APIs that disappear I really stopped tracking on it as a data point. I used keep track of APIs that were shuttered so that I could play a role in the waves of disgruntled pitchfork mobs rising up in their wake–it used to be a great way to build up your Hacker News points! But, after riding the wave a couple hundred waves of APIs shuttering, you begin to not really not give a shit anymore—-growing numb to it all. API deprecation grew so frequent, I wondered why anyone would make the claim that once you start an API you have to maintain it forever. Nope, you can shut down anytime. Clearly. In the real world, APIs going away is a fact of life, but rarely a boolean value, or black and white. There are many high profile API disappearances and uprising, but there are also numerous ways in which some API providers giveth, and then taketh away from API consumers.: Deprecation - APIs disappear regularly both communicated, and not so communicated, leaving consumers scratching their heads. Disappear - Companies regularly disappear specific API endpoints acting like they were never there in the first place. Acquisition - This is probably one of the most common ways in which high profile, successful APIs disappear. Rate Limits - You can always rate limit away users make APIs inaccessible, or barely usable for users, essentially making it go away. Error Rates - Inject elevated error rates either intentionally or unintentionally can make an API unusable to everyone or select audience. Pricing Tiers - You can easily be priced out of access to an API making it something that acts just like deprecating for a specific group. Versions - With rapid versioning, usually comes rapid deprecation of earlier versions, moving faster than some consumers can handle. Enterprise - APIs moving from free or paid tier, into the magical enterprise,...[Read More]


API Evangelist

Paying for API Access

17 Jul 2019

APIs that I can’t pay for more access grinds my gears. I am looking at you GitHub, Twitter, Facebook, and a few others. I spend $250.00 to $1500.00 a month on my Amazon bill, depending on what I’m looking to get done. I know I’m not the target audience for all of these platforms, but I’m guessing there is a lot more money on the table than is being acknowledged. I’m guessing that the reason companies don’t cater to this, is that there are larger buckets of money involved in what they are chasing behind the scenes. Regardless, there isn’t enough money coming my way to keep my mouth shut, so I will keep bitching about this one alongside the inaccessible pricing tiers startups like to employ as well. I’m going to keep kvetching about API pricing until we are all plugged into the matrix—-unless the right PayPal payment lands in my account, then I’ll shut up. ;-) I know. I know. I’m not playing in all your reindeer startup games, and I don’t understand the masterful business strategy you’ve all crafted to get rich. I’m just trying to do something simple like publish data to GitHub, or do some basic research on an industry using Twitter. I know there are plenty of bad actors out there who want also access to your data, but it is all something that could be remedied with a little pay as you go pricing, applied to some base unit of cost applied to your resources. If I could pay for more Twitter and GitHub requests without having to be in the enterprise club, I’d be very appreciative. I know that Twitter has begun expanding into this area, but it is something that is priced out of my reach, and not the simple pay as you go pricing I prefer with AWS, Bing, and other APIs I happily spend money on. If you can’t apply a unit of value...[Read More]


API Evangelist

Imperative, Declarative, and Workflow APIs

16 Jul 2019

At every turn in my API work I come across folks who claim that declarative APIs solutions are superior to imperative ones. They want comprehensive, single implementation, do it all their way approaches, over granular, multiple implementation API calls that are well defined by the platform. Declarative calls allow you to define a single JSON or YAML declaration that can then be consumed to accomplish many things, abstracting away the complexity of doing those many things, and just getting it done. Imperative API interfaces require many individual API calls to tweak each and every knob or dial on the system, but is something that is often viewed as more cumbersome from a seasoned integrator, but for newer, and lower level integrators a well designed imperative API can be an important lifeline. Declarative APIs are almost always positioned against imperative APIs. Savvier, more experienced developers almost always want declarations. Where newer developers and those without a full view of the landscape, often prefer well designed imperative APIs that do one thing well. From my experience, I always try to frame the debate as imperative and declarative where the most vocal developers on the subject prefer to frame it as declarative vs imperative. I regularly have seasoned API developers “declare” that I am crazy for defining every knob and dial of an API resource, without any regard for use cases beyond what they see. They know the landscape, don’t want to be burdened them with having to pull every knob and dial, just give them one interface to declare everything they desire. A single endpoint with massive JSON or YAML post or abstract it all away with an SDK, Ansible, GraphQL, or a Terraform solution. Demanding that a platform meet their needs, without ever considering how more advanced solutions are delivered and executed, or the lower level folks who are on boarding with a platform, and may not have the same view of what is required to...[Read More]


API Evangelist

Hoping For More Investment In API Design Tooling

16 Jul 2019

I was conducting an accounting of my API design toolbox, and realized it hasn’t changed much lately. It is still a very manual suite of tooling, and sometimes services, that help me craft my APIs. There are some areas I am actively investing in when it comes to API design, but honestly there really isn’t that much new out there to use. To help me remember how we got here, I wanted to take a quick walk through the history of API design, and check in on what is currently available when it comes to investing in your API design process. API design has historically meant REST. Many folks still see it this way. While there has been plenty of books and articles on API design for over a decade, I attribute the birth of API design to Jakub and Z at Apiary (https://apiary.io). I believe they first cracked open the seed of API design, and the concept API design first. Which is one of the reasons I was so distraught when Oracle bought them. But we won’t go there. The scars run deep, and where has it got us? Hmm? Hmm?? Anyways, they set into motion an API design renaissance which has brought us a lot of interesting thinking on API design, a handful of tooling and services, some expansion on what API design means, but ultimately not a significant amount of movement overall. Take a look at what AP+I Design (https://www.apidesign.com/) and API Plus (https://apiplus.com/) have done to architecture, API has done for the oil and gas industry (https://www.api.org/), and API4Design has done for the packaging industry (http://api4design.com/)—I am kidding. Don’t get me wrong, good things have happened. I am just saying we can do more. The brightest spot that represents the future for me is over at: Stoplight.io - They are not just moving forward API design, they are also investing in the full API lifecycle, including governance. I rarely plug startups,...[Read More]


API Evangelist

What Is An API Contract?

15 Jul 2019

I am big on regularly interrogating what I mean when I use certain phrases. I’ve caught myself repeating and reusing many hollow, empty, and meaningless phrases over my decade as the API Evangelist. One of these phrases is, “an API contract”. I use it a lot. I hear it a lot. What exactly do we mean by it? What is an API contract, and how is it different or similar to our beliefs and understanding around other types of contracts? Is it truth, or is just a way to convince people that what we are doing is just as legitimate as what came before? Maybe it is even more legitimate, like in a blockchain kind of way? It is an irreversible, unbreakable, digital contract think bro! If I was to break down what I mean when I say API contract, I’d start with being able to establish to a shared articulation of what an API does. We have an OpenAPI definition which describes the surface area of the request and response of each individual API method being offered. It is available in a machine and human readable format for both of us to agree upon. It is something that both API provider and API consumer can agree upon, and get to work developing and delivering, and then integrating and consuming. An API contract is a shared understanding of what the capabilities of a digital interface are, allowing for applications to be programmed on top of. After an API contract establishes a shared understanding, I’d say that an API contract helps mitigate change, or at leasts communicates it—-again, in a human and machine readable way. It is common practice to semantically version your API contracts, ensuring that you won’t remove or rename anything within a minor or patch release, committing to only changing things in a big way with each major release. Providing an OpenAPI of each version ahead of time, allowing consumers to review that...[Read More]


API Evangelist

My Primary API Search Engines

12 Jul 2019

I am building out several prototypes for the moving parts of an API search engine I want to build, pushing my usage of APIs.json and OpenAPI, but also trying to improve how I define, store, index, and retrieve valuable data about thousands of APIs through a simple search interface. I’m breaking out the actual indexing and search into their own areas, with rating system being another separate dimension, but even before I get there I have to actually develop the primary engines for my search prototypes, feeding the indexes with fresh signals of where APIs exist across the online landscape. There isn’t an adequate search engine out there, so I’m determined to jumpstart the conversation with an API search engine of my own. Something that is different from what web search engines do, and tailored to the unique mess we’ve created within the API industry. My index of APIs.json and OpenAPI definitions, even with a slick search interface is just a catalog, directory, or static index of a small piece of the APIs that are out there. I see a true API search engine as three parts The Humans Searching for APIs - Providing humans with web application to search for new and familiar APIs. The Search Engine Searching For APIs - Ensuring that the search engine is regularly searching for new APIs. Other Systems Searching For APIs - Providing an API for other systems to search for new and familiar APIs. Without the ability for the search engine to actually seek out new APIs, it isn’t a search engine in my opinion—-it is a search application. Without an API for searching for APIs, in my opinion, it isn’t an API search engine. It takes all three of these areas to make an application a true API search engine, otherwise we just have another catalog, directory, marketplace, or whatever you want to call it. To help me put the engine into my API search engine,...[Read More]


API Evangelist

Taking A Fresh Look At The Nuance Of API Search

11 Jul 2019

I have a mess of APIs.json and OpenAPI definitions I need to make sense of. Something that I could easily fire up an ElasticSearch instance, point at my API “data lake”, and begin defining facets and angles for making sense of what is in there. I’ve done this with other datasets, but I think this round I’m going to go a more manual route. Take my time to actually understand the nuance of API search over other types of search, take a fresh look at how I define and store API definitions, but also how I search across a large volume of data to find exactly the API I am looking for. I may end up going back to a more industrial grade solution in the end, but I am guessing I will at least learn a lot along the way. I am using API standards as the core of my API index—APIs.json and OpenAPI. I’m importing other formats like API Blueprint, Postman, RAML, HAR, and others, but the core of my index will be APIs.json and OpenAPI. This is where I feel solutions like ElasticSearch might overlook some of the semantics of each standard, and I may not immediately be able to dial-in on the preciseness of the APIs.json schema when it comes to searching API operations, and OpenAPI schema when it comes to searching the granular details of what each individual API method delivers. While this process may not get me to my end solution, I feel like it will allow me to more intimately understand each data point within my API index in a way that helps me dial-in exactly the type of search I envision. The first dimensions are of my API search index are derived from APIs.json schema properties I use to define every entity within my API search index: Name - The name of a company, organization, institution, or government agency. Description - The details of what a particular...[Read More]


API Evangelist

The JSON Schema Tooling In My Life

10 Jul 2019

I am always pushing for more schema order in my life. I spend way too much time talking about APIs, when a significant portion of the API foundation is schema. I don’t have as many tools to help me make sense of my schema, and to improve them as definitions of meaningful objects. I don’t have the ability to properly manage and contain the growing number of schema objects that pop up in my world on a daily basis, and this is a problem. There is no reason I should be making schema objects available to other consumers if I do not have a full handle on what schema objects exist, let alone a full awareness of everything that has been defined when it comes to the role that each schema object plays in my operations. To help me better understand the landscape when it comes to JSON Schema tooling, I wanted to take a moment and inventory the tools I have bookmarked and regularly use as part of my daily work with JSON Schema: JSON Schema Editor - https://json-schema-editor.tangramjs.com/ - An editor for JSON Schema. JSON Schema Generator - https://github.com/jackwootton/json-schema - Generates JSON Schema from JSON JSON Editor - https://json-editor.github.io/json-editor/ - Generates form and JSON from JSON Schema. JSON Editor Online -https://github.com/josdejong/jsoneditor/ - Allows me to work with JSON in a web interface. Another JSON Schema Validator (AJV) - https://github.com/epoberezkin/ajv - Validates my JSON using JSON Schema. I am going to spend some time consolidating these tools into a single interface. They are all open source, and there is no reason I shouldn’t be localizing their operation, and maybe even evolving and contributing back. This helps me understand some of my existing needs and behavior when it comes to working with JSON Schema, which I’m going to use to seed a list of my JSON Schema needs, as drive a road map for things I’d like to see developed. Getting a little more structure...[Read More]


API Evangelist

Navigating API Rate Limit Differences Between Platforms

10 Jul 2019

I always find an API providers business model to be very telling about the company’s overall strategy when it comes to APIs. I’m currently navigating the difference between two big API providers, trying to balance my needs spread across very different approaches to offering up API resources. I’m working to evolve and refine my API search algorithms and I find myself having to do a significant amount of work due to the differences between GitHub and Microsoft Search. Ironically, they are both owned by the same company, but we all know their business models are seeking alignment as we speak, and I suspect my challenges with GitHub API is probably a result of this alignment. The challenges with integrating with GitHub and Microsoft APIs are pretty straightforward, and something I find myself battling regularly when integrating with many different APIs. My use of each platform is pretty simple. I am looking for APIs. The solutions are pretty simple, and robust. I can search for code using the GitHub Search API, and I can search for websites using the Bing Search API. Both produce different types of results, but what both produce is of value to me. The challenge comes in when I can pay for each API call with Bing, and I do not have that same option with GitHub. I am also noticing much tighter restriction on how many calls I can make to the GitHub APIs. With Bing I can burst, depending on how much money I want to spend, but with GitHub I have no relief value—I can only make X calls a minute, per IP, per user. This is a common disconnect in the world of APIs, and something I’ve written a lot about. GitHub (Microsoft) has a more “elevated” business model, with the APIs being just an enabler of that business model. Where Bing (Microsoft) is going with a much more straightforward API monetization strategy—pay for what you use. In...[Read More]


API Evangelist

The Details Of My API Rating Formula

09 Jul 2019

Last week I put some thoughts down about the basics of my API rating system. This week I want to go through each of those basics, and try to flesh out the details of how I would gather the actual data needed to rank API providers. This is a task I’ve been through with several different companies, only to be abandoned, and then operated on my own for about three years, only to abandon once I ran low on resources. I’m working to invest more cycles into actually defining my API rating in a transparent and organic way, then applying it in a way that allows me to continue evolving, while also using to make sense of the APIs I am rapidly indexing. First, I want to look at the API-centric elements I will be considering when looking at a company, organization, institution, government agency, or other entity, and trying to establish some sort of simple rating for how well they are doing APIs. I’ll be the first to admit that ratings systems are kind of bullshit, and are definitely biased and hold all kinds of opportunity for going, but I need something. I need a way to articulate in real time how good of an API citizen an API provider is. I need a way to rank the searches for the growing number of APIs in my API search index. I need a list of questions I an ask about an API in both a manual, or hopefully automated way: Active / Inactive - APIs that have no sign of life need a lower rating. HTTP Status Code - Do I get a positive HTTP status code back when I ping their URL(s)? Active Blog - Does their blog have regular activity on it, with relevant and engaging content? Active Twitter - Is there a GitHub account designated for the API, and is it playing an active role in its operations? Active GitHub -...[Read More]


API Evangelist

Thinking Differently When Approaching OpenAPI Diffs And Considering How To Layer Each Potential Change

08 Jul 2019

I have a lot of OpenAPI definitions, covering about 2,000 separate entities. For each entity, I often have multiple OpenAPIs, and I am finding more all the time. One significant challenge I have in all of this centers around establishing a master “truth” OpenAPI, or series of definitive OpenAPIs for each entity. I can never be sure that I have a complete definition of any given API, so I want to keep vacuuming up any OpenAPI, Swagger, Postman, or other artifact I can, and compare it with the “truth” copy” I have on indexed. Perpetually layering the additions and changes I come across while scouring the Internet for signs of API life. This perpetual update of API definitions in my index isn’t easy, and any tool that I develop to assist me will be in need constant refinement and evolution to be able to make sense of the API fragments I’m finding across the web. There are many nuances of API design, as well as the nuances of how the OpenAPI specification is applied when quantifying the design of an API, making the process of doing a “diff” between two OpenAPI definitions very challenging. Rendering common “diff” tools baked into GitHub, and other solutions ineffective when it comes to understanding the differences between two API definitions that may represent a single API. These are some of the things I’m considering as I’m crafting my own OpenAPI “diff” tooling: Host - How the host is stored, defined, and applied across sandbox, production, and other implementations injects challenges. Base URL - How OpenAPI define their base url versus their host will immediately cause problems in how diffs are established. Path - Adding even more instability, many paths will often conflict with host and base URL, providing different fragments that show as differences. Verbs - Next I take account of the verbs available for any path, understanding what the differences are in methods applied. Summary - Summaries are...[Read More]


API Evangelist

Why The Open Data Movement Has Not Delivered As Expected

05 Jul 2019

I was having a discussion with my friends working on API policy in Europe about API discovery, and the topic of failed open data portals came up. Something that is a regular recurring undercurrent I have to navigate in the world of APIs. Open data is a subset of the API movement, and something I have first-hand experience in, building many open data portals, contributing to city, county, state, and federal open data efforts, and most notably riding the open data wave into the White House and working on open data efforts for the Obama administration. Today, there are plenty of open data portals. The growth in the number of portals hasn’t decreased, but I’d say the popularity, utility, and publicity around open data efforts has not lived up to the hype. Why is this? I think there are many dimensions to this discussion, and few clear answers when it comes to peeling back the layers of this onion, something that always makes me tear up. Nothing There To Begin With - Open data was never a thing, and never will be a thing. It was fabricated as part of an early wave of the web, and really never got traction because most people do not care about data, let alone it being open and freely available. It Was Just Meant To Be A Land Grab - The whole open data thing wasn’t about open data for all, it was meant to be open for business for a few, and they have managed to extract the value they needed, enrich their own datasets, and have moved on to greener pastures (AI / ML). No Investment In Data Providers - One f the inherent flaws of the libertarian led vision of web technology is that government is bad, so don’t support them with taxes. Of course, when they open up data sets that is goo for us, but supporting them in covering compute, storage, bandwidth, and...[Read More]


API Evangelist

API Interoperability is a Myth

03 Jul 2019

There are a number of concepts we cling to in the world of APIs. I’ve been guilting of inventing, popularizing, and spreading many myths in my almost decade as the API Evangelist. One of them that I’d like to debunk and be more realistic about is when it comes to API interoperability. When you are focused on just the technology of APIs, as well as maybe the low-level business of APIs, you are an API interoperability believer. Of course everyone wants API interoperability, and that all APIs should work seamlessly together. However, if you at all begin operating at the higher levels of the business of APIs, and spend any amount of time studying the politics of why and how we do APIs at scale, you will understand that API interoperability is a myth. This reality is one of the reasons us technologists who possess just a low-level understanding of how the business behind our tech operation, are such perfect tools for the higher level business thinkers, and people who successfully operate and manipulate at the higher levels of industries, or even at the policy level. We are willing to believe in API interoperability, and work to convince our peers that it is a thing, and we all work to expose, and open up the spigots across our companies, organizations, institutions, and government agencies. Standardized APIs and schema that play nicely with each other are valuable, but only within certain phases of a companies growth, or as part of a myth-information campaign to convince the markets that a company is a good corporate citizen. However, once a company achieves dominance, or the winds change around particular industry trends, most companies just want to ensure that all roads lead towards their profitability. Believing in API interoperability without a long term strategy means you are opening up your company to value extraction by your competitors. I don’t care how good your API management is, if your API...[Read More]


API Evangelist

Your API and Schema Is That Complex Because You Have Not Done The Hard Work To Simplify

02 Jul 2019

I find myself looking at a number of my more complex API designs, and saying to myself, “this isn’t complicated because it is hard, it is complicated because I did not spend the time required to simplify it appropriately”. There are many factors contributing to this reality, but I find that more often than not it is because I’m over-engineering something, and I am caught up in the moment focusing on a purely computation approach, and not considering the wider human, business, and other less binary aspects of delivering APIs. While I am definitely my own worst enemy in many API deliver scenarios, I’d say there are a wide range of factors that are influencing how well, or poorly that I design my API resources, with just a handful of them being: Domain - I just do not have the domain knowledge required to get the job done properly. Consumer - I just do not have the knowledge I need of my end consumers to do things right. Bandwidth - I just do not have the breathing room to properly sit down and make it happen. Narcissism - I am the best at this, I know what is needed, and I deem this complexity necessary. Lazy - I am just too damn lazy to actually dig in and get this done properly in the first place. Caring - I just do not give a shit enough to actually go the extra distance with API design. Dumb - This API is dumb, and I really should not be developing it in the first place. These are just a few of the reasons why I settle for complexity over simplicity in my API designs. It isn’t right. However, it seems to be a repeating pattern in some of my work. It is something that I should be exploring more. For me to understand why my work isn’t always of highest quality possible I need to explore each...[Read More]


API Evangelist

The Complexity of API Discovery

01 Jul 2019

I can’t get API discovery out of my mind. Partly because I am investing significant cycles in this area at work, but it is also something have been thinking about for so long, that it is difficult to move on. It remains one of the most complex, challenging, and un-addressed aspects of the way the web is working (or not working) online today. I feel pretty strongly that there hasn’t been investment in the area of API discovery because most technology companies providing and consuming APIs prefer things be un-discoverable, for a variety of conscious and un-conscious reasons behind these belief systems.
 What API Discovery Means? Depends On Who You Are… One of the reasons that API discovery does not evolve in any significant ways is because there is not any real clarity on what API discovery is. Depending on who you are, and what your role in the technology sector is, you’ll define API discovery in a variety of ways. There are a handful of key actors that contribute to the complexity of defining, and optimizing in the area of API discovery. 
 Provider - As an API provider, being able to discover your APIs, informs your operations regarding what your capabilities are, building a wider awareness regarding what a company, organization, institution, or government agency does, helping eliminate inefficiencies, and allows for more meaning decisions to be made at run-time across operations. Consumer - What you need as an internal consumer of APIs, or maybe a partner, or 3rd party developer will significantly shift the conversation around what API discovery means, and how it needs to be “solved”. There is another significant dimension to this discussion, separating human from other system consumption, further splintering the discussion around what API discovery is when you are a consumer of APIs. Analyst - As an analyst for specific industries, or technology in general, need to understand the industries they are watching, and how API tooling is...[Read More]


API Evangelist

The Basics of My API Rating Formula

01 Jul 2019

I have been working on various approaches to rating APIs since about 2012. I have different types of algorithms, even having invested in operating one from about 2013 through 2016, which I used to rank my daily intake of API news. Helping me define what the cream on top of each industry being impacted by APIs, while also not losing site of interesting newcomers to the space. I have also had numerous companies and VCs approach me about establishing a formal API rating system—many of whom concluded they could do fairly easily and went off to try, then failed, and gave up. Rating the quality of APIs is subjective and very hard. When it comes to rating APIs I have a number of algorithms to help me, but I wanted to step back and think of it from a more simpler human vantage point, and after establishing a new overall relationship with the API industry. What elements do I think should exist within a rating system for APIs: Active / Inactive - APIs that have no sign of life need a lower rating. Free / Paid - What something costs impacts our decision to use or not. Openness - Is an API available to everyone, or is a private thing. Reliability - Can you depend on the API being up and available. Fast Changing - Does the API change a lot, or remain relatively stable. Social Good - Does the API benefit a local, regional, or wider community. Exploitative - Does the API exploit its users data, or allow others to do so. Secure - Does an API adequately secure its resources and those who use it. Privacy - Does an API respect privacy, and have a strategy for platform privacy. Monitoring - Does a platform actively monitor its platform and allow others as well. Observability - Is there visibility into API platform operations, and its processes. Environment - What is the environment footprint or...[Read More]


API Evangelist

Why Schema.org Does Not See More Adoption Across The API Landscape

25 Jun 2019

I’m a big fan of Schema.org. A while back I generated an OpenAPI 2.0 (fka Swagger) definition for each one and published to GitHub. I’m currently cleaning up the project, publishing them as OpenAPI 3.0 files, and relaunching the site around it. As I was doing this work, I found myself thinking more about why Schema.org isn’t the goto schema solution for all API providers. It is a challenge that is multi-layered like an onion, and probably just as stinky, and will no doubt leave you crying. First, I think tooling makes a big difference when it comes to why API providers haven’t adopted Schema.org by default across their APIs. If more API design and development tooling would allow for the creation of new APIs using Schema.org defined schema, I think you’d see a serious uptick in the standardization of APIs that use Schema.org. In my experience, I have found that people are mostly lazy, and aren’t overly concerned with doing APIs right, they are just looking to get them delivered to meet specifications. If we provide them with tooling that gets the API delivered to specifications, but also in a standardized, they are doing to do it. Second, I think most API providers don’t have the time and bandwidth to think of the big picture like using standardized schema for all of their resources. Most people are under pressure to more with less, and standards is something that can be easily lost in the shuffle when you are just looking to satisfy the man. It takes extra effort and space to realize common standards as part of your overall API design. This is a luxury most companies, organizations, and government agencies can not afford, resulting in many ad hoc APIs defined in isolation. Third, I think some companies just do not care about interoperability. Resulting in them being lazy, or not making it a priority as stated in the first and second points. Most...[Read More]


API Evangelist

Avoiding Complexity and Just Deploying YAML, JSON, and CSV APIs Using GitHub or GitLab

24 Jun 2019

I find that a significant portion of I should be doing when defining, designing, developing, and delivering an API is all of avoiding complexity. Every step away along the API journey I am faced with opportunities to introduce complexity, forcing me to constantly question and say no to architectural design decisions. Even after crafting some pretty crafty APIs in my day, I keep coming back to JSON or YAML within Git, as the most sensible API architectural decision I can make. Git, with JSON and YAML stored within a repository, fronted with a Jekyll front-end does much of what I need. The challenge with selling this concept to others is that it is a static publish approach to APIs, instead of a dynamic pull of relevant data. This approach isn’t for every API solution, I’m not in the business selling one API solution to solve all of our problems. However, for many of the API uses I’m building for, a static Git-driven approach to publishing machine readable JSON or YAML data is a perfect low cost, low tech solution to delivering APIs. A Git repository hosted on GitHub or GitLab will store a significant amount of JSON, YAML, or CSV data. Something you can easily shard across multiple repositories within an organization / group, as well as across many repositories within many organization / groups. Both GitHub and GitLab offer free solutions, essentially letting you publish as many repositories as you want. As I said earlier, this is not a solution to all API needs, but when I’m looking to introduce some constraints to keep things low cost, simple, and easy to use and manage—a Git-driven API is definitely worth considering. However, going static for your API will force you to think about how you automate the lifecycle of your data, content, and other resources. The easiest way to manage JSON, CSV, or YAML data you have on GitHub or GitLab is to use the...[Read More]


API Evangelist

Organizing My APIs Using OpenAPI Tags

19 Jun 2019

I like my OpenAPI tags. Honestly, I like tags in general. Almost every API resource I design ends up having some sort of tagging layer. Too help me organize my world, I have a centralized tagging vocabulary that I use across my JSON Schema, OpenAPI, and AsyncAPI, to help me group, organize, filter, publish, and maintain my catalog of API and schema resources. The tag object for the OpenAPI specification is pretty basic, allowing you to add tags for an entire API contract, as well as apply them to each individual API method. Tooling, such as API documentation uses these tags to group your API resources, allowing you to break down your resources into logical bounded contexts. It is a pretty basic way of defining tags, that can go a long ways depending on how creative you want to get. I am extending tags with an OpenAPI vendor extension, but I also see that there is a issue submitted suggesting they move the specification forward by allowing for the nesting of tags–potentially taking OpenAPI tagging to the next level. I’m allowing for a handful of extensions to the OpenAPI specification to accomplish the following: Tag Grouping - Help me nest, and build multiple tiers of tags for organization APIs. Tag Sorting - Allowing me to define a sort order that goes beyond an alphabetical list. I am building listing, reporting, and other management tools based up OpenAPI tagging to help me in the following areas: Tag Usage - Reporting how many resources are available for each tag, and tag group. Tag Cleanup - Helping me de-dupe, rename, deal with plural challenges, etc. Tag Translations - Translating old tags into new tags, helping keep things meaningful. Tag Clouds - Generating D3.js tag clouds from the tags applied to API resources. Packages - Deployment of NPM packages based upon different bounded contexts defined by tags. I am applying tags to the following specifications, stretching my OpenAPI tagging...[Read More]


API Evangelist

Doing The Hard Work To Define APIs

17 Jun 2019

Two years later, I am still working to define the API driven marketplace that is my digital self. Understanding how I generate revenue from my brand (vomits in mouth a little bit), but also fight off the surveillance capitalists from mining and extracting value from my digital existence. It takes a lot of hard work to define the APIs you depend on to do business, and quantify the digital bits that you are transacting on the open web, amongst partners, and unknowingly with 3rd parties. As an individual, I find this to be a full time job, and within the enterprise, it is something that everyone will have to own a piece of, which in reality, is something that is easier said than done. Convincing enterprise leadership of the need to be aware of every enterprise capability being defined at the network, system, or application level is a challenge, but doable. Getting consensus on how to do this at scale, across the enterprise will be easier said than done. Identifying how the network, system, and applications across a large organization are being accessed, what schema, messages, and other properties are being exchanged is not a trivial task. It is a massive undertaking to reverse engineer operations, and identify the core capabilities being delivered, then define and document in a coherent way that can be shared with others, and included as part of an organization messaging campaign. Many will see work to define all enterprise API capabilities as a futile task–something that is impossible to deliver. Many others will not see the value of doing it in the first place, and unable to realize the big picture, they will see defining of APIs and underlying schema as meaningless busy work. Even when you do get folks on-board with the important, having the discipline to see the job through becomes a significant challenge. If moral is low within any organization group, and team members do not have...[Read More]


API Evangelist

There Is No Single Right Way To Do APIs

16 Jun 2019

My time working in the API sector has been filled with a lot of lessons. I researched hard, paid attention, and found a number of surprising realities emerge across the API landscape. The majority of surprises have been in the shadows caused by my computational belief scaffolding I’ve been erecting since the early 1980s. A world where there has to be absolutes, known knowns, things are black and white, or more appropriately 1 or 0. If I studied all APIs, I’d find some sort of formula for doing APIs that is superior to everyone else’s approach to doing APIs. I was the API Evangelist man–I could nail down the one right way to do APIs. (Did I mention that I’m a white male autodidact?) I was wrong. There is no one right way to do APIs. Sure, there are many different processes, practices, and tools that can help you optimize your API operations, but despite popular belief, there is no single “right way” to do define, design, provide, or consume APIs. REST is not the one solution. Hypermedia is not the one solution. GraphQL is not the one solution. Publish / Subscribe is not the one solution. Event-driven is not the one solution. APIs in general are the the one solution. Anyone telling you there is one solution to doing APIs for all situations is selling you something. Understanding your needs, and what the pros and cons of different approaches are, is the only thing that will help you. If you are hyper focused on the technology, it is easy to believe in a single right way of doing things. Once you start having to deliver APIs in a variety of business sectors and domains, you will quickly begin to see your belief system in a single right way of doing things crumble. Different types of data require different types of approaches to API enablement. Different industries are knowledgable in different ways of API enablement,...[Read More]


API Evangelist

API Definitions Are Important

12 Jun 2019

I found myself scrolling down the home page of API Evangelist and thinking about what topic(s) I thought were still the most relevant in my mind after not writing about APIs for the last six months. Hands down it is API definitions. These machine and human readable artifacts are the single most important thing for me when it comes to APIs I’m building, and putting to work for me. Having mature, machine readable API definitions for all API that you depend on, is essential. It also takes a lot of hard work to make happen. It is why I went API define first a long time ago, defining my APIs before I ever get to work designing, mocking, developing, and deploying my APIs. Right now, I’m heavily dependent on my: JSON Schema - Essential for defining all objects being used across API contracts. OpenAPI - Having OpenAPI contracts for al my web APIs is the default–they drive everything. AsyncAPI - Critical for defining all of my non HTTP 1.1 API contracts being provided or consumed. Postman Collections - Providing me with the essential API + environment definitions for run-time use. APIs.json - Helping me define all the other moving parts of API operations, indexing all my definitions. While there is plenty of other stops along the API lifecycle that are still relevant to me, my API definitions are hands down the most valuable intellectual property I possess. These four API specifications are essential to making my world work, but there are other more formalized specifications I’d love to be able to put to work: Orchestrations - I’d liked to see a more standardized, machine readable way for working with many API calls in a meaningful way. I know you can do this with Postman, and I’ve done with OpenAPI, and like Stoplight.io’s approach, but I want an open source solution. Licensing - I am not still actively using API Commons, but I’d like to invest...[Read More]


API Evangelist

API Evangelist Is Open For Business

10 Jun 2019

After six months of silence I've decided to fire API Evangelist back up again. I finally reached a place where I feel like I can separate out the things that caused me to step back in the first place. Mostly, I have a paycheck now, some health insurance, and I don't have to pretend I give a shit about APIs, startups, venture capital, and the enterprise. I'm being paid well to do an API job. I can pay my rent. I can go to the doctor when my health takes a hit. My basic needs are met. Beyond that, I'm really over having to care about building an API community, making change with APIs, and counteracting all of the negative effects of APIs in the wild. I can focus on exactly what interests me about technology, and contribute to the 3% of the API universe that isn't doing yawnworthy, half-assed, or straight up shady shit. I don't have to respond to all the emails in my inbox just because I need to eat, and have a roof over my head. I don't have to jump, just because you think your API thing is the next API thing. I can just do me, which really is the founding principle of API Evangelist. Third, I got a kid to put through college, and I'm going to make y'all pay for it. So, API Evangelist is open for business. I won't be producing the number of stories I used to. I'll only be writing about things I actually find interesting, and will explore other models for generating content, traffic, and revenue. So reach out, and pitch me. I'm looking for sponsors, and open to almost anything. Don't worry, I'll be my usual honest self and tell you whether I'm interested or not, and have strong opinions on what should be said, but pitch me. I'm open for business, I'll entertain any business offer keep API Evangelist in forward...[Read More]


API Evangelist

Asking The Honest Questions When It Comes To Your API Journey

27 Nov 2018

I engage with a lot of enterprise organizations in a variety of capacities. Some are more formal consulting and workshop engagements. While others are just emails, direct messages, and random conversation in the hallways and bars around API industry events. Many conversations are free flowing, and they trust me to share my thoughts about the API space, and provide honest answers to their questions regarding their API journey. Where others express apprehension, concern, and have trouble engaging with me because they are worried about what I might say about their approach to doing APIs within their enterprise organizations. Some have even told me that they’d like to formally bring me in for discussions, but they can’t get me pass legal or their bosses–stating I have a reputation for being outspoken. While in Vegas today, I had breakfast with Paolo Malinverno, analyst from Gartner, he mentioned the Oscar Wilde quote, “Whenever people agree with me I always feel I must be wrong.” Referring to the “yes” culture than can manifest itself around Gartner, but also within industries and the enterprise regarding what you should be investing in as a company. That people get caught up in up in culture, politics, and trends, and don’t always want to be asking, or be asked the hard questions. Which is the opposite of what any good API strategist, leader, and architect should be doing. You should be equipped and ready to be asked hard questions, and be searching out the hard questions. This stance is fundamental to API success, and you will never find what you are seeking when it comes to your API journey if you do not accept that many questions will be difficult. The reality that not all API service providers truly want to help enterprise organizations genuinely solve the business challenges they face, and that many enterprise technology leaders aren’t actually concerned with truly solving real world problems, has been one of the toughest pills...[Read More]


API Evangelist

A Diverse API Toolbox Driving Hybrid Integrations Across An Event-Driven Landscape

25 Nov 2018

I’m heading to Vegas in the morning to spend two days in conversations with folks about APIs. I am not there for AWS re:Invent, or the Gartner thingy, but I guess in a way I am, because there are people there for those events, who want to talk to me about the API landscape. Folks looking to swap stories about enterprise API investment in possessing a diverse API toolbox for driving hybrid integrations in an event-driven landscape. I’m not giving any formal talks, but as with any engagement, I’m brushing up on the words I use to describe what I’m seeing across the space when it comes to the enterprise API lifecycle. The Core Is All About Doing Resource Based, Request And Response APIs Well I’m definitely biased, but I do not subscribe to popular notions that at some point REST, RESTful, web, and HTTP APIs will have to go away. We will be using web technology to provide simple, precise, useful access to data, content, and algorithms for some time to come, despite the API sectors continued evolution, and investment trends coming and going. Sorry, it is simple, low-cost, and something a wide audience gets from both a provider and consumer perspective. It gets the job done. Sure, there are many, many areas where web APIs fall short, but that won’t stop success continuing to be defined by enterprise organizations who can do microservices well at scale. Despite relentless assaults by each wave of entrepreneurs, simple HTTP APIs driving microservices will stick around for some time to come. API Discovery And Knowing Where All Of Your APIs Resources Actually Are API discovery means a lot of things to a lot of people, but when it comes to delivering APIs well at scale in a multi-cloud, event-driven world, I’m simply talking about knowing where all of your API resources are. Meaning, if I walked into your company tomorrow, could you should me a complete list...[Read More]


API Evangelist

YAML API Management Artifacts From AWS API Gateway

23 Nov 2018

I’ve always been a big supporter of creating machine readable artifacts that help define the API lifecycle. While individual artifacts can originate and govern specific stops along the API lifecycle, they can also bring value when applied across other stops along the API lifecycle, and most importantly when it comes time to govern everything. The definition and evolution of individual API lifecycle artifacts is the central premise of my API discovery format APIs.json–which depends on there being machine readable elements within the index of each collection of APIs being documented, helping us map out the entire API lifecycle. OpenAPI provides us with machine readable details about the surface area of our API which can be used throughout the API lifecycle, but it lacks other details about the surface area of our API operations. So when I do come across interesting approaches to extending the OpenAPI specification which are also injecting a machine readable artifact into the OpenAPI that support other stops along the API lifecycle, I like to showcase what they are doing. I’ve become very fond of one within the OpenAPI export of any AWS API Gateway deployed API I’m working with, which provides some valuable details that can be used as part of both the deployment and management stops along the API lifecycle: x-amazon-apigateway-integration: uri: "http://example.com/path/to/the/code/behind/" responses: default: statusCode: "200" requestParameters: integration.request.querystring.id: "method.request.path.id" passthroughBehavior: "when_no_match" httpMethod: "GET" type: "http" This artifact is associated with each individual operation within my OpenAPI. It tells the AWS gateway how to deploy and manage my API. When I first import this OpenAPI into the gateway, it will deploy each individual path and operation, then it helps me manage it using the rest of the available gateway features. From this OpenAPI definition I can design, then autogenerate and deploy the code behind each individual operation, then deploy each individual path and operation to the AWS API Gateway and map them to the code behind. I can do this...[Read More]


API Evangelist

The API Journey

23 Nov 2018

I’ve been researching the API space full time for the last eight years, and over that time I have developed a pretty robust view of what the API landscape looks like. You can find almost 100 stops along what I consider to be the API lifecycle on the home page of API Evangelist. While not every organization has the capacity to consider all 100 of these stops, they do provide us with a wealth of knowledge generated throughout my own journey. Where I’ve been documenting what the API pioneers have been doing with their API operations, how startups leverage simple web API infrastructure, as well as how the enterprise has been waking up to the API potential in the last couple of years. Over the years I’ve tapped this research for my storytelling on the blog, and for the white papers and guides I’ve produced. I use this research to drive my talks at conferences, meetups, and the workshops I do within the enterprise. I’ve long had a schema for managing my research, tracking on the APIs, companies, people, tools, repos, news, and other building blocks I track across the API universe. Now, after a year of working with them on the ground at enterprise organizations, I’m partnering with Streamdata.io (SDIO) to continue productizing my approach to the API lifecycle, which we are calling Journey, or specifically SDIO Journey. Our workshops are broken into four distinct areas of the lifecycle: Discovery (Goals, Definition, Data Sources, Discovery Sources, Discovery Formats, Dependencies, Catalog, Communication, Support, Evangelism) - Defining your digital resources are and what your enterprise capabilities are. Design (Definitions, Design, Versioning, Webhooks, Event-Driven, Protocols, Virtualization, Testing, Landing Page, Documentation, Support, Communication, Road Map, Discovery) - Going API first, as well as API design first when it comes to the delivery of all of your API resources. Development (Definitions, Discovery, Virtualization, Database, Storage, DNS, Deployment, Orchestration, Dependencies, Testing, Performance, Security, Communication, Support) - Considering what is needed...[Read More]


API Evangelist

What Does The Next Chapter Of Storytelling Look Like For API Evangelist?

21 Nov 2018

I find myself refactoring API Evangelist again this holiday season. Over the last eight years of doing API Evangelist I’ve had to regularly adjust what I do to keep it alive and moving forward. As I close up 2018, I’m finding the landscape shifting underneath me once again, pushing me to begin considering what the next chapter of API Evangelist will look like. Pushing me to adjust my presence to better reflect my own vision of the world, but hopefully also find balance with where things are headed out there in the real world. I started API Evangelist in July of 2010 to study the business of APIs. As I was researching things in 2010 and 2011 I first developed what I consider to be the voice of the API Evangelist, which continues to be the voice I use in my storytelling here in 2018. Of course, it is something that has evolved and matured over the years, but I feel I have managed to remain fairly consistent in how I speak about APIs throughout the journey. It is a voice I find very natural to speak, and is something that just flows on some days whether I want it to or not, but then also something I can’t seem to find at all on other days. Maintaining my voice over the last eight years has required me to constantly adjust and fine tune, perpetually finding the frequency required to keep things moving forward. First and foremost, API Evangelist is about my research. It is about me learning. It is about me crafting stories that help me distill down what I’m learning, in an attempt to articulate to some imaginary audience, which has become a real audience over the years. I don’t research stuff because I’m being paid (not always true), and I don’t tell stories about things I don’t actually find interesting (mostly true). API Evangelist is always about me pushing my skills forward...[Read More]


API Evangelist

The Ability To Link To API Service Provider Features In My Workshops And

16 Nov 2018

All of my API workshops are machine readable, driven from a central YAML file that provides all the content and relevant links I need to deliver what I need during a single, or multi-day API strategy workshop. One of the common elements of my workshops are links out to relevant resource, providing access to services, tools, and other insight that supports whatever I’m covering in my workshop. There are two parts to this equation, 1) me knowing to link to something, and 2) being able to link to something that exists. A number of API services and tooling I use don’t follow web practices and do not provide any easy way to link to a feature, or other way of demonstrating the functionality that exists. The web is built on this concept, but along the way within web and mobile applications, we’ve have seemed to lose our understanding for this fundamental concept. There are endless situations where I’m using a service or tool, and think that I should reference in one of my workshops, but I can’t actually find any way to reference as a simple URL. Value buried within a JavaScript nest, operating on the web, but not really behaving like you depend on the web. Sometimes I will take screenshots to illustrate the features of a tool or service I am using, but I’d rather have a clean URL and bookmark to a specific feature on a services page. I’d rather give my readers, and workshop attendees the ability to do what I’m talking about, not just hear me talk about it. In a perfect world, every feature of a web application would have a single URL to locate said feature. Allowing me to more easily incorporate features into my storytelling and workshops, but alas many UI / UX folks are purely thinking about usability and rarely thinking about instruct-ability, and being able to cite and reference a feature externally, using the fundamental...[Read More]


API Evangelist

Flickr And Reconciling My History Of APIs Storytelling

06 Nov 2018

Flickr was one of the first APIs that I profiled back in 2010 when I started API Evangelist. Using their API as a cornerstone of my research, resulting in their API making it into my history of APIs storytelling, continuing to be a story I’ve retold hundreds of times in the conversations I’ve had over the eight years of being the API Evangelist. Now, after the second (more because of Yahoo?) acquisition, Flickr users are facing significant changes regarding the number of images we can store on the platform, and what we will be charged for using the platform–forcing me to step back, and take another look at the platform that I feel has helped significantly shape the API space as we know it. When I step back and think about Flickr, it’s most important contribution to the world of APIs was all about the resources it made available. Flickr was the original image sharing API, powering the growing blogosphere at the beginning of this century. Flickr gave us a simple interface for humans in 2004, and an API for other applications just six months later, that provided us all with a place to upload the images we would be using across our storytelling on our blogs. Providing the API resources that we would be needed to power the next decade of storytelling via our blogs, but also set into the motion the social evolution of the web, demonstrating that images were an essential building block of doing business on the web, and in just a couple of years, on the new mobile devices that would become ubiquitous in our lives. Flickr was an important API resource, because it provided access to an important resource–our images. The API allowed you to share these meaningful resources on your blog, via Facebook and Twitter, and anywhere else you wanted. In 2005, this was huge. At the time, I was working to make a shift from being an...[Read More]


API Evangelist

The Impact Of Travel On Being The API Evangelist

01 Nov 2018

Travel is an important part of what I do. It is essential to striking up new relationships, and reenforcing old ones. It is important for me to get out of my bubble, expose myself to different perspectives, and see the world in different ways. I am extremely grateful for the ability to travel around the US, and the world the way that I do. I am also extremely aware of the impact that travel has on me being the API Evangelist–the positive, the negative, and the general shift in my tone in storytelling after roaming the world. One of the most negative impact that traveling has on my world is on my ability to complete blog posts. If you follow my work, when I’m in the right frame of mind, I can produce 5-10 blog posts across the domains I write for, on a daily basis. The words just do not flow in the same way when I am on the road. I’m not in a storyteller frame of mind. At least in the written form. When I travel, I am existing in a more physical and verbal sense as the API Evangelist, something that doesn’t always get translated into words on my blog(s). This is something that is ok for short periods of time, but after extended periods of time on the road, it is something that will begin to take a toll on my overall digital presence. After the storytelling impact, the next area to suffer when I am on the road, is my actual project work. I find it very difficult to write code, or think at architectural levels while on the road. I can flesh out and move forward smaller aspects of the projects I’m working on, but because of poor Internet, packed schedules, and the logistics of being on the road, my technical mind always suffers. This is something that is also related to the impact on my overall storytelling....[Read More]


API Evangelist

What Are Your Enterprise API Capabilities?

22 Oct 2018

I spend a lot of time helping enterprise organizations discover their APIs. All of the organizations I talk to have trouble knowing where all of their APIs are–even the most organized of them. Development and IT groups have just been moving too fast over the last decade to know where all of their web services, and APIs are. Resulting in large organizations not fully understanding what all of their capabilities are, even if it is something they actively operate, and may drive existing web or mobile applications. Each individual API within the enterprise represents a single capability. The ability to accomplish a specific enterprise tasks that is valuable to the business. While each individual engineer might be aware of the capabilities present on their team, without group wide, and comprehensive API discovery across an organization, the extent of the enterprise capabilities is rarely known. If architects, business leadership, and any other stakeholder can’t browse, list, search, and quickly get access to all of the APIs that exist, the knowledge of the enterprise capabilities will not be able to be quantified or articulated as part of regular business operations. In 2018, the capabilities of any individual API is articulated by it’s machine readable definition. Most likely OpenAPI, but could also be something like API Blueprint, RAML, or other specification. For these definitions to speak to not just the technical capabilities of each individual API, but also the business capabilities, they will have to be complete. Utilizing a higher level strategic set of tags that help label and organize each API into a meaningful set of business capabilities that best describes what each API delivers. Providing a sort of business capabilities taxonomy that can be applied to each API’s definition and used across the rest of the API lifecycle, but most importantly as part of API discovery, and the enterprise digital product catalog. One of the first things I ask any enterprise organization I’m working with upon...[Read More]