{"API Evangelist"}

Serverless Approaches To Deploying Code Will Help Unwind Some Of The Technical Debt We Have

I am sure there is some equation we could come up to describe the amount of ideology and / or dogma present alongside each bit and byte of code. Something that exponentially increases with each additional line of code or MB on disk. An example of this in action, in the wilds of the API space, is the difference between an SDK for an API, and just a single sample API call. 

The single API sample is the minimum viable artifact that enables you to get value from an API -- allowing you to make a single API request and receive a single API response. Very little ideology, or dogma present (its there, but just smaller quantities). Now, if an API provider provides you with a Laravel SDK in PHP, or a JAX-RS SDK in Java, and React.js SDK, I'm significantly cranking up the volume on ideology and dogma involved with this code. All contributing what type of technical debt I'm willing to assume along the way, with each of one my API integrations, and wider technological solutions.

I work hard to never hold up any single technology as an absolute solution, as there are none, but I can see a potential for the latest wave of "serverless" approaches to delivering code potentially helping us unwind some of our technical debt. Like most other areas of technology, simply choosing to go "serverless" will not provide you the relief you need, but if you are willing to do the hard work to decouple your existing code, and apply the philosophy consistently to future projects, the chances "serverless" might pay dividends in the reduction of your technical debt will increase greatly.

I Am Seeing Significant Returns From Investing In Definitions Over Code When It Comes To My API Strategy

I am doing way more work on the creation of machine-readable OpenAPI Specs for APIs, indexed using machine-readable APIs.json files than I am the actual creation of APIs lately. About half of the API definitions I create are for existing APIs, with the rest of them describing APIs that should exist. With the existing APIs, in some cases, I am creating client-side code, but mostly just focusing on a well crafted API definition. When it comes to the new API designs, I am focusing on a complete API definition, but also crafting both server-side, as well as client-side code around the definition--when needed.

Even when I do craft server or client code for an API definition, the value of the code is significantly lower than the definition(s). In my mind the code is disposable. I want to be able to throw it away, and start over with anything I am building, at any point. While I have made significant ideological investments into using Linux for my OS, AWS for my compute and storage hosting, MySQL for my database, and PHP + Slim for my API deployment framework, the code that operates within this framework has to be transient. Some code might end up having a long, long life, but if a piece of code isn't generating value, and in the way, I want to either get rid of it or rewrite it to better meet the requirements.

When it comes to delivering technology in my world, my investments are increasingly in the API definitions, underlying data schemas, and the data and content that is actually stored and transmitted within. The PHP, MySQL, JavaScript, CSS, and HTML is valuable, but a second class citizen to the JSON, and YAML representations of my APIs, schemas, and the valuable data and content stored. For me personally, having made significant investments in a variety of tech solutions historically, this provides me with the flexibility I need to live in the current online climate. This is something that only has been coming into focus in the last year, so I assume it will also continue to evolve in focus over the next couple of years, but I am already seeing significant returns from my investing in definitions over the code when it comes to my API strategy.

The API Evangelist API Deployment Guide

There are many different ways to actually deploy an API. If you are a larger, more established company, you probably have existing tools, services, and processes set forth by IT for deploying APIs. If you are a startup, your developers probably have their own frameworks, or possibly a cloud service they prefer using. This guide looks to map out the world of API deployment, and some of the common ways companies, organizations, institutions, and government agencies are deploying APIs in 2016.

This API deployment guide breaks out many of the common companies, and tools while also looking to identify some of the common building blocks employed by leading providers, in support of API deployment across organizations of all shapes and sizes. This research is conducted by API Evangelist and is the just aspect of a modern API lifecycle, one that includes over 50 stops, from API design to deprecation, with deployment being just one area you should be considering.

I work to update my guides regularly, and if purchase a copy, you'll get any updates to it for the next year. Every reader of API Evangelist guides helps support this research and enables the work to continue--thank you!

APIs At Brigham Young University

I have been tracking on how APIs are used in higher education for some time now, keeping an eye on almost 50 campus API related efforts. I have my University API guide that I regularly update, but I was eager to push forward my storytelling around what is going on, so I have been working on some papers telling the story behind some of the top higher ed API implementations which I have access to.

What better way to kick this off, than to showcase my favorite campus API group, over at Brigham Young University (BYU). The team led by CIO Kelly Flanagan (@kelflanagan) have embraced APIs on campus in a way that keeps my excited about APIs, and what is possible. In my opinion, BYU isn't just using APIs to shift IT strategy at the school, they are using APIs to shift how their students and faculty see technology, and put it to work on their terms. 

I'm pretty familiar with what has been happening at BYU when it comes to APIs, as I see Kelly and his team regularly, but I jumped on a Google Hangout with them, so that I could get the latest. The results is a short five page essay, about the history of the API efforts on campus, some of the benefits to faculty and students, and a glimpse at the future of APIs at the school. 

The paper is freely available for you to download as PDF, but I will also be working on some stories that I used  the paper, as part of my regularly blogging here on API Evangelist. I want to keep bringing attention to what they are up to at BYU, but also generate attention at other schools about what is possible when it comes to APis. As I note in the paper, I'm also working with Davidson College in North Carolina, and will be working to keep spreading the conversation to other schools.

You can find me speaking at University of California in San Diego this June, so stay tuned for more information about how APIs are used in higher ed this summer--showcasing my conversations with BYU, Davidson, and UCSD, as examples of how APIs are making an impact in higher education.

Working To Establish A Complete OpenAPI Spec For Leading APIs

I am always working as hard as I can to develop as complete as possible OpenAPI Specs for the APIs that I monitor. I call this my API Stack research. When possible, in addition to mapping out API operations for an API using APIs.json, I also work to create a machine readable OpenAPI Spec for each API.

In most cases I only have the time to profile the surface area of an API -- the host, base url, all properties, and required parameters. I don't always have time to reach what I'd consider to be a complete API definition. This is something that takes authenticating, achieving a successful request and response for each endpoint present, then generating JSON schema for each response -- this takes a significant amount of effort to do properly.

Thankfully in addition to investing in this work myself, I am also getting the assistance of my partners like Dream Factory, who are interested in getting specific API definitions completed, and make sure they have complete API definitions for the leading APIs out there today. Here are a handful of them I'm working on producing this week:

AngelList - Working to define all the endpoints for the startup, and investment platform API.
Alchemy - Working to define all the endpoints for the machine learning API.
Century Link - Working to define all the endpoints for the cloud computing API.
Docomo - Working to define all the endpoints for the telco, image, machine learning API.
Stipe - Working to define all the endpoints for the payment API.
Twilio - Working to define all the endpoints for the voice & messaging API.
Verizon - Working to define all the endpoints for their Internet of Things (IoT) API.

I've assigned each API its own Github repo to store the OpenAPI Specs, as well as indexing them using APIs.json, and facilitating the conversation using the Github issues. My goal is to streamline the process of making sure all the endpoints are represented, and a complete definition for both the request and the response exists--while keeping an eye on the future, in a potentially crowd sourced, Github-centric way.

If there is an API you'd like to see get completed, let me know. I am happy to prioritize other APIs after this. My goals it to focus a dedicated amount of time each week to this, but unfortunately this takes money, so I'm looking to get investment from partners, and the general community to help make sure happens. If you want to help, feel free to ping me, and let me know what you need.

Thinking About An API Proxy To Add Link Header To Each API Response

I was learning more about using the Link header for pagination yesterday, as part of my work on the Human Services Data Specification (HSDS), and this approach to putting hypermedia links in the header got me thinking about other possibilities. Part of the reason I was considering using the Link header for pagination on this particular project was that I was looking to alter the existing schema as little as possible -- I liked that I could augment the response with links, using the header.

Another side thought I had along the way were around the possibilities for using it to augment 3rd party APIs, and APIs from an external vantage point. It wouldn't be too hard route API requests through a proxy, which could add a header with a personalized set of links tailored for each API request. If the request was looking up flights, the links could be to value add services that might influence the decision like links to hotels, events, and other activities. If you were looking up the definition of a word, the links could be to synonyms--endless possibilities.

You wouldn't have to just use it for pagination, and other common link relations, you could make it more about discovery, search, or even serendipity injected from outside sources. Anyways, I think the fact that you can augment an existing response using a header opens up a lot of possibilities for adding hypermedia behaviors to existing APIs. It might also be an interesting way to introduce existing API owners to hypermedia concepts, by showing them the value that can be added when you provided valuable links.

The JSON Schema Is Proving To Be As Valuable As The OpenAPI Spec

As my API Stack work gets more attention, folks are reaching out to me to see if I have done certain APIs, or see if I'd prioritize some of the ones already on the list. One thing I'm also noticing is that people are often looking for the JSON schema of each of the API responses, just as much as they are looking for the OpenAPI Spec for the API surface area.

I had someone looking for the complete schema for the Reddit API today, while I was working to authenticate, and record the responses to each endpoint for AngelList, AlchemyAPI, CenturyLink, NTT Docomo, Stripe, Twilio, and Verizon. An OpenAPI Spec for the request structure of an API will get you a long ways in on boarding and learning about an API, but you will need the schema to complete any integration.

This is one of the reason I'm working to establish a better process for certifying that an OpenAPI Spec is complete, because without the definitions, providing a JSON schema, will only get you part way--still so much work to be done. This is why API service providers are looking to have it defined, because a complete spec is what will be used to map to other systems, tools, and services.

After spending a couple days going through Schema.org, and working on the Human Services Data Specification (HSDS), I'm feeling the gravity of having common, machine readable schema available for the industry to work from. I've spent a number of cycles in the last two years creating OpenAPI Specs for APIs, but often stopping short of requiring the JSON schema to be present. From here forward, I will equally prioritize the response schema --it is proving to be just as valuable as the response described using OpenAPI Spec.

HTTP Header Awareness: Using The Link Header For Pagination

I was revisiting the concept of pagination for a specific project I am working, and after consulting my API research, I came up with a suitable approach using a Link Header. Beyond applying this in my specific project, I thought the usage of the header for pagination would be a suitable topic for helping with HTTP header awareness -- a topic I will be writing about regularly, to help folks be more aware of useful approaches to using HTTP headers.

Github has the best example of using the link header for pagination that I could find. Github uses the Link response header to hold a handful of Hypermedia link relations including next, last, first, and prev. Providing a nice way to handle not just pagination, but potentially any other related action you might want to take around an API response. It also provides a way to augment link relations to any existing API design, without adding to the actual response body -- which is one reason I decided to use it for my existing project. 

HTTP headers are a critical aspect of API integration, and an area that I feel many developers are lacking awareness of. Which is why I will be working harder to write up simple uses like Link header for pagination, that can help API providers, as well as API consumers better use common HTTP header patterns. To be clear, the Link header is not unique to Github, and is something that is articulated in RFC 5988 for web linking. Something I will add as a tooling to my API design research, so it can be considered as part of an overall API design toolbox.

The Month At A Glance, Road Map, Support, And The Recent Posts For Your API Platform

I was playing with Microsoft's API Catalog, a tool to visualize and analyze the API overlap between standards specifications and type systems within browsers, and their footer caught my eye. I am always looking for quality examples of companies doing platform communications and support well, and I think their layout, and available building blocks in their footer, is worthy of showcasing.

For me, these numbers, and the available communication and support building blocks, send the right signals--that a platform is alive and active. These are the data points I tune into, to understand how well a platform is doing, or not doing. There are no absolutes when it comes to this type of monitoring, as anything can be gamed, but sign Github activity, road map evolution, and blog storytelling can provide a vital heartbeat for any healthy API platform. 

What Are The Most Relevant API Topics To Discuss At @APIStrat in Boston This Fall?

We are picking up speed with the planning for @APIStrat in Boston this November, and with the event committee assembled, and the call for talks open, we are looking at what the possible topics are for sessions. We have seeded the list with the usual suspects like API design, management, testing, monitoring, and others, but we want to also consider what the alternative topics are, and what is relevant to the API sector in 2016. 

What is most important to you in 2016? Is it the trendier topics like bots, voice, and IoT? Is it more practical things like evangelism, versioning, and discovery? Is it the technical, business, or politics of APIs that are impacting your operations most in 2016? If you want to come and present on it, make sure you submit your talk by May 20th. Otherwise you are welcome to tweet at us, email at us, write a blog post, commit to a Github repo, or any other way you desire--we'll add into the mix, for the event committee to review.

It is a lot of work to make sure @APIStrat keeps reflecting the wider API movement after 3 years, and we depend on you keeping us informed. I'm enjoying the fact that we are only doing one event this year, as it is giving us more lead time to engage in discussions with the community, and continue making it an inclusive event. Another thing I'd like to hear about from y'all, is what is relevant to Boston, something I'm sure I will learn more about as I continue my profiling of Boston area businesses. 

Let me know your thoughts please!

More Patents Turning APIs Into An Invasive, Royalty Generating Virus

The relationship between API provider and consumer is a fragile one. As an API provider I am offering up my valuable digital asset, data, content, and digital resources. I would like you to take this Application Programming Interface or SDK that I have built, and put it in your business systems and applications. As an API consumer, I'm given access to valuable business API asset, which I'm expected use in a respectful way, that is always in alignment with a providers terms of service -- an environment where so much can go wrong, and does each day.

I watch companies take a wide range of tones when it comes to setting the stage for this relationship. Some companies are super protective of their valuable content (like Marvel Comics), some are very communicative and inviting like Slack (inject your bot into us). where others wobble between a more open, the back to strict, like Twitter has done over the last couple years. In my opinion, it comes down to how you see your digital resources, and your belief in intellectual property -- when I you, I mean you and your investors.

I try NOT to read all the patent notifications that come into my reader, as it fucking depresses me, but everyone once in a while I have to revisit to help remind everyone, what an illness patents will be, when it comes to APIs working. This fragile relationship I speak of above, operates well, or not very well, depending on how loose, or how much friction exists at this layer. There are plenty of examples out there of APIs who do not do well when they restrict, or too heavily meter API consumers. 

To help build an image in your mind. Imagine the Twitter API, and all the apps that have been built on it. OK, now add in Stripe for payments, Dropbox for storage, Instagram for Images, Facebook for Social, Amazon EC2 for compute, and YouTube for videos. Now, think about enforcing copyright on every API design, and patent licensing on each process involved. It will grind these billion dollar revenue engines to a screeching halt. Nobody will build on these platforms if they have to bake your legal heavy design, and process into their business.

Modern web APIs are balance between the technical, business, and politics of how data, content, and algorithms are exchanged. Simple, ubiquitous web technology is working on the technical side of things. Simple, pay as you go, utility based, and tiered access is helping strike balance on the business side of things. TOS, privacy, security, transparency are the knobs and dials of the political side of things, let's not willingly invite in something so invasive, like patents into the API layer. Patent your algorithm, copyright your content, and license your data, but keep the API definition copyright free, and your API process patent free.

In the "secure cloud storage distribution and aggregation" patent I'm reading this morning, its not the patent that offends me. It is the patent being so focused on the API being the thing that makes the patent. Patent how you index, search, and secure your file storage wizardry, but the API design, and operations should NOT be a focal point of your patent. You want people to integrate with your "secure cloud storage distribution and aggregation", bake the API design and process into their businesses, you want the cloud storage industry to speak your API -- don't lock it down, otherwise API consumers will look elsewhere.

A Regular Reminder That Storytelling Is The Most Important Tool In Your API Toolbox

I was reminded by my friend Mike Amundsen of the importance of storytelling in our world. When I am asked by anyone doing APIs, what is the most important thing they should be doing, my answer is always "storytelling". I don't care if its internally, publicly, on the corporate blog, or your personal blog, tell the story of what you are doing. I do not care how good your API is, if you aren't telling the story of this, nobody is going to care. 

When I hold up storytelling as the most important tool in your toolbox within developer groups, they often snicker, dismissing it. I also have a contingent of my NOT fanbase in the API space who love to dismiss me with -- nobody reads or cares about stories! I'm never phased by these folks, my focus is on the people who are truly trying to build community, and reach their intended audience. #NoHaterz

While stories swirl all around is many forms, the one I like to reference the most, is when it comes to the mating ritual that is startup funding. If you are looking to get funding for your startup, you have to be sending the right signals, otherwise your VC mate will never be interested in you. You will need to be on TechCrunch, Venture Beat, and other tech blogs. Hacker News, Reddit and Hacker News needs to be in sync. Your stories should also be in sync with what is floating around VC firm blogs, aggregate, and other investor related streams. 

Startup valuations are are defined by an ocean of stories crashing on the shore each day. Bots are a thing because of storytelling. We are focusing on selling to the enterprise and abandoning B2C solutions because of storytelling. Google loves them some moonshot storytelling. Elon Musk understand the importance of the right story at the right time. Stories of what people are using. Stories of what people are investing in. Stories of what the future will hold, and will be a 20B industry by 2020. 

That is marketing Kin! I'd put it in the fiction section of the library if I was in charge, but I'm not. Its all storytelling. If you need another example, visit my secret to Amazon's success post. This story does 2K page views a month, and I've seen it referenced in keynotes, framed on the wall of a bank, and on the home page of a federal agencies internal portal. This story is more fiction than it is fact, yet it keeps on resonating with folks, getting passed around, and turning people on to the concept of APIs, and why they are important.

Please keep telling your story, if for no other reason, so that I have stories to retell. ;-)

The Wikimedia Unique Devices Data API

I came across the Wikimedia Unique Devices data set, which also is served up as an API endpoint, along with the other APIs the platform offers. The data set and API provides access to a list of unique devices that have visited Wikipedia, for a specific period of time.

The data set and API only has data back to January currently, but I'm sure is something that will evolve over time. I really like the fact that we have organizations who are operating at scale, and are open and willing to share their data. This type of information is high value, and is something that can help us all better understand the ever shifting digital landscape around us--thank you Wikimedia for making accessible.

While playing with their Unique Device API, I also noticed how transparent they are with their page view data, also making it available as simple set of API endpoints. I know Wikimedia is a non-profit, but I can't help but feel their way of doing things provides a blueprint that other commercial platforms should consider following.

Content That Lives On When You Invest In The Right API Stories, Training, and Guides

I am working with my partner Cloud Elements to build out a community of evangelists, who are interested in delivering on many of the essential building blocks I track on when it comes to API evangelism. I get regular requests from companies who are looking for evangelism and advocacy resources -- in desperate need of smart folks who can help craft content, crank out code, in the service of evangelizing a platform.

The work I'm doing with Cloud Elements, is the early work to help deliver on what API providers, and service providers are needing. Right now we are on-boarding new folks to the concept, and getting feedback on what some of the common elements, and challenges will be. One of the folks I'm engaging with on this is expert PHP and API developer, tech lead, author, consultant and open source maintainer, Lorna Mitchell (@lornajane).

Lorna is one of several folks I've been working with to help deliver on API Evangelist projects ranging from blog, and white papers, to training, and how-to guides. If you are a PHP developer, have been learning about Git and Github in the last couple of years you have probably come across Lorna's work. I was talking to my friend Mark Body on Skype today, and mentioned Lorna and he said oh yeah, I've used her Git training materials before, in my stories and talks -- something I echoed as well, having pointed my readers and clients to her training materials regularly.

After five years of doing what I call hacker storytelling, writing code, JSON, and stories about APIs, I understand how much hard work goes into creating content that is educational and simple, focused on very technical topics like APIs, while also driving search engine traffic over long periods of time. Lorna has done very well these areas,crafting blog posts that will keep being surfaced in Google searches for a long time to come, and help walk people through some very lofty technical concepts--wave after wave.

If you are needing help with storytelling as part of your industry, or API platform, let me know. I'm happy to see where I can help. Feel free to reach out to Lorna as well, she is looking to help the right companies and platforms craft white papers, guides, training and tutorials. If you are an evangelist who is looking to do the same, let me know, and I'm also happy to plug you into the community we are assembling, and see where you can help deliver. If you want to test drive some of first tasks and challenges we have going on in the beta phase, you are welcome to sign up on your own at Mt. EveREST-- things are just getting going, and we could use your feedback.

API Providers & Consumers Keeping In Touch Is How You Can Set The Right Tone For An API Community

One of the side effects of the recent bot craze, is that I'm getting to showcase the often very healthy API practices of Slack, as they grow, scale, and manage their developer ecosystem. Slack is beginning to renew my faith that there are API providers out there who give a shit, and aren't just looking to exploit their ecosystems. There are two Slack blog posts that have triggered these thoughts, one on the Slack platform road map, and a little thing about release notes, both of which reflect what I would love to see other API providers emulate in their platform operations.

Slack is going the extra mile to set the right tone in their community, with what I consider to be some of the essential communication building blocks of API operations, but they simply call "keep in touch":

  • Recent updates We improve the Slack platform every day by releasing new features, squashing bugs, and delivering fresh documentation. Here's an account of what's recently happened.
  • Support and Discussion - Whether you're working on an app for our App Directory or a custom integration for your team, you'll find yourself in good company, from all of us here at Slack and the wide community of developers working with the platform.
  • @SlackAPI - Slack tweets, news, features and tips can be found at  but this? This is all API, all the time.
  • Platform Blog - A Medium blog dedicated to the Slack API platform.
  • Slack Engineering Blog - A Medium blog dedicated to the Slack engineering team.
  • Platform Roadmap - Come, transform teams, and build the future of work with us--About our road mapExplore our roadmapReview recent platform updates, and Discover what teams want.
  • Register As a Developer - Working with the Slack API? Tell us a bit about yourself! We'll use the answers you supply here to notify you of updates to the Slack API, general Slack API news, and to get a better sense of the variety of developers building into Slack. 

I just copied and pasted that from their developer portal. Slack does not stop there, also providing an FAQ, a Code of Conduct, and Ideaboard to further set the tone for how things can and should work in a community. What I like about the tone Slack is taking, is that it is balanced--"keep in touch"! Which really is just as much about us API consumers, as it is about Slack. Slack has done the hard work of providing most of the essential API building blocks, as well as a valuable API resource, now its up to the community to deliver--this balance is important, and we should be staying in touch.

Remember the tone Twitter took with us? Developer Rules of the Road!! Very different tone than "keep in touch". The tone really matters, as well as the investment in the common building blocks that enable "keeping in touch", both synchronous, and asynchronously. Having a road map, and change log for your API goes a long way, but telling the behind the why, how, and vision behind your road map and change log--gives me hope that this API thing might actually work. 

Vital Resources Like The Court Listener API Depend On Our Donations To Operate

Despite popular belief in Silicon Valley, there are many different ways to fund the design, development, deployment, and operation of valuable API resources. Not all APIs are destined to be the next Amazon, Twitter, or even Twilio. Some APIs just need exist, be available, and will never be a revenue engine--one of these APIs is the Court Listener API.

Version 3.0 of the Court Listener API possesses 15 valuable endpoints, providing access to courts, dockets, opinions, people, sources, ratings, and other details about how laws are made in the United States. With their latest release containing a comprehensive database of judges and the judiciary, to be linked to Court Listener’s corpus of legal opinions authored by those judges.

Increasingly APIs like Court Listener, and the Human Services Data Specification (HSDS) API, are capturing my attention. These types of APIs represent the type of impact I'm looking to make using APIs, going beyond what APIs can do for a single application or industry, but what they can do for wider sections of our society. Focusing on getting agile, nimble, and more efficient in how craft laws in our country, and how we make sure people are fed, and find the government services they need in their life.

The Court Listener API depends on grants, and donations to operate, which is an approach I will be showcasing more when it applies to APIs that deliver valuable human, civic, and research related resources. APIs like the Court Listener API provide an important window into how our local, state, and federal judicial system is operating (or not), and will need our financial support to do what they do. Which I think is a very viable approach to designing, deploying, and operating APis -- that is, if we all step up and support these efforts.

Slack Meets The Minimum Viable API Platform Requirements

I am using my minimum viable API operations definition tool to continue profiling the API sector, this time to size up the Slack API community. Slack is kind of a darling of the API space, so it kind of seem silly to profile them, but profiling those who are doing this API think right, is what API Evangelist all about--whether I follow the hype or not.

Using my minimum viable API definition, I went through the Slack API portal looking for what I'd consider to be the essential building blocks that any modern API platform should have.

API Overview
Name: Slack API
Description: All of our APIs can be used alone or in conjunction with each other to build many different kinds of Slack apps. Whether you're looking to build an official Slack app for your service, or you just want to build a custom integration for your team, we can help you get started!
Image: https://a.slack-edge.com/ae57/img/slack_api_logo.png
API Portal: https://api.slack.com/
API Base URL: https://slack.com/api/
Getting Started: https://api.slack.com/slack-apps
Registration: https://slackhq.typeform.com/to/kOHQvo
Documentation: https://api.slack.com/rtm
Code: https://api.slack.com/community
Road Map: https://api.slack.com/roadmap
Change Log: https://api.slack.com/changelog
Pricing: You should be at least sharing some rate limits, acceptable uses, and other pricing and access related information.
Terms of Service: https://slack.com/terms-of-service/api
OpenAPI Spec: A machine readable OpenAPI Specification for an API is fast becoming an essential element of API operations.
API Blueprint: A machine readable API Blueprint for an API is fast becoming an essential element of API operations.
Postman Collection: A machine readable Postman Colelction for an API is fast becoming an essential element of API operations.
Github Org / User: https://github.com/slackhq
Twitter Account: https://twitter.com/slackapi
Blog: https://medium.com/slack-developer-blog
Blog RSS: https://medium.com/feed/slack-developer-blog
Support Page: https://api.slack.com/docs/support
Contact Info
Contact Name: Slack
Contact Email: https://apievangelists.slack.com/help/requests/new

Performing better than the review of the i.Materialise 3D printing API that I conducted the other day, Slack checks off all but one of the essential building blocks-everything except for pricing. The only other area(s) that I find deficient, is when it comes to machine readable API definitions like OpenAPI Spec and Postman Collections. These aren't required for success, but they can sure go a long ways in helping developers on-board from documentation, to generating code, and tooling that will be needed for integration. 

I'm assuming Slack hasn't generated OpenAPI Specs because they have a more XML-RPC design, which I think many folks assume can't be documented in this way. While it doesn't lend itself to more easily being documented with OpenAPI Spec, I found some simple little hacks that make it doable, allowing you to also document even XML-RPC designs. Having some OpenAPI Specs and Postman Collections would make the API more accessible for people looking to play with.

Anyways, I just wanted to test out the minimum viable API operations tool on another API. I am trying to profile several APIs in this way each week, helping the number of APIs I am monitoring grow, while also encouraging other API providers to follow Slack's lead.

OpenAPI Specifications For 642 Of The Schema.org Types

I am gearing up for another wave of API definition work, so I am taking the opportunity to produce some more tooling that assists me in the process. One of the tools I want to build, is a simple solution for walking me through one or many OpenAPI Specs, and push me to make sure every parameter has a complete set of descriptions. I possess amazing powers of bullshit, and can craft default description for almost anything I come across, but it would be nice to have an an ever evolving autocomplete dictionary to augment my existing super powers. 

I already have an APIs.json driven autocomplete tool, that loops through all the parameters within the OpenAPI Specs that are indexed. I just needed a rich set of fields and parameters to pull from -- Schema.org. The rich vocabulary that is Schema.org possesses 992 properties across 642 schema types. I've long wanted to craft OpenAPI Specs for Schema.org, but needed a reason to help push things forward, and this is a perfect opportunity to kick things off.

As a starting point, I created 642 separate OpenAPI Specs. One for each of the schema types. I already have an API that will generate an OpenAPI Spec from any JSON schema, building out GET, POST, PUT, and DELETE methods, as well as a default 200 response, and connecting it to the schema for the API response definition. As I was doing the work I realized that I didn't want to limit the OpenAPI Specs to just the JSON version, so as I generated I also published a YAML version.

Next I'm going to use these 642 OpenAPI Specs as an autocomplete index for helping me quickly fluff up the parameters of other existing API definitions. Next I'll work on wiring up the hierarchies and relationships present in the Schema.org definitions. Right now, none of the OpenAPI Specs will validate as the parameter types aren't all valid, but I didn't want to lose the object references. 

Beyond being an autocomplete index which I can use across my micro tools for working with API definitions, I am thinking this Schema.org work will bear a shit ton of fruit in future work. I would like to have API implementations for many of the more common Schema.org types, allowing me to help folks design, and deploy Schema.org compliant APIs. Something that I feel will go a long way to help stabilize and standardize APIs, by helping them speak using a more common vocabulary.

Using My APIs.json Annotation Tool To Drive An API Design Conversation Via Github Issues

I am working on one possible API definition for the Human Services Definition Specification (HSDS), and the next phase of this work involves bringing in a small group of technical, and non-technical folks to have discussions around their API designs, in context of the master specification I am looking to pull together. 

To help drive the discussion I am wanted to use the OpenAPI Specification that I created for HSDS, and I also knew I wanted to use Github issue management to help keep track of the synchronous, and asynchronous conversation that would occur around it. However Github tends to have a perceived barrier to entry for many non-developers (which I don't fully get), so I wanted to leverage the platform, but also soften the way everyone discussed the overall specification, as well as each individual endpoint.

The HSDS specification is pretty verbose, and I needed a way to have conversations at the high level, but also down to the endpoint level. To help facilitate this, I got to work on a prototype micro tool which enables a conversation around any API(s) that are indexed within an APIs.json file, producing a human readable list of endpoints, parameters, etc., but then uses Github issue management as the core of the conversation. 

Resulting in my APIs.json Annotation tool. It is just a first draft, so there will be a lot of changes that need to occur. I'm going to test it against 20+ APIs.json collections I have defined as part of my API Stack work to try and harden it a little bit. My APIs.json Annotation tool runs in my single repo app style, leveraging Jekyll + Github Pages + Github API to operate--Github is the front and backend.

Anyone can view the conversation, but if you want to participate you have to be added to the Github repository, and pass in a Github personal token. This is something I often automate with a simple Github login, where I use OAuth.io to obtain token, but I kind of see the token as a minimum viable investment to understanding Github for using each tool.

It is really important to me that each app stands on its own feet. Not all of my micro tools that I develop in this way will enjoy zero outside dependencies, but most of them can be easily forked, and ran in any Github user account or org (with little or no setup). Conversations around API is just one area I am looking to simulate with this approach to delivering tooling, and specifically APIs.json tooling, that can be used throughout an API life cycle. 

You are welcome to play with the APIs.json Annotation, or any of the other tools I have listed on my home page. I will keep adding them here, so that they can be found, but like all my work they are all a work in progress. Each tool has its own dedicated repo and issue management, where you are welcome to join in the conversation around the road map for each one. I am just looking to develop a robust toolbox of small micro tools that will help be more successful across the life cycle of the APIs I am working on, but maybe you can benefit using them too.

I Like Working With JSON On Github Because CORS Is Never An Issue

I tend to only work in environments where I have full control over the server, so Cross-origin resource sharing (CORS) is never really an issue for any of the APIs I have control over, but is a pervasive problem for APIs, and JSON files I come across on the web. This is one of the reasons I really enjoy the fact that I publish all of my JSON driven, hacker storytelling projects using Github Pages and Jekyll.

In the last couple weeks I have been working on a bunch of micro tools that deliver specific functionality which can then be used throughout the API life cycle. All of these apps are developed using JavaScript, and depend on being able to read from any number of JSON file stores. Sometimes these files are stored within the project, but most likely I'm calling the remote JSON files of other projects, something that depends on the sharing of resources across domains.

If I am publishing a JSON file publicly on the open Internet, I want it to be accessible from anywhere. CORS has to be default. The speed, and agility at which I'm able to ingest and work with APIs.json files, and the OpenAPI Spec indexes they contain, sets me up for some serious nimbleness across my work.

If you are working with open data on the web, make sure and consider the CORS enablement by default when working with your JSON data on Github. It will make your life easier, as well as anyone else who will be looking to consume the valuable data you are putting out there.