{"API Evangelist"}

The Concept of Patientdirected APIs

I am reading through the API task force recommendations out of the Office of the National Coordinator for Health Information Technology (ONC), to help address privacy and security concerns around mandated API usage as part of the Common Clinical Data Set, Medicare, and Medicaid Electronic Health Records. The recommendations contain a wealth of valuable insights around healthcare APIs but are also full of patterns that we should be applying across other sectors of our society where APIs making an impact. To help me work through the task force's recommendations, I will be blogging through many of the different concepts at play. 

One phrase that is used regularly across the task force recommendations that really caught my attention was the concept of "PatientDirected APIs". It is a powerful concept, when you think about APIs existing, not just for helping integrate healthcare systems, and innovate around the development and delivery of the web and mobile applications, but doing it all because of, in service of, and at the direction of the patient. The document has grabbed my attention because this is the first time I've seen such an end-user focused API vision in the wild.

While these electronic healthcare record APIs will be used for system integration, delivering web and mobile apps, and benefiting healthcare platform operators, and developers, the reason for the APIs will be existing is to benefit the patient. This just isn't how things are done in Silicon Valley, where you always focus on benefits for the platform, its partners and investors first, maybe the developers secondarily, but in most cases the end-user is just an afterthought. In startup culture, things are just turned on its head, allowing for some serious imbalance in how we do things with APIs.

There are a wealth of topics for me to work through from the task force recommendations out of ONC and HHS. I will keep blogging, as I read through, and work through the important recommendations it contains. Once I have my head wrapped around the document more, and up to speed with what they are doing, I'm hoping I can contribute some more ideas that might help stimulate things as healthcare providers roll out their APIs.

The API Evangelist API Definition Guide

How we define our APIs has dramatically changed in recent years. Since Swagger came onto the scene around five years ago, there has been a rapid growth in the number of open formats, tooling, and services to help us define APIs. Companies are using API definitions like OpenAPI Spec, Postman, and API.json to communicate about their APIs at almost every stop along the API life cycle. This is my research to better understand all the moving parts in this fast growing sector.

This guide focuses on API definition formats like OpenAPI Spec and API Blueprint, as well as schema formats like JSON Schema, and MSON, but I also touch on media types, link relations and other common building blocks for defining our APIs so they speak a common language. While there is some open tooling included in this guide, I'm trying to focus on the specifications, and formats themselves, and help make sense of the amount of information available to us.

I fund my research through consulting, selling my guides, and the generous support of my sponsors. I appreciate your support in helping me continue with my work and hope you find it relevant in your API journey.

Purchase My API Definitions Guide

Cutting Through The Smoke & MIrrors Of IT Discussions Using API Definitions

I get brought into a lot of API discussions with IT departments from companies, institutions, and government agencies, which are often coordinated by business groups who are interested in better meeting their goals using APIs. This is often an immediately charged conversation, with IT coming to their table with a whole array of baggage. 

In about 75% of the situations, IT, and developer representatives are nice, or rather they are tight-lipped, relying on a myriad of smoke & mirrors to defend their dark arts. Let me stop for a moment, and put out there that I was IT director from 1998 through 2010. I'm not saying IT are bad people, but there are a wide variety of ways we slow, obfuscate, and distort the conversation to be in our favor -- takes one to know one. I wouldn't say that I was 100% honest in my approach to being an IT leader, but I tried my hardest to keep things as transparent as I possibly could.

Anyways, in a couple of the  IT discussions I've had lately, there was an OpenAPI Spec available to define the resources that were on the table, and in a handful of other conversation there were not. Keep in mind that most of these scenarios are with a more traditional version of IT, not with startup technology groups (a whole different beast). As I step back, I am taking notice of the harmonizing effect that an API definition can have, in keeping conversations focused, productive, and moving forward toward a common goal.

In the conversations without an OpenAPI Spec, back-end systems and legacy processes dominated the discussion, even though we are all on a conference call to discuss an external, partner, and public facing API. In the discussions where an OpenAPI Spec was present, we focused on exactly which resources were needed (nothing more), and the details (params, responses, etc) that were needed by all consumers--essentially providing us with a scaffolding for the discussion, that kept things moving forward, and not bogged down in legacy sludge. 

Backend focused discussions always seemed to get slowed down by what was, and what is. The API definition focused conversations seemed to focus on what was needed, using a common language that everyone at the table understood. The presence of an OpenAPI Spec seemed to cut through the smoke & mirrors, which I think often alienates many of the business users. I find having three versions of an OpenAPI Spec and APIs.json file present: 1) simple outline 2) YAML and 3) JSON, was also something that significantly improved discussions, keeping conversations focused while also making them as inclusive as possible.

I think people will always bring their baggage to these discussions, but I'm liking the harmonization effects API definitions like OpenAPI Spec, API Blueprint, Postman, and APIs.json are having in these conversations. I'm hopeful that these API definitions can continue providing bridges between business and IT groups, helping close a canyon that has existed for decades.

A Healthy Stance On Privacy And Security When It Comes To Healthcare APIs

I am reading through the API task force recommendations out of the Office of the National Coordinator for Health Information Technology (ONC), to help address privacy and security concerns around mandated API usage as part of the Common Clinical Data Set, Medicare, and Medicaid Electronic Health Records. The recommendations contain a wealth of valuable insights around healthcare APIs but are also full of patterns that we should be applying across other sectors of our society where APIs making an impact. To help me work through the task force's recommendations, I will be blogging through many of the different concepts at play. 

Beyond the usage of "patient-directed APIs" that I wrote about earlier, I thought the pragmatic view on API privacy and security was worth noting. When it comes to making data, content, and other digital resources available online, I hear the full spectrum of concerns, and it leaves me optimistic to hear government agencies speak about security and privacy in such a balanced way.

Here is a section from the API task force recommendations:

Like any technology, APIs allow new capabilities and opportunities and, like any other technology, these opportunities come with some risks. There are fears that APIs may open new security vulnerabilities, with apps accessing patient records "for evil", and without receiving proper patient authorization. There are also fears that APIs could provide a possible "fire hose" of data as opposed to the "one sip at a time" access that a web site or email interface may provide.

In testimony, we heard almost universally that, when APIs are appropriately managed, the opportunities outweigh the risks. We heard from companies currently offering APIs that properly managed APIs provide better security properties than ad-hoc interfaces or proprietary integration technology.

While access to health data via APIs does require additional considerations and regulatory compliance needs, we believe existing standards, infrastructure, and identity proofing processes are adequate to support patient directed access via APIs today.

The document is full of recommendations on how to strike this balance. It is refreshing to hear such a transparent vision of what APIs can be. They weigh the risks, alongside the benefits that APIs bring to the table while also being fully aware that a "properly managed API" provides its own security. Another significant aspect of these recommendations for me is that they also touch on the role that APIs will play in the regulatory and a compliance process.

I have to admit, the area of healthcare APIs isn't one of the most exciting stacks in the over 50 areas I track on across the API space, but I'm fully engaged with this because of the potential of a blueprint for privacy and security that can be applied with other types of APIs. When it comes to social, location, and other data the bar has been set pretty low when it comes to privacy and security, but health care data is different. People tend to be more concerned with access, security, privacy, and all the things we should already be discussing when it comes to the rest of our digital existence--opening the door for some valuable digital literacy discussions.

Hopefully, I don't run you off with all my healthcare API stories, and you can find some gems in the healthcare API task force's recommendations, like I am. 

Providing A Set Of API Keys For Developers To Test Out Different API Outcomes

I wrote a post about Twilio using magic phone numbers that let their developers test out functionality without incurring any charges for deploying live phone numbers, making calls, and sending SMS. After publishing my post, Runscope CEO, John Sheehan (@johnsheehan) said he was behind the original spec for the Twilio magic numbers.

John continued to share some of the logic that went into his original spec:

Which I think adds another dimension to the concept of having test numbers like this. Different numbers give you different outcomes, and different credit card numbers give you different results. What else could you do with test numbers and unique identifiers? Existing invoice and order numbers for different commerce situations. Seems like you could load just about anything into any alpha and / or numeric that you would want to.

Around the turn of the century (I don't think I've ever said that) I used to work on web applications for non-profit organizations where I used to build campaign code tracking for large, and lengthy mail, phone, fax, and other types of activities. We had 6 to 8 digit identifiers, which every two digits had unique meanings--allowing us to build a pretty robust set of scenarios that helped us track every step in the campaigns evolution. 

Anyways, I think the concept is worthy of further exploration. I could see API providers crafting a pretty robust set of keys that could represent almost any object served up as part of API operations, with a very structured approach for how you tailor a multitude outcomes involving these objects. For me, this type of stuff goes way beyond just having a sandbox for your API, and could provide a much more meaningful way help developers polish their integrations.

APIs Will Wither On The Vine And Never Reach Their Full Potential

As I push out stories on the next round of the Oracle v Google API copyright case, considering how I will write about API deprecations and acquisitions I'm privy to, and document the continued march by the enterprise into the world of APIs -- I begin to see how APIs will continue to wither on the vine, and never reach their full potential in an increasingly toxic environment.

The Oracle v Google is just the first of many battles to occur between tech giants when it comes to APIs and the application of intellectual property laws. With the number of patents out there that focus on APIs as the thing that is protected by IP, not just the thing behind the API being protected, the copyright of the common API patterns we enjoy will just be the tip of the iceberg. This is how the giants will battle it out, leaving the rest of us fighting for scraps in the cracks.

Legal tussle like what we see between Oracle and Google will become commonplace in the future as the enterprise fights for what they think is their "property" when it comes to integrations, collaboration, sharing, and automation. These battles will take place after this current wave of "interest in APIs" by the enterprise, where these large entities have sent their scouts out to map this uncharted (and mildly threatening) wilderness, understand what everyone is doing, and file patents based on what they see. Simplifying enterprise operations, actually innovating, or truly achieving integration is farthest from their mind. While we are all busy doing the work, these IP cartographers are mapping out the exhaust from our labor.

Alongside all of this, another shift in the winds is occurring, that will reduce some very useful, innovative, and valuable API patterns to just portfolio items of the enterprise. As VC funding cycles shift, the startups who are doing the most interesting things with APIs will have their work gobbled up by the enterprise, either through direct acquisitions, or fire sales -- in any case APIs are IP, nothing more. These winds are strong, and will not just blow away the valuable APIs, but also much of the shade trees that is needed for other APIs to flourish on the vine.

All of this change will sweep up this API experiment as we know it, into the IP portfolios of the enterprise giants, so they can use as leverage in court battles, and venture negotiations of the future. The enablement that APIs bring to the table is just too small of a thing currently for the enterprise to even see. Startups are often too greedy, or beholden to their investors to understand the opportunity they are passing on when they see APIs as intellectual property. API enablement is not IP, the thing they enable is IP, and by freezing up the enabling factor, you will miss out on your IP ever reaching full potential, and will be left with a vineyard of perpetually green grapes.

I know, I know. All my enterprise and startup friends will tell me how naive I am, and this is how business is done. Again, I will say, you are so blinded by your greed, and your belief in this broken IP systems, you are willing to kill off this very interesting experiment--where we all  had a seat at the table. At this point, I am left without any hope, and the concept of APIs as we know it will wither on the vine, never reaching their full potential -- this API economy thing we all saw in our minds eye, will not happen. 

Don't get me wrong, I'm gonna still keep on fighting, and pushing for APIs usage in healthcare, education, government, and other important areas. I will keep building services and tools that embrace APIs as their core, but you will rarely hear me speak of the bright, API enabled future anymore. The experiment is over, the believers in a broken IP system are winning -- they just have too much money, and resources to play the long game, and the wider API space, and the rest of us will lose out.

Quality of Service API Endpoint For Your API Platform

I'm spending a lot of time in the Twilio API ecosystem this week, so you will hear multiple stories about what they are up to. This one is highlighting their Call Feedback API and the growing amount of what I'd consider as infrastructure APIs from leading API platforms. The more API platforms mature, the more I see APIs deployed to assist API consumers manage their integrations, as well as get a the core API resources being made available.

Twilio's feedback API focuses on a single API resource, a call, but could just as easily be applied equally as a quality of service feedback endpoint for anything you are serving up, like video, bot responses, recommendations, and beyond. I like the idea of having one endpoint for serving up a resource, and another endpoint for reporting the quality of service around the resource. 

Now that I have filed a report on Twilio's approach to using APIs as part of their platform operations, I will be keeping a closer eye out for other platforms doing similar things. I will consider adding it to my stack of existing API infrastructure APIs, alongside access to account, application, billing, and analytics via an API.

As the growth of API integration continues, I will continue looking at the API pioneers for examples of how we can be providing more infrastructure focused API resources, that allow API consumers to manage, orchestrate, and automate their API integrations. As developers use a larger number of APIs to drive any single application or system integration, the need to automate accounts, apps, pricing, billing, analytics and feedback loops is only going to grow.

An Acceptable Business Model Page For Your API Platform

One thing I look at closely when I review API platforms is how they approach the monetization of their API resources, and the resulting plans, pricing, and access tiers. How platforms think about, and present this aspect of their operations provide a wealth of insights into what a company's motivations are behind their API operations.

I found the approach by 3D printing API platform i.materialise to be an interesting approach which goes beyond just charging for API access and focuses on helping API consumers align their business model with i.materialises. i.materialise presents API consumers with two possible routes, with the first being referral based, and the second being a deeper, white label relationship.


The i.materialise API provides consumers with access to a  full stack of 3D printing APIs, allowing you to upload your model, determine what it will cost to print, all the way to assisting you with order fulfillment, delivery, and invoicing. The i.materialise approach to API monetization is different than other APIs I talk about because it's not about paying for API access. In this situation, it's about providing API access to the manufacturing life cycle -- one that is 3D printer driven. 

While I think utility-based, pay as you go API pricing represent the majority of how we approach API consumption at the moment, I think having your business model aligned, and focused on existing, tangible products and services holds a huge amount of untapped potential for APIs. I'm thinking there are endless numbers of small business out there who would benefit from the process of mapping out their product supply chain and service lifecycles, then the opening up of things as a simple web API--even if it is just for internal, and trusted partner use.

Twilio Provides Test API Credentials With Magic Phone Numbers

I am always on the hunt for the little things that make API integration easier, and while working to certify my Twilio API definition, I noticed their test credentials. When you are playing with the Twilio API, it's pretty easy to add new keys, and create new apps, but they also offering test credentials along with what they call "Twilio's Magic Numbers" so that you can play without connecting to real phones or making actual charges on your account.

Many APIs provide you access to data or content, but Twilio enters the additional realm of much more complex, programmatic API resources. When getting up and going with these types of APIs, it really helps to have a sandbox to play in, and a ready to go set of test credentials provides this for users by default.

If you are offering up more than just data and content, via your API, you may want to follow Twilio's lead, and create a set of permanent, or even regularly changing set of test accounts for consumers to use. It made my onboarding with Twilio, significantly easier.

Serverless Approaches To Deploying Code Will Help Unwind Some Of The Technical Debt We Have

I am sure there is some equation we could come up to describe the amount of ideology and / or dogma present alongside each bit and byte of code. Something that exponentially increases with each additional line of code or MB on disk. An example of this in action, in the wilds of the API space, is the difference between an SDK for an API, and just a single sample API call. 

The single API sample is the minimum viable artifact that enables you to get value from an API -- allowing you to make a single API request and receive a single API response. Very little ideology, or dogma present (its there, but just smaller quantities). Now, if an API provider provides you with a Laravel SDK in PHP, or a JAX-RS SDK in Java, and React.js SDK, I'm significantly cranking up the volume on ideology and dogma involved with this code. All contributing what type of technical debt I'm willing to assume along the way, with each of one my API integrations, and wider technological solutions.

I work hard to never hold up any single technology as an absolute solution, as there are none, but I can see a potential for the latest wave of "serverless" approaches to delivering code potentially helping us unwind some of our technical debt. Like most other areas of technology, simply choosing to go "serverless" will not provide you the relief you need, but if you are willing to do the hard work to decouple your existing code, and apply the philosophy consistently to future projects, the chances "serverless" might pay dividends in the reduction of your technical debt will increase greatly.

I Am Seeing Significant Returns From Investing In Definitions Over Code When It Comes To My API Strategy

I am doing way more work on the creation of machine-readable OpenAPI Specs for APIs, indexed using machine-readable APIs.json files than I am the actual creation of APIs lately. About half of the API definitions I create are for existing APIs, with the rest of them describing APIs that should exist. With the existing APIs, in some cases, I am creating client-side code, but mostly just focusing on a well crafted API definition. When it comes to the new API designs, I am focusing on a complete API definition, but also crafting both server-side, as well as client-side code around the definition--when needed.

Even when I do craft server or client code for an API definition, the value of the code is significantly lower than the definition(s). In my mind the code is disposable. I want to be able to throw it away, and start over with anything I am building, at any point. While I have made significant ideological investments into using Linux for my OS, AWS for my compute and storage hosting, MySQL for my database, and PHP + Slim for my API deployment framework, the code that operates within this framework has to be transient. Some code might end up having a long, long life, but if a piece of code isn't generating value, and in the way, I want to either get rid of it or rewrite it to better meet the requirements.

When it comes to delivering technology in my world, my investments are increasingly in the API definitions, underlying data schemas, and the data and content that is actually stored and transmitted within. The PHP, MySQL, JavaScript, CSS, and HTML is valuable, but a second class citizen to the JSON, and YAML representations of my APIs, schemas, and the valuable data and content stored. For me personally, having made significant investments in a variety of tech solutions historically, this provides me with the flexibility I need to live in the current online climate. This is something that only has been coming into focus in the last year, so I assume it will also continue to evolve in focus over the next couple of years, but I am already seeing significant returns from my investing in definitions over the code when it comes to my API strategy.

The API Evangelist API Deployment Guide

There are many different ways to actually deploy an API. If you are a larger, more established company, you probably have existing tools, services, and processes set forth by IT for deploying APIs. If you are a startup, your developers probably have their own frameworks, or possibly a cloud service they prefer using. This guide looks to map out the world of API deployment, and some of the common ways companies, organizations, institutions, and government agencies are deploying APIs in 2016.

This API deployment guide breaks out many of the common companies, and tools while also looking to identify some of the common building blocks employed by leading providers, in support of API deployment across organizations of all shapes and sizes. This research is conducted by API Evangelist and is the just aspect of a modern API lifecycle, one that includes over 50 stops, from API design to deprecation, with deployment being just one area you should be considering.

I work to update my guides regularly, and if purchase a copy, you'll get any updates to it for the next year. Every reader of API Evangelist guides helps support this research and enables the work to continue--thank you!

APIs At Brigham Young University

I have been tracking on how APIs are used in higher education for some time now, keeping an eye on almost 50 campus API related efforts. I have my University API guide that I regularly update, but I was eager to push forward my storytelling around what is going on, so I have been working on some papers telling the story behind some of the top higher ed API implementations which I have access to.

What better way to kick this off, than to showcase my favorite campus API group, over at Brigham Young University (BYU). The team led by CIO Kelly Flanagan (@kelflanagan) have embraced APIs on campus in a way that keeps my excited about APIs, and what is possible. In my opinion, BYU isn't just using APIs to shift IT strategy at the school, they are using APIs to shift how their students and faculty see technology, and put it to work on their terms. 

I'm pretty familiar with what has been happening at BYU when it comes to APIs, as I see Kelly and his team regularly, but I jumped on a Google Hangout with them, so that I could get the latest. The results is a short five page essay, about the history of the API efforts on campus, some of the benefits to faculty and students, and a glimpse at the future of APIs at the school. 

The paper is freely available for you to download as PDF, but I will also be working on some stories that I used  the paper, as part of my regularly blogging here on API Evangelist. I want to keep bringing attention to what they are up to at BYU, but also generate attention at other schools about what is possible when it comes to APis. As I note in the paper, I'm also working with Davidson College in North Carolina, and will be working to keep spreading the conversation to other schools.

You can find me speaking at University of California in San Diego this June, so stay tuned for more information about how APIs are used in higher ed this summer--showcasing my conversations with BYU, Davidson, and UCSD, as examples of how APIs are making an impact in higher education.

Working To Establish A Complete OpenAPI Spec For Leading APIs

I am always working as hard as I can to develop as complete as possible OpenAPI Specs for the APIs that I monitor. I call this my API Stack research. When possible, in addition to mapping out API operations for an API using APIs.json, I also work to create a machine readable OpenAPI Spec for each API.

In most cases I only have the time to profile the surface area of an API -- the host, base url, all properties, and required parameters. I don't always have time to reach what I'd consider to be a complete API definition. This is something that takes authenticating, achieving a successful request and response for each endpoint present, then generating JSON schema for each response -- this takes a significant amount of effort to do properly.

Thankfully in addition to investing in this work myself, I am also getting the assistance of my partners like Dream Factory, who are interested in getting specific API definitions completed, and make sure they have complete API definitions for the leading APIs out there today. Here are a handful of them I'm working on producing this week:

AngelList - Working to define all the endpoints for the startup, and investment platform API.
Alchemy - Working to define all the endpoints for the machine learning API.
Century Link - Working to define all the endpoints for the cloud computing API.
Docomo - Working to define all the endpoints for the telco, image, machine learning API.
Stipe - Working to define all the endpoints for the payment API.
Twilio - Working to define all the endpoints for the voice & messaging API.
Verizon - Working to define all the endpoints for their Internet of Things (IoT) API.

I've assigned each API its own Github repo to store the OpenAPI Specs, as well as indexing them using APIs.json, and facilitating the conversation using the Github issues. My goal is to streamline the process of making sure all the endpoints are represented, and a complete definition for both the request and the response exists--while keeping an eye on the future, in a potentially crowd sourced, Github-centric way.

If there is an API you'd like to see get completed, let me know. I am happy to prioritize other APIs after this. My goals it to focus a dedicated amount of time each week to this, but unfortunately this takes money, so I'm looking to get investment from partners, and the general community to help make sure happens. If you want to help, feel free to ping me, and let me know what you need.

Thinking About An API Proxy To Add Link Header To Each API Response

I was learning more about using the Link header for pagination yesterday, as part of my work on the Human Services Data Specification (HSDS), and this approach to putting hypermedia links in the header got me thinking about other possibilities. Part of the reason I was considering using the Link header for pagination on this particular project was that I was looking to alter the existing schema as little as possible -- I liked that I could augment the response with links, using the header.

Another side thought I had along the way were around the possibilities for using it to augment 3rd party APIs, and APIs from an external vantage point. It wouldn't be too hard route API requests through a proxy, which could add a header with a personalized set of links tailored for each API request. If the request was looking up flights, the links could be to value add services that might influence the decision like links to hotels, events, and other activities. If you were looking up the definition of a word, the links could be to synonyms--endless possibilities.

You wouldn't have to just use it for pagination, and other common link relations, you could make it more about discovery, search, or even serendipity injected from outside sources. Anyways, I think the fact that you can augment an existing response using a header opens up a lot of possibilities for adding hypermedia behaviors to existing APIs. It might also be an interesting way to introduce existing API owners to hypermedia concepts, by showing them the value that can be added when you provided valuable links.

The JSON Schema Is Proving To Be As Valuable As The OpenAPI Spec

As my API Stack work gets more attention, folks are reaching out to me to see if I have done certain APIs, or see if I'd prioritize some of the ones already on the list. One thing I'm also noticing is that people are often looking for the JSON schema of each of the API responses, just as much as they are looking for the OpenAPI Spec for the API surface area.

I had someone looking for the complete schema for the Reddit API today, while I was working to authenticate, and record the responses to each endpoint for AngelList, AlchemyAPI, CenturyLink, NTT Docomo, Stripe, Twilio, and Verizon. An OpenAPI Spec for the request structure of an API will get you a long ways in on boarding and learning about an API, but you will need the schema to complete any integration.

This is one of the reason I'm working to establish a better process for certifying that an OpenAPI Spec is complete, because without the definitions, providing a JSON schema, will only get you part way--still so much work to be done. This is why API service providers are looking to have it defined, because a complete spec is what will be used to map to other systems, tools, and services.

After spending a couple days going through Schema.org, and working on the Human Services Data Specification (HSDS), I'm feeling the gravity of having common, machine readable schema available for the industry to work from. I've spent a number of cycles in the last two years creating OpenAPI Specs for APIs, but often stopping short of requiring the JSON schema to be present. From here forward, I will equally prioritize the response schema --it is proving to be just as valuable as the response described using OpenAPI Spec.

HTTP Header Awareness: Using The Link Header For Pagination

I was revisiting the concept of pagination for a specific project I am working, and after consulting my API research, I came up with a suitable approach using a Link Header. Beyond applying this in my specific project, I thought the usage of the header for pagination would be a suitable topic for helping with HTTP header awareness -- a topic I will be writing about regularly, to help folks be more aware of useful approaches to using HTTP headers.

Github has the best example of using the link header for pagination that I could find. Github uses the Link response header to hold a handful of Hypermedia link relations including next, last, first, and prev. Providing a nice way to handle not just pagination, but potentially any other related action you might want to take around an API response. It also provides a way to augment link relations to any existing API design, without adding to the actual response body -- which is one reason I decided to use it for my existing project. 

HTTP headers are a critical aspect of API integration, and an area that I feel many developers are lacking awareness of. Which is why I will be working harder to write up simple uses like Link header for pagination, that can help API providers, as well as API consumers better use common HTTP header patterns. To be clear, the Link header is not unique to Github, and is something that is articulated in RFC 5988 for web linking. Something I will add as a tooling to my API design research, so it can be considered as part of an overall API design toolbox.

The Month At A Glance, Road Map, Support, And The Recent Posts For Your API Platform

I was playing with Microsoft's API Catalog, a tool to visualize and analyze the API overlap between standards specifications and type systems within browsers, and their footer caught my eye. I am always looking for quality examples of companies doing platform communications and support well, and I think their layout, and available building blocks in their footer, is worthy of showcasing.

For me, these numbers, and the available communication and support building blocks, send the right signals--that a platform is alive and active. These are the data points I tune into, to understand how well a platform is doing, or not doing. There are no absolutes when it comes to this type of monitoring, as anything can be gamed, but sign Github activity, road map evolution, and blog storytelling can provide a vital heartbeat for any healthy API platform. 

What Are The Most Relevant API Topics To Discuss At @APIStrat in Boston This Fall?

We are picking up speed with the planning for @APIStrat in Boston this November, and with the event committee assembled, and the call for talks open, we are looking at what the possible topics are for sessions. We have seeded the list with the usual suspects like API design, management, testing, monitoring, and others, but we want to also consider what the alternative topics are, and what is relevant to the API sector in 2016. 

What is most important to you in 2016? Is it the trendier topics like bots, voice, and IoT? Is it more practical things like evangelism, versioning, and discovery? Is it the technical, business, or politics of APIs that are impacting your operations most in 2016? If you want to come and present on it, make sure you submit your talk by May 20th. Otherwise you are welcome to tweet at us, email at us, write a blog post, commit to a Github repo, or any other way you desire--we'll add into the mix, for the event committee to review.

It is a lot of work to make sure @APIStrat keeps reflecting the wider API movement after 3 years, and we depend on you keeping us informed. I'm enjoying the fact that we are only doing one event this year, as it is giving us more lead time to engage in discussions with the community, and continue making it an inclusive event. Another thing I'd like to hear about from y'all, is what is relevant to Boston, something I'm sure I will learn more about as I continue my profiling of Boston area businesses. 

Let me know your thoughts please!

More Patents Turning APIs Into An Invasive, Royalty Generating Virus

The relationship between API provider and consumer is a fragile one. As an API provider I am offering up my valuable digital asset, data, content, and digital resources. I would like you to take this Application Programming Interface or SDK that I have built, and put it in your business systems and applications. As an API consumer, I'm given access to valuable business API asset, which I'm expected use in a respectful way, that is always in alignment with a providers terms of service -- an environment where so much can go wrong, and does each day.

I watch companies take a wide range of tones when it comes to setting the stage for this relationship. Some companies are super protective of their valuable content (like Marvel Comics), some are very communicative and inviting like Slack (inject your bot into us). where others wobble between a more open, the back to strict, like Twitter has done over the last couple years. In my opinion, it comes down to how you see your digital resources, and your belief in intellectual property -- when I you, I mean you and your investors.

I try NOT to read all the patent notifications that come into my reader, as it fucking depresses me, but everyone once in a while I have to revisit to help remind everyone, what an illness patents will be, when it comes to APIs working. This fragile relationship I speak of above, operates well, or not very well, depending on how loose, or how much friction exists at this layer. There are plenty of examples out there of APIs who do not do well when they restrict, or too heavily meter API consumers. 

To help build an image in your mind. Imagine the Twitter API, and all the apps that have been built on it. OK, now add in Stripe for payments, Dropbox for storage, Instagram for Images, Facebook for Social, Amazon EC2 for compute, and YouTube for videos. Now, think about enforcing copyright on every API design, and patent licensing on each process involved. It will grind these billion dollar revenue engines to a screeching halt. Nobody will build on these platforms if they have to bake your legal heavy design, and process into their business.

Modern web APIs are balance between the technical, business, and politics of how data, content, and algorithms are exchanged. Simple, ubiquitous web technology is working on the technical side of things. Simple, pay as you go, utility based, and tiered access is helping strike balance on the business side of things. TOS, privacy, security, transparency are the knobs and dials of the political side of things, let's not willingly invite in something so invasive, like patents into the API layer. Patent your algorithm, copyright your content, and license your data, but keep the API definition copyright free, and your API process patent free.

In the "secure cloud storage distribution and aggregation" patent I'm reading this morning, its not the patent that offends me. It is the patent being so focused on the API being the thing that makes the patent. Patent how you index, search, and secure your file storage wizardry, but the API design, and operations should NOT be a focal point of your patent. You want people to integrate with your "secure cloud storage distribution and aggregation", bake the API design and process into their businesses, you want the cloud storage industry to speak your API -- don't lock it down, otherwise API consumers will look elsewhere.