{"API Evangelist"}

Managing Your API Terms of Service And Privacy Policies On Github Like Medium Does

I always dig it when API stories spin out of control, and I end up down story holes. I'm sure certain people waiting for other work from me do not appreciate it, but these are where some of the best stories in my world come from. As I was writing the story about Best Buy limiting access to their API when you have a free email account, which resulted in the story about Best Buy using Medium for their API platform blog presence, which ended up pushing me to read Medium's terms of service.

Anyways, to help manage the evolution of the terms of service, privacy policy, Medum uses Github to establish a historical record of all the changes to all legal documents that impact platform, and API operations. Seems like a pretty sensible, and default approach to the legal department for any platform. Hosting your legal department on Github, making sure their is as much transparency, and logging across this area of operations will play a central role in establishing and maintaining trust with your partners and consumers.

Maintaining the legal side of your platform operations on Github, taking advantage of the version control build in makes a lot of sense. Something that also opens the door for using Github Issue Management, and the other more social aspects of Github for assisting in the communication of legal changes, as well as facilitate ongoing conversations around changes in real time. I can see eventually working this into some sort of rating system for API providers, a sort of open source regulatory consideration, that is totally opt in by API platforms -- if you give a shit you'll do it, if you don't, you won't. 

See The Full Blog Post

I Am Seeing More API Platforms Manage Their Blog Presence Using Medium

One thing that struck me as I wrote my post about Best Buy stopping issuing API keys to free email accounts, was the fact that Best Buy operates their developer blog on Medium--something I am seeing more of. As I discover new API-centric companies via my blog, Twitter, Product Hunt, AngelList, and the many other ways I tune into the space, I'm seeing more companies operating the blog portion of their presence in this way. 

One of voices in my head points out that this just doesn't seem like a good idea. It reminds me of hosting our blogs on Posterous. Medium doesn't let you map your domain, or sub-domain to your blog (invite only), but I'm sure is something they'll do soon. They don't have RSS, and they don't have a read API either? *warning bells*  While I get the Medium thing, it seems like one of those neatly tended gardens, where as many roads out are gated off with friendly hand-painted signs.

However some of the other voices in my head chimed in that Medium is just asking for non exclusive license to use your content. They let you delete your account, and I know they are working hard on their API strategy. Also, I'm a big supporter of API providers being as scrappy as possible, as many of us have to do as much as we can with some often very non-existent budgets. Best Buy rocks it at leveraging Github, Medium, Twitter, and all the channels I usually recommend. So in the end I really can't bitch about API platforms using Medium for their blog presence.

When you use Medium you get the network effect for your blog presence--something I'm working to understand better, and leverage more as part of my own work. I'll keep tracking on the APIs I find that use Medium for their blog presence. Right now my only complaint is simple -- THERE IS NO RSS!!! Beyond that, I'll just track on and see how it all plays out. My advice is to still to utilize a WordPress, Jekyll, Blogger, or other hosted service where you can map a subdomain to, keeping all the valuable exhaust within your domain. Then follow the POSSE principles when incorporating Medium into the communication strategy for your API platform.

See The Full Blog Post

Best Buy Will Not Issue API Keys To Free Email Accounts And Wants To Get To Know Your Company

Best Buy is one of the many of the recent responses I am seeing from public API providers, as they work to strike a healthy balance within their API community. In an attempt to incentivize the behavior they desire within the Best Buy API community, the platform will not be issuing API keys to any email address that comes from the popular free email platforms (@google.com, @yahoo.com, etc.). While I hate seeing any public API access be tightened up, I can't help but sympathize with their move, and I have to support any API providers who works to set a healthy balance within their communities.

While I am not sure limiting access based upon email account may not be the solution they are looking, they hit all the right notes for me:

  • Develop Human Relationships >> If we want to have a better relationship with you, our active users, we need to make better connections between your alpha-numeric key and the services we provide to you
  • Respect For What You Build >> If we disable a key because the email address is old, we may break an app. We don’t like breaking things.
  • Business With Companies >> Over the next couple of months we will transition to a new system that will associate API keys with a company and not an individual.
  • Empowering Education usage >> We are developing a program that we intend to have up and running before the start of the next school year that will accommodate educational use. 
  • Allow For Play & Exploration >> Similarly, we have ideas for how to accommodate events, hackathons and developer sandboxes to allow folks to test the waters without needing to go through a formal key sign up process.

I will always encourage companies, organizations, institutions, government agencies, and individuals to be as public as possible. More importantly i will always encourage them to do it in the safest, most meaningful way possible, and when they are just working to cultivate and get to know their community -- I can only lend my support. Looking beyond the tech, APIs are all hammering out how you discover, and maintain digital relationships and partnerships, via your website, web and mobile applications, as well as your API platform.

Not every comopany will be able to do API the way Best Buy is. Its just not in the DNA of every company. The ones that will be most successful with it, will be the ones that do the hard work of getting to know their community, establish sensible ground rules, but in my opinion the most critical part it all, is that you be as communicative and transparent about it as you possibly can. The Best Buy API does this well, by sharing their thoughts behind their difficult decision to lock down on API keys issued to users with free email accounts. 

See The Full Blog Post

The Annotated Code Walk-Throughs and Tutorials Over At Twilio

I am always trying to identify the common building blocks employed by leading API providers, and Twilio is one of the usual suspects I showcase. This time it is focusing their annotated code walk-throughs and tutorials, which provides a pretty good model that other API providers can follow when planning their own tutorials.

Twilio offers up a tutorial for almost every API endpoint they offer. Some of the more popular tutorials have versions for almost every programming language, while some of the lesser traveled ones only have a single version. Once you click in to each tutorial, it delivers as the title implies, an annotated walk through, showing you how to get up and running, making the API call, in the language of choice.

Each tutorials provides you a direct link to the code libaries, available on Twilio's Github account. It can get pretty busy in the walk-through section for Twilio, but the value is clear. I could envision a more portable, embeddable, and machine readable tool, that would help API providers do this as well as Twilio, but maybe in a more cleaner, plug and play way.

I've had tutorials as a common API management building block for some time, maybe I'll take Twilio's model and expand on it, and provide a more detailed blueprint that API providers can follow when planning their own approach. Maybe someone could also turn it into a simple service, or open source solution that the API space could put to use.

See The Full Blog Post

The Lack Of An API And Healthy Partner Integrations Is An Early Warning System For Service Providers

I was disappointed to see the email in my inbox this morning from IFTTT about their Pinboard integration. I also helped amplify Pinboard when he was Tweet'n up a storm earlier, and I recommend you read his post: My Heroic and Lazy Stand Against IFTTT.

However I had work to get done on an essay about the API effort over at Brigham Young University, and prepare for some meetings I have this week around the Open Referral API definition, to help people find government services--IFTTT bums me out, but priorities.

The loss of Pinboard integration in IFTTT sucks, but there is always Zapier. ;-) I began moving away from my IFTTT support back in 2014, after I began seeing their lack of an API, and the absence of a forward-facing business model, as an early warning sigs about the sustainability of IFTTT as an service provider.

I prefer having the option of paying for services in which I have a growing dependence on for my personal digital presence, and it is vital when it comes to my business presence. This was a deal breaker for me when it comes to IFTTT being a service I could support.

I also strongly believe, that if you are building a startup using the APIs of other companies, you should be offering an API for your own aggregation, automation, interoperability, and any other aspect of your tech--otherwise it isn't a service I want to support. 

There are many other approaches to integrating your cloud services, and the concept of iPaaS is a growing layer of the API space. Unfortunately there will also continue to be incentives for startups to not offer an API, be secretive and shady about their operations, not communicate honestly with their community, and have fucked up terms of services. 

All we can do to combat this, is to make sure and only use the services who have a sensible business model which is in alignment with their community, provide a truly open API, that possesses terms of service that are more human than lawyer, and know how to actually communicate with their community. 

Which is why I support API providers like Pinboard, a service that plays a central role in the operation of API Evangelist. 

See The Full Blog Post

Exploring A New Way To Fund API Life Cycle Tooling By Open Sourcing The API Garage

The news out of Runscope makes today a good day to kick off discussion around a project that I've been helping push forward with the API Garage team, assisting them find the healthiest path forward for their API client tooling. As Runscope demonstrates, it is a tough time for API startups, something that adds fuel to my personal mission to do what I can to help startups find success. 

First, what is API Garage? It is one of the HTTP / Web API Client tools available today, including Postman, Paw, DHC, and Stoplight. API Garage is an Electron based solution, you can download and put to work in helping you integrate with web APIs, allowing you to make calls, see the requests / response, all without having to write code. This approach to working with APIs has evolved beyond use by just API consumers, and is quickly becoming the tool of choice for API development teams, and being applied at almost every stop along the API life cycle.

The API Garage team approached me a couple months back to discuss the roadmap, and figure out how we can evolve it beyond just being a web API client, and help it emerge as an "garage environment" where individuals and teams can work on APIs, at any stage of the API life cycle. Inevitably this discussion led us to the talking about how the API Garage would be licensed, and generate revenue to sustain their vision. At this point, the API Garage team expressed interest in open sourcing the solution, and focusing on alternative approaches to generating the money they needed to sustain the solution, and keep them working on meaningful API projects.

After a number of conversation, they settled in on open sourcing the API Garage, and begain focusing on revenue being focused on several key areas:

  • Sponsorship - The API Garage website, and the open source download will have a handful of sponsorship spots, ranging from the default home page, to a well placed banner location, which they will be filling with quality, complementing services and tooling providers.
  • Default APIs - When you download the API Garage, there will be a handful of default APIs available. The team will be carefully considering a small group of valuable, relevant, and usable API partners to fill these slots.
  • API Services - The current solution provides API testing and mocking services, and the new open source solution will offer opportunities for a diverse range of API services, that serve almost every stop of the API life cycle. There will be a handful of default slots available for quality API services to sponsor.
  • Private Label - With the focus on the API Garage being open source, the team is committed to helping deploy it as custom, and private label solutions for companies who would like to establish an API Garage within their local organizational environment. 
  • Consulting - The team is also opening up to API consulting services, helping companies of all shapes and sizes with their API design, development, management, testing, strategy, and educational needs.

These are just a handful of the approaches we've sketched out to help the team make ends meet. The team has a short runway to focus on transforming the current API Garage, into the open source version (until summer 2016 target), and will need to bring in additional revenue by the time they launch, if they are going to make all of this work. 

There are two things that attracted me to this project. 1) The opportunity for an open source API life cycle solution to help be a window for managing all stops. 2) The opportunity in helping cultivate alternative, collaborative approaches to funding this window to the API life cycle, as well as potentially the APIs, and API services that need to exist within any thriving API Garage. As demonstrated by Runscope's announcement, delivering the valuable API services we need is challenging, and we need to be pushing forward open source software, and open community revenue models, alongside the VC funded vision that is dominating the sector right now.

I have a notebook full of stories to publish, from weeks of conversations with the API Garage team, and lots of work to help flush out what exactly is an API Garage. Think about the tech startup myth stories of the HP garage, and imagine how we can create an API focused environment where API engineers can craft their solutions. If you would like to know more about what the API Garage team is up to, head over to their website and contact them directly, or feel free to ping me directly, and I'll get you plugged in. 

I'm curious to see what the community can do with an open, collaborative API Garage, where we can share our API designs, and put to use the best of breed API services and tooling, in the service of a modern API life cycle.

See The Full Blog Post

This Is How APIs Will Deliver The Change We Need

I recently caught a glimpse of how APIs are going to deliver the change we need in this world. It began while I was attending a gathering of indie ed-tech folks on the campus of Davidson College in North Carolina. Where 20-30, mostly non-developers, discussed what is indie ed-tech, something which included many visions of what was dubbed "the personal API". While the gathering was very enlightening, it was what this gathering has set into motion, after we all parted ways, that I think has the most potential

Since the gathering occurred, a rolling wave of API driven awarenss has been picking up speed, with Indie Ed Tech, Colleagues/Friends, APIs, Unexpected Emergent Ideas, and dot dot dot and Can The JED-API Power a Certification with a fellow who barks about and plays with web tech. The Indie Ed-Tech: Revue/Reflections from the ed-tech Cassandra. I took a Journey to discover what is Indie Ed-tech with an expert generalist, and heard about Indie Educational Technology from a university chief information officer (CIO). I then read about how we are Pushing/Pulling Data – Thinking Computationally? Differently? from someone focused technology integration in K12 and higher ed. Then I explored Lo-fi Ed”-“Tech and The Personal in Indie from an edupunk with a mountaintop compound in Italy, and enjoyed the Reflections on Indie Ed-Tech from his partner in crime who is someone who is just making things up as he goes. I enjoyed the recap of the #IndieEdTech Design Sprint and #IndieEdTech Personal APIs & The Current State of Ed-Tech and IndieEdTech Keynote Reflections from an instructional psychology and technology graduate student. I thoroughly enjoyed the Framing Indie EdTech and Indie EdTech Design Sprint and Indie EdTech: Future and Funding and finally listening to the Vinyl API of One’s Own from a director of digital learning.

This collective vision of Indie Ed-Tech, which includes some very personal views of a what an API is, is how APIs will deliver the change we need.

It will not be the e-commerce vision of SalesForce, eBay, Commerce, Paypal, and other API pioneers that will move the needle with APIs. It will not be the following wave of social API leaders like Twitter and Facebook who connect all of us together using APIs. It will not be the API as a product vision of startups like Twilio, SendGrid, and Stripe who shift the landscape with a complete API package.

A perfectly designed REST API, that follows all the hypermedia rules, and the linked data vision of API visionaries will not save us. No single API specification, schema or standard handed down from above will provide the framework we will need to make the change necessary. Our belief in the perfect API implementation will not ever unite, and connect humans in the ways we need, and bring the balance that is necessary.

The well oiled API platforms of the big five: Amazon, Google, Microsoft, Facebook, and Microsoft will not bring the collective power needed to make web scale API change. Their API platforms, organizational-wide unity, design strategies, and CEO mandates, will never support the API power that is needed to achieve the global scale we will need.

The better late than never API implementations of last generation tech gorillas like Oracle, IBM, SAP, and AT&T will not begin to power even 5% of the potential that APIs will deliver. These 1000 lb legacy tech giants like to talk API, and like to pretend they get what APIs can do, but they will never realize what is actually needed to be API, beyond their own selfish needs.

Even the mighty US, UK, and other top governments, with their all knowing, all seeing, all mighty NSA, military, and bureaucratic institutions will ever fully realize API, with all their open data efforts, and global surveillance networks. Their belief that they have all the data, intelligence, network, and mobile access that they will need, will only distract them from what is actually possible with APIs.

It won't even be the over eager API Evangelists, who spends all their time understanding everything that is API, that will change the world using APIs. These evangelist will only be the channel in which each individual receives the information and awareness about APIs each day, setting the stage for what will bring the change we need. Evangelists can spend the next 33 years watching, writing, speaking, and hacking on APIs, and still not move the needle in the same way, that the API literacy showcased above will set into motion.

It will be the API literate individual, who understands that they can get access to their own data, and information from any website, system, application, connected device, company, and institution, using APIs. It will be people who understand that they can make their education, career, and the web into what they want, using APIs. That the web is programmable. A digitally aware individual who assumes full control over their online self, taken it back from the tech giants, understanding that they own all the exhaust from their online (and increasingly offline) personal, and professional life. 

I have been excited about some of what I've seen while monitoring the API space over the last five years. Something that is getting harder and harder to find each year. However, nothing I have seen makes me more hopeful and optimistic about what APIs can do, than reading about each of these individuals, who have been turned on to what an API is. It is not my vision. It's not IT's vision. It's not Silicon Valley's vision. It is their own vision of what is API, an understanding that APIs are all around them, and developing their own interpretation of what APIs can do.

No single API implementation, tool, service, specification or standard will set into motion the change that is needed. The real API story is about empowering every single individual to take control over their own digital self using APIs--something I believe should begin in education at the K-12, as well as University level. This is the front-line of APIs. This is how we will push back on Silicon Valley, technology, digital exploitation, and the NSA's of the world.

This is how APIs will help to deliver the change we need. #IndieEdTech

See The Full Blog Post

Some Milestones From The Last 15 Years Of Web API History

History is everything. Understanding where we have come from is critical to knowing where we are going. While pushing forward with the latest technology, it is always healthy to pause and take a look at that past. Someone Tweeted the link to my history page, and I realize it has been three years since I refreshed my view of the overall history of the space, so I wanted to take some time and add a few other milestones that I feel were significant along the way.

When I talking about APIs, I'm focused one version that was born out of the enterprise, during the Service Oriented Architecture (SOA) movement, which sometime around 2000, a portion of the SOA experiment left the enterprise and found a more fertile environment in the world of start-ups. In 2016, this version of API has re-captured the attention of the enterprise, as they see them being used in popular, public API driven services, and the startups they are acquiring and gobbling up.

Where we stand in 2016, there are some obvious technical reasons of why web APIs are finding success in companies of all shapes and sizes, and even within government; but not all the reasons for this success is technical. There are many other, less obvious aspects of web APis that have contributed to their success, things we can only learn by closely studying the past and looking at why some of the pioneers of web APIs were successful, and have continued to be successful over the years.

In 2016, it is critical that we emulate the best practices that have been established over the last 16 years, following the lead of early API providers like Amazon, Salesforce, eBay and Twitter--much of what is still being emulated by new API practitioners in 2016. As a startup, SMB, enterprise, institution, and government agency organization, you don't have to follow every example set in this 16 year history, but you should be aware of this history, and understand your place in the sector.

As I look back each year, I see to some clear patterns emerge that have defined the industry--patterns that need to be emulated, and some that should be avoided, as we plan our own API strategy and presence.


As the first .COM bubble was bursting, platforms were looking for innovative ways to syndicate products across e-commerce web sites, and web APIs, built on the backs of existing HTTP infrastructure proved to be the right tool for the job.

With this in mind, a handful of tech pioneers stepped up to define the earliest uses of APIs as part of sales and commerce management, kicking-off a ten year evolution that I consider as the early history of web APIs, defining the sector we all enjoy today.

However, even with the early success of APIs, the sector would struggle to reach a mature point, without several other critical ingredients that would prove to be as important as essential commerce variables like social, payments, and messaging. 


February 7th, 2000 Salesforce.com officially launched at the IDG Demo 2000 conference.

Salesforce.com launched its enterprise-class, web-based, sales force automation as a "Internet as a service". XML APIs were part of Salesforce.com from day one. Salesforce.com identified that customers needed to share data across their different business applications, and APIs were the way to do this.

Marc R. Benioff, chairman and founder of salesforce.com stated, "Salesforce.com is the first solution that truly leverages the Internet to offer the functionality of enterprise-class software at a mere fraction of the cost."

Salesforce.com was the first cloud provider to take an enterprise class web application and API and deliver what we know today as Software-as-a-Service.

Even with SalesForce being the first mover in the world of web APIs, they are still a powerhouse in 2016. SalesForce continues to lead when it comes to real-time APIs, testing, deployment and most recently taking a lead when it comes to mobile application development and backend as a service (BaaS).


On November 20, 2000, eBay launched the eBay Application Program Interface (API) , along with the eBay Developers Program.

The eBay API was originally rolled out to only a select number of licensed eBay partners and developers.

As eBay stated:

"Our new API has tremendous potential to revolutionize the way people do business on eBay and increase the amount of business transacted on the site, by openly providing the tools that developers need to create applications based on eBay technology, we believe eBay will eventually be tightly woven into many existing sites as well as future e-commerce ventures."

The launch of the eBay API was a response to the growing number of applications that were already relying on its site either legitimately or illegitimately.

The API aimed to standardize how applications integrated with eBay, and make it easier for partners and developers to build a business around the eBay ecosystem.

eBay is considered the leading pioneer in the current era of web-based APIs and web services and still leads with one of the most successful developer ecosystem today. 


On July 16, 2002, Amazon launched Amazon.com Web Services allowing developers to incorporate Amazon.com content and features into their own web sites.

Amazon.com Web Services (AWS) allowed third party sites to search and display products from Amazon.com. Product data was made accessible using XML and SOAP.

From day one the API was integrated with the Amazon.com Affiliate Program, allowing developers to monetize their sites through purchases made at Amazon.com via links from their web sites.

Internet visionary Tim O'Reilly was quoted in original Amazon Web Services press release saying, "This is a significant leap forward in the next-generation programmable internet."

APIs and Amazon both have roots in e-commerce, but APIs were quickly applied to other areas resulting in the social media, cloud computing, and almost every single component necessary to build the web, and mobile Internet that we all use every day.


As API driven commerce platforms were still finding their footing, working to understand the best way to put APIs to work, a new breed of technology platforms emerged when it came to using content, media, and messaging on the web, in a way that was very user centric and socially empowering for individuals and businesses.

Publishing user generated content, and the sharing of web links, photos and other media via APIs emerged with this birth of new social platforms between 2003 and 2006.  This was an entirely new era for APIs, one that wasn't about money, it was about connections.

These new API driven, social platforms would take technology to new global heights, and ensure that applications from here forward, would all always contain essential social features, that were defined via their platform APIs. 

Social, was an essential ingredient the API industry was missing.


del.icio.us is a social bookmarking service for storing, sharing and discovering web bookmarks to web pages, that was founded by Jousha Schachter in 2003.

Del.icio.us implemented a very simple tagging system which allowed users to easily tag their web bookmarks in a meaningful way, but also established a kind of folksonomy across all users of the platform. Which proved to be a pretty powerful way for cataloging and sharing web links.

Photo Credit

The innovative tagging methodology used by del.icio.us allowed you to pull a list of your tags, or public web bookmarks by using the URL http://del.icio.us/tag/[tag name]/. So if I was searching for bookmarks on airplanes, I could http://del.icio.us/tag/airplane and I would GET a list of all bookmarks that have been tagged airplane.  It was simple

When it came to the programmatic del.icio.us interface, the API was built into the site, creating a seamless experience--if you wanted the airplane tags via HTML you entered http://del.icio.us/tag/airplane, if you wanted RSS of the tags you entered http://del.icio.us/rss/tag/airplane, and if you wanted XML returned you used http://del.icio.us/api/tag/airplane. This has changed with the modern version of Delicious API.

del.icio.us was the first, concrete example of how the web could deliver HTML content, alongside machine readable like RSS and XML, using a URL structure that was simple and human readable. This approach to sharing bookmarks would set the stage for future APIs, in making APIs easy to understand for developers and even non-developers alike. Any slightly technical user could easily parse the XML or RSS, and develop or reverse engineer widgets and apps around del.icio.us content.

del.icio.us has been sold twice since its early popularity, which included to Yahoo! in 2005 and AVOS Systems on April, 2011. However del.icio.us was one of the pillar platforms that ushered in the social era of the API movement, establishing sharing via APIs as critical to the API economy, but also showing that simple rules when it comes to API design.


In February 2004 the popular photo sharing site Flickr launched. Six months later they launched their now infamous API, and six months after that, they were acquired by Yahoo.

Flickr was originally created as an online game, but quickly evolved into a social photo sharing sensation.

The launch of the RESTful API helped Flickr quickly become the image platform of choice for the early blogging and social media movement by allowing users to easily embed their Flickr photos into their blogs and social network streams.

The Flickr API is the driving inspiration behind the concept of BizDev 2.0, a term coined by Flickr co-founder Caterina Fake.  Flickr couldn't keep up with the demand for its services, and established the API as a self-service way to deal with business development.

The core concepts established by Flickr using its API would transcend the company and its acquisition by Yahoo.  Business development using APIs is embedded in the philosophy of the business of APIs pushing APIs to something beyond technical. 

APIs became something that any company could use to actually conduct business with its partners and the public, but we still had a ways to go before APIs would grow up.


On August 15th 2006, Facebook launched its long-awaited development platform and API.Version 1.0 of the  Facebook Development Platform allowed developers access to Facebook friends, photos, events, and profile information for Facebook.

The API used REST, and responses were available in an XML format, following common approaches by other social API providers of the time.

Almost immediately, developers began to build social applications, games, and mashups with the new development tools.

The Facebook Development Platform gave Facebook an edge over then popular competitorMySpace, and established itself as the top social gaming platform with games like Farmville.

While the Facebook API and platform is considered by many developers to be unstable, it continues to play a significant role in the evolution of the entire platform with applications and partnerships that drive new features and experiences on Facebook.


On September 20, 2006 Twitter introduced the Twitter API to the world.

Much like the release of the eBay API, Twitter's API release was in response to the growing usage of Twitter by those scraping the site or creating rogue APIs.

Twitter exposed the Twitter API via a REST interface using JSON and XML.

In the beginning, Twitter used Basic Auth for API authentication, resulting the now infamous Twitter OAuth Apocalypse almost four years later, when Twitter forced all those using the API to switch to OAuth.

In four short years Twitters API had become the center of countless desktop clients, mobile applications, web apps, and businesses -- even by Twitter itself, in its iPhone, iPad, Android apps via its public website for much of its existence (no longer true).

Twitter is one of the most important API platforms available, showing what is possible when a dead simple platform does one thing well, then opens up access via an API and lets an open API ecosystem build the rest.

Twitter is also one of the most cautionary tales, of how your API ecosystem can also begin to work against you, unless you properly address the political considerations of an API ecosystem as it grows.

Business and Marketing

As APIs evolved from commerce, through social, it was clear that the industry was going to need some standardizing, by introducing some common business practices.  The industry needed to standardize how APIs were deployed as well as provide marketing to help get the word out about the potential of APIs and common business practices. 

The establishment of common business  and marketing practices for the API space took a lot of grassroots outreach as well as storytelling of the behalf of APIs, companies and the industries they rose out of.  There were two separate API pioneers that stepped up to help define the API industry we know today, between the years of 2005 and 2012.


While writing about the history of APIs, it is easy to be so focused on just APIs, that you overlook the single most important player in the entire history of the web API--ProgrammableWeb.

In July 2005, John Musser started ProgrammableWeb. According to his original about page:

ProgrammableWeb is a web-as-platform reference site and blog delivering news, information and resources for developing applications using the Web 2.0 APIs.
I started this site because I couldn't find what I was looking for: a technology focused starting point for web platform development. (For a bit more see my initial post.) Although no guarantees, the last time I started a reference site it somehow became Google's highest rated link on the topic. Given that this site will be a collaborative effort with community input as well, this can be what we make it.
I hope you find the site useful.
John Musser - Seattle, August 2005

John’s original blog post on why he started ProgrammableWeb, says it all: Why? Because going From Web Page to Web Platform is a big deal.

Web APIs are a big deal! Whether its social networking, government, healthcare or education--having a programmable platform to make data and resources available will be a critical part of how commerce and society operates from here on forward.   

John made a early decision to showcase open and RESTful approaches to deploying APIs vs. parallel attempts of Service Oriented Architecture (SOA) and Web Services, and focused on telling stories about open APIs--way before it was the thing to do in Silicon Valley.

When I started API Evangelist in July 2010 (5 years after PW), and started talking about the business of APIs, the technology of web APIs was already widely accepted in Silicon Valley, because of the stories that have told on ProgrammableWeb.

As we progress through 2013, a year in which I think we can confidently say APIs are moving mainstream, and I feel we owe much of the success to ProgrammableWeb. The stories John, Adam and other writers have been telling on ProgrammableWeb have been crucial to quantifying and defining the API industry--allowing us all to build, iterate and move things forward.

Without stories around the technical, business and politics of APIs, these virtual interfaces would not have been able to find a place in our real life worlds.


In November 2006, API the first API service provider Mashery came out of "stealth mode" to offer documentation support, community management and access control for companies wishing to offer public or private APIs--from a blog post in TechCrunch titled API Management Service is Open for Business.

At this point in time, in 2006, we were moving from the social period of APIs into the cloud computing phase with the introduction of Amazon Web Services. It was clear that the world of web APIs was getting real, and there was opportunity for companies to offer API management as a service.

Photo Credit

While there were tools for deploying APIs, there was no standard approach to managing your API deployment. Mashery was the first to bring a standard set of services to API providers, that would help set the stage for the future growth of the API industry.

It would take almost six more years before the API industry would come of age, in which Mashery significantly helped contribute. The space we all know today was defined by early API commerce pioneers like SalesForce and Amazon, social pioneers like Flickr and Delicious, and from Mashery who helped define what is now know as the business of APIs.

In 2013, Mashery was acquired by Intel, and again by Tibco in 2015, helping continue to validate that the API industry truly is coming of age. 


One early API pioneer saw the need for provider web developers with a simple JavaScript map that could assist uses in navigating online, localized content or even navigate in the real world.

Web developers quickly saw the potential of embeddable maps, and found ways to hack these mapping sources to innovate and build the web properties users desired, focused on solving the local problems we all face daily.

This early use of APIs in providing mapping tools and services for developers laid the groundwork for much of the early mobile developer talent that would drive the coming mobile API period.

Google Maps

On June 29th, 2006, Google launched Google Maps API allowing developers to put Google Maps on their own sites using JavaScript.

The API launch was just shy of 6 months after the release of Google Maps as an application, and was in direct response to the number of rogue applications developed that were hacking the application. Google Maps was immediately so popular that developers hacked the JavaScript interface and developed application such as housingmaps.com and chicagocime.org. The demand was so great for information on hacking Google Maps it spawned books such asMapping Hacks and Google Hacks from O'Reilly.

Google Maps API started a trend of API mashups with its valuable location based data, with over 2000 mashups to date.

The API demonstrates the incredible value of geographic data and mapping APIs, as well as the power users can have in influencing the direction an application or API takes. Lars Rasmussen, the original developer of Google Maps commented how much he learned from the developer community by watching how they hacked the application in real-time, and they took what they learned and applied it to the API we know today.

Few other companies have the resources to tackle a problem like mapping the worlds resources and delivering a reusable, API driven resource, like Google did.  Google has played many roles in moving forward the APi space, but Google Maps played a pivotal role in the history of APIs.

Cloud Computing

As APIs were generating social buzz across the Internet, Amazon saw the potential of a RESTful approach to businessinternalized and saw APIs in a way that nobody had seen them before--giving birth to an approach to using APIs that was much more than just e-commerce, it would re-invent the way we compute.

Amazon transformed the way we think about building applications on the web, delivering one of the essential ingredients we needed for APIs to work, by putting APIs to work.  What we now know as cloud computing changed everything, and make the mobile, tablet, sensor and other API driven realms possible.

Amazon S3

In March, 2006 Amazon launched a new web service, something completely different from the Amazon bookseller and e-commerce site we've come to know. This was a new endeavor for Amazon: a storage web service called Amazon S3.

Amazon S3 provides a simple interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web. It gives developers access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of websites.

Amazon S3 or Simple Storage Service was initially just an API. There was no Web interface or mobile app. It was just a RESTful API allowing PUT and GET requests with objects or files.

Developers using the Amazon S3 API were charged $0.15 a gigabyte per month for storing files in the cloud.

With this new type of API and billing model, Amazon ushered in a new type of computing we now know as cloud computing.

This also meant that APIs were no longer just for data or simple functionality. Now they could be used to deliver computing infrastructure.

Amazon EC2

In August 2006, shortly after Amazon launched its new cloud storage service Amazon S3, the company followed with a new cloud computing service dubbed Amazon EC2 or Elastic Compute Cloud.

Amazon EC2 provides re-sizable compute capacity in the cloud, by allowing developers to launch different sizes of virtual servers within Amazon data centers.

Just like its predecessor Amazon S3, Amazon EC2 was just a RESTful API. Amazon wouldn't launch a web interface for another three years.

Using the Amazon EC2 API developers can launch small, large and extra large servers and pay for every hour that the server is running.

Amazon EC2, combined with Amazon S3 has provided the platform for the next generation of computing with APIs at the core.

Cloud computing, was an essential ingredient the API industry was missing, and would grow to become what was needed for almost every aspect of growth, during the next 10 years. The most significant part of this story is that Amazon's cloud APIs were not just making their companies digital resources available to other businesses, they were also driving much of the growth across every other sector of the API sector.

A Mobile World

With the introduction of iPhone and Android smart phones, APIs had evolved from powering e-commerce, social, and the cloud, to delivering valuable resources to the mobile phones in our pockets, that are quickly becoming commonplace around the globe.

APIs make valuable resources modular, portable and distributed, making them a perfect channel for developing mobile and table applications of all shapes and sizes. 

A small group of API driven technology platforms have helped defined the space and won over the hearts and minds of both developers and the end users of the applications they develop.


In March 2009 Foursquare launched at the SXSW interactive festival in Austin, TX.

Foursquare is a location-based mobile platform that makes cities more interesting to explore, by checking in via a smartphone app or SMS, users share their location with friends while collecting points and virtual badges.

In November 2009 after a round of angel funding from several investors including Union Square Ventures and O'Reilly AlphaTech VenturesFoursquare launch their API.

At the time of API launch, Foursquare had an impressive set of applications developed by a closed group of partners including anAndroid application and augmented reality with Layar.

Even with growing competition from early mover Gowalla and major players like Facebook and Google, Foursquare has emerged as the dominant mobile location-sharing and check-in platforms.


On October 6, 2010 Instagram launched its photo-sharing iPhone application.

Less than three months later, it had one million users.

Kevin Systrom the founder of Instagram focused on delivering a powerful, but simple iPhone app that solved common problems with the quality of mobile photos and users' frustrations with sharing.

Immediately many users complained about the lack of central Instagram web site or an API, with Instagram remaining firm on focusing its energy on the core iPhone application.

In December a developer name Mislav Marohni? took it upon himself to reverse engineer how the iPhone app worked, and built his own unofficial Instagram API.

By January Instagram shut down the rogue API and announced it was building one of its own.

Then in February of 2011, Instagram released the official API for the photo platform.

Within days many photo applications, photo -sharing sites, and mashups built around the API started showing up.

Instagram became a viral iPhone app sensation, but quickly needed an API to realize its full potential.  Asserting the platforms place in history as one of the defining players in the mobile period of APIs.


Photo Credit

In 2007, a new API-as-a-product platform launched, called Twilio, which introduced a voice API allowing developers to make and receive phone calls via any cloud application. In recent years Twilio has also released a text messaging and SMS short code API, making itself a critical telephony resource in many developers toolbox.

Twilio is held up as a model platform to follow when evangelizing to developers. Twilio has helped define which technical and business building blocks are essential for a healthy API driven platform, set the bar for on the ground evangelism at events and hackathons, and worked hard to showcase, support and invest it its developer ecosystem.

Alongside Foursquare and Instagram, Twilio has come to define mobile application development, helping push APIs into the mainstream. While Twitter has sometime been held up as a cautionary tale when it comes to APIs,  Twilio has demonstrated, that when done right, API driven ecosystems do work.

By 2011 the bar for delivering APIs, via HTTP, has been well established by early pioneers like SalesForce and Amazon, but Twilio has shown how mature the business of APIs has become with its evolution into the mobile period.  However, mobile development via APIs, owes its roots to the foundation laid by the commerce, social and cloud API pioneers.

JavaScript Object Notation

In the years between 2005 and 2010, as the web API space matured, JSON or JavaScript Object Notation, an open-standard grew to be the dominate choice of API developers. It has taken a number of years, but JSON quickly out paced XML as the preferred format to send and receive data by API providers. 

JSON use evolved out of a need for stateful server-to-browser communication, without using browser plugins such as Flash or Java applets, which had been the dominant methods in the early 2000s. The JSON organizational website was officially launched in 2002, but it wasn't until Yahoo! began offering some of its Web services in JSON in 2005 and then Google used it for its GData protocol in 2006, that we started to see widespread adoption of the format by API providers, and consumers.

JSON is a language agnostic data format derived from JavaScript, but as of 2016, the code to generate and parse JSON-format data is available in many programming languages. While we are seeing some evolution in the space towards another machine readable format called YAML, which has shed the brackets that define JSON, making it more readable for some--JSON is still the clear winner when it comes to web APIs.

The switch from XML to JSON has marked the maturing of the web APIs space, going from hobby to an actual business solution that can be used to describe essential business resources--resulting in near complete adoption in 2016.

The Ongoing Evolution of Online Commerce

Over the first decade of the 21st century, online commerce APIs were still evolving, with the essential elements like product, sales, auctions, shopping carts, and payments play a central role. Many API providers would come and go, but there are only a handful that deliver a precise approach to APIs that would prove to elevate their offering, making an impact on how well approach commerce APIs, as well as almost any other digital resource.


By September 2011, startups and investors had read the writing wall, and the proven "API as a product" model began being applied to disrupt the payment industry, with the launch of Stripe. Like Twilio, Stripe was built for developers, and did everything right, when it came to API design, to documentation, support, and pricing that worked for web, and mobile application developers when it came to integrating payments into their business and consumer solutions.

Right along with compute, storage, location, and messaging, payments are an essential resource to any commercial web or mobile application, and having a simply priced, easy to get up and running payment APIs, proved an instant hit with developers. I considered adding Authorize.net, and Paypal to the history of APIs, but in my opinion it took 10 years for digital commerce to evolve via APIs, with providers like Amazon, and eBay, and API-as-a-product business models established by Twilio, before a standalone payment provider like Stripe could exist, an make the impact that they have.

Payments are a mission critical resource for developers, and will continue to be in the future. Stripe continues to set the bar for how you do payment APIs, as well as how you do APIs in general, and is held up by the entire API industry as how you do it. Stripe continues to do one thing (payments), and do it well, setting the tone for what APIs can do, to disrupt a well established industry like online payments.

Hardening Security Practices

As more companies looked to open up their digital assets via web APIs, the need to harden security practices emerged, but at the same time these practices needed to reflect the simple nature of the modern web API, that developers expected. Traditional enterprise approaches to identity and access management would not always fly within web API implementations, with the majority of providing opting to go with basic auth, or API keys, when securing their APIs, but there were two approaches to securing APIs that have evolved along the way.


In 2006 a movement was born out of Twitter, and the social bookmarking site Ma.gnolia, out of a frustration that there were no existing standards for platforms, developers, and users to manage API access and resource delegation. By 2007 a small group gathered to draft a proposal for a new proposal, resulting in what became the OAuth Core 1.0 draft, which then emerged as an OAuth working group within the Internet Engineering Task Force (IETF).

By October 2012, OAuth 2.0 had emerged as the next evolution of the protocol, focusing on client developer simplicity while also providing specific authorization flows for web applications, desktop applications, mobile phones, as well as devices. OAuth 2.0 has seen wide adoption by leading API providers, quickly establishing it as as one of the first major open standards, that the web API community would embraced.

While OAuth can be celebrated as a security standard for the API space, the evolution hasn't been without its problems. In July 2012, one of the original OAuth champions Eran Hammer resigned his role of lead author for the OAuth 2.0 project, withdrew from the IETF working group, removing his name from the specification, citing a conflict between the open web and enterprise cultures, stating that the IETF as a community is "all about enterprise use cases", and "not capable of simple." What is now offered is a blueprint for an authorization protocol, he says, and "that is the enterprise way", providing a "whole new frontier to sell consulting services and integration solutions."

While OAuth 2.0 is not the perfect solution the API space needs to delegate access to resources via APIs, it is the best we have at the moment. The approach to securing APIs, provides a viable solution that allows API platform providers to secure resources in a way that enables developers to easily access resources, with the involvement of end-users. Even if OAuth 2.0 has become a tool of the enterprise, it is providing some meaningful delegation, and enabling the space to safely and securely expand, and integrate at a steady pace for the last few years.

Jason Web Tokens (JWT)

At the same time OAutb has been maturing, another industry standard (RFC 7519) evolved, called JSON Web Tokens, providing an open way to securely represent online exchanges between two parties. The tokens are designed to be compact, URL-safe and usable in single sign-on (SSO) context via the web. JWT claims are typically used to pass identity of authenticated users between an identity provider and a service provider, or any other types of claims as part of regular business activity.

JWT began being worked on in September of 2010, with the first draft of JWT becoming available in July of 2011. A growing number of API providers are using JWT as a middle ground between simple API keys, and the sometimes overwhelming OAuth implementations, that can create friction for developers.

I think JWT has the potential to flourish outside of the challenges OAuth has faced from the enterprise, at least for a couple more of years, until it sees the same amount of adoption as OAuth.

Both OAuth, and JWT has helped round off the API security stack, where along with Basic Auth and API Keys, API providers now have a robust set of tools that allow them to secure the valuable resources that are being made available via web APIs.

The Now Infamous Yegge Rant

Echoing the API history Amazon has been putting down, in March of 2011, there was an accidental post from a Google employee about Google+. The internal rant was accidentally shared publicly and provides some insight into how Google has approached APIs for their new Google + platform, as well as insight how Amazon adopted an internal service oriented architecture (SOA).

The insight about how Google approached the API for Google+ is interesting, but what is far more interesting is the insight the Google engineer who posted the rant, Steve Yegge, provides about his time working at Amazon, before he was a engineer with Google.

During 6 years at Amazon he witnessed the transformation of the company from a bookseller to the almost $1B, Infrastructure as a Service (IaaS) API, cloud computing leader.  As Yegge's recalls that one day Jeff Bezos issued a mandate, sometime back around 2002 (give or take a year):

  • All teams will henceforth expose their data and functionality through service interfaces.
  • Teams must communicate with each other through these interfaces.
  • There will be no other form of inter-process communication allowed: no direct linking, no direct reads of another team's data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network.
  • It doesn't matter what technology they use.
  • All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions.

The mandate closed with:

 Anyone who doesn't do this will be fired.  Thank you; have a nice day! 

Everyone got to work and over the next couple of years, Amazon transformed itself, internally into a service-oriented architecture (SOA), learning a tremendous amount along the way. While this story has been proven to be more myth than reality, I think the real impact of the story is in how this myth has been heard, and passed around the API sector, told and retold by IT, developer, and business people around the globe.

This story came at a time where many companies were struggling with the scary possibility of operating public APIs, and has allowed them to refocus much of the value that APIs bring to the table internally. The Yegge rant provides an important story that companies can tell themselves as the begin their API journey, keeping things internal in the beginning, but with hopes that someday they can go public, and find the success that Amazon has with their successful API platform.

Open Source Software and Now APIs With Github

In tandem with the evolution of the cloud, another company was being born who would make yet another monumental impact across the API space. In late 2007, Thomas Preston Warner, and Christ Wanstrath would come together to improve on the open source distributed version control system, Git. The pair were looking to improve on the existing Git experience, and develop a hub for coders, which by mid-January of 2008, after three months of nights and weekends, they would launch Github into private beta mode.

Along with simplification of using Git at the heart, and the social network that brought together coders, Github also has leveraged APIs all along the way. The Github API provides developers with access to all aspects of the Github platform, providing the ability to manage the software development life cycle, while also building community along the way.

As the potential of Github in software development was being realized, Github did another seemingly simple thing, which would further expand its use across the API sector, by launching Github Pages. The new solution would allow project websites to be deployed alongside Github master repos, something that would tweak the meaning of exactly what a repo could be used for. 

API providers would begin using Github Pages to host their API developer portals, to host API SDKs and code samples, and even began pushing its use for publishing event presentations, and managing the publishing of open data. Github has emerged as the platform of choice in the API space, and is used at almost every stop along the API life cycle, leverage Git, and a robust API to orchestrate and automate the API driven backend of the latest wave of web, mobile, and device-based solutions.

Changing The Way We Communicate Around Are APIs With Swagger And The Swagger UI

In 2011, and 2010, a new way to approach the old SOA way of describing services emerged called Swagger. The new API definition format, was developed by Tony Tam (@feyguy), to meet their API needs at Wordnik, when it came to helping managing the evolution of their dictionary API. Swagger provides API providers a new way to describe the surface area of any web API, allowing for the generation of documentation, code libraries, and many other things developers need to understand what an API does, and how to put it to work.

Swagger is often known for its tooling for deploying a new type of API documentation, in a way that made it more interactive, allowing developers to make API requests, and see the details of the request, and the results, before they ever write any client code. However the interactive API documentation was just the beginning, and the API definition format would eventually be applied to almost every stop along the API lifecycle.

Swagger has matured to version 2.0, and has become the central contract that defines the arrangement between API provider and consumer. In 2015, Swagger was acquired by SmartBear Software, with the specification put into the Linux Foundation. In 2016, the specification has re-emerged as the OpenAPI Spec, and is now governed by the Open API Initiative (OAI), the organization formed as part of the move to the Linux Foundation. Even amidst all the turmoil, the OpenAPI Spec is still rapidly expanding in use across the web, and providing a machine readable way for API providers, consumers, and even business stake holders to describe the valuable API resources being exchanged as part of the API economy.

Apiary Teaches Us To Be API Design First

Swagger gave us a way to describe our APIs, but many API providers still apply it after an API has been developed, until one company came along and helped us move the conversation earlier on in the API life cycle. The Apiary.io team used their own API definition format, call API Blueprint, to not just to describe and document an API, but also allow designers mock it, before you ever got your hands dirty writing code. This API design first approach to API development has had a profound effect on how we look at the API life cycle, allowing us to make mistakes, and bring in key stakeholders before things ever get costly.

What Apiary brought to the table wasn't just about making it easier to design, mock, develop, and document our APIs, they pushed the space to open up the API conversation with consumers, and key business stakeholders much earlier on in the life cycle, before things went down a bad road, and were set in stone. This process allows everyone involved to get to know the resources being made available via APIs, and design a solution that better matches how the resources will be experienced, not just how the resource is stored. 

API design first has become a mantra for many companies, and API service providers. While it isn't truly a reality for all who recite the phrase, it provides a healthy focus for API designers, architects, and business stakeholders, at varied stages in their API journey. Many companies will need this focus to get them through many of the challenges they face along the way, as they try to operate in this new online, API driven web, mobile, and device driven world. 

A Glimpse At The Internet of Things From Fitbit

By 2009 and 2010, it was becoming clear that APIs could be used to deliver the resources we need for the increasing number of mobile phones that were becoming ubiquitous. Amidst this rapid growth of mobile, another company popped up that would see the potential of connecting devices to the Internet, with the birth of the Fitbit. The new fitness and health tracking device would allow users to track their activity, health, and other key wellness indicators, which could then be connected to our mobile phones, helping plant the seeds for what we now call the Internet of Things (IoT).

In February 2011 Fitbit quietly launched their API, providing connectivity to the data that was uploaded to the Internet from the tracking device, via our mobile phones. Two months after Fitbit launched their API, they announced the first wave of partners who had integrated with the fitness and health device. This partner potential is why companies of all shapes and sizes were beginning to deploy APIs, allowing for 3rd party companies to tap into the growing number of valuable resources being made available online.

While Fitbit is not responsible for the Internet of Things, as devices being connected to the Internet via wifi and bluetooth is nothing new, they do provide a solid example of IoT in action, one that is publicly traded, and has seen both consumer, and commercial success. Whether you call it the quantified self, wearables, or Internet of Things, Fitbit has captured the imagination when it comes to Internet connected devices. 

Integration Platform as a Service (iPaaS)

As developers are realizing the potential of web APIs, a wave of new companies were also emerging that saw the potential for non-developers to put APIs to work in everyday business and consumer world. In November 2011, Zapier began publishing simple connectors between popular cloud platforms that would allow anyone to put APIs to work in managing their increasingly online world. 

By June of 2015, Zapier launched its third-party developer platform, which allowed API providers to build their own connectors. The connectivity that companies like Zapier offered, reflect older, more enterprise approaches like Extract, Transfer, and Load (ETL), which helped businesses move data and information around on their networks. This big difference with this new breed of provider is that that connectors employ simple icons, that represent popular API driven services, and focused on the API driven cloud, moving beyond the company network.

There are more than 50 providers that I track on who provide iPaaS services, of all shapes and sizes, continuing to to legitimize the concept, but not all pay it forward by providing an API as well--a significant part of the concept working. While iPaaS helps smooth over some of the more difficult aspects of API integration, they shouldn't hide it all together, and eliminate the possibility for API access by consumers.

iPaaS isn't just about move data and content from point A to B, it is about aggregating, syncing, and migrating valuable API driven resources. As the number of APIs grow, the number of iPaaS providers also increases, providing a wealth of API driven resources that any business user, or even developer can put to work for them.

Obama Mandates Federal Government To Go Machine Readable By Default

As a follow-up to the Executive Order 13571 issued on April 27, 2011, requiring executive departments and agencies to identify ways to use innovative technologies to streamline their delivery of services to lower costs, decrease service delivery times, and improve the customer experience--Barack Obama has directed federal agencies to deploy Web APIs.

The Whitehouse CIO has released a strategy, entitled "Digital Government: Building a 21st Century Platform to Better Serve the American People", provided federal agencies with a 12-month plan that focuses on:

  • Enabling more efficient and coordinated digital services delivery
  • Encouraging agencies to deliver information in new ways that fully utilize the power and potential of mobile and web-based technologies
  • Requiring agencies to establish central online resources for outside developers and to adopt new standards for making applicable Government information open and machine-readable by default
  • Requiring agencies to use web performance analytics and customer satisfaction measurement tools

While the mandate itself didn't do much to move the open data and API needle in the federal government, it did mobile many people who were looking to make change in government. In addition to the mandate, a wave of open data, and API savvy CTOs and CIOs have led the charge at the White House, and groups like 18F have taken up the cause of open data and APIs across the federal government. 

At the same time this change is happening at the federal government level, open data and APIs would also be making change on the ground in city, state, and county governments across the country. While not all early visions of open data have been realized, the Obama mandate marked a major milestone in how our government works, in part to the concept of the web API.

Setting A Very Negative Precedent In The Oracle v Google API Copyright Case

Even with all the gains the API industry has made in the last 15 years, it hasn't been without its major potholes, speed bumps, tool booths, detours, and disruptions. Just as the API space is seeing some amazing contributions, and growth, a chill was sent across the industry by a court case brought by Oracle against Google, which claimed that the Java API had been copied by Google,and were something that was protected under copyright. 

In May 2012, a jury in the case found that Google did not infringe on Oracle's patents, and the trial judge ruled that the structure of the Java APIs used by Google was not copyrightable. However, by 2014, the Federal Circuit court partially reversed the district ruling, ruling in Oracle's favor that the APIs were indeed protected under copyright. A petition was submitted to the  United States Supreme Court on June 29, 2015, but was denied, sending the remaining issue of fair use back down to the district court.

While the Java API is a different breed of API, than the web APIs that have gained momentum, and there remains the fair use discussion, the court case has sent shockwaves across the API sector. There is a lot of uncertainty involved with companies doing APIs, and the API copyright precedent adds yet another concern for both API providers, and consumers, adding unnecessary strain to the space. Web APIs flourish when they are used as an external R&D lab between a company, its partners, and the public, and the dark cloud of API copyright threatens this balance. 

Twitter Sends All The Wrong Signals To Its Community in 2012

At the same time we were dealing with the fallout from the Oracle v Google case, one of poster children of the modern API movement sent a series of chilling messages, and veiled threats to its then fast growing API ecosystem. In June 20212 Twitter published a post explaining the need for delivering a consistent Twitter experience for users, followed up by a very ominous post in August of 2012 talking about changes coming down the line for the Twitter API.

While Twitter was just tightening up its control over its brand, applications, and its community, something all API providers face, the way it approached the situation, sent such a negative vibe to its community, the developers revolted. Twitter made it clear, that it was clearly in competition with its API ecosystem, and was trying to take back control over some of the more successful areas of development that had been occurring within the ecosystem, and already being met by businesses being built by API developers.

Everything we know of as Twitter was built by its developer ecosystem, a relationship that was very public, and encouraged by Twitter, until the company did not did the free labor, and took on a significant amount of funding, requiring it to shift its course. Twitter was needing to generate revenue, and made it very clear that they were taking back the most successful areas of the platform, something that would have a very chilling effect on the API community, and is something that the company has never recovered from.

Even though Twitter co-founder Jack Dorsey reassured developers that Twitter cares about its developers, as he retook the reigns of his stumbling company in 2015, the trust had already been broken. Proving that trust is one of the most important aspects of API platform operations, something that once it has been broken, will be almost impossible to recover from.


As the momentum in the API space grew in 2011 and 2012, the traction API service providers were seeing caught the attention of some of the more established tech giants who have dominated the tech sector for decades. The well defined discipline of API management, set into motion by Mashery who is showcased above, had ripened to the point that made a very attractive acquisition targets by the enterprise, and we saw a handful of acquisitions that rung out across the space in 2013.

In late 2012 we saw the first acquisition of Vordel by Axway, which set off a series of high profile API management provider acquisitions in 2013, beginning with Mashery being purchased by Intel, then Layer 7 by CA Software, and Apiphany by Microsoft later in the year. The acquisitions would send the signal to markets that the API space had come of age, the space was maturing, and the big boys were taking notice.

In a little less tan a decade, API management had grown up to be a legitimate business, and prove to be one that would attract the attention of the biggest tech companies in the space. While the acquisitions have legitimized the value of API management solutions, it hasn't all been good, as the attention from the enterprise, has almost meant a shift in focus by the investors of popular API service providers, looking for the big pay-off, shifting away from many of the priorities that have made API successful, operating on the open Internet, as opposed behind a corporate firewall.

Apigee IPO

Even with all the acquisitions in 2013, the biggest milestone for the API management space was the IPO of one of the API pioneers, Apigee. In May of 2015, Apigee Corporate, the developer of API-based software platform, filed a registration statement on Form S-1 with the U.S. Securities and Exchange Commission (SEC) relating to a proposed initial public offering of shares of its common stock.

The API management acquisitions were validating, but one of the leading companies was going public which was a significant milestone marking that the space was indeed a real thing (we hope), and potentially something that mainstream markets now acknowledged. In the tech sector we are all surrounded by like-minded folks who are usually believers by default, when out in the real world there are large sectors of business who are much more skeptical, and bullish on what is relevant.

While the Apigee IPO performance has been mild at best, it still legitimizes not just API management, but also brings a validation to the wider concept that the web, can be used as a driver of real world business, not just mashups, and online play. In 2015, after fifteen years of evolution, the web APIs now have a representative on wall street, setting the stage for wider growth in many established industries like banking, insurance, health care, and beyond.


In early 2014, Stewart Butterfield, one of the original founders behind the pioneering photo sharing platform included in this history, launch a team messaging solution named Slack. After Butterfield left Yahoo, who acquired Flickr in 2008, he began building a game called Glitch, which while enjoying a small cult following, was not a commercial success, and by 2012, had to shut the doors and lay off their staff.

One by product of the gaming platform, was a messaging core they had built, which after shutting down, they spun off into a separate product they continued to work on throughout 2013. Once released in 2014, the platform was an immediate hit with the VC, and Silicon Valley community, and quickly has become a huge messaging success, but equally as important, via its API the platform spawned a huge number of successful integrations, as well as a fast moving bot ecosystem. 

In 2016 Slack has become the epicenter of a chat and messaging bot evolution, that originally focused no the Twitter ecosystem, but has become more about business productivity, and other business solutions, injected into the workplace team environment, via the popular messaging platform. This bot movement has spawned a whole new wave of interest from VC"s, and while the concept is nothing new, Slack, Twitter, and other API or messaging driven platforms are giving rise to this new bot as an API client environment.

Amazon API Gateway

In 2015, AWS continued to define the APIs space, and demonstrate their dominance, by releasing the AWS API Gateway, which allows any AWS customers to design, deploy, manage, and monitor their APIs via their existing AWS cloud infrastructure. While many cried this was a killer of many of the existing API management service providers, after time has passed, it seems to be a natural progression of the API space, as well as telling of Amazon's role in the space.

As the AWS API Gateway press release information states:

“create an API that acts as a “front door” for applications to access data, business logic, or functionality from your back-end services, such as workloads running on Amazon Elastic Compute Cloud (Amazon EC2), code running on AWS Lambda, or any Web application”

The new gateway will take all that existing infrastructure you have accumulated (in the cloud), and it:

“..handles all the tasks involved in accepting and processing up to hundreds of thousands of concurrent API calls, including traffic management, authorization and access control, monitoring, and API version management.”

Distilling down the lessons from the last five years, and selling it:

“With the proliferation of mobile devices and the rise in the Internet of Things (IoT), it is increasingly common to make backend systems and data accessible to applications through APIs.”

To me, the release of the AWS API Gateway is a pretty significant milestones in the evolution of what is API. By 2006 the web had matured, and the Internet was being used for much more than just consumption, the API community was realizing that we could deploy vital digital resources using the Internet as a vehicle. Almost 10 years later, Amazon understands the opportunity in enabling you to do this for yourself--helping you either embark on, or speed up your API journey, which they've been on for over 15 years.

Allowing you to manage any of your digital assets as an API, using AWS API Gateway, is just the beginning of the expertise that Amazon is packaging up for all of us in the latest release.

Delivering On Promise Of Voice Enablement With The Alexa Voice Service & Skills Kit

Joining in on the wider conversation around the Internet of Things, Amazon has released several IoT focused solutions, but none have made an impact on the space, and potentially the future of APIs, more than the Alexa Voice Service. I hesitate to include this as a milestone in what I consider to be the history of APIs, but what Amazon is doing is already making significant waves when it comes to how APIs are consumed.

In the summer of 2015, Amazon introduced their voice enabled device the Amazon Echo, which was supported by a suite of APIs they bundled under the Alexa Skills Kit (ASK), and now also the Alexa Voice Service (AVS). Much like the rest of IoT platforms, Amazon Echo still has to provide itself with real world usage, but the skills kit, and voice service has emerged at just the right time in the evolution of mobile, voice, as well as complimenting the number of API resources available. 

Like messaging platforms like Slack are providing a potential new way to reach consumers, Alexa Voice Services is providing a new way to access valuable API driven resources. I feel more importantly the concept of the "Skills Kit" is providing an entirely new way for API providers to think about how they expose their valuable resources, making them available in ways that are more meaningful to home, and business users. Only time will tell if Alexa becomes part of the overall API consciousness, but after less than a year of operation, I am seeing signs of the platform being a very important milestone in the evolution of the space. 

Understanding History

Understanding history is critical to understanding where we are going. Calling this document the "history" of web APIs seems kind of silly. We are actually talking about a span of 16 years. But there are so many important lessons to be learned from the approach of these API pioneers, and the marks they've left on the space, to the point we can't ignore this history. If the technologists had their way, APIs would have purely been successful in the period of commerce, but with the radical innovation of companies like Amazon, Twitter, Twilio, Slack, and others, we now understand that APIs needed several essential ingredients to succeed: commerce, social, cloud, messaging, voice, and more.

Of course, all of this has to make money, but APIs need to be scalable, while also delivering the meaningful tools, services and resources that are important end users, otherwise none of this matters. As we stand solidly in the mobile period of API evolution, looking at the evolution to a period that will be more about devices and an internet of things (IOT), we need to understand our own history and how we've gotten here, to make sure we the right decisions for what is next.

Web APIs are about delivering valuable, meaningful, scalable and distributed resources across the World Wide Web. While Silicon Valley keep pushing forward with the next generation of technology solutions, we need to make sure that we know our past.

See The Full Blog Post

The Unintended Consequences Of API Patents

As I was reviewing patent #20160070605: Adaptable Application Programming Interfaces And Specification Of Same, from yet another person I know, after I pick my head up off the desk, I begin thinking about all of the unintended consequences of API patents. Here is the abstract from this work of beauty:

Aspects of the disclosure relate to defining and/or specifying an application programming interface (API) between a client and a computing device (such as a server) in a manner that the client, the computing device, or both, can evolve independently while preserving inter-operability.

This is something that WE ALL should be building into our clients, and is fundamental to all this API shit even working at scale. This particular patent is owned by Comcast, so I'm assuming it will be wielded in the courtroom, and leveraged in back room net neutrality discussions among cable and telco giants. There really isn't a lot of unintended consequences involved when two or three 1000 lb gorillas battle it out. The little people always suffer in these battles, and there really isn't much I can do--I will leave this to my friends in the federal government to take on.

Where I feel like I should be speaking up more, is when it comes to the patents being filed by my startup friends, like the hypermedia patents I talked about a few months ago. For weeks after I published that story, my friends came out of the woodwork to tell me this is what you have to do in this game, and they are well meaning, and would never ever use the patent for anything bad. Everyone I talked to I respect, and do believe they mean what they say, but its the unintended consequences that keep me up at night.

What happens after your startup is acquired, and you have cashed out, and long gone? Are you going to follow the company who acquired your startup and track on their litigation, and speak up? Maybe spending some of your fortune to protect the little guy or gal? What happens when your startup fails, and your investors parts out your company, and all of its assetts (cough cough patents), what recourse do you have here? Your VC's will assume your friendly stance on patent usage right? Right. 

I know many of you like to brush me off as being an optimistic hippie, and naive of how things work (all without actually looking at my background). My response to that, is you should probably look around a little, understand the beast you work for and the damage they do in the world. I think you should examine your decisions to make when getting into bed with certain money men, and realize the machine you've become a cog in the wheel of. Only then you'll realize that the stories you tell me, and yourself to help you sleep at night, are fairy tales handed to you by the machine that will consume you. 

See The Full Blog Post

The More I Gather OpenAPI Specs The More I Realize How We Obsess Over The Unecessary

I spend a lot of time gathering, creating, and organizing machine readable OpenAPI Specs, as part of my API Stack, and personal API stack work. I'm not insane enough to think I can create OpenAPI Specs for all of the public APIs out there, I'm just trying to tip the scales regarding how many API definitions that are out there, to increase the usage and value of open tooling, which in turn will increase the number of people who will create their own OpenAPI Specs. (just kind of sort of insane)

As I do this work, I realize how easily I can obsess over unnecessary, and meaningless metrics and goals in the tech space. Two things have stood out for me as I do this work:

  • Overall Number of API Specs - Much like our obsession over the completely meaningless and incorrect number of APIs that are in the ProgrammableWeb directory, the number of OpenAPI Specs means nothing. At some arbitrary point, things will change and people will get on board--it is already happening.
  • The Complete OpenAPI Spec - Having a complete OpenAPI Spec that contains ALL endpoints, parameters, and underlying data schema is a myth, and really doesn't matter. You only need as much as you are going to use. Maybe to API providers this matters, but to consumers, you only need what you need. The trick is figuring out what this is, without dumping everything on a consumer.

I won't stop what I am doing. As an experienced technologists I know when I am simply a tool in a larger game, and I perform as the automatron that is expected of me. However I will not obsess over the completeness of the API specs, or ever showcase the overall number of OpenAPI Specs that I have--I'll just keep on working.

In this data driven world, it is easy to get caught up in focusing on mythical, meaningless, data points. I'm not saying we shouldn't be doing this, because I often use these data points to motivate and push my work forward, but I think the lesson for me, is to know when an obsession over some arbitrary number becomes unnecessary or unhealthy. Another important aspect of this realization for me, is separating out when I personally set these number, or they are handed down from the wider tech space, and I had no part in helping craft them--this is when I think the chance for moving into the unhealthy realm exists the most.

See The Full Blog Post

FullContact But For OpenAPI Specifications

I get these regular updates from FullContact when there is new information available about the contacts I have added to my contact list of people I care about. Anytime there is a new photo, social network, or other element of their contact information updated, I get notified, andI  can choose to update it in my CRM.

Would someone go ahead and create this, but for OpenAPI Specifications? All you have to do is use Github, and build your own index of the websites of leading companies who are publishing APIs (common crawl is good start), and begin keeping track of ALL of the OpenAPI Specs, and API Blueprints that are increasingly spread across the web. Then you will need to develop some sort of API definition diff solution (which I've talked about before), and then send me any changes or updates that I do not have in my directory of API definitions--which you know, because you have indexed already.

You can offer premium services, like private repositories, and pathways to the wealth of API definition driven tooling out there. Again, this is something I could very well do, except I am one person, and not really interested in doing anything beyond all the work I already do. Especially if it involves taking VC money, scaling, and being any closer to the machine than I already am. So if you could get to work on this for me, and help solve this growing need in the API space, that would be great! Yeah. Also make sure and cut me in for 10% of all the $$$ you are going to make.

See The Full Blog Post

Deploying The API Service Providers That I Depend On Within My Own Infrastructure

I play with a lot of services that are looking to provide solutions to the API industry, and I'm always looking to better understand what leading API services providers are using to deploy their warez. I was test driving the testing and monitoring solutions from Opsee this week, and separate from the solutions they provide (which I'll talk about later), I thought the deployment of their API testing and monitoring solutions was worthy of talking about all by itself.

Opsee deploys as a micro-instance within my AWS stack, and gets to work testing and monitoring the APIs that I direct it to, providing a very precise, and effective way of doing monitoring.

I do not think this approach will work in all scenarios, for all API providers, but I think packaging up the services, so that API providers can deploy within their stack, and run within the cloud or on-premise environment they choose, is a potentially very powerful formula.

I have written before about offering up your APIs as wholesale or private label solutions before, and I would categorize what Opsee is doing as offering up your API industry service provider as wholesale or private label solutions. Many companies will do just fine consuming your SaaS or publicly available API driven solution, but more sophisticated operations, and potentially regulated companies, are going to need a solution that will run within existing infrastructure, not outside the firewall. 

I could see bandwidth and CPU intensive situations also benefiting from this approach. Opsee's way of doing things has gotten me thinking more about how we package up and deploy the services we are selling to API providers. Once Opsee was up and running in my stack, using a set of keys I setup and configured especially for it, it got to work monitoring the endpoints I tell it to. I could also see this approach also work as a locally available API, where I tell my systems to integrate and work with an API made available from the deployed instance as well--either permanently or on a more ephemeral time frame.

There is lots to consider but with the evolution in container tech, I could see this approach be applied in a lot of different ways, allowing companies to pick from exactly the API services they need (A la carte), and deploy exactly where they are needed, eliminating the need to depend on services outside the firewall.

See The Full Blog Post

Exposing The Meaningful Skills Our APIs Possess For Use In The Next Gen Messaging And Voice Apps

As I'm working through my morning work monitoring the API space, I'm proccesing stories about the availability of valuable resources, like the House Rules Committee data being released in XML formats, and ExoMol, the molecular line lists DB used in simulation of atmospheric models of exoplanets, brown dwarfs & cool stars

I feel fortunate to live in a time where the world is opening up such valuable resources, making them available online--available for anyone to use, remix, improve, and make better. My faith in APIs doesn't come from any single API, it comes from the possibilities that will exist when individuals, companies, organizations, institutions, and government agencies all publish valuable resources using APIs.

While there is still a lot of work ahead, I'm seeing the early signs of this reality emerging across my API monitoring in 2016. I'm coming across so many, extremely valuable, openly licensed, machine readable resources that can be used in some very interesting ways. The trick now, is how do we expose the most meaningful parts of these resources, and make sure they get found by the people who will actually put them to use. As the number of APIs increase, this is something that is going to get harder and harder, and the need for value even more critical.

Another dimension to this discussion is the growing number of channels we need to make our API resources available in. Web and mobile are still king when it comes to consuming APIs, but quickly devices, messaging, voice, bots, and other channels are growing in use. The next wave of API evangelism is going to require that the right people (domain experts) are available to help expose the most meaningful skills that our APIs posses, via these growing number of quick moving channels.

An example of this in action, using one of the valuable resources above, could involve making the Congressional activity that is most relevant and important to me, available in my Slack channel (or messaging app of choice), or even available via my Amazon Echo, using Alexa Voice Skills. How do we start carving out meaningful skills from government, and other open data, using simple APIs? How do we use these to educate individuals, either as an average citizen, or maybe in a professional or commercial scenario.

We have many, many years ahead of us, helping individuals, companies, institutions, and government understand why they need to be exposing valuable data, content, and other digital resources via simple web APIs. However, alongside these efforts, we are going to need armies of other individuals who have the ability to identify valuable resources, and help craft simple, usable, and meaningful endpoints, that can be added as skills within the web, mobile, device, messaging, bot, and voice apps of the future.

See The Full Blog Post

Your Microservices Effort Will Fail If You Only Focus On The Technical Details

I have self-censored stories about microservices, because I have felt the topic is as charged as linked data, REST, and some parts of the hypermedia discussion. Meaning there are many champions of the approach, who insist on telling me, and other folks how WRONG they are in the approach, as opposed to helping us work through our understanding of exactly what is microservices, and how to do well.

For me, when I come across tech layers that feel like this, they are something that is very tech saturated, with the usual cadre of tech bros leading the charge, often on behalf of a specific vendor, or specific set of vendor solutions. Even with this reality, I've read a number of very smart posts, and white papers on microservices in the last year, outlining various approaches to designing, engineering, and orchestrating your business, using the "microservices diet".

Much of what I read nails much of the technology in some very logical, and sensible ways--crafted by some people with mad skills when it comes to making sense of very large companies, and software ecosystems. Even with all of this lofty thinking, I'm seeing one common element missing in most of the microservices approaches I have digest--the human element.

I hear a lot of discussion about technical approaches to unwinding the bits and bytes of technical debt, but very little guidance for anyone on how to unwind the human aspect of all of this, acknowledging the years of business and political decisions that contributed to the technical debt. Its one thing to start breaking apart your databases, and systems, but its another thing to start decoupling how leadership invests in technology, the purchasing decisions that have been made, and the politics that surrounds these existing legacy systems.

I don't know about you, but every legacy system I've ever come across almost always has had a gatekeeper, an individual, or group of individuals who will fight you to the death to defend their budget, and what they know about tech (or do not know). I've encountered systems who have a dedicated budget, which only exists because the system is legacy, and with that gone, the money goes away too--sell me a microservices solution for this scenario!

Another dimension to this discussion, is that investors in microservice solutions are not interested in their money being used for this area. It just isn't sexy to be spending money on dealing with corporate politics, and unwinding years of piss poor business decisions, and educating and training the average business user around Internet technology. If you do not unwind these very human led, politically charged, business environments, you will never unwind the systems that exist within their tractor beams. Never. I'd care how much YOU believe.

In the end, I'm not trying to make you feel like you are going to fail. My goal is to encourage more investment in this area by the microservice pundits, vendors providing solutions the space, and VC's who are pouring money into these solutions. Many of the young, energetic folks at the helm of startups do not fully grasp the human side of corporate operations, and the potential quagmire that exists on the ground in front of them.

I am hoping that a handful of service providers out there can lower the rhetoric around their services and tooling, so that expectations get set at more realistic levels. Otherwise the push-back against the first couple failed waves of microservice implementations will become impenetrable, and blow any chance of making it work.

See The Full Blog Post

What I Am Seeing As A Minimum Viable Bot Presence

I have looked at way more Bots than I should have in the last couple days, and I'm beginning to see  similar patterns emerging across bot implementations, in sync with what I shared as part of my advice to API service providers. After you look at hundreds of APIs, and now a couple hundred bot implementations, you really begin to see what some of the common building blocks of the successful bot implementations are:

  • Domain - Having a domain dedicated to the bot, its operations, and the community around it.
  • Website - Simple, modern, and informative website for your folks to discover, and put your bot to work.
  • Logo - Having a simple, modern, and often clever logo and overall branding to your bots presence.
  • Twitter - Have a genuine, active Twitter account that actually engages in conversations with community.
  • Github - Establish an active, and useful Github presence via a dedicated user account or organization.
  • Blog - Provide a thoughtful, active, and informative blog that engages with a community,  customers, and the public at large.
  • API - You are using open APIs, and messaging formats to make your bot work, pay it forward with APIs and Webhooks.
  • Monetization - You gotta make some money, how are you going to keep the lights on, feed and cloth your bot. ;-)
  • Support -Every bot will need support, even if it is automated. Make sure have answer questions via Twitter, and Github. 

The most interesting bots I came across, whether they are Twitter, Slack, or Telegram bot, all had at least half of the items listed above. I'm guessing we are going to see a huge surge in the number of bots that are available, as well as the platforms in which bots can operate, and I am thinking that the bots that follow these patterns will be floating to the top of the churn.

I'm just getting started documenting the common building blocks in this recent surge in API driven bot activity. I'll keep adding the most interesting bot solutions to my research, and keep track of what I feel the best parts of the sector are. I'm curious to see where all of this goes. Not everything bot is capturing my attention right now, but I am seeing enough interesting approaches to using APIs for bot delivery, as well as providing the resources bots will need, to keep my attention.

See The Full Blog Post

Savvy API Providers Submit Pull Requests To Update Their OpenAPI Specs In My Github Repo

I'm constantly working to hand-craft, scrape-craft, and auto-generate OpenAPI Specs, and APIs.json files for as many of the top APIs I can. It is something Steve Willmott (@njyx), the CEO of 3Scale always flicks me shit about, saying I shouldn't have to do that--API providers should be doing this! While I agree, I feel like we haven't reached the point where all providers understand the importance of having an up to date OpenAPI Spec available for their API (some even have them, and work to hide them!)

This is something the savvy API providers like SendGrid are doing, rolling around Github, making sure copies of their OpenAPI Specs are up to date. 

Thank Elmer, you da man. Now I just need to convince another 2K other API providers of the importance of having up to date machine readable API definitions available, and actively maintain them. Having your definitions up to date, and easy to find increases the chance a developer will load up in the favorite HTTP client like Postman, Postman, PAW, or API Garage

Its going to happen Steve! I tell you! Some day, all the good API providers will maintain their own API definitions. You should always have an easy to find copy available as part of your API documentation, but you should also search for them on Github, and submit pull requests to keep them up to date. It is like have little machine readable business cards for your API, sitting on the desks of developers, except with this business card allows you to submit a pull request to update. 

P.S. While on the subject--PUBLISH AN RSS FEED FOR YOUR BLOG!!! ;-)

See The Full Blog Post

I Am Hearing A Lot More Talk About Restricting Free And Freemium Tiers Of API Access

I'm seeing a significant shift in the conversations around how SaaS, and API-first platforms are planning access to their APIs. I'm seeing a pretty significant back peddling around free, and freemium access levels. I'm trying to keep notes on what I'm hearing, so that I can better understand what is causing this, and see if I can identify where the balance might exist in providing self-service access to API our valuable resources.

Let' me explore some of the main reasons I'm hearing for reducing, restricting, or completely doing away with these lower level areas of access:

  • Too Many API Freeloaders - There is a growing number of poorly behaved API consumers who are just looking for a free ride.
  • At Odds With Sales Teams - Free layers of access cannibalize our sales cycle, and make it harder for sales teams to close the deal.
  • These Layers Do Not Convert - The users coming in at these layers are just not converting, and becoming paid customers.
  • We Built It And Nobody Came - Nobody seems to care about the API, and nobody signed up for acess, so we are shutting down.
  • No More Money To Support Free - We just don't have the money to pay for the infrastructure and support it takes to support.
  • VC's Told Us To Focus On Enterprise - Our investors told us to focus all our attention on selling to the enterprise, consumer focus is gone. 

When it comes to providing and consuming APIs, I've seen it all. I sympathize with many of these reasons for shrinking of the free tiers of access to APIs. There will be many contributing factors to why things might be off in an API community. As my friend Ed Anuff (@edanuff) focused on in his latest post about how a large number of us are doing API wrong, with many companies approach to APIs being fundamentally at odds with their ecosystem. 

Ultimately I think that API providers WILL need to tighten down their access levels, but this can't be done without properly thinking things through. You need to consider the bigger picture, around how have planned API access, communicated and engaged with your consumers, and be honest with yourself about what you've done right, and what you've done wrong. While it might be a lot of work to manage this free level of access, and do what it right, you want to make sure you still maintain an environment where serendipity can happen. 

Then again, maybe you weren't actually interested in this happening in the first place. You were just looking to get someone to build some things for free on your resources, looking to offload the hard work on an external community. I'm not saying everyone who has a self-service, publicly available API will find success, bu the ones that work hard to strike a balance here, are more likely to have invested in all the right areas--setting the stage for a healthier balance between API provider and consumer.

See The Full Blog Post

Sucked Into The World Of Bots And APIs

I am slowly getting sucked into the world of bots. I've been tagging stories related to Twitter bots for some time, but it was the growing buzz of Slack bots that has really grabbed my attention. It pushed me to light up a research area, so that I can begin to look at things closer, and work to understand the common building blocks, like I do for the other areas of the API space.

The world of bots intrigues me, from the perspective of how APIs can be used to execute bots, but also provide the valuable resources needed to deliver bot functionality. I feel like the list of categories of available Slack bots are somewhat telling of the business potential for bots:

While I find Twitter bots creative and interesting, there are many I also found annoying as hell. I've had similar responses to some of the bots I've encountered on Slack. I tend to be pretty boring when it comes to goofy shit online, so my threshold for bot silliness is low. I don't care what others do, I just don't go there that often, so you'll see me highlighting more of the business productivity bots, or the more creative ones. 

Another thing I think is significant, is the growing number of platforms in which bots are becoming common practice, with many of the bots operating on multiple platforms.

  • Telegram bots
  • Snapchat bots
  • Kik bots
  • Trello bots
  • Another interesting part of this, is that not all bots are executed via API. Some bots simply use the chosen protocol of popular messaging applications, and operate via an account setup on the platform. However, even with these implementations, APIs still come into play in providing the bot with its required skills. 

    I think bots are significant to API providers, and they should be working to better understand how their APIs can be used to either drive bot behavior via popular messaging platforms, as well as how bots can operate on their platform, either using APIs, or a common messaging format. There is a lot of chatter around bots online to sort through, and will be something that takes me a while to produce some coherent research, but you can keep an eye on it, as I progress via my Github research project for bots.

    See The Full Blog Post

    Further Simplifying My API Docs And Providing API Samples

    For some workshops preparation this week, I needed to isolate just the best of the API calls and documentation from handful of APIs I am trying to teach my intended audience about. I have almost twenty separate companies targeted, with a couple hundred individual endpoints across the API provided served up by these companies. I needed to a way to easily define, organize, and present a subset of API samples, intended for a specific purpose audience. 

    In this workshop, I need the simplest, most intuitive samples possible across these popular APIs. For Twitter I need to be able to just send a tweet, or list friends, and on Facebook I need to post to my wall, and search for user. I need simple actions, that will be meaningful to my higher education audience--most likely students. I don't want to bury all of them with the endless API possibilities that surround the APIs I'm showcasing, I need the twenty use cases that they'll give a shit about, and result in them being interested in what an API is.

    As I do with all my research, I organize my lists of APIs into APIs.json collections, and publish the results as a Github repository. This allows me to quickly assemble, relevant collections of APIs, designed for specific audiences, in a way that I can wrap with informative content and stories, helping them on-board with the importance of APIs, as well as the individual APIs. This got me thinking, because as part of an APIs.json index I have all the moving parts identified, I just need a way to simplify, and distill down into little API samples, so that I can offer up through the experience I am crafting.

    To help me in my effort I started defining a new APIs.json API type schema, which allows me to define a sample subset of any API I have defined using OpenAPI Spec. Within each schema, I can map to a specific endpoint + method present in the OpenAPI Spec, select which parameters will be used, and what the default values and enum will be applied. I also provide a title, description, and now I have a machine readable schema for my API sample. Next, I just need to craft a simple API Samples JS library for rendering these samples into widgets and other embeddable goodies.

    Each sample will use the security definition defined within the OpenAPI Spec to get the authentication details it needs, render it as a simple widget, which allows users to quickly make a relevant API request, and see the results immediately. This really is no different than API explorers and interactive docs like Swagger UI, but rather than providing a complete UI or docs, it provides just a sample, to help educate the API consumer around what is an API, as well as what any single API does.

    While I think samples can live alongside the regular API docs, and explorers, within the portal of each API, I'm thinking they will become exponentially more valuable when used in a hacker storytelling, API broker, and evangelism setting. I have the base JSON schema defined for my API Samples, and once I get a working prototype for the JS client, I will publish a simple page, demonstrating a few use cases. At first I'll just be using to help people understand the value of APIs in general, as well as specific APIs, but I think future I will be able to get more sophisticated in how I tell stories around stacks of APIs, and how APIs can be assembled to accomplish bigger things than just any single API can accomplish on its own.

    Hopefully in the near future, I will have a wealth of API Samples for the most important APIs I profile as part of my API Stack work, providing me with some valuable education and storytelling tooling to assist me in my evangelism work. 

    See The Full Blog Post

    Personal APIs Are Not Just A Local Destination, They Are A Journey

    I am preparing a project for the conversations, and a workshop I have on my schedule this week at Davidson College, called: Indie EdTech & The Personal API. I'll be going on campus, talking to campus leadership, administrators, teachers, and students about APIs. To put myself into the right frame of mind, I wanted to explore what the concept of a personal API.

    The Concept Of Personal APIs Is Ridiculous
    First, I'll set the stage with what is a common reaction, when I mention the term "personal API" to other API folk in the space: "Its a nice idea, but it just isn't something the average person will ever need, let alone care about what an API is--it is a non problem." To me, that response sounds just like what you'd hear in early 2000's when asked if any single individual would ever need a web presence--something that blogging and the social media star has continued to  evolve, while also proving the naysayers wrong. 

    At first glance, a personal API seems like it would be a stack of APIs that you would setup, and manage yourself. Again, something the average developer or IT person would dismiss as out of the realm of reason for the average person--there just isn't a need. I would probably agree here. While having a set of APIs that could help you manage your life bits sounds like a great idea, I just don't think the average Internet user is going to care about their digital life bits at this level--the average person will have to have a an actual problem to solve, before they will ever need and care about an API (secret is that this is true for business too)

    Everybody Already Has Personal APIs
    I prefer looking at this topic as something that has already been answered--everyone already has personal APIs! I manage all my public messages and social network using the Twitter API, my photos using the Flickr and Instagram APIs, my documents with the Google Drive and Dropbox APIs, and my email with the Gmail API. I already have a stack of personal APIs I depend on each day to drive the web and mobile applications I use for my personal and professional existence, they just are spread all over the Internet, and not owned and operated by yours truly.

    OK, sure. These APIs aren't technically personal APIs, but they do provide API access to my personal information. For me, personal APIs are going to be just a very personal API focused journey--as well as a local destination. Even in the business API world, where having an API is not a ridiculous notion, what an API actually is, varies widely. Rarely is their a coherent stack of APIs within a company, and the reality providing and consuming APIs is actually spaghetti mess of services, open source tooling, and custom code. The API journey is always about pulling together this vision, organizing it, discovering new APIs, evolving and deprecating old ones, while also having a plan to actually conduct business in this volatile online enviornment << the journey that will be no different for the individual.

    In My World There Are Two Lists Of My Personal APIs 
    I am an API professional, and making my living with APIs, and I have two lists of what I would consider my "personal APIs". First, here is my master list of APIs I have hand developed, then there are the public APIs I depend on, that are developed by external people and organization. I'm guessing the first list comes closes to what someone would consider to be a stack of personal APIs. While I do prioritize the development of an API in my own stack, the reality of a modern business owner is that you depend on a whole suite of API driven services, that accomplish specific business objectives, something that is often done via integration using APIs. I also have to note, that the overlap betwen these two spheres is huge. (Bernie Sanders huge!)

    In the end, my point is that the lines between personal and other APIs are very blurred, and will always be a mix. I have a bunch of personal and professional life bits I create, move around, and share as I need throughout any given day or week. My personal storage API is always Amazon S3, Dropbox, or Google Drive, no matter how you look at it. I make decision about which API I publish, storage, and manage files, documents, and other heavy objects, based upon where I need it, cost of storage, and any number of other factors. The leading storage API providers are my personal APIs any way you look at it, a theme that comes up again and again, blurring the concept of what a personal API is for me.

    Breaking Down The Core Personal API Stack
    To help think through this, I wanted to take a moment and break down what some of the core personal API stack might be. With my experience it is pretty easy to identify what might be the core personal API stack, as one area my API research into the backend as a service (BaaS) space, works hard to deliver many of these common objects that are used across mobile apps that developers are building today. These are just a handful of the common, end-user facing API resources that are available as part of the standard BaaS offering:

    • Profiles - The account and profile data for users.
    • People - The individual friends and acquaintances.
    • Companies - Organizational contacts, and relationships.
    • Photos - Images, photos, and other media objects.
    • Videos - Local, and online video objects.
    • Music - Purchased, and subscription music.
    • Documents - PDFs, Word, and other documents.
    • Status - Quick, short, updates on current situation or thoughts.
    • Posts - Wall, blog, forum, and other types of posts.
    • Messages - Email, SMS, chat, and other messages.
    • Payments - Credit card, banking, and other payments.
    • Events - Calendar, and other types of events.
    • Location - Places we are, have been, and want to go.
    • Links - Bookmarks and links of where we've been and going.

    Obviously there are many other objects that represent our digital existence, but to help keep things focused, I will only highlight these very common life bits we generate, store, move around, and share online each day. Depending on what we do for a living, the bits and bytes we will be managing using APIs will vary widely, but for the most part all users will have something to put into one or more of these areas of one possible personal API stack

    Understanding Where Our Personal APIs Operate and Store Information
    With a core stack defined, let's think about where this personal API stack will live. There is no way all of these resources could possibly live in a single location, and we will always need the help of a variety of companies, organizations, government agencies, and other individuals, to help realize even this core set of personal APIs. Let's spend a moment exploring where each of these life bits already exist, and considering what our motivations around storing, syncing, syndicating, and backing up might be.


    • Facebook - Your account and profile on Facebook.
    • Twitter - Your account and profile on Twitter.
    • Instagram - Your account and profile on Instagram.
    • Tinder - Your account and profile on Tinder.
    • Yik Yak - Your account and profile on Yik Yak.
    • Snapchat - Your account and profile on Snapchat.


    • Facebook - Maintaining connects and friends on Facebook.
    • Twitter - Maintaining connects and friends on Twitter.
    • Instagram - Maintaining connects and friends on Instagram.
    • Tinder - Maintaining connects and friends on Tinder.
    • Yik Yak - Maintaining connects and friends on Yik Yak.
    • Snapchat - Maintaining connects and friends on Snapchat.


    • Facebook - Managing your own, and engaging with other business profiles & pages.
    • Twitter - Managing your own, and engaging with other business accoiunts.
    • LinkedIn - Managing your own, and engaging with other business profiles & pages.


    • Facebook - Managing the photos that are primarly published, or syndicated to Facebook.
    • Instagram- Managing the photos that are primarly published, or syndicated to Instagram.


    • Youtube - Managing your own videos, and the videos curated from other users on Youtube.
    • Facebook - Managing your own videos, and the videos curated from other users on Facebook.
    • Instagram - Managing your own videos, and the videos curated from other users on Instagram.


    • Spotify - Access your music, and experincing music discovery via Spotify API.


    • Dropbox - Using the Dropbox platform storage, management, and sharing of documents.
    • Google Drive - Using the Google Drive platform storage, management, and sharing of documents.
    • Google Sheets - Using the Google Sheets platform storage, management, and sharing of documents.


    • Facebook - What is the current status, as well as historical archive, on Facebook.
    • Twitter - What is the current status, as well as historical archive, on Twitter.
    • Instagram - What is the current status, as well as historical archive, on Instagram.


    • Facebook - Managing larger form content published on Facebook.
    • Twitter - Managing larger form content published on Twitter.
    • WordPress - Managing larger form content published on WordPress.
    • Blogger - Managing larger form content published on Blogger.


    • Facebook - All of the public and private messaging occurring via Facebook.
    • Slack - All of the public and private messaging occurring via Slack.
    • SnapChat - All of the public and private messaging occurring via SnapChat.
    • Yik Yak - All of the public and private messaging occurring via Yik Yak.


    • Paypal - All payments mde and received via Paypal.
    • Facebook - All payments mde and received via Facebook.


    • Facebook - Adding, updating, an deleting of Facebook events.
    • Google Calendar - Adding, updating, an deleting of Google Calendar events.


    • Facebook - What is the current and historical location, as well as the places of others.
    • Twitter - What is the current and historical location, as well as the places of others.
    • Instagram - What is the current and historical location, as well as the places of others.


    • Pinboard - All bookmarked URLs added, and stored via Pinboard.
    • Google URL Shortener - All URLs that were shortened via Google URL shortener.

    The APIs of these platforms are your APIs, and much of this information is never going anywhere, unless you are simply looking to backup, or sync to other locations (which you should be doing). I try to look at this as a positive, by default, you have a rich stack of APIs available, to help you manage your digital information. Whether you are a company, or an individual, you will always have to make some trade offs about how you manage your information, and ultimately where things are stored, how important the resources are, and ultimately are seeking to strike some sort of balance.

    OK, So What? Nobody Cares About Their Information
    Even if we look at APIs in this way, that our personal APIs will always be a mix of APIs across the services we use, the average individual doesn't know or care about APIs--this is a non-problem. #truth The average individual most likely will never care about the bits and bytes they generate online each day. The photos, videos, messages, and other exhaust from our daily lives can just be lost (and monetized and owned by some one lese), along with many of our memories in the physical world--not everything needs to saved.

    Within this reality, there will always be some of the bits and bytes that we choose to look at differently. There will be situations where some videos, and some photos are more valuable than others. Maybe we are a professional speaker, and our photos and videos are used as part of our professional services.  Maybe we are a writer, and we need to be paid for our words, either through advertisements, pay to download, and pay walls or subscriptions. There are many reasons why we will want to keep better track of our digital bits and bytes, and there will always be a need to educate new individuals around the opportunities to take control of our digital self, even with the majority of people never caring about their information.

    Personal APIs Will Always Require Regular Doses Of API Literacy
    Nobody will ever care about APIs, and put the proper thought into their existence, if they do not know about APIs. APIs are not a destination, they are a journey, and whether it is an individual, business, organization, or other entity, there has to be the proper education about what is an API, what are the APIs that are already in use. Throughout this journey, APIs will continue to evolve in their meaning, and as this understanding becomes more nuanced, so will the personal API stack.

    Personal APIs Will Evolve And Be Strengthened By Living POSSEE First
    API literacy will be exercised and strengthened through operating and managing your online domain, living a Publish (on your) Own Site, Syndicate Elsewhere (POSSE) existence. POSSE is less about your bits and bytes living within your domain, than it is about thinking critically about what your bits and bytes are, and where they do, and should live. Regular POSSE rituals help you understand the API possibilities, and experience the limitations of APIs, but also hopefully the potential of APIs, always guiding you down a path, toward your fuller awareness of what your personal API can be.

    Personal APis Will Be Defined By The Services We Use (or Don't Use!)
    Our personal APIs will always be defined by the services we use, or don't use. Having the control that we desire over our life bits amplifies the open (or the closed) nature of the services and platforms we are using. Once we are API literate, and have embarked on our a personal API journey, platforms that do not give us the control we are used to, make less sense to use. This experience will raise our expectations of the services we use, shaping our regular POSSE rituals, feeding our thoughts about what is a personal API, and driving our decisions around the online services we use, or do not use in the future.

    Personal APIs Are Validated By The Solutions They Deliver For Us
    Ultimately our personal APIs will always be defined by the solutions that they deliver. If we cannot use an API as part of our regular rituals, sync, share, and publish as we desire, an API will either never exist in the first place, or whither on the vine, and end up orphaned, and in disrepair. This reality will continue to define what is a personal API, and it will be a driving force for anyone to learn about what is an API, how to live POSSE, and making decisions around which services we use. Personal APIs will continually be validated, or invalidated, by the solutions they enable, or do not enable.

    The Need For Personal APIs Will Grow As We Take Control
    As API literacy matures, we take more control of our world through our POSSEE rituals, which are strengthened by the services and tools we use, the concept of personal APIs will grow, evolve, and take deeper root. Something that brings me back to all of the root concerns I hear from folks, about people not caring, and there is no problem or need for personal APIs. This is all true, in the average employee who subscribes to current IT norms. This is all true, in the average online digital citizen, who subscribes to current Silicon Valley, tech industry norms. I see the naysayers of the concept of the personal API, as the gatekeepers of traditional power structures, with the handful of us who strike out our our personal API journeys, simply demanding to live a somewhat safer, saner, and healthier life in the cracks of this digital circus.

    See The Full Blog Post