27 Mar 2016
History is everything. Understanding where we have come from is critical to knowing where we are going. While pushing forward with the latest technology, it is always healthy to pause and take a look at that past. Someone Tweeted the link to my history page, and I realize it has been three years since I refreshed my view of the overall history of the space, so I wanted to take some time and add a few other milestones that I feel were significant along the way.
When I talking about APIs, I'm focused one version that was born out of the enterprise, during the Service Oriented Architecture (SOA) movement, which sometime around 2000, a portion of the SOA experiment left the enterprise and found a more fertile environment in the world of start-ups. In 2016, this version of API has re-captured the attention of the enterprise, as they see them being used in popular, public API driven services, and the startups they are acquiring and gobbling up.
Where we stand in 2016, there are some obvious technical reasons of why web APIs are finding success in companies of all shapes and sizes, and even within government; but not all the reasons for this success is technical. There are many other, less obvious aspects of web APis that have contributed to their success, things we can only learn by closely studying the past and looking at why some of the pioneers of web APIs were successful, and have continued to be successful over the years.
In 2016, it is critical that we emulate the best practices that have been established over the last 16 years, following the lead of early API providers like Amazon, Salesforce, eBay and Twitter--much of what is still being emulated by new API practitioners in 2016. As a startup, SMB, enterprise, institution, and government agency organization, you don't have to follow every example set in this 16 year history, but you should be aware of this history, and understand your place in the sector.
As I look back each year, I see to some clear patterns emerge that have defined the industry--patterns that need to be emulated, and some that should be avoided, as we plan our own API strategy and presence.
As the first .COM bubble was bursting, platforms were looking for innovative ways to syndicate products across e-commerce web sites, and web APIs, built on the backs of existing HTTP infrastructure proved to be the right tool for the job.
With this in mind, a handful of tech pioneers stepped up to define the earliest uses of APIs as part of sales and commerce management, kicking-off a ten year evolution that I consider as the early history of web APIs, defining the sector we all enjoy today.
However, even with the early success of APIs, the sector would struggle to reach a mature point, without several other critical ingredients that would prove to be as important as essential commerce variables like social, payments, and messaging.
February 7th, 2000 Salesforce.com officially launched at the IDG Demo 2000 conference.
Salesforce.com launched its enterprise-class, web-based, sales force automation as a "Internet as a service". XML APIs were part of Salesforce.com from day one. Salesforce.com identified that customers needed to share data across their different business applications, and APIs were the way to do this.
Marc R. Benioff, chairman and founder of salesforce.com stated, "Salesforce.com is the first solution that truly leverages the Internet to offer the functionality of enterprise-class software at a mere fraction of the cost."
Salesforce.com was the first cloud provider to take an enterprise class web application and API and deliver what we know today as Software-as-a-Service.
Even with SalesForce being the first mover in the world of web APIs, they are still a powerhouse in 2016. SalesForce continues to lead when it comes to real-time APIs, testing, deployment and most recently taking a lead when it comes to mobile application development and backend as a service (BaaS).
On November 20, 2000, eBay launched the eBay Application Program Interface (API) , along with the eBay Developers Program.
The eBay API was originally rolled out to only a select number of licensed eBay partners and developers.
As eBay stated:
"Our new API has tremendous potential to revolutionize the way people do business on eBay and increase the amount of business transacted on the site, by openly providing the tools that developers need to create applications based on eBay technology, we believe eBay will eventually be tightly woven into many existing sites as well as future e-commerce ventures."
The launch of the eBay API was a response to the growing number of applications that were already relying on its site either legitimately or illegitimately.
The API aimed to standardize how applications integrated with eBay, and make it easier for partners and developers to build a business around the eBay ecosystem.
eBay is considered the leading pioneer in the current era of web-based APIs and web services and still leads with one of the most successful developer ecosystem today.
On July 16, 2002, Amazon launched Amazon.com Web Services allowing developers to incorporate Amazon.com content and features into their own web sites.
Amazon.com Web Services (AWS) allowed third party sites to search and display products from Amazon.com. Product data was made accessible using XML and SOAP.
From day one the API was integrated with the Amazon.com Affiliate Program, allowing developers to monetize their sites through purchases made at Amazon.com via links from their web sites.
Internet visionary Tim O'Reilly was quoted in original Amazon Web Services press release saying, "This is a significant leap forward in the next-generation programmable internet."
APIs and Amazon both have roots in e-commerce, but APIs were quickly applied to other areas resulting in the social media, cloud computing, and almost every single component necessary to build the web, and mobile Internet that we all use every day.
As API driven commerce platforms were still finding their footing, working to understand the best way to put APIs to work, a new breed of technology platforms emerged when it came to using content, media, and messaging on the web, in a way that was very user centric and socially empowering for individuals and businesses.
Publishing user generated content, and the sharing of web links, photos and other media via APIs emerged with this birth of new social platforms between 2003 and 2006. This was an entirely new era for APIs, one that wasn't about money, it was about connections.
These new API driven, social platforms would take technology to new global heights, and ensure that applications from here forward, would all always contain essential social features, that were defined via their platform APIs.
Social, was an essential ingredient the API industry was missing.
del.icio.us is a social bookmarking service for storing, sharing and discovering web bookmarks to web pages, that was founded by Jousha Schachter in 2003.
Del.icio.us implemented a very simple tagging system which allowed users to easily tag their web bookmarks in a meaningful way, but also established a kind of folksonomy across all users of the platform. Which proved to be a pretty powerful way for cataloging and sharing web links.
The innovative tagging methodology used by del.icio.us allowed you to pull a list of your tags, or public web bookmarks by using the URL http://del.icio.us/tag/[tag name]/. So if I was searching for bookmarks on airplanes, I could http://del.icio.us/tag/airplane and I would GET a list of all bookmarks that have been tagged airplane. It was simple
When it came to the programmatic del.icio.us interface, the API was built into the site, creating a seamless experience--if you wanted the airplane tags via HTML you entered http://del.icio.us/tag/airplane, if you wanted RSS of the tags you entered http://del.icio.us/rss/tag/airplane, and if you wanted XML returned you used http://del.icio.us/api/tag/airplane. This has changed with the modern version of Delicious API.
del.icio.us was the first, concrete example of how the web could deliver HTML content, alongside machine readable like RSS and XML, using a URL structure that was simple and human readable. This approach to sharing bookmarks would set the stage for future APIs, in making APIs easy to understand for developers and even non-developers alike. Any slightly technical user could easily parse the XML or RSS, and develop or reverse engineer widgets and apps around del.icio.us content.
del.icio.us has been sold twice since its early popularity, which included to Yahoo! in 2005 and AVOS Systems on April, 2011. However del.icio.us was one of the pillar platforms that ushered in the social era of the API movement, establishing sharing via APIs as critical to the API economy, but also showing that simple rules when it comes to API design.
In February 2004 the popular photo sharing site Flickr launched. Six months later they launched their now infamous API, and six months after that, they were acquired by Yahoo.
Flickr was originally created as an online game, but quickly evolved into a social photo sharing sensation.
The launch of the RESTful API helped Flickr quickly become the image platform of choice for the early blogging and social media movement by allowing users to easily embed their Flickr photos into their blogs and social network streams.
The Flickr API is the driving inspiration behind the concept of BizDev 2.0, a term coined by Flickr co-founder Caterina Fake. Flickr couldn't keep up with the demand for its services, and established the API as a self-service way to deal with business development.
The core concepts established by Flickr using its API would transcend the company and its acquisition by Yahoo. Business development using APIs is embedded in the philosophy of the business of APIs pushing APIs to something beyond technical.
APIs became something that any company could use to actually conduct business with its partners and the public, but we still had a ways to go before APIs would grow up.
On August 15th 2006, Facebook launched its long-awaited development platform and API.Version 1.0 of the Facebook Development Platform allowed developers access to Facebook friends, photos, events, and profile information for Facebook.
The API used REST, and responses were available in an XML format, following common approaches by other social API providers of the time.
Almost immediately, developers began to build social applications, games, and mashups with the new development tools.
The Facebook Development Platform gave Facebook an edge over then popular competitorMySpace, and established itself as the top social gaming platform with games like Farmville.
While the Facebook API and platform is considered by many developers to be unstable, it continues to play a significant role in the evolution of the entire platform with applications and partnerships that drive new features and experiences on Facebook.
On September 20, 2006 Twitter introduced the Twitter API to the world.
Much like the release of the eBay API, Twitter's API release was in response to the growing usage of Twitter by those scraping the site or creating rogue APIs.
Twitter exposed the Twitter API via a REST interface using JSON and XML.
In the beginning, Twitter used Basic Auth for API authentication, resulting the now infamous Twitter OAuth Apocalypse almost four years later, when Twitter forced all those using the API to switch to OAuth.
In four short years Twitters API had become the center of countless desktop clients, mobile applications, web apps, and businesses -- even by Twitter itself, in its iPhone, iPad, Android apps via its public website for much of its existence (no longer true).
Twitter is one of the most important API platforms available, showing what is possible when a dead simple platform does one thing well, then opens up access via an API and lets an open API ecosystem build the rest.
Twitter is also one of the most cautionary tales, of how your API ecosystem can also begin to work against you, unless you properly address the political considerations of an API ecosystem as it grows.
Business and Marketing
As APIs evolved from commerce, through social, it was clear that the industry was going to need some standardizing, by introducing some common business practices. The industry needed to standardize how APIs were deployed as well as provide marketing to help get the word out about the potential of APIs and common business practices.
The establishment of common business and marketing practices for the API space took a lot of grassroots outreach as well as storytelling of the behalf of APIs, companies and the industries they rose out of. There were two separate API pioneers that stepped up to help define the API industry we know today, between the years of 2005 and 2012.
While writing about the history of APIs, it is easy to be so focused on just APIs, that you overlook the single most important player in the entire history of the web API--ProgrammableWeb.
In July 2005, John Musser started ProgrammableWeb. According to his original about page:
ProgrammableWeb is a web-as-platform reference site and blog delivering news, information and resources for developing applications using the Web 2.0 APIs.
I started this site because I couldn't find what I was looking for: a technology focused starting point for web platform development. (For a bit more see my initial post.) Although no guarantees, the last time I started a reference site it somehow became Google's highest rated link on the topic. Given that this site will be a collaborative effort with community input as well, this can be what we make it.
I hope you find the site useful.
John Musser - Seattle, August 2005
John’s original blog post on why he started ProgrammableWeb, says it all: Why? Because going From Web Page to Web Platform is a big deal.
Web APIs are a big deal! Whether its social networking, government, healthcare or education--having a programmable platform to make data and resources available will be a critical part of how commerce and society operates from here on forward.
John made a early decision to showcase open and RESTful approaches to deploying APIs vs. parallel attempts of Service Oriented Architecture (SOA) and Web Services, and focused on telling stories about open APIs--way before it was the thing to do in Silicon Valley.
When I started API Evangelist in July 2010 (5 years after PW), and started talking about the business of APIs, the technology of web APIs was already widely accepted in Silicon Valley, because of the stories that have told on ProgrammableWeb.
As we progress through 2013, a year in which I think we can confidently say APIs are moving mainstream, and I feel we owe much of the success to ProgrammableWeb. The stories John, Adam and other writers have been telling on ProgrammableWeb have been crucial to quantifying and defining the API industry--allowing us all to build, iterate and move things forward.
Without stories around the technical, business and politics of APIs, these virtual interfaces would not have been able to find a place in our real life worlds.
In November 2006, API the first API service provider Mashery came out of "stealth mode" to offer documentation support, community management and access control for companies wishing to offer public or private APIs--from a blog post in TechCrunch titled API Management Service is Open for Business.
At this point in time, in 2006, we were moving from the social period of APIs into the cloud computing phase with the introduction of Amazon Web Services. It was clear that the world of web APIs was getting real, and there was opportunity for companies to offer API management as a service.
While there were tools for deploying APIs, there was no standard approach to managing your API deployment. Mashery was the first to bring a standard set of services to API providers, that would help set the stage for the future growth of the API industry.
It would take almost six more years before the API industry would come of age, in which Mashery significantly helped contribute. The space we all know today was defined by early API commerce pioneers like SalesForce and Amazon, social pioneers like Flickr and Delicious, and from Mashery who helped define what is now know as the business of APIs.
In 2013, Mashery was acquired by Intel, and again by Tibco in 2015, helping continue to validate that the API industry truly is coming of age.
Web developers quickly saw the potential of embeddable maps, and found ways to hack these mapping sources to innovate and build the web properties users desired, focused on solving the local problems we all face daily.
This early use of APIs in providing mapping tools and services for developers laid the groundwork for much of the early mobile developer talent that would drive the coming mobile API period.
Google Maps API started a trend of API mashups with its valuable location based data, with over 2000 mashups to date.
The API demonstrates the incredible value of geographic data and mapping APIs, as well as the power users can have in influencing the direction an application or API takes. Lars Rasmussen, the original developer of Google Maps commented how much he learned from the developer community by watching how they hacked the application in real-time, and they took what they learned and applied it to the API we know today.
Few other companies have the resources to tackle a problem like mapping the worlds resources and delivering a reusable, API driven resource, like Google did. Google has played many roles in moving forward the APi space, but Google Maps played a pivotal role in the history of APIs.
As APIs were generating social buzz across the Internet, Amazon saw the potential of a RESTful approach to business, internalized and saw APIs in a way that nobody had seen them before--giving birth to an approach to using APIs that was much more than just e-commerce, it would re-invent the way we compute.
Amazon transformed the way we think about building applications on the web, delivering one of the essential ingredients we needed for APIs to work, by putting APIs to work. What we now know as cloud computing changed everything, and make the mobile, tablet, sensor and other API driven realms possible.
In March, 2006 Amazon launched a new web service, something completely different from the Amazon bookseller and e-commerce site we've come to know. This was a new endeavor for Amazon: a storage web service called Amazon S3.
Amazon S3 provides a simple interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web. It gives developers access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of websites.
Amazon S3 or Simple Storage Service was initially just an API. There was no Web interface or mobile app. It was just a RESTful API allowing PUT and GET requests with objects or files.
Developers using the Amazon S3 API were charged $0.15 a gigabyte per month for storing files in the cloud.
With this new type of API and billing model, Amazon ushered in a new type of computing we now know as cloud computing.
This also meant that APIs were no longer just for data or simple functionality. Now they could be used to deliver computing infrastructure.
In August 2006, shortly after Amazon launched its new cloud storage service Amazon S3, the company followed with a new cloud computing service dubbed Amazon EC2 or Elastic Compute Cloud.
Amazon EC2 provides re-sizable compute capacity in the cloud, by allowing developers to launch different sizes of virtual servers within Amazon data centers.
Just like its predecessor Amazon S3, Amazon EC2 was just a RESTful API. Amazon wouldn't launch a web interface for another three years.
Using the Amazon EC2 API developers can launch small, large and extra large servers and pay for every hour that the server is running.
Amazon EC2, combined with Amazon S3 has provided the platform for the next generation of computing with APIs at the core.
Cloud computing, was an essential ingredient the API industry was missing, and would grow to become what was needed for almost every aspect of growth, during the next 10 years. The most significant part of this story is that Amazon's cloud APIs were not just making their companies digital resources available to other businesses, they were also driving much of the growth across every other sector of the API sector.
A Mobile World
With the introduction of iPhone and Android smart phones, APIs had evolved from powering e-commerce, social, and the cloud, to delivering valuable resources to the mobile phones in our pockets, that are quickly becoming commonplace around the globe.
APIs make valuable resources modular, portable and distributed, making them a perfect channel for developing mobile and table applications of all shapes and sizes.
A small group of API driven technology platforms have helped defined the space and won over the hearts and minds of both developers and the end users of the applications they develop.
In March 2009 Foursquare launched at the SXSW interactive festival in Austin, TX.
Foursquare is a location-based mobile platform that makes cities more interesting to explore, by checking in via a smartphone app or SMS, users share their location with friends while collecting points and virtual badges.
In November 2009 after a round of angel funding from several investors including Union Square Ventures and O'Reilly AlphaTech Ventures, Foursquare launch their API.
At the time of API launch, Foursquare had an impressive set of applications developed by a closed group of partners including anAndroid application and augmented reality with Layar.
Even with growing competition from early mover Gowalla and major players like Facebook and Google, Foursquare has emerged as the dominant mobile location-sharing and check-in platforms.
On October 6, 2010 Instagram launched its photo-sharing iPhone application.
Less than three months later, it had one million users.
Kevin Systrom the founder of Instagram focused on delivering a powerful, but simple iPhone app that solved common problems with the quality of mobile photos and users' frustrations with sharing.
Immediately many users complained about the lack of central Instagram web site or an API, with Instagram remaining firm on focusing its energy on the core iPhone application.
In December a developer name Mislav Marohni? took it upon himself to reverse engineer how the iPhone app worked, and built his own unofficial Instagram API.
By January Instagram shut down the rogue API and announced it was building one of its own.
Then in February of 2011, Instagram released the official API for the photo platform.
Within days many photo applications, photo -sharing sites, and mashups built around the API started showing up.
Instagram became a viral iPhone app sensation, but quickly needed an API to realize its full potential. Asserting the platforms place in history as one of the defining players in the mobile period of APIs.
In 2007, a new API-as-a-product platform launched, called Twilio, which introduced a voice API allowing developers to make and receive phone calls via any cloud application. In recent years Twilio has also released a text messaging and SMS short code API, making itself a critical telephony resource in many developers toolbox.
Twilio is held up as a model platform to follow when evangelizing to developers. Twilio has helped define which technical and business building blocks are essential for a healthy API driven platform, set the bar for on the ground evangelism at events and hackathons, and worked hard to showcase, support and invest it its developer ecosystem.
Alongside Foursquare and Instagram, Twilio has come to define mobile application development, helping push APIs into the mainstream. While Twitter has sometime been held up as a cautionary tale when it comes to APIs, Twilio has demonstrated, that when done right, API driven ecosystems do work.
By 2011 the bar for delivering APIs, via HTTP, has been well established by early pioneers like SalesForce and Amazon, but Twilio has shown how mature the business of APIs has become with its evolution into the mobile period. However, mobile development via APIs, owes its roots to the foundation laid by the commerce, social and cloud API pioneers.
JSON use evolved out of a need for stateful server-to-browser communication, without using browser plugins such as Flash or Java applets, which had been the dominant methods in the early 2000s. The JSON organizational website was officially launched in 2002, but it wasn't until Yahoo! began offering some of its Web services in JSON in 2005 and then Google used it for its GData protocol in 2006, that we started to see widespread adoption of the format by API providers, and consumers.
The switch from XML to JSON has marked the maturing of the web APIs space, going from hobby to an actual business solution that can be used to describe essential business resources--resulting in near complete adoption in 2016.
The Ongoing Evolution of Online Commerce
Over the first decade of the 21st century, online commerce APIs were still evolving, with the essential elements like product, sales, auctions, shopping carts, and payments play a central role. Many API providers would come and go, but there are only a handful that deliver a precise approach to APIs that would prove to elevate their offering, making an impact on how well approach commerce APIs, as well as almost any other digital resource.
By September 2011, startups and investors had read the writing wall, and the proven "API as a product" model began being applied to disrupt the payment industry, with the launch of Stripe. Like Twilio, Stripe was built for developers, and did everything right, when it came to API design, to documentation, support, and pricing that worked for web, and mobile application developers when it came to integrating payments into their business and consumer solutions.
Right along with compute, storage, location, and messaging, payments are an essential resource to any commercial web or mobile application, and having a simply priced, easy to get up and running payment APIs, proved an instant hit with developers. I considered adding Authorize.net, and Paypal to the history of APIs, but in my opinion it took 10 years for digital commerce to evolve via APIs, with providers like Amazon, and eBay, and API-as-a-product business models established by Twilio, before a standalone payment provider like Stripe could exist, an make the impact that they have.
Payments are a mission critical resource for developers, and will continue to be in the future. Stripe continues to set the bar for how you do payment APIs, as well as how you do APIs in general, and is held up by the entire API industry as how you do it. Stripe continues to do one thing (payments), and do it well, setting the tone for what APIs can do, to disrupt a well established industry like online payments.
Hardening Security Practices
As more companies looked to open up their digital assets via web APIs, the need to harden security practices emerged, but at the same time these practices needed to reflect the simple nature of the modern web API, that developers expected. Traditional enterprise approaches to identity and access management would not always fly within web API implementations, with the majority of providing opting to go with basic auth, or API keys, when securing their APIs, but there were two approaches to securing APIs that have evolved along the way.
In 2006 a movement was born out of Twitter, and the social bookmarking site Ma.gnolia, out of a frustration that there were no existing standards for platforms, developers, and users to manage API access and resource delegation. By 2007 a small group gathered to draft a proposal for a new proposal, resulting in what became the OAuth Core 1.0 draft, which then emerged as an OAuth working group within the Internet Engineering Task Force (IETF).
By October 2012, OAuth 2.0 had emerged as the next evolution of the protocol, focusing on client developer simplicity while also providing specific authorization flows for web applications, desktop applications, mobile phones, as well as devices. OAuth 2.0 has seen wide adoption by leading API providers, quickly establishing it as as one of the first major open standards, that the web API community would embraced.
While OAuth can be celebrated as a security standard for the API space, the evolution hasn't been without its problems. In July 2012, one of the original OAuth champions Eran Hammer resigned his role of lead author for the OAuth 2.0 project, withdrew from the IETF working group, removing his name from the specification, citing a conflict between the open web and enterprise cultures, stating that the IETF as a community is "all about enterprise use cases", and "not capable of simple." What is now offered is a blueprint for an authorization protocol, he says, and "that is the enterprise way", providing a "whole new frontier to sell consulting services and integration solutions."
While OAuth 2.0 is not the perfect solution the API space needs to delegate access to resources via APIs, it is the best we have at the moment. The approach to securing APIs, provides a viable solution that allows API platform providers to secure resources in a way that enables developers to easily access resources, with the involvement of end-users. Even if OAuth 2.0 has become a tool of the enterprise, it is providing some meaningful delegation, and enabling the space to safely and securely expand, and integrate at a steady pace for the last few years.
Jason Web Tokens (JWT)
At the same time OAutb has been maturing, another industry standard (RFC 7519) evolved, called JSON Web Tokens, providing an open way to securely represent online exchanges between two parties. The tokens are designed to be compact, URL-safe and usable in single sign-on (SSO) context via the web. JWT claims are typically used to pass identity of authenticated users between an identity provider and a service provider, or any other types of claims as part of regular business activity.
JWT began being worked on in September of 2010, with the first draft of JWT becoming available in July of 2011. A growing number of API providers are using JWT as a middle ground between simple API keys, and the sometimes overwhelming OAuth implementations, that can create friction for developers.
I think JWT has the potential to flourish outside of the challenges OAuth has faced from the enterprise, at least for a couple more of years, until it sees the same amount of adoption as OAuth.
Both OAuth, and JWT has helped round off the API security stack, where along with Basic Auth and API Keys, API providers now have a robust set of tools that allow them to secure the valuable resources that are being made available via web APIs.
The Now Infamous Yegge Rant
Echoing the API history Amazon has been putting down, in March of 2011, there was an accidental post from a Google employee about Google+. The internal rant was accidentally shared publicly and provides some insight into how Google has approached APIs for their new Google + platform, as well as insight how Amazon adopted an internal service oriented architecture (SOA).
The insight about how Google approached the API for Google+ is interesting, but what is far more interesting is the insight the Google engineer who posted the rant, Steve Yegge, provides about his time working at Amazon, before he was a engineer with Google.
During 6 years at Amazon he witnessed the transformation of the company from a bookseller to the almost $1B, Infrastructure as a Service (IaaS) API, cloud computing leader. As Yegge's recalls that one day Jeff Bezos issued a mandate, sometime back around 2002 (give or take a year):
- All teams will henceforth expose their data and functionality through service interfaces.
- Teams must communicate with each other through these interfaces.
- There will be no other form of inter-process communication allowed: no direct linking, no direct reads of another team's data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network.
- It doesn't matter what technology they use.
- All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions.
The mandate closed with:
Anyone who doesn't do this will be fired. Thank you; have a nice day!
Everyone got to work and over the next couple of years, Amazon transformed itself, internally into a service-oriented architecture (SOA), learning a tremendous amount along the way. While this story has been proven to be more myth than reality, I think the real impact of the story is in how this myth has been heard, and passed around the API sector, told and retold by IT, developer, and business people around the globe.
This story came at a time where many companies were struggling with the scary possibility of operating public APIs, and has allowed them to refocus much of the value that APIs bring to the table internally. The Yegge rant provides an important story that companies can tell themselves as the begin their API journey, keeping things internal in the beginning, but with hopes that someday they can go public, and find the success that Amazon has with their successful API platform.
Open Source Software and Now APIs With Github
In tandem with the evolution of the cloud, another company was being born who would make yet another monumental impact across the API space. In late 2007, Thomas Preston Warner, and Christ Wanstrath would come together to improve on the open source distributed version control system, Git. The pair were looking to improve on the existing Git experience, and develop a hub for coders, which by mid-January of 2008, after three months of nights and weekends, they would launch Github into private beta mode.
Along with simplification of using Git at the heart, and the social network that brought together coders, Github also has leveraged APIs all along the way. The Github API provides developers with access to all aspects of the Github platform, providing the ability to manage the software development life cycle, while also building community along the way.
As the potential of Github in software development was being realized, Github did another seemingly simple thing, which would further expand its use across the API sector, by launching Github Pages. The new solution would allow project websites to be deployed alongside Github master repos, something that would tweak the meaning of exactly what a repo could be used for.
API providers would begin using Github Pages to host their API developer portals, to host API SDKs and code samples, and even began pushing its use for publishing event presentations, and managing the publishing of open data. Github has emerged as the platform of choice in the API space, and is used at almost every stop along the API life cycle, leverage Git, and a robust API to orchestrate and automate the API driven backend of the latest wave of web, mobile, and device-based solutions.
Changing The Way We Communicate Around Are APIs With Swagger And The Swagger UI
In 2011, and 2010, a new way to approach the old SOA way of describing services emerged called Swagger. The new API definition format, was developed by Tony Tam (@feyguy), to meet their API needs at Wordnik, when it came to helping managing the evolution of their dictionary API. Swagger provides API providers a new way to describe the surface area of any web API, allowing for the generation of documentation, code libraries, and many other things developers need to understand what an API does, and how to put it to work.
Swagger is often known for its tooling for deploying a new type of API documentation, in a way that made it more interactive, allowing developers to make API requests, and see the details of the request, and the results, before they ever write any client code. However the interactive API documentation was just the beginning, and the API definition format would eventually be applied to almost every stop along the API lifecycle.
Swagger has matured to version 2.0, and has become the central contract that defines the arrangement between API provider and consumer. In 2015, Swagger was acquired by SmartBear Software, with the specification put into the Linux Foundation. In 2016, the specification has re-emerged as the OpenAPI Spec, and is now governed by the Open API Initiative (OAI), the organization formed as part of the move to the Linux Foundation. Even amidst all the turmoil, the OpenAPI Spec is still rapidly expanding in use across the web, and providing a machine readable way for API providers, consumers, and even business stake holders to describe the valuable API resources being exchanged as part of the API economy.
Apiary Teaches Us To Be API Design First
Swagger gave us a way to describe our APIs, but many API providers still apply it after an API has been developed, until one company came along and helped us move the conversation earlier on in the API life cycle. The Apiary.io team used their own API definition format, call API Blueprint, to not just to describe and document an API, but also allow designers mock it, before you ever got your hands dirty writing code. This API design first approach to API development has had a profound effect on how we look at the API life cycle, allowing us to make mistakes, and bring in key stakeholders before things ever get costly.
What Apiary brought to the table wasn't just about making it easier to design, mock, develop, and document our APIs, they pushed the space to open up the API conversation with consumers, and key business stakeholders much earlier on in the life cycle, before things went down a bad road, and were set in stone. This process allows everyone involved to get to know the resources being made available via APIs, and design a solution that better matches how the resources will be experienced, not just how the resource is stored.
API design first has become a mantra for many companies, and API service providers. While it isn't truly a reality for all who recite the phrase, it provides a healthy focus for API designers, architects, and business stakeholders, at varied stages in their API journey. Many companies will need this focus to get them through many of the challenges they face along the way, as they try to operate in this new online, API driven web, mobile, and device driven world.
A Glimpse At The Internet of Things From Fitbit
By 2009 and 2010, it was becoming clear that APIs could be used to deliver the resources we need for the increasing number of mobile phones that were becoming ubiquitous. Amidst this rapid growth of mobile, another company popped up that would see the potential of connecting devices to the Internet, with the birth of the Fitbit. The new fitness and health tracking device would allow users to track their activity, health, and other key wellness indicators, which could then be connected to our mobile phones, helping plant the seeds for what we now call the Internet of Things (IoT).
In February 2011 Fitbit quietly launched their API, providing connectivity to the data that was uploaded to the Internet from the tracking device, via our mobile phones. Two months after Fitbit launched their API, they announced the first wave of partners who had integrated with the fitness and health device. This partner potential is why companies of all shapes and sizes were beginning to deploy APIs, allowing for 3rd party companies to tap into the growing number of valuable resources being made available online.
While Fitbit is not responsible for the Internet of Things, as devices being connected to the Internet via wifi and bluetooth is nothing new, they do provide a solid example of IoT in action, one that is publicly traded, and has seen both consumer, and commercial success. Whether you call it the quantified self, wearables, or Internet of Things, Fitbit has captured the imagination when it comes to Internet connected devices.
Integration Platform as a Service (iPaaS)
As developers are realizing the potential of web APIs, a wave of new companies were also emerging that saw the potential for non-developers to put APIs to work in everyday business and consumer world. In November 2011, Zapier began publishing simple connectors between popular cloud platforms that would allow anyone to put APIs to work in managing their increasingly online world.
By June of 2015, Zapier launched its third-party developer platform, which allowed API providers to build their own connectors. The connectivity that companies like Zapier offered, reflect older, more enterprise approaches like Extract, Transfer, and Load (ETL), which helped businesses move data and information around on their networks. This big difference with this new breed of provider is that that connectors employ simple icons, that represent popular API driven services, and focused on the API driven cloud, moving beyond the company network.
There are more than 50 providers that I track on who provide iPaaS services, of all shapes and sizes, continuing to to legitimize the concept, but not all pay it forward by providing an API as well--a significant part of the concept working. While iPaaS helps smooth over some of the more difficult aspects of API integration, they shouldn't hide it all together, and eliminate the possibility for API access by consumers.
iPaaS isn't just about move data and content from point A to B, it is about aggregating, syncing, and migrating valuable API driven resources. As the number of APIs grow, the number of iPaaS providers also increases, providing a wealth of API driven resources that any business user, or even developer can put to work for them.
Obama Mandates Federal Government To Go Machine Readable By Default
As a follow-up to the Executive Order 13571 issued on April 27, 2011, requiring executive departments and agencies to identify ways to use innovative technologies to streamline their delivery of services to lower costs, decrease service delivery times, and improve the customer experience--Barack Obama has directed federal agencies to deploy Web APIs.
The Whitehouse CIO has released a strategy, entitled "Digital Government: Building a 21st Century Platform to Better Serve the American People", provided federal agencies with a 12-month plan that focuses on:
- Enabling more efficient and coordinated digital services delivery
- Encouraging agencies to deliver information in new ways that fully utilize the power and potential of mobile and web-based technologies
- Requiring agencies to establish central online resources for outside developers and to adopt new standards for making applicable Government information open and machine-readable by default
- Requiring agencies to use web performance analytics and customer satisfaction measurement tools
While the mandate itself didn't do much to move the open data and API needle in the federal government, it did mobile many people who were looking to make change in government. In addition to the mandate, a wave of open data, and API savvy CTOs and CIOs have led the charge at the White House, and groups like 18F have taken up the cause of open data and APIs across the federal government.
At the same time this change is happening at the federal government level, open data and APIs would also be making change on the ground in city, state, and county governments across the country. While not all early visions of open data have been realized, the Obama mandate marked a major milestone in how our government works, in part to the concept of the web API.
Setting A Very Negative Precedent In The Oracle v Google API Copyright Case
Even with all the gains the API industry has made in the last 15 years, it hasn't been without its major potholes, speed bumps, tool booths, detours, and disruptions. Just as the API space is seeing some amazing contributions, and growth, a chill was sent across the industry by a court case brought by Oracle against Google, which claimed that the Java API had been copied by Google,and were something that was protected under copyright.
In May 2012, a jury in the case found that Google did not infringe on Oracle's patents, and the trial judge ruled that the structure of the Java APIs used by Google was not copyrightable. However, by 2014, the Federal Circuit court partially reversed the district ruling, ruling in Oracle's favor that the APIs were indeed protected under copyright. A petition was submitted to the United States Supreme Court on June 29, 2015, but was denied, sending the remaining issue of fair use back down to the district court.
While the Java API is a different breed of API, than the web APIs that have gained momentum, and there remains the fair use discussion, the court case has sent shockwaves across the API sector. There is a lot of uncertainty involved with companies doing APIs, and the API copyright precedent adds yet another concern for both API providers, and consumers, adding unnecessary strain to the space. Web APIs flourish when they are used as an external R&D lab between a company, its partners, and the public, and the dark cloud of API copyright threatens this balance.
Twitter Sends All The Wrong Signals To Its Community in 2012
At the same time we were dealing with the fallout from the Oracle v Google case, one of poster children of the modern API movement sent a series of chilling messages, and veiled threats to its then fast growing API ecosystem. In June 20212 Twitter published a post explaining the need for delivering a consistent Twitter experience for users, followed up by a very ominous post in August of 2012 talking about changes coming down the line for the Twitter API.
While Twitter was just tightening up its control over its brand, applications, and its community, something all API providers face, the way it approached the situation, sent such a negative vibe to its community, the developers revolted. Twitter made it clear, that it was clearly in competition with its API ecosystem, and was trying to take back control over some of the more successful areas of development that had been occurring within the ecosystem, and already being met by businesses being built by API developers.
Everything we know of as Twitter was built by its developer ecosystem, a relationship that was very public, and encouraged by Twitter, until the company did not did the free labor, and took on a significant amount of funding, requiring it to shift its course. Twitter was needing to generate revenue, and made it very clear that they were taking back the most successful areas of the platform, something that would have a very chilling effect on the API community, and is something that the company has never recovered from.
Even though Twitter co-founder Jack Dorsey reassured developers that Twitter cares about its developers, as he retook the reigns of his stumbling company in 2015, the trust had already been broken. Proving that trust is one of the most important aspects of API platform operations, something that once it has been broken, will be almost impossible to recover from.
As the momentum in the API space grew in 2011 and 2012, the traction API service providers were seeing caught the attention of some of the more established tech giants who have dominated the tech sector for decades. The well defined discipline of API management, set into motion by Mashery who is showcased above, had ripened to the point that made a very attractive acquisition targets by the enterprise, and we saw a handful of acquisitions that rung out across the space in 2013.
In late 2012 we saw the first acquisition of Vordel by Axway, which set off a series of high profile API management provider acquisitions in 2013, beginning with Mashery being purchased by Intel, then Layer 7 by CA Software, and Apiphany by Microsoft later in the year. The acquisitions would send the signal to markets that the API space had come of age, the space was maturing, and the big boys were taking notice.
In a little less tan a decade, API management had grown up to be a legitimate business, and prove to be one that would attract the attention of the biggest tech companies in the space. While the acquisitions have legitimized the value of API management solutions, it hasn't all been good, as the attention from the enterprise, has almost meant a shift in focus by the investors of popular API service providers, looking for the big pay-off, shifting away from many of the priorities that have made API successful, operating on the open Internet, as opposed behind a corporate firewall.
Even with all the acquisitions in 2013, the biggest milestone for the API management space was the IPO of one of the API pioneers, Apigee. In May of 2015, Apigee Corporate, the developer of API-based software platform, filed a registration statement on Form S-1 with the U.S. Securities and Exchange Commission (SEC) relating to a proposed initial public offering of shares of its common stock.
The API management acquisitions were validating, but one of the leading companies was going public which was a significant milestone marking that the space was indeed a real thing (we hope), and potentially something that mainstream markets now acknowledged. In the tech sector we are all surrounded by like-minded folks who are usually believers by default, when out in the real world there are large sectors of business who are much more skeptical, and bullish on what is relevant.
While the Apigee IPO performance has been mild at best, it still legitimizes not just API management, but also brings a validation to the wider concept that the web, can be used as a driver of real world business, not just mashups, and online play. In 2015, after fifteen years of evolution, the web APIs now have a representative on wall street, setting the stage for wider growth in many established industries like banking, insurance, health care, and beyond.
In early 2014, Stewart Butterfield, one of the original founders behind the pioneering photo sharing platform included in this history, launch a team messaging solution named Slack. After Butterfield left Yahoo, who acquired Flickr in 2008, he began building a game called Glitch, which while enjoying a small cult following, was not a commercial success, and by 2012, had to shut the doors and lay off their staff.
One by product of the gaming platform, was a messaging core they had built, which after shutting down, they spun off into a separate product they continued to work on throughout 2013. Once released in 2014, the platform was an immediate hit with the VC, and Silicon Valley community, and quickly has become a huge messaging success, but equally as important, via its API the platform spawned a huge number of successful integrations, as well as a fast moving bot ecosystem.
In 2016 Slack has become the epicenter of a chat and messaging bot evolution, that originally focused no the Twitter ecosystem, but has become more about business productivity, and other business solutions, injected into the workplace team environment, via the popular messaging platform. This bot movement has spawned a whole new wave of interest from VC"s, and while the concept is nothing new, Slack, Twitter, and other API or messaging driven platforms are giving rise to this new bot as an API client environment.
Amazon API Gateway
In 2015, AWS continued to define the APIs space, and demonstrate their dominance, by releasing the AWS API Gateway, which allows any AWS customers to design, deploy, manage, and monitor their APIs via their existing AWS cloud infrastructure. While many cried this was a killer of many of the existing API management service providers, after time has passed, it seems to be a natural progression of the API space, as well as telling of Amazon's role in the space.
As the AWS API Gateway press release information states:
“create an API that acts as a “front door” for applications to access data, business logic, or functionality from your back-end services, such as workloads running on Amazon Elastic Compute Cloud (Amazon EC2), code running on AWS Lambda, or any Web application”
The new gateway will take all that existing infrastructure you have accumulated (in the cloud), and it:
“..handles all the tasks involved in accepting and processing up to hundreds of thousands of concurrent API calls, including traffic management, authorization and access control, monitoring, and API version management.”
Distilling down the lessons from the last five years, and selling it:
“With the proliferation of mobile devices and the rise in the Internet of Things (IoT), it is increasingly common to make backend systems and data accessible to applications through APIs.”
To me, the release of the AWS API Gateway is a pretty significant milestones in the evolution of what is API. By 2006 the web had matured, and the Internet was being used for much more than just consumption, the API community was realizing that we could deploy vital digital resources using the Internet as a vehicle. Almost 10 years later, Amazon understands the opportunity in enabling you to do this for yourself--helping you either embark on, or speed up your API journey, which they've been on for over 15 years.
Allowing you to manage any of your digital assets as an API, using AWS API Gateway, is just the beginning of the expertise that Amazon is packaging up for all of us in the latest release.
Delivering On Promise Of Voice Enablement With The Alexa Voice Service & Skills Kit
Joining in on the wider conversation around the Internet of Things, Amazon has released several IoT focused solutions, but none have made an impact on the space, and potentially the future of APIs, more than the Alexa Voice Service. I hesitate to include this as a milestone in what I consider to be the history of APIs, but what Amazon is doing is already making significant waves when it comes to how APIs are consumed.
In the summer of 2015, Amazon introduced their voice enabled device the Amazon Echo, which was supported by a suite of APIs they bundled under the Alexa Skills Kit (ASK), and now also the Alexa Voice Service (AVS). Much like the rest of IoT platforms, Amazon Echo still has to provide itself with real world usage, but the skills kit, and voice service has emerged at just the right time in the evolution of mobile, voice, as well as complimenting the number of API resources available.
Like messaging platforms like Slack are providing a potential new way to reach consumers, Alexa Voice Services is providing a new way to access valuable API driven resources. I feel more importantly the concept of the "Skills Kit" is providing an entirely new way for API providers to think about how they expose their valuable resources, making them available in ways that are more meaningful to home, and business users. Only time will tell if Alexa becomes part of the overall API consciousness, but after less than a year of operation, I am seeing signs of the platform being a very important milestone in the evolution of the space.
Understanding history is critical to understanding where we are going. Calling this document the "history" of web APIs seems kind of silly. We are actually talking about a span of 16 years. But there are so many important lessons to be learned from the approach of these API pioneers, and the marks they've left on the space, to the point we can't ignore this history. If the technologists had their way, APIs would have purely been successful in the period of commerce, but with the radical innovation of companies like Amazon, Twitter, Twilio, Slack, and others, we now understand that APIs needed several essential ingredients to succeed: commerce, social, cloud, messaging, voice, and more.
Of course, all of this has to make money, but APIs need to be scalable, while also delivering the meaningful tools, services and resources that are important end users, otherwise none of this matters. As we stand solidly in the mobile period of API evolution, looking at the evolution to a period that will be more about devices and an internet of things (IOT), we need to understand our own history and how we've gotten here, to make sure we the right decisions for what is next.
Web APIs are about delivering valuable, meaningful, scalable and distributed resources across the World Wide Web. While Silicon Valley keep pushing forward with the next generation of technology solutions, we need to make sure that we know our past.
See The Full Blog Post