{"API Evangelist"}

Converting Between The Popular API Definition Formats Using API Transformer

My own API management system allows me to import Postman collections, HAR files, Charles Proxy XML files, and Swagger version 1.2, but when it comes to output, it only speaks Swagger 2.0. I've been wanting to create a tool for outputting my definitions as API Blueprint for some time now, but just haven't had the time to do the work.

I have been secretly hoping someone would build a good quality, so I wouldn't have to do this work myself. Now I have API Transformer, an API definition translation platform, developed by the APIMATIC team. Using API Transformer you can upload or pull API definitions in the following formats:

  • API Blueprint
  • Swagger 1.0 - 1.2
  • Swagger 2.0 JSON
  • Swagger 2.0 YAML
  • WADL - W3C 2009
  • Google Discovery
  • RAML 0.8
  • I/O Docs - Mashery
  • APIMATIC Format

Then you can output API definitions in the following formats:

  • API Blueprint
  • Swagger 1.2
  • Swagger 2.0 JSON
  • Swagger 2.0 YAML
  • WADL - W3C 2009
  • APIMATIC Format

With API Transformer, they are offering a pretty valuable service, that any API service provider, and API consumer can put to work right away. I quickly generated four API Blueprints, for my API, audio, blog, and building block APIs, which I also indexed as part of each APIs.json file.

As any good API service provider should, The API Transformer has an API. You can build in API transformations between definition into any service or tooling. In my opinion, every API service provider in 2015 should speak as many API definitions as possible, always allowing customers to import, and export in all of the formats offered above.

API Transformer reflects how API service providers should work, it is doing one thing, and doing it well, and it provides a simple web interface, as well as a dead simple API. There is no way I will be building out my own service now that APIMATC has launched API Transformer.

See The Full Blog Post


Adding A Page To My Research For Tracking On Swagger Extensions

I have a research project dedicated to trying to understand al things Swagger. I try to add any new research, or tooling there when I can. The latest thing I added was a page to list out Swagger extensions that I find in the wild.

I knew Apigee has extended Swagger in some interesting ways, but I was coming across other interesting examples, and want to try and aggregate into a single location, so that others can reference and build upon. 

There is now an extensions page on my Swagger research. I have added APIMATIC's x-codegen-settings, a couple from Apigee to kick things off. If you know of any interesting examples of Swagger extension, please let me know via the project's Github issues, or feel free to fork the project and add to the extensions page using the _config.yml file for the Github project, they are stored as a YAML collection.

Hopefully we can start centralizing all the innovative extensions of Swagger into a single location, helping us all not re-invent the wheel when it comes to extending the popular API definition format, affectionately known as Swagger.

See The Full Blog Post


Having The Resources You To Need To Scale Your Startup API Team For That Enterprise Project

I just sat in on an APIWare call with a fast growing startup, developing a better understanding of how we can assist their team. They have a pretty solid development team behind their API, so providing core API development resources is not in the cards. 

Where  this startup is needing the most help, is where the APIWare team shines:

  • API Operations Review - Taking a look at how an API platform works from the inside out, preferably starting with an outside perspective, so we can bring that fresh perspective to the table, and immediately begin adding value to the road-map.
  • Crafting API Strategy - With a better understanding of how an API works, inside and out, we can take a walk through our massive list of best practices, and see what we can apply when it comes to bringing together a strategy.

Providing companies with a comprehensive review of their API operations from an outside perspective, but also walking through their architecture, team, and other internal resources, is a valuable process on its own. Having a team of experience API professionals listening, and walking through the current makeup of your API operations helps you better articulate your vision while also helping us define how things work, so a coherent API strategy can be crafted by our team.

This is the stage we are in with this particular startup conversation--kicking off a formal review, so we can help them craft a coherent API strategy. Once this is done, we can re-assess how we can help them, and probably begin telling more public stories about the project. This conversation reflects other conversations the APIWare team is having with other startups, SMB and enterprise groups. However one thing that did present itself on this call, is how APIWare is also becoming a burst-able development team that can act as a backup when small startups are faced with potentially large enterprise relationships that come with API success.

This particular startup  just had a new enterprise customer come in through their API, the dream of any small API, until you actually have to scramble to meet the demands. I have seen the interest of the enterprise be both a blessing and a curse in the past for small teams, something that has the potential to derail a startup, taking them away from the daily work that matters. The enterprise has the resources to throw at these projects, but small startup teams do not always have what it takes to keep day to day operations running smoothly, while also meeting the needs of a large project or partnership.

This is APIWare. In addition to helping this startup review their current API operations, and craft a coherent API strategy, the APIWare team is here to help their core API developer team expand and contract as necessary. We are getting to know know the company's existing infrastructure and operations, primarily to be able to help craft the overall API strategy, but this awareness also puts us in a good position to step in and help either to help with a larger project, or augment the core team, while they address the needs of larger, temporary projects. 

The ability for APIWare to be there for a company in this capacity always starts with an API operations review. We can't be waiting in the wings, ready to work, if we aren't up to speed on how things are currently operating, and in tune with the road-map. It is nice to be starting these conversations early, so we are up to speed when we are needed most. I am happy to see that staying specialized in the areas of API design, deployment, and management is proving to be a valuable approach to the startups we are talking with so far.

See The Full Blog Post


A Suggested, And Sponsored Link Relation Engine For Hypermedia APIs

Sometimes I have ideas, that are sticky, meaning they won't leave my brain, but are not always concepts I personally enjoy exploring. This is just a little insight into the madness that is my brain, focus Kin...FOCUS! So, I found myself thinking last week about API monetization, while I was also updating my API design and hypermedia research--then, during this Reese's peanut butter cup moment, I came up with an idea for a suggested or sponsored link relation engine for hypermedia APIs.

First, I'd prefer to start with what I'd consider to be a hypermedia suggestion engine, that could provide external suggestions for related links to any structured data that is being returned via an API. It is a suggestion engine that an API provider, who has followed hypermedia principles as part of their API design, could use to augment the resources they are serving up. Such a suggestion engine would have to be pretty smart, and work from an existing index that API providers could train via an external API.

One possible example of this from my existing work, is present in a scientific research API, that might be serving up research papers, and with each API call you could potentially get some native related links for annotating, tagging, and other opportunities that are dictated by the API providers. But, what if the institution, where the research occurred could provide related links to other research going on at the institution, or maybe a governing scientific organization could suggest other related research or resources, and it was up to the API provider to whitelist or blacklist link relations that were included.

Taking this to the next level, which is inevitable, and I might as well be the asshole who suggest it, but what if API providers could approve a paid index of links to suggest as a link relation, for any product in the catalog. Sure you can add this product to your wishlist, or shopping cart, but you could also donate it to a non-profit organization, or buy it through a partner who will modify it for me, and deliver it to me in a way that improves on the original product experience. Such a link relation engine could inject valuable links into the API stream, and the API provider, and potentially developers could tap into this as a potential revenue stream. Of course, all links included would be fully vetted, and certified secure using a service like Metacert.

If you are still reading this, this link relation engine does not exist. This is just a random idea derived from the collision of several of my active projects, with a little forward thought of what could be one possible dystopian post-advertising world, API monetization world might look like. I'm sure, like current advertising, 90% of the links in a link relation engine would be total shit, but who knows, they might also add value to the increasing number of structured objects being served up via APIs, and maybe active as a next generation financial engine that is tailored specifically for the API economy.

See The Full Blog Post


API Definitions Broker Critical Conversations Between Business And Developers Who Are Building NextGen Web, Mobile, and Device Apps

If you are in an industry being impacted by technology, you have probably become very aware of the term Application Programming Interfaces, more widely known as APIs, and how they are driving web applications, mobile applications, and increasingly everyday objects in our homes, cars, businesses, and across the public landscape. If you are finding yourself part of this growing conversations, you have most likely have also heard talk of a new breed of API definition formats that are becoming ubiquitous like Swagger and API Blueprint.

API definitions are a way to describe what an API does, providing a machine readable blueprint of how to put the digital resource to work. API definitions are not new, but this latest round of available formats are taking the API conversation out of just IT and developer groups, and enabling business units, and other key stakeholders to participate throughout the API life-cycle. Much like the latest wave of web APIs has made data, content, and other digital resources like video and images more accessible, API definitions are making APIs more accessible across the rapidly expanding digital business landscape.

The first widely available API definition format was the Web Services Description Language (WSDL), which is an XML format established in 2001 that described web services. Much like web services (an API predecessor), WSDL was a very technical vision of APIs, something dictated by IT, and developer groups, with heavy top down governance from business and industry leadership. While web services, and WSDL are still ubiquitous across the enterprise, they are rapidly being replaced with much lighter weight, simpler web APIs that use the Internet as a way of delivering the digital data, content, and resources web, mobile, and devices are demanding in 2015.

Along the way, newer, more web friendly API definition formats emerged, such as Web Application Description Language (WADL), but ultimately WADL was something that never took root, suffering from many of the same illness of its predecessor WSDL. It wasn't until a new format called Swagger was born, that we started to see the conversation around how we define, communicate and develop standardized tooling around APIs evolve, providing an open specification for defining all the details that go into an API. 

Swagger provided developers a way to describe an API that was more in sync with everything else modern API developers were used to, including using JSON, rather than the XML of previous web services, WSDL and WADL. Swagger gave us something more than just way to define APIs, it gave us swagger UI, which is a interactive version of API documentation that made learning about what an API does, and how to integrate with it, a hands on, interactive experience. This new approach to documentation gave us a solution to the number one problem plaguing API providers, which was out of date documentation that confused consumers.

Shortly after Swagger began seeing wide adoption because of the interactive documentation it provided for APIs, a new API definition format also emerged called API Blueprint, which provided interactive documentation, but rather than using JSON, it used Markdown, making the process of defining APIs a little less intimidating for non-developers. Apiary, the makers of API Blueprint did another thing that would move the conversation forward again, making the reasons for defining APIs in these formats, more about API design, than just about delivering up-to-date documentation.

Using API Blueprint, API designers could define an API, before any code was actually written. Developers could craft an API using Apiary's tooling, then a mock version of an API could be generated, which could be shared with other project stakeholders, from business users, to potential web or mobile developers. This process saves considerable time, money, and other resources in ensuring than API would be something web, mobile, and device developers could actually put to use. With two new API definition formats Swagger, and now API Blueprint, the processing of defining, designing APIs in a machine readable way, was accessible to everyone, across a rapidly expanding API life-cycle.

This evolution has all occurred over the last 4 years, a period which has also produced other API definition formats like RSDL, RADL, RAML, I/O Docs, and MSON--just to name a few. All of these API definition formats are quickly becoming the preferred format for not just defining, and designing APIs, as well as delivering documentation. Another positive by-product has been that a new breed of API service providers are also using it as the central definition for quickly putting their services to work on any API, for documentation, mocking and virtualizing APIs, generating server code, producing software development kits (SDK), and setting up essential testing and monitoring to keep APIs stable and reliable for consumers--the API definition driven life-cycle continues to expand.

In 15 years, like APIs, the API definition formats have moved out of the real of the technical, and are providing vital business interactions that ensure APIs meet critical internal, partner, and public needs. They are also being applied to bring much needed balance to the political side of API operations, from making sure APIs are stable, and available, to defining pricing, rate limits, terms of service, and even helping secure APIs that operate on the open Internet.

API definitions have become a machine readable contract that defines the boundaries of a relationship between API provider, and its consumers. Acting as a central truth, that is crafted by developers and API architects to govern how an API operates, from mocking to client integration, but in a way that is also setting the technical, business, and legal expectations of consumers. This API definition-driven contact is transcending the often proprietary, black box algorithms that make an API function behind the scenes, providing a portable, shareable, machine readable contract that can be shared internally, and externally with partners or the general public.

The importance of this new layer, and its role in the future of software development can be seen playing out in the Oracle v Google API copyright case, where Oracle (using the courts) has set the precedent that the naming, and ordering of your interface is separate from the code, and falls under copyright protection. Beyond the core legal case, the questions, and understanding of exactly what is code has been even more interesting. Many API architects do not see APIs as anything but code, having not seen impacts of the modern API definition movement within their architecture yet.

API definitions aren't just about defining the URLs, parameters, headers, and other aspects of API operations that developers need to know, it is also bringing much needed clarity and awareness of value generated by APIs among business users, and the end-users of the applications that APIs are powering. API definitions provide a common format in markdown, YAML, or JSON, that describe the technical surface of an API, but then also take this technical specification, and allow it to be applied across every stop along the API life-cycle, from idea to deployment, to resulting integration with web, mobile, and device applications.

As APIs make their way into almost every aspect of our business and personal lives, driving our social relationships with family and friends, meter our connections to our utility companies, connect us to educational and healthcare opportunities, this touch-point between platform, and the web, mobile, and device applications it powers, is becoming increasingly critical. To businesses this layer represents critical supply chains, but to each individual this touch point is where all of our life bits flow--further emphasizing the importance of, but also the sensitivity required in defining APIs in a meaningful way, that makes sense to EVERYONE involved.

In 2001, a WSDL definition was very much about communicating what a service did between platform and an the system that was integrated, something that only involved IT, and developers. In 2015, a modern API definition format provides the same benefits that WSDL has historically delivered, but it is also addressing the business, and political elements of how Internet enabled software works.  A modern API definition provides:

  • a medium for API designs, architects, and business stakeholders to craft exactly the API that is needed, before any production code is written.
  • a necessary set of instructions needed for a quality assurance (QA) team to make sure an API meets business requirements
  • a definition of sandbox, mocking, simulation, and virtualization environments that developers may need to be successful
  • what a developer needs to integrate with another system, or build an application through interactive documentation, and even complete Software Development Kits (SDK)
  • what the API testing, monitoring, and performance groups will need to ensure service level agreements are met or exceeded
  • the known surface area that security auditor will need to properly secure the infrastructure web, mobile, devices, and ultimately users will depend on
  • a map that government regulators can use to understand the industry landscape, and help keep all players in alignment with the nations priorities

This is just a sampling of how API definitions are being used as a driver for what is widely being called the API economy, which is the heart of cloud, mobile, big data, Internet of Things (IoT), and almost every other technical trend of the last ten years. While API definitions provide the much needed machine readable instructions for computers to understand what occurs at these vital API touch-points, they also provide the much needed human readable instructions, that people can use to interpret business agreements, individual relationships, that are playing out across our increasingly digital lives.

See The Full Blog Post


Adding A 3rd Dimension To My API Monetization Thinking

When it comes to the API space, it always takes numerous conversations with API providers and practitioners, before something comes into focus for me. I've spent five years having API management conversations, an area that is very much in focus for me when it comes to my own infrastructure, as well as using as a metric for reviewing the other public and private APIs that I review regularly.

While I have been paying attention to API monetization for a couple years now (thank you @johnmusser), in 2015 I find myself involved in 20+ conversations, forcing the topic to slowly come into focus for me, whether I like it or not. When talking to companies and organizations about how they can generate revenue from their APIs, I generally find the conversation going in one of two directions:

  • Resource - We will be directly monetizing whatever resource we are making available via the API. Charging for access to the resource, and composing of multiple access tiers depending on volume, and partnerships.
  • Technology - We believe the technology behind the API platform is where the money is, and will be charging for others to use this technology. Resulting in a growing wholesale / private label layer to the API economy. 

90% of the conversation I engage in are focused in the first area, and how to make money off API access to a resource. The discussion is almost always about what someone will pay for a resource, something that is more art than science--even in 2015. The answer is, we don't know until there is a precedent, resulting in an imbalance where developers expect things for free, and providers freak the fuck out--then call me. ;-)

As more of my thoughts around API monetization solidfy, a third dimension is slowly coming into focus, one that won't exist for all API providers (especially those without consumers), but is something I think should be considered as part of a long term roadmap.

  • Exhaust - Capture, and make accessible the logs, usage, tooling, and other resources that are developed and captured through the course of API operations, and make available in a self-service, pay as you go approach.

There are many ways you can capture the exhaust around API operations, and sell access to it. This is where the ethics of APIs come into play--you either do this right, or you do it in a way that exploits everything along the way. This could be as simple as providing an API endpoint for accessing search terms executed against an API, all the way to providing a franchise model around the underlying technology behind an API, with all the resources someone needs to get up and running with their own version of an API. If you are very short-sighted this could be just about selling all your exhaust, behind the scenes to your partners and investors.

To me this is all about stepping back, and looking at the big picture. If you can't figure out a theoretical, 3rd dimension strategy for making money off the exhaust generated by the resource you are making available via an API, and the underlying technology used to do so---there probably isn't a thing there to begin with. Well, if you can't do this in an ethical way, that you will want to talk about publicly, and with your grandmother, you probably shouldn't be doing it in the first place. I'm not saying there isn't money to be made, I'm say there isn't real value, and money to be made that also lets you also sleep at night.

This opens up a number of ethical benchmarks for me. If you are looking at selling the exhaust from everything to your VC partners, and never open it up via your public APIs, you probably are going to do very well in the current venture capital (VC) driven climate. What I'm talking about, is how do you generate a new layer of revenue based upon the legitimate exhaust, that is generate from the valuable resource you are making available, and the solid technological approach that is behind it. If there is really something there, and you are willing to talk about it, and share publicly, the chances I'm going to care and talk about on my blog increases dramatically.

If you do not have a clue what I'm talking about, you probably aren't that far along in your API journey. That is fine. This isn't a negative. Just get going as soon as you can. If you are further along, and have people and companies actually using your API, there is ap robably a lot of value already being generated. If you partner with your ecosystem, and educate, as well as communicate with end-users properly--I am just highlight that there is a lot of opportunity to be captured in this 3rd dimension.

See The Full Blog Post


There Is A Big Opportunity Right Now When It Comes To API Design Tooling

API design and definitions are the number one area when it comes to talks submitted for APIStrat 2015 in Austin, and when it comes to traffic across the API Evangelist network in 2015. After diving into the Amazon API Gateway a little more over the weekend, I was reminded of the opportunity out there when it comes to API design tooling.

Amazon did a good job, providing a GUI interface for crafting the methods, resources, and underlying data models for APIs you are deploying using the gateway. However when you compare to some of the GUI API design editors I mentioned in my last post on this subject, from Restlet, APIMATIC, and Gelato, the Amazon interface clearly needs evolve a little more.

AWS is just getting started with their solution, so I'm not knocking what they have done. I just wanted to keep comparing of all of the solutions as they emerge, and highlight the opportunity for some standardization in this layer of API tooling. I see a pretty big opportunity for some player to step up and provide an open source API design editor that provides a seamless experience across API service providers. 

This post is nothing new. I am just trumpeting the call for open API design tooling each time I see another new interface introduced for crafting APIs, their paths, resources, parameters, headers, authentication, and underlying data models. At some point, a new player will emerge with the open source API design editor I am looking for, or one of the existing players open sources their current offering, and evolve it in context of the wider API design space, providing an abstracted layer that supports all API definition formats.

With the growth in the number of service providers I see stepping up to server the API space, the need for common, open tooling when it comes to API design is only going to grow. It took almost four years of waiting for the API management space to figure this out, I'm hoping I don't have to wait as long in the API design side of things.

See The Full Blog Post


Are There Really Any Monetization Opportunities Around Open Data And APIs?

One of my readers recently reached out to me, in response to some of my recent stories of monetization opportunities around government and scientific open data and APIs. I'm not going to publish his full email, but he brought up a couple of key, and very important realities of open data and APIs that I don't think get discussed enough, so I wanted to craft a story around them, to bring front and center in my work.

  • Most open data published from government is crap, and requires extra work before you can do anything with it
  • There currently are very few healthy models for developers to follow when it comes to building a business around open data
  • Business people and developers have zero imagination when it comes to monetization -- aka ads, ads, ads, and more ads.

My reader sums it all up well with:

I don't dispute that with some pieces of government data, they can be integrated into existing businesses, like real estate, allowing a company to value add. But the startup space levering RAW open, or paid government data is a lot harder. Part of my business does use paid government data, accessible via an API, but these opportunities the world over are few and far between in my mind.

I think his statement reflects the often unrealized challenges around working with open data, but in my opinion it also the opportunity when it comes to the API journey, when applied to this world of open data.

APIs do not equal good, and if you build a simple API on top of open government data, it does not equal instant monetization opportunity as an API provider. It will take domain experts (or aspiring ones) to really get to know the data, make it accessible via simple web APIs, and begin iterating on new approaches to using the open data to enrich web and mobile applications in ways that someone is willing to pay for.

The reality of taking an open data set, cleaning it up, and then being able to monetize access to it directly via an API is simply not a reality, and is something that will only work in probably less than 5% of the scenarios where it is applied. However this doesn't mean that there aren't opportunities out there when it comes to monetizing adjacent to, and in relationship to the open data.

Before you can develop any APIs that any business or organization would want to pay for you have to add value. You do this by adding more meaningful endpoints that do not just reflect the original data or database resources, and provide actual value to end users of the web and mobile devices that being built--this is the API journey you hear me speak of so soften.

You can also do this by connecting the dots between disparate data-sets, in the form of crosswalks, and the establishing common data formats that can be applied across local, and regional governments, or possibly an industry. Once common data formats and interface models are established, and a critical mass of high value open data, common tooling can begin to evolve, creating opportunities for further software, service, and partnership revenue models.

The illness that exists when it comes to the current state of open data is something partly shared between early open data advocates when it came to over-promising the potential of open data, and their own under-delivery, as well as the governments under-delivery when it came to the actual roll-out and execution around their open data efforts. Most of the data published cannot be readily put to work, requiring several additional steps before the API journey even begins--making more work for anyone looking to develop around it, putting up obstacles, instead of removing them.

There is opportunity to generate revenue from open data published by government, but it isn't easy, and it definitely isn't VC scale opportunity, but for companies like HDSCore, when it comes selling aggregate restaurant inspection data to companies like Yelp, there is a viable business model. Companies that are looking to build business models around open data need to tamper down their expectations of being the next Twitter, and open data advocates need to stop trumpeting that open data and APIs will fix all that is wrong with government. We need to lower the bar, and just get to work doing the dirty work of exposing, cleaning up, and evolving how open data is put to work.

It will take a lot of work to find more of the profitable scenarios, and it will take years and years of hard work to get government open data to where it is default, and the cleanliness and uselessness levels are high enough, before we see the real potential of open data and APIs. All this hard work, and shortage of successful models, doesn't mean we shouldn't do it. For example, just because we can't make money providing direct access to Recreational Information Database (RIDB), doesn't mean there isn't potentially valuable APIs when it comes to understanding how people plan their summer vacations at our national parks--it will just take time to get there.

My Adopta.Agency project is purely about the next steps in this evolution, and making valuable government "open data" that has been published as CSVs and Excel, more accessible and usable, by cleaning them up and publishing them as JSON and / or APIs. I am just  acknowledging how much work there is ahead of us when it comes to making the currently available open data accessible and usable, so we can just begin the conversation about how we make them better, as well as how we generate revenue to fund this journey.

See The Full Blog Post


A Data Migration Tool To Help You Import, Export, and Sync Your Data In The Cloud

The Element Loader from API service provider Cloud Elements came across my monitoring this week, providing a configurable JavaScript application that help simplify data migration, allowing you to move content and data between cloud platforms, by putting their APIs to work.

The Element Loader is interesting to me as an evolution to the concept of what I’ve long called API reciprocity, where companies like Zapier allows you to migrate your bits and bytes between the platforms we are all increasingly finding ourselves dependent on.

I think Cloud Elements is moving the needle forward just a bit, by formalizing a tool that is dedicated to real-time sync, between the platforms you depend on. You can accomplish similar things with Zapier, but I think looking at it purely about sync of specific life bits (objects) can be a very valuable exercise. 

I have been calling this reclaim your domain for a couple years now, where I think the process of identifying the services we depend on is extremely valuable, and one where establishing a plan for how your bits and bytes work in concert, really pushes things into the realm of actually healthy IT operations--for both individuals and businesses.

I do not have my world synced. My contacts on Google, LinkedIn, and other platforms are totally out of sync, and my documents are spread between Google, Amazon, and Dropbox, without any coherency at all. Don’t get me started on my images. This is a real problem, that is only growing, and a segment where I'd like to see more solutions like Element Loader emerge.

I’ll start tracking on reciprocity providers like Cloud Elements who are doing specific things like sync, which will become one of the common building blocks I’ll add to my research when I get the time to update. Hopefully I will find some more time soon to take a deeper look at my my API automation and interoperability research--it has been a while.

See The Full Blog Post


Working Hard To Generate API Value Within An Entrenched Legacy Industry Like Real Estate

I preparing a talk this week in Portland, OR, at the IDX Developer Summit. IDX serves the real estate industry, providing real estate professionals access to hundreds of multiple listing services (MLS) groups, from around the United States. If you are a real estate broker or agent, and you need real estate listings on your site, IDX is how you do this--they are leading player in the space.

If you aren't familiar with the world of real estate data, it has been long controlled by a network of MLS groups, totaling almost 2000 (I think), spread around the nation. These MLS organizations tightly control their data, deciding exactly who has access to it, and exactly how it can be used, and how it MUST be displayed in print and across the web. This is a process that has long existed, prior to the existence of the web, but since 1995, it is mechanism that has gotten even more strict, and litigious, seeking to maintain their control over a very valuable, and increasingly digital layer of our physical worlds. 

When it comes to APIs, the real estate industry is the OG API provider, making data available via FTP locations as soon as the web was a thing. However, when it comes to the core principles of what makes APIs work, the real estate industry is the anti-API. MLS hoard facts, something that cannot have copyright applied, but if you are litigious enough, it is something you can defend. The address, and details of residential and commercial property is data that should be accessible to everyone, but MLS groups, and National Association of Realtor (NAR) have created a cartel, that prevents this from ever being a reality. Think what the Record Industry Association of America (RIAA) and record labels have done to music--the MLS and NAR do this to real estate.

IDX has struck a balance between hundreds of these MLS organizations, allowing them to process their prized data, and enable real estate agents and brokers to publish this data on their websites using seamless and often embeddable tooling, that adheres to the distribution, and branding guidelines set by the MLS. IDX provides a bridge between the online digital world, and this legacy world of data control, potentially providing the real estate industry with the online tooling they will need to be successful.

Ok, if the MLS, NAR, and real estate industry is the anti-API, what the hell am I doing speaking at a real estate industry developer conference? Well, I was the original tech founder of IDX. ;-) I build the original system for pulling real estate data, targeting the first handful of MLS organizations. I exited the company sometime around 2005, and my technology is long gone (thank god), and my two co-founders have gone on to do some very interesting things, building a thriving company in a very difficult space.

I won't be going to the IDX Developer Summit to talk shit on the MLS and NAR, I will be helping to inspire the developers, about how much opportunity is available out there right now, even in the real estate industry. There are a lot of open data, and Internet of Things related opportunities emerging when it comes to residential, and commercial buildings, as well as neighborhood, and city level development possibilities. My objective is to help them understand the realities of the space they exist in, and still build value within, and around an industry that is so entrenched when it comes to data sharing--it can be done!

I also hope to get them to use my nationwide MLS API, and bypass the MLS and NAR system. Just kidding!! If you have ever worked in the industry, this is the question every newbie asks, "can't I just get access to the nationwide MLS?". ;-)

I am actually really honored to be speaking at the IDX Developer Summit this week. My buddies Chad and Jeff have done some good work, in a very difficult industry--I am proud of them.

See The Full Blog Post


Some Potentially Very Powerful API Orchestration With The Amazon API Gateway

I sat down for a second, more in-depth look at the Amazon API Gateway. When it first released I took a stroll through the interface, and documentation, but this time, I got my hands dirty playing with the moving parts, and considering how the solution fits into the overall API deployment picture.

API Design Tools
As soon as you land on the Amazon API Gateway dashboard page, you can get to work adding APIs by defining endpoints, crafting specific methods (paths), the crafting the details of your HTTP resources (verbs), and round off your resources with parameters, headers, and underlying data models. You can even map the custom sub-domain of your choosing to your Amazon API Gateway generated API, giving it exactly the base URL you need.

API Mapping Templates
One feature provided by the Amazon API Gateway that I find intriguing is the mapping templates. Using the data models and the mapping template tool, you can transform data from one schema to another. This is very interested when you are thinking about evolving your own legacy APIs, but I'm also thinking it could come in real handy for mapping to public APIs, and demonstrating to clients what is possible with a next version--designed from the outside-in-mapping is something I've wanted to see for some time now.

API Integration Types
Up until now, in this review, we are just talking about designing APIs, and possibly mapping our data models together. There are many other ways you can gateway existing systems, databases, and other resources using Amazon API Gateway, but the one that seems to be getting the lions share of the discussion, is deploying APIs with Lambda functions as the back-end.

API Integration Using Lambda Functions
Lambda functions give you the ability to create, store, and manage Node.js and Java code snippets, and wire up these resources using the Amazon API Gateway. When you create your first Lambda function, you are given a small selection of blueprints, like a microservice, or db connection, which also allows you to edit your code inline, upload a .zip file, and pull a .zip file from Amazon S3 (where is the Github love).

Identity & Access Management (IAM)
The Amazon API Gateway gives you some pretty simple ways to secure your APIs using API keys, but then also gives you the whole AWS IAM platform, and resources to put to leverage as well. I think most of the IAM will be more than many API providers will need, but for those that need this, I can see this part of their gateway solution sealing the deal.

Scaling Lambda Functions Behind
Being scalable is one of the promises of a Lambda backed API deployed using Amazon API Gateway, which I can see being pretty alluring for devops focused teams. You can allocate each Lambda function to posses the memory it needs, and individually monitor and scale as needed. While I see the recent containerization movement taking care of 50% of the API back-end needs, I can also see that being able to quickly scale individual functions as you need using the cloud, taking care of the other 50%.

Events For Lambda Functions
Another powerful aspects of a Lambda function, is you can engineer them to response to events. Using the interface, command line, or API, you can define one or many event sources for each Lambda function. Amazon provides some pretty interesting sources for triggering each Lambda function.

Those six event sources provide some pretty potent event sources for triggering specific functions in your vast Lambda code library.  You can rely on running code stored as Lambda functions using the API you deploy using Amazon API Gateway and / or you can have your code run in response to a variety of these events you define.

Beyond Lambda
When it comes to defining a back-end for the APIs you deploy using Amazon API Gateway, Lambda is just the beginning. Amazon provides three other really interesting ways to power APIs. I see a lot of potential in managing code using Lambda, and using it to scale the back-end of many APIs pretty quickly, but these areas provide some pretty interesting potential as well.

HTTP Proxy
A quick way to put Amazon API Gateway to use is as a proxy for an existing API. When you think about the potential in this area, when put mapping templates to work, transforming the methods, resources, and models. I haven't mapped it to any existing APIs yet, but will make sure and do so soon, to better understand the HTTP proxy potential.

Mock Integration
Another way to quickly deploy an API is mock your integration, providing a quick API that can be used to hack on, making sure an API will meet developer's needs. You may even want to mock an existing public API, rather than use a live resoure as you are developing an application. There are many uses for mock integration. 

AWS Service Proxy
The final way Amazon gives provides for you to power your API(s), is by proxying an existing AWS service. This opens up the entire AWS cloud stack for exposing as API resources, using the Amazon API Gateway. This reminds me of other existing API gateway solutions, except instead of your on-premise, legacy infrastructure, this is your in the cloud, more recent infrastructure. I'm guessing this will incentivize many companies to migrate their legacy infrastructure into the cloud, or at least make it cloud friendly, so you can put the AWS service proxy to use--lots of possibilities here.

Defining The Stages Of Your Lifecycle
Going beyond the types of integration you can employ when crafting, and deploying APIs using the Amazon API Gateway, the platform also provides a way to define stages that APIs will exist in from design, development, QA, production, or any other stage you wish. I like the concept of having a stage defined for each API, designating where it exists on the API life-cycle. I tend to just have dev and prod, but this might make me consider this a little more deeply, as it seems to be a big part of defining the API journey.

API Monitoring By Default
Amazon has built in monitoring by default into the API Gateway, and Lambda functions. You can connect APIs, and their designated integration back-end to CloudTrail, and monitor everything about your operations. CloudTrail is very much a cloud infrastructure logging solution over API analytics solutions, but I could see it evolve into something more than just monitoring and logging, providing an overall awareness of API consumption. Maybe an opportunity for the ecosystem to step in via the API(s).

CLI An API For The API Gateway
You have to admit, Amazon gets interfaces, making sure every service on the platform has a command line interface as well as an API. This is where a lot of the API orchestration magic will come into play in my opinion. The ability to automate every aspect of API design, deployment, management, and monitoring, across your whole stack, using an API is the future. 

There Are Some Limitations
There are some current limitations of the Amazon API Gateway. They limit things to 60 APIs maximum per AWS account, 300 resources maximum per API, 10 stages maximum per API, and 10-second timeout for both AWS Lambda and HTTP back-end integration. They are just getting going, so I'm sure they are just learning how people will be using the API deployment and management infrastructure in the cloud, and we'll see this evolve considerably.

What Will This Cost?
Lambda is providing the first 1 million requests per month for  free, and $0.20 per 1 million requests thereafter, or $0.0000002 per request. The Amazon API Gateway costs $3.50 per million API calls received, plus the cost of data transfer out, in gigabytes. It will be interesting to see what this costs at scale, but I'm sure overall, it will be very inexpensive to operate like other AWS services, and with time the cost will come down even further as they dial it all in.

AWS API Gateway Has Me Thinking
I won't be adopting AWS right away, I'd prefer to watch it evolve some more, but overall I like where they are taking things. The ability to quickly deploy code with Lambda, and use blueprints to clone, and deploy the code-behind APIs, has a lot of potential. Most of my APIs are just simple code that either returns data from a database, and conducts some sort of programmatic function, making Lambda pretty attractive, especially when it comes to helping you scale and monitor everything by default. 

My original criticism of the platform still stands. Amazon is courting the enterprise with this, providing the next generation of API gateway for the legacy resources we have all accumulated in the cloud. Something that really doesn't help large companies sort through their technical debt, allowing them to just grow it, and manage it in the cloud. Win for AWS, so honestly it makes sense, even though it doesn't deliver critical API life-cycle lessons the enterprise will need along way to actually make change.

This is a reason I won't be getting hooked on Lambda + Amazon API Gateway anytime soon, because I really don't want to be locked into their services. I'm a big fan of my platform employing common, open server tooling (Linux, Apache, NGINX, MySQL, PHP), and not relying on specialty solutions to make things efficient--I rely on my skills, and experience and knowledge of the resources I'm deploying, to deliver efficiency at scale. My farm to table approach to deploying APIs, keeps me in tune with my supply chain, something that may not work for everyone.

While the tooling I use may not be the most exciting, it is something I can move from AWS, and run anywhere. All of my APIs can easily be recreated on any hosting environment, and I can find skills to help me with this work almost anywhere in the world. After 25 years of managing infrastructure, I'm hyper-aware of lock-in, even the subtle moves that happen over time. However, my infrastructure is much smaller than many of the companies who will be attracted to AWS Lambda + API Gateway, which actually for me, is another big part of the API lesson and journey, but if you don't know this already, I'll keep it to myself.

I'd say AWS gives a healthy nod to the type of platform portability I'm looking for, with the ability to import and export your back-end code using Lambda, and the ability to use API definitions like Swagger as part of Amazon API Gateway emerge. These two things will play a positive role in the overall portability, and interoperability of the platform, but doing this for the deeper connections made with other AWS services, will be a lot harder to evolve from if you ever have to migrate from AWS.

For now, I'll keep playing with Amazon API Gateway, because it definitely holds a lot of potential for some very powerful API orchestration, add while the platform may not work for me 100%, AWS is putting some really interesting concepts into play.

See The Full Blog Post


You Have 24 Hours Left To Submit Your Talk For APIStrat Austin

There is a little more than 24 hours left for you to submit your talk for APIStrat in Austin, TX, this November 18th, 19th, and 20th. With this sixth edition of APIStrat, we are taking things back to our roots, and not choosing a theme, but making it a conversation about the most important topics in the space facing API providers and consumers in 2015. 

From looking at the talks that have been submitted so far, API definitions, design, and Internet of Things seems to be leading the pack. We've also seen a couple session talk submissions that we think are probably more worthy of being keynotes, because they are just that good.

The APIStrat team feels like 400 people is the sweet spot when it comes to having a productive API discussion, so we chose a venue that fits this vision, which will most definitely sell out. Check out the sponsors who have already lined up, and we have only announced two keynotes (more coming this next week).

Make sure and submit your talk before tomorrow night, so that you are part of the conversation. Also, make sure you jump in and  sponsor the event, as we are already looking at closing off one tier of sponsorship--contact us today if you want to get in on the action.

We'll see you in Austin! 

See The Full Blog Post


When Are We Going To Get A Save As JSON In Our Spreadsheets?

My last rant of the evening, I promise. Then I will shut up and move back to actual work instead of telling stories. I'm working on my Adopta.Agency project, processing a pretty robust spreadsheet of Department of Veterans Affairs expenditures by state. As I'm working to convert yet another spreadsheet to CSV, and then to JSON, and publish to Github, I can't help but think, "where is the save as JSON" in Microsoft or Google Spreadsheets?

I can easily write scripts to help me do this, but I'm trying to keep the process as close to what the average person, who will be adopting a government agency data set, will experience. I could build a tool that they could also use, but I really want to keep the tools required for the work as minimal as possible. 

It would just be easier if Microsoft and Google would get with the program, and give us a built in feature for saving our spreadsheets as JSON.

See The Full Blog Post


State of Popular Database Platforms And Their Native API Publishing Features

I had a reinder on my task list to check-in on where some of the common database platforms were when it came to APIs. I think it was a Postgres announcement from a while back that put the thought in my notebook, but as an old database guys I tend to check-in regularly on the platforms I have worked most with.

The point of this check-in, is to see how far along each of the database platforms are when it comes to easy API deployment, directly from tables. The three relational database platforms I'm most familiar with are:

  • SQL Server - The platform has APIs to manage, and you deploy an OData service, as well as put .NET to work, but nothing really straightward, that would allow any developer to quickly expose simple RESTful API.
  • PostgreSQL - I'd say PostgreSQL is furthest along with thier "early draft proposal of an extension to PostgreSQL allowing clients to access the database using HTTP", as they have the most complete information about how to deploy an API.
  • MySQL - There was a writeup in InfoQ about MySQL offering a REST API, but from what I can tell it is still in MySQL Labs, without much movement or other stories I could find to show any next steps.

The database that drives my API platform is MySQL running via Amazon RDS. I haven't worked on Postgres for years, and jumped ship on SQL Server a while back (my therapist says I cannot talk about it). I automate the generation of my APIs using Swagger and the Slim framework, then do the finish work like polishing the endpoints to look less like their underlying database, and more like how they will actually be used. 

Maybe database platforms shouldn't get into the API game? Leaving API deployment to the API gateway providers like SlashDB and DreamFactory. It just seems like really low hanging fruit for these already used database solutions, to make it dead simple for developers to expose, and craft APIs from existing datasources.

if you are using any database to API solutions for SQL Server, PostgreSQL, or MySQL, please let me know.

See The Full Blog Post


Cultivating the Next Generation of API Design Skills

There was a pretty interesting conversation around API design going on in one of my API slack channels over the last couple days, about what is API design, and what is needed to get developers, and even non-developers more aware of the best practices that exist. It is a private group, so I won't broadcast it as part of this post, but I did want to extract a narrative from it, and help me set a bar for other API design discussions I am having. 

The Restafarian, hypermedia, and linked data folk have long been frustrated by the adoption of sensible API design practices across the sector, and the lack of basic HTTP literacy among developers, and non-developers is at dangerously high levels. The good news, is some of this is beginning to change, but we still have so much work to do, something that won't be easy, and unfortunately it won't always have the clean outcomes leaders in the space are looking for.

APIs returning JSON is just the next step in the evolution of the web, and when you consider how much work it took to get everyone on-board with HTML, and building web pages, then web apps, and recently mobile apps, you can begin to understand the work that still lies ahead. We have to help the next generation of developers be more HTTP literate (something the previous generations of developers aren't), and possess a baseline knowledge of common API design best practices. This needs to be done in a world where many of these developers really aren't going to care about the specifics of good API design, like us API architects and early pioneers have.

The average API designer in the future will not always be willing to argue about the nuance of how to craft URLs, whether to use a header or parameter, caching, and how to version. They just want to get the outcome they seek, accomplish their project, and get paid for their work. Consider the solar industry as a quick comparison. The first wave of installers were domain experts, while the following waves of installers who will be focused on scaling and maintaining the industry, will only need to be trained on only what is required to get the job done in a stable, profitable way.

Ok. So how do we do this right? I feel like we are already on the good path. We just need you to publish your own API design guide somewhere that we can all learn from, like other leading API providers already present in my API design research. As we build a critical mass of these, we need to also work to aggregate the best practices across them, so that instructors and textbook publishers can incorporate into their curriculum. If you have an API platform, and have ever wished that there were more highly skilled API designers out there, make sure you have your API design practices documented, and shared with the world.

This will get healthy API design practices out of the trenches of startups, SMBs, enterprise, and government agencies, and get it into the educational institutions around the world. Then we can start equipping the next generation of programmers with the knowledge they will need to be successful in delivering the resources need for the next generation of Internet powered apps, networks, and devices. 

I want to add on emore thing. API service companies who are looking to provide tooling that API providers can use to do deploy APIs, will have share in the load here. This is core to my criticism of the AWS API Gateway, in that I applaud their use HAL, but please make sure you also provide a healthy dose of hypermedia literacy along the way, don't just hide it behind the curtain. I really do not want to see the another FrontPage for APIs, so if you are building an API editor, let me know so I can provide you with some ideas. (1) (2).

We all have a lot of work to do, in preparing the next generation of developers, and business users when it comes to a baseline of HTTP literacy, as well as a healthy dose of API awareness. We are going to need an army of API designers to help us deliver on the API economy we are all seeing in our heads--so let's get to work. If you do not have a formal API design strategy get to work on one (let me know if you need help). If you have one, please share it, so I can add it to my API design research, for others reference.

See The Full Blog Post


Catching My Breath On My API Monetization Ramblings Before I Enter Into Some New Conversations

I have two more conversations kicking off on the topic of API monetization, so I just needed to take a moment to gather up the last wave of posts on the subject, catch my breath, and refresh my overall thoughts in the area. What I really like about this latest wave, is that they are about providing much needed funding for some potentially very important API driven resources. Another thing is that they are pretty complicated, unproven approaches to monetizing APIs--breaking ground!!

Over the last couple weeks, I have be engaged in four specific conversations that have shifted my gaze to the area of API monetization:

  • Audiosear.ch - Talking with the PopupArchive team about making money around podcast search APIs.
  • Department of Interior - Providing feedback on the Recreation Information Database (RIDB) API initiative.
  • Caltech Wormbase - Helping organize a grant effort to fund the next generation of research, from Wormbase, and other scientific database.
  • HDScores - Mapping out how HDScores is funding the efforts around aggregating restaurant inspection data into a single, clean API.

 As I think through the approaches above, I'm pushed to exercise what I can from these discussions, on my own infrastructure:

  • My API Monetization - As I look to add more APIs to my stack, I'm being forced to clearly define all the moving parts of my API monetization strategy.
  • Blockchain Delusions - While thinking again about my API pricing and credit layer, I'm left thinking about how the blockchain can be applied to API monetization.

The API Evangelist network is my research notebook. I search, re-read, and refine all the thoughts curated, and published here. It helps me to aggregate areas of my research, especially in the fast moving areas, where I am receiving the most requests for assistance. Not only does it refresh my memory of what the hell I've written in the last couple weeks, I also hope it gives you a nice executive summary in case you missed anything.

If you are looking for assistance in developing your API monetization strategy, or have your own stories you'd like to share, let me know. If you have any feedback on my stories, including feedback for the folks I'm talking to, as well as items missing from my own API monetization approach, or blockchain delusions--let me know!

See The Full Blog Post


Algorithmic Transparency With Containers and APIs

I believe in the ability of APIs to pull back the curtain of the great OZ, that we call IT. The average business and individual technology consumer has long been asked to just believe in the magic behind the tech we use, putting the control into the hands of those who are in the know. This is something that has begun to thaw, with the introduction of the Internet, and the usage of web APIs to drive applications of all shapes and sizes. 

It isn't just that we are poking little holes into the corporate and government firewall, to drive the next generation of applications, it is also that a handful of API pioneers like Amazon, Flickr, Twitter, Twilio, and others saw the potential of making these exposed resources available to any developers. The pulling back of the curtain was done via these two acts, exposing resources using the Internet, but also inviting in 3rd parties to learn about, and tap into these resources.

Something that is present in this evolution of software development, is trust. API providers have to have a certain amount of trust that developers will respect their terms of service, and API consumers have to have a certain amount of trust that they can depend on API providers. To achieve this, there needs to be a healthy dose of transparency present, so API providers can see into what consumers are doing with their resources, and API consumers have to be able to see into the operations, and roadmap for the platform.

When transparency and trust does not exist, this is when the impact of APIs begin to break down and they become simply another tech tool. If a platform is up to no good, has ill intentions, selling vapor ware, or there is corruption behind the scenes, the API concept is just going to create problems, for both provider and consumer. How much is exposed via an API interface is up to the API designer, architect, and ultimatley the governing organization. 

There are many varying motivations behind why companies open up APIs, and the reasons they make them public or not. APIs allow companies to keep control over their data, content, and algorithmic resources, while also opening them up so "innovation" can occur, or simply be accessible by 3rd party resources to bypass the historical friction or bottleneck that is IT and developer groups. Some companies I work with are aware of this balance being struck, while many others are not aware at all, they simple are trying to make innovation happen, or provide access to resources.

As I spend some brain cycles pondering algorithmic transparency, and the recent concept of "surge pricing" used by technology providers like Uber and Gogo, I am looking to understand how APIs can help pull back the curtain that is in front of many algorithms impacting our lives. in the same way APIs have pulled back the curtains on traditional IT operations, and software development. As part of this thought exercise I'm thinking about the role Docker and other virtualized contaniners can play in providing us with more transparency in how algorithms are making decisions around us.

When I deploy one of my APIs using my microservices model, it has two distinct API layers, one for the container, and one for what runs inside of it. Docker comes ready to go with an API for all aspects of it operations--here is an Swagger definition of it. What if all algorithms came with an API by default, just like each Docker container does? We would put algorithms into containers, it would have an interface for every aspect of its operation. The API wouldn't expose the actual inner workings of the algorithm, and its calculations, but provide a complete interface for all its functionality.

How much of this API a company wished to expose, would vary just like with APIs, but companies who cared about the trust balance between them, their developers, and end-users, could offer a certain amount of transparency to build trust. The API wouldn't give away the proprietary algorithm, but would give 3rd party groups a way to test assumptions, and verify the promises made around what an alogorithm delivers, thus pulling back the curtain. With no API, we have to trust Uber, GoGo and other providers about what goes into their surge pricing. With an API, 3rd party regulators, and potentially any individual could run tests, validating what is being presented as algorithmic truth. 

I know many companies, entrepreneurs, and IT folks will dismiss this as bullshit. I'm used to that. Most of them don't follow my beliefs around the balance between the tech, business, and politics of APIs, as well as the balance between platform, developer, end-users, and what I consider to be an invetiable collison with government regulation. For now this is just a thought exercise, but it is something I will be studying, and pondering more, and as I gather more examples of algorthmic, or "surge pricing", I will work to evolve these thoughts, and solidify into something more presentable.

See The Full Blog Post


Expanding On My API Monetization Strategy And Research

This is a full walk-through of me trying to distill down my approach to API monetization, in a way that can be applied across not just 30 APIs, but potentially 300, or 3000. There are several things converging for me right now, which includes the maturing of my own infrastructure, as well as conversations I'm having with startups, enterprise groups, federal government agencies, and my own partner(s).

I need to keep hammering on this to help me operate my own infrastructure, but I am also partnering with APIWare to help me deliver on much of the API design, deployment, and management, so I need to have a good handle on my costs. As with all of my areas of research, within the area of API monetization I am just trying to get a handle on the common building blocks, and provide a checklist of considerations to be employed when I'm planning and operating my API infrastructure.

To help me build a base, let's walk through some of the building blocks of my own API monetization strategy.

Acquisition
What do I have invested into any single API. Even if I am building something from scratch, what went into it? Every API I possess has some sort of acquisition cost, even if it is just $14.00 for the two pints of beer I bought while I was brainstorming the idea.

  • Discover - What did I spent to find this. I may have had to buy someone dinner or beer to find, as well as time on the Internet searching, and connecting the dots.
  • Negotiate - What time to I have in actually getting access to something. Most of the time its on the Internet, and other times it requires travel, and meeting with folks.
  • Licensing - There is a chance I would license a database from a company or institution, so I want to have this option in here. Even if this is open source, I want the license referenced as part of acquisition.
  • Purchase - Also the chance I may buy a database from someone outright, or pay them to put the database together, resulting in one-time fee, which I'm going to call "purchase".

Having a framework for me to think about the acquisition of each API resource I possess, makes it easier for me to think it through when I am brainstorming new API ideas. Something that makes sure I am tracking all details from the moment of inception, to when I commit to actually making it available via an API on my platform.

Development
What does it actually take to stand up an API. There are a lot of moving parts with making an API happen, and not all of them are technical. Am I willing to invest the time necessary to stand up an API or will it require outside investment, as well as resources. What is needed to take an API from acquisition to actual operation?

  • Investment - Who put up the money to support the development of this API resource? Was it internal, or did we have to take external investment.
  • Grant - Was the development of this API rolled up in a grant, or specifically a grant for its development. Covering costs involved.
  • Normalization - What does it take me to cleanup, and normalize a dataset, or across content. This is usually he busy janitorial work necessary.
  • Design - What does it take me to generate a Swagger and API Blueprint, something that isn't just auto-generated, but also has the required hand polish it will require.
  • Database - How much work am I putting into setting up the database. A lot of this I can automate, but there is always a setup cost involved.
  • Server - Defining the amount of work I put into setting up, and configuring the server to run a new API, including where it goes in my overall operations plan.
  • Coding - How much work to I put into actually coding an API. I use the Slim PHP framework, and using automation scripts I can generate 75% of it usually, but there is always finish work.
  • DNS - What was the overhead in me defining, and configuring the DNS for any API, setting up endpoint domain, as well as potentially a portal to house operations. 

Historically when it came to APIs, I just dove into writing code with little consideration for what went into it. I'd say this is one by-product of the microservices way of thinking, is that I decoupled the moving parts of each of my APIs, allowing me to approach development in this way. I'm sure I will keep slicing off other elements within the development process as I progress.

Operation
What goes into keeping an API operational, reliable and available? How much do I spend on all aspects of an existing APIs lifecycle to make sure it meets the standards of API consumers. Ideally operational costs go down the more efficient the platform gets with overall operations, reducing overhead, and streamlining across everything.

  • Definition - How much resources am I applying to creating and maintaining APIs.json, Swagger, and API Blueprint definitions for my APIs.
  • Compute - What percentage of my AWS compute is dedicated to an API. Flat percentage of the server its one until usage history exists.
  • Storage - How much on disk storage am I using to operate an API? Could fluctuate from month to month, and exponentially increase for some.
  • Bandwidth - How much bandwidth in / out is an API using to get the job done.
  • Management - What percentage of API management resources is dedicated to the API. A flat percentage of API management overhead until usage history exists.
  • Code - What does it cost me to maintain my code samples, libraries, and SDKs for each individual API, or possibly across multiple APIs and endpoints.
  • Evangelism - How much energy do I put into evangelizing any single API? Did I write a blog post, or am I'm buying Twitter or Google Ads? How is the word getting out?
  • Monitoring - What percentage of the API monitoring, testing, and performance service budget is dedicated to this API. How large is surface area for monitoring?
  • Security - What does it cost for me to secure a single API, as part of the larger overall operations? Does internal resource spend time, or is this a 3rd party service.
  • Virtualization - What am I spending on virtualization for an API, as part of QA process, for retail sandbox and simulation environments, or for specific partner needs.

Ideally the more APIs you operate, the more detail you will get about each of these areas, and some of these areas you should get better deals, the more volume you run through each area listed above. Example of this would be with compute and storage costs going down, as we do more business. The more we understand the details of operations, the more we can optimize operations.

Access Levels
What sort of access levels are we going to provide across ALL APIs, not that all APIs will use all areas, but we should be ready for as many scenarios as we possibly can. We need to be clear of what access is the free layer, as well as the tiers of access, and any wholesale, partner, or re-sellers aspects.

  • Free (unlimited) - This is just a free API, I won't be rate limiting the usage of it. It will act similar to any website I put out there, but instead of HTML it is JSON.
  • Free Trial - I am only going to offer a limited use, or time period for access a resource, giving just a taste test, but won't be in main pool of APIs available. 
  • Not For Profit - This API is being subsidized somehow. Either there is direct investment from internal or external resources to subsidize or there is a grant involved.
  • Educational Access - Is this API available as an educational resource, with special pricing for students and teachers? This will usually be reflected in the tier, and credit availability.
  • Tier(s) - Which of these service tiers is an API available in, and which endpoint paths + verbs are accessible in the tier (api-pricing definition).
    • Public - General access, you usually don't even need a key. Only limited to specific APIs.
    • Retail - This is the pay as you go level for public acess to all APIs. This is where the retail side of business operations will occur.
    • Trusted - These are just a handful of trusted individuals or companies, who may have write access to some endpoints.
    • Education - Providing a specific access tier for education partners, including students, teachers, and institutions. Providing higher levels of free access, and lower price points.
    • Partner - These are partners I have prearranged agreements with, something I will be transparent about, showcasing them on partner page.
    • Wholesale - The wholesale, often non-rate limited portion of my platform, where I deploy APIs in other people infrastructure, or our own for flat fees.
    • Platform - This is all internal access by applications I build for my own usage. I still rate limit, and manage this access, I just give myself different privileges.
  • Partner Program - A structured program allowing API consumers to achieve higher levels of access, with reduced pricing levels, flat rate access, and other benefits.
  • Reseller Program - A structured programming for allowing API consumers to prove themselves, and share in revenues from API usage, affiliate links, and revenue share.

My intent around access levels is to be as transparent as possible. Not all users will access at all levels, and not all APIs, and their endpoints will be available within at all access levels. The goal is to optimize access, remain as open as makes sense, while also sensibly monetizing resources to cover costs, and make a fair profit.

Pricing & Credits
I am employing a universal credit system that will be used by all APIs. The goal is to expand the unit of currencies I employ beyond just API calls, and attach a universal unit of value that can be applied across all APis. API consumers will be given a certain amount of API credits to be used each day, as well be able to buy and sell credits at various rates. 

  • API Value - Each API will have its own credit rate set, where some will be one credit, while others may be 100 credits to make a single call, it can be defined by API or a specific endpoint.
  • Daily Limit - The daily allowed credit limit will be determined by the access level tier is registered at, starting with daily free public access to retail, trusted, and potentially custom tiers.
  • Usage - How many credits does any one user use during a day, week, or month, across all APIs. When each API is used, it will apply the defined credit value for the single API call.
  • Incentive - How can the platform give credits as an incentive for use, or even pay credits for writing to certain APIs, and enriching the system, or driving traffic.
  • Purchase - What does it cost to buy a credit, something that could fluctuate from day to day, week to week, or month to month.
  • Buyout - Allow API consumers to get paid for the credits on their account, preferably all users are encouraged to spend credits, but buyout is an option.
  • Discounts - Can we give discounts when you buy credits through specific channels, promotions, or other type of planned deal.
  • Volume - Are there volume discounts for buying of credits, allowing consumers to purchase credits in bulk, when they need to and apply when they desire. 
  • Applying - Can you wait to apply credits you have accumulated? Given the option with each billing cycle to apply, or you may want to wait and use at future date.

I envision credits being the lifeblood of the API monetization strategy for my platform, and would love to see it spread beyond any single API ecosystem, and be something that all providers could put to work. The benefits of this would be seen by both API provider, as well as consumers, in helping us establish a currency for the API economy.

Indirect Value Generation
What value is generated via API operations that isn't directly monetized, but driving value in other ways. These indirect value generators are often overlooked, and under-showcased areas of operation, often resulting in API failure--always showcase the buzz.

  • Marketing Vehicle - Having an API is cool these days, and some APIs are worth just having for the PR value, and extending the overall brand of the platform.
  • Web or Mobile Traffic - The API exists solely for distributing links to web and mobile applications, driving traffic to specific properties - is this tracked as part of other analytics?
  • Brand Awareness - Applying a brand strategy, and using the API to incentivize publishers to extend the reach of the brand and ultimately the platform - can we quantify this?
  • Data & Content Acquisition - Using the API, and the applications built on top as part of a larger data and content acquisition strategy--can we quantify this?

I could see data and content acquisition grow into an area we can better quantify soon. Putting a base value on each resource in the system, and figure out how much each resource grows in size, and quality over time. Applying value to these indirect areas is something I'd like to expand upon in future iterations.

Partner Revenue Generation
Ideally any platform should be sharing the revenue and value exhaust generated via the ecosystem, providing revenue opportunities for web, and mobile application developers. There are a handful of ways revenue can be shared via API operations.

  • Link Affiliate - What revenue is generated and paid out via links that are made available via the API, with potentially externally affiliate links embedded.
  • Revenue Share - What portion API revenue is paid out to potential re-sellers who drive API usage. Revenue is percentage of overall credit purchases / usage.
  • Credits to Bill - All revenue is paid in credits on account, and user can decide to get buy out of credits at any time, or possibly use in other aspects of system operation.

I will be expanding on these areas in the future, as I play with ways to incentivize content or data creation, or just driving API consumption well into the paid tiers. Right now many API platforms I study are essentially sharecropping plantations, reaping the value generated from developer activity. In the future, developers should be incentivized with cash and credit to help achieve platform monetization goals, which is something I want to explore via my own API resources when I have the bandwidth.

Internal Revenue Generation
Where are we making money? What revenue is generated across the platform, and then what are the breakdowns. I want to understand who my rockstar users and applications are, something that isn't isolated to external users. I am looking to craft all of my applications as individual citizens within the API ecosystem, measuring and limiting what type of access they have, and treat them like any other consumer on the platform.

  • Monthly - How much revenue is being brought in on a monthly basis for an API and all of its endpoints.
  • Users - How much revenue is being brought in on a monthly basis for a specific user, for an API and all of its endpoints.
  • Applications - How much revenue is being brought in on a monthly basis for a specific application, for an API and all of its endpoints.
  • Tiers -  Which tiers generate most usage and revenue? I should be applying just as easily to aspects of platform / internal usage as well.
  • Affiliate Revenue - What was generated from affiliate links made available via APIs, minus what percentage was paid out to API consumers.
  • Advertising Revenue - What advertising revenue was derived from web or mobile application traffic resulting from the API, minus whatever was paid out as rev share to API consumers.

The goal of my platform is not simply to make money. Sure I like making money, but I'm looking to flush out a reproducible framework for hanging each API, and making sense of it as part of my larger API platform operations. Not all APIs will be created equally, but I should be able to equally measure what it costs to develop, and operate, and apply consistent ways of generating revenue around its use. 

All of this looks intimidating when you scroll back through. However my goal is to produce a standardized pricing page that can exist across all of my API ecosystem(s), which are growing in number, and prompting me to think in this way. I need a better handle on my costs, and ultimately be able to generate more revenue to keep me with a roof over my head, food on the table, and my AWS bill paid.

While I only have a single API portal right now, I'm preparing to deploy a specific collection using APIs.json and publish as version 2.0 of my API Evangelist developer portal. I'm also looking to immediately publish a few other API portals, designed to support various collections or stacks of APIs available in my network (images, API definitions, etc.). I need a standard way to deliver on-boarding, and pricing for the APIs, and this backend framework gives me the v1 approach to that. 

Each API that I launch will have a pricing page, with each of the available service tiers as a box, and within each box it will list how many free credits you get each day, and other features available like per credit rate beyond the daily allowed limit, support elements, and other relevant details to that tier. There should also be links to more detail about partner, re-seller, and wholesale options for each API portal I launch. The API consumer never sees all of this. This framework is for me to hang each API upon, and think through it in context of the overall API lifecycle and platform operations.

I'm applying this outline to the 30 APIs I have in my stack, and then also applying it to a handful of new data APIs I'm working on. Along the way I will flush it out a little more, before I get to work on some of the more advanced pieces like a partner and re-seller programs. I'm not a big fan of advertising, but I do have some single page apps that perform pretty well, and it wouldn't be too intrusive to have some advertising on them. All of these SPAs are driven by my APIs, and they often exist as tools across my API driven content network as well.

This post will be published to my API monetization research, and this list will be published as common building blocks, that can be considered as part of any API monetization strategy. It makes me happy to see this portion of my research finally move forward, and evolve, especially since its based upon my own platform goals, as well as my wider monitoring and review of the space.

See The Full Blog Post


A Blockchain To Act As A Universal API Credit Layer That Can Be Used By Both API Provider And Consumer

I have had this discussion several times now, in the dark corners of bars in San Francisco, Paris, Berlin, and Barcelona. It is something I just want to make sure is published on my site, as part of my latest expansion of my API monetization research. I'm rolling out a standardized credit system as part of my API operations, and using my API service composition layer to apply credit usage as part of each API call. 

My objective is to make it easy as possible to buy and even sell credits via my platform, and easily apply those credits across any API I publish, in a variety of access tiers. As I think through this, I can't help but start thinking about how this can play out on a larger scale. Meaning, how do we get API providers to adopt a universal API credit system, so you could buy and sell credits anywhere, use them on any API platform, and transfer them between platforms.

Every time I think about this topic, I end up at the blockchain. It just makes sense that we would use blockchain technology establish a currency that could be used as fuel for the API economy. The blockchain ledger could be used to manage API credit exchanges, but also could be used to store other relevant details that could impact API driven transactions in real-time. With the most important piece being about interoperability, and the fact that you can use in any API ecosystem.

An API world, where you could generate credits on one platform, and transfer them for buying of API calls on another platform, opens up some pretty interesting API driven scenarios for me. Not many platforms pay developers to use APIs, most are just free, or charging per API call. There isn't much incentive for API providers to shell out cash, to incentivize API consumers, but when you could pay in credits, that were transferable, it might change the dynamics.

Honestly, I am not very knowledgeable on the blockchain. It is something I've only recently started educating myself about it. I'm also still mapping out my own API monetization strategy, and gathering my thoughts on an API credit system that is based upon my needs, but it is something that is also rooted in my monitoring of common API platforms. I have a lot to learn about the blockchain, and have numerous details to work out as part of my larger API monetization strategy, but I wanted to at least put this out there, and make sure it is part of my API monetization conversations and storytelling in the future.

See The Full Blog Post


Learn From Google Maps API And Just Have Standard Approach To Free And Paid Tiers For Your API From The Beginning

It has been almost 10 years since Google launched the Maps API. With as many APIs as Google have, you'd think they'd have a better handle on a standard approach to pricing across all of them. As we've seen with Google Translate, they have struggled with the right way of pricing APIs, as well as communicating this to their developer ecosystem.

This week Google took further steps on the road to standardizing how you pay for Google Maps API, opening up pay as you go purchasing via the Google Developer Console. Here is how they put it:

In this new purchasing structure, the Google Maps Geocoding, Directions, Distance Matrix, Roads, Geolocation, Elevation, and Time Zone APIs remain free of charge for the first 2,500 requests per day, and developers may now simply pay $0.50 USD per 1,000 additional requests up to 100,000 requests per API per day. Developers requiring over 100,000 requests per day should contact us to purchase a premium licence.

It is good to see Google run with some pretty standard pricing, and running with something in line with what other API providers are putting to work. Just like Google worked to standardize OAuth 2.0 across the APIs, and get their approach to API management organized via the developer console, it looks like they are now working hard to get their API pricing in order.

I still see many API providers struggle with API pricing, both large and small. Many of the larger API providers do what Google did, and offer a free access level, and then once they need to generate revenue they shut things down, or tighten things up, rather than just offering a clear path to paying for resources. This approach has fueled the debate around whether you can depend on APIs, and also whether or not a freemium approach to APIs will work at all.

Another common stumbling block I see API providers trip over when it comes to pricing, is a disconnect between each level of access, meaning there is a freemium layer, and there are paid tiers of access, but there is no even steps between them, making it impossible for many developers to traverse. This generally makes a clear statement that it is ok to play with our API, but you really need to be the enterprise to actually do business with us.

This is all something that can be mitigated with a clear, pay as you go, pay for what you use API model with a free tier of access, and sensibly tiered paid levels of access, as well as a robust, transparent partner program for higher levels of needs. There are plenty of models out there to follow when crafting your own API monetization strategy, so make sure you something in place from day one, even if you are tweaking along the way like Google has had to do.

See The Full Blog Post