{"API Evangelist"}

A Fun Way To Explore HTTP Status Codes With A Subway Map From Restlet

If you were at @defrag or @apistrat in November, you know that I am working to better understand the often complex world of APIs using the Subway map concept. My goal is to better understand the overall API life cycle, as well as the life cycle of individual APIs, and how I can articulate, strategize, and execute on it all, using a subway experience. 

It made me happy to see the folks over at Restlet playing with the same concept (we didn't coordinate on it honestly), to help articulate HTTP status codes, which is a very important topic for the space, and we need more education tools, and stories around it. Using the subway map analogy, Restlet provides a representation of the five areas of status codes, providing a simple way to explore them, and find a description of each individual status code.

The subway map they provide is currently a static map, but this is one of the biggest potential areas in using this analogy for me, is that with the right JavaScript + JSON voodoo, you can make it real-time, and interactive. This is something I'm working on to bring the entire lifecycle to the forefront this month. 

Nice work Restlet team, I enjoy these efforts by providers to help educate the space, especially when they do it in creative and fun ways like this.

Disclosure: Restlet is an API Evangelist partner.

See The Full Blog Post


75% Of Your API Efforts In The Enterprise Will Be Cultural And Political, Not Technical

I started API Evangelist, because I saw a huge deficiency in the overall API conversation--nobody was talking about the business of all of this, and how you actually make money doing this emerging web API thing. Over time, I also discovered that very few people were also studying, and discussing the politics of APIs. Sure, when something flares up around terms of service violations, or there is an acquisition that the community dislikes, we discuss it, but we have to talk about the political issues in real-time, not at polarize-time.

From my vantage point, the business and politics of API operations, internal as well as external influences, continue to be the number one things that negatively impact API operations, over anything technical. Let's just look at a couple examples of this in action:

  • Money - How are you going to operate an API without any money? Either from paying customers, or internally from other sources.
  • Doers - If your business has been built on traditional sales and support models, evolving consumers to a doers mentality will not be an easy.
  • Competing Interests - Various groups within an organization competing for budget, attention, and any other way regular current operations impact your API vision.
  • Management Change - Your CTO had your back, and the new one not so much. It could be about lower level or mid-management support, all the way to the top. When the champions leave, API programs often dry up on the vine.
  • Industry Regulation - Silicon loves to be in denial of the larger world, and the regulatory frameworks that are in place across many industries--once you grow big enough, or operate in the right space, this will become clearer.
  • Scope - How big are things? Software? Teams? Systems? Processes? There are many things that are big in the enterprise and even small businesses, that will resist decoupling and unbundling. People just do not think of things in small, bite-size chunks--it just isn't what they've been taught.
  • You - Your vision might not be a fit for a business, organization, or industry, and you may be coming to the table with unrealistic expectations of what APIs can do.

These are just a handful of the common business and political roadblocks you will encounter doing APIs. I do not care how solid your API design, and deployment strategy might be, when you come up against many of these currents listed above, the tech does not always win. I've seen API effort, after API effort fail because of these common challenges, that we all face.

For us technologists, it can be easy to sit at our corners, and craft a vision of the perfect world we want, but often it is oil, in a water filled world. I am not saying your API vision can't make change, but you have to equipped to understand that 75% of your efforts in the enterprise, organizations, and government agencies will be about culture, politics, and business--not technical.

See The Full Blog Post


API Client, Tool, Garage, Hub, Playground, Studio, Workbench, And Builders

I am spending some time taking another look at my "client research", which started out as just about Postman and PAW, but now contains ten separate services I'm and bundling into this area of research. As with all my research areas, these project repos shift, evolve, split and marge with time, as the API space changes, and my awareness of it grows. 

I  completely understand the term "client" doesn't provide an adequate label for this bucket of research, but for now, it will have to do. As I add a couple of new services to the bucket, and made my way through some of the existing ones I had, I wanted to step back and look at what they were offering, but more importantly the message that went around quantify what tehse companies were offering.

When it comes to what I call "lines along the API lifecycle", I saw these areas represented.

This is where the API client line potentially intersects with all of these other API life-cycle lines. However, When you start to analyze the features or building blocks offered by these service providers, you begin to see each stop along along the API client line, which becomes pretty critical to other areas of the API lifecycle.

I know that what I am saying might not be completely clear, it isn't for me either. That is why I tell stories, to try and find the patterns, and learn how to articulate all the moving parts. I'm still trying to figure out what to call my research, alongside all of these API service providers working to define just exactly what it is they are selling as well. 

The more time I spend with my API client research, the more all of this comes into focus. The problem is that these companies are rapidly adding in new features, in demand to what their customers are needing, which keeps me on my toes, as well as increases overlap with other lines that I track on along the API life-cycle. 

I just wanted to take a moment, update my research, and take another look at the companies, and tooling at play.

See The Full Blog Post


What Is API Service Composition?

This is another one of those topics I talk a lot about, but only found few examples of me talking about on the blog--API service composition. If you aren't familiar with the concept, it is the art of taking digital resources (aka APIs), and mix and match them in different ways, until you find the right approach to delivering APIs, that provides value for both provider and consumer.

API service composition is about taking the basic building blocks of any web API, the URL, path, and VERBS (ability to get, add, update, and delete), and put them into as many different configurations as you think makes sense. Limiting who can access, how much they use, restrict by time frame, and crafting different pricing for different users or groups, require trial periods, setup costs, and even restricting to specific countries. API service composition is all about taking your APIs, and if you are using modern API infrastructure like from 3Scale, you can compose any service you desire, serving it up to anyone, in a secure way, using the open Internet.

This is the magic of modern API-driven solutions, when an API can be any digital resource, from simple data and content, to media like images, audio, and video, all the way to an increasing number of devices like fitness trackers, thermostats in our homes, and even our cars. When you have all of these resources at your finger tips, and the ability to compose them into any possible package, to satisfy any group of consumers--you have unlimited potential.

This is why so many companies, organizations, institutions, and government agencies are jumping on the API train. When you do APIs right, you gain a new level of control over the increasing amount of digital resources that are dominating our lives. API service composition is the key to all of this, and is one of the parts of the journey I enjoy the most--composing meaningful API service that offer value to consumers, and have the potential to change how business, industries, and even government can work.

See The Full Blog Post


Automating API Key Management Using API Service Provider APIs, And Other Open Source Solutions

I'm working my way through some of the low hanging fruit in my API notebook, when it comes to stories, and found a story thread I was working on regarding automating API key management. I'm personally storing my keys, across the private master branch for my API reps, because I don't have any super-sensitive data, and it helps me manage across hundreds of APIs, in a central way

I've talked about the need to automate API key management in the past--with the number of APIs we are using, to reach the level of security we will need, the lower level of keys will need a global refresh and management process. This level of keys most likely won't ever result in large scale security breaches, but will cause plenty headaches for both API providers and consumers.

If you use one of the following API management solutions, they provide you with an API for managing your API keys:

This will help you manage your keys, if you are an API provider, but doesn't do a lot for you to manage your API keys across providers, as an API consumer. Amazon provides a key management solution, but at first glance it appears to be something you can use to manage keys across your AWS infrastructure (am I wrong?)--which makes sense for supporting AWS objectives. ;-)

When I wrote my last post on the growing need for API key management solutions, I received a number of email and DMs, which yielded two pretty interesting open source key management solutions, Vault and Keywhiz. I'm going to evaluate these solutions for their viability as a back-end for an API driven, API key management solution, but I have a lot more work to do. 

I'm also working with a partner of mine, SecureDB, and consider the possibility fo developing an encrypted API key management solution, which then would be accessible via their secure API. They are looking for some interesting solutions like this to be developed on their platform, so if you are a developer, and looking for a viable micro startup idea--let me know.

As with everything in my world, the concept of automating API keys using APIs, and managing of keys across API platforms, is a work in progress--stay tuned!

See The Full Blog Post


What Licensing Should I Be Considering When I Take Open Source Software And Offer Up As An API?

I've done this a couple of times now. I took PhantomJS, and created my Screenshot API, and used ImageMagick to create my Image Manipulation API. These are two openly licensed software solutions, which I took, and am using as an API. 

What are my licensing considerations? If I keep my server side code licensed according to the specifications, am I fine? PhantomJS is licensed under BSD, and ImageMagick is Apache 2.0. Does the licensing extend itself to the commercial services I would potentially offer via an API interface? There are lots of questions to satisfy, before I move forward--I guess I am looking for a precedent.

I am evaluating at a number of openly licensed software solutions right now to deliver a variety of stops along the API life-cycle, ranging from design, to deployment, virtualization, and much more. As always, I am trying to better understand the layers involved, and how software licenses, patent, and potentially copyright might apply.

Just putting it out there to the universe, and curious to see what comes back.

See The Full Blog Post


Using The Wikimedia Objective Revision Evaluation Service And Move Beyond Just GET With Your API

I stumbled across Objective Revision Evaluation Service (ORES) last night, a web service running in Wikimedia Labs that provides machine learning as a service across Wikimedia Projects, and is designed to help automate vandalism detection and removal for content, being developed as part of the R:Revision scoring as a service project.

As I came across, I was also considering different access plans across my APIs, with some of the plans allowing for updating existing content in the system--the topic of abuse of API access was on my mind. I'm curious if ORES could be applied to any sort of content or data post via a PUT / PATCH API request?

Even if it didn't 100% out of the repo, maybe it could it be evolved to help manage the PUT / PATCH layer of API operations, allowing platforms to open up a little bit more, and open up more of their HTTP verbs, to a wider audience. Something like this could go a long way to helping API providers secure, and stabilize their API operations, and loosen service composition restrictions a little further.

Just a thought, as I'm kicking the tires of some of the open source API offerings I come across. Seems like to me, there is an opportunity for someone to deploy these open solutions as a service, and help API providers open up a little more. Just sharing with my audience, as a possible service that would benefit the API space, and hopefully make someone a little beer money--who knows!

See The Full Blog Post


Realizing I Need Hypermedia To Bring My API Lifecycle Vision To Life

I have been learning about hypermedia over the last three years now, and only earlier this year, I began playing with Siren to help me craft a better experience around my API industry news and link curation API. My motivations in going down this hypermedia road was never about easing my client side pain, or helping me with future versions of my API--I am just not that sophisticated of an operation.

I started playing with hypermedia to help evolve the experience around the API news I was curating each week, making it so you could browse week by week, month by month, but also by topic, company, author, etc. I'm still trying to figure it out all out, and honestly the project is currently in the ditch after hitting the wall this fall, and not really giving a shit about the flow of API news. (I am better now, thx!)

Now in December, I'm trying to take my building block API, which provides access to over 600 of the common patterns I've tracked on across the API space. These are the features offered by API service providers, and the competitive advantage brought to the table by the successful API providers I keep an eye on, and they are all potential stops along the API life-cycle I am working to define.

My API building block API is a pretty standard content API, providing each element broken down by category, and type, with other supporting details. However, now I need to be able to plot them on a subway map, with an endless number of configurations, and dimensions to the journey. Via my building block API, I need to return any single stop along the API life-cycle, but along with it I need to provide the next stop in the line, the previous stop (paging 101), but then if I hit a transfer station, or other element, I need to offer an unlimited number of dimensions.

While I am concerned with the behavior of my client, it isn't the normal hypermedia-focused arguments I usually hear. I need to be able to deliver a subway-like transport experience for over 600 stops along the API lifecycle, for any possible API you can image. A simple API design just isn't going to cut it, I need a request and response model that can dynamically deliver the API life-cycle transport experience that I am looking to extend to my users.

I wish I had thought this out more, when I first got started designing my building block API. ;-(

See The Full Blog Post


API Economy Tooling For The Business Masses

Most of the tooling and services I come across in the API space, are designed for developers. As I play with more services, and put tools to work, trying understand their role in the API space, some take a lot of work to figure out, while others are pretty clear--it will be these tools and services that the masses of business users adopt, as part of the API evolution that is occurring online currently.

There are just a handful of services out there right now, that I think are mainstream ready, and are something that the average business user of the web, should be playing with right now--here are three of them:

  • Restlet - Deploy API from a Google Spreadsheet.
  • Blockspring - Use APIs in a Google Spreadsheet.
  • Form.io - Deploy an API as form, via Single Page App.

There are other services and tools out there that will help you deploy and consume APIs, but these stand out, because they help the average business user solve real world business problems they are facing, without needing to write code. They also anchor there solutions in existing tooling that are ubiquitous, and part of every day business operations--the spreadsheet and the form.

Developers might care about the technical details of APIs, and evangelists like me might be able to convince some of the average business users of the potential of APIs, but before we can bring the masses on-board, we will need the right tools. Restlet, Blockspring, and Form.io have the potential to be these tools, and help the "normals" become active participants in the API economy, but more importantly find the (API driven) solutions they need for the problems they face every day.

See The Full Blog Post


The Growing Need For API Virtualization Solutions

This conversation has come up over 10 times this month, at Defrag, APIStrat, and online conversations via Skype and GHangouts. The concept of API virtualization solutions. I am not talking about virtualization in the cloud computing and Docker sense, although that will play a fundamental role in the solutions I am trying to articulate. What I am focusing on is about providing sandbox, QA environments, and other specialized virtualization solutions for API providers, that make API consumers worlds much easier. 

I've touched on examples of this in the wild, with my earlier post on API sandbox and simulator from Carvoyant, which is an example of the need for virtualization solutions that are tailored for the connected automobile solution. Think of this, but for every aspect of the fast growing Internet of Things space. The IoT demand is about the future opportunity, I've talked about the current need, when I published, I Wish All APIs Had Sandbox Environment By Default.

I see envision 1/3 of these solutions being about deploying Docker containers on demand, 1/3 being about virtualizing the API using common API definitions, and the final 1/3 being about the data provided with the solutions. This is where I think the right team(s) could develop some pretty unique skills when it comes to delivering specialized simulations tailored for testing home, automobile, agriculture, industrial, transportation, and other unique API solutions.

We are going to need "virtualized" versions of our APIs, whether or not it is for web, mobile, devices, or just for managing APIs throughout their life cycle. You can see a handful of the current API virtualizations out there on my research in this area, but I predict the demand for more robust, and specialized API virtualization solutions is going to dramatically increase as the space continues its rapid expansion. I just wanted to put it out there, and encourage all y'all to think more about this area, and push forward the concept of API virtualization as you are building all the bitch'n tooling you are working on.

See The Full Blog Post


Freemium Access For Your API Is Not Bad, It Is Just One Tool In Your Providers Toolbox

I get regular links sent to me, and foiks telling me that freemium API access is a bad idea. That it doesn't help your API sales funnel, and was something that was just a thing, for a brief moment in time. The folks who always bring me these stories, are not API consumers, they are business folks. 

I agree, there are some really bad examples of freemium in play across the API space, and with the bad behavior we see from API consumers, I fully understand why stories make their rounds--leaving a bad taste in the mouths of API providers for freemium access.

The disconnect that allows this to happen in my opinion, is folks not thinking through their monetization, plans, and pricing in an API centric way. Folks approach it as a across the board monetization approach, and do not think of it as a tool to apply on a resource by resource basis. 

Freemium is just one tool in your toolbox, and should evaluated at every step in the monetization planning for your APIs, and used, or not used, based upon your well planned on-boarding funnel for your API consumers.

Consider educational access to your API resources (also one tool in your toolbox), which is a common offering from API providers. You wouldn't offer this in all scenarios, it just doesn't make sense, but you aso wouldn't just dismiss it as a bad idea--you keep in your toolbox for when it makes sense.

I see API operations let by business folks dismiss freemium as a bad idea because they don't experience on how it influences in the on-boarding process, but I also see API operations let by developers dismiss the need for traditional sales approaches, and hurt themselves in similiar ways.

Ultimately, I recommend iterating through the list of common API monetization building blocks I've aggregated from leading API providers, for each resource you are planning to release. If you have a modern API infrastructure provider like 3Scale, you will be able to define individual plans, using a variety units of value, at different rates, rooted in who you are targeting with your API resources--make sure you have as many tools as possible in your toolbox, don't throw any away.

See The Full Blog Post


Evolving My API Stack To Be A Public Repo For Sharing API Discovery, Monitoring, And Rating Information

My API Stack began as a news site, and evolved into a directory of the APIs that I monitor in the space. I published APIs.json indexes for the almost 1000 companies I am trackig on, with almost 400 OADF files for some of the APIs I've profiled in more detail. My mission around the project so far, has been to create an open source, machine readable repo for the API space.

I have  had two recent occurrences that are pushing me to expand on my API Stack work. First, I have other entities who want to contribute monitoring data and other elements I would like to see collected, but haven't had time. The other is I that I have started spidering the URLs of the API portals I track on, and need a central place to store the indexes, so that others can access.

Ultimately I'd like to see the API Stack act as a public repo, where anyone can grab the data they need to discovery, evaluate, integrate, and stay in tune with what APIs are doing, or not doing. In addition to finding OADF, API Blueprint, and RAML files by crawling and indexing API portals, and publishing in a public repo, I want to build out the other building blocks that I index with APIs.json, like pricing, and TOS changes, and potentially monitoring, testing, performance data available.

Next I will publish some pricing, monitoring, and portal site crawl indexes to the repo, for some of the top APIs out there, and start playing with the best way to store the JSON, and other files, and provide an easy way explore and play with the data. If you have any data that you are collecting, and would like to contribute, or have a specific need you'd like to see tracked on, let me know, and I'll add to the road map.

My goal is to go for quality and completeness of data there, before I look to scale, and expand the quantity of information and tooling available. Let me know if you have any thoughts or feedback.

See The Full Blog Post


Each Of My APIs Has Its Own Github Repo With Two Branches, One Private And One Public

While it will never be 100% complete or perfect, I finally have my API stack in a way that lets me add, evolve, scale, and deprecate the endpoints as I need. I've been centralizing all of my APIs, underneath a single Github organization, with each API as a single repository. 

Each API has from one to around 50 endpoints, existing in its own repo, with the master branch set to private, and the gh-pages branch existing as a public repository, and contains the public face of the API (aka portal). This approach is giving me standardized way to manage my fast growing API stack, in a way that I know how to find anything, anytime. 

While the API itself runs on a series of AWS EC2 instances, the private repository is the blueprint for its operation, with server side code, database backups, and details of its overall configuration. The public, or Github Pages branch of the API, acts as the public portal, with docs, client code, OADF file, and of course an APIs.json file.

Within my master API Github organization I have 33 separate API repos, with a single master public page, which acts as the central index of all APIs, with a single portal portal with APIs.json file. I use this page to navigate to all my APIs, and I rely on Github heavily to manage the public, and private side of my API operations (emphasis on public).

Now that I have my house somewhat in order, when I'm looking to start a new API, I simple create a new repo, add an APIs.json, and OADF file, and begin crafting the API resource that I need. Every other stop along the API life cycle from deployment to deprecation uses this central Github repo, and it's API definitions, as its central source of truth and operations. 

I still have a lot of housekeeping left to do, to get things in order, but I can't help myself, and I keep adding new APIs. I'm hoping I can keep automating my API lifeycle using this approach, allowing me to tackle a growing API stack, while staying a one person operation.

See The Full Blog Post


Deploying An API From Your Critical Twitter Data Without Being A Programmer

I am continuing my series on helping non-developers realize they can publish, and put APIs to work, without having an API expert in their pocket, using Zapier, Google Sheets, and Restlet. Its no secret that Restlet is an API Evangelist partner, and they are my partner because they are the easiest way to deploy a web API--something I am trying to help the "normals" understand the untapped potential of.

As I finish up my @APIStrat conference again, for the sixth time, I'm reminded that I need to harvest the essential Twitter exhaust from the conference, otherwise if I wait too long, I won't be able to get it. You see, Twitter limits what you can grab from the Twitter API by either time, or number of Tweets, so I can only gather X amount of Tweets, and if I wait too long, I'm often out of luck. It is critical that I do this right away, and depending on how loud the Twitter exhaust is, I need to do it while the event is still going on.

To start, you obviously need a Twitter account, but you will also need a Zapier, Google and Restlet account. Then using Zapier you can gather the following Twitter data points, and send to a Google Sheets:

  • New Mentions
  • New Followers
  • New Favorites
  • New Tweets

How you store these in Google Sheets is up to you, but I recommend breaking each data point down as its own sheet, and even by separate time-frame like day, week, or month. It helps to keep things broken up into smaller chunks, when you are using Google Sheets as a data store--tricks of the trade!

Zapier gives you everything you need to setup the Zaps that route mentions, followers, favorites, and your tweets to the Google Sheet(s), and once they are there, you can clean up, edit, and organize as you see fit. Next Restlet comes into the picture, allowing you deploy an "entity store" from your Google Spreadsheet. Once you have your entity store connected to the Google Spreadsheet, using their simple wizard, you can then deploy as a web API--Restlet walks you through the entire process, no code necessary.

What you do with your API is up to you, but the important part for me, is I have the references to the Tweets and followers from Twitter that are important to me. When I need to reference the mentions for an event back in 2013, I can do it. When I need to create a special outreach list to anyone who Tweeted at our most recent event, I can do it. 

Simple API solutions, using Zapier, Google Sheets, and Restlet, are empowering to the average business and organizational user who is looking to just get their work done, track on what is important to them, but in an API-centric way which will then allow the information to be then used in web and mobile apps, but also spreadsheets, widgets, and other API driven goodies.

See The Full Blog Post


When Intelligent Programmers Realize They Do Not Understand HTTP And The Web That They Use Daily

I've seen something at an ever increasing pace lately, situations where very intelligent software engineers hit a wall, and realize they do not understand the fundamental building blocks of HTTP, and the web that we are all using daily. It makes my heart ache, because I remember when I found myself in the same place, and still suffer from the deprivation I experienced.

Whether it was my time programming in Microsoft-land, Drupal, WordPress, or any other Web 2.0-land, I eventually realized how much was hidden from me behind the curtain. When you bundled this with the fact I'm a fairly privileged white male software engineer who can be fairly clueless about what is around me, I missed a lot. Something I still find myself recovering from in 2015, even after five years of exclusively studying web APIs--you know, the ones that use HTTP for transport? 

I'm not stupid, but with my lazerfocus ™, I often miss a lot. I feel that many of the smart people I know suffer from a similar illness, but do not have the fortitude for transparency, or the humbleness to accept, that will allow them to move on. Programming language dogma, and platform or framework dependencies can be a powerful thing, but they can also do a lot protect us from what we need to learn to actually be successful, and grow.

At this point, i question everything. What I know. What a platform might be hiding from me, to sell back to me as feature, and what is the bigger picture of the Internet that I use everyday, and often take for granted.

See The Full Blog Post


Making Sure Everything You Offer As An API Service Provider Is Portable

Runscope added the ability to import and export your API tests as JSON, helping make API monitoring a much more flexible and portable thing. You can import and export using the Runscope UI, as well as import via their API, help you automate the setup of your API tests. 

I judge API service providers based upon whether they have an API, and I am increasingly encouraging my readers to do the same. I will also be studying the portability of services that are being sold to API providers, and pushing for more import / export features like Runscope offers.

I will be tracking on the API definitions used by service providers like Runscope, across the 26+ areas I monitor, and trying to better understand the JSON schemas they are using to encourage the portability of their services. The API definition, response, and request models put forth by companies, tells a lot about the service they offer in my opinion, and I'd say the overall portability of services is another strong characteristics I will be keeping an eye out for.

See The Full Blog Post


I Like Being Able To Verify A Developer Is Real Before Giving Them Access to My APIs

As I think about the bad behavior that occurs on the API consumption side of API operations, I'm considering ways that I can help API providers address these problems when they arise within their ecosystems. What can you do when bad actors have access to your APIs? Also more critically for some providers, what can you do to prevent bad actors from on-boarding with your API program at all?

I strongly believe that companies should be as public with their API efforts as possible, but when it comes to which developers you let in, and which ones you don't, I'm finding I'm becoming more conservative in my thoughts--as long as you are transparent about the process. I'm still forming all of my thoughts around this (hence the blog post), and I'm sure is something that will keep evolving as I continue to push forward my awareness of the API space.

When I see a new sign-up for my own APIs, I like to be able to verify who the new consumer is. I like to see a real name, and potentially a company name, but also when I Google the combination, I like to see an active Twitter, LinkedIn, or Github account. It is easy to tell real people, from personas that live n the shadows, and I prefer verifiable people use my APIs.

If you are a public API consumer, I do not think it is unreasonable to ask you to maintain some sort of public presence, to verify who you are, and what you do. I know for many enterprise developers this is insanity, which is why I put LinkedIn profiles in the mix--I do not expect everyone to be super popular on Twitter, and a die-hard Github user. However, in 2015, you really should consider!

As I'm going through my own API on-boarding process, trying to make smoother (it isn't the best right now), I am considering how I will articulate what behavior I expect of my API consumers--in plain English. This post is just part of my iteration, putting my thoughts out to the universe, getting feedback when I can, but ultimately knowing that I am the sole decision maker when it comes to setting the tone in my own API community.

What tone are you setting? Do you verify your API consumers? What is the bar you set for a place at the table?

See The Full Blog Post


How Do I Price My API Resources?

I am continuing to push my research around API monetization, plans, and partners forward, whilep preparing for my API lifecycle keynote at @Defrag and @APIStrat. Along the way, I am also exercising some of my API pricing and planning strategies with my partner in crime at APIware, as we think through some new products that we are developing.

How I approach pricing for API products, is on a resource by resource basis, considering all of the API monetization tools in my toolbox. For example, when i launch two new API endpoints for exploding API definitions, and telling me how big my APIs are, I quickly run down my list of monetization building blocks to see which areas will apply to these new resources. The reality is, until these APIs leave an alpha stage, they will just exist in my internal consumption tier.

Even though these APIs aren't ready for prime time yet, I am already thinking through which of my API monetization building blocks will apply. Will I charge for access? Are they available in the freemium tier? Do I offer only a trial version of them? Ultimately it all comes down to who I will be targeting with these resources, and in this case it will be primarily hardcore API architects--something that helps me define how I will price these resources, and which service tiers I make them available in.

When it comes to generating revenue from APIs, rarely are there a one size fits all solution. You should be considering each resource based upon the value it brings to the table, who you are targeting with this resource, and weighing all the tools in a modern API monetization toolbox. Remember the business of APIs resembles the technology of APIs, where you do things in small chunks, and you iterate, until you find the optimal value proposition that works for the platform, and your consumers.

See The Full Blog Post


Bridging How We Currently Document Our APIs Now With How We Should Be Experiencing APIs Via Hypermedia

I am still catching up on my feeds, and open browser tabs, and one tab that has been open for a couple of weeks is Why Your Colleagues Still Don’t Understand Hypermedia APIs, by Luke Stokes (@lukestokes) of FoxyCart. The post is very thought provoking for me, and represents what I feel is the very pragmatic front of the hypermedia movement, from someone who has helped move the concept of a hypermedia API from academic discussion to reality, with the FoxyCart API.

His challenges at the end of his post really set the stage for me:

So how do we find a balance between idealism about what Hypermedia API documentation systems “should” be and what they practically are? How can we move the whole ecosystem forward by encouraging client developers to code to link relationships instead of hard-coded URLs? How do we help pave the way for the future but not look like unsophisticated outsiders in the process? What pragmatic steps should we take to be like the other cool kids using standard documentation presentations while at the same time saying, “Um, yeah, don’t do that. Don’t code to the URL, code to the link relationship instead.”

For me, his questions illuminate the canyon between where the API community is currently with API design, and the vision of where we should be going with our API design practices. The more I play and learn with hypermedia, the less I see it as the lofty vision for the future, and the more I see it as a set of practical design patterns that will make my API work more meaningful.

The shortcomings of API definitions like Swagger (now OADF), is that it was designed for documenting your API, after it was already designed--awesome, but not remotely about API design. Where the concept of hypermedia is all way back the other direction, where you have to put some serious thought into your API design, before you ever roll up your sleeves to write code--big gap between these worlds.

Jakub, Z, and the Apiary.io team has done some amazing work to bring the mainstream conversation closer towards API design, much than hypermedia folks had ever done--until recently. What Luke asks, reflects the work we have left to bridge the divide, and emphasizes that it has to be something that scales. This is why Swagger was successful, because it provided the section of the bridge that API architects needed to solve an immediate problem that on the table, at scale--documentation. 

The problem is, the masses are pretty short-sighted, and often unwilling to the heavy learning and lifting needed to "code to link relationships, instead of hard-coded URLs". Unfortunately we will need more tooling to help us build more sections of the bridge (beyond Swagger), as we won't be able to it all at once. I think the important thing with this tooling, is we need make sure with each section of the bridge that we build, we are also educating API architects, and client developers around HTTP, media types, link relations, and other empowering hypermedia concepts. 

Anyways, Luke got me pumped with his questions. I do not know what the answers are either, but I want to lend a hand to help figuring it out, and will be asking these questions over and over in my head, showcase solutions that y'all are working on, and tell stories here on the blog, until we make some progress.

See The Full Blog Post


APIs Dedicated To Elections At the City, County, State, Or Federal Level

I'm neck deep in government open data again, and as we are gearing up for the presidential election, you really begin to see the potential for accurate, real-time election data via APIs. There are a number of leading election-related APIs at the federal level like we have from the Sunlight Foundation, and you see the emergence of high value APIs out of government like the Federal Election Commision (FEC) API from the 18F, but with the amount of money in politics, and scope of what is at stake, I can't help but feel there is a huge opportunity out there for more election APIs.

Seems to me that there is an opportunity for some API savvy activistpreneur to step up at the city, county, state, and even the federal level. I'm not just talking about election districts, candidate, and other common building blocks of elections we experience, but real-time sentiment, superpac spending, and other influential information that if availablle for distribution via APIs, could shift the balance one way or another. It would take a healthy assessment of the current landscape, but I think you could identify some low hanging fruit to get started with.

One you get going on the API journey around election data, if you do it right, you will learn a lot, and discover other valuable content or data sources, and see patterns that the rest of might not see. The use of social platforms over the last 8+ years of elections was all API driven, and is just the tip of the iceberg--an API dedicated to elections seems like a huge opportunity to me, in coming months and years.

Let's get to work!

See The Full Blog Post