{"API Evangelist"}

The Work It Takes To Connect And Keep Up With Each Valuable API I Come Across

If I find a new company during my research, it takes me about a minimum of 15 minutes to lightly profile what they are up, and add to my system, and if I do a full profile, it can take several hours. All of this adds up to a lot of work, time that would be better spent on actually writing stories, hacking on the APIs, and publishing of the longer form research papers on what I’m seeing.

In an ideal world, each of these APIs would have an APIs.json in the root of their domain, with details about the company and people responsible for API operations, and as much information about each API as possible. With a single click, I could import the entire profile of a company, and its valuable API resources into my monitoring system, ready for publishing to relevant areas of the API Evangelist network, as well as some of my partner sites like APIs.io--the beauty is, so could anyone else. ;-)

If API providers used APIs.json, I could be connected to their Blog via RSS, their platform Twitter account via the Twitter API, and their Github profile or organization using the Github API. Not only would I have relevant metadata about a company like their name, description, logo, and details about APIs, I would also be connected to a real-time heartbeat of API operations via the platform blog, Twitter, and Github accounts. This is just the start--there are numerous other building blocks you could wire up with an APIs.json file, to keep potential consumers up to speed.

I know it can be hard to se the potential for APIs.json, because there is limiting tooling available currently, beyond the open source search engine APIs.io, Even before the next wave of tooling like the internal enterprise search engine, visualizations, browser plugins, and other tooling are available, you can realize the benefit of being listed on APIs.io, and make sure you are fully plugged into my regular monitoring of the API space, which means the chances are much, much higher that I will write a story about what you, and your developers are up to. ;-)



Encouraging Feedback By Thinking Through Your API Road Map, Out In The Open, On Your Blog

This story is derived from listening to the API podcast, Traffic and Weather. One of the things John Sheehan (@johnsheehan), and Steve Marx (@smarx) discussed during the recent show, was a post Steve wrote as part of his role as product manager, called "How many HTTP status codes should your API use"? I think the story is extremely useful to read, helping us all think through how we use HTTP status codes in our own platform, but personally I think the real story, is Dropbox talking out loud about their strategy.

John and Steve pointed out on the podcast that someone had tweeted that the story was not in sync with the way Dropbox actually did their HTTP status codes. To which Steve replied, that is why he wrote the story, to talk it out, and solicit feedback. This is how you should be thinking through your API road map, out loud, on your platform blog. When I mention to API providers that the blog is the most important tool in their toolbox, this is what I mean. Your blog isn’t just for broadcasting your platform message, it is also about you thinking through thoughts, sharing those thoughts with your API consumers, and potentially getting feedback, which you can consider as part of the overall decision making process.

I am addicted to thinking through my ideas publicly on my blog. For me, a thought isn’t fully formed, until its published. I also depend heavily on the feedback from my audience to my stories, in their public comments on Disqus, and Twitter, as well as privately via email and DMs. Talking things through, publicly on your blog, teaches you to think outside the box, let the sunlight in on your thoughts, and forces you to (potentially) properly form your ideas (not always), at least enough to share with the public--even if you don’t end up getting feedback, that process alone, is highly valuable.

The best part of this, is after you’ve thought through your stories, received feedback from your API consumers, the story also has the potential to live on as a marketing vehicle, which is what I feel most people think of when I mention a blog for API operation. All of this creates a potentially virtuous cycle, that is essential to API evangelism and the overall road map. If you do this type of storytelling regularly, thoroughly thinking through your ideas, and opening them up to public comment before you finalize your road map, eventually you will find that you have momentum in some very positive directions, and are generating some extremely valuable exhaust from your efforts that also brings in new API consumers, and helps keep the existing API consumers engaged.



Considering The 3D Robotics Drone API As Potential Blueprint For The Drone Industry

As my Internet of Things (IoT) research continues, I’m applying much of my API thinking from the mainstream API space to the world of IoT. A couple weeks ago I stumbled across the 3D Robotics Drone API, which to my surprise, was documented using a Swagger specification, providing a machine readable definition of the API, which helped me quickly get up to speed on what the Drone API delivers.

I haven't put much thought into the detail for any drone APIs, something that has been adjacent to my mainstream research for over a year now, but I did have some basic opinions about what a drone API could have. So, when it comes to potential resources, I have to say the 3D Robotics Drone API delivers in all earlier areas I had been considering--with five main endpoints:

  • The Vehicle API - Exposes operations for browsing and searching lists of vehicle, and retrieving single vehicle, including airspeed, ground speed, and battery.
  • The Mission API - Exposes operations for browsing and searching lists of mission, and retrieving single mission.
  • The Administrator API - Access to administrative resources including debugging, logging, and flight simulation.
  • The User API - Access operations for browsing and searching lists of user, and retrieving single user.
  • Authentication - Session operations for logging in and out, and identifying user.

The vehicle stuff I expected, but I was pleasantly surprised to see the resources around logging, and specifically the mission API. When it comes to transparency, and accountability around drone activity, I feel pretty strongly that these types of resources will be essential, and eventually will be mandated by government entities. I’m glad that 3D Robotics is ahead of the game, in providing API driven resources that deliver in this way.

What I didn’t expect, when I stumbled across the user and mission management APIs, was the social, online layer. As part of the 3D Robotics Drone API platform, the company offers DroneShare, where you can publish your user profile, and your missions. I was coming at this from the angle of a potential industry regulation tool, or possible drone pilot resource management—I didn't think it would be done as social, and potential entertainment platform.

Here is the description from the site:

"This is the beta release of Droneshare. Droneshare is a mission viewing and sharing application that works with ground control applications to let you share your mission data. This project is open source, if you would like to improve it, it is probably best to start at our github repo."

This is why I thoroughly enjoy doing my research, because it expands my understanding of specific business sectors that APIs are touching. While I was discovering DroneShare, I also learned about the existence of the "Don’t Fly Drones Here” project, from Mapbox, which provides a database resource of restricted zones, where you can’t fly drones. Which brought me back to my original thoughts, regarding what resource we will need to keep the drone industry sane, rather than just focusing on social and entertainment.

Ultimately I see a lot of patterns here, that reflect several possible drone futures we are potentially crafting. I see social networks, where drone pilots compete for the best jobs, based upon their missions, and equipment usage. I also see potential for entertainment--think about the networks that have arisen for watching people play videogames, and competitive gaming platforms, but apply that to the world of drones. I also see potential elements of government regulation, where the FAA and other agencies, require drone pilots to log their equipment, and activity, logging all drone flights, and potentially sharing publicly via drone social networks to increase accountability.

In the end, I’m thinking that the 3D Robotics Drone API provides a potential blueprint that could be used by other drone manufacturers, and cloud platforms that serve the drone industry. The 3D Robotics Drone API pattern could be held up as an open specification, that the government could weigh in on, making sure the proper regulatory resources are available, and that drone activity can be accessed, and audited via a secure API, available via the Internet.

The world of drones is fascinating to me. On one hand, the security and privacy considerations scare the shit out of me. On the other hand, they are endlessly fascinating to me, and ultimately I’m going to be purchasing a drone, so I can play with more, and understand better—its a business expense you know!

Man, I have a great career. I'm so privileged to be doing what I do—even if our world is under a threat of attack by drones!



A Little Standardization Around How We Do Internet of Things (IoT) APIs And Developer Portals Could Go a Long Ways

I was dedicating some time to researching APIs in the Internet of things (IoT) space, and stumbled across the Myfox API, serving the Myfox home security products. While the developer portal for the Myfox API doesn't have everything I'd like to see in an IoT developer portal, it is one of the better one's I've seen.

If you think web or mobile app developers suck at providing a simple, and a coherent developer portal and experience, device platform providers are even worse. The Myfox API, has the essential building building blocks, with a simple overview, authentication info, simple Swagger driven API documentation, application management tools, and a clear terms of service for the platform.

Additionally, there is a clear link between the primary Myfox website, the developer portal, and a myfox.me user portal, for device consumers and would-be developers to easily on-board with everything. I haven't fully profiled that Myfox platform, but at first glance it is clean, well laid out, and most importantly the link between the website, API, and user accounts is clear, and easy to navigate.

As I think this through, and after recently finishing a white paper called Building an API-driven Ecosystem for the Internet of Things, for the now defunct GigaOm, I can't help but think the IoT space is going to need to agree on some sort of standard approach for not just managing how they design, deploy, and manage APIs, but also how they deliver developer and user portals. To help you visualize what I mean, consider the number of Internet connected devices any single household will own, all with their own portal, and on-boarding process. #FUN

In short, developers, and end-users will get IoT account fatigue pretty quick. I don't think that we are going to standardize everything across the space, I am not delusional, but I can't help but think that the sharing of some common blueprints, could go a long ways in reducing friction for developers, and device consumers. I'll take my research from the GigaOm white paper and produce a blueprint, that IoT device makers can use, to to hopefully be more successful in deploying their developer portals, platform accounts, and maybe help influence a tiny bit of consistency in how IoT platforms function.



Why The New API Blueprint Sharing Button From Apiary Is So Important

API design service provider Apiary, quietly launch a new sharing button for API Blueprints, in their interactive API documentation, the other week. They added a setting in their account area, which allows users to be more open with their API designs:

"Start sharing your API blueprint with other API enthusiasts, by enabling this feature within your API settings. Simply toggle the ‘Public Blueprint’ setting on and you’re ready to start sharing your API blueprint.” also stating that, “this is just the beginning of being able to socialize and learn from other public API blueprints."

Now we you are visiting the Apiary driven API documentation for common platforms like:

Akamai Imaging API
Relayr Definition API
Loader.io From SendGrid
CloudBreak From SequenceIQ

The machine readable API Blueprint definition, driving the documentation now has the potential to be accessible. This may not seem like much, but it is actually fundamental to the health, and growth of the overall API industry. Developers cannot learn from other developers, and from leading API platforms, and share best practices for API design across the space without it—we need this to grow, and evolve.

This is one of the reasons Swagger has seen such wide adoption, because it isn’t just an open format, it is also because the API definitions that these public platforms generate are also available to share, and learn from. You can still lock down the valuable resources made available via an API, but the interface patterns we apply to our APIs should be shared as widely as possible, for everyone to see. This is why I’ve been fighting so hard to push back against Oracle in their API copyright case against Google—API designs do not need to be locked up.

I’ve been able to find API Blueprints on the open web, and on Github for some time now, but this opens up the playing field to be able easily discovery, and engage with existing Apiary, and API Blueprint driven APIs at a new level. Much of what I’ve learned about Swagger and API design has come from being able to click on the raw link on Swagger UI, and reverse engineer the Swagger definition behind the curtain. I’ve never been able to do this with API Blueprint, or Mashery I/O docs. I have been able to get at the WADL behind the Apigee Explorer, but it is a pain in the ass, and the definitions are usually out of date by the time I get to it.

The effects of Apiary’s move won’t be some immediate thing, it will take time, but the more API Blueprints that are shared, the more users will learn about the powerful API definition format driving many public APIs—this is good for Apiary, good for API providers, and good for API consumers. Thanks Apiary for doing this, it makes me happy. It also gives me a head start on making sure my API Stack is API Blueprint fluent, and that more API Blueprint driven APIs are present in my research.



Where You Will Find Me Next: Berlin, Barcelona, and Broomfield

I’ve been fortunate enough to be able to cut back on my travel in 2015, and focus on some important research, coding, and writing. I apologize to all the events I’ve said no to over the last couple months, but I hit a wall last year with speaking, and trying hard to make 2015 a much more healthier, and balanced year.

With that said, here is where you can find me this spring. I have three places you can find me in April and May:

That is all the speaking I’m doing this spring, with one tentative date early in June at BYU. I won’t be traveling again until the fall, when we do @APIStrat again, and of course @DefragCon!

I’m excited to be in Berlin and Barcelona, two of my favorite European cities, and of course Broomfield, CO (specifically the Taproom). I am also excited to not be traveling as much, and able to focus on my wider research, coding, and writing. #thanks



I Wish API Providers Published Their Developer Portals On Github So I Could Submit Pull Requests

I spend a lot of time looking through the developer portals of API providers. I see a lot of things, both the good and bad, while navigating these portals, and while some of the bad stuff I see are way too big for me to doing anything, there are many little things I see that I could help do something about. Sometimes it is just spelling mistakes, sometimes broken links, and other times I want to rewrite API descriptions, and add to the resources that are available for an API.

During these travels, I got to thinking...what if API providers published their developer portals to Github? Just like I do with my API Evangelist sites. Then I could fork their portal, and submit pull requests with rewrites of the description of what your API does, adding little tweaks, here and there, helping polish the portal--potentially making things easier for developers. I’m sure this thought scares the hell out of some companies, who can’t imagine, allowing outside input like this, but if you sit down and think about it for a while, it just might make sense.

When I was in Washington D.C., I created a default portal template that runs on Github. I might upgrade this, and make it available for API providers to use as a forkable template, to seed any API developer portal. Doing this makes it easy to provide a base set of building blocks that every API provider should start with, allowing them to delete the elements that they don’t want. I am also considering making a default one, that can be used as a starter developer portal for Internet of Things (IoT) platforms.

I thoroughly enjoy the pull requests that I get for spelling and grammar mistakes I made on API Evangelist, from some of my favorite grammar trolls. Running all of my websites on Github Pages has changed the way I tell stories, and interact with my audience. I’ve always encouraged API providers to use Github for an increasing number of API management tasks, but actually running your entire portal as a Github repo, would this to a new level. This approach could be used for larger, or smaller developer platforms, and I even use Github pages as a front-end for private API developer portals--ask me how.

It is just a suggestion for you API providers, and soon to be API providers. Github provides a very low cost, scalable way to launch your portal, and you never know, people like me who spend a lot of time in your developer portal, might actually help make it a better place. I wish more API providers would publish their portal as a Github repository, as well as graphic designers in the space help craft template portals, that providers can put to work. My portal definitions will always reflect best practices that I see across the API landscape, but they won’t always be the slickest looking when it comes to graphic design—just not my core competency.



Combined Calls: Monetization Through The Bundling Of API Calls

I was doing my regular monitoring, and found myself on the AlchemyAPI site. Not exactly sure how I got there, but I stumbled across their HTMLGetCombinedData API, which can be used for analysis on HTML content, and is one of 3 separate APIs, AlchemyAPI is calling "combined calls".

If you aren’t familiar with AlchemyAPI, the company has a number of valuable APIs, which you can use to make sense of content and data from on, or offline source. I use AlchemyAPI for API Evangelist, to pull keywords, and the content out of blog posts, helping me shed the overall look of a site, and any advertisements--getting down to the raw content. What I thought was particularly interesting about this API, was their approach combined calls, and specifically their approach to monetizing these aggregated API calls.

There are three specific APIs they are considering "combined calls":

These three APIs are only available in the AlchemyAPI pro and enterprise packages, which for me makes see this aa a potentially new approach to API monetization. I don’t see it as something that works for all API providers, but when you have numerous decoupled APIs, which developers may also be implemented several of them at a time, or daisy chaining them together—a combined API call, might save some developers valuable time.

Combined API calls also seem like a potential opportunity for API platform developers themselves. If an API platform, provides tools for developers to aggregate, and stitch together multiple APIs, and publish their recipes, it is something that could produce some interesting patterns, that may better deliver solutions to the problems developers actually face during integration. At the very least, allowing developers to publish SDKs that accomplish the same thing, might achieve the same thing.

I am just looking to share my thoughts on AlchemyAPIs approach to aggregating their API calls, and specifically the focus on monetization, adding the concept to my research. Maybe it is something others can for their API platforms, or maybe API developers, could provide aggregated API recipes, for specific API platforms, or across multiple platforms.



Weekly API.Report For April 6th, 2015

Swagger is now Open API Definition Format (OADF) -- READ MORE

My Weekly API.Report represents the best of what I've read throughout the week, and is only what I personally felt should be showcased. Each news item comes with a link, and some thoughts I had after curating the piece of API related news. I'm trying to break down stories into as coherent buckets as I can, but it remains something that is ever changing for me, but ultimately I will settle on a clear definition for each of the research areas.

Always a good report, when I start with 101:

More on the legal side of 3D Printing, than API:

An interesting Account Management feature:

A handful of interesting Acquisitions to note:

I'm see more Analytics related items like this focused on gaming lately:

I'm calling these API Aggregation, but they are kind of a new take on the topic:

  • AlchemyAPI Web Combined Call - I thought this was an interesting approach to monetizing APIs, by saving developers time. In a world where you have a bunch of microservices that do one thing, and do it well, I think aggregate or combined API calls ike this as premium service will be more common. 
  • AlchemyAPI Text Combined Call - Seems to be something they are doing across the board. Will keep an eye out for others doing this too.

This reflects some of the API Broker style broker I"d like to see more of:

More movement on API Definitions, and not just Swagger, some pretty nice moves from API Blueprint last week:

Handful of API Deployment resouces this last week:

API Deprecation rumors and misconceptions out of Fecebook:

The API Design advice for the week:

The always evergreen topic of the API Economy:

A single, slick approach to API Evangelism:

Couple of API Event related items this week:

Some API Integration insight from SalesForce:

The growing area of API Lifecycle discussion:

Lots API Management moves to evaluate from the last week:

Some very different looks at API Monetization:

An API Monitoring discussion:

The other API News from the space:

A single API On-Boarding story, to showcase how NOT to on-board:

Some API Performance thoughts from Restlet:

Some API Reciprocity stories from the private and public sector:

The growing area of API Testing:

The only lightly touched on area of API Virtualization:

  • Patterns of API Virtualization - I think API virtualization is interesting, but it is hard to help people understand the potential. I'm thinking there needs to be a lot more examples, and storytelling before they become common practice.

An interesting API blip in the world of Architecture:

The PopUp Archive team rock'n the Audio discussion in my roundup:

Pulling out just the area acuqisitions area, and look at in terms of Authentication:

 I just couldn't resist putting this one in the Automobile area:

The Banking conversation keeps on heating up:

The ever enjoyable general bucket of the Business of APIs:

Two things to look at on the Internet connected Cameras front:

Where I place Career related items:

I'm putting this under Censorship, even though it might also be legitimate use cases:

An interesting API related Certification item:

  • An Open Can of Tin Badges - Tin Can API - Not quite certification for APIs, but an experience API driven badging and certification. So you could craft specific API experiences, using the Tin Can API, and have the result be a badge delivery. 

A Chat story out of the business social network world:

Just a single City Government story this weeK:

Some expansion, and contraction from the world of Cloud Computing:

Two Cloud Storage stores that barely caught my attention:

A single, important Commerce story:

The Containers bucket is pretty small this week:

Diverse number of Content related items this last week:

A single County Government too:

A number of Data nuggets this last week:

And directly Database related:

I am seeing a consistent uptick in number of Device related API approaches lately:

DNS is so important to all of this working:

Two takes on the Documents when it comes to Dropbox:

Only a handful of Drone related stories I felt were worthy enough to discuss:

Kind of sort of some Education stuff:

A single Email item:

Some Embeddable thoughts:

An Encryption talk in the messaging world:

APIs in the Federal Government:

Couple of Hackathons to showcase:

I love having Hacker Storytelling items to showcase:

Two Healthcare stories:

A single, weird Home story:

One item to note in my IDE related research:

Two International items:

Internet of Things is kind of tame this week:

JavaScript wisdom from the week:

Yes! Libraries:

IBM continuing to dominate Machine Learning:

A single Mapping item out of Google:

Media related items in the news:

Messaging talk:

A single Microservices tale:

A handful of Mobile to aggregate:

A single Museums story:

Some thoughts from the world of Music:

Love the Outdoors news:

A single Partner story to note:

Payments not as hot this last week:

A robust Politics of APIs tag this last week:

Some helpful, and some frustrating Privacy discussions:

Good to see more API discussion in the world of Real Estate and Mortages:

The Real-Time discussion always dominated by PubNub with their kick ass content:

Security discussions I am watching:

A cool Showcase out of the Noun Project ecosystem:

Some Shuttering of companies in the cloud space to note by itself:

The every fascinating Single Page Applications how-tos:

Adding a single Software Defined Networking item, cause it has good content:

I love anything out of NASA that is Space related:

Isolating this again as a Speech related items, because it goes beyond just machine learning:

Big, big week for Spreadsheets:

And a single State Government item to cover each level of government this week:

Busted out Telecommunications, to show how stunted the industry can be:

Not as much Transparency talk this last week, but a handful to showcase by themselves:

Blockspring Shifts The API Client Conversation With Their Google Spreadsheet API Add-On

The one thing I've learned in five years as the API Evangelist is that us technologists and developers don't always see the world like everyone else. We focus on the perfection of the technology, our own desires for the future, and often miss the mark on what end-users actually need. This is one of the hallmark success of APIs over SOA, is that by accident, APIs jumped out of the SOA petri dish (thanks Daniel Jacobson - @daniel_jacobson), and was use solve everyday problems that end-users face, using the technology that is readily available (aka HTTP).

While I think us API folks have done a great job of delivering valuable resources to mobile applications, and a decent enough job at delivering the same resources to web applications, and I guess we are figuring out the whole device thing? maybe? maybe not? Regardless, one area we have failed to serve a major aspect of the business world, is delivering valuable API resources to the number #2 client in the world—the spreadsheet.

We have done a decent job of providing resources to data stewards, helping them deploy APIs using spreadsheets using services like API Spark, but other than a handful of innovative implementations from companies like Octoparts and Twilio, there are no solid API consumption resources that target the spreadsheet environment. Meaning there is no easy way for mainstream spreadsheet users to put common API driven resources to work for them within the spreadsheets that they live in daily--that is until today, with the launch of the Blockspring launched their Google Spreadsheets Add-On.

Yeah I know, making APIs work in spreadsheets has been done for a while, via Google Spreadsheets and Excel Spreadsheets, but nobody has standardized it like Blockspring just did. So let’s take a quick look at the implementation. I went to the Google Chrome App Store, and downloaded the add-on.

 

 

Then using a new spreadsheet, I click on add-ons > Blockspring, and logged into my account. After giving Blockspring access to the Google Spreadsheet via my Google Account oAuth, I was given an API console in the right hand sidebar of my spreadsheet interface. The API options I’m given aren't the usual geek buffet, they are everyday use scenarios that would attract the average spreadsheet users.

 

I select the IMDB movie search, which once chosen, I’m given the option to populate my spreadsheet with results, providing me with API driven resources, right in my worksheets. The best part is it is complete with one cell as a search term, allowing me to customize my IMDB search.

Using Blockspring, I’m given easy to use, API driven resources, that anyone can implement, like visualizing the recent news:

Or possibly evaluate stock volatility clustering, using stock market data APIs (cause you know we all do a lot of this):


Blockspring gives me over 1000 API driven functions that I can use in my Google Spreadsheet—kicking everyone’s asses when it comes to potential API client delivery. While us technologists are arguing over whether or not we can automatically generated Swagger driven SDKs, and the importance of hypermedia APIs when deploying the next generation clients, someone like Blockspring comes along and pipes in APIs to the #2 client in the world—the spreadsheet. #winning

Now the game will be about getting the attention of Google Spreadsheet users, and developing comparable Microsoft Spreadsheet tooling, and getting mainstream Excel users attention as well. The rest of you will have to get the attention of Blockspring, and make sure your API resources have simple, meaningful endpoints that can be piped in as Blockspring Google Spreadsheet functions. Spreadsheet driven business units should not have to learn about APIs and go look for them, at each individual API portal—APIs providers should find and education business users about their resources, via one of the most ubiquitous tools in business.

Nice work Blockspring, in helping ensure the space move beyond excel as a data source for API deployment, and focusing on it as an API client, delivering vital API resources to the business users who can potentially benefit the most, and are willing and able to pay for API access in my opinion.

P.S. As soon as I finished this I remembered this story from last weeks API.Report Free Federal Energy and Economic Information Delivered Straight to Your Spreadsheet - not an standardized approach, but definitely an important implementation to showcase.



API Streaming: Cache And Push Data From APIs Using StreamData.io

I was introduced to StreamData.io the other day, by Gabriel Dillon (@gjdillon) of Readme.io. StreamData.io provides a caching, and push layer for apps to take advantage of, that can be deploy on top of existing APIs. I haven’t implemented StreamData.io yet, I am still evaluating it, and at first glance it looks like an interesting new real-time layer, that you can use on top of APIs, to improve application performance and user experience.

I’m going to add StreamData.io to my real-time research right now, however once I start playing with I may add as an API deployment resource as well. I have some verification to do, but I’d love to test out the potential for StreamData.io to deliver cache and sync layer for common APIs, doing for APIs, what StreamData.io is doing for apps—delivering on one of my ongoing ideas for caching high use APIs.

When I was working in Washington DC, I had the pleasure of working during the federal government shutdown, where I wrote a piece asking if we can depend on our federal government APIs, and talked about caching APIs with AWS cloud formations, or Open Shift. My objective is to ensure there redundant sources for APIs coming out of government, providing developers with a more stable solution, they can depend on. I even added this thinking into the early APIs.json design, allowing you to define cached indexes of APIs.

I’ve added StreamData.io to my real-time research, and I’ll get in there and play with more, as I have the time. Then I’ll get a better understanding of what you can do with the API caching and syncing, seeing if it is possible to deliver this layer for my APIs as well as my apps—something that is even more appealing in this age of Docker containers. #FoodForThought



Temp API Keys: Leave Them Laying Around The Web Where Devs Will Find Them

I was reading Are You Or Your Customers Leaking Your API Keys? from Daryl Miller (@darrel_miller) the other day, which spurred me to float up a story in my Evernote, from a couple months back. My thoughts are only related to Daryll’s because it is about API keys, and their accessibility—the similarities stop there. Daryl is talking about a very different layer of key management, which we should all be thinking deeply about.

API key management is an API provider issue, as much as it is an API consumer issue. As an API provider I need to have an easy way to manage all the keys I provide to developers. As an API consumer, I need to have an easy to manage all the API keys I get across the API providers that I depend on. In short, key management isn’t easy, and there are no clear solutions that serve both ends of the key management coin.

As an API provider, I use 3Scale API Infrastructure to manage my API keys. Using my API portal, developers can sign up for their own accounts, manage their apps, and the keys that are issued. 3Scale does all the heavy lifting for me, all I have to do is manage my API service composition, and keep an eye on dev usage, as well as keep an eye on my own app usage. I use all my own APIs for my apps too! Sometimes my apps are the bad actors, it not always the developers who abuse API access. 

As an API consumer, I’m working hard to pull my API key world together, using a home brew format I’m calling simple api-keys. I’m am using Github for all of my API and microservice related projects, so I’m beginning to centralize how I store keys, and how I manage them across all my apps. With a central storage, and management process, I’m hoping to get a better handle on which APIs I use, and introduce more regular cycles of key management, where I refresh API keys on a strict schedule.

With the work I’m doing on the API consumption front, and reading Daryll’s post got me thinking about keys. If our API management house is in order, and we can manage different groups of API keys, monitor their usage, and revoke them at will, why can’t we just have a pool of API keys that we can just leave laying around? I’ve seen many APIs provider default keys, embedded in URLs available in API documentation, allowing you to make sample calls, why can’t we apply this a little bit wider? It seems like we could have a rolling pool of API keys, that we leave laying around in code snippets, how-to guides, and stories, which allow instant access to APIs—which we monitor, and if we see abuse we can revoke, and refresh with new set--easily done if you use Github Gists for storytelling (snippet centralization).

I’m going through my entire API stack, and generating Postman Collections, and I’m considering generating some API keys that are designed for use in specific stories, 101 resources, and other public and private evangelism efforts. It is just a thought at this point, that I wanted to put out there for discussion. Seems to me, if we really want to reduce any friction in on-boarding, we could start using “some” API keys in conjunction with SEO, SMM, and other existing evangelism metrics and analysis tools that are in our toolbox.



Weekly API.Report For March 30th, 2015

I swear 3D Printing is continuing to cross over in some interesting ways:

Interesting week for acquisitions in my opinion:

Seems like agriculture is a continuing trend:

Several diverse areas within analytics caught my attention:

Some API aggregation tidbits:

Helpful API debugging assistance from Splunk:

Big shifts in the world of API definitions:

Some API deployment advice from this week:

Valuable API design from this week:

Keeping developers up to speed with a little API evangelism:

Some API events activity (mostly from my events ;-)

Great API lifecycle guidance out of Nordic APIs:

  • The Entire API Lifecycle - I think I'm going to break these lifecycle discussions into their own area. Normally I put under business of APIs, but recently some of the conversation is elevating way beyond just business.

Lots to highlight from API management this week:

Also some interesting API monetization chatter:

Some good API monitoring leads from this week:

I am not the only one busting out the good API news:

API performance is something I'll be breaking out more as well:

Big moves fro Talend in the area of API reciprocity, and I guess cloud storage:

Like seeing talk on API stability:

Previoius weeks have been a lot of talk by me, but this wee there were two amazing visualization discussions:

Two interesting authentication pieces: 

Thoughts on automobiles:

The banking chatter I felt worthy of higlighting:

I am tracking more on blockchain now too:

Bots can be fun to throw in as well:

Browser APIs are normal, but I'm keeping a closer eye on to educate myself:

The infamous area called the business of APIs:

Oh the magic of cameras with IPs:

Two API career paths to highlight this week:

From the city government front:

Hard to tell who is winning when it comes to cloud computing:

Some interesting cloud storage specific items too:

Have to look at comments out of Facebook:

Ony a handful container related stories this week:

I will try to push more content to Tumblr as part of my evangelism work:

A single conversion story to share:

  • Parsing EDI to XML (and vice verse) - I was doing some Swagger conversion tools, and came across this simultaneously doing my monitoring. figured it was interesting enough to track on for future reference.

Paying our respects to cURL:

At the cybersecurity front:

Data is always a big thing:

The conversation around the data center continues to grow:

Two database items caught my attention:

Couple of deprecation items flagged as part of story:

Devops creeping into the mix:

And like that, drones are on the menu:

I'm hoping for a lot more election data and API activity in the future:

An interesting email integration between platforms I track on:

The coverage of embeddable comments at Reddit:

Encryption love from aWS:

Some great energy stories this week:

Focus on the enterprise:

The super fun area of facial recognition:

The news worth discussing out of the federal government:

Three things caught my eye in the financial space:

Look fitness from the gaming console!

Hackathon news that makes me happy:

Some of it is more IoT than healthcare, but still pretty robust this week:

A single, very cool hotel related story this week:

IDE space is coming into focus when it comes to APIs:

Good ideas are contagious in the API space:

Keeping an eye on the international picture:

  • Data Protection Around the World - More great stuff coming out of DataSift, and this time an important international view of things. Something we are going to need to be more educated on.

The always active Internet of Things (IoT):

Investment blips on the radar:

An important discussion about the legal system in our country:

Libraries stories are the best:

Location is critical:

Pulling back the machine learning curtain:

Some mapping activity I followed this week:

Media is one area Microsoft is kicking butt in:

Mostly about Facebook messaging, but a handful of stories from the week:

The always fun microservices conversation(s):

Mobile chatter:

Museums stories too!

Love this music item this week:

Two new APIs to highlight:

From the mainstream news space:

Telling partners is critical to the space going round:

Some good, some weird patent news:

Strong week for payments:

No really API, but photos are key to what I do:

Interesting police stories this week:

Several fronts from world of politics:

Stories from the Politics of APIs:

A lone printing story:

Lots of privacy chatter to discuss:

Some more protection stories this week:

Two real-time thoughts from the week:

The area of Reclaim Your Domain is an area I showcase much, but track on all the time:

An eye on regulation:

API providers, make sure and provide resources like a calendar of events:

A fat SDK section this week:

Lots more security news, as expected:

Semantics isn't something you see often in my coverage:

I track on sensors, or they'll track on me:

Can't tell whether this is a thing, or anti smart meter propaganda:

  • Smart meters – not so smart - An interesting take on smart meters not working, and biggest the first casualty of IoT hype. Not sure I subscribe to everything that he lays out, but definitely think we should be skeptical. I will keep an eye out for other examples supporting or against these thoughts.

Some interesting spreadsheets related items to discuss:

Yes, some state government news I can showcase:

Couple of transparency items that caught my attention:

Obligatory Transportation story:

Showcase the two-factor authentication so it becomes widespread:

Video talk from several of the big API players I track on:

Hacking the wearables this week:



Quantifying The Community Around The Swagger API Specification

This post is extracted from a deep dive I've been doing into the Swagger ecosystem, as part of regular conversations I have been having with Tony Tam (@fehguy), trying to understand how we can better tell stories about Swagger. I've been tuned into things for a while now around the Swagger community, and there is a lot going on, but from the outside, some of this can be hard to see. Seeking a better understand what is the Swagger community, I wanted to take a walk through the sites, forum, and Github repositories.

First let’s start with the basics, what is Swagger? Merriam Webster defines Swaggers as: : to walk in a very confident way : to walk with a swagger. ;-) Seriously though, Swagger is a machine readable format (either JSON or YAML), for describing an API, providing a description of API operations, in a way that other applications can read, and understand. Even though Swagger is machine readable, and designed for other applications, it has also become a language for humans to describe and collaborate, around a shared, quantifiable, vision of what an API can do, throughout all stages of the API life cycle. 

If you want to learn about Swagger yourself, you can tune into these channels to learn more:

My goal with this post is evolve my understanding and awareness of what Swagger is, and hopefully along the way I can help increase yours as well. In my experience, many people have a very distorted, and limited understanding of what Swagger is, and I’d like to take some time to evolve that. When I started this research, my objective was to help Tony deliver a new website, and on-boarding materials for Swagger, assisting people of all skill levels in finding all things Swagger. I am just taking this moment to walk through what at I have so far, and craft a more robust version 1.0 narrative about what Swagger is, something that blogging helps me work through.

One of the biggest misconceptions I find when I talk with folks about Swagger, is that it is interactive documentation, when that is actually Swagger UI--when it comes to Swagger all roads start with the Swagger specification:

  • Swagger-Spec - The goal of Swagger is to define a standard, language-agnostic interface to REST APIs which allows both humans and computers to discover and understand the capabilities of the service.

You can find version 2.0 of the spec here, which is the current active version in use. The Swagger spec lets you describe all the essential aspects of API operations, with information about the operator, the host and basePath for the API, details about what type of content it produces or consumes, like JSON or XML. Most importantly it lets you describe each of the paths for an API, with details on parameters, responses, and other details of each specific API path. Swagger also provides you with the ability to describe the underlying data model definitions, describing the valuable resources being served up via APIs.

To grasp the last paragraph, you have to have a pretty solid understanding of the moving parts of an API, but in short the Swagger spec gives us a language to potentially map out all of the common parts of an API. It also allows me to describe the security for each API, and tag to organize APIs into specific buckets. I use Swagger to describe the almost 20 APIs my business uses daily, with varying amount of detail about each resource, and the information it serves up. All the 20 of the Swagger definitions provide me with a machine readable map of my own IT infrastructure—as well as a common language that almost any other developer, or potentially not developer can learn to read, and work with.

With Swagger we now have a common way to communication around the APIs we develop, and also map out the public APIs we also depend on like Twilio, SendGrid, or even Twitter. Now what? Why do we do this? There are an increasing number of incentives emerging for why you generate machine readable API definitions like Swagger, which is part of the reason I’m looking to further quantify the Swagger community, so I can add to the master list.

Let’s start with what I’d consider to be the primary building blocks of Swagger, developed by the Swagger team, as part of the core offering:

  • swagger-ui - Generation of interactive API documentation, which Swagger is often known for.
  • swagger-codegen - Generation of client code in a variety of languages using a Swagger definition.
  • swagger-js - A JavaScript library to connect to swagger-enabled APIs via browser or Node.js.
  • swagger-socket - A REST over WebSocket tool for creating real-time integrations with Swagger.
  • swagger-node-express - Swagger module for Node.js w/express module
  • swagger-scala-module - Swagger support for Scala.
  • swagger-editorA visual editor for Swagger.

Those six repositories represent the core tooling that the Swagger team has evolved around the specification, providing what I’d consider to be three stops along the API life cycle:

  • Documentation
  • Server Code
  • Client Code

Swagger is often known for its interactive API documentation (aka Swagger UI), but there has also been a lot of work done on the codegen platform, and editor, taking things well beyond just API documentation. This is just the beginning of what is possible with Swagger. In an effort to find out what else is being done with Swagger, and what is being built with the specification, I wanted to see what the community is up to, which platforms they are integrating it into, and what other standalone tooling is being built using Swagger.

I want a better understanding of the core group of companies who have embraced the Swagger specification, and baked it into their own platform. I know there are hundreds of them out there, but finding them can be easier said than done, especially when I'm looking some sort of public announcement, blog post, press release, service description, to use as my reference. So far I have 49 companies that I’m tracking on, who have pulled the Swagger spec, and incorporated into their own platform and infrastructure.

3scale (reference)

Akana (reference)

Apache Apollo (reference)

Apache Camel (reference)

Apigee (reference)

Apigility (reference)

APIMATIC (reference)

apis.io (reference)

APIs.json (reference)

APItools (reference)

AppNow (reference)

Ardoq (reference)

Axway (reference)

Beego (reference)

Catch Software (reference)

Cloud Elements (reference)

Django REST (reference)

DreamFactory (reference)

elastic.io (reference)

Espresso Logic (reference)

Gluu (reference)

IBM (reference)

Magnolia CMS (reference)

Microsoft Azure (reference)

Neo4J (reference)

Netflix (reference)

Nomos Software (reference)

Open311 (reference)

OpenDaylight Project (reference)

OPENi (reference)

OpenShift by Red Hat (reference)

Pivotal (reference)

Postman REST Client (reference)

Quandl (reference)

REST United (reference)

RESTBase (reference)

RESTFiddle (reference)

Restlet / APISpark (reference)

Sandbox (reference)

Service Stack (reference)

SmartBear Software (reference)

SnapLogic (reference)

StrongLoop (reference)

Synapp.io (reference)

TIBCO Software (reference)

Visual Studio (reference)

WaveMaker, Inc. (reference)

WSO2 (reference)

Yelp (reference)

This list represents a pretty interesting mix of folks who understand the potential of Swagger, and have invested in the community by making it a core part of their operations. I know there are more of these out there (a lot more), but they haven’t talked publicly about what they are up to. This is just what I’ve aggregated from going through the Swagger site, group, and Github repositories, and from what I already know of the API space, from my monitoring.

After looking at the platforms who have pulled the Swagger specification into their operations, I wanted to see if I could also begin to quantify the community of APIs who have deployed Swagger UI, providing interactive API documentation for their own API platform. Since this is what Swagger is known for, I knew I would find quite a few to showcase. So far I have found 62:

World Population Project (reference)

3scale (reference)

ACI Blog Index (reference)

API Evangelist (reference)

API2Cart (reference)

AppSpin (reference)

Banckle (reference)

BetISN (reference)

Bitdango (reference)

BitTitan (reference)

CallFire (reference)

Carnival Mobile (reference)

CDScience (reference)

CityGro (reference)

CitySDK (reference)

Clarify API (reference)

Climate Change Costs (reference)

Cloudify (reference)

CommaFeed (reference)

Concur (reference)

DataValidation API (reference)

Datumbox (reference)

DreamFactory (reference)

DroneKit (reference)

Evercam (reference)

Expedia (reference)

fanart.tv (reference)

Formagg.io (reference)

GeoTrellis (reference)

Getty Images (reference)

GroupDocs (reference)

Guru (reference)

HabitRPG (reference)

hapi.js (reference)

Kubernetes (reference)

League of Legends (reference)

League of Legends eSports (reference)

Likibu (reference)

Mobovivo (reference)

Mozilla Open Badges (reference)

MuckRock (reference)

MyFox (reference)

MyNeighbor (reference)

MySMS (reference)

Pingup (reference)

Pop Up Archive (reference)

Project Chronos (reference)

PTisp (reference)

Puppet Labs (reference)

QI Bench (reference)

Rackspace (reference)

ReliefWeb (reference)

Sensr.net (reference)

Shoeboxed (reference)

Skimlinks (reference)

Subledger (reference)

TaDaweb (reference)

taxamo (reference)

The Resumator (reference)

UK Commission for Employment and Skills (reference)

Wordnik (reference)

YunoHost (reference)

Finding websites that have implemented Swagger UI is harder than you think. There are only three search engines I could find that let you search the source code of websites for JavaScript files and code. Ultimately I ended up mostly depending on my own ability to find new APIs, and manually uncover whether or not Swagger was actually in use. I’m working on a stack of Swagger and API Blueprint discovery search engines, but they aren’t quite ready for this dive.

Next I wanted to look through some of the tooling that has evolved as part of the Swagger community, which Github is definitely the place to begin this journey. When you search Github for the word Swagger, you get almost 1000 repositories. I was going to programmatically explore these results, but found that manually browsing through them was the best path to take, allowing me to make a decision on what each developer was up to.

The one thing I learned doing through this process—Swagger means many different things, to many different people. Most refer to Swagger, and mean Swagger UI, while others refer to it as generating server side code, or maybe client side code. Moving forward, there needs to be a lot of work done helped educate people about first, what Swagger is, a specification, and what are  each of the areas of tooling that are built on top of the Swagger specification.

Even after looking through the almost 1000 Swagger related Github projects, things are still very hazy for me, but I did learn a lot along the way, about what languages, frameworks and platforms developers are using Swagger, and the types of tools and integrations they are developing.

The best way to look at all of these projects is by language—here is the Github breakdown.

  • 411 JavaScript
  • 135 Java
  • 61 Ruby
  • 41 Python
  • 32 PHP
  • 31 Scala
  • 22 C#
  • 17 Shell
  • 13 Clojure
  • 8 Groovy

This isn’t accurate, because many of the JavaScript repositories are actually other languages. I forget to make sure Github knows what language a repo is sometimes too. Regardless, it gives you an idea of which programming languages developers are using to work with Swagger—one layer to look consider.

Beyond the programming languages I want to know which platforms and frameworks developers are applying Swagger. Here are 36 frameworks and platforms I’ve identified so far.

AngularJS (reference)

Apache Ant (reference)

Apache Camel (reference)

Apache Maven (reference)

Apache Wink (reference)

ArangoDB Foxx (reference)

BackboneREST (reference)

CakePHP (reference)

CKAN (reference)

Django REST (reference)

Django Tastypie (reference)

elasticsearch (reference)

Express (reference)

Feathers (reference)

Flask (reference)

Flask MongoRest (reference)

Flask Restful (reference)

Gradle (reference)

Grails (reference)

Grape (reference)

Grunt (reference)

Guzzle (reference)

Jersey (reference)

Laravel (reference)

LINQPad (reference)

MongDB (reference)

Nancy (reference)

Pyramid (reference)

Salesforce (reference)

Sequelize (reference)

Silex (reference)

Sinatra (reference)

Spring (reference)

Symfony (reference)

Tornado-JSON (reference)

Veneer (reference)

Beyond the programming languages, frameworks, and platform integrations I’m looking to also understand the types of tools developers are actually building, and take Swagger beyond what the core development team, and working group could do. It is hard to understand the objectives of each of the developer projects I came across, but here are the X areas I’ve identified so far:

  • Converter
  • Generator
  • Parser
  • Schema
  • Validator
  • Server
  • Command-Line
  • Powershell
  • Client
  • Proxy Generator
  • Aggregator
  • Tester

This added a couple stops along the API life cycle for me, that Swagger is serving. I hadn’t envisioned a Swagger powered CLI or Powershell tool. I’ve only lightly thought about proxies, and aggregators driven by Swagger. When you bundle this with the platforms above who have integrated Swagger into their products and services, it starts to paint an even bigger picture of what is possible with Swagger.

I think I’m going to wrap up this dive into the Swagger ecosystem here. I still have all of the contributors, watchers, and people who have forked the core Swagger products, and well as many of the people involved in Swagger discussions via the Swagger Google Group and Twitter account to consider. Plus I think this article will flush out some other projects that are out there, being executed by my readers. Storytelling along the way is the best way to flush out hidden details in my opinion, and is one of the main reasons I’m so transparent--I depend on people emailing and tweeting at me about stuff.

This exercise has given me a new perspective on the Swagger community, and introduced me to some new companies, and tools. If you are doing anything cool with Swagger, please let me know, and I’ll add it to my research. Remember, if you don’t tell the story about what your are doing with APIs, nobody knows, including me—API evangelism 101. This is just my first attempt to better quantify the Swagger community, and I’m happy to include what you are up to in future updates.



Amazon Echo: Voice Enablement Will Be Major API Driven Channel In The Future

I've been tracking on the potential for voice APIs since Siri was first announced, a topic that often meant telephony like from Twilio, or audio transcription from Popup Archive and Clarify. When I close my eyes and think about the the future of APIs and voice enablement, it is more akin to the Siri example, where the digital world is (supposedly) just a voice command away.

Imagine making all your employee directory, company calendar, or product catalog available via voice commands. How do you do this? You do it with APIs, and a voice enablement platform in-between the application developers, and the available API resources. Much like all other voice enablement, I think we have a huge amount of work to get anywhere close to the pictures we all have in our head, when if comes to voice enablement.

It is my mission to find these signs across the API landscape, and keep an eye on what they are doing. One platform that is now open for beta is Amazon Echo. Amazon says, "Echo is designed around your voice. It's always on—just ask for information, music, news, weather, and more.” APIs are how we will deliver on the “more” part of this equation. The difference between Apple Siri, and Amazon Echo at this point, is Amazon will let you (API providers) help deliver on the “more” part of Amazon Echo discovery equation.

I’m signing up for the beta, and if I get access, I will share more stories. I encourage you to sign up as well, and if you have any work your doing with Amazon Echo, or know of other API driven voice-enablement platforms I should be paying attention to, let me know.



Even More Pondering On My Microservice Definition

I am evolving my own personal microservices definition, something that is constantly changing, as I work on my infrastructure, read other people’s own definitions (no shortage these lately), and continuing having conversations with smart folks across the space. I had the pleasure of having Mike Amundsen over the other night for dinner, and after having some interesting discussions about community, and potential micro services design, I’m adding a couple of elements to my microservice definition list.

In my last post on this, I listed seven criteria I congress as part of my micro services definition:

  • minimal surface
  • minimal on disk
  • minimal compute
  • minimal message
  • minimal network
  • minimal time to rebuild
  • minimal time to throw away

I wanted to add two elements to my list, considering some of the other elements I’ve noticed at play when it comes to API bloat:

  • minimal ownership
  • minimal dependencies

Minimal owners is a pretty easy one for me, as I’m the owner of all my microservices—the buck always stops with me. However, It is a good reminder to make sure all of my services have a vCard applied to their APIs.son file index. It is important to easily known who is responsible for any public or private API, and with APIs.son and Swagger, this is easy to do.

When it comes to minimal dependencies, this gets a little tougher for me. My world isn’t too bad, because my systems are small, and I’ve been the only developer. The down side is I’ve been the only developer, and I’ve bee lazy about how many dependencies an API might have, and also been fairly sloppy about how I integrate with 3rd party APIs. Adding this item to my list will help me keep an eye on how I establish dependencies between services, keeping them minimal, and well documented.

That is it, I just wanted to add ownership and dependencies to my list of considerations—I just like making a production out of everything.



With The Addition Of The DroneKit API, I Am Now Tracking On The World Of Drone APIs

There are many areas I track on as part of my research. Things that do not show up on my weekly API.Reports, or in my analysis on API Evangelist. One of these is the area of drones. I’m fascinated by this potential area in the area of Internet of Things, but until now it was more of a side note, or one possible path for APIs and the Internet of Things.

Amongst the numerous drone stories I curate, I'm seeing more shift in the space towards a developer focus. It isn't just about the drone itself, its about customizing, developing and using the drone as a platform. One recent story I noticed was 3D Robotics (3DR), one of North America's largest consumer drone manufacturer, announcing the release of a new DroneKit, which includes an API for drone app development.

DroneKit has a web API defined using Swagger, which is the first Drone API I have actually added to my system. I am sure there are more out there, and now that I’m tracking on closer, I'm sure I'll find them. The number of drone related stories that flow through my feeds is getting pretty out of control, and I'm thinking the space needs some standard approaches to APIs identified, to help develop better drone solutions, but also potentially inject some transparency into this growing layer of IoT.

Now that I have an API blueprint for a drone API, in Swagger too! ;-) I will be able to start thinking about what some of the API patterns for drones might look like, and what some of the common building blocks might be for managing the technical, but also business, and political side of drone operation. I’m thinking that when it comes to drone operations, APIs might not just be a nice to have, they might be something that becomes mandatory in trying to bring this fast growing, not well defined space, under some control.



WhatsApps Reason For Not Opening Up API Is Just Not Informed And Short-Sighted

There was a quick post from Mashable the other day about a conversation with WhatsApp cofounder Brian Acton, called WhatsApp cofounder: Sorry developers, no API for you. Its pretty common argument you hear from business leaders, regarding why they won’t open up APIs. It has been a while since I railed on this line of thinking, so I am taking the opportunity to share why this is such an un-informed, short-sighted view of APIs.

I wasn’t at the conference, so I a completely dependent on Mashable for this, but apparently someone in the audience asked:

"We have done a lot of transactional messaging using traditional email and SMS, and we've recently moved into the international market," the engineer explained to the audience, who clapped and hollered. "We're desperate to communicate with our members where WhatsApp would be a wonderful platform. We're good developers."

Acton stated that they had no current plans to, adding:

"We don't want to inundate users with messages they don't want," Acton explained of the choice. "I am very empathetic to your cause. I receive emails on a regular basis from people who want to run their business or want to run something using WhatsApp as the backbone for communication, but we're balancing that with the user experience."

Before I point out how narrow this thinking, let’s focused on why I use the word un-informed to describe this perspective. Let’s say the motivation for not opening up an API at WhatsApp is purely about UX, let’s explore some of the options on the table, for API providers to strike the balance they are looking for in this area.

API Management Service Composition
With modern API management services, you have a critical concept called "service composition", where you the API provider, can carve up different levels of access to valuable API resources, deciding exactly who has access to what. API do not equal firehose to the public, where anyone can build whatever they want. If you still see APIs in this way, spend some time playing with the more successful APIs out there like Twilio, SendGrid, and expand your definition of just exactly what an API is, and the role service composition plays.

oAuth For Platform Access & End User Control
oAuth is a pretty proven way for managing who is accessing API resources, in a way that the end-user has a say in operations. An API does not automatically open access to all users, when a developers signs up for API access. In the beginning, you have access to your own account as a developer, that is it. It is up to end-users to find these applications, and using oAuth, give (or revoke) access to whatever application they choose. As a platform owner you should consider giving end-users oversight, via oAuth, into how you the platform provider is accessing their data, and require ALL partners to follow this path. If you need example of this in action go look at your Google Account profile, you’ll see Google Apps in the oAuth list.

Branding And Design Guidelines
What are the official internal branding and design guidelines you use to direct your own WhatsAp development teams—you have them right? So share these with developers, and give them a chance to share in your master UX vision of how the WhatsApp UX vision, is the one to rule them all. You should be able to articulate the platform UX vision as a set of professionally crafted, publicly shared, design guides, that everyone can see, and use to understand the vision--you never know maybe here would be someone out there in the world that could iterate on them in a meaningful way, that even you might even agree with.

Starter Kits and UX Frameworks
Additionally, go beyond just design guides, and provide developers with HTML, CSS, Images, and JavaScript goodies that will demonstrate how visionary the WhatsApp UX vision is. Think Twitter Bootstrap. I’m not saying that Twitter nailed it on this front. They did well, but their overall timing and execution left a lot to be desired. As a platform provider who possesses a UX vision that works, open up the platform, provide developers with the proper starter kit, or UX framework that they need to succeed. As a developer, I fail to deliver on your UX vision, it is more a failure of your platform to provide me with proper tooling, than it is my fault as a dev.

Developer Sandbox & Simulation Environments
A common approach to managing developer on-boarding, and overall quality control through an API and resulting app lifecycle, is done by providing developers with a sandbox. When nothing is live, and developers do not have any access to actual end-users, what harm is done? What harm to the overall UX experience is there, when developers are only building in sandbox and simulation environments? Until they prove themselves, no developer should move to a production environment, until their solutions is proven to meet WhatsApp strict UX guidelines.

Application & Developer Certification
Now let’s talk about how WhatsApp could certify developers, giving them access to more resources, or possibly investment from the WhatsApp platform. Many leading platforms require developers to get certified, and / or their applications certified before they get full access to production environments, and end-users. How you craft your developer and application certification is a reflection of your overall UX requirements, and platform goals.

Application Showcase
When you have proper branding and design guidelines, starter kits and supporting UI / UX frameworks, proper developer environments, and official application and developer certification, you should end up with a handful of apps worth showcasing. Maybe its not just a single, competing application, it could be integration with an existing platform (SalesForce, ZendDesk, etc.), or just providing access to valuable messages and conversations. With proper API ecosystem management, an application showcase provides an incentive for developers to build applications that work properly, and reflect the well defined UX vision of WhatsApp.

Partner Program
With a robust ecosystem of certified developers, and applications that meet the strict UX guidelines of WhatsApp, surely there will be the cream on top of what is developed on the APIs. Which apps, and platform integrations are the best designed, and provide the most value to WhatsApp end-users? These are the ones that WhatsApp should be investing in, bringing them closer to internal operations, and providing them with additional access to resources, investment when it makes sense. What does the WhatsApp partner program look like currently? I'm sure there is one, just not a formal approach, which includes any transparency, or public involvement.

Zooming Back Out
That represents just a handful of the common API management building blocks that are commonly put to work by API providers to help manage their API ecosystems. It doesn’t just take technology to make APIs work, it takes a healthy business, and political approach to make sure any API actually works as an ecosystem. I could go on for days about how proper support, communication, and other proven elements, could also be used by WhatsApp to strike the API developer balance system they are looking for, and even possibly create a positive feedback loop with the community, that could be applied to the roadmap.

Now that we are paused, and zoomed out a little bit, let’s talk about the narrow perception demonstrated by WhatsApp current stance on APIs. If you open up APIs, developers will just come and mess with your UX version. Nothing else will happen? This is where we shift from being educated about common building blocks, to the limited perceptions that results from this lack of awareness—you just don't know, what you don’t know.

You Have APIs, You Just Aren’t Open
WhatsApp uses APIs. This is how the mobile app works. The company just does not want to share these resources with developers, because they fear losing control over the UX experience. I guarantee that WhatsApp is sharing data and resources with 3rd party provider, they are just very picky about who these are, and they don't have a formal process for doing so, so they keep their cards very close to their chest—aka, not open.

Let’s explore the darker side of the WhatsApp API conversation. I’m not alluding that WhatsApp has less than honorable intentions, my goal is to shine a line into the dark areas that exist, regardless of WhatsApp lack of education of modern API practices, and missing the wider picture of how APIs are actually put to work. This moves into the area of responsibility I feel that platforms assume, when they step up and invite end-users to sign up, depend on, and stay engage with any online, public, API driven web, mobile, or device based application.

Data Access & Portability
WhatsApp end-user deserve access to their accounts, and their messages. APIs are not just about developers building the cool app that will compete with WhatsApp, or mess up the WhatsApp UX vision. APIs provide basic data access and portability to the information end-users generate via the WhatsApp platform. Platforms that do not empower end-user data portability, do not operate in the best interest of their own customers, removing from the overall user experience. Something you don't see, because your UX vision is so narrowly aligned with your business goals, not widely aligned with your end-users needs.

Platform Reciprocity
Using services like IFTTT and Zapier, API driven platforms are enabling users to to migrate their information between the services they depend on, and add to their own user experience on each platform they depend on, and most importantly between them. APIs, and oAuth enable the flow of information between cloud platforms in a way that protects the interests of users, as well as that of the platforms. APIs aren't just about developers building apps, they are also about users enhancing their own overall experience, on their terms—moving beyond the platform owners single UX vision.

Diversity & Platform Balance
I just can't imagine any single company, in 2015, thinking they have single UX vision, and nobody could possibly contribute to that, from an outside perspective. As a product owner, I would love to get as much input from my community regarding the next generation UX, because in the end, it is ultimately up to me what I incorporate into the roadmap, what are best practices in the API community, and ultimately which apps get certified, showcased in the application gallery, and pushed to end-users. When you operate a global platform like WhatsApp you can't address the diverse needs of end-users over time, around the globe, without the help of users augmenting and evolving exactly what is the WhatsApp UX vision. When you are a white dude in the U.S., you just aren't going to understand the UX for women of color in Africa, I don't care how much vision you think you have.

In Conclusion, Just Shy of Innovation
I’m going to end this there. I’m not even going to pull out the innovation card and talk about all the potential for new ideas, a WhatsApp API could generate. I’ll stick with talking about a handful of common building blocks used across the space to mitigate the UX concerns that API driven platform providers have, and pointing out how narrow the UX concerns are. There are plenty of other reasons for having an API, beyond just developers building apps, that will mess with your UX vision.

Users should have a choice to use their preferred messaging channel (WhatsApp) in any platform they depend on, and be able to manage their experience using 3rd party services of their choice. With an informed approach to API management, you can do all of this, giving end-users more, without compromising the overall WhatsApp UX—you can carefully craft and evolve it, with the help of your community. As it stands, WhatsApp has an API, it is what the mobile app uses, and I’m pretty sure that WhatsApp does provide access to 3rd party partners of their choice, they just don't open that up to public knowledge—they are already doing what they say they don't do, they just aren't open and transparent about the process.

It doesn't take much to launch a proper developer portal, with the essential building blocks like documentation, code samples, and self-service registration. You don’t have to give production access to every new developers—make them build something cool, and prove themselves first. That way you only deliver only the best to WhatsApp users. People are already building unsanctioned apps on WhatsApp (just Google WhatsApp API), not having an API isn’t stopping these potentially bad seeds from entering the UX paradigm, there is just no sanctioned place for users to go, when they are looking to extend, augment, or make their WhatsApp work with other services they use, or assume control over their own account and data.

Alright, I'll stop ranting there. I know I’m biased, but if you have a popular public mobile application in 2015, there is no reason you shouldn't have at least a public face of your developer program, to answer these question. In the end you are just hurting your own operations, as well as the overall experience for your users. Additionally, I just can't see how the leading position of any mobile is defensible for very long without an API. Things are just too volatile, and end-users are just too picky, someone will always come along and disrupt you, unless you can shift to deliver the UX consumers are demanding.



Politics of APIs: Talk Of API Driven Regulation Is Increasing

When I started API Evangelist, I was all about the Business of APIs, something I still focus on, but increasingly over the last couple years I am focusing more on what I call the politics of APIs. In my opinion, the politics of APIs can be anything from terms of service and privacy policies, to rate limits, pricing, and security, all the way up to court cases, patents, and government regulation.

Today I want to focus on the very top of that spectrum—government regulation. The government isn't just getting into the API game, by deploying their own APIs, they are going to also increasingly be getting more involved with private sector APIs. In my weekly monitoring I'm seeing more chatter across several industries, like the UK treasury getting involved in banking API standards, and a push in the healthcare industry for more interoperability via APIs--just to highlight two recent stories.

This isn’t anything new, governments been defining API standards and pitching them to industries for a while now. I personally have been involved in healthcare and energy related standards, as part of Blue Button and Green Button data services. In coming years, what is going to increase, is the number of industries that the government is helping define standard, as well as beginning to get a little more heavy handed about mandating APIs as part of the regulatory process..

Personally I’m not the biggest fan of government regulation, something I feel gets abused on both sides of the tracks, but I understand it is a necessary actor when it comes to balance in markets. To help address some of the abuse that occurs, I think APIs could significantly help bring much needed transparency, and self-service access to the process, for both the public and private sector. I feel the conversation in coming years will move beyond the government just defining industry API standards, but also pushing for real-time interoperability, and execution by companies who operate in heavily regulated industries.

I’ll continue to keep track of the patterns I see emerging when it comes to API driven regulation, and try to stay informed regarding what is coming down the pipes across business sectors. At various points along the way, I’ll do roundups of regulator related news, analysis, and API definitions, or platforms that focus on industry regulation. When it comes to API driven regulation, there will be no black or white, just high a high frequency blur, which will need transparency, and a machine readable interface to make any sense of.



How Do You Make Something, Something, The API Edition

After reflecting on API management and the Apigee IPO, I’m thinking deeper on how the API space has come to be a “thing". To kick off this story, you have to start with the API itself, something very abstract, hard to see, let alone convey to the average person. A single API is difficult to quantify, let alone an entire sector, or the companies rising up to service this perceived sector—think of the argument from early API management service providers like Mashery, 3Scale, and Apigee had to make to their VCs.

Don’t get me wrong, the API space has always been a “thing” to me, but along the road from 2010-2015, there has been plenty times, where I question the way things are. I have to give it to Mashery, they saw it early on, then 3Scale and Apigee joined in, and in 2015 we have a real industry, with 35 API management service providers, in addition to the expanding number of service providers catering to API design, deployment, management, discovery, integration, and other evolving layers of the API lifecycle.

In 2015, I feel like the API sector is a real thing. Not because 3Scale is still growing, Mashery was acquired by Intel, Apigee is IPOing, or that there is over 35 API management service providers, including a number of open source solutions. The API sector is a real thing because there are many other companies, in many supporting area who have emerged, and the overall awareness around the potential of API, by business leaders has grown significantly. How did this happen? It happened because people were telling the story of APIs, whether it was marketing for their products services, or stories on ProgrammableWeb and API Evangelist—working in concert to create an industry (something).

Within the same thought, I try to grasp how it was that I created API Evangelist. There sure as hell wasn’t any strategy to do it. Early on I had a vision in my mind, but five years later, that vision is radically different. I did what I had to do, when the time came, and told the stories that mattered (at least I tried). I am not trying to connect API Evangelist to the overall existence of the API management space, I’m trying to work through the details how you define something to a wide audience, in this very abstract world of APIs. I think this applies at the highest levels of industry like API management, to the lowest levels, at the microservice level (soon to be nanoservice, trust me it will be something ;-)—it isn’t something, until it is something, and people believe.

I know. All of this sounds like bullshit. If you think about it though, it begins to make sense (hopefully). Web APIs weren’t a thing until John Musser aggregated them together into a directory in 2005. API management wasn’t a thing until Mashery, 3Scale, and APigee made it into something. API evangelism wasn’t a thing until API providers like Twilio and Sendgrid started sending out armies of evangelists to hackathons. API design wasn’t a thing until Tony Tam gave us Swagger to help us visualize our APIs, and Jakub Nesetril of Apiary allowed us to design, mock, and collaborate around an API Blueprint. None of these things were things, until people stepped up and worked really hard to convince others, that they were a thing.

I’m working hard to understand this in action across the API space. The Apigee IPO, Swagger changing hands, and other shifts in the landscape are making me think a little deeper about how we got where we are at. How did we all build momentum in API management, design, or now apply to world of API discovery? How do you make government APIs a thing? How do we ensure hypermedia becomes / is a thing? I mean, not just one possible design constraint, but well understood by all developers, and when it makes most sense to put to use. How has Swagger evolved? What is next for Swagger? These are all things that make me curious for what the future holds, and how I can help get us there safely, with as few scratches as possible.



Swagger Shifts Hands From Reverb to SmartBear

The news came out yesterday that SmartBear was taking over ownership of Swagger, from Reverb. Swagger has been outgrowing its home at Reverb for some time now, becoming much more than I think even Tony originally imagined. You can find the official announcement from SmartBear on their site, and Tony’s thoughts over on the Reverb blog. No need for me to recap, but I did want to provide some of my personal thoughts on the shift in this very important project.

I feel like we are embarking on the next stage in the evolution of Swagger. Over the last couple of years, the Swagger community has see growth, adoption, and investment at an unprecedented pace, culminating in version 2.0 of the spec and now its transfer SmartBear. The scope and passion of the Swagger community and the investment that Tony Tam specifically and also the API community have made -- these have all brought us to this moment.

I've known the SmartBear team for years, and after speaking with them about this transition of the Swagger specification, I believe SmartBear is the right steward for Swagger, during its next stage of its evolution. SmartBear understands the gravity of what they are now taking charge of: the Swagger specification representing a central truth throughout the entire API lifecycle.

SmartBear fully understands too that it was organic growth that made Swagger the force that it has become, and SmartBear is aware that a unique balance of leadership, open governance, and a strong community pull will be just as essential during the next stage of growth. I look forward to working with the Swagger community and SmartBear team to ensure Swagger continues playing a vital role across the API space, meeting the demands of the fast-growing API economy.

As I mentioned, I see Swagger as a central truth in the API economy, something I’m invested in beyond just my own IT infrastructure, as Swagger is also the cornerstone of the APIs.json format, that Steven Willmott (@njyx) and I created to assist in the next generation of API discovery. I am behind Swagger because I believe in Tony’s vision, and I also believe that the Swagger community will make this transition more about shifting gears for the Swagger spec and tooling, than it is about who is in charge of the project.

I’ve been tuned into the Swagger community since its inception, and the community support, contributions, and chatter has become pretty loud over the last year. I trust that SmartBear has experienced this scope as well, and understands the importance of Swagger having an open governance model, which would make this current transition all about shifting Swagger into 2nd gear, and setting up of open governance representing about shifting into 3rd gear, putting the ability to shift into 4th and 5th gear up to the Swagger community itself.

I am looking forward to what SmartBear and the Swagger community can achieve out on the open API highway!



Reflecting On API Management And The Apigee IPO

It has been a little over three years since I published my first roundup of API management providers. I’ve been tracking on this new breed of companies long before I started API Evangelist, but in 2011 I started formalizing how I monitored what these companies were up to. In 2015, I now track on over 35 API management service providers, offering everything from simple proxies, to the full API management infrastructure stack you get from 3Scale.

I have met most of the API management providers, and after 5 years of covering them, it is no secret I have my favorites (you know who you are ;-). The business side of being an API management service provider has never really excited me, so I tend to stay away from the investment and acquisition stories, or speculating to much on who is winning the cash game. With that said, I think the Apigee IPO is a pretty significant milestone for the industry, something I have to give pause to, and reflect on how far we’ve come, and evaluate how an IPO compares to the acquisitions we’ve seen in recent years, or the other significant industry milestones.

The acquisitions of Mashery, Layer7, Vordel, and Apiphany showed the space was really maturing, and the recent name change by SOA Software to Akana, shows the space has evolved over the last ten years. I think the Apigee IPO shows the overall space is actually growing up. I don’t think Apigee had many other alternatives, with a gagillion dollars in funding, but I still think still it shows the space is moving out of its juvenile phase. I guess the actual IPO will be the true test of how grown up we all actually are eh?

For me, another thing to note is where 3Scale is at. You see, I consider Mashery, 3Scale, and Apigee to be the OG three of API management service providers. Yeah, I know you have been around longer--SOA, Vordel, and other gateway solutions, but those three are the "OG API management”. 3Scale has been plugging along slow and steady, taking on only the funding it needs, while Mashery was acquired, and now while Apigee is IPOing. I know I’m biased when it comes to 3Scale, but I think their longevity, and success is as notable as any IPO, or acquisition milestones--its been a long haul.

Another thing to think about is the amount of open source tooling that is available in 2015--I was pleased by the amount of new players when I did my last roundup. I was also happy to work with WSO2 early on to understand what the space needed with its open source API management solution, and also the API Umbrella platform as part of the federal government work I’ve done. The amount of open source tooling that is available is a clear sign for me, that the API management is truly growing up, and API management is really a thing (was a little shaky there for a while, couldn’t tell if I was dreaming, or awake ;-).

I would like to end this reflection, on the most important sign for me that the API space will continue its growth, and reach new heights in coming years. The fact that the conversation has moved way beyond just API management, with companies, services, and tooling emerging throughout the API life-cycle, stimulating design, deployment, discovery, integration, and management conversations that are generating the growth I speak of. Things were only about API management for a while, something that really worried me, but in 2015 the conversation goes much wider and deeper, pushing into even more exciting territory like visualizations, containerization, and objects.

In closing, congrats to Apigee, wishing you best of luck in your IPO--Its been a fun ride, and I’m looking froward to it not ending anytime soon.

Disclosure: 3Scale, and WSO2 are API Evangelist partners.



Weekly API.Report For March 23rd, 2015

Ok, it seems like Monday is going to be the regular posting day for my weekly API.Report. No matter how hard I try to get done over the weekend, something always seems to get in the way. Regardless, I'm still going to do, as the process is more than worth it.

The Weekly API.Report represents the best of what I've read throughout the week, and is only what I personally felt should be showcased. Each news item comes with a link, and some thoughts I had after curating the piece of API related news. I'm trying to break down stories into as coherent buckets as I can, but it remains something that is ever changing for me, but ultimately I will settle on a clear definition for each of the research areas.

A couple of 3D printing footnotes for the week:

Accessibility is a critical aspect of the API space that needs more discussion:

  • Accessibility APIs: A Key To Web Accessibility - This is an important aspect of the API conversation that isn't discussed, let alone APIs delivering a solution. I'm going to break this down, and try to incorporate accessibly into my research more.

Only one acquisitions I wanted to showcase this week:

Interesting usage of analytics in an acronym, and some money rais'n:

I will be tracking on more annotation related discussions in the future:

Some API aggregation action, mostly from Google:

Interesting API analysis piece, that was guest post on Twitter:

Handful of worthy API deployment things I was checking out this week:

Lots of API design opinions this week:

API discovery roundup, and more APIs.json love:

Couple of things falling in the API evangelism bucket:

API events news (ok, its almost @APIStrat time!):

Lots of interesting API integration nuggets:

A number of interesting posts, and additions to the world of API management:

My API monetization exploration:

Some other API News roundups:

The latest in API podcasts (cause there are so many):

The single API reciprocity story from the week:

  • Zapier Integration - Zapier is something every API should have, an dwhen you successful integrate, make sure and tell the story about it, and showcase on your website.

An potential API simulation related story:

Some API visualizations gems from the week, do not miss the architecture of a data visualization post:

Some tips on application management from the big players:

Was going to put this under SDN, but I'm going to call it architecture:

My single art story:

An artificial intelligence piece that stuck around this week:

Interesting parody in the automobiles section this week:

Some banking thoughts:

I am going to begin putting blockchain stories in weekly roundup:

Random business stories this week:

Big, big week for the business of APIs:

From the careers section:

When it comes to city government and smart cities:

I like that cloud computing is much shorter this week:

Two Command Line Interface (CLI) pieces I felt like sharing:

Slwo containerization news week too:

One content delivery network story:

A very diverse data category this week:

One data center story to rule them all this week:

Lone deprecation remidner this week:

You don't find me referencing desktop implmentations often:

Diversity from the government:

Document API movement this week:

Some valuable education blips:

Two embeddable stories to highlight:

Energy stories are important:

So are environmental related news:

Beyond just API events, and movement with event mangement APIs:

The always active federal government sector:

Some very important movement in the financial space this week:

Just some regular government stories, that apply across teh board:

Three hackathon stories pass inspection this week:

Lots to consider in the healthcare sector:

Interesting home discussions, and companies:

Google leading with HTTP:

I smell huge opportunity for change in the insurance space:

  • Insurance: Risk and reward - Interesting peek into the potential of APIs in the insurance industry. This is an area I'd like to do a proper white paper, but lacking the bandwidth at moment.

Yay! We made it to the Internet of Things section again:

One investment story that floated up on radar:

Two JavaScript nuggets from the week:

Two interesting legal stories I found this week:

Important story about libraries:

A single location story:

Had to highlight Google's move into logging:

I don't always bite on the machine learning stories, but device centric seems valuable:

Soe mapping stories:

Media stories from the cloud:

A basic messaging story:

The microservices stories keep getting better in my opinion:

Three mobile stories to keep you happy:

Some OAuth tales:

Beyond just data and into the realm of open data:

Open Source discussions:

Lone patents story:

Payments discussions:

My new bucket for tracking what I consider to be Platform Development Kits (PDK):

Interesting policing, surveillance, and data story:

Some of the darker areas around the politics of APIs:

Lots of privacy discussions:

I'm calling this area protection for now:

A single quantified self thought:

One interesting rate limits story from the week:

Don't get many real estate API stories:

A new real-time addition:

  • Netty: Home - A new event-driven, real-time platform I added to my list of tooling. I'll spend more time learning about what they are up to as I can.

The ever growing regulation bucket:

Case studies in the restaurant space:

The somewhat interesting retail front:

The scraping goings on:

Three SDK entries to consider:

Security. Security. Security.

Sensors are everywhere:

Single Page Applications related talk:

Some patterns from the world of smart watches:

AI and the social good:

Growing number of of Software Defined Networking (SDN) news:

Also some software defined storage news as well:

Oh yeah, sports:

More spreadsheets services:

  • SpreadCloud - An interesting addition to my spreadsheet stack. They found me after I wrote a story last week.

State government can rock it too:

More telephony from Twilio:

A single translation post:

Essential transparency:

Two transportation stories:

Wearables keep dominating the conversation:

A single weather cell:

More container, but also a worker entry:

That concludes my report on what I read last week across the API space. I'm still working on pulling together a summary e-newsletter version, allowing people to get an executive summary of what I thought was important from the week--I am hoping it will be available next week. I'm also going to auto-generate some visualizations, and summary counts for each week. I'd like to see a tag cloud, and overall counts to help me understand the scope of news I cover each week.

If you know of something I missed, feel free to email or tweet at me, and make sure I know about your company, and have a link to your blog RSS. If you don't tell the story and share, the world will never know it happened.

It is a good time to be tracking on the API space--lots going on!



Overcoming API Rate Limits Like They Did With WebhookDB

I was listening to too infrequent Traffic and Weather API podcast today, and one of the topics John and Steve covered was an interesting approach to API consumption, and getting past API rate limits, with WebhookDB. I agree with John, that WebhookDB is pretty clever, and represents what I'd consider classic API developer ingenuity, when it comes to getting access to the resources they need. I’d have to give this one to my friend and API adversary Tyler Singletary (@harmophone), when he says rate limits stimulate developer creativity. +1 Tyler.

So, what does WebhookDB do?

...allows you to replicate Github's database over HTTP using webhooks. It's useful if you want to treat Github's APIs as a database, querying over pull requests and issues. Github doesn't like that, and you'll quickly hit the API's rate limits -- but if you use WebhookDB, you don't have to worry about it! Just populate the initial data into the database, set up the webhook replication to keep it in sync, and query your local database however you'd like...

There is more back-story over at the REST API gotcha, and webhookdb story that Traffic and Weather linked to. I agree with John’s thoughts, that this is an opportunity for a service provider to step up and deliver on. I envision a pretty simple, open source, containerized version, as well as a cloud-based version that people could tap into, and pay for more premium services on top of each index.

John is right, there will be significant overhead in defining the schema of each API you would support. Let me know, I can help with some of the targeting, alongside what I’m doing for my API Stack. Each API would need some special attention, but with containers, you could easily build out a pretty slick deployment solution that runs in cloud, or in infrastructure of choice for end-users. With some heavy lifting up front, it would be a pretty viable solution for API consumers, but I also think provide an interesting cache opportunity for API providers--think Datasift.

I also see a higher, more analyst level view here, in helping establish a common definition of web hook patterns across the space, and identify common patterns, then bring much needed education and potential consistency to the API space when it comes to webhook execution. Kind of like what I work to do for API design, deployment, management, integration, and other key areas of the API space. People are hungry for this type of information, and I’m happy to aggregate the work of anyone who steps up and delivers in this way.

I would also take this one step further, and charge someone with stepping up into more of a web hook advocacy role, where you could push back on some API providers to offer webhooks (many don’t), and improve existing web hook designs, again helping establish consistency in the space. In my experience many API providers want to understand the bigger picture, but don’t often have the time or awareness, and with a little education, and guidance, you could make a significant impact.

I look at webhooks as the important second lane, making APIs a two-way street, shifting API operations to be just as much push, as it is pull. Web hooks play a different role in potentially each APIs operations and ecosystems, but for many platforms, they will continue to play an important role in giving developers more control over API resources, while also eliminating much of the burden upon API providers.

Great idea John and Steve, thanks for sharing. One more reason to tune into Traffic and Weather—startup ideas!!



Bringing My IT Infrastructure Out Of The Shadows With An API-First Approach

I'm slowly migrating my own infrastructure, towards a microservice first approach, something you can follow the details of on my alpha.apievangelist.com blog. I've been running on my own custom brew content management system (CMS) since I started API Evangelist in 2010. There are many APIs in use across my system, both external publicly available APIs like Twitter and Crunchbase, and a wide variety of internal APIs I've developed myself.

As I work to replicate each aspect of my CMS, carving off each potential micro service, I'm reminded of how much of what I do lives in the shadows. Little scripts I've written here, and there, jobs I haven't run in months, and just general code exhaust from years of operation. I’m a little different than most shops, as I'm a one person operation, who codes all my own stuff, but based upon my experience in other shops, the same behavior exists in almost any organization.

Throughout this migration I’m reminded of what I'd consider to be one of the biggest security lapses in my world, which is just about things being known. As I build out new features, jobs, bells, and whistles, are they all cataloged, mapped out, and consistently used, or deprecated. Anything that is unknown, unaccounted for, lives in the shadows of my operations, and potentially provides a security risk. Often times the culprit in any security lapse, is an unknown workstation, server, application, or single script.

Having a single doorway (micro service), with a single approach to authentication (3Scale API Infrastructure), and a map of my entire surface area, using Swagger + APIs.json, I’m getting a much better handle on things. It will take a while to clean up all my legacy stuff, and map everything out, but once I do, I can easily audit, monitor, deprecate, and ultimately secure the entire surface area of my infrastructure, because it is all done with an API first approach. Then I’ll continue to address other aspects of my security like SSL, app management, and other critical areas, but for now, I feel like being able to map out my infrastructure, is doing wonders for bringing my infrastructure out of the shadows.