{"API Evangelist"}

An API Has No Value Until It Is Actually Used

I am putting a lot of thought into the value of an API request and response lately. I've talked about the potential of a Postman collection as a unit of measurement, for the API economy. I’m also spending time breaking down the monetization strategies of the 700+ companies I track on, but I’m really looking to get beyond just the API pricing, and actually measure any evidence of real value that I can.

This particular post is born out of a conversation with John Sheehan (@johnsheehan), CEO of Runscope, while we were in Sydney Australia last month for API Days Sydney. John had mentioned that when people asked him which are the best APIs out there, he replies “none of them, until they are used”. Which when you think about it, it really is the truth--APIs do not offer ay value, until they are actually put to use in an app, or system integration.

It is easy to place the value in the actual API definition, or in the request, or response from the API. There is also a lot of talk about much of the value being in the documentation, or possibly the SDK. You can event point to the value being that it is a public API, with easy on-boarding, that you can rely on, and scale as you need. All of this may contribute units to the overall value, but at this point any API still doesn't have value to anyone.

As John says, the point at which an API has value, is when it is used. Until that point, an API isn’t a thing. I’m not saying the API call as a metric, is the single metric to rule them all, but it is at this point when I feel you can really begin to measure the value an API delivers, and capture the exhaust from it. Honestly, I don’t know what I mean by all of this, it really is just an exercise for me to break down the moving parts of each API, so that I can better understand how APIs will potentially fuel the API economy--think what Runscope does, but for the monetization portion of each API transaction.



APIs.json As A Distributed Transport Layer For The API Economy

I'm working with multiple partners to define what I’d consider to be the firsts stops along the API lifecycle, when you use APIs.json as the scaffolding for your API operations. APIs.json is a machine readable format for indexing APIs, that exist under a specific website domain, or are part of a aggregate collection designed for a specific project or objective.

To help me, and the folks I’m working with to better see the API lifecycle that APIs.json is enabling, I wanted to take a stroll through each of the stops along the API lifecycle, and think through the potential for additional formats, services, and tooling throughout the entire journey. I want to explore the possibilities for individual API providers, API consumers, as well as API service providers, and the aggregate level, where companies and individuals are delivering valuable tools and services to the overall space.

First, lets start with what I envision APIs.json enabling at the direct one to one, API provider and consumer levels:

  • API Metadata - Quick access to the most important info about any API, its title, description, image, endpoint, url and tags.
  • API Surface Area - (The Truth) - Using Swagger and API Blueprint, any developer can quickly quantify the surface area of an API or microservice.
  • Execute API - Include Postman Collections, allowing each API to be execute ready, allowing consumers to load in Postman and deliver the value behind any API.
  • On-Board API - One click reference to where an API consumer can onboard with an API.
  • Existing Server-Side Code - Locating of existing server-side code in multiple languages.
  • Auto Generate Server-Side Code - Locating of services for generating of server-side code in multiple languages.
  • Supporting Container Images - Where Docker, and other container images reside, providing instant deployment opportunities.
  • Existing Client Side Code - Locating of existing client-side code in multiple languages.
  • Auto-generate Client Side Code - Locating of services for generating of client-side code in multiple language.
  • API Pricing - Understand pricing, access tiers, and partnership opportunities around API operations.
  • Terms of Services - Access, and make sense of terms of services that apply to API usage.
  • Licensing - Understand the licensing of an API, data, content, and code related with an API.
  • General Questions - Be able to ask, and get answers to common questions about any API.
  • API Conversations - Understand where to engage in conversations, and experience streams from common sources like Github and forums.
  • Augment APIs - Add endpoints, verbs, and parameters to existing endpoints, augmenting what any single API provider can deliver.
  • API Presets - Establish preset value for API entries, allowing the single, or bulk publishing of data, content and other values to APIs.
  • API Mapping and Domain Specific Language (DSL) - Bridging API definitions, and APIs, providing much needed aggregation and reciprocity for APIs.
  • API Sandboxes - Establish API sandboxes, based upon existing API designs, allowing developers to develop against alternative API than a production environment.
  • API Simulations - Beyond just a sandbox, actually establish alternative APIs that provide specific simulations of production scenarios that an API would serve.
  • API Cookbooks - Assemble API cookbooks that help onboard API consumers using wizards, blueprints, and other established orchestration patterns.
  • API Testing - Initiate API testing tools and services, covering surface area of each API.
  • API Monitoring - Initiate API monitoring tools and services, covering surface area of each API.
  • Audit Security - Audit surface area of any API that is indexed using APIs.json.
  • API Adjacent Monitoring - Initiate monitoring of other API adjacent resources like documentation, terms of services, and other essential building blocks of operations.
  • Visualize API Surface Area - Simple, embeddable visualizations that help us see the surface area an API provides.
  • Visualize API Resources - Simple, embeddable visualizations that help us interact with the resources APIs deliver.
  • Analyze API Surface Area - Simple, plug and plug tooling and services for analyzing the surface area of any API.
  • Analyze API Resources - Simple, plug and play tooling and services for analyzing the resources APIs deliver.
  • Load API Response In Spreadsheet - Resource for loading, and importing API resources into an Excel or Google Spreadsheet.
  • Craft experiences - Weave together experiences made up of multiple API resources, and the resources that are available to support them.

This list represents my master vision for stops along the API lifecycle, that could be served via APIs.json, providing value directly to API providers, and their consumers. Each of these areas would benefit both sides of the API coin, but as you can see will add value to the APIs.json engine, in a way that goes well beyond just API discovery--which APIs.json is known for.

It is hard for me to articulate, and map out the opportunities that will be available when APIs.json is fully realized. Our goal is to push the number of APIs.json files available to a critical point, where this full lifecycle vision is a reality. Once this occurs, I see another plane of existence emerging, one that can be applied across hundreds, or thousands or public or private APIs and micro services.

Once a critical mass with the number of APIs.json files is achieved, the wider opportunities for expansion, growth, and monetization will span three main areas in my mind:

  • API Definitions - Additional API definitions like Swagger, API Blueprint, and API Commons, providing machine readable access to vital components of API operations, that can be baked into the overall API lifecycle.
    • API Pricing - A machine readable format for breaking down pricing, access tiers, and partnership opportunities around APis indexed with APIs.json.
    • API Questions - A machine readable format for aggregating questions around API operations, including terms of service, privacy policy, and other elements that impact API integration.
    • API Conversations - A machine readable for for aggregating conversations that occur around API operations, that occur on multiple channels like Github, Stack Overflow, Twitter and more.
  • API Services - The opportunity for new API services to be developed on top of the APIs.json lifecycle.
    • Testing - Introduction of testing related services to known, and custom APIs definitions.
    • Monitoring - One time and schedule monitoring services, triggered via APIs.json indexing.
    • Security - Scanning of entire API surface area defined by an APIs.json definition.
    • Broker - Use of APIs.json in API broker related activity, crafting custom API backends and stacks.
    • Analyst - Delivery of vital industry and area data and content, using APis.json format, connecting analysis directly to sources.
  • API Tooling - The opportunity for new API tooling to be developed on top of the APIs.json lifecycle.
    • Search - Continued growth in the number of API search engines like APIs.io, providing options, and competition.
    • Collections - Establishment of common tooling for doing API roundups, collections, stacks, or any other grouping that is defined.
    • Notebook - Deployment of tools for saving, organizing, and remixing using APIs.json formats, either publicly or privately.
    • Spots / Hubs - Deployment of API spots and hubs that allow aggregation of API definitions, supporting building blocks, and API consumers in a wide variety of API areas.
    • Dashboards - Deployment of dynamic, APIs.json driven dashboards that generate visualizations, listings, and other detail via machine readable API definition formats.
    • Infographics / Reports - Deployment of dynamic, APIs.json driven infographics, reports that generate visualizations, listings, and other detail via machine readable API definition formats, in a portable format.

That is just a sampling of the definition, services, and tooling opportunities that are already, and will eventually emerge around an APIs.json driven lifecycle. The objective of this post was to flush out my ideas around the one to one, and the wider API economy opportunities using in APIs.json driven API lifecycle, for my own purposes, but also to communicate my thoughts to a handful of partners who are executing definitions, services, and tooling already.

If you are curious about what this all means, and get more clarification about how your company, its services and tooling fit into this, feel free to reach out, and I’ll do what I can to help you understand where you fit. Some of the stops along this API driven supply chain, I’m depending on existing efforts like Swagger and API Blueprint to execute, some of the areas I’m pushing forward myself like API Commons, API Pricing, API Questions, and API Conversations, while other areas I’m depending on 3rd party individuals and companies to step up and own. If you think you might be interested in delivering an API definition, service, or tooling somewhere in the areas I’ve defined, feel free to reach out.

This post is a product of several conversations I’m having with folks, and the regular conversations that Steve Willmott (@njyx) of 3Scale, and I are having around the APIs.json format. Additionally we are currently working on the roadmap for APIs.io with Nicolas Grenie (@picsoung), which is the first of many tools that are being built on the APIs.json format. You can influence the roadmap for APIs.json via the Github repo, and the individual communities of each of the sub APIs.json API definitions formats like Swagger, API Blueprint, API Commons, on their own working sites and repos.

If you made it this far, thanks for listening! ;-)



What We Do In The API Community Influences How The Rest of The World Is Making Change

I was just talking with my friend Oliver Seiler (@0seiler) in New Zealand via email. Oliver is great at keeping me in tune with API related stories out of New Zealand. I was making sure he knew how much I appreciate people like him sending me regular updates, and that it is what makes API Evangelist go around—to which he replied, reminding of how important the work we are doing here in the US API space.

Oliver told me what I hear a lot, "We still need to sell our story every single day and you wouldn't believe how much anything that comes from particularly you and 18F, but also ProgrammableWeb and GDS in the UK matters to our work." This isn't an isolated case, I hear this a lot from people I talk to in government, enterprises, and other institutions around then globe--the stories we tell are used to make change in how they think and potentially operate. Our stories from the trenches, become the stories they use to sell APis to their bosses. 

These reminders are what keeps me going after almost five years of evangelizing. The stories we are sharing about API best practices, and the challenges we face across API design, deployment, management, monetization, evangelism, discovery, and integration, are important to not just the health of the overall API space, but to each public and private sectors that APIs are touching in 2015.

The message for me here, is that we all need to keep pushing forward, but also making sure we tell the stories publicly, sharing our knowledge in real-time, because people are watching, and depending on us to help influence how things work in their businesses, institutions, and government agencies around the world. #NoPressure



The API Journey: We Do Not Always Get Our API Strategy 100% Perfect, But We Can Communicate, Learn and Evolve

Running the perfect API operation is pretty much a delusional dream. Even leaders  like Twilio and AWS have platform, and ecosystem produced problems on a regular basis. In my opinion, API are all about the journey, and we may never get our strategy 100% perfect, we can communicate, and evolve along the way—this is what I consider the API journey.

To help demonstrate this in action, here a post from the GSA, about the SAM API, on the US Government API Forum:

We've, rightfully, heard complaints about the SAM API languishing and the documentation not being as functional as it needs to be. We want to make sure users of the API feel comfortable using it to build real applications off of. We've adjusted how we're staffing the API as well as raised the priority the API in the eyes of our developers.

To you, this means this: 

1) At a minimum, we're sweeping through new GitHub issues weekly, but we will do everything to go through those issues more rapidly.
2) We're going to increase our transparency by making sure that anything in our backlog that's API-related is in a GitHub issue - if it's prioritized with a date, that'll be made clear, too. We want to make sure you know what's coming and on what days - and the docs will be updated to reflect what's currently implemented and what's coming.
3) We're going to publish our version management plan to the SAM API site so it's more clear what's going to cause a version change. One of the things about the SAM data is that regulations may change our data, so we should be treating some of it (like reps and certs) as variable-length instead of fixed to keep systems from breaking, so we absolutely document that.

We've already added some updates that include reference data that wasn't available before and a few minor cleanups. Expect the more as we ramp up our people at this to be worked through by the end of next week. We appreciate being held to task on this - IAE's larger strategy relies on usable, exciting APIs and we can't do that if we aren't serving you.

This is why I prescribe APIs, not because APIs are a technological wonder that magically fixes everything (they do, oh they do). I prescribe APis because of the API journey, and if done right, the API provider can change, evolve, and learn to better serve their consumers, and in this case constituents.

APIs are born out of breaking down legacy walls built up between IT and business operations, but they have also become about breaking down walls built up between inside the firewall, and outside the firewall. In the end I do not think APIs are the thing that will change government or companies, it is the process of doing them, learning to open up and communicate, that will move the needle forward.

Something that will not happen at your agency, organization, institution, or business if you do not start on the API journey yourself.



More Pondering On My Own Microservice Definition

Like API, the term microservice has emerged as a force, along with a meaning that is very precise, or very broad, depending on who you are. The only thing I’m sure of at this point, is the term microservice is very personal, and means very different things to different people, depending on where you stand in the industry.

Regardless I can't help but consider the implications of the term, internalize it, and see what value I can derive from its meaning in my own world. This exercise leaves me asking, what is a microservice? Over and over, acknowledging I may never fully know the meaning, and like API it will be more of a journey, then it never will be a destination.

First and foremost in my world a microservice means simple, and small. Simple and small. Simple and small. Say it until you can feel it, all Buddhist colliding with API Evangelist. Simple and small, penetrates all layer of meaning for me.

  • minimal surface
  • minimal on disk
  • minimal compute
  • minimal message
  • minimal network
  • minimal ui
  • minimal time to rebuild
  • minimal time to throw away

Ultimately I want all of this, coupled with the ability to achieve maximum scale, with minimum effort. I want to do one thing, as well as I possibly can, with as minimal resources necessary, but delivering the maximum value at any scale—with value benefiting myself, my API consumers, and the end-users being served, equally.

None of this is set in stone for me. I acknowledge this is my own perception, and that others will see microservices as something entirely different. Even so, I share my definition publicly, as also I enjoy reading others definition. I kind of think that is part of the whole microservices thing, sharing your own personal definition. In reality microservices do not change my API mission, but it does allow the conversation to be distilled down, for a new generation of business leaders working to make sense of all of this.



Weekly API.Report For March 16th, 2015

Swagger is now Open API Definition Format (OADF) -- READ MORE

Phew!! This week was hard. I just couldn't find the mojo to plow through, but I did it. A little late on Monday evening, but still so worth doing--putting the week into perspective.

I'm still trying to assess the best balance with what I post as individual news stories on API.Report. I think more of this could be pulled out as actual news stories, but will have to see if I have the time. Overall I am happy with how the weekly API.Report is evolving, but there is so much more I could do.

The Weekly API.Report represents the best of what I've read throughout the week, and is only what I personally felt should be showcased. Each news item comes with a link, and some thoughts I had after curating the piece of API related news. I'm trying to break down stories into as coherent buckets as I can, but it remains something that is ever changing for me, but ultimately I will settle on a clear definition for each of the research areas.

I track on a lot of 3D Printing, but few of them are directly API related, finallky one that moves the conversation forward:

A handful of acquisitions I noticed this week, with one more stroy on last weeks IBM acquisition:

Several interesting API analysis related things I am tracking on:

Some API definition stories from me this week:

  • Crafting Exactly The API Definition You Need With Swagger Vendor Extensions · - Swagger vendor extensions were new to me, so I wanted to share the fact that they existed, in case you thought Swagger wasn't extensible. I will be using, and adding to the spec where it makes sense. 
  • What is ALPS? · - I'm working to better understand ALPS, and paying attention to conversations going on around it. What Mike and Mark are working on is important, and i just need to find some entry level use cases to help onboard people.

Some serious API deployment movement this week, with a couple of new offerings:

Some approahces to API Deprecation:

Mostly my own API design thoughts this week:

Again, mostly my own stories, but a handful of API discovery items this week:

A wide variety of API evangelism stories:

API integration related stories, with commerce taking over:

Some interesting shifts, and additions in the world of API management:

Thinking I'll keep breaking API monitoring, out of integration stories:

I can usually count on Zapier for one good API reciprocity story a week:

Like montoring, I will keep breaking out API testing stories into their own category:

I can usually find some API visualizations related stories, even it is more visualizaiton than just API:

Thought this was a very cool nod in the world of architecture:

Anything art related I can sneak in:

Two automobiles, API, and data related items:

Business of APIs is always a good catch-all bucket:

Putting any interesting API related job posts under what I call careers:

A small CDN related nuggest from AWS:

I call it city government, but put anything API, and data related to city operations here:

Always several cloud computing items to talk about betwwen AWS, Microsoft, and Google:

Two items standing out specifically dealing with cloud storage:

Didn't quite know how to categorize this one, so simply tagged as code:

Magento integrations was a big part of the commerce stories:

Not to much worthy enough talking about on the communications front:

Lighter lod of stories on the containerzation front:

Direct CORS stories isn't always something you see:

The always full list of data related stories I'm tracking on:

Some of the SDN related talks bleeding over into the data center:

Two database stories stood out this week for me:

Several education related items that caught my attention:

I'll keep an eye on election related stories as we get closer to 2016 here in the US:

  • Quick Sketch – Election Maps - I will be tracking on data, analysis, visualization, mapping and other areas in support of elections. I'm all about shaking things up here.

Two encryption related thougths:

Always makes me happy when there are energy related stories:

Two sides of the enterprise API conversation:

Always lots of activity in the federal government:

The financial world is continuing to heat up with API conversations:

Another interesting view on using GitHub:

Handful of hackathons stories:

Appled fueled healthcare stories:

One item to add to my history of tech stories:

API related talk in the home:

One hypermedia discussion to note:

Infrastructure conversations out of Netflix are often worth showcasing:

Some crossover with other areas in the world of Internet of Things:

Couple of items to note in the legal bucket:

I wish I had more library related items each week:

The always interesting concept of machine learning:

Three mapping related items:

TWo interesting items on microservices that I read:

When it comes to date I'm going to start slicing off open data specific related stories:

Movement on the outdoors API front out of the government:

  • Open Data for the Outdoors - A nice call to action around the RIDB, and work in federal government around recreation API. 
  • mheadd/node-ridb - My man Mark got inspired and build an SDK for the RIDB database. nice! 
  • Agencies tap into recreational data - A view, a little closer to the government press side. I'm tracking on all stories I come across, to show the full view of story from all sides.

Always something payment related to discuss:

Only one entry in the area of policing this week:

Politics of APIs is where I put all the politically charged ideas from the week:

The growing concern of privacy:

PubNub is always good for some real-time stories:

Reporting stories coming out of SalesForce has caught my attention:

Anything related to scraping and Kimono is worthy of talking about:

Couple of SDK specific stories were interesting:

Lots of security items this wee, with cross-over from a couple of other areas:

Two Single Page Applications (SPA) items:

Smart watches will probably be tracked on separately from IoT in the future:

Single social related items:

  • Introducing the MailUp App for Facebook - I do not see as many announcements regarding new apps on Facebook, figured I'd record when I do. Keep track of what is being deployed in this environment.

The hot topic of Software Defined Networking (SDN):

A story I did on spreadsheets, sparked some interesting discussions, and finds this week:

The important area of transparency:

An interesting take on how educated about API versions:

Wearables will be tracked on separatelky from IoT:

Two interesting weather items:

The API for Wikipedia sucked me in this week, and worthy of showcasing the wiki API tech behind it:

  • RESTBase docs - I idid not know about RESTBase. I'm looking into adding it to my API deployment and management research. I just need to understand it better.

A handful of Women in Tech stories in my feed this week:

That concludes my report on what I read last week across the API space. I'm still working on pulling together a summary e-newsletter version, allowing people to get an executive summary of what I thought was important from the week--I am hoping it will be available next week. I'm also going to auto-generate some visualizations, and summary counts for each week. I'd like to see a tag cloud, and overall counts to help me understand the scope of news I cover each week.

As I did last week, I'm walking away with a better awareness of what is happening across the space. It isn't enough for me to read all of this API news in isolation, it helps to see it side by side with other news, allowing me to see and understand patterns that I may have missed. 

Thanks for reading. ;-)

 

Updated November 27, 2015: Links were updated as part of switch from Swagger to OADF (READ MORE)

The Enterprise Will Make The Same Mistakes With API And Microservices That They Did With SOA, Because Essential API Concepts Go Right Over Their Head

I would say the enterprise space fleet has successfully shifted their course, heading in the general direction of everything API. SAP, Oracle, IBM, Microsoft, and the rest of the usual suspects have pledged their faith to APIs, mobile, and IoT, all in a very public way. For those of us who jumped off the SOA bandwagon a while back (getting on the API bandwagon, yeehaw), this can be amusing to watch, cause the API bandwagon is better y'know?. ;-)

I sincerely hope that these large companies can make a shift towards an API way of life, or a microservice devops way of operating, or whatever the cool enterprise kids are calling it these days. Some companies will find success in these areas, while many, many others will just talk about it, and buy a wide range of vendor services that will help them talk about it—there will be a lot of money to be made (and spent).

The thing I fear, is that many of the core principles of why this whole thing is working will go right over the heads of many technologies and business leaders in the enterprise. Elements like simplicity, transparency, openness, and functioning in a decoupled, decentralized way. You are the enterprise, many of the essential ingredients of API, are just fundamentally at odds with what you are. You are a big, complex, powerful, and political corporate presence—a beast that does not like being taken apart, reinvented over and over, and being forced to work in concert with other internal, partner, and public resources, in an open way.

This doesn't mean that some enterprise entities won’t be successful—some will. I will be carefully watching the waves of enterprise organizations, for these tech savvy few, who internalize the core philosophy around APIs, successfully uploading their corporate soul to the API singularity. However, I predict I will have to work very hard to find these companies amidst the wreckage of the enterprise organizations that make the same mistakes they did with SOA, because APIs are just not compatible with their DNA, and ultimately will be rejected for reasons that are more enterprise than they are API.



Postman Collections As Unit Of Measurement For Transactions In The API Economy

I'm immersed in deep thoughts about the implications of machine readable API definitions like APIs.json, Swagger, and Postman Collections, which are being applied to APIs, in some very different, but overlapping, and complementary ways. Currently I’m working to define the overlap between APIs.json and Postman Collections, and how they work together.

For me, any good API starts with a Swagger definition, or API Blueprint, which is the single, machine readable truth of what is the surface area of any API, which I consider fingerprint for what is possible with any API or micro service. Additionally when I am working with any API, or stack of APIs, I am using APis.json to organize them into a single coherent set of APIs, then rely on each APIs properties to point to essential resources like Swagger, terms of service, pricing, and anything else I feel is essential to operations.

In addition to providing links to Swagger definitions for each APIs.json API entry, I’m also experimenting with adding a property that point to any Postman Collection for each API, and microservice. I’ll start by just calling the type “x-postman-collection”, and provide a URL to a Postman Collection I’ve generated around the API call, providing a link between what an API is, and a measurement of the transaction an API produces.

I’d say that Swagger is the promise of value an API delivers, and Postman is a machine readable representation of the value, ready for execution—which acts as machine readable unit you can then load into Postman, or any other system, and actually experience the result. When this unit is sitting in your Postman client, you can see the value behind a single collection, but when you see a couple hundred of these collections in your Postman, you start seeing the important role they can play in the overall API economy, as portable a unit of measurement for API transactions.



Making The API Feedback Loop Machine Readable With APIs.json

If you are following some of my recent stories, I’m heavily focused on APIs.json, as I work to organize my own stack of microservices, using APIs.son and Swagger. I’m working hard to define additional, machine readable layers to my microservices index that answer as many of the questions that I have about the microservices and APIs that I depend on.

The first machine readable definition I include in any APIs.json collection I’m building is Swagger, which provides a JSON or YAML definition of the surface area for any microservice or API that I depend on. The second thing I put to work for my own APIs, is API Commons, which provides a JSON definition for the licensing of my APIs. Beyond that I’m pushing forward a couple of other machine readable definitions, that help me make sense of not just a handful of APIs in my own collections, but potentially thousands of APIs in the public space.

I just published two potential new machine readable definitions to add to my stack of APIs.json properties:

  • api-pricing - My quest to understand the API pricing across the API space, from providers like Amazon to Twilio, there are some healthy patterns to follow when crafting API pricing.
  • api-questions - My realization that nobody would give a shit about a machine readable terms of service, so I created a simple JSON list of questions and answers that can be applied to APIs.

These are two areas that I am trying to better understand, as I build my machine readable index of the APIs, that I call the API Stack, which also helps fuel APIs.io. Another area I’m struggling to get a handle on, is the message layer around APIs, and more widely what I’d consider the feedback look around APIs. So, what the heck do I mean by this? I’m not entirely sure, I’m still trying to figure it out in real-time--which is why I'm writing this post. 

If an API is one of my internal APIs, the messaging, communication, and feedback loop starts with issue management in Github. Each API has its own Github repo as its home base, and any communication around its design, deployment, and management, happens there. However there may be other references to the API made on Twitter, Stack Overflow, or other channel, and I am looking for a way to aggregate them all in a single machine readable away, that is connected to the API.

To do this, I’m employing the same approach I’m using to get a handle on pricing, and other questions I have around API operations, and creating a new APIs.json API property type, which I’m going to call api-conversations. Right now I’m just going to include message entries for Github issues, but work to make as inclusive as I can, to include any possible channel. We’ll see where I go with api-conversations, right now I’m going to use it to manage the feedback loop within my own microservices stack, but then I’m also going to apply to the API Stack, and see what it looks like when I try and aggregate the feedback loop around public APIs.



Going Beyond Excel As A Data Source For API Deployment And Focusing On It As An API Client

I've said it before, and I will say it again — Excel and spreadsheets will continue to be super critical for the growth of the API industry. There are an increasing number of solutions like APISpark for deploying and managing APIs using spreadsheets, something that will get easier over time, but so far I'm not seeing equal acknowledgment of the potential of Microsoft Excel as an API client.

The majority of the world's data is locked up in spreadsheets, and CSV files. Something I learned during my short time in Washington DC, is that the API community is going to have to court the legions of data stewards who spend their days in spreadsheets at the companies, and government agencies around the world, if we are going to be successful. The tooling for deploying APIs from spreadsheets has emerged, but we have a lot of work ahead to make them simpler and easier to use.

With the majority of the worlds data locked up in spreadsheets, this also means many of the business decision makers have their head in the spreadsheet on a daily basis, depending on the data, calculations, and visualizations that influence their daily decision-making. I’m seeing only light efforts around delivering API driven services in the spreadsheet, something that is going to have to grow significantly before the API industry can reach the scale we would like.

I know that the spreadsheet does not excite API providers, and API integrators, but they are a comfortable tool for many of the business ranks, and if we are going to get them to buy into API economy, and play nicely, we are going to have to accommodate their world. When thinking of spreadsheets and APIs, don't just think delivering content and data to APIs, but also how APIs can deliver vital content and data back to spreadsheets users—acknowledging the ubiquitous tool can provide huge benefits as an API client, as well as data source.



Get API Results Into A Google Spreadsheet By Pasting The Following Into A Cell

My API hero of the day is Alan deLevie (@adelevie) over at 18F. I was tweeting out my story "Going Beyond Excel As A Data Source For API Deployment And Focusing On It As An API Client” today, and he tweeted me back his slick usage of the ImportXML function to get Federal Communication Commission (FCC) Electronic Comment Filing System data into a Google Spreadsheet.

When you visit ecfa.link, and scroll to the bottom of the page you will see simple instructions on how to get the data into your Google Spreadsheet:

This is what we need more of, something that should be default for all data APIs. There are some other RSS, JSON, and XML goodies on the page too, but that is another story. Providing a simple copy / paste option for taking API driven data and making available in Google Spreadsheets is what I'd consider fundamental building block, getting us towards what I talk about in my post. Spreadsheet users need a wealth of simple, copy and paste resources like this that they can tap, from both public, and private data sources.

You rock Alan! I think his approach to providing a copy / paste ImportXML standard should be standard for all data APIs. We can start simple, by building a library of simple data sources like cities, states, postal codes, but then graduate up to more advanced data sources, or even subscriptions to valuable, curated, premium data--the trick is going to be keeping it as dead simple as Alan made it.

I will have to think about adding the ImportXML copy / paste as a standard building block, along with RSS, XML, and JSON. It feeds nicely into my wider spreadsheet and API research nicely. Thanks Alan for this story, keep up good work!



Transitory APIs: Intentionally Building APIs That Can Go Away At Anytime

I use many types of APIs. Some are public APIs, operated by other people, and then there are the APIs I have designed, operate, and consume on my own. As I design, redesign, deploy, and manage my APIs, I'm in a fortunate place that I get to push the boundaries of API operations, without much of the friction of normal API platforms. I'm the only one providing and consuming many of my APIs, essentially running an API dictatorship.

Over time I have built quite a few APIs that are still up, active, but nobody is using them. I have built them during some specific research, or to support a prototype, which once I was done with core work, I do not need the API to actually be alive and running. At some point in the future when I revisit the research, I may want to fire back up the APIs, but for now its ok if they are down, with a dormant Docker image, and server-side code living within its requisite Github repo.

This is just one example of how I’m building APIs that could possibly go away at anytime. It is an area that, as I think about more, the more use cases I come up with. It is helping me rethink some of my architecture, and workflow. Another potential approach I am contemplating involves having a single UI that may use one or many APIs, but those APIs aren't even live until I load the screen—which triggers the containerized deployment of the last known snapshot of each of the dependent microservices.

If nothing else I’m having fun. I’m not sure if all these scenarios are viable in the real world, but it makes for an interesting exercise, and provides me with a kind of fire drill approach to my workflow that lets me rethink my APIs, and the workflow around them, with each drill. For me, it shines a whole new light on the old APIs can go away at any time, no wait you have to support them forever argument. Docker and containerized deployment of my APIs are really warping how I see the services I depend on—fun, fun!



Crafting Exactly The API Definition You Need With Swagger Vendor Extensions

I was listening the APIs​Uncensored podcast last weekend, where Ole Lensmar (@olensmar), and Lorinda Brandon (@lindybrandon) sat down for a conversation with Tony Tam (@fehguy), the creator of API definition format Swagger. There are a lot of interesting API informational nuggets throughout the podcast, some of which I'll be writing about, but this one is about something Tony mentioned—called Swagger vendor extensions.

I've been using Swagger for a while, and specifically Swagger 2.0 since its release, and Swagger vendor extensions was new thing to me—most likly user error. I don't ever sweat things I miss, because I consume so much information each day, things are bound to get past me. Anyways, Swagger vendor extensions are simply the ability to add any property to an API definition by using the “x-“ prefix”.

This is something we do with APIs.json as well, and is something that I think becomes the R&D layer for any specification. Since I just learned about this ability, I do not have many examples of what has been done with these extensions. I found a post from Apigee demonstrating how you can add "capabilities such as caching, quota, OAuth, and analytics to your API with Swagger vendor extensions via a project we call Volos.js”, and I came across Wikimedia using a "x-subspecs” in the API definition for Wikipedia—now that I’m tracking on Swagger vendor extensions I will look for them in Swagger specs I come across.

I've come across blog posts about Swagger, where people outline how they don’t use it because they can’t define something they want. Maybe they just didn't know about Swagger vendor extensions like me, IDK. Regardless I will be tracking on how people are extending Swagger using the vendor extensions, and begin brainstorming some of my own ideas on how they can be used to craft exactly the API definition you need.



Augmenting A Read Only API With AN External POST, PUT, And DELETE

I am revisiting some work that I started when I was working in Washington DC as a Presidential Innovation Fellow(PIF). Actually there are several things going on here, a sort of perfect storm of API design thoughts. After reviewing hundreds of APIs, including APIs in the federal government, you start to want to either go mad, or start making changes to the API designs you are exposed to.

This leads me to the other layer to this story, a common question I get regularly, regarding whether or not there are any write APIs in the federal government—meaning, can you POST, or PUT to a common government resources. Write APIs are important in government, and their scarcity reflect some of the systemic illnesses I feel exists in government IT. With that in mind, bundled with the regular process of reviewing an API implementation out of government—you get this late night API Evangelist story. Enjoy. ;-)

I was profiling the data.usajobs.gov jobs API, which included crafting a machine readable Swagger API definition:

After crafting this machine readable definition of the surface area of data.usajobs.gov, I am once again reminded of the lack of write APIs in government--as the data.usajobs.gov API is GET only, which only provides a single search endpoint. Don’t get me wrong. I am thankful for the work they’ve done with the data.usajobs.gov jobs API, it is just my nature to study, then push and iterate on API designs anywhere I possibly can. Sorry, I have an addiction. ;-(

As I was profiling data.usajobs.gov, I remembered of the work I did as a PIF in DC, and how after the government shutdown, I wanted to see more APIs, have an augmented layer to their operations—eliminating the ability to turn off the switch for government APIs. Meaning I would like read only APIs like data.usajobs.gov to have one or many other APIs that sync, and augment API functionality, beyond what the original government agency intended. It may sound radical, but I think it is how we are going to get this stuff done.

To help understand what I’m talking about, I generated a basic evolution of the data.usajobs.gov jobs API design, allowing for potential POST, PUT, and DELETE capabilities, opening up the federal jobs API for community input:

Of course, this API isn’t a real API--It is just a Swagger representation of one possible API that could exist to augment the read only API provided by data.usajobs.gov. My goal is to just think through how we might be able to help iterate on existing API resources being provided by government agencies. Allowing for extending, building upon, and potentially crowdsourced assistance in the management public data. I’m not saying this should be a fully open API, available to anyone, but the idea of a well managed, or possibly multiple well managed resources for the same public data is an interesting thought.

This approach would allow for the evolution of public data, beyond what the government can do on its own. External entities could build upon government data, like the jobs data from data.usajobs.gov, and then data.usajobs.gov could decide if any 3rd party resources becomes worthy of paying attention to. I’m thinking about applying this type of model in other areas of government I am working through like with the FAFSA API, and some of the API efforts I’m seeing out of Department of Interior and Department of Energy.

In the end this post is primarily about the potential in augmenting existing read only APIs, with an external POST, PUT, and DELETE. I’m not saying the data.usajobs.gov jobs API needs this, it just happen to be the API I was profiling when this story came up, and it seemed as good of an API as any to leverage as part of this story.

Not much else will happen for me on this topic. I am not going to develop this API. I just wanted to add the data.usajobs.gov job API to my federal API stack, and along the way I got sidetracked with adding the POST, PUT, and DELETE layer of this story--damn ADD! ;-)

I’m going to find a couple of other federal government data sources to possibly push this idea forward, and eventually I’d like to see a single API, developed on top of a government resources, surpass its original source in quality, and value, resulting in the original agency accepting the data back into their interanal systems. When it comes to data stewardship, open data, and APIs in government it is all about trust—without trust, any external, outside-in effort around open data and APis in the federal government, will be rejected.



A Glimpse At What I Am Imagining For API Driven Analysis, Visualization, And Beyond

I came across an interesting piece of technology today while doing new curation for API.Report. RASON, an interesting approach to API driven analytics and potential UI and visualization, that kind of resembles what I have been envisioning for one possible future. The analytics tool is created by a company called Frontline Systems, and I’ll let them articulate what it is:

RASON™ software is designed to make it much easier to create, test and deploy analytic models that use optimization, simulation, and data mining. RASON stands for Restful Analytic Solver™ Object Notation.

RASON targets analytical professionals, excel power users, and web app developers, but here is where it gets over my head, "Problems you can solve with the RASON service include linear programming and mixed-integer programming problems, quadratic programming and second-order cone problems, nonlinear and global optimization problems, problems requiring genetic algorithm and tabu search methods -- from small to very large." — sounds impressive to me!

I signed up and played with RASON a little bit, but it wasn't as intuitive as I hoped. I think I have a little more to learn about RASON. The RASON models are very cool, I like the airport hub, I just don’t have enough knowledge to make it work right now, however I’m digging the idea, and it reflects what I’ve been seeing in my head when it comes to defining API driven analysis--when you connecting that with API generated visualizations, hypermedia, spreadsheets, APIs.son, and more—I start to get a little too excited.

Anyhoo. Just sharing a new technology that I found. Once I learn more about RASON, hopefully I will be able to see where RASON fits into the bigger API life-cycle, and share a little more.



What is ALPS?

I was watching an open thread around ALPS over at API Craft, something that is on my working list to better understand, apply more in my world, and tell the story all along the way. ALPS author, and API visionary (;-) Mike Amundsen (@mamund) responded to the thread with a nice overview, which I wantd to repost and share with you.

So, what is ALPS?

  • ALPS is not a runtime format like HTML or HAL
  • ALPS is not a designtime format like RAML or Swagger

ALPS is a Profile format for describing the bounded context of a service. ALPS can be used as source material for designtime formats like RAML, WADL, Swagger, WSDL, etc. on the server side.

ALPS can also be used as source material for client-side frameworks like Ember.js, Angular, etc on the client side.

ALPS describes the operations (actions) and data elements of a service. that's all. that description is the same no matter the designtime tooling, protocol, or message format used. that description is the same whether you are implementing code on the client-side or server-side.

"Hence state is maintained SERVER SIDE by mapping the URL's when it is implemented... correct?"

ALPS has nothing to say about how or where state is maintained. You can do whatever you wish when you implement the bounded context ALPS describes.

ALPS tells you the WHAT of the service, not the HOW.

For example, you can use the ALPS document to implement a services that employs your API Chaining on the server side. Or you can use the same ALPS document to implement a classic hypermedia-driven service using Cj or some other hypermedia-rich format. You can also use the same ALPS document to implement a simple JSON-object CRUD-style service.

When you use ALPS as the "shared bounded context" you can be confident that each of the services, regardless of their local implementation choices are supporting the same bounded context. this is especially handy when you want to provide the same operations using different implementations (e.g. JSON-object CRUD for mobile, HTML for browser, etc.).



Visualize Your Cloud Presence Using Mohiomap

There are lots of good visualization stories recently, or maybe I’m more focused on API driven visualizations. Who knows? This particular post is an API story because Mohiomap, the company at the center is using APIs to accomplish their “vision”, not because they have an API—which actually would be a nice way to pay it forward, but that is another story. Regardless, I think what Mohiomap is up to, is interesting on a lot of levels, bur primarily because in coming years we need to be able to better understand this new, cloud-based world we are creating for ourselves.

Why am I blogging about this? Because Mohiomap is driven with APIs, and if done right, it can make a big difference in how we see our virtual selves, whether the business or brand personas we are crafting online, or our personal self that is spread across Twitter, Facebook, Dropbox, and other API driven platforms. We currently can easily see our personas within each silo, like Facebook profile, Twitter profile, or Github profile, but understanding ourselves across the places we frequent online is much more difficult—something only possible when you begin using APIs.

I’ve had a visualization research project on my radar for some time now, but I think its time that I start publishing it for everyone to enjoy. API driven visualization is an important layer to my overall API economy research, and whether its for the meta layer of APIs, or for the valuable resources they are serving up, like wall posts, messages, photos, videos, or business documents--visualizations will continue to drive how we evolve our understanding of the digital world that is unfolding at a breakneck speed.

With the addition of Mohiomap, I’m starting to develop quite a list of API driven visualization services and tooling. Enough to warrant its own research area, and be given a certain amount of attention each week. Ultimately it will depend on what the community delivers, but I’m pretty sure that visualizations will dominate the stories here on API Evangelist for the foreseeable future.



Targeting Some APIs In My Stack For House Cleaning And Maybe Some Design Iterations

As I look through more APIs, and I don’t just play around in their developer portal, and look at documentation, I am actually get my hands dirty generating Swagger definitions, and authenticating and making calls to an API. There is no better way to get to know an API, that generating a Swagger definition, and integrating with it—something when done, you always walk away with a new perspective.

Once I get more of the APIs profiled in my API Stack, I’m going to see if I can configure my API editor to let me iterate on other people’s API designs a little bit. As an API consumer, you don’t always get a voice in the design of the next version of an API, but with API definitions like Swagger and API Blueprint, you can actually make edits, and then share them back with a provider. Of course it is still up to a provider to decide on what they accept, but there is not better way to make suggestions than a machine readable API definition, that is ready to go.

When I get around to doing some housecleaning on these API designs I generated, and possibly make some design iterations, I’ll focus on some of the smaller operations, leaving the bigger teams to do their own heavy lifting. Also, my goal isn’t just to iterate on API designs to help the original provider improve on their design, my goal is to iterative and improve on the API designs in my stack. The more designs I play with, more designs I have in my library, and the more I get a feel for what I think API designs could, or should look like.

I can’t fix all the APIs in the world, but I can improve on the hard work that has already occurred, and who knows, maybe some providers will see it for the compliment that it is, and listen to some of my API designing that is occurring out loud.



@Broadcom, I Am Going To Need Your Switches To Support Virtualized Containers So I Can Deploy My Own APIs Too

While processing the news today over at API.Report, I came across a story about Broadcom delivering an API for managing their latest network infrastructure. The intersection of Software Defined Networking (SDN) and Application Programming Interface (API) is something I’m paying closer attention to lately. Hmmm. SDN + API = Brand New Bullshit Acronym? Meh. Onward, I just can’t slow down to care--{"packet keep moving"}.

At the networking level, I’m hearing Broadcom making a classic API infrastructure argument, "With the OpenNSL software platform, Broadcom is publishing APIs that map Broadcom's Software Development Kit (SDK) to an open north bound interface, enabling the integration of new applications and the ability to optimize switch hardware platforms.”, with examples of what you could build including "network monitoring, load balancing, service chaining, workload optimization and traffic engineering."

This new API driven approach to networking is available in the Broadcom Tomahawk and Trident II switches, looking to build up a developer community who can help deliver networking solutions, with Broadcom interested in giving, "users the freedom to control their technology, share their designs and boost application innovation.” Everything Broadcom is up to is in alignment with other valid Internet of Things (IoT) efforts I’m seeing across not just the networking arena, but almost any other physical object being connected to the Internet in 2015.

I think what Broadcom is doing, is very forward leaning effort, and providing native API support at the device level is definitely how you support “innovation” around your networking infrastructure. To keep in sync with the leading edge of the current API evolution as I'm seeing it, I would also recommend adding virtualized container support at the device level. As a developer I am thankful for the APIs that you are exposing, allowing me to develop custom solutions using your hardware, but I need you to take it one level further--I need to be able to deploy my own APIs using Docker, as well as working with your APIs, all running on your infrastructure.

I need your devices to support not just the web and mobile apps I will build around your around your hardware, and the API surface area you are providing with the new Tomahawk and Trident II switches, I need to also plugin my own microservice stack, and the microservices that vendors will be delivering to me. I need the next generation of switches to be API driven, but I also need to guarantee it is exactly the stack I need to achieve my networking objectives.

That concludes my intrusion into your road-map. I appreciate you even entertaining my ideas. I cannot share much more details on what I intend to do with your new SDN & API driven goodness, but if you work with me to let me innovate on your virtual and physical API stack—I feel that you will be surprised with what what the overall community around your hardware will deliver.



An Outside-In Approach Will Play A Critical Role In Driving The API Economy

I'm a big fan of private APIs, and all for keeping API access tailored to meaningful groups of users vs. just opening up data to the public, without first thinking critically about the possible pros and cons. With that said, I always encourage API providers to seriously consider the outside-in effect, when APIs are designed to be as accessible as possible, allowing unintended things to occur around your API resources.

It isn't just about the developers who are directly working with your API, it is also the entire API industry, and service providers that serve it. A number of companies exist across the API space, providing valuable products, services, and tooling that is derived from one or many API platforms. The API space is big enough now, it isn't just about the handful of strongest players, it is about the entire community.

An example of this is with API ChangeLog, a monitoring service for API documentation. The API ChangeLog keeps an eye on 68 of the leading APIs, and tracks on whether or not they have updated their API documentation, and keeps developers informed of the changes. These types of services benefit both API providers, and consumers, making for a healthier space overall. While some API provider are good at keeping developers in tune with documentation changes, not all of us are able keep up, and sometimes developers just aren't tuned in, and the API ChangeLog helps fill in the gaps.

Without outside access to API resources (like your docs), services like API ChangeLog do not exist, leaving developers to fend for themselves, and not providing valuable industry data that can be used by anyone. Services like API ChangeLog will be the lifeblood of the API economy, and provide valuable exhaust that informs API driven markets as they grow, and help give these markets som e of the positive forces they are so well known for.

Keep your APIs, and much of their supporting building blocks as accessible as possible, and while you are at it, make sure and index them using APIs.json, helping services like API ChangeLog provide vital data to developers, and the wider API economy.



What Exactly Does Your API Do?

A short, concise, portable description of what your API does, is one of the most critical building blocks of API management, and is essential to reducing friction when new users are on-boarding with any API. It can be easy to design, and populate your API developer portal from a perspective that is very much in the know. I do it all the time, structuring your portal to reach the widest possible audience can be hard—it is why I’m here! ;-)

Here is a great example of providing all the right elements, but leaving out the essential detail that new API consumers will need. View the Dailymotion developer area, land on home page, and without clicking, tell me what Dailymotion does.

 

The portal has all the right moves, except it neglects to say anything about videos, or show a video picture. I guess you could argue that anyone landing here will probably be getting there via the main Dailymotion site, with the proper knowledge of what is going on. This is probably true, but there will always be people landing on the API developer portal page, without any clue of what Dailymotion does—do not make them click to find out.

This really isn’t that big of a mistake, and something that only someone like me, who looks at numerous developer portals would think about. For Dailymotion, all they have to do is update the FMA area, put video in the title, short description, and maybe include as an image in the background. Boom, within seconds everyone understands the value offered by Dailymotion in 2 seconds or less.

The description you provide on your API portal landing page is your first impression, make it count. It is also likely, that the press, bloggers, and other 3rd parties will copy / paste the description to describe what you do, on remote websites—so make it good!



Weekly API.Report For March 9th, 2015

Swagger is now Open API Definition Format (OADF) -- READ MORE

This is the third week in a row I've managed to do this weekly report, something that took me about 9 hours of work, so I can see that the amount of work will vary pretty dramatically each week. Doesn't matter, I'm in for the long haul now, butI'm guessing Monday morning is going to be the common release date, as I start wrapping up on Saturday, but often can't do all the heavy lifting until Sunday. 

The Weekly API.Report represents the best of what I've read throughout the week, and is only what I personally felt should be showcased. Each news item comes with a link, and some thoughts I had after curating the piece of API related news. I'm trying to break down stories into as coherent buckets as I can, but it remains something that is ever changing for me, but ultimately I will settle on a clear definition for each of the research areas.

You don't often see me lead into a story with accounting, but Xero is pushing forward with their world domination plan, worthy of highlighting:

There were a number of acquisitions, that were worth enough of showcasing this week:

Advertising is not something usually here, but at the intersection of gaming, I'm a little more interested:

I am working to pull together analytics specific research project over the next week. It is one of those areas I'm carving off of data:

I am moving API definitions out of the API design realm, where I can showcase the benefits of machine readable formats:

  • Apigee Product Highlight Video: SmartDocs - I thought was an introducing highlight of SmartDocs on the Apigee platform, or as I call them interactive docs. I like seeing how each of the API providers view the benefits of machine readable API definitions like Swagger--Apigee is definitely invested in Swagger.

These are my thoughts on API deployment for the week:

  • We Need An Open Library Of The Most Common Utility API Implementations · - This seems like a no brainer to me. Someone should step up and build a platform to house these, and figure out some way of generating revenue around this, in a way that you can offer the open source designs for free. Maybe premium services, or custom deployments could be enough to pay for it? It is another area that if someone doesn't step up I may start creating and publishing to a Github repo.

Some interesting API design discussions this week, from the pragmatic, to the restafarian:

As usual, I'm leading the API discovery conversation:

A handful of API evangelism discussions throughout the week:

The event season is warming up, with lots of API events to attend:

I predict API integration conversation will keep dominating the landscape:

Lots to learn in the area of API management this week:

  • A Little Care Package for PHP Developers - A pretty straightforward code library story, which I usually do not share as part of weekly update, but I like well told stories, that show API providers care about their devs. 
  • How Promoting a Developer Ecosystem Strengthens an API - Great story from Mark over at PW. These types of stories are important to motivate API providers in a positive way. 
  • Adding Four New Building Building Blocks Providing An API Management API Blueprint · - You will see more work from me in this area, as I define a base blueprint API providers can use to provide an API for their API management infrastructure. I'm using 3Scale to drive the base definition, as I use them for my own infrastructure. 
  • A note about rate limits - A nice, transparent look into API rate limits over at Keen.io. I love it when platforms break down their limits, and the logic that goes into that. It goes a long way to building trust. 
  • Top 20 Intuit Developer Questions - I like the idea of doing a top X FAQ blog post regularly. As an API provider you should also have a dedicated FAQ page, but grabbing the top questions, and publishing on the blog regularly can help get the word out!

I am hoping that the API podcasts that have emerged keep it up, and new one's continue to emerge:

  • APIs Uncensored - Another episode from SmartBear on their APIs Uncensored podcast. They had Tony Tam over from Swagger, for a very interesting conversation. 

Artificial Intelligence (AI) is always a fun look at a possible future:

Some stories from the world of connected automobiles:

A few moves this week on the banking front:

An interesting enough API launching in the online billing space:

One area I will profile deeper in coming weeks is calendaring:

Always good stuff to cover in the world of city government operations:

Lots of things to watch in the cloud computing wars:

The container movement always has interesting stories and players:

Just keeping the API and copyright conversation going:

Like net neutrality, cybersecurity wil continue to leak onto m weekly roundup:

  • The Democratization of Cyberattack - All the cybersecurity rhetoric is like a train or car wreck for me, I can't help but tune in and be captivated by the 1984-esque tone of it all.

Data continues to lead the API conversation:

Always like covering the nextgen databases like Ochestrate, but cloud blueprints is interesting too:

You won't see DNA too often in roundup (I hope):

Interesting DNS, while doing Internet at global scale:

Embeddable is still critical to API integrations:

  • Embed Twitter-hosted video on your website - Ok. Twitter investing on th embeddable front. Just when I'm about to de-emphasize some of the embeddable research I'm doing, a big player like Twitter steps up and makes another investment. #value 
  • Google Maps API Checker Uncovers Mapping API Problems - This is interesting, because they made an API to help debug and solve the problems faced when using their embeddable, JS API. I think this is something other providers of embeddable JS APIs should consider. Paying attention to what pioneers like Google Maps API are up to is important. 
  • Twitter introduces video embeds for sites - Just a little outside perspective on what Twitter is up to with their new investment in the embeddable space.

An encryption story to keep an eye on:

I wish there were more environment related API stories available:

As usual the federal government is dominating, or maybe its because I'm obsessed with the area:

Couple of financial related items:

I am reminded of how important FOIA is, and will continue to be, in the work I do:

This was an interesting piece from the world of forms:

APIs will continue to evolve in gaming sector this year:

More amazing GitHub lessons:

A couple of interesting hackathons occuring:

GE shows it is interested in the healthcare space:

Adding to my history of compute archives:

Some fun and distrubing thoughts from the connected home:

An idea I had this week, that generated some great conversation:

Identity related products I'm researching better:

Of course, more Internet of Things (IoT) stories:

A couple of healthy API career options emerged this week:

One of the more mature layers to the API space, mapping:

I keep saying it, keep an eye on Microsoft, and media is the focus this week:

Google dominated the messaging discussion this week:

Couple of microservices conversations I was paying attention to:

Continuing to demonstrate that mobile is top of list for big companies in 2015:

I like stories about API mocking:

Net neutrality crept into my roundup. I track on lots of stories in this area, but they rarely making it into the API conversation:

One of the top, new growth areas for me in the API conversation is networking:

The API Journey Or What Is the Point of an API, By Tony Hirst (@psychemedia)

Tony Hirst (@psychemedia) wrote an interesting story about what I would consider the API journey, which he called, “What’s The Point Of An API?”. I’m the first to call bullshit on the term API, which is more storytelling magic than it ever was technical, for me. In my opinion, the point of an API is rarely ever the actual technical implementation—sorry devs. ;-( The point of the API is the journey, for the API provider, and the API consumer (both dev and end-user).

The point of the API is about the API provider reducing friction, when it comes to understanding, and put to work, valuable digital assets from data, content, to more programmatic, algorithmic elements like image filtering or video encoding, to more physical elements like a Nest thermostat, webcam access, or drone controls. An API does not automatically equal success, it take so much more than just launching an API to establish something that delivers value, and is sustainable--something that can be easier said than done, in todays volatile, digital environment.

The point of an API is about the API consumers having the freedom to discover, access, explore, and put to use valuable digital resources that they couldn’t deliver on their own, in new and interesting ways. Developers need to be part of a feedback loop with API providers that helps iterate APIs, and evolve APIs towards being more experience based beyond their resource oriented roots, and integrate them in meaningful ways into the systems, apps, and devices end-users depend on the most.

The point of an API is about empowering the end-user with applications that deliver value, all while opening up as much of the backend, and the pipes that are delivering valuable API driven resources, giving end-users a voice throughout the supply chain, using open formats like oAuth—bringing much needed transparency to the process. This isn’t just about tech, it is also about developing sustainable businesses built around applications that are truly deliver value, which along a way also protect the interests, security, and privacy of the end-user.

The API is not purely the technical elements, it is made up of the technical, business, and political building blocks, which includes the documentation, SDK, terms of service, pricing, and the many other essential aspects of API operations. APIs are definitely made up of the API projections into the users environment, and back-again, facilitating a real-time journey, that if done right, can benefit the API platform, developers who build upon it, and the end-users who ultimately define what the point of any API is.



Slowly Adding The People Layer To The API Evangelist Network

I'm adding a new layer to my monitoring of the API space, what I consider to be the people layer of the API Evangelist network--the actual people who are executing on much of what I talk about, across my research, and storytelling. This is all born out of the network of people I already talk to regularly, to get the stories, and details for the research that I publish regularly.

I have been tracking on many of the doers of the space for a while, behind the scenes, and this is just about exposing a select handful of individuals who can help you with some of the areas I discuss in my research and storytelling.

Up until now, my research has been about showcasing:

  • News
  • Companies
  • Tools

All I’m doing now is exposing some of the people who are doing interesting things as well, and are open to being contacted about their work, and discuss how you can potentially tap their expertise to help you achieve your API objectives. Over the next couple weeks, you will see me start listing individuals who are open to helping out in the core areas I research:

  • API Design
  • API Deployment
  • API Management
  • API Discovery
  • API Integration
  • API Evangelism
  • API Monetization
  • API Security

I already showcase companies in many of these areas, and track on the features, or building blocks employed to deliver products and services they offer, all I’m going to do now is provide more information on individuals who can help execute in these areas as well. Along the way you will also find me showcasing specific companies and tooling as well--API Evangelist partners like 3Scale, WSO2, and Restlet, but also other products I’m personally invested in like APIs.json, and the companies, and tools I use daily like Swagger, API Blueprint, APITools, API Science, and Runscope.

After five years of doing this, I’m trying to understand what scale looks like for API Evangelist. My objective is to keep funding my research, producing the short form (blog), and long form (guides, white papers, and blueprints), while keeping it all rooted in what the leading API providers, and ultimately the people behind them are doing on the ground. With this in mind, you will find more areas of my research, providing links to people actually doing these things in the field, and you will also hear more about the conversations I am having with these people as they execute their craft.

As always, if you, or your company is doing something interesting in these areas, let me know—if you are an individual looking to freelance in these ares, or build a consultancy delivering services in these areas, let me know. I can’t guarantee I’ll include you in my research, but if you are doing interesting things, you’ll stand out, and I’m sure I can find some way to showcase your work in a way that helps you, me, and the wider API space. Also, if you need help with a project in one of these areas, feel free to reach out, and I’ll do my best to point you in the right direction.



Postman Collections Will Take Your API Productivity To The Next Level

If you are an API developer, it is likely you have used Postman, the simple tool for building, testing, and documenting your APIs. I have Postman open as a Google Chrome App, which allows me to make API calls as I’m designing, developing, and integrating with the APIs across my world. Something which opens up the response and requests of API calls that I’m making, giving me more insight into how things are actually working (or not).

One of the key aspects of Postman, are the collections, which:

A collection lets you group individual requests together. These requests can be further organized into sub-collections/folders to completely mirror your API. Requests can also store variations and responses when saved in a collection. You can add metadata like name and description too so that all the information that a developer needs to use your API is available right where he(she) needs it. Collections are listed in the sidebar alphabetically. You can search through collection requests using the search form for quick access.

To me, Postman Collections are API discovery at the API transaction level. Allowing you to define a single unit of currency in the API economy, save, and organize them, and execute them at any point in your API lifecycle. Postman Collections measure not just an API transaction that happens, it is a measure of future transactions that can happen, complete with any relevant meta data that will be needed along the way.

Along with Swagger, and API Blueprint, I’m working to better understand how Postman Collections impact, and fuel the API lifecycle, including conversations with Abhinav Asthana (@a85), Founder and CEO of Postman, about their roadmap. Postman Collections are not just about defining, organizing, and executing your API calls in the Postman client as a solo API developer, they are also about collaborating with other key players involved throughout the API lifecycle.

I’m setting up my API Stack as Postman Collections, to help me understand how I can use the format to improve my API own API workflows. My goal is to ensure Postman works seamless with all stops along my own API and micro service lifecycle, from design to evangelism, and is plug and play as a machine readable element of APIs.json.