{"API Evangelist"}

What Do I Mean When I Say APIs Are Just The Next Step In The Evolution Of The Web?

I remember the vision clearly from 2004, when I first changed the URL for my Delicious social bookmarking account to make it return a list of bookmarks as XML instead of HTML. It was a vision of the programmable web--where everything I explored on the Internet, wasn’t just consumable, and right below the surface of any website or application I was using, there was also a machine readable version, allowing me to build whatever I desired.

People are often surprised when they realize I do not have anything to sell them, and that I evangelize that APIs are not some new product, but just the next step in the evolution of the web. This is easy to say, however it can be much harder to demostrate to the “normals”, leaving me always hundting for easy to understand API implementations I can use to help bring me people closer to understanding.

Steve Ziegler (@stevezieglerva) introduced to me to a great new example of this in the wild. An API I haven’t come across before, and I was pleasantly surprised to see it was a partnership between Department of Commerce, Energy, Interior, State Department, Transportation, EPA, Health & Human Services, NASA, National Science Foundation, Smithsonian, USAID, and USDA.

According to their own description the Global Change Information System is:

The US Global Change Research Program (USGCRP) has established the Global Change Information System (GCIS) to better coordinate and integrate the use of federal information products on changes in the global environment and the implications of those changes for society.

For me the Global Change Information System is an example of how websites, linked data, and APIs should work in concert, but is something I understand very little about how to actually do. The GCIS platform organizes an amazing amount of information, all the people, organizations, and relationships involved, in a very elegant way-you can start seeing in action by browsing the database, by clicking on the menu in top left corner.

Immediately I notice how structured everything, then as I scroll to bottom I see that everything is available in a machine readable format. What is even cooler, is that it isn’t just available as JSON, you get it in YAML, Turtle, RDF, and some formats I’m not familiar with. Then of course, you get a robust, yet simple web API as well.

I’m impressed with the amount of detail available in the Global Change Information System, and the amount thought put into the relationships between all the information, and actors involved. It makes me optimistic for what can come out of government, and that something so forward thinking is being applied to an area as import as the environment.

I’m just getting going reviewing the Global Change Information System API, and will make more time to evaluate how it works under the hood, maybe generating a Swagger spec for the interface, to help me better understand how it works. I’m also going to reach out to them and see if I can get more information on the story behind, and possibly what the roadmap looks like—making it likely you will see more stories about it in the near future.



Recap Of APIs At Dept of Education, And The FAFSA API

My work on APIs for the Department of Education, and the FAFSA API began while I was working in Washington DC as a Presidential Innovation Fellow. Shortly after leaving DC, I was informed that conversations around an API for the FAFSA had been put on back-burner, and in response I developed a prototype FAFSA API to help jumpstart the conversation.

In December 2013, I went to one of the two data jams put on by the White House and Dept. of Education, up in Palo Alto, CA at Stanford. Then, in January 2014 I heard there was talk at the secretary level, about officially pursuing a FAFSA API. Yay! By February I got an email that there was an opportunity for funding such an initiative, if there was a technical specification available. Three days layer I finished a couple of options, which ultimately resulted in a draft technical proposal for a possible FAFSA API.

I’ve heard nothing about the outcome of these efforts. I provided some thoughts on APIs in general, in June of 2014, in response to their RFI, but beyond that it has been radio silence at Department of Education when it comes to a FAFSA API. Granted, I’m not pursuing this, but when it comes to FAFSA API discussion, I own the SEO conversation, so if it came up, I’m sure someone would ping me.

I have no confidence that the Department of Education will pursue a FAFSA API. My motivations around contributing to the conversations stem from a desire to jumpstart investment from both the public, and the private sector. While jumpstarting conversations in the federal government, I was hoping I could also jumpstart the development, and deployment of a federated FAFSA API, which in turn would apply pressure on the federal government to participate. Doing this is not trivial, and a cause that needs a champion or full time evangelist—I’m not your man.

The FAFSA API is an excellent example of the technology, business, and politics of APIs, something that is more art, than science. If it is going to become a thing, I think it has to happen inversely to the way the IRS ecosystem has occurred, from the outside in. I just don’t have confidence that the Department of Education can own this one. I think multiple leaders from the private sector have to make it happen, get private sector buy-in, then convince the Department of Ed to play nicely in a federated FAFSA API ecosystem.

Which is no small task.

P.S. This post is meant to help several groups understand where I am at with the FAFSA API. I just posted a round of updates directly to my FAFSA API research, as well as my overall Dept. of Education research.



I See An Opportunity In Paying Attention To Other Types Of APIs

I've been pretty focused on web APIs in my API Evangelist world, steering clear of hardware, networking, desktop software, and the American Petroleum Institute. While you will never catch me paying attention to oil, I am slowly changing my tune on other types of legacy APIs.

As I read through the first 700, of the 16K API related patents I’ve harvested from USPTO XML files, I initially started dismissing hardware related patents, and then some of the more network related ones as well. Then I started evaluating the impact these patents could have on the Internet of Things (IoT), and I am beginning to shift my stance.

I may not fully profile the approaches of some of these API providers, but I think I will at least bookmark, and consider the approach, and how its being applied, while also putting on my web API architect hat. As I read through a press release today on Infolytica corporation releasing the next generation of their MotorSolve software, complete with mention of an API, I can’t help but think of the implications if Infolytica embraced a web-API strategy.

Imagine the potential, if MotorSolve was broken up, migrated into the cloud, and containerized? Then add in the necessary business building blocks like docs, code, pricing, and the requisite political building blocks like rate limits, terms of services, etc. This is the lens I am looking at older patents I am reading through, that may not 100% reflect modern web APIs, but because our virtual and physical worlds are increasingly merging with the growth of Internet of Things (IoT), and Software Defined Networking (SDN), might have significant impact if just looked at a little differently.

Overall I see an pretty interesting opportunity in trying and consider all types of APIs, no matter their origins, reconsidering them in light of shifts in compute like the cloud, mobile, IoT, and SDN, and see what we can learn. You’ll see more news reports of more “low level” APIs on API.Report, and some of what I learn evolving my analysis here on API Evangelist.



APIs Used To Close, Rather Than Open The Internet

I get a lot of folks who come to my blog, see the title, read one or two posts, and assume that I’m a blind lover of API technology, and that I see APis as a solution to everything. While some of this is true, I do love APIs, and think they are a great solution (in some cases), at the same time I’m also an outspoken critic of APIs, and work hard to be a voice of reason when I see people doing stupid shit with them.

With this theme in mind, I want to once again remind everyone that APIs are neither good, nor bad, nor neutral by themselves, they are merely one of the tools companies can wield, and completely reflect the motivations of their masters. One such example of this in action, where I believe an API is being used for some pretty bad reasons is with the AT&T sponsored data API.

I’ve tried to support AT&T as much as I can, because I really want to help the enterprise make sense of web APIs, and teach them to wield them in positive ways, but I have to say the sponsored data API is not something I can get behind. Upon closer examination this API is working to close down, control and meter the Internet, rather than opening up the Internet making it more accessible, and usable by AT&T customers.

Allowing for mobile users to get their music, video and other content delivered in a way that doesn’t impact their phone bill seems like a good idea, and allowing companies to step in an sponsor the delivery of data and content for users, may smell like a good opportunity when you own the pipes, but this is leading us down a dark road. I’m sorry, there are much more interesting ways to optimize the delivery of content, and make money off the Internet pipes you have--AT&T you lack imagination, and creativity.

I'm sure you are well aware, but what you are doing with sponsored data delivery to mobile phones is another push for allowing the prioritization of Internet traffic, and being able to pay for a better Internet experience for those who can afford it, rather than making the web accessible to everyone. Additionally if you consider that many providers will actively work to slow the Internet delivery of content behind the scenes just so they can generate revenue using approaches like sponsored data, things really start to get really ugly. I'm not ok with you doing this via APIs, and helping your customers see API as something attached to their bill, and speeding up the Internet

This use of an API to close the Internet down, and such bad examples of API monetization really bum me out. There is so much opportunity for monetization if everyone has open, free access to the Internet. We can get more creative than this, when it comes to monetizing the pipes, and approaches like this from AT&T is just going to continue fucking up the Internet, similar to what we are seeing come from Verizon, and the other leading telcos who don't get Internet, let alone APIs.

PS: I wrote this 8 months ago, and just now found in my Evernote. Figured I'd publish in the shadow of FCC announcment on net neutrality.



An Increase In Number Of Press Releases Involving API Integration

I spent a portion of my time each day reviewing press release sites, in addition to the 1000+ blogs I keep an eye on, for syndication to API.Report. During the course of my work this year, I'm noticing an uptick in the number of press releases that are about some new app, feature, and partnership that has an API at its core.

Telling the story of prominent integrations is something I am a big advocate for, but I think the growth in the number of official press releases about API integrations shows that the mainstream SMB and enterprise markets are putting APIs to work more, and looking to showcase. For me, this demonstrates that APIs are playing more of a central role in not just the deployment of apps, but the course of regular business for an increasing number of companies in 2015.

I’m guessing that many more companies will be showcasing the API integrations that they achieve, long before they talk about the APIs they posess, let alone make publicly available. Ideally, everyone would be both a public API provider and consumer, but I’m afraid that many companies just don’t have the culture for such a thing—keeping their APIs closer to their chest, while still beating the API integration drum. Because it is what you do in 2015.