{"API Evangelist"}

Different API Rate Limits For Verified And Unverified Free Tiers Of Access

One of the approaches to API plans I was studying recently is from the data provider Factual, who provides access to places, products, and other valuable data-sets. I felt Factual had a pretty straightforward approach to the free tier of access for their platform, that was worthy of sharing.

When you visit the page for the Factual data services, they offer two distinct levels of free access to data resources:

  • Free (unverified) - Up to 100 read requests per day. No access to resolve or write APIs.
  • Free (verified) - Up to 10,000 requests per day for most tables.

When I look at the plans for many APIs, they almost always have free tiers of access, but normally there is just one dimension to it. You get a certain amount of API calls per second, day, or month, for free. What is unique about the approach from Factual is they offer another free level, which gives you more API calls if you verify yourself--this is a pretty interesting approach, that other providers should consider.

You want free access, but don't want to verify yourself, and you get 100 red requests per day, but if you go through the verification process, you get 10,000 requests per day--a pretty significant difference. I am talking with a number of providers lately about how they are tightening up the free tiers of API access, and working to incentivize users to become paying customers, which made me take another look at Factual's approach.

I think free access to APIs is essential to operations, but I think providers need to be more thoughtful about just how much of their resources they give away at this tier. It is OK if you want to get to know your consumers, before you give away too much of the farm. There can be a lot of bad behavior from API consumers at the free tier level, and developer verification is one way you can minimize the impact on your API operations.



Automated Mapping Of The API Universe With Charles Proxy, Dropbox, OpenAPI Spec, And Some Custom APIs

I have been working hard for about a year now trying to craft machine readable API definitions for the leading APIs out there. I've written before about my use of Charles Proxy to generate OpenAPI Spec files, something I'm evolving over the last couple days, making it more automated, and hopefully making my mapping of the API universe much more efficient.

Hand crafting even the base API definition for any API is time consuming, which is something that swells quickly to being hours when you consider the finish work that required, so I was desperately looking how I could automate this aspect of my operations more. I have two modes when looking at an API, review mode where I'm documenting the API and its surrounding operations, with the second being about actually using the API. While I will still be reviewing APIs, my goal is to immediately begin actually using an API, where I feel most of the value is at, while also kicking off the documentation process in the same motion.

Logging All Of My Traffic Using Charles Proxy On Machine
Using Charles Proxy, I route all of my network traffic on my Macbook Pro through a single proxy which I am in control of, allowing me to log every Internet location my computer visits throughout the day. It is something I cannot leave running 100% of the time, as it breaks certificates, sets of security warnings from a number of destinations, but is something I can run about 75% of my world through--establishing a pretty interesting map of the resources I consume, and produce on each day. 

Auto Saving Charles Proxy Session Files Every 30 Minutes
While running running Charles Proxy, I have it setup to auto save a session XML every 30 minutes, giving me bite size snapshots of transaction throughout my day. I turn Charles Proxy on or off, depending on what I am doing. I selected to save as a session XML file because after looking at each format, I felt it had the information I needed, while also easily imported into my database back end. 

Leverage Dropbox Sync And API To Process Session Files
The session XML files generated by Charles Proxy get saved into my local Dropbox folder on my Macbook Pro. Dropbox does the rest, it syncs all of my session XML files to the cloud, securely stored in a single application folder. This allows me to easily generate profiles of websites and APIs, and something that passively occurs in the background while I work on specific research. The only time Dropbox will connect and sync my files, is when I have Charles Proxy off, otherwise it can't establish a secure connection.

Custom API To Process Session Files Available In Dropbox
With network traffic logged, and stored in the cloud using Dropbox, I can then access them via the Dropbox API. To handle this work, I setup an API that will check the specified Dropbox app folder, associated with its Dropbox API application access, and import any new files that it finds. Once a file has been processed, I delete it from Dropbox, dumping any personally identifiable information that may have been present--however, I am not doing banking, or other vital things with Charles Proxy on.

Custom API To Organize Transactions By Host & Media Type
I now have all my logged transactions stored in a database, and I can begin to organize them by host, and media type--something I'm sure I will evolve with time. To facilitate this process I have created a custom API that allows me to see each unique domain or sub-domain that I visit during my logging with Charles Proxy. I am mostly interested in API traffic, so I'm looking for JSON, XML, and other API related media types. I do not process any image, and many other common media types, but do log traffic to HTML sites, routing into a separate bucket which I describe below. 

Custom API To Generate OpenAPI Spec For Each Transaction
In addition to storing the primary details for each transaction I log, for each transaction with a application/json response, I auto-generate an OpenAPI Spec file, mapping out the surface area of the API endpoint. The goal is to provide a basic, machine readable definition of the transaction, so that I can group by host, and other primary details I'm tracking on. This is the portion of the process that generates the map I need for the API universe.

Custom API To Generate JSON Schema For Each Transaction
In addition to generating an OpenAPI Spec for each transaction that I track on with a application/json response, I generate a JSON Schema for the JSON returned. This allows me to map out what data is being returned, without it containing any of the actual data itself. I will do the same for any request body as well, providing a JSON Schema definition for what data is being sent as well as received within any transaction that occurs during my Charles Proxy monitoring.

Automation Of Process Using The EasyCRON Layer Of Platform
I now have four separate APIs that help me automate the logging of my network traffic, storing, processing of all transactions I record, then automatically generate an OpenAPI Spec, and JSON Schema for each API call. This provides me with a more efficient way to kick off the API documentation process, automatically generating machine readable API definitions and data schema, from the exhaust of my daily work, which includes numerous API calls, for a wide variety different reasons.

Helping Me Map Out The World Of Web APIs As The API Evangelist
The primary goal of this work is to help me map out the world of APIs, as part of my work as the API Evangelist. Using this process, all I have to do is turn on Charles Proxy, fire up my Postman, visit an API I want to map out, and start using the API. Usually within an hour, I will then have an Open API Spec for each transaction, as well as aggregated by host, along with a supporting JSON Schema for the underlying request or response data model--everything I need to map out more APIs, more efficient scaling what I do. 

Helping Me Understand The Media Types In Use Out There Today
One thing I noticed right away, was the variety of media types I was coming across. At first I locked things down to application/json, but then I realized I wanted XML, and others. So I reversed my approach and let through all media-types, and started building a blacklist for which ones I did not want to let through. Leaving this part of the process open, and requiring manual evaluation of media types is really pushing forward my awarnesss of alternative media types, and is something that was an unexpected aspect to this owrk.

Helping Me Understand The Companies I Use Daily In My Business
It is really interesting to see the list of hosts that I have generated as part of this work. Some of these companies I depend on for applications that I depend on like Tweetdeck, Github, and Dropbox, while others are companies I'm looking to learn more about as part of API Evangelist research, and storytelling. I'm guessing this understanding of the companies that I'm using daily in my work will continue to evolve significantly as I continue looking at the world through this lens. 

Helping Me Understand The Data I Exchange Daily With Companies
The host of each transaction gives me a look at the companies I transact with daily, but the JSON Schema derived from request and responses that are JSON, also giving me an interesting look at the information I'm exchanging in my daily operations, either directly with platforms I depend on, or casually with websites I visit, and the web applications I'm testing out. I have a lot of work ahead of me to actually catalog, organized and derive meaning from the schema I am generating, but at least I have them in buckets for further evaluation in the near future.

Routing Websites That I Visit Into A Separate Bucket For Tracking On
At first I was going to just ditch all GET requests that returned HTML, but instead I decided to log these transactions, keeping the host, path, and parameters in a separate media type bucket. While I won't be evaluating these domains like I do the APIs that return JSON, XML, etc, I will be keeping an eye on them. I'm feeding these URLs into my core monitoring system, and for some companies I will pull their blog RSS, Twitter handles, and Github accounts, in addition to looking for other artifacts like OpenAPI Specs, API Blueprints, Postman Collections, APIs.json, and other machine readable goodies.

Targeting Of Specific Web, Mobile, Device, And API Driven Platforms
Now that I have this new, more automated API mapping system setup, it will encourage me to target specific web mobile, devices, and API platforms. I will be routing my iPhone, and iPad through the proxy, allowing me to map out mobile applications. If I can just get to work using an API in my Postman client, or use the website or mobile app, and auto-generate a map of the APIs in use in OpenAPI Spec, and data models using JSON Schema, you are going to find me mapping a number of new platform targets in 2016. 

Ok, So What Now? What Am I Going To Do With This Mapping Info Next?
You know, I'm not sure what is next. I learned a lot from this 25 hour sprint, to better automate this process. I think I will just sit back and let it run for a week or two, and do what I regularly do. Visit the websites and developer areas of platforms that I'm keeping an eye on. I will keep using APIs to run my own operations, as well as play with as many APIs as I possibly can fit into my days. Periodically I will check it to see how my new API mapping system is working, and see if I can't answer some pressing questions I have: 

  • How much do I create vs. consume? ie. POST, PUT & PATCH over GET?
  • How often do I use my own resources vs the API resources of others?
  • Do I have any plan B or C for all resources I am using?
  • Do I agree with the terms of service for these platforms I am using?
  • Do I pay any of the services that are a regular part of my daily operations?
  • Am I in control of my account and data for these platforms & companies?

For the moment, I am just looking at establish a map of the digital surface area I touch on each day, and further scale my ability to map out unknown areas of the API wildnerness. I am curious to see how many OpenAPISpecs and JSON Schemas I can generate in a week or month now. I have no idea how I'm going to store or organize all of these maps of the API sector, but it is something I'm sure I can find a solution for using my APIs.json format

This is the type of work I really enjoy. It involves scaling what I do, better understanding what already exists out there, something that will fuel my storytelling, and is something that pushes me to code, and craft custom APIs, while also employing other open tooling, formats, and services along the way--this is API Evangelist.



Offering A Monthly To Annual Toggle For Your API Pricing Page

I am continuing to work through notes from a recent push forward of my API monetization, and API plan research. Something that yielded a number of valuable nuggets  that I think API providers should be considering when crafting their own strategy. One of the pricing pages I was looking at as part of my research, was from authentication provider Auth0, which provided a nice way for allowing their customers to toggle between monthly or annual pricing.

I organize small elements like this, into my lists of common API building blocks, which help API providers, and API service providers, with a list of things they can consider applying in their own strategies. I like the approach from Auth0 especially, because the toggle actually changes the plan pricing on the page, reflecting the shorter, or longer term costs associated with their authentication services.

Whether your intent is to offer price breaks for longer term relationships, or helping your customers better understand and manage their costs, a simple thing like a time-frame toggle, that adjusts pricing, could make a positive impact. Anyhoo, I just wanted to share this nugget with you, as I was adding it as a building block to my API plan research.



If You Are Proud Of Your API Patents Publish Your Portfolio And Showcase Them

I'm going to keep beating the patent API drumbeat, until I bring more awareness to the topic, and shine a light on what is going on. While I will still be my usual self and call out the worst behavior in the space, I am also going to try and be a little more friendlier around my views, and try and help bring more options to the table. This is a serious problem, nobody is talking about, and one that has many dimensions and nuances--if you want my raw stance on API patents, you can read it here

One area I wanted to try and cover, in response to my friends trying to convince me their aren't bad people, in having patents. I know you aren't, and it isn't my goal to make you look bad in this, it is only to shine light on the entire process, how broken it is, and call out the worst offenders. If you truly believe in patents, protecting the work you've done, and that your intentions are good, share your patent portfolio with the world, and showcase it like you do the other aspects of the work you do. You will craft a press release about everything else you do, do the same for your patents. 

I do not think patents are bad. I think ill-conceived patent ideas, that haven't been properly vetted by the under resourced USPTO, that are used in back door dealings as leverage, and litigated in a court of law are bad. I'll take your word that your patents are good, and you aren't operating in any of these areas, if you are public, transparent, and openly proud of them, as you state in private conversations.

Part of the purpose of my research is to encourage good behavior in the sector, by highlight the common building blocks of the space. I think I will add a patent portfolio building to my research. While I have ZERO examples to highlight, I encourage API companies to do this, and would love to highlight in a positive way, any company that is straight up enough to showcase their patents. If you are proud of your API patents, and do not have bad intentions in having them, please publish your portfolio, show case them as you would anything else you are doing--help bring API patents out of the shadows.



Parse Shutting Down: Maybe We Should Lower Our Expectations Of Tech Just A Little Bit

The mobile backend as a service (MBaaS) platform Parse is shutting down. I started tracking on Parse as part of my BaaS research a couple years back, something that resulted in having all of the BaaS providers, including Ilya Sukhar (@) of Parse, on stage at @APIStrat NYC in early 2013--this conversation was just a couple months before Parse was acquired by Facebook. 

Parse was widely consider to be the top BaaS platform, which resulted in wide adoption, something that I'm sure grew expoentitally after the purchase by Facebook. Parse is giving everyone a year to migrate, providing a database migration tool, as well as open sourcing a version of the platform. Which I think is a pretty fair deprecation strategy for customers, even with the unexpected news.

Despite the tec highway is littered with these types of acquisitions, and deprecations, the tech blogosphere, and social bookmarkosphere (is that a word?), loves to sqawk when these happen. Competitors like AWS, Google, and others love to invite you to come use their platform, and the technorati love to point out how you cannot depend on any platform--which is the truth. 

Personally, I enjoy taking these moments to explore why the space think technology is such an asbsolute, where these ways of thought rarely exist in other sectors. I think there are a couple distinct things at play:

  1. Promises Of Tech Providers - In an effort to get new users, tech solution providers make some pretty wild promises of how they'll make your life easier, do all the hard work for you, all you have to do is just believe in them. Never mentioning you aren't really their true target customer, an acquisition by big tech company is their true customer.
  2. Religious Belief In Technology - Like the marketing of providers, developers, and other folks who drink the Silicon Valley Kool-Aid, really, really, really belive that technology is the answer, it will save us, and all of this is inevitable. Thus we believe the tech will always be there to save us, and are so willing to ignore the actual business and politics of all of this.

As I've stated in earlier posts, there are no guarantees your vendors will will always be there in other business sectors, what the hell makes us think our tech vendors will always be there? There is no basis for believing a platform or API will ALWAYS be there, no matter what you are promised. Companies go out of business, get acquired, and in this fast paced tech climate, companies are always looking to deliver the latest product, and features. Everything in the space points to disruption, change, and evolution, where the hell did we get the idea these services shouldn't go away?

I think the tech blogosphere, social bookmarkosphere, and startup elite and belivers should lower our expectations of technology just a little bit. ternet outages, acquisitions, and roadmap shifts will always happen--seems to me, these are the only constants! As a small business operator (I am not a startup), I am constantly evaluating what my Plan B, C, D, E, and even F is. While I may not always be prepared for changes in the landscape, rarely you will catch me squawking too loudly, as I'm business executing on the next stage of my evolution.

However, you will hear me exploring, and understanding these topics, as they occur--because that is fun, and educational!



My Vision For One Possible Future Of The API Life Cycle Present In A Real-Time Subway Map For Helsinki

If you caught my keynotes at @Defrag and @APIStrat last year, you know I'm working on using the subway map as a method for visualizing, understand, and eventually exploration of the API life cycle. I feel like the subway map concept, has helped us find a globally universal way of understanding the transport of humans, via some very complex transportation systems, in cities around the globe--something I feel can be applied to world of APIs.

I was giving a version of my API life cycle talk to a group in Finland the other evening (their morning), and someone in the audience sent me a link to the real-time subway map for Helsinki. If you watch it closely, it updates based upon where the trains are, sharing times and locations. 

Click To See Real Interactive Map

This is what I'm working for across the API space. I see the subway map analogy being key to understanding our API life cycle, across the 40+ areas I track on in my research, as well as a potential real-time window to understanding how each API is being used by employing modern API management infrastructure like 3Scale.

API providers (which ALL companies will be shortly) should have real-time windows into where each of their vital API resources are in the life cycle, whether they are being designed, developed, managed or deprecated. We should also be able to experience real-time views of how are APIs are being consumed, which apps are currently making calls, and potentially even security threats against our API infrastructure.

The subway map is also providing me a way to educate new, and existing API folks in the space, providing them with an interactive journey within the design, definition, deployment, management, monitoring, and 40+ other areas of the API life cycle I keep an eye on. I'm optimistic for what is possible, when it comes to applying the subway map concept to the world of APIs, but like earlier subway engineers, I think it will take some time before I fight just the right approach. ;-)



Breaking Out API Support Into A Separate Research Area

Supporting your community is not unique to the API space, but supporting API operations does have some unique needs, and approaches that are proven by leading platforms. Like other areas of my research, I'm pulling out API support into its own area, so I can start shining a light on successful patterns I find in the area of API support.

Two things pushed me to spin out this research area. One, I was tagging more blog posts, and other resources as support, and without a dedicated research area, this information would rarely float to the surface for me. Two, my partners over at Cloud Elements have an API hub dedicated to "Help Desk". While their aggregate API solution is targeting beyond the API community, it is API driven, and can also be applied to providing an aggregate support solution for API communities.

As with most of the areas of the API space, there are several dimensions to how APIs are being applied to support customers, and online communities. With my research, I will focus on tracking on approaches to community support for API providers, and API service providers. There will also be that layer of tracking on help desk and support platforms who employ APIs, as well as API aggregate and API interoperability solutions from leaders (and my partners) in the space like Cloud Elements.

You can visit my API support research via its Github repository, and I will try to make sure and continue linking to it from my API management research, where it was born out of.



My Stance On APIs And Patents

My post the other day on the hypermedia API focused patents from ElasticPath, has resulted in some very interesting conversations, with folks trying to understand this world, to those who are patent believers, and those who are just doing what they have to--in a world they do not control. This is why I write these stories, and frankly, it is why I am looking to push the patent conversation to new levels, to bring all y'all out of the woodwork.

In a collective response to these conversations, I wanted to share my stance on patents, when it comes to the world of APIs. Let's start closer to the median, and talk about patents, and the world that "exists today", and explore the common responses when you look to discuss API patents in the current climate.

I Do Not Want To Do Patents, I Only Do Them As a Defensive Response!
C'mon Kin! You do not understand why we have to do patents. We only do them to protect our space, against the worst, of the worst in the tech space--you know SAP, Sun, Oracle, Microsoft, IBM, and the other evil that lurk (the ones with most patents). I get it. It is the same reason all my redneck, and now hippie friends own guns--those over there have guns, and I need to defend myself. You never know when one of those situations will happen, and I will need to defend what I've invested so much to build.

I Need To Generate Intellectual Property (IP), Because It Is How We Define Value!
If I could spend my time as I wish, I wouldn't be writing up patents, and spending money on the patent process. My investors, CEO, and my wider stakeholder's expect that, as a startup, I patent the most valuable of the exhaust from our operations. Ok. So you do this because the people who give you money tell you to? Or do you do this because it actually defines the value you generate as a startup? So, as a startup in 2015, a process acknowledged in ancient Greece, and evolved after the dark ages, and further refined in the industrial age, define you? Ok. #DisruptionW00T!

Patents Are How You Make Money, Do You Know How Much Money IBM, Microsoft, SAP and Others Make?
The amount of money the large enterprise make off their patent portfolios is in the BILLIONS! This is how business gets done, and fortunes are made! This is how the leading companies that we hold up as shining examples make their billions, so why wouldn't I follow in their footsteps? Well, you know, all that disruption, innovation, and other bullshit you hear--that is why! Actually its not, but I'm just trying to use your own marketing against you, to try and change your tune. 

Let's Start With The Basic Differences Between Your View, And My View Of The Space
The concept of the patent process is about taking a unique and innovative process, locking it up, and preventing others from doing it. Which in a capitalist, physical world might make some sense, some of the time, but in a digital world it actually works against the way things actually operate--where reuse, interoperability, and global access is how you reach a wider audience, encourage ubiquitous integration, and yes make money (open source duh?). Locking things up, preventing others from using a successful pattern, and metering that usage doesn't get you as far in this digital world, as it did in the physical world of manufacturing and distribution. #Sorry

In conflict with your world of patents, I am in the business of opening things up, reducing friction, increasingly interoperability, educating more people, and making things work together. I like to make money, but in the end, I'm more in the business of making you money, than I am about making money--you just do not realize it. You are are so short sighted in your view of business, that you do not get what API is. You think SOA didn't work because of some technical flaw, and ignore the role your heavy hand has played in gumming up the gears. Only now are you noticing that API is even working, and you know what? It has nothing to do with you. It has been working because of your absence, and now that your heavy hand has come into play, trust me, shit will slow down. #GiveItTime

The Patent Process Is Broken And Few Actually Care (Or Willing To Talk)
The biggest thing that bothers me in taking my patent stance, and challenging the larger world of patents, is that the process is broken, and that many know, few care, and even fewer are willing to talk about it. Let's start with the basics, in that a process developed in the physical world, applied to physical processes, is transferable to the digital world with no adjustments. WTF! You will argue that taking an existing patent applied in the physical world, reapplied to web or mobile is now worthy of a new patent, but the patent process itself doesn't warrant the same response? WTF! #Broken. We could leave it there, but wait, there is more!

The US Patent Office Is Underfunded, Undertrained, And OverWhelmed With Demand
When you take the time to look through the work that goes into looking for prior works, you realize how little actually goes into the overall patent vetting process. Its less about invalidating an idea for a patent, than it is about make sure it gets approved. The USPTO is just one casualty in a larger war to defund the United States government, and part of the beautiful gift the 1% receives in waging this war. This the how power works. Weaken the process, defund the regulators, and give the power to those who own the patent. You are either smart and innovating, or you are a loser right? #VoteTrump

Patents Are A Rich Persons Game, Something The Rest Of Us Should Avoid
The patent game is for those with the resources, on the front-end and on the back-end. Do you have the time to carve out and define your patent, hire a lawyer to write and file your patent? Do you have the time to wait for the USPTO to approve your patent? Are you filing your patent, or is the company you work for filing the patent? You do!! Now do you have the time, and resources to defend your patent, upon attack by your enemies, in a court of law? A sustained attack? This is the two-sided beauty of how patents are a rich person game. You have to have what it takes to gneerate patents to please your VCs, and to defend against your perceived enemies, but do you also have what it takes to defend in the real world, in a court of law? I do not. Even if I could afford to patent my unique process on patenting APis, I couldn't litigate it ;-)

Nobody Wants To Talk About Patents, Because Someday I Need Funding, Or Will Be In The Club
Blogs like Techcrunch love to do gotcha stories, when it helps the cause of the powers that pull their strings, and similar publications will call foul when really scary patent related cases enters the legal system, but patent filing, and approved applications, in even the scariest of ways rarely make the news. As a patent filer or owner, you rarely talk about your patent portfolio publicly, let alone feed this information to news outlets, and interestingly, very few organizations go looking for this information. Hmmmm....

Patent Conversations Happen Behind Closed Doors, Way Before They Ever Enter A Court Of Law
When I hear the statement that patents are being filed in a defensive posture, focusing in on the legal battles that will no doubt come in the future, from the most aggressive in the space--I hear the passive aggressive tones from the shadows (takes one to know one). This statement leaves out the posturing that takes place behind closed doors, between executives, and investors. Do not tell me your patent portfolio, or perceived patent portfolio (patent pending) doesn't give you leverage in every day dealings, never even bringing litigation into the picture.

Are Patents Sending The Signals You Think They Are In the Space 
I just can't help but think that even though companies are playing the "patent game" by the rules, that they fully understand the signals they are sending. if I am watching the patent filings, and application approvals, and I know you are, don't you think your competition, and potential enemies are? Are you better off gaining a first mover advantage with your "unique process", or are you better off filing a patent for your process? IDK. I'm guessing this depends on how big your company is, and your general position in the space. Which means, I'm guessing it might not mean what you think it means for your little startup, and your view of the value vs. the view of your VC's might differ. Maybe it actually makes you a target for other types of bad behavior, and in the end, if necessary some actual patent litigation.

Patents Are Like Guns, I'm Doing Them Because Everyone Else Is Doing Them
Each time I listened to the stories of why my friends are doing patents over the last week, I think about the same stories I mentioned above from friends on why they have guns. Only difference is I have a long history with guns (love em), and not so much with patents. My friends have guns, because there are many other bad guys out there with guns, creating this really amazingly hostile environment, where nobody is safe. As a former thug, guess where you go rob when you see yourself as top tog in a market? You go rob the guys with all the guns, who might not have the resources to actually defend them. Guns have enjoyed much more "need" in the past, much like patents, but in the current climate we live in, we need to reassess their role, not do away with them, just take another look at the process involved--we should do the same with patents.

Patent Trolls Is Just The Illness Coming To the Surface Of Tech Space
The reason patent trolls can exist in this environment, is because of the buy / sell nature of companies, the scale at which the USPTO is equipped, government downsized, and coming down to who can actually afford the lawyers in the on-boarding, as well as offensive or defensive portion of the long patent game. If you have ever seen patent litigation play out, you know such a small portion of it is ever public, so what we hear publicly about the problem, is a very small piece of what is going on. This is just the part that plays out in the courtroom, and NEVER touches on what happens in the boardrooms of leading tech companies, or the investors behind the curtain.

Now We Get To The Part Where You Actually Hear My Stance On The Patents As They Apply To APIs
The concept of patent is broken, and the process that is patent is broken. For this reason I do not believe in it. Then you add the fact that it is a rich persons game, seals the deal for me. I cannot afford a patent, even if I wanted to, something that makes it an agent of power in my world. Something that also makes it such a hypocritical tell for startups. Why don't you push out press releases on your patents, that your VC's require you do? Alongside all of the innovation and disruption press releases? Oh yeah, we are using this really old process of the existing power structure. That is right, we aren't actually that proud of our patents--unless we work for IBM.

I Cannot Support This Concept Moving Forward If Nobody Will Stand Up And Fight To Evolve It
As a startup you will dismiss history, so that you can disrupt, find your funding, and convince all of us that your thing, is a thing. Government, banks, unions, institutions, and other entrenched entities are bad, and need disrupting in a digital era, but this centuries old process of locking of business processes, developing intellectual property, and building wealth portfolio deserves NO CHANGE in a digital age? WTF? Are you serious? The same concept applied to the cotton gin, applies to my cotton gin API? Don't even think about it, I've already applied for the patent! ;-) Sorry, I'm just not sold.

I understand why you are doing this, and I will keep accepting your requests to talk about, because I learn a lot along the way, and it is something that continues to anchor me in my stance. You will look to quickly dismiss me because I am not "playing in your league", and I don't understand the stakes, and the rules of the game. #Truth. I would also challenge that maybe you are so caught up in the stakes and rules of your game, you might be missing some opportunity, and potentially a whole other way of operating--one that actually makes all of this work, and less about you getting rich.



Embedding Your Language SDK(s) In Your Apiary Documentation Using APIMATIC

I'm seeing a resurgence in my embeddable API research lately, based upon signals I'm seeing across the space, and conversations I'm having with folks. The interesting part for me is that this wave isn't about API providers like Twitter and Faceobok using JavaScript to create buttons, badges, and widgets. This latest round is about API service providers making their services more accessible to both API providers, and API consumers, using embeddable tooling, and most importantly, API definitions.

API driven embeddable tools comes in many shapes and sizes, but is something I work hard to understand, and track on in the space. I have several new embeddable stories to talk about this week, but today's is from my friends over at APIMATIC, as well as Apiary. The two service providers now offer the ability to embed your APIMATIC driven SDKs, into your Apiary driven API documentation. All you do, is plug in the URL of your Apiary portal page for your API, into your APIMATC account, and you are returned embeddable markdown you can paste into your Apiary API documentation--Apiary addresses your API design, documentation, testing and virtualization needs, and APIMATIC comes is with the API SDK assist. 

I like API service providers working together like this, it is something that makes API provider's and API consumer's lives easier. This approach to API service interoperability is what it will take to weave together the patchwork of API services that will be needed across the API life cycle. As more API service providers emerge to fill gaps in the life cycle, the ability to stitch these stops will grow more critical, something embeddability will contribute to. Depending on a single provider in 2016, is just not a reality, and I need the services that I depend on to work together.

As I will work to showcase in future stories, embedability comes in many shapes and sizes. I'm hoping we are on the cusp of a new wave of API driven embeddability, one that is exponentially more fruitful, and easier to implement for each individual API provider. An approach to using JavaScript and image driven embeddability, that uses APIs like previous waves, but this one is more designed for API providers, API consumers, as well as for end-users, like historical approaches to embed like Twitter tweet buttons, and Facebook share and login buttons were.



API Definitions Are The Contract For Doing Business With APIs

I held a hangout with API Evangelist this morning, with Steve Willmot () of , & Jakub Nesetril () of  today, where we discussed API definitions. Both Steve and Jakub are CEOs of leading tech companies, who are taking frontline positions when it comes to the whole API definition conversation.

My role in this hangout, was just bringing together these two API leaders, to discuss the most important topic facing us in the world of APIs. API definitions are touching on every aspect of the API life cycle, and as Steve and Jakub discuss, playing a central role in their businesses, and their customer's businesses. We published the hour and half conversation on Youtube, so you can join in, even if you couldn't make the hangout.

 

 

The focus on API definitions being the contract, was the most important part of today's hangout for me. This wasn't a conversation about just Swagger, or API Blueprint, it was a discussion about the role API definitions play in the API life cycle, from the latest wave of API description formats like Swagger and Blueprint, to API discovery formats like APIs.json, and even media types, and schemas. These are the contracts we are using to communicate our ideas, generate code, setup tests, drive documentation, and make the API economy go roudn.

This hangout was scheduled to be an hour, but there was so much to cover, we took it an extra 30 minutes. The conversation left me feeling like I need to do this more often, expanding on the API definition discussion, but also push into other important areas like API deployment, virtualization, discovery, monitoring, and the other almost 40 areas of the API lifecycle I track on. We'll see how much time i have to do these, but more importantly my ability to bring smart people like Steve and Jakub together--something I hope I can make it work.

Thanks to Jakub and Steve for participating today!



Reverse Engineering APIs From The Common APIs Models We Know

As I work to complete more API definitions, with all API endpoints defined as an OpenAPI Spec, API Blueprint, and Postman Collection, with everything wrapped in a complete APIs.json index--I can't help but consider the importance of these definitions in helping others reverse engineer these APIs, to help apply in their own API design, development, and management processes.

Whether or not you are learning about an API for consumption purposes, or learning about it from a providers perspective, there is a lot to learn from APIs that are defined using OpenAPI Spec, API Blueprint, and Postman Collection, and is something I'm working to push APIs.json to deliver on more. Right now I'm struggling to just get the basics of each API, its individual methods, parameters, and underlying schemas. I am also working to index their overall operations using APIs.json. Soon though, I will reach the point where I will have a nice collection of existing APIs defined, I will be able to do much, much more--this is what I'm planning for now.

Now that I am closing in on having a couple hundred, complete (enough) API definitions for leading platforms, I think I will take another look through the stack, and evaluate how I can position them better to help potential API consumers, as well as API providers. API consumers are going to care about learning only what they need to get at the valuable resources made available via the API, while providers are going to want to better understand the API design, schemas, business models, and other aspects of the operations.

I'm thinking more deeply the API provider and API consumer sides of the same coin, so when you land on the home page of any API service provider, there is personalized, easy to use, visual elements that draw you in to learn more about an existing API you already know about. Want to learn how to connect to APIs using Postman, here is a Twitter guide. Want to learn more about designing a file storage API using Apiary, here is a Dropbox guide. There is no reason that these common guides could be easily available in the tooling we use everyday, and be driven by APIs.json, OpenAPI Spec, and API Blueprint.To help illustrate my point, I am going to pick 10 of the most well known APIs, and craft examples of what I am talking about. Blueprints that could be used in any service, as easily provide reverse engineering lessons from the home page, dashboards, and help sections we are using each day. I learn a lot about APIs from reverse engineering the API definitions of leading APIs, and hoping others will too.

If there is a specific API you'd like to see, let me know, and I'll consider prioritizing.



Moving Towards A Meaningful Set Of Icons For The API Community

There are many inconsistencies I struggle with in the API space, and the lack of meaningful icons to express myself is one of them. I was meeting with my friend Jerome Louvel of Restlet this last week, and he also articulated (again), the lack of imagery that represents APIs. To understand what we are talking about simple icons, like what RSS has, but to represent an API, instead of an XML feed.

When I need an image to represent an API, I always borrow the simple icon from the API Commons. Like my API Evangelist logo, it is a pretty simple, black and white, minimum viable representation of "API". The folks over at Restlet also have their own, image, that they use across the platform, and suggested it as one potential design that could be used. This is a concept that I would like to push forward. I think branding, and iconography is very important, and is something long overdue for the API space.

I would like to see a single universal, community driven icon to represent the concept of a web API. I would also like to see similar icons for formats used across the space, as well as vendor specific icons. If you look at my personal API stack, as well as my overall stack, you see me using a mix of format, and vendor icons to document a pretty complex assortment of API driven resources, and information. I'd like to step this up a bit, be able to standardize across not just my network of sites, but the overall API community. 

I am guessing this is something I will have to slowly push forward, like the numerous other areas of my API industry research, but would love to hear your input, and see what else we could do to establish a common set of API industry icons. I'm already working with a handful of API vendors to define their API driven embeddable strategy, and looking to push forward my API branding research, so I wanted to make sure and publish a story on this--which in my world is essential to the ideation process, and essential to moving anything forward.



Join @3Scale, @Apiary, And I For A Hangout On API Definitions This Wednesday

Join me, Steve Willmott(@njyx) of 3Scale, and Jakub Nesetril(@jakubnesetril) of Apiary, for a hangout on API definitions this week. I wanted to explore  doing more hangouts under the APIStrat, as well as API Evangelist brand(s)--for this one I wanted to bring together some experts to talk about the fast moving world of API definitions, as a Hangout with API Evangelist.

This Wednesday, January 27th, at 11:00 AM PST, the three of us will jump on a Google Hangout, and you are welcome to join in the conversation. We will be doing the gathering as a Hangout on Air, so that you can ask questions if you want, joining in the live conversation, or you can wait until after we are done, I will make sure and publish the video to Youtube.

Its a pretty important time for API definitions with the Swagger specification reborn as the OpenAPISpec, and Apiary, the creator of API Blueprint and MSON, also adopting OpenAPI Spec this last week, allowing you to design and mock your API in both formats. 3Scale was an earlier adopter of Swagger, and has taken a leadership position in shepherding it to into the Linux Foundation, and is a member of the governance working group.

I figured that it is a pretty good time to check-in with Steve and Jakub, on the current state of API definitions, and how they see them impacting their own platforms, as well as the overall API space. If you have any specific questions you'd like me to ask, or have any specific topics you'd like to see discussed, feel free to tweet at me. I'll tweet out the link for the event, but all you need to do is visit hangoutwith.apievangelist.com, this Wednesday at 11:00 PST, and join in the conversation.



A New Open Source Interactive API Documentation From Folks Over At Lucybot

I am happy to be showcasing a new open source, OpenAPI Spec driven, interactive API documentation, from the LucyBot team. The API definition driven documentation solution is one of the best implementations I've seen to date, and reflects the future of where API documentation should be going in my opinion. While the Swagger UI has significantly moved the API documentation conversation forward, it has never been the most attractive solution, pushing some API providers to use a more attractive, less interactive, open source solution from Slate.

The LucyBot API Console is both! It is API definition driven, using the OpenAPI Spec as its engine, but also provides a conversion tool for RAML, WADL, or API Blueprint. The LucyBot API Console is also very attractive, responsive, and flows much like Slate does, but goes much further by being an actual interactive API console, not just good looking, static docs. You can see Lucy Console in action over at AnyAPI, or as a small set that demonstrates different CSS variations

I have forked, and started playing with the LucyBot API Console more--I encourage you to do the same. There are a number of features I'd like to see in there, like more embeddability, and templability of the UI, as well as some APIs.json support, and D3.js visualization integration--so I wil be getting more knowledgeable of the code base, and in tune with the road map. I know the LucyBot team is open sourcing the code to encourage contributions, and are looking for investment in dev and business resources to move the project forward--so get involved!

It is good to see the API definition driven API documentation conversation moving forward in 2016!



The New API Definition Abstraction Layer Over At Apiary

I was on the road last week, so I didn't maintain my usual barrage of API analysis. As I go through my monitoring for the week, I'd say the biggest news of the week was Apiary's support of the OpenAPI Spec. I got a test drive of the support for the API definition format over the holidays, and was impressed with how smoothly Apiary integrated the OpenAPI Spec into their API design, virtualization, and management platform

Another very interesting dimension of the Apiary release for me, was how they seamlessly integrated the new API definition into their road map. This wasn't a switch to the OpenAPI Spec from API Blueprint, it was about opening up, embracing, and abstracting away of multiple API definition formats across their platform operations. As an API service provider, it is just smart business to support as many possible API definition formats as you possibly can. The 2016 road map for Apiary acknowledges the value of using the OpenAPI Spec, but still reflects the strengths that API Blueprint + MSON bring to the table. 

Last September, I walked around San Francisco with Jakub Nesetril (@jakubnesetril), the CEO of Apiary, talking about the need for an open abstraction layer to help us better define our API, across all stops of the API life cycle. This wee's OpenAPI Spec support at Apiary is Jakub's vision playing out, making sure the process of defining your APIs across the design, virtualization, and management areas of your API life cycle is as robust, and agile as it possibly can be. For me, this makes this weeks news much more than about Apiary abstracting away the complexity of switching between leading API definitions, than it is about their support for the OpenAPI Spec.

I had beers with Emmanuel Paraskakis(@manp) and Jakub at the Toronado on Haight Street in San Francisco this week, and had another conversation with them about their abstraction layer, which helps them efficiently switch between using API Blueprint and OpenAPI Spec in Apiary. They expressed interest in exploring the open sourcing of this layer, to help others achieve similar abstraction in their own platforms. During our talk Jakub reiterated his earlier vision of an open abstraction layer, where each API definition provider(OpenAPI Spec, RAML, Postman, APIs.json, etc) can maintain theri own plugins, abstracting away this work for API service providers.

Consider the opportunities when any API service provider in the community, serving any stop along the API life cycle, can seamlessly integrate this abstraction layer into their platform. Where they can work with a single API definition layer, which navigates the differences between each individual API definition, and the owner of each definition has to step up and maintain the individual API definition side of the equation. If you are selling your services to API providers across the sector, you should support as many of the leading API definitions formats as you can, and what Apiary is cooking up, will make this reality much easier to achieve.



Taking A Snapshot Of Just The Essential API Building Blocks Across My Research

My API industry research is constantly moving forward, shifting, and being added to--much like the space itself. As I work to update each of my research areas each week, my process involves adding any news I've curated, and changes to the companies who are doing things in the space, as well as add or remove any common building blocks I've identified. These building blocks are the common patterns I've identified by studying how API providers are operating, and what features API service providers are bringing to the table, for API providers to put to work.

This last fall, in preparation for my keynotes at Defrag and APIStrat, I pushed forward much of my research, flushing out more of the building blocks across 20+ of my research areas. As I was preparing for another big push forward with my research, I wanted to stop and take a snapshot of just what I'd consider to be the essential building blocks across the most mature areas of my API monitoring research.

I track almost 1000 stops along the modern API life cycle, but this snapshot represents what I'd consider to be a master list that API providers, and service providers should be considering as they design, develop, manage, and execute on almost 20 other areas of a modern API lifecycle. This snapshot is meant to be a guide to what I'm seeing in the space, which can be used as a checklist, in your own API strategy. I do not recommend every company do every thing listed in this document, but they are elements that you should be considering, and hopefully have a better understanding of.

Like my other guides, this one is in beta. I'm adding a couple new areas to my research in January that will impact this work, and I will also be exposing links to other companies, services, and resources that support each building block. You can download via a PDF version, or access via its own Github repository. If you see any mistakes, you are welcome to correct them there, and submit a pull request, or simply submit an issue to let me know, via the Github repo. I will roll any edits into the next edition.

While not as short as I'd hope it to be, I think the essential API building guide across these 20+ areas of my research provides an interesting snapshot of the space in 2016. Stay tuned...more to come.



The Four Categories Of Dwolla API Consumers

I just finished spending an hour talking with Brent Baker (@norcaljhawk), head of product for Dwolla, and Jordan Lampe (@JsLampe), about the vision behind the developer experience for their new developer portal. I will be able to craft many stories from the notes I generated during our conversation, but there was one aspect of how they view consumers of their ACH transfer API, I wanted to quickly share.

This portion part of our conversation came up when they mentioned how they broke up their API users, as they were rethinking the overall developer experience. They put API consumers into four distinct categories:

  • seeker - individual looking for a solution
  • implementer - individual implementing solution
  • maintainer - individual maintaining solution
  • hacker - individual playing with their solution

I think this is a great way to look at your API consumers. I've heard many different approaches to labeling your API consumer buckets, but I'd say this is the first time that I've heard it put so elegantly. A moment I couldn't help but ruin, by selfishly adding:

  • analyst - individual looking to understand role solution plays in big picture

You can't forget about us analyst types who will just talk about your APIs, and rarely actually do any "real work". ;-) Anyways, I just wanted to share this small part of our conversation. The Dwolla developer portal experience is rich with API management examples, that other API providers should be looking to emulate. I will keep processing the notes from our conversation, and share any other nuggets I collected during my walk through the API journey the Dwolla team has crafted within their new API portal.



The API Feedback Loop: Your Feedback Powers Everything We Do

One of the benefits of doing an API, is so that you can take advantage of the potential for a community feedback loop, driven by internal groups, external partners, and even in some cases the pubic. Under my API management research, you can find a number of building blocks I recommend for helping power your feedback loop, but sometimes I like to just showcase examples of this in the wild, to show how it all can work.

I was reading the Letter from our co-founder: 2016 Product vision, from Electronic Health Record (EHR) provider Practice Fusion, and I thought the heart of the blog post, represented a well functioning feedback loop. As Practice Fusion looked back over 2015, they noted the following activity:

Acknowledging that this feedback "powers everything we do". They continue listing some of the major customer requests that were fulfilled in 2015, and close the letter with an "eye towards the future". It is a simple, but nice example of how a platform's community can drive the platform road map. Sure a lot of this comes from the SaaS side of the Practice Fusion operations, but it works in sync with the Practice Fusion developer community as well. 

The lesson from this one, to include in my overall research, is to always have a way to collect feedback from the community, tag ideas for discussion as potential additions to the road map, and carefully track on which ideas get delivered, and which ones end up being included in the road map. This is something that is pretty easy to do with Github Issue Management, which I use for my own personal projects, to drive both my road maps, and resulting change logs.

This post also pushed me to begin tagging these types of stories, organizing them into a simple API playbook, full of API platform how-tos, like this product vision letter from co-founder of Practice Fusion.



Just The Best Parts Of The API Documentation Please

I was just talking API documentation with Brent Baker (@norcaljhawk), and Jordan Lampe (@JsLampe) from Dwolla. As we were going through their API documentation, they mentioned how they were using Slate for the version 1.0 of their API documentation, but for this round they took what they felt were just the best parts of Slate, and were looking to craft a new experience. 

Interestingly I had written about their use of Slate for API docs back in 2014, so it makes sense for me to keep tracking on, but more importantly I think the move reflects the wider evolution of API documentation. If you aren't familiar with Slate, it is a very attractive way to document your APIs, that many API providers and consumers have latched on to. Slate, in contrast to the very utilitarian style of Swagger UI, has certain elements developer prefer--something I can see why, after looking at a lot of API docs.

Dwolla's evolution from their old static API docs to Slate, and then to their current version highlighted two important aspects of the modern API documentation evolution for me. First, the feedback to the Dwolla team revealed that the three column format which Slate used for documentation, made the documentation seem like it was not part of the overall Dwolla experience--it was separate. Which is one unique thing Swagger UI did, is that allowed it to be embedded within any API portal, even though the look was not as attractive as Slate is.

As they evolve overall their portal experience, Dwolla was sure they wanted to keep the code sample viewer, which allows for inline viewing of raw, PHP, Ruby, Python, and JavaScript samples for each API. In rethinking their API docs, the Dwolla team wanted to decouple the three-pane experience, but keep the attractiveness, flowing navigation, and inline language sample experience that Slate delivered--keeping just the best parts of the increasingly ubiquitous API documentation format.

For me this highlights some of the elements, the latest wave of API documentation developers, like Any API, may want to consider as they help push forward the overall API documentation conversation. Any API did what I wanted to see, and combined the interactive functionality of Swagger UI, and the clean UI / UX that Slate brought to the table. I recommend to anyone crafting the next generation of API documentation solutions, consider the overall layout, making things more decoupled, responsive, and allowing of more embeddability of all, or even just portions of the API documentation--making the API documentation experience more portable, and re-mixable.

Even though I'm not seeing everything I would like to when it comes to the evolution of API documentation, I am optimistic about what I am seeing, and my conversation with the Dwolla team shows me there is a lot of positive momentum forward in how leading API providers are really thinking about what matters, when it comes to API documentation.



I Just Cannot Get Behind API Patents, Especially When They Apply To HTTP And Hypermedia

I got an email in my inbox, about a new API modeling language from Elastic Path earlier today. The product is called Helix, and is sold as being "the first enterprise-class API modeling language designed specifically for REST Level 3".  

The Elastic Path team is where I first learned about the concept of hypermedia, I believe back in 2011--honestly, I had heard the concept before, but never fully grasped what it was, and the potential until 2011 (I know I am slow). However it has been an awareness that has grown rapidly, the more I learn about hypermedia, study how people are practicing hypermedia, and even dabble in it myself with my curation, and building block APIs.

Elastic Path is an expert in the area of hypermedia, so it makes sense they would step up with some hypermedia focused API tooling, and even a modeling language. No surprises here, but where I was a little surprised, was when I read the "Why Helix":

Elastic Path built Helix because existing API modeling languages, such as RAML, Swagger, and API Blueprint, while good for describing standard APIs, are not well suited to designing hypermedia APIs. Elastic Path Cortex, our patented API technology, utilizes Helix definitions and a Helix-compatible implementation, allowing our partners and customers to extend existing APIs and quickly create new ones.

After reading, I wanted to explore the portion of their description that states, "Elastic Path Cortex, our patented API technology, utilizes Helix definitions and a Helix-compatible implementation, allowing our partners and customers to extend existing APIs and quickly create new ones.", but specifically the "patented API technology". Something a quick Google Patent search discovers as defined five separate patents held by Elastic Path

Linking functionality for encoding application state in linked resources in a stateless microkernel web server architecture
Publication # US9143385 B2
A method of serving a resource to a client via a computer network is provided. The method may include at an HTTP server system having a stateless microkernel architecture, the server system including one or more link resource servers, receiving an HTTP request for a resource from an HTTP client via a computer network, the request being to perform a resource operation, the resource operation being to retrieve the resource and send the resource to the requesting client, wherein the resource is a data object. The method may further include determining if the resource operation is authorized based on the request. If the resource operation is authorized, the method may include sending the resource operation to an object server associated with the resource identified by the request, in response receiving a data object from the object server, providing, via a linking engine, the data object to each link resource server of the one or more link resource servers, in response receiving one or more links from each of the one or more link resource servers, embedding the links in the data object, and sending the data object to the requesting client via the computer network.

Follow location handler and selector functionality in a stateless microkernel web server architecture
Publication # US8959591 B2
A method of serving a resource to a client via a computer network is provided. The method may include providing a follow location handler logically positioned on a WAN side of an HTTP server. At the follow location handler, the method may include receiving a POST request from the client, and forwarding the POST request to the HTTP server. At the HTTP server, the method may include receiving the POST request, creating a modified data object based upon the form data, generating a link to the modified data object, and returning the link. At the follow location handler, the method may include intercepting the link to the modified data object from the server, sending a GET request to the server to retrieve the modified data object, and, in response, receiving the modified data object. The method may further include forwarding the modified data object to the client.

Stop condition functionality in a stateless microkernel web server architecture
Publication # US20130179498 A1
A method of serving a resource from an HTTP server system having a stateless microkernel architecture and one or more link resource servers is provided. The method may include generating a data object in response to an HTTP request, sending the data object to each of the link resource servers, and at each link resource server receiving the data object from the handler and examining the data object for pre-determined information to perform a linking operation. The method may further include if the data object includes the pre-determined information, performing the linking operation by returning one or more links to the handler linking to related information provided by the link resource server. The method may further include if the data object does not include the pre-determined information, not performing the linking operation and instead returning one or more stop condition links indicating that the pre-determined information is not included. 

Stateless microkernel web server architecture with self discoverable objects
Publication # US20130179545 A1
A method is provided for exchanging a self discoverable data object between a client executed on a client computing device and a server with a stateless REST-compliant software architecture, which is configured to reply to HTTP requests from a browser engine of the client and to messages from a runtime executable program executed by a runtime executable program interpreter of the client. The method may include receiving an HTTP response from the server, the response including the data object, the data object including a self entity including a URI and a content type of the data object, passing the data object to the runtime executable program at the runtime executable program interpreter for processing. The runtime executable program may communicate with the server using the URI and content type of the data object. Cache controls and an HREF may also be contained in the self entity. - https://www.google.com/patents/US20130179545?dq=inassignee:%22Elastic+Path+Software,+Inc.%22&hl=en&sa=X&ved=0ahUKEwiRnsHvlaXKAhVC9GMKHbyMAuIQ6AEIOzAE

Encoding application state in linked resources in a stateless microkernel web server architecture
Publication # WO2013102272 A1
A method of serving a resource to a client via a network is provided. The method may include at an HTTP server system having a stateless microkernel architecture with one or more link resource servers, receiving an HTTP request from an HTTP client to perform a resource operation to retrieve a resource data object. If the resource operation is authorized, the method may include sending the resource operation to an object server associated with the resource identified by the request, and in response receiving a data object from the object server; providing, via a linking engine, the data object to each link resource server of the one or more link resource servers; and in response receiving one or more links from each of the one or more link resource servers, embedding the links in the data object, and sending the data object to the requesting client via the computer network.

Hmmmmmm.....

I know you all think I'm some hippie dippie API dude, who doesn't get all this VC driven, business shit you are all calling startups, and innovation, and you are probably right. However, I'd argue that many of all you really do not get this Internet, or API thing either. This is a hard one for me, because I've worked with Elastic Path since way back in API Evangelist history, and I personally know and love their team. So its hard for me to call bullshit on this, but I have to. 

I'm not all the way through the detail of the patents, but in my opinion, they should never have been approved. They go against everything that is the Internet, and why APIs work. The only thing I see unique in the patent titles, abstracts, and art, is the term "microkernel". Not sure exactly what that is, but the rest of it sounds like what we are all already doing with APIs to me--correct me if I am wrong?

Luckily patents are coming back onto my radar, thanks to my lovely partner in crime doing the research for her piece, Ed-Tech Patents: Prior Art and Learning Theories. Her work, and this release from Elastic Path is encouraging me to spend some fresh time going through my patent research, and make sure that I am paying attention to this type of stuff in real-time. I started my dive last January, making it a perfect time to refresh my research in the area.

I swear, with the Oracle v Google, last years Swagger bullshittery, and now this, y'all are going to turn API into SOA, and then everyone is gonna be whining at me that I promised this API thing would work!! Luckily someone will come along with the next thing to sell you fuckers, make money, and y'all can fuck that up too. #IFuckingQuit!



How Do I Provide A List Of Certified Applications To My API Ecosystem Partners

I was emailed by someone working in government, asking some pretty interesting questions around using an application showcase, to make trusted applications available to an ecosystem of partners. I'm not going to talk specifically about the agency, as I have not gotten approval to talk publicly, but I think the question is an interesting mix of several areas I am researching, that I wanted to explore a little further, in an anonymous fashion.  

This is a a heavily edited, summarized version of what was asked, but it essentially went like this:

There are 400 apps that want to get data from an organization but only some portion of them meet or exceed the governance criteria to be deemed “trustworthy”.  This usually involves certain legal commitments are followed and other privacy requirements are satisfied--these 400 apps that will be evaluated, and if they qualify, they will be put into the an application registry. Another aspect of it is that rather than each of the 400 apps having to go to each of the partners to get authorized and access the partners API that it would be more efficient to rely on the application registry to determine if they can expose their APIs to that apps request(s) as well.

Application showcases, that share approved applications with an ecosystem is common, but what I found interesting about this agencies question is that they want to also use the application approval process, as a sign of trust for other partners, when it comes to accessing their own API resources. As I said, this conversation spans three key areas of what I'm seeing occur modern API ecosystems:

  • 3rd Party Applications (ie. approval, showcase, case studies, etc.)
  • Partner Programs (i.e. certification, access tiers, etc.)
  • Service Composition (ie. plans, monetization, rate limiting, etc.)

Modern approaches to API planning, and well thought-out API driven partner programs, provide the approaches you need to approve partners (ie. companies), and the applications they build. Then your service composition, using API management infrastructure 3Scale, is how you craft the partner tiers of access, and provide the API infrastructure APIs to your partners for knowing which applications, have which levels of access. I've written about service composition many times, but my partner research is just getting going, so I wanted to take a look at the API pioneers, as potential blueprints in these areas--let's start with Amazon's approach: 

As you can see both the individual, and group level offer several layers of certification, depending on individual, as well as the software being sold (ie. government, hardware, AWS marketplace, SaaS, etc).  For some more examples of partner programs, I am taking a look at Salesforce with their certification program, the Ebay’s Solutions Directory, with Twitter's official Partner Program, over at Facebook with their Marketing Partners program, and the Edmunds Certified Developer Network, for a start.

I felt this conversation provided me an interesting look on certifying partners (companies, organizations, agencies, individuals), and the software integration(s) they produce (app showcase, directory, etc.), then using that as a trust driver for your API service composition for your own platform, but also extending that identity, trust, and authentication for your partners to use in their own API service composition.

The application showcase portion of this story is interesting, but I think the use of partner program certification, and extending this to your partners, so that they can use in their API service is unique. Something that is totally possible, if your API management, planning, and partner layers all have APIs. You can then use your infrastructure APIs, provide facades for your partners to access users, applications, partner program details, and other aspects of API operations, in a seamless way using APIs. *mind blown*



Helping The Average Business User With More Information On How To Put APIs To Work

API Evangelist has long been dedicated to helping the average business user understand all that is API. I saw early on in 2010, the potential of APIs, when used to empower the IT, or even shadow IT of the average business user. I think I've done well in this mission, except for one thing, the API Evangelist network is heavily focused on providing APIs, and much more lighter on topics and information around consuming APIs--something I will be working hard to shift over the next five years.

To help me tackle this, is my new partner Cloud Elements. I do not partner with organizations, unless they help fuel my research, and Cloud Elements is helping me pushing forward several areas including API aggregation, API reciprocity, as well as pushing me to profile 50 of the top business sectors. While these areas of my research will have lots of information for API providers, many of the companies, services, and tooling I profile in these areas will be about empowering API consumers, and even more importantly--all types of API consumers, not just developers.

Services like IFTTT and Zapier, make APIs accessible to the average business user because they allow them to move things around in the cloud using APIs. API service providers like Cloud Elements are allowing the average user to not just move things around in a way that protects their interest, and the interest of platforms (aka reciprocity), but they also all for aggregation, automation, and orchestration, of critical business "life bits" like documents, contacts, and images. This is why I seek out partners like them, to work in concert to better understand the space, but also tell stories that help average business user solve their actual problems. #winwin

You will hear me talk a lot about what Cloud Elements is doing in 2016. Partly because they give me money, yes, but also because they are giving me money to operate, and spend my time researching how APIs can be put to work by the average business, and telling more stories that focus on API consumption. 3Scale, Restlet, and WSO2 have helped me push forward my API provider focused research, and I look forward to helping balance the API consumer side of the scales through my partnership with Cloud Elements.



API Aggregation, Reciprocity, and Orchestration

I struggle a lot with how I separate out my research areas--there are a lot of reasons why I will break off, or group information in a certain way. Really it all comes down to some layer of separation in my head, or possibly what I perceive will be in my readers head. For example, I broke off hypermedia into its own research project, but now I'm considering just weaving it into my API design research

This is one of the reasons I conduct my research the way I do, is that it lets me spin out research, if I feel necessary, but I can easily combine projects, when I want as well. As I move API aggregation and reciprocity out of my "trends" category, and into my primary bucket of research, I'm consideration an addition of a 3rd area dedicated to just orchestration. Right now I'm considering aggregation staying focused on providing APIs that bring together multiple APIs into a single interface, and reciprocity is about moving things between two API driven services--I'm thinking orchestration will be more about the bigger picture that will involve automation, scheduling, events, jobs, logging, and much more. 

I enjoy my research being like my APIs, and keeping them the smallest possible units as possible. When they start getting too big, I can carve off a piece into its own area. I can also easily daisy chain them together, like API design, definitions, and hypermedia are. Some companies I track on will only enable API reciprocity at the consumer level, like IFTTT, where others like Cloud Elements will live in aggregation, reciprocity, and orchestration. I also think orchestration will always deal with business or industrial grade API usage, where my individual users can look to some of the lighter weight, more focused solutions, available in reciprocity.

Who knows? I might change my tune in the future, but for now I have enough curated stories, and companies who are focused on API orchestration to warrant the spinning off of its own research. Once added, I will link offf the home page of API Evangelist with the other 35+ research projects into how APIs are being put to work. I'm hoping that like my research into API monitoring, testing, and performance has produced a critical Venn diagram for me, that API aggregation, reciprocity, and orchestration, will better help me understand see the overlap in these areas for both API provider, and consumer.



API Stack, APIs.io, And APIs.Guru Need You To Create And Share Your API Definitions

I feel pretty strongly that for the next wave of growth in the API sector, we need the majority of public APIs in use today, to have well crafted, as complete as possible, API definitions in either OpenAPI Spec or API Blueprint. Yes I know, we actually need all of these APIs to be crafted using hypermedia approaches, but until then we need them all to possess machine readable API definitions, making them discoverable, learnable, and consumable.

It is easy to get hung up on this being about API discovery, but API definitions are enabling almost every step along the 35 areas of the API life cycle I am mapping out. Historically API definitions have bee used for interactive API documentation, but more recently are additionally being used to light up other aspects of API integration, such as setting up your API monitoring, loading into the API client of your choosing, or lighting up a mock server for use in development. 

In addition enabling services and tooling throughout the life cycle, well crafted, complete API definitions is driving API design literacy. Many API developers and architects learn by reverse engineering the APIs they know, and a well crafted OpenAPI Spec or API Blueprint provides a detail blueprint for enabling this experience. API definitions make it easier to learn about good, and bad API design, in terms of the APIs you actually care about--equaling a much more open mind.

I've been working to define API definitions as part of my API Stack work, for over a year now. You can access all the APIs.json + OpenAPI Specs + API Blueprint + Postman collections under the /data folder for the project repository. Additionally, this repo is one of the sources of APIs.io which is an APIs.json driven open source API search engine, which provides an API for you to register and search for API definitions.

In 2015, I saw another strong player emerge, that I'm big on supporting--APIs.guru. The open source API definitions repository is looking to be the Wikipedia of API definitions, providing a single place we can find, robust, complete, API definitions in a variety of formats. They have done a lot of work to seed the repository with 196 APIs, possessing over 4000 endpoints, but they are looking to take things to the next level, and will need our help to this Wikipedia of APIs become a reality.

The API Stack, APIs.io, and APIs.guru all need you to help contribute, and refine the API definitions in their indexes. Developers around the world are using these definitions in their work, and modern API tooling and service providers are using them to define the value they bring to the table. To help the API sector reach the next level, we need you to step up and share the API definitions you have with API Stack, APIs.io, or APIs.guru, and if you have the time and skills, we could use your help crafting other new API definitions for popular services available today. 

If you need help getting in touch with APIs.io or APIs.guru, there is contact information on both their sites. Alternatively, feel free to just ping me with the URL of your own Github repository, and I'll make sure your API definition index gets in sync with all of this new wave of open API repositories.



Public GETs, In Concert With Private POST, PUT, And DELETE For Your APIs

Another story I wanted to tell from my work to expose an API yesterday, so I could get help with it, was focused around the service composition that I used. I feel like this is a powerful story, that should be told, and retold among API evangelists, across conversations with folks who are new to the API space, and the concept of putting APIs to work in their daily business worlds.

The largest concern I hear from people who don't fully understand API, is the perceived loss of control, from putting things up on the open Internet. When you don't understand modern API management infrastructure, and the nuance of API service composition, what an API does can seem pretty scary. The first lesson around me exposing of an API, from the @APIStrat speaker database, was about soliciting help from Nicolas Grenié (@picsoung), and the second lesson is centered around how I used a combination of public / private endpoints to make this work.

The first two endpoints or methods I published from my speaker databases, were simple GETs:

All 351 speakers in our database are already public, on the schedules for each of the six @APIStrat events, so there really is no reason why I would lock up the APIs to get this information--I am just returning JSON representations, in addition to the HTML I already do on the websites.

However, for the collaboration part with @picsoung the POST, PUT, DELETE (aka Add, Update, Delete), I'm going to need to secure things a little more. Using my already in place, 3Scale API management infrastructure, I have an access tier specifically for my partners like 3Scale, WSO2, and @picsoung already has a set of API keys at this API plan level. I simply put the three endpoints / methods for POST, PUT, and DELETE for the URLs above, into my "partner" level, and they require an appID and appKey for each API call--secured.

This is a simple, yet powerful example of how APIs work, and specifically the level of control that API service composition gives you. I can easily make the speakers across our six conferences available in a machine readable format for use in other websites, and mobile applications. With the same API, I am also able to open up write capabilities to the partners who I trust, enabling them to help me evolve my speakers information. I am able to publicly share my resources via an API, making HTML and JSON versions of it available, while also opening up important collaboration which will move my work forward, in a safe, and secure way.

This is APIs in action, and how they actually give you more control over the digital resources that are fueling your daily operations.