{"API Evangelist"}

Are Your APIs Ready For The Coming Containerization Evolution Of The API Space?

If you were at Defrag in Colorado last November, or in Sydney, Australia for API Days last week, you’ve heard me talk about what containers are doing to APIs. There is a subtle, but important shift in how APIs are being deployed occurring right now, and as John Sheehan (@johnsheehan), the CEO of Runscope says, containers are doing for APIs, what APIs have been doing for businesses.

As I was processing news for API.Report this morning, I found more evidence of this, with the release of logging API container from Logentries. APIs have made resources like “logging”, much more modular and portable, for use in multiple channels like mobile or via websites. A containerized logging API makes the concept of a logging API much more portable, by adding an entirely new dimension for deployment. You don’t just have a logging API, you now have a logging API that can be deployed anywhere in the cloud, on-premise, or on any physical object.

This opens up a whole new world of APIs, one that goes beyond just a programmable web, quickly evolving us towards a programmable world, for the better, and the worse. Last month I asked, when will my router have docker containers by default? I received an email from a major network equipment manufacturer this week, letting me know that they are working on it. This means little containers on the Internet enabled objects across our worlds, ushering in the ability to deploy localized, wholesale APIs, giving us the ability to manifest exactly the virtual API stacks that we need, for any objective.

I try not to hype up where things are going in the API space, so I will stick with calling containers a significant evolution in how APIs are done. This new way of deploying APIs will push the evolution around the business of APIs, changing how we make generate revenue from valuable resources, while also putting even more stress on the politics of APIs, with introduction of more privacy and security concerns—not to mention adding a whole new dimension of complexity.

I’m not 100% sure where all of this is going. As with the rest of the API space, I struggle with making sense of all of this in real-time, and the allocation the mental bandwidth to see the big picture. All I can say at this moment, is to make sure you work to better understand various approaches to containerization, and adopt a microservice approach to your API design. Beyond this, all we can do is keep an eye on what companies like Logentries are doing, when it comes to containerized API deployment, and try to learn as fast as we possibly can.



An API For The Interactive JumboTron Floor Display At The National Museum of Mathematics (MoMath) In New York

I just found one of the coolest API stories I’ve seen in a while over at CHANCE, the quarterly magazine designed for "anyone who has an interest in the analysis of data, informally highlighting sound statistical practice." CHANCE talked with the executive director of The National Museum of Mathematics (MoMath), Glen Whitney, about their new hands-on, API driven exhibit that the "museum has created a physical and virtual recreational math community to nurture this generation and the next in their mathematical pursuits."

As part of their plans to reach people outside New York City, and encourage them to join the conversation at the museum, they have installed as a JumboTron on the floor, which:

You can walk right onto it and it’s equipped with sensing technology, so it’ll know the location of everyone who’s standing on the floor. We have a variety of exploratory mathematical activities on that floor. We’ll have mazes that have special rules or maybe a lot of turnstiles that trigger changes as you walk through them. It shows the notion that math is about exploring the consequences of simple rules."

At the heart of the interactive mathematics exhibit:

"There will be an API (application programming interface)—a system by which groups can submit their own activity to be displayed on the Math Square floor. We will invite submissions from across the country. And we’ll have a curation process, of course. If one group’s exhibit is selected, we’ll give them the opportunity through live streaming video where the class can see another group in the museum interacting with their creation and get feedback about what these other students experienced as they explored whatever puzzle, problem, or illustration the originators created. We’re looking forward to that as a way of connecting people from around the country."

At the moment where I’m most concerned about the Internet of Things (IoT) API efforts I’m seeing emerge across the landscape, an API project like the MoMath API shows up to make me happy. :-) ;-) Can you image the possibilities here? Not just for interactive, API driven displays on the floor at the National Museum of Mathematics, but interactive mathematics anywhere you can install a visual display, or a network of API driven human interfaces.

I am very curious to see what mathematicians around the world do with the MoMath API project, and better understand how we can use API to make math a much more fun, accessible, and interactive experience, that can be woven into our daily experiences.

P.S. I really, really hope this is good enough to make it into the Hack Education roundup! ;-)



Changes To LinkedIn Developer Program Are No Surprise

LinkedIn recently announced some changes to their developer program, which involves further tightening down the screws on who has access to the API, limiting public access to only a handful of very superficial APIs. If you want greater access to the business social network API, you will need to be an officially approved partner.

As a result of LinkedIn’s announcement, you will hear more discussion about the demise of public APIs, as this is narrative many API providers would like to employ, to support their own command and control positions around their client, or very own API driven resources. There is nothing wrong with having private APIs with supporting partner programs, but this has no bearing on the viability of publicly available APIs.

In reality, LinkedIn’s API never really was open. Sure it is a public API, but the API has never been developer friendly, often times taking a very adversarial stance with its community, as opposed to embracing, nurturing, and investing in its developer ecosystem. Honestly, this is ok. Not every company has the DNA, or business model to make public APIs work—this latest move by LinkedIn reflects their ability, and not the potential of public APIs.

We can’t expect all companies to be able to make public APIs work, it isn't easy. When it comes to making money around valuable content and data online, a closed ecosystem is seen as being better. Tighter control over your users data exhaust, allows you to decide who can do what, limiting to just the partners who have business relationships with you. You just can't monetize user generated to the extent LInkedIn would like, without taking away users control and access to this data.

Even with LinkedIn stance, there are a number of lessons to be learned by studying their approach. Like Twitter and Facebook, there are plenty of positive moves to analyze, as well as numerous other negative elements, that you can learn from when crafting the tone for your API. As an API provider do not dismiss what you can take away from LinkedIn’s platform, and as a consumer LinkedIn is a valuable lesson in what you should look for in an API platform.

Ultimately, the move by LInkedIn is no surprise to me, and the platform is purely a distribution channel for me, and has been for some time.. Meaning I only syndicate content there, and you will never find me actually engaging very deep on the platform, building relationships there, because along with other platforms like Quora I do not have any ownership over any of the exhaust I generate. As a professional this is unacceptable to me, as I have a valuable brand that I carefully maintain. As other professionals realize this, they too will mostly abandon the business social network, leaving it to be a spammy corner of the Internet where HR professional prey upon the semi-professional, aspiring employee types.



Migrating My Own API Infrastructure Conversations To My Personal Blog And Keep API Evangelist About Mainstream Stories

After seeing the conversation around my In The Future There Will Be No Public vs. Private APIs, I'm reminded of my own mission. I write on API Evangelist first and foremost for my own education, and secondarily I do it to help educate the normals about the importance of APIs. Not page-views. Not to educate the API echo chamber. Not to drive conversation over at DZone or Hacker News. Definitely not to insult anyone.

That story was me working through my own service composition, and looking at one possible future. That exchange you hear in the story, and all my stories, is the conversation between the voices in my head, and is never mean to insult anyone (think Michael Keaton in Birdman). All of this has reminded me that API Evangelist is not about cutting edge stories, like the unproven stuff I'm doing with my own API architecture, docker and microservices. However it is critical that I still flush out my ideas, in my own way, so I'll move these stories to my personal blog kinlane.com.

As I re-read that post, I’m faced with the link-baity title, which was not crafted with that intent, and the error and brevity in one statement that was singled out and ultimately fueled the conversation. I’m always happy to see conversation stirred, but not in the way it was around David’s post. It is ridiculous that I would allude to privacy being gone from the conversation around APIs (really?), but ultimately I was pleased to see most people make the same argument that I was having in my own head.

I’m super thankful for having APIEvangelist.com, and my readers. It makes it possible that to bring in the little bit of money I do from 3Scale, Restlet, and WSO2, to pay my rent, and fund my research and learning. I’m also thankful for moments like this that help me remember why it is that I do this, and stay true to my API Evangelist mission.



APIMATIC Code-Generation-as-a-Service Has Built-In Support For API Commons Manifest

The API savvy folks over at Apimatic are at it again, pushing forward the conversation around generating of software development kits, using machine readable API formats, and this time the doorway to your SDK is via the API Commons manifest.

I'm going to go ahead and use their own description, as it sums it well, no augmentation needed. Using the code generation API, you can generate SDKs for your API directly from your Github repository. 

Step 1: Describe you API using some format. You may choose from Swagger, RAML, APIBlueprint, IODocs, and Google Discovery formats. Automatic code generation makes use of information in your API description to generate method and classes names. Please be as expressive as possible by not leaving out any optional fields as applicable e.g., not leaving out types and schemas for your parameters and fields.

Step 2: Define meta information about your API using API Commons manifest format. You can generate your API Commons manifest using the API Commons Manifest generator. Be sure to enter all relevant information. Upload the generated manifest as a new file in the root directory of your Github repo by the name "api-commons- manifest.json". Be sure to have the correct name and location of this file.

Step 3: Open/Create a markdown file (README.md is a good candidate). Add the following markdown syntax to render an image link.

[![apimatic][apimatic-{platform-name}-image]][apimatic-{platform- name}-url]

[apimatic-{platform-name}-url]: https://apimatic.io/api/github/{account-name}/{repo-name}/{branch- name}?platform={platform-name}

[apimatic-{platform-name}-image]: https://apimatic.io/img/github/{platform-name}.svg

Replace the {platform-name} token with one of the following values: windows, android, ios
Replace the {account-name} token with the name of your url-encoded Github account name
Replace the {repo-name} token with the name of your url-encoded Github repository name
Replace the {branch-name} token with the name of your url-encoded Github branch name where the API Commons manifest file is present. 

To validate, open the following url after replacing tokens. This url should open the raw manifest file. https://raw.githubusercontent.com/{account-name}/{repo-name}/{branch- name}/api-commons-manifest.json

You can see an example here. Commit changes and navigate to your Markdown file in your browser. You will see apimatic widgets (image links), which you can click to generate SDKs for your API. To see an example, open this link to view the README.md file in raw text form.

The Apimatic team is owning the conversation when it comes to generation of full fledge SDKs for your APIs. I always hear folks talk about the limitations of auto-generation of client side code, but the Apimatic team is pushing the conversation forward with their persistent approach.