The API Evangelist Blog

This blog represents the thoughts I have while I'm research the world of APIs. I share what I'm working each week, and publish daily insights on a wide range of topics from design to depcration, and spanning the technology, business, and politics of APIs. All of this runs on Github, so if you see a mistake, you can either fix by submitting a pull request, or let me know by submitting a Github issue for the repository.


OpenAPI Spec Google Spreadsheet to Github Jekyll Hosted YAML

I have been playing around with different ways of using Google Spreadsheet to drive YAML and JSON data to Jekyll data projects hosted as Github repositories. It is an approach I started playing around with in Washington DC, while I was helping data stewards publish government services as JSON-LD. It is something I've been playing around with lately using to drive D3.js visualizations and even a comic book.

There are couple of things going on here. First, you are managing machine-readable data using Google Spreadsheets, and publishing this data as two separate machine readable formats: JSON and YAML. When these formats are combined with the data capabilities of a Jekyll website hosted on Github Pages, it opens up some pretty interesting possibilities for using data to fuel some pretty fun things. Plus...no backend needed.

To push this approach forward I wanted to apply to managing OpenAPI Specs that can be used across the API life cycle. I pulled together a spreadsheet template for managing the details I need for an OpenAPI Spec. Then I created a Github repository, forked my previous spreadsheet to YAML project, and modified it to pull data from a couple of worksheets in the Google Doc and publish as both JSON and YAML OpenAPI Specs. 

My OpenAPI Spec Google Sheet to YAML for use in a Jekyll project hosted on Github is just a prototype. The results don't always validate, and I'm playing with different ways to represent and manage the data in the Google Sheet. It is a fun start though! I am going to keep working on it, and probably start a similar project for managing an APIs.json index using Google Sheets. When done right it might provide another way that non-developers can participate in the API design process, and apply OpenAPI Specs to other stops along the API life cycle like with API documentation, SDK generation, or testing and monitoring.


How Do We Keep The Fire Alive In API Space?

It is tough to keep a sustained fire burning in the world of technology, at the individual, organizational, and community level. I have been doing API Evangelist full time for six years, and it is no secret that I have had several moments where I've experienced a crisis of faith, and I do not doubt that there will be many more of these in my future--there is no perfect solution. It takes hard work, creativity, and a myriad of other considerations to keep going, stay energized, and keep other folks doing the same.

I have spent a great deal of time this fall thinking about all of the factors that influence me, and contribute to the fire burning, or acting as a flame retardant to me and the API space. When exploring these contributing factors, it is only logical we start with the external forces right? Because this all sucks because of everything external, aka all of you! Couldn't possibly be me?

So what are some of the external forces out there that contribute to the fire burning brightly, or possibly being dampened across API space are?

  • People Aren't Always Nice - For some reason, the Internet has brought the worst out in many of us. I personally feel this is the nature of technology -- it isn't human, and the more time you spend with it, the less human we are, and less empathy we will have for other people.
  • Everyone Takes A Little - Until you've spent a great deal of time in the spotlight writing, speaking, or otherwise you don't fully grasp this one. Every person you talk to, every hand you shake, takes a little bit from you -- making it super critical for people to give back -- it all takes a toll, whether we acknowledge it or not.
  • Few Ever Give Back - The general tone set by startup culture and VC investment is to take, take, take, and rarely give back. The number of people who want to pick your brain, take your open work, and not ever give anything in return is far greater than the people openly giving back, or acknowledging they should pay you for your time.
  • Use & Subscribe To Open Channels - Follow me on Twitter, and Medium. Subscribe to the Atom Feed and email newsletter. Support those people in the space who provide open channels of information by tuning in and engaging.
  • Fork, Contribute & Build Upon - When you fork someone's work plan on how you will contribute back, build upon, and cite their work. Don't just use open source, contribute back to it, and augment it with the value you bring to the table.
  • Events Are Exhausting - Producing, organizing, and pulling of valuable events that focus on ideas over products are fucking hard, and you should support the open tech events you think contribute most to the tech space. Invest time, money, and your energy wherever you can afford it. I know your company demands you get the most out of your sponsorships, but step back and consider how you can give the most as part of your sponsorship as well.
  • Where We Invest - 95% of the investment in the APi space is into proprietary products, services, and technology. The other 10% is an equal investment in ideas, open concepts, specifications, definitions, and software. If companies do invest in "open" it is in name only, and not a true invest. Everyone suffers from this behavior, making the space pretty fucking business as usual--which is a fire that nobody wants to tend to for very long.
  • Intellectual Property - Unhealthy views on IP lock up ideas. I'm not saying copyright, patents, and licensing shouldn't exist. I am saying that aggressive, greedy views will continue to act as a wet blanket when it comes APIs making a meaningful impact, and making the game pretty untenable for individuals.

Next, up, what are some of the internal forces (aka my responsibility) that can contribute to the fire burning more brightly in the API space?

  • Going To Burning Man - Yes burning man will reinvigorate you each year, but we have to find a way to establish a regular alter that we can burn on a regular basis, finding renewal on a regular basis without so much fucking dust involved.
  • Eat Well - If we've learned anything over the last six years it is that man cannot live on pizza alone. Words of caution for you youngsters--eventually you will start falling apart, and this shit will break if you do not eat well.
  • Alcohol - Drinking is fun and an important social lubricant, but it can also lead to some very bad behavior in real-time, as well as contributing to a multitude of other issues in life ranging from health to relationships. 
  • Exercise - We just can't sit on our ass all the time, as much as we'd like to think we can. Again, this isn't a problem for the youngsters, but as we get on in life, it will catch up with you, and we are better off finding a way to work it in regularly.
  • Family Time - Spending time with family is important. Too much travel, screen time and work all hurt this. Regularly check in on what is important when it comes to family. Even if it is just working on API integrations with your kids (did I tell you I'm doing a project with my daughter) -- woohoo!
  • Creativity - Take time to invest in and move forward creative projects. All work and no play makes us all very fucking boring, and are not conducive to any flame burning. The less I invest in my creative side, the less productive I am in my regular work. As an individual and a business make sure there is the investment in creative projects and space.
  • Money - Money is not everything, but making some is required. I've had several waves of $$ making good in my career, and it rarely ever brought me more happiness, and always brought me more concern. There is a lot of focus on VC investment, and showcasing the successful founders in the space. To keep a sustained fire burning we have to realize there is a lot more to all of this than just making money.

These are just some of the key external and internal forces contributing to the fire burning within me individually when it comes to APIs, and I also feel contribute the fire burning also across the community (which I am part of). Startup and VC investment success do not equal community and individual success. Rarely does a startup winning contribute to any single individual success, or the wider community being more vibrant, creative, and rich? You have a rare rock star founder, and always the wealth of corporate and brand success, but these do not make a fire burn brighter in the community. It might attract a few moths to the flame along the way but doesn't truly enrich everyone, and provide fuel for a sustained burn--it is about burning bright, fast, and hard, which aren't good for most of us humans.

I keep going as the API Evangelist because I'm interested in what I'm doing. I'm fulfilled by learning, writing, sharing, and building. I will keep going for another 10, 20, hopefully until the end of my life, because a real fire is truly burning--not because I met my sales goals, sold my startup, or reached API economy nirvana (that is API singularity). Most of the time I'm learning, I am being creative, and I've made more money than was required to pay my rent, my bills, and could eat well. More meetings. More projects. More handshakes. More money does not always nurture me, and keep the fire alive personally, or within the wider API community.


The Twitter Branding Page Provides Minimum Bar For API Providers

API branding is an area that I find to be contradictory in the space, with the loss of brand control being in the top concerns for companies when doing APIs, while simultaneously one of the most deficient areas of API operations, with most API providers have no branding guidance in their developer portal whatsoever. I think it is just one of those telling aspects of how dysfunctional many companies are, and how their concerns are out of alignment with reality, and where they are investing their resources.

Every API should have some sort of branding page or area for their API operations--I even have a branding page. ;-) If you are looking for a healthy example to consider as a baseline of your branding page, take a look at Twitters branding page, which provides the essential building blocks you should be considering:

  • Simple URL - Something easy to find, easily indexed by search engines, even a subdomain like Twitter does.
  • Logos & Assets - Provide us with at least a logo for your company, if not a wealth of assets to put to use.
  • Branding Guidelines - Offer up some structure, helping guide us, and show you've put some thought into branding.
  • Link to Embeddables - If you have any buttons, badges, and widgets, point us to your embeddable resources.
  • Link to Terms of Service - Provide us with a quick link to the terms of service as it applies to branding.
  • Contact Information - Give me an email, or another channel for asking a question if I get confused at any time.

I do not agree with all of Twitter's branding enforcement, but I respect that they have set a bar, and provide the ecosystem with guidance. At the very least, it makes sure all of us are using the latest logo, and when done right it can help encourage consistency across every integration that is putting API resources to work. I find it hard time respecting branding concerns from companies who don't have a dedicated branding resource area for me to consult.

When done right, APIs extend the reach of any brand. Think about Twitter and Facebook sharing, and other embeddables. Most API consumers are more than happy to help extend your brand, but we are going to be lazy, and ALWAYS need guidance and things to copy / paste. At the very least, make sure you have the minimum viable branding assets and guidance like Twitter does, it really is the minimum bar for any API operating out there, and should not be ignored.


Learning About APIs Has To Be Relevant And Interesting

I am working on a project with a 16-year-old young lady to extract and tell a story using the YouTube API. I'm pretty excited about the project because the young lady happens to be my daughter Kaia Lane. If you've ever seen my API Blah Blah Blah t-shirt, you've seen her work. Historically she could care less about APIs, until recently when pulling data about one of her favorite YouTube stars came up--suddenly she is interested in learning more about APIs.

During regular chatting with my daughter, I shared a story on the entire history of Kickstarter projects broken down by a city. She is a little geeky and likes Kickstarter, so I figured I'd share the approach to telling stories with data, and said that if she ever wanted help telling a story like this using YouTube, Instagram, or another platform, that she should let me know. She came back to me a couple days later asking to learn more about how she could tell a story like this using data pulled from one of the YouTube stars she follows.  

Ok. I have to stop there for a moment. My 16-year-old daughter just asked me to learn more about APIs. :-) As an old goofy dad who happens to be the API Evangelist, I am beside myself.

I'm not 100% sure where this project will go. Right now I'm just seeing what data I can pull on Dan & Phil's video game YouTube channel, and from there we'll talk more about what type of story we want to tell about their followers and get to work pulling and organizing the data we need. I couldn't think of a tougher audience to be trying to get her interested in APIs. She isn't going to care about APIs, wants to learn about APIs, let alone become proficient with APIs unless they are relevant and interesting to her world. 

I do not think this lesson is exclusive to teaching 16-year-olds about API. I think this applies to ANYONE potentially learning about APIs. I am a big fan of EVERONE learning about APIs because they are the plumbing that moves our bits and bytes around in our personal and professional worlds. The more we are aware and the more we know we can put APIs to work, the more successful we are going to be in our lives. I want EVERYONE to open up and learn about APIs for this reason, but I REALLY REALLY want my daughter to find success in this way.

Just something to consider, as we are trying to help key internal, essential partner and other public stakeholders understand the API potential. How can we present APIs in a way that is relevant and interesting? Otherwise, most people probably aren't going to care, and it will all just be API Blah Blah Blah!


Are API Docs & Definition Formats A Single Thing Or Separate?

I was reading a virtual panel: document and description formats for web APIs, and thought the conversation was very productive when it comes to helping bring the world of API documentation and definitions into better focus. I encounter daily reminders that folks do not see the many dimensions of API definitions, and the role they play in almost every stop along the life cycle. This virtual panel helps move this discussion forward for me, providing some clarification for when it comes to the separation between API definitions and API documentation.

One of the questions asked of the panels was "Do you see API Documentation and Description formats as a single thing? Or multiple things?" Which I found Zdenek Nemec (@zdne) answer to be a great introduction for folks when it comes to understanding the importance of this separation:

There are definitely two different things. But truth be told, the initial incentive for the use API description formats was definitely the vision of API documentation without much work. However, the tide is turning as more and more people are discovering the benefits of the upfront design, API contracts, and quick prototyping

Many people still see machine readable definitions as purely something that drives API documentation. OpenAPI Specs are just for deploying Swagger UI, and API Blueprint is just for using Apiary. When in reality, the why and how you are doing API definitions is much, much deeper. As Z from Apiary points out, it is key to the API design and prototyping process, and critical to establishing the API contract.

Realizing that crafting machine readable API definitions is not just about API documentation, and that it is essential to establishing a meaningful technical, business, and legal contract internally, with partners, and maybe the public, early on in this API life cycle is empowering. I would say that I didn't fully appreciate API design, and understanding the depth of it until I had OpenSpec providing me with a scaffolding to hang things on.

Anyways, it is a great conversation from some very smart folks over at InfoQ, I recommend heading over and spending time absorbing it. I'm leaving open for a week and rereading until it all sinks in.


All The Right Channel Icons In Support Of Your API Platform

I look at a lot of websites for companies who are providing APIs and selling services to the API space. When I find a new company, I can spend upwards of 10 minutes looking for all the relevant information I need to connect. Elements like where their Twitter and Github accounts are. These are all the key channels I am looking for so that I can better understand what a company does and stay in tune with any activity, but they are also the same channels that developers will be looking for so that they can stay in tune a platform as well.

I spend a great deal of time looking for these channels, so I'm always happy when I find companies who provide a near complete set of icons for all the channels that matter. Restlet, the API design, deployment, management,and testing platform has a nice example of this in action, providing the following channels:

  • Facebook
  • Twitter
  • Google+
  • LinkedIn
  • Vimeo
  • Slideshare
  • Github
  • Stack Overflow
  • Email

All of these channels are available as orderly icons in the footer of their site. Making my job easier, and I'm sure making it easier for other would be API developers. They also provide an email newsletter signup along with the set of icons. While this provides me with a nice set of channels to tune into, more than I usually find, I would still like to have a blog and atom feed icons, as well as maybe an AngelList or Crunchbase, so that i can peak behind the business curtain a little.

I know. I know. I am demanding, and never happy. I am just trying to provide an easy checklist for companies looking to do interesting things APIs of the common channels they should consider offering. You should only offer up channels that you can keep active, but I recommend that you think about offering up as many of these as you possibly can manage. No matter which ones you choose, make sure you organize them all together, in the header and footer of your website, so nobody has to go looking for them.


A More Honest And Flexible API Contract Using Hypermedia

One of the reasons I write so much on API Evangelist is to refine how I tell stories about APIs and hopefully make a bigger impact by being more precise in what I'm saying. I feel like one of the reasons why hypermedia API concepts have to take longer than we anticipated to spread is because many (not all) of the hypermedia elite suck at telling stories. I am sorry, but you have done a shit job selling the importance of hypermedia, and often times were just confrontational and turning many folks off to the subject.

I am working on playing around with telling different stories about hypermedia, hoping to soften some of the sharp edges of the hypermedia stories we tell. One of the core elements of hypermedia APIs is they provide us with links as part of each API response, emulating much of what works with the web, in the system to system, and application layers of the web. One of the benefits of these links is they help facilitate the evolution and change that is inevitable in our API infrastructure.

The default argument for hypermedia folk is often about versioning and change-resistant API clients. I'm looking to help make these concepts more human, and that employing hypermedia as part of our API contracts is more about providing an honest and flexible view of the business relationship we are entering into. As API provider and consumer, we are acknowledging that there will be change and evolution in the resources that are being delivered, and being more honest that this exists, as we craft this API contract.

As an API provider I am not just going to dump this JSON product response on you, and expect you to know what to do, refusing to have a discussion with my consumers that this product will change. it might be out of inventory, and might be replaced by another product, and any number of evolution changes that may occur in the normal course of business. Hypermedia gives me a framework to acknowledge that this will indeed evolve upon entering our agreed upon API contract, and provide a channel for all integrated API clients to receive future instructions about what is possible when things do change--our API contract is flexible.

As an API provider, when I share this machine-readable representation of my product I'm not just assuming you know you can add it to the cart, or to a wish list, I am providing you with the links to do this, and the link relations that define this relationship. As things change and evolve, I have a way to share other opportunities, and changes to this business contract. When a product is recalled, there is a link routing you to what you need. When a product is replaced, you are routed to what you need. As an API provider, I'm committed to providing you with what you need to be successful when putting API resources to work.

Hypermedia goes being just providing links, managing versioning, and ensuring more resilient API clients--a very technical story. Hypermedia gives us a more honest and flexible way to define the API business contracts we are entering into when we make API resources available internally, with partners, and even with the public at large. Change is inevitable, and I feel like we need to be more honest about this change and be less rigid in how we are doing things, and I keep coming back to hypermedia as a meaningful way we can make this happen.


Thinking More About API Driven Conversational Interfaces

I am spending a lot of time thinking about conversational interfaces, and how APIs are driving the voice and bot layers of the space. While I am probably not as excited about Siri, Alexa and the waves of Slack bots being developed as everyone else, I am interested in the potential when it comes to some of the technology and business approaches behind them.

When it comes to these "conversational interfaces", I think voice can be interesting, but not always practical for actually interacting with everyday systems--I just can't be talking to devices to get what I need done each day, but maybe that is just me. I'm also not very excited about the busy, chatty bots in my Slack channels, as I'm having trouble even dealing with all the humans in there, but then again maybe this is just me. 

I am interested in the interaction between these conversational interfaces and the growing number of API resources I track on, and how the voice and bot applications which are done thoughtfully, might be able to do some interesting things and enable some healthy interactions. I am also interested in how webhooks, iPaaS, and push approaches like we are seeing out of Zapier, can influence the conversation around conversational interfaces. 

Conceptually I can be optimistic about voice enablement, but I work in the living room across from my girlfriend, I'm just not going to be talking a lot to Siri, Alexa or anyone else...sorry. Even if I move back to our home office, I'm really not going to be having a conversation with Siri or Alex to get my work done, but then again maybe its just me. I'm also really aware of the damaging effects of too much messaging, chat, and push notification channels open, so the bot thing just doesn't really work for me, but then again maybe it's me. 

I am more of a fan of asynchronous conversations than I am of the synchronous variety, which I guess could be more about saved conversations, crafted phrases or statements that run as events triggered by different signals, or even by me when I need via my browser--like Push by Zapier does. I see these as conversations, that enable single or a series of API enabled events to occur. This feels more like orchestration, or scripted theater which accomplishes more of what I'm looking to do than synchronous conversations would accomplish for me.

Anyways, just some exercising of my brain when it comes to conversational interfaces. I know that I'm not the model user that voice and bot enablement will be targeting with their services, but I can't be all that out in left field (maybe I am). Do we really want to have conversations with our devices or the imaginary elves that live on the Internet in our Slacks and Facebook chats? Maybe for some things? What I'd really like to see is a number of different theaters where I can script and orchestrate one time, and recurring conversations with the systems and services I depend on daily, with the occasional synchronous engagement required with myself or other humans, when it is required.


People Do Not Know What Your API Does If You Do Not Showcase It

With a lot of my storytelling, I feel like captain obvious, but I also recognize the importance of simple, and sometimes repetitive storytelling to help reach my audience of time and resource-strapped API providers. Sometimes API providers are just too busy to remember the small things, and this is where I come in to help you remember some of the most obvious aspects of providing APIs that can be essential to success.

This morning's reminder is that nobody will know the cool things your API does if you do not showcase what is being done with it. This is why I added my API showcase research, to highlight the approach of the successful API providers, and give me a regular reminder to write about the topic. I understand you are busy, but so are your API consumers, and there is a good chance they need a little help understanding what can be done with your super valuable API resource(s).

Showcasing your successful API integrations is where the rubber meets the road with API operations. Tell us all about the cool things people are doing with your APIs. It doesn't have to be a full blown case study (although that would be nice too), it can be 250 words, plus some bullets, and another 250 words, helping us understand the possibilities in 2 minutes or less. From my experience in the space, people eat up these simple, easy to read examples of APIs solving problems, and providing solutions.

Ideally, you are showcasing the cool things being done with your API on a regular basis via your blog, but if for some reason, you are too busy, make sure and at least share your thoughts with me via email. You never know, if it is worthy, I might take the time to share here on API Evangelist.


Discovering New APIs Through Security Alerts

I tune into a number of different channels looking for signs of individuals, companies, organizations, institutions, and government agencies doing APIs. I find APIs using Google Alerts, monitoring Twitter and Github, using press releases and via patent filings. Another way I am learning to discover APIs is via alerts and notifications about security events.

An example of this can be found via the Industrial Control Systems Cyber Emergency Response Team out of the U.S. Department of Homeland Security (@icscert), with the recent issued advisory ICSA-16-287-01 OSIsoft PI Web API 2015 R2 Service Acct Permissions Vuln to ICS-CERT website, leading me to the OSIsoft website. They aren't very forthcoming with their API operations, but this is something I am used to, and in my experience, companies who aren't very public with their operations tend to also cultivate an environment where security issue go unnoticed.

I am looking to aggregate API related security events and vulnerabilities like the feed coming out of Homeland Security. This information needs to be shared more often, opening up further discussion around API security issues, and even possibly providing an API for sharing real-time updates and news. I wish more companies, organizations, institutions, and government agencies would be more public with their API operations and be more honest about the dangers of providing access to data, content, and algorithms via HTTP, but until this is the norm, I'll continue using API related security alerts and notifications to find new APIs operating online.


Defining OAuth Scope Inline Within The API Documentation

I am working on a project using the Youtube API, and came across their inline OAut 2.0 scopes, allowing you to explore what the API does as you are browsing the API docs. I am a huge fan of what interactive documentation like Swagger UI, and Apiary brought to the table, but I'm an even bigger fan of the creative ways people are evolving upon the concept, making learning about APIs a hands-on, interactive experience wherever possible.

To kick off my education of the YouTube API I started playing with the search endpoint for the Youtube Data API. As I was playing with I noticed the had an API explorer allowing me to call the search method and see the live data.

Once I clicked on the "Authorize requests using OAuth 2.0" slider I got a popup that gave me options for selecting OAuth 2.0s copes, that would be applied by the API explorer when I make API calls.

The inline OAuth is simple, intuitive, and what I needed to define my API consumption, in line within the Youtube API documentation. I didn't have to write any code or jump through a bunch of classic OAuth hoops. It gves me what I need for OAuth, right in the documentation--simple OAuth is something you don't see very often.

I'm a supporter of more API documentation being an attractive static HTML layout like this, with little interactive modules embedded throughout the API docs. I'm also interested in seeing more web literacy being thrown in at this layer as well, pulling common web concepts and specification details, and providing popups, tooltips, and other inline API design learning opportunities.

I'm adding YouTube's approach to OAuth to my list of approaches to a modular approach to delivering interactive API documentation, for use in future storytelling.


Convert OpenAPI Spec to Slate / Shins Markdown API Docs

Someone turned me on to an OpenAPI Spec to Slate / Shins compatible markdown converter on Github this last week. I have been an advocate for making sure we are still using machine readable API definitions for our API documentation, even if we are deploying the more attractive Slate. I've been encouraging folks to develop an attractive option for API documentation driven by OpenAPI Spec for some time, so I am happy to add this converter to my API documentation research and toolbox.

The OpenAPI Spec to markdown converter also introduced me to a version of Slate that is ported to JavaScript / Node.js called Shins. I'm going to add Shins to my API documentation research, and "widdershins" the OpenAPI Spec to markdown converter to my API definition research. The auto-generation of attractive API documentation like Slate and Shins seems like a valid approach to getting things done, and worth including in my research.

I am increasingly publishing YAML editions of my OpenAPI Specs which drive API documentation that operates on Jekyll, using Liquid. So I am all about having many different ways to skin the API documentation beast, allowing it to be easily deployed as part of any CI flow, and enabling the publishing of API docs for many different APIs, in many different developer portals or embedded on any device as part of IoT deployments. I think that a diverse range of approaches are optimal, as long as we do not lose our machine readable core.


Transparency In Police Access To Social Platforms Using OAuth And APIs

I was learning about Geofeedia providing law enforcement access to social media data from Twitter, Facebook, and Instagram via their API(s) this week. Geofeedia was making money by selling surveillance services to law enforcement build on top of these social APIs and is something that I guess Facebook and Instagram have cut-off access, but they could still have Twitter access through a reseller (Gnip?). 

This isn't something that will just go away. If law enforcement wants access to user's data on Facebook, Twitter, and Instagram, they are going to get it. I am guessing that the rules regarding what law enforcement can or can't do aren't clear (I will have to learn more), and something that is just left up to platforms to enforce via their terms of service. It is a problem that modern approaches to API authentication, management, and analytics are well designed to help make sense of--we just have to come up with a new layer defined specifically for law enforcement.

Law enforcement should be able to fire up any standard, or customized solution they desire to search against social media data via APIs. However, they should be required to obtain an application key, and obtain the OAuth tokens that any other developer would need to. Rather than law enforcement being the customer of companies like Geofeedia, they should each get their own app id and keys, providing an identifying application that represents a specific law enforcement agency. They can still buy the software from providers, they just need the unique identifier when it comes to API consumption.

Along with this access, we also need to begin to define an auditable or regulatory layer, where other government agencies or 3rd party auditors can get access to the access logs for all applications registered to law enforcement agencies. A kind of real time FOIA access to the API management layer, allowing for a window into how law enforcement agencies are searching and putting social media data to use.

Of course, there will be special considerations regarding the OAuth interactions at play. At what point are end-users notified that their data is being accessed by law enforcement, and at what point do other government agencies and 3rd party auditors have access to API access to log file APIs for the law enforcement applications that are consuming Facebook, Instagram, Twitter, and other APIs. 

There is a lot of work ahead to define how law enforcement can put social data to work via APIs, but the tools are there. Modern API infrastructure excels at this when done right. We can give law enforcement access to the data they need, while also enabling transparency in the process, making the platform operators like Twitter and Facebook feel better, while also respecting the privacy of US citizens. We need to just hammer out the OAuth scopes for these relationships similar to how we do it for energy, healthcare, and other vital data being served up via APIs.

This is a problem that will keep popping up. We can't just rely on groups like ACLU to find the companies who are acting as brokers and waiting for the platforms to play whack a mole when these companies are singled out. We need a formal definition to guide how law enforcement is obtaining access to increasingly vital social media and network data via APIs. We need some transparency and consistency in the process, something that APIs do well when executed properly.

While it makes cringe think about. I predict that many companies will be required to have API access in the future--specifically for this purpose. My hope is that there is also some transparency and consistency baked into this approach, leverage what web APIs do best. Allowing law enforcement to get what they legally need, allowing other government agencies, watchdog groups, and journalist to get self-service, predefined access to understanding what law enforcement is up to when it comes to surveillance using online services.

I will spend more time on this subject. Mapping out how it might work across the top platform like Twitter, Facebook, and Instagram. I'm hoping that we can make some movement in this area before too many other episodes occur.


The Monitoring Layer Of The DevOps Aggregation API Platform

While spending some time going through my API monitoring research I found myself creating an OpenAPI spec and APIs.json index for the DataDog API, and had the realization that this is the beginning of what I'm looking for when I was talking about a DevOps aggregation API platform. DataDog is just the monitoring layer of this vision I have, but it has many of the other elements I'm looking for.

DataDog has all the monitoring elements present in their API platform, and they have all the platform integrations I'm envisioning in a DevOps aggregate API. We just need the same thing for design, deployment, virtualization, serverless, DNS, SDK, documentation, and the other critical stops along a modern API life cycle.

I'll keep profiling the APIs for the service providers in my life cycle research until I get more of this DevOps aggregate API definition mapped out. Hopefully, I will stumble across other providers like DataDog who are doing such an interesting job with the choreography, and orchestration that will be needed to work across so many platforms. I appreciate API aggregation service providers who 1) have an API, and 2) share so much of the definition behind their work.

The next thing I will work on is profiling the metrics that DataDog has defined across the platforms they integrate with. Take a look at the metrics they have defined for each integration, there are some valuable patterns available in their work. I'd love to see a common set of API monitoring metrics emerge from across providers, something that if we standardize and share in a machine readable way, others will emulate--making interoperability much smoother when it comes to monitoring.

I just wanted to keep beating my drum about the fact that APIs aren't just about building applications, they are also critical to the API life cycle, and making sure there are stable, scalable APIs to build applications on top of in the first place.


Taking A Fresh Look At The Twitter API

I am working on profiling the Twitter API again, and I thought their stack of APIs have evolved significantly beyond what we tend to think of as the Twitter API, and was worth taking another look at. It is easy to think of Twitter API being about tweeting, friends, and following people, and #hashtags, but they have an interesting mix that I think tells its own story about Twitter's journey.

Here is the current Twitter API stack:

  • Public REST API - The public REST APIs provide programmatic access to read and write the Twitter data -- what we think of when we talk about the Twitter API.
  • Media API - The APi for managing photo, videos or animated GIFs, that are used by other Twitter API endpoints when tweeting, direct messaging, and others.
  • Collections API - The API for managing collections of tweets to tell specific stories, providing a single URL that represents each Twitter collection.
  • The TON (Twitter Object Nest) API - Allowing implementers to upload media and various assets to Twitter, allowing for resumable, and single file uploads.
  • Curator API - Provides broadcasters their curator-created streams for on-air graphics systems, or other digital displays. 
  • Streaming APIs - Deliver new responses to REST API queries over a long-lived HTTP connection, providing a regular stream of tweets from the platform.
  • Ads API - The Ads API gives partners a way to integrate Twitter advertising management in their product. Selected partners have the ability create their own tools to manage Twitter Ad campaigns while easily integrating into existing, cross-channel advertising management solutions.
  • GnipGnip is Twitter’s enterprise API platform, delivering real-time and historical Twitter firehose data for large use applications.

It is interesting to think about Twitter's long API evolution that got them here. I hear people often reference Twitter as the most extreme example of a public API out there. Granted, it is definitely the original example and has a very public element to it, but it also has several APIs that require partner status or special permissions to access, with the documentation available publicly--pushing back on the Twitter stereotype often used when we discuss APIs.

While most API providers will never reach Twitter scale, I think there are lessons in growth present here. That you don't always have to be 100% public, that you'll probably need streaming and higher volume solutions, including sensible handling of heavy media objects like images and video, as well as make money--do not forget to make money. It makes me sad that monetization on the Twitter platform is all about advertising, a huge missed opportunity for them in my opinion, but the advertising API is still worth documenting alongside the others.

Ok, that concludes my fresh look at the Twitter API stack. I'm going through each of them and documenting all available endpoints while profiling their current approach to the business of API operations. I figured that I better document everything before they get purchased by someone for Christmas. I haven't heard back on my offer yet, so I' guessing I was outbid. ;-)


Slack Shares Their View On Bot Advertising

I was reading the hard questions on bot ethics from Slack, and their thoughts on bot advertising grabbed my attention. Trying to understand how bots will be monetizing things has been something I'm learning about, so I found Slack's post rather timely, and relevant to this fast growing layer of the API world. 

Here was Slack's view on advertising with bots by developers:

A bot should not serve ads unless it has a strong, expressed purpose that benefits the user in doing so, and even then only on B2C platforms. I would hate to see bots becoming the new tracking pixel. Bots should not be prompting users to click on things and buy things unless explicitly asked to do so.

They continue by adding that, "ads in apps are against the Slack API terms of service, and that makes me rather proud". I'm hoping that the Slack's business model is solid enough that they'll never need to consider bot advertising. I think it is an interesting constraint upon the community, and one that I'm curious how they'll work around when it comes to making money with bots. Could we be looking at a post-advertising world, when it comes to generating revenue?

From what I can tell, bot monetization will either be about mining data from users, paying for premium features, or a little of both. I'm still a few weeks off from having more examples of how the bots are generating revenue, but I wanted to make sure I recorded Slack's stance on the subject, for reference in future work.


Including The Twitter Object Nest API As A File Upload API Example

One request I get from folks on a regular basis, is an example of file upload APIs. Each time I get one of these requests I regret that I do not have more file upload and storage APIs profiled, allowing me to share a list of examples. So file upload APIs are high on my list to keep an eye out for as I'm doing my regular monitoring and mapping of the API universe. 

An API I wanted to add to this list was the TON (Twitter Object Nest) API, which "allows implementers to upload media and various assets to Twitter". The TON API is an interesting model for me because it supports resumable, and non-resumable uploads--with all files over 64MB required to be resumable. I wanted to profile the API in a story, and add some of the key aspects to my research on file upload APIs, so that I could reference in future conversations.

Some of the core features of how the TON API operates are:

  • The Content-Type of requests cannot be application/x-www-form-urlencoded.
  • The Content-Type of requests are a valid media type as defined by IANA.
  • Chunks should be in integer multiples of X-TON-Min-Chunk-Size (except the last).
  • The Location header after upload needs to be saved to be used in other Twitter API calls.

Here is the basic makeup of the initial request to kick off a resumable upload:

  • Authorization: See 3-legged authorization
  • Content-Length: Set to 0
  • Content-Type: The Content-Type of the asset to be uploaded.
  • X-TON-Content-Type: Identical to Content-Type
  • X-TON-Content-Length: Total number of bytes of the asset to be uploaded

The initialization response contains a Location which can then be used in other calls to the Twitter API. After you make the resumable upload initialization call, you can make each of the follow-up chunk uploads for the file--here is an example resumable video upload request:

  • PUT /1.1/ton/bucket/{bucket}/SzFxGfAg_Zj.mp4?resumable=true&resumeId=28401873 HTTP/1.1
  • Authorization: // oAuth1.0a (3-legged) details here
  • Content-Type: video/mp4
  • Content-Length: {number of bytes transferred in this request}
  • Content-Range: bytes {starting position, inclusive, 0-indexed}-{end position, inclusive}/{total content length}

Anything under 64MB in size can just be done in a single chunk. Next, I'm going to create an OpenAPI Spec for the Twitter TON API, and hack together a simple server side edition of it in PHP, just so I can play with a complete example, in a sandbox environment. I will play with the Twitter TON API as well, and get familiar with how it works in relationship to the rest of the Twitter API.

Once I profile the file upload APIs for a couple of other providers I will add as a single area of my API stack research. I'm hoping to establish a common set of design patterns that I can point people to when designing their own file upload APIs, providing a single repository of API definition patterns that anyone can fork and put to use.


The Open Skills API From Dept of Labor & University of Chicago

We like to talk about the API economy in this space. It is kind of the grand dream of API obsessed, that helps us articulate how big of a deal we think APIs are going to be. We know APIs are going to be big, but in reality, the impact most APIs make don't size up. This is one of the reasons I'm such a big support of APIs in the public sector, as the potential for a positive impact tends to be greater in my opinion.

One of the APIs that has the potential to contribute at the API economy scale is out of a partnership between the Department of Labor and the University of Chicago--the DataAtWork Open Skills API. Providing "a complete and standard data store for canonical and emerging skills, knowledge, abilities, tools, technologies, and how they relate to jobs". Opening up a pretty useful API from their collaborative work to "map the DNA or genome of the U.S. labor market."

The Open Skills API uses Swagger UI for the documentation, which always makes me happy because it means there is an OpenAPI Spec behind, for use in Postman, APIMATIC, and other API solutions. It's a simple, open API, that has some serious potential for use in web, and mobile apps, as well as visualizations, analysis, and other types of applications--making it worthy to add the APIs in my list that I think could actually make an impact at API economy scale.


Saving and Versioning API Definitions In Editor Using Github Gists

My friend Jordan Walsh (@jordwalsh) just released a new take on the Swagger editor, that inches closer to my vision of a dream API sketchbook and portfolio. His swagger-gist.io tool allows you to open and save your API definitions to Github Gists, allowing you to use the snippet sharing solution to manage your API definitions, and their evolution.

While it isn't my entire vision for an API sketchbook and portfolio, swagger-gist.io's usage of Github Gists is a move in the right direction. This is just the first draft of his tool, and it looks like he plans on building in more of the API definition management features I am looking for--leveraging Github Gists as the book, in my sketchbook definition. #Creative

I like this model, especially when it comes to collaboration and storytelling around the API design process. I could see offering more sharing features for API definitions within the editor, enabling you to email, Slack, and share throughout an API's life cycle. I can also see more copy and paste opportunities, embedding API definitions using Github Gists in blog, knowledge base, and forum posts--grabbing the embed code from within the editor.

I'm curious to see where Jordan takes it. I have lots of ideas, but will just keep an eye on his work. My only critique at the moment is to not couple the functionality too tightly with the word "Swagger", as that is a trademarked product. I recommend relying on "OpenAPI Spec" or even better, some other way of identifying Gists that contain an OpenAPI Spec definition. ;-)

Cool stuff Jordan, keep up the good work.


Preserving The Sunlight On Github

I'm following along as the Sunlight Foundation winds down their operations and gathering any lessons along the way, that can help us open data and transparency folks can learn from as we do our work. I wrote earlier that we should be learning from the Sunlight Foundation situation and that we are making sure we bake transparency into our projects and wanted to continue to extract wisdom we can reuse as they turn out the lights.

The Sunlight Foundation shared that they are working with the Internet Archive and Github to preserve their projects, and that they are "trying to ensure the open source community can understand and use our projects in the future" by:

  • adding documentation
  • standardizing licenses 
  • scrubbing sensitive info

This is the benefits of being transparent and open by default is that you tend to do all of this in real-time. If you are in the business of opening up data, making it accessible with open APIs, you should be using Github, documenting, and telling the story as you go along. Then you do not have to do it all when you are walking away--everything is open by default.

This is why I work out in the open on Github each day, it allows people to take my tools like the CSV Converter, and the API Stack, and put them to work, even as I'm still evolving them. When I step away from a project, it can continue to live on with all the code, definitions, schema, data, licensing, and the story behind in a nice forkable package--no extra work necessary.

Anyways, just a couple of nuggets to consider as we are working on open data and API project across the space.