{"API Evangelist"}

Storing API Keys In The Private Master Github Repository For Use In Github Pages

My public websites have been running on Github Pages for almost two years now, and slowly the private management tools for my platform are moving there as well. Alongside my public websites, I’m adding administrative functions for each projects. Most of the content is API driven already, so it makes sense to put some of the management tools side by side with the content or data that I’m publishing.

These management tools are simple JavaScript, that use the Github API to manage HTML, and JSON files that I have stored either publicly or privately within repositories. I use Github oAuth to work with the Github API, but to work with other APIs I need a multitude of other API keys, including 3Scale generated API keys I use to access my own API infrastructure.

My solution is to store a simple api-keys.json file in the root of my private master repository, and then again using Github oAuth, and the Github API, I access this file, read the content of the JSON file into a temporary array I can use wthin my management tools. If you do not have access to the Github repository, you won’t be able to read the contents of api-keys.json, rendering the management tools useless.

I will develop a centralized solution to helping manage API keys across all my projects, allowing me to re-use keys for different projects, and easily update, or remove outdated API keys. This approach to storing API keys in my private Github repository is allowing me to easily access keys in client-side apps I run on Github Pages, as well as via server-side applications and APIs—something that I’m hoping will give me more flexibility in how I put multiple APIs across my infrastructure.



Using Containers To Bridge What Swagger Cannot Define On The Server-Side For My APIs

When I discuss what is possible when it comes to generating both server and client side code using machine readable API definitions like Swagger, I almost always get push-back, making sure I understand there are limitations of what can be auto-generated.

Machine readable API definitions like Swagger provide a great set of rules for describing the surface area of an API, but is often missing many other elements necessary to define what server side code should actually be doing. The closest I’ve gotten to fully generating server side APIs, is when it came to very CRUD-based APIs, that possess a very simple data models--beyond that it is difficult to make "ends meet".

Personally, I do not think my Swagger specs should contain everything needed to define server implementations, this would make for a very bloated, and unusable Swagger specification. For me, Swagger is my central truth, that I use in generating server side skeletons, client side samples and libraries, and to define testing, monitoring, and interactive documentation for developers.

Working to push this approach forward, I’m also now using Swagger as a central truth for my Docker containers, allowing me to use virtualization as a bridge between what my Swagger definitions cannot define for each micro-service or API deployed within a specific container. This approach is leaving Swagger as purely a definition of the micro-service surface area, and leaving my Docker image to deliver on the back-end vision of this surface area.

These Docker images are using Swagger as a definition for its container surface area, but also as a fingerprint for the value it delivers. As an example, I have a notes API definition in Swagger, for which I have two separate Docker images that support this interface. Each Docker image knows it only serves this particular Swagger specification, using it as a kind of fingerprint or truth for its own existence, but ultimately each Docker image will be delivering its own back-end experience derived from this notes API spec.

I’m just getting going with this new approach to using Swagger as a central truth for each Docker container, but as I’m rolling out each of my API templates with this approach, I will blog more about it here. Eventually I am hoping to standardize my thoughts on using machine readable API definition formats like Swagger to guide the API functionality I'm delivering in virtualized containers.



Use APIs.json To Organize My Swagger Defined APIs Running In Docker Containers

I continuing to evolve my use of Swagger as a kind of central truth in my API design, deployment, and management lifecycle. This time I’m using it as a fingerprint for defining how APIs or micro-services that run in Docker containers function. Along the way I’m also using APIs.json as a framework for organizing these Swagger driven containers, into coherent API stacks or collections, that work together to accomplish a specific objective.

For the project I’m currently working on, I’m deploying multiple Swagger defined APIs, each as separate Docker containers, providing some essential components I need to operate API Evangelist, like blog, links, images, and notes. Each of these components have its own Swagger definition, and corresponding Docker image, and specific instance of that Docker image deployed within a container.

I’m using APIs.json to daisy chain all of these APIs or micro-services together. I have about 15 APIs deployed, each are standalone services, with their own APIs.json, and supporting Swagger definition, but the overall project has a centralized APIs.json, which using the includes properties, provides linkage to each individual micro-services APIs.json--loosely coupling all of these container driven micro-services under single umbrella.

At this point, I’m just using APIs.json as a harness to define, and bring together the technical aspects of my new API stack. As I pull together other business elements like on-boarding, documentation, and code samples, I will add these as properties to my parent APIs.json. My goal is to demonstrate the value APIs.json brings to the table when it comes to API and micro-service discovery, specifically when you deploy distributed API stacks or collections using Docker container deployed micro-services.



Scrubbing Individuals And Company Names From Stories I Tell

I find it more valuable to scrub the names of the APIs from about 75% of my stories, which I feel helps them be received by a widest possible audience as possible. If I say “API Provider X”, other providers just think I’m talking about that company, but when I make it generic, I find API providers think I’m talking about them.

There are many reasons I would scrub individual or company names from a story, most often this is because they’ve asked me not to reference them public, but are fine with me telling the story in an anonymous fashion. However, it occurred to me over time that there are other more powerful storytelling influences potentially behind leaving out specific characters.

Early on I identified that there are a lot of API lessons to be learned across business sectors, but if you reference the specific company, or business sector, many business leaders or developers will put there blinders on, feeling this doesn’t apply to them. Ideally companies would identify the potential of hearing stories from other ways of doing business, but unfortunately many lack the imagination to make the transition—lucky I have enough imagination to go around! ;-)

I thoroughly enjoy storytelling in the API space. Over the last four years I’ve developed a voice I enjoy speaking in, and have slowly built an entire toolbox of approaches like scrubbing people or company names, to make my storytelling more impactful, and potentially reach a wider audience. I feel strongly that I would never have been able to do this if API Evangelist was purely a marketing blog for a company, but because I’m lucky enough to be able to focus exclusively on entire API space, spanning multiple business sectors, and left to my own devices when it comes to the editorial choices--I’ve been able to grow my storytelling in ways I never imagined in 2010 when I first started.



Providing An oAuth Signature Generator Inline In Documentation

I talked about Twitter's inclusion of rate limits inline with documentation the other day, which is something I added as a new building block, that API providers can consider when crafting their own strategy. Another building block I found while spending time in the Twitter ecosystem, was an oAuth signature generator inline within the documentation.

While browsing the Twitter documentation, right before you get to the example request, you get a little dropdown that lets you select from one of your own applications, and generate an oAuth signature without leaving the page.

I am seeing oAuth signature generators emerge in a number of API platforms, but this is the first inline version I’m seeing. I’ve added this to my tentative list of oAuth and security building blocks I recommend, but will give some time before I add. I like to see more than one provider do something before I put it in there, but sometimes when it is just Twitter, that can be enough.