{"API Evangelist"}

The Community Has Spoken: API Evangelist Will Stay As It Is

I came out of the wilderness this week (literally, the Kalmiopsis Wilderness) to an inbox full of emails and a Twitter stream full of messages and DMs. I expected a little buzz from my exit, but not to the extent that it did, let alone the direction it took.

I am incredibly humbled by the community's response, and honestly, I'm not well equipped to deal with this type of attention. I know there are some folks out there who think I am some sort of attention seeker, but that is the part of all of this that I really struggle with the most. I have to step aside from myself to be able to write. Otherwise, I'm left incapacitated.

I am incredibly thankful for the API community. I have had the bar set pretty high for this community throughout the last six years, and you managed to raise it beyond what I had already set in my mind. I kindasorta thought some of you were reading my rants, but it kind of terrifies me now how many of you actually were.

Sooooooo....while I received some very fine bids on API Evangelist this week, as well as some pretty creative solutions for keeping alive and generating revenue, I'm going to accept the community's bid for the domain. It will remain the community's API Evangelist voice, as it has 87.34% of the time (I am a sellout, just not very good one) over the last six years -- I will just leave it as is.

The blog and Twitter account will remain dormant, as I will have no time to do any work. If you do have a post which you feel would be "worthy" of publishing to the blog over the summer, feel free to fork, and submit as pull request. When I come out of woods, I'll consider accepting it for publishing. If not, you will have to wait until I get my head back into tech to read anything more.

All the rest of the bids I received for the site, I'll have to decline. Thank you for your kind offers and will try to get everyone a short email response. That's all I got. I have enough money to get through the summer now, pay for gas, food, and the occasional campsite and motel. Oh, and some cool drone equipment! I have no idea what the summer holds, so I can't make any predictions for the fall -- we will see.

I can say, that you should take Steve's advice in his blog post, and don't take anything for granted. Especially going to the bathroom in a toilet. Don't take that for granted. It's a wonderful thing!! ;-) 


See The Full Blog Post

API Evangelist Is Up For Sale - Get Your Bids In By Friday

It has been six years since I started API Evangelist. A personal matter has come up which will require my attention for at least six months, probably upwards of a year. Which in the tech world can be centuries--when you are dealing in real time. 

With this in mind, I will be putting the apievangelist.com domain, and @apievangelist Twitter account up for auction. When I come back (if I do), I'm perfectly happy starting over, and I need all the funds I can get to get me through the summer, and API Evangelist is the only asset I own.

I am very sorry for the impact on my business partners, but family and human beings come first, and I'm sure they will be fine. If you want the short and sweet on what I'm doing head over to kinlane.com blog, and if you want to follow the story of what I'm up to this summer you can head over to dronerecovery.org.

The apievangelist.com website does over 50K page views monthly with about 30K uniques. The Twitter account has 5K followers, and growing every day--perfect for any API company. I'm taking bids for all of it to fund what I'm doing the next few months. API Evangelist is worth a lot to me, so I'm starting the bidding at 10K, and accepting the highest bidder by May 27th. 

Email your bids to [email protected] The entire site runs on Jekyll via Github, so I can just transfer domain, and transfer the organization to you via Github. If the site does not sell, I will just leave up, and contemplate its future at a later date.

Thank you for all your support--it was a fun ride.

See The Full Blog Post

The API Evangelist API Management Guide

API management is the oldest area of my research. The area has been being defined since Mashery first opened its doors in 2006 and continues to be defined with the recent IPO by Apigee, and the entry of AWS into the sector with the AWS API Gateway. API management is often defined by the services offered by these companies, but also by the strategies of leading API providers in the space.

My API management research walks through the common building blocks of the space, the open source tooling, and API service providers who are serving the space. I first started this guide in 2010 and 2011 and have worked to evolve with each release, as well as the fast pace change that seems to occur in the space.

This guide touches on, and often overlaps with my other areas of research (as everything was born out of this), but should provide you with what you need as a checklist for evolving your existing API management strategy, or help you craft a brand new one for your new API. This guide has a heavy focus on a service provider led approach to API management, but with the growth in open source solutions lately, I'm working to evolve more towards a DIY approach to making it work.

I fund my research through consulting, selling my guides, and the generous support of my sponsors. I appreciate your support in helping me continue with my work and hope you find it relevant in your API journey.

Purchase My API Management Guide

See The Full Blog Post

The API Evangelist API Design Guide

In the last couple of years, I've seen the concept of API design go from being something the API elite discuss, to something that involves business users, and something that has spawned a whole ecosystem of service and tooling providers. 

This API design research covers the concepts people often associate with API design, like REST, and Hypermedia, but also touches on other media types, API definitions, and many of the services, tooling and other solutions that have emerged to serve the space.

I fund my research through consulting, selling my guides, and the generous support of my sponsors. I appreciate your support in helping me continue with my work and hope you find it relevant in your API journey.  

Purchase My API Definitions Guide

See The Full Blog Post

Oracle vs Google: APIs Are Kind Of Hard To Understand

One of the things that stood out for me reading through the Oracle v Google trial coverage today was Sarah Jeong's acknowledging how APIs are kind of hard to understand, and is something that is casting a shadow over the case -- something that is probably my top concern.

Also pointing out what I think is true of the Oracle legal team:

While this might be more about bringing things down to a lower level for "normals", I think it still represents how difficult it is for people to grok what APIs are:

Then continuing with more of the skeuomorph theater:

Then I guess we need to make some API calls? IDK

None of which I think is helping anyone involved:

I agree with with Sarah. She's a super smart chick, and if all the theater is confusing her, I can only imagine what effect it will have on the jury. What is code, what is API, what is definition, and what actually runs, or is there to enable and facilitate interoperability is blurry when you are in the game--I cannot even image anymore what it is like for the normals.

The only thing that keeps me optimistic in all of this is the possibility for some API literacy lessons for the masses, amidst all the theater. I guess we have to see how it goes and hope there are more opportunities to help folks understand what APIs actually are.

See The Full Blog Post

Enabling A Patients HIPPA Right To Access Their Personal Health Information (PHI) With APIs

I am reading through the API task force recommendations out of the Office of the National Coordinator for Health Information Technology (ONC), to help address privacy and security concerns around mandated API usage as part of the Common Clinical Data Set, Medicare, and Medicaid Electronic Health Records. The recommendations contain a wealth of valuable insights around healthcare APIs but are also full of patterns that we should be applying across other sectors of our society where APIs making an impact. To help me work through the task force's recommendations, I will be blogging through many of the different concepts at play

In addition to highlighting the usage of "patient-directed APIs" that I wrote about earlier, and taking a healthy stance on privacy and security when it comes to healthcare APIs, I wanted to separate out the conversation around a patent's right to access their own personal health information, and how APIs are being used as the enabler. Here is the chapter from the task force's recommendations:

Many of the discussions within the task force centered around the notion that the patient directed app of our purview supports the patient’s HIPAA right to access his/her own PHI from a Covered Entity, as required under HIPAA § 164.502.

This could be characterized in several ways:

  1. the individual requesting access to their information
  2. an entity designated by the individual to receive a copy of PHI (as part of the individual exercising his/her right to access PHI)
  3. the medium on which the individual requests that PHI be provided or transmitted as part of the individual exercising his/her right to obtain a copy of PHI

Alternatively, the patient directed app may also be characterized as a third party formerly authorized by the individual to receive PHI or a tool for engaging the individual in treatmentEach of these scenarios creates challenges when attempting to determine oversight of an app’s behavior there is not one clear solution.

I am going to educate myself about HIPAA § 164.502, and get to work understanding what other precedents exist--maybe with FERPA or COPPA, or other similarly regulated industries. I am just looking to understand where the lines are drawn when it comes to people having a "right to access" when it comes to their data, especially when APIs are playing a central role like they are with healthcare interoperability. 

I have read the healthcare API task force recommendations several times now, but I am only a couple pages into when it comes to cherry picking ideas I want to consider more deeply, as well as have indexed as part of my overall API industry research. So stay tuned for continued posts about how APIs are being used to drive patient-centered access to their healthcare data.

See The Full Blog Post

Twitter Collection Of @SarahJeong, @Xor & @Swiftstories Oracle vs Google Coverage

The next round of the Oracle v Google Java API Copyright battle has kicked off again in San Francisco after being sent back to the lower court by the United States Supreme Court. This round is all about deciding if Google's usage of the Java API in their Android platform was allowed under fair use. If I was in San Francisco I would be there to hear the case first hand--I am lucky there are some really smart folks covering the case and live tweeting things.

I went through the Tweets from Sarah Jeong (@sarahjeong), contributing editor at @Motherboard, Parker Higgins (@xor), EFF activist, and Mike Swift (@Swiftstories), Senior Correspondent for MLex. They are all providing some smart, and pretty amusing commentary on what is going on, so I wanted to organize the relevant Tweets into a Twitter collection summary for inclusion in my research. 

I will add more tweets to it each day of the trial, and since it's a Twitter collection, it will update here on the blog post, or if you'd rather,you can access it directly via Twitter. I have no idea how long this round of the case will play out, but I will definitely be tuning in, and ranting about it when I have time. 

Let me know if there is anyone else covering the trial, that I should be including in the Twitter collection. I want to try and capture as complete of a record as possible, from multiple perspectives if possible. While I am bummed that the precedent for copyright has been established for APIs, I'm hopeful that we can look at this round as an API literacy opportunity. ;-)

See The Full Blog Post

The Concept of Patientdirected APIs

I am reading through the API task force recommendations out of the Office of the National Coordinator for Health Information Technology (ONC), to help address privacy and security concerns around mandated API usage as part of the Common Clinical Data Set, Medicare, and Medicaid Electronic Health Records. The recommendations contain a wealth of valuable insights around healthcare APIs but are also full of patterns that we should be applying across other sectors of our society where APIs making an impact. To help me work through the task force's recommendations, I will be blogging through many of the different concepts at play. 

One phrase that is used regularly across the task force recommendations that really caught my attention was the concept of "PatientDirected APIs". It is a powerful concept, when you think about APIs existing, not just for helping integrate healthcare systems, and innovate around the development and delivery of the web and mobile applications, but doing it all because of, in service of, and at the direction of the patient. The document has grabbed my attention because this is the first time I've seen such an end-user focused API vision in the wild.

While these electronic healthcare record APIs will be used for system integration, delivering web and mobile apps, and benefiting healthcare platform operators, and developers, the reason for the APIs will be existing is to benefit the patient. This just isn't how things are done in Silicon Valley, where you always focus on benefits for the platform, its partners and investors first, maybe the developers secondarily, but in most cases the end-user is just an afterthought. In startup culture, things are just turned on its head, allowing for some serious imbalance in how we do things with APIs.

There are a wealth of topics for me to work through from the task force recommendations out of ONC and HHS. I will keep blogging, as I read through, and work through the important recommendations it contains. Once I have my head wrapped around the document more, and up to speed with what they are doing, I'm hoping I can contribute some more ideas that might help stimulate things as healthcare providers roll out their APIs.

See The Full Blog Post

The API Evangelist API Definition Guide

How we define our APIs has dramatically changed in recent years. Since Swagger came onto the scene around five years ago, there has been a rapid growth in the number of open formats, tooling, and services to help us define APIs. Companies are using API definitions like OpenAPI Spec, Postman, and API.json to communicate about their APIs at almost every stop along the API life cycle. This is my research to better understand all the moving parts in this fast growing sector.

This guide focuses on API definition formats like OpenAPI Spec and API Blueprint, as well as schema formats like JSON Schema, and MSON, but I also touch on media types, link relations and other common building blocks for defining our APIs so they speak a common language. While there is some open tooling included in this guide, I'm trying to focus on the specifications, and formats themselves, and help make sense of the amount of information available to us.

I fund my research through consulting, selling my guides, and the generous support of my sponsors. I appreciate your support in helping me continue with my work and hope you find it relevant in your API journey.

Purchase My API Definitions Guide

See The Full Blog Post

Cutting Through The Smoke & MIrrors Of IT Discussions Using API Definitions

I get brought into a lot of API discussions with IT departments from companies, institutions, and government agencies, which are often coordinated by business groups who are interested in better meeting their goals using APIs. This is often an immediately charged conversation, with IT coming to their table with a whole array of baggage. 

In about 75% of the situations, IT, and developer representatives are nice, or rather they are tight-lipped, relying on a myriad of smoke & mirrors to defend their dark arts. Let me stop for a moment, and put out there that I was IT director from 1998 through 2010. I'm not saying IT are bad people, but there are a wide variety of ways we slow, obfuscate, and distort the conversation to be in our favor -- takes one to know one. I wouldn't say that I was 100% honest in my approach to being an IT leader, but I tried my hardest to keep things as transparent as I possibly could.

Anyways, in a couple of the  IT discussions I've had lately, there was an OpenAPI Spec available to define the resources that were on the table, and in a handful of other conversation there were not. Keep in mind that most of these scenarios are with a more traditional version of IT, not with startup technology groups (a whole different beast). As I step back, I am taking notice of the harmonizing effect that an API definition can have, in keeping conversations focused, productive, and moving forward toward a common goal.

In the conversations without an OpenAPI Spec, back-end systems and legacy processes dominated the discussion, even though we are all on a conference call to discuss an external, partner, and public facing API. In the discussions where an OpenAPI Spec was present, we focused on exactly which resources were needed (nothing more), and the details (params, responses, etc) that were needed by all consumers--essentially providing us with a scaffolding for the discussion, that kept things moving forward, and not bogged down in legacy sludge. 

Backend focused discussions always seemed to get slowed down by what was, and what is. The API definition focused conversations seemed to focus on what was needed, using a common language that everyone at the table understood. The presence of an OpenAPI Spec seemed to cut through the smoke & mirrors, which I think often alienates many of the business users. I find having three versions of an OpenAPI Spec and APIs.json file present: 1) simple outline 2) YAML and 3) JSON, was also something that significantly improved discussions, keeping conversations focused while also making them as inclusive as possible.

I think people will always bring their baggage to these discussions, but I'm liking the harmonization effects API definitions like OpenAPI Spec, API Blueprint, Postman, and APIs.json are having in these conversations. I'm hopeful that these API definitions can continue providing bridges between business and IT groups, helping close a canyon that has existed for decades.

See The Full Blog Post

A Healthy Stance On Privacy And Security When It Comes To Healthcare APIs

I am reading through the API task force recommendations out of the Office of the National Coordinator for Health Information Technology (ONC), to help address privacy and security concerns around mandated API usage as part of the Common Clinical Data Set, Medicare, and Medicaid Electronic Health Records. The recommendations contain a wealth of valuable insights around healthcare APIs but are also full of patterns that we should be applying across other sectors of our society where APIs making an impact. To help me work through the task force's recommendations, I will be blogging through many of the different concepts at play. 

Beyond the usage of "patient-directed APIs" that I wrote about earlier, I thought the pragmatic view on API privacy and security was worth noting. When it comes to making data, content, and other digital resources available online, I hear the full spectrum of concerns, and it leaves me optimistic to hear government agencies speak about security and privacy in such a balanced way.

Here is a section from the API task force recommendations:

Like any technology, APIs allow new capabilities and opportunities and, like any other technology, these opportunities come with some risks. There are fears that APIs may open new security vulnerabilities, with apps accessing patient records "for evil", and without receiving proper patient authorization. There are also fears that APIs could provide a possible "fire hose" of data as opposed to the "one sip at a time" access that a web site or email interface may provide.

In testimony, we heard almost universally that, when APIs are appropriately managed, the opportunities outweigh the risks. We heard from companies currently offering APIs that properly managed APIs provide better security properties than ad-hoc interfaces or proprietary integration technology.

While access to health data via APIs does require additional considerations and regulatory compliance needs, we believe existing standards, infrastructure, and identity proofing processes are adequate to support patient directed access via APIs today.

The document is full of recommendations on how to strike this balance. It is refreshing to hear such a transparent vision of what APIs can be. They weigh the risks, alongside the benefits that APIs bring to the table while also being fully aware that a "properly managed API" provides its own security. Another significant aspect of these recommendations for me is that they also touch on the role that APIs will play in the regulatory and a compliance process.

I have to admit, the area of healthcare APIs isn't one of the most exciting stacks in the over 50 areas I track on across the API space, but I'm fully engaged with this because of the potential of a blueprint for privacy and security that can be applied with other types of APIs. When it comes to social, location, and other data the bar has been set pretty low when it comes to privacy and security, but health care data is different. People tend to be more concerned with access, security, privacy, and all the things we should already be discussing when it comes to the rest of our digital existence--opening the door for some valuable digital literacy discussions.

Hopefully, I don't run you off with all my healthcare API stories, and you can find some gems in the healthcare API task force's recommendations, like I am. 

See The Full Blog Post

Providing A Set Of API Keys For Developers To Test Out Different API Outcomes

I wrote a post about Twilio using magic phone numbers that let their developers test out functionality without incurring any charges for deploying live phone numbers, making calls, and sending SMS. After publishing my post, Runscope CEO, John Sheehan (@johnsheehan) said he was behind the original spec for the Twilio magic numbers.

John continued to share some of the logic that went into his original spec:

Which I think adds another dimension to the concept of having test numbers like this. Different numbers give you different outcomes, and different credit card numbers give you different results. What else could you do with test numbers and unique identifiers? Existing invoice and order numbers for different commerce situations. Seems like you could load just about anything into any alpha and / or numeric that you would want to.

Around the turn of the century (I don't think I've ever said that) I used to work on web applications for non-profit organizations where I used to build campaign code tracking for large, and lengthy mail, phone, fax, and other types of activities. We had 6 to 8 digit identifiers, which every two digits had unique meanings--allowing us to build a pretty robust set of scenarios that helped us track every step in the campaigns evolution. 

Anyways, I think the concept is worthy of further exploration. I could see API providers crafting a pretty robust set of keys that could represent almost any object served up as part of API operations, with a very structured approach for how you tailor a multitude outcomes involving these objects. For me, this type of stuff goes way beyond just having a sandbox for your API, and could provide a much more meaningful way help developers polish their integrations.

See The Full Blog Post

APIs Will Wither On The Vine And Never Reach Their Full Potential

As I push out stories on the next round of the Oracle v Google API copyright case, considering how I will write about API deprecations and acquisitions I'm privy to, and document the continued march by the enterprise into the world of APIs -- I begin to see how APIs will continue to wither on the vine, and never reach their full potential in an increasingly toxic environment.

The Oracle v Google is just the first of many battles to occur between tech giants when it comes to APIs and the application of intellectual property laws. With the number of patents out there that focus on APIs as the thing that is protected by IP, not just the thing behind the API being protected, the copyright of the common API patterns we enjoy will just be the tip of the iceberg. This is how the giants will battle it out, leaving the rest of us fighting for scraps in the cracks.

Legal tussle like what we see between Oracle and Google will become commonplace in the future as the enterprise fights for what they think is their "property" when it comes to integrations, collaboration, sharing, and automation. These battles will take place after this current wave of "interest in APIs" by the enterprise, where these large entities have sent their scouts out to map this uncharted (and mildly threatening) wilderness, understand what everyone is doing, and file patents based on what they see. Simplifying enterprise operations, actually innovating, or truly achieving integration is farthest from their mind. While we are all busy doing the work, these IP cartographers are mapping out the exhaust from our labor.

Alongside all of this, another shift in the winds is occurring, that will reduce some very useful, innovative, and valuable API patterns to just portfolio items of the enterprise. As VC funding cycles shift, the startups who are doing the most interesting things with APIs will have their work gobbled up by the enterprise, either through direct acquisitions, or fire sales -- in any case APIs are IP, nothing more. These winds are strong, and will not just blow away the valuable APIs, but also much of the shade trees that is needed for other APIs to flourish on the vine.

All of this change will sweep up this API experiment as we know it, into the IP portfolios of the enterprise giants, so they can use as leverage in court battles, and venture negotiations of the future. The enablement that APIs bring to the table is just too small of a thing currently for the enterprise to even see. Startups are often too greedy, or beholden to their investors to understand the opportunity they are passing on when they see APIs as intellectual property. API enablement is not IP, the thing they enable is IP, and by freezing up the enabling factor, you will miss out on your IP ever reaching full potential, and will be left with a vineyard of perpetually green grapes.

I know, I know. All my enterprise and startup friends will tell me how naive I am, and this is how business is done. Again, I will say, you are so blinded by your greed, and your belief in this broken IP systems, you are willing to kill off this very interesting experiment--where we all  had a seat at the table. At this point, I am left without any hope, and the concept of APIs as we know it will wither on the vine, never reaching their full potential -- this API economy thing we all saw in our minds eye, will not happen. 

Don't get me wrong, I'm gonna still keep on fighting, and pushing for APIs usage in healthcare, education, government, and other important areas. I will keep building services and tools that embrace APIs as their core, but you will rarely hear me speak of the bright, API enabled future anymore. The experiment is over, the believers in a broken IP system are winning -- they just have too much money, and resources to play the long game, and the wider API space, and the rest of us will lose out.

See The Full Blog Post

Quality of Service API Endpoint For Your API Platform

I'm spending a lot of time in the Twilio API ecosystem this week, so you will hear multiple stories about what they are up to. This one is highlighting their Call Feedback API and the growing amount of what I'd consider as infrastructure APIs from leading API platforms. The more API platforms mature, the more I see APIs deployed to assist API consumers manage their integrations, as well as get a the core API resources being made available.

Twilio's feedback API focuses on a single API resource, a call, but could just as easily be applied equally as a quality of service feedback endpoint for anything you are serving up, like video, bot responses, recommendations, and beyond. I like the idea of having one endpoint for serving up a resource, and another endpoint for reporting the quality of service around the resource. 

Now that I have filed a report on Twilio's approach to using APIs as part of their platform operations, I will be keeping a closer eye out for other platforms doing similar things. I will consider adding it to my stack of existing API infrastructure APIs, alongside access to account, application, billing, and analytics via an API.

As the growth of API integration continues, I will continue looking at the API pioneers for examples of how we can be providing more infrastructure focused API resources, that allow API consumers to manage, orchestrate, and automate their API integrations. As developers use a larger number of APIs to drive any single application or system integration, the need to automate accounts, apps, pricing, billing, analytics and feedback loops is only going to grow.

See The Full Blog Post

An Acceptable Business Model Page For Your API Platform

One thing I look at closely when I review API platforms is how they approach the monetization of their API resources, and the resulting plans, pricing, and access tiers. How platforms think about, and present this aspect of their operations provide a wealth of insights into what a company's motivations are behind their API operations.

I found the approach by 3D printing API platform i.materialise to be an interesting approach which goes beyond just charging for API access and focuses on helping API consumers align their business model with i.materialises. i.materialise presents API consumers with two possible routes, with the first being referral based, and the second being a deeper, white label relationship.


The i.materialise API provides consumers with access to a  full stack of 3D printing APIs, allowing you to upload your model, determine what it will cost to print, all the way to assisting you with order fulfillment, delivery, and invoicing. The i.materialise approach to API monetization is different than other APIs I talk about because it's not about paying for API access. In this situation, it's about providing API access to the manufacturing life cycle -- one that is 3D printer driven. 

While I think utility-based, pay as you go API pricing represent the majority of how we approach API consumption at the moment, I think having your business model aligned, and focused on existing, tangible products and services holds a huge amount of untapped potential for APIs. I'm thinking there are endless numbers of small business out there who would benefit from the process of mapping out their product supply chain and service lifecycles, then the opening up of things as a simple web API--even if it is just for internal, and trusted partner use.

See The Full Blog Post

Twilio Provides Test API Credentials With Magic Phone Numbers

I am always on the hunt for the little things that make API integration easier, and while working to certify my Twilio API definition, I noticed their test credentials. When you are playing with the Twilio API, it's pretty easy to add new keys, and create new apps, but they also offering test credentials along with what they call "Twilio's Magic Numbers" so that you can play without connecting to real phones or making actual charges on your account.

Many APIs provide you access to data or content, but Twilio enters the additional realm of much more complex, programmatic API resources. When getting up and going with these types of APIs, it really helps to have a sandbox to play in, and a ready to go set of test credentials provides this for users by default.

If you are offering up more than just data and content, via your API, you may want to follow Twilio's lead, and create a set of permanent, or even regularly changing set of test accounts for consumers to use. It made my onboarding with Twilio, significantly easier.

See The Full Blog Post

Serverless Approaches To Deploying Code Will Help Unwind Some Of The Technical Debt We Have

I am sure there is some equation we could come up to describe the amount of ideology and / or dogma present alongside each bit and byte of code. Something that exponentially increases with each additional line of code or MB on disk. An example of this in action, in the wilds of the API space, is the difference between an SDK for an API, and just a single sample API call. 

The single API sample is the minimum viable artifact that enables you to get value from an API -- allowing you to make a single API request and receive a single API response. Very little ideology, or dogma present (its there, but just smaller quantities). Now, if an API provider provides you with a Laravel SDK in PHP, or a JAX-RS SDK in Java, and React.js SDK, I'm significantly cranking up the volume on ideology and dogma involved with this code. All contributing what type of technical debt I'm willing to assume along the way, with each of one my API integrations, and wider technological solutions.

I work hard to never hold up any single technology as an absolute solution, as there are none, but I can see a potential for the latest wave of "serverless" approaches to delivering code potentially helping us unwind some of our technical debt. Like most other areas of technology, simply choosing to go "serverless" will not provide you the relief you need, but if you are willing to do the hard work to decouple your existing code, and apply the philosophy consistently to future projects, the chances "serverless" might pay dividends in the reduction of your technical debt will increase greatly.

See The Full Blog Post

I Am Seeing Significant Returns From Investing In Definitions Over Code When It Comes To My API Strategy

I am doing way more work on the creation of machine-readable OpenAPI Specs for APIs, indexed using machine-readable APIs.json files than I am the actual creation of APIs lately. About half of the API definitions I create are for existing APIs, with the rest of them describing APIs that should exist. With the existing APIs, in some cases, I am creating client-side code, but mostly just focusing on a well crafted API definition. When it comes to the new API designs, I am focusing on a complete API definition, but also crafting both server-side, as well as client-side code around the definition--when needed.

Even when I do craft server or client code for an API definition, the value of the code is significantly lower than the definition(s). In my mind the code is disposable. I want to be able to throw it away, and start over with anything I am building, at any point. While I have made significant ideological investments into using Linux for my OS, AWS for my compute and storage hosting, MySQL for my database, and PHP + Slim for my API deployment framework, the code that operates within this framework has to be transient. Some code might end up having a long, long life, but if a piece of code isn't generating value, and in the way, I want to either get rid of it or rewrite it to better meet the requirements.

When it comes to delivering technology in my world, my investments are increasingly in the API definitions, underlying data schemas, and the data and content that is actually stored and transmitted within. The PHP, MySQL, JavaScript, CSS, and HTML is valuable, but a second class citizen to the JSON, and YAML representations of my APIs, schemas, and the valuable data and content stored. For me personally, having made significant investments in a variety of tech solutions historically, this provides me with the flexibility I need to live in the current online climate. This is something that only has been coming into focus in the last year, so I assume it will also continue to evolve in focus over the next couple of years, but I am already seeing significant returns from my investing in definitions over the code when it comes to my API strategy.

See The Full Blog Post

The API Evangelist API Deployment Guide

There are many different ways to actually deploy an API. If you are a larger, more established company, you probably have existing tools, services, and processes set forth by IT for deploying APIs. If you are a startup, your developers probably have their own frameworks, or possibly a cloud service they prefer using. This guide looks to map out the world of API deployment, and some of the common ways companies, organizations, institutions, and government agencies are deploying APIs in 2016.

This API deployment guide breaks out many of the common companies, and tools while also looking to identify some of the common building blocks employed by leading providers, in support of API deployment across organizations of all shapes and sizes. This research is conducted by API Evangelist and is the just aspect of a modern API lifecycle, one that includes over 50 stops, from API design to deprecation, with deployment being just one area you should be considering.

I work to update my guides regularly, and if purchase a copy, you'll get any updates to it for the next year. Every reader of API Evangelist guides helps support this research and enables the work to continue--thank you!

See The Full Blog Post

APIs At Brigham Young University

I have been tracking on how APIs are used in higher education for some time now, keeping an eye on almost 50 campus API related efforts. I have my University API guide that I regularly update, but I was eager to push forward my storytelling around what is going on, so I have been working on some papers telling the story behind some of the top higher ed API implementations which I have access to.

What better way to kick this off, than to showcase my favorite campus API group, over at Brigham Young University (BYU). The team led by CIO Kelly Flanagan (@kelflanagan) have embraced APIs on campus in a way that keeps my excited about APIs, and what is possible. In my opinion, BYU isn't just using APIs to shift IT strategy at the school, they are using APIs to shift how their students and faculty see technology, and put it to work on their terms. 

I'm pretty familiar with what has been happening at BYU when it comes to APIs, as I see Kelly and his team regularly, but I jumped on a Google Hangout with them, so that I could get the latest. The results is a short five page essay, about the history of the API efforts on campus, some of the benefits to faculty and students, and a glimpse at the future of APIs at the school. 

The paper is freely available for you to download as PDF, but I will also be working on some stories that I used  the paper, as part of my regularly blogging here on API Evangelist. I want to keep bringing attention to what they are up to at BYU, but also generate attention at other schools about what is possible when it comes to APis. As I note in the paper, I'm also working with Davidson College in North Carolina, and will be working to keep spreading the conversation to other schools.

You can find me speaking at University of California in San Diego this June, so stay tuned for more information about how APIs are used in higher ed this summer--showcasing my conversations with BYU, Davidson, and UCSD, as examples of how APIs are making an impact in higher education.

See The Full Blog Post