{"API Evangelist"}

I Think The Parse Twitter Page Sums It Up Pretty Well

Building a business is hard. Building a business that depends on other business is hard. We would like it if all of our vendors stuck around forever, but this is not the reality of doing business in today's climate. My stance on this situation that nothing lasts forever, but startups and the enterprise could be more honest about the business of startups, which is seriously beginning to impact the trust we all have in the platforms, tools, and APIs we depend on for our businesses.

I was working my way through some legacy tweets, and I came across Parse's Twitter home page, which I think sums up the promise being made by each wave of startups, and the end results of these promises--although I have to say that Parse actually handled it pretty well, compared with other startups that I have seen in action.

Building apps is not easy. However, we need solutions that will get us all the way there. I actually think that in the end, Parse handled it pretty well, better than StackMob did, when Paypal bought them. In the end, you could take the open source version of Parse and install it, and they communicated the deprecation pretty well, giving folks quite a bit of time to take care of business. However, this is the way it should be from day one. There should be APIs, and open source available to ease operation and migration--as well as communication about what the future will hold.

If API focused startups don't start being more honest about their true business strategy, and the realities of the future from day one, fewer developers and users will buy what is being sold. Sure, there will always be new waves of young developers who will still believe, but the more experienced folks, who are often in charge of making purchasing decisions, will become increasingly skeptical about baking in APIs into their operations--screwing this up for the rest of us trying to actually make a living, and not just get rich.


Being Able See An API Request In Browser Is Important

There are a number of things at work making this whole web API thing actually work. One of them that came up while I was at Google discussing APIs a couple weeks ago, while we were listening to Dan Ciruli (@danciruli) was the importance of being able to see an API request in the browser. It is something I think we often overlook when it comes to understanding why web APIs have reached such a wide audience.

I remember when I first realized I could change the URL in my Delicious account and get an XML listing of my bookmarks--this is when the API light first went on in my head. The web wasn't just for humans, it could be structured for use in other websites. Seeing the XML in the browser presented me links in a machine readable way, that triggered me to think about else I could with them, and which other systems I could put them to work in.

Being able to see the results of an API call in the browser helps stimulate the imagination when it comes to what is possible. This is similar to why API client tooling like Postman and Restlet Client are popular with developers--they help us see the possibilities. While not all APIs are simple enough to allow for viewing in the browser, when at all possible, we should keep things this easy, because you never know when it will make a mark, and help folks better understand what is going on under the hood, allowing them to put our APIs to work in ways we never expected.


API Definition: API Transformer

This is an article from the current edition of the API Evangelist industry guide to API definitions. The guide is designed to be a summary of the world of API definitions, providing the reader with a recent summary of the variety of specifications that are defining the technology behind almost every part of our digital world

OpenAPI Spec is currently the most used API definition format out there, with the number of implementations, and tooling, with API Blueprint, Postman Collections, and other formats trailing behind. It can make sense to support a single API definition when it comes to an individual platforms operations, but when it comes to interoperability with other systems it is important to be multi-lingual and support multiple of the top machine-readable formats out there today.

In my monitoring of the API sector, one service provider has stood out when it comes to being a truly multi-lingual API definition service provider--the SDK generation provider, APIMATIC. The company made API definitions the heart of its operations, generating what they call development experience (DX) kits, from a central API definition uploaded by users--supporting OpenAPI Spec, API Blueprint, Postman Collections, and other top formats. The approach has allowed the company to quickly expand into new areas like documentation, testing, continuous integration, as well as opening up their API definition translation as a separate service called API Transformer.

API Transformer allows anyone to input an API Blueprint, Swagger, WADL, WSDL, Google Discovery, RAML 0.8 - 1.0, I/O Docs - Mashery, HAR 1.2, Postman Collection 1.0 - 2.0, Mashape, or APIMATIC Format API definition and then translate and export in a  API Blueprint, Swagger 1.0 - 1.2, Swagger 2.0 JSON, Swagger 2.0 YAML, WADL - W3C 2009, RAML 0.8 - 1.0, Postman Collection 1.0 - 2.0, and their own APIMATIC Format. You can execute API definition translations through their interface or seamlessly integrate with the API Transformer API definition conversation API.

There is no reason that an API provider and API service providers shouldn’t be multi-lingual. It is fine to adopt a single API definition as part of your own API operations, but when it comes to working with external groups, there is no excuse for not being able to work with any of the top API definition formats. The translation of API definitions will increasingly be essential to doing business throughout the API life cycle, requiring each company to have an API definition translation engine baked into their continuous integration workflow, transforming how they do business and build software.


 If you have a product, service, or story you think should be in the API Evangelist industry guide to API design you can email me , or you can submit a Github issue for my API definition research, and I will consider adding your suggestion to the road map.


I Predict A Future Flooded With Google Prediction Galleries

I was roaming through Google's Prediction API, and I thought their prediction gallery provides us a look at a shift occurring right now in how we deliver APIs. I predict that machine learning galleries and marketplaces will become all the rage, independently operating like Algorithmia, or part of a specific API like the Google prediction gallery.

Ok, let me put it out there that I hate the use of word prediction. If I was naming the service, I would have called it "execution", or more precisely a "machine training (MT) model execution API". I know I'll never get my way, but I have to put it out there how bullshit many of the terms we use in the space are--ok, back to the API blah blah blah, as my daughter would say.

A common element of API portals for the last decade has included an application gallery, showcasing the apps that are developed on top of an API. Now, when an API offers up machine learning (training) (ML) execution capabilities, either generally or very specialized model execution (ie, genomics, financial), there should also be a gallery of available mobiles, or simply models that have been delivered as part of platform operations--just like we have done with APIs & applications.

I see this stuff as the evolution of the algorithmic layer of API operations. Meaning that most APIs are delivering data or content, but there is a 3rd layer that is about wrapping algorithms, and in the current machine learning and artificial intelligence craze, this approach to delivering algorithmic APIs will continue to be popular. It's not an ML revolution, it is simply an evolution in how we are delivering API resources, leveraging ML models as the core wrapper (Google was smart for open sourcing TensorFlow).

Developing ML models, and making them deployable via AWS, Google, Azure, as well as marketplaces like Algorithmia will become a common approach for the API providers who are further along in their API journey, and have their resources well-defined, modular, and easily deployed in a retail or wholesale manner. While 90% of the ML implementations we will see in 2017 will be smoke and mirrors, 10% will deliver some interesting value that can be used across a growing number of industries--keeping machine training (MT), and machine training execution APIs like Google Prediction something I will be paying attention to. ;-)


API Definition: U.S. Data Federation

This is an article from the current edition of the API Evangelist industry guide to API definitions. The guide is designed to be a summary of the world of API definitions, providing the reader with a recent summary of the variety of specifications that are defining the technology behind almost every part of our digital world.

The U.S. Data Federation is a federal government effort to facilitate data interoperability and harmonization across federal, state, and local government agencies by highlighting common data formats, API specifications, and metadata vocabularies. The project is focusing on being a coordinating interoperability across government agencies by showcasing and supporting use cases that demonstrate unified and coherent data architectures across disparate agencies, institutions, and organizations.

The project is designed to shine a light on “emerging data standards and API initiatives across all levels of government, convey the level of maturity for each effort, and facilitate greater participation by government agencies”--definitely in alignment with the goal of this guide. There are currently seven projects profiled as part of the U.S. Data Federation, including Building & Land Development Specification, National Information Exchange Model, Open Referral, Open311, Project Open Data, Schema.org, and the Voting Information Project.

By providing a single location for agencies to find common schema documentation tools, schema validation tools, and automated data aggregation and normalization capabilities, the project is hoping to incentivize and stimulate reusability and interoperability across public data and API implementations. Government agencies of all shapes and sizes can use the common blueprints available in the U.S. Data Federation to reduce costs, speed up the implementation of projects, while also opening them up for augmenting and extending using their APIs, and common schema.

It is unclear what resources the U.S. Data Federation will have available in the current administration, but it looks like the project is just getting going, and intends to add more specifications as they are identified. The model reflects an approach that should be federated and evangelized at all levels of government, but also provides a blueprint that could be applied in other sectors like healthcare, education, and beyond. Aggregating common data formats, API specifications, metadata vocabularies, and authentication scopes will prove to be critical to the success of the overall climate of almost any industry doing business on the web in 2017.


 If you have a product, service, or story you think should be in the API Evangelist industry guide to API design you can email me , or you can submit a Github issue for my API definition research, and I will consider adding your suggestion to the road map.


A Looser More Evolvable API Contract With Hypermedia

I wrote about how gRPC API implements deliver a tighter API contract, but I wanted to also explore more thought from that same conversation, about how hypermedia APIs can help deliver a more evolvable API contract. The conversation where these thoughts were born was focused on the differences between REST and gRPC, in which hypermedia and GraphQL also came up. Leaving me thinking about how our API design and deployment decisions can impact the API contract we are putting forth to our consumers.

In contrast to gRPC, going with a hypermedia design for your API, your client relationship can change and evolve, providing an environment for things to flex and change. Some APIs, especially internal API, and trusted partners might be better suited for gRPC performance, but when you need to manage volatility and change across a distributed client base, hypermedia might be a better approach. I'm not advocating one over the other, I am just trying to understand the different types of API contracts brought to the table with each approach, so I can better articulate in my storytelling.

I'd say that hypermedia and gRPC approaches give API providers a different type of control over API clients that are consuming resources. gRPC enables dictating a high performance tighter coupling by generating clients, and hypermedia allows for shifts in what resources are available, what schema are being applied, and changes that might occur with each version, potentially without changes to the client. The API contract can evolve (within logical boundaries), without autogeneration of clients, and interference at this level.

As I learn about this stuff and consider the API contract implications, I feel like hypermedia helps API provides navigation change, evolve and shift to deliver resources to a more unknown, and distributed client base. gRPC seems like it provides a better contract for use in your known, trusted, and higher performance environments. Next, I will be diving into what API discovery looks like in a gRPC world, and coming back to this comparison with hypermedia, and delivering a contract. I am feeling like API discovery is another area that hypermedia will make sense, further helping API providers and API consumers conduct business in a fast changing environment.


Uber Is Painting Bigger Picture Of Their Drivers With Driver API Partnerships

I was taking a look at the new Uber Driver API and trying to understand the possibilities with the API, and some of the the motivations behind Uber's launch of the API. According to Uber, "our Driver API lets you build services and solutions that make the driver experience more productive and rewarding. With the driver's permission, you can use trip data, earnings, ratings and more to shape the future of the on-demand economy." Providing an interesting opportunity for partners to step up and help build useful apps that Uber drivers can leverage in their worlds, helping them be more successful in their work.

The first dimension of the Uber Driver API I find interesting is that it is not an API that is about their core business--ridesharing. It is focused on making their drivers more successful, and developing tools, and integrations that make their lives easier. I could see something like this evolving to other platforms like AirBnB, and other sharing or gig economy platforms, helping operators make sense of their worlds, while also strengthening partnerships and hopefully their relationship with operators along the way. 

The second dimension I find interesting is thinking about why Uber is publishing their Driver API. At first glance it is all about making their drivers happy--something Uber needs a lot of help with currently. However, once you think a little more about it, you can start to see a bigger picture that Uber is looking to paint of their drivers. The company's leverage over drivers has proven to only go so far, and they need to understand more about the lives of their drivers, and if they can invite corporate partners to do their taxes, and potentially other key tasks in their lives, a greater picture of their life will come into focus. 

If an Uber driver is also a Lyft driver, the Uber Driver API gives them more of a look into their competitor's accounting. They also get a better understanding of what else their drivers have going on, and other ways they can increase their leverage over them. It's fascinating to think about. Right now, all the examples Uber providers are tax related, but I'm sure other types of partners will emerge--it is why you do APIs. Open up an API, and people will line up innovate for you, helping you understand the dimensions of your driver's life. It is a fascinating look at why you would do APIs, and how there is almost always more than one dimension to why a company will deploy an API.

I am sure there are other reasons I am not considering here. Maybe now that I've planted the seed in my brain, I'll come up with some other incentives for why Uber operates their APIs as they do--it is fascinating stuff to unpack.


Moving APIs Out Of The Partner Realm And Making Them More Public

It is common for API providers to be really private with their APIs, and we often hear about providers restricting access as time goes by. So, when API providers loosen up restrictions on their APIs, inviting wider use by developers, making them public--I think it is worth taking notice. 

A recent example of this in the wild is from the API poster child Twitter, with their Periscope video service. Twitter has announced that they are slowly opening up access to the Periscope video API, something that has been only available to a handful of trusted partners, and via the mobile application--there was no way to upload a video without using your mobile device. Twitter is still "limiting access to fewer strategic partners for a period", but at least you can apply to see if your interests overlap with Twitters interests. It also sounds like Twitter will continue to widen access as time goes on, and as it makes sense to their Periscope strategy.

While I wished we lived in a world where API developers were all well behaved, and API providers could open up services to the public from day one, this isn't the reality we find on the web today. Twitter's cautious approach to rolling out the Periscope API should provide other API providers with an example of how you can do this sensibly, only letting in the API consumers that are in alignment with your goals. Slowly opening up, making sure the service is stable, and meets the needs of consumers, partners, as well as helps meet a platform's business objectives.

Hopefully, Twitter's approach to launching the Periscope API provides us with a positive example of how you can become more public APIs, instead of always locking things down. There is no single way to launch your APIs, but I'd say that Twitter's cautious approach is probably a good idea when you operate in such a competitive space, or when you don't have a lot of experience operating a fully public API--start with a handful of trusted partners, and open things up to a slightly wider group as you feel comfortable.

Thanks, Twitter for providing me with a positive example I can showcase--it is good to see that after a decade you can still launch new APIs, and understand the value of them to your core business--even amidst the often intense and volatile environment that is the Twitter platform.


API Definition: WebConcepts.info

This is an article from the current edition of the API Evangelist industry guide to API definitions. The guide is designed to be a summary of the world of API definitions, providing the reader with a recent summary of the variety of specifications that are defining the technology behind almost every part of our digital world.

Keeping up with the standards bodies like International Organization for Standardization (ISO) and Internet Engineering Task Force (IETF)  can be a full-time job. Thankfully,  Erik Wilde (@dret) has help simply and made the concepts and specifications that make the web work more accessible and easier to understand, with his WebConcepts.info project.

According to Erik, “the Web’s Uniform Interface is based on a large and growing set of specifications. These specifications establish the shared concepts that providers and consumers of Web services can rely on. Web Concepts is providing an overview of these concepts and of the specifications defining them.” His work is a natural fit for what I am trying to accomplish with my API definition industry guide, as well as supporting other areas of my research.

One of the areas that slows API adoption is a lack of awareness of the concepts and specifications that make the web work among developers who are providing and consuming APIs. The modern API leverages the same technology that drives the web--this is why it is working so well. The web is delivering HTML for humans, and APIs are using the same to deliver machine-readable data, content, and access to algorithms online. If a developer is not familiar with the fundamental building blocks of the web, the APIs they provide, and the applications they build on top of APIs will always be deficient.

This project provides an overview of 28 web concepts with 643 distinct implementations  aggregated across five separate organizations including the International Organization for Standardization (ISO), Internet Engineering Task Force (IETF), Java Community Process (JCP), Organization for the Advancement of Structured Information Standards (OASIS), and the World Wide Web Consortium (W3C)--who all contribute to what we know as the web.  An awareness and literacy around the 28 concepts aggregated by Web Concepts is essential for any API developer and architect looking to fully leverage the power of the web as part of their API work.

After aggregating the 28 web concepts from the five standards organization, Web Concepts additionally aggregates 218 actual specifications that API developers, architects, and consumers should be considering when putting APIs to work. Some of these specifications are included as part of this API Definition guide, and I will be working to add additional specifications in future editions of this guide, as it makes sense. The goal of this guide is to help bring awareness, literacy, and proficiency with common API and data patterns, making use of the work Web Concepts project, and building on the web literacy work already delivered by Erik, just makes sense.

Web Concepts is published as a Github repository, leveraging Github Pages for the website. He has worked hard to make the concepts and specification available as JSON feeds, providing a machine-readable feed that can be integrated into existing API design, deployment, and management applications--providing web literacy concepts and specifications throughout the API life cycle.  All JSON data is generated from the source data, which is managed as a set of XML descriptions of specifications, with the build process based upon XSLT and Jekyll, providing multiple ways to approach all concepts and specifications, while maintaining the relationship and structure of all the moving parts that make up the web.

When it comes to the startup space, the concepts that make up the web, and the specifications that make it all work, might seem a little boring, something only the older engineers pay attention to.  Web Concepts helps soften, and make these critical concepts and specifications accessible and digestible for a new generation of web and API developers--think of them as gummy versions of vitamins. If we are going to standardize how APIs are designed, deployed, and managed--making all of this much more usable, scalable, and interoperable, we are going to have to all get on the same page (aka the web).

Web Concepts is an open source project, and Erik encourages feedback on the concepts and specifications. I encourage you to spend time on the site regularly and see where you can integrate the JSON feeds into your systems, services, and tooling. We have a lot of work ahead of us to make sure the next generation of programmers have the base amount of web literacy necessary to keep the web strong and healthy. There are two main ways to contribute to the building blocks of the web: participate as a contributor to the standards body, or you can make sure you are implementing common concepts and specifications throughout your work, contributing to the web, and not just walled gardens and closed platforms.


 If you have a product, service, or story you think should be in the API Evangelist industry guide to API design you can email me , or you can submit a Github issue for my API definition research, and I will consider adding your suggestion to the road map.


Sharing API Data Validation Examples

I was studying examples of how I can validate the data returned from a human services APIs demo, and develop a set of API tests, as well as API service providers who can implement the tests, for cities to consider as part of their API deployments that are serving up locations and organizations where you can find critical services. I'm looking for examples of the common things like API availability and response time, but I'm also looking to get very granular and specialized to organizational, location, and service APIs.

The image I borrowed from RunScope helps visualize what I'm talking about, showing us how we can keep an eye on the basics, but also getting really granular when specifying what we expect from of our APIs. I have a pretty good imagination when it comes to thinking of scenarios I want to test for, but I'm also looking for any API providers who might be already sharing their tests and being more transparent when it comes to their API monitoring and testing practices. If you know of any API providers that would be willing to share the lists of what types of things they test for, I'd love to hear more. 

I'm thinking a regular blog series on different examples of how people are testing APIs from a diverse range of business sectors might help stimulate people's imagination when it comes to API testing concepts. I'm thinking it is another area that we could all learn a lot from each other if there was just a little bit of sharing. I'd love it if the examples were machine readable and reusable in any API testing service, but I would settle for just a blog post, or sharing of a bulleted list of API tests via email, or another channel. ;-)


Deploying Your APIs Exactly Where You Need Them

Building on earlier stories about how my API partners are making API deployment more modular and composable, and pushing forward my understanding of what is possible with API deployment, I'm looking into the details of what DreamFactory enables when it comes to API deployment. "DreamFactory is a free, Apache 2 open source project that runs on Linux, Windows, and Mac OS X. DreamFactory is scalable, stateless, and portable" -- making it pretty good candidate for running it wherever you need.

After spending time at Google and hearing about how they want to enable multi-cloud infrastructure deployment, I wanted to see how my API service provider partners are able to actually power these visions of running your APIs anywhere, in any infrastructure. Using DreamFactory you can deploy your APIs using Docker, Kubernetes, or directly from a Github repository, something I'm exploring as standard operating procedure for government agencies, like we see with 18F's US Forest Service ePermit Middlelayer API--in my opinion, all federal, state, and local government should be able to deploy API infrastructure like this.

One of the projects I am working on this week is creating a base blueprint of what it will take to deploy a human services API for any city in Google or Azure. I have a demo working on AWS already, but I need a basic understanding of what it will take to do the same in any cloud environment. I'm not in the business of hosting and operating APIs for anyone, let alone for government agencies--this is why I have partners like DreamFactory, who I can route specific projects as they come in. Obviously, I am looking to support my partners, as they support me, but I'm also looking to help other companies, organizations, institutions, and government agencies better leverage the cloud providers they are already using.

I'll share more stories about how I'm deploying APIs to AWS, as well as Google and Azure, as I do the work over the next couple of weeks. I'm looking to develop a healthy toolbox of solutions for government agencies to use. This weeks project is focused on the human services data specification, but next week I'm going to look replicating the model to allow for other Schema.org vocabulary, providing simple blueprints for deploying other common APIs like products, directories, link listings, and directories. My goal is to provide a robust toolbox of APIs that anyone can launch in AWS, Google, and Azure, with a push of a button--eventually.


API Definition: Open API Specification 3.0.0-RC0

This is an article from the current edition of the API Evangelist industry guide to API definitions. The guide is designed to be a summary of the world of API definitions, providing the reader with a recent summary of the variety of specifications that are defining the technology behind almost every part of our digital world.

The OpenAPI Specification, formerly known as Swagger is approaching an important milestone, version 3.0 of the specification, but it is also the first major release since the specification was entered into the Linux Foundation. Swagger was the creation of Tony Tam of Wordnik back in 2010, but after the project was acquired by SmartBear in 2015, the company donated the specification to the newly formed OpenAPI Initiative (OAI) which is part of the Linux Foundation. Swagger has now been reborn as the OpenAPI Specification, and in early 2017 is approaching its first major release under the guidance of a diverse group of governing members.

Version 3.0 of the API specification format has taken a much more modular, and reusable approach to defining the surface area of an API, enabling more power and versatility when it comes to describing the request and response models, as well as providing details on the common components that make up API usage like the underlying data schema and security definitions. There are numerous changes to the API specification, but there are just a handful that will have a significant impact across every stop along the API life cycle where API definitions are making an impact.

Hosts

When describing your API, you can now provide multiple hosts, allowing you to better deal with the complexity of how APIs might be located in a single location, or spread across multiple cloud location, and global infrastructure.

Content Negotiation

You can now provide content objects to define the relationship between response objects, media types, and schema, opening up the negotiation of exactly the type of content needed, encouraging the design of multiple rich dimensions for any single API resource.

Body

The latest version of the specification plays catch-up when it comes to allowing the body of a request and response to be defined separately from the request parameters, allowing for more control over the payload of any API calls.

Schema

There is an increased investment in JSON Schema, including the support of `oneOf`, `anyOf` and `not` functions, allowing for alternative schema, as well as the standard JSON schema definition included.

Components

The new components architecture really reflects APIs, making everything very modular, reusable, and much more coherent and functional. The new version encourages good schema and component reuse,  helping further bringing the definition into focus.

Linking

While not quite full hypermedia, version 3.0 of the OpenAPI Spec supports linking, allowing for the description of relationships between paths, giving a nod towards hypermedia design pattern, making this version the most resilient so far.

Webhooks

Like the nod towards hypermedia, the specification now allows for the inclusion of callbacks that can be attached to a subscription operation describing an outbound operation--providing much needed webhook descriptions as part of API operations, making it much more real time and event driven.

Examples

The new version enables API architects to better describe, and provide examples of APIs responses and requests, helping make API integration a learning experience, by providing examples for use beyond just documentation descriptions.

Cookies

Responding to a large number of API providers, and the reality on the ground for many implementations, the new version allows for the definition of cookies as part of API requests.

These nine areas represent the most significant changes to the OpenAPI Spec 3.0. The most notable shift is the componentized, modular, reusable aspect of the specification. After that, it is the content negotiation, linking, web hooks, and other changes that are moving the needle. It is clear that the 3.0 version of the specification has considered the design patterns across a large number of implementations, providing a pretty wide-reaching specification for defining what an API does, in a world where every API can be a special snowflake.

In 2017, the OpenAPI Spec is the clear leader of the API definition formats, with the largest adoption, as well as the amount of tooling developed. While documentation and SDK generation are still the top two reasons for crafting API definitions, there are numerous other reasons for using API definitions including mocking, testing, monitoring, IDE integration, and much, much more. In 2017, the OpenAPI Spec is the clear leader of the API definitions format. It has the highest usage rate, as well as the largest number of tooling (or tools?) available.


 If you have a product, service, or story you think should be in the API Evangelist industry guide to API design you can email me , or you can submit a Github issue for my API definition research, and I will consider adding your suggestion to the road map.


Opportunity For Push Button API Deployment With Google Cloud Launcher

I'm keeping an eye on the different approaches to deploying infrastructure coming out of AWS, Google, Microsoft and other providers. In my version of the near future, we should be able to deploy any API we want, in any infrastructure we want with a single push of a button. We are getting there, as I'm seeing more publish to Heroku buttons, AWS and Azure deployment packages, and I recently came across the Google Cloud Launcher, which I think will work well for deploying a variety of API driven solutions--we just need more selection and a button!

All the parts and pieces for this type of push button API deployment exist already, we just need someone to step up and provide a dead simple framework for defining and embedding the buttons, abstracting away the complexities of each cloud platform. I want to be able to take a single manifest for my open source or wholesale API on Github, and allow anyone to deploy it into Heroku, AWS, Google, Azure, or anywhere else they want. I want the technical, business, and legal complexities of deployment abstracted away for me, the API provider.

API management has matured a lot over the last 10 years, and API design and definitions are currently flourishing. We need a lot more investment in helping people easily deploy APIs, wherever they need. I think this layer of interoperability is the responsibility of the emerging API service providers like Restlet, DreamFactory, or maybe even APIMATIC. I will keep tracking on what I'm seeing evolve out of the leading cloud platforms like AWS, Azure, and now Google with their Cloud Launcher. I will also keep pushing on my API service provider partners in the space to enable API deployment like this--I am guessing they will just need a little nudging to see the opportunity around providing API deployment in this seamless, cloud-agnostic way.


API Icon Vocabulary

I am working on profiling 75 of the Google APIs, and one thing I struggle with at this scale is standardizing the images I use, or more specifically, icons that represent each service as well as the value they deliver under the hood--something Google seriously needs to get more organized in by the way. I have written before about having a set of icons for the API sector, for SDK related icons, and also about how Amazon is getting more organized when it comes to icons for the AWS platform, as I beat this drum about the need for common imagery.

While I am glad that Amazon is started to think about iconography when it comes to working with APIs at scale, a lead that Google and Microsoft should follow, I'm hoping that API icons are something that someone will tackle at the same level as say a Schema.org. I would like to see API provider (company) level icons, building on the work of Devicon, but I'd also like to see individual icons developed for common resources that are made available via APIs--like compute, storage, images, video, etc.

What Amazon is doing provides the best model we have so far, but I want to make sure icons are not vendor specific. I would like to see a universal icon to represent a compute API for instance, whether it was Amazon, Google, or Microsoft. Think about IFTTT or Zapier, but something that would universally represent the valuable bits and bytes we are putting to use via individual platforms, but are increasingly also moving around between platforms--I want a common visual iconography we can use to communicate about what is being done with APIs.

There is a ton of work involved with establishing a project of this scale. Ideally, it is something that would also involve API providers, and not just be an external thing. I'd also like to see each icon be more than just a visual icon, I'd like there to be semantics and vocabulary attached to each image. Imagine if we had a common set of icons describe machine learning APIs, that helped us quickly understand what they do, but would also help us more consistently articulate the reality of what they do, and do not do (ie. Facial Recognition, Sentiment Anlysis, etc.).

I am going to keep talking about this until someone either does it or gives me the money to pay someone to do it. Sadly, I feel like it will end up being like the rest of common API definitions and tooling, we'll see each provider do it on their own, where all meaning and vocabulary becomes platform-driven, and not about actually finding a common language to communicate about what is possible with APIs.


API Definition: Human Services API Specification

This is an article from the current edition of the API Evangelist industry guide to API definitions. The guide is designed to be a summary of the world of API definitions, providing the reader with a recent summary of the variety of specifications that are defining the technology behind almost every part of our digital world.

A lot of attention is given to APIs and the world of startups, but in 2017 this landscape is quickly shifting beyond just the heart of the tech space, with companies, organizations, institutions, and government agencies of all shapes and sizes are putting APIs to work. API definitions are being applied to the fundamental building blocks of the tech sector, quantifying the computational, storage, images, videos, and other essential resources powering web, mobile, and device based applications. This success is now spreading to other sectors, defining other vital resources that are making a real impact in our communities.

One API making an impact in communities is the Human Services Data Specification (HSDS), also known as the Open Referral Ohana API. The project began as a Code for America project, providing an API, website, and administrative system for managing the organizations, locations, and the human services that communities depend on. Open Referral, the governing organization for HSDS, and the Ohana API is working with API Evangelist and other partners to define the next generation of the human services data specification, API definition, as well as the next generation of API, website, admin, and developer portal implementations.

The HSDS Specification API isn’t about any single API, it is a suite of API-first definitions, schema, and open tooling that cities, counties, states and federal government agencies can download or fork, and employ to help manage vital human services for their communities. Providing not just a website for finding vital services, but a complete API ecosystem that can be deployed incentivizing developers to build important web, mobile, and other applications on top of a central human services system. Better delivering on the mission of human services organizations, and meeting the demands of their constituents.

This approach to delivering APIs centers around a common data schema, extending it as an OpenAPI Spec definition, describing how that data is accessed and put to use across a variety of applications, including a central website and administrative system. While server-side HSDS API implementations, website, mobile, administrative, developer portal, and other implementations are important, the key to the success of this model is a central OpenAPI definition of the HSDS API. This definition connects all the implementations within an API’s ecosystem, but it also provides the groundwork for a future where all human services implementations are open and interoperable with other implementations--establishing a federated network of services meeting the needs of the communities they serve.

Right now each city is managing one or multiple human service implementations. Even though some of these implementations operate in overlapping communities, few of them are providing 3rd party access, let alone providing integration between overlapping geographic regions. The HSDS API approach employs an API-first approach, focusing on the availability and access of the HSDS schema, then adding on a website, administrative and API developer portals to support. This model opens up human services to humans via the website, which is integrated with the central API, but then also opens up the human services for inclusion into other websites, mobile and device applications, as well as integration with other systems.

The HSDS OpenAPI spec and schema provide a reusable blueprint that can be used to standardize how we provide human services. The open source approach to delivering definitions, schema, and code reduces the cost of deployment and operation for cash-strapped public organizations and agencies. The API-first approach to delivering human services also opens up resources for inclusion in our applications and system, potentially outsourcing the heavy lifting to trusted partners, and 3rd party developers interested in helping augment and extend the mission of human service organizations and groups.

If you’d like to learn more about the HSDS API you can visit Open Referral. From there you can get involved in the discussion, and find existing open source definitions, schema, and code for putting HSDS to work. If you’d like to contribute to the project, there are numerous opportunities to join the discussion about next generation of the schema and OpenAPI Spec, as well as develop server-side and client-side implementations.


 If you have a product, service, or story you think should be in the API Evangelist industry guide to API design you can email me , or you can submit a Github issue for my API definition research, and I will consider adding your suggestion to the road map.


With Each API We Increase The Attack Surface Area

It is easy for me to get excited about a new API. I'm an engineer. I'm a dude. I am the API Evangelist. It easy to think about the potential for good when it comes to APIs. It is much harder to suspend the logical side of my brain and think about the ways in which APIs can be used in negative ways. As a technologist it is natural for me to focus in on the technology, and tune out the rest of the world--it is what we do. It takes a significant amount of extra effort to stop, suspend the portion of your brain that technology whispers to, and think about the unintended consequences, and the pros and cons of why we are doing APIs.

Technologists aren't very good at slowing down and thinking about the pros/cons of connecting something to the Internet, let alone whether or not an API should even exist in the first place (it has to exist!). As I read a story about the increases in DDOS attacks on the network layer of our online world, I can't help but think that with each new API we deploy, that we are significantly increasing the attack surface area for our businesses, organizations, institutions, and government agencies. It feels like we are good at thinking about the amazing API potential, but we really suck at seeing what a target we are putting on our back when we do APIs.

We seem to be marching forward, drunk on the potential of APIs and Internet-connected everything. We aren't properly securing the technology we have, something we can see playing out with each wave of vulnerabilities, breaches, and leaks. We are blindly pushing forward with new API implementations, and using the same tactics we are using for our web and mobile technology, something we are seeing play out with the Internet of Things, and the vulnerable cameras, printers, and another object we are connecting to the Internet using APIs.

With each API we add, we are increasing the attack surface area for our systems and devices. APIs can be secured, but from what I'm seeing we aren't investing in security with our existing APIs, something that is being replicated with each wave of deployments. We need to get better at thinking about the negative consequences of doing APIs. We need to stop making ourselves targets. We need to get better at thinking about whether or not an API should exist or not. We need a way to better visualize the target surface area we've crafted for ourselves using APIs, and be a little more honest with ourselves about why we are doing this.


Reminding Myself Of Why I Do API Evangelist

This is my regular public service reminder of why I do API Evangelist. I do not evangelize APIs because I think everybody should be doing them, that they are the solution to all of our problems, or because I have an API I want you to buy (I have other things for you to buy). I do API Evangelist because I want to better understand how platforms like Facebook, Twitter, Uber, and others are operating and impacting our personal and professional lives.

I do believe in APIs as an important tool in our professional toolboxes, but Silicon Valley, our government(s), and many other bad actors have shown me that APIs will more often be used for shady things, rather than the positive API vision I have in my head. I still encourage companies, organizations, institutions, and agencies to do APIs, but I spend equal amount of time ensuring people are doing APIs in a more open and equitable way while still encouraging folks to also embark on their API journeys--taking more control over how we store, share, and put our bits and bytes to work each day. 

APIs are playing a role in almost every news story we read today, from fake news and elections to cyber security, healthcare with the FHIR API standard, or banking with PSDS, to automobiles and transportation with Tesla and Uber. I can keep going all day long, talking about the ways APIs are influencing and delivering vital aspects of our personal and professional lives. In ALL of these situation's it's not the API that is important, it is the access, availability, and observability of the technology that is impacting the lives of humans--APIs are just the digital connector where our mobile phones and other devices are being connected to the Internet.

I like to regularly remind myself why the fuck I'm doing API Evangelist, so I don't burn out like I have before and end up roaming the streets foaming at the mouth again. I also do it to remind people of why I do API Evangelist, so they don't just think I'm a cheerleader for technology (rah rah rah, gooo APIs). My mission isn't just about APIs, it's ensuring APIs are in place and allow us to better understand the inputs and outputs of the technology invading all aspects of our lives. Without the observability into the backend systems and algorithms that are driving out personal and professional lives, we are screwed--which is why I do API Evangelist, to help ensure there is observability into how technology is impacting our world.


Considering Standards In Our API Design Over Being A Special Snowflake

Most of the APIs I look at are special snowflakes. The definition and designs employed are usually custom-crafted without thinking other existing APIs, or standards that already in place. There are several contributing factors to why this is, ranging from the types of developers who are designing APIs, to incentive models put in place because of investment and intellectual property constraints. So, whenever I find an API that is employing an existing standard, I feel compelled to showcase and help plant the seeds in others minds that we should be speaking a common language instead of always being a special snowflake.

One of these APIs that I came across recently was the Google Spectrum Database API which has employed a standard defined by the IETF Protocol to Access White Space (PAWS).  I wouldn't say the API is the best-designed API, but it does follow a known standard, that is already in use by an industry, which in my experience can go further than having the best-designed API. The best product doesn't always win in this game, sometimes it is just about getting adoption with the widest possible audience. I am guessing that the Google Spectrum Database API is targeting a different type of engineering audience than their more modern, machine learning and other APIs are, so following standards is probably more of a consideration.

I wish more APIs would share a little bit about the thoughts that went into the definition and design of their APIs, sharing their due diligence of existing APIs and standards, and other considerations that were included in the process of crafting an API. I'd like to see some leadership in this area, as well as some folks admitting that they didn't have the time, budget, expertise, or whatever the other reasons why you are a special snowflake. It is a conversation we should be having, otherwise, we may never fully understand why we aren't seeing the adoption we'd like to see with our APIs.


The OpenAPI Toolbox And My API Definition Research

I have the latest edition of my API definition research published, complete with a community-driven participation model, but before I moved on to my design, deployment, and management guides, I wanted to take a moment and connect my OpenAPI toolbox to this research.

My API definition research encompasses any specification, schema, or authentication and access scope used as part of API operations, providing a pretty wide umbrella. I am always on the hunt for specifications, schema, media types, generators, parsers, converters, as well as semantics and discovery solutions that are defining the layers of the API space. 

This is one reason I have my OpenAPI Toolbox, which helps focus my research into the fast-growing ecosystem developing around the OpenAPI specification. I'm always looking for people who are doing anything interesting with the OpenAPI specification. When I find one I get to work crafting  a title, description, image, and project link so that I can add to the OpenAPI Toolbox YAML file, driving the toolbox website. If you are developing any open tooling that uses the OpenAPI specification please let me know by submitting a Github issue for the toolbox, or if you are feeling brave...go ahead and add yourself and submit a pull request.

My OpenAPI Toolbox is connected to my API definition research at the hip. The toolbox is just a sub-project for my wider API definition research. If you are doing anything interesting with API definitions, schema, or scope please let me know. I am happy to add to my API definition research. The best of the best from my API definition research and the OpenAPI Toolbox will be included in my industry guide, receiving wider distribution than just my network of API Evangelist sites.


API Evangelist Industry Guide To API Definitions

I keep an eye on over 70 areas of the API sector, trying to better understand how API providers are getting things done, and what services and tooling they are using, while also keeping my perspective as an API consumer--observing everything from the outside-in. The most important area of my research is API definitions--where I pay attention to the specifications, schema, scopes, and other building blocks of the API universe. 

The way my research works is that I keep an eye on the world of APIs through monitoring the social media, blogs, Github, and other channels of companies, organizations, institutions, and agencies doing interesting things with APIs. I curate information as I discover and learn across the API sector, then I craft stories for my blog(s). Eventually, all of this ends up being published to the main API Evangelist blog, as well as the 70+ individual areas of my research from definition to deprecation

For a couple years now I have also published a guide for the top areas of my research, including API definitions, design, deployment, and management. These guides have always been a complete snapshot of my research, but in 2017 I am rebooting them to be a more curated, summary of each area of my research. My API definition guide is now ready for prime time after receiving some feedback from my community, so I wanted to share with you.

I am versioning and managing all of my API industry guides using the Github repositories in which I publish each of my research areas, so you can comment on the API definition guide, and submit suggestions for future editions using the Github issues for the project. My goal with this new edition of the API Evangelist API definition guide is to make it a community guide to the world of API definitions, schema, scopes. So please, let me know your thoughts--it is meant to be a community guide.

I am keeping my API definition guide available for free to the public. You can also purchase a copy via Gumroad if you'd like to support my work--I depend on your support to pay the bills. I will also funding this work through paid one page or two-page articles in future editions, as well as some sponsor blogs located on certain pages--let me know if you are interested in sponsoring. Thanks as always for your support, I hope my API definitions guide helps you better understand this important layer of the API universe, something that is touching almost every aspect of the API and software development life cycle.