How Many APIs Are Too Many?

I hit a ceiling with my Artisanal APIs.json API profiling, in that I am pushing over a thousand individual APIs. It reminds me of running conferences, and once you get over 500 people, everything begins to change. I only have a little over 100 API providers, but when some API providers have 100+ individual APIs, it adds up very quickly. When you are working to properly profile APIs using OpenAPI and the operations around them with APIs.json, then augment with searching, ratings, and other overlays, it becomes a lot to deal with. But it really isn’t just the volume of data, it is about the cognitive load involved with working with so many different types of APIs at once–the context switching begins to kill ya.

To help me make things more manageable I broke my Artisanal APIs.json work into what I am calling search nodes. I started with 12 individual topics, breaking down the top 25 API providers I have in the Artisanal index. It is just a start. I’ll dump more in there now that I have it all set up. But it is already more manageable to have things broken up into nodes. There is something about having things broken down into manageable and more meaningful chunks that changes the rules of the game. The scripts I run to automate things run faster. I feel less overwhelmed when I am working in the middle of the search index. We’ll see how it all plays out with the next couple of rounds of work, but I am feeling like I can more comfortably scale this all in a federated way.

Once I got the APIs.json broken up into 12 separate repositories I needed a way to search across each of these nodes. To do this I wrote a simple starter search script, I just needed a way to spider the YAML in each repository. Each search node is defined using APIs.json, with individual APIs.json for each API provider, as well as a single central apis.json for each node index. I just needed what I am calling a network node with a single APIs.json that provides me with an index of the 12 individual topical search nodes-—with more coming soon. The search is janky as hell, but it works. I don’t want to overthink it and just get the bare minimum of what I need to prove that I can do this. I’ll harden the search over time. This is just a simple network search. I aim to index all my nodes using a database and provide a richer and faster API-driven search.

I have things federated, and I have my automation in place to find and process new APIs that I want to profile. I have a new incubator repository setup where I am taking in any new APIs, but before it can graduate to one of the search nodes it has to have an OpenAPI. This is the line I’ve drawn between API intake and APIs that I publish to a search node. I have a number of other scripts I run against each API I am profiling to clean up names, tags, and apply the rating, but having an OpenAPI describing the surface area is a pretty critical piece of the puzzle. Now that I have my approach to federating, the APIs.json network property added, and the first draft of a search, I am feeling pretty good about this approach. We’ll see if it sticks. I could hit other challenges that might make it untenable, and I’ll have to adjust, but I will deal with that when I encounter it.

I am feeling that I will come out of this round of work with strong opinions on how many APIs are too many APIs. I have a couple of big ones like GitHub, GitLab, and others to break up into OpenAPIs, and I think that my cloud node is probably going to represent the ceiling when it comes to a search node. Along with this latest bit of work I switched the UI to use Bootstrap, and I am going to invest more into making search nodes forkable. I want this next round of work with APIs.io to make it something that anyone can fork and run on their own GitHub in the cloud or on-premise. I am going to spend the next couple of weeks adding more of my APIs into several of the indexes and add a couple of new indexes, hardening the search, and refining how it returns the search results. I am looking to render any number of APIs doable when it comes to API discovery and something that can be done in a federated way with vendors enriching using APIs.json overlays over time.