{"API Evangelist"}

API Evangelist and APIWare Partnering To Help You Do The Hard Work Of Developing and Managing Your API

I am fortunate to be able to take part in numerous conversations around API strategy, with companies, organizations, and government agencies who are trying to understand, and maximize the role they play in the API economy. However, one side effect of this world, is that most of these conversations begin and end with strategy planning, and when it comes to deployment and management, I rarely stay on in discussions. I do have clients who do check in with me, at different stops along their journey, but for the most part my role ends once the strategy discussions end.

I would like to change this with a new relationship I've established with APIWare, a new API-first agency that is dedicated to defining, developing, deploying, and managing APIs at all stages of the life-cycle. I have worked with the core team that is APIWare, on earlier projects, and after a couple of years, I've learned to trust the work they do.

API Evangelist and APIWare are partnering to make sure that some is always available to help you meet the demand of your API operations, by exclusively focusing on:

  • Defining Your API - Where do your resources originate, and what is the best path forward in defining and designing your APIs.
  • Developing Your API - The actual hard work of crafting your APIs from scratch, all the way to using common API gateway solutions and frameworks.
  • Managing API Operations - Take care of the day challenges when operating your API from monitoring and availability to developer on-boarding. 

The APIWare team is determined to stay focused in these areas, while you are busy doing other things, and the team is working hard to follow many of the best practices that I showcase across my work in the API space. Now you don't just get a handful of initial conversations with me about best practices, you have a team that will help you actually address the pieces of API operations that are falling through the cracks.

I will not be doing any of the development, the core team is way more skilled at this than I am, however I will be involved with defining overall best practices for the team, and will be touching each project as it comes in the door, and at regular stages throughout its evolution. This approach helps make sure I can continue having many conversations throughout an APIs life-cycle, be there to share thoughts on the road-map, while also keep telling the important stories about API operations from the trenches. 

The API Evangelist + APIWare partnership reflects a shift in how I will be doing things in the future. I am looking to capture more of the exhaust from my world--filling a critical gap in providing companies and organizations with a reliable source of API development, and management resources, that operates in a structured, yet agile, project management way, while sticking to an API-first vision.

This new evolution in API Evangelist will not change my tone. My goal is to not to build a startup, but to help meet a growing demand in the space. I will still be as opinionated, and honest about my approach, I will just have more stories to tell! ;-)

<-- If you need help you help there will always be a link on the main menu for you to request assistance, and of course feel free to ping me directly.



The Potential When You Are Able To Spawn Virtualized Environments For Your API

I was made aware of the Citi Mobile Challenge during a conversation last week with the Anypresence team about their new JustAPIs offering. The conversation with them has triggered several potential stories, but one that stood out for me, was the API virtualization opportunities that emerge when you use solutions like the JustAPIs gateway

To support the Citi Mobile Challenge, the bank launched a set of virtual APIs to support the event. I think their descriptions tells a lot of the story:

The APIs made available through Citi® Mobile Challenge power some of Citi's latest digital offerings around the globe. These APIs will give developers an opportunity to create solutions that could function with existing Citi technology. No customer data will be shared. Citi encourages developers to also use APIs outside of Citi to support and add additional features to their innovations.

The bank needed to mimic their production environment, but in a way that developers can still build meaningful mobile apps,without the immediate security and privacy concerns that are present when working with live systems. While I wish sandbox environments were default via all API platforms, which unfortunately is not a reality, however I think when it comes to fintech, the stakes are much higher--which smells like an opportunity for API definitions to me.

It can be a lot of work to establish and maintain both sandbox and production environments, but is something that if you do, can open up even more opportunity when it comes to supporting hackathons, providing QA environments, and much more. Additionally, if you embrace modern API definition formats like Swagger and API Blueprint in this process, as well as approaches to containerization like Docker, a whole new agility and flexibility can be realized as well.

Like API monitoring, performance, security, and other growing incentives for embracing modern API definition formats, I see the demand for API virtualization growing. When your APIs are well defined, a new world of API definition driven services emerges, making things like virtualization for hackathons, QA, load testing, simulation, as well as just a permanent sandbox possible

I have an API virtualization research area started, I just need to spend the time organizing the companies who are doing interesting things in the space, and consider some of the possible building blocks, before it is ready for prime time--stay tuned!



Providing A Web Scraping Toolkit On Github To Jumpstart The API Conversation At The University of Toronto

I found a pretty cool Github repository during my latest review of my university API research, with a mission "to provide a collection of RESTful web APIs that can allow developers to create applications or services that utilize public data from the University of Toronto."

I am a big fan of jump-starting API efforts on campus in this way, something I did to help fuel the API discussion at University of Oklahoma. Which is a topic I will be talking more about in the near future as these conversations continue play out. In my opinion, many campuses are just not ready for APIs, and to get things rolling, it will may often take outside influences.

This API project I found, that is dedicated to the University of Toronto is interesting because they identify three target APIs that are being worked on.

They are also building their own scraping module, which is available in the repository, complete with a list of target sources, where they will be scraping valuable data for inclusion in the course, building, and food API. I find the project repository is very focused, informative, and simple--well done.

I would call this a minimum viable blueprint for API operations on any campus, and is something that should be forked, and cleansed of University of Toronto information,  replaced with generalized how-to instructions--complete with a scraping tool. Anyone involved with a university should be able to set a project like this in motion, by forking, and following a simple blueprint.

With all the work available on Github, it also means anyone else can potentially get involved and contribute, and if a single project gets abandoned, someone else could potentially pick it up. Most importantly, when a university is actually ready to legitimize the API efforts, they can also fork, and build on top of any work that has been already accomplished. 

While this approach to doing APIs at the higher educational level may not be optimal, sometimes its all we have to stimulate the API discussion on campus, something that we just can't wait for--it needs to be set in motion as soon as possible..



Is Your Monetization Rooted In The Resource Or Experience Side Of Your API Operations?

I had another one of my regular check-ins with Anne and Bailey over at Popup Archive today, with a focus around the monetization strategy for their AudioSear.ch API. I thoroughly enjoy my conversations with the team, because they have worked really hard on their audio API solutions, and are extremely open to discussion, and sharing the stories of their API journey.

During our talk, they tuned me into where they were at with their road-map, which includes now having a solid set of API resources that provide access to the episodes, people, and shows involved in podcasts across the Internet. They talked about how they have been iterating on these core API resources, with more experience focused endpoints including relatedness, trending, and tastes, to name a few. All of this really demonstrates that the company are well along in their API journey, moving beyond just the definition of core aPI resources, and actually getting in tune with how these resources will actually be put to work.

Next they walked me through some of the core use cases for their podcast API, ranging from radio stations looking to provide better recommendations, to advertisers who want to better understand the exploding world of podcasts, and understand where the opportunity is. Audiosear.ch is working to quantify the world of podcasting, by indexing not just the metadata surrounding audio files, but also its contents, and relationships between them--busting open a brand new realm in cyberspace, that like video, that is poised for explosive growth, and will be ripe with opportunities.

Now that AudioSear.ch can crack open, index, and develop awareness around the contents of podcasts, which includes the episodes and people involved with shows, they are also faced the hard part of actually defining, and delivering APIs that developers will need, and actually pay for. Meaning which API endpoints do we provide, and what do we charge for them? The number one conversation I am having with companies who have embarked on their own API journey.

These conversation always begin with discussion the hard costs: what did we invest in developer hours? what are our licensing costs? what does compute cost? what does our storage cost? what does our bandwidth cost? All of these costs are rooted in the resource side of the conversation. Where do the data, content, and other resource originate? How much work have we put into? How do we perceive the value of our own resources? Leaving every API provider trying to figure out what people will pay, and calculating their margins--resource-based monetization strategy.

Resource-based API monetization strategies are where all API operations begin when they trying to figure things out. Once you are further along in your API journey like Anne and Bailey are, you need to start thinking about more experienced based pricing, something that will be more closely aligned with how API consumers actually see things. As an API provider, you tend to be too closely aligned with your resources, focused on an API design derived from this view, and a monetization strategy that looks to cover the costs, while also bringing home a potentially good margin. When you are rooted in this way of thinking you are always chained to your core resource-based assumptions, which is often very distant from what will be important to your customers. 

When you look at how Audiosear.ch is iterating with the more experience based API designs like /trends, /relatedness, and /tastes for podcasts, you see how they are beginning to uncover what is actually valuable to end-users. As they keep iterating on these experience based API endpoints, driven by API consumer needs, you can't have your pricing solely anchored to the resource, it needs to also reflect the value perceived by the consumer. You never know when you will find just the right endpoint for a resource, that has high perceived value to end-user, but relatively low costs associated with actually delivering--add entirely new dimensions to your margins.

Approaching API monetization based upon how resources will be experience, decoupling from just a resource based approach, will first and foremost increase the chance that anyone will purchase API access for any sustained period, but secondly opens up a whole new world for price increases (or reductions) based upon demand (think AWS). These are just some initial thoughts from our conversation, something I will be thinking about more in coming weeks, and hopefully provide some other more concrete examples of this in the wild. 

I just needed to work through some initial thoughts as I got out of the Google Hangout--thanks Anne and Bailey #GoodTimes



Going Beyond Just Data APIs With Code Endpoints At Apitite

in my time as API Evangelist I've seen a number of API deployment as a service providers come and go. It is something that is technically hard to do properly, and something that historically had been hard to monetize, but as a wider awareness around APIs grows in the mainstream, the demand landscape for API deployment is shifting.

I got a demo of an API provider today called apitite, which provides API deployment and management services in the cloud. The core of the service is built around the common need to deploy APIs from datasources like MySQL, SQL Server, etc. However I wanted to highlight another feature of their platform which they call code endpoints.

When I help folks understand what is possible with APIs, I break into four main groups:

  • Data - APIs are about making data more accessible, and usable.
  • Content - With a focus on data, many forget to talk about serving of media and content via LAPIs.
  • Functional - Providing more algorithmic and functional features that accomplish a task.
  • Hardware - Allowing for a device to connected to by other applications and systems.

I know this is simplified, and some tech pundits will love to poke holes in it, but when you are explaining the API opportunity to business leaders, as well as the average business user, this has worked well for me. 

apitite delivers on the functional portion of this API equation forward for me, with their endpoints service. Their platform is Node.js based, but allows for very small scripts to be deployed as simple API endpoints. When you couple this with marketplace-like abilities so users can find ready to go scripts they can reverse engineer, as well as a place for developers to publish and share / sell their scripts--you have a potentially very powerful API deployment formula. 

I have seen other similar solutions emerge like Algorthmia, but nothing that is bundled with the data API capabilities that apitite also brings to the table. Anyways, I just wanted to highlight this single point from this conversation, while it was fresh in my head. I will have more to share on apitite as I play with the platform more, and incorporate into a project I am working on around government data.