{"API Evangelist"}

Reducing Our Hard Work To A Transaction With APIs and Serverless

I'm thinking about cloud pricing after my profiling of over 60 of the AWS API resources, as I play with building tools on Algorithmia, and evaluate a variety of serverless options. As someone who is regular blindsided by the devious undercurrents of business, while I'm busy focusing on shiny technological objects, I can't help but feel like us developers are contributing to the commoditization of our (currently) valuable skillset when it comes to APIs, microservices, devops, and serverless.

This isn't exclusive to these areas of technology, and I think it is something we've all set into motion with APIs and microservices over the last decade. We are decoupling some very complex, and often large codebases and dependencies that take a certain amount of experience and skills to tackle, and reducing down to individual reusable resources that are automatically scaled, and may not require as many advanced skills to daisy chain and connect together. I'm all for doing this in the spirit of enabling the average business person to accomplish what they need on a daily, but from my experience in the savviness, and deviance of business leaders, I can't help but feel that eventually developer's labor will be devalued in this environment. 

If we succeed, all company resources are available via APIs and microservices and all events and actions are present as "serverless" technology, and each individual resource will have a price tag, opening everything up for commoditization and extreme competition--this is good right? There will still be room for a handful of high paid architects in this environment, but new pieces of code will be able to be developed in isolation, at a very reduced price, with a huge amount of competition driving what developers will be paid way down. We are ultimately working against ourselves in this environment, and our best interest by achieving the current vision of "scale". 

I do not think that the technology being employed (API, microservices, devops, and serverless) will be entirely responsible for this. Business elements will be a driving force, the constant push for efficiency, scalability, lower costs, and a quest for profits will significantly fuel things. Also, political force like a lack of awareness of the history of labor, and the toxic views of unions in Silicon Valley, set in motion by libertarian white male dominance in venture capital is establishing an extremely rich environment for this to rapidly move into realms that are out of control of the average technology worker--it will happen so quickly we won't even notice it. 

This is an area I am just getting started thinking on. It is a feeling that kept bubbling up as I was reviewing Amazon Web Services, and thinking about APIs, microservices, devops, and serverless at a high level. I'm not as knowledgeable on the history of labor as some folks around me, but I know enough that automation sounds good to us in theory, but is something that often benefits those in power, and rarely benefit those in the middle, and at the bottom. I think many of us predominately white male technologists are willfully blind to the role we are playing in our own commoditization and reducing our hard work to something that will barely sustain a living wage in the near future. 


The State of California Drinking Water Program Repository

One of the side projects I work on regularly is focused on moving forward the conversation around water data. My next wave of work is targeting the State of California Drinking Water Program Repository, and help make some of the valuable spreadsheets and CSV files more usable. Here are the six datasets I'm targeting for processing in coming weeks:

When it comes to public data there are few datasets that are going to as valuable as water data in coming years. I'd put healthcare and education at the top, but water is only going to become increasingly important as the world continues to evolve. If you have an interest in water data let me know, as I could use some help processing these important data sets.


Oracle Acquiring Apiary

Oracle has purchased API design provider Apiary. I'm a big fan of what Apiary does, and what the team has accomplished. I don't trade in Silicon Valley currency, so I'm not going to congratulate the team on their exit. For me, it is just a reminder of how we can't have anything nice in the space. No matter how good your team is, or how good your product or services are, the thousand pound gorillas will always come in the room and fuck things up. 

I am bummed about the acquisition of Apiary because they are essential to my API design origin story. Jakub, Z, and the Apiary team made API definitions to be more about API design than just API documentation. Pushing the conversation earlier on in the API lifecycle, opening up the concept that API definitions could be used for not just documenting your APIs after they are live, and all about design early on in the process, something that opened up for use across every stop along the API life cycle. This resulted in API definitions being core to mocking, deployment, management, SDKs, testing, monitoring, and much, much more. #thankyou

Apiary was something I was proud to support in the API sector--Oracle is not. The negative mark this company has left in the space far outweighs anything good they could possibly do through acquisitions or any new solutions. With this post, I will get the usual waves of people telling me this is "just business", which is code for how people fuck each other over under their capitalist religion. So, go ahead and tell me how I live in a fantasy world, when in reality it is you who live in a fantasy world where it isn't about producing a good product, or service, it is just about your love of money, quest to be rich, and your willingness to allow for good things (and people) to be screwed over along the way.

Over the last decade, I've often been pretty naive about how money works, allowing myself to get overly excited about good people, products, interesting technology, and tooling. You won't find me making these mistakes in the future, even if I have to be a grumpy fucker to new startups that come along. I hate being in the business of telling energetic young startup founders of the realities of the world, throwing a wet blanket on their enthusiasm, but when I do jump on a call with new entrants I will be reminding them of the realities of how all of this works, and that at some point they will be fucking over their end-users and developers--it's just business.

I will be rewriting my history of APIs and will be removing emphasis on Apiary's role in the API design world. Isn't this what white dudes do? It is why it's called "his"-story. The impact they made will remain, and I will always be thankful for, but I'll be damned if I give Oracle any traffic and exposure in my research. Another sad day for the API space, leaving me believing more and more in my Tech Gypsy philosophy that there is no way to have anything nice in the world of technology, and that we will all just have to learn how to survive in the cracks, because power will always creep in and ruin the platforms that we love. :-( ;-( ;-(

Fuck You Oracle!!! - Kin Lane, the API Evangelist.


Focusing On Common API Definitions, Schema, Scopes and Specifications

The API universe is rapidly expanding as more companies, organizations, institutions, and government agencies are sharing data, content, and algorithms using web APIs. It has expanded so much in the last year that I can't keep up with everything that is going on. I can't test new APIs, and the emerging services and tooling from providers who are targeting the space fast enough--that is ok, I'm not sweating it. 

However, I do have to prioritize and focus on the areas where I can make the biggest impact when it comes to my understanding, and when it comes to helping the API community. While I will still be maintaining a general awareness of all technologies in 2017 I'm going to be heavily focused on three areas:

  • API Definition - The machine-readable definition of an API interface, security and data models.
  • Data Schema - A JSON schema, MSON, or other machine-readable description of data schema.
  • OAuth Scope - Shared definitions of OAuth scopes in play for an API and it's resources.
  • Web Specifications - Web concepts and specifications that make the web go around.

Code plays a significant role, and specific implementations are important, but for me, these four areas represent the core of the API space. Defining, employing, discussing, and sharing in these areas is how the API space will scale, and meet the needs of specific industries. Investing in these areas will make the API sector healthier, stronger, and make an impact when it comes to combatting much of the focus on investment, and proprietary of technology that works against things being truly interoperable reusable.

In 2017 I'm focusing the majority of my work in these areas. These are not the areas where the money will be in the future, but it will be what enables all the areas that do make startups money and make a difference on the ground at small businesses and the enterprise. Investment in common definitions, schema, scopes, and specifications will be the #2 challenge for API sector in coming years, right after security. I'm not sure what I can do on the security front, but I know I can make an impact when it comes to helping define common patterns of the API universe.


When We Lose Trust In The Reporting Numbers Our Providers Feed Us

What happens when we can't trust the numbers our service providers report to us? I personally do not stress over my analytics and traffic, views, and other numbers we are engineering our worlds to run by, but when you are paying for a service--I definitely have an opinion. Facebook recently had a series of misreporting events around their metrics, leaving us questioning the numbers we are fed by our service providers on a regular basis.

There is no way that we can be 100% sure our service providers are telling us the truth--we have to trust them. However, there are ways that API providers can be more transparent when it comes to the data behind the numbers. It is easy enough to open up the log files, and other data that went into calculating the numbers when operating a platform. Many of the advertising and other service providers in questions have APIs to control the parts of the systems a platform desires you to perform--they just don't like pulling back the curtain, and showing what is truly going on.

If you do not fully trust your provider's numbers make sure and let them know publicly and privately that you would like access to the raw data behind. APIs can be designed to provide access to any data, as well as provide observability into the algorithms that driving any process. This is not a question of whether it's possible technically, it is a question of whether it is possible politically, and whether or not your service provider is willing to be transparent enough with you to develop the required trust needed for all of this to work properly.


Patent #20150363171: Generating Virtualized API From Narrative API Documentation

I like to pick worrisome patents from my API patent research and share them on my blog regularly. Last week I talk about Patent #US9300759 B1: API Calls With Dependencies and today I want to talk about patent #US09471283: Generating virtualized application programming interface (API) implementation from narrative API documentation, which according to its abstract is:

A virtualized Application Program Interface (API) implementation is generated based upon narrative API documentation that includes sentences that describe the API, by generating programming statements for the virtualized API implementation based on parsing the narrative API documentation and generating the virtualized API implementation based on upon the programming statements for the virtualized API implementation. The parsing of the narrative documentation may use a natural language parser and a domain-specific ontology for the API that may be obtained or created for the API. The virtualized API implementation may be generated using an API virtualizer.

I generally won't talk smack about folks filing patents, except I'm a pretty strong believer that the API interface and the supporting elements that make an API do the API thing should be left out. All the elements present in this patent like virtualization, documentation, and narrative need to be open--this is how things work. Just when we are all beginning to get good when it comes to crafting useful APIs, learning to speak in complete sentences with multiple APIs, we are going to patent the process of telling stories with APIs? Sorry, it just doesn't compute for me. I know this is what ya'll do at bigcos, but it's counterproductive.

Instead of filing this patent I wish they would have taken their idea and launched as open source tool, then also launch an accompanying service running in the cloud, and get to work letting us crafting narratives around our APIs. As I've said before, these patents really mean nothing, and it will all come down to keeping an eye on the court filings using the CourtListener API for any cases that are being litigated using API related patents.

It won't stop me from complaining, though. I just wish folks would spend their time on more productive things, instead of slowing down the world of APIs.


What Are The Goals Behind Launching An API Portal?

I was getting ready to do some work on a developer portal for a project I'm working on and I wanted to stop, step back and try to define what the goals in launching this portal are. As the technologist on this project, it is easy for me to impose my belief in why I am launching this portal (to publish documentation), but I feel it is important that we get the perspective of all stakeholders--so, I asked everyone involved what the goals were.

In the short term, the goals are to engage these groups around the API resources:

  • Engage Third-Parties - Bring in new, and stimulate existing usage of data made available via APIs.
  • Internal Departments - Ensure the internal groups are also engaged, and part of the outreach.

While incentivizing the following things between engaged stakeholders:

  • Aware - Help make people aware of available resources, and what is being done with them.
  • Conversations - Stimulate focused conversations among stakeholders around data and APIs.
  • Issues - Focus on solving tangible issues that actually make an impact in the community.

Long term the goals to stimulate projects that matter, and can possibly bring in sustainable relationships and revenue that will help ensure the platform is up and running, and serving the target audiences. It takes money to do APIs properly, keeping the servers running, code evolving, active outreach and support in the community, and a passionate, engaged community to develop valuable projects.

I know all of this sounds overly simplified, but it is actually the thought process I'm applying to this project. I would say it is oversimplified to say that we are just launching a portal so that people will use our APIs. We need some basic goals about who we are reaching out to, and what we are trying to accomplish--we can evolve, and continue to define from there. When I lay out the outline for this project's portal I will make sure each element that I include speaks to these goals, or I will not put it in the version 1.0 of the portal. 

Next, I want to make sure my minimum viable API portal definition speaks to the widest possible audience, and not just to developers. Then I will get to work on setting up the first version of the portal for use in this API project, and we'll consider what's next.


Profiling Facebook ThreatExchange API

I'm spending some cycles on discovering what "cybersecurity" or "security" API solutions are out there, but specifically looking at threat information related to operating on the web. First up on the list is Facebook's ThreatExchange API, and I wanted to go through and break down what they offer via their API as I work to define an OpenAPI Spec, and their overall API operations as I populate an APIs.json file.This process helps me better understand what Facebook is offering in this area, as well as producing a machine readable definition that I can use across the rest of my research. 

So, what is Facebook ThreatExchange?

Learn about threats. Share threat information back. Everyone becomes more secure. Most threat intelligence solutions suffer because the data is too hard to standardize and verify. Facebook created the ThreatExchange platform so that participating organizations can share threat data using a convenient, structured, and easy-to-use API that provides privacy controls to enable sharing with only desired groups.

  •  
    • Scale your intelligence - Threats like malware and phishing will often attack many targets. Safeguard yourself using shared intelligence from other participants.
    • A better way to share - Utilize a structured platform to send and receive information about threats.
    • Join ThreatExchange - Learn about threats. Share threat information back. Everybody becomes more secure. *in beta

I'm wanting to understand Facebook's motivations behind doing the ThreatExchange API, and better understand the technical, business, and political aspects of providing a platform like this. When profiling a platform I always start with the endpoints, as I feel like they give the most honest accounting of what is going on.

Next, I look at the underlying data and object model, and since Facebook uses their Graph model for their ThreatConnect API, it is pretty easy to pick up and get running with things--here are the graph objects used as part of the ThreatExchange:

Facebook also allows for "types" to be applied across the ThreatConnect objects, adding additional dimensions to the objects available via the API:

I was pleased to see webhook infrastructure available, making things a two-way street, and a little more responsive and real-time. I would also consider a streaming technology at this layer eventually, making it easier to manage at scale.

  • Webhooks  - ThreatExchange Webhooks allow you to receive information in real-time for MalwareAnalyses, ThreatDescriptors, and MalwareFamilies.

It was also good to see privacy considerations right out of the box. If you are going to get other companies to participate at this level and share real-world threat information, there have to be granular controls to help mediate the politics of all of this. The Facebook approach also doesn't just deal with identity, groups, and access, it also dictates the sharing of information--I am guessing a logging layer at this level will be needed in the future, with an API for it as well, to enable 3rd party auditing.

  • Privacy Controls - All submissions to the ThreatExchange API allow for limiting the visibility of any Malware and ThreatDescriptor objects. Currently, ThreatExchange supports three levels of visibility: allow all members; allow a ThreatPrivacyGroup;  and allow a whitelist of specific members. The desired privacy setting on an object is specified by the values at the time of a create or edit submission to the API. Privacy settings can also be changed retroactively for data you've already submitted.
  • Reshare - All submissions to the ThreatExchange API allow for defining how the data can be re-shared by its recipients. The level of re-sharing is applied via the share_level attribute.

This is where this API becomes an exchange. With the required privacy and sharing controls in place, the submitting of new data, connecting the dots between data, and evolving the history of everything in real time can occur, allowing trusted 3rd parties to not just access data, but confidently share it with the ThreatExchange.

  • Submitting New Data - You may submit data to the graph via an HTTP POST request the following URL: https://graph.facebook.com/v2.8/threat_descriptors
  • Submitting Connections - ThreatExchange supports creating connections (aka edges) between ThreatIndicator objects to express relationships. Examples of when this can be useful are for describing URL re-direct chains or domain to IP address relationships.
  • Editing Existing Data - The ThreatExchange API allows for editing existing ThreatIndicator objects. As with all Facebook Graph APIs, editing is performed via an HTTP POST request to the object's unique ID URL.
  • Deleting Data - ThreatExchange currently supports true deletes only for connections between ThreatIndicator objects to express relationships. Examples of when this can be useful are for describing URL re-direct chains or domain to IP address relationships.

After that, there are some more support level elements present. I track on these elements as part of my indexing with an APIs.json, so that anyone can easily find the code, examples, and other supporting resources without clicking around, and digging through the documentation.

There are more general elements of the Facebook Developer community present as well, but this provides a nice roundup of all the elements that make up the Facebook ThreatExchange API, providing one possible blueprint for how we share threat information associated with operating any platform on the web. I'm going to be looking at other solutions in this same space, but I wanted to profile Facebook's effort in this area first, as they probably have the most insight and investment in this area. 

What Are Facebook's Motives With ThreatExchange?
One thing I want to learn more about is why Facebook would want to do the ThreatExchange. I'm sure they truly want to share the wealth of data they have on threats they have there, but I'm sure they equally want incentivize other tech giants to share their data as well, sweetening the entire pot. I would add that they might also be doing this to alleviate any future regulatory pressures that may unfold around cybersecurity legislation. All we can do is monitor what is going on and see how much Facebook, and others invest into ThreatExcahgne before we will know the full extent of the motivations behind the platform.

A Possible Blueprint For Wider Threat Exchange Model 
This post is part of my research into security and cybersecurity. As with the other areas of my research I am looking for the common building blocks of how APIs are being used to understand, map out, defend, and share threat information. I'm happy to see Facebook being proactive in this area, but this feels like it needs to become an open standard and wider blueprint that can be operated by an independent party, with trusted arrangements from each provider regarding what is being submitted and shared--for it to work properly. I have a lot more research to conduct before I make this determination, but it is where I'm headed with my thinking in this area. 

Researcher and Journalist Access To Data On Exchange
I applied for access to the Facebook ThreatExchange API, but I'm unsure if I will be given access, being that I am just a blogger. This opens up a wider question, though, regarding opening up this valuable information to researchers, journalists, and 3rd party auditors. I talked about the need for API access tiers and potentially service level agreements for research and journalists previously, and threat information is definitely an area we need to deeply consider who has access to this important data, in a similar way. We need to be ensuring the privacy of companies, organizations, institutions, and government agencies are protected, but we also need to allow for a wider understanding of this problem to be established, with more eyes on the data.

Further Assessment of APIs and Threat Information Solutions
I am just getting going with this research, and it will take some time for me to paint a complete picture of the landscape when it comes to online threat data, and how APIs are being used. I'll publish each profile as I complete them, and eventually, I'd like to have a common definition of all the API paths, and underlying schema being employed as part of the leading threat data sharing platforms. If we are going to scale the effort to address this growing problem, we are going to need a wealth of APIs in operation, and ensure they are all speaking a common dialect. I need to develop a better understanding of the politics around the operation of a threat data API, but I'm guessing it is something we don't want in the hands of any single, leading tech provider. 


No Innovation Around Terms of Service Reveals True Motives

Silicon Valley startups and entrepreneurs love to point out that they are trying to make the world a better place. Over a 25+ year career, I have fallen for the belief that I was improving a situation through technology. Hell, I still do this regularly as the API Evangelist, stating that a certain approach to opening up access to data, content, and algorithms can make things better, when in numerous situations it will not. I walk a fine line with this and I hope that I'm a little more critical about where technology should be applied, and focus primarily on making existing technology more accessible using APIs--not starting new ones.

When you are critical of technology in the current climate, there are plenty of folks you like to push back on you, leaning on the fact that they are trying to make the world a better place. Not sure where this line of bullshit first began but is something I should look into. I mean, when you hang out with enough entrepreneurs you really begin see through the bullshit and hype, and you know that this is all about them selling you something, and then selling you and your data as part of an exit via acquisition or going public. It rarely ever has anything to do with making the world a better place, what was promised as part of the marketing, and helping the average end-user.

This is where my entrepreneur friends have stopped reading and lean on the fact that they actually believe they are trying to make the world a better place, and their firm belief that Kin is an annoying hippie. You know I love you. You are a special snowflake. But, you are a minority, I am talking about the other 95% of the system you choose to be willfully ignorant of. If you want some evidence of this in the wild, take a look at all the world saving, innovative, revolution startups out there and how many are dedicated to terms of service?

You know, those little legal documents we agree to for our usage of every device in our life, and every single application we use in our professional and personal worlds. The terms of service touch ALL of our lives, and act as a central force throughout our days--kind of like gravity, except it's a corporate force, not anything natural--pushing down on everything we do. As important as terms of services are, and how big of a role they play in our lives, where is the innovation around these legal documents? You know, the kind that assists the average citizen make sense of all of this bullshit and makes their life better?

There are Terms of Service Didn't Read, Docracy, and TOSBack, but they just don't have the resources to tackle this properly. There are some service providers who do a good job of helping users make sense of their own terms of service, but there is NO INNOVATION when it comes to terms of service. There is nothing that serves the end-users over the platforms. With all of the innovation going on out there why isn't there more investment, and solutions in the area of terms of service? I mean, if we are trying to make our user's lives easier with technology, and not more complicated, why isn't there more technology to make sense of this legalese that is governing our lives?

Hint, it is because we aren't out for the greater good. With all the money flowing around out there, I just can't believe there isn't the money to do this right. Provide a suite modern tooling and a wealth of legal resources to tackle this problem properly. If nothing else, let's throw TOS Didn't Read, Docracy, and EFF's TOSBAck more money. Maybe there should be a "guilt tax" imposed on every terms of service owner to pay for an English, human translation of the legal documents that guide their platforms. Maybe there should be a machine-readable schema enforced at the regulatory level, making sure a suite of tools can process, and make sense of TOS changes in real-time. 

Maybe some day we will stop just saying we are trying to help the world with our technology, and actually invest in what truly does.


Requiring SSL For API All Calls

This is one of those regular public service announcements that if at all possible, you should be requiring SSL for all your API calls. I recently got an email from the IBM Watson team telling me that they would be enforcing encryption on all calls to the Alchemy API in February.

SSL is something I've started enforcing on my own internal APIs. I do not have wide usage of my APIs by third-party providers, but I do have a variety of systems making calls to my APIs, transmitting some potentially sensitive information--luckily nothing too serious, as I'm just a simple API Evangelist.

Encryption is an area I research regularly, hoping to stay in tune (as much as I can) with where discussions are going when it comes to encryption and API operations. Much of it doesn't apply to the average API provider but requiring encryption for API calls, and emailing your developers when and if you do begin enforcing, is what I'd consider an essential building block for all API providers.

I'll keep crafting a unique blog post each time I receive on of these emails from the APIs I depend on. Hopefully some day soon, all APIs will be SSL by default.


IFTTT vs Zapier vs DataFire

Integration Platform as a Service (iPaaS) solutions are something I've been tracking on for a while, and an area I haven't seen too much innovation in, except by Zapier for most of that time. I'm a big fan of what IFTTT enables, but I'm not a big fan of companies who build services that depend on APIs but do not offer APIs in turn, so you don't find me highlighting them as an iPaaS solution.

Instead, you'll find me cheering for Zapier, who has an API, and even though I wish they had more APIs, I am grateful they paying it forward a little bit. I wish we had better solutions, but the politics of API operations seems to slow the evolution of iPaaS, usually leaving me disappointed.

That was until recently when some of my favorite API hackers released DataFire:

DataFire is an open source integration framework - think Grunt for APIs, or Zapier for the command line. It is built on top of open standards such as RSS and Open API. Flows can be run locally, on AWS Lambda, Google Cloud, or Azure via the Serverless framework, or on DataFire.io.

"DataFire natively supports over 250 public APIs including: • Slack • GitHub • Twilio • Trello • Spotify • Instagram • Gmail • Google Analytics • YouTube, as well as MongoDB, RSS feeds, and custom integrations." They have a sample flows available as an individual Github repositories. Integrations can be added by the URL of an Open API (Swagger) specification or an RSS feed, you can also specify --raml, --io_docs, --wadl, or --api_blueprint.

DataFire is new, so it has a lot of maturing to do as an API framework, but it has EVERYTHING that iPaaS solutions should have at its core in my opinion. It's API definition-driven, its open source, and there is a cloud version that any non-developer user can put to use. DataFire is encouraging everyone to share each of the flows as machine readable templates, each as their own Github repository so that anyone can fork, modify, and put to work. #IMPORTANT

This is the future of iPaaS. There is lots of money to be made in the sector, empowering average business, professional, and individual users when it comes to managing their own bits on the web--if current providers get out of the way. The problem with current solutions is that they work too hard to capture the exhaust of these workflows and limit their execution to specific walled garden platforms. DataFire acknowledges that these flows will need to be executed across all the leading cloud providers, orchestrated in serverless environments, as well as more managed level of service in a single, premium cloud edition. 

DataFire is the iPaaS solution I've been waiting to see emerge and will be investing time into learning more about how it works, developing integrations and flows, and telling stories about what others are doing with it. DataFire and DataFire.io needs a lot of polishing before it is ready for use by the average, non-technical IFTTT or Zapier user, but I don't think it won't take much to get it there pretty quickly. I'm stoked to finally have an iPaaS solution that I can get behind 100%. 


A Missed Opportunity With The Medium API

In addition to using the news of Medium's downsizing as a moment to stop and think about who owns our bits, I wanted to point out what a missed opportunity the Medium API is. Having an API is no guarantee of success, and after $132M in 3 Rounds from 21 Investors, I'm not sure an API can even help out, but it is fun to speculate about what might be possible if Medium had robust API in operation.

Medium has an API, but it is just a Github repository, with reference to a handful of paths allowing you to get details on yourself, the publications you are part of, and post entries to the site. There are no APIs for allowing me to get the posts of other users, or publications, let alone any of the analytics, or traffic for this. I'm guessing there is no API vision or champion at Medium, which results in the simple, restrictive API we see today.

Many media and content companies see APIs as giving away all the value they are trying to monetize, and are unaware of the control that modern approaches to API management bring to the table. Many people see the pain that other API pioneers have suffered like Twitter, and want to avoid the same problems, completely unaware that many of Twitter's ecosystem problems were Twitter-induced and not inflicted by 3rd party developer.

If Medium had a proper developer portal, complete set of API paths, proper OAuth controls, and other API management tooling, they could open up innovation in content delivery, publishing, analytics, visualizations, voice, bot, and the numerous of other areas where APIs are changing how we consume, create, and engage with information. I get that control over the user experience is key to the Medium model, but there are examples of how this can be done well, and still have an API. The best part is it only costs you the operation of the API operations.

I do not think more money will save Medium. I think they have to innovate. I think part of this innovation needs to come from the passionate base of users they have already developed. I've seen Medium carefully plot out a path forward when it comes their trusted partners, and there is no reason this type of quality control can't be replicated when it comes to API consumers, as well as the applications and integrations that they develop. The Medium API just seems like a really huge missed opportunity to me, but of course, I'm a little biased.


Why I Still Believe In APIs--The 2017 Edition

As I approach my seventh year as the API Evangelist and find myself squarely in 2017, I wanted to take a moment to better understand, and articulate why I still believe in APIs. To be the API Evangelist I have to believe in this, or I just can't do it. It is how my personality works--if I am not interested, or believe in something, you will never find me doing it for a living, let alone as obsessively as I have delivered API Evangelist.

First, What Does API Mean To Me?
There are many, many interpretations, and incarnations of "API" out there. I have a pretty wide definition of what is API, one that spans the technical, business, and politics of APIs. API does not equal REST, although it does employ the same Internet technologies used to drive the web. API is not the latest vendor solution or even standard. The web delivers HTML for humans to consume in the browser, and web APIs deliver machine-readable media types (XML, JSON, Atom, CSV, etc.) for use in other applications. When I say applications, I do not just mean the web, mobile, and devices applications--I mean other applications, as in "the action of putting something into operation". An API has to find harmony between the technical, business, and political aspects of API operations and strike a balance between platform, 3rd party developer, and end-user needs--with every stakeholder being well informed along the way.

I Still Believe Early My API Vision
When I close my eyes I still believe in the API vision that captured my attention in 2005 using the Delicious API, again in 2006 with the Amazon S3 and EC2 APIs, and with the Twitter API in 2007. Although today, I better understand that a significant portion of this early vision was very naive, and too trusting in the fact that people (API providers and consumers) would do the right thing with APIs. APIs use Internet technology to make data, content, and algorithms more accessible and observable, securely using the web. APIs can keep private information safe, and ensure only those who should have access do. I believe that an API-first approach can make companies, organizations, institutions, and government agencies more agile, flexible, and ultimately more successful. APIs can bring change, and make valuable resources more accessible and usable. When done right.

APIs Are Not Going Away Or Being Replaced
Yes, many other technologies are coming along to evolve how we exchange data, content, and algorithms on the web, but doing this as a platform, in a more open, transparent, self-service, and machine-readable way is not going anywhere. I try not to argue in favor of any technological dogma like REST, Hypermedia, GraphQL, and beyond, but I do try to be aware and make recommendations of the benefits, and consequences along the way. Having a website or mobile applications for your business, organization, institution, government agency, and often times individuals isn't going to go away anytime soon. Neither will sharing the same data, content, and algorithms available for use in applications in a machine readable way, for use by 3rd party groups using Internet technology. The term API has gained significant mindshare in the tech space, enjoys use as an acronym in the titles of leading news publications, and has taken root in the vocabulary of average folks, as well as the entrepreneurs, developers, and technology companies who helped popularized the term. 

APIs Are Not Good, Nor Bad, Nor Neutral--People And Companies Are
APIs get blamed for a lot of things. They can go away at any time. They are unstable. They are not secure. The list goes on and on, and many also like to think that I blindly evangelize that APIs will make the world better all by themselves--I do not. APIs are a reflection of the individuals, companies, organizations, institutions, and agencies that operate them. In the last seven years, I have seen some very badly behaved API providers, consumers, and companies who are selling services to the sector. Going forward, I will never again be surprised again by how badly behaved folks can be when using APIs, and the appetite for an API-driven surveillance economy that is fueled by greed and fear. 

API Specifications, Schema, and Scopes Are The Most Critical Part
I track on over 75 stops along my definition of the API life cycle, and many of them are very important, but my API definition research, which encompasses the world of API specifications, schema, and scopes will play the most significant role in the future of the web, mobile, and connected device technology. This is why I will be dedicating the majority of my bandwidth to this area in 2017, and beyond. Other areas will get attention from me, but API specifications, schema, and scopes touch on every other aspect of the API sector, as well as almost EVERY business vertical driving our economy today. Industry level standards like PSD2 for the banking industry will become clearer in coming years, as a result of the hard work that is going on right now with API definitions.

Help Ensure The Algorithms That Are Increasingly Impacting Our Lives Are Observable
APIs provide access to data and content, but they also provide some observability into the black box algorithms that are increasingly governing our world. While APIs can't provide 100% visibility into what algorithms do, or do not do, they can provide access to the inputs, and the outputs of the algorithms, making them more observable. When you combine with modern approaches to API management, it can do this securely, and in a way that considers any end-user, as well as developer and platform interests. I'm going to keep investing in fine tuning my argument for APIs to be used as part of artificial intelligence, machine learning, and the other classifications of algorithms that are being put to use across most business sectors. It's a relatively new area of my API research, but something I will invest more time, and resources into to help push the conversation forward. 

More API Storytelling Is Needed, And The Most Rewarding Part Of What I Do
I've done an OK job at instigating other API service providers, API operators, integrators, and individuals to tell their story. I feel that my biggest impact on the space has been producing the @APIStrat conference with my partner in crime Steve Willmott at 3Scale / Red Hat, which is focused on giving people across the API sector a diverse, and inclusive place to tell their story, on stage and in the hallways. Next, I'd say my regular storytelling on my blog, like my evergreen post on the secret to Amazon's success, has had the biggest impact, while also being the most rewarding and nourishing part for me personally. I've heard stories about my Amazon blog post being framed on the wall in a bank in Europe, baked into the home page of the IT Portal for government agencies, and many other scenarios. Stories matter. We need more of it. I am optimistic regarding much of the new writing I'm seeing on Medium, but this storytelling being mostly isolated to Medium worries me.

Keep On, Keep'n On With API Evangelist The Best I Can
I still believe in APIs. Over the last seven years, my belief in what APIs can do has not diminished. My belief in what people are capable of doing with them has tremendously. I enjoy staying in tune with what is going on, and trying to distil it down into something people can use in their own API operations. I think the biggest areas of concern for the API life cycle in coming years will be in the areas of API definitions, security, terms of services, privacy, discovery, intellectual property, and venture capital

I just needed to refresh my argument of why I still believe in APIs. Honestly, 2015 and 2016 really stretched my faith when it came to APIs. Once I realized it was primarily a crisis of faith in humans and not in APIs, I was able to find a path forward again with API Evangelist. Really, I couldn't ask for a better career focus. It keeps me reading, learning, and thinking deeply about something that touches all of us, and possesses some really meaningful areas that could make an actual impact on people's lives. Things like Open Referral, my work on FAFSA, EPA, National Park, Census, Energy, API Commons, and APIs.json, keep me feeling that APIs can make a positive impact on how our world works.

In 2017 I don't think I need to spend much time convincing people to do APIs. They are already doing this. There is more going on than I can keep track of. I think I need to focus on convincing people to do APIs as open and observable as they possibly can. Push for a balanced approach to not just the technology, but the business and legal side of platform operations, in a way that protects the platform's operations, but also in a way that is looking out for 3rd party developer, and end-user interests. I still believe in APIs, but only if they possess the right balance which I talk about regularly, otherwise they are just another approach to technology that we are letting run amok in our lives.


Using An OpenAPI Spec As Central Truth In Stakeholder Discussions

I am working with Open Referral to evolve the schema for the delivery of human services, as well as helping craft a first draft of the OpenAPI Spec for the API definition. The governing organization is looking to take this to the next level, but there are also a handful of the leading commercial providers at the table, as well other groups closer to the municipalities who are implementing and managing Open211 human service implementations.

I was working with Open Referral on this before checking out this last summer, and would like to help steward the process, and definition(s) forward further in 2017. This means that we need to speak using a common language when hammering out this specification and be using a common platform where we can record changes, and produce a resulting living document. I will be borrowing from existing work I've done on API definitions, schema, and scope across the API space, and putting together a formal process design specifically for the very public process of defining, and delivering human services at the municipal level.

I use OpenAPI Spec (openapis.org) as an open, machine readable format to drive this process. It is the leading standard for defining APIs in 2017, and now is officially part of the Linux Foundation. OpenAPI Spec provides all stakeholders in the process with a common language when describing the Open Referral in JSON Schema, as well as the surface area of the API that handles responses & requests made of the underlying schema.

I have an OpenAPI Spec from earlier work on this project, with the JSON version of the machine-readable definition, as well as a YAML edition--OpenAPI Spec allows for JSON or YAML editions, which helps the format speak to a wider, even less technical audience. These current definitions are not complete agreed upon definitions for the human services specification, and are just meant to jumpstart the conversation at this point.

OpenAPI Spec provides us with a common language to use when communicating around the API definition and design process with all stakeholders, in a precise, and machine readable way. OpenAPI Spec can be used to define the master Open Referral definition, as well as the definition of each individual deployment or integration. This opens up the opportunity to conduct a "diff" between the definitions, showing what is compatible, and what is not, at any point in time.

The platform I will be using to facilitate this discussion is Github, which provides the version control, "diff", communication, and user interactions that will be required throughout the lifecycle of the Open Referral specification. Allowing each path, parameter, response, request, and other elements to be discussed independently, with all history present. Github also provides an interesting opportunity for developing other tools, like I have for annotating the API definition as part of the process.

This approach to defining a common data schema and API definition requires that all stakeholders involved become fluent in OpenAPI Spec, and JSON Schema, but is something that I've done successfully with other technical, as well as non-technical teams. This process allows us all to all be on the same page with all discussion around the Open Referral API definition and schema, with the ability to invite and include new participants in the conversation at any time using Github's existing services.

Once a formal draft API specification + underlying JSON schema for Open Referral is established, it will become the machine readable contract and act as a central source of truth regarding the API definition as well as the data model schema. It is a contract that humans can follow, as well as be used to drive almost every other stop along the API life cycle like deployment, mocking, management, testing, monitoring, SDKs, documentation, and more.

This process is not unique to Open Referral. I want to be as public with the process to help other people, who are working to define data schema, understand what is possible when you use APIs, OpenAPI Spec, JSON Schema, and Github. I am also looking to reach the people who do the hard work of delivering human services on the ground in cities and help them understand what is possible with Open Referral. Some day I hope to have a whole suite of server-side, and client-side tooling developed around the standard, empowering cities, organizations, and even commercial groups deliver human services more effectively.


The Google Baseline For A User Account Area

I have a minimum definition for what I consider to be a good portal for an API, and was spending some time thinking about a baseline definition for the API developer account portion of that portal, as well as potentially any other authenticated, and validated platform user. I want a baseline user account definition that I could use as aa base, and the best one out there off the top of my head would be from Google.

To support my work I went through my Google account page and outlined the basic building blocks of the Google account:

  • Sign-in & Security - Manage your account access and security settings
    • Signing in to Google - Control your password and account access, along with backup options if you get locked out of your account.
      • Password & sign-in method - Your password protects your account. You can also add a second layer of protection with 2-Step Verification, which sends a single-use code to your phone for you to enter when you sign in. 
        • Password - Manage your password.
        • 2-Step Verification - Manage 2-Step verification.
        • App Passwords - Create and manage application passwords.
      • Account recovery options - If you forget your password or cannot access your account, we will use this information to help you get back in.
        • Account Recovery Email - The email to send recovery instructions.
        • Account Recovery Phone - The email to send recovery instructions.
        • Security Question - A secret question to use as verification during recovery.
    • Device activity & notifications - Review which devices have accessed your account, and control how you want to receive alerts if Google thinks something suspicious might be happening.
      • Recent security events - Review security events from the past 28 days.
      • Recently used devices - Check when and where specific devices have accessed your account.
      • Security alerts settings - Decide how we should contact you to let you know of a change to your account’s security settings or if we notice something suspicious.
    • Connected apps & sites - Keep track of which apps and sites you have approved to connect to your account and remove ones you no longer use or trust.
      • Apps connected to your account - Make sure you still use these apps and want to keep them connected.
      • Saved passwords - Use Google Smart Lock to remember passwords for apps & sites you use Chrome & Android
  • Personal Info & Privacy - Decide which privacy settings are right for you
    • Your personal info - Manage this basic information — your name, email, and phone number.
    • Manage your Google activity - You have lots of tools for controlling and reviewing your Google activity. These help you decide how to make Google services work better for you.
      • Activity controls - Tell Google which types of data you’d like to save to improve your Google experience.
      • Review activity - Here’s where you can find and easily review your Google activity.
    • Ads Settings - You can control the information Google uses to show you ads.
    • Control your content - You are in control of the content in your Google Account, even if you stop using Google products or decide to delete your account altogether.
      • Copy or move your content - You can make a copy of the content in your account at any time, and use it for another service.
      • Download your data - Create an archive with a copy of your data from Google products.
      • Assign an account trustee - Approve a family member or friend to download some of your account content in the event it is left unattended for an amount of time you've specified.
      • Data Awareness - We want you to understand what data we collect and use.
  • Account Preferences - Manage options for language, storage, and accessibility. Set up your data storage and how you interact with Google.
    • Language & Input Tools - Set Google services on the web to work in the language of your choice.
    • Accessibility - Adjust Google on the web to match your assistive technology needs.
    • Your Google Drive storage - Your Google account comes with free Google Drive storage to share across Gmail, Google Photos, Google Docs and more. If this isn't enough, you can add more storage here.
    • Delete your account or services - If you are no longer interested in using specific Google services like Gmail or Google+, you can delete them here. You can even delete your entire Google Account.
      • Delete a Google service - A Google Account offers many services. Some of these services can be deleted from your account individually.
      • Delete your Google Account - You're trying to delete your Google Account, which provides access to various Google services. You'll no longer be able to use any of those services, and your account and data will be lost.

These are all building blocks I will add to my API management research, with an overlap with my API portal research. I'm not sure how many of them I will end up recommending as part of my formal guide, but it provides a nice set of things that seem like they SHOULD be present in all online services we use. Google also had two other tools present here, that overlap with my security and privacy research:

  • Security Checkup - Protect your account in just a few minutes by reviewing your security settings and activity.
  • Privacy Checkup - Take this quick checkup to review important privacy settings and adjust them to your preference.

I am going to be going through Google's privacy and security sections, grabbing any relevant building blocks that providers should be considering as part their API operations as well. For now, I'm just going to add this to the list of things I think should be present in the accounts of third party platform users, whether they are a developer, partner, or otherwise.

I would also like to consider what other providers offer accounts features I'd like to emulate. Like Amazon, Dropbox, and other leading providers. I would also like to take another look at what the API management providers like 3Scale offer in this area. Eventually, I want to have an industry guide that API providers can follow when thinking about what they should be offering as part of their user accounts.


Your State Issued ID Is Required To Signup For This Online Service

I am evaluating Shutterstock as a new destination for some of my photos and videos. I've been a Shutterstock user for their stock images, but I'm just getting going being a publisher. I thought it was worth noting that as part of their sign up process they require me to upload a copy of my state issued identification before I can sell photos or images as a Shutterstock publisher.

This is something I've encountered with other affiliate , partner, and verified solutions. I've also had domains expire, go into limbo, and I have to fax in or upload my identification. It isn't something I haven't seen with many API providers yet, but I'm guessing it will be something we'll see more of with API providers further locking down their valuable resources. 

I am not sure how I feel about it being a regular part of the partner and developer validation process--I will have to think about it more. I'm just adding to the list of items I consider as part of the API management process. It makes sense to certify and establish trust with developers, but I'm not 100% sure this is the way to do it in the digital age. IDK, I will consider more, and keep an eye out for other examples of this with other API providers.

Shutterstock publisher isn't necessarily connected directly to the API, but once I'm approved I will be uploading, and managing my account via their API, so it is a developer validation process for me. The topic of developer validation and trust keeps coming up in other discussions for me, and with the increasing number of APIs we are all developing with, it seems like we are going to need a more streamlined, multi-platform, and an API-driven solution to tackle this. 

For me, it would be nice if this solution was associated with my Github account, which plays a central role in all of my integrations. When possible, I create my developer accounts using my Github authentication. It would be nice if I had some sort of validation, and trust ranking based upon my Github profile, something that would increase the more APIs I use, and establish trust with.

I will file this away under my API management, authentication, and partner research, and will look for other examples of it in the wild--especially at the partner layer. Certifying that developers are truly human, or possibly truly a valid corporate entity seems like it is something that will only grow in the bot-infested Internet landscape we are building.


Intercom Providing Docker Images Of Their SDKs

I regularly talk about the evolving world of API SDKs, showcasing what API service providers like APIMATIC are up to when it comes to orchestration, integration, other dimensions of providing client code for API integrations. A new example of this that I have found in the wild is from communication and support API platform Intercom, with their publishing of Docker images of their API SDKs. This overlaps my SDK research with the influence that containerization is having on the the world of providing and integrating with APIs.

Intercom provides Docker images for their Ruby, Node, Go, and PHP API SDKs. It's a new approach to making API code available to API consumers that I haven't seen before, (potentially) making their integrations easier, and quicker. I like their approach to providing the containers and specifically the fact they are looking for feedback on whether or not having SDK Docker containers offer any benefit to developers. I'm guessing this won't benefit all API integrators, but for those who have successfully adopted a containerized way of life, it might streamline the process and overall time to integration.

I just wanted to have a reference on my blog to their approach. I'll keep an eye on their operations, and see if their SDK Docker images become something that gets some traction when it comes to SDK deliver. Since they are sharing the story on their blog, maybe they'll also provide us with an update in a couple months regarding whether developers found it useful or not. If nothing else, their story has reminded me to keep profiling Intercom, and other similar API providers, as part of my wider API communication and support research.


Evernote: Reaffirming Our Commitment to Your Privacy

A couple of weeks back, the online note-taking platform Evernote made a significant blunder of releasing a privacy policy update that revealed they would be reading our notes to improve their machine learning algorithms.  It is something they have since rolled back with the following statement "Reaffirming Our Commitment to Your Privacy":

Evernote recently announced a change to its privacy policy and received a lot of customer feedback expressing concerns. We’ve heard that feedback and we apologize for the poor communication.We have decided not to move forward with those changes that would have taken effect on January 23, 2017. Instead, in the coming months we will be revising our existing Privacy Policy. The main thing to know is this: your notes remain private, just as they’ve always been.

Evernote employees have not read, and do not read, your note content. We only access notes under strictly limited circumstances: where we have your permission, or to comply with our legal obligations.Your privacy, and your trust in Evernote are the most important things to us. They are at the heart of our company, and we will continue to focus on that now and in the future.

While I am thankful for their change of heart, I wanted to take a moment to point out the wider online environment that incentivizes this type of behavior. This isn't a single situation with Evernote reading our notes, this is the standard mindset for startups operating online, and via our mobile devices in 2017. This is just one situation that was called out, resulting in a change of heart by the platform. Our digital bits are being harvested in the name of machine learning and artificial intelligence across the platforms we depend on daily for our business and personal lives. 

In these startup's quest for profit, and ultimately their grand exit, they are targeting our individual and business bits. Use this free web application. Use this free mobile application. Plug this device in at home on your network. Let our machine learning evaluate your database for insights. Let me read your most personal and intimate thoughts in your diary, journal, and notebook. I will provide you with entertainment, convenience, and insight that you would never be able to achieve on your own, without the magic of artificial intelligence.

In my experience, the position a company takes on their API provides an early warning system for these situations. Evernote sent all the wrong signals to their developer community years ago, letting me know it was not a platform I could depend on, and trust for my business operations, let alone my personal, private thoughts. I was able to get off the platform successfully, but the trick in all of this is identifying other platforms who are primed for making similar decisions and helping the average users understand the motives behind all of these "solutions" we let into our lives, so they can make an educated decision on their own.


The Design Process Helping Me Think Through My Data And Content

I'm working on the next evolution in my API research, and I'm investing more time and energy into the design of the guides I produce as a result of each area of my research. I've long produced a 20+ page PDF dumps of the leading areas of my research like API design, definitions, deployment, and management, but with the next wave of industry guides, I want to polish my approach a little more. 

The biggest critique I get from folks about the API industry guides I produce is that they provide too much information, aren't always polished enough, and sometimes contain some obvious errors. I'm getting better at editing, but this only goes so far, and I'm bringing in a second pair of eyes to review things before they go out. Another thing I'm introducing into the process is the use am of professional design software (Adobe InDesign) rather than just relying on PDF's generated from my internal system with a little HTML spit shine.

While it is taking me longer to dust off my Adobe skills than I anticipated, I am finding the design process to be extremely valuable. I've often dismissed the fact that my API research needed to look good, and that it is more important that it is just available publicly on my websites. This is fine and is something that will continue, but I'm finding a more formal design process is helping me think through all of the material, better understand what is valuable, what is important, and hopefully better tell a story about why it is relevant to the API space. It is helping me take my messy backend data and content, and present it in a more coherent and useful way.

As I'm formalizing my approach using my API definition guide, I'm moving on to my API design guide, and I can't help but see the irony in learning the value of design while publishing a guide on API design, where I highlight the benefits of API design to make sense of our growing digital assets. Once I've applied this new approach to my definition, design, deployment, DNS, and management guides I am going to go back to my API design for these resources, and take what I've learned and applied to the way I share the raw data and content. The lesson that stands out most at the moment, is that less is more, and simple speaks volumes.


Patent US9300759 B1: API Calls With Dependencies

I understand that companies file for patents to build their portfolios, and assert their stance in their industry, and when necessary use patents as leverage in negotiations, and in a court of law. There are a number of things that I feel patents logically apply to, but I have trouble understanding why folks insist on patenting things that make the web work, and this whole API thing work.

One such filing is patent number US9300759 B1: API Calls With Dependencies, which is defined as:

Techniques are disclosed for a client-and-server architecture where the client makes asynchronous API calls to the client. Where the client makes multiple asynchronous API calls, and where these API calls have dependencies (i.e., a result of one call is used as a parameter in a second call), the client may send the server these multiple asynchronous API calls before execution of a call has completed. The server may then execute these multiple asynchronous API calls, using a result generated from one call as a parameter to another call.

Maybe I live in a different dimension than everyone else, but this doesn't sound unique, novel, and really just feels like folks are mapping out all the things that are working on the web and filing for patents. I found this patent while going through the almost 1300 API related patents in Amazon Web Services portfolio. Many of their patents make sense to me, but this one felt like it didn't belong.

When I read these patents I worry about the future of the web. Ultimately I can only monitor the courts for API related patent litigation, and keep an eye out for new cases, as this is where the whole API patent game is going to play out. I'll keep squawking every time I read a patent that doesn't just belong, and when I see any new legal cases I will help shine a light on what is going on.