{"API Evangelist"}

Who Is Going To Do The DevOps Aggregation API Platform?

There are two distinct types of APIs I keep an eye on. One is what I call my life cycle APIs, which are the APIs of the service providers who are selling services and tools to API providers and developers. The second category is what I call my stack network, and these are the individual API providers, who offer a wide range of API resources--you can find both of these types on the home page of API Evangelist

The 50+ life cycle APIs I track on can be used by companies to manage almost every stop along a modern API life cycle. In theory, all of these service providers have APIs. In reality, they do, but they do not practice what they preach and often do not make their APIs easily discoverable. I have said it a thousand times before--if you sell online services to API providers, you should have an API. Period.

At some point in the future, I will have profiled all of the companies included in my API life cycle research, like I did for API monitoring, and be able to provide a comprehensive API stack across all the providers for all stops along the life cycle. Ideally, each provider would have their own OpenAPI spec, but I'm still getting many of them to make their APIs public, convincing them of the importance of also having an API definition for their API will come next. Then I'll continue pushing on them to allow for the import / export of API definitions, so their customers can more easily get up and running with their services--if you need an example of this in the wild, take a look at Sandbox, or over at API Metrics.

I'd love to see someone take this idea and run with it beyond what I'm able to do as a one-man act. There are numerous API aggregation solutions already out there for financial, healthcare, images, documents, and more. What about an aggregated API across providers in the name of DevOps or microservices orchestration? An aggregated solution would allow you to automate defining of your APIs in multiple formats with API Transformer, deploy them using Docker or Heroku APIs, manage with 3Scale APIs, deploy sandboxes with Sandbox, monitor with Runscope, and almost every other stop along the life cycle. 

I'm sure I've written this one up before, but I couldn't find it, so I wanted to get a fresh post up on the subject. With all the agile, orchestration, DevOps, microservices, continuous integration goingz on, having a coherent, cross-vendor API stack, and a suite of the usual analytics, billing, and other vital middleware services just makes sense. Let me know when you get up and running, and I'll send over my bank account information for the royalty payments. ;-)


The Expanding API Layers That Overlap Our Physical And Virtual Worlds

I wrote the other day about the interesting opportunity opening up within the satellite imagery API layer, and earlier about the similar opportunity that is being expanded within the fast growing dimension of our world being opened up with drones. Layers within maps are nothing new, and is something that Google has pushed forward early on in the history of APIs with Google Maps, but I feel is further being expanding on as APIs open new dimensions like satellites and drones. Then being further expanded on by adding API access to each layer for augmenting, and injecting other valuable API resources into these newly created API dimensions.

Let's see if I can successful describe the multiple API dimensions being opened up here. APIs are providing access to maps of our physical world, whether it is on the ground, from the air with drones, or from space with satellites, and these API-driven maps have layers, which are also made available via APIs, allowing other API driven resources like weather, forest fires, restricted spaces, and temporary or permanent elements are being injected. Once injected, these API-driven mapping resources with API injected resources are also being made available via APIs, providing entirely new, and specialized resources--that is a lot of APIs!

I am not even touching on the physical devices who put these maps to work also possessing APIs, like the drones, GPS units, cars, etc. This is just the expanding layer that is opening up via the multitude of API driven mapping resources and is further expanded when you look at the video layer which drones, mobile phones, automobiles, security cameras, and other Internet-connected devices are opening up. Drones, automobiles, and others will share layers with the mapping resources, but other video resources also will possess their own layers for augmenting the experience on the web, mobile, and television.

The part of this that is really striking to me isn't just the overlapping layers between mapping, video, and other channels, it is the overlap between our physical and virtual worlds. Think what Pokemon Go has done for gaming, but now consider drones, consumer automobiles, as well as commercial fleets. It can be very difficult to wrap your mind around the different dimensions of opportunities opening up, but it doesn't take much imagination to understand that there is a growing opportunity for APIs to thrive in this expanding universe.


Sharing Your API Platform Road Map And Telling The Story Like Readme.io

Sharing your platform's road map with the public, and your community is an often overlooked aspect of API operations but is one that can go a long way to communicate your plans for the future with your community. This is why I carved the concept out into its own research area, to help me better understand how the successful API providers, as well as API service providers are publishing, sharing, and communicating around their road map.

One example of this recently is out of the API documentation service provider Readme.io, who didn't just publish a nice, simple, and clean road map for their platform, they also told the story of the process. This is a great way to announce that you have a road map for the platform but is also something you should repeat as often as possible, telling the story of what you just added to the road map and as much detail on why.

Sharing your road map, and the story behind goes a long way in lowering the anxiety around what the future holds for your API consumers, something that lets them know that you care about them enough to share what you have planned. In my opinion, a road map for an API platform shows that you have empathy for your community, and is something I like to encourage and support, by showcasing the process of your road map storytelling here on my blog when I can.


Prioritizing Commonly Requested Information With Your API Deployment

I was reading the post from open data service provider Socrata about "putting citizens first" when it comes to opening up city, county, state, and federal government data. One of the headlines they showcased was "Texas overhauls open data portal, prioritizes commonly requested info"--which is a pretty sensible thing to consider for not just government, but also companies thinking about what to open next.

First, let me emphasize that I am talking about open data that is already published on the web in another format (or should be). What the State of Texas is doing is what I call the low-hanging fruit for API deployment--if it is on your website, it should also be available in a machine-readable format. Ideally, you offer HTML, as well as JSON, XML, and other relevant formats side by side within a single domain using content negotiation, but no matter how you accomplish it, the priority is making sure that commonly requested information is accessible to those who need it.

It is a shame that Texas is only now considering this with the latest revision of their portal, ideally government agencies and companies would be applying this way of thinking by default. If it is on your website as HTML, most likely it has already been deemed important, which is why it was made self-service on the open web in the first place. If you are planning on deploying an API or open data portal, and you are just wondering where you should start, make sure to learn from the State of Texas, and prioritize the commonly requested information.


API Providers Could Add A Page To Showcase Their Bots

I am coming across more API providers who have carved off specific "skills" derived from their API, and offering up as part of the latest push to acquire new users on Slack or Facebook. Services like Github, Heroku, and Runscope that API providers and developers are putting to work increasingly have bots they employ, extending their API driven solutions to Slack and Facebook.

Alongside having an application gallery, and having an iPaaS solution showcase, maybe it's time to start having a dedicated page to showcase the bot solutions that are built on your API. Of course, these would start with your own bot solutions, but like application galleries, you could have bots that were built within your community as well.

I'm not going to add a dedicated bot showcase page until I've seen at least a handful in the wild, but I like documenting these things as I think of them. It gives me some dates to better understand at which point did certain things in the API universe begin expanding (or not). Also if you are doing a lot of bot development around your API, or maybe your community is, it might be the little nudge you need to be one of the first APIs out there with a dedicated bot showcase page.


What Is A RESTful API And Why Does It Matter To IoT?

I'm pretty skeptical about many of the reasons behind why companies are connecting devices to the Internet using APIs--I am just not convinced this is the best idea when we already have so many security issues with the standard, and mobile web. Regardless, I'm constantly working to understand the motivation behind a company's motivation to do APIs, as well as what they are telling their customers. 

I published a story last week about defining the industrial programmable automation controller (PAC) strategy using an API, which focuses on the approach by Opto 22. To support their efforts the industrial automation provider offers up a dedicated page to educating their customers on why you would want to use REST, providing some bullets:

  • Archive I/O and variable data from the PAC directly into Microsoft SQL Server using Microsoft's T-SQL—no OPC or ODBC required
  • Read data from and write data to the PAC from your browser or web-based application using JavaScript.
  • Read or write PAC data using your favorite programming languageC, C++, C#, Java, PHP, Python, and many more
  • Build a mobile application that directly accesses data on your PAC—using Java, Swift, or Xcode 
  • Build a data flow application for communicating with cloud platforms and cloud APIs, using Node-RED and our new SNAP PAC Nodes.

Each of the industrial controllers "includes an HTTP/HTTPS server and RESTful API, compatible with any programming language that supports JavaScript Object Notation (JSON)". In my opinion, this reflects the wider API space that is serving the web and mobile objectives, allowing for integration using any programming language, as well as opening up the devices to API orchestration solutions using iPaaS, and the variety of other API service provider solutions available in the market.

Ultimately I think using web technology is inexpensive, and avoids the usage of proprietary, vendor specific solutions. As the ability to offer up a web server on any physical object becomes easier and cheaper, the usage of web APIs to interact, integrate, and orchestrate around physical objects will only increase, for better or worse.


Thinking In Terms of API Skills And Moving Beyond Just API Resources

The APIs which have seen the greatest adoption across the API space, always provide the functionality that developers are needed in their applications. It is either because the platform is already in use by their users (ie. Twitter, Facebook), or just provides the core feature that is required (ie. SMS, Email). There are an unprecedented number of high-value APIs out there, but I think many API providers still struggle when it comes to defining them in a way that speaks to the needs that web, mobile, and device app developers will be needing.

I have explored this topic before, discussing the importance of exposing the meaningful skills our APIs possess for use in the next generation of messaging and voice apps, as well as asking whether or not our APIs have the skills they need in a voice and bot enabled world. I am not 100% behind the concept that voice and bots are the future, but I am 100% behind defining our API resources in a way that immediately delivers value like they are doing in these environments.

The approach used by Alexa, when it comes to developing "skills" is an important concept for other API providers to consider. Even if you aren't targeting voice enablement with your APIs, the model provides many positive characteristics you should be emulating in your API design, helping you deliver more meaningful APIs. For me, thinking in terms of the skills that your APIs should be enabling, better reflects the API journey, where we move beyond just database and other very technical resources, and providing the meaningful skills developers need for success, and end-users (aka humans) are desiring.


The Racial Bias Being Baked Into Our Algorithms

My "fellow" Presidential Innovation Fellow Mollie Ruskin (@mollieruskin), was doing some work with veterans recently and stumbled across a pretty disturbing example of how racial bias is being baked into the algorithms that are driving our online, and increasingly offline worlds.

This morning I was searching #‎Google for images of #‎Veterans for a project. I stumbled upon a photographer who had taken hundreds of beautiful photographs of Veterans all in the same style.

I clicked on a few striking portraits...I quickly noticed something very troubling.

When doing image searches, I often use the 'related images' feature to uncover more pictures relevant to what I'm hunting for, as was the case this time around. Where most of the photos returned related images of other veterans, one photo of a smiling black male vet in his uniform fatigues, garnered a series of related images that were all mugshots of CRIMINALS.

The tools we use to fuel our 21st century lives are not the seemingly neutral blank slates we imagine them to be. They are architected and shaped by people, informed by our conscious and unconscious biases. Whether this is reflecting back a dark mirror on what people click on or surfacing a careless design in an algorithm, this random search result shines a little more light on the more subtle and insidious ways racism is baked into our modern lives.

For my friends who work at the big tech giants which are increasingly the infrastructure to our lives, please help make sure your institutions are addressing this stuff. (And thanks to those of you who already have been.)

#‎BlackLivesMatter

PS: Recently saw a great talk about this idea of 'oppression' in our algorithms. Def worth a watch: https://www.youtube.com/watch?v=iRVZozEEWlE

This isn't some edge weird case, this is what happens when we craft algorithms using development teams that lack diversity. Racial bias continues to get baked into our algorithms because we refuse to admit we have a problem, or unwilling to actually do anything about it. Sadly, this is just one of many layers in which bias being built into our algorithms, which are increasingly deciding what shows up on your Facebook wall, all the way to which criminals will commit crimes in the future.

You will hear more stories like this on API Evangelist as I push forward my APIs and algorithm research, working to identify ways we can use open source, and open APIs to make these often black box algorithms more transparent, so we can potentially identify the bias inside. Even with this type of effort, we are still left with having to do the hard work change the culture that perpetuates this--I am just focusing on how we crack things open, and more easily identify the illness inside.


The Expanding World of Technology Evangelism

Technology evangelists are nothing new, but are something I think is continuing to expand as the Internet continues to crack open more of the core areas of the tech sector. I specifically chose the term API Evangelist to define what I did evangelizing for all APIs, but all I was really doing is following the lead of evangelism pioneers like Amazon, Google, and even Microsoft. 

There has long been discussion around evangelism vs advocates, and I've seen companies also choose to adopt an ambassador format. I have also been interested to see the evolution of Docker's Captain's program--who are "Docker experts and leaders in their communities who demonstrate a commitment to sharing their Docker knowledge with others". 

I also stumbled across a post out of the MongoDB as a service provider Compose showcasing what they call the database advocate, whose "job is not to guard the database from the world but to advocate the best ways to use it and offering the tools to optimize that usage". In their view, the outdated DBA is going away, with the database advocate emerging as a much more friendly, outward facing, pragmatic gatekeeper for databases within the enterprise.

It makes me happy to see the open ethos brought to the table by web APIs spreading to all the layers of the tech stack, making the world of virtualization and containers more accessible. As an old database guy, it makes me really, really happy to see it also spread to the world of databases--I am hoping that it is something that continues to spread to all layers.


Delivering API Docs Using OpenAPI Spec Driven Templates For Angular

I have been talking with Nick Houghton over at Sandbox about the state of OpenAPI Spec driven API documentation, and the lack of a machine-readable core when you deployed Slate driven documentation. He was wanting the same thing--a good looking, dynamic API documentation that was OpenAPI Spec driven.

He recently got back to me and found a solution that worked for them: "Ended up just templating the Swagger JSON myself rather than relying on Slate etc to do it. So model/resources are Swagger annotated, CI pushes out Swagger JSON and Angular UI parses in the browser, works quite well I think".

Nick is on a similar path that I am, as I work to simply API documentation using OpenAPI Spec, and provide specialized views of APIs using Liquid. We are looking for the simplicity, control, and beauty of Slate, but the machine readable core of OpenAPI Spec--allowing us to keep the core specification for the API up to date, and the documentation is always the latest.

They are going to write up their journey on their blog (as all API service providers should), and share with us. I'll probably do another write-up once I get more details on how they created the templated API docs using OpenAPI Spec and Angular. I also like how they have the OpenAPI Spec JSON pushed out as part of the CI life-cycle--something I'll cover more as part of my API life-cycle orchestration research.


When Your API Consumption Influences The Acquisition Of Your Startup

I saw that the contact API solution FullContact recently purchased the professional network management solution Conspire. Thankfully FullContact is good about blogging about the move, and the details of the motivations behind their decision -- without this type of storytelling I wouldn't even have known it happened.

One thing I noticed in the blog post was that "Paul, Alex, and the entire Conspire team have been fabulous partners with FullContact, having utilized our Person API as a part of the Conspire offering". Acquisitions from within an API ecosystem is not new. It is why many companies do APIs in the first place, to help identify talent within their ecosystem like Paypal does, and complimentary companies and teams like FullContact has done.

There are many mating rituals you can perform as a startup these days, and building an interesting product, service, and company on top of an API is a pretty cool way to accomplish this. Obviously, it is easier said that done, but if you can identify a real-world business problem, and develop a solution to solve this problem on top of an API, and get some real traction--it can lead to some pretty interesting outcomes.


Providing Multiple Types of API Sandboxes To Develop Against

I was going through the Cisco Devnet ecosystem and stumbled across their sandbox environment. I thought it was worth noting that they provided several different types of sandbox environments, with a rolling list of available sandbox instances at any point in time.

Cisco provides seven different types of sandboxes:

  • Networking - The Networking Sandbox allows you to remotely access Cisco Networking technologies. Each Sandbox contains either simulated or physical network elements as well as access to developer tools. Some Sandboxes also provide for the creation of synthetic traffic. 
  • Collaboration The Communication and Collaboration Sandbox allows you to remotely access Cisco Collaboration technologies in a cloud lab. Labs contain Cisco UC services: Unified Communications Manager, Unified Presence, and Unity Connection. In these labs you can build and test integrations to support features such as Instant Messaging/ presence, voicemail, and conferencing services into your application or using e.g. the Jabber Web SDK. 
  • Compatibility Testing The DevNet Sandbox IVT program allows users to complete Interoperability Verification Tests (IVT) in our labs with your engineer as an option to using an authorized Cisco IVT partner services lab. Cisco Solution Partner Program members will be eligible for a Cisco Compatible Logo once testing is complete and deemed passed. The labs contain the architecture, configuration and products needed to complete an IVT for supported products and categories.
  • IoT The DataCenter Sandbox allows you to remotely access Cisco Iot technologies remotely. Labs contain architectures with products from the DevNet Iot product portfolio. The labs contain simulated and actual hardware elements as well as access to tools or synthetic traffic.
  • Cloud The Cloud Sandbox allows you to remotely access Cisco Cloud technologies remotely. Labs contain architectures with products from the DevNet Cloud product portfolio. 
  • Security The Security Sandbox allows you to remotely access Cisco Security technologies remotely. Labs contain architectures with products from the DevNet Security product portfolio. The labs contain simulated and actual hardware elements as well as access to tools or synthetic traffic. 
  • Datacenter The DataCenter Sandbox allows you to remotely access Cisco DataCenter technologies remotely. Labs contain architectures with products from the DevNet DataCenter product portfolio. The labs contain simulated and actual hardware elements as well as access to tools or synthetic traffic.

I like the diverse number of environments represented here. I've been seeing more virtualized environments show up in support of device-based API integrations--you just can't expect everyone to develop and test against the real thing. The significant area represented here for me is the compatibility and security testing sandbox environments--important areas of if we are going to harden integrations.

API definitions like OpenAPI Spec and API Blueprint, combined with recent advances in virtualization (aka containers), makes for a pretty rich environment for pushing forward the number of available sandbox environments that developers can take advantage of. I'd like to see more API providers offer sandbox environments, build up the capacity in this area, and get to the level where Cisco is already operating, and offer a rich variety of virtualized environments for developers to test their integrations against.


Providing An Anonymous Layer To Your API Provider Service Like Stoplight.io

I was playing around with the free and the now paid layers of Stoplight.io, and wrote a previous piece about their lack of a public pricing page, and I noticed they provided an anonymous layer to their API modeling service--without logging in, you can play around with their HTTP client tool, and make requests to an API.

The anonymous version is super limited compared to their full solution, but I think the presence of an anonymous edition opens up an interesting discussion. It appears Stoplight.io has done a lot of work lately to separate the layers of v2 of their service, and provide a public, free, as well as paid, and enterprise editions of their API modeling solution.

With the shrinkage of freemium these days in the API space and the tightening down on free trials, an anonymous layer is compelling. It isn't something that would work for all API service providers, but it is at least something to consider as you are working to define the layers like Stoplight.io has been doing.


I Know It Is Hard When You Are Just Getting Started, But Please Make Your Pricing Page Public

I received an email from Stoplight.io about their version updates, which included the phasing out of the free beta period--makes sense. I clicked on the "you can view pricing, and setup billing, on your account billing page" in the email, and was taken to the register page. 

To clarify a little bit, I have an account with Stoplight.io, which I registered using my @kinlane Github account. I'm logged in as my @apievangelist Github account presently as I'm doing some work with multiple repos, and I really didn't want to log out of @apievangelist and log in with my @kinlane just to see the pricing.

So I head directly to the public website of Stoplight.io to look for pricing--which I can't find within 30 seconds (standard approach). When this happens I will then Google for the [API name] + "pricing"--nothing. Ultimately I did log with my @kinlane Github so that I can see the pricing because I genuinely want to keep my account, as Stoplight.io made it into the highly useful tool category for my world.

I just wanted to articulate the friction I experienced, so Stoplight.io can consider, but also so that the rest of you can consider. My preference is that you always make your pricing page public. I understand that this is difficult for startups that are just getting going, are in beta phase, etc., but I feel like in 2016 it should be the default practice for API providers, as well as service providers.

Even if your service is in beta, and you aren't charging for it yet, you should have a dedicated page to explaining this, and keep it updated as you evolve. Please do not make me log in just to review your service, understand what it does, and find your pricing. This is bad for helping analysts like me understand what you are up to (no I don't want a briefing), and is bad for your customers who are trying to understand where their accounts stand, and whether we can afford to move forward. 


Tweeting Out The iPaaS Opportunities That Are Available For Your API

I've been an advocating for API providers to embrace integration platform as a service provider (iPaaS) for three years now, encouraging them to make sure their API is accessible via popular platforms like Zapier. While I don't push these as a required building blocks for all providers, they definitely are what I'd consider as the common building block across many of the successful APIs I keep an eye on.

Making sure your API is available on iPaaS platforms, and you are showcasing these opportunities is becoming more and more important. Another positive move you can make when it comes to iPaaS, is to tweet out what is possible on a regular basis like #mce_temp_url#the transactional and marketing email API provider Mailjet does--here are a couple examples of this in the wild:

I may add this type of activity to my list of common API evangelism building blocks. It is something that I think compliments the fact that you have iPaaS integration solutions, and you are showcasing them within your API portal. It just makes sense that you should also be regularly tweeting out these opportunities, and making it known to your followers, and hopefully your API consumers.

Tweeting out the iPaaS opportunities available for an API is a great to reach beyond just the developer API consumer, and potentially reach the actual business consumer, who is probably going to have a real world problem that they hopefully will be able solve using your API--opening up a whole other dimension to how your API can be put to work.


The Historic Newspaper API From The Library Of Congress

It always bums me out that the cool kid startup APIs always get the lion share of the attention when it comes to APIs in the tech news. Which I guess makes it my responsibility to show the ACTUAL cool kid APIs, like the Chronicling America API from The Library of Congress, which provides access to information about historic newspapers and select digitized newspaper pages.

The Library of Congress provides APIs for searching the newspaper directory and digitized page contents using OpenSearchauto suggesting of newspaper titles, links using a stable URL pattern, and JSON views of all newspaper resources. They also provide linked data allowing you to process and analyze newspaper information with "conceptual precision" (oooooh I like that), as well as bulk data for your deeper research needs.

I wish that ALL newspapers had APIs like the New York Times and The Guardian do, and information was available to the public by default, and that everything was automatically synced to any archive or collection that was interested in it, like the Libary of Congress. I know that newspapers are having a hard time in the digital age, and I can't help but feel that APIs would help them evolve, and shift until they find their voice in the digital age.


Providing Additional Support Options For Your API In Your Twitter Bio

As I was writing up a story on Mailjet tweeting out the iPaaS opportunities around their email API, I noticed their Twitter bio. It is subtle, but having spent a great deal of time looking for the support channels for an API, this is a potentially huge time saver. It is what I do best, discovering these simple, subtle things that the successful API providers are doing.

I always encourage API providers to use Twitter as a support channel because it doesn't just provide support, it also is a public demonstration that you give a shit about your API consumers, and will actively work to help them solve their problems. I've seen APIs who offer no support, or a minimal number of support channels, while also seemingly working very hard to hide them--leaving you feeling like you will never get help when you need it.

Mailjet provides links to their status page, as well as a link where you can submit a trouble ticket in their Twitter bio. I would consider their Twitter bio pretty well crafted with the first half explaining what they do(for new users), and the second half providing information on how to get support(for existing users). It is a subtle, positive thing that all API providers should consider doing--thanks for the lead Mailjet.


Maybe A Save As JSON Option For Excel Wasn't Forward Thinking Enough

In September of 2015, I asked when are we going to get a save as JSON in our spreadsheets? I was doing a lot of work saving spreadsheets as CSV files, something I can easily do programmatically, but I was doing it manually as part of a workshop. After I downloaded each CSV file, I then converted to a JSON file--leaving me asking, "where is the save as JSON"?

As I've been reviewing new Microsoft's Excel API, I got to thinking about the need for a save as JSON option, and now I think that this line of thought was not forward thinking enough. A "save as" just does not speak to the future of machine readable spreadsheet interactions in an online world. Save as CSV, TSV, are very desktop oriented visions of using Excel, and in 2016 we need more.

The Microsoft's Excel API plus OAuth opens up an endless number of opportunities for working with data available in spreadsheets. Microsoft will have to open up the navigation in the online version of Microsoft Excel to the API developer community, allowing for users to subscribe to 3rd party API driven solutions like, save as JSON, open in Github as CSV, visualize with D3.js, and anything else that developers dream up via the Microsoft Excel API.

Maybe this is already possible in the Microsoft Excel online navigation, regardless, there will also be opportunities for extending these options via browser add-ons, as well as integration directly within 3rd party solutions, that use OAuth and the Microsoft Excel API to access valuable data that is locked up in spreadsheets. I enjoy that APIs are constantly pushing me to re-evaluate my legacy ways of thinking, and help me look more towards the future.


Expanding My Awareness Of How APIs Are Being Used At The Network Level

I work as hard as I can to understand every sector being opened up using web APIs, and the network level is one that I need to push my awareness of, partially because I find it interesting, but mostly because of the impact it can have on every other aspect of how the Internet works (or doesn't).

To get started I went over to Cisco Devnet, and took a look at their pxGrid solution, which is a "multivendor, cross-platform network system that pulls together different parts of an IT infrastructure such as security monitoring and detection systems, network policy platforms, asset and configuration management, identity and access management platforms", which provides "you with an API that will open up a unified framework that will enable you to integrate to pxGrid once, then share context with any other platform that supports pxGrid". 

I'd categorize pxGrid as network API aggregator in my world, providing a single, seamless API to access resources at the network level. Of course, all the network endpoints have to speak pxGrid, but the platform provides me with an introductory blueprint for how web APIs are being applied to network resources, for device configuration and management, as well as identity and security services. I'm not a network professional, but I do know that Cisco devices are pretty ubiquitous, and with the company historically having over 50% market share, I'm betting it is a good place to kick off my learning.


Continuing My Struggle For Reciprocity As ETL Evolves Into The Cloud As iPaaS

Early on in 2013, I started a research project to keep an eye on a specific type of API driven service provider, like IFTTT and Zapier, who were enabling individuals and businesses to move data around in the cloud. This new wave of startups was moving from what we traditionally called ETL in the enterprise, which was about extracting, transforming, and loading data between various systems, into the cloud era. 

I'm an old enterprise database guy, and ETL has been an essential tool in my toolbox for quite some time--according to Wikipedia ETL is:

Extract, Transform and Load (ETL) refers to a process in database usage and especially in data warehousing that performs: Data extraction – extracts data from homogeneous or heterogeneous data sources.

The IT teams that I have historically been a part of have employed ETL to make sure data was where it was needed within the enterprise. As IT began its evolution to the web, the need to migrate data between systems outside the firewall increased. We were increasingly extracting, transforming, and loading data via FTP, web services, and web APIs outside the firewall, and even migrating data between systems that exclusively existed in the cloud.

Increasingly the data we were migrating was using web APIs and began employing a more granular approach to how authentication occurred for each "extract and load"--oAuth. With the growth of shadow IT, and the adoption of software as a service (SaaS) solutions, individuals, and businesses were needing to move documents, media, and other vital data or content between the cloud services which we were increasingly depending on.

APIs were enabling savvy tech users to migrate and sync their data between systems, and startups like IFTTT and Zapier saw the opportunity and jumped in with their new offerings. I do not talk about IFTTT, as they choose to not pay all of this forward by offering an API, and also opt to ignore the transparency that APIs bring to the table--so I will simply refer to Zapier as my example of this new breed of service provider. ;-) 

In addition to evolving beyond FTP and ODBC as the primary channels, and being about the migration of information in the cloud, the other significant characteristic that stood out for me, is this new approach was also paying attention to the individual needs of stakeholders, owners, and the people who the migration of information was important to. This information was also increasingly being migrated between multiple systems and adhering to the terms of service, as well as the privacy of each party involved.

When I launched my research back in 2013, I called it reciprocity, which the dictionary defined as:

  • the quality or state of being reciprocal : mutual dependence, action, or influence
  • a mutual exchange of privileges; specifically : a recognition by one of two countries or institutions of the validity of licenses or privileges granted by the other

When I looked in the thesaurus, reciprocity also had a definition of "interchange" with synonyms of cooperation, exchange, mutuality and reciprocation. Reciprocity is also a synonym of connection with a definition of “person who aids another in achieving goal”. With synonyms being an acquaintance, agent, ally, associate, association, contact, friend, go-between, intermediary, kin, kindred, kinship, mentor, messenger, network, reciprocity, relation, relative, and sponsor.

All of these terms apply to what I was seeing unfold with this new generation of ETL providers. ETL was moving into the clouds, and out from behind the firewall, using the open web, cloud platforms and APIs, and now we have to rethink ETL, and make it accessible to the masses--putting it within reach of the everyday problem owners.

After three years, as I've seen this area continue to grow, but the growth in my website traffic to this research was not in alignment with what I saw in other areas. Over time I realized the area had been dubbed integration platform as a service (iPaaS), and while I have resisted using this term for a while, as I was wanting to emphasize the human aspect of this, I am now giving in. Reciprocity was the only term I have ever tried to push on the API community and have to admit it will be the last term or phrase I ever try to brand in this way.

Sadly iPaaS has become the leading acronym to talk about this evolution, and because IT vendors and analysts couldn't give a shit about the users who's information is being moved around, they have defined it simply as:

Integration Platform as a Service (iPaaS) is a suite of cloud services enabling development, execution and governance of integration flow connecting any combination of on premises and cloud-based processes, services, applications and data within the individual or across multiple organizations.

In classic IT fashion, no emphasis has been placed on the humans in this equation. In the early days of web APIs, startups had focused on what the people who were using their solution needed. Then with each wave of VC investment into the space, with enterprise vendors shifting their focus to this new world, and the industry analyst pundits realizing the value that lies in this new era, we are going back to the old ways of thinking about IT--one that rarely ever focuses on the human aspect of the bits and bytes we are shuffling around.

I am losing this same fight in almost every area of the API space that I keep an eye on. I'm not naive to think I can cut through the noise of the space, and truly take on the IT and developer class's obsessive belief in technology, and the blindness that comes from a focus on the money, but I do believe I can at least influence some of the conversations. This is why I'll keep trying to make sure there is reciprocity across the API space, and iPaaS pays attention to the human aspect of integration, migration, and keeping our increasingly online world in sync.