{"API Evangelist"}

Quantifying My Minimum Viable API Footprint Definition As Real API Portal

I wrote a post the other day laying out what I'd consider a minimum viable footprint for API operations. My vision of just exactly what an API is, has gone beyond just the technical, ever since I started API Evangelist back in July of 2010. Early on I saw this was more than just about the API endpoints, and documentation, code samples, and many other building blocks were essential to the success (or failure) of any API platform, area, or ecosystem.

This recent post was an attempt, here in 2015, to quantify what I would consider to be a minimum definition for API operation. After writing it I wanted to take another stab at actually creating a portal, that would stand up to the API rhetoric that I regularly produce. What better place to start, than my own personal master API stack, where I am working to get control over my own infrastructure. Once I got a version 1.0 definition, I forked it, and setup a default API portal that I am calling my demo APIs.json driven portal.

I have published to Github, a version of my minimum API footprint, which I'm using APIs.json an engine. After looking at 10,000K+ APIs, I have a pretty good idea of what I like to see. This is my interpretation of all that monitoring of the space--a distillation of what I've seen while reviewing APIs. It isn't perfect, but neither is the space that we have, and my portal is just an attempt at quantifying what I'm seeing.

I'm going to create a minimum viable Internet of Things (IoT) version of this portal as well, and use APIs.json to deploy different interpretations of what constitutes a minimum viable API presence. If you have anything you'd like to see in my base template, let me know. If you want to fork and add to, then submit a pull request, even better. I'm just playing around, but also looking to establish a suite of APIs.json driven tools, that help me(and you), better understand the API space.

To help on-board you with my vision, I also added a walk-through for when you land on the site. Something I will be adding to when I have time.



Quantifying A Minimum Viable API Footprint Definition As Real APIs.json Driven Portal

I wrote a post the other day laying out what I'd consider a minimum viable footprint for API operations. My vision of just exactly what an API is, has gone beyond just the technical, ever since I started API Evangelist back in July of 2010. Early on I saw this was more than just about the API endpoints, and documentation, code samples, and many other building blocks were essential to the success (or failure) of any API platform, area, or ecosystem.

This recent post was an attempt, here in 2015, to quantify what I would consider to be a minimum definition for API operation. After writing it I wanted to take another stab at actually creating a portal, that would stand up to the API rhetoric that I regularly produce. What better place to start, than my own personal master API stack, where I am working to get control over my own infrastructure. Once I got a version 1.0 definition, I forked it, and setup a default API portal that I am calling my demo APIs.json driven portal.

I have published to Github, a version of my minimum API footprint, which I'm using APIs.json an engine. After looking at 10,000K+ APIs, I have a pretty good idea of what I like to see. This is my interpretation of all that monitoring of the space--a distillation of what I've seen while reviewing APIs. It isn't perfect, but neither is the space that we have, and my portal is just an attempt at quantifying what I'm seeing.

I'm going to create a minimum viable Internet of Things (IoT) version of this portal as well, and use APIs.json to deploy different interpretations of what constitutes a minimum viable API presence. If you have anything you'd like to see in my base template, let me know. If you want to fork and add to, then submit a pull request, even better. I'm just playing around, but also looking to establish a suite of APIs.json driven tools, that help me(and you), better understand the API space.

To help on-board you with my vision, I also added a walk-through for when you land on the site. Something I will be adding to when I have time.



Thinking Through How We Handle The Internet of Things Data Exhaust, And Responsible API Monetization, With Carvoyant

I'm very fortunate to be the API Evangelist, as I get to spend my days discussing, some very lofty ideas, with extremely smart and entrepreneurial folks from across many different business sectors. An ongoing conversation I have going on with Bret Tobey (@batobey), the CEO of Carvoyant, is around the big data generated around the Internet of Things. Many companies that I engage with are very closed about their big data operations, where Carvoyant is the opposite, they want to openly discuss, and figure it out as a community, out in the open--something I fully support.

In reality, the Internet of Things (IoT) is less about devices, than it is about the data that is produced. If someone is exclusively focusing on the device as part of their pitch, it is mostly likely they are trying to distract you from the data being generated, because they are looking to monetize the IoT data exhaust for themselves. Carvoyant is keen on discussing the realities of IoT data exhaust, and not just from Internet connected automobile perspective, but also the wider world of Internet connected devices. If you want to join in the discussion, of course you can comment on this blog, and via Twitter, but you can also come to Gluecon in May, where 3Scale, API Evangelist, and Carvoyant will be conducting an IoT big data workshop, on this topic.

If you aren't familiar with Carvoyant, they are an API-driven device that you can connect to your vehicle, if it was manufactured after 1996, and access the volumes of data being produced each day. Carvoyant's tag line says "Your Car, Your Data, Your API" -- I like that. This is why I enjoy talking through their strategy, because they see the world of APIs, connected devices, and big data the way I do. Sure there is lots of opportunity for platforms, and developers to make money in all of this, but if there is user generated content and data involved, the end-users should also have a stake in the action--I do not care how good your business idea is. 

Carvoyant does a great job of acknowledging that the car is center to our existence, and the data generated around our vehicles is potentially very valuable, but it is also a very personal thing, even when anonymized. Here is how they put it:

Connected car data tells the repair industry when a vehicle needs service – the moment it happens.  Connected car data tells an insurer how a driver actually drives and if they are eligible for a better priced policy.  Connected car data tells gas stations if a vehicle is low on gas.  If a vehicle has not been to the grocery store for a while, than it may be time to make an offer.

The difference in this conversation, is Carvoyant isn't just making this pitch to investors, the automobile industry, and developers, they are making it to the automobile owners as well. They are working to establish best practices for gathering, accessing, storing, and making sense of Internet of Things data exhaust, in a way that keeps the end-users interests in mind. Every platform should think this way. There is too much exploitation going on around user-generated data, and Carvoyants vision is important.

Monetization Via Classic Affiliate Program
When it comes to figuring out a healthy monetization strategy for the Carvoyant platform, and resulting data, the company is starting with a familiar concept, the affiliate program. Establishing potential referral networks, where end-consumers, and developers of apps can refer potential customers to businesses, and when there is a successful sale, an affiliate commisison is paid out. This is a great place to start, because it is a concept that businesses, developers, and consumers will understand and be able to operate within, without changing much on the ground behavior.

If you identify a customer in need of vehicle servicing, and successfully refer them to a local service center, you can be paid money for making the connection, something that when done right could also be applied to the end-user in form of credits, discounts, and other loyalty opportunities. An affiliate approach to the monetization of data via the Carvoyant API makes for an easy sell, but one that can be applied to a myriad of business sectors ranging from automobile services to food, shopping, travel, and much, much more. While an affiliate base is being established, Carvoyant can also begin to look towards the future, and shifting behavior.

Monetization Beyond Affiliate
While I'm fine with Carvoyant kicking off their monetization strategy with a calssic affiliate program, I feel pretty strongly there are many other opportunities for monetization, something that Bret agrees too. When you think about the central role cars play in our lives, the opportunities for inciting meaningful experiences, are endless. While the real money is probably around the mundane realities of the average car owner, the chances for serendipity, beyond these known areas is the exciting aspect. How do you not just help users find the best time to get their oil changed, but also take that side street, instead of freeway that might involve a chance experience that could range from dinner, to concert, or just find that right sunset location.

There is a pretty clear conversion event involved with the affiliate model. A user clicks on just the right deal, a sale is made, and revenue is kicked back from the business to the platform, developer, and is something that hopefully reaches the end-user in meaningful way. What other types of "conversion events" (man I hate that phrase) can we identify in car culture. How do we encourage people to take public transit, share vehicles, or how do we make large fleets operate more in harmony? With the right platform, I think we can quickly go beyond the traditional affiliate transaction, and develop a new wave of monetization around IoT data, that goes beyond just eyeballs, and links, and is more about engagement and true experiences.

Experience Credits Not Just Sales 
Just like moving beyond the affiliate conversion event, I think we can go beyond the transaction also being currency or sale based. How do we create a more experience based currency or credit system that can be used equally to help businesses generate sales, and establish loyalty with customers, but also allow developers and end-consumers to exchange units of value, attached to valuable automobile data? If a $20.00 deal on a $30.00 oil change, results in a $2.00 affiliate revenue share, what would the value of pulling off the freeway, taking the side road and catching just the right sunset picture be worth? How do we incentivize experiences, not just sales? If we are continuing to weave data generated from our physical worlds, it cannot always be about money, there has to be other value generated, that will keep end-users engaged in valuable, yet meaningful ways.

Sharing Economy
As our worlds continue to change, partially because of technology, but also because of other environmental and societal pressures, what value, sales, and experiences can be generated from the data exhaust produced as part of the sharing economy? If its our car, our data, and our API--does this apply when it involves our ZipCar usage, rental car, or possibly Uber? How does the sharing economy impact data generated via connected cars. When the end-user is the center of the conversations, these alternate use cases have to be included, and make sure privacy is protected, but also opportunities around that data to be securely shared with users. This isn't just a shared automobile data conversation, it is potentially applicable to any other objects we rent and share like tools, equipment, supplies, and anything else we may use as part of our business and personal lives.

Commercial Fleets
Just like the data exhaust from personally owned vehicles, or automobiles used as part of the sharing economy, the opportunities, and patterns available for commercial fleets will look very different. How do we incentivize efficiency, cost savings, and safety in fleet operations? What do the affiliate deals, and experiences look like for fleet vehicle drives, and the companies that own and manage them. Think of the implications of big data exhaust from vehicles in heavily regulated industries, and public service entities like police, and fire. We will also have to think very differently about how revenue is generated and shared, as well as look at privacy and security very differently. This doesn't reduce the opportunity in the area of fleet management, it just needs a significantly different conversation about what we need for this class of automobile.

Establishing Common Blueprints 
Beyond the individual conversion events for individual drives, or the wider opportunities for sharing economy companies, and commercial fleet operators, where are the opportunities around identifying common patterns of vehicle usage at scale? How does the vehicle usage of the LAPD differ from NYPD? What does the average residential vehicle owner in San Diego look like, versus the rental car tourist for San Diego? Using connected vehicle technology like Carvoyant opens up a huge opportunity for better understanding car culture at a macro level, beyond what the auto industry, or maybe Department of Transportation sees. How do we begin having honest conversations about our vehicle usage, and allow drivers to be educated about larger studies, allowing them as a company or individual to opt in, and share data, to participate in larger studies? We have to make sure and consider the bigger opportunities for understanding beyond any single endpoint on the connected car network, and look at entire cities, states, countries, and other meaningful demographics.

Lots More To Discuss, Something That Needs Transparency
This is just the beginning of these types of discussions. I have a handful of companies, like https://www.carvoyant.com, who have access to huge volumes of extremely valuable user-generated data, who are trying to figure out how to developer useful tech, make money, all while doing it in a healthy way that protects end-users privacy and security. I am not opposed to companies making money off their API platforms, and user generated data, I just insist that APIs always be used to make it more transparent, and technology such as oAuth employed to give end-users more control, and a vote in how their data is collected, stored, shared.

This post is about me working through my last conversation with Bret, and hopefully will result in several more stories here on the blog. I also want to prime the pump for the APIStrat IoT Big Data Workshop at Gluecon in May, and make sure my readers are aware the workshop will be happening, and if you want to join in the conversation. This is just one of many posts you'll see from me discussing the big data exhaust generated from Internet connected devices, but also the potential for transparency and healthier platform operations when APIs and oAuth are employed. These types conversations are only going to become more critical as more of our physical worlds are connected to the Internet.



If Government Faces Technical Hurdles When Meeting Open Data Requirements, We Need to Lean On Vendors For Solutions

I was reading Obama administration agrees with Sunlight: Agencies should disclose what data they keep private, which is a topic I follow closely, as I'm passionate about government opening up their data inventory, but also because worked as a Presidential Innovation Fellow on helping agencies open up their data. I completely agree that agencies should have to disclose lists of even their private enterprise data repository, to help establish a clear picture of just exactly what is public or private government data.

While reading the post, I noticed the statement that, "most agencies seem to have accepted and understood the need for this change, but some concerns have been voiced":

  • The Department of Commerce, which holds a very sizable amount of data, has cited “certain technological issues that are creating major barriers to creating the PDL as described [in OMB’s guidance].” This is a concern, but Commerce appears to be working through these issues with relevant officials in the General Services Administration and OMB.
  • NASA has expressed significant concerns with the new process, specifically citing pushback and time constraints from its FOIA office and legal team. They claim that “identifying and indexing our open data is our priority,” and indexing its non-public data will take valuable capacity away. Fortunately, NASA has bought into the public process and are engaged in meaningful conversation about the agency's concerns on the Project Open Data Github.

It is interesting that agencies cite the technology challenges in complying with open data mandates. In my experience working on opening up government data, right after the education of the average government employee, challenges imposed by the technology they use is one of the biggest hurdles. To make my point, you can't find "Export Spreadsheet to JSON" as a Microsoft Excel option. Most systems encourage the output of PDFs from core systems, which when it is combined with lack of open data export capabilities, makes opening up of inventories extra work.

We need to continue the hard work to educate the average government employee about open data principles, and the little things they can do in their daily work, to help collectively move their agency towards being more open, and machine readable by default. However, we also need to push harder on software vendors, who are providing technological solutions to make sure the imports and exports of data in open, machine readable formats is a default feature. There also needs to be more mandatory API language in procurement contracts, requring APIs to be baked in by default in all government systems--should never be an afterthought.

There is only so much responsibility that we can place on the average government employee, trying to get their job done on a daily basis. All of the companies who are profiting off selling the government software solutions, also need to share in the responsibility, and help make sure our government is equipped with the proper tools. Government being open and machine readable by default shouldn't create extra work for agencies, it should help reduce their workload.



Exploring The Ways Can We Put API Driven Tools Like Known To Work

I see a lot of software come and go. I adopt applications, tools, and online services for a variety of reasons, with some of these tools remaining a regular part my operations, with others coming and going with time. I'm currently abandoning Evernote, in favor of my own home brew note tool, while I'm also adopting a handful of new HTTP Client tools like Paw. There are a number of reasons these tools go away, sometimes the reason I needed them shifts, other times they evolve to a point where they are no longer useful to me, and other times they go away after an acquisition, or running out of money, and not being able to keep the lights on. In the end, when i find tools I believe in, and deeply integrate them into my life, and want to better understand how I can help ensure they will be successful.

One tool that I am using as part of my daily operations is Known. If you are unfamiliar, Known it is a simple, social publishing platform for your blog or website. I'm am personally invested in helping the Known founders find the customers they need to be successful, and as part of this effort I wanted to practice explaining Known to my readers, while I also expand my own understanding of the role Known can play in my own operations.

Reclaim Your Domain
For me, Known is about reclaiming my domain. Rather than posting tweets to Twitter, I post to Known, and using the Twitter API, Known publishes a copy of the Tweet to Twitter. I maintain the original copy of my Tweet. When I write a wall post I can publish to LinkedIn, as well as Facebook, with the original content staying in Known. This is all about POSSE for me, which is Publish (on your) Own Site, Syndicate Elsewhere. As a professional who operates in online world, it is extremely important to me to retain ownership over my content, and intellectual exhaust.

Managing My Life Bits
Known allows me to manage a significant portion of my life bits, which include simple status updates, posts, photos, bookmarks, and audio, then provides me with a handful of connected services like Twitter, Facebook, LinkedIn, and SoundCloud which I can then syndicate these life bits to. The important piece of this for me, is thinking about my life bits, independent of the services I use to publish and manage or publish these life bits. Employing a POSSE approach allows me to decouple these valuable life bits, and manage them in a central location, which helps me achieve my goal of keeping control my digital exhaust.

Syndication Channels
The cloud version of Known offers a handful of common channels for me to syndicate my life bits to, including Twitter, Facebook, LinkedIn, Flickr, Foursquare, and SoundCloud. I'm not a big Foursquare user anymore, but I'm still active on all the other channels, and Known provides me a simple interface for managing syndication to these channels.  The cloud version of Known is a great start for any individual, or organization looking to get a handle on their basic

Public or Private Page
Beyond the whole syndication aspect, Known provides an important aggregation, and centralization feature for me, by providing a single public, or private page (if I decide), where I can find all of my syndicated Known content. Since most of my presence is public, I keep it open as a single, public place where you can find all of the content I syndicate across Twitter, Facebook, and LinkedIn. With a simple keyword search available, the platform provides a valuable resource that I am increasingly using to re-discover content across my network.

Plugins and Webhooks
Where Known really starts to get exciting for me, is when it comes to the extensibility via plugins and webhooks. First I can create simple webhooks for syndicating out my life bits to custom channels of my choosing. It is easy for me to setup custom webhooks, that can handle various syndication scenarios for me. After that I can create plugins, something I haven't done yet, so I can't speak intelligently about. I am working to carve out time to setup my own custom Known installation, and craft my first API driven plugin. I envision this as the gateway to not just syndicating to other public channels, but also to custom, private channels that I define.

API Driven Known Targets
The default syndication channels that Known provides, coupled with the opportunities the plugins and webhooks introduce, open up an endless amount of POSSE scenarios for me. I envision an entire API stack deployed to support various types of Known installations. I'd like to see API driven docker containers that can auto deploy with a push of a button to provide additional syndication channels for Known. Once you have a POSSE approach established with Known, defining new API driven channels that you can take advantage of, brokering further sydnication, transformation, and storage of all of my personal and business life bits, become much easier envision.

Who Could Put Known To Work?
This is where I need to work through some of my initial thoughts about Known, and push things to new levels. Known makes sense to me when it comes to reclaiming control over my content, while also giving me a single doorway to create additional channels for syndication and publishing, beyond the Twitter, and Facebook I am already working with. However when it comes to other folks who aren't familiar with reclaim, or POSSE principles, it becomes challenging to explain how it can be put to work.

First, I want to disconnect Known being a tool for individuals. Known can bring together any online channels used by an individual, organization, business, event, or any other entity you can think of, that would possibly have an online presence. What are some of the possibilities for putting Known to work:

  • Businesses - To manage a centralized social media, blogging, and other online presence for a small to large business.
  • Churches - Provide a central place for church groups to manage their online presence, from an interface everyone can use.
  • Libraries - Give the local library a single way to control messaging online, and engage with constituents via a single 
  • Museums - Perfect for museum collections to manage the communication and messaging around exhibits that have online presence.
  • Schools - Known has already made inroads into schools, perfect for overall institutions, as well as departments, classes, or individual projects.
  • Public Space - Parks, dog parks, open space, and other public space often have a social media, and messaging presence, where Known can be put to work.
  • Farmers Market - A farmers market needs an active presence to stay engage with constituents, and known can be used to control, and centralize messaging.
  • Neighborhoods - Bringing it down to the hyper-local level, allowing people to work together and manage the messaging and communication within a neighborhood.
  • Government - Providing a common platform for government agencies to work together on projects, as well as manage digital strategies for any size of agency.
  • Group Projects - I can think of a number of other project scenarios that have an online presence ranging from open source projects, to public / private sector partnerships that could use Known.

This is just a start, when it comes to the potential uses of Known. I'll keep working on them, and post individual stories that help people see the ways they can put Known to work. I see the platform providing a free, low cost, asw well as a privately hosted solution that individuals and organizations can use to get a handle on their content and digital presence. 

This is the toughest part of the Known proposition, is quantifying the endless scenarios of where it can be applied. In my opinion this is the potential, while also a significant challenge for the platform.

How Do I Explain What Known Is?
One of the biggest problems with concepts like Reclaim Your Domain, and POSSE is the average Joe or Jane citizen could often care less about these concepts, that is until you find a relevant context, tha is important for them--which is why I'm writing this post. If you are a photographer, you are going to care about your photos more, and think more critically about where and how you publish. If you are a writer, or creative professional, keeping track of your work, ideas, and intellectual exhaust should be important, but for many other people, businesses, and organizations, this is just not relevant or prominent in their existence.

Known is not a social media management tool, even thought it possesses many of the same characteristics. It is more about staying in control of content across all of your digital channels, in an easily extensible way. The trick is to help people see the potential, encourage enough developer to develop plugins, and craft valuable webhooks, so that individuals, businesses, organizations, institutions, and government agencies find it immediately useful. In a regular start-up, the focus would be on the most lucrative targets, like within the enterprise, but I'd like to see Known find an audience within some lucrative circles, as well as in the long tail of use, serving those who can't afford costly solutions, but could be providing a valuable community service.

The trick is going to be helping everyone understand the wide, API driven of potential, to the point where they will pay for a pro service, and help keep the platform thriving. Not everyone is moved by the important reclaim and POSSE concepts, but could really benefit from the easy of use, centralization, and management of their presence across the increasing number of platforms we are publishing to across any number of industries.

Where Would I Pay For Known?
I have upgraded my Known account to the pro version. It is important to me that I support them, and I use the tool every day so it is worth me paying for the service. Beyond that I'd be into paying for premium services, the ability to publish links, stories, and other items to specialty channels. Think event or blog syndication, or possibly tapping interesting transformation services that I can apply to my audio notes, podcasts, or other life bits. (translation, transcription, etc)

I also imagine in the future there being premium plugins that I can discover, and add to my Known projects with a push of a button. I'd be happy to pay developers a premium for these bells and whistles, something that Known could definitely take a cut of. Once I get more comfortable with plugin development, and the potential for API driven channels, I'm sure I'll have more ideas about areas I'd cough up cash for, within the Known ecosystem.

POSSE In Addition To Reciprocity
I am increasingly using Zapier, as what I call a reciprocity service. I use it to move valuable information between the cloud service I depend on. Countries establish reciprocity agreements for trade, Zapier gives me tools for managing trade between the digital spaces I operate between. Known adds a POSSE layer to my digital operations, where Zapier allows me to move things around, Known allows me to publish once, an syndicate wherever I need to. While I could hand code all of this myself, these services allow me to do it in a simple, extensible way, that saves me a lot of time and resources.

An Open Source Solution
In 2015, it is getting increasingly rare for me to adopt a new tool or service, if there isn't an open source aspect to it. Most of the time I end up depending on cloud services instead of the open source edition, but having the option to migrate my own platform, or help deploy customized solutions using open source version is important. In my opinion open source, software as a service, and APIs all work together in creating a winning business model, something Known is taking full advantage of.

The Known Team
Another important factor in all of this for me, is the Known team. I believe in Ben Werdmuller and Erin Jo Richey, the Known founders. I've spent enough time with both of them at the Reclaim Your Domain events we have organized to get to know them well. They don't just have the skills, they have the vision, and a healthy view of how the Internet should work. Ben believes that technology is an empowering force that makes the world better by leveling playing fields, informing citizens, and connecting us all, and with a background in cognitive science, Erin loves creating engaging, human-centric software within the context of sustainable businesses. #win #win

The Known Mission
Known is a start-up, but it has a larger mission, to help empower people through simple, standards-based, open source software, which also has free, and low cost cloud services for those without the resources to deploy all of it on their own. This type of approach to software is the future in my opinion. It is not just the open source software, in conjunction with a free and paid cloud model, it is also the embracing of APis as a distribution channel--this API-centric focus is why I believe in Known.

Making Sure We can Have Simple, Powerful API Driven Tools Like Known
I am writing this post, first to help me better understand, and articulate the potential of Known. I've been using for over a month now, and it is proving to be very useful in managing the daily syndication of content across my primary channels. I'm working to better understand the potential for webhooks and plugins, and extend it to better meet my own operational needs. Along the way I want to better understand how to on-board other people with the concept, whether they are into the reclaim or POSSE concepts, or just looking to get a better handle on the their social media.

Along the way I am going to tell stories about how I am using Known, in hopes of helping people learn how they can put it to work as well. Services like Known, are good for the API industry, and I want to see it succeed, so in addition to helping the platform find new users, I also want to help encourage the development of API driven plugins, and webhooks that can be employed via Known. I can see Known becoming an essential building block like I've identified Zapier as, I think it could become that important for API providers to consider offering a plugin for. 

It is also important for me to help Known find success using its simple, pro business model, and potentially find future revenue opportunities, that help keep the lights on, but also don't change the platform in ways that take it down the wrong path. I trust Ben and Erin to make the right technological, and business decisions, what worries me is that people won't see the potential, and the importance of API, standards driven tools like Known, that help us take control of our digital personas. It often seems in this crazy tech driven world that the best products don't always win, and intend to ensure this doesn't happen to Known.

We need more tools like known that help empower everyone to own the valuable exhaust that they produce online daily, not ceding control to the endless waves of exploitative platforms and applications that seem to step up and try to own people's valuable life bits. Look for more stories about Known from me, as I continue to extend its functionality to help me maintain my POSSE stance when it comes to my digital presence.



A Healthy API Strategy Does Not Involve Scheduling A Briefing To Discuss--Just Do It

I get a number of folks contacting me about their API ideas, and email or call me, looking to schedule a meeting or briefing to discuss the API ideas. This is something I always reply with a request for existing links to the portal, blog, Twitter, and Github accounts of those involved. Even when companies are doing something interesting, I prefer to approach it like any random Joe or Jane would, from the public portal, with very little information up-front. Your overall approach to API, the value it delivers, and ease of integration should speak for itself, in my opinion.

I am sure there are plenty of companies still who would prefer a briefing, meeting, call, or other personal touch, sales or pitch like engagement, but increasingly in an API driven world, a self-service, pull approach is preferred. A well done API shouldn't have to rely only on traditional marketing. I'm not saying that you shouldn't have these mechanims in place, but by pushing this approach on media, developers, and other potential consumers, you are doing yourself a disservice in many situations--going backwards, not forwards.

There is also another group of people who contact me, who really don't have a thing yet. They are just looking for validation, and potentially stimulate discussions that will get them more attention from would be investors. As with the previous group, I think the "just do it" approach works best. You are going to stimulate much more meaningful discussions amongst investors if you are actually doing something, and generating real buzz amongst developers. Additionally you never know, you may be able to bootstrap and make it happen, without attracting major investment.

I know many of you traditional busines folks are used to a push approach to getting your business out there, but when you are looking to make inroads in the API economy, your majority investment should be about creating a platform that people can visit, and pull the value you generate into their worlds, via a self-service model. Right behind the portal, you should have robust marketing, sales, and other outreach mechanisms, but your primary outreach approach should be just about doing what you preach, not scheduling calls to discuss and convince someone of what you do.

For me when someone insists on jumping on a call to discuss their API, is an immediate filter of bad ideas. If someone is just talking, and not showing, more often than not, there isn't anything really there. If you have to defend or hide your idea, it probably shouldn't be out there anyways.



Preparing Postman Collections Ahead Of Time For Developers Like JustGiving Does

I have been slowly adding Postman Collections to many of the APIs indexes for my new master stack. I index each API using an APIs.json file, as well as provide a master APIs.json which brings together all of the APIs, into the single portal I've published to Github. Within some of the APIs, you will see a Postman icon, allowing you to easily import the API definition into postman, directly from each APIs landing page.

When I first published that I was publishing these Postman collections on my Alpha API Evangelist blog, someone from the donation API platform Just Giving said they also prepare Postman collections for their developers, and publish them to Github for their API consumers: 

At JustGiving we give our API consumers a set of collections to import to make their jobs even easier, we also use the "Environments" feature to substitute different values depending on whether you're using our Sandbox or Production environment - it's a real time saver and also helps when supporting our API consumers.

I feel pretty strongly that Postman is a measurable unit for transactions in the API economy, and I feel it goes a long way in helping API consumers onboard with each new API. As a developer, if I can import a Postman Collections into my client, within just a couple clicks, and be up and running with an API--I am going to be a much happier camper, and be that much closer to integration.

I'm considering adding Postman collections as a building block under API on-boarding, and encourage API providers to follow JustGivings lead. Postman Collections is kind of the new API explorer, allowing developers to quickly test drive an API they are going to integrate with. With my own platform, I'm also going to play with publishing Postman Collections, complete with API keys that I refresh regularly, allowing immediate, but limited access my APIs, for anyone who stumble across them. 

Like Swagger, Postman is increasingly a machine readable definition that API providers should be producing, alongside Swagger and API Blueprint for the entire API surface area, and APIs.json for indexing of all API operations.



On Twitter, Gnip, DataSift, And Making The Hard Platform Decision

I have had Steve Willmotts of 3Scale's response to the announcement that Twitter was cutting off firehose access for over a week now, and bringing the higher level access completely in house via their Gnip acquisition. You can read the response from Gnip via their blog post Working Directly With the Twitter Data Ecosystem, and from DataSift in their The Twitter/Gnip Gap: Data Licensing vs Data Processing. Steve and I are in agreement with the points made via his How Not to be a Platform: Twitter’s Firehose Mistake post. Where I really struggled with over the last week, in producing a official response, was after responding How To Be A Platform: Making Tough Partnership Choices, from my friend Tyler Singletary. 

You see, I agree with everything Tyler pointed as well. He's right. Twitter has probably made the right decision for Twitter, and based upon their goals and understanding of the Twitter API ecosystem, did exactly what should be prescribed. I have lots of opinions around the Twitter ecosystem, and how Twitter Continues to Restrict Access to Our Tweets. However, when I step back, and put on my API operations hat, everything Tyler points out, is valid in my opinion. Twitter gets to decide who has access, and in respect to their goals, chooseing the Gnip offering over the DataSift offering makes sense. It's all about the winning solution, and where the ROI is when it comes platform operations? Life is good when you are on the winning team. However #winning != #good, and #winning != best outcome. I also guess, that #open != good, and #open != best outcome either. There are no absolutes in API land--sorry.

Where Tyler stance really begin to lose its shine for me, is when you zoom out. You see Tyler, you are on the winning team. You have access to the firehose, and you operate squarely within in the #winning conversation (I know you don't have direct firehose, and rely on Gnip). Those of who aren't, the conversation looks very different. Let's take a look at the Gnip vs. Datasift product comparison you provide, but from my stance.

After two phone calls with Gnip sales team, they realized I didn't have the requisite $45K to move forward, not worthy of continuing to talk to this one. All I'm left with is Twitter REST and Streaming API, and DataSift. Gnip isn't even on the menu, isn't even an option in my world. So only the players who can afford to be at the table, get to play--kind of sounds like a familiar story to me. Where have I heard this before? Oh in every industry, throughout history. This isn't the API promise I've been hearing! ;-)

I remember in 2007 and 2008 when Twitter was built on these same principles. Only people with the big money could step up and play on the platform. Oh wait no! It was the other way, the people with money didn't give shit about Twitter, and only us small fish were playing on the platform. We were building widgets, Wordpress plugins, developing meaningful analytics, mobile apps, and Twitter encouraged it all. Prove what you are building on the public API, and when you are ready, we'll get rid of any rate limits, or even give you the firehose-just ask us, we'll give you the access you need--the firehose rhetoric was much different 5 years ago:

Full investment in this ecosystem of innovation, means all our partners should have access to the same volume of data, regardless of company size. More than fifty thousand interesting applications are currently using our freely available, rate-limited platform offerings. With access to the full Firehose of data, it is possible to move far beyond the Twitter experiences we know today. In fact, we’re pretty sure that some amazing innovation is possible.

The rush of innovation did happen, but does anyone have access to the same volume of data regarless of company size in 2015? Does the argument that Tyler lays out today, apply in a 2007 or 2010 Twitter ecosystem? How does this line of thinging impact the new startups who are trying to get API developers to come out to use their ecosystems, now, when the big money players don't care about them? Sure things have changed, things have evolved, but I just don't buy the argument that the better product (Gnip) won out over the other product (DataSift). In the neatly package and positioned arguement sure, it makes sense, but if you zoom out, it just doesn't hold weight, now or then. Things have changed for Twitter, but things haven't changed for new startups who are building platforms--they are still depending on early adopters to build on their platform and turn them into a success story, and attact VCS and users.

This is what leaves many of us so disgruntled, its not Twitter making the hard decisions around platform operations. It is their abrasive stance towards those of us who are not in the club, and their lack of communication about the ecosystem, and their sharecropping, and poaching of the best of ideas along along the way, and changing the tune slowly over time. Honestly, I feel like they've done a good job in correcting much earlier bad behavior recently, with their overall partner program, API operations, and evangelism, but their whole firehose position is lacking the bigger ecosystem vision--it is just self centered. Twitter exploited the community when they could, they hand picked a group of partners (Gnip, DataSift, DataMinr, Mass Relevance, and many more) to serve this need when they didn't have time or resources to do it on their own, and now that they see that there more dumbass enteprise peopleare willing to pay Gnip prices for our data, over people building on the long tail, I respect the whole thing even less--to the point where I almost have stopped giving a shit, and begun moving on.

Steve is right, Tyler is right, and sure Twitter is right. You might even be able to build an argument that Twitter doesn't owe the long tail of ecosystem developer anything for helping make Twitter what is, which would be a tough argument in my opinion, but you can't argue that Twitter doesn't owes a certain amount of responsibility to end-users to keep platform operations real. This means not making firehose access something only the IBM, Intel's, Oracle's, and other big $$ companies of the world can afford to build (at scale) on Twitter. Sure the rest of us can still build on the public REST and streaming API, but what then? Why would we build on the platform if we reach the 100K token limit, we have no guarantees for the future--that is unless we are in the good ol boy club, and don't have anything to worry about.

I guess in the end, this is all about access (pun intended). Like with many other areas of the world, if you are lucky enough to have a priveleged position, which I'm not just pointing out the Klout / Twitter releationship. I'm also talking about the other cozy VC relationships I've witnessed companies enjoy with Twitter, versus the companies who I've seen that do not enjoy such priveleged access. The conversation cannot just be about he merits of Twitters decision, and the performance of Gnip and DataSift, without including the bigger ecosystem picture. Making the hard API ecosystem decision argument is only valid, if they take into consideration all of the macro issues that truly impact a platform, not just those of the platform owners and its most cherished partners and investors have. While I think Twitter and its investors feel like they own Twitter, us developers, and power users who have been on the platform since early days, also feel like we have a certain amount of ownership too--an ownership Twitter has shown they don't respect.

Twitter definitely holds the cards at the moment, but eventually as more of us power user realize we are just fine without a Twitter presence, Twitter will face becoming the NBC of social media--still relevant, but are you still truly relevant? Not forever.



Hypermedia APIs From Sebastien Cevey of The Guardian at APIDaysBerlin / APIStrat Next Week

The most important stories told across the API space, the ones that have the biggest impact on API providers, and ultimately API consumers, are the stories that come out of the trenches of the API operations at the leading public API providers. This is why 3Scale and API Evangelist started APIStrat, and what the audiences over the last 3 years of operating APIDays and APIStrat have consistently requested--more stories from the trenches.

With this in mind, we've asked Sébastien Cevey (@theefer) of The Guardian to come to @APIDaysBerlin / @APIStrat Europe 2015, and share a story from API operations at the worlds leading media API provider. Sébastien wanted to share their view of what Hypermedia APIs are, by comparing them to a classical RPC architecture and ad-hoc "JSON APIs". He will also cover key benefits of using Hypermedia APIs (cacheability, discoverability, evolveability/extensibility, simplicity, interactive nature), presented in the context of real-world examples and the different constraints of REST (HTTP, URIs, verbs, hypermedia controls like links and forms).

What I like about Sébastien viepoint, is 1) he works at one of the most progressive media outlets in the world, where he is actually putting APIs to work, and 2) his pragmatic approach. Sébastien will talk about what hypermedia APIs are, anchoring his lessons in real-world examples from Guardian operations, but he will also discuss the caveats of this API style, acknowledging the realities of API operations. There are no perfect solutions, but technologists like Sébastien are pushing the discipline of API design forward, while also balancing with the real-world reality that exists in their workplaces.

The keynote from Sébastien Cevey (@theefer) of The Guardian, will kick off the first day of @APIDaysBerlin / @APIStrat Europe 2015, so make sure you are registered today, so you don't get locked out when we sell out (which we are very close). If you are looking to learn more about hypermedia APIs, I recommend starting with Sébastien's keynote, but also make sure and join us Saturday morning for the entire hypermedia API session, where you can learn more about this fast growing API design consideration, from those pushing the conversation forward, and putting hypermedia APIs to work in the real world.

I look forward to seeing you all in Berlin, next week!



Hypermedia APIs From Sébastien Cevey of The Guardian at @APIDaysBerlin / @APIStrat Next Week

The most important stories told across the API space, the ones that have the biggest impact on API providers, and ultimately API consumers, are the stories that come out of the trenches of the API operations at the leading public API providers. This is why 3Scale and API Evangelist started APIStrat, and what the audiences over the last 3 years of operating APIDays and APIStrat have consistently requested--more stories from the trenches.

With this in mind, we've asked Sébastien Cevey (@theefer) of The Guardian to come to @APIDaysBerlin / @APIStrat Europe 2015, and share a story from API operations at the worlds leading media API provider. Sébastien wanted to share their view of what Hypermedia APIs are, by comparing them to a classical RPC architecture and ad-hoc "JSON APIs". He will also cover key benefits of using Hypermedia APIs (cacheability, discoverability, evolveability/extensibility, simplicity, interactive nature), presented in the context of real-world examples and the different constraints of REST (HTTP, URIs, verbs, hypermedia controls like links and forms).

What I like about Sébastien viepoint, is 1) he works at one of the most progressive media outlets in the world, where he is actually putting APIs to work, and 2) his pragmatic approach. Sébastien will talk about what hypermedia APIs are, anchoring his lessons in real-world examples from Guardian operations, but he will also discuss the caveats of this API style, acknowledging the realities of API operations. There are no perfect solutions, but technologists like Sébastien are pushing the discipline of API design forward, while also balancing with the real-world reality that exists in their workplaces.

The keynote from Sébastien Cevey (@theefer) of The Guardian, will kick off the first day of @APIDaysBerlin / @APIStrat Europe 2015, so make sure you are registered today, so you don't get locked out when we sell out (which we are very close). If you are looking to learn more about hypermedia APIs, I recommend starting with Sébastien's keynote, but also make sure and join us Saturday morning for the entire hypermedia API session, where you can learn more about this fast growing API design consideration, from those pushing the conversation forward, and putting hypermedia APIs to work in the real world.

I look forward to seeing you all in Berlin, next week!