Posted on 04-16-2014
I just wrote up a piece about how to deploy an API driven application backend using Orchestrate.io, and wrote a piece last week on API deployment using Solr and government data. After writing about both of these approaches, I can't shake the thought that external, API driven approaches like this will become commonplace in the next couple of years.
Both the approach using Solr, and the other using Orchestrate.io, identify that data is available via a machine readable data dump and even an API, but neither solution provide the simple web API access that would make application development easy.
In both these stroies, 18F and Orchestrate.io are both looking to just get the job done, achieving their development goals--in one case deploy an API of federal business opportunities, and the other is to build a web application that will make browsing super hero characters easier.
Granted these two scenarios have radically different outcomes, but both share a common approach to meeting their goals:
- Orchestrate.io - Uses Orchestrate.io to extract, store and normalize data from the Marvel API, making it much more accessible, searchable and usable in their MarvelousDB web application
- FBOpen - Uses Solr to mount and index business opportunities with the federal government in an XML dump, then deliver a web API so they can build the web interface for FBOpen, but also provide an open source API that anyone can use as is, or deploy on their own infrastructure
Both of these approaches are all about making the data and content they need for web and mobile apps, easily accessible via a simple API. Both of these projects acknowledge the data source they have to work with doesn’t meet their exact requirements, but for very different reasons. One project uses open source software to tackle their problems, while the other uses the latest cloud services to provide a solution—both get the job done.
I don’t think there will be one formula for this type of development that will work in all scenarios, but I think the pattern of extracting the data you need via a data dump or an API, and deploying an external API stack, independent of where you get the data, will be a pattern that can be re-used. This is especially ture in our federal government, where I think much of the innovation needs to occur adjacent to the agency who produces and manages valuable data and content resources, estlabishing API driven, public / private sector partnerships.
I envision a healthier API ecosystem where the government, and the private sector both encourage the re-purposing, and wholesale distribution of content and data, opening up a much more federated approach to delivering the resources we need to build both public, and private sector web and mobile applications.
comments powered by Disqus
Winning in the API Economy
|Download as PDF|
Latest Blog Posts
- Do You Know That Hypermedia Is A Better Solution For Discovery Than APIs.json?
- API Service Idea: API Via Excel From Within Corporate Email
- With Number Of APIs Continuing To Grow Account Automation Will Be Key
- History Of APIs: NOAA APIs Have Been RESTful For Over 20 Years
- Understanding More About The Web Communications Platform Respoke
- Swagger Definition Driven Sandbox And Simulation Data Templates For APIs
- Swagger API Definition Mapper
- Moving Beyond Just The PDF With A Single Page Report (SPR)
- Join Me For APIDays Berlin And APIStrat Europe This April 24th-25th 2015
- A Market For API Developer Credits