Posted on 10-12-2013
I spend a lot of time finding valuable data sets and manually converting, processing and outputting into more usable formats, so that they can be used in APIs that drive mobile and web applications.
I am always on the lookout for new tools that will help me be more efficient in to my work. I'm currently test driving a new platform called Delray that focuses on taking an older concept of extract, transform and load (ETL) and bringing it into the API age by allowing me to define common data resources as inputs, process them one time or on schedule and output data in a cleaner, more usable format.
Using Delray I can define an input from CSV, JSON, MSSQL and other common formats, and save this as the input for my workflow.
Next I can setup a configure a cleansing stage to process the data, allowing me to trim whitespace, replace space, make lowercase and other common things I tend to do manually with my data sets.
Finally I can output the CSV inputed data as a JSON file for use in my APIs and other open data efforts.
Delray represents the next generation of tools that will turn anyone into a data steward, allowing non-developers to take control of critical data flows within your business, organization or government agency.
Opening up data in machine readable format is not just for IT and developers anymore.
comments powered by Disqus
Winning in the API Economy
|Download as PDF|
Latest Blog Posts
- Adding The OpenEd API To The API Commons
- PR People For APIs
- I Will Be At API Days in Paris This Week, Will You Be There?
- Government Services Schemas With JSON-LD
- University of California Student Senate Submits Bill Stating Student Information Systems Must Have API
- Generating The Utility APIs I Need For Each Developer Portal
- An API For Your Github GeoJSON Stores
- Server Side API Templates On AWS Cloud Formation And OpenShift
- Hypermedia In The Wild: Amazon AppStream API
- API Search Endpoint Using Solr