Postman /me API Experimentation

I was doing my regular check-in with my partner in crime over in Postman devrel Joyce Lin (@PetuniaGray), and she reminded of the concept of a /me API, and Abhinav Asthana’s (@a85) vision that the Postman platform can be used to help establish a kind of universal /me API. The conversation got me thinking about the different ways Postman can be used to profile our digital self, and I wanted to see if I could brainstorm around one slice of this digital pie I had swirling around my head. In my regular morning walk around Lake Merritt I found myself thinking about how a collection could be used to represent an executable and sharable representation of a specific slice of our digital self. Let’s see if I can do what I am seeing in my head justice, and lay the foundation for some more work to move forward how we define ourselves online each day.

To understand a piece of /me I wanted to breakdown my LinkedIn footprint, and to do this I was going to use Postman Interceptor to map out my digital footprint. To do this I turned on Interceptor, filtered by the domain, pointed it to a collection I had added to my Postman on the desktop, and began making my way around the LinkedIn website in my Chrome browser. Postman Interceptor records every API call made behind the scenes of the LinkedIn web application, allowing me to now play around with in Postman, export locally, or access via the Postman API. After clicking around the application for a few minutes I generated 973 individual API requests, available in a specific collection.

Ok, what I can I do with these requests? The Postman collection provides the URLs, paths, query parameters, as well as the responses for each API call made behind my LinkedIn web application usage during that period of time. This represents just a momentary slice of my online world that exists within the LinkedIn domain, and I wanted to play around with different ways I could use Postman visualizer to “see” this slice of my world. To help me do this I created a separate collection which pulls my LinkedIn collection from the Postman API, and then looped through each of the 973 API requests stored within the collection. A lot of interesting information is contained within the domain, path, ad query parameters sent with each request, as well as the structure and data returned as part of each response. I chose to just loop through each of the elements of the paths for all 973 requests, then organized and counted them, returning as a simple tag cloud visualization.

You can’t see all of the tags because some of them occur more than others, but it provides a nice visual representation of the API paths used to define my digital self within the LinkedIn domain. I’ll spend time clean things up. Splitting the camelCase variables, snake case, dashes, and other more cryptic aspects of LinkedIn API design into meaningful words and phrases to represents the digital resources and capabilities they represent. Then I will look closer at the query parameters, as well as the schema of each response being returned. I just wanted to lay the foundation for visualizing collections that are generated from my LinkedIn web traffic using Interceptor. It provides a pretty compelling way to visualize each slice of my digital footprint within, or across domain. Establishing an approach that can be applied to other domains, reverse engineering the aPis behind any of the web applications I use each day.

Next I want to see what else I can do with this collection. I need to get my cookies properly recording and then I can actual reuse my LinkedIn collection to actually make calls to each of the requests I record. Opening up the possibilities for automation around this slice of my digital self. However, I can’t help but think I can also deploy this collection as an actual representation of my /me API, reversing what I’ve gathered here and publishing as an API, but then using the collection to make the underlying calls to LinkedIn. Essentially this collection would act as a sort of portable proxy connector that I can use to define each slice of the /me API. This is all just hypothetical at this point. I think I will need to develop some vocabulary to overlay what I parse from the domain map I have created, otherwise it will not be as coherent as it could be. I like this type of experimentation that helps me “see” the API currents that flow beneath our world each day, allowing us to see the digital slices of our daily digital life.