RSS

API Machine Learning News

These are the news items I've curated in my monitoring of the API space that have some relevance to the machine learning API showcase conversation and I wanted to include in my research. I'm using all of these links to better understand how the space is testing their APIs, going beyond just monitoring and understand the details of each request and response.

Making Machine Learning Accessible To Spreadsheet Power Users

My friends over at Algorithmia are up to some good things–making their algorithms available within a spreadsheet. Algorithmia has created a set of open source scripts and walkthrough to help you inject the algorithms from their marketplace into your Google Spreadsheets.

They have seven useful algorithms to inject into spreadsheets:

  • Linear Detrend – removes increasing or decreasing trends in time series
  • Autocorrelate – used to analyze the seasonality of a time series
  • Outlier Detection – flags unusual data points
  • Forecast – predict a given time series into the future
  • Summarizer – creates a text summary by extracting key topic sentences
  • Social Sentiment Analysis – assigns sentiment ratings of “positive”, “negative” and “neutral”
  • Count Social Shares – returns the number of times that URL has been shared on various social media sites

The Google scripts are available on Github, thanks to the hard work of Ken Burcham. It provides yet another interesting example of how a spreadsheet can be used as an API client but it also provides an interesting example of how machine learning API providers can get their ML warez in front of the average business user. Developers building applications with your ML APIs is one thing but getting the average business spreadsheet power user to put your ML API to work in their everyday workflow is a whole other world of API integration opportunity.

I have been preaching the spreadsheet to API connection for a while now. I know that many API developers want to do away with the spreadsheet, but I think they would be better off focusing on injecting their API solutions into spreadsheets like Algorithmia is doing. When you are designing your algorithmic-centric APIs, providing access to your machine learning models, make sure you keep your APIs simple and doing one thing well like Algorithmia does, then your ML APIs can be injected into the spreadsheets around the globe that are driving business each day.


API Wrappers To Help Bring Machine Learning Into Focus

I was taking a look at the Tensorflow Object Detection API, and while I am interested in the object detection, the usage of API is something I find more intriguing. It is yet another example of how diverse APIs can be. This is not a web API, but an API on top of a single dimension of the machine learning platform TensorFlow.

“The TensorFlow Object Detection API is an open source framework built on top of TensorFlow that makes it easy to construct, train and deploy object detection models.” It is just a specialized code base helping abstract away the complexity of one aspect of using TensorFlow, specifically for detecting objects in images. You could actually wrap this API with another web API and run on any server or within a single container as a proper object recognition API.

For me, it demonstrates one possible way of wrapping a single or cross section of a machine learning implementation to abstract away the complexity and helping you train and deploy ML models in this particular area. This approach to deploying an API on top of ML shows that you can use to APIs to help simplify and abstract ML for developers. This can be done to help satisfy business, regulatory, privacy, security, and other real or perceived concerns when it comes to artificial intelligence, machine learning, or any other digital voodoo that resembles magic.

No matter how complex the inputs and outputs of an algorithm are, you can always craft an API wrapper, or series of API wrappers that help others make sense of those inputs, from a technical, business, or even political perspective. I just wanted to highlight this example of ML being wrapped with an API, even if it isn’t for all the same reasons that I would be doing it. It’s just part of a larger toolbox I’m looking to create to help me make the argument for more algorithmic transparency in the machine learning platforms we are developing.


A Conference Focused On Machine Learning APIs

I try to pay attention to events going on in the API space beyond just APIStrat in Portland this fall (submit your CFP!!), and I saw a notification for PAPIs in São Paulo in two weeks, as well as Boston in October. I’m glad we’ve always kept @APIStrat a wider community thing, but if I had to pick one vertical to focus on in 2017 and on, it would definitely be machine learning APIs.

PAPIs has been on my radar for a while now, but I think their foresight is going to start paying off this year. While there are a number of trends moving the API space forward, things like microservices, serverless, and GraphQL, nothing will compare to what is happening with machine learning (ML). I think 90% of the ML will be BS, but there will be 5-10% of it that will actually move industries forward in any meaningful way, and the scope of the investment into everything ML is going to be dizzying for the foreseeable future.

Conferences like PAPIs are going to become increasingly important to help us sit down and have conversations about what ML and AI APIs do, or do not do. I see machine learning, cognitive, artificial intelligence and the buzzwords everybody likes to use just as the algorithmic evolution of the API industry. Where we will be moving beyond just data and content APIs as the default, and having a robust toolbox of algorithmic resources to bake into all of our applications will become standard operating procedure. I’m guessing we’ll see an increased presence of PAPIs conferences in cities around the globe, as well as waves of other ML and AI API-focused events pop up.


Algorithmia Invests More Resources Into Machine Learning APIs For Working With Video

I got my regular email from Algorithmia this last week and I like where they are going with some of their machine learning APIs. They have been heavily investing in machine learning applied to video, allowing for the extraction of information from video, as well as applying interesting transformations to your videos.

Here are some of the video tools they have been working on:

These are all things I’m interested in using as part of my drone and other video work that I’ve been working on as a hobby. I’m interested in the video pipeline aspect because it’s fun to work with the video I capture, but I also see the potential when it comes to drones in agriculture and mining, and I am also curious the business models associated with this type of a video pipeline. I think video, images, plus APIs, coupled with the API monetization strategy Algorithmia already has in place is their formula for success.

I’m keeping an eye on what Amazon, Google, and Microsoft are up to, but I think Algorithmia has a first mover advantage when it comes to the economic of all of this. I’m glad they are investing more into their video resources. I think there are endless uses for API-driven pipelines that process images and video and apply machine learning models using APIs, then metered, and made available via an algorithmic catalog like Algorithmia offers.


On Device Machine Learning API Stack

I was reading about Google’s TensorFlowLite in Techcrunch, and their mention of Facebook’s Caffe2Go, and I was reminded of a conversation I was having with the Oxford Dictionaries API team a couple months ago.

The OED and other dictionary and language content API teams wanted to learn more about on-device API deployment, so their dictionaries could become the default. I have asked when we will have containers natively on our routers a while ago, but I’d also like to add to that request–when will we have a stack of containers on device where we can deploy API resources that can be used by applications, and augment the existing on-device hardware and OS APIs?

API providers should be able to deploy their APIs exactly here they are needed. API deployment, management, monitoring, logging, and analytics should exist by default in these micro-containerized environments on any devices. Whether it’s our mobile phones, our automobiles, or the weather, solar, or other industrial device integration, we are going to new API-driven data, ML, AI, augmented, and other resources on-device, in a localized environment.


Key Factors Determining Who Succeeds In The API and ML Marketplace Game

I was having a discussion with an investor today about the potential of algorithmic-centered API marketplaces. I’m not talking about API marketplaces like Mashape, I’m more talking about ML API marketplaces like Algorithmia. This conversation spans multiple areas of my API lifecycle research, so I wanted to explore my thoughts on the subject some more.

I really do not get excited about API marketplaces when you think just about API discovery–how do I find an API? We need solutions in this area, but I feel good implementations will immediately move from useful to commodity, with companies like Amazon already pushing this towards a reality.

There are a handful of key factors for determining who ultimately wins the API Machine Learning (ML) marketplace game:

  • Always Modular - Everything has to be decoupled and deliver micro value. Vendors will be tempted to build in dependency and emphasize relationships and partnerships, but the smaller and more modular will always win out.
  • Easy Multi-Cloud - Whatever is available in a marketplace has to be available on all major platforms. Even if the marketplace is AWS, each unit of compute has to be transferrable to Google or Azure cloud without ANY friction.
  • Enterprise Ready - The biggest failure of API marketplaces has always been being public. On-premise and private cloud API ML marketplaces will always be more successful that their public counterparts. The marketplace that caters to the enterprise will do well.
  • Financial Engine - The key to markets are their financial engines. This is one area AWS is way ahead of the game, with their approach to monetizing digital bits, and their sophisticated market creating pricing calculators for estimating and predicting costs gives them a significant advantage. Whichever marketplaces allows for innovation at the financial engine level will win.
  • Definition Driven - Marketplaces of the future will have to be definition driven. Everything has to have a YAML or JSON definition, from the API interface, and schema defining inputs and outputs, to the pricing, licensing, TOS, and SLA. The technology, business, and politics of the marketplace needs to be defined in a machine-readable way that can be measured, exchanged, and syndicated as needed.

Google has inroads into this realm with their GSuite Marketplace, and Play Marketplaces, but it feels more fragmented than Azure and AWS approaches. None of them are as far along as Algorithmia when it comes to specifically ML focused APIs. In coming months I will invest more time into mapping out what is available via marketplaces, trying to better understand their contents–whether application, SaaS, and data, content, or algorithmic API.

I feel like many marketplace conversations often get lost in the discovery layer. In my opinion, there are many other contributing factors beyond just finding things. I talked about the retail and wholesale economics of Algorithmia’s approach back in January, and I continue to think the economic engine will be one of the biggest factors in any API ML marketplace success–how it allows marketplace vendors to understand, experiment, and scale the revenue part of things without giving up to big of a slice of the pie.

Beyond revenue, the modularity and portability will be equally important as the financial engine, providing vital relief valves for some of the classic silo and walled garden effects we’ve seen the impact the success of previous marketplace efforts. I’ll keep studying the approach of smaller providers like Algorithmia, as well as those of the cloud giants, and see where all of this goes. It is natural to default to AWS lead when it comes to the cloud, but I’m continually impressed with what I’m seeing out of Azure, as well as feel that Google has a significant advantage when it comes to TensorFlow, as well as their overall public API experience–we will see.


If you think there is a link I should have listed here feel free to tweet it at me, or submit as a Github issue. Even though I do this full time, I'm still a one person show, and I miss quite a bit, and depend on my network to help me know what is going on.