Getting started with AYLIEN Text Analysis and R

Getting started with AYLIEN Text Analysis and R


As you may already know, we like to feature interesting and useful examples of what the AYLIEN community are building with our API. Previously, we’ve showcased a PowerShell wrapper and a Laravel wrapper. For this edition of our blog, we’re going feature a data scientist who spent some time building an R binding for our Text Analysis API.

Arnab is a solutions architect and data analytics expert based in India. He recently developed an R binding for our Text Analysis API that we think is going to be very popular amongst our R users. This R Binding makes it really quick and easy for the R community, to get up and running with our API and we’re glad he spent the time to putting it together.


If you’re new to AYLIEN and Text Analysis, the first thing you’ll need to do is sign up for free access to the API. Take a look at our getting started page, which will take you through the signup process. We have a free plan available which allows you to make up to 1,000 calls to the API per day, for free.

The second thing you need to do is, install the following packages from your R console using the following commands:



Point to file:


Note: You must have XQuartz installed to view the results which you can download here.


To show how easy it is to use the API with R, we’re going to run a simple analysis using the binding, by analyzing the following article: (, extracting the main body of text from the page and classifying the article based on IPTC news-codes.

Code Snippet:

aylienAPI<-function(APPLICATION_ID, APPLICATION_KEY, endpoint, parameters, type)
  url = paste0('',endpoint)
  httpHeader = c(Accept="text/xml", 'X-AYLIEN-TextAPI-Application-ID' = APPLICATION_ID,
                 'X-AYLIEN-TextAPI-Application-Key'= APPLICATION_KEY,
                 'Content-Type'="application/x-www-form-urlencoded; charset=UTF-8")



  resp<-getURL(url, httpheader = httpHeader,
               postfields=paramEncode, verbose = FALSE)


Don’t forget to add your AYLIEN credentials:


Arnab has made it really easy to call each end point, all you need to do is specify the endpoint in the code. To call the classification endpoint for example, we simply use “classify”.

endpoint = "classify"
parameters = ""
type = "url"


It’s up to you, how you want to display your results, but using the following command, displays them nice and clearly in a table format converting the output to a data frame, as shown in the image below.

resultsdf<-ldply(xmlToList(results), data.frame)

As you can see from the Results, the API returned an accurate two-tier classification of “Sport – Soccer”.

You can also choose to retrieve data using Xpath from the XML result with the following request.

View(xpathSApply(PARSED, "//category",xmlValue))

If you have an app or a wrapper you’ve built with our API’s, we’d love to hear about it and feature it on our blog. Get in touch at and tell us what you’re building.

Text Analysis API - Sign up

Let's Talk