Good Contents Are Everywhere, But Here, We Deliver The Best of The Best.Please Hold on!
Your address will show here +12 34 56 78

Most of our users will make 3 or more calls to our API for every piece of text or URL they analyze. For example if you’re a publisher who wants to extract insight from a an article or URL it’s likely you’ll want to use more than one of our features to get a proper understanding of that particular article or URL.

With this in mind, we decided to make it faster, easier and more efficient for our users to run multiple analysis operations in one single call to the API.

Our Combined Calls endpoint, allows you to run more than one type of analysis on a piece of text or URL without having to call each endpoint separately.

  • Run multiple operations at once
  • Speed up your analysis process
  • Write cleaner, more efficient code

Combined Calls

To showcase how useful the Combined Calls endpoint can be, we’ve ran a typical process that a lot of our news and media focused users would use when analyzing URL’s or articles on news sites.

In this case, we’re going to Classify the article in question and extract any Entities and Concepts present in the text. To run a process like this would typically involve passing the same URL to the API 3 times, once for each analysis operation and following that, retrieving 3 separate results relevant to each operation. However, with Combined Calls, we’re only making 1 call to the API and retrieving 1 set of results, making it a lot more efficient and cleaner for the end user.

Code Snippet:

var AYLIENTextAPI = require('aylien_textapi');
var textapi = new AYLIENTextAPI({
    application_id: "APP_ID",
    application_key: "APP_KEY"
});

textapi.combined({
    "url": "http://www.bbc.com/news/technology-33764155",
    "endpoint": ["entities", "concepts", "classify"]
}, function(err, result) {
  if (err === null) {
    console.log(JSON.stringify(result));
  } else {
    console.log(err)
  }
});

The code snippet above was written using our Node.js SDK. SDKs are available for a variety of languages on our SDKs page.

Results

We’ve broken down the results below into three sections, Entities, Concepts and Classification to help with readability, but using the combined calls endpoint all of these results would be returned together.

Entities:

{
    "results": [
        {
            "endpoint": "entities",
            "result": {
                "entities": {
                    "keyword": [
                        "internet servers",
                        "flaw in the internet",
                        "internet users",
                        "server software",
                        "exploits of the flaw",
                        "internet",
                        "System (DNS) software",
                        "servers",
                        "flaw",
                        "expert",
                        "vulnerability",
                        "systems",
                        "software",
                        "exploits",
                        "users",
                        "websites",
                        "addresses",
                        "offline",
                        "URLs",
                        "services"
                    ],
                    "organization": [
                        "DNS",
                        "BBC"
                    ],
                    "person": [
                        "Daniel Cid",
                        "Brian Honan"
                    ]
                },
                "language": "en"
            }
        },

Concepts:

{
            "endpoint": "concepts",
            "result": {
                "concepts": {
                    "http://dbpedia.org/resource/Apache": {
                        "support": 3082,
                        "surfaceForms": [
                            {
                                "offset": 1261,
                                "score": 0.9726336488480631,
                                "string": "Apache"
                            }
                        ],
                        "types": [
                            "http://dbpedia.org/ontology/EthnicGroup"
                        ]
                    },
                    "http://dbpedia.org/resource/BBC": {
                        "support": 61289,
                        "surfaceForms": [
                            {
                                "offset": 1108,
                                "score": 0.9997923194235071,
                                "string": "BBC"
                            }
                        ],
                        "types": [
                            "http://dbpedia.org/ontology/Agent",
                            "http://schema.org/Organization",
                            "http://dbpedia.org/ontology/Organisation",
                            "http://dbpedia.org/ontology/Company"
                        ]
                    },
                    "http://dbpedia.org/resource/Denial-of-service_attack": {
                        "support": 503,
                        "surfaceForms": [
                            {
                                "offset": 264,
                                "score": 0.9999442627824017,
                                "string": "denial-of-service attacks"
                            }
                        ],
                        "types": [
                            ""
                        ]
                    },
                    "http://dbpedia.org/resource/Domain_Name_System": {
                        "support": 1279,
                        "surfaceForms": [
                            {
                                "offset": 442,
                                "score": 1,
                                "string": "Domain Name System"
                            },
                            {
                                "offset": 462,
                                "score": 0.9984593397878601,
                                "string": "DNS"
                            }
                        ],
                        "types": [
                            ""
                        ]
                    },
                    "http://dbpedia.org/resource/Hacker_(computer_security)": {
                        "support": 1436,
                        "surfaceForms": [
                            {
                                "offset": 0,
                                "score": 0.7808308562314218,
                                "string": "Hackers"
                            },
                            {
                                "offset": 246,
                                "score": 0.9326746054676964,
                                "string": "hackers"
                            }
                        ],
                        "types": [
                            ""
                        ]
                    },
                    "http://dbpedia.org/resource/Indian_School_Certificate": {
                        "support": 161,
                        "surfaceForms": [
                            {
                                "offset": 794,
                                "score": 0.7811847159512098,
                                "string": "ISC"
                            }
                        ],
                        "types": [
                            ""
                        ]
                    },
                    "http://dbpedia.org/resource/Internet_Systems_Consortium": {
                        "support": 35,
                        "surfaceForms": [
                            {
                                "offset": 765,
                                "score": 1,
                                "string": "Internet Systems Consortium"
                            }
                        ],
                        "types": [
                            "http://dbpedia.org/ontology/Agent",
                            "http://schema.org/Organization",
                            "http://dbpedia.org/ontology/Organisation",
                            "http://dbpedia.org/ontology/Non-ProfitOrganisation"
                        ]
                    },
                    "http://dbpedia.org/resource/OpenSSL": {
                        "support": 105,
                        "surfaceForms": [
                            {
                                "offset": 1269,
                                "score": 1,
                                "string": "OpenSSL"
                            }
                        ],
                        "types": [
                            "http://schema.org/CreativeWork",
                            "http://dbpedia.org/ontology/Work",
                            "http://dbpedia.org/ontology/Software"
                        ]
                    }
                },
                "language": "en"
            }
        },

Classification:

{
"endpoint": "classify",
            "result": {
                "categories": [
                    {
                        "code": "04003005",
                        "confidence": 1,
                        "label": "computing and information technology - software"
                    }
                ],
                "language": "en"
      }
  }

You can find more information on using Combined Calls in our Text Analysis Documentation.

We should also point out that the existing rate limits will also apply when using Combined Calls. You can read more about our rate limits here.





Text Analysis API - Sign up




0

If you’re new to AYLIEN and don’t yet have an account you can take a look at our blog on getting started with the API or alternatively you can go directly to the getting started page on our website which will take you through the signup process. We provide a free plan which allows you to make up to 1,000 calls per day to the API for free forever.

Downloading and Installing the PHP SDK

All of our SDK repositories are hosted on Github. To use the SDK start by making the following addition to your composer.json.


{
  "require": {
    "aylien/textapi": "0.1.*"
  }
}

Once you’ve installed the SDK you’re ready to start coding! For the remainder of this blog we’ll walk you through making calls using the PHP SDK and show the output you should receive in each case while showcasing a few simple features of the API.

Configuring the SDK with your AYLIEN credentials

Once you’ve subscribed to our API and have downloaded the SDK you can start making calls by adding the following PHP code.


require __DIR__ . "/vendor/autoload.php";
$textapi = new AYLIENTextAPI("YourApplicationId", "YourApplicationKey");

When calling the API you can pass a piece of text directly to the API for analysis or you can pass a URL and we will automatically extract the main piece of text on that webpage.

Language Detection

Let’s take a look at the Language Detection endoint. The Language detection endpoint is uite straightforward. It will automatically tell you what language a piece of text is written in. In this example we’ll detect the language of the following sentence; ‘What language is this sentence written in?’

You can call the endpoint using the following piece of code.


$text = "What language is this sentence written in?";
$language = $textapi->Language(array("text" => $text));
echo sprintf("Text: %s <br/>", $language->text);
echo sprintf("Language: %s <br/>", $language->lang);
echo sprintf("Confidence: %F <br/>", $language->confidence);

You should receive an output similar to the one shown below which shows that the language detected was English and the confidence that it was detected correctly (a number between 0 and 1) is very close to 1 indicating that you can be pretty sure it is correct.

Language Detection Results


Text: What language is this sentence written in?
Language: en
Confidence: 0.999997

Sentiment Analysis

Next, we’ll look at analyzing the sentence “John is a very good football player” to determine it’s sentiment i.e. positive , neutral or negative. The endpoint will also determine if the text is subjective or objective. You can call the endpoint with the following piece of code


$text = "John is a very good football player!";
$sentiment = $textapi->Sentiment(array("text" => $text));
echo sprintf(" <br/>Text: %s <br/>", $sentiment->text);
echo sprintf("Sentiment Polarity: %s <br/>", $sentiment->polarity);
echo sprintf("Polarity Confidence: %F <br/>", $sentiment->polarity_confidence);

You should receive an output similar to the one shown below which indicates that the sentence is objective and is positive, both with a high degree of confidence.

Sentiment Analysis Results


Text: John is a very good football player!
Sentiment Polarity: positive
Polarity Confidence: 0.999999

Article Classification

Now we’re going to take a look at our Classification endpoint. The Classification endpoint automatically assigns an article or piece of text to one or more categories making it easier to manage and sort. The classification is based on IPTC International Subject News Codes and can identify up to 500 categories. The code below analyses a BBC news article about how animals eveolved on earth.


$url = "http://www.bbc.com/earth/story/20150112-did-snowball-earth-make-animals";
echo sprintf("<br/>Classification:<br/>");
$classify = $textapi->Classify(array("url" => $url));
echo sprintf("URL: %s", $url);
foreach($classify->categories as $val) {
  echo sprintf("<br/>Label        :   %s     ", $val->label);
  echo sprintf("<br/>IPTC code    :   %s     ", $val->code);
  echo sprintf("<br/>Confidence   :   %F     ", $val->confidence);
  }

When you run this code you should receive an output similar to that shown below which assigns the article an IPTC label of “natural science – geology” with an IPTC code of 13004001.

Article Classification Results


Classification:
URL: http://www.bbc.com/earth/story/20150112-did-snowball-earth-make-animals
Label : natural science - geology
IPTC code : 13004001
Confidence : 1.000000

Hashtag Analysis

Next, we’ll look at analyzing the same BBC article to extract hashtag suggestions for sharing the article on social media with the following code.


echo sprintf("<br/><br/>Hashtags:<br/>");
echo sprintf("URL: %s", $url);
$hashtags = $textapi->Hashtags(array("url" => $url));
foreach($hashtags->hashtags as $val) {
  echo sprintf(" <br/> %s", $val );
}

You should receive the output shown below.

Hashtag Suggestion Results


Hashtags:
URL: http://www.bbc.com/earth/story/20150112-did-snowball-earth-make-animals
#SnowballEarth
#Oxygen
#Earth
#Evolution
#CarbonDioxide
#Glacier
#GlacialPeriod
#OperationDeepFreeze
#CambrianExplosion ....

If your more of a node of Java fan, check out the rest of our SDKs for node.js, Go, Ruby, Python, Java and .Net (C#). For more information regarding the APIs go to the documentation section of our website.

We’re publishing ‘getting started’ blogs for the remaining SDKs over the coming weeks so keep an eye out for them.





Text Analysis API - Sign up




0

This is the fourth in our series of blogs on getting started with our various SDKs. Depending on what your language preference is, we have SDKs available for Node.js, Python, Ruby, PHP, GO, Java and .Net (C#). Last week’s blog focused using our Go SDK. This week we’ll focus getting up and running with Ruby.

If you are new to our API and you don’t have an account, you can go directly to the Getting Started page on our website, which will take you through the signup process. You can sign up to a free plan to get started which, allows you to make up to 1,000 calls per day to the API for free.

Downloading and Installing the Ruby SDK

All of our SDK repositories are hosted on Github, you can get the Ruby repository here. The simplest way to install the repository is by using “gem”.

$ gem install aylien_text_api

Configuring the SDK with your AYLIEN credentials

Once you have received your AYLIEN APP_ID and APP_KEY from the signup process and have downloaded the SDK you can start making calls by passing your configuration parameters as a block to AylienTextApi.configure.

require 'aylien_text_api'

AylienTextApi.configure do |config|
  config.app_id        =    "YOUR_APP_ID"
  config.app_key       =    "YOUR_APP_KEY"
end
client = AylienTextApi::Client.new

Alternatively, you can pass them as parameters to AylienTextApi::Client class.

require 'aylien_text_api'

client = AylienTextApi::Client.new(app_id: "YOUR APP ID", app_key: "YOUR APP KEY")

When calling the various API endpoints you can specify a piece of text directly for analysis or you can pass a url linking to the text or article you wish to analyze.

Language Detection

First let’s take a look at the language detection endoint. Specifically we will detect the language of the sentence “’What language is this sentence written in?’

You can call the endpoint using the following piece of code.

text = "What language is this sentence written in?"
language = client.language text: text
puts "Text: #{language[:text]}"
puts "Language: #{language[:lang]}"
puts "Confidence: #{language[:confidence]}"

You should receive an output very similar to the one shown below which shows that the language detected was English and the confidence that it was detected correctly (a number between 0 and 1) is very close to 1 indicating that you can be pretty sure it is correct.

Language Detection Results

Text: What language is this sentence written in?
Language: en
Confidence: 0.9999962069593649

Sentiment Analysis

Next we will look at analyzing the sentence “John is a very good football player” to determine it’s sentiment i.e. positive , neutral or negative. The endpoint will also determine if the text is subjective or objective. You can call the endpoint with the following piece of code

text = "John is a very good football player!"
sentiment = client.sentiment text: text
puts "Text            :  #{sentiment[:text]}"
puts "Sentiment Polarity  :  #{sentiment[:polarity]}"
puts "Polarity Confidence  :  #{sentiment[:polarity_confidence]}"
puts "Subjectivity  :  #{sentiment[:subjectivity]}"
puts "Subjectivity Confidence  :  #{sentiment[:subjectivity_confidence]}"

You should receive an output similar to the one shown below which indicates that the sentence is objective and is positive, both with a high degree of confidence.

Sentiment Analysis Results

Text            :  John is a very good football player!
Sentiment Polarity  :  positive
Polarity Confidence  :  0.9999988272764874
Subjectivity  :  objective
Subjectivity Confidence  :  0.9896821594138254

Article Classification

Next we will take a look at the classification endpoint. The Classification endpoint automatically assigns an article or piece of text to one or more categories making it easier to manage and sort. The classification is based on IPTC International Subject News Codes and can identify up to 500 categories. The code below analyses a BBC news article about mega storms on the planet Uranus.

url = "http://www.bbc.com/earth/story/20150121-mega-storms-sweep-uranus"
classify = client.classify url: url
classify[:categories].each do |cat|
	puts "Label : #{cat[:label]}"
	puts "Code : #{cat[:code]}"
	puts "Confidence : #{cat[:confidence]}"
end

When you run this code you should receive an output similar to that shown below which assigns the article an IPTC label of “natural science – astronomy” with an IPTC code of 13004007.

Article Classification Results

Label : natural science - astronomy
Code : 13004007
Confidence : 1.0

Hashtag Analysis

Next we will look at analyzing the same BBC article to extract hashtag suggestions for sharing the article on social media with the following code.

url = "http://www.bbc.com/earth/story/20150121-mega-storms-sweep-uranus"
hashtags = client.hashtags url: url
hashtags[:hashtags].each do |str|
	puts str
end

You should receive the output shown below.

Hashtag Suggestion Results

If Ruby isn’t your preferred language then check out our SDKs for node.js, Go, PHP, Python, Java and .Net (C#). For more information regarding the APIs go to the documentation section of our website.

We will be publishing ‘getting started’ blogs for the remaining languages over the coming weeks so keep an eye out for them. If you haven’t already done so you can get free access to our API on our sign up page.





Text Analysis API - Sign up




0

This is the fourth edition, in our series of blogs on getting started with AYLIEN’s various SDKs. There are SDKs available for Node.js, Python, Ruby, PHP, GO, Java and .Net (C#). For this week’s instalment we’re going to focus on C#.

If you are new to AYLIEN and Text Analysis and you do not have an account yet, you can take a look at our blog on how to get started with the API or alternatively you can go directly to our Getting Started page which will take you through the signup process. We provide a free plan to get started which allows users to make up to 1,000 calls per day to the API for free.

Downloading and Installing the C# SDK

All of our SDK repositories are hosted on Github. You can find the C# repository here. The simplest way to install the repository is with “nuget package manager”. Simply type the following from a command line tool.


nuget install Aylien.TextApi

Alternatively, from Visual Studio under the “Project” Menu choose “Manage Nuget Packages” and search for the AYLIEN package under online packages.

Once you have installed the SDK you’re ready to start coding. The Sandbox area of the website has a number of sample applications in node.js which would help to demonstrate what the APIs can do. In the remainder of this blog we will walk you through making calls using the C# SDK and show the output you should receive in each case.

Configuring the SDK with your AYLIEN credentials

Once you have received your AYLIEN APP_ID and APP_KEY from the signup process and you have downloaded the SDK, you can start making calls by adding the AYLIEN namespace to your C# code.


using Aylien.TextApi;
using System;

And initialising a client with your AYLIEN credentials


Client client = new Client(
                "YOUR_APP_ID", "YOUR_APP_KEY");

When calling the various API you can specify whether you want to analyze a piece of text directly or a URL linking to the text or article you wish to analyze.

Language Detection

First let’s take a look at the language detection endpoint by analyzing the following sentence: ‘What language is this sentence written in?’

You can call this endpoint using the following piece of code.


Language language = client.Language(text: "What language is this sentence written in?");
Console.WriteLine("Text: {0}", language.Text);
Console.WriteLine("Language: {0}", language.Lang);
Console.WriteLine("Confidence: {0}", language.Confidence);

You should receive an output very similar to the one shown below which shows the language detected as English and a confidence score. The confidence score is very close to 1, so, you can be pretty sure it’s correct.

Language Detection Results


Text: What language is this sentence written in?
Language: en
Confidence: 0.9999982

Sentiment Analysis

Next, we’ll look at analyzing the sentence “John is a very good football player” to determine it’s sentiment i.e. whether it’s positive, neutral or negative. The endpoint will also determine if the text is subjective or objective. You can call the endpoint with the following piece of code


Sentiment sentiment = client.Sentiment(text: "John is a very good football player!");
Console.WriteLine("Text: {0}", sentiment.Text);
Console.WriteLine("Sentiment Polarity  : {0}", sentiment.Polarity);
Console.WriteLine("Polarity Confidence  : {0}", sentiment.PolarityConfidence);
Console.WriteLine("Subjectivity  : {0}", sentiment.Subjectivity);
Console.WriteLine("Subjectivity Confidence  : {0}", sentiment.SubjectivityConfidence);

You should receive an output similar to the one shown below. This indicates that the sentence is objective and is positive, both with a high degree of confidence.

Sentiment Analysis Results


Text: John is a very good football player!
Sentiment Polarity  : positive
Polarity Confidence  : 0.999998827276487
Subjectivity  : objective
Subjectivity Confidence  : 0.989682159413825

Article Classification

We’re now going to take a look at the Classification endpoint. The Classification endpoint automatically assigns an article or piece of text to one or more categories making it easier to manage and sort. Our classification is based on IPTC International Subject News Codes and can identify up to 500 categories. The code below analyses a BBC news article about scientists who have managed to slow down the speed of light.


Classify classify= client.Classify(url: "http://www.bbc.com/news/uk-scotland-glasgow-west-30944584");
Console.Write("nClassification: n");
foreach(var item in classify.Categories)
     {
     Console.WriteLine("Label        :   {0}     ", item.Label.ToString());
     Console.WriteLine("IPTC code    :   {0}     ", item.Code.ToString());
     Console.WriteLine("Confidence   :   {0}     ",                                                                      item.Confidence.ToString());
     }

When you run this code you should receive an output similar to that shown below which assigns the article an IPTC label of “applied science – particle physics” with an IPTC code of 13001004.

Article Classification Results


Classification:
Label        :   applied science - particle physics
IPTC code    :   13001004
Confidence   :   0.9877892

Hashtag Analysis

Next, we’ll look analyze the same BBC article and extract hashtag suggestions for sharing the article on social media.


Hashtags hashtags = client.Hashtags(url: "http://www.bbc.com/news/uk-scotland-glasgow-west-30944584");
Console.Write("nHashtags: n");
        foreach(var item in hashtags.HashtagsMember)
	{
	Console.WriteLine(item.ToString());
	}

You should receive the output shown below.

Hashtag Suggestion Results


Hashtags:
#Glasgow
#HeriotWattUniversity
#Scotland
#Moon
#QuantumRealm
#LiquidCrystal
#Tie
#Bicycle
#Wave-particleDuality
#Earth
#Physics

Check out our SDKs for node.js, Go, PHP, Python, Java and Ruby if C# isn’t your preferred language. For more information regarding the APIs go to the documentation section of our website.

0

Last week’s getting started blog focused on the Python SDK. This week we’re going to focus on using the API with Go. This is the third in our series of blogs on getting started with AYLIEN’s various SDKs. You can access all our SDK repositories on here

If you are new to our API and Text Analysis in general and you don’t have an account you can go directly to the Getting Started page on the website which will take you through how to open an account. You can choose a free plan to get started, which allows you to make up to 1,000 calls per day to the API for free.

Downloading and Installing the Go SDK

The simplest way to install the repository is with “go get”. Simply type the following from a command line tool.


$ go get github.com/AYLIEN/aylien_textapi_go

Utilizing the SDK with your AYLIEN credentials

Once you’ve subscribed to our API and have downloaded the SDK you can start making calls by adding the following code to your go program.


import (
"fmt"
textapi "github.com/AYLIEN/aylien_textapi_go"
)
auth := textapi.Auth{"YOUR_APP_ID ", "YOUT_APP_KEY"}
client, err := textapi.NewClient(auth, true)
if err != nil {
panic(err)
}

When calling the API you can specify whether you wish to analyze a piece of text directly for a URL linking to the text or article you wish to analyze.

Language Detection

We’re going to first showcase the Language Detection endpoint by analyzing the following sentence “What language is this sentence written in?“ By using the following piece of code.


languageParams := &textapi.LanguageParams{Text: "What language is this sentence written in?"}
lang, err := client.Language(languageParams)
if err != nil {
panic(err)
}
fmt.Printf("nLanguage Detection Resultsn")
fmt.Printf("Text            :   %sn", lang.Text)
fmt.Printf("Language        :   %sn", lang.Language)
fmt.Printf("Confidence      :   %fnn", lang.Confidence)

You should receive an output very similar to the one shown below, which shows, the language detected was English and the confidence that it was detected correctly (a number between 0 and 1) is very close to 1 which means you can be pretty sure it is correct.

Language Detection Results


Text            :   What language is this sentence written in?
Language        :   en
Confidence      :   0.999997

Sentiment Analysis

Next we’ll look at analyzing the following short piece of text “John is a very good football player” to determine it’s sentiment i.e. if it’s positive, neutral or negative.


sentimentParams := &textapi.SentimentParams{Text: "John is a very good football player!"}
sentiment, err := client.Sentiment(sentimentParams)
if err != nil {
panic(err)
}
fmt.Printf("Sentiment Analysis Resultsn")
fmt.Printf("Text            :   %sn", sentiment.Text)
fmt.Printf("Sentiment Polarity  :   %sn", sentiment.Polarity)
fmt.Printf("Polarity Confidence  :   %fn", sentiment.PolarityConfidence)
fmt.Printf("Subjectivity  : %sn", sentiment.Subjectivity)
fmt.Printf("Subjectivity Confidence  :   %fnn", sentiment.SubjectivityConfidence)

You should receive an output similar to the one shown below which indicates that the sentence is objective and is positive, both with a high degree of confidence.

Sentiment Analysis Results


Text            :   John is a very good football player!
Sentiment Polarity  :   positive
Polarity Confidence  :   0.999999
Subjectivity  : objective
Subjectivity Confidence  :   0.989682

Article Classification

AYLIEN’s Classification endpoint automatically assigns an article or piece of text to one or more categories, making it easier to manage and sort. The classification is based on IPTC International Subject News Codes and can identify up to 500 categories. The code below analyses a BBC news article about a one ton pumpkin ;).


classifyParams := &textapi.ClassifyParams{URL: "http://www.bbc.com/earth/story/20150114-the-biggest-fruit-in-the-world"}
class, err := client.Classify(classifyParams)
if err != nil {
panic(err)
}
fmt.Printf("Classification Analysis Resultsn")
for _, v := range class.Categories {
fmt.Printf("Classification Label        :   %sn", v.Label)
fmt.Printf("Classification Code         :   %sn", v.Code)
fmt.Printf("Classification Confidence   :   %fnn", v.Confidence)

When you run this code you should receive an output similar to that shown below which assigns the article an IPTC label of “natural science – biology” with an IPTC code of 13004008.

Classification Results


Classification Label        :   natural science - biology
Classification Code         :   13004008
Classification Confidence   :   0.929754

Hashtag Suggestion

Next, we’ll have a look at analyzing the same BBC article and extracting hashtag suggestions for it.


hashtagsParams := &textapi.HashtagsParams{URL:
"http://www.bbc.com/earth/story/20150114-the-biggest-fruit-in-the-world"}
hashtags, err := client.Hashtags(hashtagsParams)
if err != nil {
panic(err)
}
fmt.Printf("Hashtag Suggestion Resultsn")
for _, v := range hashtags.Hashtags {
fmt.Printf("%sn", v)
}

You should receive an output similar to the one below.

Hashtags


Hashtag Suggestion Results
#Carbon
#Sugar
#Squash
#Agriculture
#Juicer
#BBCEarth
#TopsfieldMassachusetts
#ArnoldArboretum
#AtlanticGiant
#HarvardUniversity
#Massachusetts

If Go isn’t your weapon of choice then check out our SDKs for node.js, Ruby, PHP, Python, Java and .Net (C#). For more information regarding our API go to the documentation section of our website.

We will be publishing ‘getting started’ blogs for the remaining languages over the coming weeks so keep an eye out for them.

1

This is the second in our series of blogs on getting started with Aylien’s various SDKs. There are SDKs available for Node.js, Python, Ruby, PHP, GO, Java and .Net (C#). Last week’s blog focused on the node.js SDK. This week we will focus on Python.

If you are new to AYLIEN and don’t have an account you can take a look at our blog on getting started with the API or alternatively you can go directly to the Getting Started page on the website which will take you through the signup process. We have a free plan available which allows you to make up to 1,000 calls to the API per day for free.

Downloading and Installing the Python SDK

All of our SDK repositories are hosted on Github. You can find the the python repository here. The simplest way to install the repository is with the python package installer pip. Simply run the following from a command line tool.


$ pip install --upgrade aylien-apiclient

The following libraries will be installed when you install the client library:

  • httplib2
  • Once you have installed the SDK you’re ready to start coding! The Sandbox area of the website has a number of sample applications in node.js which help to demonstrate what the API can do. For the remainder of this blog we will walk you through making calls to three of the API endpoints using the Python SDK.

    Utilizing the SDK with your AYLIEN credentials

    Once you have received your AYLIEN credentials and have downloaded the SDK you can start making calls by adding the following code to your python script.

    
    from aylienapiclient import textapi
    c = textapi.Client("YourApplicationID", "YourApplicationKey")
    

    When calling the various endpoints you can specify a piece of text directly for analysis or you can pass a URL linking to the text or article you wish to analyze.

    Language Detection

    First let’s take a look at the language detection endpoint. We’re going to detect the language of the sentence “’What language is this sentence written in?’

    You can do this by simply running the following piece of code:

    
    language = c.Language({'text': 'What Language is this sentence written in?'})
    print ("Language Detection Results:nn")
    print("Language: %sn" % (language["lang"]))
    print("Confidence: %Fn" %  (language["confidence"]))
    

    You should receive an output very similar to the one shown below which shows that the language detected was English and the confidence that it was detected correctly (a number between 0 and 1) is very close to 1 indicating that you can be pretty sure it is correct.

    Language Detection Results:

    
    Language	: en
    Confidence	: 0.9999984486883192
    

    Sentiment Analysis

    Next we will look at analyzing the sentence “John is a very good football player” to determine it’s sentiment i.e. whether it’s positive, neutral or negative. The API will also determine if the text is subjective or objective. You can call the endpoint with the following piece of code

    
    sentiment = c.Sentiment({'text': 'John is a very good football player!'})
    print ("nnSentiment Analysis Results:nn")
    print("Polarity: %sn" % sentiment["polarity"])
    print("Polarity Confidence: %sn" %  (sentiment["polarity_confidence"]))
    print("Subjectivity: %sn" % sentiment["subjectivity"])
    print("Subjectivity Confidence: %sn" %  (sentiment["subjectivity_confidence"]))
    

    You should receive an output similar to the one shown below which indicates that the sentence is positive and objective, both with a high degree of confidence.

    Sentiment Analysis Results:

    
    Polarity: positive
    Polarity Confidence: 0.9999988272764874
    Subjectivity: objective
    Subjectivity Confidence: 0.9896821594138254
    

    Article Classification

    Next we will take a look at the classification endpoint. The Classification Endpoint automatically assigns an article or piece of text to one or more categories making it easier to manage and sort. The classification is based on IPTC International Subject News Codes and can identify up to 500 categories. The code below analyzes a BBC news article about the first known picture of an oceanic shark giving birth.

    
    category = c.Classify({'url': 'http://www.bbc.com/news/science-environment-30747971'} )
    print("nArticle Classification Results:nn")
    print("Label	: %sn" % category["categories"][0]["label"])
    print("Code	: %sn" % category["categories"][0]["code"])
    print("Confidence	: %sn" % category["categories"][0]["confidence"])
    
    

    When you run this code you should receive an output similar to that shown below, which assigns the article an IPTC label of “science and technology – animal science” with an IPTC code of 13012000.

    Article Classification Results:

    
    Label   : science and technology - animal science
    Code    : 13012000
    Confidence      : 0.9999999999824132
    

    If python is not your preferred language then check out our SDKs for node.js, Ruby, PHP, GO, Java and .Net (C#). For more information regarding the APIs go to the documentation section of our website.

    We will be publishing ‘getting started’ blogs for the remaining languages over the coming weeks so keep an eye out for them. If you haven’t already done so, you can get free access to our API on our sign up page.

    Happy Hacking!

    image

    0

    If you’re a regular reader of our blog, you will have heard us mention Time to First Hello World (TTFHW) quite a bit. It’s all part of our focus on Developer Experience and our efforts to make our API’s as easy as possible to use and integrate with.

    In line with this initiative, in late 2014 we launched Software Development Kits for Ruby, PHP, Node.js and Python and we promised to add more SDKs for other popular languages early in the new year. The idea behind the SDKs was to make it as easy as possible for our users to get up and running with the API and making calls as quickly as possible.

    We’re happy to announce that AYLIEN Text Analysis SDKs are now available for Java, C# and Go. We know there was a few of our users waiting on the Java SDK so we’re particularly happy to now offer the Java SDK among others. You can download them directly from our AYLIEN GitHub repository below.

    If you have requests for features or improvements to our API, or our other product offerings, make sure you let us know about them. Also, if you haven’t played with it yet, check out our Developer Sandbox. It’s a Text Analysis playground for developers. A place you can go to test the API, fiddle with ideas and build the foundations of your Text Analysis service.Happy Hacking!

     

    image

    0

    We recently developed and released SDKs for Node.js, Python, Ruby and PHP with Java, Go and C# coming out next week. This is the first blog in a series on using AYLIEN’s various Software Development Kits (SDKs) and for todays blog we are going to focus on using the Node.js SDK.

    If you are new to AYLIEN and do not yet have an account you can take a look at our blog on getting started with the API or alternatively you can go directly to the Getting Started page on the website which will take you through the signup process. We have a free plan to get you started which allows you to make up to 1,000 calls per day to the API for free.

    Downloading and Installing the Node.js SDK

    All of our SDK repositories are hosted on Github. The simplest way to install the repository is with the node package manager “npm”, by typing the following from the command line

    $ npm install aylien_textapi

    Once you have installed the SDK you are ready to start coding! The Sandbox area of the website has a number of sample applications available which you can use to get things moving.

    For this guide in particular we are going to walk you through some of the basic functionality of the API incorporating the “Basic Functions” sample app from the Sandbox. We’ll illustrate making a call to three of the endpoints individually and interpret the output that you should receive in each case.

    Accessing the SDK with your AYLIEN credentials

    Once you have received your AYLIEN credentials from the signup process and have downloaded the SDK you can begin making calls by adding the following code to your node.js application.

    var AYLIENTextAPI = require('aylien_textapi');
    var textapi = new AYLIENTextAPI({
      application_id:"YOUR_APP_ID",
      application_key: "YOUR_APP_KEY"
    });

    When calling the various endpoints you can specify whether your input is a piece of text or you can pass a URL linking to the text or article you wish to analyze.

    Language Detection

    First let’s take a look at the language detection endpoint. The Language Detection endpoint is pretty straight forward, it is used to detect the language of a sentence. In this case we are analyzing the following sentence: “What language is this sentence written in?“

    You can call the endpoint using the following piece of code.

    textapi.language('What language is this sentence written in?', function(err, resp) {
      	console.log("nLanguage Detection Results:n");
      	if (err !== null) {
        console.log("Error: " + err);
      } else {
        console.log("Language	:",resp.lang);
        console.log("Confidence	:",resp.confidence);
      }
    });
    

    You should receive an output very similar to the one below which shows that the language detected was English. It also shows a confidence score (a number between 0 and 1) of close to 1, indicating that you can be pretty sure English is correct.

    Language Detection Results:
    
    Language	: en
    Confidence	: 0.9999984486883192

    Sentiment Analysis

    Next we’ll look at analyzing the sentence “John is a very good football player” to determine it’s sentiment i.e. whether it’s positive , neutral or negative. The endpoint will also determine if the text is subjective or objective. You can call the endpoint with the following piece of code.

    textapi.sentiment('John is a very good football player!', function(err, resp) {
      console.log("nSentiment Analysis Results:n");
      if (err !== null) {
        console.log("Error: " + err);
      } else {
        console.log("Subjectivity	:",resp.subjectivity);
        console.log("Subjectivity Confidence	:",resp.subjectivity_confidence);
        console.log("Polarity	:",resp.polarity);
        console.log("Polarity Confidence	:",resp.polarity_confidence);
      }
    });

    You should receive an output similar to the one shown below which indicates that the sentence is objective and is positive, both with a high degree of confidence.

    Sentiment Analysis Results:
    
    Subjectivity	: objective
    Subjectivity Confidence	: 0.9896821594138254
    Polarity	: positive
    Polarity Confidence	: 0.9999988272764874

    Article Classification

    Next we will take a look at the classification endpoint. The Classification endpoint automatically assigns an article or piece of text to one or more categories making it easier to manage and sort. The classification is based on IPTC International Subject News Codes and can identify up to 500 categories. The code below analyzes a BBC news article about the Philae Lander which found organic molecules on the surface of a comet.

    textapi.classify('http://www.bbc.com/news/science-environment-30097648', function(err, resp) {
        console.log("nArticle Classification Results:n");
      	if (err !== null) {
        console.log("Error: " + err);
      } else {
        for (var i=0; i < resp.categories.length; i++){
            console.log("Label	:",resp.categories[i].label);
        	console.log("Code	:",resp.categories[i].code);
    		console.log("Confidence	:",resp.categories[i].confidence);
        }
      }
    });
    

    When you run this code you should receive an output similar to that shown below which assigns the article an IPTC label of “science and technology – space program” with an IPTC code of 13008000.

    Article Classification Results:
    
    Label	: science and technology - space programme
    Code	: 13008000
    Confidence	: 0.9999999983009931

    Now that you have seen how simple it is to access the power of Text Analysis through the SDK jump over to our Sandbox and start playing with the sample apps. If Node.js is not your preferred language then check out our SDKs for Python, Ruby, PHP on our website. We will be publishing ‘getting started’ blogs for each of these languages over the coming weeks so keep an eye out for those too. If you haven’t already done so you can get free access to our API on our sign up page.

     

    image

    0

    We recently spoke about TTFHW (Time to first hello world) in a previous blog post and how important developer experience is to us. Ensuring our API is easy to use, quick to get started with and flexible is a major priority for us at AYLIEN. With this in mind, we recently launched SDKs in various languages and are today, happy to announce, our developer Sandbox is now live on our developer portal. The idea with the Sandbox is to provide a simple environment for our API users, whilst making it quicker and easier for developers to play with, test and build upon our API.

    The Sandbox is a development and test environment that allows you to start making calls to the API as quickly and as easily as possible. It’s comprised of an editor and output window that allows users to start coding and making calls to the API without having to setup a dedicated environment. It currently has two sample apps, one is a “Basic Features App”, a simple application that showcases each feature of the API and the other is an application that grabs the top 5 discussions on Hacker News and Summarizes them. Both are fully functional and editable for you to use as a functional app, to test the API or as the building blocks for an idea you might have. We plan to add more applications over the next couple of weeks to showcase some interesting use cases for the API.

    You can access the Sandbox with an AYLIEN Text Analysis API account. If you haven’t signed up for a free account, you can register for one here.

    Getting Started

    Once you’ve logged in, the Sandbox will automatically generate your App ID and Account information so you can start making calls immediately. Choose the SDK from the menu in the right-hand side for the language of your choice and start hacking.

     

     

    Choose an APP

    Choose which app you want to use in the drop down menu. As we mentioned there are two sample apps currently available. A feature focused app and a Hacker News summarization app.

     

     

    Build your own

    The code is fully editable so you can use the apps as building blocks for new ideas, or even build your own app from right within the Sandbox. Your work can also be downloaded and saved on your local machine..

     

     

    Invite your friends!

    If you are having fun in the sandbox then please invite your friends, simply grab the URL and pass it on. Note: Whoever you share it with will need their own AYLIEN Account to make any calls.

     

     

    We hope you enjoy using our Developer Sandbox. You can get access to it here. We’d love to hear about any cool apps you’re building and we would be happy to feature them as sample apps in the Developer Sandbox.

    Happy Hacking!




    Text Analysis API - Sign up




    0

    This edition, the 5th in the series, of our “Getting up and running with AYLIEN Text Analysis API” will focus on working with the API using C#. Previously we published code snippets and getting started guides for node.js, Powershell, Python, Java and Ruby.

    As we did in our previous blogs, we’re going to perform some basic Text Analysis like detecting what language a piece of text is written in, analyzing the sentiment of a piece of text, classifying an article and finally generating some hashtags for a URL. The idea here is to make it as easy as possible for developers to get up and running with our API and to showcase how easy it is to get started in your chosen language.

    We’re first going to look at the code in action. We’ll then go through the code, section by section, to investigate each of the endpoints used in the code snippet.

    What are we going to do?:

    • Detect what language the following text is written in: “What language is this sentence written in?”
    • Analyze the sentiment of the following statement: “John is a very good football player!”
    • Generate a classification (IPTC Code and label) for the URL: “http://www.bbc.com/earth/story/20141110-earths-magnetic-field-flips-more”
    • Generate hashtags for the URL: “http://www.bbc.com/earth/story/20141110-earths-magnetic-field-flips-more”

    Note: The getting started page on our website has a range of code snippets in other programming languages you can use if you would prefer not to use C#.

    Overview of the code in action

    The complete code snippet is given at the end of this blog for you to copy and paste. To get it running, open a text editor of your choice and copy and paste the snippet. Before running the code, make sure you replace the YOUR_APP_ID and YOUR_APP_KEY placeholders in the code with your own application id and application key. You would have been sent these when you signed up as an API user. If you haven’t signed up you can make your way to our sign up page to register for free.

    Save the file as TextAPISample.cs and open a developer command prompt. Navigate to the folder where you saved the code snippet and compile the code by typing “csc TextAPISample.cs”. You can now run the executable by typing TextAPISample at the command prompt.

    Once you run it, you should receive the following output:

    
    C:srcGettingStarted>TextAPISample
    
    
    Text     : What Language is this sentence written in?
    Language : en (0.9999971593789524)
    
    
    Text     : John is a very good football player!
    Sentiment: positive (0.9999988272764874)
    
    
    Classification:
    
    Label        :   natural science - geology
    IPTC code    :   13004001
    Confidence   :   1.0
    
    
    Hashtags:
    
    #EarthsMagneticField
    #RockMusic
    #SantaCruzCalifornia
    #Lava
    #California
    #Earth
    #Convection
    #Iron
    #Helsinki
    #SpaceColonization
    #HistoryOfTheEarth
    #Granite
    #Finland
    #DynamoTheory
    #Compass
    #SouthMagneticPole
    #Hematite
    #SolarRadiation
    

    In this case we have detected that the first piece of text is written in English and the sentiment or polarity of the second statement is positive. We have also generated hashtags for the URL and classified the content.

    Next, we’ll go through the code snippet, section by section.

    Language Detection

    Using the Language Detection endpoint you can analyze a piece of text or a URL and automatically determine what language it is written in. In the piece of code we have used in this blog, the “parameter” variable controls whether the call is made specifying the text directly or as a URL.

    
    parameters.text = "What Language is this sentence written in?";
    var language = CallApi("language", parameters);
    

    In this case we have specified that, we want to analyze the following text “What language is this sentence written in?” and as you can see from the output below, it determined that the text is written in English and it gave a 0.999993 confidence score that the language was detected correctly. For all of the endpoints, the API returns the text which was analyzed for reference and we have included it in the results in each case.

    Result:

    
    Text     : What Language is this sentence written in?
    Language : en (0.999997599332592)
    

    Sentiment Analysis

    Similarly, the Sentiment Analysis endpoint takes a piece of text or a URL and analyzes it to determine whether it is positive, negative or even neutral it also provides a confidence score in the results.

    
    parameters.text = "John is a very good football player!";
    var sentiment = CallApi("sentiment", parameters);
    

    Analyzing the following text; “John is a very good football player!”. The API has determined that the sentiment of the piece of text is positive, we can also be pretty sure it’s correct based on the confidence score returned of 0.999998.

    Result:

    
    Text     : John is a very good football player!
    Sentiment: positive (0.9999988272764874)
    

    Hashtag Suggestions

    The Hashtag Suggestion endpoint, analyses a URL and generates a list of hashtag suggestions which can be used to ensure that your content or URL’s are optimally shared on social media:

    
    parameters.url = "http://www.bbc.com/earth/story/20141110-earths-magnetic-field-flips-more";
    var hashtags = CallApi("hashtags", parameters);
    

    For hashtag suggestions, we have used an article about changes in the orientation of the Earth’s magnetic field published on the BBC news website “http://www.bbc.com/earth/story/20141110-earths-magnetic-field-flips-more”. The hashtag suggestion endpoint first extracts the text from the URL and then analyzes that text and generates hashtag suggestions for it.

    Result:

    
    Hashtags:
    #EarthsMagneticField
    #RockMusic
    #SantaCruzCalifornia
    #Lava
    #California
    #Earth
    #Convection
    #Iron
    #Helsinki
    #SpaceColonization
    #HistoryOfTheEarth
    #Granite
    #Finland
    #DynamoTheory
    #Compass
    #SouthMagneticPole
    #Hematite
    #SolarRadiation
    

    Article Classification

    The classification endpoint automatically assigns or tags an article or piece of text to one or more categories making it easier to manage and sort. The classification endpoint is based on IPTC International Subject News Codes and can identify up to 500 categories.

    
    parameters.url = "http://www.bbc.com/earth/story/20141110-earths-magnetic-field-flips-more";
    var classify = CallApi("classify", parameters);
    

    When we analyze the url pointing to the BBC news story, we receive the results as shown below. As you can see it has labelled the article as “natural science- geology” with a corresponding IPTC code of 13004001 and a confidence of 1.

    Result:

    
    Classification:
    
    Label        :   natural science - geology
    IPTC code    :   13004001
    Confidence   :   1.0
    

    For more getting started guides and code snippets to help you get up and running with our API, visit our getting started page on our website. If you haven’t already done so you can get free access to our API on our sign up page.

    The Complete Code Snippet

    
    using System;
    using System.Collections;
    using System.Collections.Generic;
    using System.Collections.ObjectModel;
    using System.Dynamic;
    using System.IO;
    using System.Linq;
    using System.Net;
    using System.Text;
    using System.Web;
    using System.Web.Script.Serialization;
    
    namespace AylienTextAPI
    {
        class Program
        {
            static void Main(string[] args)
            {
                var url = "http://www.bbc.com/earth/story/20141110-earths-magnetic-field-flips-more";
                string[] endpoints = new string[] {"language", "sentiment", "classify", "hashtags"};
    
    foreach(var endpoint in endpoints){
      switch(endpoint){
        case "language":
        {
          dynamic parameters = new System.Dynamic.ExpandoObject();
          parameters.text = "What Language is this sentence written in?";
          var language = CallApi("language", parameters);
          Console.WriteLine("nText     : {0}", language["text"] );
          Console.WriteLine("Language : {0} ({1})",
          language["lang"], language["confidence"]);
          break;
        }
        case "sentiment":{
          dynamic parameters = new System.Dynamic.ExpandoObject();
          parameters.text = "John is a very good football player!";
          var sentiment = CallApi("sentiment", parameters);
          Console.WriteLine("nText     : {0}", sentiment["text"] );
          Console.WriteLine("Sentiment: {0} ({1})",
          sentiment["polarity"], sentiment["polarity_confidence"]);
          break;
        }
        case "classify": {
          dynamic parameters = new System.Dynamic.ExpandoObject();
          parameters.url = url;
          var classify = CallApi("classify", parameters);
          Console.Write("nClassification: n");
          foreach(var item in classify["categories"])
          {
              Console.WriteLine("Label        :   {0}     ", item["label"].ToString());
              Console.WriteLine("IPTC code    :   {0}     ", item["code"].ToString());
              Console.WriteLine("Confidence   :   {0}     ", item["confidence"].ToString());
          }
          break;
        }
        case "hashtags":{
          dynamic parameters = new System.Dynamic.ExpandoObject();
          parameters.url = url;
          var hashtags = CallApi("hashtags", parameters);
          Console.Write("nHashtags: n");
          foreach(var item in hashtags["hashtags"])
          {
              Console.WriteLine(item.ToString());
          }
          break;
        }
      }
    }
            }
    
            private static dynamic CallApi(String endpoint, dynamic parameters)
            {
                String APPLICATION_ID = YOUR_APPLICATION_ID;
                String APPLICATION_KEY = YOUR_APPLICATION_KEY;
                Uri address = new Uri("https://api.aylien.com/api/v1/" +
                    endpoint);
    
                // Create the web request
                HttpWebRequest request = WebRequest.Create(address) as
                    HttpWebRequest;
    
                // Set type to POST
                request.Method = "POST";
                request.ContentType = "application/x-www-form-urlencoded";
    
                // Create the data we want to send
                StringBuilder data = new StringBuilder();
                var i = 0;
                foreach (var item in parameters)
                {
                    if (i == 0)
                        data.Append(item.Key + "=" +
                            HttpUtility.UrlEncode(item.Value));
                    else
                        data.Append("&" + item.Key + "=" +
                            HttpUtility.UrlEncode(item.Value));
    
                    i++;
                }
    
                // Create a byte array of the data we want to send
                byte[] byteData = UTF8Encoding.UTF8.GetBytes(data.ToString());
    
                // Set the content length in the request headers
                request.ContentLength = byteData.Length;
                request.Accept = "application/json";
                request.Headers.Add("X-AYLIEN-TextAPI-Application-ID",
                    APPLICATION_ID);
                request.Headers.Add("X-AYLIEN-TextAPI-Application-Key",
                    APPLICATION_KEY);
    
                // Write data
                using (Stream postStream = request.GetRequestStream())
                {
                    postStream.Write(byteData, 0, byteData.Length);
                }
    
                // Get response
                using (HttpWebResponse response = request.GetResponse()
                    as HttpWebResponse)
                {
                    // Get the response stream
                    StreamReader reader =
                        new StreamReader(response.GetResponseStream());
    
                    // Serialize to JSON dynamic object
                    var serializer = new JavaScriptSerializer();
                    return serializer.Deserialize(reader.ReadToEnd(),
                        typeof(object));
                }
            }
        }
    
        // http://stackoverflow.com/a/3806407/1455811
        public sealed class DynamicJsonConverter : JavaScriptConverter
        {
            public override object Deserialize(IDictionary
                dictionary,
                Type type, JavaScriptSerializer serializer)
            {
                if (dictionary == null)
                    throw new ArgumentNullException("dictionary");
    
                return type == typeof(object) ?
                    new DynamicJsonObject(dictionary) : null;
            }
    
            public override IDictionary
                Serialize(object obj, JavaScriptSerializer serializer)
            {
                throw new NotImplementedException();
            }
    
            public override IEnumerable SupportedTypes
            {
                get { return new ReadOnlyCollection
                    (new List(new[] { typeof(object) })); }
            }
    
            #region Nested type: DynamicJsonObject
    
            private sealed class DynamicJsonObject : DynamicObject
            {
                private readonly IDictionary _dictionary;
    
                public DynamicJsonObject(IDictionary dictionary)
                {
                    if (dictionary == null)
                        throw new ArgumentNullException("dictionary");
                    _dictionary = dictionary;
                }
    
                public override string ToString()
                {
                    var sb = new StringBuilder("{");
                    ToString(sb);
                    return sb.ToString();
                }
    
                private void ToString(StringBuilder sb)
                {
                    var firstInDictionary = true;
                    foreach (var pair in _dictionary)
                    {
                        if (!firstInDictionary)
                            sb.Append(",");
                        firstInDictionary = false;
                        var value = pair.Value;
                        var name = pair.Key;
                        if (value is string)
                        {
                            sb.AppendFormat("{0}:"{1}"", name, value);
                        }
                        else if (value is IDictionary)
                        {
                            new DynamicJsonObject((IDictionary)
                                value).ToString(sb);
                        }
                        else if (value is ArrayList)
                        {
                            sb.Append(name + ":[");
                            var firstInArray = true;
                            foreach (var arrayValue in (ArrayList)value)
                            {
                                if (!firstInArray)
                                    sb.Append(",");
                                firstInArray = false;
                                if (arrayValue is IDictionary)
                                    new DynamicJsonObject(
                                        (IDictionary)
                                        arrayValue).ToString(sb);
                                else if (arrayValue is string)
                                    sb.AppendFormat(""{0}"", arrayValue);
                                else
                                    sb.AppendFormat("{0}", arrayValue);
    
                            }
                            sb.Append("]");
                        }
                        else
                        {
                            sb.AppendFormat("{0}:{1}", name, value);
                        }
                    }
                    sb.Append("}");
                }
    
                public override bool TryGetMember(GetMemberBinder binder,
                    out object result)
                {
                    if (!_dictionary.TryGetValue(binder.Name, out result))
                    {
                        // return null to avoid exception.
                        // caller can check for null this way...
                        result = null;
                        return true;
                    }
    
                    result = WrapResultObject(result);
                    return true;
                }
    
                public override bool TryGetIndex(GetIndexBinder binder,
                    object[] indexes, out object result)
                {
                    if (indexes.Length == 1 && indexes[0] != null)
                    {
                        if (!_dictionary.TryGetValue(indexes[0].ToString(),
                            out result))
                        {
                            // return null to avoid exception.
                            // caller can check for null this way...
                            result = null;
                            return true;
                        }
    
                        result = WrapResultObject(result);
                        return true;
                    }
    
                    return base.TryGetIndex(binder, indexes, out result);
                }
    
                private static object WrapResultObject(object result)
                {
                    var dictionary = result as IDictionary;
                    if (dictionary != null)
                        return new DynamicJsonObject(dictionary);
    
                    var arrayList = result as ArrayList;
                    if (arrayList != null && arrayList.Count > 0)
                    {
                        return arrayList[0] is IDictionary
                            ? new List(
                                arrayList.Cast>()
                                .Select(x => new DynamicJsonObject(x)))
                            : new List(arrayList.Cast());
                    }
    
                    return result;
                }
            }
    
            #endregion
        }
    }
    
    
    
    $(function(){
      $('.prettyprint:last').tooltip({title:'Click to expand!', trigger:'hover'});
      $('.prettyprint:last').on('click', function() {
        $(this).css({height: '100%', cursor: 'default'}).tooltip('destroy');
      });
    });
    





    Text Analysis API - Sign up




    0

    PREVIOUS POSTSPage 1 of 2NO NEW POSTS