Start here

What Microsoft and Apple may never figure out

So let’s say that you need to build something really cool, like a virtual assistant that understands human language. Now let’s say that you take a bunch of MIT mathematics guys, and you tell them to figure it out. How likely is it, do you venture, that these guys will ever pay attention to the actual meanings of the words being evaluated. Right. Thus we have “state of the art” virtual assistants based on prediction algorithms and models. So why doesn’t Apple and Microsoft care about the meaning of words? Probably because hundreds, if not thousands of developer jobs depend upon statistical methods. This is why Microsoft and Apple will never get to the next stage of NLU. The theory of feeding millions of words into a hidden markov model, and “the more data, the better” has hit a wall, and no one (mathematicians) wants to admit it. Let’s face it, if “more data is better”, than why does Google Translate put out such awful translations most of the time? They’ve been trolling the web adding more data for a decade.

The missing link in all of this? Semantics. Right, the idea of understanding the meaning of all of the words in an utterance. Just because the odds are in favor of a word (probability), obviously means that there is an error rate, because the odds are almost never 100%. Forget about the Rest of the World, English is incredibly ambiguous. Take the word “tank”. Inside the LinguaSys lexicons in English, “tank” has nine possible meanings. I tanked the test. Fill the gas tank. The tank rolled into battle….etc. The truth, as I see it, and experience it, is that semantics beats statistics hands down. In fact, we, at LinguaSys, have beaten statistics hands down. We recently competed against two of the very largest providers of natural language understanding at a major auto manufacturer. All three vendors had to complete the same task. Guess what, we won! And, although I don’t have the actual metrics for this, I imagine that we did it in about a 10th of the time and cost. Because we deal in semantics, if I need to include the option to bring pets in my hotel reservation virtual assistant, I don’t have to build an embedded grammar listing all of the kinds of pets there could be. I already know, because of my semantic tree, all of the “children” of pets. One line of code vs hours of grammar building. Now expand this tree to 19 other languages, and you start to see the power of semantics.

We live in an era of statistics, and it certainly has its place, but the Microsoft and Apples of the world will, I fear, never fulfill their vision simply because embracing semantics would cost too many developers their jobs. When you are a hammer, everything looks like a nail.

New Natural Language Processing API Portal

By Adrian Bridgewater, Dr.Dobb’s, October 22, 2014

http://www.drdobbs.com/tools/new-natural-language-processing-api-port/240169200

Human language big data company LinguaSys has created a new API portal called GlobalNLP to reach what it calls “the flourishing global developer population” building Natural Language Processing (NLP) applications with their own business logic.

GlobalNLP understands and extracts meaning from unstructured or conversational text to comprehend the meaning of textual human dialog across over 20 languages.

LinguaSys works across the “more challenging” Asian and Middle Eastern languages with high-quality semantics. There are customizable language models, attribute tags, concept tagging, domain detection, full semantic network, hyponyms, hypernyms, synonyms, keyword extraction, language detection, lemmatization, morphology, parsing, part of speech tagging, relation extraction, sentiment analysis, translation, transliteration, crosslingual retrieval, anaphora resolution, and natural language interfaces.

The company provides services to developers via RESTful cloud-based API connectivity with up to 20 API calls a minute, and up to 500 API calls a month — free — to test.

“LinguaSys is also making Story Mapper available on GlobalNLP. [This] provides insights to media intelligence companies, analyzing large volumes of unstructured text in native languages, then categorizing and summarizing the data,” said Brian Garr, CEO of LinguaSys.

Story Mapper searches big data, social media, news feeds, and other digital text for content, sentiment, names, quotations and relationships, extracting facts, topics, events, assessing tone, and determining prominence, relevance, and the dependence of entities in multiple original languages.

LinguaSys launching GlobalNLP API for natural language processing in the cloud

By Patricio Robles, ProgrammableWeb, October 20, 2014

http://www.programmableweb.com/news/linguasys-launching-globalnlp-api-natural-language-processing-cloud/2014/10/20

Human language technology company LinguaSys is this week launching an API offering that allows developers to use its GlobalNLP natural language processing software in the cloud.

Using the GlobalNLP API, developers can access a suite of core natural language processing offerings that include: concept tagging, domain detection, keyword extraction, language detection, lemmatization, morphology, part of speech tagging, relation extraction, sentiment analysis and transliteration. The API supports more than 20 languages, including Asian and Middle Eastern languages, which the company notes are more challenging to deal with.

According to LinguaSys CEO Brian Garr, “GlobalNLP creates opportunities for developers to access hereto unavailable, or highly expensive linguistic capabilities, such as anaphora resolution, or something as simple, but complex, as tokenization for Asian languages without spaces or punctuation.”

Garr says that the GlobalNLP API, which is based on LinguaSys’ proprietary Carabao Linguistic Virtual Machine, is differentiated in a number of ways from other natural language processing solutions it competes against. For instance, “we do all linguistic determinations in the source language, using a large vocabulary, full sentence semantic model. We don’t ‘hunt for words’, we actually understand the whole utterance,” he stated.

For developers needing access to higher level natural language processing solution, LinguaSys is also offering an API for its StoryMapper solution. As Garr explained to me, “Story Mapper turns content, such as news stories, into highly tagged XML output, including the general theme of the story, was it positive or negative, what were the major entities and who said what about them. This does in minutes what it used to take people hours to do.”

LinguaSys’ API portal contains documentation for both APIs and contains code samples for a variety of languages, including JavaScript, C#, Python, PHP, Java and Ruby. The company is encouraging developers to test its APIs by offering a free tier that includes 500 API calls a month. Paid packages offer up to 5,000 calls per week or 100,000 calls per week for $99 per month and $1,499 per month, respectively. For companies with greater needs, LinguaSys can quote a custom package.

Betting on APIs

The market for natural language processing technology has grown rapidly as companies grapple with making sense of all the big data they’re generating and have access to. From social media to public data sets, there is no shortage of opportunities to turn raw, semi-structured information into actionable intelligence and numerous companies are offering natural language processing APIs to take advantage of those opportunities.

Given the demand for natural language processing technology, Garr expects his new API to be a crucial part of his company’s business going forward. “We think the future for our business is in highly available, highly affordable, highly accurate APIs to handle a multitude of language related business problems,” he told me.

With that in mind, Garr revealed that LinguaSys is working to make its multilingual natural language understanding technology available as an API.

Natural Language Processing You Can Afford. Yes, You!!

By Amy Stapleton, Technology of Virtual Agents, Virtual Agent Chat, November 4, 2014

http://virtualagentchat.com/category/technology-of-virtual-agents/

I’m not a linguist. Quite honestly, the vast majority of what LinguaSys is offering (for free for up to 20 API calls a minute and up to 500 calls a month, I might add) in their robust GlobalNLP platform are things I don’t even understand (lemmatization, stemming, and morphological synthesis anyone?). But what I do get is that GlobalNLP offers a ton of extremely useful capability to anyone who needs to process language input to run their applications. In fact, the GlobalNLP platform is built on LinguaSys’s Carabao Linguistic Virtual Machine, offering the same tools and underlying semantic library that’s used by top companies to process language input in a multitude of different languages for a variety of business critical use cases.

GlobalNLPI signed up for the developer platform and tried it out as best I could. Though I’m not a programmer, the API library is super easy to use and it even comes with a testing function. You can actually execute the APIs right from the library without any need for setting up a development platform. The “Open Console” feature allows you to input data into all the API’s parameters and execute the function. The resulting output is published at the bottom of the screen, so you can see exactly what you’d get if you were running the API from your own program.

The GlobalNLP is a full suite of tools. There are APIs for detecting the language, parsing sentences, translating, and more. The site comes with a very thorough Q&A section, lots of helpful documentation, and an online support forum.  Each API also comes with helpful source code examples in a wide variety of popular programming languages.

I put several of the APIs through their paces. The detectlanguage API does a great job at ferreting out the language of text input. I tried some German, French, and Spanish and the API always came back with the correct answer. I even tried to trick is by entering a mix of languages, but it did a good job at determining which one was dominant in the phrase.

GlobalNLPThe parse API is fun to use, as well as the listSenses API, which helps to decipher the words in a search query, enabling you to better understand the user’s intent with the search. The translate function is fun to try out too, although it’s not designed to compete with human translation. Instead, the LinguaSys automated translation is based on a semantic model that is intended to give you a gist of the source.

If you’re developing an app that needs to interpret language input in different languages, or if you want your existing app to go global, you’ll definitely want to explore the possibilities offered by LinguaSys’s GlobalNLP.

Follow

Get every new post delivered to your Inbox.