How do Chatbots Work? An Inside Look

Image for How do Chatbots Work? An Inside Look

Chatbots are changing the way brands engage with their customers. Their combination of constant connectivity and near-instantaneous response time make them an attractive way to extend or even replace the functionality of web and mobile apps. But what makes a chatbot tick? In this article we’ll look at what sets chatbots apart from typical web and mobile apps and how they use Natural Language Processing to turn human language into commands an app can understand.

Join Upwork

HOW CHATBOTS PROCESS HUMAN LANGUAGE

At a glance, a chatbot can look like a normal app. There’s an application layer, a database, and APIs to call external services. The main thing that’s missing is the UI, which in the case of a bot is replaced by the chat interface. While this setup is convenient for users (that’s why chatbots are on the rise, after all), it does add a layer of complexity for the app to handle. Without the benefit of a rich interface that allows a user to input specific, discrete instructions, it falls on the app to figure out what the user wants and how best to deliver that.

Unlike normal app inputs, human language tends to be messy and imprecise. That’s where the NLP engine comes in. Made up of a number of different libraries, the NLP engine does the work of identifying and extracting entities, which are relevant pieces of information provided by the user, using libraries for common NLP tasks like tokenization and named entity recognition. Tokenization breaks sentences down into discrete words, stripping out punctuation, while named entity recognition looks for words in pre-defined categories (for example, place names or addresses). They might also use a library called a normalizer, which catches common spelling errors, expands contractions and abbreviations, and converts UK English to US English.

For example, let’s take a simple bot, one that only does one thing: Orders takeout. When someone texts the message “order a pizza,” the bot would hopefully recognize the command (“order”) and the request (“pizza”). While these techniques alone might allow a chatbot to understand basic commands, they’re a far cry from actually understanding the structure and purpose of language.

UNDERSTANDING COMPLEX REQUESTS

What if you’re trying to build a bot that’s a more generalized assistant rather than a text-powered version of a simple web app? For that, your bot is going to need to understand context and intent. To establish context and intent, you’ll need some additional NLP tasks that allow the NLP engine to understand the relationships between words. Part-of-speech tagging takes a sentence and identifies nouns, verbs, adjectives, etc. while dependency parsing identifies phrases, subjects, and objects. For example, the sentence “please deliver a large veggie pizza with no mushrooms” might confuse a more basic bot that can only process simple commands, but our dependency parser would hopefully recognize that “no mushrooms” is meant to modify “veggie pizza.” Understanding context and intent allows bots to understand and act upon a much wider array of actions, or even ask the user additional questions until they understand the request. From there, you can add more complex NLP tasks like sentiment analysis, which can identify when a user is becoming frustrated and perhaps escalate the interaction to a human CS rep.

When it comes to building an NLP engine, there are a lot of options out there, depending on the functionality your bot requires and the language you’re using to build it. Python is often celebrated for its robust machine learning libraries, which include NLTK, SpaCy, and Pattern, all of which provide support for basic NLP tasks, as well as some more advanced applications like deep learning.

Looking for a machine learning or NLP expert? Create a job post on Upwork now.

The Database

Like most apps, your chatbot will probably be connected to a database. Unlike many apps, however, your chatbot probably won’t be producing discrete, easily parsed metrics like what buttons users clicked on or how long they stayed on a certain page. For this reason, many chatbots are natural candidates for NoSQL databases. Among these, MongoDB is a popular document-oriented option, especially for organizations that want to perform analytics on the data they collect, whether that’s to collect data about their users or to improve their chatbot’s performance via machine learning techniques.

Getting Started

As we’ve seen, the basic structure of a chatbot is not so different from a typical app. That said, chatbots have very different requirements than web or mobile apps. Where other apps can be feature-rich and complex, chatbots should be fast and lightweight. That’s one reason why the MEAN stack (consisting of MongoDB, Express.js, Angular, and Node.js) is a popular option for its agility, scalability, and support for mobile.

If you’re bootstrapping your bot and don’t want to develop your own NLP system, there are also a number of AI/NLP services available. Wit.ai (owned by Facebook) and API.ai are both platforms that allow you to create question-and-answer chat routines (called “stories” or “flows”) to “train” your chatbot to recognized normal requests and entities. Once you’ve created these routines, the services will use machine learning to teach your bot to recognize similar routines and requests, based on data gathered from other bots on their platforms.

Post a job on Upwork. It's free!
Avatar

by

Tyler writes about data for the content team at Upwork. Based in Berkeley, California, he's written and edited article- and book-length projects for a variety… more