cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Shahzad_Ismail
Community Manager
Community Manager

This comprehensive glossary provides concise, easy-to-understand definitions for key AI terms, concepts, and technologies. Unlock the language of AI and deepen your understanding of this transformative field.

 

AI Chatbot

An entity that can chat with you and provide conversational experiences like a human expert on the topic. It can answer questions, generate content and connect to data sources. The key output is "information". Examples: An AI chatbot at a bank providing information about loan rates. An AI chatbot on your employee portal that gets your benefits information.

AI copilot
An AI assistant that provides suggestions, resources, or actions for human users. As a feature of AI in customer service, AI copilots act similarly to actual aeroplane copilots by helping with tasks and even taking over certain functions.
AI Agent

An AI program built on top of Large Language Models (LLMs) that uses information-processing capabilities to obtain data, make decisions, and take actions on behalf of human users. They can autonomously pursue goals and take action. The key output is "action". AI agents can be conversational or can be automatically triggered.

 

Examples: 'Account creation' agent that helps open an account - by asking for identity proof, running fraud checks, notifying a human for final approval, and doing account creation. 'Lead enricher' agent that triggers when any lead is added to your CRM, looks up the web/internal sources, and enriches information such as company profile (from web), interests (from social), other products tried (from internal systems) etc.

AI Assistants

 These are applications like Claude or ChatGPT or Gemini AI assistant that have a conversational interface to help users get answers, get their job done.

AI Code builders

 These are applications that help users build code. These could be "low code" code builders like Lovable, v0, Bolt, or "Professional" builders like Cursor or Windsurf.  But they all help build code, which typically powers applications or services. Not calling these "app builders", since these code builders can build just services as well - and services are great targets for Sinch APIs.

AI Agent builders

These are applications and platforms that help you build an agent. These could be "low code"

AI Systems

A broad term that covers any application or platform that utilizes AI deeply. Covers AI assistants, AI Agent builders. AI Code builders, AI agent builders etc. For instance, MCP servers work with any "AI system".

Automation

Automation is the use of AI and other technologies to perform tasks with minimal human intervention. Intelligent automation can improve response times, reduce workload, and boost efficiency in various business processes.

Conversational AI

Conversational AI encompasses intelligent technologies and systems that simulate natural, human-like interactions between humans and machines. This common AI term typically describes the technology used to run chatbots, virtual assistants, and other dialogue-based interfaces.

LLM

A Large Language Models (LLM) is a type of machine learning model that can understand and generate human language.

LLMs are trained on huge sets of data — hence the name "large." LLMs are built on machine learning: specifically, a type of neural network called a transformer model.

In simpler terms, an LLM is a computer program that has been fed enough examples to be able to recognize and interpret human language or other types of complex data.

MCP

Model Context Protocol (MCP) is an open protocol that defines a universal standard for how applications provide context to LLMs to help simplify connections between AI systems and various data sources and tools. This open-source protocol addresses the challenges of fragmented data access, allowing for more efficient and context-aware AI applications. By making it easier to interact with different data sources without any problems, MCP improves the relevance and accuracy of AI-generated responses.

MCP helps you build agents and complex workflows on top of LLMs. LLMs frequently need to integrate with data and tools, and MCP provides:

  • A growing list of pre-built integrations that your LLM can directly plug into
  • The flexibility to switch between LLM providers and vendors
  • Best practices for securing your data within your infrastructure.

MCP (Model Context Protocol) provides AI agents with a simple, standardized way to plug into tools, data, and services. Having such an agreed-upon way for clients to discover and use external tools and context servers, standardizes the way AI agents access capabilities like Sinch APIs, while giving providers control over how those capabilities are presented and used.

In summary: MCP manages API calls between an AI Agent and Sinch

 

MCP Client

an MCP client lives inside the AI assistant or app (like Claude or Cursor). When the AI wants to use a tool, it goes through this client to talk to the matching server.

For example:

  • Cursor can use a client to interact with your local development environment.
  • Claude might use it to access files or read from a spreadsheet.

The client handles all the back-and-forth — sending requests, receiving results, and passing them to the AI.

MCP Host

The AI agents or applications that want access to a tool or data via MCP.

MCP Messages

There are four types of messages used in MCP:

  • Requests: The client asks for information from an MCP server.
  • Results: The MCP server replies with the desired information.
  • Errors: Sent when the server cannot give a reply.
  • Notifications: One-way messages that need no response.
MCP Protocol

The MCP protocol is what keeps everything in sync. It defines how the MCP client and MCP server communicate — what the messages look like, how actions are described, and how results are returned.

It’s super flexible:

  • Can run locally (e.g., between your AI and your computer’s apps)
  • Can run over the internet (e.g., between your AI and an online tool)
  • Uses structured formats like JSON so everything stays clean and consistent

Thanks to this shared protocol, an AI agent can connect with a new tool — even one it’s never seen before — and still understand how to use it.

MCP Servers

An MCP server is like a smart adapter for a tool or app. It knows how to take a request from an AI (like “Get today’s sales report”) and translate it into the commands that tool understands.

For example:

  • A GitHub MCP server might turn “list my open pull requests” into a GitHub API call.
  • A YouTube MCP server could transcribe video links on demand
  • A Sinch MCP server would turn “send SMS notifications to my Managers contact” into a Sinch SMS API call.

MCP servers also:

  • Tell the AI what they can do (tool discovery)
  • Interpret and run commands
  • Format results the AI can understand
  • Handle errors and give meaningful feedback.

The Sinch MCP server provides AI agents a set of tools for calling the Sinch APIs. Perhaps in a future phase we can extend this to also searching our knowledge base (documentation, support articles, and so on).

n8n

n8n is an open source workflow and automation platform that uniquely combines AI capabilities with business process automation, giving technical teams the flexibility of code with the speed of no-code.

 

n8n is a versatile tool designed to simplify complex workflow automation. It stands out as a low-code, source-available platform that is self-hostable, offering a reliable alternative to conventional automation tools. What sets n8n apart is its flexibility in using JavaScript for intricate tasks and a user-friendly drag-and-drop interface for more straightforward processes. Its visual building block approach makes it easy to configure and manage workflows, enhancing efficiency. n8n is highly regarded for its straightforward user experience and a wide range of integrations, including seamless GitHub webhook setups. Ideal for technical users, it accelerates the creation of sophisticated workflows without the complexity of traditional scripting.

 

N8n can be used for developing AI-powered workflows by incorporating LLM chains.