Build a Text Translator Web App using Flask and Azure Cognitive Services
Last Updated :
30 Mar, 2023
We're going to create a website that will translate text into multiple languages using Artificial Intelligence(AI). We'll use Flask framework for the Front end and Azure Cognitive Services(collection of AI services) for the Back end.
To get started writing our Flask application with Python, we need to set up our development environment, which will require a couple of items to be installed.
Step 1: Creating the project directory
We can create a project directory from a command or terminal window with one of the following commands :
For Windows :
md FlaskAiApp
cd FlaskAiApp
For mac or linux :
mkdir FlaskAiApp
cd FlaskAiApp
Note: Keep the command or terminal window open for the entire module.
Step 2: Creating the Python virtual environment
Python's virtual environment isn't necessarily as complex as it sounds. Instead of creating a virtual machine or container, create a virtual environment which is a folder containing all the libraries we need to run our application, including the Python runtime itself. By using a virtual environment, we make our applications modular, allowing us to keep them separate from each other and avoid versioning issues. When working with Python, you should always use virtual environments as a best practice. Write the following commands in the command or terminal window.
For Windows :
# Create the environment
python -m venv venv
# Activate the environment
.\\venv\\scripts\\activate
For macOS and Linux :
# Create the environment
python3 -m venv venv
# Activate the environment
source ./venv/bin/activate
Step 3: Installing Flask and other libraries
With our virtual environment created and activated, the library Flask that we need for our website can now be installed. We will install Flask by following a common convention to create the requirements.txt file. The requirements.txt file is not special in and of itself; it is a text file where we list the libraries required for our application. But the convention is typically used by developers, making it easier to manage applications where multiple libraries are dependent.
Open the project directory in Visual Studio Code and create the requirements.txt file and add the following text:
Return to the command prompt or terminal window and perform the installation by using pip to run the following command :-
pip install -r requirements.txt
Step 4: Creating app.py
Usually, a file called app.py is an entry point for Flask applications. Create a file name app.py in the project directory.
After creating app.py file add the following code :
Python3
from flask import Flask, render_template
app = Flask(__name__)
The import declaration contains references to Flask, which is the core of any Flask program. In a while, if we want to return our HTML, we can use the render template.
We will use one route - '/'. This route is often called either the default route or the index route since it is used when the user does not have something other than the domain name or the server. Now let us add the route method by using the following code.
Python3
@app.route('/', methods=['GET'])
def index():
return render_template('index.html')
By using @app.route, we indicate the route that we want to create. The path is '/', which is the default route. We indicate that this is going to be used for GET. If a GET request is received for /, Flask will automatically call the function. In the index body, we indicate that we are going to return the HTML template named index.html to the user.
Step 5: Create the HTML template
Let us create the Jinja (the templating engine for Flask). First, create the folder name templates and create the index.html file inside the templates folder. Add the following code:
HTML
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/[email protected]/dist/css/bootstrap.min.css"
integrity="sha384-TX8t27EcRE3e/ihU7zmQxVncDAy5uIKz4rEkgIXeMed4M0jlfIDPvg6uqKI2xXr2" crossorigin="anonymous">
<title>Text Translator</title>
</head>
<body>
<div class="container">
<h1>Test Translation service</h1>
<div>Enter the text you wish to translate, choose the language, and click Translate button</div>
<div>
<form method="POST">
<div class="form-group">
<textarea name="text" cols="20" rows="10"
class="form-control"></textarea>
</div>
<div class="form-group">
<label for="language">Language:</label>
<select name="language" class="form-control">
<option value="en">English</option>
<option value="it">Italian</option>
<option value="ja">Japanese</option>
<option value="ru">Russian</option>
<option value="de">German</option>
</select>
</div>
<div>
<button type="submit" class="btn btn-success">Translate</button>
</div>
</form>
</div>
</div>
</body>
</html>
Step 6: Testing the application
Open the terminal in the Visual Studio Code and run the following commands:
For Windows:
set FLASK_ENV=development
For macOS/Linux:
export FLASK_ENV=development
Run the application:
flask run
Open the application in a browser by navigating to http://localhost:5000. The index.html will be displayed.

Step 7: Creating Back-end using Azure Cognitive Services
Azure offers Cognitive services, which include computer vision services, speech to text, and text to speech and text translation services. You can access any of these services via software developer kits (SDKs) or by calling them the same way you would call any other HTTP endpoint. You'll need an Azure account to use Cognitive Services. We'll need a key to call Translator Service (or any other Cognitive Service). This key will be used whenever we have access to the service. The key is the same as the password.
Now let's create the Translator Service key using the Azure portal and store it into the .env file in our application.
1. Browse to Azure portal.
2. Select Create a resource.
3. In the search box enter Translator.
4. Select Create.
5. Complete the given form:
- Subscription: Your subscription
- Resource group:
- Select Create new
- Name: AIFlask
- Resource group region: Select a region near you
- Resource region: Select the same region as above
- Name: A unique value, such as ai-yourname
- Pricing tier: Free F0
6. Select Review+Create.
7. Select Create.
8. Select Go to resource.
9. Select Keys and Endpoint on the left side under RESOURCE MANAGEMENT
10. Create .env file in our application to store the key.
Replace the placeholders in the .env file.
- your_key with the key you copied above
- your_endpoint with the endpoint from Azure
- your_location with the location from Azure
Step 8: Calling the Translator service
At the top of the app.py file insert the following code :-
Python3
# Importing the required libraries
import requests
import os
import uuid
import json
from dotenv import load_dotenv
load_dotenv()
Now add the code to create the route and logic for translating text at the bottom of the app.py :-
Python3
# code
@app.route('/', methods=['POST'])
def index_post():
# Read the values from the form
original_text = request.form['text']
target_language = request.form['language']
# Load the values from .env
key = os.environ['KEY']
endpoint = os.environ['ENDPOINT']
location = os.environ['LOCATION']
# Indicate that we want to translate and the API
# version (3.0) and the target language
path = '/translate?api-version=3.0'
# Add the target language parameter
target_language_parameter = '&to=' + target_language
# Create the full URL
constructed_url = endpoint + path + target_language_parameter
# Set up the header information, which includes our
# subscription key
headers = {
'Ocp-Apim-Subscription-Key': key,
'Ocp-Apim-Subscription-Region': location,
'Content-type': 'application/json',
'X-ClientTraceId': str(uuid.uuid4())
}
# Create the body of the request with the text to be
# translated
body = [{'text': original_text}]
# Make the call using post
translator_request = requests.post(
constructed_url, headers=headers, json=body)
# Retrieve the JSON response
translator_response = translator_request.json()
# Retrieve the translation
translated_text = translator_response[0]['translations'][0]['text']
# Call render template, passing the translated text,
# original text, and target language to the template
return render_template(
'results.html',
translated_text=translated_text,
original_text=original_text,
target_language=target_language
)
Step 9 :- Creating the template to display the results
Let us create the file name results.html in the templates folder and add the following HTML code :-
HTML
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/[email protected]/dist/css/bootstrap.min.css"
integrity="sha384-TX8t27EcRE3e/ihU7zmQxVncDAy5uIKz4rEkgIXeMed4M0jlfIDPvg6uqKI2xXr2" crossorigin="anonymous">
<title>Result</title>
</head>
<body>
<div class="container">
<h2>Results</h2>
<div>
<strong>Original text:</strong> {{ original_text }}
</div>
<div>
<strong>Translated text:</strong> {{ translated_text }}
</div>
<div>
<strong>Target language code:</strong> {{ target_language }}
</div>
<div>
<a href="{{ url_for('index') }}">Try another one!</a>
</div>
</div>
</body>
</html>
Step 10: Let's Test our Text Translation Service Application
Execute the command flask run in the terminal to restart the service and browse to http://localhost:5000 to test your application.
Result:
Successfully we had created an AI application using Python, Flask and Azure Cognitive Services that translates the text into different languages.
Similar Reads
Natural Language Processing (NLP) Tutorial Natural Language Processing (NLP) is a branch of Artificial Intelligence (AI) that helps machines to understand and process human languages either in text or audio form. It is used across a variety of applications from speech recognition to language translation and text summarization.Natural Languag
5 min read
Introduction to NLP
Natural Language Processing (NLP) - OverviewNatural Language Processing (NLP) is a field that combines computer science, artificial intelligence and language studies. It helps computers understand, process and create human language in a way that makes sense and is useful. With the growing amount of text data from social media, websites and ot
9 min read
NLP vs NLU vs NLGNatural Language Processing(NLP) is a subset of Artificial intelligence which involves communication between a human and a machine using a natural language than a coded or byte language. It provides the ability to give instructions to machines in a more easy and efficient manner. Natural Language Un
3 min read
Applications of NLPAmong the thousands and thousands of species in this world, solely homo sapiens are successful in spoken language. From cave drawings to internet communication, we have come a lengthy way! As we are progressing in the direction of Artificial Intelligence, it only appears logical to impart the bots t
6 min read
Why is NLP important?Natural language processing (NLP) is vital in efficiently and comprehensively analyzing text and speech data. It can navigate the variations in dialects, slang, and grammatical inconsistencies typical of everyday conversations. Table of Content Understanding Natural Language ProcessingReasons Why NL
6 min read
Phases of Natural Language Processing (NLP)Natural Language Processing (NLP) helps computers to understand, analyze and interact with human language. It involves a series of phases that work together to process language and each phase helps in understanding structure and meaning of human language. In this article, we will understand these ph
7 min read
The Future of Natural Language Processing: Trends and InnovationsThere are no reasons why today's world is thrilled to see innovations like ChatGPT and GPT/ NLP(Natural Language Processing) deployments, which is known as the defining moment of the history of technology where we can finally create a machine that can mimic human reaction. If someone would have told
7 min read
Libraries for NLP
Text Normalization in NLP
Normalizing Textual Data with PythonIn this article, we will learn How to Normalizing Textual Data with Python. Let's discuss some concepts : Textual data ask systematically collected material consisting of written, printed, or electronically published words, typically either purposefully written or transcribed from speech.Text normal
7 min read
Regex Tutorial - How to write Regular Expressions?A regular expression (regex) is a sequence of characters that define a search pattern. Here's how to write regular expressions: Start by understanding the special characters used in regex, such as ".", "*", "+", "?", and more.Choose a programming language or tool that supports regex, such as Python,
6 min read
Tokenization in NLPTokenization is a fundamental step in Natural Language Processing (NLP). It involves dividing a Textual input into smaller units known as tokens. These tokens can be in the form of words, characters, sub-words, or sentences. It helps in improving interpretability of text by different models. Let's u
8 min read
Python | Lemmatization with NLTKLemmatization is an important text pre-processing technique in Natural Language Processing (NLP) that reduces words to their base form known as a "lemma." For example, the lemma of "running" is "run" and "better" becomes "good." Unlike stemming which simply removes prefixes or suffixes, it considers
6 min read
Introduction to StemmingStemming is an important text-processing technique that reduces words to their base or root form by removing prefixes and suffixes. This process standardizes words which helps to improve the efficiency and effectiveness of various natural language processing (NLP) tasks.In NLP, stemming simplifies w
6 min read
Removing stop words with NLTK in PythonNatural language processing tasks often involve filtering out commonly occurring words that provide no or very little semantic value to text analysis. These words are known as stopwords include articles, prepositions and pronouns like "the", "and", "is" and "in." While they seem insignificant, prope
5 min read
POS(Parts-Of-Speech) Tagging in NLPParts of Speech (PoS) tagging is a core task in NLP, It gives each word a grammatical category such as nouns, verbs, adjectives and adverbs. Through better understanding of phrase structure and semantics, this technique makes it possible for machines to study human language more accurately. PoS tagg
7 min read
Text Representation and Embedding Techniques
NLP Deep Learning Techniques
NLP Projects and Practice
Sentiment Analysis with an Recurrent Neural Networks (RNN)Recurrent Neural Networks (RNNs) are used in sequence tasks such as sentiment analysis due to their ability to capture context from sequential data. In this article we will be apply RNNs to analyze the sentiment of customer reviews from Swiggy food delivery platform. The goal is to classify reviews
5 min read
Text Generation using Recurrent Long Short Term Memory NetworkLSTMs are a type of neural network that are well-suited for tasks involving sequential data such as text generation. They are particularly useful because they can remember long-term dependencies in the data which is crucial when dealing with text that often has context that spans over multiple words
4 min read
Machine Translation with Transformer in PythonMachine translation means converting text from one language into another. Tools like Google Translate use this technology. Many translation systems use transformer models which are good at understanding the meaning of sentences. In this article, we will see how to fine-tune a Transformer model from
6 min read
Building a Rule-Based Chatbot with Natural Language ProcessingA rule-based chatbot follows a set of predefined rules or patterns to match user input and generate an appropriate response. The chatbot canât understand or process input beyond these rules and relies on exact matches making it ideal for handling repetitive tasks or specific queries.Pattern Matching
4 min read
Text Classification using scikit-learn in NLPThe purpose of text classification, a key task in natural language processing (NLP), is to categorise text content into preset groups. Topic categorization, sentiment analysis, and spam detection can all benefit from this. In this article, we will use scikit-learn, a Python machine learning toolkit,
5 min read
Text Summarization using HuggingFace ModelText summarization involves reducing a document to its most essential content. The aim is to generate summaries that are concise and retain the original meaning. Summarization plays an important role in many real-world applications such as digesting long articles, summarizing legal contracts, highli
4 min read
Advanced Natural Language Processing Interview QuestionNatural Language Processing (NLP) is a rapidly evolving field at the intersection of computer science and linguistics. As companies increasingly leverage NLP technologies, the demand for skilled professionals in this area has surged. Whether preparing for a job interview or looking to brush up on yo
9 min read