0% found this document useful (0 votes)
57 views22 pages

How To Enhance AI Chatbots With Real-Time Data From Bright Data Using OpenAI and LangChain - by Victor Yakubu - Jan, 2025 - Python in Plain English

The document discusses enhancing AI chatbots by integrating real-time data from Bright Data using OpenAI and LangChain. It emphasizes the importance of real-time data for improving accuracy, user engagement, and expanding use cases for chatbots. The article provides a step-by-step guide on accessing Bright Data's datasets and integrating them into a chatbot application to deliver context-rich responses.

Uploaded by

陳賢明
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views22 pages

How To Enhance AI Chatbots With Real-Time Data From Bright Data Using OpenAI and LangChain - by Victor Yakubu - Jan, 2025 - Python in Plain English

The document discusses enhancing AI chatbots by integrating real-time data from Bright Data using OpenAI and LangChain. It emphasizes the importance of real-time data for improving accuracy, user engagement, and expanding use cases for chatbots. The article provides a step-by-step guide on accessing Bright Data's datasets and integrating them into a chatbot application to deliver context-rich responses.

Uploaded by

陳賢明
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain

LangChain | by Victor Yakubu | Jan, 20…

Get unlimited access to the best of Medium for less than $1/week. Become a member

How to Enhance AI Chatbots with Real-Time


Data from Bright Data using OpenAI and
LangChain
Leverage Bright Data’s Data for AI, OpenAI, and LangChain to Build an AI Chatbot
with Real-Time Access to Comprehensive Knowledge Bases.

Victor Yakubu · Follow


Published in Python in Plain English
9 min read · Jan 28, 2025

Listen Share More

AI chatbots have become an integral part of customer engagement, information


retrieval, and task automation. However, their usefulness depends largely on the
quality and relevance of the data they can access. While many chatbots rely solely

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 1/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

on pre-defined knowledge bases, integrating real-time structured data can


significantly elevate their capabilities.

This article covers the process of enhancing your AI chatbot by using Bright Data’s
dataset — which is used to power AI and LLMs, OpenAI’s cutting-edge language
models, and LangChain’s advanced data processing tools. You will use these tools to
create a chatbot that delivers accurate, context-rich, real-time responses to user
queries.

Why Real-Time Data Matters for AI Chatbots


The value of AI chatbots lies in their ability to provide accurate, contextually
relevant, and timely information. However, to ensure their accuracy, you need to be
able to feed your chatbots with real-time data because static datasets can quickly
become outdated, limiting your chatbot’s usefulness. Real-time data integration
offers the following benefits:

1. Enhanced Accuracy
Chatbots can provide precise answers that reflect the latest information available by
incorporating up-to-date datasets like Bright Data’s Wikipedia Articles. This ensures
users receive accurate responses, especially for topics where facts are constantly
changing, such as scientific discoveries, current events, or product updates.

2. Improved User Engagement


When users feel that their chatbot interactions are dynamic and contextually
relevant, engagement increases, and real-time data enables chatbots to handle
nuanced queries, making conversations more personalised and meaningful. This
not only enhances user satisfaction but also fosters long-term trust in the chatbot’s
capabilities.

3. Broader Use Cases


With real-time access to data, AI chatbots
Open incan
app address a wide range of applications,
including:
53
Search
Education: Answering queries with up-to-date information for students and
researchers.

Customer Support: Offering real-time solutions based on the latest knowledge


about products or services.

Market Analysis: Analyzing and responding to trends based on current data.

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 2/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

4. Competitive Advantage
Businesses that deploy chatbots with real-time data gain an advantage over
competitors using static systems. They can respond faster to evolving customer
needs and more effectively adapt to market changes.

Overview of Bright Data’s Data for AI Dataset


Bright Data’s Data for AI dataset is a resource designed to fuel AI and LLM (Large
Language Model) projects at every stage — from pre-training to fine-tuning and
beyond. It offers a vast collection of over 5 billion LLM-friendly records sourced
from 100+ trusted providers, all structured, cleaned, validated, and refreshed on a
monthly basis. This ensures the dataset remains accurate and up-to-date for high-
quality AI applications.

Suppose the specific data you require isn’t available. In that case, Bright Data also
enables you to build a custom web data pipeline tailored to your needs using either
the code or no code option available. This flexibility makes the platform a go-to
choice for developers and non-developers seeking robust, scalable, and relevant
data for their AI systems.

For this article, you’ll use Wikipedia articles about engineering from Bright Data’s
Data for AI dataset to build a chatbot. Follow these steps to access and prepare the
dataset:

Steps to Access the Dataset


https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 3/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

1. Sign in to the Bright Data Dashboard


Log in to your Bright Data account to access their extensive suite of tools and
datasets.

2. Navigate to the Web Datasets Section

On the sidebar menu, click on “Web Datasets.”

Next, click on “Dataset Marketplace” to explore available datasets.

3. Select the Data for AI Category

Under the “Categories” filter, choose “Data for AI.”

This will redirect you to a curated list of datasets specifically designed for AI
applications.

4. Choose the Wikipedia Articles Dataset

Scroll down and locate “Wikipedia Articles.”

Within this category, select “Wikipedia articles about engineering.” This dataset
contains structured information relevant to engineering topics, perfect for
training or enhancing AI chatbots.

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 4/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

5. Purchase and Download the Dataset

Click on “Proceed to Purchase” to acquire the dataset.

You can choose between JSON or CSV formats (this article uses the CSV format).

Once purchased, the dataset will be downloaded to your local machine. Save it
as Wikipedia_articles.csv for easy access.

What are the benefits of using Bright Data’s data for AI Dataset?
https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 5/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

This dataset is ideal for AI-driven projects because of its:

1. Access to a Wide Range of Datasets: Bright Data offers an expansive library of


datasets, including over 5 billion LLM-friendly records sourced from 100+
reliable providers. These datasets cover various categories, such as engineering,
healthcare, finance, and more, enabling you to find relevant data for virtually
any AI application.

2. Easy Integration with AI Workflows: Bright Data’s datasets are available in


formats like JSON and CSV, making them easy to integrate with tools like
OpenAI’s GPT models and frameworks like LangChain. This compatibility
accelerates the development and deployment of AI solutions.

3. User-Friendly Interface: The platform’s intuitive dashboard and streamlined


navigation make data accessible to users of all skill levels.

4. Dedicated Support and Documentation: Bright Data provides customer support,


data experts and detailed documentation to help you get the most out of your
data.

5. Ethically sourced: Bright Data emphasises ethical data sourcing, ensuring


compliance with intellectual property laws and fair usage policies. This
guarantees that AI applications using the dataset operate transparently and
responsibly, a critical aspect for businesses and developers.

6. Scalability: You can fetch large datasets without manual collection or worrying
about changes in website structure or anti-scraping mechanisms.

How to Integrate Bright Data’s Wikipedia Dataset into Your AI Chatbot


In this section, you’ll walk through integrating Bright Data’s Wikipedia dataset into
your AI chatbot using OpenAI and LangChain. This allows your chatbot to provide
accurate, context-rich responses by leveraging the vast wealth of structured data
from Wikipedia.

Step 1: Prerequisites
Before proceeding, ensure you have the following tools and data set up:

1. Get Bright Data’s Wikipedia Dataset:

Obtain and download the dataset in CSV format.

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 6/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

Save the file as Wikipedia_articles.csv in your project directory.

2. OpenAI API Key: Register with OpenAI and generate your API key.

3. Python Environment: Install the necessary Python libraries using the following
command:

pip install flask pandas langchain-openai langchain-community faiss-cpu langcha

4. Project Structure: Organize your files as outlined in the folder structure below:

your_project_folder/
├── app.py
├── wiki_chatbot.py
├── Wikipedia_articles.csv
└── templates/
└── index.html

Step 2: Load and Process the Dataset


The wiki_chatbot.py script handles dataset loading, processing, and integration
with the AI model.

1. Read and Combine Data:

The load_and_process_data method in WikipediaChatbot reads the dataset using


pandas and combines relevant columns like title , raw_text , references , etc.,
into a single combined_text .

df['combined_text'] = df.apply(lambda row: f"""


Title: {row['title']}
Content: {row['raw_text']}
See Also: {row['see_also']}
References: {row['references']}
External Links: {row['external_links']}

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 7/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

Categories: {row['categories']}
""", axis=1)

2. Text Splitting for Efficiency:

The RecursiveCharacterTextSplitter divides large texts into smaller chunks for


better indexing and faster retrieval.

text_splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=2

4. Create a Vector Store:

Use the FAISS library to build a vector database from the processed text,
allowing the chatbot to retrieve relevant information efficiently.

self.vectorstore = FAISS.from_texts(texts, self.embeddings)

5. Initialise a Conversational Chain:

The ConversationalRetrievalChain is set up using OpenAI’s GPT model, enabling


intelligent responses and retrieval of source documents.

self.qa_chain = ConversationalRetrievalChain.from_llm(
llm=self.llm,
retriever=self.vectorstore.as_retriever(),
return_source_documents=True
)

Here is the complete code for the wiki_chatbot.py script

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 8/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

import pandas as pd
from langchain_openai.embeddings import OpenAIEmbeddings
from langchain_community.vectorstores import FAISS
from langchain_openai.chat_models import ChatOpenAI
from langchain.chains import ConversationalRetrievalChain
from langchain_text_splitters import RecursiveCharacterTextSplitter

import os
from typing import List, Dict

class WikipediaChatbot:
def __init__(self, openai_api_key: str):
"""
Initialize the Wikipedia chatbot with OpenAI API key

Args:
openai_api_key (str): Your OpenAI API key
"""
self.openai_api_key = openai_api_key
os.environ["OPENAI_API_KEY"] = openai_api_key

# Initialize OpenAI and embedding models


self.llm = ChatOpenAI(temperature=0.7, model_name="gpt-4")
self.embeddings = OpenAIEmbeddings()

# Initialize conversation history


self.chat_history = []

def load_and_process_data(self, csv_path: str) -> None:


"""
Load and process Wikipedia data from CSV

Args:
csv_path (str): Path to the CSV file containing Wikipedia data
"""
# Read CSV file
df = pd.read_csv(csv_path)

# Combine relevant text columns


df['combined_text'] = df.apply(lambda row: f"""
Title: {row['title']}
Content: {row['raw_text']}
See Also: {row['see_also']}
References: {row['references']}
External Links: {row['external_links']}
Categories: {row['categories']}
""", axis=1)

# Split text into chunks


text_splitter = RecursiveCharacterTextSplitter(
chunk_size=1000,

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 9/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

chunk_overlap=200,
length_function=len
)

texts = []
for text in df['combined_text'].tolist():
texts.extend(text_splitter.split_text(text))

# Create vector store


self.vectorstore = FAISS.from_texts(texts, self.embeddings)

# Initialize conversation chain


self.qa_chain = ConversationalRetrievalChain.from_llm(
llm=self.llm,
retriever=self.vectorstore.as_retriever(),
return_source_documents=True
)

def get_response(self, query: str) -> Dict:


"""
Get response from the chatbot

Args:
query (str): User's question

Returns:
Dict: Response containing answer and source documents
"""
# Using invoke() instead of direct call
result = self.qa_chain.invoke({
"question": query,
"chat_history": self.chat_history
})

# Update chat history


self.chat_history.append((query, result["answer"]))

return {
"answer": result["answer"],
"sources": [doc.page_content for doc in result["source_documents"]]
}

def clear_history(self) -> None:


"""Clear conversation history"""
self.chat_history = []

Step 3: Create the Flask Backend


The app.py script connects the chatbot with a frontend interface and manages API
endpoints.
https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 10/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

1. Load the Dataset:

Instantiate the chatbot and load the processed dataset during app initialisation.

chatbot.load_and_process_data(csv_path)

2. Set Up Chat API:

The /api/chat endpoint receives user queries, fetches responses from the
chatbot, and returns answers with sources.

@app.route('/api/chat', methods=['POST'])
def chat():
data = request.json
user_message = data.get('message', '')
response = chatbot.get_response(user_message)
return jsonify({
'answer': response['answer'],
'sources': response['sources'][:3]
})

Here is the complete code for the app.py script

from flask import Flask, render_template, request, jsonify


from wiki_chatbot import WikipediaChatbot
import os

app = Flask(__name__)

# Initialize the chatbot


OPENAI_API_KEY = "<replace with your openai key" # Replace with your actual AP
chatbot = WikipediaChatbot(OPENAI_API_KEY)

# Load your Wikipedia CSV data


csv_path = "Wikipedia_articles.csv" # Replace with your CSV file path
chatbot.load_and_process_data(csv_path)

@app.route('/')
def home():
return render_template('index.html')
https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 11/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

@app.route('/api/chat', methods=['POST'])
def chat():
try:
data = request.json
user_message = data.get('message', '')

if not user_message:
return jsonify({'error': 'No message provided'}), 400

response = chatbot.get_response(user_message)

return jsonify({
'answer': response['answer'],
'sources': response['sources'][:3] # Limiting to top 3 sources for
})

except Exception as e:
return jsonify({'error': str(e)}), 500

if __name__ == '__main__':
app.run(debug=True)

Step 4: Build the Frontend Interface


The index.html file provides a simple and interactive user interface for the chatbot.

1. Chatbox UI:

The interface allows users to type questions, view bot responses, and check
sources.

CSS is used to style the chat container, user and bot messages, and loading
indicators.

2. Send User Input to the Backend:

JavaScript captures user input and sends it to the /api/chat endpoint using
Axios.

async function sendMessage() {


const response = await axios.post('/api/chat', { message });

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 12/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

appendMessage('bot', response.data.answer, response.data.sources);


}

3. Display Responses:

Bot responses, along with truncated source information, are displayed in the
chat window.

function appendMessage(type, message, sources = null) {


const sourcesDiv = document.createElement('div');
sourcesDiv.innerHTML = '<strong>Sources:</strong><br>' + sources.map(source
}

You can find the code for the index.html file here.

Step 5: Run the Chatbot


1. Start the Flask Server: Run the Flask app to host the chatbot.

python app.py

2. Access the Chatbot: Open your browser and navigate to http://127.0.0.1:5000/ to


interact with the chatbot.

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 13/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

Key Features

Enhanced Responses: The chatbot provides detailed answers supported by


references from the Wikipedia dataset.

Source Attribution: Users can verify information with links to original sources.

Scalable Design: The setup allows integration with other Bright Data datasets for
diverse applications.

By following these steps, you’ve successfully integrated Bright Data’s Wikipedia


dataset into your AI chatbot. This setup allows your chatbot to deliver accurate and
context-rich responses to user queries. You can find the complete code for this
tutorial on GitHub.

Final Thoughts
This article covered the process of building a chatbot with real-time data obtained
through Bright Data’s Data for AI platform. It demonstrated how straightforward it is
to access structured public data, specifically tailored for AI and LLMs, without the
need for complex scripting or manual extraction.

By integrating the Wikipedia datasets with OpenAI models and LangChain, you built
a chatbot that is capable of providing accurate and up-to-date responses in even the
most specialised fields in engineering.

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 14/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

OpenAI Langchain Chatbots AI Python

Follow

Published in Python in Plain English


40K Followers · Last published 3 days ago

New Python content every day. Follow to join our 3.5M+ monthly readers.

Follow

Written by Victor Yakubu


1.9K Followers · 45 Following

Software Developer and Technical Writer. I am passionate about simplifying the web, one post at a time.

No responses yet

What are your thoughts?

Respond

More from Victor Yakubu and Python in Plain English

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 15/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

In Python in Plain English by Victor Yakubu

How to Scrape and Analyse Data for Free using AI: From Collection to
Insight
Learn how to combine web scraping, proxies, and AI-powered language models to automate
data extraction and gain actionable insights…

Dec 11, 2024 203 3

In Python in Plain English by Kiran Maan

Just Stop Writing Python Functions Like This!!!


I just reviewed someone else’s code and I was just shocked.

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 16/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

Jan 20 1.97K 47

In Python in Plain English by Kiran Maan

How I Speed Up My Python Scripts by 300%


You won’t regret learning this.

Jan 11 1.7K 21

In JavaScript in Plain English by Victor Yakubu

How I Extract Information from any Website in Seconds with AI


https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 17/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

Use Langchain, OpenAI, Bright Data, and NextJS to build an AI tool that scrapes, extracts, and
analyzes data for free.

Dec 19, 2024 189 3

See all from Victor Yakubu

See all from Python in Plain English

Recommended from Medium

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 18/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

In Towards AI by Krishan Walia

Build your own personalized Fitness RAG Agent using Python!


A complete and beginner-friendly guide to building your fully personalized Fitness RAG Agent
with Python.

Jan 12 76

In AI Advances by Thuwarakesh Murallie

Steal My Blueprint to Build and Deploy RAGs [In Minutes].


Most RAGs are built on this stack; why would you redo it every time?

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 19/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

5d ago 253 3

Lists

Coding & Development


11 stories · 1000 saves

Natural Language Processing


1926 stories · 1579 saves

Generative AI Recommended Reading


52 stories · 1643 saves

ChatGPT prompts
51 stories · 2549 saves

Shobhit Agarwal

🚀 How to Build ANYTHING You Imagine With DeepSeek-R1 (Zero Coding


Required & Free)
No Code, Just the Flow!

Jan 28 2.4K 26

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 20/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

Daniel Avila

Step-by-Step: Running DeepSeek locally in VSCode for a Powerful,


Private AI Copilot
This step-by-step guide will show you how to install and run DeepSeek locally, configure it with
CodeGPT, and start leveraging AI to…

6d ago 306 10

In Towards Data Science by Lorenzo Mezzini

How to Find Seasonality Patterns in Time Series


Using Fourier Transform to detect seasonal components

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 21/22
2025/2/9 晚上8:11 How to Enhance AI Chatbots with Real-Time Data from Bright Data using OpenAI and LangChain | by Victor Yakubu | Jan, 20…

5d ago 297 3

Joey O'Neill

Advanced RAG with Python & LangChain


In this article, we incorporate pre-retrieval query rewriting and post-retrieval document
reranking for better RAG results.

Jan 14 48

See more recommendations

https://python.plainenglish.io/how-to-enhance-ai-chatbots-with-real-time-data-from-bright-data-using-openai-and-langchain-793bb88934f8 22/22

You might also like