Build and Deploy Your AI Chatbot Using GitHub Copilot 1
Build and Deploy Your AI Chatbot Using GitHub Copilot 1
Prerequisites:
Before starting, ensure you have the following:
Python 3.9 or higher installed.
Visual Studio Code with the Python extension.
GitHub Copilot installed (free for students via GitHub Education).
Access to AI Foundry Services at ai.azure.com.
Step 1: Setting Up Your Environment
1.1 Install Visual Studio Code and Extensions
1. Download and install Visual Studio Code.
2. Open VS Code and navigate to Extensions (Ctrl + Shift + X).
3. Install the following extensions:
o Python: Automatically installs Python and dependencies.
python --version
Step 2: Understanding the Chatbot
Explain the project’s purpose:
Accepting user inputs through a web interface.
Using AI Foundry Services to generate conversational responses.
Demonstrating Prompt Scripting for modular, reusable AI prompts.
flask
ai-foundry-sdk
app = Flask(__name__)
client = FoundryClient(api_key="YOUR_API_KEY")
@app.route("/chat", methods=["POST"])
def chat():
user_input = request.json.get("message")
response = client.chat(user_input, prompt_script="general-conversation")
return jsonify({"response": response})
if __name__ == "__main__":
app.run(debug=True)
Replace "YOUR_API_KEY" with your API key from ai.azure.com.
Step 4: Understanding Prompt Scripting
What is Prompt Scripting?
Prompt Scripting is a method to pre-define reusable, modular AI prompts. It reduces the number of tokens
used by sending only the necessary context for each API call, improving efficiency.
Example Script Explanation:
1. Create a file scripts.py:
scripts = {
"greeting": "You are a friendly assistant. Respond warmly to greetings.",
"farewell": "You are a formal assistant. Respond politely to farewells.",
}
This defines modular responses for specific intents (e.g., greetings or farewells).
2. Use the script in the app:
@app.route("/chat", methods=["POST"])
def chat():
user_input = request.json.get("message")
script = scripts.get(request.json.get("script"), "general-conversation")
response = client.chat(user_input, prompt_script=script)
return jsonify({"response": response})
Step 5: Deploying the Chatbot
5.1 Local Testing
Run the app locally to verify:
python app.py
Access it at http://127.0.0.1:5000.
5.2 Deploying to Azure
1. Use GitHub Copilot for guidance: Prompt: "Deploy this Flask app to Azure App Service."
Copilot provides:
2. Automate deployment with Azure Developer CLI (AZD): Prompt: "Set up and deploy using
AZD." Copilot suggests:
azd init
azd auth login
azd up
Step 6: Enhancing the Chatbot
Suggestions for advanced features:
Error Handling: Add robust error handling using Copilot.
Persistent Memory: Integrate a database for multi-turn conversations.
Contextual Conversations: Extend Prompt Scripting for complex workflows.
Conclusion
By following this guide, you’ve built and deployed an AI chatbot using modern tools like the AI Foundry
SDK, AI Foundry Services, and GitHub Copilot. With Prompt Scripting, you’ve optimized efficiency
while simplifying modular design. This approach highlights the power of combining cutting-edge AI
tools with cloud-based deployment.