In this Gen AI world, companies are working hard to release best LLM model, few of them are OpenAI, Google, Meta, anthropic.
Today I will show you how we can chat with our MySQL database using OpenAI LLM model. You can use any other model as well, even use any open source model.
Pre-requisite:
1. You should have account on Open AI to use the Open AI key.
2. MySQL database
3. LangChain (LangChain is an open source framework for building applications based on large language models (LLMs).
4. Visual Studio Code (You can use any other platform to develop code, I feel vs code is easy)
5. Python 3.10 or later
So let's start..
I have 2 files, .env and app.py
app.py where I write my code. and .env where I specify the OpenAI key
1. First step is to install libraries.
pip install langchain langchain-community langchain-openai langchain_core langchain_chains
2. Setup environment variable. (here basically I setu AP key as an environment variable that point to my .env file)
3. Connect to MySQL database. This is how we connect to MySQL database with db_uri string where chinook is my database name.
"create_sql_query_chain" function generate the SQL query using llm model and db schema as input
as you can see above is generate the SQL query.
5. Now we have to run that SQL query on DB to get the result.
and then I create a chain which will invoke using user query and first generate query and then execute it and later I print the result. Here I have 8 employees in employee table.
6. Here everything looks good but we need the output in natural language not like [(8,)]
To do that I created a custom prompt for LLM to use which will take user question, query, result and give answer.
and created another chain which will take the output from step 5.
and create a chain using RunnablePassthrough where we pass the user question.
first assign save the SQL query into query variable and then second assign get the query and execute it and save the result into result variable.
and then we use the prompt(prompt now have the user question, query and result) and pass to LLM model and StrOutputParser() basically save the natural language output.
When I print the response, you can see the natural language output.
7. I also tried some complex query and it worked like magic :)
No comments:
Post a Comment