Crafting Context-Conscious Conversational Brokers: A Deep Dive into OpenAI and FastAPI Integration
On this tutorial, we are going to discover the method of making a Conversational Agent with a reminiscence microservice utilizing OpenAI and FastAPI. Conversational Brokers have change into a vital element in varied functions, together with buyer assist, digital assistants, and knowledge retrieval programs. Nevertheless, many conventional chatbot implementations lack the flexibility to retain context throughout a dialog, leading to restricted capabilities and irritating consumer experiences. That is difficult, particularly when constructing agent companies following a microservice structure.
The hyperlink to the GitHub repository is on the backside of the article.
Motivation
The motivation behind this tutorial is to handle the limitation of conventional chatbot implementations and create a Conversational Agent with a reminiscence microservice, which turns into particularly essential when deploying brokers inside complicated environments like Kubernetes. In Kubernetes or related container orchestration programs, microservices are topic to frequent restarts, updates, and scaling operations. Throughout these occasions, the state of the dialog in conventional buffers for chatbots can be misplaced, resulting in disjointed interactions and poor consumer experiences.
By constructing a Conversational Agent with a reminiscence microservice, we will be certain that essential dialog context is preserved even within the face of microservice restarts or updates or when interactions should not steady. This preservation of state permits the agent to seamlessly decide up conversations the place they left off, sustaining continuity and offering a extra pure and customized consumer expertise. Moreover, this method aligns with the perfect practices of recent utility improvement, the place containerized microservices typically work together with different parts, making the reminiscence microservice a invaluable addition to the conversational agent’s structure in such distributed setups.
For this challenge, we are going to primarily work with the next applied sciences and instruments:
OpenAI GPT-3.5: We’ll leverage OpenAI’s GPT-3.5 language mannequin, which is able to performing varied pure language processing duties, together with textual content technology, dialog administration, and context retention. We might want to generate an OpenAI API Key, be sure to go to this URL to handle your keys.FastAPI: FastAPI will function the spine of our microservice, offering the infrastructure for dealing with HTTP requests, managing dialog states, and integrating with the OpenAI API. FastAPI is nice for constructing microservices with Python.
On this part, we are going to dive into the step-by-step technique of constructing our Conversational Agent with a reminiscence microservice. The event cycle will embody:
Atmosphere Setup: We’ll create a digital atmosphere and set up the required dependencies, together with OpenAI’s Python library and FastAPI.Designing the Reminiscence Microservice: We’ll define the structure and design of the reminiscence microservice, which will probably be accountable for storing and managing dialog context.Integrating OpenAI: We’ll combine OpenAI’s GPT-3.5 mannequin into our utility and outline the logic for processing consumer messages and producing responses.Testing: We’ll step by step check our conversational agent.
Atmosphere Setup
For this setup, we are going to use the next construction to construct the microservice. That is handy for extra expansions of different companies below the identical challenge, and I personally like this construction.
├── Dockerfile <— Container├── necessities.txt <— Libraries and Dependencies├── setup.py <— Construct and distribute microservices as Python packages└── src├── brokers <— Identify of your Microservice│ ├── __init__.py│ ├── api│ │ ├── __init__.py│ │ ├── routes.py│ │ └── schemas.py│ ├── crud.py│ ├── database.py│ ├── major.py│ ├── fashions.py│ └── processing.py└── agentsfwrk <— Identify of your Widespread Framework├── __init__.py├── integrations.py└── logger.py
We might want to craft within the challenge a folder named src which is able to comprise the Python code for the companies; in our case brokers accommodates all of the code related to our conversational brokers and the API, and agentsfwrk is our frequent framework for utilization throughout companies.
The Dockerfile accommodates the directions to construct the picture, as soon as the code is prepared, the necessities.txt accommodates the libraries to make use of in our challenge and the setup.py accommodates the directions to construct and distribute our challenge.
For now, simply create the companies folders together with the __init__.py information and add the next to the necessities.txt and setup.py to the basis of the challenge, go away the Dockerfile empty, as we are going to come again to it within the Deployment Cycle part.
# Necessities.txtfastapi==0.95.2ipykernel==6.22.0jupyter-bokeh==2.0.2jupyterlab==3.6.3openai==0.27.6pandas==2.0.1sqlalchemy-orm==1.2.10sqlalchemy==2.0.15uvicorn<0.22.0,>=0.21.1# setup.pyfrom setuptools import find_packages, setup
setup(title = ‘conversational-agents’,model = ‘0.1’,description = ‘microservices for conversational brokers’,packages = find_packages(‘src’),package_dir = {”: ‘src’},# That is optionally available btwauthor = ‘XXX XXXX’,author_email = ‘XXXX@XXXXX.ai’,maintainer = ‘XXX XXXX’,maintainer_email = ‘XXXX@XXXXX.ai’,)
Let’s activate the digital atmosphere, and we are going to run pip set up -r necessities.txt within the terminal. We won’t run the setup file but, so let’s get into the subsequent part.
Designing the Widespread Framework
We’ll design our frequent framework, so we will use it throughout all of the microservices built-in the challenge. This isn’t strictly mandatory for small initiatives, however eager about the longer term, you’ll be able to increase it to make use of a number of LLM suppliers, add different libraries to work together with your individual knowledge (i.e. LangChain, VoCode), and different frequent capabilities corresponding to voice and picture companies, with out the necessity of implementing them in every microservice.
Create the folder and the information following the agentsfwrk construction. Every file and its description are beneath:
└── agentsfwrk <— Identify of your Widespread Framework├── __init__.py├── integrations.py└── logger.py
The logger is a really primary utility to arrange a standard logging module, and you may outline it as follows:
import loggingimport multiprocessingimport sys
APP_LOGGER_NAME = ‘CaiApp’
def setup_applevel_logger(logger_name = APP_LOGGER_NAME, file_name = None):”””Setup the logger for the appliance”””logger = logging.getLogger(logger_name)logger.setLevel(logging.DEBUG)formatter = logging.Formatter(“%(asctime)s – %(title)s – %(levelname)s – %(message)s”)sh = logging.StreamHandler(sys.stdout)sh.setFormatter(formatter)logger.handlers.clear()logger.addHandler(sh)if file_name:fh = logging.FileHandler(file_name)fh.setFormatter(formatter)logger.addHandler(fh)
return logger
def get_multiprocessing_logger(file_name = None):”””Setup the logger for the appliance for multiprocessing”””logger = multiprocessing.get_logger()logger.setLevel(logging.DEBUG)formatter = logging.Formatter(“%(asctime)s – %(title)s – %(levelname)s – %(message)s”)
sh = logging.StreamHandler(sys.stdout)sh.setFormatter(formatter)
if not len(logger.handlers):logger.addHandler(sh)
if file_name:fh = logging.FileHandler(file_name)fh.setFormatter(formatter)logger.addHandler(fh)
return logger
def get_logger(module_name, logger_name = None):”””Get the logger for the module”””return logging.getLogger(logger_name or APP_LOGGER_NAME).getChild(module_name)
Subsequent, our integration layer is completed through the combination module. This file acts as a intermediary between the microservices logic and OpenAI, and it’s designed to reveal LLM suppliers in a standard method for our utility. Right here, we will implement frequent methods to deal with exceptions, errors, retries, and timeouts in requests or in responses. I realized from an excellent supervisor to all the time place an integration layer between exterior companies/APIs and the within world of our utility.
The mixing code is outlined beneath:
# integrations.py# LLM supplier frequent moduleimport jsonimport osimport timefrom typing import Union
import openaifrom openai.error import APIConnectionError, APIError, RateLimitError
import agentsfwrk.logger as logger
log = logger.get_logger(__name__)
openai.api_key = os.getenv(‘OPENAI_API_KEY’)
class OpenAIIntegrationService:def __init__(self,context: Union[str, dict],instruction: Union[str, dict]) -> None:
self.context = contextself.directions = instruction
if isinstance(self.context, dict):self.messages = []self.messages.append(self.context)
elif isinstance(self.context, str):self.messages = self.directions + self.context
def get_models(self):return openai.Mannequin.checklist()
def add_chat_history(self, messages: checklist):”””Provides chat historical past to the dialog.”””self.messages += messages
def answer_to_prompt(self, mannequin: str, immediate: str, **kwargs):”””Collects prompts from consumer, appends to messages from the identical conversationand return responses from the gpt fashions.”””# Protect the messages within the conversationself.messages.append({‘function’: ‘consumer’,’content material’: immediate})
retry_exceptions = (APIError, APIConnectionError, RateLimitError)for _ in vary(3):strive:response = openai.ChatCompletion.create(mannequin = mannequin,messages = self.messages,**kwargs)breakexcept retry_exceptions as e:if _ == 2:log.error(f”Final try failed, Exception occurred: {e}.”)return {“reply”: “Sorry, I am having technical points.”}retry_time = getattr(e, ‘retry_after’, 3)log.error(f”Exception occurred: {e}. Retrying in {retry_time} seconds…”)time.sleep(retry_time)
response_message = response.selections[0].message[“content”]response_data = {“reply”: response_message}self.messages.append({‘function’: ‘assistant’,’content material’: response_message})
return response_data
def answer_to_simple_prompt(self, mannequin: str, immediate: str, **kwargs) -> dict:”””Collects context and appends a immediate from a consumer and return response fromthe gpt mannequin given an instruction.This methodology solely permits one message alternate.”””
messages = self.messages + f”n<Shopper>: {immediate} n”
retry_exceptions = (APIError, APIConnectionError, RateLimitError)for _ in vary(3):strive:response = openai.Completion.create(mannequin = mannequin,immediate = messages,**kwargs)breakexcept retry_exceptions as e:if _ == 2:log.error(f”Final try failed, Exception occurred: {e}.”)return {“intent”: False,”reply”: “Sorry, I am having technical points.”}retry_time = getattr(e, ‘retry_after’, 3)log.error(f”Exception occurred: {e}. Retrying in {retry_time} seconds…”)time.sleep(retry_time)
response_message = response.selections[0].textual content
strive:response_data = json.hundreds(response_message)answer_text = response_data.get(‘reply’)if answer_text isn’t None:self.messages = self.messages + f”n<Shopper>: {immediate} n” + f”<Agent>: {answer_text} n”else:elevate ValueError(“The response from the mannequin isn’t legitimate.”)besides ValueError as e:log.error(f”Error occurred whereas parsing response: {e}”)log.error(f”Immediate from the consumer: {immediate}”)log.error(f”Response from the mannequin: {response_message}”)log.data(“Returning a protected response to the consumer.”)response_data = {“intent”: False,”reply”: response_message}
return response_data
def verify_end_conversation(self):”””Confirm if the dialog has ended by checking the final message from the userand the final message from the assistant.”””go
def verify_goal_conversation(self, mannequin: str, **kwargs):”””Confirm if the dialog has reached the purpose by checking the dialog historical past.Format the response as specified within the directions.”””messages = self.messages.copy()messages.append(self.directions)
retry_exceptions = (APIError, APIConnectionError, RateLimitError)for _ in vary(3):strive:response = openai.ChatCompletion.create(mannequin = mannequin,messages = messages,**kwargs)breakexcept retry_exceptions as e:if _ == 2:log.error(f”Final try failed, Exception occurred: {e}.”)raiseretry_time = getattr(e, ‘retry_after’, 3)log.error(f”Exception occurred: {e}. Retrying in {retry_time} seconds…”)time.sleep(retry_time)
response_message = response.selections[0].message[“content”]strive:response_data = json.hundreds(response_message)if response_data.get(‘abstract’) is None:elevate ValueError(“The response from the mannequin isn’t legitimate. Lacking abstract.”)besides ValueError as e:log.error(f”Error occurred whereas parsing response: {e}”)log.error(f”Response from the mannequin: {response_message}”)log.data(“Returning a protected response to the consumer.”)elevate
return response_data
Some notes in regards to the integration module:
The OpenAI Key’s outlined as an atmosphere variable named “OPENAI_API_KEY”, we must always obtain this key and outline it in our terminal or utilizing the python-dotenv library.There are two strategies to combine with GPT fashions, one for the chat endpoint (answer_to_prompt) and one for the completion endpoint (answer_to_simple_prompt). We’ll deal with the utilization of the primary one.There’s a methodology to examine the purpose of a dialog — verify_goal_conversation, which merely follows the directions of brokers and creates a abstract of it.
Designing the (Reminiscence) Microservice
The very best train is to design and consequentially draw a diagram to visualise what the service must do, together with the actors and their actions when interacting with it. Let’s begin by describing our utility in easy phrases:
Our microservice is a supplier of artificially clever brokers, that are consultants on a topic and are anticipated to have conversations in response to an outbound message and following prompts.Our brokers can maintain a number of conversations and are filled with reminiscence that’s to be endured, which implies they need to be capable of retain the dialog historical past whatever the session of the consumer who’s interacting with the brokers.The brokers ought to obtain, at creation, clear directions on learn how to deal with a dialog and reply accordingly throughout the course of it.For programmatic integration, the brokers must also comply with an anticipated response form.
Our design appears like the next diagram:
With this easy diagram, we all know that our microservice must implement strategies which are accountable for these particular duties:
Creation of brokers & definition of instructionsConversation starter & preservation of dialog historyChat with brokers
We’ll code these functionalities of their order, and earlier than we dive into that we are going to construct the skeleton of our utility
Software Skeleton
To kickstart the event, we start by constructing the FastAPI app skeleton. The app skeleton consists of important parts, together with the primary utility script, database configuration, processing script, and routing modules. The primary script serves because the entry level for the appliance, the place we arrange the FastAPI occasion.
Most important File
Let’s create/open the primary.py file in your brokers folder and sort the next code, which merely defines a root endpoint.
from fastapi import FastAPI
from agentsfwrk.logger import setup_applevel_logger
log = setup_applevel_logger(file_name = ‘brokers.log’)
app = FastAPI()
@app.get(“/”)async def root():return {“message”: “Whats up there conversational ai consumer!”}
Database Configuration
We then create/open the database configuration script referred to as database.py, which establishes the connection to our native database for storing and retrieving dialog context. We’ll begin by utilizing a neighborhood SQLite for simplicity, however be at liberty to strive different databases to your atmosphere.
from sqlalchemy import create_enginefrom sqlalchemy.ext.declarative import declarative_basefrom sqlalchemy.orm import sessionmaker
SQLALCHEMY_DATABASE_URL = “sqlite:///brokers.db”
engine = create_engine(SQLALCHEMY_DATABASE_URL, connect_args = {“check_same_thread”: False})SessionLocal = sessionmaker(autocommit = False, autoflush = False, bind = engine)
Base = declarative_base()
API Routes
Lastly, we outline routing modules that deal with incoming HTTP requests, encompassing endpoints accountable for processing consumer interactions. Let’s create the apifolder and create/open the routes.py file and paste the next code.
from typing import Checklist
from fastapi import APIRouter, Relies upon, HTTPExceptionfrom sqlalchemy.orm import Session
import brokers.api.schemasimport brokers.modelsfrom brokers.database import SessionLocal, engine
from agentsfwrk import integrations, logger
log = logger.get_logger(__name__)
brokers.fashions.Base.metadata.create_all(bind = engine)
# Router primary informationrouter = APIRouter(prefix = “/brokers”,tags = [“Chat”],responses = {404: {“description”: “Not discovered”}})
# Dependency: Used to get the database in our endpoints.def get_db():db = SessionLocal()strive:yield dbfinally:db.shut()
# Root endpoint for the router.@router.get(“/”)async def agents_root():return {“message”: “Whats up there conversational ai!”}
With this structured skeleton, we’re prepared to start out coding the appliance we designed.
Create Brokers and Assign Directions
On this part, we are going to deal with implementing the “Create Agent” endpoint. This endpoint permits customers to provoke new conversations and work together with brokers, offering a context and a set of directions for the agent to comply with all through the remainder of the dialog. We’ll begin by introducing two Information Fashions for this course of: One for the Database and one other one for the API. We will probably be utilizing Pydantic for our knowledge fashions. Create/Open the schemas.py file within the api folder, and outline the Agent base, Agent Create, and Agent knowledge mannequin.
from datetime import datetimefrom typing import Checklist, Optionalfrom pydantic import BaseModel
class AgentBase(BaseModel): # <– Base modelcontext: str # <– Our brokers contextfirst_message: str # <– Our brokers will method the customers with a primary message.response_shape: str # <– The anticipated form (for programatic communication) of the response of every agent’s interplay with the userinstructions: str # <– Set of directions that our agent ought to comply with.
class AgentCreate(AgentBase): # <– Creation knowledge modelpass
class Agent(AgentBase): # <– Agent knowledge modelid: strtimestamp: datetime = datetime.utcnow()
class Config:orm_mode = True
The fields within the agent’s knowledge mannequin are detailed beneath:
Context: That is an general context of what the agent is.First message: Our brokers are meant to start out a dialog with the customers. This may be so simple as “Whats up, how can I aid you?” or one thing like “Hello, you requested an agent that can assist you discover details about shares, is that appropriate?”.Response form: This area is principally used for specifying the output format of our agent’s response and needs to be used for remodeling the textual content output of our LLM to a desired form for programmatic communication. For instance, we could wish to specify that our agent ought to wrap the response in a JSON format with a key named response, i.e. {‘response’: “string”}.Directions: This area holds the directions and pointers every agent ought to comply with throughout the entire dialog, corresponding to “Collect the next entities [e1, e2, e3, …] throughout every interplay” or “Reply to the consumer till he’s now not within the dialog” or “Don’t deviate from the primary matter and drive the dialog again to the primary purpose when wanted”.
We now proceed to open the fashions.py file, the place we are going to code our database desk that belongs to the agent’s entity.
from sqlalchemy import Column, ForeignKey, String, DateTime, JSONfrom sqlalchemy.orm import relationshipfrom datetime import datetime
from brokers.database import Base
class Agent(Base):__tablename__ = “brokers”
id = Column(String, primary_key = True, index = True)timestamp = Column(DateTime, default = datetime.utcnow)
context = Column(String, nullable = False)first_message = Column(String, nullable = False)response_shape = Column(JSON, nullable = False)directions = Column(String, nullable = False)
This code is fairly just like the Pydantic mannequin, it defines the desk of the agent in our database.
With our two knowledge fashions in place, we’re able to implement the creation of the Agent. For this, we are going to begin by modifying the routes.py file and including the endpoint:
@router.submit(“/create-agent”, response_model = brokers.api.schemas.Agent)async def create_agent(marketing campaign: brokers.api.schemas.AgentCreate, db: Session = Relies upon(get_db)):”””Create an agent”””log.data(f”Creating agent”)# db_agent = create_agent(db, agent)log.data(f”Agent created with id: {db_agent.id}”)
return db_agent
We have to create a brand new perform that receives an Agent object from the request and creates it into the database. For this, we are going to create/open the crud.py file which is able to maintain all of the interactions to the database (CREATE, READ, UPDATE, DELETE).
# crud.pyimport uuidfrom sqlalchemy.orm import Sessionfrom brokers import modelsfrom brokers.api import schemas
def create_agent(db: Session, agent: schemas.AgentCreate):”””Create an agent within the database”””db_agent = fashions.Agent(id = str(uuid.uuid4()),context = agent.context,first_message = agent.first_message,response_shape = agent.response_shape,directions = agent.directions)db.add(db_agent)db.commit()db.refresh(db_agent)
return db_agent
With our perform created, we will now return to theroutes.py, import the crud module, and use it within the endpoint’s methodology.
import brokers.crud
@router.submit(“/create-agent”, response_model = brokers.api.schemas.Agent)async def create_agent(agent: brokers.api.schemas.AgentCreate, db: Session = Relies upon(get_db)):”””Create an agent endpoint.”””log.data(f”Creating agent: {agent.json()}”)db_agent = brokers.crud.create_agent(db, agent)log.data(f”Agent created with id: {db_agent.id}”)
return db_agent
Now let’s return to the primary.py file and add the “brokers” router. The modifications
# major.pyfrom fastapi import FastAPI
from brokers.api.routes import router as ai_agents # NOTE: <– new additionfrom agentsfwrk.logger import setup_applevel_logger
log = setup_applevel_logger(file_name = ‘brokers.log’)
app = FastAPI()app.include_router(router = ai_agents) # NOTE: <– new addition
@app.get(“/”)async def root():return {“message”: “Whats up there conversational ai consumer!”}
Let’s check this performance. First, we might want to set up our companies as a Python bundle, secondly, begin the appliance on port 8000.
# Run from the basis of the challenge.$ pip set up -e .# Command to run the app.$ uvicorn brokers.major:app –host 0.0.0.0 –port 8000 –reload
Navigate to http://0.0.0.0:8000/docs, the place you will notice the Swagger UI with the endpoint to check. Submit your payload and examine the output.
We’ll proceed creating our utility, however testing the primary endpoint is an efficient signal of progress.
Create Conversations & Protect Dialog Historical past
Our subsequent step is to permit customers to work together with our brokers. We would like customers to work together with particular brokers, so we might want to go the ID of the agent together with the primary interplay message from the consumer. Let’s make some modifications to the Agent knowledge mannequin so every agent can have a number of conversations by introducing the Dialog entity. Open the schemas.py file and add the next fashions:
class ConversationBase(BaseModel): # <– base of our conversations, they need to belong to an agentagent_id: str
class ConversationCreate(ConversationBase): # <– dialog creation objectpass
class Dialog(ConversationBase): # <– The dialog objectsid: strtimestamp: datetime = datetime.utcnow()
class Config:orm_mode = True
class Agent(AgentBase): # <– Agent knowledge modelid: strtimestamp: datetime = datetime.utcnow()conversations: Checklist[Conversation] = [] # <– NOTE: we’ve added the dialog as an inventory of Conversations objects.
class Config:orm_mode = True
Be aware that we’ve modified the Agent knowledge mannequin and added conversations to it, that is so every agent can maintain a number of conversations as designed in our diagram.
We now have to switch our database object and embody the dialog desk within the database mannequin script. We’ll open the fashions.py file and modify the code as comply with:
# fashions.py
class Agent(Base):__tablename__ = “brokers”
id = Column(String, primary_key = True, index = True)timestamp = Column(DateTime, default = datetime.utcnow)
context = Column(String, nullable = False)first_message = Column(String, nullable = False)response_shape = Column(JSON, nullable = False)directions = Column(String, nullable = False)
conversations = relationship(“Dialog”, back_populates = “agent”) # <– NOTE: We add the dialog relationship into the brokers desk
class Dialog(Base):__tablename__ = “conversations”
id = Column(String, primary_key = True, index = True)agent_id = Column(String, ForeignKey(“brokers.id”))timestap = Column(DateTime, default = datetime.utcnow)
agent = relationship(“Agent”, back_populates = “conversations”) # <– We add the connection between the dialog and the agent
Be aware how we added the connection between the conversations per every agent within the brokers desk, and likewise the connection between a dialog with an agent within the conversations desk.
We’ll now create a set of CRUD features to retrieve the agent and conversations by their IDs, which is able to assist us to craft our course of of making a dialog and preserving its historical past. Let’s open the crud.py file and add the next features:
def get_agent(db: Session, agent_id: str):”””Get an agent by its id”””return db.question(fashions.Agent).filter(fashions.Agent.id == agent_id).first()
def get_conversation(db: Session, conversation_id: str):”””Get a dialog by its id”””return db.question(fashions.Dialog).filter(fashions.Dialog.id == conversation_id).first()
def create_conversation(db: Session, dialog: schemas.ConversationCreate):”””Create a dialog”””db_conversation = fashions.Dialog(id = str(uuid.uuid4()),agent_id = dialog.agent_id,)db.add(db_conversation)db.commit()db.refresh(db_conversation)
return db_conversation
These new features will assist us throughout the regular workflow of our utility, we will now get an agent by its ID, get a dialog by its ID, and create a dialog by offering an ID as optionally available, and the agent ID that ought to maintain the dialog.
We will go forward and create an endpoint that creates a dialog. Open the routes.py and add the next code:
@router.submit(“/create-conversation”, response_model = brokers.api.schemas.Dialog)async def create_conversation(dialog: brokers.api.schemas.ConversationCreate, db: Session = Relies upon(get_db)):”””Create a dialog linked to an agent”””log.data(f”Creating dialog assigned to agent id: {dialog.agent_id}”)db_conversation = brokers.crud.create_conversation(db, dialog)log.data(f”Dialog created with id: {db_conversation.id}”)
return db_conversation
With this methodology prepared we’re nonetheless one step away from having an precise conversational endpoint, which we are going to assessment subsequent.
It is very important make a distinction right here after we initialize an agent, we will create a dialog with out triggering a two-way alternate of messages or one other manner is to set off the creation of a dialog when the “Chat with an agent” endpoint is named. This gives some flexibility in orchestrating the workflows outdoors the microservice, in some instances you might wish to initialize the brokers, pre kick-off conversations to shoppers and as messages begin to are available, you begin preserving the historical past of the messages.
Necessary Be aware: if you’re following step-by-step this information and see an error associated to the database schema on this step, it’s as a result of we’re not making use of migrations to the database with every modification of the schemas, so be sure to shut the appliance (exit the terminal command) and delete the brokers.db file that’s created at runtime. You will have to run every endpoint once more and take notes of the IDs.
Chat with an agent
We’re going to introduce the final entity sort in our utility which is the Message entity. This one is accountable for modeling the interplay between a consumer’s message and an agent’s message (two-way alternate of messages). We can even add API knowledge fashions which are used to outline the construction of the response of our endpoints. Let’s go forward and create the info fashions and API response varieties first; open the schemas.py file, and modify the code:
########################################### Inside schemas##########################################class MessageBase(BaseModel): # <– Each message consists by consumer/consumer message and the agent user_message: stragent_message: str
class MessageCreate(MessageBase):go
class Message(MessageBase): # <– Information mannequin for the Message entityid: strtimestamp: datetime = datetime.utcnow()conversation_id: str
class Config:orm_mode = True
########################################### API schemas##########################################class UserMessage(BaseModel):conversation_id: strmessage: str
class ChatAgentResponse(BaseModel):conversation_id: strresponse: str
We now have so as to add the info mannequin in our database fashions script which represents the desk in our database. Open the fashions.py file and modify as beneath:
# fashions.py
class Dialog(Base):__tablename__ = “conversations”
id = Column(String, primary_key = True, index = True)agent_id = Column(String, ForeignKey(“brokers.id”))timestap = Column(DateTime, default = datetime.utcnow)
agent = relationship(“Agent”, back_populates = “conversations”)messages = relationship(“Message”, back_populates = “dialog”) # <– We outline the connection between the dialog and the a number of messages in them.
class Message(Base):__tablename__ = “messages”
id = Column(String, primary_key = True, index = True)timestamp = Column(DateTime, default = datetime.utcnow)
user_message = Column(String)agent_message = Column(String)
conversation_id = Column(String, ForeignKey(“conversations.id”)) # <– A message belongs to a conversationconversation = relationship(“Dialog”, back_populates = “messages”) # <– We outline the connection between the messages and the dialog.
Be aware that we’ve modified our Conversations desk to outline the connection between messages and dialog and we created a brand new desk that represents the interactions (alternate of messages) that ought to belong to a dialog.
We at the moment are going so as to add a brand new CRUD perform to work together with the database and create a message for a dialog. Let’s open the crud.py file and add the next perform:
def create_conversation_message(db: Session, message: schemas.MessageCreate, conversation_id: str):”””Create a message for a dialog”””db_message = fashions.Message(id = str(uuid.uuid4()),user_message = message.user_message,agent_message = message.agent_message,conversation_id = conversation_id)db.add(db_message)db.commit()db.refresh(db_message)
return db_message
Now we’re able to construct the ultimate and most attention-grabbing endpoint, the chat-agent endpoint. Let’s open the routes.py file and comply with the code alongside as we will probably be implementing some processing features on the best way.
@router.submit(“/chat-agent”, response_model = brokers.api.schemas.ChatAgentResponse)async def chat_completion(message: brokers.api.schemas.UserMessage, db: Session = Relies upon(get_db)):”””Get a response from the GPT mannequin given a message from the consumer utilizing the chatcompletion endpoint.
The response is a json object with the next construction:“`{“conversation_id”: “string”,”response”: “string”}“`”””log.data(f”Consumer dialog id: {message.conversation_id}”)log.data(f”Consumer message: {message.message}”)
dialog = brokers.crud.get_conversation(db, message.conversation_id)
if not dialog:# If there aren’t any conversations, we will select to create one on the fly OR elevate an exception.# Which ever you select, be sure to uncomment when mandatory.
# Choice 1:# dialog = brokers.crud.create_conversation(db, message.conversation_id)
# Choice 2:return HTTPException(status_code = 404,element = “Dialog not discovered. Please create dialog first.”)
log.data(f”Dialog id: {dialog.id}”)
On this part of the endpoint, we’re ensuring to create or elevate an exception if the dialog doesn’t exist. The following step is to arrange the info that will probably be despatched to OpenAI through our integration, for this we are going to create a set of processing features within the processing.py file that may craft the context, first message, directions, and anticipated response form from the LLM.
# processing.py
import json
######################################### Chat Properties########################################def craft_agent_chat_context(context: str) -> dict:”””Craft the context for the agent to make use of for chat endpoints.”””agent_chat_context = {“function”: “system”,”content material”: context}return agent_chat_context
def craft_agent_chat_first_message(content material: str) -> dict:”””Craft the primary message for the agent to make use of for chat endpoints.”””agent_chat_first_message = {“function”: “assistant”,”content material”: content material}return agent_chat_first_message
def craft_agent_chat_instructions(directions: str, response_shape: str) -> dict:”””Craft the directions for the agent to make use of for chat endpoints.”””agent_instructions = {“function”: “consumer”,”content material”: directions + f”nnFollow a RFC8259 compliant JSON with a form of: {json.dumps(response_shape)} format with out deviation.”}return agent_instructions
Be aware the final perform that expects the response_shape outlined throughout the creation of the agent, this enter will probably be appended to the LLM throughout the course of a dialog and can information the agent to comply with the rules and return the response as a JSON object.
Let’s return to the routes.py file and end our endpoint implementation:
# New imports from the processing module.from brokers.processing import (craft_agent_chat_context,craft_agent_chat_first_message,craft_agent_chat_instructions)
@router.submit(“/chat-agent”, response_model = brokers.api.schemas.ChatAgentResponse)async def chat_completion(message: brokers.api.schemas.UserMessage, db: Session = Relies upon(get_db)):”””Get a response from the GPT mannequin given a message from the consumer utilizing the chatcompletion endpoint.
The response is a json object with the next construction:“`{“conversation_id”: “string”,”response”: “string”}“`”””log.data(f”Consumer dialog id: {message.conversation_id}”)log.data(f”Consumer message: {message.message}”)
dialog = brokers.crud.get_conversation(db, message.conversation_id)
if not dialog:# If there aren’t any conversations, we will select to create one on the fly OR elevate an exception.# Which ever you select, be sure to uncomment when mandatory.
# Choice 1:# dialog = brokers.crud.create_conversation(db, message.conversation_id)
# Choice 2:return HTTPException(status_code = 404,element = “Dialog not discovered. Please create dialog first.”)
log.data(f”Dialog id: {dialog.id}”)
# NOTE: We’re crafting the context first and passing the chat messages in an inventory# appending the primary message (the method from the agent) to it.context = craft_agent_chat_context(dialog.agent.context)chat_messages = [craft_agent_chat_first_message(conversation.agent.first_message)]
# NOTE: Append to the dialog all messages till the final interplay from the agent# If there aren’t any messages, then this has no impact.# In any other case, we append every so as by timestamp (which makes logical sense).hist_messages = dialog.messageshist_messages.kind(key = lambda x: x.timestamp, reverse = False)if len(hist_messages) > 0:for mes in hist_messages:log.data(f”Dialog historical past message: {mes.user_message} | {mes.agent_message}”)chat_messages.append({“function”: “consumer”,”content material”: mes.user_message})chat_messages.append({“function”: “assistant”,”content material”: mes.agent_message})# NOTE: We might management the dialog by merely including# guidelines to the size of the historical past.if len(hist_messages) > 10:# End the dialog gracefully.log.data(“Dialog historical past is just too lengthy, ending dialog.”)api_response = brokers.api.schemas.ChatAgentResponse(conversation_id = message.conversation_id,response = “This dialog is over, good bye.”)return api_response
# Ship the message to the AI agent and get the responseservice = integrations.OpenAIIntegrationService(context = context,instruction = craft_agent_chat_instructions(dialog.agent.directions,dialog.agent.response_shape))service.add_chat_history(messages = chat_messages)
response = service.answer_to_prompt(# We will check completely different OpenAI fashions.mannequin = “gpt-3.5-turbo”,immediate = message.message,# We will check completely different parameters too.temperature = 0.5,max_tokens = 1000,frequency_penalty = 0.5,presence_penalty = 0)
log.data(f”Agent response: {response}”)
# Put together response to the userapi_response = brokers.api.schemas.ChatAgentResponse(conversation_id = message.conversation_id,response = response.get(‘reply’))
# Save interplay to databasedb_message = brokers.crud.create_conversation_message(db = db,conversation_id = dialog.id,message = brokers.api.schemas.MessageCreate(user_message = message.message,agent_message = response.get(‘reply’),),)log.data(f”Dialog message id {db_message.id} saved to database”)
return api_response
Voilà! That is our remaining endpoint implementation, if we have a look at the Notes added to the code, we see that the method is kind of simple:
We be sure the dialog exists in our database (or we create one)We craft the context and directions to the agent from our databaseWe make use of the “reminiscence” of the agent by pulling the dialog historyFinally, we request the agent response by means of OpenAI’s GPT-3.5 Turbo mannequin and return the response to the consumer.
Native Testing Our Brokers
Now we’re prepared to check the whole workflow of our microservice, we are going to begin by going to our terminal and typing uvicorn brokers.major:app — host 0.0.0.0 — port 8000 — reload to launch the app. Subsequent, we are going to navigate to our Swagger UI by going to http://0.0.0.0:8000/docs and we are going to submit the next requests:
Create the agent: Give a payload that you simply’d like to check. I’ll submit the next:{“context”: “You’re a chef specializing in Mediterranean meals that gives receipts with a most of straightforward 10 components. The consumer can have many meals preferences or ingredient preferences, and your job is all the time to research and information them to make use of easy components for the recipes you counsel and these must also be Mediterranean. The response ought to embody detailed data on the recipe. The response must also embody inquiries to the consumer when mandatory. For those who assume your response could also be inaccurate or obscure, don’t write it and reply with the precise textual content: `I haven’t got a response.`”,”first_message”: “Whats up, I’m your private chef and cooking advisor and I’m right here that can assist you along with your meal preferences and your cooking abilities. What can I can do for you at this time?”,”response_shape”: “{‘recipes’: ‘Checklist of strings with the title of the recipes’, ‘components’: ‘Checklist of the components used within the recipes’, ‘abstract’: ‘String, abstract of the dialog’}”,”directions”: “Run by means of the dialog messages and discard any messages that aren’t related for cooking. Give attention to extracting the recipes that have been talked about within the dialog and for every of them extract the checklist of components. Be sure that to supply a abstract of the dialog when requested.”}Create the dialog: assign the dialog to the agent_id one that you’ve gotten from the earlier response.{“agent_id”: “Exchange with the UUID from the agent you simply created.”}Let’s chat with our agent by means of the outbound message: Our agent will initialize the dialog by asking us a query or approaching us with a message. We’ll comply with the dialog by submitting a message again — I’ll use this one:{“conversation_id”: “74485ccb-21e5-45bd-8008-001cbd6d3e37″,”message”: “Hey I’m concerned about cooking dinner for me and my pals tomorrow evening, and I would like your recommendation on what to cook dinner for five folks.”}
Our agent replied to us with a response and we will proceed this dialog by replying in a pure manner.
# Our query{“conversation_id”: “74485ccb-21e5-45bd-8008-001cbd6d3e37″,”message”: “There’s somebody that’s allergic to peanuts and that is it. My solely desire is seafood and something that goes completely with white wine.”}# Our brokers response{“conversation_id”: “74485ccb-21e5-45bd-8008-001cbd6d3e37″,”response”: “Nice! We will undoubtedly work across the peanut allergy and incorporate seafood into the menu. How about we begin with a refreshing appetizer of Grilled Shrimp Skewers with Lemon and Garlic? It is easy, scrumptious, and pairs effectively with white wine.nnHere’s the recipe for Grilled Shrimp Skewers with Lemon and Garlic:nnIngredients:n1. 1 pound of enormous shrimp, peeled and deveinedn2. 2 cloves of garlic, mincedn3. 2 tablespoons of contemporary lemon juicen4. 2 tablespoons of olive oiln5. Salt and pepper to tasten6. Wood skewers, soaked in water for 30 minutesnnInstructions:n1. Preheat your grill to medium-high warmth.n2. In a bowl, mix minced garlic, lemon juice, olive oil, salt, and pepper.n3. Thread the shrimp onto the soaked picket skewers.n4. Brush the shrimp skewers with the garlic-lemon marinade on either side.n5. Place the skewers on the preheated grill and cook dinner for about 2-3 minutes per facet till they flip pink and opaque.n6. Take away from the grill and serve sizzling.nnFor the primary course, how a couple of Mediterranean-style Baked Salmon? It is flavorful, wholesome, and pairs splendidly with white wine.nnHere’s the recipe for Mediterranean Baked Salmon:nnIngredients:n1. 1 pound of salmon filletsn2. 2 tablespoons of olive oiln3. 2 cloves of garlic, mincedn4. 1 teaspoon of dried oreganon5. Juice of half a lemonn6. Salt and pepper to tastennInstructions:n1. Preheat your oven to 375°F (190°C).n2. Place the salmon fillets on a baking sheet lined with parchment paper or aluminum foil.n3. In a small bowl, combine collectively olive oil, minced garlic, dried oregano, lemon juice, salt, and pepper.n4. Brush the combination over the salmon fillets, making certain they’re evenly coated.n5. Bake the salmon within the preheated oven for about 12-Quarter-hour or till it flakes simply with a fork.n6. Take away from the oven and let it relaxation for a couple of minutes earlier than serving.nnTo complement the meal, you’ll be able to serve a facet of Roasted Mediterranean Greens. It is a colourful and nutritious addition to your dinner.nnHere’s the recipe for Roasted Mediterranean Greens:nnIngredients:n1. 1 purple bell pepper, slicedn2. 1 yellow bell pepper, slicedn3. 1 zucchini, slicedn4. 1 eggplant, dicedn5. 1 purple onion, slicedn6. 2 tablespoons of olive oiln7. Salt and pepper to tasten8. Contemporary herbs like basil or parsley (optionally available)nnInstructions:n1. Preheat your oven to 425°F (220°C).n2. In a big bowl, toss collectively all of the greens with olive oil, salt, and pepper till effectively coated.n3. Unfold the greens in a single layer on a baking sheet.n4. Roast within the preheated oven for about 20-25 minutes or till they’re tender and barely caramelized.n5. Take away from the oven and sprinkle with contemporary herbs if desired.nnFor dessert, you’ll be able to maintain it easy but satisfying with a basic Mediterranean dessert like Contemporary Fruit Salad with Honey and Mint.nnHere’s how one can put together Contemporary Fruit Salad with Honey and Mint:nnIngredients:n1. Assorted contemporary fruits of your selection (corresponding to berries, melons, grapes)n2. Honey for drizzlingn3. Contemporary mint leaves for garnishnnInstructions:n1. Wash and minimize the contemporary fruits into bite-sized items.n2. Mix all of the fruits in a bowl.n3. Drizzle honey over the fruit salad and gently toss to coat.n4. Garnish with contemporary mint leaves.n5. Serve chilled.nnI hope you and your folks take pleasure in this Mediterranean dinner menu! You probably have some other questions or want additional help, be at liberty to ask.”}# Our query – Be aware that we’re not mentioning the allergy, as we anticipate the agent to recollect we’re speaking about an allergy to peanuts.{“conversation_id”: “74485ccb-21e5-45bd-8008-001cbd6d3e37″,”message”: “Is the recipe okay for my good friend who’s allergic?”}# Our brokers response – Be aware the point out of those two indices.{“conversation_id”: “74485ccb-21e5-45bd-8008-001cbd6d3e37″,”response”: “Sure, the Grilled Shrimp Skewers with Lemon and Garlic recipe needs to be protected to your good friend with a peanut allergy. Nevertheless, it is all the time essential to double-check the components you employ to make sure they’re free from any potential allergens or cross-contamination.”}
Go forward and mess around with the code and your new brokers. Within the subsequent part, I’ll deal with the deployment of this service.
We’ll deploy our utility below a container atmosphere within the cloud corresponding to Kubernetes, Azure Container Service, or AWS Elastic Container Service. Right here is the place we create a docker picture and add our code so we will run it in one in all these environments, go forward and open the Dockerfile one we created at first and paste the next code:
# DockerfileFROM python:3.10-slim-bullseye
# Set the working directoryWORKDIR /app
# Copy the challenge information to the containerCOPY . .
# Set up the bundle utilizing setup.pyRUN pip set up -e .
# Set up dependenciesRUN pip set up pip -U && pip set up –no-cache-dir -r necessities.txt
# Set the atmosphere variableARG OPENAI_API_KEYENV OPENAI_API_KEY=$OPENAI_API_KEY
# Expose the required portsEXPOSE 8000
# Run the appliance# CMD [“uvicorn”, “agents.main:app”, “–host”, “0.0.0.0”, “–port”, “8000”]
The Dockerfile installs the app after which it runs it through the CMD which is commented out. It is best to uncomment the command if you wish to run it domestically as a standalone, however for different companies corresponding to Kubernetes, that is outlined when defining the deployment or pods within the command part of the manifest.
Construct the picture, wait till the construct is accomplished, after which check it by operating the run command, which is beneath:
# Construct the picture$ docker construct – build-arg OPENAI_API_KEY=<Exchange along with your OpenAI Key> -t agents-app .# Run the container with the command from the brokers app (Use -d flag for the indifferent run).$ docker run -p 8000:8000 agents-app uvicorn brokers.major:app –host 0.0.0.0 –port 8000# OutputINFO: Began server course of [1]INFO: Ready for utility startup.INFO: Software startup full.INFO: Uvicorn operating on http://0.0.0.0:8000 (Press CTRL+C to give up)INFO: 172.17.0.1:41766 – “GET / HTTP/1.1” 200 OKINFO: 172.17.0.1:41766 – “GET /favicon.ico HTTP/1.1” 404 Not FoundINFO: 172.17.0.1:41770 – “GET /docs HTTP/1.1” 200 OKINFO: 172.17.0.1:41770 – “GET /openapi.json HTTP/1.1” 200 OK
Nice you’re prepared to start out utilizing the appliance in your deployment atmosphere.
Lastly, we are going to attempt to combine this microservice with a front-end utility that may serve the brokers and the conversations by calling the endpoints internally, which is the frequent manner of constructing and interacting between companies utilizing this structure.
We will use this new service in a number of methods, and I’ll solely deal with constructing a front-end utility that calls the endpoints from our brokers and makes it doable for customers to work together through a UI. We’ll use Streamlit for this, as it’s a easy approach to spin up a front-end utilizing Python.
Necessary Be aware: There are further utilities that I added to our brokers’ service which you can copy instantly from the repository. Seek for get_agents() ,get_conversations(), get_messages() from the crud.py module and the api/routes.py routes.
Set up Streamlit and add it to our necessities.txt.# Pin a model should you want$ pip set up streamlit==1.25.0# Our necessities.txt (added streamlit)$ cat necessities.txtfastapi==0.95.2ipykernel==6.22.0jupyter-bokeh==2.0.2jupyterlab==3.6.3openai==0.27.6pandas==2.0.1sqlalchemy-orm==1.2.10sqlalchemy==2.0.15streamlit==1.25.0uvicorn<0.22.0,>=0.21.1Create the appliance by creating first a folder in our src folder with the title frontend. Create a brand new file named major.py and place the next code.import streamlit as stimport requests
API_URL = “http://0.0.0.0:8000/brokers” # We’ll use our native URL and port outlined of our microservice for this instance
def get_agents():”””Get the checklist of obtainable brokers from the API”””response = requests.get(API_URL + “/get-agents”)if response.status_code == 200:brokers = response.json()return brokers
return []
def get_conversations(agent_id: str):”””Get the checklist of conversations for the agent with the given ID”””response = requests.get(API_URL + “/get-conversations”, params = {“agent_id”: agent_id})if response.status_code == 200:conversations = response.json()return conversations
return []
def get_messages(conversation_id: str):”””Get the checklist of messages for the dialog with the given ID”””response = requests.get(API_URL + “/get-messages”, params = {“conversation_id”: conversation_id})if response.status_code == 200:messages = response.json()return messages
return []
def send_message(agent_id, message):”””Ship a message to the agent with the given ID”””payload = {“conversation_id”: agent_id, “message”: message}response = requests.submit(API_URL + “/chat-agent”, json = payload)if response.status_code == 200:return response.json()
return {“response”: “Error”}
def major():st.set_page_config(page_title = “🤗💬 AIChat”)
with st.sidebar:st.title(“Conversational Agent Chat”)
# Dropdown to pick agentagents = get_agents()agent_ids = [agent[“id”] for agent in brokers]selected_agent = st.selectbox(“Choose an Agent:”, agent_ids)
for agent in brokers:if agent[“id”] == selected_agent:selected_agent_context = agent[“context”]selected_agent_first_message = agent[“first_message”]
# Dropdown to pick conversationconversations = get_conversations(selected_agent)conversation_ids = [conversation[“id”] for dialog in conversations]selected_conversation = st.selectbox(“Choose a Dialog:”, conversation_ids)
if selected_conversation is None:st.write(“Please choose a dialog from the dropdown.”)else:st.write(f”**Chosen Agent**: {selected_agent}”)st.write(f”**Chosen Dialog**: {selected_conversation}”)
# Show chat messagesst.title(“Chat”)st.write(“This can be a chat interface for the chosen agent and dialog. You’ll be able to ship messages to the agent and see its responses.”)st.write(f”**Agent Context**: {selected_agent_context}”)
messages = get_messages(selected_conversation)with st.chat_message(“assistant”):st.write(selected_agent_first_message)
for message in messages:with st.chat_message(“consumer”):st.write(message[“user_message”])with st.chat_message(“assistant”):st.write(message[“agent_message”])
# Consumer-provided promptif immediate := st.chat_input(“Ship a message:”):with st.chat_message(“consumer”):st.write(immediate)with st.spinner(“Pondering…”):response = send_message(selected_conversation, immediate)with st.chat_message(“assistant”):st.write(response[“response”])
if __name__ == “__main__”:major()
The code beneath connects to our agent’s microservice through API calls and permits the consumer to pick the Agent and the Conversations and chat with the agent, just like what ChatGPT gives. Let’s run this app by opening one other terminal (be sure to have the brokers microservice up and operating on port 8000) and sort $ streamlit run src/frontend/major.py and you’re able to go!