Trip_advisor
AI-powered trip planning assistant that creates personalized itineraries collaboratively based on user preferences using various interfaces including CLI, FASTAPI, and Streamlit.
Author

Dips1902
Quick Info
Actions
Tags
🏖️ Trip Planner: Streamlit with CrewAI
Introduction
Trip Planner leverages the CrewAI framework to automate and enhance the trip planning experience, integrating a CLI, FASTAPI, and a user-friendly Streamlit interface.
CrewAI Framework
CrewAI simplifies the orchestration of role-playing AI agents. In VacAIgent, these agents collaboratively decide on cities and craft a complete itinerary for your trip based on specified preferences, all accessible via a streamlined Streamlit user interface.
Running the Application
To experience the VacAIgent app:
Configure Environment: Set up the environment variables for Browseless, Serper, and OpenAI. Use the
secrets.example
as a guide to add your keys then move that file (secrets.toml
) to.streamlit/secrets.toml
.Install Dependencies: Execute
pip install -r requirements.txt
in your terminal.Launch the CLI Mode: Run
python cli_app.py -o "Bangalore, India" -d "Krabi, Thailand" -s 2024-05-01 -e 2024-05-10 -i "2 adults who love swimming, dancing, hiking, shopping, food, water sports adventures, rock climbing"
to start the CLI Mode.Launch the FASTAPI: Run
uvicorn api_app:app --reload
to start the FASTAPI server.Launch the Streamlit App: Run
streamlit run streamlit_app.py
to start the Streamlit interface.
★ Disclaimer: The application uses GEMINI by default. Ensure you have access to GEMINI's API and be aware of the associated costs.
Details & Explanation
- Streamlit UI: The Streamlit interface is implemented in
streamlit_app.py
, where users can input their trip details. - Components:
./trip_tasks.py
: Contains task prompts for the agents../trip_agents.py
: Manages the creation of agents../tools directory
: Houses tool classes used by agents../streamlit_app.py
: The heart of the Streamlit app.
Using LLM Models
To switch LLMs from differnet Providers
class TripAgents():
def __init__(self, llm: BaseChatModel = None):
if llm is None:
#self.llm = LLM(model="groq/deepseek-r1-distill-llama-70b")
self.llm = LLM(model="gemini/gemini-2.0-flash")
else:
self.llm = llm
Integrating Ollama with CrewAI
Pass the Ollama model to agents in the CrewAI framework:
agent = Agent(
role='Local AI Expert',
goal='Process information using a local model',
backstory="An AI assistant running on local hardware.",
llm=LLM(model="ollama/llama3.2", base_url="http://localhost:11434")
)
License
Trip Planner is open-sourced under the MIT License.