streamlit_application
AI Chatbot Documentation
Overview
This project is a Streamlit-based web application that implements a personal gym chatbot. It uses the Ollama API for local AI processing and provides a user-friendly interface for interacting with different language models.
π Project Insights
| π Stars | π΄ Forks | π Issues | π Open PRs | π Closed PRs | π οΈ Languages | π₯ Contributors |
Features
- Dark-themed user interface with custom CSS styling
- Configurable model selection (llama2, mistral, codellama)
- Adjustable temperature setting for AI responses
- Chat history persistence using Streamlit session state
- Real-time chat interface with user and assistant messages
- Clear chat functionality
- Error handling for API requests
π Exciting News...
π This project is now an official part of GirlScript Summer of Code β GSSoC'25! πππ» We're thrilled to welcome contributors from all over India and beyond to collaborate, build, and grow streamlit application! Letβs make learning and career development smarter β together! ππ¨βπ»π©βπ»
π©βπ» GSSoC is one of Indiaβs largest 3-month-long open-source programs that encourages developers of all levels to contribute to real-world projects π while learning, collaborating, and growing together. π±
π With mentorship, community support, and collaborative coding, it's the perfect platform for developers to:
- β¨ Improve their skills
- π€ Contribute to impactful projects
- π Get recognized for their work
- π Receive certificates and swag!
π I canβt wait to welcome new contributors from GSSoC 2025 to this streamlit application project family! Let's build, learn, and grow together β one commit at a time. π₯π¨βπ»π©βπ»
Prerequisites
- Python 3.8+
- Streamlit
- Requests library
- Ollama server running locally on port 11434
Installation
- Install required Python packages:
pip install streamlit requests- Ensure Ollama is installed and running locally:
- Follow Ollama's official documentation for installation
- Start the Ollama server:
ollama serve
Project Structure
project_directory/
β
βββ app.py # Main application code
βββ README.md # This documentation fileUsage
- Run the Streamlit app:
streamlit run app.py-
Access the application through your web browser (typically at
http://localhost:8501) -
Configure settings in the sidebar:
- Select desired model
- Adjust temperature slider
- View application information
- Interact with the chatbot:
- Enter messages in the text input field
- Click "Send" to get AI responses
- Use "Clear Chat" to reset the conversation
Code Explanation
Imports
streamlit: For creating the web interfacerequests: For making API calls to Ollamajson: For handling JSON data
Page Configuration
st.set_page_config(
page_title="AI Chatbot",
page_icon="π€",
layout="wide"
)Sets up the Streamlit page with a title, icon, and wide layout.
CSS Styling
Custom CSS is applied using st.markdown with unsafe_allow_html=True to create a dark theme and style various UI components.
Session State
if 'messages' not in st.session_state:
st.session_state.messages = []Initializes chat history storage using Streamlit's session state.
Sidebar Configuration
with st.sidebar:
st.title("βοΈ Configuration")
model = st.selectbox(...)
temperature = st.slider(...)Creates a sidebar for model selection and temperature adjustment.
Main Interface
- Displays the chat history in a container
- Shows user and assistant messages with proper formatting
- Uses markdown for styling
Ollama Integration
def query_ollama(prompt, model_name, temp):
response = requests.post("http://localhost:11434/api/generate", ...)Handles API calls to the local Ollama server for generating responses.
Chat Functionality
- Text input for user messages
- Send button to trigger AI response
- Clear chat button to reset conversation
- Spinner during API processing
Error Handling
- Checks for valid user input
- Handles API request errors
- Displays warning messages when appropriate
Limitations
- Requires local Ollama server
- Limited to supported models (llama2, mistral, codellama)
- No persistent storage for chat history
- Basic error handling for API failures
Future Improvements
- Add persistent storage for chat history
- Implement message streaming
- Add support for more models
- Enhance error handling
- Add conversation export functionality
Troubleshooting
- Ensure Ollama server is running on port 11434
- Verify model availability in Ollama
- Check internet connection for package installation
- Monitor Streamlit logs for errors
π Code of Conduct
Please refer to the Code of Conduct for details on contributing guidelines and community standards.
π€π€ Contribution Guidelines
We love our contributors! If you'd like to help, please check out our CONTRIBUTE.md file for guidelines.
Thank you once again to all our contributors who has contributed to streamlit application! Your efforts are truly appreciated. ππ
See the full list of contributors and their contributions on the GitHub Contributors Graph.
π‘ Suggestions & Feedback
Feel free to open issues or discussions if you have any feedback, feature suggestions, or want to collaborate!
π Support & Star
If you find this project helpful, please give it a star β to support more such educational initiatives!
π License
This project is licensed under the MIT License - see the License file for details.
β Stargazers
π΄ Forkers
|
Manideep Botsa
|
π¨βπ« Mentors β streamlit application (GSSoC'25)
| Role | Name | GitHub Profile | LinkedIn Profile |
|---|---|---|---|
| Mentor 1 | Aayush Kumar Gupta | AayushKGupta12 | aayush-kumar-gupta |
π¨βπ» Built with β€οΈ by streamlit application Team
β€οΈ Manideep Botsa and Contributors β€οΈ open an issue
Ready to show off your coding achievements? Get started with streamlit application today! π



