👋 Introduction:
- This is Humphrey, a AI Minecraft Assistant project aims to create a helpful tool that provides assistance and answers questions related to the game Minecraft. It leverages Meta's Llama 3 LLM model to understand user queries and provide relevant information and guidance.
- This project is not affiliated with or endorsed by Mojang Studios, the creators of Minecraft.
📦 Tech Stack:
- Meta's Llama3 Language Model (LLM) for natural language understanding and generation.
- Python for backend scripting.
- HTML/CSS/JavaScript for frontend development.
- A text2text-generation model from Finnish-NLP on HuggingFace
👨💻 Features:
- Listening and recognizing speech, responding only when "Humphrey" is spoken (similar to how "Alexa" works)
- Answering questions related to gameplay mechanics, crafting recipes, item descriptions, etc.
- Providing tips, tricks, and strategies for gameplay improvement.
- Offering assistance with troubleshooting common issues and errors.
💭 Process:
- The project begins with defining the scope and objectives of the AI Minecraft Assistant.
- Development starts with setting up the backend infrastructure, including installing and configuring the language model.
- Backend development continues with implementing speech recognition.
- Frontend development is initiated to create a user-friendly interface that is continuously updated with AJAX.
- Testing and debugging are performed to ensure the assistant functions correctly and provides accurate responses.
- Documentation and user guides are created to assist users in utilizing the AI Minecraft Assistant effectively.
📚 What We Learn:
- Understanding and leveraging the capabilities of language models for natural language processing tasks.
- Integrating external APIs or SDKs with existing software (e.g., HuggingFace NLP, Ollama API requests).
- Frontend development skills for creating user interfaces.
- Project management and coordination in software development.
- Problem-solving and troubleshooting skills, especially when dealing with complex systems and integration challenges.
✨ Improvements:
- Fine tuning the LLM instead of prompt engineering
- Adding support for additional languages to cater to a broader audience.
- Gathering user feedback and iteratively improving the assistant based on user needs and preferences.
🚦 Running the Project: To run the project:
- Clone the repository from GitHub.
- Install and run Ollama on your local machine.
- Install dependencies as specified in the project's requirements.txt.
- Run the program with
python3 app.pywhile using not Python 3.12 (I tested with 3.11 and 3.9) - Speak a question beginning with the word "Humphrey", similar to how an "Alexa" works
📸 Video of the Website:
