AI Skill Report Card
Building AI Workstations
Quick Start15 / 15
Check system readiness:
Bash# System info uname -a && lsb_release -a free -h && df -h lscpu | grep -E "(Model name|Thread|Core)" nvidia-smi || echo "No NVIDIA GPU detected" # Essential tools check which python3 git docker curl wget || echo "Missing tools detected"
Recommendation▾
Add actual input/output pairs for the system readiness check showing what good vs problematic output looks like
Workflow15 / 15
Phase 1: Base System Setup
Progress:
- Verify Ubuntu/Debian system requirements
- Install essential development tools
- Configure GPU drivers (if NVIDIA)
- Set up Python environment
- Install and configure Docker
Phase 2: AI Infrastructure
Progress:
- Install local LLM runtime (Ollama recommended)
- Set up model storage on appropriate drive
- Test GPU acceleration
- Configure development environment
Phase 3: Agent Framework
Progress:
- Set up coding agent tools
- Configure browser automation
- Implement safety boundaries
- Test agent workflows
Recommendation▾
Include specific error messages and their solutions in the troubleshooting examples rather than just general categories
Base System Setup
Essential packages:
Bashsudo apt update && sudo apt install -y \ build-essential curl wget git vim \ python3 python3-pip python3-venv \ apt-transport-https ca-certificates \ software-properties-common \ htop tree jq unzip
NVIDIA GPU setup (if applicable):
Bash# Check GPU lspci | grep -i nvidia # Install drivers sudo apt install nvidia-driver-535 nvidia-cuda-toolkit sudo reboot # Verify after reboot nvidia-smi
Docker setup:
Bash# Install Docker curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list sudo apt update && sudo apt install -y docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin # Configure user permissions sudo usermod -aG docker $USER newgrp docker # Test docker run hello-world
Local AI Setup
Ollama installation:
Bash# Install Ollama curl -fsSL https://ollama.ai/install.sh | sh # Start service sudo systemctl start ollama sudo systemctl enable ollama # Test with small model ollama pull phi3:mini ollama run phi3:mini "Hello, test response"
Model storage configuration:
Bash# Check disk space df -h # If moving to larger drive sudo systemctl stop ollama sudo mkdir -p /mnt/models/ollama sudo chown -R $USER:$USER /mnt/models echo 'OLLAMA_MODELS=/mnt/models/ollama' | sudo tee -a /etc/environment # Restart and verify sudo systemctl start ollama ollama list
Development Environment
Python setup:
Bash# Create project structure mkdir -p ~/ai-workspace/{models,agents,tools} cd ~/ai-workspace # Virtual environment python3 -m venv venv source venv/bin/activate # Essential packages pip install --upgrade pip pip install fastapi uvicorn requests python-dotenv pip install openai anthropic ollama-python pip install selenium beautifulsoup4 playwright
VS Code with AI extensions:
Bash# Install VS Code wget -qO- https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > packages.microsoft.gpg sudo install -o root -g root -m 644 packages.microsoft.gpg /etc/apt/trusted.gpg.d/ sudo sh -c 'echo "deb [arch=amd64,arm64,armhf signed-by=/etc/apt/trusted.gpg.d/packages.microsoft.gpg] https://packages.microsoft.com/repos/code stable main" > /etc/apt/sources.list.d/vscode.list' sudo apt update && sudo apt install code # Launch and install Python, Docker, Remote extensions code ~/ai-workspace
Examples17 / 20
Example 1: Local API Server Input: Need API endpoint for local model Output:
Python# api.py from fastapi import FastAPI import ollama app = FastAPI() @app.post("/chat") async def chat(prompt: str): response = ollama.chat(model='phi3:mini', messages=[ {'role': 'user', 'content': prompt} ]) return {"response": response['message']['content']} # Run: uvicorn api:app --host 0.0.0.0 --port 8000
Example 2: Safe Coding Agent Input: Automate code reviews in workspace Output:
Python# coding_agent.py import os import subprocess from pathlib import Path class SafeCodeAgent: def __init__(self, workspace_path): self.workspace = Path(workspace_path).resolve() def review_file(self, filepath): if not self._is_safe_path(filepath): raise ValueError("File outside workspace") # Analysis logic here def _is_safe_path(self, filepath): return Path(filepath).resolve().is_relative_to(self.workspace)
Recommendation▾
Provide concrete performance benchmarks or success criteria for each setup phase (e.g., 'Ollama should respond in <2s for phi3:mini')
Best Practices
System Management:
- Use systemd for service management
- Keep models on fastest/largest drive
- Monitor GPU memory with
nvidia-smi -l 1 - Regular cleanup:
docker system prune -a
Security Boundaries:
- Run agents in workspace-scoped containers
- Use read-only mounts for sensitive data
- Implement file path validation
- Never give agents sudo access
Performance Optimization:
- GPU: Use CUDA-enabled containers
- RAM: Monitor with
htop, consider swap on SSD - Storage: SSD for models, HDD for archives
- Network: Test local vs cloud latency
Common Pitfalls
GPU Issues:
- NVIDIA driver mismatch →
sudo apt purge nvidia-* && reinstall - CUDA version conflicts → Use Docker with specific CUDA base
- GPU memory exhaustion → Monitor with
nvidia-smi
Docker Problems:
- Permission denied → Check user in docker group:
groups $USER - Port conflicts →
docker psandnetstat -tlnp - Storage full →
docker system df && docker system prune
Python Environment:
- Multiple Python versions → Use explicit
python3 - Package conflicts → Fresh venv for each project
- Missing system deps → Install dev packages with apt
Agent Safety:
- Unbounded file access → Implement path validation
- Resource exhaustion → Set container limits
- Network access → Use restricted Docker networks
- Persistent state → Scope to project directories
Path and Environment Issues:
- Stale symlinks →
find -L . -type l - Wrong PATH → Check with
echo $PATH - Environment not activated →
which python - Service conflicts →
systemctl status service-name