Welcome to the MCP Fundamentals Workshop by learnwithparam.com!
This hands-on workshop teaches you the Model Context Protocol (MCP) — an open protocol that standardizes how AI applications connect to external tools and data sources. You'll go from manual tool integration to building MCP servers and multi-server clients, step by step.
Regional pricing is available for eligible learners, with discounts of up to 60% in supported regions. Start here: https://www.learnwithparam.com/ai-bootcamp
You're building a DevTools AI Assistant — an internal tool that helps your engineering team find research papers (arXiv) and look up technical concepts (Wikipedia). You'll start by wiring everything up manually, feel the pain of the M x N problem, then rebuild it with MCP to see everything click.
| Step | File | Description |
|---|---|---|
| 01 | 01-why-mcp.md | The M x N integration problem and why MCP exists |
| Step | File | Description |
|---|---|---|
| 02 | 02-mcp-architecture.md | Host, Client, Server roles + Tools, Resources, Prompts |
| 03 | 03-transport-protocols.md | STDIO vs SSE vs Streamable HTTP — when to use each |
| Step | File | Description |
|---|---|---|
| 04 | 04-tool-use-arxiv.py | Manual tool integration for arXiv (notice the boilerplate) |
| 05 | 05-tool-use-wikipedia.py | Same boilerplate again for Wikipedia (the M x N problem) |
| Step | File | Description |
|---|---|---|
| 06 | 06-build-mcp-server.py | Build a server from scratch with @mcp.tool() |
| Step | File | Description |
|---|---|---|
| 07 | 07-mcp-client-basics.py | Your first MCP client — CLI, no LLM, STDIO transport |
| 08 | 08-mcp-client-single.py | Full chat UI client — Streamlit + LLM + SSE transport |
| Step | File | Description |
|---|---|---|
| 09 | 09-resources-and-prompts.md | Resources (read-only data) and Prompts (pre-built workflows) |
| Step | File | Description |
|---|---|---|
| 10 | 10-mcp-client-multi.py | Connect to multiple servers — add sources via config |
| File | Transport | Features |
|---|---|---|
| servers/arxiv-server.py | STDIO | Tools only |
| servers/wikipedia-server-stdio.py | STDIO | Tools only |
| servers/wikipedia-server-sse.py | SSE | Tools only |
| servers/wikipedia-server-full-stdio.py | STDIO | Tools + Resources + Prompts |
| servers/wikipedia-server-full-sse.py | SSE | Tools + Resources + Prompts |
| servers/wikipedia-server-full-http.py | Streamable HTTP | Tools + Resources + Prompts |
- LLM: LiteLLM (supports Gemini, OpenAI, Anthropic, etc.)
- MCP SDK: FastMCP (Python)
- Chat UI: Streamlit
- APIs: arXiv, Wikipedia
- Environment: UV, Docker
- Python 3.11+
- UV (fast Python package manager)
- An AI API Key (Gemini recommended for its free tier)
- Docker (optional, for containerized setup)
- Node.js (only needed for step 10's filesystem server via npx)
# 1. Initialize environment and install dependencies
make setup
# 2. Add your API key
# Edit .env and set GOOGLE_API_KEY (or any LiteLLM-supported provider)# Run the theory (read in order)
# Open 01-why-mcp.md, 02-mcp-architecture.md, 03-transport-protocols.md
# Run the "pain" demos (steps 04-05)
make run-tool-use-arxiv # Manual arXiv integration
make run-tool-use-wiki # Manual Wikipedia integration
# Build and test an MCP server (step 06)
# Read 06-build-mcp-server.py, then test with MCP Inspector:
npx @modelcontextprotocol/inspector uv run python 06-build-mcp-server.py
# Run an MCP client (steps 07-08)
uv run python 07-mcp-client-basics.py # CLI client
make run-server-wiki-sse # Start server (terminal 1)
make run-client-single # Start client (terminal 2)
# Multi-server (step 10)
make run-client-multi-
The Problem (10 min): Walk through
01-why-mcp.md. Ask the audience: "How many of you have written custom API integrations? How many times?" -
Architecture (10 min):
02-mcp-architecture.md+03-transport-protocols.md. Focus on the three roles and three capabilities. -
Live Coding: The Pain (20 min): Open
04-tool-use-arxiv.py. Run it. Then open05-tool-use-wikipedia.py. Point out the copy-pasted boilerplate. Ask: "What happens when we add a 3rd data source?" -
Live Coding: Build a Server (15 min): Walk through
06-build-mcp-server.pystep by step. Show how @mcp.tool() eliminates schemas. Test with MCP Inspector. -
Live Coding: The Client (15 min): Run
07-mcp-client-basics.py. Show tool discovery. Then run08-mcp-client-single.pywith the SSE server for the full chat experience. -
Resources & Prompts (10 min): Walk through
09-resources-and-prompts.md. Show the full server variants. -
Multi-Server (10 min): Show
server_config.json, then run10-mcp-client-multi.py. Add/remove a server entry live. -
Wrap-Up (10 min): Production considerations, the MCP ecosystem, what to build next.
- Read the theory (steps 01-03) to understand the concepts
- Run the pain demos (steps 04-05) to feel the M x N problem
- Build a server (step 06) — read the code, then test it
- Connect a client (steps 07-08) to see MCP in action
- Learn Resources & Prompts (step 09) and explore the full server variants
- Go multi-server (step 10) to see the full power of MCP
# Setup
make setup # Initialize .env, install uv, create venv
make install # Sync dependencies
make validate # Validate Python & JSON syntax
# Servers
make run-server-arxiv # arXiv MCP server (STDIO)
make run-server-wiki-stdio # Wikipedia MCP server (STDIO)
make run-server-wiki-sse # Wikipedia MCP server (SSE, port 8000)
make run-server-wiki-full-stdio # Full Wikipedia server (STDIO)
make run-server-wiki-full-sse # Full Wikipedia server (SSE)
make run-server-wiki-full-http # Full Wikipedia server (Streamable HTTP)
# Demos & Clients
make run-tool-use-arxiv # Step 04: Manual arXiv integration
make run-tool-use-wiki # Step 05: Manual Wikipedia integration
make run-client-basics # Step 07: CLI MCP client
make run-client-single # Step 08: Single-server Streamlit client
make run-client-multi # Step 10: Multi-server Streamlit client
# Testing & Cleanup
make test # Run full test suite
make clean # Remove venv, cache, auto-generated data
make clean-all # Also remove Docker volumesCreated with care by learnwithparam.com.