MCP Servers for Python Devs
Episode Deep Dive
Guest Introduction and Background
Den Delamarski is a Principal Product Engineer at Microsoft working in the Core AI division, focusing on AI tools for developers. Den is one of the core maintainers of the Model Context Protocol (MCP), having initially joined the project through his expertise in security and authorization. When MCP first launched with an auth specification, Den identified opportunities to improve it for enterprise scale and worked with the Anthropic team to rewrite the authorization framework, which was merged into the June 2024 version of the protocol. Beyond MCP, Den helps drive projects like GitHub SpecKit, which enables spec-driven development with agentic coding tools. His work centers on building developer tools and experiences in the rapidly evolving AI ecosystem, including projects like Copilot and other Microsoft AI initiatives.
What to Know If You're New to Python
- Understanding of async/await: MCP servers heavily use Python's async/await syntax for handling streaming responses and progress updates. This is fundamental to building responsive MCP servers that can report progress during long-running operations.
- Familiarity with decorators: The FastMCP framework uses a Flask-like decorator pattern (@mcp.tool, @mcp.prompt, @mcp.resource) to expose functions as MCP primitives. Understanding how Python decorators work will make the programming model intuitive.
- Basic HTTP and JSON-RPC concepts: While MCP abstracts much of the complexity, understanding that MCP servers communicate via JSON-RPC messages over either stdio (local processes) or HTTP (remote servers) helps with debugging and architecture decisions.
- Pydantic models: Structured output in MCP commonly uses Pydantic for data validation, similar to FastAPI. This ensures type safety and structured data exchange between AI agents and your server.
Key Points and Takeaways
1. MCP as "USB-C for AI" - Universal Integration Layer
The Model Context Protocol solves a fundamental problem in AI systems: LLMs are trained on data that gets locked at a specific point in time, but users need to work with fresh, dynamic data. MCP provides a universal interface that allows any LLM or AI client to connect to data sources, applications, and services without custom integrations. Just as the Language Server Protocol (LSP) standardized how editors communicate with language tools, MCP standardizes how AI agents access external capabilities. The protocol is highly opinionated about authentication, message passing, and primitive exposure, eliminating the inconsistency found in traditional REST API integrations.
The protocol went from non-existent to widely adopted in less than a year, with major companies across banking, healthcare, and gaming now integrating MCP into their AI strategies. The composability of MCP means you can connect multiple servers to a single client, allowing an LLM to coordinate across different data sources and services seamlessly.
- modelcontextprotocol.io - Official MCP specification and documentation
- github.com/modelcontextprotocol/python-sdk - Python SDK with 143+ contributors
- MCP Registry Blog Post - Introduction to the MCP Registry
- github.com/mcp - GitHub MCP registry for discovering servers
2. Building MCP Servers with Python and FastMCP
The Python SDK makes building MCP servers remarkably simple through the FastMCP framework, which provides a Flask-like developer experience. Creating an MCP tool is as straightforward as writing a Python function and adding a decorator. The SDK handles all the complex JSON-RPC envelope creation, streaming, and protocol compliance automatically. Developers can focus on business logic rather than protocol implementation details.
FastMCP is integral to the official Python SDK and simplifies common pain points like authorization. The programming model supports async functions naturally, allowing you to await user input via elicitations without complex callback patterns. The framework also includes built-in support for structured output using Pydantic models, progress reporting, and image handling.
- FastMCP Documentation - Flask-like framework for MCP servers
- Install with
uv add mcporpip install mcp - FastMCP GitHub examples showing decorator patterns for tools, prompts, and resources
3. Three Core MCP Primitives: Tools, Prompts, and Resources
MCP servers expose three fundamental primitives that LLMs can interact with. Tools are function calls that perform actions - think of them as API endpoints that do something like sending an email, querying a database, or creating a 3D scene in Blender. Prompts are reusable templates that help LLMs interact with your server effectively, such as "create a recipe with substitutions." Resources allow LLMs to reference and work with entities like databases, files, or API objects.
Each primitive serves a distinct purpose in the agent workflow. Tools enable actions and side effects. Prompts guide the LLM on how to best use your server. Resources provide structured access to data and entities. Together, these primitives create a complete interaction model that's both powerful and constrained enough to be reliable.
- Tools: Actions and functions the LLM can invoke (add decorator to any Python function)
- Prompts: Pre-configured templates for common interactions
- Resources: References to databases, files, and API entities
- Elicitations: Structured input requests for user confirmation or data
4. Two Transport Modes: stdio and Streamable HTTP
MCP servers can run in two distinct modes depending on your architecture needs. Local MCP servers use stdio (standard input/output) to communicate via native OS constructs between the MCP client and server processes. This is perfect for development machines where you want your editor or AI tool to access local capabilities without network overhead. Remote MCP servers use streamable HTTP and can be hosted anywhere - AWS, Azure, GCP, your home lab, or behind a reverse proxy like Nginx or Caddy.
The transport layer is abstracted by the SDK, so the same server code can work in both modes with minimal changes. For local development with remote access, tools like Tailscale provide secure overlay networks without exposing ports or configuring complex VPN setups. This makes it trivial to run an MCP server on your home lab and access it securely from anywhere.
- stdio transport: Local process communication for development and desktop integration
- Streamable HTTP: Cloud hosting with standard HTTP conventions for JSON-RPC
- Containerization: Run in Docker with Nginx/Caddy for production deployment
- Tailscale - Secure overlay networks for private MCP server access without VPN complexity
5. The MCP Registry and Discovery Ecosystem
The MCP Registry launched in September 2024 as a centralized API that indexes all publicly available MCP servers. Think of it like Docker Hub for MCP servers - you can discover servers, but you're not required to use the registry. The registry supports both public servers (like the GitHub-maintained registry) and private enterprise registries for internal company use. This allows organizations to maintain approved MCP servers behind security gates while still benefiting from the discoverability infrastructure.
Discovery is improving rapidly with better integration into clients like VS Code, Cursor, and Claude Desktop. The Awesome MCP Servers list on GitHub has become a valuable community resource with hundreds of servers categorized by function - from biology and medicine to gaming, marketing, and sports analytics.
- github.com/mcp - Browse available MCP servers with one-click install to editors
- Awesome MCP Servers - Comprehensive list with 72,000+ GitHub stars
- Public and private registry support for enterprise use cases
- Category-based discovery: command line, cloud platforms, gaming, marketing, sports, and more
6. OAuth 2.1 Security Without the Complexity
Security and authorization was Den's entry point into MCP development. The June 2024 spec introduced formal OAuth 2.1-based authorization, eliminating the need for developers to implement custom auth flows or check API keys into source control. The brilliant part is that MCP server developers don't need to become OAuth experts - the SDKs handle it. For consumers, authentication is as simple as logging in when you connect a server. The client bootstraps the auth flow, stores tokens securely, and ensures you access only your data.
MCP servers can specify whether they use API keys (stored in configuration) or OAuth (handled via standard browser-based login flows). This approach scales from hobby projects to enterprise deployments where data access controls are critical. The standardization means you don't face "17 different dances" to get authentication tokens from different services.
- OAuth 2.1 specification built into MCP protocol
- SDK handles authentication complexity for both servers and clients
- Browser-based login flows for OAuth servers
- Secure token storage by MCP clients
- MCP Authorization Specification - Technical details on auth implementation
7. Spec-Driven Development with GitHub SpecKit
GitHub SpecKit represents Microsoft's hypothesis for how spec-driven development works with AI coding tools. The approach starts with defining what and why you're building in a specification document, then breaks down the technical implementation plan, and finally decomposes it into consumable tasks that AI can execute iteratively or in parallel. This isn't the only way to do spec-driven development, but it provides a recipe book and ingredient box for teams wanting to adopt this workflow.
The philosophy recognizes that there's no single correct approach to spec-driven development - it depends on your models, team structure, and project complexity. However, starting with a thorough planning phase using high-quality models, then executing with faster models guided by those specs, has proven effective for managing AI agent workflows on complex projects.
- devblogs.microsoft.com - GitHub SpecKit - Den's article on spec-driven development
- github.blog - Spec-Driven Development with AI - Official GitHub announcement
- Open source toolkit for structured AI development
- Separates planning (high-quality models) from execution (faster models)
8. Real-World Examples: From Blender to Minecraft to DoorDash
The MCP ecosystem has exploded with creative and practical implementations. The Blender MCP server lets you describe a medieval scene with a dragon and lighting, and it builds it for you by translating high-level descriptions into Blender's native API calls. Gaming servers exist for Unity 3D, Minecraft, and even analyzing Halo stats. Marketing professionals can connect Facebook Ads, Google Ads, and Amazon Ads MCP servers to ask "how are my ads performing this week" across all platforms without clicking through dashboards.
Sports enthusiasts can use Strava MCP for running and biking analytics, or the Formula 1 Multiviewer MCP that controls viewing angles and telemetry during live races. For developers, there are Jira and Atlassian MCP servers to automate bug triage and ticket management. The diversity shows MCP's flexibility - it's not just for data retrieval, but for controlling applications, analyzing information, and automating workflows across domains.
- Blender MCP: 3D modeling via natural language descriptions
- Unity/Minecraft MCP: Game engine integration
- Marketing MCP servers: Facebook Ads, Google Ads, Amazon Ads integration
- Strava MCP: Fitness and activity data analysis
- Jira/Atlassian MCP: Bug tracking and project management automation
- github.com/punkpeye/awesome-mcp-servers - Full catalog of available servers
9. MCP vs RAG: Different Solutions for Different Problems
Retrieval Augmented Generation (RAG) and MCP serve different purposes in the AI architecture landscape. RAG builds vector databases to augment an LLM's context with additional knowledge, helping it understand what exists in a codebase or documentation set. It's primarily about giving the LLM more relevant context for making decisions. MCP, on the other hand, provides universal access to live data and actionable capabilities. It's not just about knowing what exists - it's about doing something with that information.
While RAG helps an LLM understand that an authorization component exists in your codebase, MCP lets it actually invoke authentication services, update records, or chain multiple actions across services. The two technologies can complement each other: RAG for knowledge augmentation and MCP for capability extension. Many real-world AI applications benefit from using both - RAG for understanding context and MCP for taking action.
- RAG: Vector databases for augmenting LLM knowledge with custom data
- MCP: Universal interface for live data access and action invocation
- RAG focuses on "knowing", MCP focuses on "doing"
- Complementary technologies that work well together in production systems
10. Local vs. General-Purpose Models and MCP Composition
There's ongoing debate about whether specialized local models or general-purpose cloud models work better for specific tasks. Den's perspective is that general-purpose models like Claude and GPT-4 will typically outperform local models for most scenarios due to superior training resources and compute power. However, local models excel for privacy-sensitive workloads - like organizing family photos without sending them to remote servers - or domain-specific tasks where a small, focused model can be as effective as a large general one.
MCP enables an interesting hybrid approach: use powerful general-purpose models for orchestration and decision-making, but delegate specific subtasks to specialized local models or services via MCP servers. For example, a general model could coordinate a photo organizing workflow while a local computer vision model handles the actual image analysis. This composability allows building sophisticated systems that balance capability, privacy, cost, and latency.
- General-purpose models excel at broad capabilities and reasoning
- Local models valuable for privacy-sensitive data and specialized tasks
- MCP enables composition: general models for orchestration, specialized models for specific subtasks
- Privacy considerations: keep sensitive data local while leveraging cloud AI for coordination
11. Developer Experience: Flask-Like Simplicity Meets Modern AI
The Python MCP SDK prioritizes developer experience through familiar patterns and minimal boilerplate. The decorator-based approach (@mcp.tool) mirrors Flask and FastAPI, making it immediately intuitive for Python web developers. Async/await support is first-class, allowing natural progress reporting and elicitations without callback hell. The SDK includes 143+ contributors, ships releases every few days, and maintains "good first issue" tags for new contributors.
Documentation and samples are comprehensive, with the official Python SDK repo containing multiple example servers. The team actively solicits feedback and iterates quickly on developer pain points. Installation is as simple as uv add mcp or pip install mcp, and you can have a working MCP server in under 10 lines of code. The combination of low barrier to entry and production-ready features makes MCP accessible to Python developers at all skill levels.
- Flask/FastAPI-like decorator patterns for familiar developer experience
- First-class async/await support for modern Python practices
- 143+ contributors with active maintenance (releases every few days)
- Comprehensive documentation with practical examples
- Low barrier to entry: working servers in ~10 lines of code
- github.com/modelcontextprotocol/python-sdk - Actively maintained with good first issues
12. Security Considerations and Best Practices
While MCP provides secure authentication mechanisms, users must still exercise caution when installing third-party MCP servers. Like any software that accesses your data, you should verify the source and understand what an MCP server does before connecting it. An MCP server that reads your iMessages to "sort by importance" could potentially scan for credit card numbers or social security numbers. The responsibility for vetting servers lies with the user, just as it does with browser extensions or system-level applications.
Best practices include reviewing source code for open-source MCP servers, starting with servers from trusted organizations, using private registries for enterprise deployments, and being cautious about granting broad permissions. Never check API keys into source control - use environment variables or OAuth flows instead. The MCP community is working on improved discovery with trust signals, but individual diligence remains essential for security.
- Exercise caution with third-party MCP servers like any software
- Review source code and understand what permissions servers request
- Use OAuth flows instead of checking API keys into repositories
- Private registries for enterprise-approved servers
- Start with servers from trusted organizations and open-source projects
- Model Context Protocol Security Best Practices - Official security guidance
Interesting Quotes and Stories
"Think about it like last year at this time, like at the time when we were recording the work item episode, MCP did not exist. Just not a thing. And now everybody's talking about MCP. Like you talk to any big companies, you talk to like the banks, the healthcare, the gaming, everybody, everybody cares about MCP." -- Den Delamarski
"The way the folks at Anthropic have been describing it, it is USB-C for AI." -- Den Delamarski
"Look at the simplicity of this. You literally have a Python function, you have def add, and there is your arguments, you would pass you a function, like two integers. And then all you need to do to make that a tool that an LM can invoke is just add that @mcp.tool decorator. That's it. You're not going and crafting elaborate JSON RPC envelopes and converters and all these things." -- Den Delamarski on the developer experience
"I'll tell you what, the LLMs are getting really good at analyzing the stats. You give them the data, they can make some conclusions." -- Den Delamarski on his Halo stats MCP server
"Do you remember the days when you had to work, this episode is not sponsored by Tailscale, for the record. Should be." -- Den Delamarski and Michael Kennedy discussing VPN complexity vs. Tailscale simplicity
"The power is composability. It's the fact that you can compose things together and have them work together based on the prompts that you have and scenarios that you have." -- Den Delamarski
"There's an MCP server for everything. Like, this list is massive. I'm actually like, every time I discover these things, I was like, oh, I didn't know there was one for multiviewer." -- Den Delamarski exploring the Awesome MCP Servers list
"These are the life hacks you learned only from this podcast. Query all the bugs assigned to me, reassign them to somebody else." -- Den Delamarski joking about Jira MCP automation
"Exercise caution, just like you would exercise with any other software and APIs and websites where you log in because the responsibility is kind of on you to figure out what's safe, what's not." -- Den Delamarski on MCP server security
Key Definitions and Terms
Model Context Protocol (MCP): An open protocol that provides a standardized way for AI applications to connect to data sources, services, and tools. It acts as a universal translation layer between LLMs and external systems, similar to how LSP standardized language tooling for editors.
MCP Server: A service that implements the MCP specification and exposes tools, prompts, and resources that AI clients can use. Servers can run locally via stdio or remotely via HTTP.
MCP Client: An application or editor that connects to MCP servers and makes their capabilities available to LLMs. Examples include VS Code, Cursor, Claude Desktop, and custom applications.
Tools: Function calls that MCP servers expose to LLMs, allowing them to perform actions like querying databases, sending emails, or controlling applications.
Prompts: Reusable templates that MCP servers provide to guide LLMs on how to interact effectively with their capabilities.
Resources: References to databases, files, or API entities that MCP servers make available to LLMs for data access and manipulation.
Elicitations: A mechanism for MCP servers to request structured input from users during tool execution, enabling confirmation dialogs, dropdown selections, and data validation.
FastMCP: The primary framework within the Python SDK that provides a Flask-like decorator-based programming model for building MCP servers quickly.
stdio Transport: A local communication method where MCP servers use standard input/output pipes to exchange JSON-RPC messages with clients on the same machine.
Streamable HTTP Transport: A remote communication method where MCP servers expose HTTP endpoints for JSON-RPC message exchange, enabling cloud deployment and distributed architectures.
JSON-RPC: The underlying message format used by MCP for communication between clients and servers, abstracted away by SDKs for developer convenience.
MCP Registry: A centralized index of available MCP servers, similar to Docker Hub, that enables discovery and installation of servers into MCP clients. Supports both public and private registries.
OAuth 2.1: The authentication and authorization standard used by MCP for secure access to protected resources, handled automatically by SDKs.
RAG (Retrieval Augmented Generation): A technique that builds vector databases to augment LLM context with additional knowledge, complementary to MCP's action-oriented approach.
Spec-Driven Development: A development methodology where projects start with detailed specifications that guide AI coding tools through implementation, promoted by GitHub SpecKit.
Learning Resources
If you want to dive deeper into the topics covered in this episode, these courses from Talk Python Training can help you build the foundational skills and advanced techniques you'll need.
LLM Building Blocks for Python: This concise 1.2-hour course teaches you to move beyond basic "text in, text out" with LLMs, covering structured data, chat workflows, async pipelines, and caching - essential skills for building MCP servers that integrate AI capabilities.
Modern APIs with FastAPI and Python: Since FastMCP uses FastAPI-like patterns, this course provides deep knowledge of building modern Python APIs with type hints, async/await, and data validation - all of which directly apply to MCP server development.
Async Techniques and Examples in Python: MCP servers heavily use async/await for streaming responses and progress reporting. This course covers Python's entire async ecosystem, from basic async/await to parallel processing and thread safety.
Rock Solid Python with Python Typing: Type hints are fundamental to MCP servers and structured output with Pydantic. Learn how to use Python's typing system effectively, which powers frameworks like FastAPI and FastMCP.
Build An Audio AI App: This course combines AI, FastAPI, and MongoDB to build real applications - a perfect companion for creating MCP servers that work with audio content, transcripts, and multimedia data.
Overall Takeaway
The Model Context Protocol represents a fundamental shift in how we build AI-powered applications. Rather than creating custom integrations for every data source and service, MCP provides a universal standard that works across LLMs, editors, and agentic tools. The Python ecosystem has embraced MCP with remarkable speed, delivering a developer experience that feels as natural as Flask or FastAPI while handling the complexity of JSON-RPC, streaming, and authentication behind the scenes.
What makes MCP truly powerful is its composability. You can connect multiple servers to a single client, enabling LLMs to coordinate sophisticated workflows across different services. The registry ecosystem is exploding with servers for everything from 3D modeling in Blender to analyzing Formula 1 telemetry to automating Jira tickets. Yet beneath this diversity lies a consistent, well-designed protocol that makes both building and consuming MCP servers straightforward.
For Python developers, now is the perfect time to explore MCP. The barriers to entry are low - you can have a working server in minutes. The community is active and welcoming, with good first issues available for contributors. The use cases span every domain imaginable, from enterprise data integration to creative hobby projects. Whether you're building the next generation of AI agents or simply want to give your AI tools access to your custom data, MCP provides the plumbing that just works. As Den put it, "MCP can do anything - it's just a pipe. What you do with that pipe is up to you."
Links from the show
Agentic AI Programming for Python Course: training.talkpython.fm
Model Context Protocol: modelcontextprotocol.io
Model Context Protocol Specification (2025-03-26): modelcontextprotocol.io
MCP Python Package (PyPI): pypi.org
Awesome MCP Servers (punkpeye) GitHub Repo: github.com
Visual Studio Code Docs: Copilot MCP Servers: code.visualstudio.com
GitHub MCP Server (GitHub repo): github.com
GitHub Blog: Meet the GitHub MCP Registry: github.blog
MultiViewer App: multiviewer.app
GitHub Blog: Spec-driven development with AI (open source toolkit): github.blog
Model Context Protocol Registry (GitHub): github.com
mcp (GitHub organization): github.com
Tailscale: tailscale.com
Watch this episode on YouTube: youtube.com
Episode #527 deep-dive: talkpython.fm/527
Episode transcripts: talkpython.fm
Theme Song: Developer Rap
🥁 Served in a Flask 🎸: talkpython.fm/flasksong
---== Don't be a stranger ==---
YouTube: youtube.com/@talkpython
Bluesky: @talkpython.fm
Mastodon: @talkpython@fosstodon.org
X.com: @talkpython
Michael on Bluesky: @mkennedy.codes
Michael on Mastodon: @mkennedy@fosstodon.org
Michael on X.com: @mkennedy
Episode Transcript
Collapse transcript
00:00
00:04
00:11
00:17
00:23
00:27
00:29
00:36
00:43
00:46
00:48
00:51
00:52
00:54
00:57
01:02
01:07
01:13
01:19
01:24
01:30
01:36
01:41
01:47
01:54
02:00
02:07
02:13
02:19
02:24
02:31
02:36
02:42
02:47
02:53
03:00
03:06
03:12
03:17
03:19
03:23
03:25
03:27
03:31
03:33
03:35
03:36
03:37
03:38
03:39
03:41
03:43
03:45
03:45
03:46
03:49
03:50
03:53
04:01
04:05
04:10
04:13
04:15
04:16
04:17
04:18
04:20
04:21
04:22
04:25
04:25
04:28
04:32
04:34
04:34
04:35
04:35
04:38
04:44
04:46
04:46
04:48
04:49
04:51
04:53
04:53
04:54
04:57
05:00
05:06
05:09
05:13
05:15
05:16
05:19
05:21
05:22
05:26
05:27
05:29
05:29
05:31
05:36
05:40
05:41
05:45
05:49
05:51
05:57
05:58
06:04
06:08
06:09
06:15
06:18
06:19
06:23
06:26
06:29
06:35
06:37
06:41
06:42
06:46
06:58
07:06
07:07
07:08
07:14
07:16
07:21
07:25
07:30
07:36
07:39
07:42
07:43
07:45
07:47
07:49
07:52
07:52
07:54
07:58
08:00
08:06
08:06
08:09
08:10
08:10
08:14
08:17
08:20
08:21
08:23
08:25
08:30
08:32
08:34
08:37
08:43
08:54
08:56
09:02
09:04
09:04
09:11
09:11
09:14
09:17
09:19
09:21
09:23
09:25
09:29
09:34
09:39
09:43
09:47
09:54
09:56
10:05
10:07
10:14
10:19
10:22
10:26
10:30
10:30
10:34
10:40
10:42
10:44
10:50
11:06
11:11
11:15
11:18
11:23
11:27
11:34
11:40
11:48
11:55
12:00
12:09
12:14
12:19
12:25
12:31
12:34
12:35
12:39
12:41
12:43
12:45
12:50
12:57
13:02
13:06
13:11
13:15
13:20
13:25
13:30
13:36
13:42
13:45
13:50
14:03
14:05
14:06
14:12
14:15
14:21
14:28
14:33
14:37
14:42
14:44
14:46
14:48
14:50
14:53
14:59
15:02
15:03
15:04
15:05
15:11
15:18
15:22
15:26
15:32
15:38
15:43
15:49
15:55
16:00
16:05
16:10
16:15
16:19
16:23
16:31
16:37
16:43
16:49
16:55
17:01
17:03
17:09
17:14
17:17
17:23
17:27
17:30
17:37
17:46
17:49
17:55
18:01
18:05
18:10
18:19
18:22
18:25
18:38
18:43
18:46
18:47
18:50
18:54
18:59
19:02
19:04
19:05
19:05
19:11
19:19
19:22
19:29
19:31
19:35
19:36
19:37
19:37
19:38
19:41
19:42
19:44
19:48
19:51
19:53
19:56
19:58
20:00
20:01
20:07
20:09
20:12
20:19
20:23
20:26
20:31
20:35
20:36
20:39
20:42
20:43
20:46
20:50
20:54
20:54
20:59
21:03
21:07
21:13
21:18
21:23
21:23
21:33
21:40
21:45
21:51
21:59
22:03
22:08
22:15
22:18
22:24
22:30
22:34
22:37
22:42
22:49
22:54
23:03
23:07
23:11
23:15
23:19
23:26
23:33
23:38
23:43
23:47
23:54
23:59
24:04
24:08
24:12
24:15
24:19
24:23
24:26
24:27
24:28
24:32
24:39
24:42
24:46
24:52
24:55
24:57
24:58
25:01
25:04
25:05
25:12
25:14
25:15
25:19
25:26
25:30
25:34
25:39
25:44
25:49
25:54
25:59
26:03
26:08
26:15
26:17
26:20
26:24
26:26
26:28
26:30
26:33
26:34
26:36
26:39
26:44
26:47
26:54
27:00
27:02
27:05
27:06
27:07
27:09
27:10
27:10
27:14
27:15
27:19
27:21
27:22
27:26
27:27
27:29
27:32
27:35
27:36
27:40
27:42
27:48
27:51
27:59
28:01
28:06
28:11
28:11
28:12
28:20
28:29
28:32
28:35
28:36
28:36
28:37
28:42
28:46
28:47
28:49
28:52
28:58
29:02
29:04
29:08
29:13
29:14
29:15
29:16
29:22
29:26
29:27
29:32
29:33
29:34
29:35
29:37
29:37
29:39
29:42
29:44
29:45
29:47
29:50
29:53
29:55
29:55
29:58
30:00
30:04
30:08
30:11
30:16
30:18
30:19
30:22
30:22
30:23
30:23
30:25
30:25
30:30
30:31
30:33
30:42
30:45
30:49
30:50
30:51
30:53
30:55
30:58
31:04
31:10
31:16
31:22
31:28
31:31
31:33
31:41
31:46
31:49
31:53
32:09
32:13
32:17
32:26
32:31
32:39
32:39
32:41
32:42
32:47
32:50
32:55
32:59
33:00
33:05
33:09
33:10
33:13
33:14
33:18
33:22
33:25
33:27
33:30
33:35
33:36
33:40
33:45
33:49
33:53
33:58
34:03
34:08
34:13
34:14
34:17
34:23
34:23
34:24
34:28
34:34
34:37
34:41
34:47
34:51
34:55
34:57
35:03
35:07
35:08
35:12
35:18
35:24
35:28
35:34
35:39
35:42
35:49
35:56
36:00
36:03
36:06
36:08
36:11
36:13
36:17
36:23
36:25
36:26
36:29
36:38
36:43
36:50
36:55
36:58
37:04
37:09
37:14
37:19
37:24
37:31
37:37
37:45
37:51
37:56
38:01
38:06
38:11
38:16
38:20
38:21
38:25
38:27
38:28
38:29
38:34
38:35
38:37
38:39
38:42
38:46
38:51
38:55
38:57
39:03
39:03
39:04
39:10
39:12
39:14
39:20
39:25
39:30
39:32
39:34
39:42
39:44
39:50
39:55
40:02
40:07
40:11
40:41
40:45
40:50
40:56
41:03
41:07
41:11
41:15
41:20
41:24
41:27
41:31
41:37
41:44
41:47
41:53
41:57
42:02
42:03
42:08
42:10
42:14
42:17
42:22
42:26
42:31
42:37
42:39
42:42
42:47
42:48
42:51
42:54
42:55
42:55
43:02
43:05
43:05
43:06
43:07
43:08
43:09
43:10
43:12
43:12
43:14
43:18
43:19
43:19
43:20
43:21
43:21
43:22
43:23
43:23
43:24
43:25
43:25
43:26
43:29
43:31
43:34
43:36
43:39
43:40
43:42
43:43
43:43
43:46
43:46
43:50
43:53
43:54
43:59
44:04
44:10
44:16
44:19
44:25
44:29
44:33
44:39
44:41
44:46
44:50
44:54
44:57
45:02
45:03
45:03
45:04
45:06
45:06
45:08
45:09
45:10
45:12
45:13
45:17
45:21
45:26
45:30
45:33
45:39
45:43
45:49
45:54
45:57
46:01
46:02
46:06
46:06
46:07
46:07
46:11
46:15
46:17
46:19
46:24
46:25
46:27
46:30
46:31
46:31
46:33
46:35
46:36
46:39
46:40
46:43
46:47
46:51
46:57
47:01
47:04
47:08
47:11
47:13
47:16
47:17
47:17
47:19
47:20
47:22
47:25
47:26
47:29
47:30
47:34
47:35
47:37
47:39
47:40
47:47
47:47
47:47
47:48
47:50
47:54
47:55
47:55
47:55
47:56
47:59
48:00
48:00
48:04
48:05
48:06
48:06
48:08
48:13
48:19
48:21
48:23
48:26
48:27
48:29
48:31
48:34
48:35
48:35
48:36
48:41
48:41
48:49
48:49
48:50
48:52
48:57
48:58
49:02
49:06
49:10
49:13
49:14
49:17
49:22
49:27
49:30
49:31
49:34
49:36
49:43
49:50
49:52
49:54
49:58
49:58
50:00
50:03
50:08
50:13
50:19
50:24
50:30
50:34
50:38
50:42
50:44
50:48
50:53
50:56
51:00
51:04
51:09
51:15
51:19
51:23
51:27
51:30
51:34
51:41
51:45
51:49
51:54
51:58
52:00
52:04
52:07
52:11
52:15
52:21
52:25
52:27
52:29
52:34
52:37
52:38
52:42
52:43
52:45
52:45
52:49
52:54
52:57
52:58
53:02
53:03
53:05
53:11
53:14
53:18
53:20
53:23
53:24
53:26
53:28
53:29
53:32
53:33
53:39
53:41
53:46
53:51
53:54
53:56
53:57
53:58
54:04
54:08
54:12
54:14
54:18
54:21
54:26
54:27
54:29
54:31
54:33
54:34
54:36
54:37
54:38
54:40
54:42
54:44
54:45
54:46
54:49
54:55
55:00
55:03
55:05
55:06
55:10
55:12
55:14
55:15
55:16
55:17
55:21
55:22
55:23
55:24
55:24
55:27
55:29
55:33
55:36
55:37
55:41
55:43
55:47
55:50
55:51
55:53
55:54
55:55
55:58
55:59
56:01
56:02
56:09
56:11
56:12
56:13
56:19
56:26
56:27
56:28
56:31
56:32
56:36
56:37
56:38
56:38
56:40
56:42
56:46
56:48
56:49
56:51
56:54
56:56
56:59
57:01
57:06
57:06
57:07
57:08
57:15
57:18
57:18
57:19
57:19
57:20
57:20
57:23
57:24
57:25
57:28
57:30
57:34
57:38
57:40
57:46
57:48
58:00
58:03
58:07
58:08
58:11
58:12
58:17
58:20
58:23
58:23
58:26
58:29
58:32
58:34
58:36
58:40
58:41
58:42
58:48
58:49
58:51
58:51
58:57
58:59
59:01
59:07
59:08
59:09
59:10
59:12
59:14
59:17
59:19
59:22
59:25
59:31
59:33
59:34
59:36
59:40
59:42
59:42
59:44
59:47
59:49
59:52
59:52
59:54
59:55
59:59
01:00:02
01:00:05
01:00:11
01:00:14
01:00:18
01:00:20
01:00:27
01:00:32
01:00:32
01:00:33
01:00:34
01:00:35
01:00:38
01:00:39
01:00:39
01:00:42
01:00:46
01:00:49
01:00:58
01:00:58
01:01:02
01:01:03
01:01:05
01:01:08
01:01:10
01:01:12
01:01:17
01:01:20
01:01:22
01:01:22
01:01:27
01:01:30
01:01:30
01:01:32
01:01:35
01:01:36
01:01:38
01:01:39
01:01:41
01:01:42
01:01:46
01:01:47
01:01:49
01:01:50
01:01:52
01:01:53
01:01:56
01:01:58
01:01:58
01:02:01
01:02:01
01:02:02
01:02:09
01:02:10
01:02:11
01:02:13
01:02:13
01:02:14
01:02:17
01:02:18
01:02:22
01:02:24
01:02:26
01:02:27
01:02:31
01:02:33
01:02:35
01:02:39
01:02:43
01:02:43
01:02:44
01:02:48
01:02:52
01:02:58
01:03:03
01:03:06
01:03:10
01:03:11
01:03:17
01:03:19
01:03:22
01:03:24
01:03:30
01:03:36
01:03:40
01:03:45
01:03:47
01:03:50
01:03:51
01:03:53
01:03:54
01:03:55
01:03:57
01:03:59
01:03:59
01:04:00
01:04:01
01:04:06
01:04:11
01:04:14
01:04:19
01:04:24
01:04:28
01:04:34
01:04:40
01:04:47
01:04:54
01:04:59
01:05:05
01:05:11
01:05:18
01:05:24
01:05:31
01:05:36
01:05:41
01:05:46
01:05:51
01:06:06


