Tip Sheet #22: Implementing an MCP sample project and behind the scenes of book launch day


Hello Tip-Sheeters, I hope you're having a great week. In this issue, I'll take you through some initial Python coding for Model Context Protocol (MCP), a recent open-source project led by Anthropic. I also share a behind-the-scenes look at book launch day, which was a lot of fun and successful.

MCP and Your APIs

In Tip Sheet #19, I shared 7 steps to getting up to speed on MCP (or any new technology). In that, I gave some background on what MCP is and how it relates to using APIs with AI.

TL;DR: MCP is a brand-new method for API producers to wrap their APIs (and other data sources) so LLMs can interact with them.

In that tip sheet, I said that I would save for another day Step 7: Try it out for yourself. Today is that day.

Running the Anthropic Demo in GitHub Codespaces

I created the MCP Client and MCP Server quickstart projects that Anthropic has on their website. (Github repo shared at the end.) Here's the architecture diagram that Anthropic includes in their introduction page, with the parts the demo covered circled in red:

I started by creating the MCP server quickstart, which creates the code to call the Weather.gov API. The quickstart instructions assume that you'll be installing the code in a local Python environment, but if you've been following my work you know that my go-to is using GitHub Codespaces for my Python development work. I created a new GitHub repo and created a Codespace on the main branch. Then I followed the instructions in the demo. It uses the Python uv library to setup the project, which is one I wanted to try, so that was a bonus of working on this project.

Next, I implemented the MCP client quickstart in the same Codespace (in a separate directory). To implement the client you need to get an Anthropic API account and generate an API key. The instructions show you adding the key to an environment file, but I instead put the key in a Codespace secret, which is also be available to the Python code.

With the API key set up, I was ready to test. Roughly following the instructions, I kicked off the MCP client with this command:

python client.py ../weather/weather.py

After a few seconds to start up, the MCP client told me what tools were available from the Weather.gov API and prompted me for a query. Since I was traveling, I asked for the weather for St. Louis, MO. Here's what I got back:

So far so good, the quickstarts are running.

Customizing the MCP Server to use my own API

It was great that I had the demos working with the Anthropic LLM and Weather.gov API, but I want MCP to work with my API. This is where having a core project comes in handy. For my book, I built the SportsWorldCentral API and I know the ins and outs of it. I decided to create a new MCP server that can call my API's get_count endpoint. Here's the Swagger UI summary info for that endpoint:

I followed the server quickstart instructions, but created a football server instead of a weather server (I'll link to all the code at the bottom). This time my command was different -- it uses the same MCP Client (which apparently is a generic client) but I pointed it to my new server: python client.py ../football/footballserver.py

The conversation this time was based on the one API endpoint I put in my code. I asked it for a count of players and I got the correct one from the API (1,018) and threw in counts of teams and leagues for good measure:

I did a few more quick tests with the weather and football servers. Before I disabled my Anthropic key (to protect it in case I exposed it somehow) I saw that I had spent 10 cents on the use of the Anthropic APIs:

To see the code of the quickstart client and weather server along with my custom football server, check out the repo here: https://github.com/Ryandaydev/mcp_server_demo

Quick Takeaways

•The MCP client in the demo is the part of the code that's using the LLM (in this case Anthropic's models)

• The client didn't change based on the data source I connected it to, it seems capable of calling a variety of clients.

• To connect a different API, I created a new MCP Server and defined the endpoints available to it. The server doesn't use the LLM -- it's a wrapper for the API.

• Having your own core project (in this case my SportsWorldCentral API) is a big help for learning new tools, because it forces you to sift through the sample code to see which part is template code and which is specific to your use case.

Next steps

There are two directions I can go next with this project using MCP:

  1. Implement more endpoints to give the model the ability to answer more questions from the API.
  2. Use my API's SDK (which is called swcpy) instead of calling the API directly using the httpx library.

I'll let you know if I progress on either of those paths in the future.

Behind the Scenes on Launch Day

Last Tuesday was the official book launch day when I got the word out on LinkedIn that the book was ready to buy. Although I live in the Kansas City area, launch day found me on King Street in downtown Charleston, SC at the Spread Bagelry.

I posted the official post early in the morning, and then as people shared the post, I responded to all the kind comments from people (If a post gets a lot of interaction, LinkedIn shares it to a wider audience.) While interacting with all the well-wishers, I had a nice light-roast coffee and a rosemary olive oil Montreal-style Bagel, fresh out of the wood-fired oven.

As word spread on LinkedIn, the sales were coming in on Amazon, and by mid-morning I had the good news, 🌟my book was an Amazon #1 New Release! 🌟

I owe a big thanks to the Tip-Sheeters who shared and commented on the LinkedIn post to help me get the word out about the book. You all have been going along on this journey as I share technical tips about APIs for AI and Data Science. I'm having a lot of fun sharing the tips and I'm glad that so many people continue to engage with the newsletter.

Feel free to continue to share the word about the book by reposting the original post or making one of your own. Thank you all!

Keep coding,

Ryan Day

https://handsonapibook.com/

Ryan Day

This is my weekly newsletter where I share some useful tips that I've learned while researching and writing the book Hands-on APIs for AI and Data Science, which will be published by O'Reilly Publishing in April 2025.

Read more from Ryan Day

Hi Tip-Sheeters, This week I'm excited to share a chat I had with Sebastian Ramírez Montaňo, creator of FastAPI and founder of FastAPI Labs. There's some good stuff in our conversation, including Sebastián's tips on skill-building and advice on staying positive. If you've had this email forwarded to you, I hope you'll consider subscribing weekly at https://tips.handsonapibook.com/. The Big Announcement For background, you might have seen the announcement a few weeks ago on Sebastián's...

Hi Tip Sheeters! This week I will take my first look at Model Context Protocol (MCP), which Anthropic rolled out during the holidays last year. Along the way, I'll share my approach to getting up to speed on a new open-source framework or tool. Let's get started! 7 Steps to get up to speed on MCP (or any new tech) There are some repeatable steps about how I get to know a new tool. One thing to remember up front is that just because someone proposes a standard or releases a tool doesn't mean...

Hello Tip-Sheeters, I hope your week is off to a good start. When I'm asked how someone can start or grow their data career, my short answer is generally "Build something and share it". We're in the golden era of cloud computing, with hundreds of low-cost ways to build data apps and share them with friends, colleagues, and potential employers for minimal cost. There's just no reason not to jump in and get started. One fun way to do this is by joining an online data science contest like the...