Tip Sheet #22: Implementing an MCP sample project and behind the scenes of book launch day


Hello Tip-Sheeters, I hope you're having a great week. In this issue, I'll take you through some initial Python coding for Model Context Protocol (MCP), a recent open-source project led by Anthropic. I also share a behind-the-scenes look at book launch day, which was a lot of fun and successful.

MCP and Your APIs

In Tip Sheet #19, I shared 7 steps to getting up to speed on MCP (or any new technology). In that, I gave some background on what MCP is and how it relates to using APIs with AI.

TL;DR: MCP is a brand-new method for API producers to wrap their APIs (and other data sources) so LLMs can interact with them.

In that tip sheet, I said that I would save for another day Step 7: Try it out for yourself. Today is that day.

Running the Anthropic Demo in GitHub Codespaces

I created the MCP Client and MCP Server quickstart projects that Anthropic has on their website. (Github repo shared at the end.) Here's the architecture diagram that Anthropic includes in their introduction page, with the parts the demo covered circled in red:

I started by creating the MCP server quickstart, which creates the code to call the Weather.gov API. The quickstart instructions assume that you'll be installing the code in a local Python environment, but if you've been following my work you know that my go-to is using GitHub Codespaces for my Python development work. I created a new GitHub repo and created a Codespace on the main branch. Then I followed the instructions in the demo. It uses the Python uv library to setup the project, which is one I wanted to try, so that was a bonus of working on this project.

Next, I implemented the MCP client quickstart in the same Codespace (in a separate directory). To implement the client you need to get an Anthropic API account and generate an API key. The instructions show you adding the key to an environment file, but I instead put the key in a Codespace secret, which is also be available to the Python code.

With the API key set up, I was ready to test. Roughly following the instructions, I kicked off the MCP client with this command:

python client.py ../weather/weather.py

After a few seconds to start up, the MCP client told me what tools were available from the Weather.gov API and prompted me for a query. Since I was traveling, I asked for the weather for St. Louis, MO. Here's what I got back:

So far so good, the quickstarts are running.

Customizing the MCP Server to use my own API

It was great that I had the demos working with the Anthropic LLM and Weather.gov API, but I want MCP to work with my API. This is where having a core project comes in handy. For my book, I built the SportsWorldCentral API and I know the ins and outs of it. I decided to create a new MCP server that can call my API's get_count endpoint. Here's the Swagger UI summary info for that endpoint:

I followed the server quickstart instructions, but created a football server instead of a weather server (I'll link to all the code at the bottom). This time my command was different -- it uses the same MCP Client (which apparently is a generic client) but I pointed it to my new server: python client.py ../football/footballserver.py

The conversation this time was based on the one API endpoint I put in my code. I asked it for a count of players and I got the correct one from the API (1,018) and threw in counts of teams and leagues for good measure:

I did a few more quick tests with the weather and football servers. Before I disabled my Anthropic key (to protect it in case I exposed it somehow) I saw that I had spent 10 cents on the use of the Anthropic APIs:

To see the code of the quickstart client and weather server along with my custom football server, check out the repo here: https://github.com/Ryandaydev/mcp_server_demo

Quick Takeaways

•The MCP client in the demo is the part of the code that's using the LLM (in this case Anthropic's models)

• The client didn't change based on the data source I connected it to, it seems capable of calling a variety of clients.

• To connect a different API, I created a new MCP Server and defined the endpoints available to it. The server doesn't use the LLM -- it's a wrapper for the API.

• Having your own core project (in this case my SportsWorldCentral API) is a big help for learning new tools, because it forces you to sift through the sample code to see which part is template code and which is specific to your use case.

Next steps

There are two directions I can go next with this project using MCP:

  1. Implement more endpoints to give the model the ability to answer more questions from the API.
  2. Use my API's SDK (which is called swcpy) instead of calling the API directly using the httpx library.

I'll let you know if I progress on either of those paths in the future.

Behind the Scenes on Launch Day

Last Tuesday was the official book launch day when I got the word out on LinkedIn that the book was ready to buy. Although I live in the Kansas City area, launch day found me on King Street in downtown Charleston, SC at the Spread Bagelry.

I posted the official post early in the morning, and then as people shared the post, I responded to all the kind comments from people (If a post gets a lot of interaction, LinkedIn shares it to a wider audience.) While interacting with all the well-wishers, I had a nice light-roast coffee and a rosemary olive oil Montreal-style Bagel, fresh out of the wood-fired oven.

As word spread on LinkedIn, the sales were coming in on Amazon, and by mid-morning I had the good news, 🌟my book was an Amazon #1 New Release! 🌟

I owe a big thanks to the Tip-Sheeters who shared and commented on the LinkedIn post to help me get the word out about the book. You all have been going along on this journey as I share technical tips about APIs for AI and Data Science. I'm having a lot of fun sharing the tips and I'm glad that so many people continue to engage with the newsletter.

Feel free to continue to share the word about the book by reposting the original post or making one of your own. Thank you all!

Keep coding,

Ryan Day

https://handsonapibook.com/

Ryan Day

This is my weekly newsletter where I share some useful tips that I've learned while researching and writing the book Hands-on APIs for AI and Data Science, a #1 New Release from O'Reilly Publishing

Read more from Ryan Day

Hi Tip-Sheeters, In Tip Sheet #22, I demonstrated creating an MCP server to connect to my Football API. This week, I'll update that demo to the latest version of FastMCP and MCP Cloud. It's another good chance to pitch you why you need an ongoing side project, which I'm calling an Anchor Project. Why You Need an Anchor Project ⚓💪 I learn best by building, and I've found that a lot of other tech and data people do as well. A valuable method for me is to build real-world projects, as I wrote...

Team stats Streamlit app page

Hi Tip-Sheeters, This week brings one of the most exciting times of the year for me: the National Football League kicks off its regular season. I'm a fan of the Kansas City Chiefs, and they'll be playing their opener in São Paolo, Brazil (in Arena Corinthians if there are any serious soccer fans out there). Creative Commons, TheBo 2007 This also means Fantasy Football Season begins!! I'm a huge fantasy football fanatic, and I'll be managing five teams this year. What is fantasy football, you...

Hi Tip-Sheeters, Great interview tips for your up-skilling this week, and don't miss a couple of links at the end related to a new tool worth checking out. Creating an Up-Skilling Plan Andres Vourakis is a Data Scientist at Nextory and the creator of the To Be a Data Scientist newsletter, which provides a lot of technical content and career advice for data scientists. Andres shared a post recently titled How I'm Currently Upskilling as a Senior Data Scientist in Tech (2025 Edition) with a...