Tip Sheet #22: Implementing an MCP sample project and behind the scenes of book launch day


Hello Tip-Sheeters, I hope you're having a great week. In this issue, I'll take you through some initial Python coding for Model Context Protocol (MCP), a recent open-source project led by Anthropic. I also share a behind-the-scenes look at book launch day, which was a lot of fun and successful.

MCP and Your APIs

In Tip Sheet #19, I shared 7 steps to getting up to speed on MCP (or any new technology). In that, I gave some background on what MCP is and how it relates to using APIs with AI.

TL;DR: MCP is a brand-new method for API producers to wrap their APIs (and other data sources) so LLMs can interact with them.

In that tip sheet, I said that I would save for another day Step 7: Try it out for yourself. Today is that day.

Running the Anthropic Demo in GitHub Codespaces

I created the MCP Client and MCP Server quickstart projects that Anthropic has on their website. (Github repo shared at the end.) Here's the architecture diagram that Anthropic includes in their introduction page, with the parts the demo covered circled in red:

I started by creating the MCP server quickstart, which creates the code to call the Weather.gov API. The quickstart instructions assume that you'll be installing the code in a local Python environment, but if you've been following my work you know that my go-to is using GitHub Codespaces for my Python development work. I created a new GitHub repo and created a Codespace on the main branch. Then I followed the instructions in the demo. It uses the Python uv library to setup the project, which is one I wanted to try, so that was a bonus of working on this project.

Next, I implemented the MCP client quickstart in the same Codespace (in a separate directory). To implement the client you need to get an Anthropic API account and generate an API key. The instructions show you adding the key to an environment file, but I instead put the key in a Codespace secret, which is also be available to the Python code.

With the API key set up, I was ready to test. Roughly following the instructions, I kicked off the MCP client with this command:

python client.py ../weather/weather.py

After a few seconds to start up, the MCP client told me what tools were available from the Weather.gov API and prompted me for a query. Since I was traveling, I asked for the weather for St. Louis, MO. Here's what I got back:

So far so good, the quickstarts are running.

Customizing the MCP Server to use my own API

It was great that I had the demos working with the Anthropic LLM and Weather.gov API, but I want MCP to work with my API. This is where having a core project comes in handy. For my book, I built the SportsWorldCentral API and I know the ins and outs of it. I decided to create a new MCP server that can call my API's get_count endpoint. Here's the Swagger UI summary info for that endpoint:

I followed the server quickstart instructions, but created a football server instead of a weather server (I'll link to all the code at the bottom). This time my command was different -- it uses the same MCP Client (which apparently is a generic client) but I pointed it to my new server: python client.py ../football/footballserver.py

The conversation this time was based on the one API endpoint I put in my code. I asked it for a count of players and I got the correct one from the API (1,018) and threw in counts of teams and leagues for good measure:

I did a few more quick tests with the weather and football servers. Before I disabled my Anthropic key (to protect it in case I exposed it somehow) I saw that I had spent 10 cents on the use of the Anthropic APIs:

To see the code of the quickstart client and weather server along with my custom football server, check out the repo here: https://github.com/Ryandaydev/mcp_server_demo

Quick Takeaways

•The MCP client in the demo is the part of the code that's using the LLM (in this case Anthropic's models)

• The client didn't change based on the data source I connected it to, it seems capable of calling a variety of clients.

• To connect a different API, I created a new MCP Server and defined the endpoints available to it. The server doesn't use the LLM -- it's a wrapper for the API.

• Having your own core project (in this case my SportsWorldCentral API) is a big help for learning new tools, because it forces you to sift through the sample code to see which part is template code and which is specific to your use case.

Next steps

There are two directions I can go next with this project using MCP:

  1. Implement more endpoints to give the model the ability to answer more questions from the API.
  2. Use my API's SDK (which is called swcpy) instead of calling the API directly using the httpx library.

I'll let you know if I progress on either of those paths in the future.

Behind the Scenes on Launch Day

Last Tuesday was the official book launch day when I got the word out on LinkedIn that the book was ready to buy. Although I live in the Kansas City area, launch day found me on King Street in downtown Charleston, SC at the Spread Bagelry.

I posted the official post early in the morning, and then as people shared the post, I responded to all the kind comments from people (If a post gets a lot of interaction, LinkedIn shares it to a wider audience.) While interacting with all the well-wishers, I had a nice light-roast coffee and a rosemary olive oil Montreal-style Bagel, fresh out of the wood-fired oven.

As word spread on LinkedIn, the sales were coming in on Amazon, and by mid-morning I had the good news, 🌟my book was an Amazon #1 New Release! 🌟

I owe a big thanks to the Tip-Sheeters who shared and commented on the LinkedIn post to help me get the word out about the book. You all have been going along on this journey as I share technical tips about APIs for AI and Data Science. I'm having a lot of fun sharing the tips and I'm glad that so many people continue to engage with the newsletter.

Feel free to continue to share the word about the book by reposting the original post or making one of your own. Thank you all!

Keep coding,

Ryan Day

https://handsonapibook.com/

Ryan Day

This is my weekly newsletter where I share some useful tips that I've learned while researching and writing the book Hands-on APIs for AI and Data Science, a #1 New Release from O'Reilly Publishing

Read more from Ryan Day

Hi Tip-Sheeters, One of the key areas where APIs are used in machine learning is for model inference. This is where the model is made available for real-time API calls to receive predictions. This is one of the two primary modes of model inference (along with batch inference). Tip Sheet #12 listed some of the top Python libraries worth exploring. One of those was LitServe, which is a framework for hosting models with a FastAPI backend. This week, I'll demonstrate this framework using my...

Hi Tip-Sheeters, Today is Veteran's Day, and I want to start with appreciation for any service members who are reading this. The holiday was originally created to remember the armistice that ended the first World War in 1918. Peace is a blessing that we should never take for granted. Entering a Hackathon Quick and Dirty As data and IT pros, we are constantly honing our craft. One of the best ways is through taking on intense challenges or projects to drive our learning in a short period of...

Hi Tip-Sheeters, This week, I’m sharing a few of the questions that I’ve had this past year from readers, people at events, or folks who’ve reached out on LinkedIn. If you ever have a question or something I can assist with, please hit reply on the newsletter email, and I’ll be happy to share my thoughts. Let’s jump in! Q&A About Data, Tech, and Career Q: What is the new tool or framework you are learning about right now? If you’ve been keeping up with the Tip Sheet, you won’t be surprised...