Self-hosted Open Source and more
Time for another update from Talk Python (the podcast and courses). Here are a few fun things in that space since we last spoke.
We’ve recently completely overhauled how we run our websites and APIs at Talk Python. It’s highly influenced by the whole “Leaving the Cloud” series of conversations I’ve mentioned here and there on some of my shows.
Now the full story is out. I was fortunate to be a guest on the Django Chat podcast with Carlton and Will, episode 162: Self-Hosted Open Source with Michael Kennedy. It’s a really fun conversation. If you are thinking about the ideas of the big clouds being way overpriced and overly complicated, you’ll definitely enjoy the chat.
Keeping with podcasts, a few weeks ago, I was a panelist on the Teaching Python podcast. Kelly and Sean invited me to participate and answer the controversial question, inspired by the meteoric growth of LLMs in coding:
Coding is Dead?
It was a great discussion and a bit too nuanced for a paragraph here. So just give episode 130: Is Coding Dead a listen.
All this talk about LLMs and coding got me paying more attention to my use of LLMs for software development. So I thought you might enjoy a peek into my weird world of podcaster/programmer/dev-ops-er.
First, I usually open up my private and local client rather than ChatGPT, LM Studio (highly recommended) and use the downloaded Llama 3 model.
Here’s what I was asking LM Studio on the day I took notes.
- I have many zombie processes that are defunct. What bash command can I use to close all of them and clean up resources?
- I’m working on a docker compose file with health checks. What do the following arguments mean exactly? (Passes in a partial YAML config)
- How long will docker compose wait between retries for health checks?
- My docker health checks seem to be causing many zombie processes on the linux host. How do I prevent this?
- What does the nohup linux command do?
- Yes. I am using gunicorn. I want to know about the feature of restarting worker processes. Can you describe what max-requests does? <– The answer here was amazing.
- For max-requests is the request count per worker for for all worker processes?
Does that code me out of a job? I don’t think so. But it sure made quick use of the docs and man pages!
Of course, there are some new Talk Python episodes since I last wrote you as well. Here are the highlights. I hope you have time to listen to the full episodes.
On episode 460, I spoke with Jimmy Chan from Dropbase about Building Internal Tools with Python. If you live in “forms over data” land, this Python low-code app platform is pretty cool.
We had Keiland Cooper on episode 461 to discuss Python in Neuroscience and Academic Labs. It was a super cool look at how Python is being used in “applied academia.”
I had the chance to talk to the influential Wes McKinney (creator of Pandas and other data sci things) on episode 462 where we talked about Pandas history and his current data science projects.
Giovanni Barillari was on episode 463 to tell us about his web framework Emmitt and web server Granian. We’re running some of our workloads on Granian already over at Talk Python.
If you haven’t seen Kolo in action, definitely visit their website, then listen to Wilhelm Klopp tell us about it on episode 464.
Last but not least, we have Ines Montani back on the show to talk about how open source will influence the AI revolution. It’s a fun and upbeat conversation that shouldn’t be missed.
And we have a new course as well, Getting Started with NLP and spaCy! This course is super fun. It takes 9 years of Talk Python transcripts and uses spaCy and LLMs to pull out trends and tools that we’ve talked about over the years.
If you have text data to process, do give the course a look.
Before I say goodbye for now, one more amazing experience to share. I had a chance to spend 3 days riding in the desert at the Giant Loop Rally with 100s of offroad motorcycle riders. We covered around 350 miles (560 km) and did almost 36,000 ft in elevation change (10,000 m)! You can catch some pictures on my instagram channel.
As always, if you or your team is looking to level up your Python, check out our over 250 hours of courses at Talk Python.
Thank you very much for supporting our work.
Note: This post originally appeared on Talk Python’s mailing list. Join for free to get these updates immediately.