Monitor performance issues & errors in your code

#396: AI Goes on Trial For Writing Code (crossover) Transcript

Recorded on Tuesday, Nov 29, 2022.

00:00 Hi, folks. For our final episode of 2022 here on Talk Python, we're crossing the streams with my other show, Python Bytes. I present to you one of the more popular episodes of the year. We cover many small topics on Python Bytes. It's the whole theme of the show, as you may know, but the headline of this one is AI Goes On Trial for Writing Code. Thank you for listening to Talk Python to me in 2022 and have a great holiday break.

00:25 The opportunity for me to do this work and have you all appreciate it and support it is humbling and truly an honor. Thank you, and I'll see you next week. Here's that episode. Hello, and welcome to Python Bytes, where we deliver Python news and headlines directly to your earbuds. This is episode 312, recorded November 29, 2022. I'm Michael Kennedy.

00:46 And I'm Brian Okken.

00:47 We got a lot to cover today. You want to just jump right into it.

00:49 What I want to talk about is Simon Willison.

00:53 This is incredible. So Simon did a talk at Django Con 2022, and then he roads up with the slides and everything. And we're going to link to his blog. His blog title is Coping Strategies for the Serial Project Order. And then the talk title was massively increase your productivity on personal projects with comprehensive documentation and automated tests. Yes, that's a mouthful. But really, I don't know what a good name for this is, other than everybody that works with development needs to watch this talk because it's incredible.

01:28 So he goes through a lot of stuff I'm going to go through.

01:33 Luckily, he's got screenshots on here. But he starts out. So this is important not just for open source projects or personal projects. This is also for if you're working in a company. I think this is equally true. So he talks about how he got these techniques from working at now I'm going to forget where he worked, but yeah, it's gone a large company with multiple continents, and it was helpful to do this model. So what is he talking about? So one of the things he talks about is the perfect commit. So as a professional software developer, you're not really doing new code all the time. What you're doing is maintaining existing software.

02:21 So the commit is your unit of work, and a perfect commit includes the implementation of whatever you've done. But it also has tests and documentation and a link to the issue thread. And it seems like a lot to me, but walking through his talk, it totally makes sense.

02:40 So he gives an example of one of his with some cool highlights that highlights that he's got documentation changes also. And the document documentation might just be a single line change or something.

02:54 But the tests he does pause here and say tests are hard for some people, some developers. So it's important to get a working test framework in place quickly. So that a test developer isn't starting from scratch or a software developer when they're writing tests.

03:12 It's not like comprehensive testing has to be there, but it is a test that passes when your change is there and fails when it's not there or fails when it's not working. That's enough. You can do more thorough testing, but that's enough to get us started. And I think that's a good way to think about it.

03:34 He throws in this little cool thing of like just keep common types of projects that you have around as cookie cutters in your own GitHub area. He's got a Python library and a click app and data set plug in for him. I might have different things like a pytest plug in or something. And that way you can just keep up with your best practices, what you think of as best practices in one place. This is a cool idea.

03:59 Totally going to steal this for myself as well. I built this predated cookie cutter, but I built this thing. I always want to have logging and I want it like this. I always want to connect to this other service and ping and make sure that the thing is a lie or whatever thing. We decided for monitoring inside of the company I worked at. And all the new projects would just start that way and it was so nice because you didn't like, is it really worth doing the thing to make sure that we can monitor some time? You just run the one command line thing and it's there, right? We could give it to an intern and they could run it to start their projects. It was great.

04:34 And then he's going got like this thing that supposedly with a GitHub hook and I'm going to have to dig into this more because I didn't quite understand how this works. But he's got a way within the GitHub interface to say I want a new project and it automatically gives you the choices of what kind of project and then fills out all the defaults from the start instead of just getting to read me like normally. So this is kind of neat and I guess I want to try figuring this out. The documentation bit.

05:02 At least one of the things about including this, even if it's difficult, you can have this be part of the code review requirements. So don't accept a code review until the documentation is there also.

05:14 So this is a cool idea.

05:16 And then bonus trick of testing documentation, which is a cool idea. And then the links to the issue. And I thought this was just sort of, yes, you should do this. But this is really the meat of the talk is him doing his entire thought process in the issue thread. And he even gives examples where there's sometimes like up to 50, 60 comments and it's just him talking to himself. But this is brilliant and I'm not going to convince you as much as he is, but include screenshots and dead ends. Like, I tried this thing and it didn't work. And we're going to go back and do this other thing.

05:55 He calls it temporal documentation. And I just love this idea and I'm going to try to follow this myself because I have like a memory issue. I write stuff down and then I forget where I wrote it down. And this way he says, you don't have to remember anything, you just dump it there. And that way, let's say you get pulled off of a project and you don't get back to it for like six months because you've been fighting fires and do other stuff and then you get back to it, you won't remember where you're at. And with this line of thinking of keeping all of your thinking in the issue thread, you can just jump in and go, oh, that's where I was, and get started pretty quickly. I love this, it's cool. So then the rest of the talk is pretty interesting too, talking about how scientists have been doing this and other engineers have been doing this for a long time. They called them lab notebooks before and we kind of got out of the habit of doing that with software. But anyway, lots of great techniques and I think this is just how to be a professional software developer now.

06:56 I love that it's like a casual conversation and not like, here are my four recommendations, but the playing around and the dead ends are really valuable.

07:05 Came out in the audience, says the cookie cutter approach also works beautifully from a DevOps perspective for setting up developers to use your firm specific infrastructure. Yeah, I totally agree. And that's a little bit of what I was hinting at of like, here's how we integrate with this like uptime manager and stuff. But obviously DevOps that was talking. We were deploying to a server in a closet, which that tells you all you need to know.

07:27 Yes.

07:28 Speaking of stuff you might need to know, Brian, Google Copilot and these code writing AIS, I have more to say that about that at the very end of the show book. In this little bit. They train themselves on lots of code. And Google Copilot, for those of you who don't know, you basically can give it a comment and it's like I want to connect to a postgres database with SQLAlchemy, and then boom, it'll literally write all the code, import the usings, come up with a connection, string, all that kind of stuff. It's pretty fantastic. It has some privacy issues.

08:01 I don't know what it's doing now. It used to send your source code that you wrote up to GitHub, which made me not want to use it already. But the big news here is get the website GitHub Copilotlitigation.com. And that's, as ominous as it sounds, says we filed, this is what the website sort of announces, we filed a lawsuit challenging GitHub Copilot, an AI product that relies on unprecedented open source software piracy. Why piracy? Because it's trained on things that are like GPL and Creative Commons share alike or Attribution, and then it outputs code based on that original input that has no GPL and it has no whatever license. Right. The license is stripped and no Attribution. What do you think?

08:50 Something we talked about from the very beginning of like, how is this okay? Yeah, not sure.

08:55 Absolutely. If it did things like we are only going to look at MIT licenses and other commercial open no Attribution licenses, I don't think there'd be anything to say about it, but apparently that's not the case. So there's a couple of updates as well. I suppose we should also, like they do on this page, say, we are not a lawyer, please don't take legal advice from us. We write code, not legal documents. But nonetheless, it says, this is Matthew Butterick and they've set up to investigate Google Copilot and they filed a class action lawsuit in the US. Federal Court in San Francisco on behalf of a couple of folks. So they're challenging the legality of GitHub Copilot and a related product, OpenAI Codex, which powers Copilot. The suit has been filed against a set of defendants that includes GitHub, Microsoft, and OpenAI. Wow, there's an update down here somewhere.

09:53 Let's see. It says by training here's the motivation for their lawsuit by training their AI systems on public GitHub repositories, though based on their public statements, possibly much more, we contend that the defendants have violated the legal rights of a vast number of creators who posted code or other work under certain open source licenses on GitHub, which licenses a set of eleven popular open source licenses that all require attribution the author's name and copyright, including, I guess, that MIT license as well the GPL and the Apache license. And it's listed out there. There's a whole bunch more details and it says Update November 10. That original was November 3. There's an update here. We filed a second class action lawsuit on behalf of two additional plaintiffs. The defendants and claimants are otherwise similar to the initial one. So there you go. It's going to be interesting. It's not going to be just interesting for Google GitHub Copilot, but basically AI in general, right? It says it's going to challenge that AI strips the ownership and other requirements of inputs and outputs, right? And maybe it does, maybe it doesn't. I mean, we heard that APIs are not copyrightable in the Google Oracle Java lawsuit. So we're going to find out here.

11:11 Interesting, when we looked into this a little bit earlier, if it's helping you fill in parameters to a function or what likely things you're going to fill in for a function call, that's one thing. But when it plops down like 20 lines of code for you, where did it get those 20 lines of code.

11:32 Open source doesn't necessarily mean you can copy it. It's just open to read.

11:38 You can put your own license in there. You can make up your own license that says anybody can read this, but you can't copy it, use it, or do anything else with it at all. Can't even fork it.

11:49 And there's nothing stopping you from doing that sort of a license, right.

11:54 The default, if you put on GitHub, I believe if you put no license means you're confirming no license whatsoever, right?

12:01 Yeah, it means it's just like writing a book. When you write a book, you have the full copyright unless you give it to somebody else.

12:10 Yeah, absolutely.

12:13 This portion of Talk Python to Me is brought to you by Sentry. How would you like to remove a little stress from your life? Do you worry that users may be encountering errors, slowdowns or crashes with your app right now? Would you even know it until they sent you that support email? How much better would it be to have the error or performance details immediately sent to you, including the call stack and values of local variables and the active user recorded in the report? With Sentry, this is not only possible, it's simple. In fact, we use Sentry on all the Talk Python web properties. We've actually fixed a bug triggered by a user and had the upgrade ready to roll out as we got the support email. That was a great email to write back. Hey, we already saw your error and have already rolled out the fix. Imagine their surprise, surprise and delight your users. Create your Sentry account at Talkpython.FM/sentry? And if you sign up with the code, Talk Python all one word. It's good for two free months of Sentry's business plan, which will give you up to 20 times as many monthly events, as well as other features, create better software, delight your users, and support the podcast. Visit Talkpython.Fm/sentry and use the coupon code talkpython.

13:27 All right. Over to Brian. What's the next one?

13:31 This is a silly thing, but sometimes I've got code that I want to python code that I want to have a pop up window pop up. And I've always been using what is it? Py Simple GUI. Well, not always, but that's what I've been using lately for, like, really easy just a simple pop up thing, especially if I needed to use Run on Max and really anywhere, because it's totally fast to get it done and I don't have to think about it anymore. However, Py Simple GUI doesn't I haven't mastered the art of getting it to look just like a native dialogue box, and maybe there's some tricks that you can do that I just don't know. But if I know it's on Windows, maybe we could just go ahead and use the Windows DLLs and do a native Windows.

14:22 Just go straight into the Windows 132 API for yeah, sure.

14:28 That shouldn't be too hard, right? It sounds scary to me, but I ran across Matt Callahan's blog.

14:36 Matt Callahan has an article called Display A Message Box in Python without using a non standard library or other dependency. Actually, you can just do this. You don't have to install anything.

14:48 And I got this where did I get this from? Give credit where credit is due. I got this from the Py Quarters weekly newsletter. So thanks to them.

14:59 Anyway, this is not hard. So he has a little pop up example. And I should have read the article, but I just skimmed for the code.

15:07 Here's some code. That's it. This pop makes a dialogue box pop up. And it's calling the it's just like a couple of flags. It's like ten lines of code.

15:18 It calls Ctypes windyll User 32 message box EXW, whatever that means. And with some stuff in it, like a title and a message and everything. So it's using C types, which I don't use much, but you can to get into DLLs. So Ctypes is built into python.

15:39 So this message box, I wanted to play with it a little bit more. So as I was playing with this, looked into the Microsoft documentation, the message box dialogue. One of the flags is this. You type and it's like this hex value thing or a bit field and you can or in a whole bunch of stuff. So you can use this to get like an OK box or an OK cancel box. You get different types of dialogue boxes using this flag. And then once you've got this popped up, how do you need to know like, what users clicked on and stuff? So there's return values from this, and you can just check the return value. And it's defined to be like a three for Abort and a two for cancel and one for okay. And you can just check this value. So with just a little bit of code, you can have a native dialog box pop up if you need to in your code.

16:31 Yeah, that's awesome. And it does things like natively that you would expect. Like for example, you hit Escape and you have an okay, cancel it'll, return cancel. I hate some of these UI things. They show up and you're like, well, it's got one text input and a submit button. You hit Enter, it does nothing. You're like, yeah, great. Okay, apparently this is not real. I'm going to just go click it or whatever. Right? So hooking into the native OS is sweet like that. This looks like a thing that would be ripe for a short, simple little package that wraps up, say, all the cancelation.

17:03 Cancel. What kind of icon you want? Do you want like a warning? Do you want an informational icon?

17:09 The button? Yeah, it seems really great, but this is fantastic.

17:13 So neat and built in.

17:16 Anyway, just a quickie yeah, it comes included. And, yeah, I really like it. And it's also a bit of a roadmap to show what you could do beyond that. Right. There's more than just really simple dialog boxes. For example, like, the open file dialog box on Windows could probably be real similar, right?

17:30 Oh, yeah, probably someone was looking it up. There's a whole bunch of dialogue boxes you got access to.

17:37 Yeah, exactly. It's like a roadmap to, like, oh, I can create a file or any of these things, which I think is pretty cool. All right, let's flip away from OS specific to OS general, but stick with PY coders for a minute. So this one also comes from PY coders. I don't know if it's the same issue or not, but pretty cool. It says, Write Chrome extensions, which also mean like Brave and Vivaldi and others edge maybe. Write chrome extensions in Python.

18:05 How does it work? PyScript, of course. So, yeah, we just take PyScript, and this is an article by Pete Fiston, and it sort of walks through how he was able to use PyScript, which is Python, on WebAssembly, running in the browser to use that to power a Chrome extension. And it doesn't really matter if it's a bit of a nine meg download because you install it once and it's local on your computer. Right. So if you want to do this, it walks you through all the things you got to do in order to use PY script to write Chrome extensions, or Python to write Chrome extensions. What do you think? Cool. Even shows you how to put an icon.

18:43 That's pretty cool.

18:44 Yeah. But I have more for you. Just so in case people don't know, this is an extra here all about it, actually, because I want to hit a whole bunch of things. So as of recently, just published this episode let me look, 30, 31 minutes ago, and it says, PyScript powered by Micro Python. So one of the challenges that PyScript has had traditionally is it's based on the full nearly the full CPython runtime compiled into WebAssembly, which after you strip a bunch out that doesn't work in the browser, it comes down to, like, nine megabytes. Okay. For this browser extension thing, that's reasonable. But you would never use it in place of, like, viewjs on a popular page because you want that page to load quickly. You want it to be good for SEO, all those things. But you know what's small and fast? Micro Python.

19:33 That would be neat.

19:34 So I just had Brett Cannon, Nicholas Tollerve, and Fabio Flager on Talk Python to talk about the work that they're doing to make PyScript not run on full CPython, but to run on micro Python. Micro Python. You can get that to load up in 100 milliseconds on your page, and it's only a couple of hundred kb. All of a sudden, that starts to sound a lot like, a pretty rich front and framework level of stuff. You got to download and get started, and you cache it, then you're good to go. That's exciting.

20:04 That's super exciting. Yeah.

20:06 So this Chrome extension thing is cool. When you look at the shipping version, I don't know if you can call it shipping, because price grip is still, like, super alpha, but what you can get today so Nicholas said probably spring, that they'll have something to share, but in terms of being able to use micro Python. But I think that's pretty excellent. That could really, really unlock some super cool features if now we could build, like, a VJs type thing, but with Python. And one of the goals that they stated is that they're looking to build this as a framework or excuse me, a platform that you can build frameworks on top of. So it's not just, here's how you write some Python code in the browser, but here's a foundation that people could create, like a Py Vue or a Py Angular or whatever they wanted to create. Right.

20:54 Question. Just for my own personal use with Chrome extensions work on vivaldi.

21:01 Yeah, they do. One of the things that's interesting about Vivaldi, and I think it probably affects its reporting a little bit when you look at the user agent of Vivaldi, it's exactly the user agent of Chrome. So it lies to the world and tells the world it's Chrome.

21:16 There's no user agent for vivaldi. It's just whatever version of Chrome it's like using. Okay, so when you go to the Chrome Web store, it's like, Put this in Chrome, you click it, and yeah, it goes so it works perfectly. Sure.

21:27 Cool.

21:27 Yes. And John Sheen says, yes. They do. All right, next extra.

21:32 Brian, I've been excited a little bit about Mastodon. I don't know if you notice.

21:36 Yeah, me, too.

21:37 I know. It's fantastic. It's really tons of great interactions. And I started putting in our show notes, which you'll see when I publish this, your Mastodon account in mind so people can connect with us and have even more conversations over there. But there was a really interesting article by Eugene, the guy who created Mastodon, called I've been looking and looking. It's about scaling Mastodon and the challenges they were having, and, boy, I would love to link to it, but I just can't find it. But it's written in Ruby, right? And so it talks so much about these are the challenges of scaling out threads, and, oh, we have this thing called a GIL, and it really doesn't allow you to use threads very easily.

22:20 It was so interesting to look at how a technology that doesn't have Async, IO and Async and await getting all tangled up, trying to do IO based things. So it's like, well, maybe we should have ten to 20 threads to do the network communication, but if we have more than 20, then we get, like, a context switching and contention in the operating system.

22:44 That just comes with having OS threads. Well, guess what? You can do really well with no threads or one thread. You can talk to web. You can call other websites, you can receive web requests. And the mechanism for doing that in Python is Async and Await. And async IO requires no additional threads. Very, very little overhead, no contact switching. So this project by Andrew sorry, if I am not getting Andrew Godwin, sorry, forgot his last name for a moment of Django channel said, what if I rewrote this, but in Python with an Asynch and await, okay, there's a bunch of challenges of running massive on. People want to have their own server because they're like, oh, I want my own server, so I'm not stuck in one of these communities and beholden to them. The problem is every one of those is like a standalone DevOps adventure. There's tons of things working together and it's a lot of work. It would be better if you host more of them on one machine and sort of scale that up in a nice way. So this one lets you host multiple domains for small to medium instances, and it's written with Async and Await, which is pretty awesome.

23:58 Anyway, I don't know how to check this out.

24:01 I didn't know if I caught you trying to pronounce it Tikahi or Takahe.

24:06 I don't know. I'm going to go with that.

24:08 And of course, Andrew Godwin just said I could probably write this in Python and get it out in a couple of weeks.

24:15 I think it was like five days or something.

24:18 So key features, multiple domain support, multiple identities per user, which is kind of interesting.

24:24 Desktop, mobile and PWA compatible. Again, how many days and easy deployment, a web worker, a background worker, and one database. Not all this crazy stuff. So anyway, people can check it out. Let's check out the requirements, see what we got going on here. Uvicorn for an httpx. I mean, that pretty much pretty much says it right there. Oh, interesting. It's based on each Django. HTMX is pretty interesting as some of the building blocks, but yeah, super cool.

24:55 So there's another one.

24:57 This portion of Talk Python to me is brought to you by the AWS Insiders podcast.

25:02 When was the last time you ordered a physical server to host your functions as a service? Your latest API or your most recent Web app? I remember the last time I did that was around the year 2001. And yes, it was quite the Odyssey. Of course, we don't do that anymore. We run our code in the cloud with near instant provisioning and unparalleled data centers. And the most popular cloud provider is AWS. But for all the ways that AWS has made our lives easier, it has also opened a massive box of choices. Should you choose platform as a service? Or maybe it's still VMs with IAS. What about your database. Maybe you should choose a managed service like RDS with Postgres. Or is DynamoDB better? Maybe Aurora? No, wait. I hear good things about Amazon Document DB, too, and that's where the AWS Insider Podcast comes in. This podcast helps technology leaders stay ahead of Amazon's constant pace of change and innovation. Some relevant recent episodes include Storage Wars Database Edition, Microservices or Macro Disaster, and exploring computer vision at the Edge with AWS Panorama. They bring on guests to debate the options, and the episodes are vibrant and fun. So if you want to have fun and make sense of AWS, head on over to Talk Python.FM/awsinsiders. Yes, I know you probably already have a podcast player, and you can just search for it there, but please use the link so that they know you came from us. Thank you to the AWS Insider Podcast for keeping this podcast going strong.

26:37 We just had our Black Friday sale over at Talk Python. Cool. And that was really excellent. Sold a bunch of courses. We sold some podcast courses, by the way.

26:44 Yeah, I'm just excited because sometimes we have these sort of conversations about you, like, cool sales and stuff, and I'm glad that I get to be a part of that.

26:53 Now, we've done other fun things where we couldn't sell your book through them because it's through the publisher and it gets tricky. Right. So I'm really excited as well. So we did our Black Friday sale, and guess what? I noticed something a little bit unusual.

27:09 After a little bit, I opened up glances on the main web server, and it said, CPU usage is 85%. I'm like, oh, that's not so good. 88, 91, 92.

27:18 But what was super interesting was NGINX, not Python was the thing getting hammered. So both NGINX workers were like, almost 100%, and Python was just chilling. I'm like, okay, that is a really interesting story for Python performance. That something amazing like NGINX that people say is fast all the time is the bottleneck. And it turned out it survived, but just barely, right? If it were like, twice as bad, it would have keeled over, which have been bad. So I talked to a bunch of people about this, and I realized that there's one Http response. I got to spell that better.

27:52 And twelve CSS files, 43 images, and one JavaScript file on the page I was sending them. So I'm like, all right, maybe I should try to use some interesting CDN, which I had got a recommendation from one of our listeners, but otherwise hadn't heard about. What a cool service. So now we have 112 different locations serving up those static files nice and just processing. So I went back today when we did our Cyber Monday, and I said it was yesterday when I pushed out the announcement for Cyber Monday closing, and I pulled up the real time data. Look at that traffic. That's. CSS and JavaScript and images 1.4gb a second.

28:31 Oh my gosh.

28:32 Insane, dude. And check this out on the server. This is the most important part. 3% Cpusage on NGINX and across the whole computer, across all of the micro wsgi processes just a couple more percent.

28:45 CDN to the rescue.

28:47 Exactly. But the thing that's also interesting is that Python is just like, yeah, it was nothing. We can take that. But it's all those static files. So anyway, I put that right up together for people in order to serve out that data.

28:59 Pay $2 for zero point 35 terabytes.

29:04 And by the way, it's going right now. I got to refresh here. They have these cool real time maps and whatnot, but that little spike right there is when I released the Talk Python episode.

29:17 And that's about four and a half terabytes per second, which is just insane. So anyway, I totally recommend people check this out. It's super fun.

29:25 You're reaching people all over the world. That's pretty cool.

29:27 Yes. Isn't it amazing? You get all these different locations. I think it lost its WebSocket connection because it stopped updating.

29:37 It's a little warning. This live monitor is like a little bit of a suggestion of how things might be, but yeah. Anyway yeah.

29:44 What's up with the Alaska people not listening? Hey, Alaska.

29:47 Hey, man. Come on. They're going to have to see the end over to Canada anyway, so not that this final one here. No, not final 1 second final one that read all about it or hear all about it. Reader Five. I've actually been really getting back into RSS.

30:07 What's your RSS story these days?

30:09 No, I use Feedly on my phone just to keep up on stuff.

30:14 Nice. I switched to things like Zeit, which is sadly gone, and Flipboard and these sort of like Apple Newslike things where they kind of curate a bunch of different sources. I'm like, you know what, there's a bunch of great places I would really like to just directly get them from and curate a little more than just I suggest more python. Because do you know how many times my python channel in Flipboard has woman scared of python that comes out of toilet? Like, you know, no, not that python. Really not.

30:44 Oh, no, no.

30:45 And so I've just been super loving. I've been using Reader Five with two E's and what a nice piece of software this thing is for, for $10.

30:54 Okay.

30:54 Really, really cool. Yeah. So we'll check it out.

30:57 Yeah.

30:57 And another thing I would like, if people have awesome recommendations for blogs, especially Python blogs, that I should be following or people listening should be following, put that on the YouTube channel comments or send it to us on mastodon on our Twitter and maybe I'll give a shout out to the ones that are extra good but very cool. Let's see.

31:18 Check this out. There's a podcast called Sing for Science. And on season three, episode eight, which just came out six days ago, Rivers Cuomo of Weezer and Guido Van Rossum sit down for a conversation. How cool is that?

31:32 That's pretty cool.

31:33 That's really cool.

31:34 Have you listened to it?

31:35 Yeah, I listened to it. I grabbed my phone and my dog and went for a walk and listened to It because the sun came out and that was rare right now. So, yeah, it's really interesting.

31:46 Neat.

31:47 It's a lot of the Hosts talking to Rivers and Talking to Guido and a little bit of Interaction. I would love a little more facilitation of talking than to talking directly. But both great people.

31:57 Rivers is awesome. He does really cool stuff with Python. I had him on talk python 327 little automation tools, which was fun. So. Yeah, he's a legit developer these days, which is pretty neat. All right. Final thing, Brian. Final extra.

32:13 We started with. I started at least my segment with AI coding, and I'm going to end it with AI coding. Kite. Do you remember Kite? It was like the original GitHub copilot.

32:23 Yeah.

32:24 Unfortunately they are shutting down, so they've been around for ten years or so. Not quite. Seven years. Something like that. Really quite a while. But they're shutting down, so thanks for all the Code, I Suppose. And that's it. That's all I Got for all my extras.

32:40 I want to add one. So we talked to Simon Willison.

32:45 Talked to one of the thing I didn't mention about in his talk is he encouraged people to write blogs. Because there's not that. Blogs were huge for a while, and then everybody was doing it and now not so much. And so you do get noticed more if you're writing a blog. I think that that's a good thing. Plus we can link to it easier if you got your article on the blog.

33:07 Absolutely.

33:08 But also and RSS wise, planet Python is something I still check out. So Planet Python.org. If You haven't heard Of It, you Can either have The full content so you can read and It pulls all Of this through RSS from Different Blogs And titles. Only if you have A Python Blog or You're Starting one. Check out planetpython.org and Try To Get Your name On The list. Maybe put out, like, three or four articles first and then try to get your name on the list or your blog on the list. And that way it gets seen by people like us, even if you don't. Notify us.

33:46 Yeah, that's excellent. I didn't subscribe to that because I feel like it's a little bit too much of everything. But I went through all the recent posts and said, this writer looks interesting, or this source looks interesting and subscribe directly. So I kind of used it to start my exploration of those things I want to subscribe to.

34:02 Yeah, not a bad idea. And, you know, they have RSS feeds because they're in here.

34:06 Exactly.

34:08 Since you brought it up. I just want to also point out one of my roadblocks of writing a lot was, well, I don't have time to write, like, an article, something well thought out and 1000 words and that you know what my new philosophy has been? Let's just write really short posts. Like, here's one about a fun thing I did with spammers, and it's like three paragraphs. Or here's one about installing something as a PWA. It's two pictures and four paragraphs. And you don't have to write long essays to contribute interesting things and ideas, I think. So I was following up on that.

34:41 Yeah. My thoughts are, if it's going to be a thread, make it a post instead.

34:46 Yeah, exactly.

34:48 All right, well, my jokes have vanished. I had a cool joke on social media and it got taken down.

34:57 It was really funny. It was totally benign. I don't know why it's gone, but whatever. And then, by the way, following up on this, jeremy Page says you could also RSS Mastodon users. Okay.

35:08 And mastodon hashtags as well.

35:13 Okay. Yeah, I follow the Python Hashtag over there. I could RSS it, I suppose. Excellent. All right. Brian, do you have a joke for us?

35:21 Yeah. So speaking of mastodon on Mastodon, I said, I'm getting a lot of great Python content on Mastodon, but I need some joke people to like I need some nerd jokes asking for people. And somebody didn't tell me a person to follow. I'm still looking for people to follow with good jokes. So if you send them my way or send me their way if you know of people. But here's one that I got from somebody I mastered on so I got it from who they get this from. I should probably give credit.

35:52 So this came from Stephenbox. Nice. Thanks, Steven. So exit condition from Monkey user.com

36:00 It took me a while to get this, so there's a couple of people sitting at the desk. Pair of programming are guessing, and then somebody else is frustrated. They hear, Wait. And the frustrated guy says, that's it.

36:16 And he starts going towards a door that's labeled Recursion. And somebody says, Wait, I'm going in. He goes in and he gets into the other side and says, Wait. He's the person trying to say, Wait, there's no exit condition.

36:36 It's a dumb joke, but it's really good.

36:38 And it's got some clever where, like, the speech of the other one is off screen. So it kind of looks like it comes from the original group, but in fact it's coming from the recursion of the first one.

36:52 Onemore. Somebody said I should follow Oliver.

36:59 Anyway, I just thought this was dumb and funny. Bobby Pin. No, I go by my full name, Robert Pindel.

37:08 And it reminded me of the Bobby tables thing.

37:10 Yes, exactly. I love it. All right, well, thank you, everyone, for listening. And Brian, thanks for being here.

37:16 Thank you.

37:17 Bye, everyone.

37:18 Bye.

37:20 I hope you all enjoyed this crossover Python Bytes episode. If you did, please consider subscribing to Python Bytes. If you don't already, do so. Thanks for being part of Talk Python to me. Have a great New year and I'll see you in 2023.

Back to show page
Talk Python's Mastodon Michael Kennedy's Mastodon