« Return to show page
Transcript for Episode #314:
Ask us about modern Python projects and tools
00:00 Here's an episode that I did not see coming Sebastian Witoski. And I put together a live stream 'Ask Me Anything' as a follow up to some of his ideas around his recent course 'Modern Python projects'. We dove deep into comparisons of 'Poetry' vs 'PIP' 'PIPENV', and answered questions like, do I need to use Docker? And when should I, and so on. After the AMA was over, I realized it would make a great podcast too. So here you go. This is our AMA with Sebastian, all around the ideas of modern Python workflows. I hope you enjoy it. This is talk Python to me, Episode 314, recorded April 19 2021.
00:48 Welcome to talk Python to me, a weekly podcast on Python, the language, the libraries, the ecosystem, and the personalities. This is your host, Michael Kennedy, follow me on Twitter, where I'm @mkennedy, and keep up with the show and listen to past episodes at 'talkpython.fm'. And follow the show on Twitter via '@talkpython'. This episode is brought to you by '45Drives' & Us over at talk Python training. Please check out what we're offering during those segments. It really helps support the show.
01:15 Everyone out there. Thank you so much for being here today. It's exciting to have you here in the live stream or if you're watching later watching the record. So Sebastian, welcome. It's great to have you part of this. Ask me anything. In fact, it's your Ask me anything. Really? Thank you. Yeah, I'm excited to be here. Yeah, of course, I'll give my thoughts as well. But you've done a lot of thinking about tooling and putting the right tools together and what maybe constitutes what you might call a modern Python project, or the tool chain of a modern Python developer. And there's a couple of things that we've already done together that are maybe worth calling out that we'll build on here today. So a while ago, I guess what was this? Oh, my gosh, this was back last year, August 29 2020. You were on talk Python, you came on and talked about the modern Python developers toolkit. And then we talked a little bit after that and said, You know, it would be really cool actually, to put a course like this together for people. So over talk Python training, you created the modern Python projects. course, this is almost a nine hour course, that takes a lot of ideas you mentioned there previously in the podcast, and makes them concrete makes it something that actually people can employ and use. And so yeah, those are some of the things that we've done together before, but we're just gonna take a broader view and talk about what this whole idea of modern Python projects might be. Yeah, yeah. So let's just kick it off with with a quick, high level overview, like what are your thoughts? What constitutes modern Python development? How has it changed over time, things like that, that will get to the people's individual questions. Yeah. So as you mentioned, this whole idea started as a Python workshop where I wanted to share with people some ideas for tools that I've been using, and I know that a lot of other people's have been using, because I sometimes see people like start programming in Python, and they still stick to using the default python shell. And they don't know about many great tools that are in the ecosystem. You hear like, Oh, I used IDLE, to try to run this. And you're like, Whoa, please don't use IDLE. I mean, it's built in. But there's really more helpful options these days, right, something like this. Yeah, I mean, Python is really cool with the batteries included, because you have a lot of things there. But at the same time, there are a lot of other different tools and projects that can make your life much, much easier. And that's what I wanted to share with people. Nice. I would like to point out for people maybe don't know, there's a lot of amazing tools that are not part of Python, on purpose. And the reason they're not part of Python is not that they're amazing. It's that Python releases that used to be every 18 months, and now every 12 months, and once something goes into Python, it can't be taken out almost ever. Though, it's very hard to like move fast and break things type of mentality with Python, itself in the standard library, and so on. But stuff outside of it can be much more rapid. For example, there was a conversation with the core devs around making requests the package part of Python as a better way to do because it's vastly a better way to do it than the built in HTTP libraries. They decided not to put requests into Python itself, because they said it would actually slow down and hamper requests and make it less valuable. And it's better to leave it as its own standalone thing. And I think that touches on a lot of what you're talking about Sebastian, is that there's what's built into Python, and a lot of it is really good. But oftentimes there's better things outside and they're not likely to be moved into Python for their own good. Yeah, exactly. And since installing new packages or tools in Python is just one PIP command away. It's very easy to add new tools experimental of them. And at the same time, it's worth knowing which tools are kind of like the backed up by the
05:00 Python community like which are the most popular, right? We're the de facto primary ways like 'pytest', for example would be in a case there. Yeah, exactly. Okay. Yeah, for example, I wanted to mention 'cookiecutter' because like, if you're looking for a way to start a new project, and you're wondering, no matter if it's like a Django website, or if it's like a Python module, a lot of people know about 'cookiecutter' templates. And they are a great way to start because you use a template that many people use before you it has a lot of sane defaults already set up for you. But if you've never heard about cookie cutter templates, there is like, no way you're gonna randomly find it, because like, cookie cutter doesn't even have python in his name. So unless you know there is a tool for that there is like no way for you to find it. Right, exactly. But it's so super helpful. Okay, fantastic. Well, we have a bunch of questions from the Ask Us Anything or ask you anything that we've already gotten? Because we sent out an email and said, Hey, send us your questions. We're also Norbert has a question here in the live chat, which would you like to go with? First you want to do the live chat question, you want to pick some out of our previous ones? So we make sure we get to them to say ask first, what are your thoughts? So maybe go with the one that we got beforehand? Because there are some interesting questions that we picked up. And I think they will be useful to more people. And then we'll do the live chat. Yeah. All right, that sounds great. So folks out there, please keep putting your questions and follow up questions to what we're about to talk about in the live chat. And we'll get to them. Alright, so the first one is, what's the point of setting up something like 'Pyenv', using 'VENV' and so on, when you can just use Docker? Yeah, so maybe a little bit of background just for people who are like not really sure what these all these things are, and then the trade offs? Exactly. So 'Pyenv' is a tool that you can use to manage different Python versions on your computer. So for example, if you want to have Python 3.6 3.7 3.8 3.9, install the same time and easily switch between them, you can use that and then 'VENV' is a tool used to create virtual environments. So it's a way to isolate the dependencies of your project. So basically, you would use 'PyEnv'and 'VENV' to isolate Python and Python packages, right. So 'PyEnv' is getting the version you want. And then 'VENV' is isolating that for a particular project, right? Exactly. Okay, they work together. And then the same thing could be solved with Docker where you have like a Docker image that you use to spin Docker containers, and everything is isolated inside of it. So you have a specific Python version that you choose based on which Docker image you use. And then inside this Docker container, you install the 'PIP' packages, and they are isolated from both your computer and other Docker images, Docker containers. And so coming back to the question, I actually use both. I use Docker in a lot of my projects. And it's very convenient. And especially if you want to later deploy your project or share it with your colleagues. But at the same time, I don't want to spin up a Docker image if I just want to run a Python script. So I still use 'PyEnv' to manage Python versions on my computer. So I can use 'PyEnv' and to install to change the global version of Python on my computer. And I saw that it's already useful for me, because sometimes when I'm installing packages with homebrew, by the way, I'm using Mac, so 'Homebrew' is the package manager for Mac. Yeah, I use 'Homebrew' as well. And I think that's also part of the modern toolchain is that you have something that updates your tools. Yeah, exactly. automatically. Yeah. And then each time, I mean, it's installing something, I see that it's trying to install some Python version in the background. And at the beginning, I was worried because it was messing up the global Python versions. But with 'PyEnv', I have Python versions separated from my system. So whatever Homebrew is messing up there, I don't really care I have file that lives in a separate folder. And it modifies the system path variables. So this separate folders are being read before the system Python. And that works in front of the Homebrew so I don't really care what the system Python on my computer. By the way, Mac is still using 2.7 I don't care what Homebrew is installing I have 'PyEnv' and that works really flawless. But Docker is also really great. Because if you want to start a project using a different Python version, you just change one variable in the Docker file. And that's it. Yeah. So there are many benefits to using that. I think Docker is really useful. And I do use it some of the time. But it's not the first tool that I reach for, because a lot of times I don't need that complexity, you know, if I'm gonna say fire ups and like I need 'Redis' just like this, and I need 'Celery' like that. And I need MongoDB like, set up like that. And then I want to just make that whole package work together and I really would like it to be exactly on 'UBUNTU' which is the way I run things in production. Then I would definitely reach for Docker but
10:00:00 If I just want to just develop one of my web apps or some API I'm working on, chances are running it locally is totally good. And it's just fewer things in the mix to worry about. So to me, a virtual environment seems totally good for that. To me, it's easy to see what everyone is super excited about, and then go, Oh, I have to do that as well. And like, that makes sense when there's enough complexity, or you're in the same sort of situation that they're also excited about. But it doesn't mean it applies everywhere. Like if I just wanted to do basic Jupyter work on something that I could run on my machine, I wouldn't necessarily fire up Docker to make that happen. Yeah. So I want to also point out that this question is specific about using Docker to replace 'pyenv' and, and virtual environments. And it's not about using Docker in general, I think we have another question coming up later about using Docker in general. Yeah, yeah, exactly. Alright, another thing worth pointing out while we have this on the screen is it doesn't work exactly on Windows, right. But there is 'Pyenv- win' win. Yeah. So the question is, what's the benefit of using that on Windows over, say, 'Conda'? And I feel like almost it's like a general question, what's the advantages of something like 'pyenv' + virtual environments, and 'Pip' over 'Conda', almost, I would say, if you're using Conda, on Windows, or whatever operating system and you are happy with it, I don't really see any reason to switch to something else. I mean, conda is that is one huge tool that you can use to manage both Python versions, dependencies, installing packages, and stuff like that. So I would say if you're happy to stick with it, but if you're not using conda, and for some reason, you don't want to use it, but you're on Windows and you want to install different versions of Python, then 'pyenv-win' is a great tool to do that. Because unlike conda, it's very specific. It's just for managing Python versions, it doesn't do a lot of other things that conda does. But if you're not looking for this specific type of tool, then I would stick with 'Pyenv-win'
12:01:01 this portion of talk Python is brought to you by '45Drives' 45 drives offers the only enterprise data storage servers powered by open source. They build their solutions with off the shelf hardware and use software defined open source designs that are unmatched in price and flexibility. The Open Source solutions 45Drives uses are powerful, robust and completely supported from end to end. And best of all, they come with zero software licensing fees and no vendor lock in 45 drives offer servers ranging from four to 60 bays and can guide your organization through any sized data storage challenge, check out what they have to offer over at
12:01:01 drives'. If you get in touch with them and say you heard about their offer from us, you'll get a chance to win a custom front plate. So visit 'talkpython.fm/45drives' or just click the link in your podcast player.
12:01:01 I think one of the things that I would point out here is that the conda packages to some degree and certainly the official Anaconda version, not necessarily the conda forge version, they do lag a little in version number to what is out there, something gets published by the author of that package to PIP to PyPi immediately. And then eventually, you know, once it's been verified to be working well and all that it gets brought into Anaconda. And generally, that's not a big deal. But sometimes if you want to be on the absolute latest, there's a little bit of a delay in the place where absolute latest matters the most is if there's a security vulnerability in an API framework, or a web framework, you want to update if something like that were to come out say with Django, which has happened recently, you would like to update to the new version immediately before anybody starts poking around on those things. And things like conda and these other more verified places sometimes lag behind and if it takes a little bit longer to get the latest notebook, UI or whatever, who cares. But if it's, there's an active exploit on the internet against my thing, then you definitely are going to want the latest latest latest as soon as possible. So I think that's one thing to consider if that's your world. I know that's something I've run into. Yeah. Alright. Let's take a couple of things out of the livestream comments just because their follow ups. Right. So one question is what's the best 'Homebrew' equivalent for Windows? I've got some thoughts but go ahead Sebastian. Okay. I'm not a Windows user. I have no idea. I've heard about chocolaty, but I only heard about it. Sorry. Yeah. So yeah, crusted, says 'Chocolatey', and then 'Scoop' and 'Winget', I don't know about Scoop or 'Winget', but 'Chocolatey' is very much like Homebrew, and I'm familiar with that one. And that one works pretty well. So that's pretty cool. But I definitely recommend if you're on Windows, check out chocolatey. You're on Mac, check out homebrew and then a lot of Linux distributions already actually have this right.
15:00:01 apt on Ubuntu and so on. You don't need to go searching for those because that's the primary way. Yeah. And then finally, Mike Levin says the best homebrew equivalent for window is Ubuntu repo 100 wsl. Okay, that's an interesting one as well. But yeah, so you also have WSL, but then you have apt as well. Alright, cool. Thank you everyone for that. And we'll get back to some of the main questions as well. The next one, the Sebastion is about 'Poetry'. I've been using 'setup.py' in my Python projects. But I see a lot of tutorials recommending other tools like 'Pipenv' or 'Poetry' did you switch. So basically, Poetry and pipenv can be used to manage projects. So they are much more than just like a 'setup.py' replacement. But actually, 'pipenv' is not a replacement for 'setup.py'. But I will get to that in a moment. So basically, those are like bigger tools that you can use to manage your whole project, you can use it to like add dependencies to update dependencies, they will manage the virtual environments for you. So you don't have to activate anything and things like that. And with poetry, you can also publish packages to Pip, which is magically a replacement for setup.py. You don't need to have 'setup.py' with 'poetry', which is kind of convenient, because I always have to Google out what I should put in the setup.py. And there is also a package called 'fleet' if you want the lightweight replacement for the 'setup.py', but that's an off topic. And if you like using setup.py, and again, you don't have a real reason to switch to something else, I would say stick with setup.py, because that's like the de facto way of doing things in Python setup. py is supported by Python itself. There is also this project terminal that is now supported with like recent pip, but for many years at about the way of building packages. So I wouldn't just switch for the sake of switching. But if you're looking for a tool that can manage your whole project, then I would use either Pipenv or 'Poetry'. And if you want to publish, then I would go with poetry. Yeah. And I think you also want to consider are you building a package? Or are you building an application? Because some of the things that you do differently, particularly around like 'Pipenv', is a little bit more targeted to building applications than it is packages from my understanding? Yeah, exactly. And one thing to consider is that both poetry and 'Pipenv' will be an external dependency for your project. So if for some reason they break or stop being maintained, then you're gonna have to migrate to something else. While if you still use setup.py and the requirements files, then all the tools are there when you install Python. Yeah, that's a good point. But also, yeah, it's like a bootstrapping step to get started. So if you have setup.py, you don't need to install anything else to kind of like, check it out and try it. Whereas, you know, if you have poetry, the first thing you have to do with a new environment is set up poetry. So then you can then initialize all the stuff, right? Yeah, exactly. I think I was recording an episode for the courses with poetry. And the first package that I wanted to add was the latest 'pytest'. And the first command I got was an error because Poetry already had an older version of pytest being down. So I had to like open the config file and remove it. But the first impression was kind of weird. You try to add a package and it breaks. But otherwise, I would say it's a good tool. If you like it, use it. But keep in mind that it's yet another dependency for you. Okay, let me ask a follow up, Brahmin crusted on the live stream. What is the best way, if any, for distributing Python applications to non developers should this, this conversation of poetry and Pipenv instead of all that is for developers? Right? Yeah. So what's the story on where's the icon thing? I can put my dock or pinned to my Taskbar and let people click? What do I do there? Yeah, that's a good question. I also covered this in the course, because I was also looking for a way that you can distribute your Python application with people who don't know what Python is. And there is this 'PyInstaller' package that you can use to create an executable files that contains all the dependencies. And it works really great. I was able to spin up a Windows machine, create an executable, then spin up another Windows machine without even Python installed, move it there. And it was working fine. It was using like a graphical user interface. The only downside I saw is that they are not cross compilers. So if you want to build an application for Windows, you have to compile it on Windows. If you want to build a Mac application, you have to do the same thing on Mac. Otherwise, I would say pyinstaller is a really great way to build executable Python binaries. Yeah, I agree. And as soon as Patrick will we'll Hey, Patrick. Thanks for that. Also add one thing really quick to
20:00:01 I really, really quickly partly just show it as an example for people. And partly because this is the thing that I do all the time, I built the little thing, a little Mac, menu bar app 100% in Python and distributed it as a '.app file', just like y'all are talking about here. So created this thing, it looks like this all it does a few more things now. But you know, if you're copying some title, and you want to turn it to a file name, or you want to like make sure you strip off stuff or clean out the text before you paste it somewhere else, I've put this little thing that goes in the menu bar, filled it with Python, and it runs with Py installer, I believe no, that uses 'py2app' actually, is the one that it uses. I think so anyway, there's a couple of options. But yeah, it's usually some variation of what you need to do on Windows and on Mac. And you actually have to have a Windows machine. And you have to have a Mac OS machine, order to create those distributables. So kind of a pain there. By the way, if someone is wondering, what's this 'Py2app' because I was checking this as well. So there is 'Py Installer' that can be used for Windows, Mac, and Linux distributions. And there is this Py2app that is like a Py installer, but for creating Mac application. And there's also 'Py2win', which is also again, like a subset of 'py-to-exe', right? Yeah, 'py-to-exe'. Yeah, that is like a subset of py installer. For Windows, I feel like they might be more specific, like covering. You're adding things like icons and other stuff maybe a little bit better. I'm not sure. For example, like this one, it has like an icon set that you create for it. And the reason I chose 'Py2app' is this is based on 'Rumps', which is Ridiculously Uncomplicated, like Mac notifications or something. I don't remember what the URL the acronym stands for. But it's 100%, a Mac App. So I figured I just use the Mac App builder thing for it, because there's no chances running on any other thing, because the framework itself is only for Macs. So anyway, let's go to the next question. Here we go. Which one should I choose? Pipenv or Poetry or something else? I will say they are both very similar. It's just that poetry allows you to easily publish packages on PyPI. So if you're building a Python package, I would go with poetry. If you're not building a Python package, then I think both should be fine. They offer a similar set of functionality. Okay. Yeah, very cool. Very cool. Conda versus PIP, we sort of touched on this a little bit, but maybe a little more. Yeah. So that's interesting question. Because with conda, you install binaries from the conda repository? Well, with PIP you install packages from PyPI, and they are not always binaries. So the main difference is when there is no binary. So if there is no binary people will try to build this package from sources. And if you are missing some dependencies on your computer, then it's going to crash, right? If you're on Windows, and you see cannot find VC Vars.bat, this is the problem. Yeah, if you're somewhere else, and it just fails to install with something that looks like a c++ compiler. That's the problem. And that's part of the problem, what conda is trying to solve, right? Exactly. So we cannot, there is no such problem, because every package that is published in the conda repository is a binary. So it contains all the dependencies, which can be super helpful. If you're not like, if you don't have a computer science background, or if you're not very experienced with debugging, there's missing whatever you said, Mike
20:00:01 So let's say you, I don't know, if you're a data scientist, and you just want to have your code running, you don't care about setting up stuff, you just want to install some packages and have it running conda is an excellent choice, because it has like a very good support for all those data science packages. You just install them you have all the dependencies installed. So it makes things much, much easier in that sense. Yeah. Especially in the data science world where there's really weird dependencies like you need a Fortran compiler like, Yeah, I definitely don't have that on my machine. Yeah. On the other hand, as you already mentioned, Michael, those binaries don't always have the latest version. So someone first have to create this binary. And if there is like a vulnerability, then you have a problem. And there might be some, like, less popular packages that are available on PyPI, but no one created the binary for conda. So either you have to wait until someone does that. Or you have to create it yourself. Yeah. Because with conda, you can also install packages from Pip from PyPI. But you basically have to build a binary yourself. So again, if you're not like very experienced developer, you probably don't know how to do this. Yeah. So I'll say like, the benefit of using PIP is that you can install any package from PyPI while with Conda. You can't but with Conda installing things is much easier. Yeah, I think that sounds like a pretty good summary. It's worth pointing out that the workflow is quite different from a virtual environment from Conda. Right like conda you create the environment off somewhere magical with the name
20:00:01 Have you activated by saying the name? Where's Pip? It's more you explicitly say this one here in this location. Yeah. So Mike Levin has a chat stream. I'll go through here. Thanks, Mike. So conda was, he says only not sure I'll say was more necessary before PIP included binary specifically with wheels. And now that wheels, it doesn't happen as much PIP can replace Anaconda. Now, that's true. But Frankie one says, Yeah, but not everything has a cross platform wheel, as well. So I do agree with that, for example, Micro whiskey when I installed that it seems to like compile away for good 30 seconds, instead of coming up with some binary version that can just drop in or so on. So yeah, I agree, Mike, that that's generally true that things are much better now that we have wheels, but it's not true. Also, that it's not 100%. What do you think? Yeah, I agree with that. And one thing that I want to also add is that, while we are talking about binaries and the dependencies, if you are, let's say web developer, and you're installing Django or flask API, there is like, very little chance that you're gonna be missing some dependencies. So you probably will never have problems with Pip. So for some cases, using PIP is going to be perfectly fine for all your work life. Yeah. So it's the weirder farther out, it gets, like I said, Fortran dependencies with some weird C library or something. And then also, Joan Pino says, I've used pyinstaller for some side projects, and it works really well. Yeah, quite cool. That's a definitely need to option.
20:00:01 Alright, Sebastian, onto the next. How do you hide secrets for development for production? We've talked about I think we haven't talked about yet. But there's a hidden secret yet, which is quiet nerving. To see should get going along here and finding all the secrets that people have put into public repos, or what were private repos, but then got turned public. And people are pulling these out, that's very much not good. This is not what you want to do. Just put the secret, right in your source code. So what do you do? What are your recommendations here? So my recommendation is pretty standard, to use the environment to hide the world secrets and study them as environment variables. And for example, you can have like a file with the 'yaml' extension, you can put it in the Git repository is just you don't put the secrets there, you just put the structure, and let's say dummy values, or whatever. So you remember that you have to set up all those environment variables, right? That's one of the challenges because if you put it in environment variables, or you put it into the some like, secrets, 'template.yaml', vs 'secrets.yaml', where the 'secrets.yaml' is not in the repository. In a real project, there's gonna be a bunch of stuff, and you gotta have the names just right. And you got to like, knowing what you're even supposed to fill out for the app to work is a challenge. So having this template around is super helpful, but not with real secrets, right? Yeah. And there are some libraries that you can use that will make managing those templates much easier for you. And why am I saying about those templates is that quite often, when there is no template, people start putting variables into the environment, and they spin up a new server, and they realize that they forgot about some environment variables, and then things stopped working. So good practices to have this, like, example, environment, variables file where you list all the variables that have to be set. And then you have like a proper environment file that you never put to the GitHub repository where you actually specify the real secrets, and then you load it to the environment. Yeah. And Crest has an interesting location.
20:00:01 under a rock, under a rock, yeah, no, but then more seriously points out that there are tools for keeping secrets in version control to like certain vaults, right, like 'HashiCorp' has something and we've had some sponsors were talking Python, and things were basically set up like the equivalent of one password, or LastPass. But for your server right here, definitely, if you have to be more careful about your secret, it makes sense to use an external tool. But if a lot of those things can be mitigated by setting up the API keys properly, for example, if you have like API key for AWS, you just limit what this account can do. And that way you don't spin up. If it leaks, then you don't spin up 1000 servers mining bitcoins for you, and then you wake up with like $1 million depth or something like that. So there are different ways to mitigate depends on what level of risk but the easiest thing is to use the environment. Yeah. And you can combine them as well, right? Like you could have an encryption key and the environment variable and then actually encrypt the other elements in ways and then use instead of because that way, if it's just in the environment, you could log in, if you hack into the process, you could just go well Os.env, like What's in here? Let's just look around. If it's all full of encrypted stuff, like I don't have no idea how to unencrypted. So there's layers right?
20:00:01 Obviously, running your app as lower privileges is really important. firewalls and isolate. There's like layers. It's not the only thing. But one of the things is not to put Rob passwords and API keys in your source code. Cool. All right. Let's talk Docker again. Yeah. So should people use Docker? What do you think? More broadly? Yeah, exactly the same more broad question than the first one we had. It depends who you ask. Because there are some people who will tell you that you should always use Docker. And when you start using Docker, you should spin up a Kubernetes cluster, and then you finally will be able to deploy your almost static block there. Yeah, but don't listen to those people. I mean, it really depends on what you're trying to do. Because Docker has those amazing benefits that it makes deploying your application much, much easier. So especially when you want to scale something up, Docker really pays off. This, for example, let's say you're deploying your website for the first time, then probably renting a virtual private server and then installing a Django website there, it's easier than writing a Docker file. But if you want to scale your Django application to like, hundreds of machines with Docker, it's much easier because you always use the same Docker file, and spinning up 100 virtual private servers by hand, it's probably not something that you want to do. So there is a higher cost up front, because you need this additional tool in your tool set. But in the long run, it will pay off. And another huge advantage of Docker is that it makes collaborating with people much easier. Because I remember, back in the days, when I was joining a new team, there were like some setup scripts, and you had to install some dependencies, and you had to set up some environment variables. And then after two days, you were finally able to start coding. And now when I joined new projects that are using Docker, the is just two commands, Docker build, and Docker run, and you have this whole work, all development environment running, you have the website, there, you have the database, you have the ready, server, whatever. So it makes joining a new team much, much easier. On the other hand, let's say if you have like a Python script, or like one off script that you want to either know, scrape some data from the internet, there's like, absolutely no point in adding yourself more work to write a Docker file, if you know that you're going to be throwing away this code like tomorrow. So it depends on what your use case, if it's for like simple things, I wouldn't bother unless you really like using Docker. But for like more advanced projects, it will probably pay off. Yeah, I was speaking with one of the talk Python listeners having a What do you think about this? And how do we solve this problem type of conversation?
20:00:01 And it was basically, how do I make sure that everyone on my team has exactly the same version of Python? Right? I want to make sure that they're using 3.9.2, not 3.9.3, not 3.9.1, you definitely know what one does that has that floating point vulnerability, but you know what I mean? Like you don't want variation, you want exactly that thing. And if you really care down to that degree, you know, Docker is great, because then you just make everyone run the same Docker image, you provide it in some, I don't know, local Docker Hub type thing, or just put the Docker file into version control. And if it changes, then it just rebuilds and it runs. and off you go, right. Yeah, yeah. So it helps a lot for that. But I totally agree, I wouldn't use it for like super simple things. You just causing yourself more challenges, when often it's not really that useful, or that needed, I guess, is the way to put it. So there's some interesting follow ups here. And the live stream. In addition, under a rock, bring you one says, I thought GitHub now automatically detects when you commit secrets accidentally. I think I did hear that it takes an attempt, I wouldn't count on GitHub, catching all of my secrets, I would be more deliberate about it. Do you know anything about this? Not really. I know that there are those bots that check like for vulnerabilities and stuff like that. So I would imagine that it also has a bot for the tech secrets in the comments. But I'm just wondering how it works. Because when you push something to GitHub, it's automatically available there. So does GitHub put some blocker on your repo do not show the latest commit? Because usually, what I see people doing is that they commit, they push a commit with a secret and they realize they push that so they push another commit removing the secret, but they're like buzz running through GitHub harvesting, though. So the moment you push it, it's already distributed to some people or to shake it. It's too late. Yes, too late, the stream has already been observed. It's really not good. Related to this sort of follow on I think, related to the Docker, a little bit, or maybe in some ways. So thp says, if you're not an expert, pythonista How do you manage supply chain tool chain attacks appropriately? You want to maybe give people an idea of what supply chain vulnerabilities are and then we could talk about this. I don't know
20:00:01 Our supply chain vulnerabilities, I'll give you the runner. So for example, recently there were about 4000 malicious packages that were put into PyPI, then there was another vulnerability where if you're running a private 'PyPI' Server so that you can have like, your team can publish and share packages across applications, and then a lot, often those will like fall through to the real PyPI. So I could just say, pip install requests, and it'll get the one from pypi, by say, pip install data layer, it'll get our local data layer. Right? One of the vulnerabilities was if somebody finds a way to publish your private name thing, but to PyPI, with a higher version number, it would prefer that one and go give that, but that might be one that is some sort of vulnerability or something along those lines, right. So this may be more like more concerning is in PHP, somebody put in some code to run that if you put the rhodium I believe, as a header that was in the headers, or it knows in the user agent, if you put the words or rhodium, and then some PHP code in the user agent and requested against a PHP site, which 79% of the internet runs on? I think a lot, it would actually just run that arbitrary code. So those are the types of things I think that we're talking about here. That's a very good question. And to be honest, I don't really know how you can mitigate those kinds of attacks. Because like, yeah, the package name typos can happen. And there is probably not much you can do, the only thing that comes to my mind is to use like a, if you don't know how to solve those problems, it's probably worth paying someone to take care of that. So I don't know, maybe using Heroku would be a bit more helpful. I don't know how they can mitigate this stuff. I know what about you, Michael? Yeah, I got a couple of ideas. None of them are perfect, you can pin your versions. So you can go and look at all the libraries you're using and say, Okay, I trust them as they are now. And if you pin the version and your requirements, txt, your project title or whatever, you're not going to automatically get the latest version, which may have some kind of vulnerability snuck into it. So you can consciously decide to move to the next version. Hopefully, you know, somebody were to hack into Django, maybe that gets out, but it wouldn't last very long, right. And so if you don't automatically upgrade on just the next deploy, then you're going to at least be a little bit better off. So penya versions, probably one. You could whitelist, you can do things like set up a private PyPI Server and just whitelist packages, so people can't arbitrarily install various things. But they only install ones you approve, you can use this thing. You can use this as the 'Snyk' package advisor. So far, I want to learn about Fast API. And like what's the story of Fast API, this is cool, because it's not just security. But it's more than just security gives you like the health, the maintainability level, or how much it's maintained, whether it has any known security issues, whether community's active, whether it's influential, all sorts of cool graphs about who's working on it when it's been worked on the amount of funding, security analysis by 'Snyk', and so on. So this is a pretty good thing. So together, right, so if I were to say, I'm going to pin my versions, and then those versions are going to be verified or with or maybe I'm going to create a whitelist and only allow you install from the whitelist. And I'll make sure that everything that gets through the whitelist, at least has been checked out by snyk. not perfect, but it's better than just pip install random thing and hope for the best. So yeah, anyway, let's see. incrusted also follows on with his vendoring dependencies, an appropriate mitigation, like sort of PIP installing requests, like finding the key bits of requests and just copying that code into your application. It certainly would help. On the downside, if a non intended vulnerability were to get into requests, not a hack, but just something that was a ultimately became a security hole. You've now vendored in that instead of automatically getting an upgrade, right? I don't know what do you think? Yeah, I think it's easy to do this because you can basically put the virtual environment folder with your project. And that's it. In the long run, it's probably harder to maintain. And I think as you said, like, cleaning the dependency versions, making sure you don't have typos. Probably can save you from a lot of those problems. Yeah, be very careful when you pip install something that you get the spelling, right, because there was that one, those vulnerable ones was something like asteroids versus asteroid. I mean, it was like plural versus singular. And that was it. Right? It's very subtle. Yeah. And I guess if you're using like more popular packages, then probably should be safe. I will say like, if some more obscure packages, it can maybe happened that they will get abandoned and someone will either have the Git repository or like, squatted. I don't know if it's actually possible by API but with bigger packages, it's probably not an issue.
20:00:01 Yeah, for sure. I think it's a relative thing. So let's see, piling. A piling also says you can develop an a VM or Docker. And related to that Doug ferrill. Hey, Doug, he says, really interested in how to set up a Docker based development environment. So one of the things I think you want to distinguish here in this supply chain thing is, I'm trying something out. And if it's going to run that setup, hi, when I pip install it, it could pip install virus onto my machine. If that's your concern, you can develop an a virtual machine you can develop in Docker, because then all it sees is what's inside the Docker machine, which is nothing basically. So that's really cool. But also, as incrusted points out, you know that if this is developing your application and pushing it to production, that vulnerability comes along on side, the web server sets a challenge, right? So in this exploratory world, I think things like Docker actually helped a lot, solve this problem. But in a production world, it does nothing, nothing to help. But maybe. Let's follow on real quick with that. I think this development environment based on Docker is pretty interesting. For example, PyCharm now lets you set up a Docker image as the run target. So which you can even debug into. And so you just press run and it starts the Docker thing attaches to it. You just run it there. I think VS code also has something along these lines. What do you think about this? Yeah, because I'm using VS code. I was trying to set it up with Docker some time ago. And it was possible to also like put the breakpoints there, and maybe like run, let's say a Django application in Docker and have breakpoints worker. It just requires a bit of setup. But I would guess that either by now, or soon in the future, they will even the guys from the VS code will further simplified, because they just keep adding things to make stuff easier there. Yeah, that's definitely a viable option. Yeah, that sounds good. Your is the thing for PyCharm, right? Basically configure an interpreter using Docker, I just press go and then boom off, it runs in, in Docker, which is, I think, pretty awesome.
20:00:01 Talk Python to me partially supported by our training courses. Do you want to learn Python, but you can't bear to subscribe to yet another service. At talk Python training. We hate subscriptions too. That's why our course bundle gives you full access to the entire library of courses. For one fair price. That's right, with the course bundle, you save 70% off the full price of our courses, and you own them all forever. That includes courses published at the time of the purchase, as well as courses released within about a year of the bundle to stop subscribing and start learning at 'talkpython.fm/everything'.
20:00:01 Yeah, some comments about this basically being so much about dependencies and stuff. But I do think a lot of it really is. It's like how do you get the right libraries? How do you keep those libraries up to date? How do you do that securely? How do you deploy those things out to the world? How do you share that with the user? So much of this stuff is a pretty interesting thing to talk about. One follow up here from Doug is are there resources or links that would help me and everyone understand how to build a Django debuggable Docker environment? I think actually, the thing that I have on the screen here, if you're into PyCharm as one I'll throw that into the live chat here. mean that is quick these three buttons and and press the debug button and it runs in Docker. Do you have something like this for VS code? So what I did I took the documentation, I think VS code was also using Django as an example in the documentation. So you'd have to dig in the docs, but they're pretty easy to follow. On the top of my head. I don't know anything. Yeah. Okay. So let's see. I think incrusted has a pretty interesting comment, question here. What's the story around code formatters linters. And type checkers autocomplete got the better of them there. But no problem. Did you be more specific? Because I'm not sure? Well, yeah, I'm guessing like, did you use 'Black'? Did you use MyPy? Things like that? I would have guessed. But yeah. incrusted? Give us a follow up if you want. I will start and you can clarify the question. So I would say definitely use some of those tools. Definitely, I would recommend using black.
20:00:01 Because some people don't like it. It's opinionated. And you might not like how it formats some code, even though you can actually put comments to disable formatting, if you want to like preserve your beautifuly form a dictionary of List of tuples of dictionaries and stuff like that. But if everyone on your team is using code formatter, no matter if it's black, or if it's something else, then at least during the code reviews, you don't have some silly discussions about code styles. And I had those discussions in the past and they were terrible because code reviews should focus on what's happening in the code, not how the code looks like. So definitely some kind of formatter is very, very useful. That is
20:00:01 The most popular one, it works out of the box. So I would suggest using that. And speaking of linters, I really like like 'flake8', but there's also a 'Pylint'. And you can actually use both of them. And even though I have like, I don't know, eight years, nine years of Python, fake it still find some silly bugs that I'm doing in the code. And because like sometimes you're tired, made like assigned to an undefined variables and stuff like that. So instead of like, waiting for your call to actually run and give you this error, because there is like no compilation, so you can't get the errors beforehand, you immediately can get those easy to spot errors for free. So definitely spend some time adding them to your code editor. And then you can keep going crazy, because like flake8 has plenty of plugins that you can further extend it, you have a bunch of other static analysis tools, I am listing them in the course, I think there is like a website with the resources for the course that you can find some cool plugins. And there is stuff like for example 'Sorcery' Do I really like that gives you some recommendations of how to reformat your code. For example, when you're writing a for loop, and it's basically building a list, you will get a recommendation to use a list comprehension showing you how this comprehension will look like. So that's yet another tool that Yeah, one of the things about sorcery that annoyed me quite a bit is, in the early days, it would recommend all these refactorings. And like nine out of 10. I'm like yeah, that's a great refactoring. And other times like no, this is a horrible refactoring. Like, I don't want this change. And I know the pattern I'm using is better than this, even though you are you're programmed to think this is it, and you couldn't disable notification. So I just have this permanent like mark of a warning on my screen for certain bits of code. They recently added a way to add breed to disable that on a per project basis. So now I'm also all about sorcery. Now that I can turn off the one or two refactorings that I'm not a huge fan of. And then the rest are really nice. Like you say, Yeah, I didn't have this experience of sorcery. I like it. But as I said, there are like plenty of tools. So you can check them out. Each of them works in a slightly different way. You can combine them, you can disable some stuff, if you don't like, for example, I didn't like piling because it was too strict. Yeah. Like when you had a class that had too little too few methods. It was saying like, well, this shouldn't be class and stuff like that. Right? But maybe you're putting it there because it's gonna get bigger in a month as the thing grows. But it's you're like putting that flexibility in now. And it's gonna tell you No, you shouldn't have it. Right. Which is not necessarily the case. Exactly. I mean, it's trying to make you write the most beautiful code. It's just sometimes I prefer my ugly.
20:00:01 Well, yeah, for sure. And so a couple other follow ups, Patrick reveals His love is in black and flake eight with pre commit, very cool. The thought this dog had the same thought as I did that what he really likes about it is like Python VS code. And the tools and plugins is that they basically do this while you're writing code. And to me, like I don't run flake8 separately, I just use PyCharm with all its settings turned on. And it's super obvious if something's wrong. Actually, what I did when I read it on my stuff with sorcery, I went through like the 20,000 lines of Python code that are talked python training. And I went and accepted every refactoring that sorcery was giving me, except for that one that I don't like. And then while I was going through, I found there's a few 'PyCharm' warnings that I had been ignoring they were fine, but that I really should do this other thing. And I just fixed everything, got it pristine and perfect. And so now I really pay attention to those warnings, because there's no like leftover junk warnings. And I think that's actually a really big deal about a lot of this. And who was it over here? Someone? Don't see the comment, and maybe I'll find it but was asking basically, how hard do you enforce this? Yeah, here you go. And Chris, it says, MyPy and others, should 'Black' be hard enforced? How strictly should you adhere to those? Like, for example, do you break continuous integration? If some linter fails or something like that? Yeah, that's a very good question. I think it's a matter of how the rest of the teams like it, I work with some teams that were like, very reluctant to use any of those tools. I work with some team leaders that said that CI has to work, your commit has to pass all the CI tracks. So it's a matter of preference. I mean, those are there to help you not to hinder you. So if some tool doesn't work well for you, you should kind of configure it or maybe replace it. So I know one question that we had a queued up and some people maybe a little bit touched on it. But we haven't really talked about the language hardly at all have we like using language features and stuff? So what do you think about things like pipenv's, like this kind of stuff? Yeah. And that's another sensitive topic. I would say they can be definitely useful. I mean, if you especially if you have like a lot of code, lots of legacy code, being able to immediately see what are the
20:00:01 argument types that are given a function takes what's the return value? What's the type of the return value, a function returns in your code editor. It's super useful. Yeah. And even though with Python, we have duck typing, so we can get quite far with this, there can be some subtle bugs that typehints can help you. But the thing with type hints is, my problem of type hints is the same problem as with tests and documentation. It's not strictly a problem. But it's a yet another thing that you have to take care of. I'm not saying I don't like writing typehints. But for example, let's say you're writing code. And then you have to add a test, you have to have documentation and you have to have typehints. And then you call called changes. And you have to update all three places. Because it's not only about updating tests and documentation, you also have to make sure that your typehints are updated, because you have wrong type hints. And they are useless. Yeah. And of course, there are tools like MyPy, but they are external tools. So you have to configure them at the beginning, you'll probably have a lot of warnings and stuff like that. So I will say that typehints will give you back as much love as you give them. So if you spend time making sure that the typehints are properly declare that you're not using any in all all around your code, then they are going to be very, very useful. Yeah, I absolutely love the type hints. But I think you should remain pythonic as you think about using them. So there's I think there's two blends is one is do you want to go down to MyPy, have the checkers completely verify everything is absolutely hanging together, like C# or C++ compilers do like it's 100%, this is an integer and everywhere to integer. So you like lay all that out. That's one way of using these the other way, which is the way that I like is put them on the boundaries, right? If I'm creating a package, and I want to put that package on PyPI, it would be awesome if the public API of that package all had type hints or type annotations, because then when I use it, I know automatically, my editor knows automatically, am I doing the right thing or the wrong thing without having to constantly dive into the documentation? Is this a string? Or a class that contains the information that I'm supposed to pass for location? All right, is that a quote Portland comma, Oregon? Or is it a location object? But those two things set or a dictionary? or What the heck goes in here? Right? Like those kinds of answers just go away instantly. And when you're using it wrong, your editors will like, put some kind of squiggly or something to say, No, no, you're using it wrong. And I think that's the biggest value of type hints is like, right in those boundaries, where you're maybe unfamiliar with it, or have the it'll really communicate that across. Yeah, I wouldn't use typehints on like throwaway code. But I usually try to start to add them as my project grows bigger. So for like, small project, it's easy to kind of understand what's going on. But as it keeps growing it, it really pays off to start adding typehints. Yeah, for sure. All right, getting short on our time here. But I know there's a couple of people got in right the beginning and we kind of skipped over it a little bit just because we were getting to the pre questions first, but Patrick labial has one I think it was worth touching about. What do you think that good Python GUI development? Why do you think they're rare, especially for beginners? You know, real UI is way more motivating the terminal output. I agree that it is way more motivating. And I also agree that we're quite short on good UI development GUI development options. Question. Yep, same opinion. I was never using a lot of GUI. So I'm not very familiar how many tutorials are there. But I was building one for the course. And I decided to use 'Tkinter' because it comes with Python. And there weren't that many tutorials. You're right, I found one that was very useful, because it was showing a lot of different components of decorator. But apart from that, I would say there weren't many showing you how to build some specific c think GUI, like a specific project or something like that. Yeah, I just dream of the days when we got something like VB6, or Windows Forms was in .net, where he's have a nice UI thing. Here's a button, here's a text box, you put it over there. You want code run, you double click it, you write the three lines of code, you go back to work and like you push a button, you get a thing that you distribute to people's applications. Like I don't really understand why we don't have something like that in Python, where sort of a visual layout, push a button, it does something Pyinstaller magic, I'd add magic. Here's your app, give it to someone like that would be transformative. And we just we don't have it. I think it's a bit of a chicken and egg thing. But one can dream when country All right. Norbert Kirkpatrick also talks about some Enterprise Development stuff is what's your take on things like Azure Active Directory and security topics like DB encryption, major Cloud Platform deployments, and so on. It's a bit of an open ended one, but what are your thoughts generally
54:57 almost no experience with Azure
55:00 And yeah, as you said, it's kind of a broad topic. So don't really know what to say here. Without like a specific question, I'll throw a few things out there. One, Azure Active Directory seems pretty interesting to me, one of the things that's an ultra pain is federated identity or trusting other people's identity, like if I want to let all of your users Single Sign On at your workplace, and then single sign on on my site. Like, that's not an easy thing to do right now. And I think Azure Active Directory makes it quite easy. I haven't done a lot with it. But in that use case, I think it's pretty useful. I honestly don't do much with Active Directory at all, like corporate accounts, and like building public web apps and things like that, so that they don't make sense in that world. database encryption. Very interesting. There's a lot of levels of this data encrypted at REST is encrypted and memories encrypted in the response. A lot of the modern databases have like encrypted columns, for example, MongoDB has different encrypted columns you can have. And then certain clients as they talk to it, they either may or may not be permitted to decrypt that column. So it could return all the columns or values without overexposing the data potentially. And there's, I don't do anything like that. But also encrypting backups. Right? Like, what do you think about, I got to take a backup of the database. Now what this is like, the worst possible thing I could lose is the backup to the database, right? It's the only thing worse is letting people get access to the live database. I don't got thoughts on that, like, do you have any recommendations, I can tell people what I've been doing lately, but it's not perfect. Now, I will leave this question to you. So I've been using like an encrypted volumes, like in Mac, you can create virtual encrypted volumes that you can just open and close, like highly, highly encrypted. And so but it appears as a drive, so I've been doing my backups lately, straight to that thing. And then closing the encrypted vault, when I'm not actively backing up a database. So then I put that somewhere safe. But if for some reason, something were to happen to it, at least what they get hold of is an encrypted vault with a huge password that they probably can't mess with. So I don't know, lots to think about loss to be up late at nights worrying about. Yeah, and then Vince, it throws out there. We have VB style and anvil. And yes, and roll. It was certainly on my mind. I didn't quite call it out. But I forgot about it. But then I remember that from some conferences. And yeah, it was really interesting in that it has that it's the thing that that's different here is what you get his mo supernait. But you do get a specific type of web application. But if if that specific type of web application works for you, then they do have this VB style, drag and drop, double click run your code, which is absolutely beautiful. But if I wanted like an add on to build to do list, or one password in Python, there's no real way to do that nicely. You know, I think one nice thing about Anvil is that it actually generates a nice code that is readable. Because the one problem I have with this, what you see is what you get text editors is that I remember them from the time when you were building websites like that. And the resulting HTML was an abomination, you just couldn't modify it by hand anymore. So I'm worried that if we get tools like that for Python, it's also going to result in like unreadable code. But I know that Anvil actually generated pretty nice coat. Yeah, I remember some of the tools, you would look at the use of the auto generated code. And it was like, basically, there's giant comments, don't touch this, don't try to read this, just leave it alone. Any attempt to mess with it will probably just break everything. Don't touch it. So Doug also thinks Anvil is awesome. Yeah, very cool. And it used to be it had to run in the Anvil cloud, I believe they've open sourced the runtime. So you can like self host it and stuff that's worth checking out. And then I don't know, maybe maybe this is the last last comment. And I see maybe one more out there. We're throwing out something out there. But Christian says, Are there any courses out there that teach you how to get started with a lot of what we're talking about without any background knowledge? I certainly will give another shout out to your course Sebastian and Modern Python projects over talk python training. That is definitely a great course that goes deep into all these things like nine hours of conversation of like, here's my screen, let's go do these things. Is it with no background knowledge? I mean, you do have to know Python, right to do this. So basically, what are the assumptions for your course that you have to know Python? And that's it. I show you a lot of those tools, but I always start from scratch. Like, I even had like empty markerless account. So I made sure that I don't miss some dependencies and stuff like that. I also like spinnable, Windows VM for some parts that required windows, so I think you should be good with just knowing Python. Yeah. Okay, a couple real quick ones here. Duck says Oh, man, yes, visual Python would be amazing. I like Visual Basic. I agree. And then tsp says, fman build system is a pretty cool, fman is pretty cool UI in the app.
01:00:00 And build system for building those things up and distributed. And that's pretty interesting. I haven't done anything with that man of you know, first time I heard the name. Yeah, it's like a file management. Boy, I believe in Python, the person who created it then also built this build system and this UI on top of it. So pretty interesting. Looks cool. Let's see, there's another one from Joan, his court and it's async capabilities ready to level up flask? web apps for medium sized enterprises are currently better to stick with flask. I have no experience with court. So I don't know. What about the Miko none. I mean, I interviewed the guy who did it. And I've played around with it. And it seems okay, but at enterprise like, got to be five nines level up, I've never run anything like that. Honestly, if I was really, really looking to build a web app with async capability, I would either look at fast API or starlett. Like I know, people might think, oh, with fast API, that's an API thing, not a web thing. But be honest, you know, it can be definitely can be done. And I actually build a whole class on, on how to take fast API and sort of replace flask with it. So it's not that hard. It supports template, support static files, all those kinds of things, is super cool. And yeah, thanks, Norbert for the shout out on the 10 apps course, that's really a fun one. But yeah, so I think actually, if I was trying to go all in on async, I feel like fast API is one of the best async frameworks out there. It's just it all holds together really, really well. So all right. Yeah, I think that might be it. Sebastian, you want to just maybe give us some wrap up thoughts here on this whole whole idea? Well, thank you for coming. And thanks for those questions. They were really great. And I hope we answered all of them. Yeah, I think, you know, there's so much variety. I've worked in different technologies. Over the years, I've been doing development, like 25 years or so. And some of the time, you'll find some language or some ecosystem where there's a sort of a central, like, this is how you do things like specifically in like the Microsoft space with dotnet. I go, here's the one web framework you use, here's the one database ORM, you use to talk to the one database you use. And while that's really helpful, I now know what to do I do these four things, and I'm good. This is the recommended way in Python, we don't have that. And that's absolutely a blessing. But it's something of a curse as well, in that there's this Paradox of Choice, right? Oh, it's amazing. We have this cool ORM library. Oh, wait, there's 10? Well, if there's 10 ORM libraries, which one should I choose? And we're having this conversation that what should we use flash? Should we use fast API? Should we use Core on top of flask? Should we use Django like you can just go on and on and on about all the trade offs that you have to make? So I think one of the big challenges is choosing a path and then going down in choosing say, I'm going to use poetry. Flask, Mycoskie, let's go or something like that. Right? Whatever your branch that you happen to follow. I think that's a big challenge. Maybe I'll throw out one more thing. For people out there on the live stream, if I can pull it up quick enough, is awesome. python.com, awesome-python.com. This kind of helps at least narrow that list, right? So for example, if I'm interested in caching, like here are probably the eight most popular caching libraries that you might use or something like that. It's not perfect, but at least it gives you some way to explore when you're totally new, because I think that that's one of the big challenges here, Sebastian is not that there are not enough choices. But there are so many choices at each step of the way. That's a challenge, right? There's the T shirt. I learned Python in a weekend it was I learned Python. It was a great weekend. Right? Like, that's funny, and also kind of true at the language level. But I don't know about how you feel especially I'm still learning Python after so many years. And we're spending all day in it. Yeah, thank you. And like tools come and go. I bet in like five years, this list will be completely different than it is now. So it's definitely important to spend some time trying to figure out which tool you should use. Because as you say, this Python, you can mix and match different tools. And sometimes there are like no tutorials explaining how to do things. Sometimes the tool but get popular, but then it's abandoned. And and it's no longer maintain. So yeah, I mean, a lot of tools to choose from. Yeah, I guess we'll leave with that. There's a lot of tools to choose from. And we've covered a bunch of options and a lot of trade offs here. So Sebastian, it's been great to chat with you about that. Thanks for coming on the livestream. Thank you. Likewise, it was great to talk with you. Yeah, you as well. And thank you everyone for all the questions those who emailed them in previously and the live ones. It was a great conversation. See you next time. Guys. This has been another episode of talk Python to me. Our guest in this episode was Sebastian Barsky and it's been brought to you by 45 drives in us over at talk Python training.
01:04:50 solve your storage challenges with hardware powered by open source, check out '45drives' storage servers at 'talkpython.fm/45drives' and skip
01:05:00 The vendor lock in and software licensing fees. Want to level up your Python we have one of the largest catalogs of Python video courses over at talk Python. Our content ranges from true beginners to deeply advanced topics like memory and async. And best of all, there's not a subscription insight. Check it out for yourself at 'trainingtalkpython.fm'. Be sure to subscribe to the show, open your favorite podcast app and search for Python. We should be right at the top. You can also find the iTunes feed at /iTunes, the Google Play feed at /play and the direct RSS feed at /RSS on 'talk python.fm'. We're live streaming most of our recordings these days. If you want to be part of the show and have your comments featured on the air, be sure to subscribe to our YouTube channel at
01:05:00 This is your host Michael Kennedy. Thanks so much for listening. I really appreciate it. Now get out there and write some Python code