WEBVTT

00:00:00.000 --> 00:00:03.380
The OWASP Top 10 just got a fresh update, and there are some big changes.

00:00:03.900 --> 00:00:07.060
Supply chain attacks, exceptional condition handling, and more.

00:00:07.780 --> 00:00:11.740
Tanya Janca is back on Talk Python to walk us through every single one of them.

00:00:12.280 --> 00:00:14.000
And we're not just talking theory here.

00:00:14.140 --> 00:00:19.580
We're going to turn Claude Code loose on a particularly crappy web project and see what it finds.

00:00:20.100 --> 00:00:20.960
Let's do this.

00:00:21.320 --> 00:00:26.460
It's Talk Python To Me, episode 545, recorded April 8th, 2026.

00:00:28.460 --> 00:00:29.820
Talk Python To Me.

00:00:30.000 --> 00:00:31.100
Yeah, we ready to roll.

00:00:31.420 --> 00:00:32.480
Upgrading the code.

00:00:32.640 --> 00:00:33.940
No fear of getting old.

00:00:34.040 --> 00:00:35.160
Async in the air.

00:00:35.300 --> 00:00:36.560
New frameworks in sight.

00:00:36.700 --> 00:00:37.740
Geeky rap on deck.

00:00:38.040 --> 00:00:39.740
Quarth Crew, it's time to unite.

00:00:39.860 --> 00:00:41.280
We started in Pyramid.

00:00:41.380 --> 00:00:42.760
Cruising old school lanes.

00:00:43.040 --> 00:00:44.540
Had that stable base, yeah, sir.

00:00:44.540 --> 00:00:49.000
Welcome to Talk Python To Me, the number one Python podcast for developers and data scientists.

00:00:49.440 --> 00:00:50.860
This is your host, Michael Kennedy.

00:00:51.200 --> 00:00:54.840
I'm a PSF fellow who's been coding for over 25 years.

00:00:55.400 --> 00:00:56.540
Let's connect on social media.

00:00:56.540 --> 00:01:00.020
You'll find me and Talk Python on Mastodon, Bluesky, and X.

00:01:00.220 --> 00:01:02.160
The social links are all in your show notes.

00:01:02.880 --> 00:01:06.420
You can find over 10 years of past episodes at talkpython.fm.

00:01:06.500 --> 00:01:09.920
And if you want to be part of the show, you can join our recording live streams.

00:01:10.100 --> 00:01:10.600
That's right.

00:01:10.780 --> 00:01:14.160
We live stream the raw uncut version of each episode on YouTube.

00:01:14.160 --> 00:01:19.160
Just visit talkpython.fm/youtube to see the schedule of upcoming events.

00:01:19.340 --> 00:01:22.980
Be sure to subscribe there and press the bell so you'll get notified anytime we're recording.

00:01:23.600 --> 00:01:27.120
This episode is brought to you by Temporal, durable workflows for Python.

00:01:27.520 --> 00:01:34.100
Write your workflows as normal Python code and Temporal ensures they run reliably, even across crashes and restarts.

00:01:34.620 --> 00:01:37.400
Get started at talkpython.fm/Temporal.

00:01:38.000 --> 00:01:39.160
Hello, Tanya Janca.

00:01:39.440 --> 00:01:40.940
Welcome back to Talk Python To Me.

00:01:40.960 --> 00:01:41.680
Awesome to have you here.

00:01:41.900 --> 00:01:42.600
Oh my gosh, Michael.

00:01:42.680 --> 00:01:43.680
It's so nice to see you.

00:01:43.680 --> 00:01:45.220
Yeah, it's really great to see you as well.

00:01:45.480 --> 00:01:54.140
I remember last time I was nervous you were on the show because you're going to make me feel concerned about all my software running on the internet that now all has all these issues I just realized.

00:01:55.400 --> 00:01:57.280
We're back for the 2025 edition.

00:01:57.840 --> 00:01:59.260
And I know the year is 2026.

00:01:59.580 --> 00:02:00.420
Please don't email me.

00:02:00.720 --> 00:02:01.960
I mean, email me, but not for that reason.

00:02:02.180 --> 00:02:07.380
But this is the 2025 OWASP top 10, which is pretty new, right?

00:02:07.580 --> 00:02:13.580
Yeah, so we released it December 31st, 2025 so that it could stay 2025 on it.

00:02:13.580 --> 00:02:17.000
Do you know how much branding has to change if we don't get it out this year?

00:02:17.080 --> 00:02:17.660
Let's just go.

00:02:18.260 --> 00:02:18.860
That's incredible.

00:02:19.000 --> 00:02:20.460
I didn't realize it was that close to the wire.

00:02:20.660 --> 00:02:23.040
We had released the release candidate.

00:02:23.040 --> 00:02:29.580
So every time it's released, we release a release candidate to say, this is what we're thinking.

00:02:29.660 --> 00:02:31.340
And then we ask the community for feedback.

00:02:32.040 --> 00:02:39.000
And I don't know if you remember in previous versions before I joined, there was some drama where sometimes the community is like, absolutely not.

00:02:39.220 --> 00:02:40.000
You are incorrect.

00:02:40.000 --> 00:02:42.080
Or there's vendor influence or whatever.

00:02:42.080 --> 00:02:43.640
And then they've had to rework it.

00:02:43.640 --> 00:02:53.560
But this time it was the smoothest it's literally ever been since the first time where all the links were or all the GitHub issues were great.

00:02:53.860 --> 00:02:57.680
Like, hey, you know, here's a great example of that attack.

00:02:57.860 --> 00:02:59.060
Do you want to use it?

00:02:59.100 --> 00:02:59.740
And we're like, yes.

00:02:59.840 --> 00:03:03.900
Or, you know, the grammar is wrong here or the, you know, the links are wrong.

00:03:03.900 --> 00:03:08.200
We had a couple of, well, this one should be number one because that's what my product solves.

00:03:08.560 --> 00:03:10.660
We're like, well, we'll hear that feedback.

00:03:11.000 --> 00:03:11.360
Exactly.

00:03:11.700 --> 00:03:13.020
You failed to mention our product.

00:03:13.100 --> 00:03:14.880
I'm like, oh, I do see how that happened.

00:03:15.260 --> 00:03:15.480
Yeah.

00:03:16.160 --> 00:03:22.620
But other than like the feedback was overall just overwhelmingly, yes, we agree, which was very validating.

00:03:22.780 --> 00:03:22.980
Yeah.

00:03:23.020 --> 00:03:27.980
I'm sure there was a lot of, boy, if I could get the OWAS top 10 to reference my solution.

00:03:28.380 --> 00:03:29.880
That's some good marketing right there.

00:03:30.240 --> 00:03:30.660
Incredible.

00:03:30.660 --> 00:03:39.220
So before we dive into that with a bit of a Python focus, let's just hear a little bit about you, who you are.

00:03:39.320 --> 00:03:41.800
You've started a podcast since you've been on the show.

00:03:42.240 --> 00:03:43.220
Tell us about you.

00:03:43.380 --> 00:03:48.220
So I'm Tanya and I was a software developer that switched into application security.

00:03:48.340 --> 00:03:49.660
I went to the dark side, Michael.

00:03:50.200 --> 00:03:55.580
I started speaking at conferences so I could get in free and writing and then ended up writing two books.

00:03:55.580 --> 00:04:02.460
And now I teach secure coding and like how to use AI securely and all of those things to large companies.

00:04:02.460 --> 00:04:04.040
And then speak at conferences.

00:04:04.040 --> 00:04:05.080
Serve as like a habit.

00:04:05.200 --> 00:04:05.940
I can't stop.

00:04:06.500 --> 00:04:08.800
You don't get paid to do that usually.

00:04:09.460 --> 00:04:12.360
And so recently I started a podcast called DevSec Station.

00:04:12.360 --> 00:04:18.600
And it's five to 10 minute lessons on secure coding and it's free.

00:04:19.140 --> 00:04:25.980
I used to have a podcast called We Hack Purple Podcast and I, like my company got bought and absorbed, et cetera.

00:04:26.180 --> 00:04:27.920
And eventually the podcast was retired.

00:04:28.420 --> 00:04:29.980
I've missed having a podcast, Michael.

00:04:30.060 --> 00:04:31.960
I'm sure as a podcast host, you can relate.

00:04:31.960 --> 00:04:34.920
It's really nice to be able to create a piece of art and release it.

00:04:35.100 --> 00:04:43.380
It's a very interesting medium and you get to just reach out to people or explore ideas that are just interesting to you.

00:04:43.480 --> 00:04:46.940
And long as there's a through thread, you can kind of do whatever you want to.

00:04:47.300 --> 00:04:48.240
It's great.

00:04:48.340 --> 00:04:48.920
Yeah, I love it.

00:04:48.980 --> 00:04:53.220
I wanted to teach some lessons and I wanted them to be just really short.

00:04:53.980 --> 00:04:58.700
And the first season I'm exploring the idea that the supply chain is changing.

00:04:58.700 --> 00:05:02.440
The supply chain security used to be just dependencies that people worried about.

00:05:02.580 --> 00:05:07.840
But now I'm like, what if that tech surface is actually very different than we realize?

00:05:08.000 --> 00:05:14.320
And so I'm talking about how developers can protect themselves, protect the organizations, protect their build pipelines, et cetera.

00:05:14.480 --> 00:05:17.620
And so, yeah, I'm excited to see what people think of it.

00:05:17.820 --> 00:05:18.900
Yeah, I encourage people to subscribe.

00:05:19.040 --> 00:05:19.640
That's really cool.

00:05:19.860 --> 00:05:20.800
Five to 10 minutes.

00:05:21.260 --> 00:05:22.300
Is it daily or weekly?

00:05:22.500 --> 00:05:27.500
So I released one two weeks ago and I was kind of thinking of releasing one tomorrow, but I need to just get the edits.

00:05:27.500 --> 00:05:33.000
It's, I hired some students and they're learning video editing and it's been very exciting.

00:05:33.420 --> 00:05:43.120
Yeah, that's the back end of all this type of work that people don't realize is it's an hour or 10 minutes or whatever, but then there's the whole production distribution, et cetera.

00:05:43.480 --> 00:05:44.520
I'm just a control freak.

00:05:44.600 --> 00:05:45.520
That's the problem, Michael.

00:05:46.560 --> 00:05:48.240
I just want to do it myself.

00:05:48.400 --> 00:05:49.020
Yes, exactly.

00:05:49.680 --> 00:05:51.080
There's a bit of a wind noise thing.

00:05:51.140 --> 00:05:51.940
Can we do that again?

00:05:52.320 --> 00:05:53.620
No, it is tough.

00:05:53.620 --> 00:05:56.540
It is really, really tough to kind of find that balance.

00:05:56.640 --> 00:05:57.620
But yeah, people check that out.

00:05:57.740 --> 00:05:58.320
That's awesome.

00:05:58.740 --> 00:05:58.900
Yeah.

00:05:59.040 --> 00:06:01.900
And is SheHacksPurple still your domain?

00:06:02.340 --> 00:06:02.540
Yeah.

00:06:02.680 --> 00:06:08.140
So if people go to SheHacksPurple.ca, you will find my website and my services and my blog.

00:06:08.280 --> 00:06:19.820
So lately I'm blogging a lot about how we can combine behavioral economic interventions, which is like the science of why people make decisions to the software development ecosystem

00:06:19.820 --> 00:06:30.480
so that we basically set up secure defaults and other things that just nudge developers to do the secure thing and make the secure thing always the easiest path.

00:06:30.480 --> 00:06:38.500
And so not how do we manipulate them and pressure them and make them feel bad, but more how can we remove cognitive load that's not necessary?

00:06:39.120 --> 00:06:42.560
How can we make it more obvious what we hope that they'll do?

00:06:42.560 --> 00:06:46.500
How can we make it so like it requires effort to do the bad thing?

00:06:46.700 --> 00:06:53.420
There's a phrase that I got, I think from Scott Guthrie was the one who spoke about it at Microsoft, but it doesn't really matter.

00:06:53.940 --> 00:06:58.600
The help people, help developers and security folks fall into the pit of success.

00:06:58.600 --> 00:07:02.860
Like you've got to climb out of the thing you're supposed to do and actively do it wrong.

00:07:02.960 --> 00:07:03.260
You know what I mean?

00:07:03.400 --> 00:07:03.800
Exactly.

00:07:04.140 --> 00:07:04.520
Exactly.

00:07:04.700 --> 00:07:07.960
So I'm writing a series on that based on a talk I did.

00:07:08.100 --> 00:07:10.740
So sometimes I'll do a talk and then I'm really excited about it.

00:07:10.740 --> 00:07:13.480
And I'm like, well, now I can nerd out as much as I want on my blog.

00:07:13.640 --> 00:07:17.040
I want to talk about what is the OWASP top 10?

00:07:17.360 --> 00:07:17.920
What is OWASP?

00:07:17.960 --> 00:07:20.280
What is the OWASP top 10 and all of that?

00:07:20.620 --> 00:07:34.320
But before we kind of get into that, I do want to set the stage just a little bit, because when people think about Python security or you pick your language, there's like every language and framework has a little security gotchas.

00:07:34.320 --> 00:07:40.860
Like in Python, there's a, I think it's a YAML parser, but it's too, it can run arbitrary code.

00:07:41.040 --> 00:07:43.240
So you got to do like the safe YAML parsing.

00:07:43.340 --> 00:07:47.580
And then there's, there's pickles, which is a serialization format that can run arbitrary code.

00:07:47.680 --> 00:07:50.440
So people say, these are the things to look out for.

00:07:50.520 --> 00:07:52.420
I actually think those are fairly useless.

00:07:52.880 --> 00:08:03.620
I think the real problem is things that developers do to write code and they either omit or add actions or steps that they should or shouldn't have done depending on the situation.

00:08:04.040 --> 00:08:06.720
And really that's what OWASP focuses on, right?

00:08:07.020 --> 00:08:19.560
Technically it is in a nonprofit organization, but in what I feel it is, is an international community of thousands and thousands and thousands of people who want there to be more secure software.

00:08:20.100 --> 00:08:22.800
And so we have chapters where people meet each month.

00:08:22.920 --> 00:08:25.060
There's over 300 worldwide, Michael.

00:08:25.240 --> 00:08:26.580
It's amazing.

00:08:27.000 --> 00:08:27.120
Wow.

00:08:27.120 --> 00:08:27.440
Yeah.

00:08:27.560 --> 00:08:30.760
Almost every big city in all of Canada has one.

00:08:30.960 --> 00:08:34.520
And like, we can't agree on a lot of things in Canada, but apparently we love OWASP.

00:08:34.720 --> 00:08:37.200
And like in India, oh my gosh, they have so many.

00:08:37.360 --> 00:08:41.040
And in the United States, like people love, oh, I love OWASP.

00:08:41.180 --> 00:08:45.040
And then we have over 100 active open source projects.

00:08:45.300 --> 00:08:51.180
So we have free books, free documents, free tools, free software, like everything you can think of.

00:08:51.200 --> 00:08:52.580
We're like, oh, I'll build one for you.

00:08:53.020 --> 00:08:56.940
And then, you know, there's a Slack channel with thousands of us on there being nerds together.

00:08:56.940 --> 00:09:08.500
And then twice a year, we have one in Europe and one in North America where we have a conference and we gather and there's talks and teachings and all the things, right?

00:09:08.620 --> 00:09:16.040
And I've been a part of OWASP since 2015 was the first thing I attended.

00:09:16.240 --> 00:09:19.700
And then 2016, I was a chapter leader because I don't know how to do small things.

00:09:19.700 --> 00:09:23.180
I only know how to do really big things because I was like, oh, I'm in.

00:09:23.680 --> 00:09:32.040
And I've just been, so I'm now a lifetime distinguished member because I've volunteered for over 10 years and I'm their biggest fan.

00:09:32.220 --> 00:09:34.980
I'm totally president of the non-existent fan club.

00:09:35.560 --> 00:09:40.400
And the thing, so the weird thing, Michael, so OWASP does so many amazing things.

00:09:40.460 --> 00:09:41.540
There's so many amazing people.

00:09:41.540 --> 00:09:46.100
But the thing that we're absolutely most famous for is called the OWASP top 10.

00:09:46.520 --> 00:09:55.480
And I volunteered in Norway at the OWASP booth at a conference because I gave a talk and then I had nothing else to do.

00:09:55.700 --> 00:09:58.880
And I don't mean to sound rude, but unless it's a security talk, I'm not going.

00:09:59.000 --> 00:10:00.080
And it was a developer talk.

00:10:00.320 --> 00:10:03.360
And so I went to the other two security talks and then I was like, what am I going to do?

00:10:03.420 --> 00:10:05.060
So I volunteered at the OWASP booth.

00:10:05.060 --> 00:10:08.100
And every person that walked by went top 10.

00:10:08.560 --> 00:10:09.160
That's awesome.

00:10:09.680 --> 00:10:12.620
And any of them that knew us, they only knew the top 10.

00:10:13.020 --> 00:10:14.060
They didn't know we had chapters.

00:10:14.300 --> 00:10:16.300
They didn't know we had any other things to help them.

00:10:16.660 --> 00:10:19.340
But literally person after person, top 10.

00:10:20.400 --> 00:10:21.280
That's pretty funny.

00:10:21.380 --> 00:10:24.600
Like a few weeks later, the top 10 team invited me to join.

00:10:24.700 --> 00:10:25.940
And I was like, me?

00:10:26.780 --> 00:10:28.120
And I was like, okay.

00:10:28.460 --> 00:10:32.300
I knew about the whole project because of the top 10 as well, honestly.

00:10:32.560 --> 00:10:34.140
Well, I mean, then we're doing a great job.

00:10:34.140 --> 00:10:36.760
And so we finally wrote a new one.

00:10:36.880 --> 00:10:39.680
And the team's like, okay, Tanya, you like talking the most.

00:10:39.720 --> 00:10:40.540
So you go tell them.

00:10:41.240 --> 00:10:42.480
You go tell them it's ready.

00:10:42.800 --> 00:10:46.980
Actually, if you go to the OWASP top 10, there's a GitHub link at the top.

00:10:47.280 --> 00:10:55.640
And you can see that there's, if you go there, you can actually see historically the 2003, 4, 7, 10, 2017, 2021, and the 2025.

00:10:55.960 --> 00:10:59.380
And you go and there's all the markdown files and presentations and et cetera.

00:10:59.580 --> 00:11:02.860
So you can kind of get the historical evolution as well.

00:11:02.860 --> 00:11:04.300
People file issues.

00:11:04.480 --> 00:11:08.860
And then I have to, me or, you know, we're twisting or that's what we do.

00:11:09.120 --> 00:11:09.920
We respond.

00:11:10.240 --> 00:11:11.880
Sometimes we respond and we're like, no.

00:11:13.180 --> 00:11:14.840
This is the third time you've asked.

00:11:14.960 --> 00:11:15.200
No.

00:11:15.420 --> 00:11:16.740
We talked about it twice.

00:11:16.740 --> 00:11:17.220
Yeah.

00:11:17.220 --> 00:11:23.020
We try really hard to always be open because the community, sometimes we don't have the data that supports the thing.

00:11:23.080 --> 00:11:24.660
But let me tell everyone what it is.

00:11:24.720 --> 00:11:37.560
So if you've been hiding under a rock, the OWASP top 10 is an awareness document of the top 10 things, according to the data we gathered and multiple community surveys of risks to web applications.

00:11:37.560 --> 00:11:42.940
Most of them relate to all software, but technically this one is about web applications.

00:11:43.400 --> 00:11:51.540
And you might not realize, but underneath the next steps, there's three more secret items because we couldn't decide, Michael.

00:11:52.060 --> 00:11:54.580
And Vibe coding needed to be on there.

00:11:54.580 --> 00:11:57.880
And then there was sort of a tie for the number 10.

00:11:58.240 --> 00:12:00.420
So we want to include the one that was the tie.

00:12:00.840 --> 00:12:05.560
And then we felt memory safety is still so unbelievably critical.

00:12:05.780 --> 00:12:07.620
We had to comment on that as well.

00:12:07.920 --> 00:12:10.560
So we had to talk about those as well.

00:12:10.760 --> 00:12:11.960
So that was good.

00:12:12.060 --> 00:12:16.640
So it's the top 10, but these are three other things for your consideration that are very important.

00:12:16.920 --> 00:12:17.780
I did not realize those.

00:12:17.880 --> 00:12:18.300
How interesting.

00:12:18.300 --> 00:12:32.420
I do think in whenever you release your next version in two, three, four years, whatever, there's going to be a strong AI bent and we're going to get into the AI angle of security in this conversation, which is going to be really fun.

00:12:32.740 --> 00:12:35.120
But yeah, I think we're just beginning this.

00:12:35.260 --> 00:12:38.120
Like you called it out with Vibe coding, but there's more to that even.

00:12:38.380 --> 00:12:42.400
Making the top 10 is complicated because we have to do it based on data.

00:12:43.140 --> 00:12:54.560
And the data I would like would be the postmortem from security incidents and would be, you know, the AppSec team telling us the things that are happening, not what we actually

00:12:54.580 --> 00:13:00.060
get, which is a bunch of SaaS vendors and DAS vendors telling us what their automated tools are capable of finding.

00:13:00.600 --> 00:13:08.800
And then a bunch of boutique pen tester companies who are so generous to give us their reports and to try to normalize that for us.

00:13:09.220 --> 00:13:13.480
And like we end up with millions and millions and millions of records and that's great.

00:13:13.480 --> 00:13:22.160
But if a SaaS tool is really good at finding X, that doesn't necessarily mean that X is the biggest problem in our industry.

00:13:22.400 --> 00:13:31.480
And so when we put supply chain security on there and expanded it from just being libraries, like, oh, you're using outdated or vulnerable components.

00:13:31.940 --> 00:13:33.420
Like, yeah, that's bad.

00:13:33.680 --> 00:13:35.520
But there's also malicious components.

00:13:35.520 --> 00:13:45.200
There's also you didn't lock down your CI and then the co-op term, like the co-op or you call them interns, put too many zeros on the Kubernetes deployment.

00:13:45.200 --> 00:13:50.280
And then you've got $30,000 bill you weren't expecting or like I've seen it, right?

00:13:50.400 --> 00:13:54.480
I feel like it's hard to get data that tells the full picture.

00:13:54.480 --> 00:13:58.080
And we're not allowed to just say, well, this is what the team thinks, right?

00:13:58.100 --> 00:13:59.820
So that's where the surveys come in.

00:13:59.940 --> 00:14:02.100
And we're like, this is what we see.

00:14:02.160 --> 00:14:02.880
This is what we think.

00:14:02.920 --> 00:14:03.420
Do you agree?

00:14:03.540 --> 00:14:07.840
And like across the board, everyone was like, supply chain must be on that list.

00:14:07.900 --> 00:14:08.780
You must expand it.

00:14:08.860 --> 00:14:09.240
We agree.

00:14:09.400 --> 00:14:12.800
I think it's the biggest issue of the year, at least of the last six months.

00:14:12.800 --> 00:14:23.780
If we look at reports like the Verizon breach report and the CrowdStrike and the like the big reports of many, many breaches, if we look at the big ones, the nation state ones,

00:14:24.460 --> 00:14:30.880
their supply chain, or in my opinion, if we really look at it, we think about it, they exploited the developer.

00:14:31.120 --> 00:14:33.960
They compromised a developer within an organization.

00:14:33.960 --> 00:14:37.480
It gave them access to multiple parts of the supply chain.

00:14:37.700 --> 00:14:40.040
And then they owned the entire organization.

00:14:40.040 --> 00:14:46.040
So if you get SQL injection in one app, you got into one database and maybe you could read sensitive data.

00:14:46.620 --> 00:14:48.700
Maybe you could delete sensitive data.

00:14:49.080 --> 00:14:56.420
If that database was completely unpatched in a total like terrible mess, then maybe you could take over that server.

00:14:56.800 --> 00:15:06.040
Then if your network's totally not secure and crappy, which is not exactly that common, like I'll look at, you know, network diagrams with clients and I'm like, oh, that line, is that a firewall?

00:15:06.180 --> 00:15:07.600
They're like, it's more aspirational.

00:15:07.840 --> 00:15:09.620
I've heard that many times, right?

00:15:09.620 --> 00:15:17.900
Then maybe they could pivot and get to a couple of places, but that's like, maybe, maybe, maybe, maybe, maybe you get a little bit, but you compromise a senior developer.

00:15:18.520 --> 00:15:18.620
Right.

00:15:18.700 --> 00:15:25.040
And so, yeah, I was really glad when the team agreed that we would do this and then the community supported it.

00:15:25.100 --> 00:15:26.520
So I was like, yes, win.

00:15:26.920 --> 00:15:26.940
Yeah.

00:15:27.000 --> 00:15:29.200
You compromise a developer, then you get.

00:15:29.400 --> 00:15:30.300
Especially a senior.

00:15:30.600 --> 00:15:31.120
Yeah, exactly.

00:15:31.240 --> 00:15:36.160
You get arbitrary code execution on potentially all the stuff that they send out to the world.

00:15:36.160 --> 00:15:37.240
It's, it's really bad.

00:15:37.320 --> 00:15:40.220
You know, an example of that would be the last past breach.

00:15:40.360 --> 00:15:40.540
Yeah.

00:15:40.600 --> 00:15:53.060
And the way that that all started from what I understand is one of the developers had an outdated version of Plex, the like streaming ripped video player on his home network that was open on the internet.

00:15:53.060 --> 00:15:54.100
That got taken over.

00:15:54.200 --> 00:15:57.760
Then they got into the dev machine and then they got everybody's password vaults.

00:15:57.760 --> 00:16:03.360
It's like, excuse me, because you had a movie player on your home automation network.

00:16:03.500 --> 00:16:04.280
That's crazy.

00:16:04.480 --> 00:16:10.640
There's been some recent hacks where the developers download like a plugin or something and it's malicious.

00:16:10.640 --> 00:16:17.160
And then not only is it trying to steal like secrets off their computer, but then it robs their crypto wallets.

00:16:17.160 --> 00:16:21.160
Because why don't you just kick someone when they're down like jerks?

00:16:21.300 --> 00:16:21.960
It's terrible.

00:16:22.320 --> 00:16:23.460
Well, we'll circle back to that.

00:16:23.460 --> 00:16:29.040
But I do think the vibe coding side is going to be a big deal in the future and you'll have a little bit of a challenge there.

00:16:29.120 --> 00:16:33.900
I think obviously professional developers love to bag on vibe coding.

00:16:34.020 --> 00:16:35.560
And I think that that's totally fair.

00:16:35.880 --> 00:16:39.460
But the problem is, I think a lot of that is kind of dark matter.

00:16:39.800 --> 00:16:42.960
Like you'll never see the people because the people do vibe coding.

00:16:43.040 --> 00:16:45.400
They don't know to go and fill out a survey for OWASP.

00:16:45.660 --> 00:16:47.700
They don't even know what a line of code looks like.

00:16:47.740 --> 00:16:49.160
They're just make me happy.

00:16:49.300 --> 00:16:50.200
You know, it's crazy.

00:16:50.200 --> 00:17:04.240
And now some companies are building dark factories, which is a term that was new for me, which is where they replace their entire software development team only with complete AI end to end solutions where there's no human in the loop whatsoever.

00:17:05.260 --> 00:17:12.940
And oh my gosh, Michael, imagine the security posture of what's being released there and the fact that they don't know what the posture is, right?

00:17:13.220 --> 00:17:16.860
But they're getting to market faster than someone else.

00:17:16.860 --> 00:17:16.980
Yes.

00:17:17.360 --> 00:17:22.980
And so consumers don't know that what they're buying has been put together with duct tape and glue.

00:17:23.500 --> 00:17:27.120
So I actually, we didn't talk about this beforehand, but I hope it's okay to mention.

00:17:27.440 --> 00:17:30.260
So I'm trying to push a secure coding law in Canada.

00:17:30.540 --> 00:17:34.100
So Canada is cute and quaint, just like all the stereotypes.

00:17:34.420 --> 00:17:39.800
And a citizen is allowed to create a petition if she could get a member of parliament to support her.

00:17:39.920 --> 00:17:42.160
And after three years of letter writing, one of them did.

00:17:42.440 --> 00:17:42.540
Wow.

00:17:42.880 --> 00:17:43.240
Congratulations.

00:17:43.640 --> 00:17:44.040
Thank you.

00:17:44.040 --> 00:17:47.980
In the house of commons and I have enough signatures, so it's going to go to vote.

00:17:47.980 --> 00:17:54.240
And so right now I'm lobbying all the public to try to get them to call their member of parliament and ask them to vote.

00:17:54.320 --> 00:17:54.500
Yes.

00:17:54.520 --> 00:18:00.360
Because what happens is my member of parliament will be like, Hey, seven one, one, five says this.

00:18:01.000 --> 00:18:08.960
We should create a secure coding law for all governmental organizations, have a standard and then, you know, assure the standard, make, make sure there's compliance.

00:18:09.400 --> 00:18:12.080
And I also like wrote the standard for them and sent it to them.

00:18:12.080 --> 00:18:15.700
Cause that's what I'm like, I'm like, you don't have to use it, but like, here it is in case you want one.

00:18:16.300 --> 00:18:17.640
I've been writing letters for years.

00:18:17.640 --> 00:18:18.320
I'm very annoying.

00:18:18.720 --> 00:18:19.060
Guess what?

00:18:19.140 --> 00:18:21.320
Members of parliament don't know what the word cyber means.

00:18:21.420 --> 00:18:22.300
Like they're very smart.

00:18:22.520 --> 00:18:23.660
I'm not trying to mock them.

00:18:23.760 --> 00:18:25.640
I'm not an expert in what they do either.

00:18:25.800 --> 00:18:26.060
Right.

00:18:26.480 --> 00:18:28.000
And so I need members.

00:18:28.000 --> 00:18:33.400
So Canadians, if you're listening, go like, look up petition E seven one, one five.

00:18:33.500 --> 00:18:36.420
You'll find me sign it and then call or write your member of parliament.

00:18:36.560 --> 00:18:44.980
So if enough of us call, if like 10 or 20, 30 people call the member of parliament or they receive emails, Michael, when the petition comes up, they're like, Oh yeah, my constituents care.

00:18:45.400 --> 00:18:46.000
Therefore I do.

00:18:46.020 --> 00:18:48.800
And they'll raise their hand and I get one chance.

00:18:49.340 --> 00:18:54.720
And I really want a lot of hands going up specifically at least half of 334 people.

00:18:54.900 --> 00:18:55.360
Good luck with that.

00:18:55.360 --> 00:18:56.420
I hope that that goes through.

00:18:56.480 --> 00:18:56.800
That's cool.

00:18:56.880 --> 00:18:57.980
We'll know by June.

00:18:58.220 --> 00:18:58.480
Yeah.

00:18:58.500 --> 00:19:01.660
That's the challenge with legislatures and government in general.

00:19:01.780 --> 00:19:08.300
So many of the people, especially elected officials, they're not elected because they're developers or security specialists or whatever.

00:19:08.440 --> 00:19:08.680
Right.

00:19:08.840 --> 00:19:13.060
But the difference is that you said it's fine because you don't know what they do.

00:19:13.180 --> 00:19:14.640
You're not an expert in law or whatever.

00:19:14.760 --> 00:19:15.660
That's that is true.

00:19:15.660 --> 00:19:23.220
But they have to choose how it's going to work for us through technology, whereas you don't have to choose how law works for them as a tech.

00:19:23.280 --> 00:19:23.720
You know what I mean?

00:19:23.720 --> 00:19:25.320
Like it's they decide.

00:19:25.460 --> 00:19:29.900
So it's I'm not saying that it's their fault or anything, but it is a very tricky thing to balance.

00:19:30.180 --> 00:19:30.380
It is.

00:19:30.460 --> 00:19:37.680
And this is why I, as like a influential person or whatever, are trying to use my influence for good.

00:19:37.740 --> 00:19:39.700
And I'm trying to protect Canada.

00:19:39.820 --> 00:19:46.020
And here's the thing, Michael, is that if Canada creates a law that does this, that is huge momentum for every other country.

00:19:46.020 --> 00:19:50.180
And Canada was one of the first countries to have privacy laws.

00:19:50.180 --> 00:19:51.480
Like we really led the way in that.

00:19:51.480 --> 00:19:56.260
We really have led the way in like laws for quantum as well.

00:19:56.580 --> 00:20:04.100
And like, we're not really used to being that we're just, you know, struggling along and we could lead the way in this.

00:20:04.100 --> 00:20:04.320
Right.

00:20:04.320 --> 00:20:07.440
And then that means other countries can say, well, they have one.

00:20:07.440 --> 00:20:09.460
Like, do we really want to be behind Canada?

00:20:09.460 --> 00:20:10.540
I mean, come on.

00:20:10.540 --> 00:20:11.300
We love Canada.

00:20:11.300 --> 00:20:11.860
Canada is awesome.

00:20:11.860 --> 00:20:13.060
Yeah, we're sweet and we're wonderful.

00:20:13.060 --> 00:20:14.740
And we mean well all the time.

00:20:16.020 --> 00:20:18.580
This portion of Talk Python To Me is sponsored by Temporal.

00:20:18.580 --> 00:20:25.460
Ever since I had Mason Egger on the podcast for episode 515, I've been fascinated with durable workflows in Python.

00:20:25.940 --> 00:20:30.620
That's why I'm thrilled that Temporal has decided to become a podcast sponsor since that episode.

00:20:30.960 --> 00:20:39.200
If you've built background jobs or multi-step workflows, you know how messy things get with retries, timeouts, partial failures, and keeping state consistent.

00:20:39.820 --> 00:20:45.000
I'm sure many of you have written brutal code to keep the workflow moving and to track when you run into problems.

00:20:45.000 --> 00:20:46.360
But it's trickier than that.

00:20:46.360 --> 00:20:51.560
What if you have a long-running workflow and you need to redeploy the app or restart the server while it's running?

00:20:52.120 --> 00:20:55.460
This is where Temporal's open source framework is a game changer.

00:20:56.140 --> 00:21:10.400
You write workflows as normal Python code and Temporal ensures that they execute reliably, even across crashes, restarts, or long-running processes while handling retries, states, and orchestrations for you so you don't have to build and maintain that logic yourself.

00:21:10.400 --> 00:21:16.340
You may be familiar with writing asynchronous code using the async and await keywords in Python.

00:21:16.840 --> 00:21:25.100
Temporal's brilliant programming model leverages the exact same programming model that you are familiar with but uses it for durability, not just concurrency.

00:21:25.840 --> 00:21:28.080
Imagine writing await workflow.sleep.

00:21:28.080 --> 00:21:29.960
Heim Delta, 30 days.

00:21:30.300 --> 00:21:31.020
Yes, seriously.

00:21:31.300 --> 00:21:32.220
Sleep for 30 days.

00:21:32.380 --> 00:21:33.040
Restart the server.

00:21:33.280 --> 00:21:34.280
Deploy new versions of the app.

00:21:34.520 --> 00:21:34.940
That's it.

00:21:35.140 --> 00:21:36.280
Temporal takes care of the rest.

00:21:36.820 --> 00:21:41.320
Temporal is used by teams at Netflix, Snap, and NVIDIA for critical production systems.

00:21:41.440 --> 00:21:44.560
Get started with the open source Python SDK today.

00:21:44.860 --> 00:21:47.280
Learn more at talkpython.fm/Temporal.

00:21:47.560 --> 00:21:49.600
The link is in your podcast player's show notes.

00:21:50.000 --> 00:21:52.020
Thank you to Temporal for supporting the show.

00:21:52.020 --> 00:22:03.500
Let's shift just a little bit to maybe people who don't necessarily mean well, the people who might exploit, you know, broken access control or other types of things.

00:22:03.720 --> 00:22:13.660
And to aid us here, going through the OWASP top 10, I've come up with a little example of, well, this is what some of the concrete examples might look like.

00:22:13.760 --> 00:22:16.100
Maybe I'll put a link to this and I'll reference them.

00:22:16.220 --> 00:22:18.520
And I don't know how often I'll totally use this.

00:22:18.660 --> 00:22:20.700
But let's start with this one.

00:22:20.700 --> 00:22:23.540
And if I understand it correctly, you all don't like suspense.

00:22:24.060 --> 00:22:27.420
It's the worst comes first, then the second worst, and then the third worst.

00:22:27.620 --> 00:22:29.560
Yeah, we go in order of this.

00:22:29.920 --> 00:22:32.720
So this is, it causes lots and lots of damage.

00:22:33.060 --> 00:22:35.220
It's not that hard to find or exploit.

00:22:35.540 --> 00:22:37.080
And it's everywhere.

00:22:37.400 --> 00:22:38.000
It's everywhere.

00:22:38.380 --> 00:22:43.060
When I talk to people that do pen testing, they're like, yeah, I find this basically every time.

00:22:43.680 --> 00:22:51.060
Like one of my friends, Katie Paxton Fear, she does API content and pen testing and bug bounty and stuff.

00:22:51.060 --> 00:22:58.540
And she said, I have never not once found broken access control in an API, like never once.

00:22:59.160 --> 00:23:08.320
And it's really hard to get right, Michael, because every single page, every single record, every single access, we should check that the role is allowed.

00:23:08.320 --> 00:23:11.180
And that they still are that role, right?

00:23:11.240 --> 00:23:15.000
So we continue to make sure the session's accurate and then grant access.

00:23:15.000 --> 00:23:18.120
And unfortunately, we just forget to ask a bot.

00:23:18.420 --> 00:23:23.260
Or we return the entire record set, and we'll just sort it out on the front end.

00:23:23.340 --> 00:23:25.440
And the malicious actor is like, thanks for the data set.

00:23:26.680 --> 00:23:28.680
That's so sweet of you to give that to me.

00:23:29.220 --> 00:23:32.200
And we just screwed up so much, so much.

00:23:32.400 --> 00:23:35.440
I would like to point out that these are not a single vulnerability.

00:23:35.600 --> 00:23:38.020
It's not like this is the number one issue.

00:23:38.220 --> 00:23:39.740
They're like categories, right?

00:23:39.840 --> 00:23:49.740
It's like violations of these could be you didn't use least privilege, or you bypass a control check, or you didn't put access control on the delete part of the API.

00:23:49.900 --> 00:23:52.360
You know, like there's a bunch of things that fall into each one of these, right?

00:23:52.400 --> 00:23:52.940
It's a category.

00:23:52.940 --> 00:23:54.340
They're all a bucket.

00:23:55.040 --> 00:24:00.140
And when we looked at the data, poor code quality was a bucket.

00:24:00.260 --> 00:24:03.360
I'm like, no, because all of the things go into that bucket, right?

00:24:03.500 --> 00:24:04.360
The OWASP top one.

00:24:04.500 --> 00:24:05.240
OWASP top one.

00:24:05.340 --> 00:24:09.900
And then also the mitigation advice is, what if you sucked less, right?

00:24:10.000 --> 00:24:12.820
Like there's no constructive feedback to poor code quality.

00:24:12.940 --> 00:24:14.020
It's not specific enough.

00:24:14.180 --> 00:24:16.220
So we didn't want buckets like that.

00:24:16.280 --> 00:24:17.260
Keep it actionable, right?

00:24:17.380 --> 00:24:17.900
Yeah, exactly.

00:24:18.040 --> 00:24:21.420
If it's not actionable, it's not worth raising awareness about, we felt.

00:24:21.420 --> 00:24:24.160
And broken access control, Michael, it's everywhere.

00:24:24.360 --> 00:24:28.100
And I wish there was like a product that you could buy that could just do this for you.

00:24:28.160 --> 00:24:34.960
So you can buy authentication products that manage session and identity like really, really, really well, right?

00:24:35.000 --> 00:24:38.180
And people buy the crap out of them because they work really well.

00:24:38.320 --> 00:24:48.120
I'd like to be able to buy an access control tool that was as easy to implement as, I'm going to try not to name brands, but you know the products, right?

00:24:48.120 --> 00:24:57.100
People pay a lot of money for Okta, you know, at like Active Directory, you know, Incognito because they work, right?

00:24:57.140 --> 00:24:58.020
And they work well.

00:24:58.240 --> 00:25:04.080
And the less painful they are to implement, the more likely, like they're willing to pay more for that, right?

00:25:04.140 --> 00:25:08.600
And so if we could solve this issue, like I think that could be pretty huge.

00:25:08.940 --> 00:25:09.000
Yeah.

00:25:09.020 --> 00:25:12.300
I got a couple of examples here that were, they're not obvious.

00:25:12.660 --> 00:25:15.220
Some are obvious, kind of like, here's a Django example.

00:25:15.220 --> 00:25:16.660
We might see this later.

00:25:16.960 --> 00:25:25.160
So you might have an admin endpoint and it has at login required, which is a decorator in Django that will do the validation before the function even runs.

00:25:25.500 --> 00:25:26.920
And so you look at it like, oh yeah, this is fine.

00:25:27.000 --> 00:25:30.460
It's using authentication here, but it's not using authorization.

00:25:30.460 --> 00:25:33.620
It's not checking that the person necessarily is an admin.

00:25:33.780 --> 00:25:35.420
It's just that they're logged in, right?

00:25:35.780 --> 00:25:37.260
Like that's a real simple example.

00:25:37.540 --> 00:25:37.800
Yep.

00:25:38.060 --> 00:25:39.260
And I see the problem.

00:25:39.400 --> 00:25:43.980
Another one would be if you say we're going to let people read and write files.

00:25:43.980 --> 00:25:46.320
Maybe it's like a WordPress type thing or something.

00:25:46.560 --> 00:25:56.920
But then if they can put dot, dot in their path and break out, let me read the file dot, dot, slash, dot, dot, slash, et cetera, dot password and read the passwords or whatever, or usernames.

00:25:57.100 --> 00:26:01.300
This is the type of stuff that falls in a broken access control or just no login checks at all.

00:26:01.480 --> 00:26:05.460
I literally did this yesterday, Michael, because someone was like, hey, go get this file from there.

00:26:05.560 --> 00:26:07.400
And then I go in the folder and it's not there.

00:26:07.600 --> 00:26:08.840
Like the link wasn't correct.

00:26:08.900 --> 00:26:11.520
So I just went through the web directory with that.

00:26:11.520 --> 00:26:14.360
Like, but they wanted to send me the file.

00:26:14.500 --> 00:26:17.860
So just to be clear, like, like they sent me and told me to go get it.

00:26:18.060 --> 00:26:19.600
I wasn't stealing anything.

00:26:19.660 --> 00:26:22.060
And I didn't end up eventually finding it either.

00:26:22.540 --> 00:26:25.300
So then they had to send me another link that was correct.

00:26:25.400 --> 00:26:27.180
But I was like, oh, I'll just like not waste their time.

00:26:27.220 --> 00:26:29.840
I'll just go look for myself because it's that easy.

00:26:30.260 --> 00:26:30.420
Incredible.

00:26:30.600 --> 00:26:33.780
It's like, I'm going to use their tools, but not in a way that they necessarily expected.

00:26:33.920 --> 00:26:35.700
Well, I mean, they could have just sent me the right link.

00:26:35.800 --> 00:26:36.120
Exactly.

00:26:36.240 --> 00:26:37.900
They should have just sent you the right link.

00:26:37.900 --> 00:26:45.640
Yeah, I've I have some I cannot recount them here, but my dad, he I've got to let you help take care of him and stuff now these days.

00:26:45.640 --> 00:26:48.140
And he can't do a lot of his own paperwork and things.

00:26:48.220 --> 00:26:54.560
And so I've had to do some crazy stuff to to get access to or to help him fill out something that, yeah, it's insane.

00:26:55.060 --> 00:26:57.560
OK, let's go on to number two.

00:26:57.960 --> 00:27:02.840
This is just setting the wrong configurations, not following the hardening guide, not doing patching.

00:27:02.940 --> 00:27:06.460
And this is so easy for a malicious actor to find because there's scanners.

00:27:06.460 --> 00:27:12.860
I joke whenever anyone's like, oh, like, you know, we don't want to do a pen test because it might break our thing.

00:27:12.940 --> 00:27:15.660
And I'm like, well, you're actually having a penetration test done all the time.

00:27:15.780 --> 00:27:18.040
If you're on the Internet, you just aren't receiving the report.

00:27:18.380 --> 00:27:21.520
That is absolutely so true and so disturbing.

00:27:21.720 --> 00:27:34.520
If if if you're out there listening and you have something on the Internet, API website, whatever, and you have not just tailed the log of it and just see slash WP slash admin slash this slash that just coming at it left and right.

00:27:34.520 --> 00:27:36.260
You're like, what is going on?

00:27:36.300 --> 00:27:37.960
Because it doesn't show up in your analytics.

00:27:37.960 --> 00:27:49.400
I actually had someone find something on my website that I was surprised about because I had a user from my blog from long ago and they could see my user and then it gave my

00:27:49.400 --> 00:27:57.460
email address and it was actually a personal email address that I because I had a backup admin account and I used my personal email and he is like, did you want that on there?

00:27:57.460 --> 00:28:01.340
I'm like, no, only my mom and my dad email me there because I'm bad.

00:28:01.340 --> 00:28:05.420
And if my parents write me, I should write back.

00:28:06.060 --> 00:28:06.360
Right.

00:28:07.160 --> 00:28:11.560
And so I'm like, oh, that's the personal email that I'm supposed to answer on time.

00:28:12.040 --> 00:28:19.920
He wrote me and helped me like turn off that setting that I had no idea about despite having multiple security plugins and having run an audit.

00:28:20.020 --> 00:28:20.740
I'd miss that.

00:28:20.740 --> 00:28:24.680
This one could evolve for like Python people like Django with debug equals true.

00:28:24.820 --> 00:28:30.680
Now this is the most, probably most used misconfiguration example for Django apps out there.

00:28:30.740 --> 00:28:34.160
You're like, well, of course, Michael, I know you don't set debug true in production.

00:28:34.400 --> 00:28:35.000
You do it all the time.

00:28:35.000 --> 00:28:35.720
People do it.

00:28:35.840 --> 00:28:36.220
You're right.

00:28:36.320 --> 00:28:42.480
And then two, there's like 10 other settings that are, should be in production that are not in production in Django.

00:28:42.720 --> 00:28:43.980
Like HSTS.

00:28:44.440 --> 00:28:45.260
Yes, please.

00:28:45.420 --> 00:28:53.520
And a bunch of other, you know, do not allow me to put it, be put into an iframe and all sorts of other things, content security policies and security headers.

00:28:53.800 --> 00:28:54.620
Yes, exactly.

00:28:54.880 --> 00:28:56.380
This is what happened with Claude Code.

00:28:56.560 --> 00:29:09.580
And this is how they, they essentially allowed debug and production, like by not suppressing their map file and then also not having it as part of their git ignore, which is essentially having debug mode in prod.

00:29:09.840 --> 00:29:11.560
And that's how they lost their source code.

00:29:11.880 --> 00:29:14.360
Just to be clear, I'm not shaming the developer that did that.

00:29:14.580 --> 00:29:17.260
They probably didn't have a checklist for that person.

00:29:17.260 --> 00:29:21.560
They probably didn't have anything that scanned to tell them that those settings were incorrect.

00:29:21.920 --> 00:29:22.320
Right.

00:29:22.780 --> 00:29:25.020
They probably don't even have a policy that clarifies.

00:29:25.120 --> 00:29:26.900
They're like, oh, they should just know not to do that.

00:29:27.160 --> 00:29:27.300
Right.

00:29:27.340 --> 00:29:28.580
And then they were rushed.

00:29:28.660 --> 00:29:30.020
Probably they're in a hurry.

00:29:30.460 --> 00:29:30.620
And then.

00:29:31.000 --> 00:29:31.120
Yeah.

00:29:31.140 --> 00:29:34.420
They're shipping three or four times a day, which I appreciate, but at the same time.

00:29:34.660 --> 00:29:34.900
Yeah.

00:29:34.960 --> 00:29:37.700
And then now very bad things are happening.

00:29:38.060 --> 00:29:41.120
It is going to be the most audited code that has ever happened.

00:29:41.280 --> 00:29:41.560
Michael.

00:29:41.560 --> 00:29:45.040
I've seen so many videos parsing that stuff apart.

00:29:45.160 --> 00:29:45.660
It's wild.

00:29:45.980 --> 00:29:46.380
Yeah.

00:29:46.420 --> 00:29:54.660
For people who don't know, I'm sure people heard clog code got leaked, but basically the map file in JavaScript says, here's the minified version.

00:29:54.740 --> 00:29:59.100
But if you want to show the full source version for helpful debugging, here it is.

00:29:59.180 --> 00:30:00.300
And there's how you get to it.

00:30:00.320 --> 00:30:01.380
And that apparently got shipped.

00:30:01.720 --> 00:30:02.880
And people were just like, you know what?

00:30:02.960 --> 00:30:04.900
Why don't we find out what those files are actually?

00:30:04.900 --> 00:30:13.740
And it was two security misconfigurations because one of them would have stopped it from going and the other one would have like not had it been in the package in the first place.

00:30:14.080 --> 00:30:14.200
Right.

00:30:14.280 --> 00:30:16.020
And it's number two.

00:30:16.160 --> 00:30:23.180
And it happens even to the really, really, really, really, really high profile, you know, high security assurance requiring places.

00:30:23.640 --> 00:30:28.800
I have a third one on this list that I think will really surprise people like for real.

00:30:29.160 --> 00:30:30.280
So imagine this.

00:30:30.280 --> 00:30:35.220
I've got a self-hosted app and I'm going to run it on a Docker in Docker.

00:30:35.600 --> 00:30:36.560
You know, it could be Kubernetes.

00:30:36.700 --> 00:30:37.460
It could be whatever.

00:30:37.600 --> 00:30:40.360
But I'm going to run it in Docker on my server.

00:30:40.820 --> 00:30:44.440
And it has both a web interface and a database.

00:30:44.640 --> 00:30:47.800
And the database is running on the default port and so on.

00:30:47.860 --> 00:30:48.560
So that's all fine.

00:30:48.640 --> 00:30:51.080
But it's in Docker and everything's locked down.

00:30:51.160 --> 00:30:55.780
And so what you could do is you can use this thing called UFW, uncomplicated firewall on Linux.

00:30:55.980 --> 00:30:58.400
Turn that on and say block or don't block.

00:30:58.500 --> 00:31:00.180
Only allow my web port.

00:31:00.280 --> 00:31:08.580
And in your Docker compose file, you often or even just Docker statements, you see map, say Postgres, port 5432 to 5432.

00:31:09.200 --> 00:31:09.560
Guess what?

00:31:09.620 --> 00:31:13.840
That's actually open on the Internet, probably with the default password on that database.

00:31:13.840 --> 00:31:21.540
Because if you look at the Docker docs and you go to the bottom, it says uncomplicated firewalls, a friend that ships with Debian and Ubuntu unless you manage firewall.

00:31:21.960 --> 00:31:25.820
Docker and UFW use firewall rules in ways that make them incompatible.

00:31:26.020 --> 00:31:30.020
When you publish the container ports on Docker traffic, it gets diverted before that.

00:31:30.020 --> 00:31:30.660
So guess what?

00:31:30.700 --> 00:31:31.840
That's open on the Internet.

00:31:32.280 --> 00:31:33.600
Holy smokes.

00:31:33.860 --> 00:31:39.400
Is that it's so common to just see this port on the container map to this port on the server.

00:31:39.640 --> 00:31:44.700
And if you're thinking that UFW on your firewall is going to save you, it's actually just open on the Internet.

00:31:45.020 --> 00:31:46.020
Like I didn't realize that.

00:31:46.020 --> 00:31:47.600
And so, for example, what do you do?

00:31:47.660 --> 00:31:51.880
Well, you say localhost colon my port on the server.

00:31:52.140 --> 00:31:54.000
So you're shipping, you're only listening locally.

00:31:54.160 --> 00:31:55.500
You don't have, or you just don't do that.

00:31:55.580 --> 00:32:00.000
But that's a really subtle and sneaky one that people should be aware of.

00:32:00.220 --> 00:32:02.380
The thing is no one can memorize all of this.

00:32:02.600 --> 00:32:03.840
And so what is the answer, Michael?

00:32:04.420 --> 00:32:04.780
Checklists?

00:32:05.080 --> 00:32:05.880
Checklists are good.

00:32:06.200 --> 00:32:06.440
Scanners?

00:32:06.800 --> 00:32:07.600
Scanners are good.

00:32:07.600 --> 00:32:13.020
Honestly, I think the modern top tier AI agentic tools are really good.

00:32:13.360 --> 00:32:16.160
They find a surprising amount of these things.

00:32:16.360 --> 00:32:23.540
They find them if you ask them to find them, or they make it part of the code that they give you when you just ask for it.

00:32:23.600 --> 00:32:24.860
Because people just say, I want the app.

00:32:24.920 --> 00:32:26.580
They don't say, I want a secure app necessarily.

00:32:26.980 --> 00:32:29.080
And well, it's more efficient to not worry about the security.

00:32:29.280 --> 00:32:30.080
We'll save you some tokens.

00:32:30.080 --> 00:32:32.960
Even if you just say, I want a secure app.

00:32:33.060 --> 00:32:36.880
So I gave a conference talk two weeks ago at RSA called Insecure Vibes.

00:32:37.300 --> 00:32:44.100
In the demo that I recorded in advance that was not part of the slides when I gave my live presentation, but it's on my YouTube.

00:32:44.580 --> 00:32:49.060
I just asked Claude, I'm like, can you make a login function that's for an insulin pump?

00:32:49.480 --> 00:32:51.880
So this is a medical device that needs to be really secure.

00:32:52.240 --> 00:32:53.220
And it does it.

00:32:53.300 --> 00:32:55.340
And then after I'm like, analyze it for vulnerabilities.

00:32:55.960 --> 00:32:59.240
And multiple AIs found critical vulnerabilities in it.

00:32:59.240 --> 00:33:00.700
So I asked for it to be secure.

00:33:00.860 --> 00:33:02.480
So you can't just say, I want to be secure.

00:33:02.720 --> 00:33:05.180
You have to say, and this is what secure means.

00:33:05.460 --> 00:33:08.260
I do think you can find a lot if you use the tools in the right way.

00:33:08.320 --> 00:33:09.700
But like you said, you've got to ask.

00:33:09.820 --> 00:33:11.560
And it's a proper step.

00:33:11.920 --> 00:33:13.480
Software supply chain failures.

00:33:13.580 --> 00:33:14.200
Number three.

00:33:14.420 --> 00:33:15.400
This is the expansion.

00:33:15.620 --> 00:33:20.800
So this used to be vulnerable and outdated components, which is part of your software supply chain.

00:33:20.880 --> 00:33:25.700
But Michael, I'm sure that I've told you this before, but for people that haven't heard me, blah, blah, blah, about it.

00:33:25.700 --> 00:33:30.920
Every single thing that you use to create and maintain your software is part of your supply chain.

00:33:31.320 --> 00:33:43.740
So that includes your browser, the plugins in the browser, the sandbox you created, your CI, all the settings in the CI, where you're getting your libraries and packages from, how you're getting them.

00:33:43.740 --> 00:33:46.940
So do they maintain integrity across the wire when you get them?

00:33:47.220 --> 00:33:48.860
Could it be that you got something else?

00:33:49.680 --> 00:33:55.880
Like every single thing that you're using to maintain and create is part of the supply chain.

00:33:56.020 --> 00:33:58.680
And so we need to protect the whole thing.

00:33:59.220 --> 00:34:04.200
And like we were saying earlier, developers themselves are becoming targets of malicious actors.

00:34:04.200 --> 00:34:12.660
We need to find ways to defend the developer themselves, protect them, make them safer doing their jobs, right?

00:34:12.720 --> 00:34:19.660
And help them find ways to secure the whole supply chain that's not too painful because they still need flexibility in order to be creative.

00:34:20.180 --> 00:34:23.960
So some Python things that you can do concretely here is pin your dependencies.

00:34:24.680 --> 00:34:28.300
You can use pip compile or you can use uv lock files.

00:34:28.440 --> 00:34:31.700
There's all sorts of things that are possible there.

00:34:31.700 --> 00:34:37.660
And then you can also, I think the other side that we haven't mentioned, Tanya, is like known vulnerabilities in packages.

00:34:38.000 --> 00:34:50.660
I think a lot of people, I would say over 95% of the people that install libraries from PyPI, they don't even check whether or not there's a vulnerability in that package before they install it.

00:34:50.780 --> 00:34:51.700
I would like to see.

00:34:51.860 --> 00:34:56.100
So in 2022, the first company announced this idea of reachability.

00:34:56.580 --> 00:34:58.220
So let's say you want to do math.

00:34:58.300 --> 00:34:59.740
So you install a math library.

00:34:59.740 --> 00:35:02.540
We don't actually want to do all of math, right?

00:35:02.660 --> 00:35:07.760
We probably just want to do calculus, but maybe the vulnerabilities in the statistics function, right?

00:35:07.880 --> 00:35:13.260
And so when your code calls all the calculus functions, you're like, woo, derivatives.

00:35:14.240 --> 00:35:18.480
You're not actually, there's no reachable path from your code to the vulnerability.

00:35:19.120 --> 00:35:24.000
Most of the time, that means it's not exploitable, except for if it's log4j, then you're just screwed.

00:35:24.200 --> 00:35:26.060
Just to be clear, you're just in trouble, right?

00:35:26.060 --> 00:35:31.400
But for most things, like 99.9% of the time, then you're fine if there's no reachability.

00:35:31.660 --> 00:35:39.720
And so software composition analysis tools, sometimes called supply chain security tools, when the marketing teams got a little out of hand.

00:35:39.980 --> 00:35:47.780
I feel like if you do one of the 19 attack surfaces within the supply chain that you don't get to call yourself a supply chain tool, but I digress.

00:35:47.780 --> 00:36:17.760
I have strong feels.

00:36:17.760 --> 00:36:18.460
That's so overwhelming.

00:36:18.580 --> 00:36:19.800
I'm just not even going to look at it.

00:36:20.120 --> 00:36:24.780
I have a known vulnerability in one of the packages that I am shipping to production.

00:36:25.220 --> 00:36:28.180
I think it's a PDF package or something like that.

00:36:28.220 --> 00:36:28.760
I can't remember.

00:36:29.100 --> 00:36:33.580
And I scan all my builds with pip audit and it will fail the build.

00:36:33.660 --> 00:36:40.280
So I have to ignore it because it is a vulnerability when you call a path that I don't call and you're running on Windows.

00:36:40.280 --> 00:36:42.320
And I'm trying to deploy it to Docker.

00:36:42.900 --> 00:36:44.400
And I'm not calling that path.

00:36:44.460 --> 00:36:50.100
I'm like, I understand it's a problem that could be an issue under some circumstances, but it doesn't apply here.

00:36:50.100 --> 00:36:51.200
And I just need to use it.

00:36:51.320 --> 00:36:54.340
And it's a Windows problem on my Docker Linux.

00:36:54.460 --> 00:36:55.940
I don't really care right now.

00:36:56.020 --> 00:36:56.880
I mean, it's fine.

00:36:56.980 --> 00:36:58.440
Until they fix it, I'll be okay.

00:36:58.440 --> 00:37:05.420
Well, and especially if you know what the problem is and you're not going to suddenly switch to Windows, why would you do that?

00:37:05.900 --> 00:37:09.360
So the tools are maturing, but they're not perfect.

00:37:09.800 --> 00:37:13.200
And lots of them are going at different speeds.

00:37:13.260 --> 00:37:13.900
We'll just say that.

00:37:13.900 --> 00:37:18.600
So I look forward to the day where there's reachability done on all of those things.

00:37:20.220 --> 00:37:22.860
This portion of Talk Python To Me is brought to you by us.

00:37:23.000 --> 00:37:30.620
I want to tell you about a course I put together that I'm really proud of, Agentic AI Programming for Python Developers.

00:37:31.260 --> 00:37:37.180
I know a lot of you have tried AI coding tools and come away thinking, well, this is more hassle than it's worth.

00:37:37.540 --> 00:37:40.820
And honestly, all the vibe coding hype isn't helping.

00:37:40.820 --> 00:37:44.300
It's a smokescreen that hides what these tools can actually do.

00:37:44.920 --> 00:37:56.720
This course is about agentic engineering, applying real software engineering practices with AI that understands your entire code base, runs your tests, and builds complete features under your direction.

00:37:57.360 --> 00:38:03.580
I've used these techniques to ship real production code across Talk Python, Python Bytes, and completely new projects.

00:38:04.020 --> 00:38:10.160
I migrated an entire CSS framework on a production site with thousands of lines of HTML in a few hours.

00:38:10.160 --> 00:38:10.760
Twice.

00:38:11.120 --> 00:38:15.540
I shipped a new search feature with caching and async in under an hour.

00:38:15.980 --> 00:38:24.080
I built a complete CLI tool for Talk Python from scratch, tested, documented, and published to PyPI in an afternoon.

00:38:24.580 --> 00:38:28.560
Real projects, real production code, both Greenfield and legacy.

00:38:29.020 --> 00:38:30.680
No toy demos, no fluff.

00:38:31.240 --> 00:38:37.380
I'll show you the guardrails, the planning techniques, and the workflows that turn AI into a genuine engineering partner.

00:38:37.380 --> 00:38:41.440
Check it out at talkpython.fm/agentic dash engineering.

00:38:41.740 --> 00:38:44.800
That's talkpython.fm/agentic dash engineering.

00:38:45.000 --> 00:38:47.120
The link is in your podcast player's show notes.

00:38:47.120 --> 00:38:51.820
One of the other things I wanted to mention is like, so you said pin dependencies.

00:38:52.240 --> 00:38:59.460
And so I teach this and then inevitably every time someone's like, well, if I pin dependencies forever, then I just have all these really old dependencies.

00:38:59.720 --> 00:39:00.860
That's not what Michael means.

00:39:01.300 --> 00:39:05.080
He means you do development, you update your dependencies to a version.

00:39:05.720 --> 00:39:10.500
Like ideally you're like LTE, you're like latest, you know, stable version of whatever the thing is.

00:39:10.500 --> 00:39:23.760
Because you're trying to keep like definitely as supported version, recent, you're not picking terrible things where, you know, it hasn't been updated in two years, or there's one maintainer and they happen to live in Russia and work for the Russian government, right?

00:39:23.780 --> 00:39:25.280
So you're picking like decent ones.

00:39:25.560 --> 00:39:26.840
You're updating it in dev.

00:39:26.920 --> 00:39:27.820
You're like, okay, this is the one.

00:39:27.980 --> 00:39:28.680
Then you pin it.

00:39:28.960 --> 00:39:33.440
So as it goes up to different environments, you don't get a surprise update and it changes.

00:39:33.440 --> 00:39:39.120
And then there's something different in prod than what you tested in UAT and approved with the security tools.

00:39:39.560 --> 00:39:40.720
That's what, that's what.

00:39:40.880 --> 00:39:41.600
And a hundred percent.

00:39:41.680 --> 00:39:42.920
Because it gets misinterpreted.

00:39:43.100 --> 00:39:43.340
Yeah.

00:39:43.500 --> 00:39:52.240
And another thing to do that you can do real simple is like with some of the tools, like with uv, you can say, I have pin dependencies, update them to the current ones

00:39:52.240 --> 00:39:58.540
with a very important caveat that you can say that are older than a week or older than a day or something.

00:39:58.540 --> 00:40:05.900
But because, you know, the really big example here is LLM, light LLM, just this, was that just this week or was that last week?

00:40:05.940 --> 00:40:06.540
I can't keep it.

00:40:06.560 --> 00:40:07.400
It was very recent.

00:40:07.400 --> 00:40:07.680
Yeah.

00:40:07.800 --> 00:40:20.320
This thing has a dependency that itself became, like you talked about, the developer got taken over, I believe, and a virus got put in and it was only out for like half an hour or something, but it took over, it's so popular.

00:40:20.540 --> 00:40:25.220
It got, took it, took it over like 50,000 machines because it gets downloaded millions of times a day.

00:40:26.340 --> 00:40:26.660
Automatically.

00:40:26.760 --> 00:40:36.980
If you say, give me the latest, he's like, obviously waiting a week is not hardcore security, but at the same time, so many of these popular issues that people take, they only last briefly, right?

00:40:36.980 --> 00:40:37.780
For a few moments.

00:40:37.900 --> 00:40:41.320
And then somebody's like, oh my gosh, why is this thing using 100% CPU?

00:40:41.500 --> 00:40:41.900
You know what I mean?

00:40:42.080 --> 00:40:49.180
And here's the thing, Michael, is that not all of those 50,000 got the memo that this happened and they're still vulnerable in prod and they could be from them.

00:40:49.280 --> 00:40:49.440
Yeah.

00:40:49.520 --> 00:40:50.760
It could be for a long time.

00:40:50.860 --> 00:40:51.540
Yes, I agree.

00:40:51.700 --> 00:40:56.580
Like update, but to, you know, one that's three days old or one week old, it's weird.

00:40:56.580 --> 00:40:59.920
So this advice has drastically changed over the past six months.

00:41:00.100 --> 00:41:03.180
The best practice used to be auto update to latest version, period.

00:41:03.480 --> 00:41:06.220
That used to be the advice and that's no longer the advice.

00:41:06.220 --> 00:41:09.080
And it's kind of heartbreaking, especially if you use npm.

00:41:09.420 --> 00:41:10.780
NPM is just like under siege.

00:41:11.680 --> 00:41:12.000
It is.

00:41:12.080 --> 00:41:12.260
Yeah.

00:41:12.620 --> 00:41:16.920
High PI is as well, but it looks over at npm as thankful for its situation.

00:41:17.420 --> 00:41:18.780
Number four.

00:41:18.780 --> 00:41:20.820
Or I got to keep cruising here.

00:41:21.120 --> 00:41:22.420
For cryptographic failures.

00:41:22.920 --> 00:41:23.840
Cryptographic failures.

00:41:23.980 --> 00:41:24.180
Yeah.

00:41:24.400 --> 00:41:25.300
Not encrypting.

00:41:25.960 --> 00:41:27.940
Encrypting using something really old.

00:41:28.420 --> 00:41:32.060
You start off encrypted and then briefly you're not encrypted and then you're encrypted again.

00:41:32.560 --> 00:41:34.280
You don't encrypt it when you're supposed to.

00:41:34.720 --> 00:41:41.720
Also in this realm, one way hashing, not just reversible encryption, right?

00:41:41.780 --> 00:41:42.800
It would probably fall in here.

00:41:42.800 --> 00:41:47.060
Encrypting user passwords and storing them in the database along with the key.

00:41:47.820 --> 00:41:52.520
Ideally, we would hash and we would salt and then hash user passwords.

00:41:52.700 --> 00:41:53.540
That would be the best.

00:41:53.680 --> 00:41:56.820
If you really, really, really, really are intense, you could pepper it too.

00:41:56.940 --> 00:41:58.080
And no, I did not make that up.

00:41:58.440 --> 00:42:01.180
That is a mathematical nerd joke, not an app sec joke.

00:42:01.760 --> 00:42:06.120
But a salt is unique per user and the salt itself isn't really a secret.

00:42:06.460 --> 00:42:09.460
Where a pepper is unique per system or per organization.

00:42:09.460 --> 00:42:11.020
And it is a secret.

00:42:11.200 --> 00:42:11.280
Right.

00:42:11.360 --> 00:42:14.860
Like a secret key that you set and then it gets factored in there.

00:42:14.920 --> 00:42:15.260
That's cool.

00:42:15.400 --> 00:42:15.560
Yeah.

00:42:15.700 --> 00:42:15.900
Yeah.

00:42:15.900 --> 00:42:19.120
Also, maybe choose more modern hashing algorithms, right?

00:42:19.200 --> 00:42:23.060
Obviously not MD5, but maybe something memory hard like Argon maybe.

00:42:23.260 --> 00:42:23.540
I don't know.

00:42:23.640 --> 00:42:23.860
Yes.

00:42:23.960 --> 00:42:24.460
Argon 2.

00:42:24.920 --> 00:42:26.360
That would be better for sure.

00:42:27.160 --> 00:42:32.500
And this is something where if you're going to do it, it's very easy to look up what you're supposed to do on the internet.

00:42:33.420 --> 00:42:38.480
This is something where you can ask the AI, like, are you using a good algorithm?

00:42:38.480 --> 00:42:39.060
Are you doing this?

00:42:39.160 --> 00:42:40.000
Like, make sure it's secure.

00:42:40.160 --> 00:42:43.020
And then it's good as long as it does it.

00:42:43.260 --> 00:42:53.600
Because one suggestion, if you are going to VibeCode and not do the 400 other things we'll talk about later when we talk about my prompt library, but ask it to list its security assumptions.

00:42:54.260 --> 00:42:56.780
So whatever it is you prompt, you give it to make a thing.

00:42:56.840 --> 00:42:58.600
You're like, make this, then do that, blah, blah, blah.

00:42:59.120 --> 00:43:00.740
Please list all your security assumptions.

00:43:01.300 --> 00:43:06.120
And it'll be like, oh, yeah, but obviously like you wouldn't do like authentication like that because that's terrible.

00:43:06.120 --> 00:43:16.980
And like in production, you would do this other thing and you're like, oh, yeah, because it'll assume that you're going to change a bunch of things later that it doesn't tell you unless you ask it to tell you its assumptions.

00:43:18.120 --> 00:43:18.300
Yeah.

00:43:18.400 --> 00:43:21.080
We'll use no password here, but when you ship it, you're going to add that, right?

00:43:21.180 --> 00:43:23.540
Like, no, I wasn't going to, but now I will.

00:43:23.740 --> 00:43:24.140
All right.

00:43:24.140 --> 00:43:29.340
I think one of the best known ones has got to be little Bobby tables and friends.

00:43:29.600 --> 00:43:31.460
Number five, injection.

00:43:31.760 --> 00:43:32.160
Yes.

00:43:32.580 --> 00:43:43.200
So injection, tricking an application in, like you put your code, the malicious actor's code into a place where it should be data, but you've tricked it into thinking it's its code.

00:43:43.260 --> 00:43:45.680
And then either it executes it or it interprets it.

00:43:45.860 --> 00:43:49.580
Like if there's an interpreter, there's a compiler, there's the potential for injection.

00:43:49.580 --> 00:43:53.820
And it, yeah, we don't want to mix data in with commands.

00:43:54.200 --> 00:43:57.940
We don't want to mix data in with anything that's going to be executed or interpreted.

00:43:58.220 --> 00:43:59.600
And we do it a lot, Michael.

00:43:59.980 --> 00:44:01.560
I know we make bad choices, don't we?

00:44:01.800 --> 00:44:02.780
We make bad choices.

00:44:02.880 --> 00:44:08.360
So obviously SQL injection is the number one in this world for sure.

00:44:08.760 --> 00:44:10.600
And still it's popular.

00:44:10.840 --> 00:44:11.800
Still people don't know.

00:44:11.860 --> 00:44:12.500
It's still tricky.

00:44:12.780 --> 00:44:19.180
I mean, we have certainly parametrized queries and ORMs and stuff that should be helping us or does help us if we choose to use them with this.

00:44:19.180 --> 00:44:23.320
However, I think other ones should just give them a quick shout out.

00:44:23.440 --> 00:44:30.760
Like for example, if you're accepting JSON and converting it to a dictionary in Python, you can do MongoDB injection.

00:44:31.020 --> 00:44:39.260
Like your password, you know, the Bobby tables one is like quote, semicolon drop table that, you know, like that's what that looks like in T-School.

00:44:39.440 --> 00:44:43.020
But in MongoDB, you can do queries that are dictionaries.

00:44:43.240 --> 00:44:44.880
That's like kind of how you do your filtering.

00:44:44.880 --> 00:44:53.980
So if you take something that would be a password in a JSON document, the password could be curly brace greater than, you know, one equals one.

00:44:54.140 --> 00:44:58.040
Like a really complicated JSON dictionary that is actually the query that is equivalent.

00:44:58.160 --> 00:44:59.800
So you got to be super careful there as well.

00:44:59.860 --> 00:45:02.220
And that's really tricky to do that.

00:45:02.220 --> 00:45:05.820
And then also like the pickles and like serialization, deserialization.

00:45:06.000 --> 00:45:08.880
There's a lot to this, not just SQL injection.

00:45:09.200 --> 00:45:11.260
It's a lot about input validation.

00:45:11.880 --> 00:45:17.480
So using a parametrized query, so store procedures, prepared statements, whatever you want to call them.

00:45:17.800 --> 00:45:21.080
What that does is says this is data only treated as data.

00:45:21.080 --> 00:45:23.880
And then the database can do that.

00:45:24.520 --> 00:45:27.580
But if we, on top of that, do input validation.

00:45:27.580 --> 00:45:33.300
So like we're getting the thing that looks correct and we're rejecting, we're not trying to fix it.

00:45:33.420 --> 00:45:35.720
We're just rejecting everything that looks not correct.

00:45:36.060 --> 00:45:41.080
And then if we have to accept any special characters, we escape them or sanitize them out.

00:45:41.300 --> 00:45:42.240
I prefer escaping.

00:45:42.500 --> 00:45:43.700
I think it's weird to remove stuff.

00:45:43.800 --> 00:45:44.420
That's my data.

00:45:44.520 --> 00:45:45.260
I probably want it.

00:45:45.380 --> 00:45:50.020
So like if you have to accept single quotes because you know you're going to have users named O'Malley, let's say.

00:45:50.020 --> 00:45:50.460
Right.

00:45:50.860 --> 00:45:52.280
So we accept the letters.

00:45:52.520 --> 00:45:53.260
We accept the numbers.

00:45:53.480 --> 00:45:56.220
We accept a single quote and some dashes, even though those are dangerous.

00:45:56.220 --> 00:45:59.600
And then we escape those characters because we know they're potentially dangerous.

00:45:59.820 --> 00:46:05.760
And then we specify, by the way, that's definitely data and not code by making it a parameter.

00:46:06.060 --> 00:46:06.500
Right.

00:46:06.560 --> 00:46:08.040
And then we're locked down.

00:46:08.300 --> 00:46:09.380
We're in really good shape.

00:46:09.460 --> 00:46:21.400
If we olded input validation on everything and then we just rejected everything that looked weird, especially against like we always need to do like a yes list and allow like this is allowed list.

00:46:21.640 --> 00:46:24.640
Not like when I was a pen tester, Michael, I was not.

00:46:24.840 --> 00:46:26.480
I was a pen tester for a year and a half.

00:46:26.840 --> 00:46:31.000
I had basically zero training and I could get around those in two seconds.

00:46:31.000 --> 00:46:33.960
And like I was not particularly superbly talented.

00:46:34.600 --> 00:46:37.640
And I was just like, pew, pew, pew, ha, ha, ha, ha, ha, block list.

00:46:37.720 --> 00:46:38.340
You just try.

00:46:38.600 --> 00:46:39.880
If you just know to look, right?

00:46:40.000 --> 00:46:40.260
Yeah.

00:46:40.420 --> 00:46:43.600
Well, and there's cheat sheets all over the internet of how to get around them.

00:46:43.760 --> 00:46:46.300
It's so everyone knows how to get around them.

00:46:46.400 --> 00:46:48.100
So, but you can't get around.

00:46:48.320 --> 00:46:49.820
Well, you're only allowed letters and numbers.

00:46:49.820 --> 00:46:50.280
It's like, well.

00:46:50.320 --> 00:46:54.980
Let's keep moving a little bit quickly so that we know that we're, we got time for a little retrospective.

00:46:54.980 --> 00:46:57.460
So insecure design, that's a fun one.

00:46:57.720 --> 00:47:02.080
It does not matter how perfectly you follow the plan if the plan is bad.

00:47:02.340 --> 00:47:02.820
Right.

00:47:03.240 --> 00:47:10.280
And so this is, this was new on the last time we released the list was the first time it was on there.

00:47:10.280 --> 00:47:14.420
And I'm really glad because all the other items are implementation.

00:47:14.420 --> 00:47:17.400
And this is the only one that is design the plan.

00:47:17.580 --> 00:47:22.360
And so essentially it means, you know, someone designed something and you don't talk about it.

00:47:22.360 --> 00:47:24.020
You don't analyze it for security.

00:47:24.020 --> 00:47:27.720
You don't intentionally apply secure design concepts.

00:47:27.940 --> 00:47:29.080
You don't do a threat model.

00:47:29.180 --> 00:47:30.400
There's no security review.

00:47:30.840 --> 00:47:31.660
You YOLO that.

00:47:31.900 --> 00:47:35.980
You don't even have a list of security requirements usually that you knew you should have added.

00:47:35.980 --> 00:47:44.020
Like if you're going to do an API and it's, you know, visible from accessible from the internet, to me, it should be behind an API gateway, period.

00:47:44.440 --> 00:47:45.140
That is my opinion.

00:47:45.340 --> 00:47:47.660
No, I don't sell one, but I think we all need one.

00:47:47.880 --> 00:47:48.240
Right.

00:47:48.360 --> 00:47:51.240
And to me, that should just be a requirement up front.

00:47:51.240 --> 00:47:54.700
And then when I see your design document and it's there, I'm like, thumbs up, let's go.

00:47:55.020 --> 00:47:55.220
Right.

00:47:55.220 --> 00:48:03.440
But if you're not giving clear requirements and then you're not reviewing the design, you are getting a YOLO approach at stuff.

00:48:03.580 --> 00:48:05.760
And that doesn't mean developers don't care.

00:48:06.140 --> 00:48:09.600
But if no one's taught them this, no one's asked for this, and then no one checks this.

00:48:09.800 --> 00:48:11.740
The problem with this one is the code looks fine.

00:48:11.820 --> 00:48:12.920
It looks like you're doing it right.

00:48:12.920 --> 00:48:15.640
It's just there's something important that's just not even there.

00:48:15.640 --> 00:48:19.220
Like examples that I came up with were like no rate limiting would be one.

00:48:19.480 --> 00:48:20.440
The login looks fine.

00:48:20.520 --> 00:48:23.640
You're checking the person's not a duplicate that they're there, et cetera, et cetera.

00:48:23.780 --> 00:48:23.860
Right.

00:48:23.980 --> 00:48:27.060
Or a client side enforcement and some kind of like Vue.js act.

00:48:27.140 --> 00:48:32.960
You've got all the validation there, but the API actually just assumes the client is doing it, which is never the way.

00:48:32.960 --> 00:48:39.820
And there's a lot of business logic issues that will be the way that you're solving the problem.

00:48:40.160 --> 00:48:43.120
If users do the thing you want, it's fine.

00:48:43.340 --> 00:48:45.520
But not all users do the thing you want.

00:48:45.580 --> 00:48:46.600
And some of them are Tanya.

00:48:46.820 --> 00:48:48.460
And they're like, well, I'm just going to click through your.

00:48:48.660 --> 00:48:49.020
Exactly.

00:48:49.260 --> 00:48:51.560
Some of us are curious and we like to click the buttons.

00:48:53.020 --> 00:48:54.580
We're like, oh, look at that.

00:48:54.900 --> 00:48:57.520
Oh, there's a next button, even though it says it's the last page.

00:48:57.600 --> 00:48:58.740
Well, what would happen if we click that?

00:48:58.840 --> 00:48:59.060
Absolutely.

00:48:59.140 --> 00:49:00.400
You've got to click that button, right?

00:49:01.280 --> 00:49:02.900
Authentication failures, number seven.

00:49:03.060 --> 00:49:09.520
So this is when an attacker can trick the app into thinking they're a different user, usually an admin user.

00:49:09.520 --> 00:49:13.580
Or if they're not a legitimate user, tricking them into being a legitimate user.

00:49:13.700 --> 00:49:15.460
But we all want to be admin, Michael.

00:49:15.600 --> 00:49:15.880
Oh, yeah.

00:49:15.960 --> 00:49:17.580
This can be caused by lots of things.

00:49:17.580 --> 00:49:29.120
We wrote our own authentication instead of buying a tried, tested, and true product that will do this for us easier, better, faster, and cheaper in the long run when we count maintenance.

00:49:29.120 --> 00:49:30.980
That's the biggest mistake.

00:49:31.100 --> 00:49:33.420
But we don't protect against credential stuffing.

00:49:33.540 --> 00:49:35.020
We don't protect against brute force.

00:49:35.240 --> 00:49:37.040
Those are the two super, super huge ones.

00:49:37.120 --> 00:49:43.880
We let people reuse passwords, use terribly insecure passwords, etc. We don't have multiple forms of authentication.

00:49:43.880 --> 00:49:45.860
So there's no second factor.

00:49:46.720 --> 00:49:46.900
Yeah.

00:49:46.900 --> 00:49:53.820
And there's, Michael, there are ways to do multi-factor that don't, like, have to be awful necessarily.

00:49:54.220 --> 00:49:57.780
Like, if you don't require the same level of security, like posture.

00:49:57.780 --> 00:50:03.600
So, for instance, you know, you do have multi-factor authentication for the first time.

00:50:03.780 --> 00:50:07.240
But then maybe you fingerprint their browser and their device and their network.

00:50:07.420 --> 00:50:14.580
And if they're going to log in from the same device, the same browser, and the same network, maybe you don't require an MFA challenge very often.

00:50:14.840 --> 00:50:15.400
A hundred percent.

00:50:15.540 --> 00:50:18.560
Like, you could say, trust this machine or this browser.

00:50:18.660 --> 00:50:21.560
And you're like, okay, we'll never ask you 2FA again.

00:50:21.740 --> 00:50:22.860
Just your username, password.

00:50:23.160 --> 00:50:26.120
Or we won't, unless you're doing something like deleting your account.

00:50:26.120 --> 00:50:29.660
There's ways to make this not necessarily always super painful.

00:50:30.140 --> 00:50:33.300
And I, you know, pass keys are so nice.

00:50:33.600 --> 00:50:35.080
Those are making things a lot nicer.

00:50:35.580 --> 00:50:38.020
But I still feel like there's a ways to go.

00:50:38.380 --> 00:50:43.660
I dream of the day where we trust our devices so well that, like, I can just touch the thing and I know it's okay.

00:50:44.020 --> 00:50:44.240
Right?

00:50:44.400 --> 00:50:51.660
And I know that someone can't just, like, XKCD hit me with a wrench until I put the thing in front of my face and then it unlocks.

00:50:52.100 --> 00:50:52.140
Exactly.

00:50:52.240 --> 00:50:54.840
That is one of the weakest parts of the security chain there.

00:50:54.840 --> 00:50:55.340
Mm-hmm.

00:50:55.460 --> 00:50:58.020
Software or data integrity failures?

00:50:58.100 --> 00:50:58.620
Number eight.

00:50:58.940 --> 00:51:00.640
So this is one we fought a lot about.

00:51:01.100 --> 00:51:01.520
Yeah.

00:51:01.580 --> 00:51:02.460
Especially me and Neil.

00:51:02.720 --> 00:51:07.960
Because Neil, I wrote this one and Neil wrote the supply chain one or vice versa.

00:51:08.360 --> 00:51:10.280
And there's lots of arguments of how to differentiate.

00:51:10.280 --> 00:51:19.980
So we need to make sure that things we download are exactly what we think they are and that the integrity, it's not been spoofed or tampered with in the meantime.

00:51:20.200 --> 00:51:22.060
So no one has changed it.

00:51:22.380 --> 00:51:24.240
And this is for data and for software.

00:51:24.400 --> 00:51:27.500
So, you know, third-party components that we're getting.

00:51:27.800 --> 00:51:28.960
Are we getting the thing we thought?

00:51:29.080 --> 00:51:30.040
Is there a table squat?

00:51:30.280 --> 00:51:33.840
Is, has someone been able to intercept in between and change it out?

00:51:33.840 --> 00:51:35.320
Same with data.

00:51:35.680 --> 00:51:39.120
Like, did someone change the data on the way to us, et cetera.

00:51:39.400 --> 00:51:43.960
And this is really key, especially for things that require anything medical.

00:51:44.300 --> 00:51:48.780
Like imagine the insulin pump that gets it wrong sometimes and people have comas.

00:51:48.780 --> 00:51:50.600
Like that would be so unbelievably bad.

00:51:50.600 --> 00:51:51.660
That's very bad.

00:51:51.660 --> 00:51:59.700
Yeah, I worked with a company that did like all the medical devices and instruments and ORs and ERs and security assurance.

00:51:59.840 --> 00:52:00.200
Hi.

00:52:01.060 --> 00:52:02.800
It was an awesome project.

00:52:02.980 --> 00:52:06.020
It was really cool, but it was also like, damn, your job hard.

00:52:06.140 --> 00:52:07.640
It's tough to sleep at night in that one.

00:52:07.700 --> 00:52:07.840
Yeah.

00:52:07.940 --> 00:52:11.460
The thing is, is that private industry tends to really focus on availability.

00:52:11.920 --> 00:52:14.140
Like if their website's down, they can't sell their thing.

00:52:14.480 --> 00:52:16.260
Clients call, it costs them money.

00:52:16.560 --> 00:52:16.780
Right.

00:52:16.780 --> 00:52:21.660
But integrity is like more silent hurt, if that makes sense.

00:52:21.840 --> 00:52:21.980
Yeah.

00:52:22.100 --> 00:52:23.860
So many of these are like this, honestly.

00:52:24.060 --> 00:52:30.260
Like the whole top 10 is only, it only slows you down and it's sand in the gears until something happens.

00:52:30.260 --> 00:52:31.840
And then it's your fault for not doing it.

00:52:31.920 --> 00:52:34.160
But before that, it's like all this stuff is a hassle.

00:52:34.340 --> 00:52:41.700
And integrity, it comes into play in so many situations because some, it can fail silently.

00:52:42.380 --> 00:52:45.460
Things that fail silently are more scary for security teams.

00:52:45.840 --> 00:52:46.580
Does that make sense?

00:52:46.580 --> 00:52:51.480
People love to use CDNs for their JavaScript and their CSS and so on.

00:52:51.600 --> 00:53:01.580
And there've been examples where the CDN was taken over or another developer could have been compromised who published a malicious JavaScript.

00:53:02.240 --> 00:53:08.380
And the danger with this is if you make that hack go through, you don't just take over that app.

00:53:08.440 --> 00:53:12.720
You take over all the people who use that app and everyone who uses the CDN to pull it out.

00:53:12.720 --> 00:53:16.220
Like it can be really the knock-on effects are mega.

00:53:16.220 --> 00:53:16.280
Yeah.

00:53:16.280 --> 00:53:22.240
And like checking the sub-resource integrity, doing that check, that can help.

00:53:22.240 --> 00:53:26.500
Sometimes we can do all the right things and we still get hurt.

00:53:26.500 --> 00:53:43.980
Because like for instance, with SolarWinds, the compromise was so deep in the organization that they were able to not only push in like code that was malicious, have it pass all the security tests in the pipeline, then sign it and then release it.

00:53:43.980 --> 00:53:45.980
And then not have customers also notice the problem.

00:53:45.980 --> 00:53:48.980
Like that situation is rare.

00:53:48.980 --> 00:53:55.140
What we want to do with this one is raise awareness that you should just be checking the integrity of your stuff, period.

00:53:55.640 --> 00:53:55.820
Right?

00:53:56.000 --> 00:54:06.040
So the software composition analysis companies, the security researchers, they're on it finding those rare edge case zero day situations.

00:54:06.040 --> 00:54:10.440
What we need the average developer to do is just check integrity, period.

00:54:10.820 --> 00:54:14.840
Like the thing you've got is what you think you've got and it's from the right place.

00:54:15.200 --> 00:54:18.420
And if we could all do that, like life would improve greatly.

00:54:18.620 --> 00:54:20.040
There's defaults that are not great.

00:54:20.140 --> 00:54:21.000
For example, check this out.

00:54:21.060 --> 00:54:23.680
JS deliver are Tailwind.

00:54:24.180 --> 00:54:28.120
So here's a real popular CDN delivering a very, very popular library.

00:54:28.480 --> 00:54:29.520
Here's how it tells me to use it.

00:54:29.720 --> 00:54:30.380
What's missing here?

00:54:30.520 --> 00:54:32.400
Sub-resource integrity check.

00:54:32.400 --> 00:54:32.800
Yes.

00:54:33.100 --> 00:54:38.260
So if I just say, I want to use this Tailwind and it says, great, source equals such and such.

00:54:38.380 --> 00:54:39.040
Good to go.

00:54:39.400 --> 00:54:39.720
You know what I mean?

00:54:39.760 --> 00:54:40.320
And that's it.

00:54:40.560 --> 00:54:47.620
So even the really popular CDNs and stuff are just encouraging you to fall, to scramble from the pit of success.

00:54:47.840 --> 00:54:48.780
You know, it's not that at all.

00:54:48.860 --> 00:54:51.600
Maybe we should write them and be like, I want you to change this, please.

00:54:52.000 --> 00:54:54.060
When I worked at Microsoft, I did that all the time.

00:54:54.100 --> 00:54:56.760
I'd be like, you need to change your readme page.

00:54:56.820 --> 00:54:57.280
It's wrong.

00:54:57.660 --> 00:54:59.180
You forgot the security thing.

00:54:59.620 --> 00:55:01.680
And they'd be like, Tanya, just, it's a demo.

00:55:01.680 --> 00:55:02.260
I'm like, nope.

00:55:02.400 --> 00:55:04.000
Two more real quick before we run out of time.

00:55:04.500 --> 00:55:05.420
Logging and alerting.

00:55:05.760 --> 00:55:07.420
So security logging and alerting.

00:55:08.020 --> 00:55:14.980
So developers might be doing lots of logging and they might be doing some alerting for debugging, which is important and you should still do it.

00:55:15.180 --> 00:55:21.440
But this is more that we're not logging when security controls are called and especially pass or fail.

00:55:21.440 --> 00:55:28.560
So if someone tries to log in 100 times in one second, I don't want to know the 100th time that they got in.

00:55:28.560 --> 00:55:32.460
I want to know all 99 times where they failed in the logs.

00:55:32.460 --> 00:55:32.980
Right.

00:55:32.980 --> 00:55:38.080
I want to have enough information in those logs that I can do a proper investigation.

00:55:38.080 --> 00:55:42.740
Like when I worked in AppSec, my job wasn't called incident responder.

00:55:42.740 --> 00:55:48.880
But every time an app got smashed, they're like, okay, Tanya, do you saying that you do that weird thing?

00:55:49.280 --> 00:55:50.900
And I would go look at the logs.

00:55:50.900 --> 00:55:58.940
And I remember a client calling me one day and they're like, Visa called us and 27 of our customers got popped and we need you to go investigate.

00:55:59.540 --> 00:56:01.400
And turns out they didn't have any logs.

00:56:01.860 --> 00:56:03.560
They didn't think they needed to log that.

00:56:03.560 --> 00:56:08.840
And so they had absolutely no application logs for that log for that app.

00:56:08.880 --> 00:56:09.060
Sorry.

00:56:09.260 --> 00:56:15.040
And I was like, when am I supposed to investigate walk around the building with like a magnifying glass and just look cool with a hat on?

00:56:15.120 --> 00:56:16.640
Like there's nothing, there's no evidence.

00:56:16.920 --> 00:56:21.220
There's probably somebody in the corner with a hoodie, sunglasses looking sort of hacker-ish.

00:56:21.540 --> 00:56:21.900
Right.

00:56:22.280 --> 00:56:24.600
Like, I'm just like, what am I supposed to investigate guys?

00:56:24.660 --> 00:56:26.980
Like you have no logs at all.

00:56:27.060 --> 00:56:28.280
You got to just let it keep going.

00:56:28.280 --> 00:56:34.200
Basically, you got to say, well, now we add logging and then we can figure out if there's new stuff happening or something.

00:56:34.320 --> 00:56:35.000
It's really bad.

00:56:35.240 --> 00:56:36.480
It turns out it wasn't them.

00:56:36.560 --> 00:56:44.140
It turned out that there's a sandwich shop downstairs and an employee had swiped cards and everything from our end was fine.

00:56:44.540 --> 00:56:48.820
But then I was like, we are rewriting this app so that it does security logging and failure.

00:56:48.940 --> 00:56:51.460
So essentially, you're making it so we can't investigate.

00:56:51.740 --> 00:56:54.060
You're making it so there's no evidence that a thing happened.

00:56:54.320 --> 00:56:55.540
We can't press charges in court.

00:56:55.620 --> 00:56:56.600
There's no chain of custody.

00:56:56.600 --> 00:56:59.300
We'll never know what happened.

00:56:59.780 --> 00:57:02.340
And then that means we don't know how to protect ourselves in the future.

00:57:02.680 --> 00:57:04.080
And we really need these logs.

00:57:04.220 --> 00:57:14.640
So every time a security thing happens, input validation, output encoding, like anything that is security related, just log that the attempt was made and it worked or it didn't work.

00:57:14.960 --> 00:57:17.800
And, you know, which user ID, et cetera, things like that.

00:57:17.900 --> 00:57:18.540
And the timestamp.

00:57:18.760 --> 00:57:18.900
All right.

00:57:18.920 --> 00:57:19.420
Last one.

00:57:19.520 --> 00:57:22.600
Let's round it out real quick with mishandling of exceptional conditions.

00:57:22.600 --> 00:57:27.660
So this is brand new and this one is related to the other one.

00:57:27.800 --> 00:57:33.540
So number nine was basically you're not doing logging when you should or your logs suck.

00:57:33.660 --> 00:57:34.080
They're incomplete.

00:57:34.480 --> 00:57:38.640
This one is errors happen and you just don't handle them properly.

00:57:38.640 --> 00:57:46.960
So I'm sure you've reviewed code and seen this where it's like try and it does a thing and then catch and then there's nothing and then end.

00:57:47.160 --> 00:57:47.720
I'm like, what?

00:57:47.880 --> 00:57:50.360
You didn't handle anything.

00:57:50.360 --> 00:57:58.080
Or the handling is just I'm going to print the entire system error to the screen with the stack trace and a mess.

00:57:58.180 --> 00:57:58.940
Nope, that's gross.

00:57:59.560 --> 00:58:02.260
I'm just going to not properly recover.

00:58:02.740 --> 00:58:02.860
Right.

00:58:02.980 --> 00:58:07.860
And so application resilience is important, but you can't have that at all.

00:58:07.860 --> 00:58:09.540
If you're not doing this, you can't recover.

00:58:09.920 --> 00:58:09.940
Yeah.

00:58:10.000 --> 00:58:14.380
Or you don't use a database transaction and it's data is corrupted, something like that.

00:58:14.380 --> 00:58:27.580
This is where a lot of business logic flaws, like really unique bugs happen that are harder to find because we are not handling our errors at all or we're handling them very, very poorly.

00:58:27.580 --> 00:58:36.320
And I was really excited to have this on here because lack of application resilience tied for this one for spot number 10.

00:58:36.760 --> 00:58:40.700
But if you solve this, you almost always solve lack of application resilience.

00:58:40.700 --> 00:58:44.680
But if you solve lack of application resilience, you do not solve this.

00:58:45.220 --> 00:58:48.440
And so I was like, and so that's how I got them to agree to put this one on.

00:58:48.480 --> 00:58:53.680
And so the other one, having technical discussions with really smart people, it's pretty cool.

00:58:54.240 --> 00:58:54.640
Absolutely.

00:58:55.340 --> 00:59:04.080
So I want to take a moment and talk about AI and security and give you a chance to talk about your prompt library and how people can get it.

00:59:04.180 --> 00:59:07.120
And while you're doing that, I'm going to pull up an example I can kick off.

00:59:07.240 --> 00:59:08.800
So tell people about it and I'll pull up the example.

00:59:08.800 --> 00:59:12.440
I give training and I do this bad, better, best thing where I give an example.

00:59:12.580 --> 00:59:17.260
So I'm like input validation or whatever the topic is, you know, like a brief lecture on it and best practices.

00:59:17.600 --> 00:59:19.460
Then I give an example of bad code.

00:59:19.600 --> 00:59:21.640
Then we fix that thing, better code.

00:59:21.700 --> 00:59:23.380
And then best codes, like layers of defenses.

00:59:23.740 --> 00:59:34.160
And when I was creating these examples with the AI, Michael, every time the example was bad code, like no security control whatsoever or completely incorrectly done.

00:59:34.160 --> 00:59:38.220
So like you get the input, you use it and then you validate it.

00:59:38.580 --> 00:59:38.960
Right.

00:59:39.120 --> 00:59:41.540
So it has gotten better.

00:59:41.740 --> 00:59:46.180
So over the past two years, I've seen it go from every time bad to maybe half the time.

00:59:46.240 --> 00:59:47.100
It's a bad example.

00:59:47.220 --> 00:59:49.640
Sometimes I have to dumb it down now, which is encouraging.

00:59:49.640 --> 00:59:51.940
But that's obviously not what we want.

00:59:52.560 --> 00:59:57.480
And so the AI, I think everyone knows, is not creating great code.

00:59:57.640 --> 01:00:01.840
And the reason is it was trained on not great code.

01:00:02.180 --> 01:00:03.380
Most code out there is not great code.

01:00:03.580 --> 01:00:11.900
The code specifically it used was demos, examples, things on GitHub, publicly available demos where there's no security team involved.

01:00:11.900 --> 01:00:12.380
Right.

01:00:12.380 --> 01:00:12.500
Right.

01:00:12.680 --> 01:00:25.400
So like if you went and scanned the code inside Microsoft that makes the Microsoft products, you better believe it, that'd probably be pretty darn good code versus some random crap Tanya did five years ago that's on her GitHub.

01:00:25.700 --> 01:00:29.100
That might be really crappy or it might even be intentionally vulnerable.

01:00:29.520 --> 01:00:29.700
Right.

01:00:30.120 --> 01:00:30.680
And it doesn't.

01:00:30.980 --> 01:00:31.120
Yeah.

01:00:31.380 --> 01:00:31.700
No.

01:00:31.920 --> 01:00:40.440
And so as a result, we have this thing that's trained that security just it's optional, it's low priority and it's missing.

01:00:40.440 --> 01:00:43.880
And so it is doing what it was trained to do.

01:00:44.420 --> 01:00:50.060
And developers and non-developers are constantly making apps now.

01:00:50.560 --> 01:00:54.460
We have CEOs making apps because they don't like what the marketing team did.

01:00:54.500 --> 01:00:55.940
And they're like, look what I did over the weekend.

01:00:56.940 --> 01:00:57.300
Boom.

01:00:57.500 --> 01:00:59.840
It's publish, please, because I'm the boss.

01:01:00.260 --> 01:01:00.940
Oh, I've seen it.

01:01:01.060 --> 01:01:01.480
Like literally.

01:01:01.800 --> 01:01:01.860
Yeah.

01:01:01.900 --> 01:01:02.760
Who's going to say no, right?

01:01:02.840 --> 01:01:03.320
Yeah, exactly.

01:01:03.320 --> 01:01:12.940
And so here we have very, very insecure code going onto the internet very, very quickly, often with no time for the security team to go look at it.

01:01:13.060 --> 01:01:13.140
All right.

01:01:13.160 --> 01:01:18.620
So you've got this prompt library that people can go and get from your website for free.

01:01:18.940 --> 01:01:22.360
You gave me an example to say, go find problems in this code.

01:01:22.400 --> 01:01:25.520
I took just some random code that I know has trouble in it and threw it in here.

01:01:25.520 --> 01:01:34.160
So the secure code prompt library, if you want to go, just go securemyvibe.ca and you do have to join my newsletter to get it.

01:01:34.220 --> 01:01:38.240
But I feel that's a reasonable price because my newsletter is awesome and you get memes.

01:01:38.480 --> 01:01:40.980
But anyway, so this is from that.

01:01:41.220 --> 01:01:46.140
So the prompt library has many things, but one of them is to review the code for security.

01:01:46.400 --> 01:01:47.980
So this is a code review prompt.

01:01:48.160 --> 01:01:50.480
So after you've generated the code, you would put this in.

01:01:50.480 --> 01:01:51.740
We have high risk findings.

01:01:52.220 --> 01:01:54.840
That looks like an, and what number was that?

01:01:55.080 --> 01:01:57.960
We got more findings, mass assignment, unvalidated JSON.

01:01:59.080 --> 01:01:59.320
Yeah.

01:01:59.400 --> 01:02:00.940
And look how short your code was.

01:02:01.020 --> 01:02:03.040
I gave it 62 lines of code here.

01:02:03.120 --> 01:02:07.060
And you are going to have more vulnerabilities than you have lines of code.

01:02:07.160 --> 01:02:07.420
Okay.

01:02:07.540 --> 01:02:11.740
So I'm not going to go into the details, but wow, I just gave it a little bit and it pulled up a whole bunch.

01:02:11.820 --> 01:02:13.560
So I think that that is.

01:02:13.940 --> 01:02:18.500
Did this find more than when you just asked it to review for vulnerability?

01:02:18.500 --> 01:02:19.240
Yeah, I think so.

01:02:19.320 --> 01:02:20.120
I think it did actually.

01:02:20.480 --> 01:02:23.080
Because if you put the AI in the right frame of mind, right?

01:02:23.240 --> 01:02:23.980
That's incredible.

01:02:24.200 --> 01:02:26.820
Well, and I gave it specific things that I wanted it to look for.

01:02:27.080 --> 01:02:28.960
So the prompt library has three levels.

01:02:29.120 --> 01:02:37.920
So prompt level one, you would make, you would add it to your memory or make a code skill, but you would make it run 100% of the times that you generate code.

01:02:38.260 --> 01:02:47.360
And it takes most of the first two thirds of my most recent book, Alice and Bob Learn Secure Coding, and it has condensed it into a set of prompts.

01:02:47.500 --> 01:02:48.140
Oh, that's awesome.

01:02:48.140 --> 01:02:51.800
When you build the code, these are the rules for doing so.

01:02:52.420 --> 01:02:54.800
Then, so that runs just every single time.

01:02:54.800 --> 01:03:00.600
And then it tells you all of its security assumptions and it flags any potential security issues for you automatically.

01:03:01.040 --> 01:03:03.880
So every time you generate code, it's like, I need you to know these things.

01:03:03.880 --> 01:03:05.500
And so then you can address them.

01:03:05.500 --> 01:03:12.640
And then level two prompts are, well, I'm going to build an API or I'm going to build a serverless app or I'm going to do this or I'm going to do that.

01:03:12.840 --> 01:03:19.280
And then you fill in the blanks and it helps basically set security requirements before the code's generated.

01:03:19.280 --> 01:03:23.460
So it does the first prompt and then that as like a double check.

01:03:23.800 --> 01:03:26.740
Then after you can run the secure code review check.

01:03:26.980 --> 01:03:29.680
And then level three is like where you want to get nitty gritty.

01:03:29.880 --> 01:03:35.920
Like you're like, I'm doing a user login feature and I want to hash these passwords very securely.

01:03:36.280 --> 01:03:40.000
Like, and then it's very specific about exactly how to do that.

01:03:40.220 --> 01:03:40.580
And it's free.

01:03:40.660 --> 01:03:40.760
Yeah.

01:03:40.780 --> 01:03:41.940
People should definitely check this out.

01:03:41.980 --> 01:03:42.440
That's very cool.

01:03:42.520 --> 01:03:42.760
All right.

01:03:42.820 --> 01:03:44.100
We are out of time, Tanya.

01:03:44.200 --> 01:03:45.400
Thank you so much for being here.

01:03:45.400 --> 01:03:46.280
Final thoughts.

01:03:46.380 --> 01:03:49.040
People want to get going with the new top 10.

01:03:49.240 --> 01:03:50.760
Please go take a look at it.

01:03:50.820 --> 01:03:53.820
So just look up OWASP top 10 and that will be us.

01:03:53.940 --> 01:04:00.860
Like Google's very good at finding us and maybe give it a read and maybe think about it the next time you are building an app.

01:04:01.280 --> 01:04:04.360
Also, maybe consider visiting your local OWASP chapter.

01:04:05.100 --> 01:04:14.280
Next time you want to, you know, search the internet how to do something that is security related, look up OWASP cheat sheets and then authentication, authorization or wherever you're doing.

01:04:14.280 --> 01:04:15.600
There's over 100 cheat sheets.

01:04:16.220 --> 01:04:19.820
We are a community that lives to serve and help you secure your software.

01:04:20.680 --> 01:04:21.800
And come check out me.

01:04:21.900 --> 01:04:26.700
If you look up She Acts Purple, I am all the things, the newsletter, the podcast, the blog, et cetera.

01:04:26.980 --> 01:04:28.360
And I'm also here to help.

01:04:28.820 --> 01:04:30.040
Well, I know you're doing really good stuff.

01:04:30.140 --> 01:04:31.220
I really appreciate your time here.

01:04:31.520 --> 01:04:31.900
Thank you, Tanya.

01:04:32.020 --> 01:04:32.600
Thank you, Michael.

01:04:34.660 --> 01:04:36.980
This has been another episode of Talk Python To Me.

01:04:37.120 --> 01:04:38.080
Thank you to our sponsors.

01:04:38.280 --> 01:04:39.560
Be sure to check out what they're offering.

01:04:39.720 --> 01:04:41.120
It really helps support the show.

01:04:41.120 --> 01:04:45.700
This episode is brought to you by Temporal, durable workflows for Python.

01:04:45.960 --> 01:04:52.680
Write your workflows as normal Python code and Temporal ensures they run reliably, even across crashes and restarts.

01:04:52.940 --> 01:04:55.960
Get started at talkpython.fm/Temporal.

01:04:56.240 --> 01:05:08.740
If you or your team needs to learn Python, we have over 270 hours of beginner and advanced courses on topics ranging from complete beginners to async code, Flask, Django, HTML, and even LLMs.

01:05:08.740 --> 01:05:11.420
Best of all, there's no subscription in sight.

01:05:11.840 --> 01:05:13.600
Browse the catalog at talkpython.fm.

01:05:14.240 --> 01:05:18.920
And if you're not already subscribed to the show on your favorite podcast player, what are you waiting for?

01:05:19.380 --> 01:05:21.400
Just search for Python in your podcast player.

01:05:21.500 --> 01:05:22.360
We should be right at the top.

01:05:22.700 --> 01:05:25.680
If you enjoyed that geeky rap song, you can download the full track.

01:05:25.780 --> 01:05:27.680
The link is actually in your podcast blur show notes.

01:05:28.400 --> 01:05:29.820
This is your host, Michael Kennedy.

01:05:30.020 --> 01:05:31.300
Thank you so much for listening.

01:05:31.500 --> 01:05:32.280
I really appreciate it.

01:05:32.680 --> 01:05:33.420
I'll see you next time.

01:05:33.420 --> 01:06:03.420
и Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би Би

01:06:03.420 --> 01:06:33.400
Thank you.