Monitor performance issues & errors in your code

#136: Secure code lessons from Have I Been Pwned Transcript

Recorded on Saturday, Oct 28, 2017.

00:00 Michael Kennedy: Do you run any code that listens on an open port on the Internet? This could be a website, a RESTful web service or, gasp, even a database endpoint. Troy Hunt, a renowned security expert likes to say that you're doing free penetration testing for that product right there. Join Troy and me on this episode of Talk Python To Me. We discuss lessons learned from running the vulnerability monitoring website, Have I Been Pwned. As well as other lessons for developers to keep your code safe while providing public services. This is Episode 136 recorded October 26th, 2017. Welcome to Talk Python to Me, a weekly podcast on Python, the language, the libraries, the ecosystem and the personalities. This is your host, Michael Kennedy. Follow me on Twitter where I'm @MKennedy. Keep up with the show and listen to past episodes at talkpython.fm and follow the show on Twitter via @talkpython. This episode has been sponsored by Rollbar and GoCD. Thank them both for supporting another podcast by checking out what they're offering during their segments. Troy, welcome to Talk Python.

01:15 Troy Hunt: Hey, thank you very much for having me.

01:17 Michael Kennedy: Yeah, it's really great to have you on as a guest. I've respected the work you've done in security space immensely. And I'm looking forward to sharing what you've learned about software security and developers with the audience.

01:28 Troy Hunt: Cool, awesome, let's do it.

01:30 Michael Kennedy: Yeah, let's do it. Now, before we get into all the details let's start with how you got into programming in the first place. What's your story?

01:35 Troy Hunt: Yeah, that's a good question. So I was a little bit anti-computer when I was kid, probably up until about teenage years. And I was actually very frustrated by my friends who'd be inside on their computer. And I'm like, "Come on, man, I want to go outside and play football or something like that? What are you nerds doing?" And then I guess I got into it by... I moved overseas when I was a kid. So this was when I was almost 14. My family moved to the Netherlands. And it's kind of cold over there. And like you get--

02:06 Michael Kennedy: A lot of dark evenings. The sun goes down at what? Like 2:30?

02:09 Troy Hunt: Yeah, well, it's a little later than that

02:10 Michael Kennedy: I'm just kidding. Yeah, yeah, but it's gets, it's pretty dark in the winter there, right?

02:14 Troy Hunt: It is. I mean, for folks in the UK, it's basically like the UK but, well, actually, I won't say something derogatory about the UK. You can read my Twitter feed for that. No, we love them honestly. So anyway there's a lot of time spent indoors. And I just started getting involved. I think I must've started doing a bit of Basic back then. So this would've been sort of early 90s as well. And I think really, for me, though, so I was doing a bunch of PC-related stuff in my teenage years and doing part-time jobs at PC repair stores and that sort of thing. But the thing that really hooked me was seeing the web. So I started using the web in 95 when I started uni. And it was just like, "Wow, this is awesome. This is amazing. I want to build stuff for this." And really that was the start of modern-day life as I know it, in terms of what I do.

02:57 Michael Kennedy: That's awesome, yeah, I had the same experience. I was a sophomore or junior at university when, was it Mozilla? What was the first? Before Netscape?

03:06 Troy Hunt: Before Netscape, yeah, something--

03:08 Michael Kennedy: Mosaic.

03:09 Troy Hunt: Oh, Mosaic.

03:10 Michael Kennedy: Yes, when Mosaic came out. And I was just like oh, my gosh, like the entire world opened up. It was such an amazing time. And it seemed like... It's funny because technology was so much more limited and what you can do on the web now is way more impressive, but the world seemed so open back then.

03:27 Troy Hunt: Yeah, yeah. I don't want to reminiscent so far and say, oh, it was so awesome back then, 'cause frankly it's so awesome now and it's really a lot more awesome by any reasonable measure.

03:35 Michael Kennedy: Yeah, for sure.

03:36 Troy Hunt: But it was just an exciting time where we're sort of seeing stuff that didn't even resemble the things that we've done before.

03:42 Michael Kennedy: Yeah, I had the same feeling. That was awesome. So now what do you do day-to-day? More stuff on the web? More stuff in security?

03:50 Troy Hunt: Yeah, so, look it's a bit of both. It's funny, in fact, I've been thinking the last couple days. 'Cause I've had, I don't want to say downtime, but I haven't been rushing somewhere. And I was sort going, "What am I going to do today?" Like I'm entirely independent so I can do pretty much whatever I want. And that consists of sort of several main things. So I still do a lot of traveling and speaking at conferences. I'm trying to do less of the traveling, just because of how much I've been away. But I'm still doing a lot of that. I'm still doing a lot of workshops. I was down interstate last week doing a workshop. I think I've probably done the same workshop... I must've done it 25 times this year. 25 two-day events.

04:26 Michael Kennedy: What's the title?

04:28 Troy Hunt: The workshop is Hack Yourself First. So I usually go into... Say last week I was into a large financial institution and then another online retailer. And I'll go in there for two days and I'll sit down with usually developers but also regularly, project management, or managers, QA people, DBA security folks, and go, "Okay, let's spend two days figuring out the mechanics of things like SQL injection and cross site scripting. Here's how they work. Here's what you do to be resilient to them." And it's very, very sort of hands-on active sort of stuff. Like it's a huge amount of fun for everyone 'cause we get to go in and break stuff in ways that they've never been able to before. And everyone leaves with this sort of new appreciation of the mechanics of how these attacks work and end up building better software for it, which is a good win.

05:17 Michael Kennedy: Yeah, it's definitely a great win. And I think sometimes you just have to see these things in action to really appreciate how your security and the code you write is not actually as secure. I mean, even something as sort of old school as SQL injection attacks. Like when you first start writing SQL strings, you're like, "Well, I need to put the variable here so plus, plus quote thing." You know? How could that be wrong? It just works. But I think totally seeing it go, "Oh, my gosh, did we do that? Because if we do that I just see how horrible this is going to be."

05:46 Troy Hunt: It's the hands-on, you know. It's like the light bulb moments when people do that. And SQL injection's a great example, because, like you said, it's something that's been around for a long time. And we sort of think of this as an old thing. It's still number one in the OWASP Top 10 Most Critical Web App Security Risks. You know, it's still up there, even in their 2017 revised edition, it's still number one. And when I do the workshop and demonstration SQL injection, I show things like there's a blog post I refer to written by a guy who's trying to teach people how to do password resets in ASP.NET. And, literally, within the one screen, he's got good resilient parametrized SQL statements. And we're looking at this going, "Is this okay? Yeah, it's okay, 'cause you got parametrization." And then like the next line there's an update statement and it's just like in-line concatenated SQL with untrusted data that you can totally own. And then he connects to the database with a privileged account and I said, "Yep, goodnight." Like the whole thing's over. And there's still material, new material out there teaching people that. And here's another fun one. You can go and do, and be careful with this, folks, if you go and do this. You can go to Google and do a Google Doc search. So one of these sort of searches that turn up things that aren't really meant to be there. And you can do a search for in URL .PHP?ID=. And I think about the first half a dozen results have got an integer somewhere and a query string. You load the page up, you put an apostrophe on the end of it. And, bang, there's an internal SQL exception. So, very, very high likelihood of having a SQL injection risk. And this sort of stuff is just absolutely rampant.

07:17 Michael Kennedy: Yeah, that's really scary. Why do you think it still is a problem? I mean, it's 2017, we've got ORMs and ODMs that generally protect us against these things. We've got XKCD, Little Bobby table. If that doesn't teach people the lesson, I don't know what will. Why do you think--

07:35 Troy Hunt: That really is. I got him a t-shirt actually I used to wear at conferences, people love it.

07:39 Michael Kennedy: Oh, that's wonderful. I'll have to put a link to that in the share notes.

07:42 Troy Hunt: So why does it still happen? I think there are multiple factors. So I just gave one example. People are still creating training material that's vulnerable. And, in fact, one of the reasons why I show this blog post is that I actually left the guy a really nice comment. And I literally said, "Here's some friendly feedback." and being very constructive. I gave him this feedback, it's just like the comment sitting there, the guy's never replied and then after that, there's been a whole bunch of people that have chimed in and said, "Thank you, this is very useful." And I'm going, "Didn't you just read like the big, long friendly comment from the Australian guy saying like this is really just full of holes, don't do this." So we see this propagate over and over and over again. And ultimately so many of these risks just fundamentally boil down to the people building the software, not being familiar with how something like SQL injection works. And that is just purely a competency issue.

08:33 Michael Kennedy: Yeah, I guess that part of it is like you kind of hinted at the copy and paste, stack overflow type of thing. Although, I suspect that you take a pretty good beating on stack overflow. But, you know this copy and paste sort of thing. And we have so... Our technology and languages and libraries we use changes so quickly that I think a lot of people are just scrambling to keep up to make things work much less be secure.

08:55 Troy Hunt: So the copy and paste thing is funny. In January last year, I was running the workshop we just spoke about in Norway. And part of this workshop there's a module on looking at mobile APIs. And one of the guys in my workshop says "All right, look, I want to look at the way the app for my Nissan Leaf." Or in the US, you'd say Nee-san, the car. The way my car. Yeah, so okay, how does my app talk to my car? Because he could control features of the car from the app. Now this is not something we have a problem with in Australia, but apparently in some parts of the world, it is so cold you've got to turn your car on before your get in the car. You know, like turn the heating on. And this is one of the things they have to do in Norway 'cause it's so freaking cold there. So he's like, all right, pulls out his app, and he's figuring it out. And what he discovers is that the only thing that the app needs to know about the car is the car's VIN number. Now for folks who may not be familiar with what a VIN number is, first of all, this was being used like an API key, it was a secret. Second of all, it's printed in the windscreen of every car. So you could literally walk past a car and it had its API key in the windscreen. And it was worse than that, too, because they're enumerable. So you can just take, in this case, I think we could take about the last five digits and just keep randomizing numbers and finding different cars. And based on what he found in the space of literally sort of single-digit minutes, we discovered that you could control a climate control feature of the car, you could pull back trip history, battery status, all sorts of things, it was just crazy. It shouldn't have happened.

10:25 Michael Kennedy: That's crazy. And if there's any vulnerabilities you're like you can own the car, basically, right? 'Cause you're talking to it.

10:31 Troy Hunt: Yeah, God knows, you know what I mean? And when we say you're talking to the car, let's be clear. You're basically talking to a web server. And not a web server in a car. We can come back to the things people put web servers in . You're talking to a web server on the web, running an API and that goes back into over some propriety GSM network, or something, to the vehicle itself. So there is a proxy in between.

10:51 Michael Kennedy: Yeah, that's not too bad then, I guess.

10:53 Troy Hunt: Well, so here's where it kind of went worse. I disclosed it to Nissan and we had chats on the phone and you know, they're like, "Yeah, we probably should fix this." Yes, you probably should fix this. And then time goes by and they're not fixing it. And eventually we get to like a month after disclosure and they've stopped replying to messages and nothing's happening. So I write about it. And then suddenly they decide it's important. So they take the whole thing offline. And it's offline for about six weeks. And eventually it comes back online and they've got a new app and everything. The funny thing about the app, and this is the relevancy to Stack Overflow and code reuse, is that down the bottom of one of the screens there's this really odd text. And the text is something like: The spirit of stack overflow is developers helping developers. And this is in an app for your car. And we're looking at this going, "What are you doing? Why would you put that in there?" And then, of course, we found the stack overflow post where they've literally copied and pasted the text from the stack overflow post without understanding what it does, put it in the app that controls features of your car.

11:52 Michael Kennedy: That's incredible.

11:53 Troy Hunt: Honestly, folks, if you Google for this, Google like Nissan Stack Overflow code reuse. And it's just like this is insane. How does this happen? I mean, how does this happen at all, let alone in a car, or software to control a car.

12:06 Michael Kennedy: Yeah, I feel like the companies that have these types of IoT things that are really valuable. You know, not a light bulb, but cars and humans travel in them at dangerous speeds. They should really be careful about this, right?

12:20 Troy Hunt: Yeah, they probably should. They probably should.

12:23 Michael Kennedy: Maybe you should send 'em an email about that, or a bunch.

12:25 Troy Hunt: Yeah, well, afterwards, they're like "If you find any other stuff, please do send us an email." I've sent you an email! We had phone conversations, you were endorsed! What happened? Anyway.

12:35 Michael Kennedy: Wow. So do you know if any of these car companies run bug bounty-type things?

12:39 Troy Hunt: Yeah, and I know Tesla does. In fact, Tesla runs a bug bounty through Bugcrowd. Bugcrowd is a bug bounty as a service platform started by a mate of mine who's done extremely well moving over to the US and getting funded and doing wonderful things. And bug bounties are becoming absolutely massive now. It's great to see him doing well, but it's great to see this being a big thing in the industry now as well.

13:02 Michael Kennedy: Yeah, I totally think it's a very positive thing. I suspect most people listening know what bug bounties are, but maybe just define it for eveyrone.

13:08 Troy Hunt: Yeah, so a bug bounty is effectively acknowleding that... Maybe the right way to put this is everyone who has anything online is continually getting free penetration tests. They're are always people out there probing away at your things. And a bug bounty is a means of saying to people, "Look, if you find vulnerabilities in our software, if you find things that they could be dangerous, like SQL injection, submit them here." And there's usually a formal process, you know this is the email addressed to send it to, this is the information we need. Here's how to encrypt your communications. And then depending on the vulnerability, you may be incentivized. So you may actually get anything from a T-shirt to a large amount of money. And this is a really neat way of recognizing that we do have flaws in software. This is the nature of building software. And that if someone finds it, it is actually worth something. And this sort of value proposition, the bug bounty, is that it incentivizes people to report these things responsibly and allows the organization to handle them and fix them before someone actually goes and exploits nasty things. And, obviously, there's some incentivization for that in a monetary sense. And these are becoming really big. So you know we mentioned Tesla. The Pentagon has run a bug bounty. Like these are going really, really mainstream. And there's a lot more to it than just some random hacker sitting in a basement on the other side of the world trying to break into your things as well. They can be very, very carefully managed programs with well-selected testers.

14:29 Michael Kennedy: Yeah, like the Pentagon one, you had to kind of interview and be approved to be a part of it. It was just "Now everybody go forth and attack the site."

14:37 Troy Hunt: You couldn't be Australian either or anything else that wasn't America, as I understand it.

14:40 Michael Kennedy: Yeah, that's unfortunate.

14:42 Troy Hunt: Well, you know, like it's the freaking Pentagon. Like I kind of get that. And, frankly, just the fact that they ran that and it became such a more mainstream thing that entered into people's psyches in places it just wasn't before, I think is a very positive thing.

14:57 Michael Kennedy: Yeah, I totally agree it's a positive thing. And this money can be pretty large, like you said, it could be a t-shirt, but it could be $100,000. And this incentive has been there recently. Like if there's not a bug bounty, there's probably some other bad actor who's willing to pay $50,000 for a good O day, right?

15:11 Troy Hunt: That's the thing, and then we sort of get into this interesting spice of who is competing for the dollars of the bugs. And, look, some people will argue that there are still actors out there that will pay a lot more than what the organizations with the potential of vulnerability will. But then at least you sort of get to have a little bit of weight on the other side of the scales in a monetary sense. And, of course, from a legal and an ethical perspective as well, that there's always the incentive to try and report things through the formal channels and get money. But, yeah, look, they'll always be nefarious parties willing to pay for this stuff as well.

15:45 Michael Kennedy: Yeah, at least now there's some option to monetize that and make it part of your living and do the right thing.

15:50 Troy Hunt: Exactly.

15:51 Michael Kennedy: Yeah. This portion of Talk Python has been brought to you by Rollbar. One of the frustrating things about being a developer is dealing with errors. Ugh, relying on users to report errors, digging through log files, trying to debug issues or getting millions of alerts just flooding your inbox and ruining your day. With Rollbar's full stack error monitoring, you get the context, insight and control you need to find and fix bugs faster. Adding Rollbar to your Python app is as easy as pip install rollbar. You can start tracking production errors and deployments in eight minutes or less. Are you considering self-hosting tools for security or compliance reasons? Then you really should check out Rollbar's compliant SaaS option. Get advanced security features and meet compliance without the hassle of self-hosting, including HIPAA, ISO 27001, PrivacyShield and more. They'd love to give you a demo. Give Rollbar a try today. Go to talkPython.fm/rollbar and check 'em out. Speaking of looking at breaches and disclosure. You're running a pretty amazing website called Have I Been Pwned that has really grown in terms of awareness for when these breaches happen, right? Tell everyone about Have I Been Pwned.

17:04 Troy Hunt: Yeah, well, Have I Been Pwned has been running almost for four years now. I've got to think about a birthday thing to do, actually, 'cause it will be I think in late November, early December will be the four-year anniversary. And I started that after the Adobe data breach. So Adobe was about 150 million records from memory. And at that time I'd been doing some analysis across sort of different data breaches, looking for patterns. Are people using the same password? Well, guess what? Yeah, they are. Are people appearing in multiple different incidents? You know, like the same person. And I was sort of seeing stuff. I thought like I find this really interesting. I reckon other people would find it interesting if they could see where they were exposed as well. And particularly see things like you were exposed in these multiple places. So that was sort of the genesis for the project. And I built that out with, I think it was a total of 155 million records, so it was basically just about all of Adobe when I launched it. And I was like, wow, this is massive. And I was actually... I built it all out on Microsoft's Azure platform as well. And I really wanted to spend time using Azure in Angus. So, you know, how can I do something that actually uses a lot of storage? And how can I use things like the table storage construct instead of relational databases to save money and make it go faster and all the rest of it. And that was sort of a bit of a hobby project, too. And now sort of fast forward to nearly four years later and there's 4.8 billion accounts in there, which I still find to be bit of an unfathomable number.

18:31 Michael Kennedy: That's really incredible, incredible. That's more than half the population of the earth. I mean, I realize it's not one to one person to account, but still, that's staggering.

18:41 Troy Hunt: It is crazy. And there's just sort of all sorts of metrics about the project that just exceeded any sort of form of expectation. So that's one of them. The visitor stats, an average day is somewhere between 50 and 70,000 people come to the site. A big day is seven figures. So I think I had about 2.8 million in one day the other day.

18:59 Michael Kennedy: What was the driver factor behind that?

19:01 Troy Hunt: That's a good question, because there's always a trigger, right? So to sort of deviate from that baseline by an order of multiples, in that case, there was the spambot. So it was called the Onliner Spambot. And a security researcher in France discovered the spambot, where unfortunately, or fortunately, depending on perspective. Whoever was running the spambot didn't do a great job of actually securing their data and left 711 million e-mail addresses exposed. So he managed to grab all these data and sent it over to me. And I went, "Okay, well, let's load 711 million records in here." That's not a breach in the traditional sense, but it is data about individuals redistributed over the internet. Let's load this. And, of course, a huge number of people with an interest in those. And that sort of made a lot of news headlines. And this is the thing that really drives the traffic. It's news headlines. Because every time there's a data breach, I see a spike, because it's on CNN or the BBC, or something that faces the masses and not just the tech audiences. So it's sort of really interesting to see how broadly appealing the project has been.

20:05 Michael Kennedy: Yeah, I think it's a great service. And I've gotten probably five or six emails from your service saying: You've appeared in some breach or other. It usually doesn't freak me out too much 'cause I use one password and my passwords are 40 characters long, and they're unique per site.

20:21 Troy Hunt: So I hear a lot from people saying they hate getting email from me, so I apologize to everyone who's had an email from me.

20:28 Michael Kennedy: Don't hate the messenger, right? Come one.

20:30 Troy Hunt: Well, exactly, right?

20:31 Michael Kennedy: Yeah, cool. So the one that really stands out to me that you were pretty well highlighted in, it actually had some interesting ethical components to it, was the Addison Mashley? Massley? What?

20:44 Troy Hunt: Just make it look like you don't know the name, that is a great defense, well done.

20:47 Michael Kennedy: Thank you.

20:49 Troy Hunt: "Honestly, darling, I've never heard of this site before. I have no idea why my email address was there."

20:53 Michael Kennedy: So this is like an adult friend finder, let's have an affair type of thing. And it got hacked. And because not just people's passwords and possibly reused passwords were leaked, but just the fact that you even existed on the account was pretty bad data breach. You had to even be a little careful about letting people know or searching for that data, right?

21:13 Troy Hunt: Yeah, well, look, I mean, Ashley Madison, just for context for everyone. So this was a data breach that happened in July 2015. And hackers said, hackers, hacker, we don't know. We still don't know who it was, said, "Look, we've got the Ashley Madison data. These guys, we don't agree with their business model." And, okay, many people took an ethical stance on the whole context of this is not like a dating site, it's not like an outright adult site. It is literally the whole MO was helping people have affairs. The strap line used to be: Life is short, have an affair. So, I guess, trying to mainstream adultery. And, obviously, a lot of people took issue with that ethically. So whoever it was that got their data, said, "Look we got the data. These guys have been bad. We don't like the business model. If they don't shut down in the next month, we're going to dump the data publicly." And this was kind of interesting on many levels. I mean, we do see threats like this. But often we see threats that are more financially motivated, where we'll see threats that say, you know, "Give us Bitcoin or we'll dump your data publicly." But, in this case, they just obviously took an ethical dislike. And it was kind of interesting as well, because part of the original reasoning was what Ashley Madison did, and this was a really dick move in anyone's books, is they said, "Look, you sign up to this website. If you want to remove your data, you've got to pay for this full delete service." And, from memory, it was about $19 to delete your data.

22:38 Michael Kennedy: That's evil.

22:38 Troy Hunt: What would happen is there were a lot of people and I had literally hundreds of conversations of people on the site afterwards. And I was learning there were a lot of people who'd maybe had a couple too many red wines one night and go, "Hey, this might be fun." and sign up. And then go, "Oh, this wasn't a good idea." And they'd forget about it. There were a lot of people in there who were single. There were a lot of people, look, again, regardless of your ethical position it, "We're consenting adults." And they decided they wanted to use the site. And then if they wanted to get off there, they were having to pay money, which is just, it's just a reprehensible move. But, obviously, they thought they could monetize that. But, anyway, one of the things a hacker said is they said when you pay for your full delete, you're not actually getting deleted. And what we discovered after we actually saw the data, is paying the money would null out your record in the membership table. And then there would be a payment record with the foreign key back to the membership table. And the payment record would have your personal data on it. So it's like, "Yes, we removed you from the database, by the way, you just created a payment record and your payment record has your personal info on it." So they took an ethical dislike to it. And, in a way, it was kind of fortunate the way it panned out, in that we had a month notice between saying we're going to dump this and it actually happening. And I had time to think about how to handle this in Have I Been Pwned, 'cause it turned out to be more than 30 million records, so it was a very large breach. And I kind of went, look, this is going to be valuable data if it does turn up. And I want to make it searchable, but by the same token, this is the sort of thing where I don't want to be the vector through which, let's say, a rightly jealous wife discovers that her husband has been on the site. And, incidentally, this was a very, very heavily male-dominated service, which probably comes as no surprise to anyone. And the women that were on there, the huge number of them, that signed up from IP address 127.0.0.1, which is a little bit suspicious.

24:30 Michael Kennedy: That is a tiny bit suspicious, yes.

24:33 Troy Hunt: Yes, and actually, this is one of the things we learned, they were actually fembots. I always think Austin Powers every time I hear fembots. Don't know if it was exactly like that. However, you've got these accounts on there which are effectively bots, which are trying to engage with men, because the more you can engage with them, the more you can get them to pay and all sorts of shady stuff. So, anyway, eventually the data does get dumped. And by virtue of having had time to think about it, I had decided that I'd introduce the concept of a sensitive breach. And a sensitive breach means the data still goes in Have I Been Pwned. But you can't publicly, anonymously, search for someone else. So you've got to actually subscribe to the notification service, which is free. But the reason I use that mechanism, is because the sends you a verification email with a unique link. You click the unique link. And then it says, okay, now I know that you control this email address, we'll show you everything. So the public ones and the sensitive ones.

25:27 Michael Kennedy: Pretty legit way to handle it. Definitely the way the website handled the breach didn't sound like it went well. Let's talk about one more breach before we move on to some developer topics. One that you did write about was Disqus, the comment section.

25:42 Troy Hunt: Mm, yeah.

25:45 Michael Kennedy: Which actually is at the bottom of, it's going to be at the bottom of your show page, right at talkpython.fm/136, I suspect you could find Disqus right there. And you said that these guys, while something happened in terms of a data breach, they handled it right.

26:00 Troy Hunt: Yeah, and you know, it's at the bottom of my blog as well. And it's also my data in the Disqus data breach. So I think that the macro picture here is that when data breaches happen there's a real broad range of reactions from organizations involved. And they range from on the one hand being extremely difficult to get in touch with, sometimes denying it, often downplaying the severity, sometimes covering it up, like knowing there was a breach, but not telling people, 'cause they're worried about reputation damage, and doing all the sorts of crappy things that you might expect an organization at the receiving end of one of these things might do. And then, on the other hand, there's sort of like, this is exactly the way to handle it. And Disqus was very much on that right hand side. And there's really only a couple of organization I've dealt with that have been down there. And the things that Disqus did really well, one of them is the speed. So everyone would've seen Equifax in the news only last month. And Equifax took about five weeks after learning of the data breach to advise everyone.

27:02 Michael Kennedy: They needed to sell their stock.

27:03 Troy Hunt: Well, geez, oh, man. Allegedly. Well, they did sell stock allegedly because of the breach. You got to be--

27:11 Michael Kennedy: Right, right, right.

27:12 Troy Hunt: People get very litigious in America, I know this. Actually, I've heard this. Fortunately, I don't know this through personal experience. But the thing with Disqus is that I got in touch with them, and actually just to be clear as well the context with Disqus is someone popped up and gave me seven different data breaches. And there are things I had never seen anywhere before. It was things like ReverbNation, Kickstarter, Bitly, and all of these three had previously disclosed where they'd say, "Hey, we've been hacked. Never seen any data for it." And then suddenly they all turn up in this one place and they're all legit. And then one of them, as well, was this Disqus one. And I'm trying to find references to a data breach. I can't find anything. And there's 17 million email addresses in there. So, fortunately, I had a contact there, someone I had been chatting to only a couple months earlier on another topic. And I was like, "I think I have your data. And you probably should know about this." And from the point where I sent that first email to when they had made a public statement and had already reset impacted passwords as well. It was, I think, it was 23 hours and 43 minutes. It was just under a day. It's like, okay--

28:24 Michael Kennedy: That's awesome.

28:24 Troy Hunt: That is awesome. Like, "You guys have turned this around in less than a day." And when I spoke to them, so we jumped on the phone as well and had a good chat. They were sort of really, I mean, obviously, they weren't happy about the situation, no one's going to be. But they sort of understood that this is the reality of operating online today, where these things do happen very regularly. They prepared communications which was transparent, candid, honest. They worked with the media as well. So one of the things I was sort of impressing one them is I think it's really important to engage with the media, 'cause there's going to be stories on this. And you an either ensure that those stories are representing your point of view and your version of events. Or you can say, "Look, we're too busy." Or don't respond to media and you can let them form their own opinions. So, they just did all of that right, and the only negative feedback I saw out of any of this was people saying, well, how come it took you four years, because apparently it actually happened in late 2013. Well, they didn't know, we still don't know, well, I haven't seen any press as to how it happened or why it took this long to know, but I guess based on the hand that they were dealt a couple of weeks ago, the way they handled it was just exemplary.

29:33 Michael Kennedy: One of the things that's always kind of in the back of my mind, I run a number of websites, have a couple of servers, some backend servers, I talk to those servers, you know, how would I know if I've been hacked?

29:44 Troy Hunt: Well as we were just saying, you might not, and this is sort of part of the problem. We got to remember as well that the Disqus situation is far from being exceptional, so just off the top of my head last year was LinkedIn, Dropbox, MySpace, Last.fm, Tumblr, many others that were in similar boats, where they had had breaches years ago, and they're only just discovering it. And incidentally that big stash of data that had Bitly, Kickstarter, and all the other ones in it, there was another one and I'm thinking very carefully before I say these words 'cause two of them haven't been disclosed yet. But there's another one which was a service called We Heart It, which seems a little bit like Pinterest. That was in there. And there were two others, which we're still waiting, one of them should be sending their message out any moment now which is millions of accounts again. And then a final one, which I'm still trying to get in touch with some people. Not everyone replies to an email when you say, "Hey, you've been breached."

30:34 Michael Kennedy: They might pretend it went to spam and just go dark right?

30:37 Troy Hunt: Yeah well, they won't be able to once I publish the data on Have I Been Pwned but I'd really like to give them the opportunity to control the messaging themselves first. But what we got to remember with all these cases is they happened years ago, very often with entirely different people in the organization as well, very often within different infrastructure, different code bases. Very often, you're not even going to have logs that go back, say four years, you know, do you really keep web server logs that long? A lot of organizations don't. So it can be enormously difficult to know when an incident has actually happened. I guess your point about how do you know, one of the things that sort of continues to strike me is that there are all of these incidents that have already happened that we don't know about as the public, and then a subset of those, probably a very large subset, where the organization themselves doesn't know about it at all, or don't know about it. So we're yet to see so much stuff actually come out of the woodwork that we just haven't even begun to conceive of yet.

31:33 Michael Kennedy: Yeah that's kind of daunting.

31:35 Troy Hunt: It's sobering isn't it, you know?

31:36 Michael Kennedy: Yeah, it sure is. So it sounds to me like a lot of the time, this becomes something that we're aware of because the data is discovered, you're like, oh my gosh, these accounts are all coming from this one place. There might be people that use like, their email address, like my email address plus LinkedIn@gmail.com. And you're like, well I only use the plus LinkedIn at LinkedIn and it's out in this paste bin or something right?

32:00 Troy Hunt: Well when people do that, so when they use that sort of aliasing pattern where they have this sort of plus after the alias. Or when they have their own domain, and they create custom aliases for every service.

32:11 Michael Kennedy: Right, LinkedIn@mydomain, yeah.

32:13 Troy Hunt: Yeah right, that does actually help, in fact one of the ones I'm going through a disclosure with now, and this is sort of the last out of the seven that was in this stash that I spoke about, it wasn't immediately clear where all the data was from. It's labeled one service, but I was actually checking with Have I Been Pwned subscribers, saying hey look, you're in here, have you used this service? And a bunch of them are saying, no I got no idea what it is. And it was only when I started going through, looking at things like the aliases on email addresses, where it's like, oh, okay, I can actually see this other thing now, and I was able to kind of join the dots and go, okay well I can see that this is actually from multiple different sources. I think it's actually from two different sources.

32:51 Michael Kennedy: It does seem like that leaves a little bit of a bread crumb. I don't do it that often, but sometimes I do.

32:56 Troy Hunt: Yeah, now it can be really useful for the other thing that I hear people saying, is they say look, I use this aliasing pattern because then I can see when my data is say, sold or redistributed, or something like that, I can see where the source was. The problem then of course is what are you going to do about it? Like we know that this happens, and if you go back to the organization say, hey I signed up on your service, and I signed up with my name plus your service name at gmail.com, and I've just seen, I've just gotten spam trying to sell me Viagra. Okay now what? You know, like there's not really anything that's actionable out of it either.

33:29 Michael Kennedy: Yeah I've heard that, and actually I've been on both sides of that story, not with stuff that I'm doing these days but previous projects. And you know, how much weight should you put in that? Like on one hand, I feel like, yeah okay, if somebody really got all the data out of the database, they would have those things and they could email people. But it seems like that's a weird thing to do is just spam folks. On the other hand, if somebody breaks into that person's email and just harvests every email they see and just start sending to it, maybe it's their account that got broken into and just happened to be, to kind of loop back.

33:59 Troy Hunt: Well, you know this is also one of the great mysteries, right, where there are so many different ways that these things can go down, and very, very often you just simply can't get to the bottom of it. I mean, people say to me all the time, "I'm getting spam, where's it come from?" I don't know, it's like, you put your email address out all over the place. It could be from anywhere.

34:16 Michael Kennedy: Yeah, that's right. Get a good spam filter and just buck it up I guess.

34:19 Troy Hunt: Exactly.

34:22 Michael Kennedy: This portion of Talk Python To Me is brought to you by GoCD from Thoughtworks. These are the people that literally invented continuous integration. They have a great open source, on-prem CDCI server called GoCD, but rather than tell you about the server this week, I want to share a course they created for people who are new to continuous integration. Check out their course called Continuous Delivery 101 from GoCD at gocd.org/101. This video series covers the history of continuous delivery, concepts, best practices, how to get started, and popular tools. You'll gain a holistic view of continuous delivery and a deeper understanding and an appreciation of the critical concepts. Be sure to try their course at gocd.org/101 and let them know that you appreciate their sponsorship of Talk Python. Speak to the web developers out there like, what lessons have you learned from running Have I Been Pwned that you'd like to share with that audience?

35:17 Troy Hunt: There are a lot of things, you know, so speaking of, sort of thinking about it from pure depth perspective. Going back to what I was saying earlier, what one of the main reasons I wanted to do this is because I really wanted to do something in anger on Azure. And I've had a heap of fun, to be honest, building this service out on Azure and sort of experiencing the whole cloud scale thing and commoditized pricing and all the other sort of promises of the cloud. And there's a heap of learning to that, both good and bad, I mean so some of the good stuff has been, things like the ability to have auto scale, so actually provision more infrastructure on evidence of exceeding using infrastructure resources is fantastic. And I've spoken many times at events about how to run a project like this on a coffee budget. So how do I run a service with 4.8 billion records and sometimes millions of visitors a day for what you'd spend on coffee? And I don't always manage to do that, but I usually get pretty close. And things like really managing your scale very carefully have been great. Things like choosing the right data storage constructs for your use case have been great as well, so. Particularly in the Microsoft world there's been this traditional view of, you're going to steal data? You got to have a SQL database. And a SQL database is a behemoth of a thing, right? I mean it is a big, big thing. And it's expensive and it's-

36:41 Michael Kennedy: Especially when we have 4. something billion records.

36:44 Troy Hunt: Well, hey look, if I got to that point, if I was actually trying to put that data in a relational database like SQL Server, I mean, the cost would just be astronomical. But it would also be really unnecessary because the patterns with which the data is used just don't predispose it to needing a relational database, so one of the best things I ever did was to use Azure's table storage which is basically just a key value pair. And I just partitioned it in a way that worked really well. So those 4.8 billion records are in there. You can create a partition and then a row key. So my partition keys are the domain, so say Gmail.com, and then the row key is the alias. And what that allowed me to do is super, super, super fast lookups, because when you're searching you're literally searching by domain and alias. So a very specific partition and a very specific row key. And it also made it really easy to do entire domain-wide reports because I just pulled the entire partition. So that works awesomely, and that is still the single best decision I've ever made, I'd say, in terms of the architecture. So for those 4.8 billion records, that on disc is some tens of gigabytes, they actually don't report the exact size, and it always scales infinitely. So it is platform is a service of never reaching scale capacity on the storage tier and it usually returns records within, sort of, a single digit millisecond range and that cost me about $40 a month.

38:06 Michael Kennedy: Awesome.

38:07 Troy Hunt: Yeah, which is just like, how cool is this? It just rocks. One of the other really, really big things I've learned is I started using Cloudflare for Have I Been Pwned about a year ago. And look, I've been using them on things like my blog and a couple of other little projects, and it was kind of cool for that in that you get HTTPS for free and a few other little sort of bits and pieces that make things like my blog run a lot better. But it made a massive difference to Have I Been Pwned and I originally did it because I was getting DDoSed and that wasn't fun. And Cloudflare put a stop to that. But then it became really, really awesome because you have things like a firewall that you can control programmatically. So one of the reasons I put it in place was the API I have, I introduced a rate limit, because I was seeing some fairly nefarious behavior to it. And I went out and I looked, I'll just put a rate limit in it and if you exceed the rate limit I'll return a 429, too many requests, and say you can retry after two seconds. And they'll see that and they'll stop. And no, they don't. Like they just kept hammering it. And one of the pennies that dropped was that when you expose that sort of Origin server to the world, you have to deal with everything there is. So you have to deal with every incoming request on that same infrastructure which is actually trying to serve legitimate requests. And when you put a service like Cloudflare in front, suddenly you have this other layer in front where you can start to programmatically exclude nasty stuff and actually free up the underlying resources to do the things they're meant to do. So I've got some great rate limiting implementations. I've got a really neat model I've written about before using Azure functions where even when I see behavior that's slightly nefarious on the website itself, I just drop in a JavaScript challenge rule on the Cloudflare edge node so that if you go to the site it just makes sure you're in a browser and you can't automate it with an API. And then you basically get a 24 hour time out and then you can try again. So it's been awesome for that. It's also been super awesome for reducing my costs because I cache the bejesus out of this thing. So Cloudflare has got 118 edge nodes as of today, around the world, and everything from the front page to the FAQs to every single image and JavaScript file and CSS file is served from those edge nodes. So the actual traffic that comes through to the site is usually just API requests and a couple of other dynamic things. So it's really, really dramatically reduced my costs and the frequency with which I need to scale my infrastructure.

40:33 Michael Kennedy: Now that's really cool. And Cloudflare has actually been in the news recently. This week, a couple of weeks ago, for announcing basically unlimited DDoS protections.

40:42 Troy Hunt: Yeah.

40:43 Michael Kennedy: So, yeah, that's cool.

40:44 Troy Hunt: It's the world we live in today, isn't it? And look, that was sort of their traditional thing, right? Like they made their name out of DDoS protection. But they do so much more now because they can sit on the wire and intercept that traffic, and that spins some people out as well. And if you are listening to this and you get spun out by it, I've got a blog post about security absolutism. So if you google my name and security absolutism you sort of see me trying to put things in perspective. But it means you can do stuff like, all these websites that are now going HTTPS because they're being forced down this route, you can go HTTPS for free via Cloudflare within about five minutes. And now only that, but they can also do things like add an HSTS header because when you can intercept the traffic you can add headers. They can rewrite HTTP references to HTTPS. They can 301 all of your insecure requests over to secure requests. And they can do all this stuff because they're sitting there controlling the traffic. And that just makes a huge mountain of sense for many, many good reasons.

41:41 Michael Kennedy: Yeah, that's cool and you don't have to worry about it.

41:43 Troy Hunt: Yeah, yeah, it is like literally a turnkey thing.

41:45 Michael Kennedy: That's cool. Yeah, I don't use it but I've considered adding it and it seems like it might be a good idea. I should be quiet. Somebody might try to attack my-

41:54 Troy Hunt: No, you know what? And it is like a five minute job, too, and people go, ah yeah, but you can just go to Let's Encrypt and get a certificate. And they're right. And Let's Encrypt is absolutely awesome, but it is just certificates. And there is so much more that you can do with a reverse proxy like Cloudflare that, like once you actually use it in anger on a large scale site, you go, well how would I ever not? You know?

42:15 Michael Kennedy: Right.

42:16 Troy Hunt: And even if it's a small site or static blog or something, it still makes an enormous amount of sense because of stuff like the caching.

42:21 Michael Kennedy: Cool, yeah. It definitely sounds like it's worth checking out. So, one thing I wanted to talk to you about is Wannacry, which was a real sad thing that was a ransomware thing that went around just encrypting all the things and it took out a bunch of places. The most notable one, I guess, was the National Health Service in the UK. That's right. But it also took out things like Marist, FedEx, a bunch of places, right? Wannacry seems to me like a real lesson in just being vigilant and patching all of your stuff and keeping it up to date. But there's tons of things that were way out of date. What do you think some of the lessons are from Wannacry?

43:02 Troy Hunt: Well, just one thing on that, actually. The one was Not Petya. So Not Petya came, I think it was about a month or six weeks after Wannacry. Wasn't that long.

43:11 Michael Kennedy: Right, right, they were similar times. Yeah, it was similar, okay.

43:14 Troy Hunt: But yeah, yeah it is similar times. And look, it's still ransomware. And look, you don't want either.

43:18 Michael Kennedy: All right then.

43:20 Troy Hunt: The thing that was really interesting about Wannacry is the sort of sequence of events leading up to that. Without having the exact dates in front of me, from memory, Wannacry hit us in May. And back in March we had seen Microsoft say look, there are some critical patches. You really should take these critical patches.

43:37 Michael Kennedy: And they didn't really elaborate why they knew they were so important, so timely. But yeah, maybe you should just really just do this right now.

43:43 Troy Hunt: Exactly. And then a month later we saw this sort of shadow brokers' dump. So this collective that has collected themselves a bunch of zero days. And one of the vulnerabilities in there was this EternalBlue vulnerability which exploited SMB. Which we'd normally use for sort of connecting and file systems and sharing information across them. And the problem there was that you could remotely connect to a machine with a vulnerable SMB implementation and have remote code on them, which is really nasty stuff. Okay now we're saying, all right, so a month ago Microsoft said patch your things. Now we know why it was important. And then another whole month went by and then Wannacry hit. So by the time someone got hit with Wannacry that was exploiting EternalBlue, two months ago they knew it was a big thing. A month ago we knew why it was a big thing, and now you still haven't patched your things? And really the lesson out of this was around patching and this was just such a poignant example of why it was so important. Because this was devastating. I mean it hit particularly the NHS, but it hit everything from German trains through to other services in other parts of Europe in particular, really, really hard. And what sort of stunned me is after that that a lot of the stuff I was writing and talking to people about was this whole patching cycle where there were still people going, I disabled Windows updates. And there's literally tutorials out there how to disable Windows updates because maybe you'll get a bad one. And I would have people sort of justifying turning it off. They're like, well, I don't like it because sometimes you go down, you got to shut down your PC and it says you got to wait while updates install and once I was about to get on an airplane and I couldn't close my laptop. I'm just like, that's your reason? But actually my favorite one was a bunch of people would say, I keep installing all these stupid updates. I've never even had a virus. I was like, well, yeah, but that's why you haven't because you installed the updates. You know, certainly an important part of it.

45:38 Michael Kennedy: Yeah.

45:39 Troy Hunt: To have that happen and then have Notpetya occur just after that which, again, was exploiting a number of exploits. But one of them was an unpatched, or rather, a patched vulnerability which people had left unpatched and then got exploited.

45:51 Michael Kennedy: Right. If they didn't learn from the first two times around they should have really learned from Wannacry.

45:55 Troy Hunt: I know. Exactly. Anyway.

45:57 Michael Kennedy: The reason I bring this up is not to just talk about these crazy viruses and patching, but I think there's a real tension in organizations and the larger the organization the greater the dissolution of responsibility in this is. To say like, look, there's this system that's running our invoices. Nobody knows how really to upgrade it and nobody wants to be the one to take the responsibility of patching it because if it goes down, their weekend is toast. We're just going to leave it. Not my problem. Someone else's problem. Right? How do you think we address that?

46:30 Troy Hunt: It's a good observation and we've got to be fair here that we don't sort of overly trivialize the complexity that can be involved in actually patching these things. I mean, think about the NHS for a moment and think about what hospitals run and some of the systems they have. Think about an MRI machine. You know? Imagine actually trying to patch that thing. And look, there's a bunch of them probably sitting out there still running Windows XP. You know? You don't just like whack in the DVD and upgrade to Windows 10. This is not a simple process. So I'm sympathetic to that and I suspect that what happened in cases like the NHS is there's budget constraints. In fact we know there's going to be budget constraints in a hospital. There's budget constraints. The IT managers had to make a call between where that budget gets spent, what's patched when, how much money they allocate into different areas. And it would have been a very hard problem. And large enterprises are not exactly just running Windows update automatically across everything, either. I mean, these things are tested and rolled out through standard operating environments and they're a big thing. And I think, to be honest, like the bigger picture here is that when there is a high friction of updates, it makes the uptake very difficult. I mean, one of the things that Apple's done really well is they've made it such a low friction process. You know? Like hey, iOS 11 landed the other day. A thing popped up on my screen and I said yes and I never had to worry that it wouldn't work. And if it didn't work I would have restored from iCloud while I went out and kicked the ball with the kids or something and I come back inside and it'd be done. So, yeah, that's really the model that we'd love to move towards. Where these things are low friction and automated. And unfortunately that's just not the reality with a lot of systems today.

48:09 Michael Kennedy: Yeah, absolutely. The more complicated they get, like it could be like a library that's compiled into your code, that you've got to upgrade, for example, with the Equifax one, right. Yeah, the Struts in Equifax, that's right. And they had to like recompile and who knows what got deprecated. What had to have been changed. Granted, that doesn't excuse them. They really messed that up, but still.

48:34 Troy Hunt: It's a little bit hard trying to throw Equifax a bone. Look, I mean that is something where it's like, okay, let's just be objective about it. I can see where its Strusts and you have to recompile some pretty serious stuff as well. It comes back to that point about the friction and when there is this high friction of change, well yeah, it's going to be hard to get this stuff done in a timely manner.

48:54 Michael Kennedy: Yeah, absolutely. So we don't have a lot of time left. Let's talk a little bit about IoT. You wrote an interesting blog post. I think it's a blog post called, What Would It Look Like If We Put Warnings on IoT Devices Like We Do Cigarette Packets. Tell us about that.

49:11 Troy Hunt: Yeah, yeah, that was fun. So the sort of premise of this is a lot of IoT stuff's got some pretty crazy vulnerabilities in it. Now one of the ones I saw a couple of years ago is that there's these kids' tablets. So imagine if Fisher Price made an iPad. This is sort of my vision of it, so plastic and colorful and all this sort of thing. And these were made by a company called V-Tech. Hong Kong-based toy maker. And V-Tech had a vulnerability that someone exploited. Sucked out millions of parents' and children's data and the kids' data included things like their names, their birth dates, their genders, their photos, foreign keys to the parents with the parents' physical addresses as well. So it was like, from a stalker perspective, it was kind of the worst possible thing you could imagine.

49:55 Michael Kennedy: Really creepy, yeah.

49:57 Troy Hunt: I know. It was like super, super creepy. And, this was really bad news and someone did break into their systems but they had some shockingly bad aspects of their security there. I mean, stuff like I wrote a blog post at the time. You would log in, and when you log in via, they had a little flash emulator for the tablets, so okay, there's another hint things are going to be wrong. So the little flash emulator calls an API. The API returns a json response which contains the actual SQL query executed in the database. And it's just really, really weird stuff like that. And it wouldn't surprise me at all if it was just classic SQL injection that the guy got in with. Anyway, they had a bad time out of it and they were in the news only a couple of weeks ago because a class action against them didn't succeed, which frankly I agree with. I think people trying to mount a class action against a company where the data was exposed but contained, it was never spread, it was never abused, it was just a bit of a money grab. I think the regulators should be pinging. That's a different story. But anyway, I saw they're in the news and this one story pointed out that their terms and conditions today effectively say you could get hacked. Someone could get your data. That's your problem, not ours. I kind of replied to them, or "Twittered" them and said, look, how about you put this on the front of the pack? And it was a little bit tongue in cheek, right? 'Cause if you put that in the front of the pack no one's going to buy your freakin' tablet. And I was thinking about it later on. I was like, it's almost like it's a dangerous good, you know, and in Australia what we do with dangerous goods like cigarettes is we put great big warning signs on the front of the packs. And they're very graphic and they tell you how bad stuff can be when it goes wrong.

51:32 Michael Kennedy: It could kill you in slow and horrible ways. Let us list them.

51:35 Troy Hunt: I know, and they're super, super graphic in Australia, as well. So I thought, all right, why don't we do this with IoT? We'll literally just put these things on the front of these IoT devices like we would a cigarette pack. So I just did this blog post with a bunch of mock ups of what would it look like. And there's sort of warnings on everything from the V-Tech tablets to teddy bears to automated dog feeders. And that was a rather popular post.

52:02 Michael Kennedy: Yes, warning, you acknowledge and agree that your child's intimate voice recordings may be placed in an unsecured Amazon S3 bucket.

52:09 Troy Hunt: Oh, cloud. Oh, my god.

52:11 Michael Kennedy: Oh, yeah. That's pretty funny. I think it makes a good point and I guess maybe one final thought on IoT. Do you think things are going to get better or is it just going to continue to be a bunch of unpatched badness?

52:22 Troy Hunt: No of course they're not going to get better. I mean, there's just nothing that predisposes it to getting better. When you look at the factors that are driving the growth of IoT and how we want to be first to market, we want to put internet in things that were never meant to have internet. And if you want to know what I mean by that, just google Wii Vibe and we'll leave it at that.

52:42 Michael Kennedy: Wait, wait, wait, wait, wait. Do that in an incognito window.

52:45 Troy Hunt: When you look, you know what? You'll find new stories on it. And let's touch on that in a mature, responsible adult fashion. So these are toys for adults. And they are internet connected. And what I find fascinating from this and in a very mature way, is that this is digitizing data we never had before. So it actually stored usage data, and once these devices have been around for eons, that the concept of actually recording the use, everything from the modes that were used to the times of day, reference to the identity using it. We never had that before and now we have this new class of data which is enormously sensitive by virtue of the fact that we've internet connected the thing. And the reason you'll find them in the news is that they recently got fined up to $10,000 per owner of the device because they were collecting this data without consent. And you can imagine for the owners of them, I mean, it must be absolutely gut wrenching to think that this sort of data about them now exists on a server somewhere.

53:44 Michael Kennedy: Definitely, it's quite troubling. Yeah, so, I don't know, I think you're probably right. I think there's going to be a lot of trouble with these types of things, just the incentive to keep things updated is not very good. I did recently get an electric car, and I have a charger for the car that's on the internet which makes me a little nervous, but I logged into it the other day, and it had updated itself within the last two weeks, so maybe the higher-end devices will be a little bit better off.

54:11 Troy Hunt: Yeah, I think the devices that are doing these auto-updates are the way we're going, right? And some of them do it very well. Some of them do it still requiring user action but sort of very low friction. I hope that's going in the right direction, but jeez, there's so many things out there where something has gone fundamentally wrong in order to require the update in the first place, as well.

54:31 Michael Kennedy: Yeah, absolutely. Yeah, I mean, there's the danger that you could break the device, so there's that same type of hesitation to actually put that security patch in that you have with the big companies, right? Like no one wants to break everything they've sold.

54:44 Troy Hunt: I know. There's also that.

54:48 Michael Kennedy: So alright, I think we're just, I have so many questions I'd love to chat with you about. Maybe someday we'll do a follow-up, and I can ask the other ones. But I do want to give you a chance to talk about your courses, because you've written a ton of Pluralsight courses, and they're actually really, really valuable, I think. So maybe touch on some of the ones you feel are notable for my audience.

55:08 Troy Hunt: I think the one which is most notable at the moment is the one that still pinned to my Twitter timeline, I do intend to leave it there for a little bit, which is What Every Developer Must Know About HTTPS. And I love the HTTPS discussion at the moment because there are so many angles to it, and it's such an important time as well. So for example, back in January this year, we saw Chrome and Firefox start to warn anyone if they went to a login form or credit card form over HTTP, even if it publishes, or posts, rather, to HTTPS. And that was an important change. We're here recording in October, and Chrome 62 has just hit, and in 62, they're gradually enabling the feature, which now there's the same thing for any page with an input form as soon as you type a character. So you can go to like CNN, click on the search link, type any character into the search box, and then suddenly, you get a big "not secure" warning. So that makes things really, really interesting. Like, that's really starting to push HTTPS. And the sorts of stuff that I talk about in the course are things like, I think a lot of people know, you're meant to make sure that all of your embedded resources on the page are done so over secure connection. Otherwise, you lose your padlock and your green text and your "secure" in Chrome. But there are tricks to help you do this. So they're things like, there is an upgrade in secure request content security policy. So you can add a response header or you can put it in a metatag. And if you accidentally put anything insecurely on the page, then the request automatically gets upgraded to a secure one.

56:36 Michael Kennedy: Oh, that's nice.

56:37 Troy Hunt: So there are all these neat little tricks like that you can do which make HTTPS so much more easily accessible. And a lot of time when I hear people saying, "I'm having problems because of "this or that or whatever," there are usually things like response headers. Like the CSP, also HSTS, to enforce HTTPS connections for all requests. You know, those sorts of things are just fantastic, and that's what a lot of the course is about.

57:01 Michael Kennedy: Okay, cool. I'll definitely link to that and a couple of the other ones in the show notes that I thought were pretty cool. You also have a nice article on getting started in ethical hacking as a career, so that's cool as well. Yeah, I guess we're going to have to leave it there for the topics. I always have two questions I ask at the end, so let me ask them to you now. One of them is, if you were going to write some code, what editor do you open up?

57:23 Troy Hunt: Usually Visual Studio.

57:24 Michael Kennedy: Okay. Right on. And now, normally I ask about libraries in Python. You don't do that much Python, so I'll give you a pass on that one. I'm going to give you a variation on this. So what password manager do you use?

57:37 Troy Hunt: So I'm like you. I use the password manager called 1Password. Now I said the password manager called that 'cause if you just say, "I use one password," people are like, "What is wrong with you?" Well, people should be like--

57:47 Michael Kennedy: "You're leaving it everywhere, "what are you thinking?"

57:48 Troy Hunt: I know, I know. So look, I still use that. I'm a little bit agnostic insofar as I think, frankly, so long as you're using a mainstream, professional password manager, like that old LastPass, honestly, that's going to make your life so much better in so many ways. Use one of those things.

58:04 Michael Kennedy: Yeah. It definitely makes your blood pressure stay pretty cool when you find there's been a password breach. You're like, "Eh, it was 40 characters of random. "I'm going to reset it, not a big deal."

58:14 Troy Hunt: That's it.

58:15 Michael Kennedy: Yeah. Right on. Okay. Well, Troy, thank you so much for being on the show. Any final call to action for everyone listening?

58:20 Troy Hunt: No, look, I mean, if you want to learn any more, go to TroyHunt.com or find me on the Twitters as @TroyHunt.

58:26 Michael Kennedy: Alright. Awesome. I'll be sure to put those links in the show notes, and thanks for being here.

58:29 Troy Hunt: Good on ya, thanks, man.

58:30 Michael Kennedy: Yeah, bye. This has been another episode of Talk Python to Me. Today's guest has been Troy Hunt, and this episode has been brought to you by Rollbar and GoCD. Rollbar takes the pain out of errors. They give you the context and insight you need to quickly locate and fix errors that might have gone unnoticed, until your users complained, of course. And as Talk Python to Me listeners, track a ridiculous number of errors for free at rollbar.com/talkpythontome. GoCD is the on-premise open source continuous delivery server. Want to improve your deployment workflow but keep your code and builds in-house? Check out GoCD at talkpython.fm/gocd and take control over your process. Are you or a colleague trying to learn Python? Have you tried books and videos that just left you bored by covering topics point by point? Well, check out my online course, Python Jumpstart by Building Ten Apps at talkpython.fm/course to experience a more engaging way to learn Python. And if you're looking for something a little more advanced, try my Write Pythonic Code course at talkpython.fm/pythonic. Be sure to subscribe to the show. Open your favorite podcatcher and search for Python. We should be right at the top. You can also find iTunes feed at /itunes, Google Play feed at /play, and direct RSS feed at /rss on talkpython.fm. This is your host Michael Kennedy. Thanks so much for listening. I really appreciate it. Now get out there and write some Python code!

Back to show page
Talk Python's Mastodon Michael Kennedy's Mastodon