Learn Python with Talk Python's 270 hours of courses

#55: How our engineering environments are killing diversity (and how we can fix it) Transcript

Recorded on Wednesday, Apr 20, 2016.

00:00 In the software field, we pride ourselves on fairness, openness, and the fact that our workplaces

00:04 are largely meritocracies. And compared to other environments, I would say this is certainly true.

00:09 It's one of the reasons I love being a developer. And yet, if we look at programming jobs in Silicon

00:14 Valley, you'll see that over 85% of them are filled by men, which means that less than 15%

00:19 are filled by women. If we look at it from a race perspective, it's even more bleak.

00:24 Among the major tech companies like Google, Twitter, and LinkedIn, you'll see that over 90%

00:29 of the employees are either white or Asian. Black and Hispanics combined make up less than 4%

00:34 of the total employees, and it's probably an even smaller percent among developers within those

00:39 companies. Given that they represent 28% of the U.S. population, this means that there are seven times

00:45 fewer black and Hispanic developers than there should be. Folks, this is wrong. We should all be doing what

00:51 we can to help improve this balance, as it's not just the right thing to do, but we individually and as an

00:58 industry will be better for it. This week, you'll meet Kate Hedleston, who gave an excellent talk at

01:03 2015 entitled, how our engineering environments are killing diversity and how we can fix it. What's

01:08 great about this is that it comes with ideas for improving things. I think you'll enjoy this

01:13 conversation. I know that I did. This is Talk Python to Me, episode 55, recorded April 20th, 2016.

01:24 Welcome to Talk Python to Me, a weekly podcast on Python, the language, the libraries, the ecosystem,

01:50 and the personalities. This is your host, Michael Kennedy. Follow me on Twitter where I'm at,

01:55 mkennedy. Keep up with the show and listen to past episodes at talkpython.fm and follow the show on

02:00 Twitter via at Talk Python. This episode is brought to you by SnapCI and Hired. Thank them for supporting

02:07 the show on Twitter via at snap underscore CI and at Hired underscore HQ. Kate, welcome to the show.

02:16 Hi, thank you for having me. Oh, I'm really excited about our topic today. We're going to talk about

02:21 creating, reinforcing positive environments for programmers. That's going to be great.

02:27 Yeah, I'm super excited. Yeah, yeah, cool. I saw you talk at PyCon and I'm really excited to share it

02:33 with my audience. But before we get into that, let's talk about your story. How'd you get into Python and

02:39 programming? So I got into programming back in college. My brother is four years older than me,

02:45 and he studied computer science and electrical engineering. And he told me that everyone who

02:52 studies CS gets a job no matter how mediocre their grades are, because everyone's in desperate need of

02:58 developers. And I was like, I can manage mediocrity. And so I tried out a few things. And then I took

03:05 an introductory computer science class. And I really enjoyed the material. And it turned out throughout the

03:11 whole degree, regardless of kind of what grades I was getting. I really enjoyed the material. And that

03:17 to me was a great sign.

03:18 That's really cool. What was your first course in? What language?

03:21 It was Java?

03:23 Yeah, yeah.

03:24 Yeah, yeah. Cool. Mine was Scheme. I'm glad to see Python sort of replacing some of these languages,

03:30 a little more practical and easy to learn.

03:33 I'm so jealous of colleges that teach Python as an introductory language. But I think getting

03:38 exposure to Java, and we went on to C++, and then C, I think that that was a really good background.

03:44 Yeah, nice. So how do you go from Java and C into Python?

03:47 You start working in web development out in the real world. And it turns out that in the real world,

03:53 people are like, yes, yes, computer science languages are nice. But really, we need the thing

03:58 that's easiest for us to build something and maintain it. And it turns out Python is a really

04:02 great maintainable language, you can build out entire engineering teams that use Python to build

04:09 really robust technologies. And that is more important when you're actually building companies,

04:14 then I don't know, using some obscure language that is technically better on some minute detail.

04:22 Right, right. You know what I find really interesting, and it's been admittedly a while

04:27 since I've been in college, but just the disconnect between the academic environment and the sort of

04:35 software engineer professional environment, it seems like there's still a pretty big step change as you

04:41 go across that, which is kind of surprising, I guess.

04:43 Yeah, it's kind of like, I felt this way about math, too. I got I loved math. And then I got to

04:49 college, and it turned into symbols. And I was like, this isn't as much fun anymore. Like I liked

04:54 having numbers at the end. And so that I could academia, it's, it's great to know the theory.

04:59 But there's a certain point at which I was like, I'm much more motivated by the goals around why I'm

05:05 building something. And if I need to go learn the theory around, you know, why something should be

05:10 implemented a certain way, I'll do it. But I care more about the problem that's being solved than just

05:15 learning academic things for the sake of knowing academic things.

05:19 Right. It's interesting to understand how you would create a relational database.

05:23 But if you actually went and did that, you might be kind of crazy and wasting your time, right?

05:27 Yeah, like I'm not interested. That was never my personal interest. I like,

05:30 I like building things that people use. And so programming to me is a tool that allows me to

05:36 solve human problems. And there's a whole camp of developers who love to build technical things

05:41 for purely technical reasons, which is fantastic. I just don't happen to be one of them.

05:46 Yeah, I don't think I would count myself in that space, either. I think, you know, people who are

05:54 not in programming often think about programming as a solitary thing. And it's really not. And there's

06:01 lots of layers to that, right? Like, we were talking about your blog, and, you know, apps you write,

06:06 and web apps, and so on. And you can reach 100,000 people quite quickly, you know, if you're lucky,

06:13 right, with programming. And that's, you know, there are not many specialties or jobs that you get that

06:19 power, right, especially as an individual, but also as teams, I think it's really interesting. And so

06:24 maybe that's a good segue to talk about these environments, right? So what's the story behind

06:31 your PyCon talk? And what was the title for everyone?

06:34 The title was a bit sensationalist. It's how our engineering environments are killing diversity

06:39 and how we can fix it. You know, a little bit of pessimism, a little bit of optimism. I actually do

06:45 believe that the problems that are facing, that are causing kind of like these diversity issues,

06:52 I believe that they're solvable. But I do think that we need to start to take a look at kind of the

06:57 environmental and institutional problems that exist. You know, traditionally, a lot of people

07:02 think of sexism or racism as one individual person being unkind towards another individual person in

07:10 some sort of obvious way. And it's very rarely the case that that's what it looks like at scale.

07:16 It's a whole bunch of little tiny details that kind of come from this environment and this institution

07:22 that we set up. And everyone perpetuates it. And everyone has a hand in creating these environments

07:28 that hurt some people more than others. And so yeah, that's the theory behind it.

07:32 Yeah, and they might not necessarily be, you know, have malice in their intent, right? They just,

07:39 they might have negative effects that are not fully thought through or fully obvious.

07:44 Exactly. And that was my audience that I wanted to reach is there's a lot of people who have really

07:48 good intentions, but they don't actually know how to make things better. Because it's really,

07:53 really difficult to isolate some of these almost invisible problems, you know, they've become such a

08:00 fabric of the environments that we're in that they become really difficult to see. That's what I wanted

08:06 to do with each one of these is kind of isolate, like a core, fairly nuanced problem that can be solved,

08:12 that will actually have positive impacts on the environment and how diverse people are able to operate

08:18 in that environment.

08:19 Yeah, and I think some tips for people on things, you know, actual steps they can take would be helpful,

08:24 because environment and culture, especially as companies get larger and organizations get larger,

08:30 it's just so super hard to even influence, right? You go to some company like, well, this is what it's like here.

08:36 Like, I'm only one person out of this group, how am I supposed to change this, right? And I'm new, you know,

08:41 something like that, right?

08:42 Yeah. Or just, you know, if you haven't experienced some of these problems, it would be really hard to

08:48 identify them. And I know that there's a lot of rhetoric around, you know, how things need to change.

08:55 And a lot of people have bought into that. And then the next question, especially for me, as a software

08:59 engineer, I'm like, okay, great, we're going to do this. How do we implement it? You know, and that was,

09:05 I think that's where a lot of this movement right now is kind of stalling is like, what are all the,

09:10 if diversity issues are death by 1000 paper cuts, how can we get rid of a few paper cuts at a time,

09:17 basically?

09:17 Yeah, yeah, absolutely. I think as I look across the industry, I know, you know, you can't make

09:23 complete generalizations, but I think most people are super good intentioned. But you know,

09:29 they don't know what to do or how to make a change, or even that there is something that should change,

09:33 right? So you have a really interesting story, analogy, a couple of analogies you started your

09:39 talk with, and one of them was about mining. Can you tell that?

09:42 Yeah, so basically, I wrote this in one of those moments of just like, sassiness. I was like,

09:52 women in tech, and marginalized groups in tech are the canary in the coal mine. And the canary in the

09:59 coal mine was used to indicate to miners when there's too much non oxygen gases in the mines.

10:06 So basically, canaries were more sensitive to things like carbon monoxide and other types of

10:11 gas poisoning, and they would start to die before the miners were affected. And so if the canary was

10:16 dying, you need to get the hell out of the mine. But in tech, if these marginalized groups are the

10:22 canary in the coal mine, instead of recognizing that there's a huge problem, people are telling

10:28 the canaries to lean in, and, you know, replacing dead canaries with new canaries thinking, well,

10:34 if we just get more canaries in here, that will fix the problem. And that's not true. Like, if you

10:40 think about it as an environmental thing, you have to look at what's actually killing the canaries,

10:44 as opposed to looking at the canaries as the problem themselves.

10:48 Maybe we need more efficient logistics to bring in new canaries.

10:52 Truckloads of canaries. I'm totally, I'm sure that that will fix the problem with the mines.

10:58 Yeah, and it was just, I don't know, I, I do, a Sheryl Sandberg's book Lean In I did really like,

11:06 but as with any piece of literature, and ideology, it can be used both for positive and for negative. And so,

11:15 you know, sometimes there are problems that, like, marginalized groups themselves can try to fix.

11:21 Other times, telling these groups to lean in is a little bit like telling a canary that's dying that

11:25 it just needs to lean in a little bit more.

11:27 Just breathe harder, canaries.

11:29 Just breathe harder. I'm sure that you can make it.

11:32 And so, I think we need to be really conscious, too, when we look at kind of the ideologies and what

11:38 we tell people to do in the workplace to improve their station. I think we need to be really careful

11:43 that we're not putting the onus on these individuals who are in these environments and

11:50 these institutions that are perpetuating kind of systems that aren't fair. And then saying,

11:54 well, if you just worked harder, like, if you just did, if you just followed this list,

11:59 you could be successful.

12:01 I think that's a really interesting point. I think that scales up and down society, right?

12:07 Across getting, being more successful than your parents, potentially, or the situation you found

12:14 yourself in. But the one that we have control over, the part of society that we have the most

12:18 control over is our engineering environments. And so, right, like, applying that idea there,

12:24 there's really cool.

12:24 So, you started out by talking about your environment. And you had two great analogies,

12:30 I thought. One was around how the environment is kind of, you just acclimate to things. You know,

12:38 I think it's, that resonates with me really well. Like, eight months ago, I moved to Germany

12:45 Germany and spent a lot of time here. And when I first got here, it was like, everything was new

12:50 and everything was different. And the sounds and the smells and the sights. And now, you know,

12:55 I barely notice, like, people walk by, I'm like, was that in English or German? I kind of understood

13:00 what they said, but I don't even actually notice the language. You know, it just, it becomes,

13:04 no matter how different it is, at first, it becomes natural in a sense. So, you have this story

13:10 about fish, right?

13:11 Yeah, it's a David Foster Wallace quote. It's a, it's a story that he told giving a keynote speech.

13:18 There are these two young fish swimming along, and they happen to meet an older fish swimming the

13:22 other way, who nods at them and says, morning, boys, how's the water? And the two young fish swim

13:27 on for a bit. And then eventually, one of them looks at the other and goes, what the hell is water?

13:31 He tells it, he tells it really well. It's, his whole piece on that is amazing to listen to. It's

13:37 about, it's basically about consciousness, being conscious of the world around you and the

13:42 environment and the fact that people probably have legitimate motivations for what they're doing.

13:47 Yeah, that's cool.

13:51 It's, it's great. But yeah, I mean, for all of us, right? Like, we get acclimated to our

13:55 environments. It's, it's a lot of work to be conscious of every detail.

13:58 Yeah, that's how your brain basically lets you cope and actually think about stuff, right? If you had

14:03 to pay attention to everything, you didn't just sort of go on autopilot for, you know,

14:06 the normal stuff, then you wouldn't be able to get anything done. You're just like, you're like,

14:11 oh my gosh, there's a door. Oh, look, there's a window. Like, no, go back to what you're doing.

14:15 The doorknob is gold. Look at that.

14:18 Wow. It's amazing.

14:20 It's why when you travel and you have all of these overwhelming new details, that time feels really

14:25 long. I don't know if you've ever felt that where like, if you're paying attention to every

14:29 second, you're like, time is moving very slowly.

14:33 Yeah, I think that's a great analogy for like, our environment that we live in just fades to the

14:40 background. And it, it has an effect on us, right? Like whether or not we're always inside, or we get

14:46 to go out and get the sun, you know, get exposed to the sun and fresh air, whatever that is that we do,

14:50 we just that becomes normal, but it still affects us in our day to day, right? So you started talking

14:55 about things that are common in engineering and software environments that are not necessarily

15:01 positive, right? So you had a list of them that you, you talked about, and you talked a little bit more

15:07 in your blog post series. But the first one was criticism.

15:10 Yeah, that one was the most popular blog post by far. It resonated with a lot of people completely

15:16 outside of the tech industry as well. The reason that I kind of discovered,

15:20 discovered, I mean, most of the things, I should clarify this, most of the things that I'm talking

15:25 about are not new ideas, I have found them from other places and kind of piece them together to

15:31 write about the engineering culture specifically. But you know, you can find a lot of literature on

15:37 criticism. And so one of the things I noticed, though, in engineering cultures is that being

15:42 critical of other people or of other libraries and frameworks is kind of seen as this badge of honor.

15:47 there were these habits that I saw people having where, where they would really make fun of,

15:51 of other people's work, as though that was appropriate. And at first, I was like, Oh,

15:57 that's just how things are done. We're gonna make fun of this, this library. And I had this kind of

16:01 aha moments about criticism when I met the creator of a library that we had used at my company.

16:06 And we had used it, and it didn't work out for us. And we pulled it out. And when we were pulling the

16:11 library out, the engineers who were doing it were just they would just rip on it. They would just sit

16:16 there and like absolutely rag on this library and why it was like the worst library that had ever

16:21 been written. And I had just met the creator at a conference the week before. And I remember sitting

16:27 in that meeting going, Oh, my God, this is horrible behavior. This is this is really, really mean.

16:33 And this is not okay. Why is this such a habit in engineering culture? And why do we take it for

16:39 granted that we should, that we can walk around being really critical of other people's work?

16:44 Yeah, it was just it was one of those moments where I woke up.

16:47 Yeah, sure. I imagine. And you're probably thinking, well, this person was really nice,

16:53 and they seem smart. And then here we are just completely deriding them sort of through their

16:58 work, not directly, but more or less, right?

17:00 Right. And, and every piece of code is just, it's just one person's opinion about how a problem should

17:08 be solved. And if you held me accountable for every opinion that I had ever had, like,

17:12 man, that would be brutal.

17:14 Yeah, yeah, it absolutely would for all of us, I'm sure, you know, just on focusing on the code part

17:21 of it for a moment, I think two things. One, you know, maybe that library was written with different

17:26 constraints than it was trying to be put under, right? And so it might completely nail some problem,

17:33 and it doesn't necessarily fit here. And so it should, you know, maybe a little hubris,

17:39 and like a broader picture of the library. I, you know, we don't want to judge it, right? Like,

17:44 it doesn't make sense to do that. But just thinking like, well, maybe this was written under different

17:48 pressure. And for that, that pressure really solves it well.

17:51 Right, right.

17:52 Yeah. And the other thing, the other thought that comes to mind is, we in the software industry,

17:56 I think, pride ourselves on the fact that we try to make this a meritocracy, right? It's about the

18:04 quality of what we do. And I think you see that in a lot of ways. You see that in like hiring and

18:08 experience, like somebody who went to a boot camp versus somebody who has a PhD, if their work that

18:14 they can demonstrate to you is very similar, those people are basically considered equivalent,

18:19 right? Because it's not, you know, the degrees you have or whatever. It's like, what can you do?

18:23 Awesome. But this is what you have to bring, right? And that's really a positive thing. But

18:29 maybe there's something about where, because it's a meritocracy, people feel like if they attack a thing

18:35 like that, they're attacking the idea of it, and that they get sort of disconnected from the real

18:40 person, or the real feelings behind it. I don't know, that's just, what do you think?

18:45 Yeah. And I mean, it's interesting, I think that we have internalized this ideology that,

18:50 that we are somehow fair, that it is somehow a meritocracy. And it's, that's, it's not entirely

18:56 true, right? We're more likely to criticize libraries that people have built when that person is not

19:00 present. Companies are still far more likely to hire someone who's a PhD and pay them more,

19:06 and believe that they have more potential over their lifetime as an employee than someone who comes

19:10 out of a boot camp. And so it's interesting to kind of see these things and how they manifest. And

19:15 one of the things I found is that you can really see the cracks in the system when you start to look

19:19 at how, how people are treated kind of the less privileged that they have. A lot of these

19:26 environmental factors that we talk about, one of the reasons that they hurt kind of marginalized

19:31 groups more is that privilege is a shield. It's like, it's like armor. And the more privilege you have,

19:37 the more armor you have. And so some of these little environmental things aren't going to hurt you.

19:40 And I've seen this, like, with my male friends, my boyfriend, my brother, right? You know,

19:46 we'd be in the same environment. And I would be having a very different experience than they were

19:51 having. And, and so, you know, when it comes to things like criticism, it turns out that people

19:57 are far more likely to criticize women than men. They've done a bunch of research on it, people feel

20:03 much more comfortable. And we see this out in society as well, people are much more comfortable

20:07 criticizing women's appearance, women's eating habits, like all sorts of things. And this translates

20:14 into the workplace, people are much more comfortable criticizing women. And it has really negative

20:19 effects, because it turns out that all humans, regardless of what they look like on the outside,

20:25 really hate criticism, we really don't do well with it. As soon as someone starts to criticize us,

20:31 we go into kind of this defensive fight or flight mode, we hyper focus on the criticism and the negative

20:36 things that we've been told, we're less likely to progress in positive ways. And again, this is true

20:42 across, across genders, across races. And one of my blog posts on criticism and ineffective feedback

20:49 goes through some of the research about how that happens to people. But, you know, the biggest issue is

20:55 if, if different groups are getting more criticism, mostly because they have less protection,

21:00 then that's a huge problem. That's like a lot of subconscious bias going on right there. And so

21:07 it's really difficult for people who are receiving less criticism to see that other people might in private

21:13 instances be receiving more criticism, and kind of what those negative effects are.

21:17 Gone are the days of tweaking your server, merging your code, and just hoping it works in your production

21:37 environment. With SnapCI's cloud-based, hosted, continuous delivery tool, you simply do a get push,

21:43 and they auto-detect and run all the necessary tests through their multi-stage pipelines.

21:47 Something fails, you can even debug it directly in the browser. With a one-click deployment that you

21:53 can do from your desk or from 30,000 feet in the air, Snap offers flexibility and ease of mind.

21:59 Imagine all the time you'll save.

22:01 Thanks SnapCI for sponsoring this episode by trying them for free at snap.ci slash talkpython.

22:07 Yeah, that makes perfect sense to me. I personally hate criticism. I respond badly to it. I mean,

22:22 constructive criticism is maybe one thing. I still don't like it. I try to learn from it or whatever.

22:28 But, you know, one of my theories about work and careers is that we kind of go along more or less flat doing things,

22:35 and then we get inspired about something at some point. And you kind of do this step function jump of like,

22:42 oh my gosh, I'm going to do this thing that I didn't think I could do or whatever. And you go and do that.

22:47 And to me, constant criticism sort of kills inspiration. And I think inspiration is one of the things that really helps us grow

22:55 in creative type endeavors.

22:57 Yeah, yeah, absolutely. And you're not alone. Everyone, like literally everyone hates criticism.

23:03 There's no one out there. And if anyone's like, yes, I love criticism and do really well with it,

23:07 you should be like, well, shut up. Like, we don't need critical environments just because you for some

23:13 reason have figured out a way to not care about other people's criticism. One of the ways that I

23:17 discovered this is back when I was coaching sports. So I used to coach JV girls water polo and swimming.

23:23 I found that when I told the girls I was coaching what not to do, they completely shut off.

23:28 They basically, they looked at me, they had no idea what to do.

23:32 They felt bad because they had done something wrong. Because when you tell someone what they're doing

23:38 wrong, essentially, you're just, you're criticizing them, you're pointing out this flaw.

23:42 And I saw none of the behavior that I wanted, which is I really wanted them to be,

23:46 you know, more aggressive to go, go for things and, and to try harder. And instead, you know,

23:52 I saw them just shutting down. And so I switched all of my language over. And I was like, okay,

23:57 well, I'm going to keep yelling at them from the pool deck, but I'm going to yell at them all of the

24:02 things that I want them to do. So keep your elbow up. Keep your, keep your hips on top of the water.

24:09 Swim faster. I yelled swim faster a lot at them. As a result of that, I saw exactly the behavior that I

24:16 wanted, which is that if you tell people what to do, and you have good reasons for them doing it,

24:22 they're likely to do it. And they're likely to actually be pretty happy and pretty excited about

24:26 doing it. And so there is a positive way to actually give people feedback and to motivate them

24:33 in the right direction. And it generally doesn't involve telling them the things they're doing wrong.

24:38 That, that really resonates with me. And, you know, thinking of things like code review or

24:42 discussing in a group sort of whiteboarding architecture considerations and so on. I think,

24:48 I think there's a lot of ways in which you can make that positive as well, right? Instead of going,

24:52 oh, this code you wrote here sucks. Do you know how inefficient range is for this much data in Python

24:58 2.7 or whatever? Like, why are you not using X range or, you know, some stupid thing like this,

25:03 right? Like you could say like, look, this is a really good code you wrote, but if you actually did

25:07 this, you could actually make it better in this way or that way. Right? I think it, it sounds small,

25:11 but I think, especially when you're new, like if you've been programmed for 20 years, you're like,

25:15 yeah, whatever. Right.

25:17 If that's your first year, especially, you know, like you're talking about people coming into the

25:22 industry and then leaving, right? If that's your first year or two getting into the industry and

25:26 your experiences, people criticize you all the time. Like, okay, that's not fun.

25:30 Right. Exactly. And also when you're new, you can't make those cognitive leaps. So someone with 20 years

25:35 of experience, if someone goes, your code is really inefficient, they would look at it and they would

25:39 theoretically have all of the tools and experience to go, okay, well, here's all the ways my code is

25:43 inefficient. And someone who's new doesn't have those tools. So saying your code is inefficient,

25:49 you're coming in with this, this assumption that they were somehow bad because they wanted to be bad.

25:54 When in fact, most of the time people are trying to do their best with only a rarefying few exceptions.

26:00 And so if you're like, okay, Hey, like, I think that this would make your code more efficient.

26:04 Most of the time people are, are receptive to that kind of dialogue. It's not criticism. It's,

26:09 it's like a, I think your code would be more efficient.

26:12 Mentoring. Yeah. It's like mentoring or support or something like this, right? Which is,

26:16 I think, really positively seen.

26:18 Yeah.

26:19 Yeah. So the first thing we should probably be on the lookout for, get it out of the water category

26:24 and into the more like, Hey, there's something to do is like criticism, right? And across the board

26:29 in a lot of ways and how we can turn that on its head positively. The other one is arguing,

26:33 right? Like argument culture, like we're going to basically, I think this sort of orbits around

26:40 the meritocracy as well, right? My idea of how this problem should be solved or what to do next

26:44 and your idea, they're going to fight. And the winning one, Darwin style is going to become our

26:50 application.

26:50 Yes. And of course the ideological fight between these two concepts is pure and it's, it's completely

27:00 based on the merit and the logic behind each of those concepts, right?

27:04 Yeah. Maybe until it goes through human beings and has to come out with like,

27:08 exactly until humans are involved in it. It has almost nothing to do with the logic.

27:12 One of the things I always tell new engineers and even new engineering managers is that

27:17 being right matters very little with humans. So most of what you do is you spend time managing

27:25 people's emotions and dealing with egos and argument cultures are really indicative of environments that

27:34 kind of let people be too emotional actually in their discussions, which is funny because

27:39 generally when people are telling someone they're being too emotional, that's criticism that's levied

27:44 against women. I can't tell you the number of times that I have been very passionate about an idea and

27:50 been told you seem really emotional right now. You seem very frustrated. And I'm like,

27:55 yeah, that's, that's horrible.

27:57 I'm like, I do. I have emotions right now. I can name all of them. Can you name yours? Like,

28:03 and that's part of the problem is that we, we actually, we don't value emotional literacy as much

28:09 as we should, but we have these environments that really allow people to try to express ideas and to win

28:15 arguments using a lot of emotional manipulation tactics and aggression is one of them.

28:21 arguments where you raise your voice, you use your size, you yell at someone, you insult them.

28:26 I mean, these are all like non-logical emotional tactics to try to win an argument that have nothing

28:32 to do with the merit of your idea. And, unfortunately you see those tactics again, used against people who

28:39 have less kind of privilege and less defense against them.

28:43 Right. And even maybe physical stuff on your side, right? Like if, if you're a woman, you're on average,

28:49 probably smaller than the guys.

28:50 Right.

28:51 Maybe yell less loudly. I don't know. And so just, these are not necessarily a nice matchups, I guess,

28:59 to put, use a bad analogy. So you, you talked about some really interesting places in society where argument culture is deeply embraced.

29:11 One was the legal, legal courtroom area. And what was the other one? I don't recall. Maybe they both

29:16 are the same.

29:17 Arguments are kind of tantamount to competition, which is embraced in the legal system, uses arguments as

29:25 a form of competition and, and sports. They don't use arguments, but basically sports are the classic

29:30 arenas for, for competition.

29:32 Right. And the real big difference you pointed out was those have rules, right?

29:36 Yes. And the reason that those, those arenas are important to look at is that the goal of argument,

29:42 the distinction that I make, and some people disagree with this distinction, which I mean,

29:46 is your prerogative. The distinction that I make to clarify is that an argument is an exchange in which

29:52 winning is the primary objective is the primary objective. And so those can look different. So I could

30:05 be having a very passionate discussion with someone because there's equal trust on both sides. We're,

30:09 we're actually trying to come to a common truth. and that's a discussion. And an argument though,

30:15 is where one party wants to win or both parties want to win. And the problem with winning being

30:21 the goal is that people will engage in really bad behavior in order to win. So cheating, lying,

30:28 and that means that you need really strict rules around how people are allowed to argue.

30:34 So the legal system obviously has a very strict system with how each side is able to present its

30:40 argument, the types of words that are allowed to be used theoretically, how much they are allowed to

30:45 attack or not attack people who are participating in the case. There's a judge who presides over it.

30:50 I mean, there's a lot of rules around a system of arguing. similarly in sports, you've got refs,

30:56 you've got, no, they have a little less policing than they probably should have, but they, they have a

31:02 pretty strict system of rules, right? and people still cheat.

31:06 You have a instant replay and you can actually have like a, a television review of being like

31:12 up close in super slow mo, right? Like it is. Yeah. And it just, you break the rule a little bit,

31:17 like no, the little bit of dust was kicked up as they stepped on the line. So we know this doesn't

31:22 sound right. Right. And people are very strict about it. And even still though, you mean you have,

31:26 what was it? Deflegate the deflated football controversy, which I mean, I can't comment on what

31:31 happened, but like, I mean, there's still these massive controversies around cheating or lying or

31:36 lawyers being unethical. and so, you know, those rules have, have proven to be very important.

31:44 And cause even if it's only a small percentage of people who are willing to lie and cheat,

31:47 my joke about this is how many assholes does it take to ruin the workplace? Like one, it takes one.

31:56 Yeah. Yeah. One, maybe two, depending on how big the workplace is. But yeah, it's, it's a very,

32:02 it's a very small number actually. so you really only need someone, one person being one,

32:07 two people in your proximity being really aggressive and being, you know, really kind of unkind and

32:14 in the pursuit of, of winning to make it a really miserable workplace for a lot of people.

32:19 The major essence of that, I think, as you pointed out is that there's no, or very limited

32:27 regulation and monitoring of sort of these types of arguments. You just, you had a meeting for two

32:32 hours. We all got into the conference room and argued about how we're going to build the next app.

32:37 And then we decided.

32:38 Yeah. And if someone's winning and they're willing to be unethical, then they're going to use kind of

32:43 whatever weapons they have against you. And it turns out that the marginalized groups in tech,

32:48 there's a lot more things that can be weaponized against them. So, you know, for women, the example

32:54 is emotions. Like when someone was like, Kate, you seem really frustrated. That's a way of weaponizing

33:00 this idea that women can be inherently too emotional to be logical. That like emotions undermine logic,

33:08 which is super funny because it turns out that you actually need emotions to make decisions.

33:13 So if the emotional part of your, your brain is compromised, you become incapable of making

33:18 decisions. So every decision is emotional, like period, whether you're male or female or whatever

33:24 it is. but it's a weapon that can be used against one group that, that isn't, isn't something

33:30 that's used against the other group. And so you start to see these kind of like tiered systems of how

33:36 fair or unfair it is for someone to kind of try to move up in that environment.

33:40 One of the things, you know, in a real broad sense, I could say, well, how can we make

33:45 this part of environments better? It's like, you know, argue less, right? But do you have a more

33:51 concrete advice than that? Like what should people do?

33:54 Yeah, I would say like, there's a couple of different things you can do. The first is try to

33:59 have a culture of discussions where the idea is finding the best solution for everyone. you know,

34:05 a lot of companies try to do this, they call it Eagle list programming. So you basically set up a

34:10 system where you do actually have rules and you reward people for discussing and acknowledging other

34:15 people's ideas, for really bringing up data that supports their ideas. And then you have to have

34:22 some sort of system that deters or punishes people from using things like loudness and size and aggression

34:28 in order to get their point, their, their ideas pushed through setting up rules around how you

34:34 make decisions and how you communicate is not a bad thing. And there's a bunch of research out there on

34:39 things like brainstorms, setting up brainstorms, which is a certain type of discussion. And then as you

34:46 converge and make decisions, kind of like setting up rules about how decisions are made so that it's the

34:50 same every time. And people understand how to operate within the system is actually, I think,

34:54 healthy for everyone. And it's, it's a good practice within companies and it would greatly

34:58 reduce this kind of behavior. so you wouldn't actually have to worry about it.

35:02 That's way better than what I was thinking maybe is hire a bunch of lawyers to sit with all your

35:06 programmers to judge. So the next, the next topic that you, brought up or the next point in the

35:14 environment was about bringing people onto the team and this concept of team debt. Like we're all

35:20 familiar with technical debt where we kind of rush through writing code and we make shortcuts

35:25 for the sake of expediency. And then our software acquires this debt where it's, it's kind of,

35:31 it's kind of bad. There's something wrong with it, but it was worth it at the time. But if you do that

35:36 too much, it'll sort of crumble down and you, you're in trouble. But what's team debt?

35:41 Yeah. team debt is, is the same concept. when you build a feature, there's this idea that

35:47 there's a certain amount of work that needs to be done in order for the feature to be truly finished.

35:51 And if you ship the feature prior to, you know, all of the monitoring and all of the testing and

35:56 kind of all of the optimizations finished, then you have accrued technical debt and that will catch up

36:02 with you at some point. team debt is the same idea that when you hire an engineer or, or a team

36:07 member, in any department, there's a certain amount of work that needs to be done for them to be a

36:12 fully integrated, productive, happy, functional member of this team. And any amount less than that

36:20 accrues as team debt.

36:30 This episode is brought to you by hired hired is a two-sided curated marketplace that connects the

36:38 world's knowledge workers to the best opportunities. Each offer you receive has salary and equity

36:43 presented right up front, and you can view the offers to accept or reject them before you even

36:47 talk to the company. Typically candidates receive five or more offers within the first week, and there

36:52 are no obligations ever. Sounds awesome, doesn't it? Well, did I mention the signing bonus? Everyone who

36:58 accepts a job from hired gets a thousand dollars signing bonus. And as talk Python listeners,

37:02 it gets way sweeter. Use the link hired.com slash talk Python to me and hired will double the signing

37:08 bonus to $2,000 opportunities knocking visit hired.com slash talk Python to me and answer the call.

37:14 What's really interesting about team debt is that it actually topples startups. It's not something that

37:27 people talk about nearly enough. You know, we talk about how the technology can go down and how

37:31 our web applications didn't scale. And there's not enough people talking about how

37:36 team debt is, is one of these major things that can bring down engineering organizations.

37:42 Because if you can't scale your team of engineers, why, what are you going to do?

37:47 Like, I guess maybe optimize what you have.

37:49 Yeah, I think.

37:51 Yeah, I guess so. But yeah, that's a really good point. That maybe is a sub specialization or sub

37:58 point of this concept of culture. And you hear about companies growing too fast and their culture

38:04 comes unglued. But it's one thing to say, you know, the culture has changed and we don't quite value

38:10 this thing the same way we do, as opposed to the people who actually build the stuff don't really

38:16 know how to work together or how to collaborate or how all the pieces work in the system, basically, right?

38:23 That's bad news.

38:24 Yeah. And one of the things I used to tell the girls I coached is that your ability to win is the sum of

38:31 your talent multiplied by how well you work together as a team. So talent is just a factor of summation.

38:40 But teamwork, that's a multiplication factor. So a team that's theoretically less talented could beat a

38:46 team that's theoretically more talented if they work together much better. And it can be a huge factor

38:51 in the performance of a team. And so that's essentially what team debt is. It's this idea that your team is

38:58 not functioning together enough to even utilize their talent. So if teamwork falls below one, suddenly,

39:06 what you have as a team where the talent of the overall team is actually being underutilized. And

39:11 that's a team that leads to high attrition, unhappiness, because nobody likes to be less

39:17 productive than they are capable of. I mean, it's not a fun feeling to be like, I'm totally not fulfilling

39:23 my potential here.

39:24 Yeah. Unfortunately, I think there's a fair number of environments where people feel that way, right?

39:30 Like, it just, like you say, it kills inspiration. And it makes you feel like, well, this is definitely not a place

39:36 I want to stay, because I go to work. And instead of doing what I want to do and building amazing software, I'm like,

39:42 can't get anybody to close this pull request for this feature I just finished.

39:47 You know, it's easy to view that as like an optimization problem, right? Like, well, if we can get these people all this

39:55 a little bit more efficient, then we can, you know, get the synergy and they're this much more efficient.

39:59 But the negative of it is, is maybe even more important, right? Like that you kill,

40:04 kill the joy.

40:05 Yeah, yeah, it really, it does kill a team. Everyone likes to be on a winning team. I mean,

40:09 what, who really enjoys being on a losing team? I had a coach in college who was like,

40:14 at the end of a game, the only thing anyone sees is the W or the L next to your team name.

40:20 And I was like, do you really need to tell a group of college athletes to win? I was like,

40:25 nobody here likes losing. Nobody here is like, yeah, I think that losing today is a good option.

40:31 And so, so, you know, like people want to be on winning teams, they want to be on high functioning

40:38 teams, and they want to be their best, at least as far as I have seen. And so it's, it's the company

40:44 and the management teams failing, if they aren't able to put in place programs that really allow

40:49 teams to function well. And onboarding is a huge part of that, because whenever you add new people

40:55 to your team, the team fundamentally changes, you need to realign values, you need to make sure that

41:00 everyone's on the same page so that that everyone works together really well as a team.

41:04 The reason that onboarding also has kind of a diversity factor is that when there is no onboarding, no, no

41:12 official onboarding, people still have to get onboarded. It's just a thing that has to happen, right?

41:18 It, onboarding is literally someone new coming on board and being integrated into the team.

41:24 And so if someone doesn't get onboarded, they'll leave the company. And if there is no institutional

41:31 onboarding, people have to figure out a way to do that on their own. And it turns out that people

41:36 who are more like the existing group have an easier time getting up to speed when there isn't any sort of

41:42 official institutional onboarding, which means that in an industry that is mostly white and mostly male,

41:49 anyone new coming on board who doesn't look like the existing group kind of falls by the wayside.

41:54 And this means in some, in some teams, this can mean parents. And depending on the team,

41:59 it can also mean that an introvert coming on board, a team of extroverts is going to really

42:03 struggle to get up to speed because they're not like the existing group. And so onboarding is kind

42:09 of this, it's meant to be both a training program, but also an equalizer to try to make sure that

42:14 regardless of what someone looks like, when you bring a new person on board, they have more of an

42:18 equal chance at success in your company.

42:20 Yeah. And when you were part of that majority group, I think this is one of the places where the water,

42:25 what's water comes in really clear is like, well, people just, they come in and they just start

42:30 working with us. Like, I don't know what's, what's the big deal. Why do we need to worry about this?

42:34 But that's a really good point.

42:35 And you don't notice. So, you know, everyone has their, their close friends. And a lot of times,

42:41 I mean, we, most people have fairly diverse friend groups, but it is easier to talk to

42:47 my female friends about certain things. Like our communication styles are just, it's just easier.

42:52 We've been conditioned to speak the same way. We have similar interests. We have similar intonations.

42:57 And it's one of those things where if you are the majority group, you don't really notice that

43:04 someone else might be having a really hard time with kind of this, this communication style and this

43:10 culture that you have that is really ingrained, not only in your ideologies, but also in like,

43:17 like your interests and your communication styles. And so I've seen that a lot at companies.

43:22 Yeah. Well, I agree. It would be really easy to classify that as, hey, what's wrong with this new

43:29 person, right? They just don't seem to fit in. Right. Or something like this, right? Where it's like,

43:33 yeah, why don't we try that? It didn't work out.

43:38 When in fact, like they might actually be a great culture fit. It's just that

43:42 they're a little different. And so like, like there needs to be a little bit more work to make sure that

43:48 the communication styles are, are leveled out and that they really get up to speed, that people's

43:54 hobbies and interests as individuals aren't too ingrained into the company as a whole. So I mean,

44:00 I've worked at companies where they love to play video games and video games are great, but I am not a

44:05 gamer. And so I was like, awesome. You spend a ton of time socializing on video games,

44:09 which means that like, I'm probably never going to fit in here. Like what am I, I'm not going to hang

44:14 out with you and play video games. Like that's just not, yeah, I'm never going to fit in. And

44:19 you know, whether it's across gender or just like across bizarre interests, it's important to be like

44:24 really cognizant of that.

44:25 Onboarding is sort of one part of like an overall set of processes within a company, right? Like how people

44:32 interact, how they get raises, whether or not they're a manager, whether or not they're a sort

44:37 of engineer or so on. Right. And you said a lot of these companies, they have what you called the

44:42 null process.

44:43 The null process. No process is one of my favorite things. It's like such a startup cliche where a

44:52 lot of people worked at companies where they didn't like some sort of process or they came from a big

44:56 company. And so they believe that having no process is like a better solution to whatever bad process

45:02 they experienced. And to try to convey this idea to engineers, I was like, well, having no process is

45:09 like having a null pointer. It's something that points definitively nowhere. But it can still be

45:16 misused and dereferenced. If it's dereferenced incorrectly, it can point to garbage. Right.

45:21 And so null processes are often kind of a subset of bad processes. Good process is really the gold

45:28 standard. And good process, actually, my definition of good process is that it's lightweight. It

45:35 conveys the fewest number of steps necessary for someone to complete the process. And it's just really

45:41 easy for people to access and to understand. And so one of the suggestions I give for a lot of startup

45:47 processes is write checklists. Checklists are an amazing way to just write down the steps necessary

45:52 to get a task done and to make them a lot more visible. Because yeah, no process, people still

45:59 have to get things done. So by definition, there is a process. It's just that everyone is doing a

46:03 different thing to get this done. And there's a lot of people who have no idea what to do. And if you're

46:08 really early, it might be the case that you just haven't had time. And that's fine. But don't be fooled

46:12 into thinking that no process is good process. Yeah, it's kind of like, you know, hate is not

46:18 necessarily the opposite of love. Like apathy is that sort of thing, right? Right. It doesn't all

46:23 make it a good process just because it's you're avoiding this thing that was bad. Yeah, exactly.

46:28 You talked about automating stuff too. Like checklists are awesome because it makes it easy for

46:33 to automate humans, basically. But if you could do things like automated pushes to QA. So a new person

46:42 come on, make some changes and just say, hey, here's my changes. And it flows just like everybody

46:49 else's. Things like that would be really helpful. Yeah, automating as much as possible. It can be hard

46:53 to know what to automate. And so that's why checklists are a good start. And then if once like the checklist

46:58 is really ironed out, because I imagine that you will go through some sort of iterations on

47:03 on how to get to the best process for something, if you can automate any of those steps away,

47:08 to reduce the number of things that a human has to do, that is always a best case scenario. Humans

47:14 are, we're wonderful, but we're, we have a really hard time remembering things. We overwhelm ourselves a

47:21 lot with a huge amount of complexity at our jobs. And anytime that you can reduce complexity and reduce the

47:26 number of steps that a person has to take, that is generally a good thing.

47:29 Yeah, absolutely. And if it's something that can be automated, and it has all these steps, right, like,

47:34 you're probably uninspired to do those steps. So if you could, if you could make that, you know, sort of

47:40 increase the joy by automating that stuff, obviously, that's, that's great.

47:44 Computers are really good at remembering steps in a sequential order.

47:48 And they do them really fast. I know, they're very quick. I love computers. Computers are really good at that

47:55 kind of thing. And they're really good at remembering, whereas humans are really good at

47:58 pattern recognition and subjective meaning.

48:00 I think a lot of these concepts that we talked about are really, you know, I've been in a lot of

48:07 companies, I think they're pretty broadly applicable for a lot of engineering teams. And so, you know,

48:13 hopefully people listening out there can reflect on their version of water and see if there's something

48:18 they can do to make it a little nicer.

48:20 Yeah, and I see kind of engineering process and engineering culture as, as iterative, you know,

48:27 progress is it's iterative, it's imperfect, and it's incremental, right? So if, again, if, if these

48:36 diversity issues are death by 1000 paper cuts, if you take away even just a small handful of paper cuts,

48:41 that is progress. And I think that's a really important thing to kind of support and be positive

48:47 about because it can feel really overwhelming when we talk about a lot of these issues and,

48:52 and it can kind of turn to negativity and hopelessness. And I would rather people felt

48:56 inspired and encouraged to try to make whatever small differences they can make in their own

49:02 companies.

49:02 Yeah, I think that's a great message. So one thing I wanted to ask you about, you know, we talked about

49:07 the better logistics for Canaries. You know, there's a big push around the world in the US and as well as

49:15 like England and Australia to get computer science pushed farther ahead in the sort of education

49:23 space to like maybe high school has like a year of computer science, or at least it counts like a year

49:30 of math credit instead of doing something like geometry. My view in the world is we would all be

49:36 better logical thinkers if we solve problems through computer science rather than by trying to prove axioms

49:41 in geometry. So, you know, there's, there's definitely room in the curriculum for it.

49:47 But do you have, you know, if you view it through this lens of what we just talked about, like, do you have

49:52 thoughts on that other than just, Hey, you know, more, more people getting into programming is good, but

49:56 what's positive? Maybe what should we look out for in that space?

50:00 Yeah, I mean, I think it's great. Like I love, I love the idea of logic and computer science being added to education early.

50:07 I have done volunteering with like four to 12 year olds for the past few years. And it turns out like you can start introducing logic pretty young.

50:14 And so I think those education initiatives are great. I actually don't think that the pipeline problem is actually

50:21 a lot of people focus on the pipeline when they think about getting different groups into the industry. And one of the blog posts that I am about to write is,

50:31 it's that retention, retention is the beginning of the cross generational pipeline. So it's this idea that, you know, in the 80s, we've talked about there actually used to be more like 20 or 30% women in computer science programs and out in the industry.

50:44 If, and we'll use women as an example in this case, if women are unhappy and they leave the field, what do you think that means for the next generation? Who is going to convince young girls to go into the field?

50:57 Yeah, not only is there probably a issue there that made them leave, it's probably worse now that the women are not even there.

51:04 Right, right. And, and, you know, furthermore, having a mom who's like, well, I wouldn't really recommend this field is like, it's not a glowing recommendation for little girls.

51:14 And that's like one of my big theories about some of the problems is that retention is actually a really big deal. Because the more people that you can retain for an entire career, the more likely you are to kind of have this cross generational effect of, of they're going to convince, you know, their children or their nieces and nephews, or, you know, just people that they talk to, to go into this field, because they loved it. And they've had a wonderful experience.

51:40 Right. There's, you know, the people that you look up to, as you're growing up, you're like, I'd like to be like this person, or, oh, that person is great. What did they do that, you know, like, seeing that path for you and somebody else matters.

51:55 And if, if you don't see that in, if you're a young woman, and you don't see that in, in any of the women around you, like, well, maybe it seems interesting. I like computers, but you know, that's, maybe this actually is not for me, right? It could be a strong signal. And like you said, we're taking women to be concrete, right? It could be race. It could be class. It could be lots of things.

52:15 I do use, I use women as an example, but it generally is true across kind of like the different, the different metrics or the different dimensions that we should think about these things.

52:28 But, but yeah, I mean, we, we do, we look at the adults around us, even if it's a friend's parent. And we kind of try on, you know, what do we want to be when we grow up and seeing a positive example of someone who looks like you or has qualities like you is so important.

52:44 Right now, most women in the field are what I call first generational women. They have been pulled into the field by men. So I'm a perfect example. My brother convinced me to study computers. And there's a lot of women in the field like this and getting more what I call second generation women, where they were convinced to go into the field by another woman who was really happy, I think is just, it's just so important for the future of computer science and the future of diversity in our field.

53:12 We were talking about the earlier computer science. And I also volunteered at Hour of Code at my daughter's school. I have three daughters. One of them is seven. And we did, we did an Hour of Hour of Code with first graders, second graders, third graders, fourth graders, and fifth graders, all separate. Like I think we did seven sessions or something throughout that week.

53:34 And certainly in first and second grade, there was basically no difference between the interest and the enthusiasm and effort and what came out the other side of the girls, the little girls who were programming and little boys, right? But somewhere between, between there and now, like things get separated and I don't think they have to be.

53:56 And so the more that we can take away those things that strip some of the people out of the industry, I think we'll be all the better.

54:02 Oh, definitely. Yeah. And I think that like, you know, the media plays a big role in that. And there's, there's a lot of like messaging that I think is changing for the better around. Yeah. Basically kind of how we package up these ideas for little boys and little girls, you know, boys and girls toys. Like why, why is that an important distinction?

54:21 Yeah. I think there was just a nerdy Barbie doll came out and that was like a big deal.

54:27 We're not really big into the Barbie dolls right here, but I think I saw that somewhere.

54:31 The Reddit community, a couple of years, it would have been about five years ago now, got ahold of the voting for the next profession for Barbie. And they voted to have her be computer engineering Barbie. And one of my friends bought me one, which I thought was hilarious.

54:45 Oh, awesome. And I used to give them to, to a few of the women I knew who were like joining the company I was at or joining the industry. I would, I would give them a computer engineering Barbie as a joke. I don't know why, because it's funny.

55:00 And, and then they had an entrepreneur Barbie, which I also own. I call them my aspirational Barbies.

55:06 Oh, those should be the same Barbie. Yeah. Cause that's awesome. Those should be the same. Well, yeah. Yeah. I mean, computer engineering Barbie then became entrepreneur Barbie. And so.

55:15 Right. Exactly. She spent five years in the place with all the cubicles and she got inspired.

55:20 Yeah. Cause my goal when I was little was to grow up to be Barbie. Not at all. Not, not at all.

55:26 Yes. They only have a blonde one. I couldn't, I could never get into Barbie. I'm a brunette.

55:30 Barbie's a little bizarre and unattainable. There's some weird studies.

55:33 Oh, Barbie's so weird. Anyways, that's why, that's why we think Barbies are really funny is because like, I don't actually think encouraging little girls to play with Barbies is good. But then they had a computer engineering Barbie and I was like, I guess when I grow up, I want to be computer engineering Barbie.

55:49 Yeah. I guess that's funny. All right. So I guess maybe we'll leave it there for the environment stuff. But I think what I really liked about your presentation was it had concrete positive actions that you can take. Like I'm all about people criticizing stuff and saying, this is not good. This should be better. But if, if it often just comes with pure criticism and no sort of like, and now we should, right. Like I think it's, it's not necessarily helpful to the discourse, but I really like that.

56:19 It's not necessarily helpful to the discourse.

56:49 And I'll link to all those. And I'll also link to your PyCon talk, which is on YouTube. People can check it out. Yeah. So two quick questions before you go. Okay. One, if when you write Python code, what editor do you open?

57:03 So right now I use Adam, the new editor that was made by some of the GitHub folks. And I use the Vim plugins. So I use the Vim bindings. I used Emacs for a while in college, but I got like, I felt like I was going to get carpal tunnel from like all of the control X's, which I know you can customize and change. But then I switched to Vim and I love Vim. It's just, it's my favorite.

57:26 Oh yeah. That's awesome. I'm definitely noticing a rise in the popularity of Adam. So that's cool. I like it myself.

57:31 It's really cool. I like it. It's easy to use. and it's like, it's an editor. It, it doesn't get away from the fact that it's just an editor. It just like lets you edit your code.

57:42 Yeah. Yeah. That's cool. And they have, if you go to, what is it? Adam.io, there's a cool little like Jetsons futuristic video there that, that, you know, makes it fun to think about as well.

57:53 Then the other question I often ask on the way out is there's over 75,000 libraries on PyPI and we all have experience with different ones. And is there one that you're like, Oh, this is kind of cool. I use this and maybe it's not very popular, but it'd be cool. People knew about it.

58:12 Like I'm a huge fan of flask. I really, for like web applications, I think flask is just like a really nice, you can, you can pick the different flask libraries that you want to use. So you don't have to have everything.

58:23 That's interesting. So like recommending flask is like kind of like recommending a hundred.

58:27 Exactly. And it's recommending the idea that like you can pick and choose your libraries, which is its own mindset. But, but I really enjoy it because I like to build very service oriented web architecture.

58:40 And so for some services I need a lot less and for others, I need a few more things.

58:43 And so I really like kind of being able to like build your own web framework.

58:47 Yeah. Yeah. I like that philosophy as well.

58:50 Yeah. And then I'd say the other library that I really like is I do a lot of work with like AWS API and I love, I love kind of like the AWS Bodo library.

59:02 They've actually done a really good job with it.

59:04 Yeah. Bodo is sweet. I was using it just the other day for some S3 work and yeah, it's very clean.

59:09 It's good stuff. All right. Well, Kate, thanks so much for being on the show. Is there kind of a final call to action you have for people?

59:17 I mean, not really. I think, I think we covered a lot of different things.

59:20 Yeah. So maybe, maybe think about the, think about water, water in your environment and maybe, you know, what are the small steps you can do to make a change?

59:28 Yeah. I'd say that's the big thing. Incremental change is still important change. I think we all know that as engineers that, that you spend a lot of time doing incremental changes.

59:38 And that's huge for culture and for engineering environments in the same way that that's really important for, for developing code.

59:45 Yeah. Awesome. All right. Well, thanks for being on the show. It was great to talk to you.

59:49 Yeah. Thanks for having me.

59:50 You bet. Bye.

59:52 This has been another episode of Talk Python to Me. Today's guest was Kate Hedelson. And this episode has been sponsored by SnapCI and Hired. Thank you guys for supporting the show.

01:00:01 SnapCI is modern continuous integration and delivery. Build, test, and deploy your code directly from GitHub, all in your browser with debugging, Docker, and parallelism included.

01:00:12 Try them for free at snap.ci slash talkpython. Hired wants to help you find your next big thing. Visit hired.com slash talkpython to me to get five or more offers with salary and equity presented right up front and a special listener signing bonus of $2,000.

01:00:26 Are you or a colleague trying to learn Python? Have you tried boring books and videos that just cover the topic point by point?

01:00:33 Check out my online course, Python Jumpstart by Building 10 Apps at training.talkpython.fm and learn Python in a more engaging way.

01:00:41 You can find the links from today's show at talkpython.fm/episodes slash show slash 55. And be sure to subscribe to the show. Open your favorite podcatcher and search for Python. We should be right at the top.

01:00:52 You can also find the iTunes and direct RSS feeds in the footer of the website. And starting just this week with the launch of Google Play for podcasts, you can also find the Google Play link in the footer.

01:01:03 Our theme music is Developers, Developers, Developers by Corey Smith, who goes by Smix. You can hear the entire song on talkpython.fm.

01:01:11 This is your host, Michael Kennedy. Thank you so much for listening. Smix, take us out of here.

Back to show page
Talk Python's Mastodon Michael Kennedy's Mastodon