#45: The Python Testing Column, Now a Thing Transcript
00:00 What is the role, the core purpose of writing tests for your application?
00:03 Should you write more unit tests and fewer integration tests, or is it actually the other
00:07 way around? You may have heard of the test pyramid with unit tests building the foundation.
00:11 In this episode, we'll talk about a variation on that theme called the test column. We talk
00:16 about this and more with Brian Okken on episode number 45 of Talk Python to Me, recorded January
00:22 27th, 2016.
00:24 Welcome to Talk Python to Me, a weekly podcast on Python, the language, the libraries, the
00:53 ecosystem, and the personalities. This is your host, Michael Kennedy. Follow me on Twitter
00:57 where I'm @mkennedy. Keep up with the show and listen to past episodes at talkpython.fm
01:02 and follow the show on Twitter via at Talk Python.
01:05 This episode is brought to you by Hired and SnapCI. Thank them for supporting the show on
01:11 Twitter via at Hired underscore HQ and at Snap underscore CI.
01:15 Hey, everyone. I hope you're looking forward to an interesting conversation on testing software
01:21 using Python. This week, our guest, Brian Okken, is giving away five copies of his Python testing
01:26 frameworks book. If you want to be in the running, be sure to visit talkpython.fm and enter your
01:31 email address to become a friend of the show. Now, let's get right to the interview with Brian.
01:36 Brian, welcome to the show.
01:37 Hey, thanks, Mike.
01:39 Yeah, it's really great to have another podcaster and a friend and a fellow Portlander here all
01:45 together on Talk Python to talk about testing in Python.
01:49 Yeah, I'm excited to be here.
01:50 Yeah, yeah. It's really cool. So for the listeners out there who don't know,
01:55 Brian is the host of Python Test Podcast, which is a fairly new, but how many episodes have you had?
02:03 Are you on the ninth or tenth?
02:05 Yeah, I can't remember. I think I did number nine recently.
02:08 Yeah, number nine with Harry Percival. Excellent. Listen, that one, that's great. So I'll put a link
02:14 to the podcast out there. So you spend a lot of time talking about testing and exploring sort of the
02:20 whole spectrum or pyramid of testing in Python on your podcast, on your show.
02:25 Yeah, I really want to cover... Actually, the podcast covers everything. I want to cover everything in
02:32 software development and software testing.
02:34 So we're definitely going to spend a good amount of time talking about that. But let's start at the
02:39 beginning. How do you get into programming, Python, that sort of thing?
02:42 Well, my first introduction to programming was in high school. I took a basic class.
02:48 High school offered basic at Pascal, and I figured basic has to be easier, right? Because it's got basic in
02:54 the name. But at the same time, I didn't really get into it too much then. But at the same time,
03:00 I had a TRS-80 at home. And I spent a little time like typing in games from game listings from magazines,
03:08 like Lunar Lander was one of my favorites. And then, yeah, I didn't do much more until college. I entered
03:16 college intending to be an art and math major. And switched about two or three years in, switched to
03:24 computer science. And then finished up with a bachelor's and master's at computer science.
03:29 That's cool. So where'd Python come into the story?
03:35 I learned C++. I learned all sorts of languages, Perl and C++ and others in college. I got a job at Hewlett-Packard doing,
03:44 well, at first test systems and then embedded programming of instruments. And that was all C++ and still is.
03:54 And then, I don't know, maybe four or five years later, I was in a group that had a Python test framework for testing,
04:03 system testing the instrument. And started, that was probably maybe 2000, 2002. And I've been using Python for testing instruments ever since then.
04:14 One thing that I think is interesting about that, a lot of times people feel like the programming language that the thing you're testing,
04:23 the application or whatever is built in, that's the language in which you have to write the test, right?
04:29 So if you're writing a C++ app, you have to fire up CPP unit and do C++ unit testing.
04:36 That's not really what you're doing, right? Like you're actually testing code that maybe isn't Python, right?
04:42 Yeah, it's all C++ and other languages inside. And we've got, but it's got a, the interface is available.
04:50 Anything with an interface that's available from Python, you can use Python test frameworks to test them.
04:56 And, you know, right off the bat, I had frameworks that the company gave me and didn't even think about it.
05:04 But, yeah, it was probably 2010 when I started looking into other frameworks like unit test and pi test.
05:11 Yeah, okay. So let's dig into a little bit of the type of stuff that you test at work, and then we'll come back and talk about your podcast a little more.
05:19 So what exactly are you talking about instruments and physical things that you're testing?
05:25 And these are written in what, C++ or like embedded systems with C++ or C?
05:31 Yeah, it's C++. But these are, you know, there's lots of levels of embedded programming.
05:37 There's, you know, embedded programming in like, you know, your car or your phone or your watch or whatever.
05:45 But these are big instruments. These are, I don't know, they look like stereo components, but they're, they've got many processors inside.
05:54 The box I work on right now runs Windows.
05:56 So I'm really writing a Windows application, but I don't really do any Windows stuff myself.
06:03 But it's all written in C++, all of my code.
06:06 Yeah, the user interface, a lot of people use these for the boxes I work on are mainly used in production lines.
06:14 And the production line control uses a, I guess, a string based, it's called Skippy, but it's a, it's a special language to control instruments with.
06:26 Yeah, we, so that's, that's how our users control the box. So that's how we test them.
06:30 Is that like for like managing factories? So if you're like building cars, like the machines, like put the door on or whatever, like those types of devices?
06:39 Or what do you, what exactly are you thinking?
06:41 Well, okay, so the box I work on right now is, is called the communications test box, one box tester, but it, so it tests all the RF, the transmit and receive abilities of a mobile phone.
06:54 So pretty much, yeah, every, every phone that sells in the world has to go through testing to make sure it, you know, it does it, it, since it's a transmitter, it has to follow FCC rules.
07:07 I see. So if I like lived down in San Diego and worked at Qualcomm, maybe I would buy your company's product to test the phone internals that we're building, something like that?
07:18 Yeah. And anybody that's got, and any, we do not just cellular, but also like Wi-Fi broadcasting.
07:28 So we do Wi-Fi testing and Bluetooth and anything that transmits and receives this box will test pretty much.
07:35 Okay, cool. And so to test this embedded C++ app, basically, do you do some sort of C level interop with Python?
07:46 And do you expose some kind of API that Python can understand? Or do you do this Skippy API against it? Or what's the story there?
07:53 Well, so the magic that glues it all together is a library that's available on PiPi called PiVisa.
08:03 So, yeah, there's a, Roden Schwartz and National Instruments and others have these Visa DLLs that you can use to interact, to send Skippy over LAN.
08:17 And I use PiVisa to get at that. So it's really easy. You connect those up and you got to, you just have an object that you can send write and read commands to.
08:29 Works great.
08:30 Would you say that that kind of testing is sort of more holistic or is that more like unit testing?
08:36 It sounds to me like my first guess is like you're kind of hitting the outside, like so maybe an integration test type story, yeah?
08:43 Yeah, it's definitely like an end-to-end test.
08:45 We test, I'm testing our entire system.
08:50 But the kind of the, there is, so, you know, in looking at test frameworks, I'm way more interested in how the fixture model is because I use, we use setup and teardown to actually do things like, you know, move signals around and hit switch boxes.
09:08 Definitely more of an end-to-end.
09:10 But we do use ideas like mock, like if I'm testing the measurement capabilities, for instance, it's difficult to test that against a, like a moving target, like an actual mobile.
09:23 So we'll have an arbitrary waveform generator generate a known signal and we test the receive side against the known signal.
09:32 And then we can do the other end also, we send our transmission to like a different instrument to test it against something else.
09:40 When you think of software architectures, you often hear people talk about things like dependency injection and basically managing and decoupling the dependencies of your software so that you can test the individual pieces, right?
09:55 Yeah.
09:56 Yeah, definitely.
09:57 Yeah, but you, it sounds like your dependencies are more solid than, than maybe others, right?
10:04 Like you've got physical devices and like scientific information coming in and, you know, like waveforms.
10:11 And so you've, what kind of stuff do you do?
10:14 That part actually isn't that different, really.
10:16 I mean, because like on a, if you had like a big user database or something on a web application, the real, the real database on in, in the world is going to be a lot different than your like test database.
10:28 So that's similar to how our real signals are different from our test signals.
10:32 The diff, the difficulty is, is the fuzziness.
10:35 So, we often don't really get, because even with a, a, a pure signal from a generator, we've got, you've got noise in the system.
10:45 So you can't, you can't just get a, get a number out of a, out of a, a measurement and say, if it's 3.2, then it's correct.
10:53 Otherwise it's wrong.
10:55 And almost all of our measurements are, are, we've got tolerances around them.
10:59 That's really interesting.
11:00 a lot of people write software that is basically deterministic in, in ways that are easy to assess.
11:09 So I get a Boolean back or I query for a user and the user is either there or they're not, or they're an, they're an admin or they're a regular user, right?
11:19 These super clear distinctions.
11:21 But when you get into sort of sciencey spaces, it gets really hard to actually just know, know that you're right.
11:31 Yeah.
11:31 I mean, if you get back a number and it's off by a thousandth, is that still the same?
11:37 Does that count as okay or not?
11:39 Right.
11:39 That, that can be really challenging.
11:41 Yeah.
11:41 But there's, I mean, there's, there's ways to do it.
11:43 We've got, we, we can, there's, we can split the, we, when we're testing the entire system, it's difficult to tell because it, you know,
11:51 if it's off by 10%, it's hard to, you know, visually know, but, but we can, the, the actual numbers that go through the system go through, you know, there's, there's the different pieces are, are tested with different rigor.
12:08 So it all works out.
12:09 Yeah.
12:09 That's an interesting point.
12:11 And I'm, I'm actually not, I'm not the guy that, that tests the, the box to make sure that the, that it like actually, like if we say we're putting out like minus 10 dB, that it's actually really 10 dB, minus 10 dB.
12:25 That's like some other guy in, in like a manufacturing facility.
12:28 I'm, I'm mostly concerned with that, making sure all the heart, all the software didn't muck things up.
12:35 So, in, in the end, I trust that the hardware is working correctly and I just want to make sure that the software hasn't mucked up the, the pristine hardware.
12:45 Right, right, right.
12:46 So you assume that the hardware you've got is actually processing everything.
12:50 Okay.
12:51 But you've got to gather it, save it, interpret it, all that kind of stuff.
12:55 And so you kind of start at that level, right?
12:57 Yeah.
12:58 And then the, the measurement data that we're turning back, I mean, we're, we're sending back, we've got like thousands of different like measurement points that we're sending back and making sure that we don't muck one of those up.
13:10 And then we don't put some, some array in the wrong place.
13:14 that's, that's the sort of stuff that I'm really concerned with.
13:17 Yeah, that's interesting.
13:19 One of my first jobs was working at a place called eye tracking, Inc.
13:23 eye tracking.com.
13:25 And we would build these scientific visualization tools and sort of data analysis on top of eye tracking data, right?
13:33 Pupil dilation over time, where you're looking, that, that kind of stuff.
13:36 And we would end up doing really complicated math sort of pipelines on it, you know, Fourier analysis,
13:43 wavelet decomposition, those kinds of things.
13:45 And you try to, you know, test the results of one of those things, you know, it's super hard.
13:51 So we basically would come up with a reference input and a reference output and go, if the deviation from, from this graph to that graph is, it's not too large, we'll call it, we'll call it the same, right?
14:03 Because the, maybe they use something like MATLAB to generate the reference results.
14:08 And, you know, the way it deals with floating point numbers might be slightly different than the language we were using or something like that.
14:13 Yeah.
14:14 It's an interesting challenge that I think a lot of people building, I don't know, more standard business apps.
14:20 It doesn't really get on your radar, right?
14:22 Because the user is either an admin or not.
14:24 You know, assert true, done.
14:26 Yeah.
14:27 I mean, actually, people that have that sort of a test job, I'm like, what's so hard about this?
14:33 Just, you know, write the test and get on with life.
14:36 Yeah.
14:36 So, okay.
14:37 So let's, let's look at some of the frameworks and get some of your thoughts on that.
14:42 So let's say the majority of people listening know what unit testing is, what the goals are.
14:48 But I know a lot of my listeners are sort of scientists getting into programming or new programmers.
14:54 So maybe give us just the elevator pitch.
14:57 What is unit testing?
14:58 Why do we care?
14:59 I'm not sure why we care about unit testing, actually.
15:02 Or software testing, rather.
15:05 No, so that, that's a, the, the, the phrase unit test is a, is one that's kind of a, a sticky point for me because the, in extreme programming, the, they use the term unit test.
15:18 And that just meant any developer test, anything written by developers, not written by the QA team.
15:23 And, and in a lot of communities, it's stuck with that.
15:27 It is, that's the, the, the use model.
15:31 That's what a lot of people commonly term, term a unit test is something written by a developer.
15:35 And then in the TDD community, that often isn't the, the definition.
15:41 The unit test is often the, the smallest possible, like one function.
15:45 You're testing one function at a time.
15:48 And I look at, so there's the, the, the most importantly is that the entire, the entire thing is tested.
15:56 Your entire software is tested.
15:57 And unit testing is, I think there's certain places where you want to go in and test.
16:04 I like to put tests anywhere where I have a promise.
16:07 If I'm promising anybody other than myself that things are going to work right, we need tests there.
16:12 So if I've got a, a library that other team, other team members use or other, other groups use, that needs tests around it.
16:19 But the, and in your, anywhere there's interfaces, interfaces between groups or between components.
16:26 Those are great places to put tests.
16:28 And then of course the entire system function tests.
16:31 I think the, the focus on unit testing, unfortunately takes some of the responsibility out of system level testing where I think it's unfortunate.
16:43 I think there's different scenarios where things like unit tests matter more and other times they matter less.
16:49 So times I think they matter a lot are when you have a lot of people all in the same general code base making changes.
16:58 And you don't necessarily know everything that's changing.
17:00 Right.
17:01 So, so people are checking in changes and you're not necessarily, I mean, you're doing code reviews and stuff, but you're not necessarily like always aware like, oh, they actually changed this method that I was depending on.
17:12 And they didn't understand this like implicit contract I built around the return value here or something like that.
17:19 Right.
17:19 Yeah.
17:19 So I think it's really valuable there.
17:22 And I think it's valuable for helping, helping build apps where fewer people are working on it.
17:28 There there's less churn.
17:29 But one of the things that I saw, you know, early two thousands and in response to the extreme programming was people saw that they, they thought they had to do, you know, 95% code coverage.
17:44 Everything is TDD.
17:45 If you don't start that way, it's bad software.
17:49 You just can't write it.
17:50 How can you possibly stand up for it?
17:52 And, you know, over time I saw a lot of people saying, well, we can't do that.
17:57 Right.
17:58 We are in too much of a hurry or whatever.
18:00 You know, they have whatever excuse.
18:02 A lot of times it wasn't totally unreasonable.
18:04 We basically say we can't match that level of commitment to testing.
18:10 So we're not going to do any.
18:11 And I think that that's one of the places where it can actually, that message of like test everything can become negative.
18:19 Because, you know, if they had written, like you had said, the thing you're making a promise about, right?
18:23 Like if, if you're writing like stock trading software, the part that actually says buy or sell, there should probably be tests on that.
18:29 The, the part that logs that the app started probably doesn't need you in a test, right?
18:35 There's lots of parts of apps that are just there to support some sort of core.
18:39 And that core is much more important that it's tested, I think.
18:42 And what do you, what do you think?
18:44 Yeah, totally.
18:45 It's like, I think, actually, I think you brought it up in one of your podcasts recently that, that it's the thing that, the thing that's making you money.
18:53 The reason why somebody is buying your thing.
18:55 That's what you should test the heck out of.
18:57 Yes, absolutely.
18:58 Like if, if, if that is your thing that you build for the world, that little essence of it, you had better test that, right?
19:05 Yeah.
19:06 But there's, as you know, you ship real software.
19:09 It's like, that's like 20% of the code you write.
19:12 There's 80% that isn't that.
19:14 Well, and then I like, I like code coverage tools.
19:18 But my, a lot of times I think that they're, instead of looking at the, there's two ways to get to 100%, right?
19:26 There's a, you can either write more tests or you can delete code.
19:31 I think you should delete more code.
19:33 If you've got, if you've got a, a bunch of software that is not reachable from the, the user interface.
19:40 If, if I can't make some section of the code hit that piece of code with, with normal user input, maybe it doesn't need to be there.
19:49 That's a really interesting point.
19:50 Yeah.
19:50 Yeah.
19:51 Yeah.
19:51 This episode is brought to you by Hired.
20:04 Hired is a two-sided curated marketplace that connects the world's knowledge workers to the best opportunities.
20:10 Each offer you receive has salary and equity presented right up front, and you can view the offers to accept or reject them before you even talk to the company.
20:17 Typically, candidates receive five or more offers within the first week, and there are no obligations, ever.
20:23 Sounds awesome, doesn't it?
20:25 Well, did I mention the signing bonus?
20:26 Everyone who accepts a job from Hired gets a $1,000 signing bonus.
20:30 And as Talk Python listeners, it gets way sweeter.
20:32 Use the link Hired.com slash Talk Python to me, and Hired will double the signing bonus to $2,000.
20:38 Opportunity's knocking.
20:40 Visit Hired.com slash Talk Python to me and answer the call.
20:49 I've spent more than once, I've been giving some big software projects and said, you know, have a look at this, and then we're going to need to have you help us add a feature or recommend some change.
21:01 And there'll be like some method.
21:02 I'm like, what does this do?
21:04 What does this have anything to do with this code?
21:07 It doesn't seem to do anything.
21:08 And I'll like study the code and then, you know, two hours later, I'm like, oh, it's never called at all.
21:15 Yeah.
21:15 Somebody should have just deleted it.
21:17 They should have just deleted it and the world would have been a better place.
21:20 But instead, you know, I wasted two hours.
21:23 Somebody else probably wasted two hours.
21:24 And it just gets kicked down the road.
21:26 The other thing, one of the things that I actually like, I'm writing more lower level tests than I used to.
21:33 I used to mostly focus on high level tests.
21:37 But the, and I still do, but lower level tests do have their place.
21:43 But the, I like the model of, in test-driven development of the red-green refactor.
21:48 But just don't forget the refactor part.
21:51 So if a test, putting a test in place makes it so that you cannot refactor something, then maybe that test isn't quite right.
22:00 I don't want to, I don't want to restrict my ability to change my mind and redesign something.
22:05 So.
22:05 I had brought up the, you know, the perfect being the enemy of the good sort of thing where people say, if I can't have really good test coverage, like what, what is it?
22:13 Well, I'm not even going to start.
22:14 Forget this.
22:15 It's too much work to do testing, right?
22:17 Yeah.
22:17 Well, and that's, you see that a lot when, when a team like looks at, looks at their code and they go, they get a new manager in or a new, somebody comes in and says, hey, we need to add a bunch of tests to this.
22:28 But where do you start when you have like a 10 years of legacy code?
22:32 You can't just cover everything.
22:34 So yeah, there's a, you got to just start somewhere.
22:38 You know, I have a book recommendation actually, now that you bring that up.
22:42 So there, you know, Michael Feathers is a guy that in the early days of test driven development and extreme programming and all that sort of stuff, he was really active.
22:52 And he wrote a book called Working Effectively with Legacy Code.
22:58 And it's, you know, given the time, it was a little more C++ focused, but it has some really amazing ideas of like, how do I take this app, which has a million lines of code and zero tests, and start carving out little epochs of code functionality that are supported by tests and ways you can do that that are safe.
23:20 And definitely recommend it.
23:22 If people are dealing with like huge apps that are not yet tested and put together in a way that makes them basically untestable, check out Michael Feathers' book.
23:31 Okay.
23:32 Yeah.
23:32 Have you seen that one?
23:33 Definitely.
23:34 I've got a difference of opinion on that one, but I'll just let that go.
23:38 No, no, no, no, no.
23:39 I wouldn't hear it.
23:40 Tell me.
23:40 I think it just seems silly to try to add a bunch of unit tests to a bunch of existing code.
23:46 I think it's way more profitable for your company to put functionality testing around it.
23:52 And because they like the end to end functionality testing of your system.
23:57 If, if you know, like you, you need to know if you're going to refactor something, you need to know that the end functionality is still working solid.
24:06 And if you have tests in place to guarantee that, then what value is it to, to go in and add like thousands of unit tests?
24:16 Sure.
24:17 So I think that's a totally, totally valid point.
24:20 And I think, you know, it's been like, gosh, it's probably been 12, 13 years since I read that book.
24:25 So, you know, I'm, I'm going to only get it like a little bit, right.
24:28 But I think the idea was like, suppose you've got some huge, you've got a library you're calling and like you hit the outside method, like some, you know, do your thing library.
24:37 And it gives you an answer back rather than trying to set up a whole bunch of tests inside there.
24:42 The question was like, well, how could I refactor some dependency or some bits?
24:48 Right.
24:48 So maybe what you do is you just come up with a bunch of inputs, save the outputs and go long as those don't change.
24:53 We're going to call it good.
24:55 It was some, some like really basic scaffolding to say the thing kind of looks still okay.
25:01 Yeah.
25:02 I mean, I, I think looking at, putting tests around the outside and then also looking at the different interfaces in the side of the system and, and maybe sure, maybe adding some different interfaces that weren't there before to try to separate the design.
25:14 But all of that involves like a redesign and you're not going to, you're not going to guarantee that you're not breaking external functionality.
25:22 It's just, it's just, they don't, they don't test for that.
25:27 Yeah, that's right.
25:29 So I think that's, that's probably an interesting thing to explore a bit is, can we be talking about unit tests and the whole extreme programming origins of it and all that, but there's kind of what you refer to on your show as like a pyramid of testing capabilities or functionality.
25:46 Can you go into that a bit?
25:47 Yeah, sure.
25:48 Well, I don't really like the test pyramid, but if you, if you do much reading about software testing, you'll run into a test pyramid.
25:55 All right.
25:56 So the test pyramid, the, the notion, if you read in, do any reading on test software testing, you'll run across references to a test pyramid.
26:05 So the idea is, a foundation of unit tests and some integration tests and very little functional tests, just end to end tests.
26:16 And, um.
26:17 Right.
26:17 Kind of like a food pyramid where like the dessert is at the top.
26:20 You should have just a little tiny bit and that's the functional bit at the top.
26:23 Yeah.
26:24 And, what, what, and the, there's the reason is because supposedly functional system testing is, doesn't give you much benefit and it, they break and they're brittle and they're fragile.
26:37 I just don't buy it.
26:39 and what, what happens I think is developers see that and they go, Oh, okay.
26:44 Focus on unit tests and leave, leave the end to end test to QA.
26:48 The problem with that big problem with that is a lot of teams don't have a QA anymore.
26:53 So there's nobody doing end to end tests.
26:56 Somebody's got to.
26:58 so it's, if there's no QA, then it's your customers.
27:01 And, I don't think that's the right person to do quality assurance.
27:05 So I, I, I think that focusing on, maybe a column of tests, it should be a column, not a pyramid.
27:16 Do, do the tests where they make sense.
27:18 You've got to have full behavior coverage, end to end wise to make sure that your system works the way you promised it's going to work.
27:28 And, and also like with, especially with, continuous integration, if you're relying solely on automated tests, those have to be promises that you're trying to keep to the customer.
27:40 And then, I don't really know what integration tests are.
27:44 There's a lot of definitions for it, but it's the middle part.
27:47 And sometimes it's integration test means integrating with the outside world.
27:52 So if you're integrating with a database library, it's tests against that.
27:56 But then also some people think of, integration tests as between two components or between two functions.
28:02 And then unit tests are the, the low level while you're designing your code.
28:06 I don't know.
28:07 I think that you just need to use your own.
28:10 I don't think there's a rule of thumb.
28:11 I think you got to use your judgment and put tests where it makes sense.
28:14 I think that, yeah, I really feel that that's, that that's good advice, right?
28:19 The whole, let's make, make the top part of the pyramid a little fatter and make it more like a column.
28:24 Yeah.
28:24 So I think that makes a lot of sense because let's say you have a thousand unit tests.
28:29 I actually, I'm with you.
28:31 I don't really know what an integration test is, but say you have a hundred of those and you're at 10 functional tests, right?
28:35 Do you really need the thousand unit tests or are those 10 functional tests enough?
28:42 Maybe.
28:42 Well, a lot of the arguments for the, a lot of the unit tests are to make sure that, you know, somebody else doesn't break your units or something.
28:52 But I don't know if you, I, I just think functional testing needs to be done more.
28:57 So, well, you know what I think the origins of it are.
29:00 I think it has to do with the TDD side of things like you, it's very difficult to functional TDD.
29:08 I think it's, I think it works great.
29:12 well, like, Harry brings it up and it's been brought up by others as kind of a double loop TDD where you, you start, start with the functional test.
29:21 And then, and then, while you're trying to fix that functional test, that's failing, you write a bunch of unit tests.
29:29 I think that works, works good.
29:31 I just, you know, once you have the functional test working, do you need to keep the unit tests?
29:38 I don't know.
29:38 It's kind of up to you.
29:39 I think keep them if, if they're not, I think of a unit test is like scaffolding, right?
29:46 So if you're, well, you're, if it helps you to build your software to put a bunch of scaffolding around it.
29:50 Great.
29:51 But, once it's built, I don't know, some of the scaffolding can come down maybe.
29:56 Yeah.
29:57 You're left with columns or something, right?
29:59 Interesting.
30:01 Yeah.
30:01 I think that's a good way to look at it.
30:03 I think one of the really big values that unit tests provide in to some degree, the functional
30:10 tests and integration tests is they get you to think about how your function or object or
30:17 API is used from the outside sooner.
30:20 Yeah, definitely.
30:21 And that, that's why I think focusing on interfaces, if you, if you're really building an interface
30:25 that you need to have clean, then it needs to be, it needs to be changeable.
30:31 But, and tested, but like if, if, like, so let's say I'm writing a handful of classes that
30:36 all work together and I'm, I'm the main developer in this section of code, it's just going to
30:42 be me, right?
30:42 so if it, I don't think I'm going to put a unit test around every function because that
30:49 means every time I want to refactor this system, I've got to change all the unit tests.
30:55 As long as this, this subsystem works from the outside world.
31:00 I think that's enough.
31:01 I think one of the big mistakes that people also make, you know, in addition to being,
31:06 letting like the perfect be the enemy of the good and just saying, well, we can't do perfect.
31:10 So we're not starting.
31:10 I think the other problem is that people write unit test code in a different mental state.
31:17 Let's just say test code, right?
31:19 Cause you know, it's, it's up and down the, the, the column that you're defining here.
31:24 Yeah.
31:25 So I think they write those in a different mental state than the core of their application.
31:31 Like I've seen plenty of people just go to unit tests and just copy, paste, copy, paste,
31:35 like 10 lines of code.
31:37 And like, what are you doing?
31:39 Like, would you do that in normal code?
31:41 Of course you wouldn't, right?
31:42 You'd make that a function or some sort of object or something.
31:46 Right.
31:47 But that makes it way more painful to refactor or evolve your code when you've got 20 copies
31:55 of the thing that you have to maintain.
31:56 Well, yeah, that's another reason why I think you had to be careful with the whole, like,
32:00 have a, have as many unit tests as you can write in a day because it's, it's an, it's
32:06 a code asset, but it's also, it's also a, it's a weight around you.
32:10 You've got to, you've got to keep it all the time and you've got to change it.
32:13 It's, it's code maintenance.
32:14 There's test code maintenance just as there is normal code maintenance.
32:17 Yeah.
32:18 So I think you can structure your test code.
32:20 Most people's test code could probably be refactored to be way more maintainable.
32:25 And so that's good, but, and people should do that.
32:28 But like you say, like more is not always better.
32:31 I think if you think of what are the main use cases that a user is going to do with your application,
32:36 if you can test those, chances are all those little edge cases that you're testing, if they
32:42 were to go wrong, your main user stories or whatever don't really work anymore.
32:48 Right.
32:48 So do you have to write those little tests?
32:50 That's, that's an interesting question.
32:51 Well, I think that if, if a user, I, all the edge cases from a user's perspective, from
32:57 like an end user, those should be tested.
32:59 I definitely think you ought to test the edge cases, but the, the, the edge cases of
33:05 every function.
33:06 I don't, I don't think so.
33:08 If you've got like, like a, one of the examples I've given before is if I've got a, a username
33:14 that goes through the system and in some, some function that has to intercept that takes a
33:20 string then.
33:20 Right.
33:21 So do I need to test to make sure that string can handle a thousand characters?
33:25 maybe not.
33:27 It's interesting to question these assumptions about, you know, how many tests should you have?
33:33 How, how should they, should they be in a pyramid where it's, you know, decreasing as you go up
33:38 or should they be kind of more flat and so on?
33:42 Well, the, it's, it, and it's, it's sort of theoretical, right?
33:46 Because it just, it changes with every situation and the tests need to be an asset so that you
33:50 can, if they need to be such that you can trust that if the tests pass, your code is good and
33:57 whatever the tests have to be so that you can trust that you can check something in and it's
34:02 good because the tests pass, that's where you need to be, whatever it takes to get to
34:07 there.
34:08 Yeah.
34:08 And related to that is if you have tests such that it takes 15 minutes to ask the question,
34:14 do my test pass, people are going to stop running them and they just become stale.
34:19 All right.
34:34 This episode is brought to you by SnapCI, the only hosted cloud-based continuous integration
34:40 and delivery solution that offers multi-stage pipelines as a built-in feature.
34:45 SnapCI is built to follow best practices like automated builds, testing before integration,
34:50 and provides high visibility into who's doing what.
34:53 Just connect Snap to your GitHub repo and it automatically builds the first pipeline for you.
34:59 It's simple enough for those who are new to continuous integration, yet powerful enough to run
35:03 dozens of parallel pipelines.
35:05 More reliable and frequent releases.
35:07 That's Snap.
35:08 For a free, no obligation, 30-day trial, just go to snap.ci slash talkpython.
35:14 Well, that's the other beef I've got with, with the, the, a lot of the unit test stuff
35:27 with the testing and isolation.
35:29 So I don't even get that.
35:30 So if, if I want to, so one of the things is if test, the test suite is slow, you're not
35:35 going to run it.
35:36 So we've got it.
35:36 We want to try to speed up test suites.
35:38 But one of the ways to do that is to isolate every single function and every single test so
35:44 that the test doesn't rely on it.
35:46 This function doesn't rely on anything else.
35:49 But then if I've got such isolated tests, why am I running the whole suite?
35:54 Just run the thing, run the tests over the thing that I'm testing.
35:58 Right.
35:58 Because if you're not testing the integration, then you're doing like 95% rework that theoretically
36:05 doesn't affect you, right?
36:05 Right.
36:06 If I'm writing code that, is modular and everything, if, like the stock example,
36:12 if I want to check to see if the buy button works, I don't need to test to make sure that
36:18 that functionality that I changed didn't break the printing function.
36:22 Yeah.
36:23 Yeah.
36:23 The about page shouldn't fail when I change the trading engine, right?
36:26 Yeah, exactly.
36:29 So one thing that you talked about on, I think it was show eight, if I recall my,
36:34 my ordering correctly, of your podcast was the theme was like writing software is like
36:42 nothing else.
36:42 And I found that really interesting and it really resonated with me, not necessarily on a testing
36:49 level, but on a software engineering level, right?
36:53 Like people say people typically who are not skilled in the art of writing software will
36:59 come and say, Hey, look, your title is software engineer.
37:04 We have other types of engineers.
37:06 They build like bridges.
37:07 They can tell us within 10% accuracy of not just the cost of the project, but the duration
37:17 of the project and how many people need to be on it.
37:19 Why can't you do that with software?
37:21 And I think your, your sort of a riff on software is like nothing else.
37:24 It was really, I think it was neat.
37:26 Well, I, I, I appreciate that.
37:29 I like that as well.
37:30 The, that notion that it's like engineering or it's like architecture or it's like,
37:36 you know, whatever it isn't, it's, it's, it's its own thing.
37:40 It's hard to define what, what it's like.
37:43 And now as I'm in a management role, I've got to do some of those things.
37:46 I've got to come up with estimates and I'm, I'm often the evil guy that says, well, how long
37:51 is this going to take?
37:52 Are you going to be done next week?
37:53 and, and I know the answer is nobody knows, but, but you know, you've got to ask
37:59 those questions anyway.
38:00 So.
38:01 Yeah.
38:01 I read a book by somebody and I totally forgot all the details.
38:04 So I think it's lost.
38:06 It was talking about this and it said, you know, people often refer to software engineering
38:14 and I'd actually personally don't really like the term software engineer much.
38:17 I call myself a software developer because I think it's, I don't want to put it too near
38:21 architecture or engineering in people's mind.
38:24 But it was saying, you know, one of the fundamental differences between like a civil engineer that
38:28 builds a bridge and somebody that builds a soft piece of rich and interesting software is that
38:36 you have to build the bridge over and over and over.
38:39 If I'm going to build a bridge across this river, I'm going to build it here.
38:44 And then, oh, we need another bridge.
38:46 It's about like that bridge, but it goes across this river and it's just this much wider.
38:51 Right.
38:52 And so you can't, if this was software, you would just like take the bridge and change a
38:56 setting and move it over.
38:57 Like you would make a copy of it, but you can't copy bridges.
39:00 So you rebuild them over and over and over.
39:02 And in software, if a thing already exists, you just use it and you copy it and you incorporate
39:09 as a library.
39:10 If you're writing something often, I mean, I know there's a lot of wasteful reuse and what
39:14 are a re rewriting, you know, not made here.
39:16 So we're going to do it ourselves.
39:18 But generally in software, if you're writing software, it's because it doesn't exist.
39:24 It hasn't been done before.
39:25 And that really gave me a different perspective.
39:27 What do you think about that?
39:28 Yeah.
39:29 Well, there's a lot of not admitted here that I run into as well.
39:35 And it also, there's, it takes a certain kind of person that's willing to dig through libraries
39:41 and see if it applies to your problem right now.
39:43 But yeah, the thing that I think software is most like actually is probably writing, like
39:53 technical writing or just any kind of creative writing.
39:56 Because like in, in, in literature and lots of forms of writing, we often talk about the
40:04 first draft is, I mean, we talk about it as a draft.
40:06 You have a first draft and that's not even usually the first thing.
40:11 Usually you start with notes and then you go to a first draft and then, you know, you have
40:15 many drafts before you have something that's ready for somebody else to read.
40:19 That's probably closer to software because we, in the iterative approach, you, you, except
40:24 for there's a lot of software engineers or software developers that, that think they're like awesome
40:30 and that their code is perfect all the time.
40:32 And that's just ridiculous.
40:34 You need to let go of stuff and be willing to throw stuff away and redo it.
40:38 Yeah.
40:38 Throwing away code that you've worked on is hard to do.
40:42 It's fun though.
40:44 You should do it more often.
40:46 And that's why I like, I like to, I like to put tests around the things that I'm not
40:51 willing to throw away.
40:52 And those are the interfaces that I have promises to.
40:54 So that's, that's really where I like to, the creative freedom of software to be able to,
41:00 to take an entire chunk of code, even if, you know, even a week before ship date and say,
41:06 no, this stuff's just rotten.
41:08 We need to just rewrite it.
41:09 Instead of trying to figure out what it does, let's just re, re implement it.
41:14 Certainly I've spent hours trying to figure out what a piece of software does and then
41:21 more time trying to understand, well, can it be adapted to this situation?
41:26 And is it sufficiently generalized?
41:28 And, you know, we're not even talking about like, is it thread safe?
41:33 And all these other like weird edge cases you might have to care about.
41:36 Right.
41:37 By the time you're done with that exercise, it was probably like two hours to write that
41:40 functionality from scratch, right?
41:42 Yeah.
41:42 Possibly, you know, I mean, not always, but possibly.
41:45 And then other things where I've like, I've spent a week writing something, but, but I think of, I like to try to think of the first time writing a piece of software as learning about what the problem is.
41:57 And so you don't have to relearn the problem.
42:00 You've already know it.
42:01 So if you even like deleted all the files and started over, it would take you a day for the second time.
42:07 And then the third time, maybe an hour.
42:09 Just don't be afraid to throw things away.
42:11 That's interesting because you think I spent a week working on this, but it's understanding the problem, like you say, and like thinking through the trade-offs.
42:21 And, and a lot of that probably gets better with time as you understand it better, not worse.
42:26 Now, the, the place where people really want to throw a code away is other people's code.
42:31 And that's where I think people should like hold off a little bit.
42:34 And like the, the Michael Feathers book that you brought up, around that timeframe, there was a lot of talk about what to do with all this horrible legacy code.
42:43 And, and I, I never think of legacy code as a bad word.
42:48 It was a bad word for a while.
42:50 I don't know if it still is, but I think of legacy code as the stuff that's paying my salary.
42:55 So I really like legacy code.
42:58 so I treat it like that.
43:00 I treat it as the thing that's giving me money right now.
43:03 Legacy code that's still being used by definition was something that was super valuable and useful.
43:08 Right.
43:09 Or I guess somebody left some old website running, but generally speaking, it's, if it was not useful, it would have been turned off or somehow outdated.
43:18 Right.
43:19 Yeah.
43:19 Well, I mean, in, we're, we're, I'm thinking more along the lines of the, the, my job right now.
43:25 we've got, I mean, a, a test instrument's got thousands of lines of code and the, it goes back.
43:31 Like I started on the code base that I'm working on now, maybe five years ago, but, it had 10 years of history before I got there.
43:40 So this code is, old, even if the people are still around that made the decisions, they, they don't remember why they made certain decisions.
43:49 So if there's some weird switch case statement, you got to try to find out, well, that's one of the hard parts is trying to find out is this, was this intent, an intentional decision or just a whim of a developer?
44:01 How many different, types of version control has it gone through?
44:04 Like, was it CVS in the beginning or something like this?
44:07 Or is it still on the same one?
44:09 I'm not, I think it did switch.
44:11 I'm not sure what it was initially because there were, there are some like, the time where they would put like the version string and the header of the file.
44:20 Those are still around, but they're not filled out anymore.
44:23 right now we're, we're actually using clear case, which, I'm not real thrilled about, but it's not terrible.
44:31 One of the things that you have going on at python testing.net is you have a book that sort of compares the three major frameworks that people might use to, you know, they're, they're called unit testing frameworks.
44:44 But like you sort of point out, you can choose the level of aggregation or isolation you want to get into to sort of run the spectrum, right?
44:52 Yeah.
44:52 Yeah.
44:53 Thanks for bringing that up.
44:54 Yeah.
44:54 What's the story with your book?
44:55 I was around like 2010.
44:58 I found myself without a test framework and I looked at those three.
45:01 I looked at unit test, nose and pie test, and I had trouble.
45:06 I looked at him like what I mean is I searched on the web and tried to read things about him, but I couldn't tell whether they were going to be useful for my job or not.
45:14 And I needed something right away.
45:15 So I wrote my own, but, and while I'm, while I was maintaining my own, test executive, for work, I started researching these and I, I wanted to,
45:28 try to try to play with them and create something so that nobody else had to go through what I did.
45:34 If trying to track down what's, what's true information and what's old and what's new.
45:38 So I, I went through like, a simple math problem and, and a little, string conversion problem and, used, those three frameworks, on those, this, on the same problem.
45:53 So people can see apples to apples, compare the, the three frameworks and, and then extended just, you know, the, the basics of getting, getting a test running, through to how fixtures work.
46:07 And it's really the mechanics of the mechanics and syntax of test writing.
46:11 And I, I've, I've written other about other things, but the, the book is, basically a collection of those, mechanics posts so that people can come up to the speed really quickly on, on any of those frameworks and compare them as well.
46:28 Yeah.
46:29 That sounds really helpful if you're, you're getting started and you're not really sure what to start with.
46:34 So do you pick a winner in the book?
46:36 you know, I, I didn't in the book, but I, I think, I think pytest is a clear winner for me because of the fixture mechanism that's built in.
46:47 Plus it's a superset.
46:48 pytest really does everything that the others do and more.
46:51 There's definitely applications that I, and I, when I got into it, I expected to, to pick one and like, not like the others.
47:00 I'm not there yet.
47:01 I think all, all three of those are reasonable.
47:03 the most reasonable is, probably unit test and pytest.
47:08 Those are the two.
47:09 Yeah.
47:09 Unit test is probably the most, what, like straightforward.
47:13 I, I don't think it's straightforward.
47:15 I think it's, I think it's the most confusing to start with, because it's, it's this X unit style that came from Java that you have to derive from a, you have to derive from a test case class.
47:27 And also you can't, I mean, they discourage you from using straight asserts.
47:32 You should use these assert helper functions.
47:34 And, you know, I, I think there's more to memorize.
47:38 If you, if you're just starting out, pytest is a lot easier.
47:41 And there, there, I don't, I don't think any, anybody new to this should use nose, because it's not really supported by a very, I don't even know if it is supported anymore.
47:53 There, there has been a little bit of effort on it, but not much.
47:56 but there's, I included writing about it because there is, there's quite a few people that still use it.
48:02 And if, if you're not, if it's not your decision and it's your job to use it, then I want to, I want to have a reference for people to be able to come up to speed quick.
48:11 I think I'm going to, I would like to write a follow-up book.
48:14 So the, the book I have now, it's just a, like I said, it's a $5 collection of, of all of the, of a whole bunch of posts.
48:22 but I'm, I'm, I'm, I write, I read a lot on a Kindle.
48:27 So I wanted to make sure that those sorts of posts were available on, for somebody to be able to read on a, a e-reader or something.
48:34 Yeah, nice.
48:35 And I'll be sure to put a link to the book and whatnot in the show notes.
48:38 But I, I do want to write a follow-up that, instead of just a collection of posts is actually in a, in a kind of a technical book format, and walks through stuff and doesn't repeat things too much.
48:50 and I don't, I don't know if I'm going to include no's.
48:54 I, I think that, I think it would be sufficient to just include unit test and pi test in the discussion.
48:59 Yeah.
48:59 Okay.
49:00 So, you know, six years ago, maybe the space looked differently and now you're like, all right, pi test is kind of the clear winner in some ways.
49:10 And then, well, unit test is a built in, so people are choosing it because it's built in.
49:15 I mean, it's also, I mean, for a lot of like algorithmic things, like in the things I do that are heavy on fixture use, pi test is a clear winner.
49:24 If it's, if you've got a whole bunch of tight algorithms and you're in test speed really is a, the, the overhead of the overhead of pi test isn't very much, but the overhead of unit test is less.
49:36 So if, if test speed really is a, if you're trying to catch every millisecond out of your test suite, unit tests will be a little faster.
49:44 Right.
49:45 Okay.
49:46 Good to know.
49:46 I'll probably get a bunch of hate mail from the pi test people, after, after this, but.
49:50 Oh yeah.
49:51 I think we've set ourselves up to receive a lot of, feedback on various areas, but.
49:57 There's a lot of respect between the people that are developing unit tests and the people that are developing, developing pi tests because they're, they're, they're, it's, there's two ways to solve the same problem.
50:06 And, they're learning from each other.
50:09 And, and I think having both around is a very good thing.
50:12 Yeah.
50:13 Yeah, absolutely.
50:14 All right.
50:15 A few more things I want to talk to you before, before I let you go.
50:18 Tell us a little bit about your podcast.
50:20 It, it was kind of a follow onto the book, right?
50:23 You wrote the book and you've got the website that's, that's doing a lot of good.
50:27 And you're like, you know, let's, let's make this more, more of a weekly, semi weekly sort of conversation with the world.
50:33 Right.
50:33 Right.
50:34 So the intent was actually to start off weekly and get more than a weekly, like maybe a couple a week, but, it's, I'm really trying to do one a week, but it's, I'm really trying to do one a week.
50:43 But it's often once a month.
50:46 The, I, I, I want to cover more topics that I can, then I can write blog posts about.
50:52 And so that's the real, the real reason why I'm doing the, the podcast is to try to cover more topics.
50:57 And also, things that I think somebody wouldn't necessarily read about.
51:02 Like, I did an episode on the waterfall model that I think, you know, it'd probably be an interesting thing to listen to.
51:09 But you know, pump somebody's probably not going to go read about the history of the waterfall model.
51:15 So I've noticed that a lot of my, my audience is coming from developers and from testers.
51:21 And I know that from my own experience as a developer that there might, there's huge gaps in my education where it comes to testing.
51:30 So a developer usually gets a test.
51:31 the introduction to testing is to talk about test driven development.
51:36 And the, writing a test part, we, we get a statement that says first write a failing test, but no information on how to do that or what the test should look like.
51:47 And, and then, on the other end, you've got testers that come, you know, maybe come from a different kind of background and they don't have a lot of the skills and education that developers had, but they're expected to write automated tests.
52:00 Now I really want to, on both sides, I want to fill the education gaps in both testers and developers to, to fill all those gaps in.
52:09 And I know that people are busy.
52:11 They've already got a ton of stuff to do.
52:12 So I thought an audio format would be a good way to do that.
52:15 So people can, you know, get a little bit on their way to work or whatever.
52:18 Yeah.
52:19 And it's definitely a different type of interaction than a blog or, or a book, right?
52:25 Even than it, even different than an audio book.
52:28 You know, you can just fire up your podcasts while you're driving, biking, walking, doing the dishes, whatever.
52:34 I really liked that format.
52:35 So obviously I guess I'm spending a lot of time on it, right?
52:38 Yeah.
52:39 And I also, I was finding that I wasn't writing as much as I wanted to.
52:43 And I thought maybe I could, and I have a less time now as I go into a manager role.
52:49 And I thought maybe it'd be easier to do a podcast than it would be to write.
52:53 But I've found out that it actually takes more time than writing.
52:57 It's a lot of work, but it's, it's also a lot of fun, right?
52:59 To hear back from everybody.
53:00 So I, I just love it.
53:01 And, one of the things I was worried about that my, so I'm, since I'm not a web developer,
53:07 I thought that a lot of the audience is from web development community, but the, I thought
53:12 maybe that would be a hindrance.
53:14 But when I was talking to Harry, I think there's a lot of similarities to how, how web systems
53:19 go together and, and complex embedded systems.
53:22 So I think, I think it all works out.
53:24 Yeah, I would say so.
53:26 And I think the diversity of viewpoints, it's super helpful.
53:29 Like not everybody's writing straight up web apps.
53:31 So plus I've heard, heard from a lot of people, that I would have, would never have heard
53:36 from otherwise.
53:37 Yeah.
53:38 Yeah.
53:38 That's, that's really cool.
53:39 So I'm really glad to see it going well for you.
53:41 That's awesome.
53:42 I know we spoke just a little bit before you guys started.
53:44 So that's cool.
53:45 Good to, good to see it going.
53:46 And speaking of that, we are both from Portland and something awesome is happening in Portland
53:52 in June, right?
53:53 Yeah.
53:54 PyCon is here.
53:55 Yay.
53:55 Yay.
53:56 PyCon comes to our hometown.
53:58 I happen to be living in Germany for a year, so I'm flying like 12 hours to get there, but
54:02 nonetheless, I'm still going to go.
54:04 It's going to be awesome.
54:05 Are you going?
54:06 Yeah, definitely.
54:08 and I, I put, actually I've, I've submitted, four proposals, three talks and one
54:15 tutorial.
54:15 I don't know if I'll get any, but we'll see.
54:18 Oh, very nice.
54:19 That's awesome.
54:19 So hopefully, hopefully they do get accepted.
54:21 That'd be great.
54:22 Yeah.
54:23 So if people are going, I'm sure a lot of listeners are going to go, feel free to reach
54:27 out to me and I'm sure Brian, you'd be interested in meeting people that want to chat as well.
54:31 So.
54:32 Yeah.
54:32 What, one of the things we talked about is giving away some copies of my book.
54:36 Yeah.
54:36 So if you go to talk python.fm and at the top, it says friends, the show, long as you
54:42 are subscribed on that list, all it takes is an email address.
54:45 We will, I will, I suppose only one of us will do this randomly choose.
54:50 What do you say?
54:50 Five copies.
54:52 Yeah.
54:52 Let's give away five.
54:53 Awesome.
54:53 So on the week that this show comes out, I will randomly choose five listeners from the
54:59 friends of the show and we will be shipping you eBooks of Brian's testing Python testing
55:05 book.
55:06 Yeah.
55:06 Unfortunately, it's got a really long title that I don't even remember.
55:09 We'll ship you the book about testing with a really long title.
55:15 No, no, it's good.
55:15 We'll definitely send you an eBook copy.
55:18 So sign up and you'll be in the running.
55:20 Thanks for doing that, Brian.
55:21 That's awesome.
55:21 All right.
55:23 Parting thoughts.
55:24 I always ask my guests, well, unless we run out of time.
55:28 If you're going to write some Python code, what editor do you use?
55:30 I use Sublime, but I use it for everything, not just Python code.
55:35 Sublime is good.
55:36 Very nice.
55:36 Very nice.
55:37 And PyPI packages.
55:38 What one's your favorite or people maybe don't know about they should?
55:41 Well, this is probably nobody's going to care about this, but my favorite is PyVisa because
55:46 I couldn't do my job without PyVisa.
55:49 Nice, nice.
55:49 Yeah.
55:50 So PyVisa for talking to the machines.
55:52 That's cool.
55:53 And then maybe I'll throw in pytest as well, right?
55:56 Because that's not built in.
55:57 Oh, right.
55:58 Yeah.
55:58 I guess I should have brought that up.
56:00 Perhaps.
56:01 No, no.
56:02 It's all good.
56:02 They all go together.
56:03 One of the things, because it's not a package, but I'd like to do a shout out to the Anaconda
56:09 team or whoever does that because that's the distribution that I encourage people at work to use.
56:16 They're doing really amazing stuff.
56:17 Yeah.
56:18 That's Travis Oliphant.
56:19 And the Continuum guys, they were on show 34 and they definitely make Python in general
56:25 more accessible, especially on Windows.
56:26 So very cool.
56:27 Yeah.
56:28 It's actually that show that got me to download the Anaconda package and try it out.
56:32 Oh, nice.
56:33 Very cool.
56:34 So final call to action before we say goodbye.
56:36 Subscribe to your podcast.
56:38 How do they do it?
56:38 Oh, yeah.
56:40 On the pythontesting.net slash podcast, there's a bunch of information there.
56:48 And I've also got a, I send people information.
56:51 I've got a little email newsletter that goes out whenever I post anything new.
56:56 You can reach me on Twitter at Brian Okken or the podcast is at test podcast.
57:02 Awesome.
57:03 Awesome.
57:03 And all that's going in the show notes.
57:04 Brian, this has been really fun and it's interesting to get different opinions on testing.
57:10 So thanks so much for sharing yours.
57:11 Oh, yeah.
57:12 Thanks a lot for having me on.
57:13 You bet.
57:14 Talk to you later.
57:16 This has been another episode of Talk Python to Me.
57:18 Today's guest was Brian Okken.
57:20 And this episode has been sponsored by Hired and SnapCI.
57:22 Thank you guys for supporting the show.
57:24 Hired wants to help you find your next big thing.
57:27 Visit Hired.com slash Talk Python to Me to get five or more offers with salary and equity presented right up front.
57:33 And a special listener signing bonus of $2,000.
57:35 SnapCI is modern, continuous integration and delivery.
57:39 Build, test, and deploy your code directly from GitHub.
57:42 All in your browser with debugging, Docker, and parallelism included.
57:46 Try them for free at snap.ci slash Talk Python.
57:49 You can find the links from today's show at talkpython.fm/episodes slash show slash 45.
57:55 And be sure to subscribe.
57:57 Open your favorite podcatcher and search for Python.
58:00 We should be right at the top.
58:01 You can also find the iTunes and direct RSS feeds in the footer of the website.
58:05 Our theme music is Developers, Developers, Developers by Corey Smith, who goes by Smix.
58:10 You can hear the entire song on talkpython.fm.
58:13 And don't forget to check out the new podcast t-shirt at talkpython.fm/shirt
58:18 and share your love for Python with the whole world.
58:21 This is your host, Michael Kennedy.
58:23 Thanks for listening.
58:24 Smix, take us out of here.
58:30 Haven't been sleeping.
58:31 Haven't been sleeping.
58:32 I've been using lots of rest.
58:33 I'll pass the mic back to who rocked it best.
58:36 I'm first developers.
58:38 I'm first developers.
58:40 I'm first developers.
58:41 I'm first developers.
58:44 I'm first developers.
58:45 I'm first developers.
58:47 First of all, first of all, first of all, first of all.