Dean Karlan, professor at Northwestern


Dean Karlan is the Frederic Esser Nemmers Distinguished Professor of Economics and Finance Kellogg School of Management, and co-director of the Global Poverty Research Lab at Northwestern University.  He is the Founder and President of Innovations for Poverty Action, a non-profit organization dedicated to discovering and promoting solutions to global poverty problems, and working to scale-up successful ideas through implementation and dissemination to policymakers, practitioners, investors and donors.  His research focuses on development and behavioral economics, and he is on the Board of Directors of the M.I.T. Jameel Poverty Action Lab.

Dean joins me today to discuss his academic career and why he decided to be a founder. There appears to be a theme with Dean’s academic work and we discuss how that influences the companies that he has founded. We discuss the frustration that comes when the evidence that you are generating are not changing policy. We also talk about his other startups Stickk and ImpactMatters.

“Know what problem you're trying to solve”… “as clear as you can in your mind, so that you are able to focus on thinking through whether what you're doing is really addressing that problem.” - Dean Karlan

Today on Startups for Good we cover:

  • How IPA (Innovations for Poverty Action) works
  • Randomized Controlled Trials - the needs and criticism that comes with them
  • The size and scale of IPA today and some of the challenges they have faced
  • Methodology of a lean startup and did he use it for IPA
  • The acquisition of ImpactMatters  by Charity Navigator
  • How to measure impact of a charity
  • Lessons learned from being a founder

Connect with Professor Karlan on LinkedIn or Twitter

Subscribe, Rate & Share Your Favorite Episodes!
Thanks for tuning into today’s episode of Startups For Good with your host, Miles Lasater. If you enjoyed this episode, please subscribe and leave a rating and review on your favorite podcast listening app.

Don’t forget to visit our website, connect with Miles on Twitter or LinkedIn, and share your favorite episodes across social media. For more information about The Giving Circle

Transcript
Miles

Dean, welcome to startups for good. Thanks for coming on.

Dean

Sure. Thanks. Thanks for having me.

Miles

Yeah, really excited to get into this with you. Where I'd like to start is, you had an academic career, it was still going very well. Why did you choose to be a founder?

Dean

Well, you know, as a researcher, as an academic, we get involved in a lot of exciting ideas. And sometimes it's, I think there's two different spirits. And that kind of drive us to, for some of us to think outside of just pure academic one is when sometimes the research leads to something that you think can actually be done in the real world that it's it's not, it's, it's an idea that actually could have legs and be something that can help people in some way. And so let's go try it, let's go do it, let's go do it for real, let's not just set up lab experiments. And, you know, and, you know, clinical trials, where it's very controlled, but let's actually go and like test their ideas out in the wild.

Miles

And this seems to be a theme of your academic work, even is that line of how can, knowing better help us do better?

Dean

That's right, a lot of times, we don't, I don't, you know, I don't think about creating something to, I am able to just work with a partner who's doing something that's proximate, and then we can brainstorm together. And they're the doers, and we're the study ears. But there are other cases, sometimes where it's been involved in the actual doing. You know, the other is on the nonprofit side, the, you know, creating things is because there's a void, there's a, there's a market failure, for some reason, and institutions not exist, that is doing something that we think can happen can be done is scalable, can line up financing can provide value to the world in some way. And so let's go do it.

Miles

And is that how IPA started?

Dean

That's exactly how it started. IPA started IPAs, Innovations for Poverty Action, as you know. And I started this right when I finished graduate school, and it was recognizing a few different things that were missing. But one of which was that we were just starting off in a as a researcher, and there are other researchers who have just won the Nobel Prize last year for being leaders in economics and social science in pushing out a movement for randomized controlled trials. But a lot of that work to do those randomized control trials requires really talented people working with partner organizations collecting data, doing a lot of a lot of nitty gritty, detailed work that's about working with partners to implement change internally, for those organizations. It's about collecting data, managing surveyors, all of this work. And the reality is we needed an organization that was going to help us do that. We didn't, you know, we can't just have like a rotating set of graduate students that kind of go into the field and spend three months and then three months later, it's another set of grad students, that's just that's very risky. It's not always in line with what a grad student should be doing. There's lots of reasons why that's not a good stable, long run system. And and also, the other thing we realize is there's just huge economies of scope, that if if there's a great team that is helping me in Peru, or the Philippines, which were the first two countries to set up research, or in India, where it was helping (unintelligible), the reality is those awesome people in those countries can help more than just that small set of researchers that once they're there, and they're experienced and knowledgeable how to do that kind of work, that there's a huge economy of scale that can be had by helping other researchers also use that knowledge, that infrastructure or that human capital to help coordinate their similar similar studies. And so that was the kind of one of the the birthing ideas of IPA The second was also recognizing that as academics, we have huge incentives to write academic papers, but not so much to influence policy. And that's yet that's not why a lot of us got into the kind of work we're doing, including myself. We got into it because of a humanitarian interest in seeing that there's problems in the world and the strong belief that more evidence can really help improve the way policy is done by governments by nonprofits. And so we want to see our research when appropriate change lives. But that's not you. That's not going to get done by just writing academic papers and then thinking the world's gonna download them and use them. That's just That's crazy. We need to work with people who are really committed to that policy process. And so that was the other motive. motivation for creating IPA.

Miles

You heard it here first folks, a professor admitting that not everyone reads his papers.

Dean

Oh my gosh, far from that.

Miles

So if you are going to give an example of IPAs work so that people can understand it more tangibly, how would you do it?

Dean

So the the lion's share of IPAs work has been supporting and conducting randomized control trials. So it's basically taking the principles from that we kind of I mean, you know, in today's day with COVID, everybody's quite familiar with randomized control trials that were done for vaccines. And and there's a reason why we waited for those randomized control trials to happen before everybody starts feeling comfortable that this is a, a vaccine that, you know, that we can feel good about. And it's taking that same discipline in thinking about safety and prudence and causality of policy to, to the change that we hope to create in the world. And it's taking that to many, many, many domains from cash transfer programs to programs like microcredit to training programs, communications and text messaging about health or savings, financial services, corruption education programs, million million questions like that were that were just following someone over time doesn't really tell you or following a group of people over time, or communities or schools over time, doesn't really tell you which program caused what change and a randomized control trial is better able to do that than than most other methods in those contexts. Obviously, there's a lot of context when you can't do it and want to be clear, right? It's not a one size fit all, you know, panacea for all development questions. But when it is appropriate, it can really help move the needle in establishing that a certain policy in this context causes a certain set of changes to happen.

Miles

And you've been a real pioneer in applying evidence to these types of situations. Although not everyone agrees, even today, you hear criticism pushing back that, you know, we don't have the money to spend on it. We don't have the time, the problems too urgent, or I've even heard, you know, just because you do an RCT and a couple of different places doesn't mean it's gonna apply to the next, or someone saying, you know, it's common sense. We know what's right, we just need to do more of it.

Dean

There was a handful I can,

Miles

Yeah, there's a lot there. But I'm curious what you see that people are becoming more accepting of this idea over time, or is there? Is there been a change? There's,

Dean

I mean, there's definitely been a shift over time. I mean, it is, there's no doubt you can look, there's I've seen a few graphs I realize this is a podcast so I  we can’t give you a graph. But there's been a huge number of there, there have been data that have been tracked and showing that the likelihood of that, you know, there's just a lot more randomized control trials now than there were 20 years ago on these questions, there's no doubt. You know, obviously, there's a lot of situations where it's not the right tool to answer a question, but there's a lot of situations where it's become much more standard. You know, when I first started in these kinds of conversations, and you know, it was, you know, roughly the year 2000. And, and it was definitely the case that a lot of conversations had to start off by explaining what we mean by a randomized control trial, why we are proposing that kind of method for answering these questions. And whereas now, it's not to say we never, I never find myself engaged in that conversation, but it's actually fairly rare more often than not, we are, you know, there's a partner that reaches out to us saying, Hey, we saw a study that use this method that seems very appealing to us, please work with us to implement a study of that nature. And we're actually spending a lot less time convincing someone that randomization is good and a lot more time really, actually helping trying to figure out what the research question should be, what the what are their operational challenges, and how can we best meet the research serve their purposes?

Miles

Right, and could you share with our listeners a little bit about the scale and size of IPA today?

Dean

Sure. So we got we grew very fast. In the early years. Lately, we've kind of, you know, been a bit more stable. We are a we have, we've run that randomized control trials in over 51 countries. I believe the current count is 21 country offices, where we emphasize our work and the country office is actually a really important concept. Because a lot of the relationships and the policy influence that I mentioned earlier, that doesn't happen by, you know, remotely very well. And so by having a country office, we're able to really build better relationships with local government, local policymakers, local NGOs, local academics, and, and a lot of the use of evidence to influence what they're doing comes about through a relationship through years and years of discussion and helping to make the research more suited to answering the questions they have, as well as building building trust and, and the conversation on the lawn over a long time. And so that's an important part of IPAs principle for how we go about doing our work is to really promote these country offices and try to think for the long run. Now, in terms of, we have, I think, at any one point in time, we have roughly about 1000 people on payroll now, but about half of that are people who are, in a sense, more permanent or long term employees, and about half are responsible for data collection. So at any point in time, we have a lot of surveys that are, you know, happening. And as part of the survey, there might be someone hired for a one month contract to do a survey. And so that's about half of the half of the employees and the other half, are more, you know, kind of employment at will or, you know, one to two year kind of engagements to help manage a project like that now over 800 randomized control trials, and we have about 400 or 500 researchers that have worked with us. So that kind of what I mentioned earlier about our early days, when it was just a few handful of researchers, you know, one of the most exciting things that we were able to do fairly early on is figuring out how can we how can we make sure that we're building a community here, and helping other people who don't have? You just have one project, and they can't build infrastructure for one project? How can we how can we help support that? How can we invest a lot in things like methods, issues about data collection, and data quality, things of this nature, which might get overwhelmed, and that can really help a lot of people?

Miles

That's wonderful. It's a true scale and so impressive, to see a nonprofit in particular scale like that. What was one of the biggest challenges along the way?

Dean

So, um, I think the biggest challenge, actually, I think we still face is just that we are, we're putting a lot of interested parties together, who share a common vision, common idea that, you know, evidence can really help move the needle on improving the policy. But at the end of the day, everyone is not perfectly aligned, academics have their needs and desires and implementers have, there's funders and there's and juggling all of that can can often be can often be tricky. And so that's, you know, it's a good challenge to take on. It's one that we embrace, and I think it's our strongest value added is doing that well, but it's it is it, you know, it does pose all sorts of operational challenges. it poses everything from like financing challenges to contracting issues that are important, too important to deal with, but can can be a challenge. Kind of a very inside game answer for you. I feel like there's probably a better answer for podcasts. But that was I don't know why my mind went down to that operational level of, of of a challenge.

Miles

I think if you have something else you want to share, go ahead.

Dean

I think the broader challenge, I would say is, despite all of the success that we've had, in in helping, like I said, about 800 randomized control trials, it's still the tip of the iceberg. There's still, you know, there's so much there's so many competing forces that fight against evidence, everything from our human nature of feeling like, well, if I see it with my own eyes, well, then I don't need evidence. Well, that's just not true. to politics to funding. And that, you know, so there's a lot of competing forces, and it's not, you know, it's not always so easy as to say, well, as long as we know that a works better than B, let's just, that'll be great. And as long as we make that clear, with a nice little bar chart where A bar is higher than B's and will that be problem solved. And we just showed that to people and that's just not how it works. And so that's, you know, always a continuous effort and continuous struggle, where we've had some, you know, exhilarating success, but but still the tip of the iceberg in terms of the the kinds of problems that are out there where more evidence can go a long way to helping to solve a problem and you just keep at it.

Miles

No, that must be really frustrating when you feel like you're generating that evidence, but it's not influencing policymakers.

Dean

You know, usually it's for good reason, right? It's not It's not because there's like oh, you just know so much. And it's usually that there's some, there's some reason why there's there's an operational challenge to doing something, there's a funding bottleneck, there's a, there's maybe the evidence to is is weak. And it goes to something you asked earlier, maybe it's, you know, from one context, and we're working in a different context, will the evidence translate over here? Well, we think it will, but but there's reasons why it may not. And so, you know, you maybe you need more evidence, and, you know, so there can be, you know, good reasons why someone says, That's interesting, not gonna do it. I'd like to think that if that's the case, their answer is, that's interesting. Not gonna do it without further evidence. And let's go get it. And now we can decide if we're gonna then scale it out, in kind of a sense that the ideal, but but it's not always that easy.

Miles

Do you think that the Lean Startup methodology can be used in this kind of context?

Dean

So I mean, broadly speaking, yes. I mean, there's a lot of things that I think we've seen with that are that do undergo the kind of evidence that we're talking about with a randomized control trial that are testing something that came out of a Lean Startup kind of concept. In a sense, that's actually easier, because a lot of times as an academic, when we're trying to study the impact of something, we kind of like it, when that something is super well defined, it makes it easier to get your head around, what is it that's happening? Why might it be happening? Is it generalized knowledge that they can help in many other contexts or not. And to do that, a lot of times, there's actually a push for things to be kind of really focused and narrow and almost simple. And you know, where we end up. And so that's actually very much in the Lean Startup kind of mindset of a very simple app or device or something of this nature, when things get really complicated when there's a program and a policy that says, we're going to do 17 things, there's going to be some irrigation, there's going to be some child nutrition, we're going to send messages about vaccines, we're going to do all these things all at once. If you try evaluating that, you might, you might be able to speak to the stakeholder who paid for it to say did you get what did you get for your money, but you're gonna be much harder pressed to create generalized knowledge that helps inform other players about how they can address problems they're facing in their communities. And so that's, you know, so that's kind of the other end of the spectrum, the Lean Startup model, right. But so, so in a sense that I think the worlds are fairly aligned on that. The lean startup world in the, and the RCT world.

Miles

Gotcha. And IPA is not the only organization you've started. In fact, I think congratulations are in order for Impact Matters being acquired by Charity Navigator.

Dean

Yeah, thank you. Yeah, no, it's very, it's been an exciting year. For us. That's probably the only good thing I'll ever think about from 2020. So I guess we got to two good things. So Impact Matters was. So you know, backing up a bit with IPA, as we've mentioned, where you can build around the idea of generating knowledge about what works and what doesn't. And a lot of times, we would get an inquiry, I would get a personal inquiry, or you would get one IPA that says, hey, that's interesting. Where do I send? I'm a donor, and I'd like to support a nonprofit that does the work that you've found good evidence for. And while it's true that there's occasionally a study that is really focused on like, the entirety of what a nonprofit does, usually, that's not the case. Usually, it's not, usually the research is about some idea. And it might be one program of an NGO, it might be it might even be something more narrow than that. And so and so the odds that some, if someone would call us up at  IPA and say, Hey, I have a name of a charity, you know, should I support them? There is almost zero probability that we would have a code clean, randomized control trial of exactly what that nonprofit happens to be doing to be able to tell them like, well, this is what the evidence says. And, and IPA I also did not want to be in that business of ranking charities, the charities were our partners were implementing, and if we were then at the same time, in the business of having them compete with each other, that's, that's, that's that. That doesn't work well with our model and of being the in between between the researchers and the implementers and the funders. And so that's why we wanted to create an entity whose task was how can we help donors know which charities Have more likely positive impact, which are the which are the good charities so to speak. And what was out there was there's basically two, two organizations that were out there that were fairly large scale. One was Good Well, which used a lot of randomized controlled trials from IPA, which made us very happy. But they would only name the very top ones. And I admire them a lot for what they have done, what they've accomplished in the money they've moved. But they really only helped the kind of individual who said, Hey, I read about your philosophy and the way you think about impact. And I love it. So tell me where to donate great, never heard of them, but I believe you good, I'm gonna send my money. But the reality is, most donors don't behave that way. I think most donors have a charity in mind. They want to support their local homeless shelter, there is some scholarship program for minority kids in their community. And they want to know is that charity any good. And that's not helped when when you're only naming six charities in the entire world, you can't help those kinds of donors. And so the group that was out there that was helping those kinds of donors was was Charity Navigator, which is the group we have now merged with the historical challenge for Charity Navigator is that they would use overhead ratios as their proxy for impact. But there's really not much good research. In fact, if anything, I think the evidence goes the opposite direction that says that overhead ratios are, are actually good proxies for impact. It's just really easy to use them there. It's publicly available data filed with the IRS. So you can get that data for 1000s and 1000s, and 1000s of nonprofits. But that doesn't mean it's actually actually measure of impact. And so we built a model for different cause areas to try to estimate what the social science research combined with the output evidence, the and if they did actually have impact evidence as well, from nonprofits to build estimates. And so we built estimates for about 1500 nonprofits. And, and that was about a year ago. And we, you know, had been always talking with Charity Navigator a lot over the years. And partnering with them as much as we could. And they had a, you know, they've always had an interest in including impact information, but they just hadn't done it. And so that's where the merger made total sense. We're like, well, we done this for 1500, we can do it for more. We weren't, you know, we didn't care about our brand. So we saw it as a way of if we can get inside the market leader and help them put impact inside their rating system that we've won. That's that's our goal. And so we got rid of the brand, and we're now inside them, helping to basically integrate our methods into their impact ratings.

Miles

Well, congratulations again on that. Can you explain and break it down a little bit more basic terms, when you say like overhead ratios and measuring impact what those two things mean?

Dean

Sure, sorry. So an overhead ratio, when you when when a nonprofit reports, their financials to the IRS, they have to report three numbers, they report a lot of numbers, but there's three numbers that that are kind of highlighted. One is how much let's say their overall expenditures for the year are a million dollars of that million dollars, how much of that was spent on what's called program services. So that could be cash transfers, writing cash, it could be the cost of food, if they're providing food, it could also be the cost of labor. If they're providing training for entrepreneurship or job training, then it's the cost of the employees who are leading those classes and meeting those sessions. A second line is what's called administrative. And this is, these are things like the accountants and the comptroller and the CFO, and part of the CEO usually, and then a third line is the fundraising costs. And that's your Director of Development. It might be the cost of direct mail, it might be part of your CEO salary. And, and so overhead ratios refers to basically adding up those second two categories and dividing by the total overall expenditures of the nonprofit. And the basic idea that was put forward two years ago, which I think is a bit unfortunate, but the space is like, Hey, we want that ratio to be low. So that more of the money donated is going to program services. And and that causes actually a fair amount of distortions. It's a good thing only for getting rid of fraud. I want to be clear, there's definitely some organizations out there that are basically fraudulent. Maybe more than half of their money is spent on fundraising and overhead. And that's a sign of something, something not right But there's very few organizations like that, last I saw it was about 2% of nonprofits have, have more than half of their money spent. It's a very tiny problem. And those are not organizations that most people have ever heard of. They're very kind of, they're there. They're kind of weird organizations that do lots of phone bankings or fundraise and things like this. And so that, you know, the problem, though, of course, is like, in some sense, why do you care how the sausage is made, if you're know that there's two organizations, and you know that they both are spending a million dollars, one of them says, Oh, 80% of that goes to program services. And the other says 60% of that goes for services. But the one that's 60% is actually having a bigger impact for their million dollars, I prefer that one right there, that means a million dollars, they're creating a bigger impact than the other one is $4 million, and what goes into what line on their tax returns, I don't really care. So the only reason to ever look at that is if you have no idea whatsoever, what their impact is, but if you actually have some estimate of the impact, you should throw away all information about overhead, will I put a little asterisk on that, with the exception of the broad, you can do that to do the first filter, like don't even bother looking at a nonprofit, that is spending half their money on overhead. And so, so that's the that's the basic point is that, you know, everybody, when they're focused on overhead, it's not that they actually care about the overhead, they just care about it as a proxy for impact. But so then why don't we just try to tackle the measurement of impact better, and let's just come up with more direct measures of impact, rather than using proxies, particularly when the proxy, there's lots of reasons to think that that might be completely flipped around in the wrong direction, you know, an organization which is being really thoughtful about how they design their program, and hiring senior management staff that are really experienced and knowledgeable, and investing in an evidence, going back to the first part of our conversation you're ingesting and thinking about what the right way is the right run this program, they're probably gonna have a bigger impact, I think, but yet they're over in ratios, we're gonna actually look bigger. And that's unfortunate, because then you get hurt with someone just looking at overhead ratios, when in fact, for their overall budget, they're having a bigger impact.

Miles

So how do you measure impact?

Dean

So, you know, broadly speaking, there's, you know, there's primary data, and then there's secondary data, right. So, you know, if we go back to the first part of the conversation with innovations for poverty action, we would work with an organization to set up a primary data collection exercise with a randomized control trial to measure their impact. Now, secondary is when the secondary process, which is what impact matters went through because impact matters did not work with organizations to collect new data. Instead, what in fact, matters we do is look and see, well, do you have any impact evaluations that we can use, and we can look at those, we can evaluate those, and if they're good, and we can use those to build a build an estimate of your impact. Or in the case of a lot of organizations, they didn't have their own impact evaluation. But they didn't actually have good enough data about what their activities were. And the social science literature told us enough about the likely impact of those activities that were able to combine those two, and basically build a spreadsheet, which estimates their impact. So like, take scholarships, as an example, scholarships for children to go to college. You know, a lot of these are very small programs, I would not advise a tiny little program running, offering 150 scholarships a year to necessarily conducted their own primary data collection exercise, and with a randomized control trial, but there's been a dozen or so studies out there that help us learn when, when a student a high school student gets a scholarship. How does that change the likelihood that they go to college? And there's other studies, tons of studies that tell us if someone goes to college, compared to someone who doesn't go to college, how much extra earnings? What are they likely to earn later in life? So we can combine all of that, to then estimate what is the impact to expect of a nonprofit which hands out a $10,000 scholarship, and then we can look to those other pieces of research and say, Well, typically a $10,000 scholarship increases the likelihood of going to college by 3.4 percentage points. And going to college helps you earn $50,000 a year, I have no idea what those numbers are going to be clear. I'm just saying that you know, as an example. And so with that, we can then build and build an estimate and obviously we got a little bit more nuanced in detail than that not all scholarship programs are the same. We tried matching up as best as we can between what a particular nonprofit is doing, and who they're reaching, and how much they're giving away and scholarships to the existing evidence from the social science, literature,

Miles

That makes sense. And so now expanding from that list of 1500, to the larger universe of nonprofits, that's exciting.

Dean

It is, well, well, you know, if we were going to go, you know, clause by clause and and nonprofit by nonprofit, the team is working hard now within Charity Navigator to figure out how to how to go from there. One of the bigger bottlenecks is actually getting nonprofits to provide information. You'd be amazed how homeless shelters and food banks were two of the other cause areas that were where we were able to see a lot of nonprofits in the US and actually build a model that was that was able that we were able to use for many, many nonprofits. But I'm I was blown away by how many food banks there are in America, which do not report how many meals they gave out last year, and how many homeless shelters there are, that don't report how many people's beds were provided the prior year. So all you know, is their annual budget was $7 million. And that's it. You don't know that went to like, you know, 100 people, 1000 people, 10,000 people, you've no idea, food bank with a budget of $12 million, how many meals given no idea. That's a shame. So one of the other plans and hopes is to be working with me we're doing it's not just a hope, we're working with Charity Navigator to help build better incentives and a platform to make it easier for organizations to report the essential information needed in order to do these kinds of estimates.

Miles

That's wonderful. Now Charity Navigator itself is organized as a nonprofit, a 501C3 was impact matters nonprofit as well?

Dean

Yeah, impact matter? Definitely. There's no, I really don't think this is a space that is well suited to a for profit exercise. It's the reality is there's a market failure in this kind of information. The minute you want this information to be publicly available. We want retail donors to be able to access it. And, you know, we never really considered doing this as a for profit model. I don't think that would work in the long run.

Miles

And you have started at least one for profit startup in Stick?

Dean

Yes. Yeah. Well, the incentive there makes sense that, you know, the problem with the Impact Matters is I think, if you're for profit, you also have perverse incentives, that can really be detrimental to the credibility of what you're doing to the transparency of what you're doing is absolutely critical. Like one thing that was ethos is very important to us. And Impact Matters was that everything is fully transparent and clear. We want you know, we want to be criticized helps us improve our models. And if you're a for profit, you're like, Oh, well, no, you can't see my secret sauce. So yeah, but Stick was a a byproduct of theories, from behavioral economics, about how people commit to often want to commit, to engage in better behavior in the future. But, and the kinds of things that you kind of say you want to do, but eventually, when the time comes to do it, you find some excuse and don't do it. And then when you reflect back on it, you're like, oh, gosh, I wish I did that, you know, most passive example is losing weight, I want to go on a diet, we'll start tomorrow, tomorrow comes maybe the day after. And of course, later regretted this fits for smoking, this fits for weight loss, it's for a lot of people for exercise, a lot of other areas in life, too, that are not so generically work patterns and habits, reading, maybe, maybe playing less video games, maybe, who knows, everybody has their owns, you know, area of weekend, so to speak. And so the idea behind Stick was to give people a tool to help them commit to the behavior they want to commit to. And they commit by either putting up their reputation or some money. So they put up their reputation, what they're doing is they're saying that I'm committed to doing a certain thing. And if I don't succeed, then here's some email addresses of people who hopefully will keep me accountable. So I want you to tell them when I succeed or fail and hopefully just knowing that they're going to know will inspire me to do what I said I was going to do. Or the money obviously can be a bit even more binding depending on the person where you put up the credit card. And then if you don't do what you say you're going to do, your money goes off to wherever you say it should go to which could be Stick if you just forfeit the money to Stick which is part of what helps to keep up keeps the light on keeps. Keep The lights on me could also go to an anti charity, which is one of the popular options, where you choose a charity that you hate. So we have both sides of a lot of politically hot issues, gun control, abortion, the political action committees on both sides, things like that.

Miles

I could see how that could be really motivating someone to not not donate to their opposite side views,

Dean

Indeed, that has always been it's always been a popular option. So

Miles

And how did you decide to start that company?

Dean

So that was started, basically, because I had done research on this question, bounded commitment devices can work. But there was a part of me was like, okay, that's interesting and exciting to have this study, and in this kind of, somewhat controlled process, see that it was effective, and other people have done research as well. It's another thing to say, Can this actually scale as a business and, and, and, and be a bit more open to the kinds of contracts people can write. And, and I also did, actually, originally, when I started it, I did have the vision at some point of doing research to help learn more about how to make these tools work better. And so I actually am now doing research with it. I did not in the early years, but I am now doing some studies with it. To help learn how best to offer this tool, set up the tool, what types of contract terms are more successful and help people more?

Miles

And what scale has decreased?

Dean

So I think the last time I saw we have about four and 100,000 users is over the life of Stick, typically pick up about 30 to 40,000 a year and then some obviously, a lot of people come once and don't come back. So I think that's been a fairly slow steady up, no hockey stick. Needless to say, there's no IPO, unfortunately. Maybe someday,

Miles

What have you learned from that? That company?

Dean

Oh, so I think, um, so that's a, that's a tricky question. Because of me as a researcher. So one of I think, I think I would say two things. One is one putting on my researcher hat and the other putting on my startup hat. As a researcher, I actually wish that we did more research in the early early days. And we got so excited. And we got some early buzz that was kind of intoxicating. And it was hard to hard to think about slowing things down. And let's just do a couple studies first, and really dig in on whether the contract should look like this or that in a kind of lean startup kind of way control, not open to the public yet. And we didn't do that. And I kind of wish we did, I think we might have been able to migrate to a more optimal contract or faster. And then once we got into the mode of being a publicly available website, it was harder to harder to slow things down and do a side study. From a startup perspective, I think they're, you know, I think we, I think in some sense, we got pushed around in the early days in unfortunate direction, which was, you know, there's a lot of employee wellness programs that were. And then and that's still something that's very active in a vibrant part of Stick is a white label service that it provides to companies to help organize incentives, positive incentives, to help nudge employees towards healthy behaviors. And I think that's a really important use case per Stick, but it's one that needs to be built on top of a foundation, where the core website is more optimized. And instead, we shifted very quickly into that as the primary revenue model. So So that's, I think, I think the other the other angst they have whenever they reflect back on it does, I think there's some business gaps that we never were able to you know, the experimentation that I mentioned that I would love love to be have done as a researcher, I think was would have also helped Stick get faster to to a better product. And so that that kind of early stage tinkering and, and really simple randomized control trials back to how we started the conversation with IPA randomized control trials on what's going to help people succeed more what's going to help, what's gonna guide people to write a second contract and stay using the site to help them on other domains of life. And I think had we done more of those kinds of studies in the early days ironically, given that that's what my day job does in other domains. I think, you know, we could have gone even, you know, hadn't had even more users. Now, having said that, you know, we're still I shouldn't say this in such a negative way. I mean, we're, we're, we're quite proud of what we built. Jordan Goldberg, my co founder with it, and Ian Ayres, Jordan put his heart and soul and did an amazing job building what we built. And and the fact that we still are just continuing to gather more and more users every year is exciting.

Miles

So it sounds like you, you wish you'd done some AV testing to tune the product and improve retention, before you tried to put the gas on sales and marketing?

Dean

Yep. Yep. That is true.

Miles

Do you have any other advice that you would give to an aspiring founder? I mean, you've you've started multiple organizations here, nonprofit for profit, what what advice would you give someone?

Dean

Um, know what problem you're trying to solve, I think is the simple, the simplest, have that, you know, as clear as you can in your mind, so that you are able to focus on thinking through whether what you're doing is really trying is really addressing that problem. It's easy to get pushed around, it's easy to kind of start one thing, have someone kind of, you know, suggest to you to do something slightly different. Some partner, maybe it's a funder, maybe it's a client, who wants not what you're offering, but something else. And you know, that's always tricky. Some of those can be life changing, it can be exactly what it needs to happen. And it could be eye opening to shift you in a totally new direction and be like, Oh, my gosh, it's not what I started off to do set off to do. But that's great, then let's go there. On the other hand, it could be distracting and it could take you away from the core, the core effort that you in the core idea. So I realized that I just kind of said, sides of that is like maybe you should or should not get let yourself get pulled away. But it's instead of nothing, I guess my advice would be to think really hard about those moments, in the early stages of an organization, when you're being pulled in a different direction, then you set off, and then really think hard about whether that's Are you just being opportunistic to keep the lights on? Or is that actually a good direction to push your entire effort? And be thoughtful about that.

Miles

And I'd also be really curious to hear your advice for individual donors, both on a strategic and a tactical level.

Dean

So well, there, I think there's, I mean, my my advice, there is, surprise, surprise, follow the evidence, if it's if it's the kind of thing that can satisfy that evidence can satisfy. There's a lot of nonprofits, I want to be clear that I think they're amazing, doing awesome work that I personally would support. But that are not the kind of groups that could produce a randomized control trial. And you just have to use, you know, intuition, qualitative evidence about their activities and and their place in the world and make your decision. But for the ones that are actually doing something that is in what I would call service delivery space, they're providing a service to some people who, for whatever reason, have a disadvantage, you know, in some sort of protected group that we care about from an egalitarian perspective, from humanitarian perspective, from a justice perspective, whatever the case is, those usually can have some evidence on like, is it actually working? is it doing what you set out to do? So follow that evidence?

Miles

Follow the evidence. That's great advice. In closing, I'd love to know how people can follow you online.

Dean

Sure. So on Twitter, @DeanKarlan, IPA is poverty-action.org is also on Twitter. And  there's the two best ways basically, I'm not quite as active on Twitter as IPA. So following us both probably makes a lot of sense.

Miles

Well, thank you so much for coming on startups for good. You're doing so many amazing things. It's hard to cover it all in this short amount of time, but hopefully, people can follow up.

Dean

Thanks for having me. It was fun. Thank you. Good to catch up.