As Thinkful has evolved from an online coding course provider to a full-time, online coding bootcamp, they’ve tightened their focus on student outcomes. Thinkful published its first Jobs Report in 2015, and since then co-founder Darrell Silver has been outspoken about encouraging other bootcamps to do the same. We asked Darrell to discuss in detail about collecting and analyzing students outcome data, why auditing is crucial to the process, and why outcomes are essential to maintain student trust.
For our readers who don't know, what is your role at Thinkful?
I'm the CEO and co-founder (along with Dan Friedman) at Thinkful. We're just about to hit our four year anniversary.
How has Thinkful changed over the past four years? What does it look like compared to the original product?
It's surprisingly similar in the approach to education. We've always been an online school and we've always had one-on-one mentorship. Each one of our 7,000 students has worked with a mentor throughout the entirety of their Thinkful program.
What has changed and evolved the most is actually our focus on outcomes; our programs have become more intensive and rigorous, with a higher promise to students. Because of that trajectory, outcomes data and job guarantees have evolved as well.
More than half of our students are enrolled in the Full Time Web Development Bootcamp. Both Full Time and Flexible courses have a job guarantee.
We still have a lot of students in our Skills courses like backend development and mobile development, which promise proficiency in the topic. We also have a reasonable amount of enterprise training; for example, Uber uses Thinkful to train hundreds of their back end engineers.
Thinkful has also made it a point to report on student outcomes. Why was that important to you as an education company?
That’s almost like saying, “Why does your car need four wheels?” I suppose you could get away with three, but it's not quite as stable.
We've been talking about outcomes internally since we launched the Flexible Web Development Bootcamp program in April 2015. We knew this would be important, so we started tracking data from those first students. Once we finally got enough data — meaning enough students had graduated — and started seeing that students were graduating and getting jobs, we knew it was time to build a public tool to show those outcomes. So in November 2015, we put together the Transparency Report, then we kicked off the third-party audit in January, and published that in mid-April (PDF).
Is there something unique about the coding bootcamp industry that makes it difficult to report student outcomes?
One of the things that makes it really difficult, and it's a bit ironic, is that because the industry has grown so fast, up until recently bootcamps didn't actually have to try very hard to attract students. Between the demand for web development, under-employment, and all the major macro forces that drive the industry, as a founder, you could start a new bootcamp very, very easily and if you built it, people would come. While that’s an incredible opportunity it also means that when the industry needs to mature, the schools who haven’t planned for that just aren’t going to survive.
A coding bootcamp can't just add in “transparency” at the end of their marketing website. They actually have to build the approach into the culture of the company, and into their relationships with their teachers and their mentors. We built significant tooling around those statistics, to make sure they’re accurate and we can back up where they come from. It’s why we can report outcomes each month when most schools are only annually.
When I see bootcamps today that still don't report outcomes in a rigorous way, I think it’s not only bad for the industry, but it's also going to be harder for them to catch up. There are some schools that will end up cornered if they don't start publishing their data soon. They're going to start doing things that reflect poorly on all of the other coding bootcamps.
In-person schools usually have to be approved by their state’s regulatory agency. For an online school, have you faced any of those types of regulation roadblocks?
That's a really good question. We haven’t yet. At the moment, we work with our preferred lending partner, Skills Fund, to make sure we’re in compliance with all consumer lending practices.
What metrics do you include in the Thinkful Web Development Bootcamp jobs report?
The jobs report includes enrollment, graduation rate, job placement rate, time to graduate, and demographics like gender, location, etc.
How frequently do you update the report?
We update Thinkful’s numbers publicly every month. We do updates at the of end of each month, reported on the 15th, meaning we get two weeks to make sure that all the data is in the system properly. The Web Development Bootcamp jobs report publishes every month automatically; there's no button to press, it publishes on its own.
Will graduates from the Full Time Web Development Bootcamp be reported separately from the flexible students?
At the very beginning, we think those Full Time grads will get lumped into this one report. If you look at the most current report as of June 30th, 243 students have joined, and that includes Flexible and Full Time. The nature of a Full Time program is that students will graduate faster, and don’t have jobs while enrolled. So, inevitably those stats will diverge. When they do we'll split them apart. There's no question about it.
What do you predict will be the major differences in outcomes between the Full Time and Flexible students?
How long it takes to graduate is a stat that will start to become different in a meaningful way. Two-thirds of students in the Flexible bootcamp have jobs, while none in the Full Time program do. We don't think the placement rate is going to change, but again, these are all just predictions. If the numbers diverge we’ll report on that.
If somebody “pauses” a Thinkful course, do you count that paused time in the duration of the course?
We don't because you're not spending time or money on the course.
This is actually a big difference from some other schools. There are schools that offer a job guarantee, and in order to be eligible for it, you cannot take a vacation for more than two days (a weekend). That means you may have to be sending out interview requests and cold emails to employers every day for six months. I don’t know what those students do if they catch the flu. At Thinkful, if you pause, the guarantee pauses until you come back.
Since you started reporting student outcomes, have you found that you needed to add or remove certain metrics?
High level, the report looks basically the same. We haven't had to add or remove any new data (except monthly updates).
We’d like to add ROI: when does Thinkful actually pay off? We get asked that all the time – especially when students would have to change cities, quit their jobs or find a daycare with other programs.
Thinkful has had the outcomes results audited, right?
Right. We had the first results audited as of February 29th, and we'll do it again in 2017 for 2016. It's a huge process, and we don’t take it lightly.
There are some schools that say auditing is unimportant. How important do you find auditing to successful outcomes?
Basically, I couldn't disagree with that more. I think when we talk about outcomes data and guarantees, you're not actually talking about the web page that you publish with graphs on it. What you're talking about is the culture of looking at data around student success, carefulness, and consistency across your entire team.
We had to build a web page to publish students outcomes on our website, but it's much more about the process of how you got to those outcomes. So when someone says the audit doesn't matter at all, that's really not understanding the underlying reason for outcomes data in the first place; which is that students need to be able to trust the school they're going to. Trust comes from consistently having a great experience, and consistency in a company comes from having an aligned a team that's not overly dependent on any one person, but knows how to communicate as a group. Reports, audits, and a data-driven culture are what makes that trust possible.
If your company has an underlying culture for clear data, then the auditor is going to succeed, and they're going to find clear and transparent answers throughout your organization.
In Thinkful’s case, we honestly thought we were pretty good going into the audit, but we definitely got tighter afterward. The audit is one of many ways schools can show they’re serious about each student’s success– not just to students but to the entire company.
What do you think about other frameworks (like Hack Reactor's SSOM) or methodologies that schools are releasing for other schools to adopt?
Frameworks without data are bullshit because they're a discussion about a discussion. A school needs to be rigorous and clear and transparent about how it came up with the numbers it came up with. As a community of schools, we need to debate which is the right way and why are they different, and so on. Hack Reactor has done a good job in that. But I don’t think it’s useful to publish a model for thinking without also getting into the nitty-gritty of answering each and every little question. The devil is in the details.
The proof is going to be whether schools release verifiable outcomes data, and I believe we’ll see schools actually doing that this year. There will be differences between schools, which are important, but relative to where we were in 2015, it’s going to be night and day.
In my opinion, the frameworks released by each school are just not going to get adopted. If I'm right, then in January 2017, every credible school will have an outcomes report but each framework will have been adopted by only the school that created the framework. Students see past that kind of grandiose talk – they want the facts.
The next conversation we as an industry need is about standardization, which will slowly start to take shape next year. First, we need to get the data out there, then we need to have a long debate about what the data needs to show. I think it's inevitable that there’s a standards body but it's not going to happen this year. The most important thing is that we publish outcomes now, and separate the wheat from the chaff in terms of schools.
Do you think that online schools like Thinkful should be considered in that conversation about outcomes standardization?
Of course – when the school provides support, online education works and the students perform as well or better. All the money offline schools pay in rent we pay in providing support for students: 40 hours of Q&A sessions each week, daily workshops, 1-on-1 mentorship. Those things just aren’t possible except online. This is the opposite of what many students think about online education (we have MOOCs and video courses to thank for that impression.) But when students see the support they get at Thinkful, they come around. Thinkful has educated more students than almost any other bootcamp, but it's not because we're online. It's because we've been around for a while and our courses appeal to a lot of people. The biggest difference is really around admissions policies.
When calculating student outcomes, how do you decide whom to exclude from the calculation?
If you look at the most recent report, 81% indicate that their goal is to get a job as a developer. That's the group that we're including as “job seeking.”
The students who don’t indicate that they want a job as a developer tend to be really unique, fun cases. For example, Tristan Walker enrolled in the Web Development Bootcamp, but didn’t do it to get a development job.
How do you deal with someone like that who's just unmotivated to find a job?
If a student graduates and is job seeking, but fails to find a job, then that's still a failure for Thinkful, and it shows up in our stats. We can't say, "That student didn't put their heart into it. Therefore, they weren't really seeking a job." No, no, no. We ask you up front, on day one, when you enroll if you're job seeking. As a student, you’re allowed to change that status, but once you graduate, you can’t change that status and neither can Thinkful.
Why should outcomes matter to students when they’re researching a coding bootcamp?
Education is one of the biggest investments you're going to make, and it's one of the most consequential. As a student, you have to understand the purchase and the investment you’re making, and the only way to do that is to build trust with the school you choose. You build that trust by seeing the statistical likelihood that you will fit in with the students who have been successful, looking at the actual data and deciding whether it reflects your goals and chances for success.
There's a whole host of reasons why students choose a school: does it teach the way I want to learn? Do I want to learn in a classroom or do I want to learn with one-on-one mentorship? Can I make it affordable? Can I fit it into my schedule? Will it disrupt my life? And then, of course, you have to ask, "Am I going to succeed?” Those are the questions that our Outcomes Report answers. It’s completely shocking to us that anyone has attended a school without knowing the answer to those questions. As an industry, we’ve been really lucky, but students should really dig into those outcomes.