turing-school-job-placement-outcomes-spotlight

In this series, we’re spotlighting coding bootcamps that have released Outcomes Reports. This week, we talk with Turing School, a Denver coding bootcamp with 7-month immersives in Front-End Engineering and Back-End Engineering. Jorge Tellez from Turing chats candidly with us about views on job placement statistics, why they opened their raw data up to future students, and how they get 92.5% of Turing graduates employed as full-time software developers. Read the full Turing Outcomes Report here.

Q&A

Tell us your role at Turing and how you've been involved in the push towards Turing’s Outcomes Report?

I'm the Director of Growth & Operations at Turing School, which means I'm in charge of all the business functions and growth opportunities at Turing. I’ve been at Turing since it was Jeff and I in a basement; now we’ll be reaching 180 students by the end of the year between our two immersive programs.

How has Turing changed over the last two years in terms of outcomes?

Turing is a non-profit organization, which means that we already report publically on a lot of metrics in our public filings. We felt that, in order to tell the full story, we needed to tell everyone how we're doing on outcomes as well. There was also an internal pressure because we wanted to see whether we were fulfilling the promise that we were making to students.

For example, in the past, we had a 95% placement rate within 90 days of graduation. Because the industry has transformed itself, students are now taking a little bit longer to get jobs, it’s only right to tell everyone our real, updated numbers. A lot of coding bootcamps are pursuing this holy grail of 90% placement in 90 days at $90K starting salaries. We don’t think those are accurate placement statistics, and we wanted to force ourselves to be super honest. Our average starting salary is ~$75,000

How do you think schools like Hack Reactor are getting $104,000 salaries?  

I think it has to do with their market. $75,000 in Denver is the equivalent of over $100K in San Francisco. Students can understand those multipliers if they have the data available.

Sure, and why is it important now for coding bootcamps to start releasing outcomes data?

We've always thought about transparency, especially once we were part of NESTA (the New Economy Skills Training Association which has now disbanded). At that NESTA meeting, all of the schools signed a letter of intention to release their placement numbers. Within the first year, Flatiron School released a very comprehensive report which we liked, and we saw that we needed to support this effort and force the industry to become more transparent.

This is especially important right now, with for-profit colleges entering this space. A lot of students are putting in their savings, time, and sacrificing a lot for their families, to attend Turing, and if schools are not fulfilling this promise, then they shouldn't be in this industry.  

Being in Colorado and working with the Department of Private Occupational Schools, did you have to release outcomes data in order to be approved by that regulatory agency?

Yes and no. Regulatory bodies are still trying to figure out what to do with coding bootcamps. For example, the Department of Private Occupational Schools reviews our program, but at the same time, they review nail polishing programs and plumbing programs. They regulate a varied set of schools with different outcomes, but they ask the same questions. There is no standard that is particular to coding bootcamps, and I do think that will come at some point.

Is there something unique about coding bootcamps that is making this a difficult challenge right now?

There are some challenges in terms of collecting the data because it's like building the bridge as you walk on it. We want to make sure that the indicators of success make sense and that the data being reported by the students is accurate. One of the challenges, for example, is defining what constitutes a software developer. Job descriptions are all over the place- you see job titles for “Fullstack Developers,” “Software Engineers,” “Web Application Developers,” even “Ninjas.” Some data sanitization is necessary.

But I think that overall, if a coding bootcamp is doing what they say they’re doing, then it should be easy to report on your outcomes. The problem or the moral dilemma occurs when what you're saying is not backed by data and facts, and you're trying to hide it somehow.

I’m curious what does count as a software developer for Turing? Do you count QA Engineers as a success, for example?

“Software Developer” has a very clear definition; it means that on a daily basis, your work  involves programming and adding features to software.

There are other categories (which are great jobs) that are not software developers, such as technical marketers, etc. There’s nothing bad about those different outcomes. If you go to a coding bootamp because you want to be a product manager or a growth hacker, that's perfectly valid. What I don't like is programs telling students that if they come to their bootcamp, they’re going to become a software developer, and then not getting graduates jobs as developers.

At Turing, we tell students that we’ll train them to become a professional software developer in seven months. So then our success should be measured against that promise.

In Turing's outcomes report, do you ever exclude students from your calculations, and if so what's the threshold for excluding students?

Actually, we don't exclude anyone. We base our outcomes on total students enrolled, which means people who start on the first day of school. A lot of schools say, “if someone doesn't want a job, or drops out in the first week, then we’ll exclude that person.” We base our report on the number of students in the classroom on Day 1.

We’re aiming for transparency, so we want readers to see the full picture and make their own assumptions about how Turing is doing.

But in Turing's outcomes report, you only count students who actually graduate, right?

Yes. For example, 136 students enrolled in 2015, and then 101 students actually graduated. We are very open with the reasons that people didn’t graduate. And we also make demographics data available about students who didn’t graduate.

We want to be even more transparent- we want to provide our demographic data in further detail. Right now, a lot of our outcomes numbers are really focused on Graduates, but we’re we're really interested to start expanding and show that data for enrollees, etc.

How important does Turing find auditing in the outcomes reporting? At first, the report was not audited?

Yeah. Eventually, Skill Funds did an independent review of our report.

Do you think that having Skills Fund look at your outcomes data changes the validity of it?

I think it adds another layer of security to the data. I don't think it's the most important part of the process; ultimately the raw data should be available. We allow readers to access a CSV of our data, which means that 150 million people can check your data and call you out on it. That's even more powerful than having three guys looking over it. However, auditing is the ultimate test, because it means that a third-party saw your data and said it’s accurate.

Auditing also puts pressure on the school, knowing that a third party will look at the data.

Speaking of Skills Fund, do you see Turing adopting a full industry-wide outcomes methodology any time soon?

We participated in some of the Skills Fund meetings, and I think they're doing it for the right reasons, and we had really good relationship with them. But there were organizations at the meetings that weren’t there for the right reasons, and they have loud voices. There are schools with a lot of interest in preventing their real data from coming out. When we saw that the conversation was being moved away from student’s best interest, which was something we’d agreed upon at the beginning of our meeting, we decided to step out of the process. We’ll let the group fill in some of the missing pieces and then decide if it’s useful and truthful for our students.

Since that meeting, we’ve been talking with a lot of bootcamps that are aligned with us in terms of putting students first. We're working on putting together another methodology and will hopefully announce it soon.

Why should potential students be concerned with job placement outcomes to begin with?

Bootcampers are making a huge investment. When you buy a house, you should know if the foundation is solid, if the price is correct, etc. If you're investing in the next 30 years of your professional career, I think you have the right to know what that investment will look like – salary, success, employment opportunities, and ROI.

The only way you can make those decisions is by looking at the raw data. I always tell students that if you don't see a school’s data, ask for it; and if they are not willing to give it to you, then run away as fast as possible. Because that means that either the school doesn’t have the data to support their claims, or that they don't care, or that they have the data but don't want to share it with you. Just have those conversations.

Ultimately, you should also talk to graduates of that program. You need to be able to contact them at random. At Turing, we have a graduate who works at Pinterest, and of course we can tell potential students to go talk to him! But you should also be able to reach out to grads on LinkedIn, at Meetups etc.

To learn more, read outcomes data and reviews of Turing on Course Report, or visit the Turing website here!

About The Author

Liz pic

Liz is the cofounder of Course Report, the most complete resource for students considering a coding bootcamp. She loves breakfast tacos and spending time getting to know bootcamp alumni and founders all over the world. Check out Liz & Course Report on Twitter, Quora, and YouTube

Not sure what you're looking for?

We'll match you!