Inside This Article

hack-reactor-job-placement-outcomes-spotlight-shawn-drost

While a number of coding bootcamps have published their own student outcomes, Reactor Core recently released SSOM, a methodology for calculating job placement outcomes that they’re encouraging other bootcamps to adopt. As the coding bootcamp industry grows, transparency in marketing practices becomes integral to the success of the industry, so we spoke to Reactor Core CCO Shawn Drost about their Standard Student Outcome Methodology (SSOM), the (un)importance of auditing outcomes, and how Reactor Core calculated Hack Reactor’s 98% Hiring Rate with an average starting salary of $104K.

Shawn, what’s your role at Reactor Core?

I'm a Co-Founder of Hack Reactor, the San Francisco bootcamp. In 2015, we also started Reactor Core, which is a network of coding bootcamps made up of Hack Reactor, MakerSquare, Telegraph Academy and others. Now my official title is Chief Commercial Officer of Reactor Core.

As part of Reactor Core’s efforts, you’ve now released the Standard Student Outcome Methodology (SSOM). Tell us what SSOM is about.

SSOM is a document that helps coding bootcamps calculate a placement rate. If a school is focused on helping people get jobs as developers, then it's very important to communicate that success to the public. Hack Reactor and our other Reactor Core schools have all communicated placement rates from Day One, but over time, it became important to formally document what that means and how we calculate that number.

So what does go into the Placement Rate? Isn’t it simply the number of graduates who get jobs as developers?

People think that number is really straightforward, but it turns out there are a lot of edge cases. We have to think about who counts in the numerator and in the denominator, how to deal with international students, and how to count graduates who the school employs afterward, etc. How do you deal with somebody who gets a short term, part time contract?

People can disagree about how all of these things work, and it's really important to be transparent. In its infancy, the bootcamp industry has not really done that. And to be fair, neither have law schools, which are now in their adulthood.

You've been running a coding bootcamp for three years- what are some of the challenges that you faced as a bootcamp founder in reporting outcomes?

It's not especially difficult for the coding bootcamp industry to keep track of student outcomes, and this conversation is broader than the bootcamp industry. It turns out, the entire higher ed system is bad at keeping track of student outcomes. When Reactor Core started building SSOM, the first thing we did is look at existing standards that are in place at law schools. It turns out the existing standards are very haphazard.

Did you look at College Scorecard when designing SSOM?

We did -- College Scorecard has a great approach that we can't implement on our own.  The government basically pulls the tax records of students, so they have great salary data for all alumni.  This is an approach we've recommended to the state of California, and they're looking at it.

So should there be different outcomes rubrics for different types of schools?

About 90% of coding bootcamps are organized on the promise of "you want a job as a developer. We'll get you there." And for any school that is set up like that, they should have the same basic principles that powers SSOM. It’s very strict in that it has a narrow view of what counts as success -- it only counts jobs with “engineer” in the title, and does not count edge cases like “what if I went back to my old job and did some coding”, or “what if I got a job like ‘product manager’ that’s tangentially related to coding.

But SSOM is not appropriate for a school that consciously decides it has different values. If Startup Institute, for instance, decides that they value a diversity of outcomes and they’re explicitly clear about that to students throughout the admissions process, then I think there's room for a different type of outcomes rubric.

How many schools in the US qualify to use SSOM to report outcomes?

Most of them. Any coding bootcamp that predominantly markets itself as "we can move you into software jobs" should take that as a principal that they count coding jobs as success, and they don't count jobs in other fields as success.

When designing SSOM, how important do you find auditing in the validity of success outcomes?

Zero percent. And we got our own Hack Reactor report audited through Frank, Rimerman Rimerman + Co. LLP; they did great work.

So you'd say transparency is more important than auditing?

I think that it's important to build consumer trust through as many mechanisms as possible. I think auditing is likely to catch only the very worst cases of outright fraud, and I don't think that audits are a very effective system for ensuring that students are being served well.

The problem is that there's an inherent conflict of interest when you have a bootcamp paying a supposedly independent third party who to hold that bootcamp accountable.

How about coding bootcamps that are “accredited” or “approved” by a government body (ie. the BPPE in California)? Does that mean anything about their outcomes?

The State would be the first to tell you that they don't have any means of tracking student outcomes in a rigorous way. We are in close touch with the California government, and when they asked for employment data and we started digging a little bit deeper, we found that the government knows that this is a complicated process. We had to write SSOM from scratch because there was no such document already.

They don't have any of the funding or the operational bandwidth to assess the validity of a school’s self-reported employment rate, so the state doesn’t validate it. They send out a data request and then publish what the schools send back. But look through that data- there are typos and numbers that don't add up when you page through the government’s list of vocational programs in California.

The government doesn’t have an accreditation process; they have a “permission to operate” process, where they determine whether or not you have the facilities and curriculum to run a coding bootcamp. Everyone I’ve spoken with is a dedicated public servant who really wants to do a good job, and I think they would also be the first to say that they have no idea whether or not a coding bootcamp curriculum is “good.” They’re looking at whether the school has the financial wherewithal to issue refunds, if the school has a fax machine, a library, etc.

So SSOM has been used to actually calculate and publish Hack Reactor and Hack Reactor Remote’s outcomes in the form of a Cohort Report. When will we see Cohort Reports for MakerSquare, Operation Spark, Telegraph Academy, etc?

We are on track for that- keep an eye out for announcements.

How long does it take a school to use SSOM in order to actually create the Cohort Report?

If your dataset is really neat and the number of students is pretty small, it could take as little as 5-10 hours.

How heavy is the burden of documentation on the school?  

It's pretty substantial, because the documentation part of this assessment process is probably the hardest.

There's two phases for schools that adopt SSOM. This first report is published in “Onboarding Mode,” which basically means that the school is collecting all of the hard data (anything that a third party created or signed-  for example, offer letters).

“Compliance Mode” is when the school is sending out confirmation surveys to students.

What is your advice to a new coding bootcamp who values outcomes and wants to start the process of reporting their student outcomes?  

We designed SSOM specifically around the fact that a school will not necessarily have all the data required by a methodology that Reactor Core just wrote in the last year. Any school can start using SSOM today, and there's a grace period to get your operations in order. Reactor Core also has a standing offer on our website to help onboard any school with SSOM at cost, and that includes whatever kind of support we can offer in terms of assembling the actual report.

Have any schools taken you up on that offer?

We don't have any announcement about that yet.

Skills Fund is attempting to get schools to come to consensus on an outcomes methodology. From those early meetings, do you think that SSOM could potentially just be adopted by Skills Fund or by all schools?

What we want is for the industry to have good standards and for consumers to have access to apples to apple comparisons. The goal of the Skills Fund process is not to produce a document like SSOM- it is to produce a methodology minus the documentation standards.

SSOM is both a methodology for calculation and a set of documentation standards. I think it's correct for Skills Fund to first accomplish an easier set of standards given that they're trying to bring more coding bootcamps to the table.

Then would Hack Reactor theoretically also adopt that methodology if other schools agree on it?

Yeah. I'll say that if it is too flimsy for us to stand on, then we will try to kill it in committee. So far, though, the results are promising. Skills Fund is not our first or second methodology that we attempted to get adopted. I would love it if Skills Fund is successful and we will switch over.

Could an online school use SSOM?

Totally. An online school does use SSOM- Hack Reactor Remote!

Does it worry you that schools like DevMountain, NYCDA, and Hackbright Academy are now being acquired by for-profit education companies?

It is definitely two different worlds colliding. I don't really know yet. For-profit education companies certainly have a bad reputation, but I have a more nuanced view of that than others. And I think that generally, the safest person to trust with a responsibility is the person who just destroyed that thing. I'm at least curious to see what happens with the outcomes for those schools.

Why should students be concerned with seeing outcomes at coding bootcamps?

The first thing that a student should know is that we're in the early days of an industry that luckily is taking outcomes seriously. We care in a way that you don’t see even for other career-track programs like law schools. That’s lucky for students!

The bad news is that when it comes to bootcamp employment statistics, there's not an apples to apples comparison right now, and different schools are in varying stages of taking their personal responsibilities seriously. Students should look at how each school calculates its placement rates and you should continue to speak to alumni when you are making your decision process. I encourage students to put pressure on schools that are not explaining how they calculated their placement rates. Tell them that it matters to you.

If I were a student right now, I would almost abandon the attempt to get an apples to apples comparison and instead just look at the rigor of how specifically schools can answer questions and whether or not there is a document that explains it. That's a pretty meaningful signal.

To learn more about the Reactor Core network of schools, read Hack Reactor reviews, MakerSquare reviews, and Telegraph Academy reviews on Course Report. To check out Hack Reactor’s Student Outcomes, here is their 2015 report.

About The Author

https://course_report_production.s3.amazonaws.com/rich/rich_files/rich_files/1527/s300/liz-pic.jpg-logo

Liz is the cofounder of Course Report, the most complete resource for students considering a coding bootcamp. She loves breakfast tacos and spending time getting to know bootcamp alumni and founders all over the world. Check out Liz & Course Report on Twitter, Quora, and YouTube

related posts