In this series, we’re spotlighting coding bootcamps that have released Outcomes Reports. This week, we talk with Juha Mikkola, co-founder of Florida coding bootcamp Wyncode. Not only has Wyncode recently released a verified Outcomes Report, but they’ve also built an app (you’ll need to enter your email to use it) to help students navigate the data. Read our Q&A for detailed answers about how Wyncode calculates their 84% job placement rate (within 120 days of graduation), how their process differs from other schools, and why they’re calling on transparency in the bootcamp industry.
Tell us about your role at Wyncode.
Johanna and I are co-founders of Wyncode (and a husband/wife duo). Johanna works closely with our team to make sure that we’re executing well, and that students have everything they need. From a product side, she also makes sure that we’re continually iterating and innovating.
I’m more involved in the admissions and hiring sides. I oversee the process of finding and identifying great students, and then I oversee our Wynterviews, and bring hiring companies into Wyncode. Students get to know me the best after graduation, because I’m in close touch with our hiring partners.
Wyncode has always been outcomes-focused, but you’re now about to release your first Outcomes Report. Why now?
Outcomes have always been important to us. Since we first started in May 2014, we’ve been releasing our placement numbers periodically.
Also, as we’ve seen the industry grow, this effort is more important. There are very few schools making this effort, and we’ve always considered Wyncode to be quality-focused above all, so this is a way that we can show students they can trust and believe in our outcomes.
Wyncode was a founding member of NESTA- what happened there?
Earlier in 2015, as a founding member of NESTA, we wrote a letter to the President of the United States saying that we would make a commitment to demonstrate that we’re running Wyncode right. It was really cool to be a part of that process, but we realized that the process didn’t move quickly enough and eventually fell through. We hoped to report outcomes in aggregate as an industry, and that didn’t end up happening. But we didn’t want to move forward as an individual school too quickly, because we were still hopeful that it would happen as a group.
We have to give credit to the White House for igniting the process again. An article came out that questioned why schools weren’t fulfilling the commitment we made. Then we got an email from the White House asking about the status of the Outcomes Report. When the White House asks you for something, it’s a huge honor, but it also means we need to put the process in hyperdrive. It was a no-brainer for us to invest time, money, and resources into making this happen.
What is the methodology that went into creating Wyncode’s outcomes report?
We built it around the NESTA commitment letter, so the report fulfills those topics: completion rates, job placement, and average tuition. It’s a multiple step process. First, we already had data for our graduates, which we update as our students get jobs. We do WynWork meetings with students twice a week until they get a job, so when someone gets a job, we know about it immediately. Secondly, we’ve done verification campaigns, where we reach out to students via email, phone, and text message with the information that we have on file for them and ask them to correct or approve it. Third, once the auditors (an accounting firm) took over, they selected a significant sample size and emailed and mailed a formal letter to the students, and checked the data that we already had.
From this information, we calculated an 84% job placement rate in technical roles within 120 days. A technical role is defined as working as part of, or directly with, an engineering team.
What did that response rate end up being?
About 10% of graduates didn’t respond with complete information. In cases where data was ambiguous, we were very conservative with their data. For example, if we have someone’s employer but don’t have a confirmed hire date, we counted them as employed “after 6 months.” We wanted to be safe and make sure we didn’t overstate anything. We also asked salary ranges, and took a median value of the range (instead of the top value). Those are little things that I think were the right thing to do, but made sure we weren’t overstating our outcomes.
Overall we had 194 enrolled students in the survey period (May 1, 2014, to December 31, 2015). Of those students, 188 graduated, and 168 were considered job seeking.
You’ve worked with the Commission of Independent Education within the Florida Department of Education in the past. Do they ask about or verify outcomes information? Is that good enough?
We’re licensed (not accredited) by the Commission of Independent Education, and they do actually verify outcomes. The information we report to them is already available (or may become available in the future) publically. However, the data they ask for is very top-level, and it isn’t specific to the bootcamp space. I wouldn’t say that being licensed by a regulatory agency means that your outcomes data is better. But licensing is important because there’s an oversight body that ensures things like the people you’re hiring are suitable to be teachers, that there’s an official refund policy, and a way to complain to state authorities etc. There are a lot of schools that see that process as being very burdensome, but for us, it’s really helped us define our processes. We have a publically available 33-page catalog that explains every process we have in place. Students are able to download that, and they may not read everything, but they do understand their rights if they want to leave after three weeks or if they get into a disagreement with someone at the school.
Some methodologies, like ReactorCore’s SSOM, consider documentation (ie. collecting offer letters) crucial. How important is documentation in this process?
We put more emphasis on the student’s response. When contacted by the accounting firm, the student signs a letter that verifies that all information is correct. We haven’t had a single case of students lying about placement, but we do have a verification process (which we built in January 2016) which is based on Title IV, which covers US Federal Financial Aid. Now when the student gets hired, the employer also gets a verification letter to sign.
Unlike Hack Reactor, we don’t count job offers as “placement.” We only count students as “placed” when they start a job.
When you did the official audit, were the results what you expected? Did anything stand out?
We knew where we stood even before the verified report. The difference is that we had to define the time period that we’re looking at, so the numbers we see are for a certain time period (2014-2015 combined). The bottom line numbers for us are 84% job placement within 120 days, and 97% placement overall.
What we’ve noticed in our market is that starting salaries are affected by cost of living. Our average starting salary is $46,216, which is lower than a school in San Francisco or NYC. But when you use the Cost of Living Calculator, that $46,216 is equal to $93,500 in New York and $72,500 in San Francisco.
Are there specific challenges as a coding bootcamp founder that you’ve faced in this process?
I wouldn’t say that this process is easy, but I think the main challenge is that other industries or educators are not doing this. I don’t see universities, colleges, or other vocational programs putting time and effort into publishing this data. So, I don’t think it’s a problem specific to bootcamps. I do think it’s an opportunity for our bootcamp industry. We’re calling on the rest of the industry to follow suit.
In addition to the Outcomes Report, Wyncode is releasing an app- tell us why!
We tried to make the app more interesting than an explanation of our data. We built an app that allows you to navigate data by gender, ethnic group, and pre-Wyncode education level. Future students can get a great idea of the types of outcomes students achieve based on their background.
We’ve also incorporated different frameworks into the app- our own Wyncode framework, Flatiron School’s framework, ReactorCore SSOM, and General Assembly’s Measuring What Matters framework. Each of these frameworks takes different assumptions, so students can see how our data performs under each framework.
When you look at data under different frameworks, does it have a significant effect on Wyncode’s data?
There are definitely differences. One difference is that we’re really proud of the number of entrepreneurs at Wyncode. We’ve always encouraged entrepreneurs and helped to build their companies (who will often hire other Wyncoders)! If you look at other frameworks, some don’t count entrepreneurs because that’s less of a focus at their program. Under those frameworks, it has a negative effect on our placement rate.
Some frameworks also don’t count graduates who are hired into full-time teaching roles in their own schools. I totally understand that when you’re a big school with dozens of instructors, that may make a difference. At Wyncode, we’re a full-time team of 14 people, so to get a job teaching at Wyncode is probably more competitive than some of our hiring partners. We count full-time instructors (obviously we don’t count part-time TAs).
On the other hand, some schools like Hack Reactor count offers rather than jobs; we believe that if you get an offer but don’t take the job, there are reasons for that decision.
Do you have plans to ascribe to one methodology, like Skills Fund is proposing?
We’ve been in talks with Skills Fund, but were unable to make it to the last in-person meeting in Austin. Once their framework gets finalized, we want to add it into our placement app so that users can see how our numbers fare under that framework. It’s something that we want to support, and I think agreement about one methodology is important, but my advice to students is not to get too caught up with percentage points. Instead, look at what is being reported and how.
Outcomes are important to this industry surviving on a whole. But for an individual student, why do you think students should be concerned with outcomes when they’re researching coding bootcamps?
I think that bootcamps are really cool because there is such a specific link between the skills that you learn and the outcomes that you see after a short, focused time period. The majority of bootcamp students are looking for that specific job, so your research should be focused on schools that can offer that outcome. Also, you have to research past students, the companies they work for, and the type of network your school has in the local community.
How do we access the app?
What’s one thing you’re excited about adding to the Wyncode Outcomes Report?
We’ll add 2016 stats as soon as they’re available, and at some point we may even consider making it real-time after every cohort.