Written By Liz Eggleston
In 2014, Flatiron School was the first coding bootcamp to release an Outcomes Report. While that report examined outcomes for their NYC programs, they’ve since launched their online campus and are starting to graduate developers around the world (remember our peek into the types of students who are succeeding online?). Today, we’re diving into Flatiron School’s most recent Outcomes Report, which focuses on their Online Web Developer Program grads. See how this school is placing 97% of their online students in developer jobs!
This is not Flatiron School’s first Outcomes Report – it’s actually the third! Why did Flatiron School feel that reporting student outcomes early on was an important responsibility?
By the end of 2014, there was a lot of attention on the bootcamps that were popping up. A lot of those schools were amazing, but some were clearly being less than genuine with the ways they claimed outcomes. The very obvious public example is Devschool, which claimed 100% job placement, but also said they don't consider you graduated until you get a job. That's a really shady marketing tactic.
We saw what was happening with for-profit universities, which started with great intentions (colleges were wasting tons of money and putting students into debt; for-profit colleges thought they could do a better job). For-profit universities started growing really fast without much regard for quality, and the bad players basically ended up defining the entire industry. Now when you say “for-profit university,” nobody remembers the names of the schools that were well-intentioned; all you think of is University of Phoenix.
What drove us to release audited reporting was that we saw what was happening with for-profit universities, and we wanted to preempt that. Our industry was growing super fast, and there was no way for students to judge between school outcomes.
We want to send the industry in a different direction than that of for-profit universities. So we released the first ever audited outcomes report for our NYC campus in 2014. We then created the NESTA standards with a group of other bootcamps and the White House so that students could actually see which schools are willing to stand behind their marketing claims.
After digging into your online students’ results in the 2016 Online Outcomes Report, was there anything different between tracking online and in-person students?
We invest a ton in career services and coaching, and so we work very closely with our students. When they get jobs, we have all of that data.
The average salary is lower ($67,607 for online vs $74,447 in NYC), but that is to be expected given geographic variation. Our in-person students are all attending Flatiron School in New York, but online students are spread out. In New York, the average salary for online grads is closer to our in-person average. In San Francisco, the average salary is what you might expect there. In North Dakota, it's lower.
One thing that was inspiring to me about the online Outcomes Report is seeing the average tuition paid by graduates. So many online bootcamps are just online versions of their in-person courses, but because we built Learn.co, our self-paced proprietary learning platform, our students are able to work full-time and learn on the nights and weekends. They paid an average $5,601 in tuition and these people were able to completely change their lives.
What numbers were you most surprised by, negatively or positively, when you actually looked at the success of the online students?
I was surprised that the salary was as high as it was. 42% of our students are in the Northeast, and only some percentage of that are in New York, but seeing a $67,000 average salary across the country and internationally is insane. I was blown away by the fact that demand for talent is still high enough that employers are willing to pay that much for junior developer across the country.
One stat we noticed is that 15% of your online students are women, while women make up 40% of your in-person classes. Why the difference there?
It's possible that when we launched the online program, we took our eye off the ball and we didn't invest as heavily in trying to achieve the diversity we see on our NYC campus; but today, 40% of our online students are women.
Since we did that research, we've invested a lot more in creating more opportunities for women, including our Women Take Tech initiative, which we launched in partnership with Birchbox, and our Kode with Klossy scholarship with Karlie Kloss. In all, we’ve awarded over $300,000 in scholarships toward our online program for female students and collaborated on community events that take on issues facing women in tech. The response so far has been pretty inspiring and today those numbers are a lot more encouraging.
Arguably, the most important statistic here is the 97% job placement– can you take us through that conclusion and tell us how you got there?
The way to read this report is to first look at the 74 people who have graduated at the time of the report; then, of those people, 39 have finished a job search cycle (many of the graduates at the time we did the report were only a few weeks out from graduation). Of the 39 people who finished a job search cycle, only one person hadn’t yet gotten a job.
What does it mean to have completed a “Job Search Cycle”?
That’s simply defined as reaching six months after the beginning of your job search, or until you’ve accepted a job offer. And by the way, if you reach six months, have followed our job search standards, and haven’t gotten a job offer, we give you a full refund. But so far, everyone who has committed themselves fully to that process has been hired. The one graduate who hadn’t been hired wasn’t actively job seeking according to the standards our grads agree to follow for a long period of time.
Different schools decide to collect information from students in different ways (scanning LinkedIn for job titles, collecting job offer letters, etc). How does Flatiron School document a student’s job outcomes? Is that audited?
This is actually a huge differentiator and deserves to be discussed. We carefully collect comprehensive job data from our students, then we send our auditors a spreadsheet that has all the pertinent data: students’ names, contact information, their companies, titles and starting salaries, hire dates, and graduation dates – often with offer letters or contracts attached to further verify the information
We also send the auditors our financials, and then the auditors call 30% of the students at random. They require 100% response rate; otherwise, they won't certify the report. They also require 100% accuracy, meaning if anything is off by $1, they won't certify it. Think about that versus a report that only relies on a survey of students. There is a difference between a Review versus an Examination by auditors, and that’s something students should be aware of when they’re looking at the different outcomes reports being produced.
In-person schools are regulated by regulatory agencies like the BPPE in California. Is there a similar agency for online schools?
Even though we work with regulatory agencies for the in-person program, there are still no outcomes reporting standards from those state agencies. The reality is that online education is moving faster than regulation can catch up with. So there aren't really strict standards and regulations. We let our lawyers make sure we're doing everything we need to do.
For that reason, do you think it’s even more important to release an audited outcomes report for Flatiron’s online program?
Maybe. The larger the industry gets, the bigger the responsibility. Flatiron School was also the first to offer an online money-back guarantee. We spent six months really analyzing our program, designing commitments for both our students and our career services team and we worked so hard to get to a place where we were comfortable offering a guarantee. And literally within two months, there were five schools copying it. That's scary to me because there's no way they had time to prepare the way we did in order to actually stand behind the guarantee.
The online job report was fueled by the same idea: "We need to establish a standard by which students can actually evaluate the program."
Is there any chance that Flatiron School will work with other schools on common reporting methodologies like CIRR, etc.?
We're always open to it. Since those standards have been designed, the conversations I tend to hear about new methodologies for standards are always less strict. And you have to ask yourself why. The NESTA standards are pretty easy questions: how many students enrolled, how many graduated, how many accepted a job, the average salary of those who accepted the job, etc... The questions are really, really simple. I’d love to see more schools get on board with those to start. We've also had some conversations with schools that want to make that standard even higher, which is interesting.
Ultimately, until there are some unified standards or strict regulations, I advise all students to be skeptical. This is a big investment and they should take the time to do the research and ask all schools the tough questions.
Read more Flatiron School reviews, check out the Flatiron School website, and explore Flatiron’s full Student Outcomes for yourself!
Liz Eggleston is co-founder of Course Report, the most complete resource for students choosing a coding bootcamp.
Just tell us who you are and what you’re searching for, we’ll handle the rest.