While General Assembly has been training developers and designers since 2012, they released their first student Outcomes Report in 2016, which explores who was admitted to General Assembly's Web Development Immersive and User Experience Design Immersive, and who got jobs after working with GA's Career Services program. We talk to Liz Simon, General Counsel for General Assembly, about why it's important for students to consider outcomes as they research coding bootcamps, how they calculated their 99% job placement rate, and what they're working to add to future reports.
How were you involved in General Assembly’s first Outcomes Report?
My primary role is General Counsel for the company, and I also oversee our work with policymakers and government on regulatory issues. Given that work, my role was to help ensure our report took into account reporting requirements from a regulatory and consumer protection standpoint. I worked closely with our operational staff who are actually on the front lines helping our students with the job search process.
In April 2016, General Assembly released an open source framework for reporting student outcomes called Measuring What Matters. Was the intention for other coding bootcamps to use the same framework?
We released the open source framework because we wanted feedback on it. We wanted other schools and players to take pieces of it, give us criticism, feedback, take parts that worked for them, discard parts that didn't. We were very open about encouraging other schools to use it, while recognizing that there are other schools who are creating their own standards in parallel.
It was our hope that the framework could be a useful tool not only for other schools to pick up and run with, but also to explain to our students and other stakeholders where we're headed.
Did your team make any huge changes to the framework between April and November based on that feedback?
Not really. A lot of the feedback we received was about metrics not included in the framework. For example, people asked why we chose not to include certain metrics like salary or how we would approach ROI calculations in the future. So we had a lot of conversation around how we might add metrics in future reporting.
The Web Development and UX/UI Immersives have always been outcomes-oriented, right? Why start reporting student outcomes now in November 2016?
Quantifying a real return on investment for our students has always been a part of the mission and vision at General Assembly. And we have long wanted to create some mechanism that would allow our students and other stakeholders to have greater transparency into our outcomes.
Our framework and this report are the culmination of many conversations with researchers, policy experts, and others in our industry. While we didn't release the full report until 2016, we started this process in the fall of 2015 and it's something that we've been thinking about for a long time. We’re partly motivated by what we see as General Assembly’s obligation, as a leader in the industry, to act on reporting outcomes, even in the absence of broader collective action.
Reporting outcomes is a commitment that we've always wanted to make to our students and this was a good time to do it. It took us a long time to get to this point because of the way that we approached the process.
Is there something unique about coding bootcamps that makes the process of reporting student outcomes difficult?
We worked with two different Big 4 accounting firms – one to help us develop the framework that we released earlier this year and the other to actually conduct the review of our data – and from that process, we learned that the level of uniformity and precision required to report verifiable data at scale isn't necessarily built into a business like ours. I think that is the case for most higher educational programs, not just coding bootcamps. The questions it raised ranged from operationally how you collect data, how you get students to respond to surveys, and what third-party data (i.e., salary) you're able to draw from, if any.
For us, we felt that this was a positive for our business. Organizing and sorting out data shone a light on some things we did well, and some areas for improvement. As we've learned, surveying 100 students is different than 1,000 (and different from 10,000) and all of this gets exponentially more complex as we scale. This is true not only for the number of students, but also in having 15+ campuses. When auditors actually test your data, they’re examining our campuses across the world—not just New York, but also campuses in places like London, Austin, and Seattle.
What are the most important metrics in this Outcomes Report?
We chose to initially focus most heavily on graduation rate and job placement, because those are the most important metrics to our students, and because those are the two most objective data points.
If a student is looking at GA, their number one concern is, “will I be able to get a job?” If that's their goal, are they going to be able to meet that goal? That's the foundation of the ROI for students. There are obviously lots of data points that go into that, so there's certainly lots of other data in the report, but those are the metrics that were verified by KPMG
Why leave out salary in this version of the report? Does General Assembly plan to add salary in the future?
We absolutely plan to add salary. It's one of the things we most wanted to include. However, in this report we were looking in the past, reporting back to 2014 and 2015. While we had some salary data for students, the process by which it was collected and verified did not meet the rigorous standards that we were applying to the other metrics in the report, so we decided to hold off and draw much stronger conclusions about salary in the next phase. I certainly anticipate including salary in future reports.
One very fair question that students should ask about the salary data in other schools’ outcomes reports is, "What is the sample size?" For General Assembly, we’re not dealing with a population of 100-200 students. This effort illuminated where we can update our data collection processes so we will be able to report salary going forward.
What is your process when collecting data from students? Are you collecting offer letters or tax returns? Is it through a Google survey?
We collect data primarily via survey, so it is self-reported by our graduates. In the cases where we do not have survey data, we make a digital record of our communication (e.g., email, notes from a phone conversation) with graduates.
The accounting firm that helped us develop our framework gave us a lot of recommendations from their controls and governance perspective, to implement a stronger plan for collecting documentation. We've got a big long checklist for future plans, and documentation and salary is high on that list.
What does a “full-time job in the field of study” mean? Like if someone got a full-time job as a product manager after the WDI, would they count as having a job “in-field”?
The technical answer is that “in-field” means an occupation for which students are trained or a related, comparable, recognized occupation. This is one area where we could provide further clarification, but in your example, someone working as a product manager after WDI could be counted as a full-time employment in our metrics. The new role has to be one in which they are utilizing their new skills, and/or is a role they achieved as a result of being in the program. We have never limited this to specific titles since titles can vary so much.
What does “job placement” mean in the report? It looks like, if a student receives three job offers, they are still counted as “placed.”
I think that reasonable people can disagree on that, but we believe that this is about expectation matching between the student and the school. You do your part, we do our part. Students aren’t forced to take a job that they don't want to take – that's certainly not our goal. But while career services are always here to act as a resource, they can't dedicate endless resources, because we’re operating at scale. So we had to draw the line somewhere. However, the situation you described is an edge case (less than 1% of our outcomes). It hardly ever comes up.
Another piece of feedback we received, which we will act on, was to break down all of the different categories that we consider full-time employment in future reports.
The most important statistic in this report is arguably the 99% placement rate. Can you walk us through how you got to that placement rate?
Absolutely. We start with the total population of students who are enrolled (2,080 students). Of those, some students (147) withdraw or make it to the end of the course but don’t graduate (30), so we’re left with a pool of 1,903 graduates. Then, at the end of the program our graduates choose whether or not to participate in career services.
Of the 1,455 graduates who participated in career services, 1,440 got jobs within 180 days of graduating. That’s 99% of job-seeking graduates who got a job.
Why might someone not participate in career services?
There are many reasons why students might not want to participate in career services. Some of them are edge-case reasons, but we felt that shining light on all of them was important to broader transparency. There are a lot of different things that come up in people's lives and reasons why they take a different path. But as you can see, the vast majority of graduates do participate in career services because that's what they're coming to General Assembly for.
Is General Assembly now licensed by the BPPE in California?
So does this Outcomes Report also satisfy the BPPE’s reporting requirements?
This report is different. As a licensed school, we have to report data in a specific way on performance fact sheets for the state of California. We make that available to all California students when they enroll.
However, the categories which the BPPE gives us are not very nuanced. We believe it's helpful for students to get a more complete picture of student intent and likely outcomes, which is why our report is more detailed than what the BPPE requires.
Additionally, we have campuses in 16 other cities, each with different reporting requirements. It didn't make sense to choose one state’s reporting standards. Instead, we decided to create a global standard on top of the required reporting in each state.
What's the biggest difference that you see between GA's Outcomes Report and other schools who report outcomes (ie. Turing, Hack Reactor, Flatiron School etc)?
First, I do think that these outcomes reports are a net positive for this sector, even if schools are reporting differently. That being said, I think the biggest difference is our scale. It's fantastic that other schools are reporting on outcomes – many did it before GA and will continue to do it. That's all good for the industry, but these reports do put into perspective the relative scale of each school, and the rigor required to report on verifiable data as you continue to serve more and more students.
There are also metrics that I think some schools care more about than others. For example, some schools deeply care about a very specific definition of a “software developer” role.
One thing that I don’t necessarily see in a lot of outcomes reports, which we think is important, is talking about Admissions. Who are the students you're admitting? What's the process for the admissions standards? That certainly impacts what the results are. And sample size, again, is critical when a school is reporting on salary or other important data.
Why should students themselves be concerned with outcomes from coding bootcamps?
One of the things that we've learned is that the vast majority of students who sign up for an immersive bootcamp intend to get a job. I think it's particularly important for students to have expectations and an understanding of the investment that they're making to understand if their goals match the outcomes of our students, and how do they reflect what their goal is with what the possible outcomes are. That may be getting a job in certain fields, that maybe something else.
And I think students should care about that. They should know that. And it's also important for them to understand that we use this information in a lot of ways. It helps inform our admissions process, and informs the career support that we provide to students during the course. The support we give to students in career services after they graduate is all part of an ecosystem of information flow. So I think it's most important for students to be able to identify with their circumstance, place it in context.
And that's why we went so granular because even with those edge cases, you want students to be able to connect with them. While full-time employment in the field is an obvious goal, when you're dealing with thousands of students, there are a lot of different reasons why they are participating in these programs so understanding the nuance matters to ensure we can serve students well.
Is there anything important for students to know about the Outcomes Report that we didn’t cover?
The big thing is that we're still working on it. We're gathering more longitudinal data that will be reflected in future reports. You can expect to see metrics like salary growth, success of part-time students, etc., in the future, and we're going to keep iterating on our framework. This is a starting place, and we're excited to see it evolve over time.
To learn more, check out General Assembly reviews on Course Report, or read the full Outcomes Report on their website.
Liz Eggleston is co-founder of Course Report, the most complete resource for students choosing a coding bootcamp.
Just tell us who you are and what you’re searching for, we’ll handle the rest.