From Memberanda, Fall 2011
The Cate School, California
Cate’s Director of Admission, Charlotte Brownlee, and her talented team enjoy examining data and uncovering patterns that will aid them in their work. Recently, Cate conducted an Optimal Use Study (OUS)—formerly called a “Validity Study”—to understand the relationship between SSAT scores and first-year academic success at Cate (The OUS is a free data report service available to member schools). Office Manager Lynn Dinning shouldered the task of gathering data by completing a spreadsheet for the last four freshman classes. The results were telling. For Cate, the SSAT Quantitative Score is by far the best indicator of first-year academic success, followed by the Reading Score, and followed far behind by the SSAT Verbal Score. To calculate a school-specific SSAT score, Cate staff simply multiplies the respective weighting formulas determined by the OUS with student scores on each of the three sections.
This new total score enables the admission staff to refine its use of SSAT scores, because it provides a comparison based upon the predicted performance in first-year overall GPA at The Cate School. The OUS also provided Cate with data about how their freshmen compare with all SSAT test-takers at that age, offering the range of scores along with mean and quartile information. Additionally, when a prospective parent (or an accepted parent) wants to know how freshmen typically do at Cate, there is terrific data now on hand – 75% of Cate students have a GPA of 3.0 or higher at the end of their freshman year, and 25% have a GPA over 3.7.
Charlotte sees multiple opportunities to use these new data beyond the admission process, particularly as an early indicator for skills and support that can help all Cate freshmen. In addition, she is excited to start the next phase of analysis, where they will look at student performance throughout the four school years and evaluate how well the Cate curriculum advances them beyond the level indicated by their 8th grade SSAT scores.
Salisbury School, Connecticut
Salisbury’s longtime Director of Admission, Peter Gilbert, was confident that he knew the relationship of SSAT scores to student success – long believing that for his school the SSAT Quantitative Score was “king.” He was right. But while Salisbury’s Optimal Use Study (OUS) confirmed this, it also raised a few questions about the SSAT Verbal Score because its correlation—especially relative to the Quantitative Score—was very low.
Gilbert asked, “If I want to be efficient, should I just look at the Quantitative Score and disregard the Verbal Score completely?” While this would have been a logical assumption, he decided to test it. He took advantage of another free study offered by SSATB, the School Profile Report (SPR), which provided him with even more insight and information. The SPR examines students’ progress over time using multiple data points, including entering SSAT scores, GPA (entering and exiting), and SAT scores.
While the SSAT Verbal Score was not a strong predictor of first-year academic success at Salisbury, it was a very strong predictor of the SAT Critical Reading score. As Gilbert notes, “using that data allowed admissions to bridge/partner with the faculty. It gave us the ability to make informed decisions, but it also gave us a valuable evidence-based marketing tool. Boys who attend Salisbury for three or four years show success, and we have the data to prove it.”
Ransom Everglades School, Florida
Ransom Everglades School has been using the SSAT for over 30 years. As Director of Admission Amy Sayfie notes: “Since students apply from a multitude of schools, the report cards, grading scales and overall evaluation methodologies can vary tremendously from schools locally, domestically and abroad. Therefore, the SSAT allows us to have one consistent and valid evaluation tool, which we can utilize to assess all applicants and to predict their potential academic success at our school.”
In Ransom’s case, they additionally sought to employ SSAT data in a study of college placement. Steve Frappier, Ransom’s Head of College Counseling, who has experience in university-level admissions, took a particular interest in comparing test scores, transcripts, and student success patterns from different schools. He says, “Since our primary entry point is sixth grade, I wondered if our in-house data showed any trends that could shed light on ‘who performs best’ in the eight years between applying and graduating.” To do so, Steve isolated the attributes of feeder school, SSAT score, gender, eventual GPA, eventual standardized testing, and (believe it or not) college matriculation.
What he found is that, while outliers exist, the SSAT has been helpful and reliable in predicting eventual patterns in GPA, standardized testing, and other outcomes at Ransom Everglades School. With inquiries and applications continuing to increase at Ransom Everglades School each year and selection becoming more and more challenging, the admission office counts on the availability of data to support their final assessments. Acquiring the necessary tools to demonstrate the predictive success (including GPA, SAT, ACT, and college matriculation) of a student at the school has been quite beneficial for the clear delivery of information to prospective students and parents. Amy concludes that “as admission professionals, it is imperative for us to be able to guide not only the students we will accept, but also those who may not be a right fit for our school. The SSAT helps us tremendously with our message points.”
Phillips Exeter Academy, New Hampshire
PEA’s Dean of Admission Michael Gary is an avid user of SSATB’s data/research services. Gary says, “Having people at SSATB ready to provide these services can’t be talked about enough. I can’t imagine myself not using these resources.” While working at Peddie School in New Jersey, Gary discovered that the best predictor of completed applications – even more than inquiries or visits – was receipt of SSAT scores. As Gary told attendees at a TABS Lab workshop this summer, “If students take the test, they are serious shoppers.”
After conducting an OUS and determining the relationship of SSAT scores to first-year academic success, the faculty challenged Gary—in the most positive sense of the word—to dig deeper: What about prediction beyond the first year? What about performance at graduation? How do you know that the rating system used to evaluate applicants is working? In response to faculty queries, PEA conducted a School Profile Report (SPR) examining five years of graduates who entered in the 9th or 10th grade.
What they found was that students receiving the top rating of “A1” (based in part on their SSAT scores) indeed outperformed their “A2” peers in terms of both SAT scores and GPA. Gary said this confirmation “really buttressed” their file reading/assessment process.
The biggest surprise was the SSAT/SAT prediction study results. Although the SSAT Verbal section accurately predicted SAT Critical Reading performance (within 2 or 3 points), the SAT Math scores of PEA students were significantly higher than predicted on the SSAT score reports. On average, and throughout every cohort, Exeter students scored between 1 and 3 intervals higher on the SAT M than was predicted by the SSAT Quantitative Score.
While this came as no surprise given Exeter’s renowned Mathematics Department (a textbook is only used for Statistics, and they write their own problems), it provided welcome confirmation of the strength and value-added of its program – not only for those who compete in the Math Olympiads and other renowned competitions, but for all PEA graduates.
With an eye towards conscientious stewardship of the school’s resources, Gary also examined – via the SPR – the performance of students within five financial aid bands. One of the most compelling pieces of data was the strong performance of students whose families needed $0 – $9,999 to bridge the affordability gap. This group outperformed all other categories of students in terms of grades and SAT scores.