EPS: High-performing schools: Community characteristcs formula

- no title specified


Re: Community characteristics formula

From: Brian Hubbell <sparkflashgap@gmail.com>

Tue, Nov 9, 2010 at 8:16 PM

To: David Silvernail

David,

 

I was just re-reading your 2007 report The Identification of Higher and Lower Performing Maine Schools: School Profiles and Characteristics, in particular this paragraph:

 

Using a value-added definition, a school is designated as higher performing only when its average performance score is higher than would be expected based on that community’s characteristics and students’ prior achievement. In essence, the school is defined as adding value beyond the community. For this project, these characteristics include a) the percentage of students who receive free or reduced lunch, b) the percentage of households in the community with at least one member who holds a bachelor’s degree, and c) for upper grade students, the average MEA score of the town or district’s earlier grade students (i.e. 4th or 8th graders). Numbers representing these characteristics were used in a mathematical formula to determine a predicted score for each school.

I was wondering if you could share this formula?

 

Thanks,

Brian

 

 

David Silvernail

Tue, Nov 9, 2010 at 8:48 PM

To: Brian Hubbell <sparkflashgap@gmail.com>

It was a standard multiple regression formula, where the variables were
the three variables for each school/district mentioned in the paragraph
below.

David

David L. Silvernail
Center for Education Policy, Applied Research, & Evaluation
University of Southern Maine

 

 

Brian Hubbell <sparkflashgap@gmail.com>

Tue, Nov 9, 2010 at 9:03 PM

To: David Silvernail

Thanks. Is there a straightforward way to describe how those three variables were quantified and weighted?

 

Brian

 

 

David Silvernail   Tue, Nov 9, 2010 at 9:39 PM

To: Brian Hubbell <sparkflashgap@gmail.com>

The variables were already "quantified' as percents and scale scores.
Nothing was weighted. The variables were simply entered until the most
variance was significantly accounted for.

David

 

Brian Hubbell <sparkflashgap@gmail.com>   Wed, Nov 10, 2010 at 6:49 PM

To: David Silvernail

 

David,

 

I apologize for being so dense about your calculations for high-performing schools.  My brain is a bit better suited to the specific rather than the statistic.

 

So, if you'll bear with me, I'd like to work through a specific comparison using - let's say - MDI High School.

 

According to your 2007 report, The Identification of Higher and Lower Performing Maine SchoolsSchool Profiles and Characteristics, each school's performance is based on six indeces.  As represented in one of the report's associated school performance profiles, MDI HS exceeds the threshold for high-performance in five of the six indeces. The single index by which MDI HS falls short is the one requiring that "average performance score is higher than would be expected based on that community’s characteristics and students’ prior achievement."

 

The report explains that this expectation is based on three variables:

a) the percentage of students who receive free or reduced lunch,

b) the percentage of households in the community with at least one member who holds a bachelor’s degree, and

c) for upper grade students, the average MEA score of the town or district’s earlier grade students

 

So, to try to understand how these variables affect this index, I've made a table of the underlying data for these variables for MDI High School and also for two other high schools which, in fact, do make the grade as high-performing: Cape Elizabeth High School and Yarmouth High School.

 

 

MDI HS

Cape Elizabeth HS

Yarmouth HS

% eligible for free or reduced lunch (10/09)

26.6%

5.5%

6.5%

% bachelor’s degree +

33.0%

58.7%

57.2%

Scaled score: 2009 8th-grade Math NECAP

844*

850

851

Scaled Score: Math SAT 2007-2010

1145

1153

1151

% "Value added" increase in Math scaled scores from 8th to 11th grades

135.74%

135.65%

135.25%

Scaled score: 2009 8th-grade Reading NECAP

848*

856

861

Scaled Score: Reading SAT 2007-2010

1145

1154

1154

% "Value added" increase in Reading scaled scores from 8th to 11th grades

134.98%

134.81%

134.03%

 

Noting that the "value added" represented by the increase in scale scores is essentially identical for all three schools and that any effect from the other two variables would seem inclined to favor MDI's effort, I'm wondering if I'm misunderstanding something.  What is it that causes MDI to fall short in the index of 'expectations' while the other two high schools exceed by that same measure?

 

 

*Calculation for baseline scale score weighted in proportion to 8th grade schools sending to MDI HS:

 

 

HS enrollment (2010)

8th-grade NECAP Reading scaled score Fall 2009

Reading score (x) enrollment

8th-grade NECAP Math scaled score Fall 2009

Math score (x) enrollment

Bar Harbor

204

853

174012

845

172380

Mount Desert

79

854

67466

845

66755

Southwest Harbor

92

842

77464

841

77372

Tremont

63

845

53235

846

53298

Trenton

51

839

42789

837

42687

Lamoine

26

843

21918

844

21944

Hancock

6

844

5064

840

5040

Total:

521

 

441948

 

439476

 

 

Weighted reading scale score:

848

Weighted math scale score:

844

 

 

 

 

David Silvernail <davids@usm.maine.edu>

Thu, Nov 11, 2010 at 3:06 PM

To: Brian Hubbell <sparkflashgap@gmail.com>

Brian, you are generally on target in the way you are thinking about
this, but you "can't get there from here" because, for example,

1. the 2007 analysis was based on 2002-2005 data;

2. the data for all three grades were MEAs, not 11th grade SATs;

3. the value-added criteria is determined by the regression
analysis...using data from ALL students and schools;

4. the 8th grade scores need to be lagged three years so that they
represent the same cohort as the 11th graders.

Also, as an aside, if you look at the MDI high school profile, you will
see that the school was doing better than predicted, just not by a third
of a standard deviation as required by our definition.

David

 

 

Brian Hubbell <sparkflashgap@gmail.com>

Thu, Nov 11, 2010 at 4:18 PM

To: David Silvernail

David,

 

Thanks so much for sticking with me. I think I'm finally beginning to get it.

 

1) So, your work essentially represents schools which were exhibiting high-performing characteristics during the single three-year period from 2002-2005 based on the "value-added" for one rolling three-year cohort tested even earlier?

 

2) As the report used uniform MEA data from all grade levels, is it no longer possible to replicate the analysis because of the adoption of the disparate assessments of MEA, NECAP, and SAT?

 

3) Did your value-added calculation for MDI HS account for the varying 8th-grade baselines from the different sending schools - as mine did?

 

4) If the purpose of the exercise is to screen against the inherent advantages of wealth, Doesn't the report's conclusion that seven out of 14 high-performing high schools are located in Maine's most affluent county in communities with the lowest percentage of disadvantaged students and the highest percentage of college attainment suggest that something might be incorrectly skewed in either the data or the regression?

 

5) Maybe it's just because I haven't seen your data - but, if MDI HS appears to me to be boosting scaled scores in an identical percentage to Cape Elizabeth and Yarmouth and also starts at a relatively higher disadvantage, I still don't get why your regression indicates a lower adjusted performance against expectations.

 

My hypothesis is that the success (which the report recognizes - and thank you very much for that) of two of MDI's K-8 schools is what set the performance expectations slightly out of reach at the high school.  So, I'd like to be reassured that your calculation accounted for the significant influx at the high school of students from other systems beyond the four MDI towns.

 

My apologies for continuing to impose on your time with these questions. But, as you can gather, it's a subject that interests me greatly, especially given the imminent trend towards applying value-added measure to more aspects of education.

 

-Brian

 

 

 

So many questions; here's a few:

I may be misreading this, but "The variables were simply entered until the most variance was significantly accounted for" appears to say that the statistics employed were manipulated in order to come up with an answer the authors wanted. If that's not so, what does the sentence mean?

If the elementary test scores were followed up by high school testing, how did that affect schools that are K-8 only? For example, the school in New Sweden sends its HS students to Caribou. Were those students' scores followed through at Caribou, or are they part of the Caribou statistics without taking into account the original schooling?

In the original report, there was a section entitled "Research Plan" with several characteristics of allegedly high-performing schools. Those characteristics were not defined in the report; are they defined elsewhere?

I may have missed this, but which schools were sampled (Status Report)?

Characteristics and individual school reports

You can see the individual reports for all the schools here:

The report covered most, but not all, Maine schools. The smallest schools are left out as their test samples, I assume, are too small.

In this past Monday's status report to the Education Committee, MEPRI lists some specific characteristics of high-performing schools. The proposal says they intend the continued research to test against this hypothesis.

Links to all the MEPRI reports and some more of my own analysis and comments are on this MDIschools.net EPS page.

Thanks!

Tracking down and posting all this information should not be the role of a private, concerned citizen. The Department of Education, MEPRI, the Ed Committee and other public groups should be obligated to make sure all background information of this sort is posted at some comprehensive website so everyone (even if it's only a few) who cares can find it without asking.