Analyzing Data and Asking Questions

Accurate timely

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

Journal of Cases in Educational Leadership
2015, Vol. 18(4) 350 –364

© 2015 The University Council
for Educational Administration

Reprints and permissions:
sagepub.com/journalsPermissions.nav

DOI: 10.1177/1555458915609095
jcel.sagepub.com

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

Article

Analyzing Data and Asking
Questions at Shell School,

Sea County Florida

Charles Vanover1

Abstract
This case discusses early work to implement the Common Core State Standards at
a fictitious school in Florida. The case is designed to support students’ efforts to use
school accountability data for inquiry and to conceptualize change in schools where
previous leaders’ efforts were not successful. Shell Elementary is an exurban school
that serves a predominantly low income, Caucasian community. Accountability data
show that Shell’s students learn below the achievement levels specified by the state.
The written text describes a dysfunctional school leadership system where data are
not used in a collaborative process. The Teaching Questions ask students to imagine
the initial steps in an inquiry that might surface the information necessary to set the
direction and lead a beneficial process of change.

Keywords
instructional leadership, data interpretation, educational accountability, educational
equity, case method (teaching technique)

Overview

This article presents a single, fictional case intended to support educational leadership
development students’ efforts to engage in one of the most challenging dimensions of
the practice of school leadership: the ability to use state accountability data to begin an
inquiry into a school’s challenges and strengths. The case is a work of the imagination
(Barone, 2001; Leavy, 2009) that combines observations from schools the author has

1University of South Florida St. Petersburg, USA

Corresponding Author:
Charles Vanover, University of South Florida Saint Petersburg, 140 7th Avenue South, Saint Persburg, FL
33701-5016, USA.
Email: vanover@mail.usf.edu

609095 JELXXX10.1177/1555458915609095Journal of Cases in Educational LeadershipVanover
research-article2015

mailto:vanover@mail.usf.edu

http://crossmark.crossref.org/dialog/?doi=10.1177%2F1555458915609095&domain=pdf&date_stamp=2015-11-08

Vanover 351

encountered across his career. Quantitative and qualitative data are presented to allow
students to make an initial set of inferences about Shell Elementary, a school where
most evidence suggests students were not receiving a rich and cognitively demanding
curriculum. The quantitative data presented in the case focus on reading achievement
and do not provide the full range of information available to school leaders. This lim-
ited set of findings is offered to help students engage in the framing and questioning
that will begin, but not conclude, an inquiry.

The qualitative data provided in the text of Analyzing Data and Asking Questions
focus on the leadership system that structured teachers’ work at the school. This sys-
tem was dysfunctional in a manner familiar to many people who study schools that
serve students growing up in poverty: At Shell Elementary school, during the 2012-
2013 school year, teachers were told what to do, but what they were told to do did not
work. In buildings engaged in such dysfunctional school improvement processes,
accountability data are frequently used to game the system and to justify change pro-
grams that may be based on little, if any, evidence (Anderson & Schunn, 2000; Booher-
Jennings, 2005; Neal & Schanzenbach, 2010). In the Teaching Questions connected to
the case, students are asked to use the case material to imagine how to begin to solve
one of the most complex problems in the field of educational leadership: How might
school professionals use accountability data to create positive change in buildings
where previous change efforts have not been successful?

Analyzing Data and Asking Questions

It is May of 2013. You are an assistant principal (AP) who transfers into another school
district to work in a language-arts-focused AP position at a school in a county where
members of your family have settled. Your paperwork gets hung up, and you spend
your first week in the district central office where, through a series of fortuitous meet-
ings, you are offered a 1-year, interim principalship of a kindergarten to fifth-grade
school, 15 min away from the house you have just rented. The building has been clas-
sified as a Title 1 school by the state and federal government, and the district says if
the school begins to improve, you would likely be given a 3-year contract to continue
in the position after your first year.

One day after you receive the offer, as soon as your last paperwork clears, you drive
up to the school and arrive at 3:30 p.m. on Friday. Most students and teachers have left
for the day, but the current principal meets you at the front office, welcomes you
warmly, and shows you to an empty office next to hers. A large circular table takes up
much of the space in this room. The principal shows you to a large, ergonomic office
chair that swivels between the circular table and an empty desk that looks out into the
school’s back courtyard. The principal then sits in one of the smallest chairs surround-
ing the circular table and, after asking you politely about your previous position,
spends the next 15 min telling you how much you are going to enjoy working at the
school. Everyone she mentions is wonderful: The kids, the parents, the teachers, and
the custodians are all wonderful, with one exception. The principal’s sister is not won-
derful, the sister is ill, and the principal is going to spend the next week taking care of
her sister, before returning in 10 days to dismiss the kids for summer and to attend her

352 Journal of Cases in Educational Leadership 18(4)

retirement party that Wednesday. The principal gives you the keys to the building, gets
up from the table, and tells you that the school secretary will come by in a couple of
minutes to help you connect your new district laptop to the school’s Wi-Fi. She waves
for you to remain in your seat and tells you that someone from the district will be by
early Monday to help you manage the school: “They might send Jean out.”

The school secretary walks into your office with the person she introduces as the
school “computer expert,” her daughter. Once your Wi-Fi is set up, she gives you a
quick tour of what appears to be a clean, well-organized school building. The school
has a single story with couple of large wings connected to the central area that contains
the school offices and multipurpose rooms. After the tour, among the things you do is
to introduce yourself to a few of the teachers who remain in the building. When you
get back to your computer, you make sure that you can log into the school and district
data systems, and then look at your district inbox, where you spot an email from the
superintendent. He writes that there will be a fair number of people from the district at
the farewell party a week from Wednesday, and he thought that it would be nice to
combine the event with a presentation from you on your plans for the school: “We
would like to get a sense of what you are seeing and where things might go, especially
regarding Common Core implementation. Just you and the leadership team speaking
to me, the area superintendent, and couple of other folks. Think of it as a check in, not
a big event. Mostly, we are interested in learning more about the school so we can
provide support to you and the staff during the transition. Please focus on reading
achievement for this initial discussion.”

You spend a couple of minutes reading this email and planning your week. You then
put your new computer back in your briefcase and, after making sure you know which
key opens which door, leave the building to spend the evening opening boxes and put-
ting clothes into closets at the house you and your spouse have just rented.

You get to the school early the next day on Saturday, both because morning is the
best time for you to think and because you are not sure whether the air conditioning
stays on over the weekend. The first thing you do when you get to your desk is to
chart out school data from the state’s high-stakes tests over the last 5 years. You orga-
nize this first round of data quickly. You spent years charting data in your previous
position, and you find it always helps to create your own charts. The student demo-
graphics show a decrease but no major changes in the school’s population (see Table
A4 in the appendix), and so, when you begin charting the reading scores, you have a
moment of panic when you see the steep decline in mean scale scores from 2011 to
2012 (see Figures A2 and A3). You remember a major reason for this drop is the state
had changed its high-stakes exam from an older high-stakes test, the Florida
Comprehensive Assessment Test (FCAT), to a new test, the FCAT 2.0. The new test
was intended to act as a bridge between the old system and the Common Core State
Standards (CCSS), and the new Florida exam was designed to measure instruction in
high-order thinking skills. Only about one third, at most, of the items on this new test
measure basic skills; all the other questions are said to require deeper, more complex,
processing. As shown in Table A1, the percentage of students who scored above the
cut scores the state used to determine reading proficiency dropped statewide when the
FCAT 2.0 was implemented.

Vanover 353

When you add district and state mean scores to what becomes Figures A2 and A3,
you see a familiar pattern. Everything is not wonderful; the school is in the dol-
drums. Growth tends to be below state averages. While it is always difficult to figure
out such things from an initial dig through the data, the worst of the school’s decline
seems to have coincided with the introduction of the new Florida test, the FCAT 2.0,
in 2012. Once the state test began to measure the new standards, Shell’s test scores
began to fall.

Your initial focus is the school’s reading scores, but from what you see, writing,
mathematics, and science all follow the same pattern. The school was on a bit of a roll
in 2010 and 2011, but once the state’s new test was introduced, its test scores began to
fall. While some of this decline is connected to changes in how the state measured
instruction, the results do seem to indicate problems in the school’s instructional
program.

The school lets community and athletic groups use the school during the weekend,
and after briefly chatting with some of these folks, you begin to walk around the
school to continue your investigation. You read and poke around until noon—yes, the
air conditioning is working— and then go back home and finish the most urgent
unpacking. Before you leave the school, you organize the data displays into a one-page
handout that you might use for data chats with faculty next week, and then run copies
to distribute.

You meet Jean, the person the district assigned to help you open the school, early
Monday morning, and the two of you sit in your office and finalize a plan to manage
the school. After you have finished discussing the major issues connected to the build-
ing, a teacher walks in and asks if you will be chairing the first-grade team meeting.
You discover that Mrs. McGillicuddy, the school’s former AP for curriculum, ran all of
the school’s grade-level meetings from her office, which you now occupy. Mrs.
McGillicuddy, you find out, had transferred out of the school 2 weeks ago, when the
building had finished its last round of benchmark assessments, and it had become clear
that the district would not make her interim principal. You tell the teacher that you will
step into the grade-level meeting for 5 min during the first part of the meeting to intro-
duce yourself, but you would like to spend the morning helping to open the building
and meeting the kids.

When you come back to the grade-level meeting for your quick hello, the first-
grade team is assembled at your office’s round table. You feel a little self-conscious as
you sit down in the larger, ergonomic chair that is placed in front of your desk and
swivel to the round table to talk to the group. You discover that the meeting agenda
was created months in advance by Mrs. McGillicuddy, and that, in her absence, the
teachers continue to follow the plan she laid out.

You spend the week being visible to staff and students and asking everyone ques-
tions about the school. When you find someone who seems interested, you pull out a
printed copy of one of the data chats you have created and use that document to help
structure the conversations. One initial and troubling discovery is that none of the
teachers had seen the test scores charted out with the district and state comparisons
you made in Figures A2 and A3. The teachers all know that Shell’s scores were drop-
ping, but when you sit with them, few seem able to offer a coherent opinion about the

354 Journal of Cases in Educational Leadership 18(4)

school’s performance. They have trouble discussing how changes in the state testing
system influenced the scores of students at Shell from one year to the next.

You find out from these discussions that all teacher meetings took place in Mrs.
McGillicuddy’s office and that she also led implementation of both the CCSS and the
state/district’s performance appraisal system. Mrs. McGillicuddy’s computer seemed to
have been the school’s data room, and most teachers do not seem to spend time using
assessment data for inquiry. A main focus of Mrs. McGillicuddy’s efforts was to insure
that each grade level at Shell taught the same content at the same time and, when you
pop into teachers’ classrooms, you find teachers are not only following the district pac-
ing guide, but they are also consciously trying to stay on the same section of the text-
book. Sometimes when you pop in, you can see teachers adjusting their lesson to focus
on the particular pages that Mrs. McGillicuddy had specified for that day. You also find
out that Mrs. McGillicuddy had a “hot list” that targeted specific students for interven-
tion. When you are able to track down a copy of this list you discover, as you expected,
the vast majority of students on the list scored at one level below passing, Level 2, on
the state’s test of reading proficiency. Most of the others scored one level below passing
in math. No first- or second-grade students are on the list.

Over the course of the week, you find out that Mrs. McGillicuddy’s efforts are
widely credited for the increase in the school grade and the scores in 2010 and 2011
(see Table A1 and Figures A2 and A3). You also pick up great deal of anger toward the
state grading system, the new tests, the district’s efforts to implement merit pay, and
the CCSS. One of the coaches confides in you that many teachers are offended by the
school’s test results; these faculty members are said to be opposed to high-stakes test-
ing and do not believe that the scores accurately reflect their teaching. Many teachers
blame local families for the school’s problems, and some tell you bluntly that educa-
tion is no longer a priority for the people who live in the neighborhood.

For the most part, instruction is competent, but not of high quality. There are few
beginning teachers, and almost no classrooms are in turmoil. Some of the kids seem
withdrawn, some of the teachers seem to be going through the motions, but it is the end
of the year, and kids are reading and writing and working on projects. What you do not
see are attempts to move much beyond the textbook and ask kids to solve complex
problems. This means that despite what was described as almost a 4-year focus on
raising test scores, few teachers at the school are teaching the mid- to higher-level
thinking skills that are picked up by 65% to 70% of the items on the new Florida
assessments.

There are also some strange gaps in the school’s instructional program. Vocabulary
instruction in the regular classrooms seems perfunctory, and there are no mathematics
and science word walls. While many teachers keep mathematics and science note-
books, you discover these materials are never discussed in their grade-level meetings.
Teachers seem comfortable teaching comprehension strategies, but do not structure
their instruction to build knowledge.

Every classroom has a desktop computer and a video projector. All of the teachers
have laptops, but, for the most part, technology usage is weak and lacks rigor. Except
for a couple of the science lessons, the only time you see students using computers is
when they play instructional games at the back of the classroom. The one exception is

Vanover 355

the fifth grade, where Title funds have allowed the school to pay for a one-to-one lap-
top initiative for that grade level, and you do see the computers used in whole class
lessons. When you check in with one of the fifth-grade teachers about the technology,
she tells you how overjoyed she is “to have the computers back.” The laptops had been
requisitioned for computerized testing, and most of the fifth-grade team did not have
access to them during large portions of the school year.

In terms of CCSS implementation, all the teachers can at least talk about the reform
and can tell you something about their efforts to teach complex text. Beyond that,
much of the work teachers claim to be doing to implement the CCSS seems to be at a
surface level. When you walk into classrooms, students are not creating arguments and
synthesizing information from multiple texts. Teachers are not orchestrating lessons
where kids solve complex problems and argue about whether it is appropriate to make
a particular claim. Kids are not using digital sources. There is no school-wide push for
problem solving, modeling, or quantitative literacy. Writing instruction seems rudi-
mentary; most teachers’ efforts do not seem to go beyond what is necessary to pass
Florida’s high-stakes assessment in writing, “Florida Writes.”

There are some real strengths. Discipline at the school is fairly tight; kids might not
be engaged, but they are also not acting out and disrupting their classrooms. The num-
ber of out-of-school suspensions is quite low. Best of all, it is the end of the school year
and people are still teaching, and except for a couple of planned absences, they all
show up for work. Instruction is more relaxed than it would be during the school’s
testing season, but in some ways this is a good thing. The fifth-grade teacher who acts
as the school’s lead science teacher has convinced some of the third- and fourth-grade
teachers to teach science standards their kids had not studied in depth during “FCAT
mania.” These classrooms spend the last 90 min of the day engaging in hands-on, if
not always minds-on, science lessons.

On Thursday afternoon, you, Jean, and one of the counselors go on a mini-retreat
after school lets out. The three of you gather data from district benchmarks and other
sources to attempt to cover most any question that might be asked at the district meet-
ing on Wednesday. You tell Jean, “My goal is to have enough charts ready so that the
district spends the entire time talking about my data. I’ve been here less than a week,
I am not going to give them some big master plan about how I am going to implement
the CCSS in a school with a bunch of experienced teachers who have spent the last 4
years “implementing the textbook with fidelity.”

Teaching Questions for Shell

School

1. Please look at the charts in the appendix. If one removed the state and district
percentages from the tables and figures, what would these data imply about the
school?

2. Much of what the new interim principal does on the first Friday afternoon at
Shell is not discussed. If you were dropped into a school as an interim leader,
in late May, on a Friday afternoon, what are some sources of information you
might draw on to quickly appraise the building’s teachers, students, and curri-
cula? You might organize your inquiry around Table 1.

356 Journal of Cases in Educational Leadership 18(4)

3. The new interim principal begins to analyze data on Saturday morning. The
leader creates a set of big picture charts, and then spends the rest of the time
reading and walking around the school. Assuming time is limited, why does
the new interim principal spend most of Saturday morning in the school, but
out of the office? Is this choice an effective use of time?

4. Please use the data in the appendix to create the first draft of a set of slides you
might use to discuss the school’s strengths and weaknesses to members of the
district leadership team. What other data might you wish to collect, analyze,
and display? For this initial draft presentation, please write your questions
directly on the slides.

5. What type of action planning process might help the school move forward into
the next year? Thus, what might be an initial principal entry plan to guide the
interim leader’s next steps during the summer? Please use Table 2 to structure
this initial plan.

6. How might an appreciative approach (e.g., Carr-Stewart & Walker, 2003) to
improving Shell differ from the assessment-based approaches discussed, for
instance, in Boudett, City, and Murnane (2005) or Hamilton et al. (2008)?
Would the PowerPoint presentation and principal entry plan discussed in
Questions 4 and 5 be substantially different based on one approach rather than
the other?

7. Please comment on the use of the second person in the case and the lack of
information describing the protagonist, such as the interim principal’s age, sex,
and ethnicity.

Table 1. Asking Questions and Collecting Data.

What questions do we have?
What data might we collect to learn

more about this issue?

Table 2. Draft Interim Principal Entry Plan, Shell Elementary School.

Month

Objectives; Tasks to be
completed; Stakeholders

to be consulted
Who will do
the work?

How are implementation
and assessment data collected

and used?

Vanover 357

Teaching

Notes

I am a Caucasian, Episcopalian man who worked in the Chicago Public Schools for 8
years and supported my PhD studies by studying reform implementation in schools
that served students who live in poverty. I teach instructional leadership in a Master in
School Leadership program and, from this position, I designed Analyzing Data and
Asking Questions to support discussion about assessment and leadership issues. The
case was constructed out of encounters with a number of different schools in my
career. It is intended to help teach students to take what Cochran-Smith and Lytle
(2009) describe as an inquiry stance toward school improvement issues and to provide
students with opportunity to begin to learn two core competencies: (a) the capacity to
notice critical school problems and (b) the ability to ask questions to begin inquiry and
link data to school processes (e.g., Boudett et al., 2005; Coburn & Turner, 2011;
Spillane & Miele, 2007).

Analyzing Data and Asking Questions is part of a combined case study that includes
a second case that explicitly discusses some of the social justice issues (e.g., Black &
Murtadha, 2007; Theoharis, 2010) connected to leading change. The class where I
teach Analyzing Data and Asking Questions is part of a sequence my leadership devel-
opment program developed to support student efforts to use data to lead a capstone
change project in the practicum and learn what Grossman, Hammernes, and McDonald
(2009) might describe as the core practices of school leadership. As part of this cur-
riculum, students will be asked to examine data from district benchmark assessments
and other sources relevant to their own teaching and/or professional goals. Students
will also use a process based on the work of Skrla, McKenzie, and Scheurich (2009)
to revisit their school’s data and conduct an equity audit. I run Analyzing Data and
Asking Questions as part of my efforts to create a collaborative classroom culture that
encourages students to work together to gain the skills they need to use data as com-
mitted school leaders. I find that time spent helping students learn to ask good ques-
tions helps students manage the demands of the assignments later in my program’s
curriculum.1

I begin teaching Analyzing Data and Asking Questions by describing what it means
to take an inquiry stance toward accountability data. I then divide students into small
groups and ask them to respond to the following prompt:

Imagine you have accepted an interim leadership position for a school you are not familiar
and you arrive at the school on Friday afternoon in late May, after the school has finished
state testing for the year. What questions might you ask, and what data might you
examine, to learn more about the school, and how might you plan to introduce yourself
to the school community the next week?

I have found that these group responses provide a useful formative assessment for
members’ understanding of school data use. The responses are also an important teach-
ing opportunity to help students envision inquiry into portfolios of student work,

358 Journal of Cases in Educational Leadership 18(4)

teachers’ lesson plans, and the curricular content of the school’s textbooks and instruc-
tional software.

Once the groups discuss the prompt and report out, I give students the appendix to
Analyzing Data and Asking Questions. I discuss the leadership principles that support
an effective data presentation and describe how the data in the appendix have been
organized to communicate different types of information about the school and the
state. Students are asked to use the school and state data from the charts to generate
some inferences about the school and to discuss what data they need to deepen their
inquiry. I ask each group to read the text of the case, go over the teaching questions,
and come to the next class session with their initial version of the PowerPoint slides
described in Teaching Question 4. I emphasize the most important part of that assign-
ment is the questions they are asked to write on each slide.

The next class usually begins with the student PowerPoint presentations described
in Teaching Question 4. Students thus continue the action described in the case narra-
tive, although they use a limited set of data. I ask each group to organize the figures
and tables in the appendix into a coherent discussion of the state of the school, and I
emphasize the importance of sharing their questions with the class. During these
PowerPoint presentations, as the instructor, I strive to help the class take an inquiry
stance and keep students focused on what they know, what they do not know, and what
they hope to learn.

I have found that despite the fact that students use data taken from the same case,
each presentation tends to focus on different issues. The class, as a whole, benefits
from seeing and hearing alternative perspectives on Shell Elementary school. The pre-
sentations and discussions provide multiple opportunities for students to link data to
instructional, professional, and cultural issues and to envision how to deepen their
inquiry.

Once the class has developed an initial account of Shell’s instructional and profes-
sional environments, I recommend the instructor guide the discussion toward a central
issue in the case: the challenge of implementing a foundationally sound and cogni-
tively demanding curriculum in schools that serve students living in poverty
(Hernandez, 2011; Newmann, Marks, & Gamoran, 1996; Resnick, 2010).

The literature on school data use implies that the enactment of data-based
improvement processes is an important means of building instructional capacity
(Bryk, Sebring, Allensworth, Luppescu, & Easton, 2010; Hamilton et al., 2008;
Paredes Scribner, 1999; Vanover & Hodges, 2015). There is also much evidence that
accountability data may be used in ways that do not support all students’ growth (for
a review, see Jennings & Sohn, 2014). I have found that by asking students to inquire
into a school where the previous leaders used data in a manner that does not meet
state and national standards of best practice, I am able to create a space where stu-
dents feel free to discuss the dangers and benefits of using accountability data for
improvement.

Vanover 359

Worthwhile readings that might help master’s-level students grapple with issues
connected to data use and assessment are Heritage (2010); Huguet, Marsh, and Farrell
(2014); and Paredes Scribner (1999). Each piece discusses some of the issues involved
in using data for beneficial, curricular improvement. Classes with a deep focus on cur-
riculum might use some of Achieve the Core’s (2014) alignment tools to discuss how
students might investigate the quality of Shell’s instructional materials.

Appendix

Data Displays for Shell Elementary2

Table A1. School Grades and Reading Learning Gains,a Shell School, Sea County, Florida,
2008 to 2013.

Year

Statewide
percentage of

elementary schools
with a Grade C or

better

Shell
school
grade

Shell
percentage
reading at
proficiency
(all Florida)

Shell reading
gains

school-wide
(all Florida)

Shell reading
gains bottom

25%
(all Florida)

2009-2010 94 B 71% (73%) 62% (65%) 62% (60%)
2010-2011 94 B 72% (74%) 63% (65%) 66% (62%)
2011-2012 91 C 53% (59%) 66% (68%) 66% (70%)
2012-2013 83 C 51% (58%) 60% (65%) 69% (66%)

Note. aLearning gains is a measure of the Florida accountability system that provides the percentage of
all students school-wide and the percentage of students in the lowest quartile who make what the state
determines to be a full year’s growth in a particular tested subject. For a document-based description of
the Florida educational system during the years of the case, see Vanover (2015).

Table A2. Reading and Proficiency Gains, Student Subgroups, Shell School, 2013 School
Year.

ELL students: Reading

proficiency and gains

Exceptional students
(non-gifted): Reading
proficiency and gains

Free and reduced-price
lunch students: Reading

proficiency and gains

Achievement
level > 3

Made
gains (%)

Achievement
level > 3
Made
gains (%)
Achievement
level > 3
Made
gains (%)

School School School
35% 65 17% 53 48% 51
District District District
25% 52 12% 52 46% 55

ELL = English language learners.

360 Journal of Cases in Educational Leadership 18(4)

Table A3. Faculty Demographics of Shell School, Sea County, Florida, 2013.

Number of teachers 52
Number of newly hired teachers 03
Percentage of teachers with 0 to 5 years of experience 17
Percentage of teachers with 6 to 15 years of experience 45
Percentage of teachers with 16 to 30 years of experience 32
Percentage of White teachers 85
Percentage of Hispanic teachers 9
Percentage of African American teachers 6
Percentage of reading-certified teachers 15
Percentage of ELL-certified teachers 38

Note. ELLs = English language learners.

Table A4. Student Population by Ethnicity, Status, and Gender, Shell Elementary School,
2011 to 2013.

2011 2012 2013

M F M F M F

White/Caucasian 261 220 245 203 230 190
African American 15 9 13 12 12 12
Hispanic/Latino 98 83 98 90 102 96
Asian 14 12 14 12 15 12
Native American 3 2 1 2 3 2
Two or more races 10 5 9 9 13 11
Students with exceptionalities (non-gifted) 46 27 40 23 41 24
Free or reduced-price lunch 361 297 338 293 330 284
English language learners 37 34 40 35 37 38
Total male/female 401 331 380 328 375 323
Total number of students 732 708 698

Table A5. Reading Proficiency Percentage by Major Subgroup, Shell Elementary School,
2011 to 2013.

2011 (%) 2012 (%) 2013 (%)

All elementary statewide 74 59 58
All shell 72 53 51
White/Caucasian shell 76 55 52
Hispanic/Latino shell 64 49 50

Vanover 361

Figure A1. Number of students with attendance below 90% by grade level, Shell
Elementary School, 2013.

175

195

215

235

255

275

295

315

Reading

FCAT 2009

Reading

FCAT 2010

Reading

FCAT 2011

Reading

FCAT 2.0

Reading
FCAT 2.0

3rd Grade Reading Mean Scale
Scores

School

District

State

Figure A2. Third-grade Florida high-stakes reading assessment data, Shell School, 2009 to 2013.

Figure A3. Fifth-grade Florida high-stakes reading assessment data, Shell School, 2009 to 2013.

362 Journal of Cases in Educational Leadership 18(4)

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.

Funding

The author(s) received no financial support for the research, authorship, and/or publication of
this article.

Notes

1. Readers interested in using the second case or learning about this school leadership devel-
opment curriculum might contact the author.

2. The accountability data for Shell were constructed from results obtained from several
schools. State data in the tables and figures come from the online archive for Florida
accountability information at http://schoolgrades.fldoe.org/reports/index.asp. Historical
information on state tests, including specifications for the tests described in this article,
may be found at http://fldoe.org/accountability/assessments/k-12-student-assessment/his-
tory-of-fls-statewide-assessment/fcat-2-0/archived-pubs-documents.stml

Additional Readings

Boudett, K. P., City, E. A., & Murnane, R. J. (Eds.). (2005). Data wise: A step-by-step guide
to using assessment results to improve teaching and learning. Cambridge, MA: Harvard
Education Press.

Cochran-Smith, M., & Lytle, S. L. (Eds.). (2009). Inquiry as stance: Practitioner research for
the next generation. New York, NY: Teachers College Press.

Earl, L. M., & Katz, S. (2006). Leading schools in a data-rich world: Harnessing data for
school improvement. Thousand Oaks, CA: Sage.

Heritage, M. (2010) Formative assessment: Making it happen in the classroom. Thousands
Oaks, CA: Corwin.

Huguet, A., Marsh, J. A., & Farrell, C. (2014). Building teachers’ data-use capacity: Insights
from strong and developing coaches. Education Policy Analysis Archives, 22(52). Retrieved
from http://dx.doi.org/10.14507/epaa.v22n52.2014

Paredes Scribner, A. (1999). Using student advocacy assessment practices. In P. Reyes, J. D.
Scribner, & A. Paredes Scribner (Eds.), Lessons from high-performing schools: Creating
learning communities (pp. 169-187). New York, NY: Teacher’s College Press.

Skrla, L., McKenzie, K. B., & Scheurich, J. J. (2009). Using equity audits to create equitable
and excellent schools. Thousand Oaks, CA: A joint Publication with National Association
of Secondary School Principals and Corwin.

Vanover, C., & Hodges, O. (2015). Teaching data use and school leadership. School Leadership
& Management, 35(1), 17-38.

References

Achieve the Core. (2014). Instructional and assessment materials. Retrieved from http://
achievethecore.org/dashboard/410/search/3/1/0/1/2/3/4/5/6/7/8/9/10/11/12

Anderson, J. R., & Schunn, C. D. (2000). Implications of the ACT-R learning theory: No
magic bullets. In R. Glaser (Ed.), Advances in instructional psychology (Vol. 5) (pp. 1-33).
Mahwah, NJ: Lawrence Erlbaum.

http://schoolgrades.fldoe.org/reports/index.asp

http://fldoe.org/accountability/assessments/k-12-student-assessment/history-of-fls-statewide-assessment/fcat-2-0/archived-pubs-documents.stml

http://fldoe.org/accountability/assessments/k-12-student-assessment/history-of-fls-statewide-assessment/fcat-2-0/archived-pubs-documents.stml

http://achievethecore.org/dashboard/410/search/3/1/0/1/2/3/4/5/6/7/8/9/10/11/12

http://achievethecore.org/dashboard/410/search/3/1/0/1/2/3/4/5/6/7/8/9/10/11/12

Vanover 363

Barone, T. (2001). Science, art, and the predispositions of educational researchers. Educational
Researcher, 30, 24-28.

Black, W. R., & Murtadha, K. (2007). Toward a signature pedagogy in educational leadership
preparation and program assessment. Journal of Research on Leadership Education, 2,
1-29.

Booher-Jennings, J. (2005). Below the bubble: “Educational triage” and the Texas accountabil-
ity system. American Educational Research Journal, 42, 231-268.

Boudett, K. P., City, E. A., & Murnane, R. J. (Eds.). (2005). Data wise: A step-by-step guide
to using assessment results to improve teaching and learning. Cambridge, MA: Harvard
Education Press.

Bryk, A. S., Sebring, P. S., Allensworth, E., Luppescu, S., & Easton, J. Q. (2010). Organizing
schools for improvement: Lessons from Chicago. Chicago, IL: University of Chicago
Press.

Carr-Stewart, S., & Walker, K. (2003). Learning leadership through appreciative inquiry.
Management in Education, 17, 9-14.

Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis.
Measurement: Interdisciplinary Research and Perspectives, 9, 173-206.

Cochran-Smith, M., & Lytle, S. L. (Eds.). (2009). Inquiry as stance: Practitioner research for
the next generation. New York, NY: Teachers College Press.

Grossman, P., Hammernes, K., & McDonald, M. (2009). Redefining teaching, re-imagining
teacher education. Teachers and Teaching, 15, 273-289.

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2008).
Using student achievement data to support instructional decision making. Washington, DC:
National Center for Education Evaluation and Regional Assistance, Institute of Education
Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/
practice_guides/dddm_pg_092909

Heritage, M. H. (2010). Formative assessment: Making it happen in the classroom. Thousand
Oaks, CA: Corwin.

Hernandez, D. J. (2011). Double jeopardy: How third-grade reading skills and poverty influ-
ence high school graduation. Annie E. Casey Foundation. Retrieved from http://files.eric.
ed.gov/fulltext/ED518818

Huguet, A., Marsh, J. A., & Farrell, C. (2014). Building teachers’ data-use capacity: Insights
from strong and developing coaches. Education Policy Analysis Archives, 22(52). Retrieved
from http://dx.doi.org/10.14507/epaa.v22n52.2014

Jennings, J., & Sohn, H. (2014). Measure for measure: How proficiency-based account-
ability systems affect inequality in academic achievement. Sociology of Education, 87,
125-141.

Leavy, P. (2009). Method meets art: Arts-based research practice. New York, NY: The Guilford
Press.

Neal, D., & Schanzenbach, D. W. (2010). Left behind by design: Proficiency counts and test-
based accountability. The Review of Economics and Statistics, 92, 263-283.

Newmann, F. M., Marks, H. M., & Gamoran, A. (1996). Authentic pedagogy and student per-
formance. American Journal of Education, 104, 280-312.

Paredes Scribner, A. (1999). Using student advocacy assessment practices. In P. Reyes, J. D.
Scribner, & A. Paredes Scribner (Eds.), Lessons from high-performing schools: Creating
learning communities (pp. 169-187). New York, NY: Teacher’s College Press.

Resnick, L. (2010). Nested learning systems for the thinking curriculum. Educational Researcher,
39, 183-197.

http://ies.ed.gov/ncee/wwc/pdf/practice_guides/dddm_pg_092909

http://ies.ed.gov/ncee/wwc/pdf/practice_guides/dddm_pg_092909

http://files.eric.ed.gov/fulltext/ED518818

http://files.eric.ed.gov/fulltext/ED518818

364 Journal of Cases in Educational Leadership 18(4)

Skrla, L., McKenzie, K. B., & Scheurich, J. J. (2009). Using equity audits to create equitable and
excellent schools. Thousand Oaks, CA: A joint Publication with the National Association
of Secondary School Principals and Corwin.

Spillane, J. P., & Miele, D. B. (2007). Evidence in practice: A framing of the terrain. In P. A.
Moss (Ed.), Evidence and decision making (pp. 46-73). Malden, MA: National Society for
the Study of Education.

Theoharis, G. (2010). Disrupting injustice: Principals narrate the strategies they use to improve
their schools and advance social justice. The Teachers College Record, 112, 331-373.

Vanover, C. (2015). A note on the Florida accountability system: Supplemental document for
“analyzing data and asking questions at shell school, Sea County Florida”. Retrieved from
http://hdl.handle.net/10806/14186

Vanover, C., & Hodges, O. (2015). Teaching data use and school leadership. School Leadership
& Management, 35(1), 17-38.

Author Biography

Charles Vanover is an assistant professor at the Department of Educational Leadership
Development in the University of South Florida Saint Petersburg. He worked in the Chicago
Public Schools for 8 years and has studied school improvement, leadership preparation, and
policy implementation. His ethnodramas have been performed as part of the peer-reviewed
programs of annual conferences hosted by the American Educational Research Association, the
Ethnography in Education Research Forum, the Qualitative Report Conference, the International
Congress of Qualitative Inquiry, and the University Council of Educational Administration.

Read the case by Vanover (2015), “Analyzing Data and Asking Questions at Shell School, Sea County Florida.”

Write a 250- to 300-word response to the following:

· Imagine you accepted an instructional leadership position at an institution with which you are unfamiliar. What data would you review to appraise learner performance, curriculum currency/quality, and instructional effectiveness? Justify your responses.

· How can instructional leaders use accountability data to effect positive change where previous efforts have failed? Justify your responses.

Include your own experience as well as 2 citations that align with or contradict your comments as sourced from peer-reviewed academic journals, industry publications, books, and/or other sources. Cite your sources using APA formatting. If you found contradicting information to what your experience tells you, explain why you agree or disagree with the research.

Calculate your order
Pages (275 words)
Standard price: $0.00
Client Reviews
4.9
Sitejabber
4.6
Trustpilot
4.8
Our Guarantees
100% Confidentiality
Information about customers is confidential and never disclosed to third parties.
Original Writing
We complete all papers from scratch. You can get a plagiarism report.
Timely Delivery
No missed deadlines – 97% of assignments are completed in time.
Money Back
If you're confident that a writer didn't follow your order details, ask for a refund.

Calculate the price of your order

You will get a personal manager and a discount.
We'll send you the first draft for approval by at
Total price:
$0.00
Power up Your Academic Success with the
Team of Professionals. We’ve Got Your Back.
Power up Your Study Success with Experts We’ve Got Your Back.

Order your essay today and save 30% with the discount code ESSAYHELP