The Bulletin
Division of School & District Effectiveness
December 2015 & January 2016
"Advancing leadership-- Transforming schools"
Purposes
The SDE Bulletin: to provide regular, timely information to increase the shared understanding of our team of School & District Effectiveness professionals
Our Shared SDE Purpose: to increase collective leadership capacity to understand what effective schools and districts know and do, and to support the leaders to own their improvement processes
Previous Editions of The Bulletin
August 2014- https://www.smore.com/700mx
September 2014- https://www.smore.com/huyyh
October 2014- https://www.smore.com/std20
November 2014- https://www.smore.com/09uva
December 2014/January 2015- https://www.smore.com/09uva
February- https://www.smore.com/hrzfv
March 2015- https://www.smore.com/6wsrq
April 2015- https://www.smore.com/9vbmj
May 2015- https://www.smore.com/gwjuk
June 2015- https://www.smore.com/4suf4
July 2015- https://www.smore.com/kk5zr
August 2015- https://www.smore.com/uek4p
September 2015- https://www.smore.com/puabs
October 2015- https://www.smore.com/thryq
November 2015- https://www.smore.com/72hzp
This Month
You know about CCSSO's ISSLC leadership standards, right? Well, they have revised them and they are now called "Professional Standards for School Leaders." Check them out:
http://www.ccsso.org/Documents/2015/ProfessionalStandardsforEducationalLeaders2015forNPBEAFINAL.pdf
There are 10 standards, and the 10th is School Improvement. Review below the components of the 10th standard on school improvement (I especially love "h"!). Consider deeply these standards and how they shape your work with leadership at schools and districts.
#10. School Improvement:
Effective educational leaders act as agents of continuous improvement to promote each student’s academic success and well-being.
Effective leaders:
a) Seek to make school more effective for each student, teachers and staff, families, and the community.
b) Use methods of continuous improvement to achieve the vision, fulfill the mission, and promote the core values of the school.
c) Prepare the school and the community for improvement, promoting readiness, an imperative for improvement, instilling mutual commitment and accountability, and developing the knowledge, skills, and motivation to succeed in improvement.
d) Engage others in an ongoing process of evidence-based inquiry, learning, strategic goal setting, planning, implementation, and evaluation for continuous school and classroom improvement.
e) Employ situationally-appropriate strategies for improvement, including transformational and incremental, adaptive approaches and attention to different phases of implementation.
f) Assess and develop the capacity of staff to assess the value and applicability of emerging educational trends and the findings of research for the school and its improvement.
g) Develop technically appropriate systems of data collection, management, analysis, and use, connecting as needed to the district office and external partners for support in planning, implementation, monitoring, feedback, and evaluation.
h) Adopt a systems perspective and promote coherence among improvement efforts and all aspects of school organization, programs, and services.
i) Manage uncertainty, risk, competing initiatives, and politics of change with courage and perseverance, providing support and encouragement, and openly communicating the need for, process for, and outcomes of improvement efforts.
j) Develop and promote leadership among teachers and staff for inquiry, experimentation and innovation, and initiating and implementing improvement.
From Areas/Regions
Phillip Luck, North; Sam Taylor, Metro; Patty Rooks, South
Our teams have been working very hard with our districts/schools over the past several months. By now, our districts/schools should have completed their self-assessments in QCIS (IndiStar) and are now identifying and prioritizing tasks related to those standards that are identified as below operational. This is where the real work begins!
So, in order to allow sufficient time for the SEA monitoring of the Priority schools, the deadline will be extended from January 31st, to February 29th. Further, as many of our LEAs are new to the process of monitoring their Priority schools as well as their work related to the District Standards, the LEA Monitoring submission due date is being moved to January 22nd. This will allow additional time for the DESs to support their districts in the implementation of a quality monitoring process.
Providing this focused support to our districts on the front end will go a long way in ensuring that the foundation has been established for the districts/schools to succeed as they move forward!
From the Atlanta Support Office
Professional Learning Support
Christy Jones & Andrea Cruz
Please mark your calendars and plan to attend our SDE PL on December 15th and 16th at the Decatur Courtyard Marriott. Content for the first day includes: GaDOE Strategic Plan, Key Standards and Processes, Processes for Achievement of Key Standards, Creating a Process, and Coaching Conversations. We will begin at 9:30am on the first day. Day two will be focused on region work and job-a-likes. We look forward to seeing everyone in December!
Professional Learning Tidbit
Our December SDE PL focuses on processes, so I wanted to share additional information about them. I found an interesting article that discusses the difference between processes and systems. Essentially, the article states that a process is a conceptual sequence of events that enables people in a business to do what they do. On the other hand, systems are what’s used to execute the process. In general, processes address effectiveness and systems address efficiency. Ideally, systems support processes which support people.
Source: http://www.productiveflourishing.com/whats-the-difference-between-a-system-and-a-process/
Strategy of the Month
Each month we’ll provide a PL strategy that could be used with adults or students. Our goal is to deepen learning and engage the learner.
Title: Rose, Bud, Thorn
Description: A short activity that encourages reflection on the day's learnings and gives feedback to the facilitator.
Directions:
Materials Needed:
3 different colors of sticky notes
pink = rose > things that went well
green = bud > opportunities, ideas that have potential, things you want to follow up on
blue = thorn > challenges, what could be done differently
Write down one insight per sticky note or up to three insights per sticky note.
Going further: Group notes into themes to help reflect on key trends across the whole group.
Intended Audience: Students or Adults
Source: 2 Min PD: Rose, Bud, Thorn
Operational Support
Cindy Popp, Region Resources
IT Updates Webinar, Friday, December 4, 9:00 AM
SESI Updates – Please send any change or addition requests for the System for Effective School Instruction webpage, to schoolimprovement@doe.k12.ga.us by Thursday, December 10. The SESI team will have the quarterly webpage update meeting on Monday, December 14.
Document Revision Update
Thank you to everyone who volunteered to work on document revisions. The program managers have recommended additional staff to work on various teams. I will be sending emails over the next several weeks to each team as we move forward.
Leadership Guide
The Parent Engagement team is assisting SDE with the Family and Community Engagement (FCE) strand of the Leadership Guide. The goal is to have the FCE strand more closely aligned to the national standards.
The Leadership Guide team is working on the Professional Learning and Planning and Organization strands. We plan to have a draft of all eight strands for the Leadership Guide by early summer when we review the Georgia School Performance Standards for revisions. The goal is to have the Leadership Guide directly aligned to the Georgia School Performance Standards.
Gary Wenzel, Operations
The FY16 Title I Part A 1003 (a) School Improvement Grants for Priority, Focus, and Opportunity schools were approved by the State Board on September 25, 2015. A number of district budgets have been imported into the Consolidated Application by Title I Directors, with the Justification of Expenses completed and signed by the principal and SES/School Effectiveness Specialist (GaDOE SES assigned to Priority schools, RESA SES to Focus schools, and with Opportunity schools only the principal needs to sign off).
Priority and Focus principals are required to attend Professional Learning ILA/Instructional Leadership Academy Conferences, as well Opportunity principals for which their 1003 (a) funding is provided. Once the district superintendent signs off on the budget (as well as the MOA), the approval process begins with the Regional Lead SES and DES/District Effectiveness Specialist reviewing each Priority budget and justification.
As Leads and DES review budgets they should be sure expenditures align with the identified needs of the schools and the SIP/School Improvement Plan. Funding should directly support increased student achievement.
Focus and Opportunity school budgets and justifications are also reviewed by Atlanta staff, and once budgets are approved in the Consolidated Application, the district Finance Department may pull down the 1003 (a) funding from GAORS/Grants Accounting Online Reporting System. District purchase orders may begin, as well as draw downs. Drawdown deadlines are:
December 30, 30%;
March 30, 60%;
May 30, 80%.
If a majority of the funding is for summer programs or salary/benefits the district Title I Director may send an email to gwenzel@doe.k12.ga.us indicating drawdowns will be effected. All funds must be expended no later than September 30, 2016. There is no carryover of the FY16 1003 (a) School Improvement funds.
Federal Support
Karen Suddeth and Melvina Crawl- SIG/1003(g)
Cohort 3 (July 1, 2013-September 30, 2016)
Bibb County- Matilda Hartley Elementary School; Westside High School
Fulton County- Frank McClarin High School
Gwinnett County- Meadowcreek High School
Quitman County- Quitman County High School
Twiggs County- Twiggs County High School
Wilkinson County- Wilkinson County High School
Cohort 4 (July 1, 2014-September 30, 2017)
Atlanta Public Schools- Frederick Douglass High School
Bibb County- Southwest Magnet High School and Law Academy
Dougherty County- Dougherty County Comprehensive High School; Monroe Comprehensive High School
Muscogee County- Fox Elementary School; Jordan Vocational High School; William H. Spencer High School
Fiscal Reminders
With the closeout of FY15 funds on October 30th, and completion reports now submitted, Cohort 3 and Cohort 4 schools are operating solely with FY 2016 funding. As the end of first semester is close at hand, the SIG schools should be well into the implementation of this year’s initiatives.
In order to maximize the impact of the resources provided by the grant, all equipment, supplies and materials to be purchased with this year’s SIG funds should be purchased and in the schools as of December 1st. In addition, the drawdown for the expenditures for these resources should have occurred by November 20th. It was also expected that funds for all SIG salaries and associated benefits for the months of July, August, September and October be drawn from the state no later than the November drawdown.
Drawdowns
The deadline for drawdowns is the 20th of each month. It is expected that the drawdown include all SIG expenditures from the previous month, and it may include expenditures up to the date of the current drawdown. It is important to note that the timeliness of drawdowns is a critical factor when considering the recommendation for continued funding.
Weekly Dashboard Reports
Beginning November 20th, Weekly Dashboard Reports completed by the SES are due to Leads. These reports, using a green, yellow, or red color code, provide a quick assessment and view of each school’s progress related to assurances, non-negotiables, and SIG indicators. Green indicates expected progress is being made and no comments are needed for these indicators. Yellow indicates a concern or barrier exists within the school or district that if not addressed will lead to red. Finally, red indicates that a barrier exists or the school and/or district is in non-compliance. Yellow and red ratings require statements (brief descriptions) which explain why the school or district is not implementing or is in non-compliance with each indicator noted.
Leads compile the form for their assigned region and send the report to the SIG Program Specialists with a copy to the Area Program Managers. The Area Program Managers will inform the Division Director of all red and yellow concerns from the Weekly Dashboard Report.
Cross-Functional Monitoring
Cross-Functional Monitoring this year will include APS, Bibb, Dougherty, Gwinnett, Muscogee, Quitman, and Twiggs counties. Cross-Functional monitoring will include fiscal monitoring for 1003(g) SIG. A monitoring schedule for the SIG districts will be provided in the January Bulletin.
LEA Monitoring of SIG Schools
As noted last month, this year LEAs with SIG Cohort 3 and/or Cohort 4 schools will be responsible for submitting three (3) LEA Monitoring Reports in QCIS/Indistar for each of their SIG schools. The format and content of the monitoring report allows the SIG Coordinator, in collaboration with key leaders at the district level, to assess the level of progress of the LEA/school in implementing the SIG indicators.
The electronic LEA Monitoring Report forms can be accessed from the District Dashboard and are to be completed and submitted within QCIS/Indistar. The first LEA Monitoring Report was to have been submitted by September 30th. The remaining monitoring reports are due January 30th and April 30th.
In the event that an indicator is either not progressing at an expected rate or not evident, an interim or “follow-up” LEA monitoring of those indicators is required and submitted in QCIS/Indistar utilizing the appropriate LEA Monitoring Report “follow-up” form. As applicable, schools or districts required to complete Interim LEA Monitoring have additional due dates of November 30th and March 31st.
If all indicators are either progressing at an expected rate or fully implemented, completion and submission of the “follow-up” Monitoring Report form is not required.
2015-2016 Reward Incentive Plan
The 2015-2016 Reward Incentive Plans are in the process of being reviewed for approval. Please be reminded that the process by which the 2015-2016 Reward Incentive Plan was determined, as well as the notification of the GaDOE approved Plan to the school staff, should be documented by the schools within QCIS/Indistar. This documentation should be reviewed during the second and third SEA and LEA monitoring.
Critical Dates for 1003(g) SIG Schools
November 20th Monthly drawdown by LEA’s due for all SIG expenses (including salaries & benefits)
November 20th Weekly Dashboard due to Lead Effectiveness Specialists, weekly thereafter
November 30th If applicable, Interim LEA Monitoring Report submission due in QCIS/Indistar
December 1st All supplies and equipment for FY 2016 should be expended
December 5th Required Monthly Reports Due (Teacher and Student Attendance, Discipline)
December 10th SIG 1003(g) Webinar at 10:00. Registration sent through email on November 23rd.
December 14th SIG PL at 10:00 at Decatur Courtyard for SDE staff working with SIG 1003(g) Schools
(more details are forthcoming)
December 20th Monthly drawdown by LEA’s due for all SIG expenses (including salaries & benefits)
December 30th 30% of FY 2016 funds expended
January 5th Required Monthly Reports Due (Teacher and Student Attendance, Discipline)
January 8th All SIG 1003(g) Indicators planned
January 20th Monthly drawdown by LEA’s due for all SIG expenses (including salaries & benefits)
January 30th Second LEA Monitoring Report submission due in QCIS/Indistar
January TBA SIG 1003(g) Webinar
February 4th & 5th Sustainability Training for Cohort 3
February 5th Required Monthly Reports Due (Teacher and Student Attendance, Discipline)
February 20th Monthly drawdown by LEA’s due for all SIG expenses (including salaries & benefits)
February 24th ILA (North Region), Decatur Courtyard
February 25th ILA (Metro Region), Decatur Courtyard; ILA (South Region), UGA Tifton
February 29th SEA Monitoring Report due in QCIS/Indistar
From the Literature
Eliminating the Blame Game
Kristen Swanson, Gayle Allen and Rob Mancabelli
When we use data to dig into problems—not judge colleagues—solutions often appear.
In many schools today, the mere word data puts people on edge. We assume the conversation will veer toward student test scores and fear we'll be judged by them. We expect statements about teacher evaluation analytics. We assume the worst.
Often, when educators think about data, we find ourselves feeling overwhelmed and disempowered. Certain phrases—and the emotions accompanying them—come to mind: invasive, evaluative, narrow, difficult to gather, hard to analyze, disconnected from the day-to-day.
What if the process of gathering and analyzing data were not only simpler, but more objective? What if our findings let us respond to a problem's root cause rather than a related symptom? And what if educators approached data in a way that helped us solve systemic problems rather than blame individuals?
A Very Human Error
When used systemically, data can make a push for change seem less personal. The issue isn't about you or me; it's about the goal we're trying to achieve.
Although this strategy sounds simple, it can be tricky to execute. Humans instinctively judge other humans; it's a survival trait. It takes careful planning and intention to avoid the fundamental attribution error.1 This concept, uncovered through psychology experiments in the 1960s, refers to the human tendency to fault people, not systems. Essentially, we're hardwired to overemphasize people's internal characteristics and minimize the impact of the system or situation at hand.
Consider this scenario: You're a (nearly) flawless driver. You always come to a full stop at stop signs, obey the speed limit, and maintain a safe following distance to other cars. You try hard to be considerate of others, realizing how difficult it can be to drive safely and patiently all the time. However, when someone cuts in front of you or blows through a red light, your immediate instinct is to think, "What an inconsiderate person!"
Are you correct? The data you have access to in this scenario provides a limited picture. For all you know, this driver's car is malfunctioning or the driver is experiencing a medical emergency. But it's unlikely that your first instinct would be to consider these possibilities. You'd assume the person is the problem.
Educators and school leaders can fall into the same trap when examining data. In an effort to act swiftly and decisively, we focus on what people are doing wrong. However, awareness is half the battle: Once you know your bias exists, it's easy to reframe your data-based explorations to focus on systems.
Fixing a Broken BYOD
In 2014, Mario and Kara, two leaders in a midwestern U.S. district of 12,000 students, faced a common instructional technology issue. Their bring your own device (BYOD) program was sputtering. Teachers were growing frustrated with students because they weren't bringing in the digital devices needed to engage in personalized learning—which discouraged teachers from trying the new personalized learning strategies the district encouraged. Mario and Kara partnered with our research organization in an effort to get to the bottom of their problem.
Mario and Kara had done their homework. They knew that the growing accessibility and decreasing cost of mobile devices gave most students access to devices at home. However, instead of blaming students (or teachers) for the fact that these devices weren't making it to school, Mario and Kara decided to undertake a close examination of the system operating around BYOD.
They studied how students and teachers were using instructional technology, looking at four main areas: the classroom, home and school access to technology, skill levels with technology, and the environment in the school district. Teachers, students, and school leaders all participated in the study, creating a holistic picture of the systems operating at each school. Hundreds of data points were collected, analyzed, and presented visually. Data came in many forms, including demographic data and information collected through questionnaires. These leaders took advantage of the fact that new technologies make data visualization and analysis easier than ever, making it possible for data to truly tell a story.
As they pored through infographics and visualizations, Mario and Kara discovered something unexpected: Although 95 percent of their students reported access to a digital device at home, almost half of those students were sharing those devices with at least three other family members. In short, those devices weren't always available for students to use at school.
The mounting frustration on the part of teachers and students was a classic case of the fundamental attribution error. The students weren't forgetful, as teachers assumed. And the teachers weren't resistant to personalized learning pedagogies. The devices required for the initiative just weren't available for school use within the parameters of the system.
Mario and Kara went back to the drawing board. They reduced the scope of the BYOD program and reallocated funds from some slated software purchases to bring additional netbook carts into many classrooms. As the system dynamics shifted, so did teacher and student behavior. Teachers began trying new personalized learning strategies. Morale improved.
Getting to the Bottom of a Backlog
As the new system nudged teachers to use technology in their classrooms more often, the number of tech support requests across the district multiplied. In less than two months, the number of requests assigned to each district technician doubled. Response times plummeted, and grumbles surfaced across the district. Many of these complaints were directed at the technicians themselves, eroding a long-held trust between teachers and support staff.
Once again, Mario and Kara turned to data to fix the problem. As they started asking tough questions, they were careful to remember the ways bias or blame can creep into conversations about data. They knew that the district technicians were working as hard as they could, so they turned their attention to the systems causing so many requests.
A look at the data showed that 71 percent of school staff felt that the technology support across the district was inadequate and that requests took a long time to resolve. However, more than 50 percent of staff in the district felt "very confident" troubleshooting and fixing their own technology problems. Here was a mystery: If so many people felt comfortable working through technology glitches, why were the requests multiplying?
A brief meeting with the technicians revealed that all the devices, including those recently added to support the BYOD program, were locked down—meaning teachers didn't have permission to install updates or fix small problems on their computers or their students' computers.
Aha! The fundamental attribution error was at work again. The technicians weren't ineffective or slow. The existing system simply had created a bottleneck of requests for things that teachers and students would have fixed themselves if the system permitted. The team agreed to unlock the permissions on all district machines and provide short sessions after school to educate willing students and teachers about the most common issues. Within four weeks, the number of requests—and technician response times—had returned to normal.
Again, these small nudges across the system gradually brought about the change and the culture that everyone had wanted from the start. Thaler and Sunstein, professors at the University of Chicago and Harvard Law School, define nudges as gentle influences on the options and choices at our immediate disposal.2 When using nudges, we don't seek to directly change a person's choices. Instead, we strive to make it easier for that person to choose the desired option, or we increase their efficacy. Empowerment through small actions yields big results.
Facing Down Cyberbullying
The final problem Mario and Kara faced with the BYOD program would require much more than a nudge. With so many students logging on and connecting each day, cyberbullying started to increase. Behavior referrals, especially at the middle school level, were almost exclusively for misbehavior on digital devices and inappropriate sharing in online spaces. As these issues permeated the classroom, teachers became less patient with students. The infamous sayings about "kids these days" rang through the hallways.
It should be no surprise that Mario and Kara again probed the data. A review of their system analysis revealed that most students were receiving only 3–5 hours of digital citizenship instruction per year. In addition, more than 70 percent of instructional staff didn't believe it was their responsibility to teach students about digital citizenship.
Realizing that students needed more education, not more punishment, Mario and Kara faced this issue head on. They used freely available resources to put together a digital citizenship tool kit for teachers and ran sessions at faculty meetings to help teachers understand that promoting good digital behavior was everyone's responsibility. The campaign culminated with a parent night at which students anonymously shared their biggest digital blunders and asked for advice and help.
By the campaign's end, behavioral referrals had decreased and almost every staff member in the schools stated that it was their responsibility to help students become good digital citizens.
The Power of Systemic Change
Mario and Kara's ability to use data as a vehicle for creating small changes to the system made all the difference. Their solutions weren't fancy or expensive. In fact, they pursued many of their most innovative solutions because funding wasn't available. Over the past three years, we've studied thousands of U.S. school districts, and stories like Mario and Kara's rise to the top. Leaders who collect data, target the environment, and execute small changes successfully see healthy gains.
If we want our districts to rise to the top, we must help every member of our organization defeat his or her instinct toward the fundamental attribution error.
Here are three strategies that can help educators avoid this error and find pathways to success.
Raise Awareness
Combating biases around data begins with awareness. When we simply make colleagues aware of these underlying human tendencies, they become more likely to catch themselves engaging in ineffective, judgmental behavior.
Creating an active conversation about using data to change systems can have a tremendous impact on a school or district's process of improvement. After teaching her leadership team about the need to guard against bias, Kara noted, "It was amazing to hear my team pull themselves out of the 'blame game' and really start thinking about change. All I had to do was name it for them."
Tip: Teach your colleagues about the fundamental attribution error. Remind team members to focus on the system before engaging in a data dig.
Search for Root Causes
Sometimes our sense of urgency and predisposition toward action lead to solutions that don't actually acknowledge the root cause of problematic data results. By digging deeper into each issue, Kara and Mario were able to uncover and target systemic inefficiencies. One strategy Mario used was to keep asking why? about any new thing their exploration revealed. Although he joked that this was the same strategy his toddler employed, Mario found that it helped him get to the heart of the matter.
Tip: When trying to determine the root cause of a problem, ask why? at least five times.
Maintain a Formative Outlook
Celebrating improvement and growth, not just success, helps everyone maintain momentum. Even the words we use to talk about our data can signal our mindsets. For example, words such as yet, beginning, or emerging can help people identify growth opportunities instead of deficits. Mario and Kara never saw their work as complete. They viewed it as an ongoing journey toward excellence.
Tip: Use a phrase like not there yet, which suggests an expectation of progress, instead of a phrase like unsatisfactory, which indicates a status of failure.
No matter how hard we try, educators can't eliminate every bias we harbor. But when we shed light on our biases, we can use data to tackle system-level issues with success. When we honor people's work and assume positive intent, innovative solutions follow.
Author's note: All names are pseudonyms.
Endnotes
1 Malone, P. S., & Gilbert, D. T. (1995). The correspondence bias. Psychological Bulletin, 117(1), 21–38.
2 Thaler, R. H., & Sunstein, C. R. (2009). Nudge. New York: Penguin Books.
Upcoming Meetings & Events
Instructional Technology Webinar
Friday, Dec 4, 2015, 09:00 AM
undefined
SIG Team Meeting
Monday, Dec 14, 2015, 10:00 AM
Decatur Marriott Courtyard
Area Administrative Team Meetings
Monday, Dec 14, 2015, 02:00 PM
Decatur Marriott Courtyard
School & District Effectiveness Professional Learning
Tuesday, Dec 15, 2015, 08:30 AM
Decatur Marriott Courtyard
Your GaDOE SDE State Leadership Team
North Area
Area Program Manager- Phillip Luck
Area Program Assessment Specialist- Wendell Christian
Northwest Region:
District Effectiveness Specialist- Terri Gaspierik
Lead School Effectiveness Specialist- Amy Alderman
Northeast Region:
District Effectiveness Specialist- Susan White
Lead School Effectiveness Specialist- Kali Raju
Metro Area
Area Program Manager- Sam Taylor
Area Program Assessment Specialist- Mike O'Neal
Metro West Region:
District Effectiveness Specialist- Diana Forbes
Lead School Effectiveness Specialist- Lyn Wenzel
Metro East Region:
District Effectiveness Specialist- Iris Moran
Lead School Effectiveness Specialist- Paula Herrema
South Area
Area Program Manager- Patty Rooks
Area Program Assessment Specialist- Keith Barnett
Southwest Region:
District Effectiveness Specialist- Deborah McLendon
Lead School Effectiveness Specialist- Steve Olive
Southeast Region:
District Effectiveness Specialist- Darrel May
Lead School Effectiveness Specialist- Paula Cleckler
Atlanta Support Office
Program Manager- Joann Hooper
Director- Will Rumbaugh