top of page

Science instruction in Hanover County and throughout the rest of the state of Virginia is assessed by the Virginia State Standards of Learning (SOL) and associated end of course (EOC) tests which are standardized, computer-based end-of-year curriculum assessments. Throughout each year at Lee-Davis, we have periodic content meetings, department meetings, and individual pre- and post-SOL testing window meetings regarding SOL student data. These meetings are justified in that schools acquire funding based upon student performance on SOL tests across every subject, not just within science. Thus, preparation for these tests is a yearlong endeavor which I pride myself and my students' success on.

 

Since our state standards are derived and written from the language and content tested on each SOL, it is paramount to match lesson and unit objectives with assessment methods which minimize sources of bias and prepare students to perform their best on their earth science EOC SOL. My method to minimize sources of bias on my unit assessments involves reviewing released SOL tests, associating each question with a particular SOL-based objective, compiling the most relevant and common questions, and reformatting them for a paper test version. The ultimate goal of constructing and administering these tests is to acquire and assess student data, identify each student's learning needs and to develop differentiated instruction prior to the SOL if all other instructional strategies have yet to prove effective.

 

Throughout the year I work to balance formative and summative assessments across units in a holistic way to assess my students' learning and progress towards successfully accomplishing all learning objectives. With each SOL-based summative assessment students complete, I am able to better understand and communicate where a student's learning gaps in the content exist as I evaluate and report learner progress against standards. This allows me to better plan remediation strategies at the end of the year prior to our two-week SOL review period. As data from each unit's assessment is gathered and analyzed throughout the year, I prioritize what subsections of the content each student needs the most remediation and practice at. Utilizing year-long data, each student is prescribed a review in the areas which they require the most remediation in through online practice tests and specific content-based teaching, consisting of previously released SOL questions. If instructional strategies such as inquiry-based laboratory investigations, direct instruction, or collaborative learning activities have not allowed a student to grasp the content, I then differentiate instruction to be more individualized in their SOL review by assigning each student a JLab SOL practice review packet. This packet instructs students on how to track their own progress as they review for the SOL. During our end-of-year SOL review period, I model and structure this process which guides my students in examining their own thinking and learning as they prepare for their EOC SOL. Utilizing this online resource is just one more example of how I continually seek appropriate ways to employ technology to support assessment practice both to engage learners more fully and to assess and address learner needs.

The most beneficial aspect of creating and administering summative assessments of this quality and composition throughout the year is that this practice allows me to identify each of my students' learning needs as they relate to content-based state-wide standardized testing. Prior to the SOL, for a period of 3 weeks I rigorously prepare my students for the state standardized test. Utilizing year-long data, each student is prescribed a review in the areas which they require the most remediation in through online practice tests and specific content-based teaching, consisting of previously released SOL questions. This practice has proven to be incredibly effective as a final effort in preparing students for success on their EOC SOL test as the data below shows.   

Above: Minerals and Rocks Summative Assessment. The cumulative summative assessment I created for the minerals and rocks unit contains questions that are identical to those questions found on previously released SOLs or very similar versions of them. Upon creating a similar version of a released SOL question, I ensure that each question assesses the particular skill or content knowledge of the state standard-based learning objective which it was created from as a way to minimize sources of bias. For example, both questions 26 and 35 of the test I created above mimic question 14 from the 2000 SOL (below, middle) which was released in 2007. Each of these questions assess the student's ability to "..understand the rock cycle as it relates to the origin and transformation of rock types.." as stated by the Virginia Department of Education (2010) Earth Science Standard ES.5. The most beneficial aspect of creating and administering summative assessments of this quality and composition throughout the year is that this practice allows me to identify each of my students' learning needs as they relate to content-based state-wide standardized testing.

Example%20SOL%20Question_edited.jpg
Example%20SOL%20Question%202_edited.jpg
Example%20SOL%20Question%205_edited.jpg

[Screenshot of 2010 Earth Science SOL]. Retrieved from https://www.solpass.org/released_sol_tests/EarthScienceSOL2015.pdf?section=study-0.

[Screenshot of 2007 Earth Science SOL]. Retrieved from https://www.solpass.org/released_sol_tests/EarthScienceSOL2007.pdf?section=study-0.

[Screenshot of 2000 Earth Science SOL]. Retrieved from https://www.solpass.org/released_sol_tests/EarthScienceSOL2000.pdf?section=study-0.

Sample%20SOL%20Question%203_edited.jpg

[Screenshot of 2005 Earth Science SOL]. Retrieved from https://www.solpass.org/released_sol_tests/EarthScienceSOL2005.pdf?section=study-0.

Above: Jefferson Lab Earth Science SOL test review instructions packet. During the final two weeks of review preceding the EOC earth science SOL, students are given this packet and instructed to use it to track their progress (last page) on an online test compiled from released SOL questions. In class, I model and structure this process to guide my students' as they complete a mock-SOL test to identify their weakest area in order to determine which strand and questions they should spend the most time practicing. This is done so students can examine their year-long progress, thinking, and learning as will be measured by the test. The purpose of repeating this process several times is so students become more familiar with the styles and types of questions students will see on their SOL, as well as to remediate learning gaps in content. By employing this technology-based resource, it shows that I am continually seeking appropriate ways to employ technology to support assessment practice both to engage learners more fully and address learner needs.

SOL remediation data.jpg

Above: Student grade report, first semester tests. Since I track my students progress through various forms of formative and summative assessments as appropriate, once we near the end of the year I am able to predetermine where each student will need the most remediation and differentiation of reteaching activities as it relates to content. For the student above, I was able to assign him to work on the Earth & Space Systems (Unit 3 Test) and the Scientific Investigations & the Nature of Science (Unit 1 Test) strands on the JLab practice test site. Upon taking his mock-SOL online test to determine this, his results proved identical to what I had determined from his year-long assessment data as shown below.

JLab JM Score.jpg
2019 SOL First Attempt Student Scores.jp

The data table above is a snapshot of my effectiveness as a teacher in preparing students to successfully pass their SOL on the first attempt. Students included are those from the 2018-2019 school year in each of my earth science classes who require a passing score of 400 or above on their earth science SOL as a graduation requirement. First attempt scores are highlighted in red (failing score), yellow (within passing range), and green (successful passing score). These results could have only been accomplished through careful planning, tracking of student data throughout the year, and a balance of formative and summative assessment as appropriate to support, verify, and document learning.

bottom of page