>> Have a good show.
>> The broadcast is now starting.
All attendees are in listen only mode.
>> Good afternoon.
Thank you for joining us and welcome to the webinar.
This is a part of a series on progress monitoring for students
in special education with various needs.
My name is Dennis Cullen.
I'm one of the educational consultants with PaTTAN King of Prussia office.
And joining me today is Allen Muir who is a consultant with the PaTTAN Harrisburg office.
If you have attended any of our webinars
or any of our events before you are probably familiar with PaTTAN's mission statement.
The mission of the Pennsylvania Training and Technical Assistance Network or PaTTAN
or to some PaTTAN is to support the efforts and initiatives of the Bureau of Special Education,
and to build the capacity of local education agencies to serve students
who receive special education services.
And here we see Pennsylvania Department of Education's commitment
to the Least Restrictive Environment.
Our goal for each child is to ensure Individualized Education Program
or IEP teams begin with a general education setting with the use of supplementary aids
and services before considering a more restrictive environment.
The objectives of today's webinar are listed here.
First participants will be able to select and use appropriate progress monitoring tools
to measure students' progress towards mathematics goals written into IEPs.
And participants will be able to use the progress monitoring tools
to document students' progress and use the data to make instructional adjustments if necessary.
So, with that we want to begin with the end in mind.
When we're talking about our goals for students we strive to ensure
that each student is proficient in core subjects, that they graduate from high school
and are ready for post-secondary education and/or careers.
We also want to make sure that they have access to equitable opportunities
for education regardless of the background, condition or circumstances.
So, in general we're saying that we want to have high expectations for all of our students.
So, let's talk about mathematics proficiency.
When students understand mathematics they're able to use their knowledge flexibly.
They combined factual knowledge, procedural flexibility
and conceptual understanding in powerful ways.
So, on your screen here you see a mathematics rope if you will
and all of the strands put together help to develop mathematics proficiency.
So, we have conceptual understanding, and that refers to the integrated and functional grasp
of mathematical ideas, which enables students to learn new ideas
by connecting those ideas to what they already know.
A few of the benefits of building conceptual understanding are
that it supports retention and it prevents common errors.
We have procedural fluency and that's defined as the skill and carrying out procedures flexibly,
accurately, efficiently and appropriately.
Strategic competence is the ability to formulate,
represent and solve mathematical problems.
Adaptive reasoning is the capacity for logical thought,
reflection, explanation and justification.
Productive disposition is the inclination to see mathematics as sensible,
useful and worthwhile coupled with a belief in diligence and one's own advocacy.
Education should equip all students with mathematical skills and provide them
with the flexibility, adaptability and creativity to function as productive citizens
in the changing technological society of the 21st century.
Mathematical skills must extend beyond the ability to calculate into the use of mathematics
to investigate, analyze and interpret.
Thinking mathematically is critical to every life skill
from balancing a checkbook to understanding the newspaper.
People use math skills daily to identify problems, look for information
that will help solve the problems, consider a variety of solutions
and communicate the best solution to others.
A math classroom should provide practical experiences in mathematical skills
that are a bridge to the real world as well as explorations, which develop an appreciation
of the beauty and value of mathematics.
Using a variety of tools such as calculators, computers and hands on materials
under the guidance of a skillful teacher creates a rich mathematical learning environment.
Such an environment will help to prepare students for a world where using calculators
and computers to carry out mathematical procedures is commonplace,
a world where mathematics is rapidly growing and extensively being applied in diverse fields.
Pennsylvania should expect its students to enjoy, appreciate and use mathematics just
as it expects them to enjoy, appreciate and use music, art and literature.
Students who are challenged to reach these goals will be better prepared for a future
in which mathematics will be increasingly important in all areas of endeavor, endeavor.
Research has solidly established the importance of conceptual understanding.
Let's take a look for a moment at what we talk
about when we're talking about conceptual understanding.
They can recognize, label and generate examples and non-examples of concepts,
can use and interrelate models, diagrams, manipulatives and so on, know and apply facts
and definitions compare, contrast and integrate concepts and principles, recognize,
interpret and apply signs, symbols and terms and interpret assumptions
and relationships in mathematical settings.
So, this goes beyond just the procedural knowledge.
This is the understanding of why things work the way they work.
One of the best examples that I can think of is in algebra many
of us were taught the FOIL method when multiplying polynomials or binomials.
And we just knew that FOIL, first, outer, inner, last was the way to do it.
But, there was no conceptual understanding of why that worked, at least from my perspective.
Procedural language, students demonstrate procedural knowledge in mathematics
when they select and apply appropriate procedures,
verify or justify a procedure using concrete models or symbolic methods,
extend or modify the procedures to deal with factors in problem settings,
use numerical algorithms, read and reproduce graphs and tables,
execute geometric constructions and perform non-computational skills
such as rounding and ordering.
[ Clicking sounds ]
Here we-- this is-- we're putting this in here to refer you back to our common core standards
or our Pennsylvania core standards.
We are-- this is an old slide with the old terminology, but it is the core standards.
And if you just take a look you know along-- we have numbers and operations,
algebra and concepts, geometry, measurement, data and probability and how all those things go
across the grade spans and the differences in each of the grade spans.
And here we have Pennsylvania core standards in mathematics.
This is just a sample.
All of these can be found in the Pennsylvania core standards
on the web at the, at the SAS portal.
The new standards for mathematical practice were based on the National Council of Teachers
of Mathematics process standards
and the National Review Council Strands of mathematical proficiency.
The strands for mathematical practice are vital with respect
to students developing a powerful set of core mathematical competencies.
These practices do not stand alone and are not intended
to be taught as a standalone in lessons.
But, they are an integral part of learning and doing mathematics and need to be taught
with the same intention and attention as a mathematical content.
They are in fact, the habits of mind required for understanding mathematics.
And so here we see what the core standards for mathematical practice are.
Students should be able to make sense of problems and persevere in solving them,
reason abstractly and quantitatively, construct viable arguments and critique the reasoning
of others, model with mathematics, use appropriate tools strategically,
attend to precision, look for and make use of structure,
look for and express regularity in repeated reasoning.
And again, this is not something that you're going to teach separately,
but all of this gets embedded into the mathematical teaching.
[ Silence ]
So, with that being said the reality is that some students struggle to learn mathematics
and some students require an IEP to support their learning.
Often it is and with mathematics and unlike reading folks will justify their difficulty
with mathematics saying oh, I'm just not a math person or that just I was never a math person.
His-- my father wasn't a math person.
I just don't get it.
But, we would never say that about reading.
We would never say oh, I'm just not a reader.
So, it's a change of mindset for many folks that we want to embed in them that math is
for everybody and that everyone can learn math.
The reality however is that some students will require additional support beyond what we're
doing in our everyday instruction.
[ Silence ]
So, now let's shift our focus and talk about progress monitoring, okay.
According to IDEA and state rules progress monitoring procedures must be established
for each goal.
Progress monitoring is the method of formative assessment used
to measure students' progress towards meeting a goal.
The main purpose of progress monitoring is to describe the rate of response to instruction
and to build more effective programs for students.
Now remember, when we talk about formative assessment the only that way
that an assessment is formative is if it changes what we do
as a result of the results that we get.
So, if we see that a student is progressing and making gains at the right rate
of where we think they're going to meet their goal then we know to leave things as they are
or maybe even push a little bit to get the student beyond the goal.
But, if we're seeing that the student is not making progress through our use
of progress monitoring tools then we would use that information to make some changes.
Progress monitoring procedures guide how data will be collected in order
to make instructional decisions about the progress of the student
and establish a decision making plan for examining the data collected.
Progress monitoring assists the teachers or service providers
in making ongoing instructional decisions about the strategies being used.
And it also provides summative evidence that enables the IEP team
to determine whether the student has achieved his or her goal.
So remember, we talk about summative evidence; it's what we collect at the end.
As a result of all of this has the student met our goal?
Progress monitoring answers these questions.
Is the student making progress at an acceptable rate?
Is the student meeting short-term and long-term IEP goals?
And does the instruction or intervention need to be adjusted or changed as a result
of the progress that the student is making?
Progress monitoring links the pieces of the IEP.
It links the present levels and it links the measurable annual goal.
So, the present levels tell you where the student currently is.
The measurable annual goal tells you where you want the student
to be as a result of your instruction.
And the progress monitoring is the checks along the way to see
that the student is making progress towards that goal and what adjustments need to be made
in order to help the student to reach the goal.
In order to monitor the progress of a student toward their IEP goals the goals must be written
in a way that they can be measured.
The goals align to the standards based upon the student's identified needs and stated
in the present levels must be written
so that the student will perform some behavior that can be measured.
For example, correctly answer four out of five problems.
If you had participated in any of the,
in the previous webinars there's a seven step progress monitoring approach, okay.
It starts with having the measurable annuals goals, having data collection decisions.
So, at what point are we going to stop and take a look at the data and make decisions?
What's going to be our rule around the data, around changing instruction if necessary?
What data collection tools will be used?
How are we going to represent the data, the evaluation of the data,
instructional adjustments, and the communication of progress?
The link that is here on the slide can take you to the recorded webinar or the--
can take you to the handout section of the webinar that was done on October 29th,
progress monitoring for students with IEPs and introduction.
So, that was the first one in the series.
If-- the recorded webinar's not up yet.
It will be up shortly, but currently I do know that the handouts are available to that one.
Effective progress monitoring includes all of these things, measures the behaviors outlined
in the goal, uses an equivalent measure each time.
So, we're doing the same thing each time.
We're measuring the same thing each time, provides regular and frequent data collection.
It's easy to implement, takes a short amount of time from instruction and allows
for analysis of performance over time.
The regulatory requirements of the IEP of-- and the connections to the IEP,
we must measure how the child's progress towards meeting the standards aligned goal will
be measured.
We need to report when periodic, when periodic reports
on progress will be provided to the parents.
The requirement in special education is that the student made progress
in the general education curriculum.
We look to align our IEP goals with the same content that the students
without disabilities have access to.
When the IEP standards are anchored, aligned the monitoring of progress is direct and purposeful
and focuses on progress in the general education curriculum.
[ Silence ]
So, the purpose of progress monitoring in a standards aligned system is
to determine the progress in the general education curriculum.
This means we are using mastery of subject content is defined
by the Pennsylvania Core Standards.
This is probably very different than what most have been using for progress monitoring in IEPs.
However, many of our students with IEPs have skill deficits in areas
that are not aligned to state standards.
So, we're going to be talking about different kinds
of measures in monitoring student progress.
One item or type of measure will not always be able to measure the entire goal.
So, the IEP team should be looking at summative, formative, diagnostic and benchmark data
as multiple ways to measure progress and report to parents.
The more information about the four types of assessment can be found
on the SAS portal at the PDE website.
Also we won't have time to review the development of standards around IEPs,
but as a webinar was recorded on the topic about a year ago and it was recorded and archived
on the PaTTAN website and we encourage you to go back and view it.
Okay, research indicates that when we set appropriate goals and monitor progress
that we are likely to get large improvements in student outcomes
and that's what we're looking for from our students.
Alright, and I am going to at this point transfer the presentation over to Allen Muir
in Harrisburg and he's going to take us through approaches
to progress monitoring for-- specific to mathematics.
[ Silence ]
Mastery measures and general outcome measures are two common progress monitoring approaches.
Each approach is based on a fundamental set of assumptions with advantages and disadvantages
that need to be understood to make valid decisions about student progress.
A classic article comparing and contrasting these two approaches was published in 1991
by noted progress monitoring scientist Lynn Fuchs and Stanley Dunham.
Mastery measurement in theory measures a skill in a validated instructional sequence repeatedly
and frequently until the student demonstrates mastery.
When a student moves to the next skill that's taught
in the instructional sequence and this process is repeated.
General outcome measure measures the essential big ideas repeatedly and frequently
to show progress in math over time.
For both approaches to progress monitoring all information is obtained
and all judgments are made with respect to the PA common core standards
for both starting and ending points.
One key difference between mastery measures
and general outcome measures is the ability to look at data across time.
With general outcome measures you can compare the score a student received in May
to a score he or she had in September.
This can be done with mastery-- this cannot be done with mastery measures,
because each sub skill is tracked separately.
These sub skills do not necessarily correlate well with overall achievement.
When using mastery measurement knowledge about the sequence of skills is important,
because the test will be developed for each skill that provides information
on the performance for a student with respect to that specific skill.
So, when we're using mastery measurement we will identify the sequence of skills and then
for each one of the skills there'll be a test that is developed.
Let me give you an example.
Here's a sample goal dealing with third grade operations involving multi-digit arithmetic,
adding, subtracting, multiplying, dividing.
You'll notice that the standard, the core standard
and some core spun eligible content is listed on a slide, but there's no requirement
for either to be included in the goal.
We just put these on the slide so that you can make the connection
to the general ED curriculum.
So, this is our measurable annual goal.
Now, this is a sequence of skills of--
in instruction and this is examples for illustrative purposes only.
You'll notice that in this list of skills you have multi-digit addition with regrouping,
then multi-digit subtraction with regrouping.
Then the third skill would be multiplication facts, factors to nine and so on.
So, in this particular sequence we have, we have 10 skills.
On our first assessment we will work with the students around multi-digit addition
with regrouping and then we'll give them an assessment on that.
This is an example of what the assessment could look like.
The sequence of skills for instruction has been created in this sample
for this first skill multi-digit addition with regrouping.
Notice there are 10 problems all asking the student to perform the same skill.
This is a sample graph that can be used to see a child's progress.
The number of problems correct is the dependent variable.
So, that's what we have over on the left.
The number of weeks is the independent variable, okay.
That's what's going along at the bottom.
The number of problems correct is recorded each time the test is given.
So, you can see over on the left we have a column called multi-digit addition.
And you can see that the test has been given numerous times.
Since the identified goal stated three consecutive trials of eight
or more problems correct instruction and testing continues to happen until the goal
for that particular skill is met.
The horizontal line at eight where number of problems correct is, is the mastery level.
And the vertical line between four five indicates that the next skill can be tested.
So, you can see that that horizontal line at eight now a student has correctly answered
on three consecutive trials or more problems.
Once that happens then we go back to our [inaudible] of skills
and we start the instruction on the next skill,
which in this case is multi-digit subtraction with regrouping.
Here's a sample of multi-digit subtraction mastery test.
Again, notice there are 10 problems all asking the student to perform the same skill.
Numerous forms for this particular mastery test are created with the same type
and level of difficulty for the questions.
Here's the graph showing the progress for the multi-digit subtraction with regrouping.
Notice instruction and testing continues until the mastery level based on our goal,
eight problems correct over three consecutive trials is achieved.
The mastery level was a team based decision.
Also notice the amount of time required for this particular skill.
There is more time that's involved with multi-digit subtraction
than there was with multi-digit addition.
Instructional decisions were being made during that time
and we'll talk about some of those in a minute.
If you use mastery measures over a long period of time you'll be tracking different skills.
You cannot compare the scores for multi-digit subtraction to the scores
for multi-digit addition to see if a student's getting better
in overall mathematics across time.
Recall that mastery measurement utilizes a series of short-term skill tests
that changes masteries demonstrated.
Some consideration points for attempting to quantify progress
across skills are mastery measurement does not recognize maintenance of skills.
Mastery measurement objectives for skills are not equivalent units.
So, often or sometimes the big picture can be lost.
If you refer back to when Dennis was talking about what math looks
like in today's world you have concepts, you have procedures, you have word problems.
If we're just focused on a single skill sometimes we lose that big picture.
Other challenges include the sequencing of the skills based on what we always have done
or it's a sequence based on research of a true learning progression.
That's a big conversation.
Single skill measurement can be misleading.
Being able to successfully shoot a free throw
on your backyard hoop does not mean you can be successful during a game.
The primary advantage of mastery measurement is that it conveys important information
to teachers about the immediate impact of an intervention.
Answering the question did a student learn what I'm teaching
for that specific timeframe whether it's a day, a week or a quarter?
This is often not the best way to judge progress over the long haul.
For example, a student may pass an end of unit addition facts test and move
to the next unit on subtraction facts.
That's also tested to evaluate learning.
Once a student passes the addition facts test they may not systematically be evaluated
on that content again.
A student's inability to retain their addition facts might be missed.
Although when mastery measurement is done well the provided information can benefit both
students and teachers at that particular time in learning instruction.
Understanding the student's progress towards an expected general outcome is difficult to access,
because of the interest in gauging progress towards standards rather
than just specific skills.
[ Silence ]
So, let's take a look at the other type of progress monitoring that we want to talk
about today, the general outcome measures.
Remember general outcome measures the essential big ideas repeatedly and frequently
to show progress in math over time.
These simple efficient and short formation measures provide grade level curriculum
[inaudible] assess some information that helps teachers plan better instruction.
They're comprised of tasks of about equal difficulty to be given throughout the year
so that a growth toward a final goal may be measured.
They are sensitive to the improvement of students' achievement over time
and they're easily understood by teachers and parents.
One type of general outcomes measures is known as curriculum-based measurement
and it makes no assumption
about the instructional hierarchy for determining measurement.
It fits with any instructional approach.
Curriculum-based measurement incorporates automatic tests in retention and generalization.
As we mentioned before with general outcome measures curriculum-based measurement is used
to monitor student progress across the entire school year.
Students are given standardized probes at regular intervals whether that's weekly,
biweekly or monthly to produce accurate and meaningful results that teachers can use
to qualify short and long-term student gains toward end of year goals.
With the curriculum-based measurement teachers can establish long-term goals indicating the
level of proficiency on which students will demonstrate by the end of the school year.
Curriculum-based measurement tests also sometimes referred
to as probes are relatively brief and easy to administer.
The probes assess the same skills at the same difficulty level.
Often the probes have been prepared by researchers or test developers
to represent curriculum skill and concepts and be of the equivalent difficulty
from test to test within each grade level.
Probes are scored for accuracy and sometimes speed and students' scores are graphed
for teachers to consider when making decisions about the instructional programs
and teaching methods for each student in the class.
Curriculum-based measurement provides a doable in technically strong approach
for quantifying student progress.
Using curriculum-based measurement teachers determine quickly whether an educational
intervention is helping a student.
[ Silence ]
Here are three reasons that teachers typically mention
when they are discussing using curriculum-based measures.
The first one, the academic health of a student can be viewed.
Remember the mastery measurements, only that particular skill is included in the snapshot.
With curriculum-based measures the snapshot provides-- provided shows a much bigger picture.
Not only can the big picture view be seen,
but the rate at which academic development can be quantified as well.
The graph of the scores provides information about the rate in which the student is learning.
This type of progress monitoring in general can provide insight into the effectiveness
or ineffectiveness of a particular program or curriculum, which has an impact
on a students' work towards a goal.
So, we're going to look at some of these reasons as we go through and look at some examples.
But first, let's just make sure we understand some of the steps involved in conducting a CBM.
The skills are identified or the concepts or the types of problems
that will be involved in the year-long curriculum.
The weight of those skills or concepts or types of problems would also need to be determined,
which ones might have a little more importance than other ones.
Numerous alternate test forms are created.
Each test samples the entire year's curriculum.
Each test samples the entire year's curriculum.
Each test contains the same type of problems.
The tests are given on a scheduled basis.
Often the IEP team will be part of that decision making process
if this is for a student with an IEP.
Once the tests are given the test, the test data is graphed and then the data is analyzed
and then instructional decisions are made as appropriate.
For computation and concepts and application probes teachers use curriculum-based measurement
probes for the students current grade level.
However, if a student is well below grade level of expectations the teacher may need
to use lower grade probes, but always working towards the grade level goal
or goals on the IEP.
Here's a sample goal dealing with the sixth grade curriculum.
This is only one goal and the focus on this one is on computation.
Another goal might have a focus on problem solving or conceptual understanding.
As mentioned in the previous slide while the standard
in some corresponding eligible content is listed there's no requirement
for either to be included for the goal.
Again, in this sample it's shown to make,
just to make the connection to the general ED curriculum.
So, what we want Diane to do is to write 37 correct digits in two minutes and she's going
to be given 25 problems that represent the sixth grade curriculum.
Curriculum-based measurement makes no assumptions
about instructional hierarchy for determining measurement.
Remember, we mentioned that before, so it fits any instructional approach.
Curriculum-based measurement reflects all skills in a yearlong curriculum
with random placement of problem types.
By assessing all the objectives in the curriculum,
curriculum-based measurement is sensitive to growth as more skills are taught regardless
of the order in which they're taught.
Curriculum-based measurements also allow teachers to determine the students
who retain taught skills and generalizing the skills that they have not yet been taught.
Notice the test has all four operations and includes fractions.
The two problems show the features of retention and generalization.
Often the length of the computation test like this one would vary by grade.
Here's a graph showing the digits correct as the dependent variable
and time as the independent variable.
A correct digit is the right numeral in the right place.
Training teachers to score CBM takes place
so that a consistent reliable score can be acquired.
The graph by itself shows information, but the question that could be asked is;
is the student making adequate progress towards the goal?
An upward trend can be seen starting with the January scores,
but more information can be placed on the graph to create a clearer picture of the progress.
Before we look at that information let's review some of the benefits again
of curriculum-based measurement and look at another curriculum-based measurement
that is concept or problem solving based.
When compared to mastery measurement curriculum-based measurement avoids the need
for a sequence of skills and single skill tests.
Curriculum-based measurement assesses understanding maintenance and generalization
and are typically short in time, but they are effective measures.
Here's another sample of a curriculum-based measurement.
This sample assesses math concepts and applications.
And you can recall back to what Dennis said.
He mentioned that in math there are concepts.
There are applications.
There's problem solving.
So, while we're progress monitoring a student and we're looking at polls for a student some
of the goals might deal with developing some of that conceptual understanding
or working on development problem solving.
So, this particular CBM has information or will provide information
that contains important aspects to overall math literacy,
in this case concepts and applications.
Again, this is just a sample and this does not reflect the current PA core standards
for grade six.
This slide shows possible additional information
that could accompany a progress monitoring graph.
The boxes below indicate the type of questions and the degree of accuracy
for an individual question over a number of trials.
You'll see on the left hand column down below A1, S1, M1.
So for example, A1 could refer to adding.
S1 could refer to subtracting with regrouping.
S2 could refer to subtracting with regrouping and special case, maybe zero is being used.
M1 might be multiplying basic facts and so on.
So, this information provides some individual skill, provides information
about individual skills or concepts, the darker the box the greater the level of mastery.
So, if I look at A1 and I go across I see the first couple
of boxes they have a little bit of shading, okay.
That could mean or would mean that student is trying these and they're starting to get that.
As you continue to go along you can see that the boxes then start to get a little bit darker.
Okay, there is a box that had-- that is almost completely filled in, but a little white circle.
That would mean that you almost have it.
Then you have a couple of boxes there that are completely filled in and then it goes back
to the white circle and then a little bit, a bit less.
Again, this provides more specific information about some of the individual skills or concepts.
Typically a suggested number of scores is required before making instructional decisions,
because we know time is needed to give the intervention an opportunity to work.
As we mentioned before the graph by itself shows information.
But again, a question that could be asked is the student making adequate progress towards
the goal?
An upward trend can be seen starting with the January scores, but without an aim line
in the graph it's difficult to tell if the student is making enough progress.
We'll show an aim line in an upcoming example.
When we're working at curriculum-based measurement one of the decisions that we have
to decide in-- on is which task is developmentally appropriate for each student
to be monitored over the academic year.
For students who are developing at a typical rate
in mathematics a suggested curriculum-based measurement task is shown on this
or tasks are shown on this sheet.
For kindergarten and first grade students the following probes will be administered alone
or in combination with one another.
Quantity array asks students to identify the number of items in a box.
Number identification asks students to identify numeric characters.
Quantity discrimination asks students to identify the bigger number in a pair of numbers.
Missing number asks students to identify the missing number in a sequence of four numbers.
According to Lemke and Fagan in 2005 the quantity array, number identification,
quantity discrimination and missing number tasks have limited data related
to their technical adequacy at that time.
In other words, how much information does this provide about possible future problems
with students that struggle in math.
Early data indicate that the measures show promise as indicators
of student performance in mathematics.
Their research is still taking place.
Again, according to Lemke and Fagan
in 2005 curriculum-based measurement computation grades one through six
and curriculum-based measurement concepts and application computation grades two
through six can be administered alone or in combination with one another.
Students in the earlier grades should use computation probes until the concepts
and application probes are appropriate for their grade level material from the curriculum.
For grades one to six once you select a task for CBM progress monitoring stick with that task
and the level of probes for the entire year.
[ Silence ]
We've looked at an elementary sample.
We've looked at the sixth grade sample.
Now, let's go to a high school sample and continue looking
at some more curriculum-based measurements.
This sample goal deals with an algebra curriculum.
Notice the goal is based on more than computations.
Marianne will increase her ability to describe and make generalizations through patterns
and functions and represent them in multiple ways.
How will a student's progress toward meeting this goal be measured?
In this particular case it's the number
of correct answers tracked on a progress monitoring graph.
The previous example had the number of correct digits.
In this one it's the number of correct answers.
Additional data gathered from classroom form assessment, either an oral
or written response form and quarterly benchmarks could also be used
to monitor progress.
As mentioned in the previous slide while the standard
and corresponding eligible content is listed there's no requirement
for either the-- included for the goal.
In this sample we're just trying to make a connection to the general ED curriculum.
Again, this sample is for illustrative purposes only.
This sample of questions came from project AAIMS.
AAIMS is the algebra assessment in instruction meeting standards.
It was funded from January 2004 through December 2007.
It was designed to achieve two objectives related to the teaching and learning of algebra
for students with and without disabilities.
First they examined algebra curriculum, instruction and assessment for students with
and without disabilities and determined the extent with which this assessment was aligned.
Then they developed the assessment tools that can be used for monitoring progress
of students with and without disabilities.
Then they made sure that reliability, validity and sensitivity to growth were there.
Notice this sample is based on work with expressions and equations like simplify,
evaluate and solve and those are often considered basic skills.
This particular assessment would have 60 items and the student would have about five minutes
to complete as many as they could.
Again, the focus is on skills for which some level of automaticity is desirable.
Problems like this include solving basic fact equations, applying the distributed property,
working with integers, combining like terms, applying proportional reasoning.
Again, scoring is based on the number of problems correct.
Here is another sample from AAIMS, the algebra assessment in instruction meeting standards.
This sample is from the foundation's assessment.
Notice the students are asked to graph and translate
as well as evaluate, simplify and solve.
Other project AAIMS measures include besides the basic skills that we saw previously
and this foundation are algebra content analysis.
Those problems are sampled from core concepts in the initial two thirds
of a traditional algebra one course up through systems of linear equations and inequalities.
Students are encouraged to show work to obtain partial credit for these particular problems.
Up to three points would be given per problem.
One point could be deducted for circling an incorrect answer without showing any work.
There is a scoring rubric.
So, as we continue to move along the spectrum of what we're progress monitoring,
the grading you can see and the scoring that's a little more complex.
There is also a translation task.
That task requires students to match varying representations for relationships
between two variables involving equations, data tables, graphs and story scenarios.
So, here's a graph in which we can see that Marianne is making progress
as indicated by the upward trend in data.
But, the question is, is the trend on a path to reach the goal.
So, an aim line was put-- was placed on this graph.
The aim line shows that we expect Marianne
to reach the mastery criteria by the end of December.
So, you can see at the end of December is the star in the right hand corner.
If that was the case we can see that she is not on track to do so
and the instructional adjustments would have to be made along the way.
If the goal was for Marianne to reach mastery by May or June then the aim line slope would not be
as steep as it is in this graph and she may have been on track to meet the criteria for success.
So, that aim line depends on when the ending date is.
Again, this example is strictly for illustrative purposes.
Let's look at Marianne's case a little bit closer.
Mr. Campbell has been using CBM to monitor the progress of all his students
in his classroom for the entire school year.
He has one student, Ernie, whose been performing extremely below his classroom peers even
after two instructional changes.
[ Silence ]
After eight weeks Mr. Campbell determines that this trend line,
okay in this case this is for Marianne.
We also have Ernie that we talked about, but we're not going to show his graph,
that Marianne's trend line was flatter than the goal line or aim line.
So, you can see the trend lines in there.
So, that's the trend that's taking place
as the CBM assessments are scored and the data is recorded.
In this case the trend lines are not at the same rate as the goal line or the aim line.
Now remember, we talked about there needs to be a certain number of data points
that are collected before instructional changes are made.
So, the first instructional change would be
that first thick vertical line that's on Marianne's graph.
After another eight weeks Mr. Campbell realized
that Marianne's trend line was still flatter than the goal line.
And you can see the graph shows that Marianne has made no improvement in math.
So, Mr. Campbell made another instructional change to Marianne's math program,
maybe included work on math fact flash or math fact flash cards.
It could include work on giving some individual instruction, maybe there's some diagnostics
that took place to kind of get a pinpoint on exactly where the issue was.
The second instructional change is that second thick line.
Based on the student's trend line if the trend line is steeper than the goal line the end
of year performance goal needs to be increased.
If the trend line is flatter than the goal line the student's instructional program needs
to be revised.
But again, we don't make that decision after just one assessment.
If the trend line and goal line are fairly equal then no change needs to be made.
So, you can see in this particular case that the trend line is a lot flatter than the goal line
so that's why Mr. Campbell is making some instructional changes.
This chart reflects realistic and ambitious rates of improvement
for math involving curriculum-based measurement, computation and fluency probes.
The metric used is digits correct per two minutes per week.
Probes that would go along with this can be found on Intervention Central
and we have a slide coming up with some resources.
So, you can see in this particular slide that a realistic rate growth would be .3
so that gives the rate of change or how steep the line is.
An ambitious growth rate would be .5.
So, there's a way that when we have the data we can go in and we can figure
out what these growth rates are and then we use the growth rates that the trends
to help us make decisions, instructional decisions.
Do we need to change instruction?
Are we on the rate in which we're going to close the gap?
Do we need to increase the goal?
You know different things like that.
Here are some resources that you can use.
The first resource is from the Research Institute on Progress Monitoring.
Excuse me.
The Office of Special Education Programs funded the Research Institute on Progress Monitoring
to develop a system of progress monitoring to evaluate effects of individualized instruction
on access to and progress within the general education curriculum.
These funded activities have ended
and the center is now disseminating the findings from its five years of research.
This is still on-- available for people to get information.
The second one that we have; National Center on Student Progress Monitoring,
it's housed at the American Institute for Research and working in conjunction
with researchers from Vanderbilt University.
And they're a national technical assistance and dissemination center dedicated
to the implementation of scientifically based student progress monitoring.
Again, this particular project came to an end of its five year contract with the U.S. Department
of Education, but they continue to maintain the site with all its valuable resources.
This third sample is the National Center on Response to Interventions website.
It has funding from the U.S. Department of Education, the American Institute for Research,
researchers from Vanderbilt University and the University of Kansas.
This center provides technical assistance to states and districts and builds a capacity
of states to assist districts in implementing proven responses to intervention frameworks.
So, here are three samples or three resources or sources
where you can get more information about progress monitoring.
You'll notice the one, the National Center on Response to Interventions website.
It actually has screen-- universal screening and progress monitoring, different assessments
and they get the information about those assessments based
on the data that's been collected.
So, this would be a good resource to go to if you're looking for a particular CBM.
We just talked about CBM in general except for the algebra probes.
The RTI for Success center defines progress monitoring as repeated measurement
of academic performance to form instruction of individual students
in general and special education.
It's conducted on a regular basis to estimate rates of improvement,
identify students who are not demonstrating adequate progress and/or compare the efficacy
of different forms of instruction to design more effective individualized instruction.
We just reviewed two methods that are available to analyze student math performance,
monitor progress and inform IEP goals.
Ultimately the specific methods that you chose should be based on the purpose
of your assessment, what you want to measure and why, your expectations from that performance
and the degree to which your measures will inform your instruction intervention
for the student.
Progress monitoring tools are great assessments that are reliable, valid and evidence based
and use repeated measures that can capture student ability.
They should be measures of age appropriate outcomes.
No one progress monitoring tool can monitor for all outcome areas.
Different tools may be necessary for different areas.
Progress monitoring allows us to make decisions based on the pattern performance rather
than on one or two isolated pieces of information.
Student outcomes improve when performance is assessed regularly.
I would like to thank you for joining us.
There are additional webinars as part of this progress monitoring series.
The next one is on February 13th on the topic of progress monitoring of goals for students
who use assistive technology tools in reaching their goals.
Registration is through the PaTTAN website, www.pattan.net.
You can go to the training calendar.
We do have a couple of minutes left so if there are any questions you can type the questions
into the chat box and we'll try to answer them.
If we don't have an opportunity to answer the questions we will get back
to anybody that did ask a question.
[ Silence ]
Gee I don't see any questions.
So again, I want to thank you.
>> Hey Allen can you hear me?
It's Mike.
>> Dennis or myself.
Yes Dennis?
>> It's Mike and we got actually a couple of questions here.
>> Okay.
>> I'll read them over to you.
Say you were using a progress monitoring program such as AIMSweb and that drives the IEP goal.
How would these programs correlate to the common core standards now?
>> That's a very good question.
With the PA common core the keystone algebra the project AIMMS has those basic algebraic concepts
and skills embedded.
So, there should be a pretty good correlation there.
When you look at some of the other CBM type probes that's a question that you want
to ask the company how-- what kind of alignment is there to the core,
because the core you know is relatively new.
So, that's a very good question.
>> Alright, cool.
We have two more for you.
Should the progress monitoring be done at grade level or instructional level?
[ Silence ]
When you look at progress monitoring I refer back to some of the notes.
The attempt is to try to get it done at grade level.
But, if the grade level was not providing enough information or you're not able
to gather enough information then you go down to the instructional level.
But, you always try to work up towards the grade level.
>> Thank you and one more.
How many goals should an IEP have for math?
[ Silence ]
>> That's a loaded question.
That is-- depends on the student.
It depends on the, you know the needs of the student.
And it depends on the team that's involved.
Dennis, would you add anything to that?
>> No. That's exactly what I was going to say.
It's an IEP team decision based upon the students' needs that have been gathered
through assessment and that are identified in the present ED levels.
Alright, thanks Allen.
Any other questions we'll make sure that we get it taken care of through email after the fact.
[ Silence ]
>> So again, thank you everybody.
Have a great evening, take care.
There are no comments currently available