Monday, April 25, 2016

Self reflection quest


I am far from a perfect teacher.  I consider myself to be a good teacher, but there is always room for improvement.  For me, this is definitely true when it comes to teaching online.

This is my first year teaching an online course.  I am teaching AP Environmental Science Online for Floyd County Schools.  First, I will reflect on the few things that I have done well this year.

First, throughout the course, I made videos explaining various topics.  I think the presentation of the videos is clear and concise.  Here is an example video where I explain the Coriolis Effect and Prevailing Winds.


While I think my presentation of information is done well, there is plenty of room for improvement.  Both the video and audio quality are low as I simply shot these with an old digital camera and uploaded them to Youtube. The beginning and the end of the video could also use a little cropping.  This is something simple that I could do with the Youtube video editor, but like most teachers, I felt rushed for time throughout the semester and felt I could skip it.

Second, at times during the semester I felt that I did a good job of providing follow up feedback when students were having difficulty with parts of the subject.  One thing that students struggle with in a live or an online classroom with AP Environmental science is doing math calculations without a calculator.  Along the way in the semester, I assigned the students work an AP DBQ (data based question).  They did poorly with this assignment because of weak math skills and failure to show their work.  Here is the assignment.

As previously stated, the students did very poorly on this first math free response question.  I immediately made a video to give them feedback.  Here is the video.
This video addressed many of the problems the students had and I know at least one of them found it helpful as I got an email response from her telling me how much it helped.  Unfortunately, I think I permanently deleted this email; so you'll have to take my word for it.

Now to reflect on parts of my online teaching that need improvement.

Synchronous sessions - I did a very poor job of hosting synchronous sessions this semester.  There were two reasons for this.  One, I was not that comfortable with the platform that I was using to give the sessions (Google Hangouts).  More importantly though, my students were scheduled for their classes at different times throughout the day and at different locations.  To get them all to a synchronous session at the same time, I had to require a time outside of school.  Kids schedules are all different after school too with sports, work, etc.  At any rate, I was never able to get the majority of my students to even attend the few sessions that I had.  

I will make the synchronous sessions a more important part of the class next time and to increase participation, I will wet a time upfront and require student participation.  I will tie this their grade in order to encourage participation.  They will have a certain number of acceptable misses, but they will have to make up those misses by watching a recording of the synchronous session and leaving comments.

Individualized path to learning - As I built my course this year, I designed lessons to teach the desired material, but I did not allow for individual learning paths.  Next year, I will do a better job of that.   I will individualize learning paths two different ways.  

1 - on quizzes and tests, I will use the Moodle feedback option to give general and specific feedback.  In these I will include links to material to help reteach material for those students who missed the questions.  I have started to do this some.  Here is an example.

I know that is still too small to read easily, but what it shows is a question with the individual answer choices.  Each choice has feedback.  Here is a screenshot of the most important part - general feedback with a link to a video to reteach what a student may have missed.
2 - I will prepare individual modules for students that need further instruction, either remediation or enrichment, and will selectively release those to only certain groups of students.

I could reflect on my learning experiences more, but I think this gives a sufficient representation of my self-reflection, noting what I did well and where I need improvement.

Thursday, April 21, 2016

Differentiation quest

For my online AP Environmental Science class, I use Moodle.  Moodle doesn't have hotzone displays like those cited on the Georgia Tool page (at least not to my knowledge), which give an instant visual to student perfromance on questions.  What Moodle does give for each quiz is a student grade report for each question and a histogram of overall student grades.

Below you see a student grade report for a quiz that I gave early in the year on Earth Science Concepts.  After the picture, I will describe what it tells me as a teacher.
The first column of this table shows me the amount of time that each student spent on the quiz.  This allows me to see which students are fast and slow.  If someone had a superfast time (say 1 minute), I would know that the student either cheated or randomly guessed.  The student in this example that took only 9 minutes seems to have rushed.

This time info also allows me to see if I set my time restriction correctly.  I had set a 35 minute limit for this quiz.  Most students finished in plenty of time.  I did have two students that took most of the time - one at 34 minutes 9 seconds and the other at 31 minutes 20 seconds.  This tells me that the time I allowed for my quiz was about right.  I may shorten the time limit to 30 minutes next year.  This would rush a few students, but it would give most students time to answer and make sure that they weren't using their allotted time to look up answers.

The second column shows me the individual grades, and the bottom of the column shows me the class average.    The class average here was passing (70.81) but lower than I would have liked.  I may have rushed my teaching, but I'll analyze the scores more below when I look at individual items.  Certainly it was passable, 2 of the 7 students made Bs.

The remaining columns show student scores and the average number of points achieved on each question.  The possible points for each question is shown at the top of the column.  For example, on question 2, there were 4.35 points possible and the class average was 4.35.  No student missed the question.  I am more concerned with the questions where the students did poorly.

Questions 7 had more points available.  It was a matching question about layers of the atmosphere.  Students averaged about 15 out of 22, they got a about 2/3 of it right.  That tells me that they probably new some of the layers and guessed on others.  After this quiz, I should have posted a video link to review the layers of an atmosphere.  Maybe a video like this one.

However, it is clear to me by looking at the questions that the students had their worst performance on questions 11 and 12.  I'll talk about 12 first because I feel I actually understand why they did the worst on this question.

This (the earth science concepts quiz) was the first quiz the students took in the class this year.  Question 12 was the only question on the quiz that was open response.  Here is an image of the question as the students saw it.  
A number of students were intimidated by the question and just didn't try it.  Others wrote responses, but the responses were not on topic.  They tended to restate the question without giving an explanation.  It was clear to me after this quiz that I needed to spend some time explaining to students how to answer open response questions.  Specifically, I needed to make sure that they knew the meaning of important terms often used in free response questions - describe, explain, differentiate, etc.  Next year, I will be sure  to give students examples of how to answer free response questions before their first quiz containing one.

Question 11 was about plate-plate interaction.  There were several other questions on this subject and the students did well.  I don't know hwy the students did poorly on this one.  Here is the question.


My only guess here is that they did not pay attention to the phrase "forced beneath the other".  I assume that they read that plates collide and thought it was mountain building.  

Moodle then summarizes the grade data in a histogram.  What I see here is that it is approximately bell shaped but slightly skewed to the right.  The majority of students would need remediation on this subject.  Clearly, one student did worse than the others, but the majority of students made 75 or less.  I would group these students for remediation before allowing them to go on to the next topic.  The students who made Bs could go ahead.

I've already written about one modification that I'd make for all students, teaching them to answer free response style questions before the first quiz.  When I actually did this, I did this after the fact, which is what the data suggested I needed to do.

Moodle also has a nice feature that allows for instant general or specific feedback on missed questions.  Many students go partial credit on the layers of the atmosphere matching question.  In their feedback from the test, I could post a video link going back over the layers of the atmosphere.  I mentioned that earlier and even put a link to the video.  What I could do though is set up a new module with restricted access just for those low performing students.  In addition to the video link, I could put a quiz over the topic and require successful completion before allowing the students to move on.  This differentiation would personalize the learning paths for the students.

Monday, April 18, 2016

Rubrics and competency quest

Competency instruction is a way of designing a course so that students get all essential elements rather than just 70% of all elements. In the past, I have not structure my classes this way, but I may do so in the future.

To show that I am competent at designing course competencies, I have designed competencies for a physics lesson on analyzing graphs.  Here is the Georgia Performance Standard for which I have designed competencies.  In my example, I am teaching students how to compare the relationships among position, velocity, and acceleration graphically.

SP1.C Compare graphically and algebraically the relationships among position, velocity, acceleration, and time. 

Before designing my competencies, I need will specify learning objectives.

1. Interpret points on graphs.
2. Interpret the slopes of graphs.
3. Interpret the areas of graphs.


In order to learn these skills and how they apply to physics, students will complete a number of assignments from the chart below.  They will either a) read the section in the book on interpreting graphs and write a summary of the reading or b) watch three videos - one on interpreting points, one on interpreting slopes, and one on interpreting areas - and write a summary of each.  All students will do the motion graph worksheets.  Students will either do the constant velocity lab or the constant acceleration lab.  These are open ended labs where the students design their own experiments.


Learning objective
Assignments
Competency indicator
Interpret points on a motion graph
*worksheet on reading points
*read section on interpreting graphs and write a summary
*watch video on interpreting points and write a summary
*motion graph worksheet
*constant velocity lab
*constant acceleration lab
*make a slideshow presentation on how to read points on a graph
*write a paragraph length essay explaining the meaning of indicated points on motion graphs
Interpret slope on a motion graph
*worksheet on reading slope
*read section on interpreting graphs and write a summary
*watch video on interpreting slope and write a summary
*motion graph worksheet
*constant velocity lab
*constant acceleration lab
*make a slideshow presentation on how to read slopes on motion graphs
*write a paragraph length essay explaining the meaning of indicated slopes and how slopes change on motion graphs
Interpret area on motion graph
*worksheet on reading area
*read section on interpreting graphs and write a summary
*watch video on interpreting area and write a summary
*motion graph worksheet
*constant velocity lab
*constant acceleration lab
*make a slideshow presentation on how to read areas on motion graphs
*write a paragraph length essay explaining how to interpret slopes on motion graphs


In the introductory learning, the students have the opportunity to read about graphing or to watch videos about graphing.  In either case, they have to produce something (a summary) to show their learning.  Below is a link to a youtube video that I could use for part of the assignment. (I did not create the video.)


All students would have to do seperate assignments (worksheets, which could be converted to online quizzes with feedback for wrong answers) for interpreting points, interpreting slopes, and interpreting area.  All students would be required to do the motion graph worksheet as well.

The students could then choose between several lab activities.  These would all be open response, student produced labs.  The students would be given a task and have to design their own lab.  They may be asked to measure the speed of a contant velocity cart, to measure the acceleration due to gravity, or to analyze the motion of a ball rolling down an incline for examples.  Below is a written text description of one such activity.


I would be checking skills for mastery from each assignment.  Each assignment would be categorized and a certain number of points for successfully completing it would be assigned to each competency (learning objective).  As a final competency indicator, students would either choose to write paragraphs explaining different ways to interpret graphs, make a slideshow presentation explaining how to interpret graphs, or some combination of the two.  These would also have point values assigned.  If a student reached  a set number of points, he would receive credit for the competency.

It'd be possible for the student to get credit for reading the slope of a graph and not get credit for reading the area.  Hopefully, the student would master all three, but using competency indicators gives more information than simply one test on reading graphs.


For this section, set up a competency structure for one unit of your sample course. Associate the assignments in that unit with the competency. Submit both a screenshot of the structure and a detailed explanation of why you created the structure that way. Include a discussion of the various pathways a student may go through to attain the competencies in the unit and document all in your blog.
After completin

Thursday, April 14, 2016

Date driven instruction, analytics quest

Different learning management systmes have different tools to analyze student data.  To teach this, Georgia TOOL has asked me to analyze some sample data from a course.  I will link each table below and explain what I learn by looking at the data.

This table shows
  • the courses being taught in the spring of 201
  • that each course starts on January 7
  • that each course ends on May 10
  • the name of the courses taught (AP micro, econ, and american government)
  • and the number of students enrolled in each course (23, 49, and 43 respectively)
The table above gives information about each student.  It shows whether the student must pay for the course or if the school is paying.  It shows if the student is classified as gifted or not.  It shows if the student took an orientation or not.  In this case, all students did and the date of each student's orientation is shown.  It shows if the students have dropped the course.  None have.  And it shows that all students will be evaluated by an end of course test.


The image above shows a communication log.  It shows the type of contact.  Each contact here was by phone.  A voicemail was left for each person contacted.  The first person on the list was a student.  The other two were parents.  The "nature" heading shows the reason for the call.  In the first one, it is other and explains the reason in the comments.  The student got a welcome call on January 11, four days after the class started.  This is a rather prompt contact - making a positive contact in the first week of the course.  The other two calls went to parents about failing students.  These calls were made in February, still rather early in the course, giving parents and students ample time to work on improving student performance.

The image above show some examples of positive contacts.  The first log shows that a welcome call was made near the beginning of the course (January 16).  The second contact shown documents an email to parents of students who are doing well in the course.  This contact was made mid-semester, March 25.  This is a good way to encourage students who are doing well.  To often, these students are overlooked in teacher contact.


The image above is a bit confusing to me, but I'll explain what I think it means.  The fraction at the left shows how many grades the students have scored.  The first two students have scores for 58/59 grades, so all they have left is the final exam.  The students have missed one or more assignments.  The student at the bottom only has scores for 16 of the 59 assignments; he has not been doing his work.

The numbers along the bottom of the table show the dates of the assignments, and the nubmers in the boxes show the number of kilobytes used on each assignment.  Thus, you can tell that the first two students have been active on nearly every assignment.  Most of their assignments have used multiple kilobytes, with the high being 12 kbytes.  The students near the bottom that have completed few assignments have not been active.  Most of ther assignment have used zero kBytes.  One used 1.

Based on this information, the teacher may need to follow up with failing students and their parents earlier in the course.  Several of the students have just not done their work.


Students should also be taught to analyze data.  This is a report that a student might see.  On the first assignment, "Advertising Evaluation", the student should see his grade, 83%.  He should also see the teacher comment.  It starts with praise for what the student did well and then give constructive criticism and a reminder of the instructions.

The student should see that he failed to do the 2nd assignment, "Multi Media Definitions".

The student should see that he did well on the 3rd assignment, 95%.  Again, the teacher starts with praise and then give constructive criticism.

This image shows questions on a quiz that a student has missed.  The student can see his question and the answer he put (a wrong answer).  This will allow him to look up the correct answer.  This should help him on future assessments.

This image shows a student assignments that he has submitted as files.  He can see his grade, 100 on the first assignment and 92 on the secon.  Beside each assignment is a view button that the student can click to see comments on his assignment.


The image above shows what the student would see if he clicked on the view button beside the resume assignment.  Thus, the student now knows why he lost points on the assignment.


It is important in an online class that teachers, students, and parents be able to access and interpret data about assignments.

Summative assessment quest

One of the first assessments I made for my online environmental science class this semester was a quiz over earth science concepts.  You can see the first two questions in screenshots below.

These are there to give you an idea what the quiz is about and to show that all the answers are plausible. For example, all of the answers on question 1 are types of plate interaction - therefore they can all serve as distractors.  The answers on question 2 are choices from several different things they learned in the unit.
I gave this quiz on Moodle.  It does have some tools for analyzing scores, but not quite as nice as those illustrated in the Georgia TOOL lesson.  Here is what the Moodle analytics page looks like with the student names removed.  It took two screen shots to get them.
This was one of the first quizzes that I graded, so I could not coordinate these with the students overall grades.  However, in hindsight, I can now.  Overall the grades on this quiz coordinate nicely with the students overall grades.  One student, the first one on the list, did significantly worse on the first quiz than his overall grade.  I think this was because he realized he needed to change his study habits and put in extra effort.  Anyway, one aberrant quiz isn't that unusual.  We all have bad days.

Some of the questions were so straightforward that no one missed them.  These were useless for discriminating between high and low performers.  Question 11 was the best for discriminating performance - about half the students got it right and about half got it wrong.
To me, this question is as straight forward as all the others.  It is a simple definition question.  One student missed it because of time (did not answer).  The others seemed to confuse spreading plates with a fault.  This is something that I cleared up with simple feedback.

Question 12 did not coordinate well with the overall grades.  Almost everyone missed the question.  There were two reasons for this.  First, it was a multi-part matching question.    Missing any of it made the students miss it all.  While this accounted for some of the low scores, I also felt that I had not emphasized the layers of the earth as much.  I had taught this first in the unit and not come back to it.  Also, the earth is divided into layers two different ways - by physical properties and chemical properties.  This confused students.  The low score on this question told me that I needed to reteach this topic.

I know that the quiz was valid because these scores coordinated fairly well with overall grades.  There were legitimate distractor answers in each question.  This test was fairly secure as the students had one attempt and a strict time limit.  The time limit may have been a bit to short as one student did not finish, but I'd rather have that than have the students have so much time that they could just look up answers.
adfa

Wednesday, April 13, 2016

Quality feedback

Quality feedback is one of the hardest things to provide in an online classroom.  I have had success giving feedback with a couple of tools.

One that I like to use the comment feature built in with Google Classroom.  I believe that you can also get the same features using Read and Write for Google.  These work with Google Doc's.  Below, you'll see a screeenshot sample of some student work with my comments.




While Google makes written comments much easier than many programs, I still find it easier and quicker to give comments verbally.  This can still be done in the online classroom by using voice recordings.  I have used Soundcloud to do such recordings.

Below is a link of audio feedback that I gave to a student.  The assignment that he had completed was making a food chain.

Food Chain Feedback


While tools for giving feedback are improving, this is still one of the areas that I have most difficulty with in online teaching.

Formative Assessment Quest

Their are many ways to assess students' progress toward learning goals.  I teach science.  For my  example, I will refer to teaching physics.

In teaching physics, I think it is important for students to get the physics concepts first.  Then, they can put quantitative computation (numbers) with their ideas.  This is particulary important in teaching students to understand graphs.

I think that  teaching and assessment when teaching new ideas should follow a pattern:

  • pre-assessment to see what students already know 
  • teach new concepts tying them to what students already know
  • assessment of understanding of concepts
  • teach how to represent the concepts mathematically
  • teach how to use formulas to quantitatively solve problems
  • assess problem solving
  • final assessment
In the above sequence, asessments are interspersed throughout the teaching process.  These assessments can be used to identify areas that need reteaching and students that need remediation.

To demonstrate how I'd use assessment, I have created a Socrative quiz  to assess the following standard.

 SP1.C Compare graphically and algebraically the relationships among position, velocity, acceleration, and time



The assessment focuses on the graphing part of the standard.  Most of the questions focus on concepts, with one or two question asking for calculation.  This quiz will be used to see if students understand the concepts needed to interpret graphs before focusing on the numbers.  It does have a couple of numbers questions to see if they are making the connection on their own.

Socrative allows for immediate feedback to questions.  The feedback that I have supplied is detailed enough to help them make the connections on the number problems if they are not already doing so.

Here is a screenshot of a couple of questions from the quiz edit mode.  This allows you to see the question and the feedback that the students will get.  Below, these two sample questions, you'll find a screenshot of the whole quiz.  It is a bit small to read; that is why I posted some individual questions.