Category Archives: .P1 Practice intentional inquiry and planning for instruction.

SnapChat Leak: An Educational Opportunity?

If you’re following this story, then you know that SnapChat, a super-popular App that a large number of my high school freshmen have on their phones, had a security problem that allowed a hacker to get the usernames and phone numbers of 4.6 million SnapChat users.

[Was your data leaked?  You can check using this look-up tool.]

I was eager to see if any of my students were in the set of leaked accounts.  I wanted to create conversation around why data leakers do this, and what appropriate responses would have been for the users and creators of such technology.

So I did some poking around.  I downloaded the data (46MB ZIP).  I to open it as a CSV in Excel 2013, but it couldn’t.  I opened it in Notepad+ and searched for my number.  Not found.  I searched for anything in 425 area code (Bellevue-Redmond).  Nothing.  I searched for anything in 509 area code (eastern WA).  Nothing.  So none of my students were in the leaked data.

It turns out only a select few numbers in 76 area codes were shared.

http://www.snapchatdb.info/img/count.jpg

http://mashable.com/2014/01/01/tool-snapchat-compromised/

And it’s interesting that only 10,623 numbers in 206 area code (Seattle) were shared.  That’s only 1 part-per-thousand of the total numbers in 206.  Which is either a comment on the importance of SnapChat in Seattle or the underestimation of area codes to include from the hacker.

Or take a look at 815 area code in the picture above, if 215,953 numbers in 815 use SnapChat, that is 21 out of every thousand phones (or 2%)!  Not bad for a small App that doesn’t care about security.

So, can someone get me all the 509 numbers at SnapChat please?  It would help me in lessons at school next week.  Smile

How I Got Some Freshman Science Students To Read “The Economist”

Last week I was grappling with a way to teach the Washington State Science Standards, in particular the INQUIRY A piece.

As is often the case, inspiration came in the nick-of-time.  I would have my students

  1. gain an appreciation for the breadth of science
  2. practice some literacy skills
  3. generate some “scientific questions”
  4. work in groups
  5. practice some creativity

Here’s how it went.  The room is arranged in groups.  At the beginning of class, we review science as a pervasive quest for knowledge, which often looks like questions.  Define/Review scientific questions, and propose a form that students can use “How does ____ affect ____.”  (Is this Act 1 for Science, a la Dan Meyer?)

Tell students that there are pages from a magazine (suitably shuffled) on their group tables and that they are to get with partners and create a poster of 5 scientific questions which will be generated the following way.  Your partner takes a page and finds a noun on that page.  You take a different page and find a noun on that page.  You then come together and form a question “How does noun #1 affect noun #2.”  (Act 2, you have a tool/method, now apply it.)

Where this spins off into greatness is when students:

  • find themselves reading snippets of articles from the Economist for context, since they have been “struck in the curiousity bone
  • find themselves posing questions like “do bees affect cancer?” which might lead to a long and fruitful career in science for this 9th grader
  • realize that sometimes science questions look superficially quite silly but hide an incredible profundity, like “will dry ice slide down a sand dune?

Finally for Act 3, we have a wall full of questions, from 4 periods of science students, which we can now take to the next level of refinement of the question, and posing more questions.  Take a look:

IMG_1367

[Book Review] Where the Rubber Meets the Road

I’m reading a book by Richard N. Steinberg entitled An Inquiry Into Science Education, Where the Rubber Meets the Road.

Professor Steinberg took a sabbatical (2007-2008) from the City College of New York to teach high school physics in Harlem.  This book is a reflection on his experiences.

His themes are predictable if you’ve been following current topics in education.

  • teacher preparation
  • student apathy
  • classroom management
  • abysmal math fluency
  • standardized testing
  • teaching is a lot of work!

His more hopeful and helpful themes are around how he has stood for true inquiry in his science classrooms, and some lessons that he taught.  That plus some other references he cites as resources are worth the price of the book.

Steinberg spoke at a conference in Washington DC in May for the Robert Noyce Scholarship folks at PhysTEC, since he is also involved at that program at CCNY.  He doesn’t talk about PhysTEC in his book, but I suppose it would be out of context somewhat.

Two Days of Professional Development

I spent the past two days in PD at my school.  The topic was formative assessment and the task was to build 5 formative assessments that can be used this semester in our classes.  We were broken out into our subject matter groups, which will also be our Professional Learning Communities for the year.

Our pacing guide for Algebra 1 [need link] seems reasonable, so we started with the 9 standards that we intend to cover before our first benchmark exam.  The general approach we are taking is that our formative assessments will be shared questions that we all ask for each standard.  We will record the results of these formative assessments and then discuss (in our PLCs) how our delivery of the content or the assessments for understanding could be improved based on the results.

Just how much process here should be shared was a topic of discussion.  Some people will do pre-test vs post-test comparisons, or exit tickets, or red-yellow-green paddles.

Naturally, if we are staying close to the standards, the question also comes up if we shouldn’t just do standards-based grading as well.  I have to say it makes a lot of sense to me, but overall the team is relatively new already to formative assessment, so adding another layer of complexity to the year—which starts very soon—was deemed to *not* be ideal.  There was also some hesitation since administration will need to be informed of what we are thinking.

GREAT WORK:  results from the EOC tests are back and it looks like overall Wapato students improved math scores from 20% to 51% [outstanding!]

EDU6978: Week 07: Due 2012-08-12

Reflection
TimeCheck
Only one more week of this course, and then our projects are due on 8/24!
Time is flying and classes will be starting soon enough after that…

Common Core
I had a mini-breakthrough this week with the Common Core State Standards in Math (CCSI, 2009).  It came while I was developing some flashcards (via quizlet.com) to drill myself on the abbreviations of the Domains used in the Standards.  For example, when I see CC.9-12.G.MG.3, I want to be able to quickly say “Common Core, High School, Geometry, Modeling with Geometry, Standard #3.”  If I want to be really insane, I could eventually learn that Standard #3 in that Cluster is:

CC.9-12.G.MG.3 Apply geometric concepts in modeling situations. Apply geometric methods to solve design problems (e.g., designing an object or structure to satisfy physical constraints or minimize cost; working with typographic grid systems based on ratios).*

The next breakthrough was doing an internet search on that string verbatim “CC.9-12.G.MG.3” and finding some really cool resources for lessons on that topic.  This is the power of having shared standards, and it suddenly dawned on me.  The other thing that dawned on me is that by drilling through 495 standards, I was getting more familiar on what topics are there and where there is more emphasis.  That was a total bonus.

Of course, this wild romp through the Standards means I am late in getting my PBL Final Project to my classmates.  (Please accept my apologies Cohort 10 A!)

Our discussion question this week was whether, in our view, the new Common Core  or Next Generation Science were moving toward or away from STEM.  Most folks felt that NGSS was definitely moving toward STEM, since Crosscutting ideas mention engineering, and technology.

Schedule
image

Notes
(Verbatim from source unless italic)

Embedded Formative Assessment (Wiliam, 2011)

Chapter 6
Activating Students As Instructional Resources for One Another

[Introduction]

Even though there is a substantial body of research that demonstrates the extraordinary power of collaborative and cooperative learning, it is rarely deployed effectively in classrooms. This chapter explores the role that learners can play in improving the learning of their peers and concludes with a number of specific classroom techniques that can be used to put these principles into practice.

Wiliam, Dylan (2011-05-01). Embedded Formative Assessment (Kindle Locations 2685-2688). Ingram Distribution. Kindle Edition. 

Wiliam (2011) describes some great techniques for getting students to act as instructional resources for each other. The author also makes a compelling argument for why this is necessary by citing his personal experience with both boys and girls who readily admitted to have "pretended that they understood something [the teacher had said in a 1-1 conversation] when in fact they didn’t."

After cautioning that collaborative work is often not structured to demand both group and individual accountability at the same time, Wiliam describes some practical techniques for fostering true collaboration.

C3B4ME: is a strategy where the teacher reminds the students that he/she is not the only teacher in the classroom.

Peer Evaluation of Homework: a good trick for doing formative feedback from student to student, with a side-effect of getting students to do their homework both more consistently and more legibly. (Huge pet peeve of mine!)

Homework Help Board: provides a means of hooking those who need help up with those that might be able to provide help.

Two Stars and a Wish: encourages giving both positive and constructive feedback between peers on tasks and assignments.

End-of-Topic Questions: Uses groups to break through the "I don’t want to look silly" barrier, and also helps in literacy skills if questions need to be presented in written format.

Error Classification: When errors can be grouped easily, allows strong students to be paired with weaker students very readily.

What Did We Learn Today? Another group gets together and forms consensus on what was clear and what wasn’t clear at the end of the day.

Student Reporter: One student is selected each day to summarize the day, or answer any remaining questions.

Preflight Checklist: a great way to get higher quality work and to build in accountability in students, is having a checklist that they must go through before work is submitted.

I-You-We Checklist: is good for assessing how group dynamics are working and contributing to the learning process.

Reporter at Random: Ahh, this is the POGIL-style collaborative model, where each member of the group has a particular role, but in this case you don’t pick a reporter until they are needed so students don’t tune out when they aren’t the reporter.

Group-Based Test Prep: By asking each member to prepare for a section of material that is on the test and then present it to the group, you build in some review skills, help peers give feedback on learning and find some good questions for the test.

If You’ve Learned It, Help Someone Who Hasn’t: Wiliam saves the best for last, since this is a criticism often leveled at collaborative learning. Namely, that the bright kids are held back and the kids who struggle aren’t helped. We are reminded that an efficient group pairs those who know with those who don’t and in the process both are well served.

This was a good chapter with a lot of practical techniques that I think I will try.

Conclusion

In this chapter, we have seen that activating students as learning resources for one another produces tangible and substantial increases in students’ learning. Every teacher I have ever met has acknowledged that you never really understand something until you try to teach it to someone else. And yet, despite this knowledge, we often fail to harness the power of peer tutoring and other forms of collaborative learning in our classrooms. This chapter has presented a number of classroom techniques that can be used with students of almost any age and that can readily be incorporated into practice. Many of these techniques focus specifically on peer assessment, which, provided it is geared toward improvement rather than evaluation, can be especially powerful—students tend to be much more direct with each other than any teacher would dare to be. However, it is important to realize that peer assessment is also beneficial for the individual who gives help. When students provide feedback to each other, they are forced to internalize the learning intentions and success criteria but in the context of someone else’s work, which is much less emotionally charged. Activating students as learning resources for one another can, therefore, be seen as a stepping-stone to students becoming owners of their own learning—the subject of the next chapter.

Wiliam, Dylan (2011-05-01). Embedded Formative Assessment (Kindle Locations 2898-2908). Ingram Distribution. Kindle Edition.

Standards

[NOTE:  I like to keep PDFs with my own annotations in Mendeley which is a client that roams your PDFs, supports deep search, and keeps track of bibliographic information for each file.  You might want to check it out.]

They are listed in References, but here are some pretty accepted abbreviations of each with the most current dates for the governing documents.

CCSS-Math (2009):  Common Core State Standards-Math
CCSS-ELA (2010):  Common Core State Standards-English Language Arts
EdTech-WA (2008) :  Washington State Educational Technology Standards
NGSS (May 2012):  Next Generation Science Standards (May 2012 Draft)
ITEA-STL (2007):  Or maybe just STL, Standards for Technological Literacy

[No real significant Engineering Standards??]

Commentaries on Standards

I like to think of booklets such as “A Framework for K-12 Science Education” (NRC, 2012) as guides to help you interpret or apply the standards.  That souce is listed below but in addition we had some others in our Optional Readings for this week.

This is just to give you a flavor for what is out there.  I didn’t have time to digest all of these.  The titles are fairly descriptive, when I have time I will put my experiences with these in the comments.
image

References

Common Core State Standards Initiative [CCSI]. (2009). Common Core State Standards for Mathematics. National Governors Association. Retrieved June 24, 2012 from http://corestandards.org/assets/CCSSI_Math%20Standards.pdf

Common Core State Standards Initiative [CCSI]. (2010). Common Core State Standards for English Language Arts & Literacy in History/Social Studies, Science, and Technical Subjects. National Governors Association. Retrieved August 7, 2012 from http://corestandards.org/assets/CCSSI_ELA%20Standards.pdf

International Technology Education Association [ITEA]. (2007).  Standards for Technological Literacy:  Content for the Study of Technology.  (3d ed.).  Reston, VA:  ITEA.  Retrieved August 9, 2012 from http://www.iteea.org/TAA/PDFs/xstnd.pdf

National Research Council [NRC]. (2012). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Washington, DC: The National Academies Press  Retrieved August 11, 2012 from http://www.nap.edu/catalog.php?record_id=13165

Next Generation Science Standards [NGSS].  (2012, May) [DRAFT]  Next Generation Science Standards.  Retrieved August 7, 2012 from http://www.cascience.org/csta/pdf/NGSS_Draft_May2012.pdf

Talbert, G. (2008). Washington State K-12 Educational Technology Learning Standards December 2008. Olympia, WA:  Office of the Superintendent of Public Instruction.  Retrieved August 8, 2012 from http://www.k12.wa.us/EdTech/Standards/pubdocs/K-12-EdTech-Standards_12-2008b.pdf

Wiliam, D. (2011). Embedded Formative Assessment. Bloomington, Indiana: Solution Tree Press.

Internship Reflection Week of 2012-04-02 [32] (the week before Spring Break)

My Monday SAT Prep class had some very interesting discussion.  Since I taped that session, I am able to go over the discussion which we had in class again, in greater detail.  I would like to reflect on the whole class, but highlight that discussion.

I was just wrapping up the following slide on proportions…

image

When the question was asked by KJ:  “With proportions, is cross-multiplication and division the only way to simplify?”  My answer:  “No,” and some elaboration led to a bunch of student voice stemming from some mass confusion.

As I look back on the slide, the step where I multiply both sides by 12 could have been elaborated upon, or taken a little more slowly.  It is interesting to wonder if that was the root of the ensuing 20 minute discussion.

A few students were confused about what it means to “multiply both sides by 12”, and one student, AO, asked about where we were multiplying 12 in the numerator or in the denominator.

Another student was confused that we didn’t just compute (doughnuts/package) and then multiply by 5.  At which point I realized that students were not confident that I could take the inverse of both sides of the equation, i.e. to have doughnuts/package on both sides, and thus get the same answer for x.

image

When one student (F.R.) pointed out that this method would work when the numbers weren’t so neat and tidy, I thought we were making headway, but just then… a student asked “But why does it have to be that difficult?  Why can’t you just say 12 divided by 2 times 5.  Why is that so hard?”

And another student chimes in:  “I get what SL just said!”

“Maybe this question wouldn’t have been so hard if the numbers hadn’t been so easy,” said another student.

“Why do we have to be taught the more complex way?” says SL.

After about 8 minutes of students taking positions on cross-multiplication-and-division, or the algebraic method, we get at one root of the matter.

“When you write something over another number, it just looks so much more confusing than it has to be,” says SL.  We conclude that fractions are scary.  And that you have to work on them until they aren’t so scary.

“Fractions are, like, my worst enemy,” says SL.  And a couple of other students agree.

I have to say this animosity towards mathematics is very interesting, and a little dismaying.  No other subject seems to be determined to make the learner feel stupid.  No other subject seems to offer simplicity and then once a student is lulled into thinking they understand, there is a sudden change in difficulty.

Overall, I think the first half of the class was very valuable.  I think many students had chances to voice their frustrations or challenges with the content.  I need to keep those students in mind when I prepare a lesson.  I need to brainstorm other ways to connect the math to those students so that it feels authentic and non-threatening.  I am really thinking that a Mighton-esque approach where the numbers are easier at first and then the problems only get minutely harder as the student progresses.

The second part of the class (slightly better camera angle) was a little silly, but folks seemed attentive.  The break seems to be very helpful, and students seemed refreshed and ready to go after the break.  After I gave out the homework handout many people interpreted that as the end of class, that wasn’t so helpful, but it was used by some to get some work done.

This was the first class where I tried both a handout in class, and giving out the homework and letting some class time be used on it.  I don’t think I will get any better return or completion rate on the homework by doing this, so I may not do it again.  I was able to collect quite a few worksheets that were done in the first half of the class.

Internship Reflection Week of 2012-01-23 [22] (Unity3D in the Media Lab)

This week got off to a slow start when we found out that our campus had no power on Monday.  After that, however, things have been pretty busy.  As a preparatory exercise for the Computer Game elective that I will be teaching to high schoolers starting at the end of February, I have been utilizing some students to try out some software packages that we might use.  That also gives me an idea of what learning activities we might use during the elective.

It was so exciting to see a student immediately start modifying the sample video game that came with the Unity3D software.  All the students were excited to play the game, a few were playing it over and over again until they had “solved” it.  This is the magic of facilitating an interest-based curriculum.

At various times I am fascinated to see students pull back from a project or task, since they imagine that it is too hard, or too much work or too much math.  I interpret that as a lack of confidence, fundamentally in them, and a challenge to me as the teacher to it accessible, i.e. to differentiate the exercises so that they feel confident and can go from success to success.

I had some good discussions with a couple of students (DM, MS) over their projects and interests that mesh with the CSI (Forensics) elective that I taught for the middle schoolers last semester.

Rough Timeline of Events [my notes only no need to evaluate]

Big Picture campus had no power on Monday 1/23.  I took my conditional certificate to district headquarters (ERAC) to get it registered (stamp on the back).

Sent mail to Interim Superintendent about Outlook 2010 access to work e-mail over the internet.

I finally got proxy permissions enabled for Outlook Web Access to access email 24×7, it was a bummer to not be able to read my school e-mail when I was working late or early in the morning.

Participated in portfolio surges for 201 (Wed) and 101 (Fri) classes.  The purpose of these was to get student portfolios into shape with documentation and other content in anticipation of upcoming exhibitions (Feb 6-10 and Feb 13-17).

Thursday I helped with a student project to get milk carton recycling going at the school during breakfast, middle school lunch, and high school lunch.

Sent a rough diary of interactions with students on 1/24

From: John Weisenfeld GMAIL [mailto:john.weisenfeld@gmail.com]
Sent: Tuesday, January 24, 2012 4:19 PM
To: Jessica Rottweiler; David Levine; ddundon@gmail.com
Cc: LD, DP, MJ, MS, KE
Subject: Rough Diary of Activities Tuesday 2012-01-24

Today KE helped me get Unity 3D game development software

(http://unity3d.com/) installed on 8 computers in the media lab.

He then helped me debug the usage of that software when signed in as a student. We spent most of the day in that game development software playing a sample game that comes with that package (AngryBots).

MJ joined us around 10:00 or 10:30 am in the media lab and figured out how to use the software to modify AngryBots so that certain features of gameplay would change. He also mentioned that he would like to take a sample SAT test, and I wanted to point out that he can do that online for free at http://sat.collegeboard.org/practice/sat-practice-test

I went and got DP and MS from advisory at about 10:30a or 11am and they were also interested in seeing Unity 3D and playing the sample game.

LD joined us at about 11am, and also took a look at the sample game.

While a few folks were playing Doodle Jump (a vertical scrolling game), others were looking for other Unity-based games that we could download the source code for modification and future experimentation.

All-in-all I think this was a productive day, since I was unsure if Unity 3D would be suitable for use in the Gaming Elective that I plan to start this Friday. I appreciate the help these scholars provided today and look forward to more experimentation with this tool in the coming weeks.

If they participate in the elective I have some ideas about reports or projects they could do on:

1. game companies

2. topics in the game industry

3. topics in game design

4. journal of games they play, for how long, and what type of game

(genre) and what they like and don’t like about the game

All of these will inform our work during the elective as we talk about the roles needed to produce a game (Engineering), and what we personally like about certain games (realism, i.e. Science/Physics) and the tools it takes to produce a game (Technology/Programming).

I would love to add comments like this to individual learning plans in the future, but I need to be added to the most current learning plans for these folks, I will follow up with them, but it would be great if advisors could make sure that Dan Dundon (ddundon@gmail.com) and myself (john.weisenfeld@gmail.com) are added to current learning plans.

Thanks!

John Weisenfeld

STEM Specialist/Intern

Highline Big Picture High School

206.631.7724 (work)

425.301.7404 (cell)

I also wrote a small report of student interactions on 1/26

From: John Weisenfeld GMAIL [mailto:john.weisenfeld@gmail.com]
Sent: Thursday, January 26, 2012 4:10 PM
To: Jessica Rottweiler; David Levine
Cc: Dan Dundon; LD; DP; MJ; KE
Subject: Quick Debrief on Today 2012-01-26

Hi Jessica, Hi Mr. Levine,

KE sent the day with us here in STEAM working on his Discovery Corps application.  His essays are looking good, he just has one more form that he needs to take home, get signed and bring back and I think we can postmark his application off to the Science Center by the deadline.  I had thought I might have a chance to get to some work in the Media Lab, but he spent a lot of time watching/helping with the pheasant dissection that Dan was doing today.

MJ spent some time in the Media Lab today working on Unity (game development software) and signing up for Microsoft DreamSpark.  I have been hoping that our students would take advantage of DreamSpark since it provides free access to Microsoft professional software development tools and 90 days of free access to professional training (on how to build apps for iPhone, or Windows Phone or …). We watched a few minute samples of those and although they are pretty high level do have some demonstrations that our more technically adept students should be able to follow along on.

DP almost got signed up for DreamSpark and got especially excited if he could access training (Plural Sight) to help him get better at writing applications for the Windows Phone (since that is what he has).  I worked much of the afternoon trying to get Microsoft Visual Studio 2010 Professional installed on a couple of machines in the Media Lab, as I type the first one is still trying to finish…  Note that once these students have access to DreamSpark they can download software and install, or watch training from home, the library, etc.

LD, MJ and DP all discovered a new type of game that none of them had played before which allows for multiplayer basketball (third person, you are looking down on the court from midcourt controlling your avatar).   The game has been written in Unity 3D which helps students make the connection that a tool they have access to actually can generate games that they consider interesting or that they think have high play value.  We are still searching for a game that we can get the source code for so that we can tweak and hack and get some good learnings from, no real luck yet beyond the AngryBots demo that comes with the Unity 3D free package.


John Weisenfeld
STEM Specialist/Intern
Highline Big Picture High School
206.631.7724 (work)
425.301.7404 (cell)

Twitter as Log of What I’m Reading

A little while ago I started using Twitter as a log of what I was reading on the web.  Most of what I am reading has to do with education, and although it is intriguing to think about how to solve all of education’s problems, I should focus my reading primarily on getting certified and stuff I need for my classes at SPU.

So check out my Twitter feed to the left here on my blog, and if you want to follow me, click here

Popham, Chapter 6 Pondertime & Chapter 7 Pondertime, Due February 8, 2012

Chapter 6 Pondertime (p. 161, #1, #3)
image
1.  If you were asked to take part in a mini-debate about the respective virtues of selected-response items versus constructed-response items, what do you think would be your major points if you were supporting the use of selected response test items?

Selected-response test items can be more numerous, i.e. students can complete them more quickly than constructed response test items.  Selected-response tests can also be graded more quickly.  Although, I would readily admit that selected-response test items do not give near the amount of insight into the real thought processes of the test-taker that constructed-response items provide.  And, there is more chance of guessing a correct answer in a selected-response test, than in a constructed-response test.

3.  Why do you think that multiple-choice tests have been so widely used in nationally standardized norm-referenced achievement tests during the past half-century?

To put it quite simply, multiple-choice tests are easier to grade and easier to analyze.  The range of outputs of a multiple-choice test is only a function of the number of questions on the test.  Which is to say, you can explicitly determine all the possible scores of a multiple choice test and then determine how many students have fallen into certain output categories or groupings.

Chapter 7 Pondertime (p. 184, #1, #2)
image
image
1.  What do you think has been the instructional impact, if any, of the widespread incorporation of student writing samples in high-stakes educational achievement tests used in numerous states?

Teachers who desire their students to do well on high-stakes educational achievement tests, undergo huge pressures to tailor their instruction to help their students succeed on such tests.  For student writing samples,this means that teachers are delivering instruction that causes students to practice the skills of needed to produce students writing samples that are of high quality.  Tips such as 5-paragraph essay format, narrowing topic quickly and precisely, and writing with the right combination of detail and brevity is key.  Other instructional impact may include the loss of time for other topics in the classroom as preparation is made for doing well on writing sample portions.

According to Wikipedia, the writing section of the SAT was added in March of 2005  (almost 20 years after I took the SAT).  I should do some research on whether data since that time has proven the Writing section to be valuable or useful.

References

SAT (2012) Retrieved February 10, 2012 from http://en.wikipedia.org/wiki/SAT

2.  How would you contrast short-answer items and essay items with respect to their elicited levels of cognitive behavior?  Are there differences in the kinds of cognitive demands called for by the two item types?  If so, what are they?

Short answer items demand limited levels of cognitive behavior, at least in comparison to essay items.  The longer form demands, on average, more recall and reasoning from the student, or at least more regurgitation of opinions heard or given during classroom discussions.  The real issue I think is cognitive demands.  A short-answer item, by definition may be a mere phrase or paragraph, and thus not require much application of new learning or thinking (i.e. extrapolation or interpolation of sources and opinions of others).  The essay-based test question forces a student to mentally perambulate through sources and opinions, hearsay and argumentation, and prove that they themselves can either replicate a standard argument or come up with a newer one.  In this case I am equating “argument” with a chain of assertions more or less supported by logic or sources which necessitates some development (i.e. proposition, inference, conclusion). 

That perambulation requires some extreme cognitive demand in order not to be found falling into some slough of whimsy or crevasse of fallacy.  A student successful in an essay item has definitely met some higher cognitive demands than those the short-answer question.  An interesting followup question might be:  “Is more better?”

According to the  Wikipedia article on the SAT (2012), the writing section on that test has been studied since its inception in 2005 and there is some evidence that the longer the answer essay the better the score.  This may be an artifact that favors the rambling student!

References

SAT (2012) Retrieved February 10, 2012 from http://en.wikipedia.org/wiki/SAT

References

Popham, W.J. (2011). Classroom Assessment: What Teachers Need to Know. (6th ed.). Boston, MA: Pearson Education, Inc.

Popham, Chapter 15 Pondertime & Chapter 16 Pondertime, Due March 7, 2012

Chapter 15 Pondertime (p. 384-385, #1, #3)
image
image
1.  If you were devising a plan to promote dramatically improved evaluation of the nation’s teachers, how would you go about doing so?

[hmm…where to start on one of the most hotly debated topics of our current political atmosphere.]

First let’s get something straight, public school teachers are currently bureaucrats, that is to say that their environment is fundamentally *not* profit or results driven, like private industry.  The private sector, the profit sector, is where I come from having left Microsoft in 2011.

Second, we could dream up ways of improving evaluation but until teachers overwhelmingly see the value of evaluation then they will be manipulated by district, union and media.  (Actually, I think the substance of this point goes back to a quote from Bill Gates.)

Third, New York recently (Feb 2012) published rankings on 18000 public school teachers.  The “value added” plan for improving evaluation of the nation’s teachers seems to be engendering a lot of debate.  Despite the rhetoric, private and parochial schools seem to have no problem measuring teacher effectiveness based on the “product” or “outcome”, that is have students learned or not, can they prove it in some way, i.e. on a standardized test or exam.

Finally, all that is noble and virtuous about our education system, that children are treated fairly and that each one is nurtured to achieve his or her full potential, none of that can really be fostered in a cut-throat competitive-toxic teacher environment.  The best plan for improved evaluation is to find one that teachers themselves agree to, and teachers themselves implement and believe in.  That must be peer-based, must have real rewards (consequences!), and must be a focus on growth and improvement, not punishment and status-quo.

3.  When teachers evaluate themselves (or when supervisors evaluate teachers), various kinds of evidence can be used in order to arrive at an evaluative judgment.  If you were a teacher who was trying to decide on the relative worth of the following types of evidence, where would you rank the importance of students’ test results?
    a.  Systematic observations of the teachers’ classroom activities
    b.  Students’ anonymous evaluations of the teacher’s ability
    c.  Students’ pretest-to-posttest gain scores on instructionally sensitive standards-based tests
    d.  A teachers’ self-evaluation

Would your ranking of the importance of these four types of evidence be the same in all instructional contexts?  If not, what factors would make you change your rankings?

Below is my list of evidence, and listed in weight order from heavy weight to light weight, that should be used in an evaluation of a teacher.

1.  I believe in teacher self-evaluation against clear mutually-acceptable criteria and reasonable expectations informed by work load and experience.  Teachers are making evaluations of their students in both significant and insignificant ways all day, every day.  Teachers are able to evaluate themselves.  If blind spots develop or are recognized they should be highlighted in a coaching atmosphere.

2.  I put pretest-to-posttest gains next, because I think #1 actually drives #2.  If I wanted to prove in a self-evaluation that I was growing and having more impact on authentic student learning then I would jump at the chance to present *data*.  That means I would pull out assessments which show that gains have been made, that I have added value.  Notice that I wouldn’t put that in the newspaper or broadcast in the media, but I would use those data for self-evaluations.

3.  I would put observations next.  I think that nothing beats peer review of teacher practice.  All of the high-flying charter or private schools use something like this as a method of continual process improvement.  It I believe is an essential piece of teacher evaluations.

4.  I would put students’ evaluations last.  Recall that students are minors and their maturity is often questionable.  To place high-stakes or career-impacting decisions in their hands seems foolhardy.  Nevertheless I love getting the data, I would just de-emphasize it, hence it comes in last in my list of priorities.

I realize that instructional contexts vary, but I think my descriptions above are suitably general that the rankings would still hold.  All teacher should set goals and self-evaluate.  All teachers should do before and after type exams or gauge their student’s’ improvement.  Observaions are key because teachers should get peer and master teacher feedback / challenges to improve…

Chapter 16 Pondertime (p. 409, #1, #3)
image
1.  If you were asked to make a presentation to a district school board in which you both defend and criticize a goal-attainment approach to grading students, what would  your chief arguments be on both sides of this issue?

Goal Attainment Grading:  Pros and Cons
Pros: 
-  having clear “goals” and with clear definitions of “attainment” can be more readily communicated to students, parents and other staff
-  potential to decrease the variability of grading between students, i.e. may decrease some common tendencies (to the norm, to be too harsh, to be too lenient) that sometimes exist in grading.
-  since instruction is based on goals (standards), and assessment is ideally focused on goals, it is logical extension that the communication back to student and parents (grades) should be based on goals attained or not.

Cons:
- There is no accounting for effort in the goal-attainment approach, at some grades and in some situations, a notion of effort expended by students can be very informative.
- Given the number of goals needed/required this could be a more intensive grading system
-  there are other interesting variables which teachers would like to report on for certain students and goal-attainment doesn’t capture them all.

3.  If you were trying to help a new teacher adopt a defensible grading strategy, what advice would you give that teacher about the importance of descriptive feedback and how to communicate it to students and parents?

Grading by its very nature is a sorting process that is fraught with imprecision.  In order to reduce the perception that grading is arbitrary or subjective, and thus not defensible to the student/parent, it is very important that feedback be descriptive.  By descriptive we mean that any deviation from the standard that a teacher is claiming for a student is supported with evidence, and that evidence is presented which inherently points to the improvements that are being requested of the student which point the way towards how a grade can be improved or conversely worsened.  The beginning teacher should avoid the thinking that grading is merely proof that they are doing their job, and focus on the exercise that grading is a communication of goal-attainment (or lack of attainment) to all interested parties.  Once that groundwork is laid, the more interesting conversation about how attainment is/was measured can begin and be the focus of any improvement plans or rewards.

References

Popham, W.J. (2011). Classroom Assessment: What Teachers Need to Know. (6th ed.). Boston, MA: Pearson Education, Inc.

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: