Students Are Not Customers

The latest trend among college administrators is to say we need to be treating our students like customers. The idea is that educators need to improve “customer service” in order to give students a better educational experience. This model doesn’t make any sense. It almost feels like administrators have taken a few MBA courses and latched onto this idea of students-as-customers as the solution to the problems faced by higher ed. With the latest financial crisis resulting in reduced endowments, the trend of states cutting funding to higher education, and a soon to be shrinking pool of students, universities are looking for ways to keep enrollment up, but treating our students as customers is not the way.

What is the biggest problem with the students-as-customers model? This leads to a sense that professors somehow owe students a passing grade and the university owes the students a diploma. This sense of entitlement is psychologically damaging to the faculty. I know this isn’t the mindset most administrators start out with, but it is an unintended consequence of the students-as-customers idea. We already deal with students (and even parents) who feel entitled to good grades in our courses (“My special little snowflake, Jonny, should get an A because he is going to med school”). I think the business model further reinforces this model and can make faculty feel like they aren’t supported by administration.

Let’s stick to the overall business mindset for a moment and take another look at universities. First off, students are not customers, nor are they products. Graduates are what higher ed “produces” and it is employers and graduate schools that “purchase” the products. This makes students the raw materials that have the potential to be turned into finely crafted diploma recipients. To keep employers happy, we need to be turning out high quality students. Following this idea, educators should be culling weaker students and admitting only the highest caliber of applicants. The problem with this model is that universities (especially public universities) have a larger responsibility to society to provide educational opportunities to residents. We have a responsibility to provide students the best chance of success we reasonably can. The business model for universities fails because we have responsibilities to differing groups that sometimes conflict. What employers and grad schools need is different than what society needs, which is different than what individual students need.

Universities are definitely businesses, and they do need to learn how to be more efficient. There is much we could learn from the business world. However, treating our students as consumers can lead to a dangerous sense of entitlement that can embolden students and undermine faculty. When a student pays tuition they are buying an opportunity to learn, they are buying access to the infrastructure of the university. It is ultimately up to the student what they do with that opportunity, whether they make the most of their learning options, or whether they spend their time on Facebook and Twitter. It is the responsibility to make sure students have access to the best educational infrastructure possible, the best faculty using the best instructional techniques, the best classrooms, libraries, and student centers, and the most efficient administration that is responsive to their needs.

Posted in Just for Fun | Tagged | 3 Comments

How Many Objectives Should I Have?

      One question I frequently see people asking about Standards Based Grading is how many standards or learning objectives are reasonable for a class. I came up with a back of the envelope calculation to give you and idea what the maximum number of learning objectives your course should have.

{Learning\ Objectives} = \frac{\left(Assessments\ per\ week\right)\left(Objectives\ per\ assessment\right)\left(Weeks\ per\ semseter\right)}{\pi\left(Proficiencies\ per\ objective\right)}

Most of the terms in the equation* are self-explanatory, but “proficiencies per objective” is how many times you expect a student to demonstrate proficiency in order to complete a particular learning objective. The factor of pi is thrown in because (1) without that factor the equation gives a number that is too large and (2) every good scientific equation needs a factor of pi in it somewhere.

 

* The observant reader will note that the units of the above equation don’t work out. This is because “Objectives per assessment” should really be “proficiencies per assessment”

Posted in Uncategorized | Leave a comment

LOBA and Standards Based Grading

I should probably clarify the distinction between Learning Objectives Based Assessment (LOBA) and Standards Based Grading (SBG). LOBA is a particular implementation of SBG. I want to be clear that LOBA isn’t some brand new grading paradigm that I created out of the blue. It is my take on what many others have done in implementing SBG in their classrooms.

So the next question you will ask is “what is SBG?” The heart of SBG is (1) to get students focused on concepts and skills rather than points, (2) make sure grades are an accurate measure of student understanding, and (3) give students multiple chances to master material. This is essentially the same as the LOBA philosophy. The distinction between LOBA and SBG is really in the mechanics. There is no one correct way to implement SBG and different instructors have tried a wide variety of techniques in terms of determining final grades, reassessing, the number of standards, and so forth. In LOBA, the final grade is determined by the fraction of learning objectives a student completes. There are typically A-level and C-level learning objectives (and sometimes more), and completion of most lower level learning objectives is required to earn higher grades. The number of learning objectives tends to be higher than some SBG implementations because LOBA focuses on more discrete skills and concepts, rather than big-picture standards. Reassessment opportunities have limitations to make the workload more manageable. I know there are SBG-ers out there using many and sometimes all of these features so it is fair to say that LOBA is a subset of SBG.

When I started using SBG I ran in to a number of faculty who would ask if SBG had anything to do with the state education standards, usually with a pinched face you’d make when finding six-month-old leftovers in the back of your fridge. University faculty tend to be dismissive of, unhappy with, or skeptical about the state-mandated education standards. Learning objectives are something that every educator is aware of and it doesn’t have the same stigma. I initially called my implementation Learning Objectives Based Grading (LOBG) but several of my friends made fun of how “LOB-G” sounded (I know peer pressure is a silly reason to change, but I wanted to be taken seriously. Sigh).

A lesser reason for using a different name than SBG was I didn’t feel right speaking for the whole SBG community. I was new to this type of grading paradigm and I felt better talking about my particular implementation rather than seeming to speak for all SBG-ers. I realize that it might limit my reach, since people will be searching the internet for “SBG” and not “LOBA”, but hopefully people will eventually start to equate LOBA with a type of SBG. It may turn out that not using the SBG moniker is a mistake and I may have to start referring to what I do as SBG at some time in the future.

Incidentally, according to the literature1, what we have been calling standards-based grading is actually standards-referenced grading. In true standards-based grading, students do not progress to new material until they have successfully demonstrated proficiency on a particular standard. In standards-referenced grading, students performance is reported (or referenced) to the standards but students are allowed to proceed to the next level.

1 – Marzano, R. J. (2010). Formative assessment & standards-based grading. Solution Tree, p18.

Posted in Teaching Physics | Tagged , , , , | Leave a comment

Light Doesn’t Stop – “Stopped Light and Image Storage by Electromagnetically Induced Transparency up to the Regime of One Minute”

ResearchBlogging.orgI still remember the group meeting in grad school when someone brought up a news article stating that a researcher had slowed light down to the speed of a bicycle. We got a good laugh at the inaccuracies and clear misunderstandings in the news article. It was a little disturbing, then, to see similar articles pop up all over the web. The main problem with the articles was they claimed that scientists were slowing light down, when in fact they were slowing light pulses down. What’s the difference, you ask? Stick around and I’ll show you. When I recently saw a news article claiming that scientists had brought light to a complete stop, I felt the veins in my head start to throb again, and I couldn’t let it go without saying something. Hence this post. So what did they actually do^1? They were able to store the quantum properties of a light pulse for up to a second and then retrieve the information in a new pulse that was identical to the original. Storing the pulse and how then maintained the integrity of the information was the exciting aspect of this experiment. How do you go about storing the information from a pulse of light? You first need to compress the pulse so it is small enough to fit inside your apparatus. Since light moves so fast (3\times10^8 \frac{m}{s}), a pulse that is 1\micro s will be 1 km long! It is rather difficult fitting that into a lab room, let along a small crystal used for storage. The way you spacially compress the pulse is to slow it down as it enters the medium (in this case a silicate crystal of yttrium doped with Praseodymium (I’m not even sure where that is on the periodic table). Using something called electromagnetically induced transparency (EIT) the group velocity of light slows down, so that as the light pulse hits the crystal it is just like traffic hitting construction, the kilometer long pulse piles up into a centimeter long pulse. It is important to note that it is the group velocity that slows down to around 1 km/s and not the phase velocity. What is the difference? I’m glad you asked because this is at the heart of my complaint about the claims of “slowing light down to the speed of a bicycle”.

Phase Velocity and Group Velocity

Classically speaking, light is an electromagnetic wave which means that the magnitude of the electric and magnetic fields vary sinusoidally. The phase velocity is how fast a particular peak or valley of the light wave moves in space.

Figure showing group velocity and phase velocity. Red dots move with phase velocity (following a peak of the wave) and green dots move with group velocity (following the nodes between pulses)

The red dot in the image above is moving with the phase velocity of the waves. However, to have a wave pulse, you must have many different waves of different frequencies. When you combine multiple frequencies of a wave you end up with interference between the waves that results in a phenomena called beating, which yields a slowly varying envelope superimposed on a rapidly varying wave. In the figure above you can see that the green dots are in between each of the larger envelope pulses and that the green dots are traveling slower than the red dots (notice how the red dots catch up to the green dots and pass them). This happens because, in some media, the phase velocity of light depends on the frequency. By introducing a very slight difference in phase velocity can result in a huge change in the group velocity. The light pulses in the experiment are the envelop you see in the figure and move at the group velocity. So the key here is to somehow have a medium where the phase velocity changes as the light frequency changes, so the pulse group velocity can be very slow. This is where electromagnetically induced transparency (EIT) comes in.

EIT

Atomic level configurations for electromagnetically induced transparency. I will only talk about the Lambda configuration

This image shows a couple of different energy level configurations but I’ll focus on the lambda configuration (so named because it is shaped like the Greek letter lambda \Lambda). An energy level is possible state that an electron bonded to an atom can occupy. Electrons that are orbiting closer to the nucleus of the atom have lower energies than electrons that have orbitals that are farther away. On this diagram, states with higher energies are above states with lower energies, and the vertical distance between states represents the energy it would take to go from one state to the other (Don’t worry about what the horizontal displacement means, all we care about now is the vertical arrangement). If an atom is in one energy level, it can be encouraged to transition to a different energy level by absorbing or emitting light. The experimenters used laser light to encourage transitions between differing energy levels. The medium you shine your light through (in this case the silicate crystal of Yttrium) consists of millions and millions of atoms with similar energy level configurations. This means that if you shine a light pulse through that has an energy (which depends on frequency) that corresponds to any of the transitions between two levels (say between the |1> and |3> states), the light pulse is going to get absorbed by the atoms and not make its way through. However, if you shine a strong control laser (shown in red in the figure and with frequency \omega_c) and a weaker probe beam (shown by the black arrow and denoted to have frequency \omega_p), and you tweak the lasers just right, you can get interference between the states that prevents light from getting absorbed on the transition between states |1> and |3> (let’s call that frequency \omega_x). This is EIT; the medium is now transparent to light with an energy corresponding to \omega_x because that light is no longer absorbed. One interesting result of EIT is that right around frequency \omega_x is that the phase velocity of light varies significantly with frequency.

Blue curve is index of refraction vs. frequency. Grey curve is absorption of light vs frequency

The blue line corresponds to the phase velocity (actually the index of refraction) and the horizontal axis is frequency. The grey curve shows the absorption of light so you can see the transparent window (the dip in the curve) where the light is not longer absorbed. Since the phase velocity varies, any pulse of light with frequency near the transparent window will have a very slow group velocity. By tweaking the properties of the system you are able to get light going 1 km/s in this experiment (in other experiments involved cold gases, researchers have achieved speeds of 17 m/s in an ultracold atomic gas^2).

Storing the Pulse

Reading some of the popular articles, they make it sound like they actually slow the light down enough to stop it, but that isn’t how it works. While the pulse is moving much slower than it did in air, it is by no means standing still. What you need to do is get the medium to absorb the photons, but in a way that preserves the quantum properties of the light. This is done by switching off the control laser beam, which means that the medium is now able to absorb light with energies corresponding to the |1> to |3> transition. As the control beam is turned off, the transparent transition can now absorb light, and the quantum properties of the light are stored as spin excitation of the atoms. The important point here is that the energy from the electromagnetic field making up the light pulse is now transferred to the atoms, that the light gets absorbed, and there is no more light pulse. Now comes the hard part, making sure that the stored information is not lost, and this is where this experiment shines. By applying various magnetic fields and applying a microwave pulse, the experimenters were able to keep the atoms isolated from the effects that cause decoherence (basically, things get bumped and jiggled enough that the information is lost). What is even more amazing is that they used an adaptive algorithm to tweak the parameters of the magnetic fields and microwave pulse to maximize the storage lifetime. They did such a good job that they were able to get close to the theoretical maximum storage time for this particular type of crystal, but this technique can now be applied to other materials that might have longer possible storage times.

So why is this experiment exciting?

Overall, the experiment is about storing and retrieving the quantum mechanical properties of a light pulse. This has applications to quantum computing and quantum cryptography, which rely on quantum entanglement. The big deal here is that they managed to store light pulses in a solid-state device, a doped silicate crystal. Most of the previous storage experiments had focused on using cold gases, but a lot of people feel that solid-state devices will be needed to scale up to larger projects involving hundreds or even thousands of light pulses. Solid-state devices are more difficult because of the strong interaction between atoms of the crystal, which makes it harder to isolate the atoms that are used to store the pulse. 1 Georg Heinze, Christian Hubrich, and Thomas Halfmann (2013). Stopped Light and Image Storage by Electromagnetically Induced Transparency up to the Regime of One Minute Physical Review LEtters DOI: 10.1103/PhysRevLett.111.033601 Note: The article is behind a paywall, but you can read a summary at http://physics.aps.org/articles/v6/80?from_TRM_site=Yttrium 2 Hau, L. V., Harris, S. E., Dutton, Z., & Behroozi, C. H. (1999). Light speed reduction to 17 metres per second in an ultracold atomic gas. Nature, 397(6720), 594-598.  http://www.nature.com/nature/journal/v397/n6720/abs/397594a0.html Sorry, this one is behind a paywall too.

 

Edit: I found a nice article by Chad Orzel talking about slowing light.  Check it out:  http://scienceblogs.com/principles/2010/01/05/controlling-light-with-light/

Posted in Teaching Physics | Tagged , | 2 Comments

LOBA Gradebooks

I was looking back at my previous posts and noticed I hadn’t talked explicitly about how I keep track of grades for assessments, only about how I use Python and Excel to make my life easier.  I used to include learning objective numbers along with each problem so I could write scores on the sheet but I found that was a problem because (1) I’d have to go back through each assessment and copy the scores to my grade book and (2) students would memorize which skill corresponded with each learning objective which made it much easier for them to solve problems.

Learning objectives listed next to the assessment problems.

Learning objectives listed next to the assessment problems.

My solution was to remove any reference to the learning objectives from the assessment and to enter scores directly into Excel.  Each assessment has its own workbook and each student has a single worksheet within that workbook.  As I grade each assessment, I type the scores directly into the Excel workbook for each student.  Any comments that require me to highlight a part of their work is written on the original assessment, but general comments about their work is put into the Excel spreadsheet next to their scores.  Additionally, if there are comments that apply to the whole class, I have Excel cells linked to a master spreadsheet that allow me to type those comments once and have them appear on each student’s grade sheet.  Once I’ve graded all of the assessments I print out the Excel workbook and staple the grade sheets to each assessment.

 

Front page of student grade sheet (top half of Excel worksheet)

Front page of student grade sheet (top half of Excel worksheet)

Back of student grade sheet (bottom half of Excel worksheet)

Back of student grade sheet (bottom half of Excel worksheet)

The benefit of this system is that I’ve reduced the number of grading errors to almost zero, that I now have a record of what each student got on each assessment, and that I have a list of the comments I made to the students to reference later.

My only concern with this setup is that having the scores on a separate page from the problem may be an additional barrier to students.  Based on anecdotal evidence this doesn’t seem to be an issue (the second I hand back assessments I see students flipping through the pages to see which problems they got proficient marks for) but I must remind myself that the plural of “anecdote” is not “data”.

The other component to my grading system is using Python along with Excel to copy all of the scores from the student worksheets to a master Excel spreadsheet (see the previously mentioned post on using Excel and Python together).  If anyone needs help setting this up I am more than happy to help troubleshoot.

I’ve attached a copy of a blank assessment workbook, which you are more than welcome to copy and use.  Let me know if you try it out:  Sample Blank Assessment Gradebook

Posted in Teaching Physics | Tagged , , , , , | Leave a comment

Failure is Always an Option

“Failure” is a bad word in our culture, but every teacher knows that students learn by making mistakes and learning from those mistakes. Its the reason that students don’t learn much from watching us solve a problem correctly on the board – the most important skill is not just knowing how to solve a problem correctly, it’s also knowing how NOT to solve it incorrectly. Unfortunately we as educators tend to perpetuate this fear of failing by deducting points when students get wrong answers.

How many of you have seen a student unwilling to even start a problem unless they know how to solve it? I know I struggle getting students to start a problem by drawing a sketch and listing knowns until they have some idea how to reach the end of the problem.  I think we need to teach our students that failing is part of learning and that what really matters is what they do when they fail.

Let me preface what I am saying applied to College education and not necessarily K-12.  I think we need to let students know it is okay to fail a class.  I admit that it can delay graduation and make them run up more loans, but it is better to learn how to fail in college than waiting until they are on-the-job.  I should mention that every single student I’ve had who failed a LOBA class and who came back for a second try has taken the class seriously, worked hard, and passed the course.  Of course this is still a small sample of students, but it is the lesson that I want them to learn.  There are some students that don’t (or haven’t) retaken the course after failing.

I used to fell horrible giving a student an F.  I was afraid students would feel like failures and I didn’t want to do anything to discourage them.  I’m shifting my perspective and trying to convince my students and my colleagues that an F isn’t a Bad Thing, it is just an assessment of how well a student mastered the material in my course.  An F doesn’t mean a student is stupid or unable to learn anything, only that they didn’t master the material this time.

It is okay to fail.  There should be no stigma associated with failing.  The only Bad Thing is failing and then not taking advantage of the experience to learn.  Encourage your students to fail and learn from it.

Posted in Teaching Physics | Tagged | 2 Comments

Looking Back At LOBA – Fall 2012

With the new term about to start I’m spending time reflecting on what worked and what didn’t and what changes I need to make to my grading scheme. I tend to focus on all the things that didn’t work so it’s good for me to remind myself how wells things actually do work out. In addition to talking about how things in my class I’ll also mention what some of my colleagues have experienced. Currently there are seven other physics faculty at UW-Stout using LOBA besides myself. We’ve been meeting every other week, along with a mathematics faculty member and another from apparel design who will implement LOBA this coming term. It is pretty exciting that so many people were excited by what I did with my grading and felt they wanted to jump on board  My goal is to compile data from all the students and faculty involved to provide guidance for future adopters.

What Didn’t Work

  • The biggest issue I ran into was students putting off reassessment until later in the term. I had a policy requiring them to reassess at least once within three weeks of the second in-class assessment on a particular chapter, but I didn’t have a good system for keeping track of who did what when, so I wasn’t diligent about bugging students to reassess. I also have a philosophical qualm about hounding students to reassess because I want them to take responsibility for their learning. What I’ve come to decide is that LOBA is so different than the grading scheme they are used to that they need the extra push and external accountability provided by me, at least in the beginning. This semester I’m going to (1) shorten the window from three weeks down to two, (2) require that they reassess at least once within the two week window after each in-class assessment (and not just after the second in-class assessment for each chapter), and (3) reduce their final grade by 1/6 of a letter grade for each time they miss the two week window. By reducing the final letter grade by 1/6, this means that their grade will only drop every other time they miss reassessing so the consequences will be low enough not to discourage students who have trouble getting organized. Now I just need to figure out how to track the reassessment dates and automatically determine when students have missed the deadlines (I know I won’t do it by hand). I should note that students can continue to reassess after the two week window as much as they need to and the only thing I’m trying to accomplish is to get students to not procrastinate. Two of the other physics instructors actually set hard deadlines for reassessment (I think it was three weeks in both cases) and students who didn’t reassess within that time frame couldn’t reassess on the material. The results were very different. For one instructor, students managed to meet the deadlines and they ending up doing fairly well in the course. The other instructor had many students miss the deadlines and quite a few ended up dropping the course or they received a failing grade. It isn’t clear what the difference was between the two classes, but we’ll be making tweaks and looking closely at this problem this coming term.
  • The second issue (and I’m not sure it is an issue) is that students weren’t studying before the in-class assessments but would wait for the reassessments to start studying in earnest. It’s my belief that students should be given the opportunity to manage their time how they want and to regulate their own learning, but the end result is that I end up having to write and grade more reassessments than if they studied and demonstrated proficiency on the in-class assessments up front. Some of my colleagues do not share my hands-off approach and feel we should figure out how get students to study for the in-class assessments. We haven’t come up with any stellar ideas on how to do this short of reducing the reassessment opportunities, which removes one of the key benefits of LOBA – that students should be allowed to learn from their mistakes without fear of penalty.

What Did Work

  • I was very happy with my (mostly) automated grade book and had almost no grading error this term. Inputting all the scores directly into Excel and using the Python plug-in has helped me stay organized. My only complaint is the Python plug-in running in Excel tends to run slow for larger assessments.
  • I continue to require students to sign up for reassessment using a Google form but the big change last term is that I managed to keep track of which version of each reassessment students took. It was easy enough to sort the Google spreadsheet to see which version a student had already taken so I never gave them a duplicate version. It also cut down on the amount of printing I needed to do since I knew which versions I needed, thereby saving paper.
  • Dividing the learning objectives up into A-level and C-level learning objectives has worked pretty well. I think all of my colleagues have done something similar (one instructor actually has D-level and B-level learning objectives too) and it does a good job of insuring students focus on the basic C-level material. The one snag we’ve run into is a few students think A-levels should count as C-levels. I can understand students being disappointed if they have completed a few A-levels, but not having them effect their grade until they’ve completed all the C-levels, but the goal of LOBA is to make sure they’ve mastered the basics.
  • Positive student feedback is very positive and shows that, for these students, the grading system is doing what we want it to – getting them to focus on practicing and mastering the material rather than worrying about how many points they need. Although we do get negative student feedback, most of it is a result of students not taking responsibility for their own learning (the rest of the negative comments typically result from a failure to clearly communicate how the grading system works, which is something we are still trying to perfect).

In Summary

I’m very happy with where LOBA is and how it has grown over the past three semesters. Next semester we’ll have seven different courses taught by eight different instructors all using LOBA. When I developed LOBA a year and a half ago I didn’t imagine that it would grow so quickly. There are still a few hurdles (like how to introduce students to the grading system) and philosophical issues to discuss (is it right to force students to study before the in-class assessments), but I think things are going strong.

Posted in Teaching Physics | Tagged , , , , | Leave a comment