Warning! This site is under development.

EBoard 14: Wrap-Up

This class will be recorded! Its use will be limited to members of the class. Please do not share with others.

Approximate overview

  • Preliminaries (3:00-3:10)
    • Administrative stuff
    • Q&A
    • Instructions for today
  • Heuristic group instructions (3:10–3:30)
  • Heuristic reports (3:30–3:40)
  • Heuristic reading (3:40–3:50)
  • Break (3:50–3:55)
  • Heuristic debrief (3:55–4:15)
  • On course evaluations (4:15–4:20)
  • Course evaluation time (4:20–4:35)
    • Please take the evaluation seriously
  • Discussion (4:35–4:45)
  • Wrapup (4:45–4:50)

Administrative stuff

General Notes

  • Happy Monday! Happy last day of class!
  • Last day of adjectives! This time for the term. Weird. Meaningful. Sad. Nostalgic [+1]. Fun. Eventful. Calm.
    Painful [+2]. Intermediate. Fast [+2]. Eventful. Tiring. Magical. Busy. Rough. Long. Cool. Survival.
  • Congrats to Baseball for two more wins on Saturday and two on Sunday. That appears to make them the top team in MWC South going into the playoff!
  • Today’s UI blooper
  • Grading is taking me surprisingly long. I’ve sent out grade repors with 90 on investigations 1–3. You won’t get anything lower than that (even if you get something lower than that). I’ll keep working on it.
  • We won’t discuss the Fitts paper. I just thought it would be useful for you to think about broader notions of usability.
  • “Ghoti”
    • rouGH
    • wOmen
    • acTIon, moTIon

Upcoming Activities

  • Recording of Teach-in on Myanmar https://www.youtube.com/watch?v=FPvOBSU5-rM
  • Thursday CS Extra: A constrained K-Means clustering algorithm for improving the energy efficiency of many-core systems. In the CS Events channel.

Work for Thursday the 20th

Q&A

Heuristic group instructions (3:15–3:??)

Meet in Task 14 Channel, Post in Investigation 7

  1. Combine your spreadsheets.
  2. Talk through the problems. When multiple problems overlap, combine them.
  3. Make sure you understand problems and agree on severity.
  4. Post the new spreadsheet to your group on the Investigation 7 channel. (It’s not being scored as an Investigation.)
  5. Review the usabiity report form posted in the Investigation 7 channel.
  6. Identify the most severe usability problems. Distribute one to each group member. Make sure that each member feels like they have enough information to write and post a report.
  7. Each member writes a report and is ready to report back to the class.

We are using a form by Brad Myers and Bonnie John of CMU.

Name

Evidence

Explanation

Severity

Rating

Justification

Possible solution and/or trade-offs

Relationships

Heuristic reports (3:50–4:00)

Time to work on your reports.

Heuristic reading (SKIP)

Read the reports from successive groups (A reads B/C/D/E, B reads C/D/E/A, C reads D/E/A/B, D reads E/A/B/C, E reads A/B/C/D)

Break (4:00–4:05)

Heuristic debrief (4:05–4:25)

What was the experience of observing the site and filling out the spreadsheet like?

  • Without having a particular task, you end up exploring randomly. Having tasks seems like a better thing.
    • Sam says: As a usability consultant, it’s your job to tell them not to fix, rather than their job to design tasks.
  • We rarely focus on Web sites so carefully. We generally say “I’m annoyed” and then move on.
  • It would be helpful to have a strategy. Should you start with a heuristic and look for everything with that or go through the site and observe things as you go?
  • SmartEvals may not have been the best choice, since you can’t go back and edit or change. Or may not have access.
  • It doesn’t take too long to discover simple errors, but deeper errors are harder to discover.
  • Some sites are harder to discover errors on.

What did you discover in talking to others?

  • Some elements had a lot of heuristic problems. Example the search bar. It’s inconsistent. But it also emphasized flexability.
  • In some cases, people had very similar issues.
  • For really bad software, there can be underlying design issues as well as micro-level issues.
  • Our exploration of a site we know is colored by past experiences.
  • There can be very different interpretations of severity. Some can rate something as a 2-3 which others rate as a 1.
  • Some aesthetics, some things can be very consistent. (Or maybe it’s just bad software for which you can agree it’s bad.)
  • Most web sites are really annoying.

What did you discover in deciding on the most important issues?

  • “Sort”. The ones with the 4’s were the most important. Also the only ones with consensus.
  • Different clients will respond to some errors differently.

Other issues

  • Make sure you have the details to reproduce an error!
  • Screen shots are nice.

On course evaluations (4:25–4:30)

Sam rambles about the role of course evaluations.

  • Twenty-five years ago, Trustees agreed on larger raises as long as the faculty had a way to determine “merit” for each faculty member.
  • Different departments designed different forms.
    • Math/CS had only one quantitative component: “How many hours a week did you work on this class?”
    • Different forms makes comparison for merit hard.
  • We moved to a common form.
  • Faculty, not trusting others with statistics, voted not to allow them to be used for salaries. Merit salaries became strange.
  • About three year agos, people finally realized how biased EOCEs (“people at Grinnell”)
  • There is an ongoing project to understand bias in EOCEs and to think about how to use EOCEs in faculty evaluations.
    • Primary focus: “Growth”
  • Whether or not we should, we still use them for promotion and salary and such (sometimes well, sometimes not well).
  • Pay attention to implicit bias when filling out forms.

Course evaluation time (4:30–4:45)

Please take the evaluation seriously.

Wrapup (4:45–4:50)

I apologize if you’ve heard this before.

The last class is hard to design. How do you end?

  1. Acknowledge that the class is special.

  2. Acknowledge the relationships.

  3. Don’t forget to say goodbye.