Observing the user experience of software upgrades

How many times has your computer asked if you want to install an update, just for you to say, “Not today”? You’re not alone. Most people deal with these notifications far too often and, whether it’s a small update or a big upgrade, it’s common to just skip them. But why?

OS upgrades: who has time for them?

In this study, we wanted to understand how regular people experience operating system upgrades: what are the pain points of the installation process and what happens after the upgrade?

We ran a qualitative study, observing 14 participants while they installed an operating system upgrade on one of their own devices. Then, we followed the participants for a month in an online diary asking them to report any changes they noticed.

We found that the upgrade installation process is frustrating, with poorly designed feedback. After the upgrade, participants noticed more negative than positive changes.

My role

  • I was the lead investigator on the study, which was part of my master thesis and lasted 6 months.
  • I recruited participants, arranged all the logistics for the diary study, and conducted 13 of the 14 observations and interviews.
  • I analyzed the data, collaborating with the rest of the team.
  • I wrote the final paper based on the project and presented its results at CHI 2017.

Struggling to find participants

One of the most challenging aspects of the project was recruiting. I recruited for over three months, using typical methods: mailing lists, flyers, word of mouth. But several potential participants refused to take part because they didn’t want to upgrade their systems.

“No, thank you, I am not interested in the new version of Windows,” said one potential participant.

This was disappointing but we did not give up and, eventually, we managed to get 14 participants. Once I ran the study, though, I began to understand why people are not enthusiast about installing a software upgrade.

Multiple methods for data collection

Because we wanted to build a complex understanding of the issue and go beyond the time-limited experience of the upgrade itself, we decided to use multiple methods to collect data.

We decided to focus the field study on asking participants to upgrade the operating system on one of their own devices. We observed as the upgrade installation process took place. Then, we followed the aftermath in a four-week diary study.

During the diary, we also did two additional interviews: a short check-in halfway through and a slightly longer interview to conclude the study and reflect back on the upgrade experience.

Day 1
OS upgrade

Upgrade observation + contextual interview.

Day 1
Day 2
Diary study starts

From this day and until the end of the study, participants complete a daily diary entry describing changes they notice in their devices.

Day 2
Day 14
Mid-study interview

We meet again with participants to check on their progress with the diary and do a short interview.

Day 14
Day 28
Final interview

The diary study is over. We meet with participants for a final debriefing interview.

Day 28

In addition to the field study, we also ran an online survey with 65 participants. We wanted to triangulate different types of data and compare the experience of upgrading with statistics about their installations. So, we asked respondents to upload a file from their (macOS) systems containing installation dates for recent operating system versions, and then calculated how long they took to install them. On average, participants delayed major upgrades by 80 days and the delay has been going up in recent years.

The average installation delay for major OS X releases.

Analyzing all the data

From the observations, diaries, and interviews, we had a lot of data to untangle. I decided to use thematic coding to make sense of it.

The observations were the hardest component to analyze because so much was going on in them. In a first draft, I organized the results according to importance or gravity, but I realized it felt disjointed and failed to communicate the participants’ experience. Ultimately, I decided to go with a plain chronological organization of the results, with separate perspectives for before, during, and after the upgrade. Simple, but clear.

For the online diary, instead, I decided to be systematic and focus on identifying positive, neutral, or negative changes. I wanted to give a sense of the main things participants noticed so that we could prioritize the key steps to improve the upgrade process as a whole.

I observed 14 participants install an OS upgrade on one of their devices.

Observing the upgrade

The upgrade observations were the biggest component of the study in terms of my involvement. They took place either in the participants’ home, their office, or our office. I was there, observing all upgrades, from start to finish. Sometimes for hours.

In fact, the biggest frustration participants had with the upgrade process was how long it was. That is not entirely surprising but it was problematic for two reasons.

First, participants had to babysit their devices all throughout the process, since it may happen that they needed to answer questions, click a button or provide some kind of input.

Second, participants did have some expectations for how long the process would take based on previous experience, so they had scheduled a window of time for it. But often the upgrade exceeded their expectations by a long shot.

“I was supposed to meet a friend, I have a social life!” said one participant, exhausted.

What made the long duration feel worse, was not knowing what was going on in the process. Feedback was inconsistent, vague, or completely uninformative. For example, on Apple devices the expected remaining time would change for no reason, jumping up and down, and there was no indication of what step you were in, even approximately.

Participants reported being frustrated with the visual feedback during the upgrade process.

After the upgrade

After the upgrade participants noticed a mix of changes, from new functions to changes in the user interface. But, according to our coding, the majority were negative, followed by neutral and positive. This might be because of people’s tendency to notice negative things more than positive, or their general aversion to change.

It also feels like big operating system upgrades don’t necessarily result in noticeable or radical improvements. Especially when considering security-related improvements, it’s hard to notice a change. But this is not ideal, because upgrades are often about security. And yet people can’t really feel the difference. So there’s definitely room for improvement in the upgrade process.

I categorized changes from the diary study according to their valence, type, and impact.

How to improve upgrades

Two of the design implications we draw from the study try to address some of these issues.

For example, we recommend to use those long dead moments in the process to do something that will make good use of people’s time: show them the changes in the interface beforehand, make them try it and get familiar with it. Or, hey, even go crazy and show them movies, as one participant proposed.

Then, we recommend informing people about security improvements. Of course, this information would need to come at an appropriate time, but it could make people more aware of the hidden, positive changes that came with the upgrade.

Funny enough, after this study I’ve started to avoid my updates, anticipating the long wait times and the poor feedback. Things seem to be heading in a better direction though. For example, Microsoft is going to use machine learning to detect appropriate moments for updates. Maybe in the future, upgrades will be so great that we look forward to them. In the meantime, “remind me tomorrow.”

*

In the paper we published at CHI 2017, you’ll find more details about the project and an additional set of interviews with three security managers, giving us a different take on software upgrades.