Summary: Everyday technology users are expected to regularly take care of their devices (smartphones, computers, etc.) by installing operating system upgrades and updates. But not many people install them regularly or see them positively. We wanted to better understand what is the user experience of an upgrade: what are the pain points of the installation process and what costs and benefits do upgrades bring in the longer term? We conducted a qualitative study, observing 14 participants while they installed an operating system upgrade on one of their own devices. After the upgrade, we followed participants for a month through an online diary study in which they reported changes they noticed in their devices after the upgrade. We found that the upgrade installation process is frustrating, with poorly designed feedback. After the upgrade, participants noticed more negative than positive changes.

My role: I was the lead investigator on the study. I recruited participants, conducted 13 out of the 14 observations and interviews (Aurélien Tabard did one), analyzed the data (together with the rest of the team), wrote the final paper based on the project.

Methods: Observations, interviews, diary study, thematic analysis, survey.

When & where: The study was part of my master’s thesis at Université Paris-Saclay. I worked on it for six months in 2016. My supervisors were Wendy Mackay and Joanna McGrenere. Aurelién Tabard and Michel Beaudouin-Lafon were also part of the research team.

Publication: Francesco Vitale, Joanna McGrenere, Aurélien Tabard, Michel Beaudouin-Lafon, Wendy Mackay. 2017. High Costs and Small Benefits: A Field Study of How Users Experience Operating System Upgrades. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ‘17).

· · ·

Multiple methods to collect data

The goal of this project was to understand how regular people experience operating system upgrades. To have a complex understanding of the issue and go beyond the time-limited experience of the upgrade itself, we used a multi-method setup over one month. We started the study by asking participants to upgrade the operating system on one of their own devices. Participants were already aware of having one or more upgrades to install. I observed the upgrade installation process as it took place and asked participants questions, while also taking notes on what was happening.

OS upgrade

Upgrade observation + contexual interview.

Diary study starts

From this day and until the end of the study, participants complete a daily diary entry.

Mid-study interview

We meet again with participants to check on their progress with the diary.

Final interview

The diary study is over. We meet with participants for a final debriefing interview.

After the upgrade, participants would keep track of any changes in a daily online diary for one month. I used Typeform to build a short survey that would take just a couple of minutes to answer. Then I used IFTTT to send daily email reminders: participants could choose to report changes or said they had nothing to report if they didn’t notice anything interesting. Answering a daily diary for four weeks is quite a lot to ask of people, but overall the response rate was good and the short number of questions helped in making it a reasonable task. During the four weeks that followed the upgrade, I did two additional interviews: a short check-in halfway through and a slightly longer interview to conclude the study and reflect back on the upgrade.

Struggling to find participants

“No, thank you, I am not interested in the new version of Windows,” said one potential participant.

One of the most challenging aspects of the project was recruiting. I recruited for over three months, using typical methods: mailing lists, flyers, word of mouth. But several potential participants refused to take part because they didn’t want to upgrade their systems. “No, thank you, I am not interested in the new version of Windows,” said one person after my approach. This was disappointing but we did not give up and, eventually, we found 14 participants. Once I ran the study, though, I began to understand why people are not enthusiast about installing a software upgrade.

Analyzing data

The multiple methods we used for data collection left me with a lot of data to untangle. I decided to use thematic coding to make sense of it. The observations were probably the hardest component to analyze and organize in a reasonable structure. The online diary was somewhat easier because I could be quite systematic about it, focusing on codes for either positive, neutral, or negative changes. Overall, I wanted to give a sense of the main things participants noticed but also avoid an incoherent laundry list of problems. In a first iteration, I organized the analysis of the different study components according to importance or gravity, but I realized it felt disjointed. Ultimately, I decided to go with a plain chronological organization, with separate perspectives for before, during, and after the upgrade. I am not sure this approach avoided the laundry-list issue, but it seemed the most practical.

Observing the upgrade

Valerie Cherish from the TV show "The Comeback" saying: "wow, this is boring"
Installing an OS upgrade: it can get boring. (source)

The upgrade observations were the biggest component of the study in terms of involvement. They took place either in the participants’ home, their office, or our office. I was there, observing all 14 upgrades, from start to finish. Sometimes for hours. There were definitely interesting things going on, but also several dead moments. One participant decided to cook dinner after a couple of hours into the upgrade. Another one managed to tidy up the bedroom, go out to buy cigarettes, and wash the piled dishes in the sink, while the upgrade was still ongoing.


“I was supposed to meet a friend, I have a social life!”

In fact, the biggest frustration participants had with the upgrade process was how long it was. That is not entirely surprising: upgrades are supposed to be long, right? It’s a fair assumption, but this was problematic for two reasons. One, participants had to babysit their devices all throughout the process, since it may happen that they needed to answer questions, click a button or provide some kind of input. And you never know what the system is going to ask you for: Windows devices were especially deceiving in making people agree to unwanted features. Two, participants did have some expectations for how long the process would take based on previous experience, so they had scheduled a window of time for it. But often the upgrade exceeded their expectations by a long shot. “I was supposed to meet a friend, I have a social life!” said one participant, exhausted.

One participant’s phone updating and showing minimal feedback.

What made the long duration feel worse, was not really knowing what was actually going on in the process. Feedback was inconsistent, vague, or completely uninformative. For example, on Apple devices the expected remaining time would change for no reason, jumping up and down, and there was no indication of what step you were in, even approximately.

After the upgrade

After the upgrade participants ended up noticing a mix of changes, from new functions to changes in the user interface. According to our coding, the majority were negative, followed by neutral and positive. This might be because of people’s tendency to notice negative things more than positive, or their general aversion to change. But it also seems the case that big operating system upgrades don’t necessarily result in noticeable or radical improvements. Especially when considering security-related improvements, it’s hard to notice a change. But this is not ideal, because upgrades are often about security. And yet people can’t really feel the difference. So there’s definitely room for improvement in the upgrade process.


Two of the design implications we draw from the study try to address some of these issues. For example, we recommend to use those long dead moments in the process to do something that will make good use of people’s time: show them the changes in the interface beforehand, make them try it and get familiar with it. Or, hey, even go crazy and show them movies, as one participant proposed. Then, we recommend informing people of security improvements. Of course, this information would need to come at an appropriate time, but it could make people more aware of the hidden, positive changes that came with the upgrade.

Funny enough, after this study I’ve started to avoid my updates, anticipating the long wait times and the poor feedback. I know I should install them, but it just never seems like a good time for them. That being said, I do hope that this and other studies on the topic encourage tech companies to improve the upgrade process. Things seem to be moving in a positive direction: Microsoft is going to use machine learning to detect appropriate moments for updates. Maybe in the future we’ll have upgrades so great that we look forward to them. In the meantime, “remind me tomorrow.”

· · ·

You will find more details about the project and the two additional, smaller studies that were part of it (a survey and another set of interviews) in the paper we published at CHI 2017.