Exploring data risks
All our online actions leave behind a footprint. I developed a tool to help people understand the risks of sharing personal data online. Supported by the EPSR.
Context
Our personal data footprint grows constantly and remains beyond our control or understanding. This can have far-reaching consequences.
Project overview
- Designed an online activity to help people explore the risks around sharing personal data online
- Facilitated workshops with participants from a wide range of ages and backgrounds
- Thematically coded workshop transcripts
- Presentations, updates, and next-step planning for stakeholders
- Published papers on human-data interaction at CHI Yokahama and contributed to papers written by project colleagues
My role
I created an online game to help people better understand and explore these risks.
A persona-based game
There were obvious ethical issues with asking participants to share their own online histories. Instead, I created a fictional character named “Alex Smith” and imagined Alex’s online footprint over a single day.
Participants explored ‘packs’ of data from different sources - beginning with Alex’s own social media posts, then choosing between other sources including location tracking services, Alex’s friends and family, and biometric data.
As participants accumulated more knowledge about Alex, seemingly innocuous data sources could be combined to reveal much more than Alex intended - including their home address, political associations, finances, health, and relationships.
Adding ambiguity into the game
Insights
Participant feedback was excellent. They found the game very engaging and felt much more informed about online risks after playing. Each session was scheduled for 1 hour but typically ran far longer than this, with participants asking to see every data source and even returning to continue the game after the session. This may have been as much a result of social distancing than the depth of the game itself.
I coded these lengthy workshop transcripts and shared them with the rest of the project team. There were several significant themes.
Impact
Before starting this research, we expected that people wanted some kind of tool that allowed them to scrub their data footprint entirely. This research showed that people didn’t necessarily want to opt-out entirely in this way, but instead wanted to be better informed about risks and have more agile tools at their disposal. A digital privacy companion, for example, could assess personal, reputational, and social risks and offer ways to mitigate them before any data is shared. This approach could help people to make more nuanced decisions about what to share and in which contexts to do so.
Limitations
- Including multiple voices and more content. As I designed the game, it inevitably reflected my own views towards social media and online information sharing. Having multiple authors could mitigate this and widen the scope of the game to address more data-sharing scenarios.
- Allowing people to play the game however they want. More insights might be gained if the game were unmoderated, multi-player, or physical instead of virtual.
- Spending more time considering future risks, such as biometric data leaks. Although I included these to a small extent, the scope of the project required greater focus on immediate risks. There is clearly scope for a speculative design project to help us better understand the potential issues around control and ownership of our biometric data.
Links
- Everyday digital traces (contributor), Big Data and Society
- Data, socially distanced: cumulative data disclosure and data narratives during Covid (contributor), Bileta, Newcastle upon Tyne
- Four Speculative Design F(r)ictions: Designing for Personal Data Awareness, CHI Yokahama
- Where is the Human in HDI?, CHI Yokahama