It’s a familiar frustration – why don’t people use the features innovators create? A 2019 Pendo report found that software companies invested up to $29.5 billion in features rarely or never used, and that 80% of features in the average software product are rarely or never used.
There are a number of reasons why feature adoption rates are so low. They may not be widely useful. They may not be easy to find. But they may also not be trusted, and that is the issue we will explore in this report.
The trust gap
In 2018, consumer organisation Which? published its policy paper ‘Ctrl, Alt or Delete? The Future of Consumer Data’, a survey more than 2,000 adults to understand their attitudes to the way their data is used. The survey found that people generally fall into four categories.
Figure 1: Consumer Data segmentation attitudes
Which? found that:
“19% of the population are taking considerably more action than others to restrict what data can be observed about them and “dirtying” their data by putting incorrect information in forms and using separate email addresses for organisations they do not want to receive communications from.”
“24% of the population are characterised by how much more they take advantage of the shortcuts afforded to them online than others (for example saving their bank details in forms and logging into other services using their social media.”
Perhaps surprisingly, Which? noted that ‘there is a relative lack of a relationship between attitudes and behaviour.’ In other words – those choosing to lock down their data are not necessarily those who are considered ‘Anxious’ and those who use shortcuts are not necessarily ‘Liberal’.
Innovators who want to improve feature adoption should take this very seriously. User-centred privacy is an approach that organisations can apply to understand how different groups of users feel about features, address any concerns they may have and recognise the potential impact of user behaviour such as providing dirty data.
What is user-centred privacy?
User-centred privacy is an approach to design that recognises that different individuals will be affected in different ways by the same processing activity. It is an approach that is at the heart of data protection laws such as the General Data Protection Regulation (GDPR).
The GDPR requires organisations to behave in certain ways when design and improving processing activities. It requires them to carry out data protection impact assessments (DPIAs) to analyse the risks to individuals and design controls to mitigate those risks. It gives individuals the right to explain the impact of a processing activity on them, and to prevent the processing taking place or have their data removed from a dataset. It requires organisations to provide user-friendly information to help users understand how and why the organisations wants to process their data so they can make informed choices.
However, at present few organisations have a mature approach to this. Issues we see include treating DPIAs as a form to fill in just before launch, and after all the design decisions have already been taken. Data Protection Officers (DPOs) rarely have direct access to product and service users, with any focus groups and research typically managed by marketing or customer service teams. Privacy information is provided as a ‘one size fits all’ notice that is rarely drafted in the brand’s tone of voice and even more rarely takes into consideration the questions users might have about how their personal data is processed.
By contrast, applying a user-centred privacy approach means improving the project team’s understanding of the user’s hopes and fears in respect of the processing, ensuring that they are addressed throughout the build and communicating them in a way that is clear and builds trust. Doing this is not simply a more effective approach to compliance – it is also likely to improve feature adoption rates.
User-centred privacy in Ireland
Ireland is fast becoming a global hub for user-centred privacy research and development. Irish companies like Oblivious Software are driving privacy enhancing technologies (PETs) through initiatives like the UN Privacy-Preserving Technologies Task Team and the UN PET Lab. Meta (previously Facebook) is investing €500,000 with the Science Foundation Ireland to fund two initiatives looking at data sharing, privacy protection and governance.
These kinds of initiatives will enable Irish companies to take the lead in developing and using the technical solutions that will embed user-centred privacy practices into new innovations.
User-centred privacy in practice
Applying a user-centred privacy approach requires project teams to consider four pillars.
Figure 2: User-centred privacy
The goal for the user-centred privacy approach is to ensure that users and organisations understand the risks arising from the processing, the choices users want to make and the controls that need to be available to help them do that.
There are two pillars to this approach: information flows and operational delivery.
There are two core information flows required for user-centred privacy. First, user insights need to be received by the organisation. Project teams should consider how they can capture user insights. These may be surprising. At a recent Gemserv webinar discussing user-centred privacy in practice, Philippa Harvey of the Cabinet Office described how she sought user insights from participants at the recent COP 26 climate change conference, to ensure individuals felt safe to participate in person. Harvey ran a series of Q&A sessions, and uncovered a range of concerns – such as concerns that Covid tests amounted to providing DNA samples to the government – that were not anticipated by the project team. She was clear that, had she not taken such a structured approach to capturing user insights, participation rates would likely have been lower and therefore the impact of the conference could have been reduced.
The kinds of privacy enhancing technologies that are being developed in Ireland allow information to flow to different audiences at different levels of granularity. For example, they may allow data sets to be interrogated without the need for the underlying data to be surfaced, which in turn allows organisations to unlock the value and social benefits in personal data sets without compromising user trust and privacy.
The other core information flow follows naturally from the user insights. Having identified the hopes and concerns of individuals, organisations then need to address them in a transparent manner, so that users can see how their hopes and fears are taken into consideration. This may require more than simply updating the privacy notice, and we would recommend considering how the ‘7 Ps of Marketing’ might contribute to a consistent and transparent message. These are:
- Product – do the benefits of the feature outweigh the perceived risks?
- Price – if the product is free or very inexpensive, will people be concerned that collection of their personal data is part of the business model?
- Place – are there places the feature should and should not be associated with?
- Promotion – what messages around data privacy should be included in sales and marketing materials?
- People – are different user groups affected in different ways by the processing?
- Process – how should processes be designed to address users’ hopes and fears?
- Physical Evidence – what do users need to see and experience in order to trust that they can use the feature with confidence?
At the same webinar Beverley Adams-Reynolds, the Data Protection Officer for homelessness charity Crisis, described how they built in privacy while trialling the use of targeted online advertising as part of their Christmas fundraising campaign. Crisis considered the different ways their service users might be affected by the use of tracking cookies in comparison with their donors. They decided to ensure that cookies were only dropped from appropriate pages that were unlikely to result in detriment to visitors and tracked metrics to test whether the intrusion was justified by the outcomes.
The goal for operational delivery is to ensure that the design of the service or feature incorporates data protection by design and default, as required under the GDPR. There are two aspects to this. First, the foundational design needs to address the user insights gathered through the project development phase. Then, organisations need to review user complaints and requests to identify any further user insights and make adjustments where appropriate. In order to do this, organisations must have appropriate processes in place to identify, capture and escalate complaints and requests relating to data privacy.
We recommend that organisations consider:
- How data privacy can be incorporated into user insight sessions
- The training frontline teams receive to support them to recognise and act on insights
- How complaints and requests are logged
- What management reporting is produced based on the logs
- How ongoing feature maintenance is resourced.
User-centred privacy could be your source of competitive advantage
Organisations that understand how to make data privacy an innovation enabler have a competitive advantage at present. There are clear actions organisations need to take in order to do this, and if truly incorporated into programme management methodologies, there is likely to be very low or no incremental cost associated with it. If your organisation is looking at how it can improve feature adoption, we recommend including this approach within your strategy.
 McCarthy 1960; Booms & Bitner 1982