- Executive summary
- Research approach
- Who did we research with?
- What did we do?
- What did we test?
- Consent Score
- Consent Scores for Existing state (round 1) compared to Iterated simplified state (round 3)
- Changes to the Consent Flow
- Changes in data sharing landscape
- Empowering and voluntary
- Informed Consent and Comprehension
- Trustworthiness
- Findings
- What did we learn?
- Separation of consents (bundling)
- Pre-selected and actively selected options
- Withdrawal of consent information
- Supporting parties
- Data language standards
- 90-day notifications
- CDR receipts
- Dashboards for once-off consents
- De-identification and deletion by default
- Next steps
Executive summary
This report contains findings and recommendations based on three rounds of qualitative and quantitative Consumer Experience (CX) research, conducted from September to November 2022. In total, 290 consumers participated in research activities ranging from 1:1 moderated interview sessions to unmoderated surveys and prototype tasks.
The purpose of this research was to examine the viability of simplifying rules and standards for Consumer Data Right (CDR) consents and dashboards, as identified by the Consent Review Working Group.
Prototypes of a collection and use consent flow were used to facilitate discussion with consumer participants, and generate quantitative metrics relating to engagement, comprehension and sentiment.
Key research questions included:
- How might we simplify the consent flow while maintaining intuitive, informed, and trustworthy data sharing experiences?
- Balancing the display of information with the need to maintain an informed and trustworthy experience; and
- Balancing interaction loads while offering control and intuitive experiences.
- How might changes to the consent flow impact consumer empowerment and choice, comprehension and informed consent as well as trustworthiness of the CDR?
This includes:
Specific hypotheses, insights, and their results included:
The CX research suggests that the proposed changes to the consent flow would not meaningfully impact consumers’ comprehension, empowerment and trust.
Eight out of twelve research questions were strongly supported by the research evidence. While the evidence from this research for the remaining four questions was indeterminate, this report provides recommendations based on current and past CX research.
The evidence suggests yes.
The evidence suggests yes.
The evidence is indeterminate, but this could still be done safely and intuitively.
The evidence is indeterminate, but this could still be done safely and intuitively.
The evidence suggests yes.
The evidence suggests yes.
The evidence suggests yes.
The evidence suggests yes.
The evidence suggests yes.
The evidence is indeterminate, but this could be explored further.
The evidence is indeterminate, but this could be explored further.
The evidence suggests yes.
This research was also informed by earlier consultation and research conducted across 2020–2022 including the following:
- Noting Paper 273 consultation
- Phase 3 CX research reports
- Disclosure Consent research report
- Consumer Policy Research Centre (CPRC) report: My Data, My Choices
Full details on the public consultation and outcomes can be found on Design Paper 321: Consumer Data Right Consent Review.
The Consumer Data Right (CDR) aims to give consumers control over information about themselves and share that information with third parties. The CDR promotes competition, encourages innovation, and consumer empowerment.
The CDR’s consent and transparency requirements facilitate more consumer control, privacy conscious behaviour, and the development of trust as a competitive advantage.
- For consumers, the CDR is a safe, secure, transparent, and government regulated ecosystem that consumers can opt in to.
- For ADRs, the CDR facilitates effective pathways to consumer outcomes by enabling access to machine-readable data for more accurate, tailored, and real-time insights.
The Data Standards Body’s Consumer Experience (CX) Working Group is helping organisations provide intuitive, informed, and trustworthy data sharing experiences with positive consumer outcomes in the short and long term.
The insights and recommendations found in this report are shared for general community knowledge; to inform the development of standards, guidelines, and the CDR more generally; and to support the CDR’s development in a way that is research-driven and centred on consumer consultation.
NB: This report does not necessarily reflect the position or direction of the government or the Data Standards Body. Recommendations found within these reports represent a set of possibilities that are reviewed and considered and may be subject to change. Reports inform rules and data standards development but should not be seen as indicative of the CDR’s direction.
- The data standards cover technical and CX requirements and are published on the data standards website.
- Standards are consulted on through GitHub and change requests can be made on the standards-maintenance page.
- CX Guidelines and other CX-related artefacts can be found on the CX guidelines website.
- CX and community engagement reports can be found in our CX Reports page.
- Keep up to date by signing up to our mailing lists and subscribing to our blog.
- Contact us at cx@consumerdatastandards.gov.au.
Research approach
Following the recommendation in the CDR Rules Design Review to examine the viability of simplifying the rules for CDR consents and dashboards, Treasury has established the Consent Review Working Group with Data Standards Body’s (DSB) Consumer Experience team. The Working Group’s aim is to review the CDR consent rules and standards, as well as potential future directions for CDR consents.
Overall, CX research aimed to:
- Inform the revision of existing rules and standards; or the development and proposal of new rules and standards.
- Identify appropriate CX metrics/criteria to benchmark and measure success of the changes to Consent.
The two key areas of focus were:
- Information presented to the consumer
- Consumer control and choice
- How might we simplify the consent flow while maintaining intuitive, informed, and trustworthy data sharing experiences? This includes:
- Balancing the display of information with the need to maintain an informed and trustworthy experience; and
- Balancing interaction loads while offering control and intuitive experiences.
- How might changes to the Consent flow impact consumer empowerment and choice, comprehension and informed consent, as well as trustworthiness of the CDR?
Separation of consents (bundling)
- Can collection and use consents be granted in a single action without reducing empowerment or comprehension?
- Can multiple use consents be requested in a single consent flow without impacting comprehension or trustworthiness?
Pre-selected and actively selected options
- Can required datasets be pre-selected or clearly indicated without impacting empowerment and comprehension?
- Where the consent duration is essential to the provision of the service, can this be pre-selected or clearly indicated without impacting empowerment and comprehension?
Withdrawal of consent information
- Can withdrawal information shown during consent be simplified without impacting comprehension and empowerment?
Supporting parties
- Does the consistent display of supporting parties better align with consumer expectations?
Data language standards
- Can the data language ‘permissions’ be referred to in a more conversational way?
90-day notifications
- Should the requirements for 90-day notifications be amended to provide clarity on their content, and to allow flexibility for consolidating them?
CDR receipts
- Would specific guidance on what to include in a CDR receipt help to better meet consumer expectations?
- Would further guidance on when to provide a CDR receipt better meet consumer expectations?
Dashboards for once-off consents
- Are dashboards necessary for once-off consents?
De-identification and deletion by default
- Would a deletion by default approach improve consumer control, empowerment and trust?
Who did we research with?
A broad and diverse range of participants were recruited to help reduce bias and research out risk. A ‘no edge cases’ approach is taken to support the design of an inclusive CDR. Instead of focusing on those who are already likely and able to adopt CDR, our research focuses on removing the barriers to CDR being inclusive and accessible, which will make CDR easier and simpler to access for everyone.
The recruitment process strives to reflect the demographic percentages outlined in the Australian Bureau of Statistics 2016 Census Data, and explicitly recruits those who may be experiencing vulnerability or disadvantage.
Participants have varying levels of:
- Digital ability, financial and data literacies and experiences
- Privacy awareness
- Confidence in the English language
- Trust in Government and commercial organisations
Engagement
290 consumer participants were engaged in total across all rounds 1, 2 and 3 activities. The breakdown of participants by activity was as follows:
- Surveys: 160 participants
- Unmoderated prototype tasks: 114 participants
- Moderated prototype interviews: 16 participants
Each research activity was designed to produce different insights. The surveys focused on data sharing attitudes and preferences, while all prototype tasks were designed to capture metrics relating to the Consent Score, among other things.
Detailed demographics
The information below outlines the participants whose data contributed to the development of the Consent Score artefact and Proposal findings in this research report.
Age group | Round 1:
Survey research
(Sep 2022) | Round 1:
Unmoderated research †
(Sep–Nov 2022) | Round 2:
Moderated & unmoderated research †
(Sep–Nov 2022) | Round 3:
Moderated & unmoderated research
(Oct 2022) | Rounds 1-3 overall % |
18-24 | 21 | 4 | 5 | 6 | 12.4% |
25-34 | 35 | 14 | 11 | 13 | 25.2% |
35-44 | 37 | 9 | 11 | 13 | 24.1% |
45-54 | 22 | 5 | 5 | 5 | 12.8% |
55-64 | 24 | 5 | 6 | 8 | 14.8% |
65-74 | 15 | 1 | 4 | 1 | 7.2% |
75+ | 6 | - | 3 | 1 | 3.4% |
State or territory | Round 1:
Survey research
(Sep 2022) | Round 1:
Unmoderated research †
(Sep–Nov 2022) | Round 2:
Moderated & unmoderated research †
(Sep–Nov 2022) | Round 3:
Moderated & unmoderated research
(Oct 2022) | Rounds 1-3 overall % |
ACT | 3 | 1 | - | 4 | 2.8% |
NSW | 48 | 11 | 14 | 15 | 30.3% |
NT | 1 | 1 | - | 2 | 1.4% |
QLD | 41 | 7 | 9 | 9 | 22.8% |
SA | 8 | 3 | 4 | 2 | 5.9% |
TAS | 3 | - | - | 1 | 1.4% |
VIC | 40 | 10 | 14 | 11 | 25.9% |
WA | 16 | 5 | 4 | 3 | 9.7% |
Rural vs metro | Round 1:
Survey research
(Sep 2022) | Round 1:
Unmoderated research †
(Sep–Nov 2022) | Round 2:
Moderated & unmoderated research †
(Sep–Nov 2022) | Round 3:
Moderated & unmoderated research
(Oct 2022) | Rounds 1-3 overall % |
Metropolitan/ Inner City | 41 | 11 | 8 | 18 | 26.9% |
Suburban / Outer City | 75 | 17 | 24 | 15 | 45.2% |
Large town | 21 | 7 | 5 | 5 | 13.1% |
Small or remote town | 7 | 1 | 4 | 4 | 5.5% |
Rural location | 16 | 2 | 4 | 5 | 9.3% |
Gender | Round 1:
Survey research
(Sep 2022) | Round 1:
Unmoderated research †
(Sep–Nov 2022) | Round 2:
Moderated & unmoderated research †
(Sep–Nov 2022) | Round 3:
Moderated & unmoderated research
(Oct 2022) | % overall |
Man | 79 | 16 | 18 | 19 | 45.5% |
Woman | 77 | 21 | 23 | 24 | 50% |
Non-binary/gender fluid | 4 | 1 | 3 | 4 | 4.1% |
I use a different term | - | - | 1 | - | 0.3% |
Identity | Round 1:
Survey research
(Sep 2022) | Round 1:
Unmoderated research †
(Sep–Nov 2022) | Round 2:
Moderated & unmoderated research †
(Sep–Nov 2022) | Round 3:
Moderated & unmoderated research
(Oct 2022) |
I am of Aboriginal and/or Torres Strait Islander descent | 12 | 1 | 0 | 5 |
I have a non-English speaking background | 29 | 8 | 13 | 8 |
My Parents have a non english speaking background | 46 | 15 | 18 | 17 |
I migrated to Australia from another country | 52 | 15 | 16 | 13 |
I have accessibility needs | 11 | 4 | 9 | 6 |
I am LGBTQI+ | 19 | 8 | 6 | 12 |
Individual consumers
Working | Round 1:
Survey research
(Sep 2022) | Round 1:
Unmoderated research †
(Sep–Nov 2022) | Round 2:
Moderated & unmoderated research †
(Sep–Nov 2022) | Round 3:
Moderated & unmoderated research
(Oct 2022) | Rounds 1-3 overall % |
Employed by a company | 89 | 24 | 19 | 30 | 55.9% |
Retired | 22 | 2 | 8 | 3 | 12.1% |
Unemployed | 6 | 2 | 5 | 4 | 5.9% |
Temporarily not working but has a job to go to | 7 | 1 | 1 | - | 3.1% |
Permanently unable to work | 8 | - | 2 | - | 3.4% |
Other | 6 | - | - | - | 2.1% |
Business consumers
Working | Round 1:
Survey research
(Sep 2022) | Round 1:
Unmoderated research †
(Sep–Nov 2022) | Round 2:
Moderated & unmoderated research †
(Sep–Nov 2022) | Round 3:
Moderated & unmoderated research
(Oct 2022) | Rounds 1-3 overall % |
Self-employed with a small business that employs other people | 5 | 3 | 3 | - | 3.8% |
Self-employed as a sole trader, freelancer or contractor | 17 | 6 | 7 | 10 | 13.8% |
Financial hardship in last 12 months | Round 1:
Survey research
(Sep 2022) | Round 1:
Unmoderated research †
(Sep–Nov 2022) | Round 2:
Moderated & unmoderated research †
(Sep–Nov 2022) | Round 3:
Moderated & unmoderated research
(Oct 2022) | Rounds 1-3 overall % |
Yes | 80 | 19 | 21 | 24 | 49.7% |
No | 80 | 19 | 24 | 23 | 50.3% |
What did we do?
Qualitative and quantitative research approaches were used across three rounds of research. This combination ensured a robust analysis that allowed for metrics to be collected at scale, supplemented with qualitative interviews to hear in-depth feedback from participants.
Consistent quantitative metrics were used between rounds to assess the impact of proposed changes as they relate to comprehension, trustworthiness, and empowerment. These metrics formed the basis of the Consent Score artefact, developed for the Consent Review project. For more about the Consent Score process and results, see the Consent Score section of this report.
290 consumer participants were engaged in total across all rounds 1, 2 and 3 activities. Breakdown as follows:
Round | Research activity | Research period | Number of participants |
1 | Survey | 2 Sep 2022 – 12 Sep 2022 | 160 |
1 | Unmoderated prototype task† | 5 Sep 2022 – 13 Sep 2022;
27 Oct 2022 – 3 Nov 2022 | 38 |
2 | Moderated prototype interview | 19 Sep – 26 Sep 2022 | 8 |
2 | Unmoderated prototype task† | 27 Sep – 4 Oct 2022;
28 Oct 2022 – 3 Nov 2022 | 37 |
3 | Moderated prototype interview | 17 Oct – 20 Oct 2022 | 8 |
3 | Unmoderated prototype task | 14 Oct - 4 Nov 2022 | 39 |
† Due to technical limitations, the round 1 and 2 unmoderated prototype tasks were conducted over multiple periods.
- Round 1: 11 participants responded in September 2022, and 27 participants responded in October to November 2022.
- Round 2: 18 participants responded in September to October 2022, and 27 participants responded in October to November 2022.
Each research activity was designed to produce different insights. The surveys focused on data sharing attitudes and preferences, while all prototype-based activities were designed to, among other things, capture metrics relating to the Consent Score.
Surveys were completed independently by consumer participants. Four different scenarios were used to capture data sharing attitudes and preferences in response to different value propositions – including scenarios that may be perceived as high/low risk and benefit. Surveys took approximately 20 minutes to complete.
Unmoderated prototype tasks were completed independently by consumer participants. All rounds focussed on a finance management scenario requiring a Collect & Use consent using Banking data. Participants were asked to step through the prototype to share data from a known bank to a hypothetical Accredited Data Recipient (ADR). With their consent, participants’ screens, audio and video was recorded to allow metrics relating to the Consent Score to be collected. They were asked to think aloud as they progressed through the task. After completing the prototype activity, participants were asked to fill out a task recap survey. This survey gauged participant comprehension levels and the trustworthiness of the prototype. These activities took approximately 30 minutes to complete.
Moderated prototype interviews were completed via video call between a consumer participant, a researcher and a research observer. Moderated interviews allowed for direct feedback and insights from a diverse range of participants. Participants were asked interview questions to understand their current views on data sharing, consent and privacy risks. They were then asked to complete a prototype task using the same finance management scenario requiring a Collect & Use consent using Banking data, answering in-depth questions verbally throughout the task. After completing the prototype activity, participants were asked to fill out the task recap survey to gauge their comprehension levels and the trustworthiness of the prototype. Each interview session lasted approximately 90 minutes.
Participants were given a task recap survey to gauge their levels of comprehension, trustworthiness and propensity to share. These questions and methods were adapted from Greater than X’s Phase 2 research.
For full details about criteria and metrics methods, read our
In the survey component of the first round, participants were presented with one of four written scenarios that explored data sharing:
- Finance management – Collect & Use consent using Banking data
- Comparison tool – Collect & Use consent using Energy data
- Loan application – Trusted Advisor (TA) Disclosure consent using Banking data
- Lease application – Insight Disclosure consent using Banking data
Participants were asked to rate service’s benefit and trustworthiness, as well as their comfort with the scenario.
They were then presented with a number of items relating to information displayed, as well as control and choice. They were asked to rank these as low, medium or high importance in relation to increasing their comfort. Participants were given a soft limit of 10 items in the “high importance” category, although this was not enforced.
- Respondents to the Collect & Use scenarios (both Energy and Banking) were presented with 18 options to rank
- Respondents to the TA Disclosure scenario were presented with 24 options to rank
- Respondents to the Insight Disclosure scenario were presented with 29 options to rank
The first round of CX research consisted of an unmoderated prototype task to establish a baseline score for the existing consent flow. This prototype represented the current CDR Rules and Standards and was largely consistent with existing wireframes used in CX Guidelines.
Consistent metrics were used between rounds to assess the impact of proposed changes as they relate to comprehension, trustworthiness, and empowerment.
The second and third rounds of CX research tested simplified versions of the consent flow, using the same metrics applied to the round 1 current state prototype. Round 3 iterated and addressed design issues and opportunities uncovered in round 2.
Rounds 2 and 3 consisted of unmoderated prototype tasks as well as moderated prototype interviews.
What did we test?
The primary research question was:
How might we simplify the consent model while maintaining intuitive, informed, and trustworthy consent experiences?
Survey focus
To answer this question, the CX survey research focused on Consent information and choice, including various consent types, use cases and sector data.
Prototype focus
To answer this question, the CX prototype research focused on:
- Content presentation, including existing requirements relating to withdrawal instructions, notifications, supporting parties, data handling, CDR receipts, and descriptions of datasets.
- Consent separation, including the current treatment of certain consents as 'separate', particularly collection and use consents.
- Interaction requirements, such as the active selection of each dataset, use, duration, and the right to delete functionality.
Scenario
- The actors for all three rounds were:
- A fictional Accredited Data Recipient (ADR) that provided a personal finance management (PFM) service, and
- A real-world bank as the Data Holder (DH).
- The consumer was told they’d been discussing finances with their friend and were told about the ADR. Their friend had used it to manage many bank accounts in one place, as well as track and categorise incoming and outgoing expenses.
- They were then presented with the ADR website and application and asked how they would proceed.
Round 1
Round 2
Round 3
Consent Score
The Consent Score is an artefact developed to provide a visually simplified representation of a consent flow’s performance. This graph aggregates the various metrics used in research, based on a formula that considers several variables and areas.
The aim of the Consent Score is to visually demonstrate how well a consent flow performs against defined metrics in relation to the consent principles developed by the Consent Review Working Group. The consent flow is assessed in two degrees across three focus areas.
Degrees
Importantly, how well a consent flow performs depends on several variables, which have been categorised based on the following degrees:
- Engagement/interaction degree - what a consumer does
- Subjective degree - what a consumer thinks
This is related to observable participant behaviour, including how they act and engage with aspects of the consent flow during research.
This is related to participant feedback, sentiment and understanding of the consent flow. This is primarily collected during a post-task survey.
Focus areas
Degrees are thematically grouped according to focus areas, which map directly to the Consent Principles:
Empowering and Voluntary
- Consent is inclusive, empowering, and creates positive outcomes
- Consent is given freely and enthusiastically
Informed and Comprehensible
- Consent is specific, current, and reversible
- The consent process is intuitive and comprehensible
Trustworthy
- The CDR is trustworthy and meets expectations
These focus areas map directly to the Consent Principles, which helps to establish a link between the Consent Score and how well a consent flow is performing in relation to these Consent Principles (which were themselves drawn from the objects of consent outlined in Rule 4.9 of the Competition and Consumer (Consumer Data Right) Rules 2020 Compilation No. 7 (the Rules) and the CX Principles). The criteria that underpin these focus areas allow the consent flow to be measured, tested, and refined based on consumer research.
Each area can be measured by making an assessment of a particular consent flow component (such as active selection of datasets or information regarding consent withdrawal). The approach to measuring each area was influenced by Alsop and Heinsohn’s (2005) paper titled ‘Measuring Empowerment in Practice: Structuring Analysis and Framing Indicators’. The overarching approach to measuring each focus area for the purposes of this research can be articulated as follows:
Empowering
- Whether a person uses the opportunity to choose (use of choice)
- Once the choice is made, whether it brings about the desired outcome (achievement of choice)
Voluntary
- Whether a person uses the opportunity to opt in or out of giving consent (use of choice)
- Once the choice is made, whether it brings about the desired outcome (achievement of choice)
Informed and Comprehensible
- Whether a person engages with information (use of information)
- From (data holder)
- To (data recipient)
- Purpose
- Datasets shared
- Duration
- Voluntary consent and consequences of not sharing data
- Withdrawal methods
- Redundant data handling
- Risks associated with data sharing
- Once the information is engaged with, whether it has been understood (achievement of comprehension)
Based on the consent objects outlined in Rule 4.9, informed consent has been measured against comprehension and recollection of:
Trustworthy
- Whether a person engages with ‘trust’ markers (use of trust elements)
- Social proof- this may include ADR accreditation, external reviews, etc.
- External links to government websites
- Clear purpose and benefits statements
- What will and won't be shared statements
- Withdrawal information
- Data handling information
- Link to the CDR policy
- Once the ‘trust’ element is engaged with, whether it is deemed trustworthy (achievement of trustworthiness)
Based on CX research (to date), the following trust markers have been identified:
Consent Score calculation
The overall process for calculating the Consent Score can be explained as follows:
- A score for each degree is given based on the presence of and engagement with certain components. For example:
- Engagement/interaction: Did the participant engage with the component?
- Subjective: How did the participant respond to the component?
- Each score is added together in relation to its focus area
- The averages of the total score for each focus area are combined to arrive at the total Consent Score (e.g. 54 / 100, with 100 being a perfect, but virtually unattainable score).
Consent Scores for Existing state (round 1) compared to Iterated simplified state (round 3)
For the above Consent Scores graph, the breakdown across round 1 and 3 is as follows:
Existing collect and use consent (round 1) | Simplified collect and use consent (round 3) | Difference | |
Empowering and Voluntary | 63.32% | 52.81% | -10.51% |
Informed and Comprehensible | 74.10% | 66.31% | -7.79% |
Trustworthy | 49.21% | 45.49% | -3.72% |
Total Score | 62.21 / 100 | 54.87 / 100 | -7.34 |
At a glance there appears to be an overall decrease in Consent Scores across rounds 1 and 3. However for the majority of round 1 and 3 scores, the differences were not statistically significant. This means there is not enough evidence to conclude that there is a real difference between round 1 and round 3 results. As such, we cannot establish a cause-and-effect relationship. The differences in scores are likely to have happened by chance (e.g. participant selection), rather than design changes.
Two degrees experienced a statistically significant decrease:
- Engagement/interaction degree for Empowering and Voluntary
- Subjective degree for Informed and Comprehensible
The large and statistically significant decrease can be explained by the removal of the ‘actively select’ requirement for datasets in the simplified consent flow, which automatically resulted in a lower score.
However, the presence of active selection functionality in the current state consent flow could be considered a false choice where a consumer cannot continue without selecting required datasets. As such, the simplified consent flow’s lower scores for ‘Empowering and Voluntary’ could be considered to reflect a more accurate and realistic baseline score for this aspect of consent in general.
The small but statistically significant decrease can be explained by:
Throughout all research to date, participants were asked ‘How would you stop the collection, use and/or disclosure of your data?’ which directly relates to Rule 4.11(g)(ii), ‘instructions for how the consent can be withdrawn.’
In round 3’s simplified Consent Flow, this information was omitted from the Consent step, and instead only accessed through the CDR policy and CDR receipt. As such, many consumers did not meaningfully engage with this information prior to questioning.
The simplified Consent step maintained information relating to Rule 4.11(g)(i), ‘a statement that, at any time, the consent can be withdrawn.’ The fact that they could withdraw consent at any time was broadly understood.
While dates for sharing duration and historical data collection dates were displayed in both rounds, the presentation of this information varied. In round 1’s existing Consent Flow, sharing duration and historical data dates were displayed on a separate screen during the Consent step. This design choice did not perform well, and some participants were unclear on the difference between the future-dated access period and the historical data access. Round 3 appeared to resolve most of these issues, with a much clearer information display, and historical data access information moved into the progressive disclosure pattern of the relevant data cluster (transaction details).
Further research should be conducted to review the display of this information to ensure comprehension.
Note: Consent Scores across round 1 and 3 were compared using Student’s t-test to assess the statistical significance of the different scores.
Engagement/interaction degree
(what a consumer does) | Subjective degree
(what a consumer thinks) | |
Empowering and Voluntary | Large and statistically significant decrease | Difference is not statistically significant |
Informed and Comprehensible | Difference is not statistically significant | Small but statistically significant decrease |
Trustworthy | Difference is not statistically significant | Difference is not statistically significant |
Changes to the Consent Flow
Proposal | Existing collect and use consent (round 1) | Simplified collect and use consent (round 3) |
❶ Pre-selected and actively selected options (Control and choice) | Data clusters required for the service are presented as ‘active selection’ options.
Duration and Uses are clearly stated. | Data clusters required for the service are clearly stated.
Duration and Uses are clearly stated. |
❷ Data language standards (The display of information) | Data language permissions are referred to in a list format, as per the CX Standards. | Data language permissions are referred to in a more conversational way. |
❸ Withdrawal of consent information (The display of information) | Withdrawal information as per Rule 4.11(g), i.e.
(i) a statement that, at any time, the consent can be withdrawn;
(ii) instructions for how the consent can be withdrawn;
(iii) a statement indicating the consequences (if any) to the CDR consumer if they withdraw the consent; | Withdrawal information as per Rule 4.11(g)(i), i.e. ‘a statement that, at any time, the consent can be withdrawn.’ |
❹ Authentication information (The display of information) | Pre-authentication information includes statements about Redirection and ‘One Time Password’, as per CX Standards. | Pre-authentication information includes a statement about Redirection only. |
Across rounds, participants indicated that it is generally possible to simplify the consent flow without undermining consumer expectations and protections. Both examples were seen as “very fast”, “extremely easy” and “very trustworthy.”
Criteria | Existing collect and use consent (round 1) | Simplified collect and use consent (round 3) | Difference |
How time consuming they found the process to be (Time) | 4- Very fast | 4- Very fast | No change |
Ease of granting access to data (Physical effort) | 5- Extremely easy | 5- Extremely easy | No change |
Ease of understanding information presented throughout the Consent Model (Mental effort) | 5- Extremely easy to understand | 4- Very easy to understand | -1 point |
Comparison between typical manual processes and CDR process (Routine) | 5- New CDR way is much easier | 5- New CDR way is much easier | No change |
Criteria | Existing collect and use consent (round 1) | Simplified collect and use consent (round 3) | Difference |
Trustworthiness of the CDR | 4- Very trustworthy | 4- Very trustworthy | No change |
Risk in relation to data sharing and the use case | 3- Moderately risky | 3- Moderately risky | No change |
Willingness to share data using the CDR for the use case | 3- Moderately willing | 4- Very willing | +1 point |
Adoption of the CDR for data sharing | 4- I’d probably use it | 3- I may use it | -1 point |
Other changes between rounds
Other prototype changes were tested that may have had an impact on consent scores, but these changes were either de-prioritised as proposals, not supported by the research, or constituted superficial variations. Nevertheless, the inclusion of these variations and hypotheses affected the results of the research. They are outlined here for completeness:
- A “CDR dictionary” term: this will be considered as part of future work on consent
- Porting accreditation information to the CDR policy: this adjustment was not supported by the research findings; consumer participants valued this information being shown up-front, and its absence was noted during testing.
- Progressive disclosure variation: minor changes were made to the progressive disclosure patterns used between rounds. The “information” button tested in round 2 performed poorly, with little engagement from participants; the “find out more” text link, tested in round 3, performed better, with higher engagement.
- Single versus multiple consent screens: while this change did not appear to negatively impact trust, displaying too much information on one screen could reduce consumer comprehension. This is not prescribed in the rules or standards, but the choice of implementation should consider the potential tradeoffs for comprehension
Changes in data sharing landscape
Multiple high-profile data breaches in 2022
Finding
Research spanned from early September to early November 2022, with some rounds taking place before multiple high-profile corporate data breaches in Australia, and some at the height of its media coverage.
- 2 Sep 2022: Round 1 research commenced
- 19 Sep 2022: Round 2 research commenced
- 22 Sep 2022: Optus data breach
- 30 Sep 2022: Energy Australia data breach
- 13 Oct 2022: Medibank data breach
- 14 Oct 2022: Woolworths MyDeal data breach
- 17 Oct 2022: Round 3 research commenced
- 27 Oct 2022: Repeated round 1 research commenced
- 28 Oct 2022: Repeated round 2 research commenced
While not all participants were directly affected by these data breaches, many cited these events as reasons as to why they perceive various levels of risk when data sharing, and why they would be slow to adopt the CDR data sharing method, if indeed they would adopt it at all.
Evidence
"The recent Optus incident put safety at the top of most people’s minds […] the risk is really high and can be quite significant. […] I try to be as vigilant as I can." —R3P6, when asked about the risks of data sharing
"I'm open to the idea, however with the recent data breaches at Optus and Medibank, it does make you question how secure these systems are, especially if companies as large as that can be breached." —R3P33, when asked about social influence and the CDR process
"Giving the recent data breach events, I am just not placing a lot of trust on sharing my personal financial data to a site I know little about." —R2P27, when asked about CDR adoption
Shift towards behavioural archetypes that have a lower propensity to share data
User archetypes are useful tools to segment and succinctly describe the different drivers, behaviours and needs observed throughout research. CDR behavioural archetypes are representations of actions and general attitudes toward data sharing.
Participants were given questions to assess their attitude towards the CDR process and proposed use case.
Participants were asked:
- How trustworthy they deem the CDR and its actors to be?
- How much benefit they see in using the CDR for this use case?
- How much risk they feel exists sharing their data through the CDR?
- How willing they would be to use the CDR for this use case?
- How important is the privacy of their data when using a digital app or service?
- How likely they are to adopt new services such as the CDR?
Participants responded to these questions by:
- Marking a Likert scale with a score from 1 to 5. ‘1’ being a negative indicator, ‘3’ being a neutral indicator, and ‘5’ being a positive indicator.
- Providing open-ended responses for more qualitative questions.
Participant responses are used to assign them to one of the 4 CDR behavioural archetypes:
- ⬛️ Sceptic
Low propensity to share
- 🟧 Assurance seeker
Medium Low propensity to share
- 🟨 Sensemaker
Medium High propensity to share
- 🟪 Enthusiast
High propensity to share
Sceptics are less trusting of organisations and/or technology. They generally value control, and are adverse to data sharing based on experience with current practices.
Assurance seekers want to read additional information. They generally value familiarity and external reference/support, and are apprehensive to new experiences.
Sensemakers need to understand how the process works. They generally value details, and can trust the process if given enough valuable information.
Enthusiasts are excited to get the benefits of CDR. They generally value simple experiences once trust is established.
For more detail, see
During research rounds conducted after the high profile data breaches, we saw the proportion of consumer participants scoring as "Sceptics" increase, and the proportion of "Assurance seekers" decrease. It’s difficult to gauge whether this shift was as a result of the breaches, or random participant selection. However, participant feedback indicates that the breaches were on many participants’ minds during research.
While we do design with Sceptics in mind, we know that they are far less likely to participate in the CDR than any other archetype.
Archetype | Early September 2022 (Round 1) | Early September 2022 (Round 2) | Mid October 2022 (Round 3) | Late October 2022 (Repeated Round 1-2) |
⬛️ Sceptic | 9.09% | 11.11% | 23.40% | 22.22% |
🟧 Assurance seeker | 72.73% | 55.56% | 63.83% | 59.26% |
🟨 Sensemaker | 18.18% | 33.33% | 12.77% | 18.52% |
🟪 Enthusiast | 0.00% | 0.00% | 0.00% | 0.00% |
General thoughts on existing methods vs CDR process
Finding
During round 2 and 3 interviews, some participants anecdotally compared their existing data sharing methods with the CDR process.
Some participants described their familiarity with sharing data through screen-scraping. Of these participants, a couple were able to identify differences between screen-scraping and the CDR process (as shown in the prototype).
- In particular, one participant commented positively on items they don’t normally see, such as being able to select which accounts to share, and the detailed information about what types of data would be shared, for how long, and the fact that they had the control to end the sharing arrangement.
- One participant commented that the CDR receipt was unexpected, as they had never received something like this when sharing data before. They felt positive about this, stating that they felt that the ADR was more legitimate and cared that they understood their data sharing arrangement.
One participant expressed a preference for manual entry, rather than providing access to their data holder account. They were concerned that the ADR would be able to steal their money if given access to their accounts.
Evidence
"I reckon [data sharing using CDR] was much different to what's in the market at the moment. Like when it comes to information, it's clear […] with the details itself, like what information exactly has been taken on, how it’s going to be used. And for how long. And I am in control. And so I like that because it is the unknown, it's always scary especially when it comes to privacy and especially if it was money, especially with finance and my own finance and you are in my bank account. […] And I only do it for a reason so if I have more security it'll be much better. […] I am more comfortable with it. It is really, yes, it is really good. I like to be more informed and there is more information in it and I like it." —R3P3
Empowering and voluntary
Participants were observed and questioned to understand their levels of control while using the CDR process.
Empowering and voluntary metrics relate to how the Consent Flow caters to the consumers’ sense of control, i.e:
- whether they choose to exercise control over certain options presented, and
- whether they believe they are able to achieve positive outcomes through use of the CDR.
Empowerment, voluntary consent and positive outcomes data was collected using:
- quantitative assessments, such as component engagement and Likert scores; and
- qualitative research approaches, such as interviews and open response survey questions.
Finding
Overall, participants across rounds spoke positively of feeling in control throughout the consent process. Generally, participants did not express a strong desire for more control than was shown.
Control options that were available in the prototype were cited as inspiring trust. In particular, the ability to withdraw consent at any time and have data deleted was mentioned by many unmoderated and interviewed participants. While this is not a control enacted during the consent flow, it reflects a future realisation of benefit and empowerment to take control of the data sharing arrangement. Few participants spoke generally to feeling in control, or mentioned being able to choose which accounts are shared as inspiring trust.
Evidence
"I think that offering too many options of things to consent or not consent to starts to feel like there might be so many options because you are trying to hide something. I think condensing reasonable options together makes things more straight forward. My overall comfort and confidence would be increased if the process is streamlined and sensible. I feel like I might be getting tricked if the options are too wide spread. I think the best apps have considered all the potential risks and have made an interface with questions that reassure the user that the whole process is secure and safe." —S3P24
"That I had to confirm the process with my Bank. That I had control over every step of the process and knew how to cancel going forwards." —R2P26
"[…] I also liked that I could choose which accounts I linked." —R3P9
"The detailed information given to remove/stop sharing the data provides the confidence in me that the process is well thought [through] and trust." —R3P21
Informed Consent and Comprehension
Directly after completing the simulated scenario with the prototype, participants were given questions to assess their understanding and memory of the consent terms and prototype.
Informed and Comprehension metrics relate to how the Consent Flow caters to a consumers’ informed consent (as per the consent objects outlined in Rule 4.9). Participants were asked to recall:
Criteria | Measure | CDR Rule 4.9 criteria |
From (data holder) | Who they were sharing data from | Informed |
To (data recipient) | Who they were sharing data to | Informed |
Purpose | Why they were sharing their data | Specific as to purpose;
Informed |
Datasets shared | What kind of data they elected to share | Informed |
Duration | How long they were sharing data for | Time limited;
Informed |
Voluntary consent and consequences of not sharing data | What happens if they don’t share data | Voluntary;
Informed |
Withdrawal methods | How they might stop sharing their data
| Easily withdrawn;
Informed |
Redundant data handling | What will happen when their data’s no longer needed | Informed
|
Risks associated with data sharing | How much risk did they see in allowing their data to be accessed for the [service] | Informed |
Comprehension of consent terms
Based on the recollection of consent terms, most round 1 and 3 participants were considered well informed when they provided consent.
Overall, comprehension scores decreased between rounds 1 and 3 across all but two criteria (’Voluntary consent and consequences of not sharing data’ and ‘Redundant data handling’). However, statistical analysis indicates that most of these changes in scores are more likely to have occurred due to chance rather than due to changes made to the Consent Flow. That is, for most criteria, we cannot conclude that the differences in scores were statistically significant and as such cannot determine whether changes to the consent flow were the cause of any changes in scores. Only two criteria contributed to the small but statistically significant decrease (‘Withdrawal methods’ and ‘Duration’).
Criteria | Comprehension of existing Consent Flow (round 1) | Comprehension of iterated simplified Consent Flow (round 3) | Difference |
From (data holder) | 97.4% | 89.4% | -8.0% |
To (data recipient) | 86.8% | 74.5% | -12.4% |
Purpose | 84.6% | 83.0% | -1.6% |
Datasets shared | 94.7% | 89.4% | -5.4% |
Duration | 84.2% | 66.0% | -18.3% |
Voluntary consent and consequences of not sharing data | 84.2% | 93.6% | 9.4% |
Withdrawal methods | 73.7% | 46.8% | -26.9% |
Redundant data handling | 81.6% | 87.2% | 5.7% |
Risks associated with data sharing | 97.4% | 95.7% | -1.6% |
Trustworthiness
Participants were observed and questioned to understand their sense of trust while using the CDR process.
Trustworthiness metrics relate to how the Consent Flow caters to the consumers’ trust, i.e:
- whether they choose to engage with trust markers presented, and
- whether they deem the actors and process of the CDR to be trustworthy.
Based on CX research (to date), the following trust markers have been identified:
- Social proof- this may include ADR accreditation, external reviews, etc.
- External links to government websites
- Clear purpose and benefits statements
- What will and won't be shared statements
- Withdrawal information
- Data handling information
- Link to the CDR policy
Trustworthiness data was collected using:
- quantitative assessments, such as component engagement and Likert scores; and
- qualitative research approaches, such as interviews and open response survey questions.
Participants were asked:
- How trustworthy they deem the CDR and its actors to be?
- What increases or decreases their propensity to share CDR data?
Participants responded to these questions by:
- Marking a Likert scale with a score from 1 to 5. ‘1’ being a negative indicator, ‘3’ being a neutral indicator, and ‘5’ being a positive indicator.
- Providing open-ended responses for more qualitative questions.
Finding
Across all rounds, participants appeared to engage with trust markers less than with elements of empowerment or information about the terms of their consent. Nevertheless, when asked how trustworthy they found the process, the most common response was “very trustworthy” for both rounds 1 and 3.
For many, trust was reduced by a lack of familiarity with the fictional ADR. Some participants wanted further, specific information about the ADR to improve their trust. Government oversight, as well as the involvement of a reputable Data Holder, were cited by some as inspiring trust.
Many cited information shown about the terms of consent and security and privacy assurances as inspiring trust. However, for many, concerns about security, data breaches and data sharing in general reduced trust. Some suggested to enhance security, or provide further details about security measures in place.
Evidence
“simple breakdowns of their commitments and usage of my data.”—R3P11, when asked what inspired trust
“Ability to read side notes about the consumer data rights being an Australian government backed product.”—R2P23, when asked what inspired trust
“Does not define cybersecurity levels. Just telling me it is 'secure' [is] not sufficient.”—R3P35, when asked what reduced trust
“More information about who [ADR] is - who is running it? How long have they been running for? What kind of steps are taken to ensure data is secure?”—R3P25, when asked how to improve trust
Findings
What did we learn?
The participants in our research demonstrated various expectations and needs relating to:
- The display of information
- Control and choice
Separation of consents (bundling)
Can collection and use consents be granted in a single action without reducing empowerment or comprehension?
Finding
Many round 1 surveyed and round 2-3 moderated interview participants emphasised the importance of the link between the data requested and the service delivered. More than 65% of surveyed participants rated “tell me what the information will be used for” as “high importance”.
All moderated interview participants were presented with a bundled collection and use consent flow for a Personal Finance Management service. Participants were able to identify independently that the data requested through the collection consent would allow the service outlined under the use consent to function correctly. As a result of these consents being bundled and essential data clusters being presented as text-only information, the majority of participants believed that if they shared less information, the service delivery would be impacted. Some participants stated that explanations of what and why data was needed inspired trust.
Evidence
"I share my information then […] the app can process the data and help me understand my financial goals, my spending, etc."—R2P1
"If they're gonna process [my information] and offer me some advice, then they're gonna need some information. Otherwise they're blundering around in the dark. So it feels like they're offering me something and they need something from me in order to do that. Like hiring an architect and saying, ‘design me a really beautiful house’. And they go, ‘well how big's the block?’ And I go, ‘Oh, can't tell you that.’"—R3P2
"[The reason I need to share this information] seems to be so that they can analyse expenses and provide you with, as it says, tailored suggestions. So that kind of makes sense because I know, initially reading the text on what they need, I was like, ‘Oh, that's quite a lot of information.’ But then in terms of the service that's being provided, in terms of analysis and tailored suggestions, in terms of the financial position, that kind of makes sense."—R3P2
Despite collection and use consents being bundled, consumer participants engaged with the requisite details of both consents.
More than 70% of round 3 unmoderated prototype task participants engaged with content related to the purpose statement (e.g. reading it out loud, or referencing the purpose in their comments). This was also further evidenced by comments made during interview sessions, as well as the consistently high comprehension of the purpose of the consents.
As such, the evidence suggests that bundling collection and use consents does not meaningfully reduce engagement and informed consent.
To consumers, the data requested and the service being delivered are inextricably linked. Bundling of collection and use consents accurately reflects consumers’ mental model of providing access to data for a service. A use consent outlining a clear description of the service also provides consumers with reassurance and clarity to justify the data requested in the collection consent.
If use and collection were required to be granted in separate actions, this would break consumer mental models, resulting in a consent flow that may feel unnecessarily onerous. The separation of these consents may also negatively impact the comprehension that the data collection is needed for the service to operate effectively.
Similarly, 2021 research into disclosure consents suggested that bundling a collection, use, and disclosure consent aligned with consumer mental models where the disclosure consent was essential to the provision of the service. The research focussed on a rental application proposition, where the sole purpose of collecting data was to disclose insights to a real estate agency.
Abiding by the Data Minimisation Principle (DMP) will help make the link between the data requested and use case clear. A clear purpose statement also helps highlight the relevance and importance of the data requested.
Research shows that the Personal Finance Management service was easily understood by participants. Opportunities exist to conduct further research using other use cases or sectors.
Opportunities
Existing requirements could be reviewed to allow collection and use consents to be requested and granted in a single action.
Can multiple uses be requested in a single flow without impacting comprehension or trustworthiness?
Finding
During moderated interviews, participants were asked “What if you could choose to share more information to get extra services?” Some participants were also presented with examples verbally or visually. While few participants were hesitant to share more data, overall feedback indicated that if consumers saw value in the additional services, they would be open to sharing more data.
A couple of participants reacted positively to being presented with additional services as “opt-in” during the consent flow, as it gave them the impression that the ADR was not asking for data unnecessarily.
Evidence
"Depends what the extra services are and if I think they would benefit me at all. If I’m increasing the information I’m providing, I would want the app to more accurately tailor solutions to me, tell me where my money goes and assist me in reaching my financial goals in other ways."—R2P6
"As long as they at the beginning stay unchecked and being like, ‘you can optionally add these if you want.’ […] From a consumer perspective it would be a like big green flag for me."—R3P1
"Yeah, I think that [extra data for extra services] could be helpful. That's probably something that I would do. I always like to have the full functionality of everything that I'm using."—R3P7
"If those services appeal to me, I probably would do it. […] It's good [that it’s asking for more information] because if you don't tick it, you know that it's not taking all that information at the same time. But yeah, I mean that makes sense that you would need more information to be able to actively do the gradual bill payments. […] If you don't need the automatic bill payments thing then [you] don't need to give access to the saved payees."—R3P8
Consumers are open to opting in to additional services at the time of consent if they feel related, relevant and valuable. They expect any additional data requirements to be explicitly stated to allow them to make informed decisions.
Consumers expect additional uses to be presented as opt-in. Opt-in uses may also give consumers faith that the service is not asking for more control or more information than is necessary.
Opportunities
The research supports requests being made for related but non-essential “add-on” uses in a single consent flow, provided they require active selection by the consumer.
Pre-selected and actively selected options
Can required datasets be pre-selected or clearly indicated without impacting empowerment and comprehension?
Round 1 unmoderated research participants were presented with the option to actively select up to 3 datasets to be shared. Participants were able to proceed with any number of datasets selected (including none); there was no indication about what data was required. This is reflective of the current approach outlined in the CX Guidelines.
In round 2-3 interviews and unmoderated research, participants were presented with datasets as “information only”, with no option to actively select or deselect datasets. Round 2-3 interviewed participants were asked about their control preferences for datasets in the context of the PFM use case.
Finding
The link between datasets and uses is front of mind for consumers (for more information, see Proposal 10 in this report). Participants across all three rounds spoke to the connection between the data being requested and the Personal Finance Management use case. Many participants commented that all data requested would be needed for the service to operate effectively, even when presented with the option to actively select.
Most round 1 unmoderated prototype task participants opted to share all datasets, despite being able to proceed with fewer. As these participants were not interviewed, we don’t have definitive insight to their motivations for doing so. Some did comment out loud that they felt that all datasets would be needed for the service to operate effectively, hinting at the fact that they were analysing the data requested against the service they expected.
All round 2-3 moderated interview participants identified that the data requested would allow the service to function correctly. Many of them also believed that the service would be impacted if less data was shared than what was requested. Even with active selection removed, many participants were observed engaging with data cluster information, including reading clusters out loud or commenting on the logical link between clusters and the service on offer.
Evidence
"Let’s do everything [all datasets]… Not a big fan of sharing this, but to use a finance app like this it makes sense."—R1P22
"They want to be able to help you with your finances going forward. So they have to see what you're doing already, I guess, any patterns and where you spend most of your money and I can see why they're asking for the information."—R2P4
"If the app is kind of consolidating all my banking information, then it's gonna need to understand all the transactions and stuff in my bank."—R3P7
Finding
Half of round 2-3 moderated interview participants expressed a desire for control over datasets. Of these participants, most were keen for this level of control despite acknowledging that the service may be impacted if they shared less. A couple also acknowledged that they may not be allowed to proceed unless all datasets were selected, but still commented positively on what they called the “illusion of control” (R3P1).
Some participants stated they saw no need for control over datasets, as they believed all datasets listed were needed to deliver the service. One stated they’d like to start with less data, to trial the service, with the option to add more data clusters later to enhance the service if they felt it was worthwhile. One participant expressed a desire for control at a permission-level (rather than at the data-cluster level).
The fact that data was essential to the service was also echoed by a few round 1 surveyed participants. One participant stated that offering too many choices reduced their trust in the ADR, preferring a reasonable selection of consolidated options.
When asked about items that inspired or didn’t inspire trust, or how to improve trust, control over datasets was not front of mind for participants. No round 1 unmoderated prototype task participants mentioned having control over datasets as inspiring trust, and just one round 3 moderated interview participant suggested “giving people the option of what information can be provided” as a way to improve trust.
Evidence
"It says here though, if you want a proper analysis, it needs all this [data]. So I don't really want to [have control]. I don't think having three boxes to tick these would be possible without compromising the service. So there's no point."—R2P2
"I'd feel better about [being able to choose to share less]. Sometimes you need to try before you buy. Where you can try a little bit, see if it works for you, and then you can always build on it. That way you've got the capacity for the service to build trust and just sort of see how it goes without committing all that information so upfront. […] If you're not giving them as much information, then you're not actually gonna be able to get as much. The recommendations, I imagine, and the analysis is obviously gonna be restricted as well. So I guess initially I'd feel comfortable probably just sharing account balances and […] maybe scheduled payments. And not transaction details."—R3P6
"Well, [control over datasets] sounds good. Although, if I withheld certain information, I wouldn't get the right advice that I'm seeking. It would be incomplete. Because I think they do need all this information. […] So I would say that, I suppose it would make me feel good if they give me that option [to choose to share less]. Not that I would take up that option in view of what service I require, because they wouldn't be able to do their job properly."—R3P5
Few participants mentioned a desire for control over what data would be shared unprompted. However, when probed during round 2-3 moderated interviews, many did express an interest in this control.
Most consumers were able to infer when data clusters are reasonably needed for the service. They understood that sharing less data would impact the service offering. The absence of control over datasets may have helped consumers draw this connection.
Active selection of datasets may be seen by some consumers as a marker of empowerment and control. This is despite an understanding that the service may be impacted by sharing less data, or that it may only be an illusion of control, where all data sets must be selected before they can proceed.
For others, active selection of data that is essential to the provision of the service is seen as a false choice, and an unnecessary step.
Heuristically, actively selecting required datasets imposes an increased interaction burden on consumers. This increased load may be seen as worthwhile if it leads to better engagement with the information. However, the research indicates that removing the actively select requirement does not meaningfully reduce engagement with data clusters.
Further, technical limitations mean that certain data clusters are a pre-requisite for others (e.g. the Transaction Details cannot be accessed without the Account Balances and Details cluster). This places a burden on consumers to understand technical dependencies and service requirements.
Similar issues have been addressed in other jurisdictions, such as the GDPR. Cookie consents implementations, for example, include the pre-selection and disablement of required permissions, often with a label of ‘Necessary’ or ‘Essential’. This relays the fact that the pre-selected permission is not optional for the service to operate, and as such cannot be de-selected.
Opportunities
To support simplification and informed consent, data clusters that are essential to the provision of a service (that is, the service cannot be delivered without them) could be clearly indicated without the presence of an interactive component, such as a checkbox or toggle.
CDR participants could be allowed (but not required) to do this if the good or service cannot be delivered without the requested data. This would require a reconsideration of existing requirements that prohibit pre-selection and require active selection.
However, where datasets are genuinely optional because they are not essential for the service to function, maintaining existing requirements that prohibit pre-selection would better match consumer expectations and alignment with the DMP.
Allowing optional permissions to be requested alongside ‘essential’ permissions could also be considered (see findings related to multiple use consents being requested in a single flow).
The DMP will factor into understandings of what is ‘essential’ or ‘required’. Consideration could be given to the scope and definition of a “good or service”, and how this might be governed by the DMP.
Further research on consumer control would be beneficial, particularly as the CDR expands to support other sectors, use cases, and the initiation of payments and actions.
Where the consent duration is essential to the provision of the service, can this be pre-selected or clearly indicated without impacting empowerment and comprehension?
Round 1-2 unmoderated prototype task and round 2 moderated interviews participants were presented with duration as “information only”, with no option to actively select or deselect duration. This is reflective of the current approach outlined in the CX Guidelines.
Round 2 moderated interview participants were asked about their control preferences for duration in the context of the PFM use case.
In round 3 moderated interviews and unmoderated prototype tasks, participants were presented with duration as “information only”, with no option to actively select or deselect duration. Duration was clearly displayed alongside Collection consent information as well as Use consent information.
Round 3 moderated interview participants were also presented with an alternative scenario after completing the main prototype task. In this second scenario, duration was presented as options to actively select. Duration for Collection consent and Use consent were conflated.
Participants were asked about their control preferences for duration in the context of the PFM use case.
Finding
When asked to rate the importance of control over consent duration, 33% of round 1 surveyed participants rated this as “high importance”. This represents the lowest “high importance” ranking of the survey round.
In the case of round 2-3 moderated interviews, participants shown a Personal Finance Management service requesting ongoing data access for 3 months were generally comfortable with this duration. However, many still suggested a desire to select a longer or shorter period. Some proposed a 6 or 12 month duration to gain better insight into their spending habits. Others suggested a 1 month trial would suit them better.
One participant explained that they were uncertain around the most applicable duration for the service, so if given the choice, they would ultimately select whichever duration was “recommended” by the ADR. This echoed findings from the round 1 surveys, where one participant similarly expressed a desire for the ADR to provide sensible options.
Evidence
"I probably wouldn't [want to choose a duration], just because at this point I don't really know what the app is doing so much and I don't really have a feel for it. […] So at the moment […] if you did offer me multiple options, I'd probably just choose whatever the recommended one is."—R3P1
"I think it's good to have options… people are at different points with different needs, so giving people the option for three months, six months, or 12 months, I think it would be worthwhile, definitely."—R3P6
Finding
Some moderated interview participants expressed a desire for control over the period of historical data that would be shared. These participants cited major life events such as moving overseas or buying property as not reflecting true spending habits and therefore impacting the accuracy of the service. In these instances, participants explained that they would like to be able to specify the historical data and duration timing (past or future dates) for the sharing arrangement.
Evidence
"I also think if they gave the option of different amounts of times that you could backdate the information, that would be better. Because I know with my finances, I wouldn't need access to 12 months back because I bought a house so my banking changed quite a bit."—R2P8, referring to the 12 months historic access
"It would be handy to specify the period that you want it to work for. Our current spending is extraordinary to our [usual] spending habits. I’m happy with the 3 months, but for us it wouldn’t be worth starting until we get back to something a bit more ‘normal’. If I had the option to do 3 months, but starting from 1st November - delay the starting time. Or possibly do it in arrears, back track it to a time where things were ‘normal’."—R2P7, referring to selecting a date range to analyse
Consumer participants thought critically about the link between the service on offer and the consent duration being requested. In the case of the Personal Finance Management service tested in the research, what was considered to be the most appropriate period was heavily influenced by personal circumstances.
Some consumer participants expected that the data recipient would determine the most appropriate duration option for the service.
Others, despite feeling comfortable with the suggested duration, wanted to be able to choose from different duration options.
Opportunities
While some goods and services can offer a range of consent duration options without resulting in a service impact, other use cases may require certain durations to function properly.
Where a specific duration is necessary for a service, data recipients could be allowed to pre-select the duration or specify the duration in text form. This would diverge from the current requirement to choose the period of the consent or actively select whether the consent would apply on a once-off or ongoing basis.
If optionality and flexibility exists, allowing consumers to choose a duration beyond the minimum duration would best support consumer empowerment. Consideration could be given to permitting data recipients to present the minimum required duration as “recommended”. However, allowing any duration to be presented as “recommended” other than the minimum required could mislead consumers.
Further research on minimum access periods for other goods and services would build on insights from the consent review research, which focused on a Personal Finance Management service.
Allowing for control over historic data access durations could be considered in the future, to empower consumers to adjust this depending on their individual circumstances.
Withdrawal of consent information
Can withdrawal information shown during consent be simplified without impacting comprehension and empowerment?
Finding
More than 60% of survey participants rated “before I give permission, let me know that I can withdraw it at any time” as “high importance”. This importance was echoed in feedback received during unmoderated prototype tasks and moderated prototype interviews, with almost half of interviewees commenting positively on the withdrawal information shown during consent.
The fact that consent could be withdrawn was also cited by some as a trust marker when asked “What parts of the process inspired trust.”
Some participants also commented that the statement that consent could be withdrawn at any time gave them confidence to proceed, knowing they were empowered to withdraw consent later.
Evidence
"And also that we can stop sharing our data at any time. I think that's really good as well. So if you read something in the news about a certain company and the fact that they might have had a wormhole and been been hacked or something like that. The fact that you can stop sharing your data and potentially re-share it again, when you have more confidence in the company is really good."—R2P6
"I guess if I'm at this point, I'm obviously quite interested and if my friend has made the recommendation and has spoken so highly of it, I think that would lend itself for me to feel a bit more comfortable doing it, knowing that I can withdraw consent at any time."—R3P6
Finding
In contrast to participants’ appetite for the statement that consent can be withdrawn at any time, being given information prior to consent on how to withdraw consent, or the consequences of withdrawing consent were rated as “high importance” by less than 50% of survey participants. Prototype testing supported this finding, with moderated interview participants across rounds 2 and 3 not commenting on the absence of this information during consent, nor did they express a desire to know this.
The absence of withdrawal instructions meant that participants in rounds 2 and 3 were less likely to be able to answer the question “How would you stop the collection, use and/or disclosure of your data?”. Half of these participants were able to correctly answer this question. It stands to reason that the absence of withdrawal instructions would impact participants’ recall of those instructions. Nevertheless, the fact that the majority of participants were able to identify that they had control to manage or withdraw their consent shows that this aspect of CDR was clearly understood.
Despite consequences not being shown during testing, unmoderated task and moderated interview participants were largely able to infer that not sharing their data would result in the ADR’s service being affected. More than 75% of participants were able to correctly state the consequences.
Evidence
"[…] Make sure withdrawal ability is available without complexity (I can do it myself, not contact a call centre/support)"—S2P3
"The detailed information given to remove/stop sharing the data provides the confidence in me that the process is well thought [through] and [trustworthy]."—R3P21
Finding
Many round 2-3 moderated interview participants commented on withdrawal information shown during pre-consent, authorisation and post-consent. Some accessed this information via the CDR policy during consent.
Research shows that consumers engage with information at different stages, depending on their comfort levels. Repeated inclusion of the statement that consent can be withdrawn at any time served as a trust marker.
The fact that consent could also be withdrawn via their data holder was valued by a few participants.
In the CDR receipt, consent management information and withdrawal instructions were most frequently mentioned by round 2-3 moderated interview participants when asked “what information is important to you”.
Evidence
"[…] then there's the withdrawal, which is good and important and explains how to do that […] So you can do it through the dashboard, the bank's dashboard or by writing to either party. So you've got three options. Okay. So that's, that's good to know"—R3P6, when exploring the CDR Policy
“it has the information about just in case if I wanna stop. I won't be wondering where is that setting in the app? It says that here […] exactly where it is just in case. If I still don't do that, I can also email, there's an email address, to reach out to. So I like that piece of information.“—R2P1, when exploring the CDR receipt
“[The CDR receipt email] gives you a reference if you wanna stop sharing. Now that I've got that email, I can hopefully see how to do it.“—R2P2, when exploring the CDR receipt
Communicating that consent can be withdrawn at any time is important for building trustworthiness and confidence. Consumer participants appreciated this being mentioned at various stages of the consent flow and throughout the consent model, with some stating that this gave them confidence to proceed.
Full withdrawal details in the CDR policy were appreciated. Likewise, withdrawal instructions in the CDR receipt reassured those who felt they may want to withdraw their consent before the end of the consent period.
Consumer participants expected the process for withdrawing consent to be intuitive, easily accessible, and self-service.
Opportunities
The existing withdrawal process largely meets consumer expectations, but certain requirements could be reconsidered.
The requirement to show withdrawal instructions in the consent flow could be removed and provided in the CDR receipt instead.
The requirement to state the consequences of withdrawal up front could instead be reserved for if a consumer decides to exit the consent process, at which point the CDR participant could contextually state the consequences of not proceeding.
Exisiting requirements to include full withdrawal details in the CDR policy and CDR receipt meet consumer expectations. Currently, CDR participants are required to include information provided when obtaining consent as part of their CDR receipt. If requirements for withdrawal instructions and consequences are removed from the consent flow, the CDR receipt requirements could be strengthened to explicitly include these elements.
The data holder dashboard requirement for withdrawal to be no more complicated than the process of giving the authorisation could be expanded to apply equally to consent withdrawals.
Supporting parties
Does the consistent display of supporting parties better align with consumer expectations?
Finding
Many round 1 surveyed and round 2-3 moderated interview participants wanted transparency and to know more about the ADR and their associated third parties before data sharing. Furthermore, few wanted assurances that any parties accessing the data were reviewed or audited, “legitimate” and “Australian based”. This aligns with information found in the CDR policy, as per Rule 7.2(4).
Roughly 75% of round 1 surveyed participants rated being told whether “any third parties used by [ADR] will access their information” as “high importance.” To contrast, roughly 36% of surveyed participants wanted the “[ADR]'s policy so I can read more about how third parties access and use my information.” While the ‘how’ information is rated lower in round 1 surveys, it still acted as a trust-marker for certain behavioural archetypes in subsequent round 2–3 moderated interviews and unmoderated prototype tasks.
Evidence
“In my opinion, basic requirement is for [ADR] to be transparent and let the user know what activities have been performed on the data, what data has been collected, what data has been shared, with whom the data has been shared and when the data is removed from their database.”—S3P12, rated “Tell me if any third parties used by [ADR] will access my information” as “high importance”
Finding
While round 2-3 moderated interview participants seemingly valued upfront transparency and information about the third parties involved, there was an overall negative sentiment towards third party access. They explained that their preference is to have their data accessed by as few parties as possible, believing that “it’s not as safe”. This echoes participant attitudes from 2020 research, Phase 3, Rounds 4-5. Some believed that “a lot of companies have 3rd party affiliates” and that this information is usually hidden “in the terms and conditions or small print.” This also garnered negative sentiment and fostered mistrust.
In addition to knowing if any third parties are associated with the ADR, round 1 surveyed participants wanted transparency and control around the parties who may access the data. This echoes 2020 Phase 3 Research R4-5, and CX Guidelines (Checklist references 1CO.03a.03, 1CO.03b.08, 1CO.03c.15).
Evidence
“To me that makes it a little bit more negative. That it means my data's been shared and is out there being shared by a lot more people than I originally thought. It's just how I'm thinking. It's not a positive thing […] in my head somehow that it's not as safe. That there's more ways that my taking information can be used fraudulently.”—R2P4
“This is the stuff I don't like, […] they can give your data to third parties. I'd still be willing to do it because I understand why they would have to share that information, at least for what it's saying. Obviously it'd be better if you only had to share it with [ADR]. […] I don’t know how much I'd look into it, but one thing is, […] do their third parties also need to give it out to someone else?”—R2P8
“If third parties are used give me details about each one with an explanation of what data they will receive and why it is required. I would also like the ability to allow/disallow third parties individually.”—S1P26
CX research has consistently shown the importance of outlining all parties involved in the process who may access the data.
Consumer participants expected transparency around any OSP/intermediary involvement to allow them to make informed decisions about their consent.
Opportunities
Existing requirements could be reviewed to consider a consistent presentation of information relating to sponsors, principals and OSPs alike.
As per the CX Guidelines (Checklist references 1CO.03a.04, 1CO.03b.02, 1CO.03c.12), this could include the name(s), related accreditation number(s), and links to the related CDR policy of any supporting parties.
In other jurisdictions, such as GDPR, data recipients alert consumers periodically if/when supporting parties change. Such updates could be considered for CDR to ensure consumers are informed on an ongoing basis.
Data language standards
Can the data language ‘permissions’ be referred to in a more conversational way?
In round 1 surveys, participants were asked to rank information about “what will be accessed using consistent and standardised language” as low, medium or high importance when data sharing using the CDR across different use cases.
In the round 1 unmoderated prototype task, the Consent step initially presented data clusters with an optional progressive disclosure design pattern. Upon expansion of the progressive disclosure, permissions were presented as a bulleted list.
In round 2 moderated interviews and unmoderated prototype tasks, the Consent step initially presents data clusters in a conversational manner. Upon expansion of the progressive disclosure design pattern, data clusters were presented as a bulleted list. Individual permissions were omitted from the Consent step.
In round 3 moderated interviews and unmoderated prototype tasks, the Consent step initially presents data clusters as a bulleted list. Upon expansion of the progressive disclosure design pattern, permissions were presented conversationally in short paragraphs.
In rounds 1-3 moderated interviews and unmoderated prototype tasks, participants were also able to access data clusters and permissions as a bulleted list in the CDR policy, CDR receipt or during the Authorisation step with the data holder.
Finding
Roughly 56% of round 1 surveyed participants rated information about “what will be accessed using consistent and standardised language” as “high importance.”
Past research suggested that banking data language was comprehensible. This was further echoed in rounds 1-3 prototype research. As such, participants were not always compelled to dig into the permission-level details of the data clusters. Roughly 61% of rounds 1-3 moderated interviewed and unmoderated prototype task participants engaged with at least one data cluster accordion during the Consent Flow. Overall, there were very few comments provided about the way the data language information was displayed. This could imply that they felt neutral about the information display. Of those that anecdotally provided feedback:
- Two participants commented on the difference between the conversational and bulleted list display of information.
- One appreciated the permission-level detail of the bulleted list;
- another commented they found the bullet points easier to scan.
- One participant believed that the conversational permission-level details on the Consent screen was “an explanation” of the data cluster. While on the Authorise screen, they described the bulleted list of permissions as a “big list of stuff”.
- One participant felt that the permission-level detail may be overwhelming to other consumers, and suggested showing just data cluster level information.
In the scenario that included energy data, one surveyed participant suggested the inclusion of ‘additional descriptions’ to facilitate informed consent. This echoes 2020 research and recommendations in DP213 CX Standards | Energy Data Language.
Evidence
“[I want a] breakdown what each item does that they are looking at, e.g. what concessions mean, what usage means etc.”—S2P4
"The fact it tells me what’s been accessed is fantastic as well. Are you able to reduce the amount of information to make it less overwhelming? e.g. “we’ll access [data cluster level only] in order to give you accurate and tailored information in relation to your banking” This [permission level dot points] could create problems, it feels like a lot of information. If people want to go hunting, then they can. I feel like if it was simplified slightly. Same with [your consent screen]. This [data accessed section] probably feels like too much information and that people feel like they would probably try and cancel."—R2P6, when exploring the CDR Receipt
"Just showing a bit more of a breakdown in less of like this and more of a breakdown in like dot points I think would be good."—R3P1, when exploring the Consent step
“It's the beautiful dot points. This is what I was talking about, this is good. This is nice.“
—R3P1, when exploring the Authorisation step
Consumers scan and process information differently.
- For some, data language lists made it easier for them to scan and understand permission details.
- For others, data cluster headings with short conversational paragraphs describing and explaining permissions were favourable.
Banking data language was easily understood by consumers in both formats. CX research on energy and telco language in 2020 and 2022 showed that some technical terms and jargon were unavoidable.
Opportunities
Flexibility in how data language is presented to consumers would help support different consumer preferences and comprehension of complex terms, which may differ by sector or target market.
The existing CX standards could be amended to explicitly allow flexibility in the format and presentation of the data language standards.
Further research could be conducted to refine CX guidelines on structure and content preferences for different sectors.
90-day notifications
Should the requirements for 90-day notifications be amended to provide clarity on their content, and to allow flexibility for consolidating them?
Finding
Round 3 moderated interview participants saw value in reminder notifications for their sharing arrangements. They explained that reminders bolstered their sense of control and encouraged them to make informed assessments about their consents.
Feedback from a few participants suggests that the timing and channel of notifications could be flexible, and tailored to suit the urgency and sensitivity of the consent. For example, one participant suggested that write-access consents may require expiry reminders with more notice than read-access consents, due to the higher effort required to set up other automations.
Participants explained that reminder notifications bolstered their sense of control and encouraged them to make informed assessments around what they would like to do with their active consents. Many assumed that reminder notifications would contain actionable steps to provide them with control, such as calls to action to review or withdraw their consent.
Evidence
"It's really good [to receive reminder notifications] because it will give me the assurance that I am in control. And I can change things if I needed to or I wanted to."—R3P3
"[Reminder notifications is] a good way to keep it updated. And have an electronic reminder, like a text message. I think it's better than the email just because you can see it straight away. With the emails, you can see it but not as straight away. So I would, if that's something with the content and you need to renew some agreements, then I would definitely do it via a [SMS] message reminder."—R3P4
Finding
Where consumers may have multiple consents active with one ADR, round 3 moderated interview participants expressed concern about notification fatigue. Some suggested consolidated reminder notifications for all active consents to overcome this.
Evidence
"I guess [receiving notifications a few days or weeks apart] would be fine […] depending on how often they are […] if it were every week, it would be easier to ignore them."—R3P8
"[Getting multiple notifications] might start to get annoying. Maybe it could be a monthly email saying what access is gonna expire in the next month or something. Yeah, I feel like if you get multiple emails, particularly if you got a lot of banks, that might start to get a bit frustrating."—R3P7
The value of 90-day notifications is clear. However, the rigidity of the current requirements for their delivery schedule may result in notification fatigue, particularly as CDR adoption grows.
The lack of detail around notification content means consumer control may be absent. Notifications without an actionable next step can result in frustration and disengagement.
Opportunities
The requirements could be reviewed to allow for flexibility to consolidate notifications. This might include guidance around consolidated notifications timing, to ensure consumer protections are maintained.
Consumer control and empowerment could be improved if CDR requirements specified that 90-day notifications require an actionable step to review active consents.
The requirements may consider allowing CDR participants the flexibility to deliver notifications via different channels, depending on the urgency or sensitivity of the notification.
CDR receipts
Would specific guidance on what to include in a CDR receipt help to better meet consumer expectations?
Finding
Round 2-3 moderated interview participants were shown an example of the CDR receipt that included more detailed information, compared to what was presented during the Consent Flow. The example CDR receipt covered:
- Collection information outlining data accessed (using data language with data clusters as title and permissions as a list), as well as duration and frequency;
- Use information outlining the service provided, as well as duration and frequency;
- Redundant data handling information and a link to the CDR policy;
- Data management information outlining withdrawal timing, instruction and consequence.
Overall, consumer participants felt that the right amount of information was included. In particular, they placed value on information that improved transparency and their control.
Consent management and withdrawal information and instructions were most frequently cited as important information. Few participants also cited detail about what data was being accessed, duration information and contact details for the ADR.
Evidence
"It's a confirmation of what I've just agreed to, but it's in black and white in case I didn't quite understand. And I've got it as an email there as a backup which is good."—R2P4
"I don't think more [information], but I think that this is just the right amount of information. I wouldn't want any less. Certainly, this is good. This is excellent"—R3P5
Finding
When asked whether they felt like any information was missing, some round 2-3 moderated interview participants felt that nothing was missing.
Where some participants identified information they felt was missing, it related mainly to gaps in knowledge that they had expressed during the data sharing process. These reflect perhaps less of a need for this information to be included in the CDR receipt specifically, and more of a desire for further information in general. The following were mentioned by few participants:
- More specifics about what accounts were shared, such as account names (but not numbers)
- Samples of actual data accessed, such as actual fees and interest
- More contact details for the ADR, such as phone numbers
- Information about the ADR Accreditation, such as the accreditation number
- Information about why the data is being accessed, such as the purpose and the DMP
- Information about closing their account with the ADR, not just stopping data access
- Information about how and why data may need to be retained for legal reasons
CX research suggested that CDR receipts play an important role for informed consent and consent management.
The level of information provided in the research was broadly seen as sufficient and aligned with expectations, though some participants desired more detail.
Opportunities
Existing requirements could be revised to explicitly state what information to include in the CDR receipt. The specifications for a CDR receipt could be drawn from the artefact that tested successfully in CX research.
CDR receipts can continue to act as a record of the data sharing arrangement, with links to additional information (such as the CDR policy) as appropriate. This is especially important if critical information provided in the consent flow is reduced or only accessible upon-click.
Information relating to withdrawing consent was regarded as valuable. The research findings suggested that consumer expectations and control could be supported by providing the full details of a consumer’s right to withdraw, including instructions for how they can do so, in the CDR receipt.
Would further guidance on when to provide a CDR receipt better meet consumer expectations?
Finding
The majority of round 2-3 moderated interview participants valued receiving the CDR receipt, with just under half stating that they expected to receive a confirmation or record of the sharing arrangement. Of those that hadn’t expected a CDR receipt from the ADR:
- one explained that they expected it from their data holder;
- another stated they’d never received an email confirmation when authorising accounts with apps in the past;
- some suggested the CDR receipt provided transparency, which reassured them that the process was legitimate.
Echoing past research, it was believed the CDR receipt would be valuable as a reference point in the future.
Evidence
"The fact that I got a receipt saying they’re using my data is fantastic - no one sends this stuff. The fact it tells me what’s been accessed is fantastic as well."—R2P6
"[CDR receipt is] a good confirmation because the notification's not necessarily helpful, but the email is, because then I can go back and check on something, if anything doesn't match further down the track and go, I'm sure this was on the such and such today. And I can do a search in my inbox and find this. Where it might not be evident from the app."—R3P2
CDR receipts provide a point-in-time record of the consent given. This record is valued by consumers and expected by many. The CDR receipt provides another trust-marker for participants to feel reassured about their data sharing arrangement.
Opportunities
Further research could be conducted to understand meaningful triggers for CDR receipts and whether existing receipt delivery requirements could be expanded, but based on heuristic analysis the following could be considered:
- expiry (not just withdrawal);
- updates regarding redundant data handling, such as when data is expected to be deleted following consent expiry;
- the fact that data has been deleted once this has occurred.
Further research could be conducted to understand consumer appetite for CDR receipts when providing multiple consents in quick succession.
Dashboards for once-off consents
Are dashboards necessary for once-off consents?
Finding
When the consent was once-off, some round 3 moderated interview participants were comfortable with not having access to a dashboard to manage/withdraw their sharing arrangements. They explained that they “would feel okay with it because it's for such a short period of time.”
However some saw value in having a consumer dashboard.
- One participant wanted the option to manually withdraw their consent as soon as the benefit/value was realised. They explained that this was their standard behaviour and practice when data sharing to ensure that their data was no longer accessed.
- Another participant saw value in having a way to easily amend the once-off consent. They suggested the convenience of extending/initiating the same consent from the dashboard.
Evidence
“It seems pretty clear from that, well my interpretations of that, you wouldn't need to manage the settings. […] And maybe that's not the case, but I would assume that I don't need to manage it.”—R3P7
Initial evidence suggests that in circumstances where a consumer has only a single once-off consent, a dashboard may not be necessary and a CDR receipt may suffice.
To assure consumers that their data is no longer accessed, greater importance and value is seen in notifying them in writing that their consent has expired, as per Rule 4.18(3).
Opportunities
Initial evidence suggests that there may be merit in reviewing the need for once-off consent dashboards, but preliminary analysis suggests that the use cases supported by this change would be limited in scope.
If once-off dashboards are reconsidered, it would be prudent to emphasise other means of managing and withdrawing consent, such as the CDR receipt or, in the case of withdrawal or record access, using a simple alternative method of communication.
Further research on consumer dashboards for once-off sharing and analysis of downstream impacts is recommended.
De-identification and deletion by default
Would a deletion by default approach improve consumer control, empowerment and trust?
In Round 2, during the “Consent” step of the prototype, the following information was given about withdrawal and redundant data handling:
You can stop this at any time using our app. We’ll delete your data when we no longer need it for the service.
In Round 3, this language was refined based on findings from round 2:
You can withdraw your consent at any time.We’ll delete your data when you withdraw your consent or cancel the service.
Participants were asked their thoughts on this and other information. They were then presented with a scenario of being given the option to have their data de-identified. Some participants were not given an explanation as to why their data would be de-identified and inferred that it would be used for commercial purposes. Others were given examples of how the de-identified data might be used, such as to understand market demographics and improve services, or to sell the data.
Finding
The majority of consumer participants did not mention or identify risks associated with de-identifying their data; by contrast, one consumer participant stated that if de-identified data was used for market research purposes, there would be little effect on them. Just 3 consumer participants mentioned specific risks. These echoed past CX research, with one participant each mentioning that de-identified data could be re-identified, that the data could be hacked, and that there was a risk of misrepresentation with de-identified data. The participant concerned with misrepresentation highlighted concerns such as overgeneralised or incorrectly categorised data being used to influence targeted advertising, particularly with members of marginalised communities.
Evidence
“De-identifying is de-identifying but you still have the data. And you can re-identify [it] if you've de-identified. We de-identify at [my workplace]. But if you want to re-identify it, you can. If they wanted to do it […] they can very easily re-identify it.”—R2P5
“I would a hundred percent choose delete. I think, like I said earlier, when it comes to de-identification, I think it's the thing of misrepresentation. I am very skeptical of data collection in terms of marketing stuff. Like it's very frustrating cause I know it can be used in a lot of good ways, but I guess I just don't trust anyone to do that these days.”—R3P1
“I think I would be comfortable with just de-identifying it […] if it’s just for market research, it doesn’t really affect me that much directly”—R3P8
Finding
Comfort levels with data de-identification were mixed. Some consumer participants stated that they’d prefer to have their data deleted, rather than de-identified, while some others were comfortable with having their data de-identified.
Of those who were comfortable with de-identification, the most common reason given was to help improve products and services for themselves and others. This altruistic factor is one that we have observed in past research as well.
Even amongst the participants who preferred their data to be deleted, a few spoke to the potential benefits of de-identified data to help improve services. While their own preference was for their data to be deleted, they believed that the option to de-identify data could be presented to consumers to allow others to choose this if they wanted to. They expected this to require an explicit opt-in from consumers.
A few participants expected details on how data would be de-identified, and specific details on the purpose of retaining de-identified data. This information would in turn help them determine their preference.
Evidence
“I actually am not bothered at the thought of that at all. ‘Cause I think it can help ongoing products and services. So I'd probably be happy to do that.”—R3P7
“Personally, if it was me, I would just want it deleted. But there might be people who are open to sharing that information and I suppose you can always give people the option to do that. But I think as a default, it should first be to delete and then the other alternative”—R3P6
“If I have control over my data, then I'm totally fine with it. […] I think it's that de-identification to me is a loss of control of that data. So I would actively be like, ‘absolutely not, delete my data’. But I guess that's the thing; if they were like, ‘you don't have to consent to this, but if you want to give us your data for [this purpose], here's a consent form.’ And I would be much more likely to be like, ‘Yeah, sure. Go ahead.’”—R3P1
“I would share it provided they could absolutely vouch that there would be no personal identification there. That there's not a chance of that happening. […] But if there were to be any chance [of there still being personal information], I'd say no. But I would say under normal circumstances with a good company, I would say yes. Provided it's anonymous and de-identified. […] I would expect them to tell me exactly what they're going to [keep] and what they're going to be deleting.”—R3P5
Research evidence to-date highlights that while consumers are open to their data being de-identified and used to help improve services, their understanding of the risks and consequences of de-identification is low.
A deletion by default approach, which requires consumers to expressly opt in to de-identification and retention of their data, would better align with consumer expectations. A deletion by default approach would also better protect consumers who may not understand the risks, by not automatically enrolling them in a system they don’t fully understand.
While some consumers are happy to have their data de-identified, particularly to help improve products and services, others would prefer to have their data deleted. The ability to make a selection that aligns with their preferences would better empower consumers and provide them with control.
Current requirements stipulate that ADRs who de-identify and retain redundant data must provide consumers with the option to elect to have their data deleted instead. However, research indicated that consumers have an expectation that their data would be deleted by default, and that any de-identification and retention of their data should require them to explicitly opt in. A deletion by default approach, with a request for a de-identification consent could improve consumers’ trust in CDR participants’ handling of their data.
The requirements for requesting a de-identification consent are similar, but differ slightly, from those for de-identification of redundant data. The potential interactions between consumer elections to have their redundant data deleted, and separately granting de-identification consents are complex and likely to lead to confusion. Consolidating these two separate requirements and processes could simplify consent processes, the rules, and compliance.
Findings from this research strongly align with de-identification and deletion findings from Phase 3 research.
Opportunities
Existing requirements should be reviewed. A policy position of deletion by default should be strongly considered to improve consumer empowerment and control, facilitate informed consent, and better align with consumer expectations.
Consumers should still be able to expressly opt-in to their data being de-identified. This election could apply regardless of the data being redundant.
Allowing consumers to make granular selections when opting in to de-identification could improve consumer trust and empowerment. This granular control could allow consumers to opt in to uses they feel comfortable with, and not consent to those they don’t.
An introduction of granular control should be balanced against increased cognitive and interaction load to reduce the risk of consent fatigue.
Next steps
The insights and considerations from this research have informed the development of a design paper for the consent review. This design paper will be consulted on publicly, followed by consultation on any proposed rules and standards.
Further CX research may be considered for future work on the consent model, including to support any further simplifications, review, and the expansion of CDR to support other sectors and functionality, such as action-initiation.
Quick links to CX Guidelines: