Privacy Roadmap - Input from Testing team & Software Testing Community
A list of the key privacy issues that should be addressed in the prodcut - based on input & concerns from the Software Testing community & other sources.
Items relating to locastion data stored on the users phone are prioritized because that function is enabled gobally from ytoday, whereas the Health Authority data sharing will happen more slowly as it depends on the rpllout of infrastructure.
Reduce location data stored to the maximum required accuracy. Suggest 5 DP = 1m. (1 degree = 100km => 0.00001 = 1m) Anything below that is spurious accuracy anyway, but these excess meaningless DPs could act as a correlator to help de-anonymyze data: hence not only functionally superfluous and data-inefficient but also a potential privacy risk.
The user should be advised to remove location sharing permissions from other apps, when they install Safe Paths on their phone (due to risk of other apps leaking location data, which could be correlated back to their Safe Paths data)
Import of location history from Google or other services should be direct via an API, that does not leave a data trail in Google Drive or similar.
Private data should be encrypted when stored on the user’s phone..
Infected patients data should not be published in plain text. At a minimum it must be hashed. Further protection is desirable (e.g. PSI as per https://arxiv.org/abs/2003.14412)
Sharing of location history should be via a direct secure connection to the Health Authority, that does not leave a data trail in email folders.
Consumers should provide two independent consents: one to share their data publicly, the other to share their data with a Health Authority for analysis. If they agree to the first, but not the second, the Health Authority should never have access to their non-hashed data.
It should be possible for infected patients to remove either of these consents at any time. How to remove a hashed data set from the published data, where the Health Authority does not have the original data, and the patient might not either, may be a difficult problem to solve.
There should be a mechanism by which a user can review the redactions performed by a Health Official, anf explicitly give their consent to publish, rather than the current model of assumed consent.
It should be possible for a user to redact their own data before sharing it with the Health Authority
It should not be possible for a Health Authority to determine when a given user has or has not consented to sharing their data.
The user should be able to give not only binary yes/no consent to share their location data, but also consent to share it at a particular level of acccuracy: e.g. 10m, 100m, 1km. If they do not share at 10m resolution, that probably means their data can’t be used in a public data set for detecting possiible COVID transmission, but it may neverteless be useful for public health analyses.
It may be helpful to compile the full set of standards that there may be in favour of each of these points. See: Privacy Standards, References & Papers
Privacy ideas rejected
Create geofenced areas (e.g. the home), where the App never stores location data at all. Rejcted because there two reasons to store location data: one is in case I get infected in future, but the other is to check my own level of risk against others who may be infected. This second case is why it is valuable to store this data.
Â