Devices like the Fitbit, Samsung Gear, Microsoft Band and Jawbone are becoming increasingly popular among fitness fanatics.
The wearable technology can be used to monitor heart rates, sleep patterns, calories and even stress levels.
But a new report has warned companies could be feeding users’ personal information to private healthcare and insurance companies, for purposes ranging from the benign to the malignant.
Identify theft, data leaks, discrimination from employers and increasing insurance costs are just some of the fallout predicted from the rise of wearable technology.
The new report was released by researchers at American University and the Center for Digital Democracy in Washington.
‘Some of the very features that make mobile and wearable devices so promising also raise serious concerns,’ report authors Kathryn Montgomery, Jeff Chester, and Katharina Kopp have said.
‘Because of their capacity to collect and use large amounts of personal data—and, in particular, sensitive health data—this new generation of digital tools brings with it a host of privacy, security, and other risks.
‘As the use of trackers, smart watches, Internet-connected clothing, and other wearables becomes more widespread, and as their functionalities become even more sophisticated, the extent and nature of data collection will be unprecedented.
‘These data can, in turn, be combined with personal information from other sources— including health-care providers and drug companies—raising such potential harms as discriminatory profiling, manipulative marketing, and data breaches.’
The report also warned of the danger of data falling into the hands of hackers and other unscrupulous individuals.
According to the Health and Human Services’ Office of Civil Rights in Washington DC records that there were 253 health-care breaches across the United States in 2015 that affected 500 individuals or more, resulting in a combined loss of over 112 million records.
‘The opportunities for data breaches will increase, with hackers accessing medical and health information at insurance companies, retail chains, and other businesses,’ the report added.
‘Even those institutions with the most benevolent of goals—such as public-health departments, law enforcement, and research entities—can misappropriate and misuse health data.
‘The risks extend beyond threats to individual privacy. Algorithmic classification systems could enable profiling and discrimination—based on ethnicity, age, gender, medical condition, and other information—across a spectrum of fields, such as employment, education, insurance, finance, criminal justice, and social services, affecting not only individuals but also groups and society at large.
‘Many of the harms associated with the collection and processing of such data, moreover, are likely to affect disproportionately the most vulnerable people in our society, including the sickest, the poorest, and those with the least education.’
Companies specialising in wearables have been quick to reassure customers that their data is safe.
A Jawbone spokesperson said: ‘At Jawbone, we fully respect the privacy of our users and Jawbone adheres to the best industry standards when it comes to protecting data and personal information. We only share user data if the user asks us to – for example to integrate with a 3rd party app.
‘We are custodians of the user’s data. We collect it, analyze it, and present it back to the user with meaning. The user may give us permission to share that data. They can download their data and take it somewhere else. And they can ask us to delete it (which we will do).
‘We also facilitate users to request deletion of their data from any third party apps that may be connected to UP by Jawbone.’