Authored by: Henri Kujala
Location technology has emerged as an integral part of COVID-19 response globally. Across the world, governments, frontline workers, and civic authorities are using smartphones and citizen data to identify possible infection outbreaks, to deliver emergency services in hotspots, or to keep people indoors. India is no different. Citizens are using government-mandated apps to track whether they have encountered infected people, refer to official guidelines and prevention measures. These apps are bringing in a critical amount of citizen data such as their location and mobility patterns. Smartphone technologies like Bluetooth generated information has also become necessary to formulate effective and smart governance systems.
In a world full of data, however, trust is a precious and high-value commodity. With the onslaught of new and emerging technologies, data safety remains a heated subject with concerns around privacy erosion that these new innovations bring in. In order to address these concerns, it is important for government, organisations and technology players to collaborate together and help inform people’s understanding of privacy while enabling them to make informed choices, regarding their data and to protect it from potential mischief-makers.
As things stand now, most contact tracing apps and location-based tools rely on mobile technologies such as Bluetooth to identify physical distance and when people meet, based on the strength of the Bluetooth radio signal. When someone using the app tests positive for COVID-19, anyone identified as having been in proximity with this person is notified to stay at home. However, Bluetooth is not an ideal solution, given that it is rather weak at providing context.
Bluetooth can detect another phone within its vicinity, but it cannot pinpoint the direction that person is coming from. It does not provide a timely context either. Moreover, the risks associated with using Bluetooth for tracking do not just occur at the time the data is collected but continue as long as it is stored — in particular, once it has been linked to an individual. The privacy concerns stem from how governments and organisations could repurpose this data.
Federated Learning (FL)—The next frontier in location intelligence:
Federated learning is one potential avenue in machine learning that can address this challenge, while still giving those entities a degree of control and privacy. This is because FL removes the need for end-users and businesses never to transfer their data to external third parties. To understand this better, let us deep dive into how machine learning works first.
At present, a great deal of mobile data is directly being collected by companies and governments through the apps we use, the online platforms we search, the places we eat and shop, etc. This information goes to a data center in the cloud. The company that owns the data can analyse the data to create new models of business and produce fresh insights into people’s behavior. This contentious data model raises current security and privacy concerns.
With Federated learning, there is an alternative transfer of model to the user, where the data is collected locally. As this happens, only the derived knowledge is shared back to the cloud, without personal information, adding an additional layer of security, while the other relevant information can be used to develop and improve the cloud-based service. In other words, the smartphone turns into an ‘edge sensor’ providing only the data close to the edge of the areas that we are examining.
However, Federated Learning is not alone enough to fully protect privacy. Federated learning is known to be vulnerable to backdoor and inference attacks. To protect against these known weaknesses, federated learning should be combined with Differential Privacy or other available techniques, where necessary.
Another use case of FL is its power to turn companies into edge sensors. In a world, where multiple companies are vying for a single customer’s data in the same market, FL can allow businesses to learn locally from each of its independent customer data pool, without having to share any sensitive information or intermix it with other company’s data.
Simply put, FL allows for machine learning algorithms to gain experience from a broad range of data sets located at different locations. This enables multiple organisations to collaborate on the development of models, but without needing to directly share secure data with each other. As this happens, the shared models are exposed to a significantly wider range of data than what any single organisation possesses in-house. In other words, FL decentralises machine learning by removing the need to pool data into a single location.
To put things in perspective, consider how medical centers can leverage FL to improve models to detect suspicious injuries on images without needing to share personal details of the individuals being treated and, in the process, enhance their services. Another example of how federated learning could unlock new services into an industry, where customer privacy is paramount is the automotive industry.
Imagine a business environment, where auto insurers and auto manufacturers can study driver behaviors, road conditions, detect unusual weather conditions or unusual vehicle behavior across thousands of vehicles, to predict and model requisite on-road service centers and insurance needs.
Although location technology promises unlimited possibilities, the governments and technology companies have a mandate to balance these concerns, at both personal and enterprise-level, in ways that are respectful and accommodating of people’s fears around security compromise, data ownership and the ability to control their own data. This is where FL offers a promising future, as it is putting the privacy of the user at the center.
All said and done, emerging technology models such as FL, combined with Differential Privacy, have only initiated their first moves in this direction and privacy-conscious machine learning does offer companies to adopt a more ‘consumer-centric’ data strategy. While that happens, it is only part of the solution as half of the world’s population has no regular internet access. The universal way forward might be more straightforward: rely on people’s sense of goodness and responsibility.
—Henri Kujala is Chief Privacy Officer at HERE Technologies. The views expressed are personal