Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
podcast
Filter by Categories
ArcGIS Pro
GDAL
GeoJson
Map
Python
QGIS
Uncategorized

location Privacy and Data Ethics

Denise McKenzie is all about location privacy. Like so many in GIS, she is another accidental geospatial professional. Denise did a degree in Public Policy and Politics and worked for the public service in Melbourne, Australia, where she accidentally stumbled upon the field.

The early 2000s were an exciting time; geo was moving out of GIS, and onto the web, Apple and Google entered the marketplace. She got involved with innovation projects that put geospatial into non-technical people’s hands to make better decisions. She worked on projects such as the spatial smart tag and the Victoria mapping and address service (Australia).

Denise is  now based in the UK and is the Chair for the Association for Geographic Information (UK), and her passion areas are women in geospatial, diversity, location ethics, and the Benchmark Initiative.

IS IT STILL AN EXCITING TIME TO BE A GIS PRACTITIONER?

It is, but for a different reason from what it was in the early 2000s. Back then, we almost felt like high school students, eager to learn and grow up. So much to do and so many possibilities.

Now we’ve graduated, we’re in our early 20s and matured.  We’ve moved on to playing with adults and partake in global decision-making while being ever more aware of bigger issues, like ethics.

WHY IS LOCATION PRIVACY IMPORTANT?

10 years ago, if anyone needed data from you, they’d ask you to take part in a survey and fill in a form.

Today, we carry mobile devices and sensors on our bodies. Data’s become personal, granular, and real time.

There’s a lot of data out there that can identify you, your children, your husband, and your grandmother. It can make you feel uncomfortable how much organizations know about you and how easily other people can find things out about you from your location data.

Are we mature enough to handle all that personal data about our location? Have governments done everything they can to reassure their citizens that the recent COVID-19 tracing apps they use and the data they collect are dealt with ethically?

DOESN’T GRANULAR lOCATION DATA ALSO DO GOOD? SURELY, IT’S NOT ALL BAD.

There is an opportunity for a greater good that can be accomplished with data.

But do governments and organizations know how to use it in a way that will not end up exposing individuals? We need public data and people sharing data for the greater good. We also need transparency in making sure group data isn’t revealing patterns of individuals.

At the Benchmark Initiative, we educate people about providing data, understanding how their data is used and how they can protect themselves from data associated risks.

WHAT IS THE BENCHMARK INITIATIVE?

It’s a thought leadership and entrepreneur program exploring the ethical challenges of using location data.

Over the last year, with several entrepreneurs’ help, we’ve looked at things like the ethics of location data in agriculture, tracing apps, and waste management. We’re currently looking at mobility data.

We need to educate people about the location data collected by their mobile phones and what they can do with it. Our team looks at development data and improving the processes for decision-makers to understand where data is sourced from and how ethically appropriate it is for them to use it for their decision-making processes within their tools.

The Benchmark Initiative is not a standalone project and works in partnership with the American Geographical Society’s EthicalGeo program on the Locus Charter.

A draft Lotus Charter is a set of principles to guide the ethical new world that we need to work in. It’s already available, and it will be further updated soon.

Feedback and input are encouraged and very welcome from the global geospatial community.

ISN’T LOCATION DATA PART OF THE BROADER DATA ETHICS ALREADY BEING ADDRESSED?

I’m not sure that’s the case. While there are a significant number of things already being addressed, such as AI, with some great charters out there, the conversation is still esoteric.

It’s a high-level one. Everyday GIS practitioners have a hard time translating these sets of principles into their daily work to avoid doing harm.

We need to look at our craft and activities and set the right behavior to protect privacy.  We need skills and measures, and those can only come from practice.

HOW DOES THE TRANSPARENCY REGISTER HELP IN THIS?

The idea of the Transparency Register was born after speaking with someone working on a project in a developing country. They realized they needed to make so many trade-offs that the whys were inevitably lost in the process and hard to remember in the end.

Their task was to look at flooding. Then at fires. Then at other things. Each time they changed scope, they’d have to make a new set of decisions. Every one of those came with trade-offs; sometimes the data wasn’t available, the location wasn’t available, or the data infringed people’s privacy.

How do you transparently communicate why you’ve made those decisions?

You could write down and keep track of the critical decisions made during the project with the aid of an agreed standard or template. It will keep the transparency and trust open with what you’re doing with people’s location data.

The Transparency Register is a communication tool for letting people know what you’re doing and why.

WE NEED TO THINK ABOUT THE TRADE-OFFS CAREFULLY

Data is currency. Whatever device you’re using today, your GPS enabled mobile or watch, the data coming from it is valuable to someone out there.

The traffic authority, marketers, or insurance companies are all after and using your data.

Are you as careful about sharing it as you would be when choosing your bank or investments? Where is your data going? Who’s using it?

Educate yourself about data. Transparency and education should go hand in hand.

CAN EXCHANGING DATA EVER YIELD GOOD? IF SO, HOW?

Right now, cities and transport authorities desperately need the data about who’s accessing the various measures they’re putting in place. The City of London, for example, has been looking at micro-mobility ̶ rentable scooters, bikes, and the like.

What they really want to know is if everybody has equal access to these options. They need sensitive information, like the race and gender of the people who are using them. By knowing those, they can see whether anyone is missing from their data and work out why.

That’s really personal.

The only way to do that is to get people to opt-in and share that information when they pick up their scooter.

There is a greater good that comes from sharing this type of data. It also happens to be very personal, and you wouldn’t want to make someone’s ethnicity information publicly available.

Our task is to treat data carefully and sensitively because it can do good, but it can do harm in the wrong hands.

IS THE GENERAL DATA PROTECTION REGULATION PIONEERING THESE EFFORTS?

GDPR has prompted us to change the way we look at data.

Back in the 2000s, it was all about innovation. We’ve got the tech and the cool tools; let’s push them as far as we can.

Now, we realize that we can do pretty much whatever we want. The technology is powerful, and we can’t even imagine the next thing we’ll be able to do.

With all that power comes an increased chance to do harm. Just because we can do something ̶ should we?

GDPR was the first significant change in the European space.

No, you can’t just do whatever you want.

There are reasons why something is personal and why people should have control of it. The concept of the information being valuable to an individual needed to be recognized.

GDPR has great points, and it’s had a positive influence in the privacy space. But it’s a process, and there are challenges in how it’s been implemented in various places. That comes down to communication and people’s understanding of the regulations.

ARE WE AFRAID OF MAKING MISTAKES AND DOCUMENTING THEM?

The assumption would be valid for the tech sector in general and for the geospatial space. We come up with amazing new things. If someone is caught doing something less than desirable, the media will vilify them for it. No wonder there’s a great fear of doing something wrong and being caught for it.

There needs to be an acceptance.

No project is perfect. There’s always a trade-off. The data wasn’t quite what you needed, or you didn’t get enough of it. You’re still going to use it anyway, with the best intentions, to understand the impact of what you’re investigating.

In the process of doing so, you might even find yourself in a political situation. Maybe you’ve been looking at flooding, and you’ve accidentally exposed the reasons flooding is bad in a particular area. You’ve created tension between parts of society, and you’ve never expected this to occur.

SO LET’S START TALKING ABOUT THE THINGS THAT WENT WRONG

Take the Strava example, where soldiers unwillingly exposed military bases while using the app on their daily run.

What can we learn from that? How can we build systems that don’t expose what they’re not supposed to?

Let’s get better at sharing the whys of our decisions through the Transparency Register. Let’s write down what we did and why, how it was done with the best intentions. If harm was done, let’s backtrack and work out how that happened. Others can learn from that and from raising the discussion of ethics and location privacy.

ISN’T THIS ALREADY OUT OF OUR CONTROL?

I am hopeful.

People started to question things, governments, and organizations. They want more information on what entities are doing with their data and not necessarily implying that they’re doing something wrong.

What’s this new location tracing app going to do with my data? Who’s protecting my privacy, and how?  Tell me more, and I can make an educated decision.

DOES ANYONE RESPOND TO QUESTIONS BEING ASKED AROUND LOCATION PRIVACY?

Apple and Google did when quizzed about their practices. Because of public questioning, they opened up about what they thought was the right solution to protect privacy and still help the solution. They had to think about the solution, what they were going to do, and respond to concerns by sharing the information that was out there for everybody.

That’s a healthy response to the public using their operating systems.

I hope you enjoyed this conversation with Denise.  It’s clear to me that there’s a real tension here.

 

On one side, we have the promise of personalization of being seen and understood, provided we share our data about how we interact with the world. That data can be used to make things better.

 

On the other side, we risk being exposed or being manipulated and treated like a product instead of the customer. In terms of data privacy, location plays a massive role. We, the geospatial community, play a role in this conversation.

 

I’m not expecting anyone to have the answers to this. Let’s make a continuous effort to educate ourselves and others on location privacy. I encourage you to reach out to Denise if you’re interested in carrying on the conversation. 

About the Author
I'm Daniel O'Donohue, the voice and creator behind The MapScaping Podcast ( A podcast for the geospatial community ). With a professional background as a geospatial specialist, I've spent years harnessing the power of spatial to unravel the complexities of our world, one layer at a time.

Leave a Reply