blog darkness...

words, music, food

Privacy, Security, Empowerment, and Change (...why I'm going back...)

Pigs are on the wing... or hell is frozen over, or some other such metaphor for things that were never supposed to happen. I'm going back to work for a biG Internet company that I left 12 years ago. At the time, I had a variety of personal reasons for leaving, but also I worried about the data stewardship issues that came with one company having so much access to so many people's deeply personal information.

I took some time off, did some volunteer work, played some music, and after some time passed, I went back to school to focus on Computer Science and Security. After finally getting a backelor's degree at 33, I went to work as a Cyber Security Analyst at Argonne National Laboratory, mostly working on contracts for the Department of Homeland Security. 

I guess I should take a moment to say that this article is 100% my own thoughts and opinions and in no way reflects on, or is meant to intimate anything on behalf of anyone else, including but not limited to: Argonne National Laboratory, the Department of Homeland Security, DomainTools, or Google.

Working for DHS was full of compromises and conflicts, it is a huge organization that does a variety of really good things and some really questionable things. The work I did was mostly for the Office of Infrastructure Protection. The critical infrastructure in this country has a wide variety of cybersecurity profiles, but in general, none of it is as secure as it should be, and some of it is in a downright frightening state.

Even as DHS's ICE (Immigration and Customs Enforcement) was making (what still seem like illegal) indiscriminate arrests of undocumented immigrants, I was able to find a lot of purpose in working toward the goals of improving our abilities to protect our critical infrastructure. However, as time went on these compromises became more and more cumbersome to my personal ethics and after 4 years at the national laboratory, I decided it was time to move on.

I spent a good deal of time looking for a new gig and interviewed with a wide variety of companies that deal with cybersecurity. I struggled to find a company that I felt shared my values in a way that I would be comfortable staying with long term. I ended up spending a year at DomainTools, a smallish company that makes SaaS threat intelligence products. The mission space was right, and there were a good swath of folks that I felt like were in the same place I was in terms of values, but the work ultimately wasn't what I wanted to be doing.

This led me to a place where I had to consider again... where do I fit in this industry? The cybersecurity landscape is still fairly immature. In general, we are all bad at security, and one of the ways we react to our inability to be as effective as we would like is by trying to scoop up as much information as we can, in order that we have more data with which to develop intelligence. While this approach has some merit, it means that cybersecurity as it exists today is largely user hostile. The user is a source of data at best, and an untrusted potential insider threat at worst. It also means that we largely treat cybersecurity, especially in the enterprise, as though any sort of personal privacy is in opposition to our security goals.

I think this is short sighted on a number of levels. The user needs to be our first line of defense, but in order for them to be effective in this role, they need to be invested. One component of making that investment possible is trust. Especially in the modern working world in the US, where we expect more and more of our employees outside of normal business hours (a serious problem that is outside of the scope of this post), an acceptable use policy for company owned devices is a critical part of building user trust. This will feel like a radical departure to cybersecurity practitioners (and probably lawyers), but we need to allow our users some expectation of privacy.

Its a company device, the company has a right to know what's happening on it, yes? This is true from a certain perspective, but we all know that complete knowledge is not possible, and in fact, often EDR solutions and the like are reducing the signal to noise ratio, especially in small to medium sized companies that don't have adequate dedicated staff to analyze the data. Also, in a BYOD world, the company owned device isn't even the most dangerous part of your threat profile.

We need to start envisioning a bolder future where part of our jobs as security practitioners involves empowering users to be better sources of analysis for us. Notice I said analysis, not data. We need to enable interfaces that allow users to be part of the feedback loop so that indicators come with context. Imagine an EDR solution that a user was fully invested in because they had the ability to turn it off when using their company laptop for personal email, or to watch a movie in the hotel after work -- but when they turned it on and the EDR solution found potentially anomalous activity, it could prompt the user to provide context on the activity. It had the ability to connect the user to a security practitioner to walk through what was happening in real time.

This is just one (admittedly fairly half baked) idea that puts the user back into the drivers seat while still empowering the security practitioner. The thrust of what I'm trying to get at is that we can't view users as our enemy, and we can't afford to say that all of their data belongs to us.

Users being invested in the flow of their data and the implications of its use is going to be a critical part of our ability to be successful in fighting the next generation of threats. No amount of AI or machine learning or new tech is going to protect our enterprises from an uninformed idiot. The old joke here is that if we make an idiot proof system, they will make a better idiot. We need to stop assuming our users are idiots, and start finding ways to get them engaged in security.

There are very few companies I've found that are invested in this type of mission, and I definitely haven't found the right fit for myself in the industry as it is today, so the next phase of my journey is going to be about honing in on user privacy, user data, and user engagement.

Tomorrow I go back to work at Google as part of the Privacy Engineering team. I don't agree with everything the company is doing, but they have become a critical part of the way that most people interface with the Internet. This puts them in a place where they necessarily have access to a large volume of data about all of us, even those of us who don't use their services. The events of the last several years have put them in a place where I think they have to take privacy more seriously, and so far everyone that I have talked to on the team I'm joining shares that goal.

Friends who worked with me at Google in the past have challenged me with a variety of questions and comments... "Google is pretty evil these days." "How can they make privacy a priority when ad revenue is their goal?" "Everything they do is just for show." "They've lost the trust of the user." Some of these are true, but when I think about my own values, I realize that long ago I decided that it was more important for me to accept the compromises involved in trying to do good in a flawed system than to sit on the sidelines and yell about what is wrong.

I want an opportunity to learn from the smartest people in the industry... to be challenged by folks with enthusiasm, intellect, experience, and grit. Google attracts the best and the brightest, and I'm looking forward to finding the opportunities for growth that being a tiny fish in a big pond offers. I'm also ready to lean into the discomfort that this can breed.

So I'm going in with an open mind and an open heart, with an intention to do the best that I can to have a positive impact at a company I was once very invested in. In the mid 2000's we all felt like we were a part of something, that we were changing the world for the better. Maybe it can feel that way again. At the very least, some of the drive and passion I was able to find in the mission space working for DHS might have some parallels here. I feel conflicted, but that usually means that I'm doing something that really matters. There is important work being done and it requires people who care and are willing to think deeply about the problem space and then take action.

And maybe along the way I can help push my agenda of bringing the user back into the fold on security. Security and privacy are not mutually exclusive. The private data of users, improperly collected and stored, makes data breaches more catastrophic. More importantly, a user base that is plagued by security fatigue and feels helpless to control or protect their own information is our single biggest vulnerability for the enterprise and for the Internet as a community. We need to empower users, to get them invested and excited, we need to be "We" not "Us and Them." 

blog comments powered by Disqus