the privacy guru
  • About
  • Speaking & Events
    • Speaking Events
    • Bring the Privacy Guru to Speak
    • Book a Privacy Salon
    • Book a One-On-One Session
  • Publications
    • Privacy Memes
    • Articles
    • E-Book: Privacy for Humans
  • Blog





The Privacy Field Needs More Diversity

On 29 Aug, 2020
Uncategorized
By : theprivacyguru
No Comments
Views : 3849

1067590873-Diversity

Before Kamala Harris became the first woman of color nominated for Vice President of the United States, she was also one of the few Black, Asian-American female attorneys focused on privacy issues. In 2013, as California’s attorney general, she sponsored, and California enacted, a law requiring tech companies to post in privacy policies whether they abide by do-not-track requests and what personally identifiable information they collect from users. Two years later, in 2015, she was influential in the banning of “revenge” porn – posting explicit images or videos on the Internet without that person’s permission. More recently, she and other senators urged the inclusion of the Critical Health Care Privacy Bill in the Federal COVID-19 Relief Package.

Harris’ current stature in the public eye highlights the continued need to work towards gender, racial and ethnic diversity and representation in the legal and privacy professions. I’ve written previously about recognizing and advocating for women and the opportunities to improve diversity in the cybersecurity profession.

We can learn from and build on work done by the Minority Corporate Counsel Association (MCCA) and the American Bar Association (ABA). In 2015, the MCCA founded the Black General Counsel 2025 Initiative to track and promote Black legal leaders in Fortune 1000 companies. In the subsequent four years they managed to raise the percentage of Black general counsel from 3.8 % to 5.3%. They have done so in part by making conscious efforts to develop the pipeline of Black legal talent, preparing junior Black attorneys for leadership roles, and raising awareness of corporate diversity issues. Two ABA programs of note are the Coalition on Racial and Ethnic Justice (COREJ) which “examines issues stemming from the intersection of race and ethnicity with the legal system” and its racial justice advocacy and support of “issues addressing bias, racism and prejudice in the justice system and society.”

The tracking that has been reported for diversity in the privacy field is largely around gender representation. Although 50 % of privacy professionals are women, that is in direct contrast to the gender imbalances that exist in many privacy, security and other tech fields. According to the 2017 Global Information Security Workforce Study: Women in Cybersecurity, only 11% of cyber security positions are held by women. Perhaps that is in part because the field has not been hospitable to women. The report also reveals hefty discrimination; 51% of women report cyber security workforce discrimination, while 87% of women report unconscious discrimination. Furthermore, 54% of women report unexplained delays or denial in career advancement. 22% of women also experience “tokenism” in their cyber security roles.

What makes these statistics about gender and race particularly pertinent is the fact that privacy is a vigorously growing field in need of talent. It’s predicted that by 2021, the field of cyber security, which is allied with privacy work, will have 3.5 million unfilled jobs. In the next 20 years, job opportunities in privacy work promise to be plentiful and rewarding. Hiring on LinkedIn for jobs with titles such as “chief privacy officer,,” “Privacy officer” or “data protection officer” increased 77 % from 2016-2019, according to an analysis that LinkedIn conducted for Axios. Companies around the world are finding it challenging to recruit experienced privacy professionals with relevant technical, legal and engineering skills. According to the International Association of Privacy Professionals, (IAPP), the largest and most comprehensive global information privacy community and resource, “privacy is now a necessity of doing business.”

Why is it important that professional privacy positions be adequately represented by women and especially by people of color? In addition to the general principle that greater diversity in the workplace expands the variety of perspective and experience necessary to solve complex problems and generally improve performance, two examples make the case especially vivid.


Facial recognition software

We now have the technology in place to identify a person from a digital image or video frame, usually by comparing individual facial features, including skin textures, to a large data base. When surveillance monitors are installed in public places, the recording is done passively, without individual permission. Even worse, it’s become fairly well established that false-positive rates for this kind of one-to-one matching is higher for Asians, African Americans, and Native Americans. Another way of using the facial recognition software—matching one to many to determine if an individual has any matches in a database — is most likely to generate a false positive for African-American women.

Among other dangers, such false positives have all sorts of skewed ramifications when facial recognition software is used to identify a person who committed a crime. Many U.S. legislators have proposed several bills to limit or control its use. As Senator Ed Markey, the Democratic Senator from Massachusetts has said,

“Facial recognition technology doesn’t just pose a grave threat to our privacy, it physically endangers Black Americans and other minority populations in our country.”


Bias in data analysis algorithms

Our lives and our society are increasingly impacted by automated analysis of vast amounts of data. Machines “learn” to analyze according to the humans who create and use that data, often by scrubbing it from the Internet. For example, if enough data points exist that associate the words “male” or “man” with the phrase “privacy professional,” then the algorithm will conclude that privacy professionals are male. As you might imagine, the negative implicit and explicit bias that many of us hold—often unwittingly– towards women and people of color “teach” the algorithm to respond with similar bias. A diverse group of privacy professionals can play a role in reversing the kind of bias that leads to racial discrimination.

As privacy professional Christelle Kamaliza, CIPP/E CIPM, so eloquently put it in a recent IAPP article:

“We need to look beyond processing our data to render it “clean” and “workable” — just because it is clean, doesn’t mean that it is not biased and potentially harmful. We need to question the classification or segmentation in the databases that we mine. There are a lot of opportunities and educational resources on how to generate and apply datasets that have, to the best of our human abilities, the least amount of bias possible. We have to approach all things data with a microscopic lens of data privacy and a great depth of social awareness. Understanding the power and equity dynamics between the data subjects themselves, the data we are collecting, and analyzing will make us better data professionals.”



Tags :   ABAAIbiasdiversityIAPPKamala HarrisMCCAprivacy

Previous Post

About The Author

theprivacyguru


Number of Posts : 130
All Posts by : theprivacyguru

Related Posts

  • How to train your AI

  • Privacy Protection: Apps vs. Websites

  • Facial Recognition & Law Enforcement: Do You Have That Guilty Look?

  • Get to Know Your Privacy Regulators

Leave a Comment

Before posting a comment, please read our Comment Policy

Click here to cancel reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>





  • Error

    Fans
  • 3896

    Followers
  • Subscribe

    RSS Feed

Recent Posts

  • The Privacy Field Needs More Diversity
  • Covid 19 -Ethical and Privacy Concerns
  • Celebrating International Women’s Day
  • These are a Few of My Favorite Podcasts on Privacy, Security and Technology
  • Building Trust in Data Protection and Compliance

Archives

  • August 2020
  • June 2020
  • March 2020
  • February 2020
  • December 2019
  • October 2019
  • September 2019
  • July 2019
  • June 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • March 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • July 2015
  • June 2015
  • May 2015
  • April 2015
  • March 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014

Categories

  • Uncategorized
The personal views expressed on The Privacy Guru blog are my own, not those of my employer. The information contained on the blog is not legal advice.

Phone: 415 713 0271 | Email: alexandra@theprivacyguru.com

© Copyright 2017 THEPRIVACYGURU. All Rights Reserved.    terms of use | privacy policy
Follow theprivacyguru on Pinterest Follow theprivacyguru on Instagram