GGL

  • Section E
  • Section 1
  • Alumni

Activity Feed

On April 13, 2020, GGL commented on The Ethics of People Analytics :

Thanks for sharing this article. I particularly enjoyed your perspective that “Any algorithm that is trained with biased data will produce results that only further perpetuate those biases.”

Overall, I agree with your argument. However, as more and more of our lives are being tracked with data, I wonder if people are going to be increasingly comfortable to sharing personal information. Or rather, I wonder if they would continue to care less about data privacy. And if people care less about their own data privacy, then perhaps there is less of an ethical concern?

Thanks for sharing this article! I’m a huge fan of the searchlight.ai team. They’ve been super laser-focused even before they went through YC.

The biggest concern I have about this opportunity is whether companies would be willing to pay for reference checks. I think that the hurdle for HR and team leaders is that they like to rely on in-person “reference checks” and “interviews”. Humans have a bias towards thinking that they make better people decisions than an algorithm.

On April 13, 2020, GGL commented on Data Transparency-Privacy Tradeoff During a Pandemic :

Great article! I was wondering about this same issue over the weekend. Especially when Apple and Google are joining forces with Washington to user bluetooth to track whether people have come in contact someone who has COVID-19 (link: https://www.bloomberg.com/opinion/articles/2020-04-13/coronavirus-apple-and-google-come-to-the-rescue-are-you-ready).

I do believe that in times of crisis, I’m more willing to give up data privacy to ensure the health and safety of my loved ones.