Jeff Mayolo

  • Section 1
  • Student

Activity Feed

On April 19, 2022, Jeff Mayolo commented on Data can build better businesses. :

Thanks for the post Dimitrios, I really enjoyed it. It’s fascinating to read and understand all the data that companies are collecting on you that you may not even realize. One thing this post is your mentioning of data privacy. This semester I’m taking a class on Differential Privacy which has become the new “gold-standard” for data privacy. In your research did you come across any discussion of McDonald’s providing data to third-party companies? It would be interesting to see because I imagine there is a lot of value in the data they collect on us, but I also imagine that the primary parties interested in that data would be competitors of McDonalds. If they do release or sell our data in any form I wonder if they employ any rigorous differentially private algorithms on our data to ensure our privacy.

On April 19, 2022, Jeff Mayolo commented on “Big Data” or “Big Brother”? :

Thanks for the post Peizhen! I found it very interesting. As we discussed during the AirBnB case, a sense of comfort and belonging is very important to company culture. While analysis of private elements such as emails, trips, and calls can lead to informative inferences about employees, I think there is a very serious tradeoff between data collection and employee comfort. Even with the best intentions, rigorous data collection can feel like a form of spying and can corrode the trust in an organization. I think you hit the nail on the head when you said that “Quantitative measures should never been built on sacrificing qualitative and unmeasurable factors like organization culture.”

On April 19, 2022, Jeff Mayolo commented on Balancing robots and humans in talent decisions :

Great analysis Sofia. I love how you tied in things we’ve discussed in this class to multiple articles and things you’ve learned in other classes. One thing I would like to add to your analysis is the potential for hiring algorithms to be susceptible to historical bias. Even if an algorithm is fair when it is built does not mean it will be fair even a year later. As many companies turn towards algorithmic talent acquisition it is likely that we will see a dramatic change in how candidates approach an application. Candidates will try to learn how an algorithm evaluates their application and adapt to optimize their chances of getting a position. Nonetheless, the data and performance the algorithm was trained will be old and outdates and not reflect the changes in the candidates field. This could lead to an inaccurate and possibly unfair algorithm. I think there’s great potential for algorithmic talent acquisition but there are many issues and I agree that a balance between humans and algorithms is needed.