RL10's Profile
RL10
Submitted
Activity Feed
This is a super interesting post! I’m curious how you think companies should balance transparency with the desire to continue to modify metrics as they develop. Should companies notify employees when they make changes to how data is used? I’m especially curious in the context of a company like Amazon that is under heightened levels of external scrutiny. Opening the “black box” of an algorithm risk exposing a company to criticism and liability about how they’re using their data, but I think you’re totally right that some amount of transparency is critical to building trust. I’m also struggling a bit to think about how much workers “should” be willing to give up in terms of privacy and tracking in exchange for better pay and benefits–it’s clearly the argument that prevailed in this case, but I think there are real justice concerns about how much society “should” allow and what responsibility companies have to notify and explain things to their employees.
This is a really interesting post! I was especially struck by your discussion of quality, and I think your points of caution are really well articulated. It makes me wonder whether this tool is better thought of as a complement for managers rather than a replacement or standalone tool. For example, this kind of sentiment analysis could be positioned as a starting point for a discussion between managers and employees, or a way for shy or new managers to begin to get to get a sense of their teams. I also wonder if this could be useful for surfacing toxic teams led by high performers (the Uber example comes to mind) and thus forcing accountability within organizations. It seems like this would come with its own set of training requirements, so I wonder how this would impact the positioning and value proposition of the tool.