Very astute writing! I’m left with a more fundamental question of how much we want to use machines to screen candidates for employers. At the same time that those machines may be less susceptible to bias (assuming they can be programmed in a way that overcomes the bias of the data that they are fed), to what extent are machines able to screen better for human potential or how well different people can work together or cultural fit? In a time where Harvard, in particular, is being challenged for perceived bias in its admissions process, is the way to go a more AI-centered approach?
Also, the acquisition by Microsoft seems promising for LinkedIn’s ability to maintain a competitive advantage in this space; with it’s ability to provide LI with the scale and technology brainpower to build an AI platform.
I learned a lot through this article – thanks for writing it! It seems like Capital One is an early mover in this direction of sharing data with other service providers. https://www.businessinsider.com/jpmorgan-announces-partnership-with-plaid-2018-10 presents a case of JP Morgan Chase working with a third party to do just that. Interesting to think how this has become a way in which banking service providers are competing. I think you start to see the downsides of this type of data sharing in other areas, primarily the hospitality space where, in order to increase customer activity on their proprietary sites, airlines and hotel brands are becoming increasingly restrictive on partnering with third parties and opting to build some of those capabilities in-house. It will be interesting to see if, in 5 years, this trend continues or begins to reverse.
I generally agree with N. and Alan here, except on the point of whether CAD drawings for printable guns can be protected as speech. The difference between those drawings and instructions to build bombs or explicit images of minors is precisely the 2nd Amendment (or at least the Supreme Court’s presiding precedent on what the 2nd Amendment allows for). This is why Defense Distributed is allowed to distribute their materials under the specific limitations the company faces today.
While I believe that there are absolutely responsible ways for DD to exist, tt seems clear that their mandate is to make gun control useless/ difficult to enforce. I think it’s on the government to place controls on gun manufacturers and to place safety limitations on what kinds of guns can be produced by 3D machines legally.
I’ll take Desmond Wolfe’s point a step further. The DOD has historically played a big role in investing in emerging technologies in a way that other organizations have not been able to with the result being incredible learning externalities for the rest of the world. One could argue that the DOD has an obligation to invest in these technologies that have a direct military benefit primarily to unlock those externalities for the broader economy. Alex is right that the mechanism of this investment matters – grants/ loans/ investment/ tenders generally makes more sense than acquisition for this type of development in my mind.
This is a great piece – it also points to what is being done globally to address the level of unbankedness in a lot of developing countries. Like Emma, I wonder about the exit opportunities for this technology. Does it eventually get used by banks in these countries? Does the future of banking reject physical locations for branches? Is this extendable beyond micro-loans? I agree with M.E in worrying about the regulatory and privacy implications of such tech. There are a number of companies using these sort of technologies to create alternate credit scores to guide investment decisions. And undoubtedly, those scores lead to better decision-making by lenders. But whether those technologies will actually lead to lower interest rates or increased access to capital in those markets or a reduced overall cost to lending I believe, remains to be seen.