Visit hbs.edu

How sensors and software are solving the world’s greatest challenges

Factory worker with goggles

Imagine your day as a factory worker. You enter the factory in the morning and put on your overalls, safety shoes, goggles, and helmet. You proceed to the factory floor and start your daily routine at your station. Sounds like a pretty normal day in the present, and even in the past.

However, this is the future. And your overalls are not normal overalls; your goggles are not normal glasses. They’re filled with tiny sensors that are woven into the fabric and are invisible to the eye. These sensors are able to measure all kinds of signals: your heart rate, your breathing, your hand and leg movements. The goggles track your eye movements and provide you with enhanced information about the product or process you are working on. Tiny cameras lodged around the factory track your positioning and speed as you perform actions on the assembly line. These tools are collecting data and processing in real-time, thereby creating an immediate picture of how well you’re performing in speed, cost, efficiency, safety — and even your learning curve.

It sounds futuristic, but isn’t. Research institutions across the globe are being funded by industrial sponsors in order to understand emerging technologies that can create greater advances in efficiency, health, and safety. Innovations have been developed and tested across several industries. In France, scientists were able to embed sensors near a paralyzed man’s brain and connect signals generated from his thoughts to an exoskeleton, thereby allowing him to walk. California-based Mojo Vision is testing an augmented reality contact lens with microdisplays, wireless radios, image sensors, and motion sensors that seamlessly fuse digital information onto the real world, allowing wearers to perform critical tasks without holding devices or looking at screens.

“It’s the combination of the hardware and the accompanying software which allows for the step-change in capabilities.”

At Massachusetts Institute of Technology’s lab for nanomechanical technology (called the NanoLab), researchers are investigating a number of potential breakthrough sensor-based technologies. For example, Emerald is a radio signal-based sensor that can be used to detect motion, breathing, and a host of other human performance signals. Its current focus is in healthcare areas such as sleep apnea detection and elderly care. This type of sensor is particularly innovative because it is invisible. There are no connectors attached to the body; motion and other signals are processed through a router-like box in the room.

It’s not just the sensor that is different. It’s the combination of the hardware and the accompanying software which allows for the step-change in capabilities. The software is powerful because it is enabled by advancements in data aggregation, storage, and processing speed, combined with increasingly sophisticated algorithms that are programmed to learn.

One might assume that these types of technologies could easily be applied to any modernized manufacturing environment. Some estimate there will be 41.6 billion connected Internet of Things (IoT) devices including sensors, cameras, and machines, generating 79.4 zettabytes of data by 2025. However, both discrete manufacturing (e.g. automotive and equipment) and process manufacturing (e.g. chemicals and oil & gas) industries have been slower to embrace advances in big data, machine learning, predictive analytics, and cloud computing. This is because historically, these industries couldn’t sufficiently quantify or justify their ROI in incorporating emerging technologies. However, this is now changing and forward-looking companies are increasingly finding value in incorporating new technologies that have either resulted in higher efficiencies or in outright step-change value creation opportunities.

German company e.GO is disrupting the traditional automotive industry by building affordable, emission-free passenger cars as well as automated, on-demand rideshare vehicles (“people movers”). They are doing this through a tightly integrated digital thread of data-driven processes that encompass design, manufacturing, monitoring, and servicing. From the first steps in the design process to the building, operating, and servicing of the electric cars, e.GO collects, stores, processes, and makes sense of the data — all through a combination of physical sensors and software.

The product is designed on a parametric modeling computer-aided design (CAD) software package. The reams of data generated are managed by a product lifecycle management software package. The operations, monitoring, and analytics of the factory that builds the car itself is run by an Industrial Internet of Things (IIoT) software platform. And finally, quality control and maintenance of the product and factory machinery is handled by an augmented reality software package that allows factory operators to use digital twins of the machines to troubleshoot and repair defects or failures. It’s not uncommon to see e.GO technicians pointing an iPad at a car and seeing a digital image or a digital work instruction in real-time.

It may seem futuristic, but this already exists in a production facility in Aachen, Germany. The pace of innovation continues to grow exponentially across all industries, and humans and machines are becoming more connected through these sensors and software. What we imagined to be science fiction is becoming more real by the day. The future is indeed now, and it’s being revolutionized by the increasingly sophisticated and seamless interaction between sensors and software.

Engage With Us

Join Our Community

Ready to dive deeper with the Digital Data Design Institute at Harvard? Subscribe to our newsletter, contribute to the conversation and begin to invent the future for yourself, your business and society as a whole.