AI is spreading quickly into sensors and will drive an even greater appetite for data.
The mental health issues surrounding the pandemic are affecting people’s attitudes as they contemplate returning to work. Surveys have shown that people are somewhat concerned about their safety as they begin to mix with others in the workplace once more. For generations now, many of us have gone to work expecting to catch no more than a cold, at worst, from our colleagues. Our work environments have been designed accordingly: although conventional hygiene is catered to, there have been minimal precautions to prevent transmission of airborne viruses.
With the pandemic, measures were hastily put in place. Semi-permanent transparent screens have become commonplace in retail settings, as well as limitations on occupancy and direction of movement in stores and public places. Were they effective? Probably. Could they be better? Almost certainly.
We now have an opportunity – some may call it an imperative – to re-engineer our systems and practices with distancing and minimizing contact as a basic principle. The opportunity applies to almost every context, including retail, transportation, and work environments.
Although working from home has delivered great flexibility to large numbers of people, team building is effective by bringing everyone together in the same place at the same time. Businesses depend on this, although people are understandably concerned for their safety and well being. There is a feeling that smart buildings equipped to control ventilation, air quality and occupancy, as well as access to areas and resources such as meeting rooms and equipment, can offer a safe environment for employees to coexist.
Technologies like AI have a role in ensuring we get this re-engineering right. During the pandemic, AI-enhanced cameras were introduced in London to monitor social distancing on streets and in public spaces. Although used only for assessment and surveying, if facial recognition were added this kind of technology could be used to enforce distancing rules and bring prosecutions. Technologically speaking, this would be only a small step, although of course there are major ethical issues.
On the other hand, the same technology is being used to help with urban planning by analyzing the patterns of pedestrians and road users around various features like crossings and cycle lanes. The pattern matching and anomaly detection skills of AI can help to identify where features are being used as intended and where they are failing. With this information, planning and design can be improved to ensure systems are delivered that serve users optimally and deliver the best results for all stakeholders. It could help ensure better urban schemes and more efficient local-government spending.
The maturing of AI and its infusion into the fabric of life is fundamentally changing computer and system architectures, from the cloud to the edge. Google’s Tensor Processing Unit (TPU) is one example, an architecture specifically developed to handle certain types of AI algorithms. Google points out that the venerable CPU is well suited to fast prototyping and handling AI workloads that involve small and simple models, while larger models are suited to the inherently more parallelized GPU architecture. Applications for the TPU include handling very large models that require a long training period. As the most ambitious applications migrate towards TPU-based platforms in the cloud, this should ensure fewer and smaller data centers are needed to provide cutting-edge and high-value services in the future and could therefore save significant energy and so enhance sustainability.
Now, hot on the heels of the cloud TPU, comes the Edge TPU; optimized for machine-learning inferencing on low-power devices. Its arrival is part of a migration of intelligence towards the endpoints of the IoT, also seen in the advent of intelligent inertial sensors that contain a small DSP optimized for machine-learning and deep-learning algorithms. These can perform tasks like sound classification and activity detection locally, consuming a fraction of the power needed to run a comparable application in the host controller.
Future generations, I am sure, will significantly extend the sensor’s local inferencing capabilities. By configuring networks of such sensors, developers will be able to unleash yet more of the potential of cyber-physical systems that bring together sensing, computation, control and networking in physical objects connected to the internet and to each other. They will transform the way we manage factories (in Industry 4.0 use cases), as well as our homes and buildings, services like healthcare and transportation, and smart cities. These smarter-than-smart sensors will also need to become physically more resilient as they penetrate uncontrolled industrial and street-level environments.
The ability to provide initial filtering and event classification will enable us to capture even larger quantities of data in almost any context and quickly separate the meaningful from the meaningless. Ultimately, that information can reveal much greater and deeper insights into our surroundings, bringing numerous benefits.
In our post-pandemic world, we can appreciate the opportunity for smarter, healthier buildings in which to live and work. But there is much to come. Some examples include smart agriculture, benefiting from greater intelligence about soil conditions for growing crops and managing water. We can also improve the delivery of healthcare services, such as elderly care with in-home behavioral monitors that can help anticipate changing support needs.
AI has quickly spread from the data center to the IoT edge and, with the advent of machine-learning sensors, is poised to enhance our understanding and control of the world around us to an unprecedented degree. •