In this episode, you will learn how robotics coupled with remote sensing is transforming facilities management across the utility sector and other industries.
The Mars Rover Project. Autonomous robots monitoring substations. How is this all relevant to the geospatial community? Scott Nowicki is happy to clarify. He explains the technology that enables robots to integrate detailed maps, orientate, and move around their environments as they go on their daily business and build detailed change detection maps for substations and facilities management. But the question is, can they and do they truly add value to operations where human presence is difficult or unnecessary?
Scott Nowicki is a remote sensing expert. He is the lead R&D scientist for Quantum Spatial with an intriguing background, which includes the Mars Rover project. After his geology studies at Arizona State University, he ended up working on several Mars missions as part of a remote sensing program within the geology department. Early on in his career, he spent a lot of time outside with rocks. He then designed instruments and installed them on either Mars exploration rovers or satellites that were being sent to Mars. All with the purpose of collecting and using the data that came back from those instruments to analyze the globe of Mars itself and to understand where the best landing sites were from safety and scientific perspectives.
This experience left Scott wanting to go more into terrestrial remote sensing and work with engineers, software programmers, and operations people to solve scientific problems. At Quantum Spatial, he continues to tackle issues that need complex GIS solutions. When he is challenged by infrastructure, changing environmental characteristics, or invasive species, he specs out the type of instrumentation that’s needed for each unique situation to harvest information and data that will solve the problem. The time he spent on Mars projects, methodologies, and data analysis is now coming in handy, albeit on a smaller scale.
Spectroscopy and thermography allow measurements to be taken whether the object is 300 kilometers above the surface or just 20 meters away. Determining the characteristics of the material from a distance is one of the biggest challenges when it comes to remote sensing projects. Are things too hot, too cold, or maybe rusty? Interestingly, almost the exact same observations are being applied as on the Mars projects, but in a substation environment.
The job of these robots is to pinpoint any features that are out of spec, too hot, too cold, deformed, or spectrally significant in some way. They use the same quantitative remote sensing tools such as thermal camera, spectral camera, laser system, and LIDAR as a drone or a plane would have.
An electrical substation is a switching station between transmission lines. These high voltage power lines meet local distribution lines. It’s a relatively small area with many electrical bus bars, switches, and voltage regulators working there. All this is needed to maintain the connection between transmission and distribution. Distribution substations are somewhat different in that they connect lower voltage lines with the local houses and businesses, converting energy from one form to another and then distributing it locally.
In both cases, there are a bunch of moving parts, many of which could have been installed 50 years ago and exposed to the elements ever since. In these places, a rover which is a surface exploration device, aka robot, looks at small scale changes, like degradation, or large scale changes, such as a wall coming down. Changes that on-site personnel would probably notice, but since it’s not always viable to have someone on-site in every substation 24 hours a day, an automated robot can do the job.
A robot in such an environment is an autonomous, four-wheeled, remote sensing “tool-kit” that moves around. It checks for snakes, iguanas, or birds entering the substation, potentially causing an outage. It checks for trees growing over the fence line and dropping limbs into the substations. It checks if components are wearing out over time from the elements, so they don’t cause a failure. It collects and sends information and data on a minute by minute, or sometimes monthly basis of the occurring scale changes that need to be monitored.
Technically, yes, it would be possible to map out an entire substation with cameras and monitor it on a minute by minute basis. But it would also be expensive to implement and hard to maintain.
Cameras can leave you with blind spots or fuzzy observations, and you end up missing something even when you thought you’re watching everything. The advantage of a robot is that it can see everything as long as you move it to the right spot. It allows you to get a precise view of small or large-scale features. They’re flexible, in terms of a changing environment and can switch easily between looking for an intruder in large scale mode or spotting a hot surface that could potentially cause problems in fine-scale mode. Robots, of course, use computer vision to build a model of change via an individual observation pattern that’s been programmed into them. The data is then reviewed by the computer vision tool to analyze the information.
Having a good map of the facility is essential before attempting to go in and start anything. Otherwise, it would be difficult to give a robot full autonomy to navigate without running into anything and to make the right observations and monitor change. Quantum Spatial does this regularly with new environments, flying LIDAR over the power lines and substations and building CAD models of these with the latest software and applications. A human then has to go in and look at this highly accurate three-dimensional point cloud and draw boxes around the features and classify the substation to provide context.
That’s the bird’s eye view and it’s incredibly useful for determining if that environment can be navigated or not. First, we need to understand the big picture before we can go and fine-tune the ground operations with a rover that’s equipped with the relevant tools to navigate that space, be it a substation or an office. Only then can we map out the three-dimensional landscape. The rover will add context by detecting physical shapes, materials, and important pieces of equipment. Either a human or a machine-learning algorithm will categorize objects and assign characteristics to them, such as shape, location, function, and so on. Building this type of spatial model is labor-intensive as it’s unlike modeling a natural environment with rocks, trees, and shrubs. Detecting a set of features from a point cloud and assigning attributes to those features is difficult and it takes a while to understand the environment.
Generally speaking, it’s possible to assume different features based on a CAD model and build something along those typical operational assumptions.
But in reality, we don’t know what those conditions or the specs are. You can’t use a blanket assumption that a 20-degree rise in heat is an alarm for being too hot. Objects can heat up and cool down at different rates based on various environmental conditions. In terms of physical movement, it’s not always possible to determine the range for each component of that substation. Trees are growing over the fence lines. Conductors and other features are moving slightly. Seasons change and the occasional storms pass. The normal operating conditions and curves of that particular substation need to be generated by not just mapping things out once, but hundreds of times. Then, you can determine what your normal range of variation is in terms of temperature, location, deformation, or other reflectivity conditions. Then you average that range for a given day, month, or year, and there you have your out-of-spec event that has to be dealt with.
Indeed, it’s a huge task but the important things quickly bubble to the top. Ultimately, what we’re looking for are the critical things that a human being will need to be sent in to fix. A hurricane or big storm that throws around debris becomes a priority over small scale thermal changes that trend over time. We still need to think like a human and prioritize these events that may be happening at these different dimensions all the time. This adds real, practical value to the robot that’s working on that substation.
There is a decent amount of overlap in data but we still need to carefully divide different types of substations. It would be fair to compare smaller, urban distribution substations as they are very similar. But if we think about a location in the desert, it’s completely different when it comes to the range of changes. We can only compare apples with apples.
But then, even apples come in a wide variety. Substations can differ drastically based on their distinctive local environment. Vegetation, humidity, or aridity make each substation unique. Logically, you may think it would be easy to apply one urban model to another as the components are similar, but if you need to map and calculate real hazards, environmental factors need to be carefully examined.
Rovers collect the data on a daily and weekly basis. They do scans, LIDAR, thermal, multispectral, and visible imagery. It takes months to take this data and turn it into the right information to build up good maps of change. Exceptions are then created for being out of spec for one reason or another. Right now, these exceptions are still being sent for human analysis by someone who works at the substation, for them to decide if and what they’ll do about them. The robot is not yet able to determine if a component needs to be changed just because there is a physical change. Rather, it alerts someone to the fact who will go and carry out an inspection and decide the action. If enough data is collected on similar instances and enough of these exceptions to the conditions are found, a library of options can be built up. If enough tree branches fall into the substations and they can be identified, then the right action can also be safely identified. But if a component is reported to be out of spec because it’s too hot, at this point in time, we can’t quite replace that component just yet as a direct outcome of the observation.
In urban environments, robots will need to make a lot more decisions. Presently, the highest value for these robots is in the substations because they’re distributed all over the country, often many hours’ drive away from each other. Having robotic eyes on assets, components, and unexpected events is invaluable in these situations. If something goes wrong, it could affect the services provided to entire neighborhoods or an industrial client. There is a direct correlation between component or power failure and revenue loss.
As a next step, this technology can be extended to other types of facilities around the world, assuming a well-defined workflow and an infrastructure that is currently still using human inspection and repairs. Once you identify the requirements, conditions, and actions based on what those people in that specific plant do daily, you can then put a robot into that space to make similar observations.
To date, a few different types of facilities have been spec’d out like offshore oil and gas pumping stations where it’s not always feasible to have a human being present doing relatively simple inspections. These are ideal constrained environments and easy to program for robots.
Ground robots and flying drones have the advantage that they can navigate in the three-dimensional space. The other alternative is, of course, putting surveillance cameras with their known limitations to cover areas. With robots, you can change the perspective by changing the position or the scale of the observations that they make by being able to move around. At the moment, their use is limited to substations. Soon, we’re expecting to use drones, robots, and mobile mapping for the power network itself, transmission and distribution lines, as well as substations and power generation. Imagine when a hurricane hits Florida and destroys parts of the infrastructure. If you’re using automated mobile networks with drones and robots, then you can have an immediate, scalable response. You can start with a small problem area and expand to many orders of magnitude to encompass all of the problems that are potentially affecting the network.
The further we go in to understand systematic changes and try to manage these large scale infrastructure systems, the more we’ll have to do spatial and temporal scaling. Our focus should be gathering information that’s required to solve today’s issues on the ground.
Have you just imagined a robot zooming around your offices looking for issues and you thought, "Yeah, why not?" Do you think robots are the future but not quite this way? Or do you see potential pitfalls with using robots in urban environments and for facilities management? I’d love to hear your opinions.
Geospatial standards, like any standard, are what we agree on as a community. It’s a way to describe how we model geospatial data, exchange it, subset it, process it, visualize it, or reference it. We need standards because we share and integrate data, and we solve complex problems.
Google Earth Engine is acloud computing platform for scientificanalysis andvisualization of geospatial data sets. It isfree to use for research, education, and nonprofit. Google Earth Engine is essentially streaming data. You don’t need to go online to download the data — you just need a browser, and you can access the entire Google Earth Engine data catalog and a bunch of tools to do the analysis and visualization.